Python Bytes - #333 Live From PyCon
Episode Date: April 22, 2023Topics covered in this episode: Introducing Microsoft Security Copilot PEP 695 – Type Parameter Syntax Auto-GPT Astral: Ruff is now a company Extras Joke See the full show notes for this episo...de on the website at pythonbytes.fm/333
Transcript
Discussion (0)
Hey, welcome everybody and welcome. We've got a whole room full of people. We're recording this live.
How about we get a live from PyCon shout out?
There we go. Thank you all.
Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds.
This is episode 333, half of a beast, recorded April 21st, 2023.
And I'm Brian Ocken.
And I'm Michael Kennedy.
And we are live at PyCon US 2023.
Yeah, it's excellent to be here.
Thanks, everyone, for coming.
It's awesome.
It's a real honor.
So, shall we kick off the show?
Let's kick off the show.
I think first, I'd just like maybe to get your thoughts real quick on PyCon, the Expo, people.
How's it feeling this year?
Well, it's the first time I've been back from the pandemic.
I didn't show up last year, so I'm pretty excited.
It's really good to see people.
There's people that I haven't seen in person since 2019, so it's pretty awesome.
How about you?
Same.
It's really great to reconnect with a bunch of people and see folks I haven't seen for a long time.
Really nice.
Yeah, it's been great.
Plus we got the Python staff this year.
Yeah, the Python, the staff of Python things.
So I want to talk about something that's all the rage,
and I do want to put that on the screen for the live stream people as well.
And that is more AI chat things. What do you all think about chat gpt and all
these things is this scary or is it scary or is it awesome like uh we're getting yeah all right
i think that represents how i've felt i felt like everyone out there at different stages in this
whole journey yeah uh but we we saw is a whole bunch of thumbs up and a lot of sideways. So we're not
quite sure yet what we think of it. We're not quite sure yet. It's one of those things that's
kind of here, the cat is out of the bag. We can either rail against it or find good uses for it
to take advantage of it. So Microsoft has found a way to use their large language models behind open AI and the stuff that powers ChatGPT
to help security defenders, they say.
Like if I'm on the blue team trying to stop people from breaking into my company, I could
use a little bit of help with that.
And you can already use ChatGPT for all sorts of crazy programming and security type of
things, right?
You can say, hey, dear chat,
I would love if you could help me write a phishing email
that looks convincing,
or I would like you to help me identify things
I might test for misconfiguration in Gen X files
and what I might do with that, you know?
Those are all bad things,
but this project here is called Microsoft Security Copilot.
It says empowering defenders at the speed of AI.
And so basically what this is, is it's chat GPT, but instead of using a general purpose
language model, it's using a cybersecurity focused large language model that understands
things like don't let me get hacked, buffer overflows, configuration files, that kind of
stuff. So if you're in the space of cybersecurity, which Python is one of the most popular languages
out there for cybersecurity, right? Both sides of it, the good and the bad. But yeah, so basically
you give it a prompt, you ask it a question about configuration file or some kind of environment and it will allow,
it'll go and use that large language model. And it doesn't always get it right. This is one of the
big challenges. Maybe some of the thumbs down from you all were like, you know, this large language
model made up something about the world or whatever, but it was real confident. It was certain
it was right, but it wasn't. So this has a feedback loop. You can say, no, no, that's actually not misconfigured
security co-pilot thing. That was okay. And here's why. And so you can have this loop
that you would have with, you know, maybe with like a junior cybersecurity researcher or whatever.
And another thing that I don't really know how all these large language models work and all this AI stuff works, many of it, much of it seems to be we're going to go find a bunch of other people's work and then take that.
We'll have a really cool system with this cool data, right?
Like we're going to scan repos and maybe it doesn't matter if it's GPL, if we filter the GPL out through some kind of neural net or get all the Getty images.
And now we can create really cool pictures if you you know, get all the Getty images.
And now we can create really cool pictures if you ask for it. But the Getty wasn't on board with that. So this data story is kind of a little suspicious for these. But with this one, they
explicitly say your data does not get shared back. It doesn't go anywhere. This is like,
you can even lock it down about how other people are allowed to access.
So that's kind of cool.
And yeah, they're basically trying to help people go through log files and other things on the server
where people are trying to hide their tracks,
behaving normally, but not really,
and pull those things out.
Now, I have no experience with this,
but I know I interviewed some folks on TalkPython
who are astronomers looking for
exoplanets. And they were able to take old Kepler data and apply some machine learning and computer
vision and discover 50 new exoplanets that people thought they had already analyzed. And guess what?
They were hiding. They couldn't be discovered by people, but by computers they could. I suspect the
same type of thing is true here. Like there's 10 million lines of log file, and these three are suspicious, but nobody really noticed, you know.
So anyway, if you're in cybersecurity, definitely give this a look.
So next, I want – I should have thought of this ahead of time.
But we've got a bunch of people here that can't see our screens, and I do – which is a good reminder that, uh, this is also an audio podcast.
It's not just on YouTube apparently.
Um,
so the,
the next topic I'll be,
I'll have to be careful talking about,
but it's,
um,
it's pep six,
nine,
five type parameter syntax.
Now this is,
uh,
this pep is an,
it's,
uh,
it's for Python version three 12.'s accepted so um i don't know if it's already
in some of the alphas or betas or not yeah i don't know either um uh but we've got so it's
accepted for 312 type parameters syntax the the abstract is this pep specifies an improved syntax
for specifying type parameters within a generic class function or type alias.
It also introduces a new statement for declaring type aliases. What does that mean? Well, I like to,
if you, it has some great examples. So we go, if we go down to the examples,
there, it's the old way, like, let's say I've got one of the examples is great.
So let's say I've got a function like that takes, it takes something.
We don't know what the type is, but it takes something and then it returns the same type.
Or it takes something, it takes two of, it has to have two of the same typed things.
Doesn't matter what they are.
Doesn't matter what they are.
So like two ints or two floats or two lists or two two
pulls it doesn't matter what but it's the same thing the old way to do that which is i still
think it's fairly recent i think this might have been 311 for type var it's pretty new i think so
yeah i'm laughing because it's it's rolling over so quickly right right? Yeah. So anyway, the old way to do it was from typing import type var.
I didn't even know you could do this.
And then you declare a new type using, like in this example, underscore T equals type var and then in parentheses underscore T.
And then you can use that as the type of arguments.
And that's really kind of
ugly syntax. And the new proposed syntax is to just give a bracket, like bracket T bracket after
the function to say, basically, it's a templated function. Like all the other generic statically
typed languages like C and stuff, right? Yeah. So it definitely reminds me of like the type. Yeah.
Templates. Templates. Thank you. In C++ and stuff. So it's definitely easier. I still,
I'm not sure. So it's approved. So we'll get this in 3.12. It's definitely better than the old way,
but it's still, I think we might be confusing people with this. What do you think?
I think types in Python are awesome,
but I think it can also go too far.
I mean, let's ask it since you all are here.
Let's ask how many people like typing in Python?
Almost uniformly, yeah.
Okay.
But it can get over the top sometimes, I think.
One of the things, though, is cool.
One of the bottom examples in this,
it shows the combining types.
So maybe a function that takes two of the same type things, maybe that's a little weird.
But it's not too weird if you think of lists of things.
If I want to say it can either be a list or a set of a certain type, but only one type, How do you say that without these generics?
Yeah, I know.
Yeah, I think-
It is incomplete.
And so it's the question of how far
are you going to push the language
to get that last couple of percent?
Anyway, it is looking a lot more like C, isn't it?
I'm glad I studied that,
but also glad I don't have to write it these days.
So something to look forward to in Python 3.12
is PEP 695.
Yeah, absolutely.
While we're riffing on types, I just want to make a quick comment.
I got a message from somebody recently on this project.
It said, Michael, I discovered a bug in your code.
It doesn't run.
I'm like, oh, really?
It seemed like it ran last time I touched it.
But okay, what's going on?
Well, you just used the lowercase L list bracket type,
and only capital L list works.
Like, no, the bug is you're in Python 3.9, not 3.10.
And this is a new feature.
And I think, I'm joking kind of,
but with all these changes so quickly,
like it starts to get,
you've got to be on the right version of Python
or this thing won't exist, right?
And it's going to be an error.
Yeah.
It used to be, oh, the last five versions is fine.
Now it's like, eh, the last version is fine.
We'll see. Yeah, that used to be, the last five versions is fine. Now it's like, the last version is fine. We'll see.
Yeah.
That, I'm starting to, I'm working with some educators and one of the tricky things in
like universities is the, your curriculum is kind of needs to be known ahead of time
and they kind of set that.
And so with Python moving so fast, I wonder how
educators are dealing with this. If they're teaching 3.8 or 3.11.
All right. We got some teachers in the audience saying 3.11.
Kids, they like new shiny things anyway. Give them that. All right.
All right.
What's next here, Brian? What's my next one? I don't know either. No, I do. It has to do with
AI probably. So this one comes to us from Matt Harrison, who's here at the conference if you want to say hi. Obviously there's all this GPT stuff going
crazy, but one of the challenges is you can ask it a question and it'll give you an answer, right?
Like, hey, please write this code for me. And boom, here's, you don't need to hire anybody.
Just take this code and trust me or whatever, right? You ask it a question and you can ask it
a couple of questions, but it has what's you can ask it a couple of questions but it
has what's called was it a token stack or something like that only has so much memory of like the
context of what you're asking it and the ability to go and ask it to do one thing and then based
on its response go do another and then a third after that it's not quite there yet so there's
this project called auto gpt so if you have a open a open ai api key. So if you have a OpenAI API key, basically, so if you pay for OpenAI or somehow
have access to it, then you can plug it into this thing. And what it does is you give it a mission.
You say, dear AI thing, what I would like you to do is go search Google for this,
figure out what you find, and then get the top three most popular ones, go find their
web pages, take all the information out of that and summarize them for me. And then make a prediction
about like, who's going to win the Superbowl. Cause I'm going to bet big on it. I don't know.
So basically that's the idea. It says it has a couple of benefits over regular chat TTP,
for example, which is you can't connect it to the internet. I don't know if you ever played
with it, but it'll say things like, I only know up to 2021. Sorry. This one has internet access.
It has long-term memory storage. It'll store in a database. So you can like have it go on and on
for a long time. File storage, all sorts of interesting things. So they have a video that
we'll link in the show notes. You can check out here. I'm going to mute it because I don't want
to hear this person talk, but it says, fires it up and it says, all right, we're going
to get started. And what I want you to do, um, your role is an AI designed to teach me about
auto GPT, the thing that is itself, right? Very meta self-referential your goals as a list in
Python is first search what auto GPT is, and then find the GitHub and figure out what it
actually is from its GitHub project. And then explain what it is and save your explanation
to a file called auto GPT dot TXT, and then stop. And it will, if you run it, you will say, okay,
well now it's gone out to Google and it's done this thing and it's pulled it in and now it's
starting to analyze it. And why is this interesting? This is all Python code, right? done this thing and it's pulled it in and now it's starting to analyze it.
And why is this interesting? This is all Python code, right? So this thing is created in Python.
You run it with Python. I'm sure you can extend it in different ways with Python,
but yeah, it's, it's pretty nuts. You, you create these little things, you put them on a mission and you just say, go, you know, go get me tickets for this concert or go do this other thing. And
here's the plan I want you to follow
and you just set it loose.
So anyway, if you want to combine some Python
and some automating of the large language models,
there you go.
This seems like something could definitely
easily be used for evil.
No, no way.
There's no way.
Yeah, I agree.
All right.
What do you got for the last one? one i am so we've talked about rough before
i think um so uh there's been an announcement that charlie marsh is now his own company um
and hiring people so uh charlie marsh has formed a company calledral, and he's made a good start.
He's starting with $4 million of investment money, so it's not a bad deal.
That is not a bad deal at all.
Bad deal to start a company.
But I'm kind of excited about it, actually.
Well, one, I'm happy for him.
Obviously, well, at least I hope it's a good thing for him.
But I just think it's neat that, I guess I just wanted to highlight and say,
congrats, Charlie, you're doing this.
So the Ruff, if you're not familiar, is kind of like a Flake 8 linter sort of thing.
But it's written in Rust, and it's really, really fast.
It's so fast, you can barely detect it's running.
Yeah, how many of you all have heard of RUFF?
R-U-F-F?
Pretty much everyone, and this thing's only been out like a year,
so that's a big deal.
Yeah, I ran it on the Python bytes and the TalkPython code
and 20,000 lines of Python, and you're like,
did it actually run?
Did I give it the wrong files?
It might not have seen anything.
It's instant.
It's crazy.
One of the things Charlie's noticed is that it's becoming very popular,
but he's also getting a lot of requests.
So it's a very active project now, and I'm sure it's taking a lot of time.
So he's got things like new requests.
Let's do more of the extensions of Flake 8, which is completely valid.
And then also, yeah, well, this was a good idea of taking part of the Python tool chain
and rewriting it in Rust.
What other stuff could we rewrite in Rust?
And I think that's where they're headed is making more Python things more rough-like
or, you know, Rustifying them.
So I'm excited for it and to see what they come up with. And he's, he's promising that a lot of this stuff
is going to be open source available to everybody. So awesome. Congratulations, Charlie. That's
awesome. I would say, you know, when I got into Python nine, 10 years ago, there seemed to be
this really strong resistance to anything
corporate, anything like people were trying to bring money. It was, it seemed really suspicious.
Like, what is your, what is your motive here? Are you trying to corrupt our, our open source
environment? And I think since then we've kind of found a way where there can be commercial
interests that don't undermine the community, but, but also come in and benefit. I mean, we saw Samuel Colvin with Pydantic.
We're seeing this now.
And a lot of them seem to fall.
In textuals.
In textual, absolutely.
Will McCookin, out with the rich.
Sorry, Will.
And a lot of them seem to fall under this, what's called open core business model,
where the essence of what they're doing, they give away for free, like rich, like, um, identic. But then on top of that, there's something that, that is highly
polished and commercial. And that's where they're kind of working. And I personally, I'm just really
happy for these folks that this has happened. I think it creates more opportunity, increase more
opportunity for people in Python, people who worked on these projects for so long, get a lot
of, it kind of pays off eventually, right?
The PayPal donate button,
there's no way that that's a job.
That's like a, it covered my dinner once a month
sort of thing.
I also get that there's a lot of people
that can't do this.
I mean, there's a lot of things
that people are happy with their normal job
but they're doing something cool on the side.
We still need to figure out
how to
compensate those people better yeah we'll figure that one of the things i wanted to bring up is i
was talking about this announcement with somebody just yesterday and they said oh rough um it's kind
of like black right i'm like wait i don't think it's quite that's quite right um i think of it
more like flake eight but i i was curious about the overlap. So I went up and looked in the FAQ, and the top question is, is rough compatible with black?
So yes, it says rough is compatible with black out of the box,
as long as line length setting is consistent between the two,
because black has a weird line length thing.
I've had no problem with running them together,
and I was like, also, should I run them together?
And right in here, it says rough is designed to be used alongside black,
and as such will defer implementing stylistic lint rules
that are obviated by auto-formatting.
So what does that mean?
It means that there's no point if
they're assuming that you're running black so if if running black will do something there's
no point in rough checking it because they know that you've already done it or something they're
yeah don't let them fight wrap this line unwrap that line wrap that line well that and also like
that's not their highest priority of fixing of checking for lint errors that black would have changed anyway so yeah
indeed all right well congrats that's very cool i think that might be it for our items huh what
do you think oh yeah for our main items our main items you got some extras i do have one extra the
one extra is um i'm like fight cart whatert? What's Matthew? Matthew Fikert.
Okay.
Yes.
Wanted us to bring up, which, sorry, Matthew, for me forgetting your name right away.
Former Python Bytes co-host, guest, attendee.
Yeah.
So I wanted to announce that the tickets are available.
It's now open.
You can buy tickets to SciPy 2023.
And SciPy 2023 is in Austin, Texas on
July 10th through the 16th.
So that's open if anybody wants
to go. Should be fun.
Anyone going to Austin to go to
SciPy? I know you've all used up
your conference going.
Some maybes out there.
I mean, Austin would be great to visit.
SciPy will give you a different flavor of Python. I think it'd be
great, but
I can't make it.
I'm coming home from vacation on the 10th or something like that,
which makes it a little tight to get all the way to Austin.
All right. Do you have any extras?
I have one extra, nothing major, kind of a follow-up here.
The mobile app, I talked about that.
The mobile app is officially out for Talk By Then
courses. And I would like people to try it out. If they find a bug to shoot me an email rather
than write a one-star review and trash it because we're working really hard to get it complete.
It's been two and a half months. I've been working on it. It's completely redone from scratch. It's
very nice. And, but it, it needs a little testing across all the zillions of devices.
Android is out.
Do you notice, Brian, I did not say the Apple version is out, did I?
No.
Oh, no.
No, no, no.
Because when you submit something to Apple, what they tell you is rejected, rejected.
Your app does not do X, Y, and Z.
And Android's like, yeah, sure, that's good.
So we're now adding in-app purchasing because without it, you can't have your app.
So I'm going to work on that for the next week.
And then we'll have an Apple version y'all can test and it will be out, but it's just not out yet.
What are you going to sell for in-app purchases?
Courses.
I actually wrote some of them.
You know, I might even sell one of yours.
Yeah, the PyTest course.
Yes, exactly.
Nice. Awesome. Anyway, that's my extra. What's Android by the way? Yeah. It's
no, just kidding. Let's not go there. This one, I'm going to take a chance. I'm going to take, take a risk here and turn my screen around. Okay. For everyone, because this, this joke is very very visual you'll be able to see it over
there and you'll you can see mine but you know it already this is what it's like releasing new
production we've got the senior dev and we've got the junior dev here we go here we go what is this mr bean yeah mr bean it's just people are rocking all over the junior dev is
hanging on for life there's like a molten lava here in a second that's the database some of
the developers are thrown into the lava there you go the scrum master was thrown into the lava
which is the database anyway what do y all think? You ever felt that way?
No, I'd definitely throw the Scrum Master into the lava.
Yeah, definitely, definitely.
But anyway, that's what I brought for our joke.
Nice.
I like it.
And I also took you off the camera.
There you go.
That's all right.
Well, this was fun doing a live episode.
It was very fun.
And thank you all for being here.
This is really awesome.
Yeah.
Thanks to everybody.
And thank you, everybody online for watching and showing up.
Yeah, absolutely.
Bye, y'all.