Python Bytes - #288 Performance benchmarks for Python 3.11 are amazing

Episode Date: June 14, 2022

Topics covered in this episode: Polars: Lightning-fast DataFrame library for Rust and Python PSF Survey is out Gin Config: a lightweight configuration framework for Python Performance benchmarks fo...r Python 3.11 are amazing Extras Joke See the full show notes for this episode on the website at pythonbytes.fm/288

Transcript
Discussion (0)
Starting point is 00:00:00 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds. This is episode 288, recorded June 14th, 2022. I'm Michael Kennedy. And I am Brian Ocken. Brian, how are you doing? I'm excellent today. I hear you're a little busy. But it's just, you know, being a parent and having side jobs and stuff like that. Of course. Well, it's better than the alternative.
Starting point is 00:00:24 Definitely. I was talking to somebody this weekend about like their one job and trying to balance job and life. And I'm like, I don't even remember what that's like with just one job. I know. Or you have a job where you go to work
Starting point is 00:00:35 and you do the work. And then when you go home, there's no real reason to do the job anymore. So you can just step away from it. It's God's glories. And yet I continue to choose the opposite, which I also love. All right. right well speaking of stuff people might love you want to kick us off with your first item yeah we're going to talk about polar bears no uh not polar bears a project called
Starting point is 00:00:55 polars and actually it's it's like super fun and cool so polars was suggested to us by actually several listeners we got several people uh sent in and i'm sorry i don't have their names but thank you always send send great stuff our way we love it um but uh polars is billed as a lightning fast data frame library for rust and python and it is it is written in python um no it's written in rust and but they have, they have like, they've have a full, the full API is, is present in Python. And it's just, it's kind of neat, actually how they've done it. But so we've got up on the screen, the splash screen for, for, for the Polar's project. There's a user guide and API reference, of course. But one of the things I wanted to talk about is some of their why you would consider it.
Starting point is 00:01:48 So Polar's is lightning fast data frame library. It uses an in-memory query engine and it says it's embarrassingly parallel in execution and it has a cache efficient algorithms and expressive API. And they say it makes it perfect for efficient data wrangling, data pipelines, It has cache-efficient algorithms and expressive API. And they say it makes it perfect for efficient data wrangling, data pipelines, snappy APIs, and so much more.
Starting point is 00:02:13 But it also is fun. I played with it a little bit. It's zippy and fun. They have both the ability to do lazy execution and eager execution, whichever you prefer for your use. It's multi-threaded, has a notion of single instruction, multiple data. I'm not exactly sure what that means, but makes it faster apparently. And I was looking through the whole, the introductory, the user's guide is actually written like a very well-written book. And it looks like the whole guide,
Starting point is 00:02:49 as far as I can tell, is written for the Python API. So I think that was part of the intent all along is to write it quickly in Rust, expose it to Rust users also, but also run it with Python. And it's just really pretty clean and super fast one of these uh benchmark results performance things was um it's like uh spark was taken 332 seconds and
Starting point is 00:03:15 they took 43 seconds so um i know it's a hundred million rows yeah in seven columns that's not just like let's load up a couple of pieces of data or something right so the um there's a lot of focus on this making it making sure that it's fast especially when you don't need everything like uh doing lazy evaluation or um making sure you do multi-processing one of the things i thought was really kind of cool about it i was looking through the documentation is there's a section on um that says uh it was is this section that was talking about parallelization it says do not kill the parallelization because with python we know we we there is basically there's ways to use polars that can kill parallel processing because of the GIL.
Starting point is 00:04:07 If you don't do it the way they've set it up, you can use it in a way that makes it a little slower, I guess, is what I'm saying. So there's a section on this talking about the polar expressions, polars expressions. And these are all set up so that you can pass these expressive queries into Polar's and have it run in the background and just make things really fast and sort of skirt around the, the gill because you're doing all the work in the rust part of, part of the world and, and then collecting the data data later so there's like a set up the query and then collect the query um that's kind of cool so anyway i just thought this is a really looks
Starting point is 00:04:52 fun it's um it's just a there's nothing to you don't have to do know that it's in rust you just say pip install polars and it works so yeah that's great out in the audience uh sarab asks why rust and not c maybe an example there is pandas versus this also probably the person who wrote it just really likes rust and i think rust has a little bit more thread safety than straight c does i'm not 100 sure but this uses threads as you point out whereas the other one pandas and others in c don't i i also think that we're going to see a lot more of things like this like um because i think some of the early um faster packages for python were written in c because rust wasn't around or it wasn't mature enough but i think we're going to see more people saying well i want it i want it to be closer to
Starting point is 00:05:42 the the processor for some of this stuff. Why not Rust? Because I think Rust is a cleaner development environment than C right now. Yeah, I agree. Absolutely. It's just a more modern language, right? You know, C is keeping up. C is never going to be old, I don't think. But yeah.
Starting point is 00:06:01 Yeah. Yeah, yeah. I don't mean to say that c is out not modern in the sense that people are not using it but it doesn't embrace in its sort of natural form the most you know smart pointers and things like that yeah there's and there's c plus plus maybe but not c there's safety features written built into rust to make sure you don't that just make it easier to not do dumb things uh i guess let Let's put it that way. Indeed. All right, well, let's jump onto my first item,
Starting point is 00:06:29 which is a follow-up from last week, Python Developer Survey 2021. Yes, you heard that right. I know it's 2022. These are the results from the survey that was at the end of last year. So let's, I'm gonna kind of skim through this and just hit on some of the main ideas here. There's a ton of information
Starting point is 00:06:43 and I encourage people to go over and scroll through it. This is done in conjunction with the folks over at JetBrains, the PyCharm team and all that. So it was collected and analyzed by the JetBrains folks, but put together independently by the PSF, right? So it's intended to not be skewed in any way towards them. All right. So first thing is if you're using Python, is it your main language or your secondary language? 84% of the people say it's their main language with 16% picking up the balance of not so much. It's been pretty stable over the last four years. What do you think of this, Brian? I think that there's a lot of people like me. I think that it started out as my secondary language and now it's my main language.
Starting point is 00:07:29 Yeah. Interesting. Yeah. And it got sucked in. Like, ah, maybe I'll use it to test my C stuff. Wait, actually, this is kind of nice. Maybe I'll do more of this. Yeah. There's always the, the next question or analysis is always fraught with weird overlaps, but I like the way they ask this a little bit better than a lot of times it says Python usage with other languages, what other languages to use Python with rather than maybe a more general one where they ask this a little bit better than a lot of times it says python usage with other languages what other languages to use python with rather than maybe a more general one where they ask well what is the most popular language and you'll see weird stuff like well most people code in css like i'm a full stack css developer like no you're not just everyone has to use it like what is this it's a horrible question yeah right so this is like if you're doing python what other languages do you bring into the mix and i guess maybe just hit the top five javascript because you might be doing front-end back-end html css same reason bash shell because you're doing automation build so on
Starting point is 00:08:14 sql that's sql i'm surprised there's that much direct sql but there it is and then c and c++ speaking of that language yeah also to uh sort of address the thing that I brought up before, Rust is at 6%. Last year, it was at 5%. So it's compared to C at 30 and 29. So they both grew by 1% this year. Okay. Yeah. I think they both grew.
Starting point is 00:08:36 That's interesting. Yeah, exactly. Another thing that people might want to pay attention to is you'll see year-over-year stuff all over the place in these reports because they've been doing this for a while. So like the top bar that's darker or sorry, brighter is this year, but they always also put last year. So for example, people are doing less bash. You can see like it's lower bars higher and they're doing less PHP. Probably means they love themselves a little bit more. Don't go home crying. Okay.
Starting point is 00:09:06 Let's see. Languages for web and data science. This is kind of like if you're doing these things, what to use more. So if you're doing data science, you do more, SQL is your most common thing. If you're doing web, surprise, JavaScript and HTML is the most common other thing.
Starting point is 00:09:22 Let's see. What do you use Python for? Work and personal is 50%. Personal is 29 and work 20%. Kind of interesting that more people use it for side projects if they use it for just one or the other of work or personal. I guess people who know Python at work, they want to go home. They're like, you know what?
Starting point is 00:09:40 I could automate my house with this too. Let's do that. I think that, yeah, I would take it like that. I think it isn't just even automating your house. It's just playing around with it at home. Like, yeah, I heard about this new web framework, FastAPI. I want to try it out, things like that. Yeah, absolutely.
Starting point is 00:09:56 I'm going to skip down here through a bunch of stuff. Where do you use Python for the most? Web development, but that fell year over year. Data analysis stayed the same year over year. Machine learning fell year over year. Data analysis stayed the same year over year. Machine learning fell year over year. And a bunch of stuff. But so sort of the growth areas year over year are education and desktop development. And then other, which I think is pretty.
Starting point is 00:10:17 Also game development doubled. Doubled from one to two percent. I mean, from one to two, it's probably like there was, you know, that might be within the margin of error type of thing but still it doubled um but i think just the other uh no other didn't grow there's just i think it's just more spread out i don't know because there's still i think same number of people using python all right uh are you a data scientist one-third yes two-thirds no that's that fits with my um mental model of the python space one-third data science one-third web and API, and one third massively diverse other, the way I see the ecosystem. Python 3 versus 2, I think we're
Starting point is 00:10:51 asymptotically as a limit approaching Python 3 only, but year over year it goes 25% from 2017, then 16% Python 2, then 10%, then 6, then 5. And then there's just huge codebases that are stuck on Python 2. Like some of the big banks have like 5,000 Python developers working on Python 2 codebases that are so specialized and tweaked that they can't just swap out stuff.
Starting point is 00:11:16 So, you know, that might represent 5% bank usage, I don't know. That's, I just, I feel bad for you. We're rooting for you everybody out there using python too stick in let's approach that limit yeah yeah let's let's divide by n factorial not n for your limit there let's go get in there all right um python 3.9 is the most common version uh 3.10 being 16 and 3.8 being 27 versus 35 so that's that's pretty interesting yeah um i feel like this is hey this is what comes with my linux and this is what comes with my docker so i'm using that but maybe
Starting point is 00:11:50 it's more yeah it's interesting because you and i like are an interesting space because we're always looking at the new stuff so i i'm at i'm at 310 and i can't wait to jump to 311 um yeah and actually i switched to 311 for some projects. So, but there's a lot of people that's like, man, Python's pretty good and it's been good for a while,
Starting point is 00:12:10 so I don't need a lot of the new features. So. Yeah, for sure. I'm going to later talk about something that might shift that
Starting point is 00:12:19 to the right. I've actually been thinking, like, should I maybe install 3.11 beta and see how stable that is on the servers? We'll see. That might be a bad choice. Yeah. To the right. I've actually been thinking like, should I maybe install 3.11 beta? See how stable that is on the servers. We'll see.
Starting point is 00:12:29 That might be a bad choice. Might be a good choice. That's we'll see. Okay. Where do you install Python from? 38%. Just download the thing from python.org and run with that. The next most common option
Starting point is 00:12:39 is to install it via your OS package manager, apt, homebrew, whatever. And Alvaro has a great little recommendation out there for people who are stuck on Python 2. There probably is a support group for Python 2 users. Hi, my name is Brian and I use Python 2. Hi, Brian. All right.
Starting point is 00:13:03 Another one I thought was pretty interesting is the packaging stuff, the isolation stuff. Before we get there really quick, web frameworks, FastAPI continues to grow pretty strong here. We've got Flask is now maybe within the margin error, but just edged ahead of Django. But FastAPI almost doubled in usage over the last year. It grew nine percentage points, but it was at 12% last year. And so now it's at 21%, which is, that's a pretty big chunk to take out of established frameworks. Yeah. Well, and it looks like the third is none. I haven't tried that yet. Yeah. It gets a lot of attribute errors, but it's really efficient because it doesn't do much work. Yeah. Yeah. People who maybe don't know FastAPI, the name would indicate it's really efficient because it doesn't do much work. Yeah. Yeah, people who maybe don't know, FastAPI, the name would indicate it's only for building APIs.
Starting point is 00:13:49 But you can build web apps with it as well. And it's pretty good at that. Especially if you check out Michael's courses. He's got like two courses on building web apps with FastAPI. I do. And I also have some sort of template extensions for it that make it easier. All right. Data science libraries. I don't know how I feel about sort of template extensions for it that make it easier. All right. Data science libraries.
Starting point is 00:14:07 I don't know how I feel about this one. Do you use NumPy? Well, yes, but if you use other libraries, then you also use NumPy. Yeah, it's like all of these are using NumPy. Exactly. Exactly. Yeah.
Starting point is 00:14:18 A bunch of other stuff. Look at that for unit testing. Would it surprise you that PyTest is winning? No. It just overtook None this year, didn't it? Yeah. Alright. ORMs, SQL Alchemy is ahead, and then there's
Starting point is 00:14:36 Django ORM. Django is tied to Django. SQL Alchemy is broad, so there's that. And then kind of the None of the ORM world is raw SQL atl at 16 that's pretty interesting postgres is the most common database by far at 43 then you have sqlite which is a little bit of a side case you can use it directly but it's also used for development and then my sql the mongodb and then redis and microsoft sql server so yeah huh actually sql
Starting point is 00:15:02 server and oracle are higher than i would expect it even though, you know, but that's okay. Well, I think what you're going to find is that there's like certain places, especially in the enterprise where it's like, we're a SQL server shop or we're an Oracle shop and our DBAs manage our databases. So here you,
Starting point is 00:15:18 you put in a, you file a ticket and they'll create a database for you. Or there's a, you know what I mean? There's already an existing database and you're connecting to it or something. Yep, yep, yep, exactly. All right, let's keep going. Cloud Platforms, AWS is at the top.
Starting point is 00:15:31 Then you've got Google Cloud at 50%, then GCP, Google Cloud Platform, then Azure, then Heroku, Digital Vision. Linode has made it on the list here. So, you know, former or sometime sponsor of the show. It's good for them. And let's see, how do you run stuff of the show. It's good for them. And let's see, how do you run stuff in the cloud? Let's skip over this.
Starting point is 00:15:49 I think a bunch of interesting, a few more interesting things and we'll call it. Compared to 2020, Linux and Mac OS popularity decreased by 5% while Windows usage has risen by 10%. Wow. Yeah. Where the Windows people now double, more than double the Mac OS people and are almost rivaling the Linux people. I think that's just towards the growth of Python. I think Python's just making it more into everybody's using it sort of thing.
Starting point is 00:16:16 Yeah. And there's also a Windows subsystem for Linux. It's been coming along pretty strong, which makes Windows a more viable, have more parity with your cloud targets, right? And it feels like it's out in the audience. It's because of WSL?
Starting point is 00:16:29 Yeah, maybe. Okay, let's see. A few more things. Documentation. It's cool they're asking about what documentation frameworks you use. This one's interesting to me. What's your main editor, VS Code or PyCharm? I ask this question a lot at the end of TalkPython, and it feels like VS Code, VS Code, VS Code,
Starting point is 00:16:44 VS Code is what people are saying all the time. But it's 35% VS Code, 31% PyCharm. I asked this question a lot at the end of TalkPython, and it feels like VS Code, VS Code, VS Code, VS Code is what people are saying all the time, but it's 35% VS Code, 31% PyCharm. And Brian, right there for you, 7% Vim. Okay. Yeah. I just teased you. To be fair, it's both VS Code, it's all
Starting point is 00:16:59 three, or top four for me, but yeah. Yeah, exactly. Well well often you probably just use vim bindings within the other two right yep yeah uh let's see i think also maybe another interesting breakdown is that if you look at the use scenarios or the type of development done with the editors you get different answers so like for uh data science you've got more pie charm and for web development i think hold on drive that right oh interesting for for data science you have a lot more vs code for web development you have more pie charm and you have a lot less other in data
Starting point is 00:17:36 science aka jupiter suspected yeah okay how did you learn about your editor by first one here is from a friend. So basically, friends like Porsche editors, like drug dealers, like, got to get out. What are you doing? I think getting here. No, I think it's like, if I'm watching somebody do something cool, I want to do it also because it looks. Yeah, exactly. You sit down next to your friend. You're like, how did you do that? That's awesome. I want that feature, right? I think you're probably right. Okay. Let's just bust down a few things better. One, do you know, or what do you think about the new developer and residence role?
Starting point is 00:18:12 This is Lucas Chalenga that's going on right now. 77% are like, the what? Never heard of it. So maybe we got a little more advocacy job to do here, but he's been doing a great job really speeding things up and sort of greasing the wheels of open source contributions and whatnot. Yeah, but I'm going to take it like design
Starting point is 00:18:32 because if design's done well, nobody knows it's there. And I think the same thing. I think if he's doing his job really, really well, most people won't notice. Things will just work. Yep.
Starting point is 00:18:43 Quick real-time follow-up. Felix out in the audience says, I use PyCharm because of Michael. It should have been one of the options in the survey because of Michael. Oh, come on. That's awesome. But no.
Starting point is 00:18:56 Let's see. There's a bunch of questions about that. And the final thing I want to touch on is Python packaging. Let's see here. Which tools related to Python packaging do you use directly and we've talked about poetry we've talked about uh flit um pip env and so on and 81 of the people are like i use pip for packaging and compare as opposed to flit or something and then compare uh sort of parallel to that is for virtual
Starting point is 00:19:25 environment do you use the you know what do you use for virtual environments basically yeah like 42 is like i just use the built-in one or i use the virtual env wrapper and then it's like poetry pip env uh talks and so on there's a few i don't know what this is yeah well i'm i'm glad they included that because the one of the original questions didn't include like the built-in vnv and that's yeah i think that's what most people use so and it is yeah yeah absolutely all right well i think there's there's more in my progress bar here this is a super detailed report um link into it in the show notes so just go over there and check it out if you want to see all the cool graphs and play with the interactive aspects.
Starting point is 00:20:06 But thanks again to the PSF and JetBrains for putting this together. It's really good to have this insight and these projections of where things are going. Yeah. Hey, I'm going to grab the next one. Ooh, we did this smoothly this time. Nice.
Starting point is 00:20:20 So GenConfig is just Gen, actually the, the project's called gin config. Um, and it's, it's kind of a neat little thing. It's a different way to think about configuration files. So, um, like you have, you have your pipe project or you have dot Toml files. You could have dot any files. There's a lot of ways to have configuration files but um but jen takes the the perspective of oh well what what if you just um what if you're not really into all of that stuff and you're a machine learning person and you just have a whole bunch of
Starting point is 00:20:57 stuff to configure and you're changing stuff a lot uh maybe let's make it easier so i actually came across this because of um because of vincent vincent warm vincent warmer down he's got an excellent intro to gin uh on his uh com code site and the idea is you've got this you just have for for a function that you want to um in your code you got some code and you have like part of it that you want configurable, you just slap a gen configurable decorator onto it. And then all of the parameters to that function are now something that can show up in a config file. And it's not in any file.
Starting point is 00:21:38 I actually don't know the exact syntax, but it just kind of looks like Python. It's a, you just have names, like Python. You just have names. Like in this example that I'm showing, there's a file called simulate, and there's actually a function called simulate and a parameter called in samples. And in your config file,
Starting point is 00:21:57 you can just say simulate.insamples equals 100 or something like that. Oh, wow. This is like it basically sets the default parameters for all your functions you're calling yeah the ones that you want to be configurable and you just do that um now it's still where you can still set defaults within your code and um and just like you normally would and then and then you can configure the ones that you want to be different than the defaults so that's a that's a possibility and there's a whole bunch of uh i'm going to expand this a
Starting point is 00:22:29 little bit there's a whole bunch of different um uh things that vincent goes through like required settings you can have uh you can specify like a dot what is it um gin dot required as a function and it makes it so that or as your parameter and and it makes it so that, or as your parameter, and then it makes it so that your user has to put it in their config file. That's kind of cool. And then you can also, if you don't want somebody to configure something, you can mark it as, oh, he's got blacklist the in samples.
Starting point is 00:23:03 So if you want, like in this example, he's got a simulate function with two parameters, random the, the in samples. So if you want, like in this example, he's got a simulate function with two parameters, random funk and in samples, you want people to configure the random funk, but you don't want them to touch the in samples. You can, you can say, don't do that.
Starting point is 00:23:15 So it's kind of neat. There's a whole bunch of cool features around it. Like, like being able to specify different functions. So you can name things and do it around like like to say like in his example he's got random functions and if you you can specify you know one of the other one of the other uh like a random triangle function you can specify a function and assign it to that he's got uh named things um it's a really, it's an interesting way to think about configuration. And the
Starting point is 00:23:47 motivation section of the documentation for GIN says that often modern machine learning experiments require just configuring a whole bunch of parameters, and then you're tweaking them and stuff. And to have that be as easy as possible and as simple as possible um because is and you're going to add some and take some away and things like that because some things you want configured and then you decide not to not having to go through a config parser system and just making it as trivial as possible to add parameters i think it's a really cool idea. It is a cool idea. It reminds me of dependency injection a little bit.
Starting point is 00:24:28 Yeah. You know, where you would configure, say, if somebody asks for a function that implements this or that goes here, this is the data access layer to use, or here's the ORM I want you to pick this time. It's not super common in Python, but it's pretty common in a lot of languages and it
Starting point is 00:24:45 feels a little bit like that can we configure stuff so we have these parameters that we might use for testing or something but it just they get filled in automatically yeah even fast api has that for example yeah um pretty cool somebody in the audience says it isn't uh uh isn't gen used with go um and i'm not sure about that, but it gin is, is not an officially supported Google product, but it's under the Google like a GitHub repo group. So maybe, yeah, maybe. So it does look very Python like though for the config files and that's cool.
Starting point is 00:25:20 Yeah. Good one. All right. Let me switch back before I swap over. Okay, here we go. Now, this next one, I think universally will be well accepted. Although the comment section about it was a little bit rough and tumble. Nonetheless, I think it should be universally exciting to everyone. And this comes to us from Eduardo Orochena, who sent over this article that said the, what's it called?
Starting point is 00:25:44 The Python 3.11 performance benchmarks are looking fantastic. And oh boy, are they. So we're talking beta code six months out, right? And still, still, we've got some pretty neat stuff. So this links over to an article with that same title by Michael Larabelle. Basically says, look, we took a whole bunch of different performance benchmarks for Python and ran them on Python 3.11 beta, which this is the thing I was hinting at. Like you might really want to consider this for if you're thinking, should we upgrade from nine to 10?
Starting point is 00:26:18 Maybe you want to just go straight to 11, right? I mean, you know, it's sort of a side thought, Brian. Isn't it awesome that the one that goes like crazy performance this one goes to 11. all right so they show um all the stuff that they're testing on like amd ryzen 16 core 32 with hyper threading the motherboard i mean like down to the motherboard and the chipset and the memory and all that. So a pretty decent stuff. And then also the build commands and all sorts of things here. So pretty repeatable, I think, rather than just like, Hey, I ran it and here's a graph without, um, without axes or something like that. So you can kind of click through here and you see some
Starting point is 00:27:02 pictures and it says, all right, well, there's the pie bench, which I think is like the standard simple one. It says, look at this. The Python 3.11 beta is faster than 3.10, which by the way, was slightly slower than the previous ones. But you know, what is that? 10% or something? So already actually 16% better. So that's already pretty awesome. But there's a whole bunch of other ones. They did one called Go. I don't know what these benchmarks are. I don't think this has anything to do with the language Go, just the name of the benchmark. And then there's two to three and Chaos. That one sounds like the funnest. But if you look at this Go one, this one is like almost 50% faster, 50% faster. That's insane, right? Yeah. Wow. And you come down to the two to three is these are all estimates 25 20 faster say 40 faster with the chaos one come down to the float operations
Starting point is 00:27:55 and python 310 was already better than the others but this is again maybe 30 faster and let's roll into the next page you just kind of see this across the board. Better, better. Some of them are super better. Some are like a little bit better, like Pathlips better, but not crazy. Ray tracing is like, again, 40% better here. And you keep going. There's another one with this huge crypto I AES, some sort of encryption thing. So there's just a bunch of couple of these are there's like this one at the end, you're like, oh wait, this one got way worse.
Starting point is 00:28:26 Be careful because it says more is better on this composition. I guess is the result here. Like how much more computing power do you get per CPU cycle or whatever? What is that? That's a massive jump. You saw a little bit better improvements
Starting point is 00:28:40 from 3.8 to 3.9, 3.9 to 3.10, but 3.10 to 11 is like a 40 yeah 41 better on the beta before it's even final wow that's pretty exciting right that's very exciting um and yeah actually i think i'm curious what some of these uh negative comments are but the interesting thing is to run lots of different metrics and are lots of lots of different benchmarks and having them all be it's faster kind of means that i mean i i take it as uh you know your mileage may vary but it's going to be better um for whatever you're doing probably yeah yeah yeah this it feels like this is a thing you could just install and things get better uh the negative comments are mostly like
Starting point is 00:29:22 well if python was so slow it could be made this faster than Python's a crappy language. It's pretty much, I've summed up like 65 comments right there. By the way, so I interviewed Guido van Rossum and Mark Shannon a little while ago about this whole project about making Python five times, not 40%, but five times faster.
Starting point is 00:29:44 And the goal is to make it a little bit faster like this each release for five releases in a row. And because of compounding, that'll get you to like 5%. So it looks like they're delivering, which is awesome. Yeah, this is good. Yeah. Cool, cool. All right.
Starting point is 00:29:57 Yeah, I think that's it for all of our items. Yeah. Got any extras? No, I was going to pull up the, so yeah, this one goes to 11. If people don't know that, that's a Spinal Tap reference. Yeah, exactly.
Starting point is 00:30:13 All right, I got a few extras to throw out real quick. Python 3.10.5 is out with a bunch of bug fixes, like what happens if you create an F string that doesn't have a closing curly and just a bunch of crashes and bug fixes. So if've been running into issues you know maybe there's a decent amount of stuff in the changelog here nice people can check that out also real quick people might if they're on a mac they might check out raycast which is a replacement for the command space spotlight thing that has like all these developer plugins so you can do like interact with your
Starting point is 00:30:42 github repo through command space and stuff. You can create little macros and there's a bunch of extensions like this thing's free, at least for not for team if you're not on a team. But there's a bunch of different things you can get that are full like managing processes, doing searches, VS code project management from command space and whatnot. The one that I set up is I can now do Command Space and then just type PyPI, and then it'll just search PyPI for whatever I type. Here's an example of typing PyPI, then FastAPI,
Starting point is 00:31:15 and it'll pull up all the FastAPI packages. So anyway, people might find that fun to check out. Yeah, that's cool. Yeah, it's pretty neat. All right, well, I think I'll not talk about my other one. And then joke, shall we close it out with a joke? Yeah, let's do a joke. So I think this ties really well back to the PSF survey. We talked about, well, what framework do you use? What data science framework do you use? Or what web framework do you want to use? Django or Flask or FastAPI or what? So here's one that is a
Starting point is 00:31:41 pretty interesting analysis. And the title is, why wouldn't you choose Parrot for your next application? Not a framework, but literally a Parrot. And this is compared to machine learning. So it has like this, let's break down a features, like a feature table. And it has a Parrot, which literally just has a picture of a Parrot. And this is machine learning algorithms with a neural network. And then it lists off the features, learns random phrases, check, check. Doesn't understand anything about what it learns, check, check. Occasionally speaks nonsense, check, check. It's a cute birdie parrot, check, fail. I mean, why wouldn't you choose this, Brian? This is funny. I love it. Yeah, it's pretty good pretty good stuff i actually reminds me of
Starting point is 00:32:27 like i have to pull up this article so i was reading about uh some machine learning stuff to try to get models like even closer and closer to reality there's a whole bunch of tricks people do and then and then there's some analysis of like sometimes it's actually not doing anything more than just a linear regression. So, um, yeah, I simple for if statement. Yeah,
Starting point is 00:32:48 yeah, yeah, yeah, yeah, for sure. So they're using artificial intelligence to make the computer decide. Now it's an F statement. Like it's just computers deciding things the old fashioned way.
Starting point is 00:32:58 Yeah. Yeah. So awesome. All right. Well, thanks for being here. Thank you. Thanks everyone for listening.
Starting point is 00:33:03 Bye. Yep. See ya.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.