Python Bytes - #445 Auto-activate Python virtual environments for any project

Episode Date: August 18, 2025

Topics covered in this episode: pyx - optimized backend for uv * Litestar is worth a look* * Django remake migrations* * django-chronos* Extras Joke Watch on YouTube About the show Python Bytes ...445 Sponsored by Sentry: pythonbytes.fm/sentry - Python Error and Performance Monitoring Connect with the hosts Michael: @mkennedy@fosstodon.org / @mkennedy.codes (bsky) Brian: @brianokken@fosstodon.org / @brianokken.bsky.social Show: @pythonbytes@fosstodon.org / @pythonbytes.fm (bsky) Join us on YouTube at pythonbytes.fm/live to be part of the audience. Usually Monday at 10am PT. Older video versions available there too. Finally, if you want an artisanal, hand-crafted digest of every week of the show notes in email form? Add your name and email to our friends of the show list, we'll never share it. Michael #1: pyx - optimized backend for uv via John Hagen (thanks again) I’ll be interviewing Charlie in 9 days on Talk Python → Sign up (get notified) of the livestream here. Not a PyPI replacement, more of a middleware layer to make it better, faster, stronger. pyx is a paid service, with maybe a free option eventually. Brian #2: Litestar is worth a look James Bennett Michael brought up Litestar in episode 444 when talking about rewriting TalkPython in Quart James brings up scaling - Litestar is easy to split an app into multiple files Not using pydantic - You can use pydantic with Litestar, but you don’t have to. Maybe attrs is right for you instead. Michael brought up Litestar seems like a “more batteries included” option. Somewhere between FastAPI and Django. Brian #3: Django remake migrations Suggested by Bruno Alla on BlueSky In response to a migrations topic last week django-remake-migrations is a tool to help you with migrations and the docs do a great job of describing the problem way better than I did last week “The built-in squashmigrations command is great, but it only work on a single app at a time, which means that you need to run it for each app in your project. On a project with enough cross-apps dependencies, it can be tricky to run.” “This command aims at solving this problem, by recreating all the migration files in the whole project, from scratch, and mark them as applied by using the replaces attribute.” Also of note The package was created with Copier Michael brought up Copier in 2021 in episode 219 It has a nice comparison table with CookieCutter and Yoeman One difference from CookieCutter is yml vs json. I’m actually not a huge fan of handwriting either. But I guess I’d rather hand write yml. So I’m thinking of trying Copier with my future project template needs. Michael #4: django-chronos Django middleware that shows you how fast your pages load, right in your browser. Displays request timing and query counts for your views and middleware. Times middleware, view, and total per request (CPU and DB). Extras Brian: Test & Code 238: So Long, and Thanks for All the Fish after 10 years, this is the goodbye episode Michael: Auto-activate Python virtual environment for any project with a venv directory in your shell (macOS/Linux): See gist. Python 3.13.6 is out. Open weight OpenAI models Just Enough Python for Data Scientists Course The State of Python 2025 article by Michael Joke: python is better than java

Transcript
Discussion (0)
Starting point is 00:00:00 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds. This is episode 445. Can't believe that. Recorded August 18th, 2025. I am Brian Aachen. And I'm Michael Kennedy. This episode is sponsored by Sentry, Python error and performance monitoring. Thank you, Sentry.
Starting point is 00:00:18 And if you'd like to get a hold of us, maybe give us some topic ideas that you'd like us to cover on the show. That'll be awesome. Or just comment on something or just say hi. You can reach us on both Blue Sky and Mastodon and the links are in the show notes. And if you'd like to join us live for the live performance, performance, Pythonbytes.fm slash live, or you can also use that link to find out when we're going to record next or just find out where the YouTube page is so that you can subscribe and watch them later.
Starting point is 00:00:51 That'd be great. And finally, we'd like to get a hold of you later just to sell you stuff. Now, just to give you links. So we cover a lot of stuff in the show. You don't have to keep track of the links. They're in the show notes, but they're also the links plus extra information and things you might need to know if you are new to Python or new to a topic. Those are all in the email. So you can go to Pythonbytes.fm and sign up for the newsletter and we'll send that to you. And that's all the intro stuff. So Michael, we got some exciting news for number one. We have very exciting news.
Starting point is 00:01:27 And as part of this performance, Brian, I would like to recite The Raven by Edgar Allen. Oh, no, wait, this is not a performance. And I don't want to do that. I want to talk about Astrol, actually, because this is a really interesting announcement from them. Charlie and team are knocking stuff out of the park. They're delivering big time on the tooling for Python packaging,
Starting point is 00:01:47 managing Python, right? All the layer that you build upon when you're writing applications, right? So first they came out with Rough for formatting and fixing. issues. Then they came out with UV and then they upgraded UV to have its project management and Python management, not just alternate PIP installs and so on. And along all of that time, people have been saying, well, this is so amazing. It's so fast and it does all of the things instead of piecing together tools. But what if they rug pull us and we slip and we hit our head? What if they go and do something else? What if they start saying it's a half a cent per
Starting point is 00:02:23 PIP install or whatever, you know, what are we going to do? And Charlie has said that he's not planning on monetizing those types of things. Instead, he wants to build on top of it. So with this announcement, we get our first look at what that might be. So it's interesting in and of itself, but I think it's also interesting. And like, this could be the thing that supports the stuff that people are really starting to depend upon. And it's something called PYX, PyeX. I'm going to be interviewing Charlie on Talk Python in a couple of days, by a couple days, I think it's next a week. But pretty soon, I put a link to the event on the show notes. People can check that out. So we'll get the pronunciation. I'm going to say PYX for now, like TY and UV and so on,
Starting point is 00:03:06 unlike Ruff, not RUFF. So PYX, a Python native Python registry now in beta. So here's the deal with PYX. Think of this as an alternate PIPI, but not as a source of packages, not where, like when I publish something, I also have to publish the PYX, but more of a front end that adds a bunch of features or an intermediate layer, a middleware to PPI itself. Okay? And at the moment, the plan is for this to be a full on paid service for people that really need things like, I really need my project to 100% be able to build some thing that's only available as a source distribution, right? How often have you PIP installed something? And most things just come down as a wheel and they come flying in. But every now and then it'll
Starting point is 00:03:57 stop and say building something for 10 seconds the very first time before it gets cashed. Then it goes. Yeah, or longer. Yeah, definitely potentially longer. Like I don't use it anymore. We've talked about why is micro-whisky because they said, please stop using our project. We're not going to support it. Go find something else. And we talked about that. But that thing took forever to build. But also, if you're doing a GPU-aware types of things, you're doing a lot of stuff with PyTorch or machine learning more broadly, a lot of those things are huge and having the right platform built version is really, really tricky. So the idea is that UV and PYX are going to go together hand and glove.
Starting point is 00:04:38 So UV can work even better by having complete control over the back end as well, right? They say there's some limitations that UV was able. able to solve, but not all of them because they don't control piPI.org and its API and how things get built, right? So the ideas, you know, making it easier to install Pi Torch or Kuda-based libraries, that kind of thing. It also solves the problem. So why is everyone rebuilding the same packages over and over on their machine and what if those build tools are no longer the same? Why did set of tools break our recent release? How do we authenticate against our internal registry? The other thing this is going to do is it's going to be a first class private registry. So if you have a team, you're
Starting point is 00:05:20 like, let's publish our data access layer, not to the world, but to our other people who are working on the projects, to our other applications, and so on. So pretty exciting. I think there's a lot of stuff going on here. They're in beta. You can join the wait list. You can get in touch with them. It says, beyond the product itself, PYX is also an instantiation of our strategy. And this of what I opened with. Our tools, UV, Ruff, TY, et cetera, remain free, open source, and permissively licensed for other. And this project is pretty interesting because it's a glimpse into how they're planning to make that possible. Yeah, I'll be really curious to watch this. I'm excited for a lot of reasons. So, like, let's say this is only a commercial offering. Like, you have to,
Starting point is 00:06:01 it's a corporate sort of thing. You have to pay to get it. I'd be still okay with that because the alternatives, there's alternatives out there, like J. Frog Artifactory is around. And, and And there's other ways where you can, there's other products available to set up to mirror IPI and do other things like, as things come in, do like a security scan, your own security scans and things like that. And so a mirror in front end, but also something that you can publish to internal, publish internal things too. Those are really cool.
Starting point is 00:06:32 They're a little hard to set up, though. And I think I trust Astral to make this fairly easy for people. it also would be cool there's uh you know if you if you want to test things out locally like there's smaller businesses and smaller teams that aren't large corporations or even individuals that might have a need to try out what a what a repository might look like when you're doing those sorts of things or you're like you know developing on an airplane or something the local repo i don't know if they're going to get down to the point where it's a like a local person like a free thing that you can just run on your own um so it'll be interesting to watch i guess
Starting point is 00:07:07 I'm okay with either if they want to keep it like paid only or something that you know small fry can use as well but I'm excited to watch I am as well yeah there's I think somewhere Charlie said maybe possibly there's a free version but right now the plan is to be a paid project yeah yeah all right that's all I got to say about that for now until I talk to Charlie and learn a whole bunch more but I want to talk about Lightstar a little bit there is an article called um uh where I missed it oh here it is Sorry. James Bennett has an article called Lightstar is worth a look. And I was looking through our past history a little bit.
Starting point is 00:07:43 And it looks like we've talked about Lightstar, but it's been a while. So you brought up, this was earlier this year, I think, talking about rewriting court, or rewriting Talk Python in court. And Lightstar is one of the things you looked at, right? So that was episode, that was back. Oh, that was last year, November of 2024. So, Tiaz client, hello. That was a good one.
Starting point is 00:08:05 Um, so we haven't talked about it a bit. Um, so I was taking a look at it, uh, looking at Light Star. I had never played with this. And there is some interesting history around that it used to be, uh, Starlight, like, or something like that. Yeah, but it was not Starlet, Starlight. Yeah. And there was confusion. Whatever. Anyway, so this is an interesting, um, uh, James Bennett brought up a light star is worth a look. And right, he, he brought up like a lot of people do. If I'm going to do an API, why not Fast API? I mean, that seems like the obvious choice now for setting up an API or something and or a lightweight web framework. So why not?
Starting point is 00:08:47 There's a, there are a few reasons, right, that he brought up. So one of the things was in scaling and not, not scaling in traffic, but scaling in, I want more than one file. I want a bunch of, I want my application and a bunch of files. And there was some frustration that he talked about with trying to figure out how to do that within Fast API because there's the app dot. And how do you have multiple routings in different files if you have one app?
Starting point is 00:09:16 I don't know if this is easy or not. But he had some frustrations that it took him 40 pages into the documentation to the user guide to find that. However, it's kind of built in with Lightstar is that you don't, it's not a top level app. You can set it up, I guess, into multiple files easier, is one thing. The other thing was that Fast API is tied to Pidentic, and I don't know if you can break that or not,
Starting point is 00:09:46 but Lightstar is not tied directly to PIDANIC, and you can use Segalchemy. Yeah, it can use PIDANIC, but it also has other options. Yeah, he brings up maybe adders. You could use adders for validation, which some people forget that that's One of the cool things about adders is it can validate things. Or SQL Alchemy even can do some of that. So or a little combination of both, I guess. Anyway, so that's interesting.
Starting point is 00:10:13 If you'd rather use SQL Alchemy for that or adders, why not? The third option, what is the third thing he brought up is, I guess just their architecture. I had another note. Anyway, it's interesting that there's just, there's another option. there. So do you have any, do you know more about Lightstar? I interviewed the guys from Lightstar over on Talk Python, so I know a little bit more, but not a 10 more. I would say one thing, kind of a mental model I have about Light Star and I'm not sure how accurate this is, but you know, it's mine, so I get to have it, is fast API is kind of Flask-esque in that it,
Starting point is 00:10:52 it gives you enough to be, have a bunch of cool building blocks, but then you pick, you build the other pieces and so on, whereas Light Star seems to me a little bit more of the jing go philosophy where it comes with a bunch of batteries included like see all those little boxes down there yeah look at all like the different pieces right it's got middleware it's got data stores it's got multiple it's got ORM integration caching like a lot of those things don't come off in that sort of stuff don't come with fast API which is both a good and a bad thing right like if it comes with it but it's not what you want then it's just a hassle to battle against all the time yeah but that's how I sort of perceive light star is it's it's like fast API with more
Starting point is 00:11:34 options and like more stuff included yeah it's interesting yeah it's definitely interesting yeah all right also also interesting is our sponsor Sentry is back and I want to tell you about them so this episode is brought to you by Sentry of course it's been incredibly valuable for tracking down errors in our web apps other code that we run that's web apps, APIs, even the Talk Python mobile apps use it and send in errors when something goes wrong for somebody in some far-flung place in the world. And I've told you the story more than once about how Talk Python, I learned one user was encountering a bug through this entry. I fixed the bug and I let them know before they contacted me. It was pretty wild, right?
Starting point is 00:12:17 Yeah. So how does this work? I'll walk you through a few simple steps on how you might add error monitoring and distributed tracing through like some kind. of e-commerce app right that's got a javascript front-end flask back-in so if it's flask on the front-in react on the front end uh you want to make sure there's no errors during some checkout process for an e-commerce page for me anytime money and payments are involved always get a little extra nervous writing that code so here's what you do you enable distributed tracing and error monitoring both your flask back-in and your react front-end then you want to
Starting point is 00:12:53 add enough context to that front end and back in action so they can be correlated. And to do that, you enrich the spans, a sentry concept, with business context. So in your React checkout, you wrap the submit handler in a start span and add information to that. Then you want to see the requests. You build a real-time century dashboard. You spin up one using span metrics to track key attributes, part size, checkout duration, and so on, giving you a single place to see both performance and error data. That's it. When an error happens, You open up the entry on the error on Century and you get end-to-end requests and error tracebacks and so on. They also have a new product called Seer, which is C-E-E-R, is a thing that looks at your app and uses LLMs to understand what's going wrong and possibly even suggesting, you know, here's actually what's causing that bug and it could even potentially do a PR and help you fix it, things like that.
Starting point is 00:13:45 Really cool. So it's not just looking at the error, but even, you know, going further than that. So if your apps and your customers matter to you, you'll definitely want to set up. Century like I have over at Talk Python and Pythonbytes and so on. So visit Pythonbytes. com slash Sentry and use the code Python bytes, all caps. Just in word, that's Pythonbyte. FM slash Century.
Starting point is 00:14:05 Use the code Python bytes. The link is in your podcast. Player Show Notes. Thank you to Century for supporting the show. Awesome. What are you, Ryan? Yeah. Sleeping on the job.
Starting point is 00:14:13 Sorry. I, so for the next topic, I want to like follow up from last week. So just last week, I think it was last week. Yeah, 444, Be Gone Python of Yore. Two thirds of a beast. That took me too long to figure that up. So one of the things we talked about was an article called Stop Using Django's Squash Migrations. And we had some feedback.
Starting point is 00:14:39 So from Bruno Ala said, hey, just a follow up and a plug for one of my own projects, there is a project called Django remake migrations. So that's really kind of what I want to talk about. This looks pretty cool. So it sort of does a lot of the stuff. But one of the things that I like isn't just, there's a tool that you can use to do a lot of this work for you. You don't have to follow this step by step.
Starting point is 00:15:02 However, it's cool that it talked about the problem. So it says a Django admin command. There's a Django admin command to recreate all migrations in a project. This new one is like squash migrations, but it's on steroids. So it says the built-in squash migrations command is great, but it only works on a single app at a time, which means that you need to run it for each app in your project and on a project with enough cross-app dependencies that can be tricky.
Starting point is 00:15:30 Yeah, that sounds tricky. So this command aims to solve this problem by recreating all the migration files in the whole project from scratch and mark them as applied by using the replaces attribute. I don't know what that does, but there's some caveats like all migrations are marked as one as replaced once. So I'm glad I'm using Git when I'm using any of this stuff so I could try it. And if it doesn't work, roll it back.
Starting point is 00:15:57 But this looks pretty cool. So I wanted to just shout out to people to maybe try this as well if you want to try to remake your migrations easier. While I was looking at this, I was like scrolling down. And it says the package was created with the copier template. them and that rung a bell and it looked down and this was a while ago so michael brought up on this show talked about copier but that was back in 2021 so i'm going to go ahead and uh do a shout out to copier again because it looks pretty cool um copier is a is something like cookie cutter uh but it
Starting point is 00:16:37 it's a c l i app for rendering project templates but it looks kind of fun um uh so i'm gonna check this out I have some sort of project template needs coming up. And so I might take a look at this. They have a comparison too. So they're like, this is kind of like cookie cutter. Yeah, they know it's kind of like cookie cutter.
Starting point is 00:16:56 There's some differences here. They've got a table in the documentation. One of the things I kind of like is that the configuration files in YAML format instead of JSON for cookie cutter. I'm not really a fan of either. But if I got a handwrite stuff, I'd probably rather handwrite YAML files. So anyway, interesting to take a look at.
Starting point is 00:17:17 Nice. I think the biggest difference other than I think being a little more maintained these days is that it has, like with cookie cutter, you run a template, it generates a thing, and then you make changes to it, and that's it. With copier, you can apply, if there's a new version of the template, you can theoretically, at least it has the option to try to apply a migration from the old version to the new version. Oh, that's cool. Yeah, I think that's its biggest selling.
Starting point is 00:17:44 Point. Template updates. Yeah. That's actually awesome. Yeah. And it looks like it's composable too. So you can like if you might have different like say for instance in a Django you might have several applications that would you would add. You could use multiple copiers to pull an app different apps into different to one application. Right. Add a CRM action or something. Yeah. Yeah. So interesting. Very interesting indeed. Cool. All right. On to the next one. This is kind of like the Django show. And we'll more jingo shout out on top of this by the way i'm going to be doing a happy birthday happy 20th birthday jingo panel with a bunch of the creators and an hour and a half two hours oh wow neat now if you're listening to the audio version there's a good chance that's already over however you can still either check out the live stream or in a few weeks check out the talk by thought episode on it so keeping with that theme jingo chronos this one's a quick quick little topic but it's really cool. So I want to know how fast my page is loading. And I know there's the jingo toolbar and all that, but here's a nice way to add more information about performance right into your browser. So
Starting point is 00:18:53 it's jingo middleware that shows you how fast your pages load right in your browser. You simply add it as the first and last thing. You know the quick start. It says add it as your last installed app. And it has a start and an end that has to be the first and last middleware. So it gets the full picture of all the middleware actions and all that kind of thing. I also would recommend saying probably show in production false, unless you really, really want to show in production, something like that. But the idea is it just shows you how long did your middleware take to run, how long did your view take to run,
Starting point is 00:19:29 how long did your SQL queries and commands take as part of the middleware, as part of the view, and then how long did it take total? and how many queries, if you see the little screenshots, it's a 7Q, 5, Q, 12, Q total, guessing those are the number of queries. One of the really big problems when you're working with ORMs, like Jango has, but many, is they have lazy evaluation for relationships.
Starting point is 00:19:53 That can be tremendously slow because the programming model doesn't change if you do an eager query and join two things or you just use it. It's really just about how the query was structured, not how you program against it. So you might have, hey, I got 100,000, things back from the list as one query, and then I loop over it and I interact with a field
Starting point is 00:20:10 where each one of those is doing another query, 100 more. So 101 queries, hence they typically called the N plus one. So you would see something like that. You'd be like, whoa, the view is doing 107 queries. I think I got two things. What's happening? Well, N plus one, some version there probably. Anyway, if this is interesting to you, it's not super big in terms of popularity, but it's also pretty straightforward and simple. And yeah, people might like it. Remind me, where does the information pop-up? Is it in the browser when you're, like, developing or something? I think, I think it's at the bottom of the page. I haven't installed it and tried it, but it's, yeah, it's somewhere, like, in the page as part of it. Okay. Yeah. So basically, I believe what it does, it actually gives your template, your Django template, these pieces of information, and you show them how you want. So it gives you middleware CPU time, SQL time, view, count, total time, basically the stuff that was on the screen. So you can put it in, it provides it as data values. and then you would put it into your view as you see fit, right?
Starting point is 00:21:10 I believe that's how it works. Okay. Cool. That's neat. Indeed. Well, those are our topics. I have a little bit of an extra. Go for it.
Starting point is 00:21:22 I'm going to go back in time a little bit. And so test and code is a podcast I started in August of 2015. So it's August. So it's now, it's August 20. It's almost two days. in two days it'll be like 10 years um i think that's enough i'm looking at like episode two episode one was terrible so i deleted it a long time ago so episode two is a pie test versus unit test versus nose and i didn't even capitalize pie test correctly um that's bad and nose is just if i would
Starting point is 00:21:54 redo this it would just say don't you just pick pie test it's there's one option really anyway unit test is fine also but let's get real don't do don't use nose so what is this or why am I bringing this up? Well I just released on Friday episode 238 so long and thanks for all the fish where I announced that I am no longer doing testing code so closing the book on that chapter. I listened to that episode and yeah I think what I'll say is congratulations 10 years and 238 episodes is quite a run and those things will be around people can listen to them and enjoy them. Yeah so that's that's another thing is I'm seeing I'm not sure how long I'll leave it live because podcast hosting is something you have to pay for.
Starting point is 00:22:40 So I am... You know what? I say you take it right, a little web scraper and just generate a static site and just leave it there. That's something that might be able to do. I'm giving you work. How about that? Actually, so that, yeah, but the point was to remove work from my plate, but, you know,
Starting point is 00:22:57 we'll see. All right. Okay, leave that up in the air. All right. What you got for us? Do you have any extras? Yes, I have some extras. Let me get it really.
Starting point is 00:23:07 So we have Python 3136. Big question. Will this be the final 313 before 314? Probably not, but it's plausible, right? We're talking October, which is not, sadly, not that far away. Not ready for rain. Not ready for it. Anyway, 3136 is out.
Starting point is 00:23:28 And there's actually a lot of changes. If you flip through the library, I don't know how many pages that is. but we're talking a lot of pages. So there's actually a ton of changes here that you might want to check out, you know, like little security fixes and other stuff. Does it have that zip? There's like a,
Starting point is 00:23:45 there was some kind of zip issue where you could trick it and you do. And I don't think that was in Python. I think that was in PIP and UV also had it. So update your UV by the way, folks, as well, which is UV self-update, easy enough. Anyway, this is out because of our sweet Docker setup. I just did a rebuild on Python. bytes out of him and it said UV install Python and we have a 3136 powering everything
Starting point is 00:24:10 all of a sudden very nice anyway quick and easy yeah on top of that the final 13 the final countdown I was just listening that song uh I have this really cool script and I think this is kind of it's kind of interesting and please don't write me and tell me other things are out there that do this because I know there are some that sort of do this but it's interesting in and of itself but it's also interesting as we get better and better LLM and Agen Tech coding tools to just say, instead of just like depending on something that has 100 features and I'm going to take one or two of them and then deal with that, I just want this one thing. Could I just ask, have that one thing created?
Starting point is 00:24:46 The answer is probably yes. So I wanted something that would auto-activate Python virtual environments built with UV, named like I like, exactly as I navigate around my shell. So I asked a lot, I think, something like that. And it said, sure, here's a cool little bash script that as you enter a folder that either itself has a virtual environment or somewhere up in the hierarchy of it as a virtual environment, it'll just activate it. And if you leave that portion of the directory tree, it'll go back and unactivate it. And I know that Dura E&V does some things along these lines. But like I said, I just like, I think it's really cool that you can just say really all that needs to be is like a 35 lines of bash and then it never changes.
Starting point is 00:25:27 So I put up, I decided, someone was like, oh, that's really cool. Is that happening? I'm like, I'll just put that up as a gist and people can check it out. I like it. Also, something small that you can change if it's not quite what you want, change it. Exactly. I'm not looking for something that solves every problem for every programming language in the most. I just want this one little feature.
Starting point is 00:25:48 And I don't really, to be honest, I don't know enough bash to do this really well. But guess what? Either chat or cloud sure knows it. And it took care of it. So I'm really enjoying that. Yeah, like I'm totally going to use this, although I'm going to change VE&V to dot VENV because Brett Cannon convinced me that the dot is good.
Starting point is 00:26:06 I got tons of respect for Brett, but not my thing. I want to be able to look either via LS or in the Mac OS Finder and see that it has a virtual environment. Yeah. And you can't if it's a dot, right? Fair enough. Anyway, fair enough.
Starting point is 00:26:21 I want to just look at it and go, yep, it absolutely has a virtual environment, right? because that tells me what my next action needs to be. All right, I just spent a long time this summer writing up this huge long post. I don't know if it'll actually tell us how many minutes of reading time it is, if I ask. 24 to 34 minutes of reading time
Starting point is 00:26:39 for this article that I wrote. Okay. And it's called the State of Python 2025. I wrote this in partnership with JetBrains. And it takes the PSF JetBrain survey results and does like a ton of analysis on it and predictions and concrete actions people could take based on the trends that we're seeing and so on.
Starting point is 00:27:00 So I'll put that here now. I think I might make this one of next week's items and we can dive into some of the trends and recommendations and you can tell me how I'm wrong or right. But it just came out this morning. So I thought I'd go ahead and just throw it out there as an extra for now. Cool. Yeah.
Starting point is 00:27:15 Next, if you run local LLMs, there's a, as I do, and there's a really interesting option, You can now run Open AIs. Well, first of all, hey, guess what? OpenAI has a public open weights model you can use. And you can run it locally. So I'm running here in LM Studio in developer mode and programming against the GPTOSS, 20 billion parameter model. How cool is that?
Starting point is 00:27:40 That's pretty cool. Yeah. So I have some maybe. I don't know. It's super cool. I have some other things that I created like little utilities. I'm like, hey, I want this utility to work and it needs an LM. And I knocked it out real quick in combo with some agentic coding.
Starting point is 00:27:53 And it was using when, which was pretty good. But then when this came out, I'm like, let me try this and see if it'll give me better answers. Sure enough. So now I'm just running this locally on my Mac Mini and talking to it. And it's pretty good. So that's a cool option. You run it, Olama or other places as well.
Starting point is 00:28:09 And finally, just remind people that would just release the just enough Python for data scientist course for 29 bucks over at talk python so you're getting started in your data science journey or you feel like you've been doing it for a long time but you just don't quite have the software engineering techniques and tools and so on check this out just talk python dotfm click on courses be right there that's it from extras awesome um that's all over extras do we have do something funny for us i have a quick follow up before we do from pat decker says i saw your post on blue sky listen to 238 i agree with michael congratulations on 10 years bryan Thank you.
Starting point is 00:28:45 Thanks, Pat. Indeed. Okay, yes, we have a joke. Let's check it out. This one, you know, people don't like if I, they don't like generally when one language bashes on another. Well, some of them do, but not, not generally. So keep folks, before I show you this joke, it's supposed to be lighthearted.
Starting point is 00:29:00 Please don't email us to how we're mean. So the joke is, Python is better than Java, says someone, another person says, prove it. So the original person fires up Python, creates two strings, one name Python, one name Java, and then asks as writing in the interpreter, Python greater than Java as strings? True. True, says Python. It's greater.
Starting point is 00:29:23 Yeah, sure. Truth. It's true. There you have it. Anyway, I just thought that was funny. Yeah. It's too bad that Python doesn't have a better operator. Not necessarily greater, but just better.
Starting point is 00:29:34 Exactly. Well, there it is. Yeah, that's the joke. All right. Cool. Well, thanks again. Thanks everybody for listening. And see you next.
Starting point is 00:29:44 Yep. See you later. Bye all.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.