Python Bytes - #474 Astral to join OpenAI

Episode Date: March 23, 2026

Topics covered in this episode: Starlette 1.0.0 Astral to join OpenAI uv audit Fire and forget (or never) with Python’s asyncio Extras Joke Watch on YouTube About the show Sponsored by us! Sup...port our work through: Our courses at Talk Python Training The Complete pytest Course Patreon Supporters Connect with the hosts Michael: @mkennedy@fosstodon.org / @mkennedy.codes (bsky) Brian: @brianokken@fosstodon.org / @brianokken.bsky.social Show: @pythonbytes@fosstodon.org / @pythonbytes.fm (bsky) Join us on YouTube at pythonbytes.fm/live to be part of the audience. Usually Monday at 11am PT. Older video versions available there too. Finally, if you want an artisanal, hand-crafted digest of every week of the show notes in email form? Add your name and email to our friends of the show list, we'll never share it. Brian #1: Starlette 1.0.0 As a reminder, Starlette is the foundation for FastAPI Starlette 1.0 is here! - fun blog post from Marcello Trylesinski “The changes in 1.0 were limited to removing old deprecated code that had been on the way out for years, along with a few bug fixes. From now on we'll follow SemVer strictly.” Fun comment in the “What’s next?” section: “Oh, and Sebastián, Starlette is now out of your way to release FastAPI 1.0. 😉” Related: Experimenting with Starlette 1.0 with Claude skills Simon Willison example of the new lifespan mechanism, very pytest fixture-like @contextlib.asynccontextmanager async def lifespan(app): async with some_async_resource(): print("Run at startup!") yield print("Run on shutdown!") app = Starlette( routes=routes, lifespan=lifespan ) Michael #2: Astral to join OpenAI via John Hagen, thanks Astral has agreed to join OpenAI as part of the Codex team Congrats Charlie and team Seems like **Ruff** and uv play an important roll. Perhaps ty holds the most value to directly boost Codex (understanding codebases for the AI) All that said, these were open source so there is way more to the motivations than just using the tools. After joining the Codex team, we'll continue building our open source tools. Simon Willison has thoughts discuss.python.org also has thoughts The Ars Technica article has interesting comments too It’s probably the death pyx Simon points out “pyx is notably absent from both the Astral and OpenAI announcement posts.” Brian #3: uv audit Submitted by Owen Lemont Pieces of uv audit have been trickling in. uv 0.10.12 exposes it to the cli help Here’s the roadmap for uv audit I tried it out on a package and found a security issue with a dependency not of the project, but of the testing dependencies but only if using Python < 3.10, even though I’m using 3.14 Kinda cool Looks like it generates a uv.lock file, which includes dependencies for all project supported versions of Python and systems, which is a very thorough way to check for vulnerabilities. But also, maybe some pointers on how to fix the problem would be good. No --fix yet. Michael #4: Fire and forget (or never) with Python’s asyncio Python’s asyncio.create_task() can silently garbage collect your fire-and-forget tasks starting in Python 3.12 Formerly fine async code can now stop working, so heads up The fix? Use a set to upgrade to a strong ref and a callback to remove it Is there a chance of task-based memory leaks? Yeah, maybe. Extras Brian: Nobody Gets Promoted for Simplicity - interesting read and unfortunate truth in too many places. pytest-check - All built-in check helper functions in this list also accept an optional xfail reason. example: check.equal(actual, expected, xfail="known issue #123") Allows some checks to still cause a failure to happen because you no longer have to mark the whole test as xfail Michael: TurboAPI - FastAPI + Pydantic compatible framework in Zig (see follow up) Pyramid 2.1 is out (yes really! :) first release in 3 years) Vivaldi 7.9 adds minimalist hide mode. Migrated pythonbytes.fm and talkpython.fm to Raw+DC design pattern Robyn + Chameleon package Joke: We now have translation services

Transcript
Discussion (0)
Starting point is 00:00:00 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds. This is episode 474, recorded March 23rd, 2006. I'm Michael Kennedy. And I'm Brian Akin. This episode is brought to you by us, all the wide array of things that we're doing, courses, books, and so on. Check them out. We have a lot of stuff on testing, a lot of stuff on Python. Can you imagine, Brian? That's what we're working about. I'm actually working on a brand new course. I'm super excited about it. It's going to be a little. bit different than the ones I've done before. So it's going to be fun. Look for that in like two weeks. Now also look for if you're wise, if you've done the right thing, look for a very cool email in
Starting point is 00:00:41 your email in your inbox after the show that talks about what we covered. Extra info to make it more useful. Background info. If you don't know something about a topic we talked about and so on. So that is subscribe to the newsletter, Python Vice.FM, click newsletter. Put in your info. And then you will be subscribe to the newsletter and we're gentle with it we don't give it away to third parties or anything terrible like that we just send you stuff about the show and other things going on with us wait if it's them and me and you is that three parties no we're both in the same party yeah we're in the same party it's a star party i would say it's a star party don't you think yeah so let's um what i want to kick this off with a little talk about starlet um so if if people are from starlet
Starting point is 00:01:29 is like 12,000 stars, but also it is, if you're not, if you could, don't remember, you've heard this before, but you can't remember. It's because this is what fast API is built on top of. But the exciting news is the Starlet is 1.0. It's no longer zerover. So congratulations, Starlet. The, I was looking at the release notes. The, the main thing is not necessarily, actually, there's a lot in the, if you, there's not a lot in the actual 1.0.0, but the release candidate, It has a lot of information. And as a summary, there is a, there's a blog post. Let me go blog post that talks about really everything that's in here.
Starting point is 00:02:09 And mostly don't really, if you've been keeping up with Starlit releases, there's probably okay, but you should test anyway. However, it's mostly a stability and versioning milestone. We're told the changes were limited to removing old deprecated code. that had been on the way out for years along with a few bug fixes. So it's not not necessarily, the intent isn't a interface break. However, if you were relying on deprecated features, you might break you. So definitely test.
Starting point is 00:02:45 One of the things I, a couple of things I wanted to bring up with this, though, is, it's pretty cool. So if you, you know, Starlett's awesome. And if you're unfamiliar with what it is, you can't do. It is, it's like a whiskey. The little Asky framework. So it's like whiskey, but asynchronous. And it's awesome. But there's been some cool changes.
Starting point is 00:03:09 One of the things I like it that before we get off the release notes is at the end, what's next? It says, oh, and Sebastian, Starlet is now 1.0. So now Fast API, 1.0? So Fast API is still zero over as well. And which also we've razzed them before. Also, I think it's high time that we get a 1.0 release of Fast API, maybe. I did want to bring up, I actually learned about this by looking at Simon Wilson's blog. And he is, he noted that also that Starlett 1.0 is out.
Starting point is 00:03:46 And one of the, and his article is really about experimenting with Claude Skills because this is a new release. And that's one of the ideas is the model that Claude was built on probably doesn't have it, right? Because it's a new release. So how to deal with that? So I'm not going to get into that article too much. What I wanted to look at was this example, which I thought was cool because one of the, one of the, it's not really a change, recent, super recent change, but one of the newer ways to do async context, there's an async context manager. So if you want to handle startup code and tear down code, there used to be on startup and on shutdown parameters, but now one of the better ways to do it is to use this lifespan model, which is kind of a generator.
Starting point is 00:04:38 And you can say you've got startup code and then teradone code after a yield. And I just thought that was cool because it's very reminiscent of high test fixtures that do the setup and tear down. after a yield. So interesting, interesting that that model gets around. So but yeah, the yield is is quite interesting. It's a bit mind-bending, but it works well. This is awesome. You know, Starlett's one of the fastest web frameworks out there for Python and it definitely is battle tested being a foundation of fast API. So good, good deal. Sebastian has it already shipped? Where is 1.0? What's going on? Yeah. That's cool. Very nice one. I want to switch things.
Starting point is 00:05:21 And talk about the news tojor. Astral to Join Open AI. I don't know if you all have heard this, but big news. It's certainly something that has set the internet, the Python internet on fire. First of all, I would like to take a moment. Well, first of all, to thank you, John Hagen for sending this in. It was news to me when he sent it in. About 10 minutes later, everything exploded in. Obviously, it's come to us from many directions. But thanks, John. And I want to be to take a minute and say, congratulations, Charlie, congratulations Astral Team. They really stepped up and built something incredible for Python tooling. And they were working for different ways to make this a business.
Starting point is 00:06:04 This is certainly one of the ways to make a successful business. It's just get lots of that sweet, sweet AI money. Now, with that out of the way, like, what does this mean for the people left without sweet, sweet AI money? Pools of AI money to swim. So there's, we don't know. there's a couple interesting things. I'm going to link to the blog post from Charlie. I think there's basically it says the team is moving to open AI.
Starting point is 00:06:31 The projects are moving to open AI. So why? Let's start with why. Why does open AI want them? I don't know exactly. I think nobody has said. So obviously Ruff and UV are super relevant for making agentic coding go faster. And to be clear, they're joining the Codex team, so the Agentic Coding Tools team of OpenAI, right?
Starting point is 00:06:55 So Ruff and UV are there as really awesome tools for Codex to install things, to validate. I think actually maybe the most important could be TY. So TY, we've talked about TY before, which is their language server plus type checker. And being a language server, you could point it at an entire project and say, I want to understand this. Like one of the things that drives me crazy, I love ClaudeCodeCode. about one of the things that drive me crazy is it runs in sort of this non-IDE, non-editor way, and it does not understand the code base really as a whole. It doesn't have a mechanism for saying, like, tell me, just tell me about where is this function used.
Starting point is 00:07:35 It's just like grepping and searching constant. Like, why do you have to grep? Can you just use an abstract text tree and just know the answer instantly? So if Codex deeply took Ty-Y in, it could know the answer instantly. Right? You know what I mean? In like I truly understand the project. It's not a file of loose text files and I'm surprisingly good at just navigating them in that way. This is an interesting take because I was thinking recently using Cursor that one of the issues with Cursor, I love Cursor also, but one of the issues is they can't use Microsoft's language server.
Starting point is 00:08:10 And you feel that because the Microsoft's language server is tighter than the one that Cursor uses. Yeah, exactly. They got kicked back a few years and work away back. And Ty and Ty, and Ty, and Ty, has the potential to be as good or better than the Microsoft one. So interesting. I think it's better.
Starting point is 00:08:30 Yeah, I think it's better. I would say Powerfly and Ti are sort of cutting edge right now. And certainly I'm using TiWi and my BS code like things. Okay, so those are all good reasons why Astral might join OpenA, why the Codex team might want Astral. Except Brian, I don't know if you've gone and looked at those projects. but the source code has been leaked on the internet all over the place. Like they're just open source.
Starting point is 00:08:54 They're on GitHub. So there's nothing preventing codex from just going, we're going all in on TY. We're going to contribute some stuff back to TY. We're going to create a fork and just keep that hanging around just a case. And really make that our foundation. And they could have done that with no work for no money, no acquisitions, no legal work. So the question is, what else is going on here, right? Well, you've got a handful of awesome engineers you just collected.
Starting point is 00:09:21 Exactly. So is this acquiring for technology or is this acquiring for AccuHire, Aquahire? And is it like I would like to have the astral team actually doing that work to implement Ty into Codex or whatever. This is my speculation, right? Or whatever it is that they're doing there, right? So I don't exactly know where it's going to go. I do think that this probably means the death of PYX.
Starting point is 00:09:46 So PYX, I had Charlie on the show, and... I kind of forgot about it, actually. Yeah, PYX. It's the safe enterprise packaging solution that front-end PPI. It's not a separate package warehouse or place, right? It's a front-end to PIPI, right? So if you upload to PIPI, it shows up here in this thing, but with lots more management, and you can put your private packages, right?
Starting point is 00:10:12 That kind of thing. That was going to be the way that Astral was looking to make money until pools of AI money showed up. So people have thoughts here. Very relevant. Simon Wilson has been writing so much about all of this AI stuff in the world. So he obviously, as a Python expert, has deep thoughts on this. And, by the way, that comment I made, Simon points out, if you both look at the Astral announcement and the Open to AI announcement,
Starting point is 00:10:38 they talk about rough, UV, Ty, nobody mentions PYX at all. In other words, those letters do not appear in the announcements. Where's the other tools do? So that's notable, right? So you can check out Simon Willison's thoughts on this as well. He's got a pretty balanced view of it. Not completely like, oh, this is going to be amazing. But he's a lot of different thoughts on it.
Starting point is 00:11:00 If you look at discuss dot Python.org, the thoughts are, well, the thoughts are strong. They're not necessarily, I don't know, it's positive. Bright Cannon has a lot of comments here, a lot of work on how he's doing stuff. to sort of take some of these ideas from Astral and make them more part of just core Python, which I think is great. For example, what happens? One of the biggest questions I had when I saw that
Starting point is 00:11:23 was, oh, what happens to Python build standalone? So there's the, oh, we can always fork it take, right? Which is true, but when there's infrastructure and build tools and stuff happening on the back end, I didn't think about that, yeah. Then I forked it, and it works fine up to Python 314, and it never works again, you know what I mean? And it didn't take the security
Starting point is 00:11:43 patches because the Python build standalone is like not quite regular Python. They have to do some tweaks to make it work weirdly. So there's like in this whole discuss thread, there's like, could we just have Python build standalone kind of be there's like regular Python? Like why why do we have to have this like more portable version that works kind of better be maintained and patched by other people, right? So Brett's actually talks a lot about that and there's some interesting things here. There are not as many incredibly positive bits of feedback here. There's a few, right? Like, Astral have earned a lot of trust.
Starting point is 00:12:17 As long as Open AI doesn't insist on integrating their equivalent copilot into it will probably be fine. It's probably even good news. So there's that. Simon points out that, like, technically codex could take UV away from the other coding agents in a way and make them, like, work with less good tools. So that could be like a bad lever. Mark says, hey, let's not jump to conclusions.
Starting point is 00:12:37 Yeah, I'll tell you what, the Twitter, the X, X thread is more serious. But there's a pretty deep conversation here on Discussed Python. It also was covered on Arstechnica. You go into the comments here. I don't know how many comments are. Do I need to refresh it? It's only 48, but this is probably the most concerned one. Like, oh joy, boarding the Titanic, dot, dot, dot.
Starting point is 00:13:05 Yeah. Talk Python did get a shout out because of it. for the PYX angle, which is fun. But yeah, I don't know. I'm going to go ahead, Brian. I want to be positive, but I got to say my first reaction was nuts. This is not good. So I don't know.
Starting point is 00:13:21 I hope it can be good, but I'm still holding my breath as to whether Microsoft buying GitHub was a good idea. You know what? I hear you on the get up thing, but they were like on the verge of going out of business, which I don't think people realized at that time. But it was, the finances were bad. And it was like, they need saving. So it wasn't like, well, either they just keep on their merry way
Starting point is 00:13:44 doing their own good thing or Microsoft gets them. It was like, or they kind of, you know what I mean? Like it was, that was the, so in a sense, GitHub is still going pretty strong. So it's better than it not being there. But I don't know. But I agree with you. Okay, back to the topic.
Starting point is 00:13:57 So with this, I trust Charlie and the Astral team. I think their heart is in right place. And to be clear, it says they're still working on the three tools that were not P-Y-X. That said, once you're inside a large organization that has its own motivations and its own goals, who knows what happens? It could be totally open. One good outcome could be that it's a shock to the system and say the core development team goes, you know what, we just need to like take Python build standalone inside of Python. Like, why is this an external thing? Why was a random dude maintaining it before Astral took it over? You know, that was a lot of work. We should,
Starting point is 00:14:34 this is what we do. It could be, There's a lot of things that made... That would totally make sense to be under the PSF umbrella. Yeah. Like, why do we have to keep patching every release of Python to make Python build standalone work? Can we just upstream those fixes and just make that Python? Yes, that would be a good outcome. Me as a bystander saying it makes sense, but I don't know what's involved with that.
Starting point is 00:14:56 Yeah, I have 100%. But Brett was sort of saying some things along those lines. If, you know, PIP could easily adopt a lot of the things that UV did that make UV special. I don't know why they didn't. I mean, you look at, you look at the comments, it's like, well, we're too busy. Like, this is like a volunteer thing for PIP. And the reason, um, Paul, the guy who maintains PIP is actually like, they've in a congratulatory way sort of said, look what Astral has shown as possible if you actually fund work on an open source problem, right? But they didn't, it wasn't funded on PIP and so PIP hasn't had those changes. But you could
Starting point is 00:15:31 retrofit a lot of the things that make UV fast. Because it's not all just fast because, of rust, although it is rust, it's a lot of it's fast because of new algorithms and tradeoffs and things like that, right? So those could easily be written back. Yeah. There's also deprecating old features that like there's stuff that PIP does that UV doesn't. Yeah, you just say, hey look, you pin PIP to what it is now and we're going to get a new better PIP. Yeah. If you need old PIP, use old PIP. I don't know, whatever. So that, I mean, all those are positive influences and on the most possible positive thing is this just rough UV and TY just keep on trucking.
Starting point is 00:16:09 Yeah. Right. So like for example, the other thing I said is like, hey, that TY and, what did I say? A project from meta, right? It wasn't an other open source project that was a competitor to TY. It was a sort of a similar type of company somewhat. Henry brings up a great point comparing PIP and UV. UV has paid developers.
Starting point is 00:16:29 PIP does not. So. Yes, absolutely. 100% Henry. And Henry had a bunch of great comments and thoughts in that discuss. thread as well. So yeah, I think we're all on the same boat of like I get that they all want some payday, but also I hope it doesn't go down in flames. Look, and I just want to put this out here. I do, I hope nobody has a bunch of negative hate that they're throwing out there for,
Starting point is 00:16:56 for Charlie and team. They've given us something and they haven't taken hardly anything from any of us. And what they've shown is really awesome. And we've all enjoyed the tools. And even if they were to go away, there's a lot to carry forward from what they've done. So I have no idea how much money they got, but it's probably life-changing amounts of money to some degree. And if you've got just an open-source project that you thought you could make install Python packages better, and then you got life-changing amount of money, you should take it, unless you're already rich, really rich. Well, there's also the, there's a couple of other aspects. There's investors already in Astral.
Starting point is 00:17:34 And they have a non-zero say in what happens. And then there's also the, there's, there been comments of, well, it's, it's open source so you can just fork it. That's not trivial to fork a large project. Yeah. I think if there were not infrastructure behind it, you actually could fork it pretty well and be okay. There's, there's something like 47 non-astral employee contributors who have done five or more
Starting point is 00:18:01 contributions to UV. Yeah. I mean, it's definitely doable. It's just like to get to fork it and get it to work once is one thing and then maintaining it for a long time is a completely different story. Yeah. And I mean like no new features. It just keeps doing what it does now, which would still be awesome.
Starting point is 00:18:19 But when you're talking like Python built standalone and every new release of Python results and being patched and like that, that is a whole different level of stuff there. All right. Enough. Enough talking about Astral. You want to talk about Astral and stuff? Yeah. Let's stop talking about.
Starting point is 00:18:34 you astral and talking about UV. Actually, it's funny, but I am not kidding. So UV has a new release and there's a new feature that I think is interesting. That's kind of a secret preview feature. And that's UV audit. So that's what I want to talk about. In the release on the 19th, we got a 0.10,000. 12, we got UV audit is now as part of the CLA help.
Starting point is 00:19:08 And this was submitted by Owen Lamont. So thanks for noticing this Owen and I'm pretty curious about. So I tried it out. And so you can do UV self update and you can get it. Now you can use UV audit. But it's interesting. And there's also a link to the roadmap. So looks like almost everything's done.
Starting point is 00:19:27 There's some integrated tests that need to be done and whatever. So it looks like they're heading forward to try to have this in place. And it's pretty cool. So UV audit is a dependency checker really on your project. And I was a little, I was a little bit when I tried to run it. I didn't quite understand what happened. It found some security vulnerabilities in one of my projects. But it wasn't something I recognized.
Starting point is 00:19:57 And so I looked at it was a dependency of a dependency of a dependency of my test environment and what was even more confusing to me was that that that project I looked at like a pip-depth tree and I didn't have that version so what what am I seeing why is there vulnerability there one of the cool things of how this works is it in some part of the process it creates a UV lock file so it can have any in a UV lock file which I kind of forgot about has has it's not just all of the versions for your project, like, pinned. It's also on all operating systems that you support and all Python versions.
Starting point is 00:20:44 So it's a broad range of dependencies and versions. And what happened was my project supports 3-9 still, or it did. Wait, yeah, 3-10 is the lowest that I was supporting. And even though I was running 3-14, in 3-10, one of the, virtual emve on 310. So it was talks, virtual and then file lock, I think, or something. In version 310, there's a file lock vulnerability
Starting point is 00:21:14 that got fixed later. And they just went up new versions. So it's kind of a neat idea. I really like this idea, but it's a little bit hard to interpret to go, oh, you have to look at all of the Python versions and all the operating systems you support, which if you're doing a Python project that's usable.
Starting point is 00:21:35 That's how it would. Anyway, I'm excited to have this get in place. And I think it's kind of a neat thing to easily, if we're using UV tools already, to say UV audit for a Pi project, Tomel-based project. And make sure you have not any vulnerabilities on any of the Python versions that you're supporting. So that's cool. Yeah.
Starting point is 00:21:56 When Owen said this in, this is awesome. Because I've been using PIP audit for both my local dev, but also for our Docker deploys. So one of the Docker build steps is to do a PIP audit against everything that had been installed. And I set it up so that my PIP audit, I can use UV to PIP compile the requirements file and then use Docker to run PIP audit
Starting point is 00:22:19 so that if it has to install stuff and there is a vulnerability, it doesn't get install, I don't know, like crypto miners or whatever on my computer, it installs it into this ephemeral Docker container, right? Which is sweet, but it's kind of slowish. It adds like 10 seconds to the build time of what otherwise was, you know, seconds. A couple, it doubled the Docker build time at least, I would say. So does PIP audit must do something on the installed packages?
Starting point is 00:22:47 Yeah, exactly. So what you do is you install it, then it looks at the virtual environment. So I have a Docker container I built, which is like the command I run. And so what it does is it has PIP audit already installed it. It maps in the requirements file. it PIPP, UV PIP installs them quickly, and then it PIP audits against them, you know what I mean? Yeah. And so I'm hoping that this would be great.
Starting point is 00:23:07 And if you go to the bottom at the coming soon roadmap deal, you'll see at the one of the unfinished things is support locked, hashed requirements.tXT files. Yeah. Which is the way I've been to. I'm just not a huge fan of the project management like UVInet ad, so on for non-packages. I like it for packages. where it manages the Pipe Project.tomel and there's like a lot. But if I just have an app, I'd rather just have a requirements file.
Starting point is 00:23:35 So I'd PIPPIP compile that thing. But I can't UV audit it. I have to PIP audit it. So anyway. I'm actually thinking about because of this and because of other, because UV sync is awesome, thinking about projects that just are normal, like for my own for work for requirements.
Starting point is 00:23:55 DotSpace projects switching to PIPTAML so that I can get this. But if it supported this also, that'd be cool. Yeah, it's how much work can it be to translate the format of a requirements.txti file to a workable format of a UV lock file? It's probably an hour of cloud code, I bet you. Oh, wait, you converted to a pie product tunnel and then write it, yeah. Yeah, exactly, exactly. Okay, so that looks really cool.
Starting point is 00:24:20 I would actually like to switch over and talk about something that I thought was pretty interesting. And this is an article I wrote. It's not so much about the article itself. It's about what the article is covering. Thing, go away. So this is, the article's entitled Fire and Forget, or never with Python's Async I.O. And so here's the deal.
Starting point is 00:24:41 This is, there's a couple of tasks on our web apps that have long running things. So let me just give you one example. One example is, and there's different ways to do this. One example is like maybe I want to just send an email. Now, send an email usually is quick, but sometimes it's really slow. And maybe somebody signs up for a course and I want to send him an email,
Starting point is 00:25:01 but I want to be able to quickly just get right back to them. Right. So, hey, you're in. So you could just say start the send email async function and let it go. What are you going to do if it can't be sent? Like, I don't know. Like, you know, you're not usually told it goes out and eventually it comes back five minutes later. This email address was not found at this server or whatever, right?
Starting point is 00:25:21 It's not like you can wait for the call to say. It's like the wait is for the call, the email to start sending, not for the success. of the email delivery, right? So why wait? So you just fired off like, okay, that task goes. And then we tell them, hey, welcome to your account or whatever, right? Something along those lines. Another one is, I have all the data, all the transcripts for the various podcast stored in the database in a certain formats. It's really nice to get them in and out. And the transcript generation stuff is always, it's always fraught like with acronyms and stuff, right? Like Pi Pi Pi, without some work, it's getting better and better, but without some work, it'll come back as like pie, the food, the letter P,
Starting point is 00:26:02 or something, or maybe it's the number pie or like it's, whatever it is it's not capital P, lowercase Y, capital P, capital I, right? And so there's like this automation that I run that fixes it. Really bad is guest names. It can wreck guest names, and sometimes it's no big deal, but sometimes it's bad enough. You're like, oh, that's kind of offensive to the guests. Let's see if we cannot do that to them, you know? And so I have this like post-processing that goes on with literally hundreds of changes and not just one word, but if you see these three words together, that actually means something separate than if you heard them individually. Like talk Python slash whatever, like that's supposed to actually be a URL. So fix it. Or Python byte slash whatever. And so there's this process
Starting point is 00:26:42 that runs and fixes them. But as I discover new problems, I go back and retroactively fix the last 10 years. And that process takes forever in terms of web request time. Right. So I'll say there's admin button I can go push and it'll go pull every transcript from the database, search through it for hundreds of phrases, each transcript for hours of transcripts, change them, put them back in the database. So that's like 20, 30 seconds if it fans out across processes. So that kind of stuff, like when I push the button, I don't really necessarily want to just wait. I want to just have it go, awesome, we're working and we have a little JavaScript that'll tell you it's progress, but we're not going to await it, right? So that's a long, long way to set up this.
Starting point is 00:27:22 And the reason I went in sort of that extra background is people, a couple, when I I posted this as a couple people said, well, you should never do this. It's an anti-pattern. I'm like, you shouldn't do it most of the time. But the alternative is what? Set up an entire separate Redis server with a messaging queue. Like that's a lot of overhead for just like, I just like the little process in the background to run and correct the transcripts, you know? Anyway, so here's the news. Here's the story is that a lot of times you would write simple code like this. You would say, I'm going to do a little bit of work and it's maybe async or whatever. And then you can't just call async functions. You have to start them.
Starting point is 00:27:55 right so you have to say async i.o create task or you have to await it because if you don't create a task or await it it just doesn't run it's an unrun async co-routine right so you got to await it and up to python 311 this code was fine you would just say okay now it's on the background cue and it's going to do its async await run just as if you had awaited it but you get to carry on and like welcome the user or whatever you're doing well then this has been this way since 3 6 maybe 3 5 right so it's a lot it's like seven years. But now Python 312 onward, you can't do this anymore. If you do this, that task that was just started, you say Async I.O, create task, is eligible for garbage collection in the next line of code. And it might not even start. I wanted to put that out there for people like,
Starting point is 00:28:40 you know, if you have this type of code, you, you would better look at it. It would better make a change. Because for whatever reason, the async event loop now holds weak references to the tasks that it runs. And that doesn't just apply to this pattern. Anything that somehow put something into that task, that work queue, the loop, needs to keep track of it explicitly or else. It potentially gets deleted out of memory, right? So if you look at the documentation or what is it, create, create task in the docs, it says it's going to wrap a co-routine into a task and schedule its execution and return the task object. However, in Python 312 onward, it's important that you say, They have a reference to the result to avoid the task disappearing mid-execution.
Starting point is 00:29:25 That's new. So anyway, so what is the fix? You create this weird set that holds all running tasks that you're not tracking. You add it to that task, that set. Then you kick it off as a, or you kick it off as a background work, and you add the task there. And then you set up a callback that when the task has done, it takes it out of the set. So I don't know. Anyway, that could certainly be catching some people out in, like, super hard to debug race condition.
Starting point is 00:29:48 So I wanted to bring that up, and then I wrote up a way to do it. So that's that. Fire and forget, maybe. You might just forget. What else we got? Extras. Yeah. Yeah. What's interesting is about that is it actually requires you understand memory management a lot better than we normally do in Python land. Normally we just kind of ignore it. Okay. So your solution, does it, does it still, does a task still stick around then? Or does this? Yes. Okay. So it's, I, basically, I just took my example and just stole the, what, um, okay, what they suggested straight out of the docs. You create a global set. you run the task you put it in the set and then you wire up the tasks when i'm done call the discard myself button so the garbage collector doesn't grab it and okay yeah you just need to upgrade your weak reference to a stronger okay that works well yeah it works it's but it's not obvious it's definitely not like oh yeah i know i had to do that all right extras what do we got extras um i just got a couple uh let's see uh go over here uh just a fun article i didn't want to talk about it too much but um I found it interesting called nobody gets promoted.
Starting point is 00:30:54 Nobody gets promoted for simplicity. And this is a blog by terrible. It's a terrible software blog. That's an awesome name for a blog. So there's a quote from Ed from Dykstra. Simplicity is a great virtue, but it requires hard work to achieve an education to appreciate. And to make what matters worse, complexity sales better.
Starting point is 00:31:18 And it's just an interesting, observation and I've observed it too. People, and it's unfortunate that people with the complicated, it's like the guy that stayed late nights and weekends to fix bugs might get a free pizza or something like that or a promotion, but it was their bugs that they were fixing and the person that just didn't write the bugs in the first place won't get though. Anyway, just. That person is not even that much of a hard worker. Look, they went home on time. I suck. I bring this up also because people using. a lot of people are now in a management role,
Starting point is 00:31:54 even if they don't realize it, if they're letting AI generate a bunch of code for them, simplicity still matters. Don't reward your agent just because it gave you volumes of code. It might be a simple solution. It might be easier. Anyway, set that aside. It's just an interesting read.
Starting point is 00:32:12 I've been working on over the weekend, catching up on Pytest check. I brought this up a while ago, a couple weeks ago, as well of that I'm catching up on all the things because now we're at let's see zero issues zero pull requests that's awesome um I mean it's a small project so the the last one I just cleaned up though has been was a feature request from three years ago so I'm I'm glad that I'm finally getting in place and so the thing is I've got a bunch of these helper functions like
Starting point is 00:32:45 like equal and not equal and stuff like that um and uh the the The addition is that even if a test is not marked as X fail, you can mark an individual check as X failed now. So that individual checks, some checks might be X fail, some checks might be not. And you can also still use an assert can still cause the test to fail. Yeah, that's the feature. Very nice.
Starting point is 00:33:11 Yeah, that's my extra. All right. I got a few. I'll go quick though. So I ran across this project called Turbo API. That's a riff on fast API. We started with Starlit and Fast API, and here we are in extras with a bit more.
Starting point is 00:33:24 So there's a couple of interesting things from this. This person said, what if all of the parts of code that actually were not the Python that you write, what if that was all Rust, or actually not Rust, Zigg. But it doesn't really matter. Like, you've Rust, C++, Zig, whatever, this person likes Zig. I don't even know what Zigg codes look like, but it, you know, natively compiled code, right?
Starting point is 00:33:44 Would it be faster? Would it be cool? Would it be like you could use the same decorator? the same model technically doesn't use Pidentic. It uses some other thing that is Zigg-based instead of Piedantic as the foundation, but it understands the models the same. And seven times faster, so on. So anyway, I thought this was cool.
Starting point is 00:34:02 It's like this kind of Twitter thread. It got a lot of, it's got 325,000 views. So hey, it caught some attention. How about that? And I don't know, it was kind of interesting. I looked at it a lot and like, ah, you know, I don't really hear too much about that. But I just, the idea of it got me thinking about something. So I wanted to throw that out there.
Starting point is 00:34:20 It's kind of popular now. It's got a lot of GitHub stars and so on. I'm not suggesting people use this over Fast API. That's tried and true and so on. But this is kind of an interesting idea. Anytime you publish benchmarks, you will receive an inordinate, an abnormal amount of attention in not a good way. So a follow-up post.
Starting point is 00:34:41 I learned a lot from the Turbo PG benchmark mess. All it goes. Interesting. Also, there's some pushback that, oh, you used agentic coding tools to help do this transition or write this code. I think this is very interesting because a lot of the pushback to Astral being acquired by OpenAI wasn't that we think OpenAI is going to be bad. It was, I hate AI, I hate Open AI, and therefore I hate this. You know what I mean? And it wasn't an assessment of do I think this is better or worse for this project.
Starting point is 00:35:14 It was just like, I hate them because they're AI, because I hate AI. And so there's this really interesting tension. And so if you want to like kind of live in that moment, you can read this whole thing. I think that's pretty interesting. So anyway. Yeah. There's also like we get sent projects all the time. And I'm of like, oh, there's this new project to check this out.
Starting point is 00:35:38 And there's nothing behind it other than like two days worth of AI generated stuff. that I don't think the person that wrote it even, like, read it or tried it or anything. And I totally concur on that. I agree with you on that. If the thing solves a legitimate problem that wasn't there before, that wasn't solved before, and it was generated with AI, and it seems generated well, like, okay, it's still interesting. But yeah, we definitely get stuff that's just, like, clearly just AI generated and, like, people are looking for promotion for it and that we're not looking necessarily to shine that sort of light. But for example, in the, I think it was in the Ars Technica thing.
Starting point is 00:36:17 Literally one of the comments was, I hate open AI. This should have never happened. Or, you know, something like it was like, it was that explain. It was, you know, it was verbatim when I said, basically. So I know you got some other extras, but I want to interrupt a little bit more. I have a couple ideas then is next time I want to try to like hype up something. I think I'm going to put fake timing data in there then. Yeah, exactly.
Starting point is 00:36:42 performance metrics just to or even just bad performance they're not fake but they're bad so that you know to get people and the other thing is um um what do you think about like you know selling python bytes to open a you know what hey um i i love our listeners i love what we do but i would not be against this if okay if they come if they come with the money all right so uh chances very very low uh really quick pyramid pyramid pyramid the web framework yes pyramid, which by the own bites was originally based on from way, way back in the day, has a release. Now, you might think, Michael, why are you so psyched about this? Because we're running it on court, Acing Flask, whatever, right? These days, this is the first actual release of pyramid in five years. So I'm like, wait, there's a new version of it. What is that what's going on here?
Starting point is 00:37:32 Technically, there was a release three years ago, but it was like a very minor thing. Okay. And so there's not a lot going on here. Basically, set up tools had some issue with the, the way that it was managing packages. Like Pyramid Web Apps are packages and set up tools was saying the way that you start your web app or install it is about to be no longer supported.
Starting point is 00:37:54 So the thing actually pins the setup tools to a version that will work. But if you have something else that depends on set of tools, you can't use it. So it's always a bit of an issue. The other one is they literally, as part of this release, added Pyramid.acepexceptions. ACTPI-MATP exception.
Starting point is 00:38:12 I love it. I love it. So anyway, that's that. Vivaldi, we're both fans of Vivaldi. I know a lot of Python people are fans of Vivaldi, kind of the privacy angle and the not Chrome angle. Just released this really cool new sort of auto hiding style where you've got this really minimal UI that feels very arc-like.
Starting point is 00:38:34 People like that, like ARC browser a lot. So most people are listening to, they don't know, but you can't see my address bar. You can't see all the Chrome around. like just sort of very minimal. You can like turn it into just like this, just the essence of the web. You can even auto hide your tabs,
Starting point is 00:38:49 but it's a bit too much for me. I like when I know, like I've got 20 tabs open, how do I get back to the one I want more easily and sort of navigate that? You can add a bunch of hotkeys that make this work better. For example, like I used to go to the address bar
Starting point is 00:39:02 and copy the URL because I needed to use it and some sort of writing. But if it's hidden now, then how do you do that? So I had it a hot key to like just copy whatever thing I am on. So it's kind of encourages you to be
Starting point is 00:39:12 a little more hotkey driven, so that's fun. People should check that out. I saw you just turned that on, Brian, this show. Yeah. Yeah, we'll see how you take to it. But it took some getting used to, but I definitely like it. You've got to be the more mellow around the edges.
Starting point is 00:39:24 If you get too quick, stuff pops over and covers the nav, and that's kind of annoying. When those stuff would pop down, it would like move the window, actually as if that thing existed in the UI, so it didn't cover your elements, but whatever. One of the places where I'm seeing I'm using it the most
Starting point is 00:39:39 is places like here, or at work during presentations where I don't, I just want to show the thing that I'm showing. So yeah. Yeah, yeah. And certainly on like small laptop screens and stuff, it's super cool. Okay, a couple of things that the reason I even brought up
Starting point is 00:39:57 Turbo API is it got me thinking about, well, what if our code ran on pure rust, like all the way up until like where the code that we write starts running, you know, not the framework stuff, but just that close, right? So I looked around and found Robin, R-O-B-Y-N, web framework, which is basically exactly that. And in order to test it, I wanted to do a spike and see if I could convert one of our apps like Python bytes to Robin. To Robin, would it be a lot faster?
Starting point is 00:40:25 I don't know. Well, there's a whole bunch of chameleon templates and only supports Jinja. So I'm like, well, in order to do that, I'm going to have to create a chameleon package. So I did. So I added chameleon support to the Robin language so I could see if it was any better. Turns out, no, not really. Was it really worth it, actually?
Starting point is 00:40:41 But now the world has this cool Camelian Robin project. So a chameleon is a little bit more out there. And I was also playing around this week. I was just trying out ideas this weekend. This raw plus DC, the data classes pattern that I talked about, that makes, you know, removes the dependency of your ORM library in terms of like if it doesn't get updated.
Starting point is 00:41:02 That makes it easier for AIs to understand because it knows the raw query language of your database and so on. I actually switched Talk Python and Python bytes over that this weekend just playing around and it's like 10% faster, lower memory usage. Why not? Oh, wow. Me. Yeah, pretty sweet. Have you considered Raw plus Marvel? No, I know. It's a, it's, there's better stories over there, but now. I think that's it for our extras. We got a joke though. And this one, this one comes to you, comes from you. This is a pretty, pretty good stuff. I love this. You want me just, you have my screen up here. I'm going to put, we're going to start. So set the stage for us. What is going on here? So do, can you get the LinkedIn? Oh, yeah. Oh, yeah. So I just, like, there's the normal,
Starting point is 00:41:47 like, Google translate, you just type in stuff or whatever, all your translate tools, it type, it converts it. Somebody on Kagi wrote a LinkedIn speak converter. So you can convert English to to LinkedIn or back. So if you're looking at somebody's bio and LinkedIn, you can, or any, it works for resume speak as well. You can just finish it. Like, finished my last contract. What are you writing? I just finished my last contract. I'm looking to move to AI programming. Okay. Anyone interested? So what do we get? What are we going to get? So I just finished my last contract. This is what, this is the input. I'm looking to move on to AI programming. Anyone interested in someone who lives every day is day one. Remember that like LinkedIn? Oh yeah.
Starting point is 00:42:36 Yeah. So here's what we get. Hashtag new beginnings rocket ship pipe. I'm thrilled to share that I'm officially wrapping up my last contract. It's been an incredible journey, but I'm even more excited that for the next chapter diving deeper into the world of AI programming, robot star emoji. I'm looking for the next challenge with a team that values growth and innovation. I'm someone who brings that day one mentality, everything I do every single day hockey stick growth emoji. If you're looking for a passionate developer ready to push the boundaries of AI, let's connect, handshake, hashtag AI, hashtag machine learning, so on, so on. That's funny. That's pretty good, right?
Starting point is 00:43:12 You know, it's even better. You can type other stuff in here that's not in the list. AI, bro. Custom. Just sunset in my last gig, Rocket Ship, ready to pivot into AI and skill some world-changing models. Who wants a builder who stays hungry and treats every day like day one? Let's disrupt robot emoji, fire emoji. hashtag build in public okay so go back to the the LinkedIn one though
Starting point is 00:43:34 hold on hold on hold on you got more cobalt Cobal programmer it wrote a program that stated this it's like basically like printed hello world to say display Dea's mindset question mark stop run oh would you locat Oh, wait, that's a locat. I has finished my last joby. Now I want to do AI codes. Who wants a kitten who lives every day like day one? You want me to go back to the LinkedIn one.
Starting point is 00:44:10 I'm getting distracted. I'm sorry, I apologize. Oh, and in the English, just type I'm a programmer. This is pretty good. I'm a passionate software engineer dedicated to building scalable solutions and solving complex problems through code. Okay, but what if you saw that? So do that, like flip back and forth, like the little,
Starting point is 00:44:26 button in the middle. Oh, I see. So that translates to I write code for a living. I spend my days trying to fix things that shouldn't be broken in the first place and make sure the whole thing doesn't crash when more than 10 people use it. That round-tripping is incredible. That is incredible. Yeah, we did that. My daughter and I did that with a few job descriptions and round-tripped a few times and it's it's um you get some crazy things so kind of want to put my LinkedIn agent here see what happens just see uh but yeah anyway i actually think it might be a useful tool for like uh you know depending on how brain dead you think your um hiring manager is yeah or you could just de buzzword it put in what you see and then see translate that to english yeah anyway well well done caggy team that's
Starting point is 00:45:19 awesome cobalt is nice Yeah, I know folks, we went a little bit long in this one, but the whole astral thing, I think, deserves some, some conversation around it. Yeah, yeah. We'll see. We'll see. Thanks for being here, everyone. See you later. Bye, Brian.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.