Python Bytes - #455 Gilded Python and Beyond
Episode Date: October 27, 2025Topics covered in this episode: Cyclopts: A CLI library * The future of Python web services looks GIL-free* * Free-threaded GC* * Polite lazy imports for Python package maintainers* Extras Joke Wa...tch on YouTube About the show Sponsored by us! Support our work through: Our courses at Talk Python Training The Complete pytest Course Patreon Supporters Connect with the hosts Michael: @mkennedy@fosstodon.org / @mkennedy.codes (bsky) Brian: @brianokken@fosstodon.org / @brianokken.bsky.social Show: @pythonbytes@fosstodon.org / @pythonbytes.fm (bsky) Join us on YouTube at pythonbytes.fm/live to be part of the audience. Usually Monday at 10am PT. Older video versions available there too. Finally, if you want an artisanal, hand-crafted digest of every week of the show notes in email form? Add your name and email to our friends of the show list, we'll never share it. Michael #1: Cyclopts: A CLI library A CLI library that fixes 13 annoying issues in Typer Much of Cyclopts was inspired by the excellent Typer library. Despite its popularity, Typer has some traits that I (and others) find less than ideal. Part of this stems from Typer's age, with its first release in late 2019, soon after Python 3.8's release. Because of this, most of its API was initially designed around assigning proxy default values to function parameters. This made the decorated command functions difficult to use outside of Typer. With the introduction of <code>Annotated</code> in python3.9, type-hints were able to be directly annotated, allowing for the removal of these proxy defaults. The 13: Argument vs Option Positional or Keyword Arguments Choices Default Command Docstring Parsing Decorator Parentheses Optional Lists Keyword Multiple Values Flag Negation Help Defaults Validation Union/Optional Support Adding a Version Flag Documentation Brian #2: The future of Python web services looks GIL-free Giovanni Barillari “Python 3.14 was released at the beginning of the month. This release was particularly interesting to me because of the improvements on the "free-threaded" variant of the interpreter. Specifically, the two major changes when compared to the free-threaded variant of Python 3.13 are: Free-threaded support now reached phase II, meaning it's no longer considered experimental The implementation is now completed, meaning that the workarounds introduced in Python 3.13 to make code sound without the GIL are now gone, and the free-threaded implementation now uses the adaptive interpreter as the GIL enabled variant. These facts, plus additional optimizations make the performance penalty now way better, moving from a 35% penalty to a 5-10% difference.” Lots of benchmark data, both ASGI and WSGI Lots of great thoughts in the “Final Thoughts” section, including “On asynchronous protocols like ASGI, despite the fact the concurrency model doesn't change that much – we shift from one event loop per process, to one event loop per thread – just the fact we no longer need to scale memory allocations just to use more CPU is a massive improvement. ” “… for everybody out there coding a web application in Python: simplifying the concurrency paradigms and the deployment process of such applications is a good thing.” “… to me the future of Python web services looks GIL-free.” Michael #3: Free-threaded GC The free-threaded build of Python uses a different garbage collector implementation than the default GIL-enabled build. The Default GC: In the standard CPython build, every object that supports garbage collection (like lists or dictionaries) is part of a per-interpreter, doubly-linked list. The list pointers are contained in a PyGC_Head structure. The Free-Threaded GC: Takes a different approach. It scraps the PyGC_Head structure and the linked list entirely. Instead, it allocates these objects from a special memory heap managed by the "mimalloc" library. This allows the GC to find and iterate over all collectible objects using mimalloc's data structures, without needing to link them together manually. The free-threaded GC does NOT support "generations” By marking all objects reachable from these known roots, we can identify a large set of objects that are definitely alive and exclude them from the more expensive cycle-finding part of the GC process. Overall speedup of the free-threaded GC collection is between 2 and 12 times faster than the 3.13 version. Brian #4: Polite lazy imports for Python package maintainers Will McGugan commented on a LI post by Bob Belderbos regarding lazy importing “I'm excited about this PEP. I wrote a lazy loading mechanism for Textual's widgets. Without it, the entire widget library would be imported even if you needed just one widget. Having this as a core language feature would make me very happy.” https://github.com/Textualize/textual/blob/main/src/textual/widgets/__init__.py Well, I was excited about Will’s example for how to, essentially, allow users of your package to import only the part they need, when they need it. So I wrote up my thoughts and an explainer for how this works. Special thanks to Trey Hunner’s Every dunder method in Python, which I referenced to understand the difference between __getattr__() and __getattribute__(). Extras Brian: Started writing a book on Test Driven Development. Should have an announcement in a week or so. I want to give folks access while I’m writing it, so I’ll be opening it up for early access as soon as I have 2-3 chapters ready to review. Sign up for the pythontest newsletter if you’d like to be informed right away when it’s ready. Or stay tuned here. Michael: New course!!! Agentic AI Programming for Python I’ll be on Vanishing Gradients as a guest talking book + ai for data scientists OpenAI launches ChatGPT Atlas https://github.com/jamesabel/ismain by James Abel Pets in PyCharm Joke: You're absolutely right
Transcript
Discussion (0)
Hello and welcome to Python Bites, where we deliver Python news and headlines directly to your earbuds.
This is episode 455, recorded October 27, 2025, and I'm Brian Ockin.
I'm Michael Kennedy.
And this episode is sponsored by the wonderful people at Python Bites, actually both and everybody here.
So please check out the work we do that we do for a little bit of money.
We've got Talk Python training with tons of wonderful courses.
his new book of course michael's new book and then um also if you if you want to check out
pie test there's the complete pie test course or you can take the pie test course at talk python
training and as always thank you to our patreon supporters um you guys rock uh we don't shout you out
all the time but we always appreciate you if you'd like to submit a topic idea or some feedback
for us or just say hi um check out the show notes uh at python
bytes.fm. There's links to
our socials.
We've got the show and
both Michael and I are on
Blue Sky and Faustanon,
but that's really Massina.
So wherever you're at there.
If you're listening to this and you're like,
sounds like they're recording live somewhere
we can see it. Yes, we are recording
and streaming, but you can watch all the back episodes
or participate in
the chat if you want to.
Check out Pythonbytes.fm.
Live with all the details there and a link to when we're going to record next.
Before we get started, one last thing.
We do send out a newsletter that's getting better and better as we go along,
thanks to both Michael and I working on a little bit.
So there's a newsletter we send out with all the links.
So you don't have to take notes.
You will send you out an email.
So sign up for the newsletter and we'll send you those links.
and extra details too.
It's a great resource.
Yeah, I think people really appreciate it, Brian.
You know, if you look at when you send out the email,
then book back in the campaign,
and how many people actually open to the thing that we send out
or click on it?
Those are really close to 100%, which is ridiculous.
So that's a really good sign that other people might also like it.
Yeah, I think that people are using it in place of taking notes.
And also, if you're in a car or something,
you don't want to, like, try to remember.
You don't have to.
So awesome.
All right, Michael, what kind of, what do you got for us right away?
Well, let me see what I can do here.
Let's see what the Cyclops says.
A cyclops.
It's like a little programming kitty, but it's a cyclops.
It's a cute.
Like opts, O-PT, operations, cyclop.
I get it.
So we've heard of Click.
You and I were both just actually singing the praises of Argpars, which is pretty interesting.
I am such a fan of ARGPARs these days because it's just built in and it's simple and it's good enough.
But if you were building something where the CLI API or interface or dev user experience was super important,
you might choose something other than ARGPARs.
So what are your options?
You could choose Click.
You could choose Typer.
Typer is actually built on Click.
So in a sense, it's like Starlit and Fast API.
You choose one, you get the other in a sense.
So there's a blend there.
And there's others. I know I'm not, there's a bunch of different CLI building options.
But the Cyclops is one that sort of has the second mover advantage over Typer.
And there's sort of tagline, I remember where I saw it, but somewhere in here, it's not actually right here on this page.
But, you know, no shade at Typer, but it says what you thought Typer was going to be.
And by second mover advantage, it's like, all right, we've seen different interface things built based around type information.
And now that we've seen the pluses and the minuses, could we do a slightly different version that has more pluses and fewer minuses?
And so there's a whole page of the Typer comparison here.
And I'm not a super expert on Typer.
My CLI is not that great.
So there's 13 different things, 13 different components or features or aspects of the API that they chose to address to say,
type is great.
We're basically inspired by Typer.
That's what we're doing.
However, there's a couple of these 13 areas I think we should make better, right?
So you could actually go through and look at all of them.
Probably the biggest one is that Typer was made with Python 3738, where Python's types were
at an earlier stage in their life.
You had to say things like, this is an argument of int, or it is an int when it's actually
an argument of int or something like that.
And this Cyclops thing was built once annotated was introduced, which allows you to have sort of a dual role, like it really is one of these, but its type could also be like its native value, right, allowing the removal of proxy defaults.
Why does that matter? Because if you build a CLI with Cyclops, you can still, and you use types, you can still use it as regular functions as if they just had an integer where it said it took an integer and so on.
So, you know, things like that.
It's got about, I think, a thousand GitHub stars or so.
It's not crazy, crazy popular.
But if you're really loving that concept of having types for your TLI, you know, give
a thing a look.
Yeah, actually, I got a shout out to them for doing things like comparing to Typer.
Because, like you said, a lot of us, I use Typer as well.
And I've used Typer.
I've used Click and ArcPars.
and so if you're you're probably not going to grab too many of the arc parse people
because they're just using it because it's built in maybe but like the unit test people
Brian what can you do you talk pie test for 10 years I yell louder that's what I do
but the but I think it's a good a fair thing because I'd be thinking well if I'm not
going to use arc parse I'm probably going to reach for type or click lately typeer
But having this comparison here,
and then even a migration from Typer,
I think that's cool to show that.
And there's always room for making CLI tools better.
We love CLI tools.
Yeah, there definitely are.
But yeah, so cool.
I ran across this and I built something with it recently,
and it was nice.
So we're checking out.
A little, I'm starting to add.
I've been kind of, I've drawn
the Kool-Aid now that maybe having an actual CLI, even from my own tools, might be a little
bit better than just, here's a script I run, and maybe I'll pass it something and just look
at sys.orgv and see if it's in there.
Oh, man.
Let's do a little bit better.
And now that I've started to do that, I've been playing with these different ones, which is,
I think, where I kind of got inspired to go down this path and talk about it today.
Yeah, awesome.
Cool.
I want to talk about the present, the future, a little bit of both.
So there was this interesting article.
Of course, in Python 314, well, we had it in 313, is that the gill was optional with the, what they call it, the free threading version, right?
Free threaded Python.
Not the gilless, not the gillectomy, free threaded.
I think it's where we landed from the PEP.
Okay, but is it kind of the same thing?
Yeah.
Okay.
Oh, right. The gallectomy was a different project.
So, yeah, I was, um, gosh.
Anyway.
Larry Hastings was doing that.
But I believe, I mean, the free threaded stuff, there's a lot of things that had to happen.
And actually, my next topic is also on free threaded Python.
And it's not just let's take out the Gill, but they had to rework memory management,
um, the, the structure of object.
There's like a lot of moving parts in order to make this possible.
Okay, cool.
Um, well, okay.
So with, with the, uh, free threaded Python.
So now we have in three.
313, we had two versions of Python release.
So if you go to download Python, you get a choice.
And with UV2, you can add a T to it, 314T, and you get the free-threater version.
See, but like, and also definitely encourage everybody to test and publish that you've test.
In CI, you go ahead and test it because the CI tools support both.
Why do we care about this?
Well, because some people are migrating to it.
Because as with 314, it's no longer just an experimental thing.
So here, I'm going to cover an article by Giovanni.
Burr really?
Sorry, Giovanni.
I can pronounce that one.
I'm pretty sure this is the same Giovanni that makes Grannion.
Oh, okay.
I think the future of Python web surfaces looks gill-free.
So starts out with two.
major changes when comparing free-threaded variant of 314 versus 313.
First, free-threaded support now reaches phase two, meaning it is no longer considered experimental.
So everybody's on board with, this is what we're going to at least try to do for a while.
Not just try to do, but the Python core team is supporting this, and we're going to do it moving
forward.
What does that mean?
It means that people can depend on a free-threaded version and change their code according
if necessary, which is great, especially for a lot of async stuff, like web services.
So secondly, the implementation is now completed, meaning that the workarounds introduced in 313 make code sound.
To make code sound without the Gill are now gone, and free-threaded implementation now uses the adaptive interpreter as the Gill-enabled variant.
What does that mean?
It means that the additional optimizations, oh, okay.
Also, those facts plus additional optimizations make the performance penalty that we had in 313 way better or way less annoying.
So in 313, the free thread, if you're doing non-Async code, just straight code, you had to face it like a 35% time penalty, which sucks.
Now it's like a 5 to 10% difference and I think it's going to shrink even more.
but so if you're if you don't need the free threaded don't use it unless or or use it
whatever but if you if it's time critical don't right now but I think it's going to get better
so and especially if you do are using async and stuff it's it's way faster to use the free
threaded of course so this this article does a shout out to Miguel Greenberg's article about
performance I think we covered it last we did you did okay no maybe I did
But we did. I can't remember who started it.
So this has a whole, this article has a whole bunch more benchmarks, and I'm not going to go
walk through all of them, but talks about doing a service with Jason responses and both
measuring with whiskey and ASCII implementations.
And yes, talking about using Grinian, so that makes sense.
So I'm going to jump down.
So there's a bunch of metrics.
Basically, the end result is if you're using ASCII or ASGI, the free-threaded one is obviously the way to go.
And you don't get really much of a hit because of free-threaded implementation at all.
You get speed-ups and memory usage is reasonable.
So what I want people to do, if you're going to check this out, is to jump down to the final thoughts
because there's some great stuff here.
essentially so what do I want to pick out
on asynchronous protocols like ASGII
despite the fact that concurrency model doesn't change that much
it's a shift from one event loop per process
to one event loop per thread I think that's a big change
actually just the fact that we no longer need to scale
memory allocations just to use more CPU is a massive improvement
that's cool didn't know that that's neat
So for everybody out there coding a web application of Python, simplifying the concurrency
paradigms and the deployment process of such applications is a good thing, obviously, and the
conclusion being, for me, the future of Python web services looks like the gil-free.
So if you're doing web services, try out the free-threaded version.
At least one developer there is completely happy with what we have now.
So, yeah, super nice.
A bit of real-time follow-up.
Yes, indeed. It is the same Giovanni.
All right.
Who creates a granion?
So with any of these web production app servers where you run your Python code when it's running,
whether it's Django, Flask, whatever, there's a process that runs your code.
Very often, the way that we've got them to work around the Gill, like in the web, the gill is a much smaller problem.
It's still a problem, but it's less.
Because what we do often is we create a web garden.
So when you set up grand, you can say, and start four copies of yourself.
So that, yes, they all have the gill, but we'll round robin requests between the fours.
If you got four cores, four CPUs, you basically can say, well, each one of these is dedicated to a CPU and it can kind of like match, right?
Yeah, so we're running on grainy, good stuff.
But the thing is, when you have this free threaded option, you can actually have true concurrency in your one worker.
So instead of scaling out four copies, you could have just one.
and to say, let that one take 10 concurrent requests
or whatever it needs to take.
Yeah, right?
So that's how the free thread it gets better.
And like, well, okay, why?
What's like, you know, six, one, half dozen the other?
No, the memory becomes a problem
when you create these little web gardens.
Because if normally your server would use half a gig
and you create four, well, now it's two gigs.
And maybe that bumps you up in another tier and so on, right?
Yeah, and if you need that for, yeah,
if you need that memory for data, you don't have it.
Yeah, so I think that's kind of a one
of the angles of not just your writing concurrent code but the foundations of your code running can do so
more efficiently because they're no longer needing to work around processes uh limitations like
well we got to faint out these processes because of the kill okay back to the regular scheduled
programming free threaded python let's keep going so here's an interesting article from a snake
wearing a very fast it's a very fast snake with jet engines uh that is wearing a garbage collector so
So what I want to talk about is unlocking the performance in Python's free-threaded future
GC optimization.
So garbage collection is not something that we frequently talk about a ton in Python.
You know, it's just, we don't think a lot about GC, we don't think a lot about memory.
I don't know why, like, as you know Brian and C, C++, people are like always about it, right?
Like, oh my gosh, what size of point, what size of integer are we going to use for this, right?
And like, it's just way more sort of in the forefront, but it's nice every now and then to peel
back the covers and get a look at what's going on here.
So Neil Schemannauer wrote this article over here from Quantite.
And they do a bunch of data science primarily.
But here, check this out.
This is news to me, actually, even though I'm still interested in these things.
It says, first, the most important thing to know is that free threaded Python uses a different garbage collector than the default Python, the Gill enabled.
The gilded, I'd still want to call it the gilded python.
The gilded python.
So we have the default GC, which people probably understand pretty well.
When you create a number, like a thousand, when you create a list, you create an object from
a class, all of those things have a data structure at the top of them that allows them
to manage and track the memory.
And people think of the Gill as being a threading thing.
The Gill is really a protection mechanism for memory management in Python, right?
It's protection against a race condition in the reference counting.
So basically interacting with this structure that says seven things reference me and I
reference it.
However, you know what I mean?
Like that sort of deal.
Yeah.
And when the thing goes out of scope, it decrements that and that's where the gill comes
in to like make that decrement safe and so on.
So that's the regular one.
But we don't even have that structure anymore in the free threaded one.
So, so what?
So instead, when you allocate stuff, it goes into a special memory managed heap called
Mimalloc, or managed by the Mimalloc, right?
And this allows the GC to loop over all these objects and figure out which ones are junk and
which ones are a little bit like a mark and sweep garbage collector.
So there's a couple of interesting things.
One is that most of the mark and sweep garbage collectors and that type of thing are, they
have generations, like a generational one. So it has gen zeros where most objects come,
then it'll check that frequently and so on. That's also how the garbage collector that looks
for cycles and the regular Python goes, right? But it doesn't work here because of like
unmanaged sea extensions and a bunch of stuff like that. So what it does is it can mark
all the objects reachable from what it calls known roots. So instead of trying to scan the entire
memory space. It says, well, what are globals? What are locals? What are like a, you know,
it sort of follow those and it marks those as active. And that can, they can automatically be
excluded from the search, which is kind of like a generational optimization. So pretty interesting.
But here's the takeaway. We were talking about making things faster by using free threaded
Python. With this one, it's the free threaded GC collection is between two to 12 times faster than
the 313 version. Wow. That's pretty, yeah, it's pretty wild, right? Yeah. Yeah.
So depending on how your algorithms work, do you have a super pointer-pointer-heavy allocation-heavy type of algorithm, then, you know, we'll probably see a bigger benefit than if you just allocate a few things and jam on those.
So anyway, if you want to peel back the covers and see the GC, I didn't even know they were different, but apparently they're different.
And presumably, maybe they could even have like multi-threaded GC stuff going on.
That might be cool.
Or background thread GC.
I know those are a lot of different things that happen in some of these systems, these garbage-class systems.
Yeah. And finally, a little bit of real time, not quite real time, but real world follow-up, I'll say.
So for some reason, I can't remember what I was doing. Something I had to do, I had to go back, go through all the examples of my async programming course at Talk Python.
So I went back in there and I said, all right, well, let me just make sure everything's running, update all the dependencies.
I remember what it was. But anyway, something had to be updated. So I made sure all the dependencies got updated and penned them to the newer ones. And I'm like, well, there's a lot of changes here.
so they're all running. So go around and I ran every example from the course, probably 25 little
apps. And many of them are like, well, here's a web app or here's just a way to do multi-processing.
Then we'll look at it. But a lot of them were like, let's actually do this with threads.
And in this example with doing I.O, you see it does make a difference.
So this example doing computational stuff, no difference or slower because there's just overhead
and it's still running one at a time, right? Well, I decided to type UV-dash-Python 3.
14t run those examples and run them free threaded the ones that used to have no benefit or a net
negative are now like seven times faster they would it would go like synchronous one 10.2 seconds
async or threaded one rather 10.25 seconds you know and now just put the t on the end three
seconds that's pretty cool that is pretty cool man so
To me, it's really down to what are the framework,
what are the packages and libraries that you're using?
If it's green across the board for free threaded,
that's pretty interesting.
It opens up some real possibilities.
Yeah, also, I'll be looking forward to the,
I know, I don't know if we have a deadline or timeline for this,
but when we can go back to having one version of Python
and it's being the free threaded one,
it'll be, you know,
maybe we can switch it to be the default is the free-threaded one.
And for a couple versions, there's the other one also.
You've got to say 314G.
Yeah, or something like that.
And I think that would be good because the default way we think about how to program
and how to program quickly and the rules of thumb based on the gill are gone.
We'll be gone.
So we'll teach people different.
Yeah.
So, yeah.
Anyway.
Yeah.
And before we move on real quick, a follow-up.
to your topic from chart here.
Also, if you're on Kubernetes,
if you're on Kubernetes,
it makes setting reasonable CPU and memory requests
more difficult to have to scale out,
like the web garden type stuff.
Oh, okay.
Yeah, so if you just have one process,
like this is the one, you know,
I think it's easier.
I believe that's what he was getting at.
Yeah, and also, I mean, from the C++ plus world,
yes, processes and threads both can work.
But thinking in threads is easier than thinking
about multi-processes.
100% because you have shared memory.
You don't have to proxy stuff over
and figure out how to sync it back up and copy it.
To queues and all sorts of even.
Anyway, OK.
I can't resist.
I got to say, I got to put this out there in the world
and see what you think.
One thing right now, we have the gilded Python
and we have the free threaded Python.
And I can choose to run the free threaded one
as we've been talking about if I want that.
But what about this, Brian?
This is what I want to put out in the world.
I want, like right now if I have a project,
and I got some data and I like I need to process it some parallel on the gilded one.
I need to maybe do multi-processing.
So that's going to spin up five little baby Python processes all to go jam on that, right?
That's how you get around the gill.
What about this?
What if you could pass a flag to multi-processing and it would start up re-threaded Python with threads instead of a bunch of different processes that can't coordinate on whatever algorithm of that?
You just say run multi-processing, but this time do it in threaded, a free-threated, a free-
threaded Python as your little multi-process burst and then come back to me with the
answer that would be cool yeah yeah I'm not you're not convinced I mean you could that
way because I mean your code has very different though just that one function though
and you don't pick you know what I mean you're like this one function that I'm gonna call
multi-processing on I'm gonna write it assuming that it can be it will be an advantage to be
threaded or it will be an advantage to be concurrent but the rest of my code I don't have
to rewrite like it'd be a cool way to sort of bring kind of like Scython lets you bring
and say this one function, if it was just this were way faster, it would make all the difference.
You could do that, but for free threading.
I would like to see that.
It would be interesting.
Yeah.
Yeah, or just a chunk to say like, yeah, this, this subsystem is a through free threading.
With free threading.
Yeah.
No, all right.
Go ahead.
Okay.
Completely different tangent is we're going to go back in time by a couple of one or two episodes.
We've been, I, the last few episodes, or two or three talking about.
lazy imports because I thought they were coming in 315 they're proposed in 315 but they're not
here so or they're not accepted yet I haven't checked I don't think that anything's changed
so so last episode I did talk about lazy imports that you can use today responding to that
Bob Belderboss wrote up was had a discussion on LinkedIn and part of that discussion
we had Willa McCugan hop in and said that he's excited about the pep and that he uses lazy he has a lazy loading mechanism for textuals widgets which totally makes sense like widgets you might just need a button but you don't need like all of the all of the other widgets be cool if you could just load what you needed and he's got that so I checked it out I was checking this out and I think this is sort of a small topic but
I think it's cool.
So the idea that I want to highlight is I presented last week in A Method for doing lazy loading
now or lazy importing now on things that you depend on.
But what if you're like, you don't have to wait.
You can be lazy today.
But what about like, yeah, exactly.
What about if you're the package?
If you have a big package that you have a bunch of stuff that people might want to import,
making it so that you're net and you're not the problem,
that your package is going to import really fast
because behind the scenes you're doing lazy importing for people.
And that's essentially what Textual does.
And the implementation is like really easy.
He's using a basically when you access anything,
he's overriding the get adder function.
So if you access a widget, it's going to try to just grab it out of a dictionary.
And of course, right away,
it's going to fail. And then if it fails, he goes ahead and does it like goes ahead and imports it.
And then stores it in. So the next time somebody tries to grab it, it's going to be there. And so there's
a little bit, a little bit of misdirection here. But in the end, you get lazy loading on, on every
widget access. So pretty cool. And so I thought, you know, this is pretty neat. I don't want to
just to be hidden it's not hidden it's an open source project so i went ahead and wrote all this up
in a new or a new post called polite lazy imports for python package maintainers very nice
link to that also but i just thought that the implementation is totally clever and um and maybe it's
maybe for other package maintainers this is like well yeah that's the way you do it but um it was new to me
and i thought it was neat so there we go also want to shout out i didn't realize there was both get adder get
Adder is used for if it's not found, for missing items.
But also, there's another one called Get Attribute.
And I did look up.
I really appreciate that Trey Hunter had posted this every Dunder Method article.
So pretty cool.
I totally have this bookmarked so that I'm like, if I want to understand a Dunder method in Python, I go here.
Yeah, very cool.
I know it for classes, but for modules.
as well, interesting.
Yeah, I don't think, and packages.
So this is, you go ahead and throw it.
Well, I guess it is for modules,
but if you put that in a Dendor Init file,
then it's for your entire package.
Yeah, exactly.
It's great cool.
Anyway.
I love it.
Nice.
All right.
Is that all your extras?
Oh, that's just.
That's my main thing.
I do.
I do have one extra.
The one extra, so one of the things I've been doing is,
is I've been,
And I paused Python people a long time ago, and I totally stopped doing testing code.
I'm focused on this and focused on work, but I'm also writing more.
And one of the things I'm doing is writing a book on test-driven development.
And I don't have an announcement yet, but in the next couple weeks, I think I might have an announcement.
But if you'd like to know, I guess what I'm going to do is I'm going to write the book,
but to motivate me and to make people not have to wait, I'm doing.
Doing it is like a, you know, like Manning does like the early access books and Pragmatic has
beta books.
I'm kind of doing that.
I'm doing a lean, trying to do a lean startup applied to books.
And I'm going to release it after I've got, I'm like, you know, do a rough first draft,
clean it up a little bit and then release it for every chapter.
And then as I go along, incorporate feedback from people.
And then once the whole thing is good and polished, I might bring on an editor or something
or maybe just release it.
But if you'd like to know more about that,
I'll announce it here,
but also if you'd like to know the minute it's available,
join the newsletter over at Python Test,
and I'll let you know.
Yeah, we really do appreciate when you join our newsletter.
You might think, oh, we can just announce these things
on social media or on the podcast,
but it's not the same people.
A lot of people miss it because they miss an episode
or like social media is just a screaming feed of stuff.
It really makes a difference.
So join the newsletter.
Um, also I want to like talk about the new, there's, there's newsletters and then there's
newsletters. So, um, this isn't really, I don't do a weekly like eight tips on testing.
That would be cool. I just don't have time to do that. But, um, so it's mostly announcements
if there's something to, to keep track of. But people like you and me and others, um, we're using
things like, like I'm using kit, but there's others, uh, I've used other email things before.
I don't keep track of who's subscribed. So I'm not going to sell us to anything.
anybody or anything.
It's just an announcement thing, and you can unsubscribe anytime.
And I don't even know, like once you're unsubscribed, you're gone.
I don't have the access to that.
So there's people that abuse newsletters and people that I don't think there's a lot of
people in the Python tech space that do though.
So it's not spammy.
Anyway, moving on, do you have any extras for us?
I actually have a couple.
Let's jump over and hear about it.
Very exciting news.
I am really happy with this new course that I created called Agentic, A.
AI programming for Python devs and data scientists.
Cool.
Yeah, and the idea is basically, so we have these agentic AI tools
like code and cursor and Juni and all them.
But how do you actually be successful with them, right?
I've talked a few times about just insane productivity
and really good outcomes that I've had.
And people are like, how are you doing that?
So this course is like a real world set of guidelines
and examples and best practices for working with agentic AI.
So spend an hour building a really detailed application.
But we also, it also talks like almost for an hour
about like guardrails and road maps
and how do you get it to do exactly what you want?
You're like, when I say build an app,
don't just build the app, but build it with UV,
ripe high test tests, format it with Ruff,
with this Tommel and you don't have to do any,
you have to, but you know, like how do you get it
to give you what you want, not something in the VIG,
general space of what you ask for you know what i mean so sort of a a practical agentic a
programing course so i really i'm getting like crazy amounts of good fees back on this course so
people check it out the link is in the um in the show notes i think it's talk python dotfm dash
slash agentic dash a i so that one's fun what else i'm also going to be on talking with hugo bowen
anderson he's running the vash vanishing gradients podcast he was on talk python a while ago and now i'm
going to be on his show data science meets agentic AI but sort of a follow up from my talk python
and production book plus this course i just talked about those those kind of ideas we're going to
spend some time riffing on that so that's tomorrow night u.s time so uh check that out all right um also
open ai introduced a i ii browser wrapping google chrome called atlas this is interesting i played
with it i'm not convinced you i mean it's kind of fine it's it's i don't know i don't even know what to think
about these things. That's not why I included it. Maybe it's sort of interesting as a side note,
but I linked to the Ars Technica article. Holy smokes, you guys, check out the comments. They are
not loving it. And it's not open AI's fault exactly. It's just this AI, this idea of AI taking over
everything is not a, is not, at least if there's not everyone hating on it, there's a very vocal
group of people who don't love it. So, and I just noticed that Kyle Orlin wrote this. And he actually
was a whole book on minesweeper which you know props to him for that but oh that's awesome i got to check
that out i don't feel like geez it's been a long time to thought about mindsweber anyway this is
actually just a really interesting cultural like touch point i think so people can check that out
all right james abel um one of the i think he was behind pie bay this year but yeah
in san francisco area uh said hey we somehow we were talking about dunder name if dunder name equals dunder
main today i have a package it's real real simple but uh instead of writing this if you want to like
get people started you can just import my thing and say if is main as a simpler saner way
you should that maybe should be a built-in don't you think python yeah or what i would rather see
is if is main i here's what i would like to see but i don't know that python has a a mechanism for
it i would like to say at the top something to the effect of register this function as may and then
when it's done parsing the file, then it runs it.
Because even this, which is super helpful, it's still, if you accidentally put code below it,
it's a problem, you know what I mean?
I would like Python to enforce like, we load the whole file and then it runs, like a cool
mechanism for that.
But still, this is pretty nice.
And at first I thought I was just like wrapping if name is main, right?
But in fact, it's like doing some a little bit different here.
It's using the inspect stack to go back and then look at the value.
because obviously it itself is not main,
so it can't do that test.
So I don't know, kind of cute.
Yeah.
If you don't want to have a dependency on this library,
you could take that line of code
and put it somewhere as a utility class in yours.
But, okay, I'm just going to throw this out there.
If you want to do something really fast,
make sure you time this,
because every time I've added inspect,
I love the inspect library,
but whenever I added, it slows things down.
Okay.
Um, just, just, you know, benchmark it.
Benchmark it.
All right.
In line, in line.
All right.
Um, if you're an IntelliJ person like Pie Charm, wouldn't it be nice, Brian?
Wouldn't it be incredible?
If while you're working, a little pet could walk around in the bottom.
Anthony Shaw, I believe is the one who added this to Vs code.
Yeah.
Pie Charm people can love animals too.
So you could install the pet into your Pie Charm now.
Oh, cool.
Nice.
I don't know that I'm going to be doing it.
This is the last minute edition.
I just got a brand new M5 iPad.
How insane is that?
I've had, the last one I had I got five years ago.
I'm like, ah, it's probably time.
And I got a 13-inch iPad Pro, which is an insanely large iPad.
But I've started using it as the second monitor for my laptop,
if I'm traveling or if I'm at a coffee shop or something.
And that's a super cool experience.
Just put them side by side.
If they're on the same network or the cable,
they just do, you know, real, real low latency.
dual monitor stuff and yeah it's pretty neat so just a little shout out to that that's incredible
it's like it's like having a new laptop a second a laptop for the price of a second laptop yes it is but
here's the thing here yes i agree but it's also a really nice reading device which i do a lot of reading
and i was going to get one of the cheaper ones one of the things that drives me i have a macbook air
is my main laptop computer and the same chip from my main computer i don't have a high-end computer and it's
plenty good for Docker stuff, for programming stuff, all these things. It's totally fine.
But here's the thing. The iPad Air, while great, its peak brightness is something like five or
600 nits. And if you're working, like I like to work in my back outside area in the summer,
like that's kind of not relevant right now. But in general, you know, sit outside, enjoy the
weather, get out of the office. And if it's at all bright through a window or somewhere,
it's like really a pain. This thing is like the best screen you can buy on a Mac, period, whatever.
a thousand nits. So you could push the computer to the side and just put the laptop in front
of you and type on it. It's really nice. Anyway, that's the main reason. I want a brighter screen
without having a MacBook Pro. I feel so bad for you having to deal with like working outside
in such bright light. It's really horrible. I know. You should really, it's hard. It's hard
doing me. All right. Carrying on with the jokes. No, go ahead.
Well, before we get to the joke, I want to, I guess, to highlight, going back to my announcement,
of possibly a book.
Japanol 7 says a TDD book by Brian.
Can't wait.
Also,
Talk Python training courses are great.
Kudos.
Yeah.
I'm looking forward to getting that book at and I'm looking forward to that AI course.
Yeah.
Yeah, same.
All right.
I'm not looking forward to surgery.
I'll tell you what.
And it's getting to be weird, Brian.
I mean, like doctors using AI and stuff.
Actually, we probably are going to get better diagnoses.
as for certain things with that but here's a surgeon situation there's a person who just they're on
post-op okay they're laying there like ah man a little woozy from the anesthesia coming out of it
the the doctor which is a robot with the chat tpt like i don't think it's is it the same i don't
know some an ai logo for a robot face and the the patient says but why is the scar on the left
if the appendix is on the right the a i surgeon says you're absolutely right let me try that one more
time.
Please don't try it one more time.
It's so bad.
It's pretty bad.
It's pretty funny.
That actually drives me nuts when I'm like, this doesn't sound right.
Is this, you know, the user is showing you out.
I've made a mistake.
Oh, you're right.
Yeah.
Oh, well.
Yeah.
So, yeah, just get a second opinion.
So if opening is going to operate on you, have Anthropic be the best.
backup, I guess, is the moral of the store. I don't know. I don't think that's the moral of this
world. You don't think so? Okay. Maybe somebody trained in medicine. Yeah.
Trained on medicine? I'm sure they are.
Trained with heavy medication. Yeah, exactly. All right, cool. Well, fun as always.
Well, definitely fun as always. And we'll see everybody next week.
