Python Bytes - #413 python-build-standalone finds a home
Episode Date: December 9, 2024Topics covered in this episode: jiter A new home for python-build-standalone moka-py uv: An In-Depth Guide Extras Joke Watch on YouTube About the show Sponsored by us! Support our work through: ... Our courses at Talk Python Training The Complete pytest Course Patreon Supporters Connect with the hosts Michael: @mkennedy@fosstodon.org / @mkennedy.codes (bsky) Brian: @brianokken@fosstodon.org / @brianokken.bsky.social Show: @pythonbytes@fosstodon.org / @pythonbytes.fm (bsky) Join us on YouTube at pythonbytes.fm/live to be part of the audience. Usually Monday at 10am PT. Older video versions available there too. Finally, if you want an artisanal, hand-crafted digest of every week of the show notes in email form? Add your name and email to our friends of the show list, we'll never share it. Michael #1: jiter Fast iterable JSON parser. About to be the backend for Pydantic and Logfire. Currently powers OpenAI / ChatGPT (along with Pydantic itself), at least their Python library, maybe more. jiter has three interfaces: JsonValue an enum representing JSON data Jiter an iterator over JSON data PythonParse which parses a JSON string into a Python object jiter-python - This is a standalone version of the JSON parser used in pydantic-core. The recommendation is to only use this package directly if you do not use pydantic Brian #2: A new home for python-build-standalone Charlie Marsh See also Transferring Python Build Standalone Stewardship to Astral from Gregory Szorc python-build-standalone is the project that has prebuilt binaries for different architectures. used by uv python install 3.12 and uv venv .venv --python 3.12 and uv sync This is good stability news for everyone. Interesting discussion of prebuilt Python from Charlie Michael #3: moka-py A high performance caching library for Python written in Rust moka-py is a Python binding for the highly efficient Moka caching library written in Rust. This library allows you to leverage the power of Moka's high-performance, feature-rich cache in your Python projects. Features Synchronous Cache: Supports thread-safe, in-memory caching for Python applications. TTL Support: Automatically evicts entries after a configurable time-to-live (TTL). TTI Support: Automatically evicts entries after a configurable time-to-idle (TTI). Size-based Eviction: Automatically removes items when the cache exceeds its size limit using the TinyLFU policy. Concurrency: Optimized for high-performance, concurrent access in multi-threaded environments. Brian #4: uv: An In-Depth Guide On SaaS Pegasus blog, so presumably by Cory Zue Good intro to uv Also a nice list of everyday commands Install python: uv python install 3.12 I don’t really use this anymore, as uv venv .venv --python 3.12 or uv sync install if necessary create a virtual env: uv venv .venv --python 3.12 install stuff: uv pip install django add project dependencies build pinned dependencies Also discussion about adopting the new workflow Extras Brian: PydanticAI - not sure why I didn’t see that coming In the “good to know” and “commentary on society” area: Anti-Toxicity Features on Bluesky The WIRED Guide to Protecting Yourself From Government Surveillance Michael: Go sponsor a bunch of projects on GitHub Registration is open for PyCon Joke: Inf
Transcript
Discussion (0)
Hello and welcome to Python Bytes where we deliver Python news and headlines directly
to your earbuds. This is episode 413 recorded December 9, 2024. And I'm Brian Ocken.
And I'm Michael Kennedy.
This episode is sponsored by us. So check out the links in our show notes, but also
check out TalkPythonTraining and PythonTest.com. There's courses over there. And of course,
thank you to Patreon supporters. And we have links if you want to get a hold of us, you can reach us on Blue Sky or Mastodon. The
links are in the show notes. And also, if you'd like to to if you're listening to the show,
thank you. And please share it with a friend. Also, if you'd like to participate in the
discussion while we're recording, you can head on over to pythonbytes.fm slash live
and see when we're recording next.
But usually it's Monday at 10 a.m. Pacific time.
Sometimes it shifts though.
And during the holiday season,
who knows what we might do.
But so far we're sticking with that.
And if you'd like to get the links in your email inbox,
go ahead and go and sign up for the newsletter
at pythonbytes.fm.
And we will send you all of the links in the show notes right in your inbox.
So, Michael, let's kick it off.
Let's kick it off.
I want to talk about a little bit of jitter.
Maybe I've had too much coffee this morning or something.
I don't know. What do you think?
Maybe.
Jitter is a thing from the folks at Pydantic. And the idea here is they need really fast JSON parsing
as the foundation of Pydantic, right? Basically Pydantic is about how do I exchange, validate
and transform JSON data with Python classes, right? Into Python classes and types. Yeah.
So you want that to be fast. The folks over at Pydantic created this thing called Jitter, G-I-T-E-R,
and it is a fast, iterable JSON parser.
Now, if the Pydantic usage does not catch your attention,
OpenAI is also using Jitter, which is pretty interesting.
Ask Chat Ship PT about it.
So the reason that they were interested in it is they want to be able to work with i believe
pydantic as well but they want to work with uh responses coming out of llms and anyone who's
used llms until maybe very recently knows that they kind of like spit out the answers in a little
progressive way right and so with this you can parse parts of data as it comes down, which is pretty cool.
So there's some examples of partial in here you can go look for somewhere,
I think maybe on the docs website or something like that.
But you can give it a partially formed string and
it'll come up with perfectly good answers for it.
So that's pretty neat.
And that's one of its features.
The other is that it's faster than what I think the default
Rust JSON parser is, even for non-iterable, just straight parse it, which is, that's pretty
impressive. Okay. And then there's also, this is why we are talking about it, there's Python parse,
which parses JSON strings into a Python object. So you can go and run that as well, which is pretty cool. Jitter example. Yeah. Anyway. Yeah.
So you can go and parse it into different pieces using basically if you need a really fast JSON
parser with Python, you can use Python parse and it'll parse into a structure, right? Yeah. So
awesome. I thought people might be interested in both an iterable, iterating JSON parser, and back to this one, iterating JSON parser, and also a really fast one.
Plus, it's being built by the folks at Pydantic, Sam Colvin and team.
And yeah, excellent. Nice work.
Oh, yeah. I think I've got several uses for this. This is cool.
Yeah, cool. cool yeah cool yeah i recently had uh samuel colvin on um with david seddon to talk about
building rust extensions or integrating rust with python and things like that and talk python and he
talked about this is one of the things they're building which like oh okay this is pretty
interesting yeah definitely well i'm going to talk about python pre-builds a little bit
this this is big news brian I'm glad you're covering it.
So Python build standalone is a project
that we've talked about on the show,
but mostly we talked about it in association with UV.
Because if you use UV sync or UV install Python
or UV virtual environment or UVVNV
and then install and use Python there,
if it can't find it on your system, the Python on your system,
it's going to pull it from Python build standalone,
which is a separate project, not part of UV.
So we've discussed that.
But the big news right now is that Python build standalone
is now part of Astral or under the Astral umbrella, which is huge.
So yeah, there's a we're going to link to a an article from Charlie Marsh, head of Astral
saying a new home for Python build standalone.
There's also a that just says we'll be taking over suit will be taking stewardship of this
project from Gregory's orc.
Cool last name anyway uh the foundational
project for building installing portable python distributions and uh there's a link to gregory's
announcement also um and the uh discussion around that like the python build standalone powers uv
powers rye um also uh pipx and hatch and more and it's got like 70 70 million downloads so far oh wow um
pretty big uh project and definitely instrumental to going forward with python or with python
packaging and using python so astral is really like trying to make make uv with along with this python build standalone project um uh the new way to install python and
for me it is i'm i'm using it all every day now so 100 same for me um so a pretty short article
talking about this but it is kind of interesting that talking it talks about uh what the project
is at first and talks about the future of Python, standalone Python distributions.
Also what they have in mind for the project
that looks like they want to keep the project up to date
with Python releases, of course.
And then upstream changes
to the CPython build system possibly.
And then remove some of the,
third is remove some of the project's existing limitations.
Well, the existing ones.
It ships some MUSL-based Python builds.
They're incompatible with Python extension modules.
I don't know what that means, but...
I don't know what MUSL is, so I'm going to move on from that.
Okay.
And then improve the project's Python build and release process.
Just, you know, good stewardship for this project,
and I'm really happy about that.
I, along with this, I was interested to read a thread from Charlie Marsh about what said Python builds downloads exploded in popularity with over 70 million downloads all time.
I'm going to put this link to this thread from on blue sky into the show notes also, because it's interesting to,
it's an interesting discussion. And I,
I learned something through here that I didn't understand,
didn't know before it said that the Python,
I didn't know this,
that the python.org download the download from python.org.
It actually downloads an installer that builds Python from source on your
machine.
For Linux. For Linux. Okay. It says for Linux. Okay. So for Linux, downloads an installer that builds python from source on your machine for linux for linux okay
um it says for linux okay so for linux yeah because the mac os and the the windows ones
install way too fast the building python from source is like a 10 minute deal if you run the
if it runs the tests and stuff so okay yeah so i because, I didn't think I was doing that on, on, yeah. Anyway, you didn't get the air VC VARs bat.
Couldn't be found.
I didn't see that for a while.
So,
yeah,
I guess a bigger deal for people that are not running windows or Mac,
but that's really like all the servers and stuff.
So,
yeah.
Well,
I think the other thing that's really non-obvious here is like,
what is this build standalone anyway? Why don't we just
download the installer and just run it or just take the result of the installer and clunk it out
into your machine or something? So my understanding is the non-standalone one depends on other
foundational things in the system, especially in Linux, but also in other places. If you want to
be able to just copy it over, you can't do that. And so one of the things that they're talking about,
one of the four points of the direction
that they're trying to go that Charlie laid out
was trying to upstream some of these changes
back into CPython itself.
I think it might be number one of the feature.
Yeah, upstream, no, number two.
Upstream the changes to the CPython build system
because they have to patch Python
in order to make this actually build,
which is why it's a pain in the butt to maintain.
And then how many combinatorial variations of that
do you get for different platforms and stuff, right?
Yeah.
And so trying to say,
look, we've done these things to make it build
more simply with fewer dependencies.
Let's maybe make that part of Python.
I don't know about you,
but I have not seen a single problem with UV Python,
you know, Python build standalone Python
compared to system Python.
It's not like, oh, well, the certificates don't validate
or this thing doesn't work or it doesn't have SSL
or some weird thing like a dependency might be missing.
It seems fine to me.
And actually I'd be more,
if I'm like running a server or
something i'd be more worried about installing it separately and building it on each of the
machines i'm installing it on than i would having like you know one install that goes everywhere
um because you know yeah yeah anyway yeah and i can tell you that pythonbytes.fm is powered by python 313.1 based uh
derived from or gotten from this this method here yeah yeah anyway um big news that actually
probably doesn't mean much to individual users other than i think that we can try we we had a
little bit of a concern about whether or not you know this this one project it's it was sitting heavily on one
person one developers to maintain and i'm glad that it's uh astral helping out with this now too
yeah i i agree and if you read greg's announcement um there transferring python build standalone
stewardship to astral that he talks about how the astral folks actually for a while have been
core contributors to the project and they've
been working from the outside to help keep this thing going because they realize how important
it is to this feature right yeah and uh i know also i read i don't know if it was in this or
somewhere else but i essentially read that the project was i mean astral was really working on
it for several months anyway um yeah this is this is mostly a um is mostly an official announcement is all.
But yeah.
One final parting thought, Brian,
is right there where you are.
It says, this is in Greg's announcement,
as I wrote in my shifting open source priorities in March,
this is an interesting challenge that people can run into
with projects that are run by one person, right?
Yeah.
The guy had a kid, wanted to spend more time with the kid,
was feeling worn out by the projects
and decided, well,
and also talks about how he really just cares
way more about Rust
than he does about Python these days,
which is fine.
You're not married for life to a technology.
Go where your heart takes you.
But that's a challenge for projects
that are run by one person. I think it's worth reading this thing as well, just for people to
get a sense of, you know, when open source projects take off, but it's not necessarily a good fit.
Yeah.
Yeah. But thanks to Gregory for creating this and keeping it going. He's also known for the
PyOxidizer project, which came close, didn't quite get us to a single binary of our Python
apps. Interesting. Okay. Yeah, it's really cool that he made sure that this was in good hands
before shifting it over. Yeah, absolutely. Absolutely. All right. All right. On to the next.
On to the next thing. So I talked about, there's a theme here, I talked about the jitters from
having too much coffee. Well, let's talk about Mocha.
Maybe if we put some hot chocolate and some sugar in with it, it'll be better.
No, probably not.
This project, this project is by Del Rio and it's called Mocha Pie.
So Mocha let's like work our way inside out.
So Mocha is a high performance concurrent caching library for Rust.
Not a concurrent caching server like Redis.
Think SQLite, but for caching, right?
SQLite's written in C, not Rust, but it's an in-process sort of deal,
which is pretty neat.
And this itself is inspired by Caffeine for Java, right?
This is kind of like turtles all the way down, like ports all the way down.
So it provides a caching implementation on top of dictionaries.
They support full concurrency of retrievals in the high expected concurrencies for updates.
All right.
So thread safe, highly concurrent in-memory cache implementation sync and async can be
bounded by the maximum number of entries, the total weighted size, size aware eviction,
like kicking large things out versus
small things. You can have cache controlled by least frequently used, by last recently used,
like I want to kick out things that are over two minutes, but if you've got room based on something,
that's fine. You can give them a time to live, a time to idle, right? Idle is a really cool
interesting one, like when was this last accessed?
So you've got something that's old but is used all the time in your app,
and then something that's somewhat new but it kind of hasn't got used that much.
It would be better to kick out that new one rather than the old one, right?
Mm-hmm, yeah.
Okay, so that's all just straight Mocha.
Mocha Pie is a Python binding for this.
Here we go again. Rust library
for Python. They're probably getting VC money from this. I'm telling you. Okay. No, just joking.
Sort of. So for the MochaPy thing, it has a synchronous cache, which supports basically
thread-safe memory. It just like wraps the thing. So time to live, time to idle, size of concurrency,
all these things that you can imagine. And so
there's a couple interesting ways. You can just say cache.set some value, or you can say cache.get
some value. That's one way to use it. Another one is you can use it as, this is actually pretty
amazing. You can use it as an LRU cache function decorator alternative. Oh, wow. Right? So one of
the things you can do that's really easy
to speed up python code with not writing much code you have to maintain much as you just put a
decorator functools.lr ucache onto it and it'll look at the hash value of all the inbound parameters
and say if you pass me the same parameters you're getting the same output right yeah it just does
that just straight in python memory but this would be backed by this
high performance concurrent rust internal library it's still in process right yeah you can see yeah
go ahead sorry what you with the time to live and time to you know idle yeah especially that's
that's cool yeah this is pretty cool and there's so much talk about the thing supporting a the
mocha itself the rust version supporting asynchronous behavior
right i'm like okay if um if it has all these asynchronous capabilities what's the story with
python and it's async and await right yeah so i filed an issue which i don't really like to do
but that's how you ask questions apparently and then you close it so i said hey cool project
since it says thread safe highly concurrent in-memory implementation,
what's the Python async story?
And so they responded, this will work if you put the decorator on there.
So remember how I was complaining that it's sort of weird
that the func tools and iter tools don't support async?
This func tool-like thing supports async and sync functions as well, right?
So they just have an implementation in the center that says,
is it a coroutine?
Do this.
Else, do that.
So you can use the caching decorator, like we talked about,
like the LRU cache thing already on async functions and sync functions.
So that's fine.
And then I said, well, what about cache get and set?
And Delirio says, probably doesn't make sense to do it.
It takes 230 nanoseconds.
So you can do 4.4 million calls a second.
And set is 1.3 million sets per second for a cache size of 10,000 that's fully occupied
on a simply M1 Mac.
So you know what?
Probably not.
But there might be some ways to expand this in the future.
I don't know.
But yeah, I would say probably not.
Probably not needed,
because you're going to probably add more overhead
just to juggle the async stuff, right?
Yeah.
And also, if the supported method is through the decorator
and whatever you need,
you could just put your code in a function to do.
Yeah.
I mean, if that were Redis, you would absolutely want an async version because you're talking
to another server.
Yeah.
And there's that latency in the network and all.
But yeah, if you can do 4 million a second, then probably, I doubt you can do 4 million
of weights a second, but it's much lower.
So the cache get and set really are just that where you, the benefit of those is, is probably just for, because we want a really fast caching system or something.
Yeah, yeah, exactly.
And you, there's plenty of times where you say in this situation, I want to get this out of the cache and then keep it for a while.
Like if I had a user who logged in and I want to just hold their user account with all their details and I've used their ID as the key and their actual user object as the
object that goes in that's fine but you wouldn't you wouldn't use that as a cache decorator because
typically you might see that coming out of a database something like that and then if you
pass the same user in it's like it's similar but it's a different database object instead right
you can run into real weird situations where they're equivalent but they're not equivalent
you know and then you end up not using cache.
So anyway,
I think that might be where you would do it.
But anyway,
I think this is pretty cool.
People can check it out.
Yeah.
And it's,
it is not,
I don't believe it is like super popular here and,
you know,
a hundred stars,
it kind of shine a light on it.
But if you go over to the Mocha thing,
you know,
it's got a thousand 700 stars and this is kind of just a Python UI on top
or API on top of it. Yeah. But it's, it's pretty 1,700 stars. And this is kind of just a Python UI on top or API
on top of it. Yeah, but it's
pretty recent. I mean, it's a few weeks
old, looks like. It's just a baby.
Just a baby. It's okay to have
100 stars. It's pretty good.
Yeah, it's pretty good, actually.
It looks cool. So, now you know.
Alright, I want to shift back to
UV. I'm kind of in a UV mood.
I'm missing the sun, apparently. But there's an article from the SAS Pegasus blog about when I learned about UV sync, and started using that and all the different ways to use UV, it's a pretty powerful
tool. So it's not really one thing, it's designed to be a lot. So, so I appreciate, you know,
articles like this, but also, I really like this one. So it starts out with pretty much who is,
which is with a funny meme of a whole bunch
of different commands to install Python and update it
and install, create a virtual environment
and sync your requirements.
And all of that is just done with UV sync now.
You can do it all in one, which is pretty sweet.
So-
I don't use UV sync.
I use UV, V, E, and VV dash dash Python 3.13 or something.
But, you know, same.
Yeah, I'm using both,
depending on whether or not I have a project set up already.
So it talks about what is UV, why use it,
and we're just going to assume that you already know
if you listen to this podcast, because it's really fast.
But a lot of discussion of different
workflows talks about um installing uh adopting uv into your existing workflows doing uh install
and but i'm gonna pop down to the end uh adopting uv into your workflow there's this cool cheat
sheet this is pretty much what the entire article talks about the different parts is um you can use uh uv uh uv
python install to install python you can use a virtual environment or venv to create virtual
environments it's really fast um and then install packages with uv pip install but then also uh
you can build your dependencies like we would have used pip compile you can use uv pip compile um
but um it's all in one place to all these different commands.
And these really are the commands.
The commands listed in this article are really the way I use UV as well.
So that's why I appreciated it.
And then a discussion about how to adopt this into your workflow and what that means to get, you know, talking about, I mean, some of this, a lot of people might not have used
sort of lock files before, but using lock files with UV or it's so easy that, you know, why not?
And pinning your dependencies, just some good workflow. It's good Python project practices
anyway. So why not? Yeah. Yeah. That's great. And there's even a few more that you could throw in for the tool like the equivalent c table there yeah you know there's you there's you for um
installing cli tools you could say pip x yeah and just create a virtual environment install
the things and make that in the path and all those sorts of things versus uv tool install or
uv run right those kind of things as well tool install or UV run, right?
Those kinds of things as well. So, yeah.
Yeah. It's missing that, which is, I'll feed it back to Corey.
So one of the things, reasons why this,
this came up on my radar is I'm working on a project that uses
SAS Pegasus. So I'm in touch with Corey a lot.
Yeah. But like the UV, the tool thing,
instead, I'm not using PipEx anymore.
The UV tool install is like super cool.
Yeah, it's super cool.
It is.
I've also started using Docker for certain things as well.
So it's kind of similar.
But like, for example, Glances,
which is a Python-based server UI visualization tool,
you can just say Docker run Glances
versus installing Glances,
and you just leave the machine a little more untouched.
Yeah, one of the interesting things about this article
was the point of view.
Because at the start,
Corey talks about how he's not usually somebody to jump on like multi-tool fads
like uh pipenv or pyenv for installing um uh for doing virtual environments better and big project
wides and and i i like hatch but i'm not really a uh using hatch for my entire workflow sort of
person i was using it just for a packager.
So I'm in the same boat of like,
I didn't really need an all-in-one tool,
but this one changed my mind
and I really like this all-in-one tool.
So yeah, I'm still not bought
into the project management side,
but I love using UV for the stuff.
Yeah, yeah.
Anyway, what do we got next?
We have a quick bit of follow-up here
that I just did some searching.
So over on pipx, so one of the things that you say,
like you could use pipx, or there
is an open issue on pipx that says integrate UV in some way,
right?
Because pipx is really just a wrapper
around create virtual environment,
pip install package, pip install dash u package, right?
And so if they just change the internals to say uv pip install,
then pipx would all of a sudden become super awesome.
This recommendation is unfortunately over half a year old,
but it does have 21 upvotes.
So you know what?
Yeah, who knows?
That's there.
Yeah, okay.
Yeah, okay.
But that's not what I want to cover next.
Come on, computer, respawn.
There we go. I think that's it for our items, right next. Come on, computer, respawn. There we go.
I think that's it for our items, right? We're on to extras.
Yeah. Let's have extras now.
Yeah. Let's extra it up.
Extra.
So registered for PyCon, I did.
Oh, cool.
Yeah. Registration came out two days ago. I don't know. Whenever I posted some message on Blue Sky and Mastodon saying, I registered. How about you? Whenever that was, that's when
the announcement came out. So I think a day and a half ago or something like that. So there's
early bird pricing and all details on there. If you want to go and check it out, it's normally
450 bucks for individuals, but you could save $50 if you register before January, which is pretty
cool. There's a bunch of stuff. It has all the detailed timeline, which is always interesting.
You know, like if I want to make sure I attend the Pied Ladies auction, when do I need to do that?
When is the main thing?
When is the job fair, et cetera?
So most importantly, main conferences, May 16 to May 18, 2025.
So there it is.
And congruent with current times mask policy.
Hooray.
Optional and encouraged, but not required.
Yeah.
How about that?
Yeah. Cool. Okay. I've got a few but not required. Yeah. How about that? Yeah.
Cool. Okay. I've got a few more real quick ones here. I recommend, you know what? It's something I came across just thinking like, why don't I support more open source projects? Looking at my
dependencies and stuff that I'm working on. Like how much, you know, if everybody who used Flask
put $1 towards it per month, everybody used it in an important way where it's
not just like oh i did a tutorial with flask but like no i have a project that is important to me
and i use flask if everyone put one dollar towards it it would transform that project if everyone who
used g unicorn put one dollar towards it that would transform it right so i decided you know
i'm gonna just go to some projects and find the one that I use most. And yeah, I found four that had sponsorships that were available.
I was going to support UV and Pydantic as well, but they, for some reason, they do like corporate sponsorships or I tried to do individuals and it didn't work.
And then some other ones like Beanie don't currently have sponsorships, but, you know, are really important for the database layer and stuff.
But just think about, you know, put a couple of dollars towards some of these projects it it'll make zero difference to you if you have a
job as a software developer and in the aggregate it'll make a big difference to the health of the
ecosystem yeah uh it's interesting to think about like that like just you know a couple less coffees
a month and um yeah one you probably cover like three or four projects. Yeah. Yeah.
Anyway,
I want to encourage people to do that,
but you know,
can't obviously don't,
but I don't think it's a big deal.
Uh,
come here.
Computer.
Very slow for some reason.
Don't know why.
There we go.
All right.
Uh, this is the joke.
So that I'm skipping the joke for a second.
We'll come back to it.
There's two things that I wasn't planning on covering,
but I'll throw out here really quick.
Uh,
yeah,
here's my register for PyCon.
Also wrote a quick, people said,
oh my God, Hetzner, we moved to Hetzner,
and they changed this huge thing
where they changed their bandwidth and their price,
and it's like a no, nothing sort of deal,
like $5 a month more.
Anyway, I wrote that up so people can check that out
on Mastodon.
And then, yeah, that's it for all my items.
Okay.
And then I just got the joke when you're ready for that.
So let's do yours.
I don't have much commentary on these.
I just have a few extra things I wanted to point out.
Pydantic AI was announced, which Pydantic AI is a Python agent framework designed to make it less painful to build production-grade applications with generative AI.
I don't have really any commentary about this other than I didn't see this coming, but interesting. Yeah. I've seen
messages or tweets or whatever from people who do machine learning stuff saying, you just need
Pydantic. I mean, a lot of this is like, I got a JSON thing here and then I'm going to call some
other thing with other JSON and just suggesting, hey, you could probably use Pydantic to make these connections.
I bet the Pydantic team noticed that as well.
Okay.
A couple of commentaries on maybe society.
And anyway, I'll leave it,
leave the couple other articles I thought was interesting.
Blue Sky announced, I guess this is old, this is from August,
but anti-toxicity features on Blue Sky.
And I just actually appreciate
some of these. I, I already have hit, um, I had a troll come by. Um, and so there's some things
where you can, if people, you can, uh, detach a quoted post. If, if somebody quotes you and you
don't want them to, you can detach yourself from that. Um, um, I uh hiding replies i had some a troll you can't like
delete replies but i had uh somebody just just idiotic reply to something i said and it was
obviously just a bot or a troll so you can you can hide that um and as you know as blue sky grows
we'll we'll get trolls also um if they're if they're not affecting you yet they they may
in the future
so so i do appreciate that there's um there are features around to protect yourself so there's
there's that and then um this i don't know what to make of this really but wired fairly mainstream
magazine i think uh released the wired guide to protecting yourself from government surveillance wow i'm i just this is a head shaker of like um
i guess we need this i wish we didn't but wow um yeah there's that so yeah you probably say
that about some state governments as well right every state's different but yeah yeah depending
on your gender and things you know it's touch and go some places. Yeah.
Anyway.
So that's a little bit of a downer.
So maybe we need something funny.
We do.
I don't want to spend all the time going down that deep rabbit hole.
Instead, let's go infinitely down the rabbit hole.
Yes.
So check this out, Brian.
Somebody who goes by Bits, very personal, on Blue sky posted what the comments seem to indicate is probably a textbook
a print this printed by the way a printed textbook on LaTeX okay okay in the index at the back on
page 252 there's an entry for infinite loop and it says see page 252 i love it so much it's so simple i love it yeah it's a
really good just like a little easter egg in there isn't it yeah i i've i haven't seen it for infinite
loop i saw that uh somebody did that for recursion in some yeah they're if you look in the comments
it says that uh carrington and ritchie has the same i guess that's probably c or something
the same in under the index for recursion.
And it's pretty good.
People love it.
Yeah.
That's funny.
And there's somebody that says,
for those who can't be bothered,
Google search for recursion.
Did you mean recursion?
Yeah.
Apparently.
I kind of feel bad for people
that actually really need to know what that means.
Good luck. Yeah, good luck with that, huh? So. Wow. All good. I kind of feel bad for people that actually really need to know what that means.
Good luck.
Yeah, good luck with that, huh?
So, yeah.
All good.
All good here.
We know what recursion and infinite loops are, but we're going to break the loop and get out of here, right?
Yeah.
Yeah, let's break the loop and say goodbye until next time.
So, thanks a lot.
Bye.
Bye, all.