Python Bytes - #334 Packaging Organizations
Episode Date: May 5, 2023Topics covered in this episode: rye - Python workflow tool PyPI Organizations 5 tips to learn any new Python library faster Python gets down to (the) Metal Extras Joke See the full show notes fo...r this episode on the website at pythonbytes.fm/334
Transcript
Discussion (0)
Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds.
This is episode 334, recorded May 3rd, 2023.
I'm Michael Kennedy.
And I'm Brian Akin.
This episode is brought to you this time by us.
Support our work through our courses over at TalkPythonTraining.
Check out Brian's PyTest course.
Check out a lot of the other courses I did there.
Check out the Test and Code podcasts.
And we have Patreon supporters as well.
Link on the episode pages on the website.
Connect with us over on Fauceton, at MKennedy, at Brian Ocken, at Python Bytes over there.
And if you're not part of the YouTube live stream and you want to be,
we generally do this on Tuesdays at 11 Pacific time in the morning.
But today we had to move it. So plus one
plus 23 hours. Yep. All right. Yeah. Brian, sometimes life happens. Life does happen.
And sometimes Pycons happen. Yeah. Yeah. That was a lot of fun to see you there.
Yeah, it was, it was a blast. Did you, um, I guess, uh, I had a great time. Uh, how about you? You
got a lot of work done. I got a lot of recording done. I'm looking forward to releasing some of those episodes.
Met a bunch of great people.
Saw some old friends.
Got to hang out with you.
You brought the Python staff of power and battled the snake,
which was the episode album art from last time, which was a lot of fun.
So, yeah, it's starting to come back, isn't it?
I mean, quite as many people are as vendors,
but it's good to see it coming back to life.
Yeah, I really enjoyed it.
Should we kick in?
Let's do it.
Yeah, jump in.
What do you got?
Well, I was actually talking with one of our friends from PyCon
that I'm blanking right now.
So somebody from PyCon, so thank you and apologies for forgetting your name right now.
But I came across actually from several people mentioned Rai because we had talked about
Huak.
I think Huak.
Anyway, H-U-A-K.
And I think I got corrected that the intended pronunciation is Huak. I think anyway, I'm on board with
that. Let's do it. So a similar so that's what it is. It was was a rust based Python
project workflow tool. So now we have another one. This one's called rye. And in I don't
know what came first. But actually, this look doesn't look that old. This is from Armin Roeniker.
Roeniker?
I'm okay.
I got to practice this stuff.
The dude from Flask that started Flask.
So I was curious about it because he usually kind of knows what he's doing.
And this is a really pretty interesting project.
So I tried it out this morning.
Rye, it says it's Armin's personal one-stop shop for all of his Python needs.
And there's a video, which I didn't watch, but it's a nine-minute intro video.
So what am I excited about with this?
So it's more than just project.toml workflow tools.
It does that.
So you can do things like initializing a project, adding a dependency, removing, you can remove a
dependency, that's kind of a neat trick. And then build a
wheel and even add lock files, which is kind of nice. The so
the other thing that one of the things that I thought was really
kind of cool is it also manages. I'm not going to find all the
details here, but it manages Python. So you can,
what are people using for PyEnv or something to install Python? This installs Python also. So you
can say, you can say what, Rye fetch, and then give it like 3.9. And it'll, it'll download and
install Python 3.9 for you, which like, wow, that's kind of cool.
And it does it in an interesting way.
So it uses these IndieGreg Python build standalone releases,
which I was not familiar with.
But so there's these different standalone distributions,
builds of Python.
So neat.
I forgot to add, this is not a Windows thing yet. It's just
Linux and Mac OS. So I tried it on Mac. So it's installing a bunch of stuff. But where does it
install it? It doesn't install it in the normal place. It installs it in in your home directory
under dot rye. And that that was surprising to me at first, but it's kind of neat that everything's there.
So, okay, so it does PyENV sort of stuff.
It can install Pythons for you, which is kind of cool.
With the command line, you can install a Python.
And it's pretty fast.
It also handles virtual environments.
So you can do things like run within your virtual environment.
You can run a command without actually going into it. You can say, like, rye run black, and it'll run black on virtual environment. You can run a command without actually going into it.
You can say like,
rye run black,
and it'll run black on your project.
You can say sync,
and it'll take all of the dependencies
in your project.toml
and install them in your virtual environment.
That's kind of neat.
And also, it's kind of like PIPX also.
So you can do things like install a global tool,
like PIPX install cards or PIPX install really whatever you want or black, for instance, if you wanted to, if you're going to like have any global tools on on your, you know, on your machine.
It confused me at first because I tried it and I couldn't get it in the thing to run.
But it doesn't really it't, it isn't invasive.
So it's not modifying your.zhrc or your RC files itself.
You have to do that.
So it's all going into a home directory.ry.
And then I think there's a,
there's a bin directory in there or something like that,
that you just add to your path.
So it's doing that. The kind of neat thing about that that all that isolation in the dot rye directory
is that if you decide you don't like it you can just uh you can just delete it and and then delete
that directory and all that stuff's gone um so kind of cool yeah i do like that it just blow
away the directory and it's gone or it's reset, you know? That's pretty cool.
It's also interesting to see that this is not a package
or tool within Python, but outside of Python, right?
I mean, it says it uses virtual environment
and it actually uses virtualenv,
which is a third-party package
that's a little faster than the V and B. But
it also uses pip but it doesn't expose any of those. So it is kind of interesting that
it's outside. I kind of like that it's outside because you can do things like like pip x
and and pi and B. And for me it like just worked better. I've still had trouble getting
pi pi e and v is a cool idea but it mucks with everything in your environment,
and I don't like that part of it.
Yeah, exactly. Yeah, I don't like that either.
So...
I think the trend... I've been talking...
I did the Python packaging discussion
on TalkPython a while ago,
and there's been a lot of talk about this.
Talked to Ophek about Hatch,
and it's looking like there's a little more
a little more interest in saying like okay what if we had a tool that actually not just manages
environments and packages but actually manages python like it for example like this one like
install the version of python you want and do a bunch of other things along those lines so
um yeah it's interesting uh another just a side note uh
simon willison wrote up some notes on his when he tried rye so we'll link that article as well
okay excellent all right what is next pi pi is next okay okay so remarkably pi pi has 452,000 packages.
That is insane.
But what is more insane is the traffic, right?
So there's some,
one of the themes from PyCon this year
was about sustainability
and kind of building for the future.
Carewilling gave a great keynote talking about,
you know, let's make sure that we're investing
in the front end story for Python, investing in the mobile and deployment story for Python.
And, you know, PyPI and the Python Packaging Authority, it's kind of there as well, right?
Like as we grow in traffic, we need things like somebody to watch over the security and some, you know, somebody to make sure all this stuff is developed and polished.
And people have been doing that, but to a large degree in a volunteer way, right?
But how does Python pay somebody to work on this?
Well, there's been some grants, I think.
I don't want to say the wrong company.
It was from some of the big tech companies are hiring people to do specifically security work,
like a supply
chain type of security for PyPI, which is great, right? But those are year by year things. So what
could be done? Well, right now, if I go over here on PyPI and I search for like Vodacore,
this is the way to talk to AWS, basically, from Python to a large degree. Who's it run by? Oh, it's Garnot amongst others.
Wait a minute. This is like the official AWS thing. Shouldn't this somehow be kind of more
official than that? And how do I find the other AWS projects? I can like email the person or
something. I don't know. I guess I could go over here and see this, this, the projects, if I go to their user account, but it's kind of just, you know, AWS sort of hacked the system by
creating an account named AWS, but it's not really official. You don't get like a branded
sort of story, right? You just get a, you know, use your profile. So GitHub has something like
this. Like if you went to github.com slash talk Python, this is the official organization.
We have 47 repositories.
We have six people involved and you can sort of see, right?
You can put this together officially as not Michael,
but like this group, right?
So the big announcement is
introducing PyPI organizations by Ederman.
So today we are rolling out, today being a few days ago,
today we are rolling out the first step in our plan
to build financial support and long-term sustainability
for the Python Packaging Index,
while simultaneously giving our users
one of the most requested features organization accounts.
So these are self-managed teams
with their own exclusive branded web
addresses. And the goal is to make PyPI easier for large community projects, organizations,
and companies to manage multiple sub-teams and multiple packages, right? So much like the GitHub
org that I was talking about, you can say, okay, this person is an admin of it. And this person
can have right access to that thing, but not this thing. Right. So kind of, kind of those types of things instead of,
Hey, everyone in the company, here's the one and only username and password for PyPI go.
Interesting. Yeah. Yeah. And it's like, I opened this conversation, uh, increase the ideas to
increase sustainability. This last year, PyPI served
235 billion downloads for those packages and saw 57% year-over-year growth on download counts and
bandwidth alike. That's great, but it also means costs and effort and infrastructure and all that
is just going up. Also, these organizations are not required. AWS can
still manage it however they are now if they want. So these are opt-in and it does cost,
it says a small fee. I would love to see small fee equals a hundred dollars a month or a dollar a
month. I don't know what this is. Maybe it says somewhere, maybe I got to go. Yes, exactly.
If you got to ask, no, just kidding. It's a small fee.
If you've got to ask, it's not for you.
Anyway, PyPI organizations coming now.
Check it out.
Okay.
So is it just, we don't know this yet, but is it just for things like companies or could it be for PyTest or something like that?
Sure, it could be like for PyTest.
I mean, you talked about, it says for large community projects for example yeah like you talked about flask uh well you talked about armin who was the original creator of flask but
that's now under the palettes project which runs a bunch of different projects and has a bunch of
contributors like i think a pallets organization would be potentially reasonable, you know? Yeah.
Yeah.
It's going to be tough to come up with like pricing for something like this,
because like we said, some of it is a bunch of volunteer organizations and some of it is, yeah, companies.
So yeah.
Interesting.
Yeah.
I don't know.
It probably says somewhere, but I didn't see it in this article on the PyPI.
Nice.
Cool.
Well, should we jump into the next topic? Yeah, go. Okay,
well, I just, we, I like this article by Bob elder was, it's
a five tips to learn any new Python library faster. And this
is actually near and dear to my heart, because we do this weekly
basis. You know, basis you know trying something
trying something new and I think I have a condensed version of this but let's kind of
walk through these are pretty good tips on if you're thinking about using something or just
want to learn you hear here's something cool you want to want to learn learn about it what do you do um so his first step is
i quickly read the main docs so read the manual uh rtfm uh especially i um i like things like if they
have a quick start or getting started uh like i like to read that if there's a video go ahead and
watch it um especially if it's a short video um Go ahead and try that. So, okay, so you know what it does, great,
and you kinda know how it works.
And then you install, number two is install it.
I think this is funny.
I mean, obviously you have to install it
in order to play with it, but okay, install it.
Number three is explore the library.
Essentially play with it.
Play around, see what you do. Bob has a great idea of doing this
within Jupyter notebooks to just kind of explore what it does that's pretty great and then and then
you have to kind of make it more deliberate so instead of just playing around with like maybe
the things it does try to do something with it try to to have a task. And this in this is a deliberate called deliberate practice. But this is
where I think you're really going to start learning
something is actually trying to get something done. Because
often, there'll be extra features you don't need right
away. With a lot of projects. Most people use 20% of the
functionality. So don't learn 100% of the functionality.
You learn the 20% that you need to get something done.
So deliberate exercise.
And then maybe actually build something or change something.
So number five is build something.
So going into a deeper level is actually incorporate this library in a project of yours.
Or if it's similar to another project
that you're already using and you're just thinking about switching, go ahead and try switching it.
You might not stick with it, but kind of feel how, how, how easy it is to change over to this new,
new project. And you might not publish it, but you might, it might be great.
Building something small is a good idea to just kind of get your hands dirty.
And then the bonus, number six, I always like it when people lie about how many tips there are.
The real bonus is teaching it.
So and this is like massive.
And it just ties a bow on the whole learning process is trying to teach it back to somebody. And this could be a today I learned thing on your blog or an actual longer blog post, or maybe a little tutorial video or something,
or actually just sit down with somebody and try to talk to them about it. One of the things that I
find even just for ourselves, for this podcast, trying to look at it and go, if I was trying to
teach this to somebody or tell
somebody why they should use it why i mean what are the different points like just earlier when
i was talking about rye what's different about that well it's it's different because it's uh
um it handles the python installs also and it also kind of works like pipx i mean the different
things about it and then think about like and then then actually try it. So I tried it out. I went through, uh, um, uh,
set up a,
an extra little project directory and tried it out a few times and saw what it
did, looked at the project Tom output. Uh, so yeah. And then, uh, just, yeah,
doing that. So, uh, some great things. Yeah.
I think, you know, teach it is, is really cool.
You don't have to be an expert at something to give a presentation on it
if you structure, as long as you're genuine about it. If you said, hey, I'm going to teach you
everything you need to know and how awesome this thing is, and really you just learned it last week
at a conference talk, people might call you out. You don't seem to really know as much as you
came up. If you say, i'm i'm excited about this
thing i just learned it i wanted to share my excitement and and just you know show you a quick
getting started thing then people won't go like well why does he know it like you start out well
i just learned it but i'm i'm excited and i think you'll be excited too if you you know so i i do
think those presentations are awesome i mean there, there's plenty of places. There's user groups and meetups, regional conferences, like online meetups.
Those people are always looking for presenters.
They're like, gosh, it's a week away and we don't have a speaker yet, right?
So reach out.
I mean, there's a whole ton of opportunity to do that or even brown bags at your company.
Yeah, I like that you brought up that be honest about it, that you can even say, I just got excited about this, learned a little bit, and I want to show you what it is.
Yeah, don't make yourself out to be an expert in the field.
But that's great.
This process actually is exactly how I got started uh blogging about testing is just learning learning
something new writing an article about it and then writing a bunch so if you really want to deep dive
into a into a module or a package you can write a series of them great you're getting started one of
my first thoughts and then move on to like uh well how do i do this other thing with it uh how do i
do authentication we'll do a little article on that.
And then the whole thing could end up being a book.
I mean, you could make a book or a course
or something out of it eventually.
But if you just want to do a quick one,
this is a great way too.
Yeah.
All right.
And speaking of just like,
Bob opened his conversation here with like,
and some of the things I'm learning are PyScript,
Fletch, PySimple, GUI, PyWrite, HTML, Reddit, Leaflet.
Those are all awesome.
So I could see why you would want to learn those.
And Liz out there says, I'm excited about this type of presentations tend to be more
to the point, which is indeed.
All right.
What's next?
We have GPUs next.
Brian, I know you heard that we can do all sorts of amazing things by programming GPUs.
And if we want to process tons of data, maybe we're doing medical research on protein folding,
we're running around a cluster of GPUs and we could solve some kind of huge computational
biological problem and make a big impact.
But if you're going to do that, you sure better hope that. But if you're going to do that,
you sure better hope that you, if you're going to do something like CUDA, you better hope that you
have not just GPUs. That's way too broad. No, no, no. You have NVIDIA GPUs. And NVIDIA GPUs are
sometimes hard to come by. A couple of years ago, they were very hard to come by, unless you're
using the cloud where you can go get them, right? I awesome mac here i've got my m2 pro mac mini now and that thing has 16 uh gpu cores on it can i do
cuda or use pi cuda on it no that's not in video one and so like honest on mac especially it's been
extremely hard to do any sort of GPU
computation, right?
They had the afterburner cards and those like weird external graphics cards on the older
Intel ones, but those are not even an option in the last four years.
So, you know, that's, that's a drawback.
And on Mac, the graphics language like DirectX on Windows or OpenGL and a lot of things is
called Metal.
And so I want to introduce you to a library and an article called Python Gets Down to
the Metal, not like CPU, but GPU.
And it says, are you a Mac and Python person?
Do you have a trillion numbers to multiply together?
You don't have all day to wait for them?
It says, well, Python is quite slow.
Although the person acknowledges I've been a happy Python user for quite a while.
You know, for pure number crunching, Python is not as fast as the compiled language is like C and Rust, which is totally true.
Is it plenty fast to drive your web API?
Probably.
It's probably really fast for that.
But if you literally have a trillion numbers, you want to just crunch them in a loop.
The advice is not to do that, right? The advice is to use something like Pandas or NumPy or Dask or something like that, which are really all C-base. Or if you had
a NVIDIA graphics card, maybe you could use some kind of thing like PyCuda. But again, on this
super powerful computer, it is just, you can't do it. There's no way to get an Nvidia graphics card.
However, if you could somehow program this metal,
it says metal does have a way to program it.
It's just, there's not a lot of things for it.
And it says there's a language to do on GPU computation
that looks very C++ like.
And it says on an M2, this is not the pro,
but the regular one has 8 gp cores which means
let's see let's give some numbers here yeah says um on the 8 gp gpu core version you could do about
a trillion floating point operations a second and that's the the base version right and so the
bigger models like mine closer up towards 10 trillion, right? So mine's probably like six or seven trillion.
But, you know, the ultra max, I don't know, whatever.
The bigger ones can do even more than that, right?
So that is a ton of power.
And so we introduce Metal Compute.
So pip install Metal Compute.
And with this, it's a little bit like doing sql like raw sql database stuff
in that you define a thing that you want to run on the graphics card that looks a lot like
a lot like c plus plus i think you would imagine brian what do you say yeah yeah it's just hash
include metal standard lib using namespace metal write a function um do the operations there's a lot of
const device float star buffer i mean it's not easy right but it's also not that easy on on
cuda and other things as well generally speaking so they come up and say all right what we're going
to do is kind of come up with a bunch of you know a vector and a matrix then we're going to multiply
them and do a bunch of work and get the answers out. And boom, off it goes.
Prints out the answer.
Very cool.
It's kind of cool
that you don't have to
you don't have to compile it
or anything, though.
I mean, it is cool
or have it in a separate file.
It's just a string there.
That's that's neat.
It is.
And if I was a
if this was my job
and someone said,
Michael, you're writing this program,
I would not do
what they are doing in this example and put triple quotes and put metal code, which is like C++ in there.
I would write a metal file or call it CPP or whatever is going to give you the best autocomplete and color syntax highlighting.
And then just do path read text and get, in one line, get that out,
but have that in a separate file
so you could sort of more properly reason about it.
But anyway, it says, look, we run this together
and did a whole bunch of work,
took 70 milliseconds to do 10 multiplications,
not that impressive, because there's a startup cost.
What if we gave it a billion,
a billion multiplications to multiply that vector in that matrix?
Oh, that takes 0.3 seconds, a billion times.
That's a lot.
Yeah.
So that's pretty good.
It does say, though, look, like part of the speed or part of the challenge is moving data into memory and then out of memory.
And so the more you move stuff in and out rather than kind of load it up with the data and then do operations on it, it will be slower.
So it talks a bit about some of the performance things that you can do to make it faster.
It gives some examples on how to do that.
There's probably some interesting trade-offs with the Apple Silicon having a shared memory
between memory, memory, and GPU memory, right?
You don't actually have to copy it between there.
But I suspect that you're going from Python to C++ memory
and back through some kind of serialization, right?
That's going to have some kind of cost.
Who knows?
Anyway, there's some cool examples of a Mandelbrot set being computed with this,
or Julia set, rather.
But yeah, if people have been dreaming of doing GPU processing on their Macs,
well, this might be worth checking out.
Nice.
The other thing that's kind of cool about it is it doesn't, you know, it's not like
a library that takes Python code and compiles it or transpiles it to run on the GPU, which
would be awesome.
But at the same time, if it doesn't quite get it right, how much control do you have?
How useful is it?
You don't know.
Here you can just, if you can just give it
the data as an array, then you're just writing straight metal, which is a bit of a pain if you
don't know it. But at the same time, it means that it's super flexible, kind of like a DB API.
You open a connection, you say, here's a string, run that on the database, you can give it a
selector, some kind of query or update, and it kind of gets out of the way.
So it seems pretty flexible in that regard.
Yeah.
Yeah.
Interesting.
And this is Mac only, right?
So this is... Yeah, it's Mac only.
Take that, NVIDIA.
You can't run Mac.
No, just kidding.
I have both an NVIDIA card and this,
but I don't really have...
I would like to do more GPU stuff,
but I just don't have a trillion numbers
that need multiplying right now.
Well, you know, it's not a bad thing. Someday. Someday. No. All right, over to you. more gpu stuff but i just don't have a trillion numbers that need multiplying right now well you
know it's not a bad thing someday no all right over to you um i've got so we're on to extras
uh i've got just one extra that i wanted to bring up and that was um just the uh textual um
so uh will mcgougan uh uh posted like this picture and like with no explanation.
Apparently it's the it's the weird I'm showing a weird bird with a large mouth.
It's kind of an ugly thing, I think.
But it's the the logo for Frogmouth.
And what is Frogmouth?
Frogmouth is the first apparently first of possibly more to come applications that the textual team is releasing that are built on
textual so frogmouth is a markdown viewer and browser for your terminal uh and um it looks
pretty clean i tried it out as well um i haven't been able to get like these uh these these menu
bar thing or the the tree things on the side but um it's like a it's like a navigation thing with like, you can browse your markdown within Textual.
So it's kind of fun.
Yeah.
Continues to impress with what they can build with that stuff.
Yeah.
And this is the application that I tried
when I was talking about Rye.
I tried installing this as a global application
and it worked just fine.
Yeah, cool.
All right.
I would like to serenade the
audience brian oh yay yeah i was going to talk about this as a main item this thing called
serenade at serenade.io ai and it's really neat what it does is you install a plugin for your
editor where your editor equals vs code or jetbrains Es, one of them two. And then you run this in the background
and you can speak to it.
Like you hit a hot key or whatever
and you basically start speaking code-oriented operations.
So there's a cool example that it shows somewhere in here.
Let's see.
Yeah, if you go and click on the docs,
you'll see a bit of a video thing here and basically
you can with that on you can go into it and you can say you know uh teach you to do tests you can
navigate around this is kind of cool so if if you're typing in the editor you're kind of kind
of good i mean i know if you have rsi and stuff that's not ideal but one of the challenges is
like i okay i need to leave this and navigate over there. So you can say things like open some file name, and it'll actually go through your
editor and say, oh, that's over in this directory, you need to expand the section, and it'll jump to
it. Or you can say go to this function, or, you know, those kinds of things, you can speak to it,
and it'll do it. So anyway, it's really cool. Why is it not the main item? Because I'm
super excited about these kinds of things. The reason it's not is it hasn't really been touched
for coming up on a year. And that was just a merge of some PR. Is it still going? I don't know.
It was kind of working pretty good, but then it was throwing errors when I, so I don't know. I
love the idea. If this thing comes back to life, you know,
someone out there let us know, cause this is super cool,
but it doesn't quite seem to be getting kept up with the editors and tools
and so on.
I liked your comment in the show notes.
Serenade has seemed to gone silent.
It has gone silent.
Thank you.
But it's still worth checking out.
It's kind of cool.
And then Brian Skin is...
We've been talking a lot about packaging on this episode, and I did my packaging discussion
and inspired by that, or maybe just the discussion that I was also inspired by. They're setting up a Python distribution packaging roundtable, not just of
the people inside Python core devs world, but in the broader ecosystem for like Anaconda and data
science and that. All right. So they have 13 maintainers across nine projects lined up for
Tuesday, May 9th, and I'll link to go attend it. So if you're interested,
you can go check that out. Um, yeah, so I'll put a link in the show notes. Yeah. Cool. Yeah.
That's it for the extras on my end. How about a joke? Uh, yeah, but before we go there, um,
I just wanted to say, I just remember the person that like told me about Ryan in the first place.
And now I feel like a dork. Uh, it was Paul Everett. Um, so thanks Paul from JetBrains. I just wanted to say I just remember the person that told me about Rye in the first place,
and now I feel like a dork.
It was Paul Everett.
So thanks, Paul, from JetBrains.
Awesome.
Yeah.
Paul was quite the host at PyCon and did a bunch of awesome stuff for many people.
Much appreciated.
All right.
How about a joke?
All right.
Before I put this on the screen, I'll tell you about it.
You know, sometimes programming is just amazing.
You just get in the zone and you just go,
and look what I built, boom, boom, boom, boom.
Other times you end up with a bald patch where you've been tearing your hair out.
You're like, no, why?
You may end up yelling at the computer, right?
There's just like a bunch of stuff.
And this joke highlights the small wins
you might make in this situation, okay?
So it's just a person
with two monitors, a bunch of energy drinks crushed next to them, a bunch of wrappers,
empty coffee. They've clearly been here for a while and the arms are up like, yes,
wow. A different error message. Finally, some progress. Yes. Have I mentioned that I've been working with a guy named Lauren on a Flutter mobile
app?
Let me tell you, there are a lot of tools in that tool chain.
And more than once here, I've been like, yes, that's not the same error.
We're making progress.
Yeah.
This was me yesterday for half the day.
Working with them.
Working with a Docker compile that used Docker and Artifactory and a whole bunch of C++ stuff.
Indeed.
All right.
Well, let's hope for different error messages for anyone out there struggling.
And then eventually, more error messages.
That's a,
it's a good one.
Like may you have a different error message tomorrow.
Exactly.
May you live in interesting times and may you have different messages.
All right.
Well,
thanks for being here,
Brian.
Thanks everyone for listening.
All right.
Bye.
Bye.