Python Bytes - #467 Toads in my AI
Episode Date: January 26, 2026Topics covered in this episode: GreyNoise IP Check tprof: a targeting profiler TOAD is out Extras Joke Watch on YouTube About the show Sponsored by us! Support our work through: Our courses at... Talk Python Training The Complete pytest Course Patreon Supporters Connect with the hosts Michael: @mkennedy@fosstodon.org / @mkennedy.codes (bsky) Brian: @brianokken@fosstodon.org / @brianokken.bsky.social Show: @pythonbytes@fosstodon.org / @pythonbytes.fm (bsky) Join us on YouTube at pythonbytes.fm/live to be part of the audience. Usually Monday at 11am PT. Older video versions available there too. Finally, if you want an artisanal, hand-crafted digest of every week of the show notes in email form? Add your name and email to our friends of the show list, we'll never share it. Michael #1: GreyNoise IP Check GreyNoise watches the internet's background radiation—the constant storm of scanners, bots, and probes hitting every IP address on Earth. Is your computer sending out bot or other bad-actor traffic? What about the myriad of devices and IoT things on your local IP? Heads up: If your IP has recently changed, it might not be you (false positive). Brian #2: tprof: a targeting profiler Adam Johnson Intro blog post: Python: introducing tprof, a targeting profiler Michael #3: TOAD is out Toad is a unified experience for AI in the terminal Front-end for AI tools such as OpenHands, Claude Code, Gemini CLI, and many more. Better TUI experience (e.g. @ for file context uses fuzzy search and dropdowns) Better prompt input (mouse, keyboard, even colored code and markdown blocks) Terminal within terminals (for TUI support) Brian #4: FastAPI adds Contribution Guidelines around AI usage Docs commit: Add contribution instructions about LLM generated code and comments and automated tools for PRs Docs section: Development - Contributing : Automated Code and AI Great inspiration and example of how to deal with this for popular open source projects “If the human effort put in a PR, e.g. writing LLM prompts, is less than the effort we would need to put to review it, please don't submit the PR.” With sections on Closing Automated and AI PRs Human Effort Denial of Service Use Tools Wisely Extras Brian: Apparently Digg is back and there’s a Python Community there Why light-weight websites may one day save your life - Marijke LuttekesHome Michael: Blog posts about Talk Python AI Integrations Announcing Talk Python AI Integrations on Talk Python’s Blog Blocking AI crawlers might be a bad idea on Michael’s Blog Already using the compile flag for faster app startup on the containers: RUN --mount=type=cache,target=/root/.cache uv pip install --compile-bytecode --python /venv/bin/python I think it’s speeding startup by about 1s / container. Biggest prompt yet? 72 pages, 11, 000 Joke: A date via From Pat Decker
Transcript
Discussion (0)
Hello and welcome to Python Bites, where we deliver Python news and headlines directly to your earbuds.
This is episode 467 recorded January 26, 2026.
Ooh, that's a lot of 26es.
All right.
I'm Brian Ackin.
I'm Michael Kennedy.
This episode is sponsored by all of us, all of you, and the products that we have going at Python Bites.
And I guess not at Python Bites, but talk Python training.
And at PythonTest.com, we've got the PytTest course.
lean Tdd, but Michael's got courses and books and all sorts of fun stuff.
And also, of course, Patreon supporters.
We still love you.
And yeah, if you'd like to connect with us and send us topics, maybe topics for the show,
we love that.
You can just send us through the contact forum at Pythonbites.fm, but you can also find
us on the socials.
And the links are in the show notes.
We're on Blue Sky and Mastodon.
Yeah.
And if you'd like to see what we look like and watch us.
us or watch us live or tape recorded.
You can go to Pythonbytes.com slash live and yeah, check out that.
And it tells you when we're going to show next time.
Yeah, they can even check out a glimpse of the Python staff of power just behind you there.
Yeah, it's right.
It's, yeah, there it is.
Yeah.
Yeah, and I'm still on the fence about going to Pycon this year.
And if I go to Pycon, I'll definitely bring the staff of power.
My fence has been built and decided upon.
my tickets and I got actually a pretty amazing deal on Expedia. I was able to get four days hotel
right near a block or two from the convention sitter and the water and a round trip flight
for 600 bucks total. So I'm going. Oh, nice. Yeah. I'm just trying to, I'm going to try to ask work
if I can get away with not having to take vacation time to go. Yeah, wouldn't that be nice? Hey,
they talk about pie tests there. I know you're doing a lot with that. Yeah. So, all right, well,
Well, should we do some topics for the show?
Let's talk.
Oh, I have the toad of.
That is not what I want to talk about.
I want to talk about this thing called the Grey Noise IP Check.
Okay, so Gray Noise is a company that does research on cyber security breaches, et cetera, et cetera, right?
Yeah.
So they have published this thing called the Grey Noise IP Check.
And if you just go to check.
Dot,.org, graynoise.io, or way simpler, click the link in the show notes.
It will take you this place that says your IP address is, whatever your IP address is, and your IP address is clean.
Your IP has not been observed, scanning the internet, or contained in the common business services dataset.
So what is this for?
Like in the day of 30 things connected to your internet, your Wi-Fi, with your kids installing random junk off the internet, like, my kid is pretty good, she says, dad, is it okay if I install this?
Yes, it's okay or no.
And she actually completely reformatted her Windows 11 machine so she could play games but not worry about what was installed on it.
So, you know, not knocking the kid or anything.
But kids install stuff, other people install stuff, your TV, your smart power thing that goes to that lamp in the corner.
You know, there's a crap ton of stuff that is on your network.
And even if you run some sort of virus check on your computer, having that come up empty doesn't mean that there's not a problem with things that you are in control of, right?
So basically the idea of this thing is you can go here and it will tell you if something on your network has been being bad.
Okay.
Yeah.
And I don't know.
For me, I thought that was pretty cool.
So I threw it in as one of the topics so that other people can click on this and see, I guess a couple caveats.
If your IP address just changed recently, it's very plausible that you could get a warning and it was whoever had it before.
You know what I mean?
Like we had a power outage for just a couple hours, some sort of, I think I talked about this last
time because my mic had gotten all resets because everything got powered 100% down because
some sort of transformer thing exploded.
Okay.
We got a new IP address from then, right?
But, you know, a couple weeks ago.
And so it turns out that, I guess things are still good.
That's good to know.
But if you just had your IP address change, then it could be picking up issues from whoever
had it before, right?
Kind of like you keep getting spam with that new phone number you got.
Yeah, but these, I don't know about you, but my IP address is pretty much rock solid stable,
unless something goes wrong with the internet or with power that it may change.
But other than that, it's the same for quite a while.
I'm slightly embarrassed to say I usually don't know my IP address.
And I actually don't quite get how, I mean, I get how IP addresses work.
But like for a house, is your whole house like the same IP address?
Or I don't know how that works.
also like does this can you use this like if you're at a cafe to check to see if there's other like bad actors there
yeah yeah i guess you probably could actually yeah so yeah anyway uh interesting okay well i'll
yeah and i i know more because i have a lot of my stuff gated behind IP addresses like the server
yeah you can't even SSH to the server unless your IP addresses on a whitelist right so
periodically when it changes i'm like ah what is my IP address again so then i go copy i don't
really pay too much attention, but because of that, I kind of got to pay a little attention.
Yeah, I'm like a very cloud data, data in the cloud sort of person. So I guess. Perfect. All right.
Well, I'm very excited about this next topic that you have because I also saw and said,
that is killer. It is killer. Yeah. So what we're talking about is T-Proff. This is from,
or to prof? I think it's T-Proff. It's the targeting profiler from Adam Johnson. And we've got a link to the
but the PIPA site.
But it just mostly says, yeah,
it's a targeting profiler,
but we're also going to link to his article,
introducing it,
introducing T. Profit targeting profile.
Profiler.
So I, yeah, so it's, it almost speaks for itself.
What the, what he's talking about is if you run a profiler on your program,
it gives you everything.
It like, it, and you do, you want to do that at first to,
to see all the hot spots, where are you spending your time?
And that's what you use profilers for to see where,
you know, where you maybe could optimize some of your code.
And so, but once you pinpointed what you want to try to fix,
then you want to try to measure that.
And you kind of have to do, and his pain point was running it over the whole program
again and again and again.
And you don't really need to.
So his is a targeting profiler and you give it exactly the function.
that you're caring about or the functions multiple.
And it'll report what the times are, like the Minutamax and stuff.
And then he went further and said, there's a lot of times where you're just going to want
to do like a, I got before and after function.
And it probably won't be before and after, but you can my, like my search routine old
and my search routine new or something like that.
And you can have those be targeted and run the results and compare those.
Nice. It's like a showdown, right?
Instead of trying to remember the results, you just say,
here's the old way, here's the new way, how that turn out, right?
It's kind of got two ways to do it.
You can do it, well, so you can do the,
pass the functions that you want to want to look at on the command line.
So you don't even have to actually change anything in your code to do this.
You can just target a couple functions and call those.
And then, but you can also do a dash,
X for a compare and it'll do a delta time like a percent faster or a percent slower,
which is pretty cool.
Or you can and not have to do the command line thing too much and just do, he's a context manager.
You can say which ones you're going to compare and run those before and after with T-Proff.
All those are, I probably use all of those at different times,
but I'm really excited to try this to use to optimize some code that I'm going to working on.
It's pretty neat.
Apparently, this is the new Python 312 profiling API, which is really nice.
So using the modern lower touch profiler as well.
Yeah.
So of course, you have to run it with 312 or above.
But that doesn't mean that the code that you're testing against is necessarily just 312 and above.
It's just when you're running it, it'll be 312.
You know what, 312 is almost obsolete.
Seriously, like how insane is that?
What?
I mean, 310 is the oldest that is even allowed.
Like, 3-9 has gone into, like, we don't even do security patches anymore.
Yeah, yeah, that's true.
3-10 is on the shadow and then, like, it's two years still, all that thing's gone.
So it's not that old.
And we don't even mention 2-7 anymore, the legacy.
So good.
So good to be past those days, right?
I would also like to point out that this is just so good for profiling any real
application, like any real application. Because so much of what happens if you just like see profile,
you know, my particular app is all the time spent loading modules, all the time connecting to the
database and just all of the, you know, setting up the login, that all, like all the stuff that
happens before you get to the one function you want to see what's going on and then it stops.
And you're like, the thing only takes 50 milliseconds. But I've got profiling for like two seconds.
and what is going on.
It's just so lost in the noise.
And then this one, you can just say, run it,
let all that, just ignore all that junk.
When you get to this function, start,
when you're done with the function, stop.
You know what I mean?
Yeah, and that's actually, oh, that part of it's overwhelming to me.
Every time I've had to reach, I don't have to, you know,
it's like a lot of debugging tools.
You don't have to reach for them all the time.
But when you do, it's a little overwhelming to look at all that profiling output.
And I still haven't.
100%.
I still haven't got my head around flame graphs, but, you know.
Yeah.
Catch them.
They're hot.
No, I honestly, like, the fact you just say, just profile this function and that actually
happens on the, the CLI, I'm, my profiling is back, baby.
Like, it's just, I'm like, is it really?
We're trying to dig through all that overhead and junk and decide, like, probably no.
But now maybe, maybe it really is.
Thanks, Adam.
Yeah, thank you, Adam.
And then one more thing is, like, a lot of times some things are slower the very, very first
times they run. So it might be worth writing a function that actually, like, runs it a bunch of times
or runs it once before you get to it and then profile. You know what I mean? Like just, like,
because it might be the module loading that actually overwhelms the compute, but that only happens
once or, you know, do I see compilation. The context manager one would be great for that then.
You could just, you could call both, like, both the new and the old function once. And then do a loop and,
and do like a hundred times to the other,
do their two.
Yeah, that would be totally perfect.
That would be really, really good.
You know what we don't talk enough about, though, Brian?
Honestly, I mean, what?
Toads.
Talk about Toad.
I know, and we live in Oregon.
We got them all over the place.
We do.
We do got them all over the place.
Or frogs, at least.
I don't know if we have toads.
I don't, you know what.
So with Toad, this comes from Will McGugan.
It's kind of his next big thing that he's been working on since textual.
Okay.
And the idea is it's kind of like a UI for Claude Code type things.
It actually works with something called Open Hands.
It works with Claude Code.
It works with Gemini, CLI, probably others that I don't know about.
When I first saw this, I'm like, okay, well, I already have a terminal.
I can do this.
Like what I'm not really entirely sure what I need this for.
But the more I looked into it, it looks really quite neat.
So if you go through, it obviously renders pictures.
Apparently not there.
It renders pictures like one of the,
the pictures is the Mandelbrot set drawn.
Cool.
And it has like really nice, basically better input.
One of the challenges with things like Clyde Code and friends
is you're just in the terminal.
And so can you use the mouse to like select part of your text
and cut it?
No, it's the terminal, all right?
You know?
The arrow keys work right and all this kind of things.
So it has a really nice, I'll just like put this little video
in the background while it's going here.
I can get it to play.
And it doesn't want to play.
Well, it has a really nice input for that kind of behavior, right?
It has a nice little web server that runs.
Let's see, a few more.
It uses fuzzy search.
So something you do all the time or should do all the time.
And if you're not, you're totally missing out, is adding files a specific context to what you're working on.
So you might say, hey, I'd like to work on the login page.
here are the notes that I, you know, use the notes that I took.
How well is it going to work?
We don't know.
You could say I'd like to work on at login.html and please see at login requirements.md, right?
And you, like, that will actually pop up a select.
And so this uses really nice, like fuzzy search and UI drop downs in the terminal.
It says better prompt input.
It uses, it has support for terminals within terminals.
So if you ask it to run a command that is like complex terminal output, it will actually embed that, which is really sweet.
So people should give this a look.
I think it's pretty nice, you know, PIP install, I believe.
I'm going to check it out.
No, curl.
Curl install it.
Get it.
That's the new way, right?
UV is showing us the way.
But you can also UV tool install it if you prefer.
And that's probably the way I would install it because I already have scripts that basically manage all of my UV tools, check for updates and so on.
So check out Toad if you're doing like CLI, cloud code like things.
Yeah, cool.
Yeah.
Batro chain.
Batra chain.
I don't know.
The girl is pulling it from a URL called B-A-T-R-A-C-H-I-A-N.
Yeah.
Is that him?
Is that him?
That's him.
Okay.
Nice.
Cool.
You even got your animated Zoom.
Oh, yeah.
It's got the codex as well.
I'm a nostril.
Unified.
for your term.
Nice.
Nice.
Yeah, so this is, I don't know how this relates back to Toad's,
but it must in ways that I don't know.
Well done.
Well done, well.
Well, it's the ink plot.
What do you see here?
I see a Toad.
Exactly.
Well, I want to talk about something that sort of AI related as well.
So fast API just made, I'm linking to a merge request,
but and we'll link to the actual page too.
But in the.
contributing to Fast API page, there was a new change to talk about contribution instructions
about LLM generated code and comments and automated tools for PRs.
So I'm guessing that all Fast API at least, but probably all very popular projects, are having
a problem with people doing PRs that they really haven't spent that much time on the PR,
but they want to get their name in or something.
I don't know.
But there's some really good highlight here.
And it's not, so I'm bringing this up,
not just because of Fast API is awesome.
And it is, but it's just sort of an interesting thing
that we're having these sort of discussions.
And this is a nice, concise verbiage,
if you want to add some of this pros to your own project
to say what is allowed and what's not.
There's, and it's so we'll link to the contributing guideline,
developing contributing,
And there's an automated code and AI section.
And the gist is you're encouraged to use AI tools to do things,
you know, whatever tools you have of hand to do things efficiently,
but also know that there is human effort on the other end for the pull request.
And they just basically want to make sure that you're not doing less effort than they have to do to just check it.
So please put a person in the middle.
So there's things.
like they will automatically, they will, they will close things that look like automated pull
requests from bots and whatever. And there is a, there's, they have a section on human effort
denial of service, you know, using automated tools and AI to submit PRs or comments that we have
to carefully review and handle would be an equivalent of a denial of service attack on our human
effort. It would be very little effort for a person submitting the PR that generates a large
amounted effort on our side.
Please don't do that.
So I think this is completely fair.
And it said, we will block accounts that spam us with repeated automated PRs or comments,
use tools wisely.
And a nice, what is this Spider-Man comment, quote, with great power.
I mean, tools comes great responsibility.
So I just think this was good on their part to throw this in of, yes, use tools.
but know that there's a human on the other side having to deal with it.
Yeah.
Keep it focused, right?
Like, don't submit a 7,000 line PR and say, I made it better.
You know, like, there was probably some small little part that need changed.
Stay on target, right?
Yeah, that's actually, now that you bring that up, it is very easy now to say,
oh, I want to refactor this function to be a new, like, even with, without AI,
I want to refactor this function to be a new function name,
and it's going to change tons of code.
that's still okay to do if it's focused,
even if it hits like hundreds of files,
if it's just that one thing, like separate those up
so that code reviewers can go,
oh yeah, you just change that function name, that's fine.
But if you do that plus, oh, plus I, you know,
formatted with black and plus I like, you know,
optimize this one function,
that's terrible to combine those together.
And, you know, less it's here.
And AI, it's very common for to just churn through that kind of stuff
and generate a bunch of changes.
The curl team actually has blocked all AI contributions,
period for the same reason.
It's like, this is out of control.
We're not doing it.
I got a weird request for a private repository or like to,
when I was starting to the PITES course,
I kicked around the idea of having a,
a GitHub group or whatever, like whatever those are,
organization for testing code.
I decided not to, but it's that or that organization still has some of the
private code I used for the course. And I had somebody request me to give it Claude access to my
private notes. No, I'm not going to do that. No. But anyway, I'll share one really quick,
weird Cloud Plus GitHub thing on the topic. Then we'll move on. I am working on this project that
I was doing something with Claude Code on this feature. And I had created a GitHub issue with a lot
of notes from myself about it on GitHub, but I didn't connect Claude code to it or anything. And then I was
like, hey, Cloud, let's just brainstorm about what it would look like in this codebase to add this feature.
And it takes a second and it goes, I see that you've considered this, this, and this.
And now that I've got that background, I think this, I'm like, how do you know that?
What it had done is I had the GitHub, the GH, CLI, installed on my computer.
And so it used GH to go explore the GitHub repository and find the issue.
Issue 1,274 or something.
Yeah, issue 1274 said you wanted to do it this way.
I'm like, okay, that is insane that it just got in there.
I didn't ask it to, you know what I mean?
Yeah, anyway.
Do I want a GH on my computer?
I'm like, how did it figure that out?
It shouldn't have access to information, but it's okay.
How about we move on extra time?
Yeah.
Do you have any extras?
Yeah, sure.
I got a couple, and then we can flip over to you.
So, first of all, remember, Henry Schreiner pointed out that you can use compile bytecode as a flag
to UV. And it points out in the UV docs here that UV does not compile PY files to bytecode in the Dundra PiCash startup PYC file. Instead, that is lazily done at module import. And that really got my attention because all of the Docker deployments that we do, for example, Pythonbytes runs on Docker. So when I deploy a new version of Pythonbytes.fm, what happens? It compiles the code, installs the dependencies with UV,
And then it starts, you know, there's other stuff, but effectively,
then it just starts Grainion running Python bytes, right, as a core app.
And from the time that it actually starts starting until it's all the way started,
like the website is unavailable, right?
Yeah.
Because it's not some super complicated Kubernetes thing.
It just restarts the Docker container.
And for me, for the extra simplicity of like having one to two seconds of downtime per couple
times a week, it's totally fine. But what I realized when Henry said that, and you talked about
UV not compiling these things, that that actually means it's got to every time I start the Docker
container, literally every time it has to compile the PYC files for every library that it uses,
and it uses like well over 50. There's a lot of libraries for these web apps, right? Yeah. And I'm like,
wow. Okay, so there's no scenario where it will ever not have to generate those on App Start, right? Because I don't
ever shut down the Docker container and start it again. The only time, the only reason the
Docker container will ever shut down is because it needs a new version of code or dependencies.
So it rebuilds the container from scratch using layers, and then it will bring that back up.
So I added dash dash compile dash bytecode to all of the UV installs because I don't care
if the build time is one second slower, if that means the actual launch time, which involves
down time is one second faster.
Okay, so does this
this pre-compile thing happen
then at the time you're building the Docker image?
Yes, which has nothing to do with
the uptime or anything.
It's not until you say restart the Docker container
with the new image that it actually shuts
down the old one and starts the new one.
So I might make the build time one second slower
but the launch time one second faster.
Yeah, and it probably makes like what
the Docker image a little bit bigger?
Probably.
Yeah, yeah, yeah, probably.
But it would have, you know, I guess it does make the image a little bit bigger.
But I'm not shipping it to Docker Hub and Bucks.
I don't really care.
Yeah.
I mean, even if you were, it's that, yeah, one second faster to do the flipover.
It's great.
I think it's, and all it is, it's just include dash, compile, bytecode on your UV install stuff.
Yeah, sweet.
So thanks, everyone.
You and Henry and whoever else.
Okay, two things real quick.
I talked about the MCP server for Talk Python.
how that works, but I did a nice little write-up about it over on the Talk Python blog,
so I just want to point people to that proper write-up, which has some background on the MCP
server and the LLMs.txte information there. And another thing, Brian, people sent me a message
and said, so let me get this right. You just recounted the story of Tailwind CSS getting
destroyed by AI, and then you purposefully added AI stuff to Talk Python. Are you insane?
Well, okay, maybe, but I don't think so.
So I also wrote a blog post on my personal blog about why I think, you know,
hiding from AI, blocking AI crawlers is probably not going to serve you in the long term
and why that I added these things.
So sort of gave my background and why I thought that was worth doing.
So people are like, like Michael's kind of got two contradicting thoughts in his mind at the same time.
I don't think it's contradictory though.
I think it would be like you, you know, doing AI, having an AI generated thing on all of the content of Talk Python training.
Yeah.
And you wouldn't do that because that's what you make money off of.
So exactly.
This stuff is basically I'm saying if you have content you would like in Google, you probably want it in AI indexes as well.
Yeah.
If it's something you don't want publicly available in Google, then you probably probably.
would don't want it here as well right but this is like assuming you want to show up in search
results for regular search engines you probably do with these and also just really quick while we're on it
like putting in the mcp sort of stuff here means it turns questions like hey what guests were on the show
or what was this episode about or maybe i could find the the transcripts for a page or whatever it
turns that from scraping my website and hitting tons of code to database index single queries for a small
fragment of text. So if the AI is going to ask questions anyway, this is way less harm on my server,
if you will, than... Well, hopefully they do that instead of doing both, though. Yeah, I know.
The other thing is, so do you have to, is this a process now that you have to keep this updated on a
weekly basis or? No, it's all just driven by the database. Okay. So, yeah, and if you look at the,
my personal post version, you can actually see some pictures of like how Claude, Claude,
dot AI, the chatbot is used.
So I'll ask it, like, what are the last five episodes?
You can see it's actually calling the API endpoint getting recent episodes.
And then, or if you ask it more, it'll like search.
And then based on the search results, it'll get the details and so on.
Yeah.
One of my next projects is to, or in the near future, is I try to build one of these for
internal stuff.
So the internal tools can see internal APIs.
Yeah.
It's very, it's very neat.
And it's not that hard.
Also, you don't need a whole.
framework like fast mcp or some one of these things it's just a couple of simple web requests like
it's not it's not like implementing your own web sockets or something it's you can just add it to any
website it's super easy cool all right any other extras nope over to you okay um i let's see what do i got
i got a few um do you remember dig i do dig is back um which uh and i'm a link into a tech crunch
article and I there's a bunch of articles on it but it just it's I did a quick skim basically
they're trying to trying to trying to be what dig used to be but not that's sucky and also
some interesting people Kevin Rose from and one of the co-founders from from Reddit are
putting it together interesting and yeah anyway I'm I'll be interested to watch to see if it
becomes a something that we should care about but um anyway interesting article um also uh there's a python
community uh in there just started there's 99 members it's 99 members of python in the world um
no there's more but in the on the dig community and uh um is put together by somebody that meant that
actually the same person that sent it to us and said hey so this wigan wigan whigging person uh sent it to us
was just great and also talked about us python bites podcast yay so yeah so yeah
Thank you.
Yeah, it's awesome.
Another interesting article I ran across was from, I think it's, I'm sorry, Marique,
Marik, why lightweight websites may one day save your life, but basically just owed to
lightweight websites.
There's a, and this is a lot to do with, if the intent is for people to be able to use
your website, even if they're on a cell phone, even if they have like a bad connection,
then lightweight websites are a must.
That's just basically it.
Think about your target.
audience. If your target audience is on the move, make it light and fast, obvious.
Last shout out to a whiz to a article called How to Parameterize Exception Testing in Python,
in Pi test. And I'm glad people are still writing articles about Pi test. And I'm going to let it,
I'm going to let it slide that he did the capitalization wrong. No capitals in Pi test.
Anyway, it's fine.
You can fix it.
If you just right click on that and say inspect, you can fix his website.
You didn't know if you could edit other people's sites, but.
You just pay, really?
It doesn't last right now.
Well, for you, but if you reload or anywhere else, it doesn't it.
The trick here, which is cool, is he's parameterizing whether or not something raises an exception.
And he's using a thing from Context lib called Null Context and importing it and changing its name to Does Not Raise.
And this is brilliant because this is very clear code to say, yeah, these cases that shouldn't raise an exception.
Other cases should be a zero division error or a type error or whatever.
Obviously, you're not going to write a test for test division exceptions.
But to test your own code, just make sure that it's raising the right exceptions.
Yeah.
And throwing in a couple of cases for when it doesn't raise, this is good clean code and short article.
So I like it.
That's it.
That's it.
I would have not seen that coming.
That's pretty good.
Yeah.
It's very, very good.
Well, that's it.
Do we have something funny?
This last thing here, this joke is really here, basically for you, Brian.
This was sitting by Pat Decker, I believe, and I think it's good.
I think you'll appreciate this coming from a testing perspective, okay?
Okay.
Okay.
So this was on Reddit.
And the idea here is two developers talking or maybe project manager, I don't know,
says your new date picker widget has crashed.
Really? That's impossible. I've tested it with negative numbers, special characters, null. What have you put in it? A date?
Yes.
Oops. I forgot the base case.
So, so busy testing all the edge cases and the error conditions that forgot to see if it even worked.
Yeah. Yeah. That's a weird thing. I think that there's so many people think that testing is just about like the complicated edge cases.
But you got to get the happy path first and make sure those were.
work also yeah yeah absolutely that's funny pretty good one yeah did you've heard that tester walks into
the bar right yeah i think so but give me your variant uh i i'll just a bad memory but tester walks into
or uh text tester um tester walks into a bar and uh uh orders uh half a beer and one and a half beers and a
negative beer and um and a million beers to see what happens and everything everything works fine um uh
I thought the bar caught on fire.
Astro, actual custer comes into the bar and orders one beer and bar catches on fire and
there you go.
There you go.
Like that.
It's beautiful.
It's a beautiful thing.
It's a, it's like a fable that tells the moral story of testing.
Yeah.
And then my dad joke version, of course, the guy walks into a bar and says, out.
All right.
It does hurt.
Different kind of bar.
Sorry.
Spoiled your joke.
But, anyway.
All right.
Another fun episode.
Talk to you guys next week.
Bye, you all.
