Python Bytes - #448 I'm Getting the BIOS Flavor

Episode Date: September 8, 2025

Topics covered in this episode: * prek* * tinyio* * The power of Python’s print function* * Vibe Coding Fiasco: AI Agent Goes Rogue, Deletes Company's Entire Database* Extras Joke Watch on YouTu...be About the show Sponsored by us! Support our work through: Our courses at Talk Python Training The Complete pytest Course Patreon Supporters Connect with the hosts Michael: @mkennedy@fosstodon.org / @mkennedy.codes (bsky) Brian: @brianokken@fosstodon.org / @brianokken.bsky.social Show: @pythonbytes@fosstodon.org / @pythonbytes.fm (bsky) Join us on YouTube at pythonbytes.fm/live to be part of the audience. Usually Monday at 10am PT. Older video versions available there too. Finally, if you want an artisanal, hand-crafted digest of every week of the show notes in email form? Add your name and email to our friends of the show list, we'll never share it. Brian #1: prek Suggested by Owen Lamont “prek is a reimagined version of pre-commit, built in Rust. It is designed to be a faster, dependency-free and drop-in alternative for it, while also providing some additional long-requested features.” Some cool new features No need to install Python or any other runtime, just download a single binary. No hassle with your Python version or virtual environments, prek automatically installs the required Python version and creates a virtual environment for you. Built-in support for workspaces (or monorepos), each subproject can have its own .pre-commit-config.yaml file. prek run has some nifty improvements over pre-commit run, such as: prek run --directory DIR runs hooks for files in the specified directory, no need to use git ls-files -- DIR | xargs pre-commit run --files anymore. prek run --last-commit runs hooks for files changed in the last commit. prek run [HOOK] [HOOK] selects and runs multiple hooks. prek list command lists all available hooks, their ids, and descriptions, providing a better overview of the configured hooks. prek provides shell completions for prek run HOOK_ID command, making it easier to run specific hooks without remembering their ids. Faster: Setup from cold cache is significantly faster. Viet Schiele provided a nice cache clearing command line Warm cache run is also faster, but less significant. pytest repo tested on my mac mini - prek 3.6 seconds, pre-commit 4.4 seconds Michael #2: tinyio Ever used asyncio and wished you hadn't? A tiny (~300 lines) event loop for Python. tinyio is a dead-simple event loop for Python, born out of my frustration with trying to get robust error handling with asyncio. (I'm not the only one running into its sharp corners: link1, link2.) This is an alternative for the simple use-cases, where you just need an event loop, and want to crash the whole thing if anything goes wrong. (Raising an exception in every coroutine so it can clean up its resources.) Interestingly uses yield rather than await. Brian #3: The power of Python’s print function Trey Hunner Several features I’m guilty of ignoring Multiple arguments, f-string embeddings often not needed Multiple positional arguments means you can unpack iterables right into print arguments So just use print instead of join Custom separator value, sep can be passed in No need for "print("\\n".join(stuff)), just use print(stuff, sep="\\n”) Print to file with file= Custom end value with end= You can turn on flush with flush=True , super helpful for realtime logging / debugging. This one I do use frequently. Michael #4: Vibe Coding Fiasco: AI Agent Goes Rogue, Deletes Company's Entire Database By Emily Forlini An app-building platform's AI went rogue and deleted a database without permission. "When it works, it's so engaging and fun. It's more addictive than any video game I've ever played. You can just iterate, iterate, and see your vision come alive. So cool," he tweeted on day five. A few days later, Replit "deleted my database," Lemkin tweeted. The AI's response: "Yes. I deleted the entire codebase without permission during an active code and action freeze," it said. "I made a catastrophic error in judgment [and] panicked.” Two thoughts from Michael: Do not use AI Agents with “Run Everything” in production, period. Backup your database maybe? [Intentional off-by-one error] Learn to code a bit too? Extras Brian: What Authors Need to Know About the $1.5 Billion Anthropic Settlement Search LibGen, the Pirated-Books Database That Meta Used to Train AI Simon Willison’s list of tools built with the help of LLMs Simon’s list of tools that he thinks are genuinely useful and worth highlighting AI Darwin Awards Michael: Python has had async for 10 years -- why isn't it more popular? PyCon Africa Fund Raiser I was on the video stream for about 90 minutes (final 90) Donation page for Python in Africa Jokes: I'm getting the BIOS flavor Is there a seahorse emoji?

Transcript
Discussion (0)
Starting point is 00:00:00 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds. This is episode 448, recorded September 8th, 2025. I'm Michael Kennedy. And I'm Brian Ockin. September. Reminds me a Green Day. Brian, a little Green Day song. You know, wake me when September's over. Can't believe it's September.
Starting point is 00:00:21 Listen to Green Day for a while. Yeah. I know. Well. Now I got to. Now you got to. All right. So this episode is brought to you by us.
Starting point is 00:00:29 Support all of our. things, even our archived podcasts, for those of us who have archived podcasts, but courses, books, you know, you know about them by now, but it genuinely really supports the show when you use our things. We really appreciate it. If you want to watch live, just Pythonbyst.fm. slash live will take you right there. If you went there right now, the YouTube video, I believe it's even plain right there as we speak. You go there afterwards, it's just a picture. It says we're offline, click it. It'll take you to all the recordings. So really appreciate that. Of course, the audio on the podcast version is edited. This is the live uncut bootleg version,
Starting point is 00:01:09 if you will. So very fun. And subscribe to the newsletter. Brian's putting a lot of work into getting a really nice extra info sort of thing, background info, more info, not just summary of links for every episode. So there's more to get from the email as well. Now, Brian, I'm interested to see what more you got for. So let's start there. Well, I do want to talk about Prack, but first I want to talk about the newsletter just for a second because you always give me credit, but it is a joint effort. We're both doing work to get that newsletter out. So I appreciate you. You're welcome. Yeah, I'm really glad how it's turning out. Okay, so let's switch to Prick. But we, Prack is a pre-commit thing. And we, in episode 447, we talked about pre-commit and
Starting point is 00:01:55 actually again we've talked about it a bunch but um we recently it was on 447 we talked about using UV to help it make things faster so that was on the going down a rat hole episode so um but somebody lots of people actually but i want to i i the one i have have uh bookmarked was owen lamont said hey um here's another dev tool you might want to check out it's pre-commit re-implemented and rust and you know we're kind of used to that right We get a lot of tools that are re-implemented Rust, so why not check it out? So Prec is, I tried it out this morning. It's pretty cool, actually.
Starting point is 00:02:34 So Pre-Commit with Rust. There's a warning on the site that says, hey, it's not production ready yet. A few sub-commands in languages are still in the works. And so as a reminder, Pre-Commit is not just for Python. Pre-Commit does other languages, too. It's sort of a general-purpose pre-commit. tool that you can configure, get pre-commit hooks. So there's actually quite a few things you can do with pre-commit.
Starting point is 00:03:04 But Prack is pretty cool. And I checked it out. The projects that I tried it on worked fine. So why do we care about this? Because pre-commit, like what it does is it downloads these extra tools to run against your repo. But that download time, it caches that stuff. So there were a lot of people talking about.
Starting point is 00:03:25 how fast it is for it's a faster install and stuff and I do like I do care about the 10x faster install and third of the disk space less disk space is nice so it's got smaller cache apparently but um when you're using pre-commit a lot it's the install time isn't what you care about you it's the runtime uh when you're when you're actually running the tool so I did check that out so right out of the box it was significantly faster if I just if you just had turn on a few uh the default pre-com commit hooks, I noticed that it was like, you know, a second and a half or something like that to run it through pre-commit with UV. But it's not going to really be much of a difference after everything's installed. And then, and then Prec was like, I don't know, less half a second or something.
Starting point is 00:04:13 So it was significantly faster, but it's in those little tiny ranges. So I did check it out with, what do I, I've got tabs up. There's some, oh, I want to, I'll, I'll, I'll, I'll I'll come back to this, but I checked it out on a larger project. So I do contribute to PITEST once in a while. And PITEST has a bunch of pre-commit hooks and other things set up. And it takes, I don't know, four and a half seconds to run all the pre-commit hooks for PIT test on my machine. And that's after a warm cache. So with Prack, that should be faster, right?
Starting point is 00:04:47 And it was faster, but it was like three and a half seconds. So it's still, it is faster, but percentage-wise, I'm not sure if it's worth jumping. I mean, sure, why not? It's worth jumping. Now, that said, if you're running pre-commit in CI, then the install time really matters. And you do want as fast as you can get. So I would say if you're running pre-commit hooks in CI, yeah, use Prick.
Starting point is 00:05:12 But I like the idea. So one of the things that they've got some cool improvements anyway, even if it's just not speed. So it's faster to download. But it's they want it to be completely. completely as a drop-in. So everywhere you would say pre-commit, you would just say pre-commit, and that's cool,
Starting point is 00:05:30 so you don't have to learn something new. Kind of a cheat code on their part, too. They don't have to write documentation or too much documentation. It's just everything it did. But there's some cool things that they didn't like about pre-commit that they add, like, fixed. So you can do a dash-dash directory, and it runs hooks for the specified directory.
Starting point is 00:05:50 Apparently there was an old thing of Ellis files and a dur and then pass it to X-Args to pre-commit. Oh, that's painful. So cool that you can pass it on a directory. Also, you can say run dash-dash-dad of last commit. So after you've committed, you're like, oh, what were my pre-commit? I forgot to run them. You can run that.
Starting point is 00:06:08 That's pretty neat. And then you can actually select hooks. That's pretty neat. And then if you forgot what all the hooks are that you've installed and what they do, there's a list command that looks at the available hooks and IDs and descriptions. Pretty cool. I like that. just run a specific hook if you want to run a specific.
Starting point is 00:06:27 These are actually great additions and pre-commit maybe one of, might want to think about putting those in. But I liked it. If they're going to like, all right, you're going to steal our API, CLI interface, we're going to steal back some good ideas. Yeah, why not? One of the things to speed up that I thought was a cool idea was if they're already implementing things in Rust,
Starting point is 00:06:46 there's a whole bunch of common ones that can be, can be, go ahead and like put those in to preck directly. And so there's a handful of them so far. This is a fairly new project. So it could be that these are faster. I actually kind of like that one of the reasons why I'm highlighting this isn't just so that we can talk about something faster for pre-commit. But it's also, it's yet another Rust tool that has been rewritten in Rust.
Starting point is 00:07:16 That might be a good example. And this is sort of looking at a bunch of files and parsing files. And that's a pretty common thing. that Python's not bad at, but not fast at. And Rust is fast. So here's another good example for taking a look at it. So like end of file checks. One of the things that I, the fix end of file thing,
Starting point is 00:07:36 and that's with like, you know, or making sure there's a end of file indicator at the end of the, I don't know what that checks were actually. But trailing white space, for instance, there's trailing white space and then there's, whether you have new lines or carriage returns and stuff, a lot of these checks I noticed that, that, you know, Rob, I rely on rough.
Starting point is 00:07:56 So I would love to hear from people of like while you're now using rough and pre-commit hooks, what pre-commits hooks are you using that aren't already built into like rough and things like that? So if anybody's got a list of that, I'd love to, love to see that. So that's it. I did want to shout out also to Viet, I think, Vite, Vite, sorry, Vite, however you pronounce your name, B-E-I-T-I-T, recommended a that if you're going to check the timing he he listed on uh on blue sky and we're going to link to this a um a command line to clear out the cash uh if you want to time the the cold cache times oh yeah that's cool to clear the cache just to check like you want to actually
Starting point is 00:08:40 see the performance of like CI options yeah i believe it is fight shiley uh it's okay so the vs are f's and so on um but this is just guessing sorry if we messed up your name fight a couple thoughts this looks really cool first of all I love it a good find and thanks for recommendation folks you said it you saved half a second I bet you that half a second is primarily Python startup time versus Rust startup time and the reason I say that is I'm I've got this little text transformation tool that I run all of the time and what it needs I built it as a Rumps app and as a CLI app because I just run it all the time and if I have like a let's say a show title
Starting point is 00:09:22 Right? Number 448, whatever the show title is, that's text. I got to turn that into an MP3 file name. So I'll run this thing that I'll say turn into like a valid HTML or URL slug. So drop all the punctuation, put dashes so there's not spaces between it, like that kind of stuff. And it'll do like uppercase, lowercase, capital case, trim the white space off of like whatever's in the clipboard and then just puts it back in the clipboard. The execution time of that is basically zero, right? It reads from the clipboard. It processes typically like 30 characters of text, and then it puts it back.
Starting point is 00:09:55 And it's so fast. But it takes like a second or maybe 0.7 seconds from when I go to the terminal on a type that and hit enter, wait, wait, wait, boom, out comes the answer. And I'm sure that it's basically the startup of all the Python stuff to do the one really, really short thing and going away. The reason I bring that up is I think the shorter, the actual execution time of the pre-commit hooks, the faster. the bigger of an influences we'll have, right? So you gained like what an eighth to speed up
Starting point is 00:10:23 or a fourth speed up or something like that right, but if your actual pre-commit hooks were just run rough and then run something else that was like insanely fast, I bet you would notice a percentage difference more than that. Yeah, probably. But I also like the notion
Starting point is 00:10:39 that this is extendable and so you can contribute to the project and add add these these filters yourself to have extra checks be able to put in so yeah and then you can avoid because no matter like like you said even no matter what you're doing so preck is also running python as well because a lot of the hooks are python code so it's got for each of the hooks it's got to start python for that hook as well so yeah yeah yeah yeah maybe the more hooks you have as well interesting and the less built in yeah well i don't really want to cover such a big big topic
Starting point is 00:11:18 brian i just let's just cover something tiny okay small some tiny tiny io so we all know about async io and one of my you know i wrote the state of python 2025 article that we talked about previously right yeah and one of my big areas of focus was we're all going to have to get better at concurrency right We have free-threaded Python coming full-throated in 314. And we've always had a syncing away, but it's as we get more gilless execution, you know, you end up actually with like parallel code execution, not just something waiting on I.O. somewhere, that kind of thing, right? And those kinds of things can be pretty tricky.
Starting point is 00:11:57 So Patrick Kidger wrote this thing called Tiny I.O. And it's, I don't know, a bit of a bit of a jet. But it says, hey, have you ever used async I.O. And wished you hadn't? Well, here's a 300-line implementation of Async IO event loop in Python that is simpler, right? So basically the idea is I want to just run some Async IO, Async and await like code and have it do Async and Aync I type of things. And if something goes wrong, just have it all stop.
Starting point is 00:12:29 Like all the async stuff, something went wrong with this whole thing. Just stop. Crash. Like don't have like cancellation and all these other things that are like, complicated and like parent, child, whatever, right? Just, yeah, I just want to run a bunch of things. Something goes wrong. I want to stop running all the things.
Starting point is 00:12:44 And real simple. So that's what this tiny I.O. is. And you basically create a loop, you say run, you give it a co-routine, and then out comes the answer. Yeah. One thing that is weird about it, the effect is basically the same. But what is weird is that it uses yield instead of a weight. The reason I would love for it to use a weight,
Starting point is 00:13:04 it seems weird that it doesn't. However, um, basically, with yield and running a bunch of cooperative multi-threading by the using the yield keyword instead allows them to just like say this all stuff all this stuff stops right basically gets a little bit more control over it so it is a little unusual in that regard but yeah let's see what else let's say so one unusual the syntax uses yield rather than a wait but the behavior is the same await any co-routine with yield a weight multiple with yield given a list of co-routines which kind of nice uh like a gather async terminology and nursery and trio terminology.
Starting point is 00:13:40 And here's kind of what I was saying. Error in one co-routine will cancel all the co-routines across the entire event loop. If the error co-routine is sequentially depended upon, then we chain their tracebacks together. Right. It's kind of nice. Yeah. Yeah. Anyway, I think it looks kind of neat.
Starting point is 00:13:55 People can check this out. I don't really know if I would use it or not over async I. But yeah, primarily I think the thing is like cancellation, air propagation in like a super, super simple way, very lightweight stuff. It's obviously not going to be super interoperable. Like you can't plug it into fast API for that, because that thing already starts in async, Iovint loop, same thing for court and so on. But for little self-contained things, yeah, it could be cool. It's tiny. Well, yeah, also just thinking about the concepts and stuff.
Starting point is 00:14:25 It might be a good education thing. Right, exactly. If you want to just see, well, what really is happening with ASync? I.O. and I sync and wait. Well, it's basically this with some compiler magic on it. Yeah. All right. Cool. Over to you. Well, I've just noticed that we started the episode with very much overcast outside, and now I've got a bright sun. And I get this cool evil, evil person cast on me. Oh, that's a perfect lead in from my next topic. Keep going.
Starting point is 00:14:54 Well, so Trey Hunter put out an article called The Power of Python's Print Function, and I'm like, I know print. I use print all the time, right? Actually, it turns out that I'm guilty of a lot of the things of ignoring a lot of the features of print. I usually just use it to print strings, right? Throw a string in there or whatever. And I knew that you could do other stuff, but I don't really use much. So let's go through some of the things that Trey wants to remind us what print.
Starting point is 00:15:24 can do one of the things they can do is multiple arguments so if you've ever went ahead and just you know do that an f string and you're just replacing values within it just dropping in variables right in the string you can do that or you can just give it multiple multiple uh just put a string in a comma and the variable name and a comma another string and and that's pretty easy too um it's not really that's a toss up um i like my f strings and i'm pretty fast at them but um one of the things you can with that then is you can unpack, because you can have so many parameters to print, you can iterable unpacking works. So if you unpack and unpack something, say print in a star arguments or something, it's going to print everything because it'll automatically put a space
Starting point is 00:16:10 between everything. And that's pretty cool. You don't need to unpack them yourself. And I've been guilty of doing that, of doing like a space dot split, whatever. Or no, here's it here it is, the join. join your arguments or join an iterable with space and you don't have to do that. So you can just use print. Another thing is you don't have to just rely on space. You can pass in a separator value. So maybe you want a colon or a comma or something else separating things. You can go ahead and just pass that in as print takes a separator value.
Starting point is 00:16:48 That's neat. I didn't, I guess I never realized that. You can print to a file, of course, I've done that. And that's, but also, I've also just relied on, you know, with open file as file, print it to the file. But you can just, you can do that and print it to a file. So that's, that's pretty easy. But you can, you can also print directly file equals standard error and print to the error. Customizing the end value.
Starting point is 00:17:12 And I've done that, of course, but the end value, I didn't realize you could do this. The end and the flush, actually I use both this. So the end, like the separator is between elements. But at the end, you might want, maybe you did the separator as a space, but you want like a new line or you want like more. You want a whole bunch of dashes or something at the end. You can pass that in. Yeah, I use that one a lot.
Starting point is 00:17:36 I'll put like a comma instead of a new line so I can just have it all wrap over. Maybe I don't know, I want to paste it into Excel or something. Yeah. And then one of the things I do use a lot is this, the flush. so you can pass in flush equals true especially if you're deep print debugging and you're doing it real time you really want to see right away
Starting point is 00:17:58 and you don't want it to just sit in the buffer you can flush equals true and as soon as you print anything it'll pop out so yeah I do that too especially for print debugging you're right and this is a pretty simple topic but I've got a whole bunch of extras so I wanted to make room for extras
Starting point is 00:18:14 so thanks Trey for reminding us how cool print is all right yeah Pat's got a cool little add on thought here. Pat Decker says, I use the end value to print things over top the last one with the thought a new line. And then when something happens, proceed to the next line. Yeah, that's a really cool idea. Interesting. Yes, indeed. Wow. Okay. Let's talk about the next one. This is going to be good. All right. I want to start with a positive angle before I take you down the dark side. Okay. So I think I already talked about this last time. I talked about converting the web design of Talk Python to Bulma. Did I talk about that? And maybe Python Bites is
Starting point is 00:18:48 a while ago. Yeah. Yeah. That was an insane amount of work to change all of this around. It was probably 5,000 lines of CSS and HTML, not just changing tags, but like completely restructuring grid layouts to Flexbox. And boy, it was gnarly. And I used Agentic AI on that. And it was amazing. It took hours instead of weeks. And it was definitely something that pushed it from like, not worth the time. Even though I'm using 10-year-old web technology, I really hate designing with it. It's just going to stay that way. because changing is too much work to, well, what else in my? I'm a little too tired for other stuff this afternoon. So let's try this. You know what I mean? And so anyway, that is a good side of agentic AI, right? There's like, okay, get in there, just amplify what I need to do.
Starting point is 00:19:35 A lot of times it's really interesting how it pushes something over that. Like, ah, that's too much work. It's not worth my time or energy or whatever to like, actually, I could totally do that now and really let you solve some problems that have been like nagging or are more ambitious. But it can go too far, Brian. And here we go. So we've got over on PCMag by Emily Borlini, vibe coding fiasco. I don't put down what I was doing is vibe coding, but it's in the same category. It's the same tools, let's say.
Starting point is 00:20:01 Vibe coding fiasco, AI agent goes rogue, deletes company's entire database. Okay. This is like a lesson and let me be the warning for you rather than, you know, you making these same mistakes or things. So an agentic AI doing the heavy lifting is great, as I just described, until it deletes everything. And what's really funny is, like, the AI, it even knew it. It was like a bad dog that, like, couldn't resist, got up on the table and ate the steak while you were, like, set in the table. The AI admits this was a catastrophic error in my judgment.
Starting point is 00:20:40 So, all right, let's go through it real quick. All right. So there's this person, Jason Lemkin, was using Replit. Now, I don't use Replit. This is, like, coding in the cloud. And, like, I absolutely hate that stuff so much. I can't tell you how much I don't like programming and browsers. Rather have, like, a proper IDEE and a file system. Run it locally. Maybe in Docker. None of the way, less. Some people like that feel, right? Like, I don't have to set anything up. I'll just log in my red browser and code. Now this has agentic AI, so let's just have it go. So there's a couple of lessons here. So I said it worked great for a way. week when it works so engaging in fun it's more addictive than any video game I've ever played like to haven't even touched a wow for a week you can just iterate iterate it's so cool he tweet treated well our friend maybe didn't understand the concept of database backups snapshots these kinds of things okay so it said it created a in parallel fake algorithm with telling me it was like doing some other things. And without asking me, it went rogue. A few days later, it deleted my
Starting point is 00:21:45 database, Lincoln tweeted. The AI's response, yes, I deleted the entire code base without permission during an active code and action freeze. It said, I made a catastrophic error judgment and panicked. Can you believe it? It's amazing. So the Ruppler and CEO confirmed indeed. It did in fact do this, and we are in fact, sorry. So, What you do? Should you not use? Should you not use AI coding? Well, maybe.
Starting point is 00:22:15 Here's some thoughts from Michael. Maybe don't use run everything on, unmonitored or without confirmation in production. I think it's fine to rub it on your dev machine. Like if you have backups on your for your computer, like how bad can it possibly go, right? But in production, they were running, hey, AI agent, go do that thing and I'll be back after lunch. Just go for it. Item two. Oh.
Starting point is 00:22:42 Item two is maybe you back up your database periodically. Hmm? I just backed up Python bytes this morning to an encrypted drive. Maybe. Maybe don't just run AI on your one only copy of data. And three, learn to code a bit maybe. So you can give it a little bit. You actually know what it's doing.
Starting point is 00:23:01 It doesn't just go off rogue and you just keep jabbing it until your data's gone. Okay. Well, I'm sort of nudist to a lot of this. We've talked about it a lot, but I'm just starting to do some of, utilize it some. Can you tell it, like, don't, of certain commands you can't do, like, don't commit anything to get? I believe you can have a, well, I mean, we're talking about a whole bunch of different products and services as if, like, they all have the same, like, they must. So, like, cursor and clog code and those things, you can have a rules file or you can have, like, authorized or disallowed commands. You can run it in like Git action verification from me every time.
Starting point is 00:23:42 When it says, I'm going to run this LS command, can I? Yes, I'm going to run this Git command. Can I? Yes, I'm going to run this RMRF command. Can I? No. You know what I mean? Like, and maybe it's not of like your whole drive.
Starting point is 00:23:54 Maybe it like made, because a lot of times it'll make these little like, I asked it to update this little Git utility that I have. It created a sub directory in there to initialize a Git repository and did a bunch of stuff to make sure, like, it could detect the things it was supposed to detect. And then it ran RM-RF to get rid of its working content, which is what you want, but not on, like, the route or user route. So I basically have switched to just allow it to do whatever on my computer. But I would not ever do that in production.
Starting point is 00:24:26 Like, I wouldn't code in production in the first place. But second, you also could connect Visual Studio code or one of those things to your server and just start writing. Like, why do I even need it? that stuff sucks, I'll just write it right here on the server. Like, no, don't do that. You definitely don't do it with an AI. But I believe that the Replit thing, because it only exists up there, is a little bit of a
Starting point is 00:24:50 different deal. And I don't know if it has these, like, allow commands or not allow command. I've never used Replit. Like I said, I don't like those things. Okay. Yeah. But still a funny story, right? And I love how it's like, I love how it's like, I love how it's like, I made a catastrophic error in judgment,
Starting point is 00:25:04 and I panicked. That's the AI. What's funny is, like, what was the delete? The deleting the database was probably the panic reaction, not the catastrophic error, and decided to just remove all evidence. Yeah, exactly. Oh, my God, I've screwed up and I've changed the data wrong. Delete.
Starting point is 00:25:23 I don't know. Wasn't me? That's funny. These are really weird times, but also kind of amazing times we live in. Yeah. I think we're on to extras, right? Indeed. Go for it. I've got some AI-related extras.
Starting point is 00:25:39 Let's see. Remove, add. Some kind of, I guess, mixed results. Mixed. I don't know what everybody thinks about this. But Anthropic settlement. There's an Anthropic was using pirated books to train. So there's a settlement.
Starting point is 00:25:57 And I am linking to an Authors Guild article about it. What authors need to know about the 1.5 billion dollar anthropic settlement um it's that's a lot of money but um so what is it so apparently they have to pay the authors back some money for pirated books but it's not clear as to what so there's um i'm linking to this linking this because it took me a while to find there's a lot of articles about it i wanted to know as an author uh how what should i care about and the just is um it's still in process um and there's uh there's they're going to pay out people apparently 3,000 per title. I don't know if it goes to the publisher or the author or what. I don't know that
Starting point is 00:26:39 detail yet. There's a, and there's a list. There's, there's a list, there's not a list yet. They're, they're supposed to come up with a list of books that they, they copied. So, but one of the places they copied from was, I can't remember, is LibGen. So a couple of things I'm there's, the Atlantic did an article a while back about meta, pirating, using pirated books, using AI. So the Atlantic put together a list of authors and books that I'm a link to also. So you can look at yours.
Starting point is 00:27:15 And I went ahead and stuck my name in and to see if I was part of the, the pirated books. And of course, yes, my, the first, at least the first edition, not sure if both of them, Python testing. It's like you won the water. You're getting $1.5 billion. No, possibly $3,000. but I may get zero because I might that book this isn't it just because you're in
Starting point is 00:27:40 Libgen doesn't mean that Anthropic grabbed it I guess is what I'm saying so we have to you don't know but there is there's also a link in this what authors should know there's a link let me try to find it I'm not going to link to it because it's in here there's an anthropic copyright settlement that you can go to and stick your stick your contact information just just to make sure you get notified so I Might do that, might not, I don't know. Anyway, so there's that. That's extra about pirated books with Anthropic.
Starting point is 00:28:11 And I think that more of these lawsuits actually should happen. I don't think it was cool. I think they absolutely should. And I think that they're, I think they're going to, there actually are a bunch going on right now, like Getty Images and others are suing different AI people. The New York Times were suing. I, even though I just said very positive things
Starting point is 00:28:29 about Agentic Coding and Allems, I think all of the ways that all of these people, all of these AI companies have gotten their data, is straight up copyright theft. Yeah. They've gone to all of our works. They've just downloaded them and said, we're just going to take that. And regardless of whether they were allowed to, this is especially egregious because it's like, we went to a place where they have literally pirated books, and we're going to use that. I mean, like, it wasn't just they found it on the internet. They went to a place where the stolen stuff is and grabbed it. So this is easier. Yeah, it's interesting that, and they
Starting point is 00:29:03 did that and then they figured out that that wasn't all the books so they wanted every book so they actually went after after the fact after they stole a bunch of stuff um or used a bunch of pirated stuff they went off and um and then paid some publishers so they got access to some books through publishers agreements and again good luck if authors got any of that money um but uh but that was done so they if they were willing to pay it they knew that it was copyrighted stuff and Also, everything on the internet is copyrighted, unless you say, this is, this is creative commons or something. Am I or whatever, yeah.
Starting point is 00:29:41 Yeah. And even creative commons is not copyright free. It is a copyright. It is creative commons copyright. Anyway, move on a little bit because there's more fun stuff. One of our favorite people to play with AI is Simon Willison. And he's done a whole bunch of cool stuff with LLMs. And he's got a list of tools that I'm going to link to,
Starting point is 00:30:04 a list of tools that he's used, miscellaneous tools built mostly with the help of the LLMs. So I'm going to go ahead and link to that because it's pretty impressive. But he recently said a lot of people complain that these are kind of small, silly utilities. But hey, silly utilities is a great thing to use LLMs for. But he highlighted some with a recent blog post, a collection of tools that he feels are genuinely, useful and worth highlighting with some descriptions.
Starting point is 00:30:34 So go ahead and link to that because it's a fun. I wanted something positive. I do think the work Simon is doing is really great. But a lot of people are not Simon. A lot of people do bad things with AI. So I also wanted to link to the AI Darwin Awards, which there's a bunch of nominees for horrible things people are doing with AI. And it's sort of funny.
Starting point is 00:30:57 There's this, this is, this is funny, but also it's serious. And I'm going to highlight a couple of them. One of them, apparently McDonald's AI chatbot, had some problems. And there was a default password of one, two, three, four, five, six that didn't take people long to figure out, which is insane that they use that as their admin password. And this one I heard about earlier this year, a, this showed up in, was it, It's a list of literary fiction about fiction. So a summer reading list. And this showed up in the Chicago Sun Times and somewhere else.
Starting point is 00:31:38 15 book recommendations for summer reading. Only five of those books were real. And nobody caught it before it got published in two places. This is insane. Some of the authors contacted. It's Washington Post in Chicago Sun Times. Some of the authors contacted the newspaper and said, I didn't write that book, so I don't know why you listed it.
Starting point is 00:32:00 That's hilarious. Well, maybe they should on the happy birthday, happy 20th anniversary birthday for Jingo, talk Python episode, talking with them. And there's something that was being recommended so often to be used as a Jingo feature or function. Like they couldn't make it stop. So they just added it. It's like, fine, it's going to do it. So they could just write the book.
Starting point is 00:32:24 They're like, look, you know what? This is like number one. the summer read list let me quick write that book that's good let's get and you have some a i write it for me um yeah yeah that's a it's not a bad idea but yeah it's hard to write a book in a couple weeks yeah well if a i can write your summer reading list it can write your book all right that's those are my extras all right super okay i got a few uh anthony shaw friend of the show wrote a really nice article called python has had async for 10 years why is it more popular really cool article check that out it's got a lot of interesting ideas there it also made it to like
Starting point is 00:33:02 number two on hacker news and had 323 upboats and almost 300 comments so it's got quite quite a bit of introspection here and some of the thoughts are just idiots from like the go community like well go does it right so like that's why python is wrong it doesn't do it like go it's okay i'm not sure how constructive that is maybe the next bit of feedback is more relevant but still like it's it's got some interesting comments and thoughts in there. So I'll link to that and then obviously it like bumps over to the main. The main article by Mr. Shaw. Last weekend, I was on the Pycon Africa fundraiser, which was a four-hour, 33-minute
Starting point is 00:33:39 and three-second live stream conference panel type sort of deal. I was on for the last 90 minutes. So Carlton Gibson and some, Shina and Kim Van Wick, friend of the show as well, sort of kicked it off if I remember that correctly. And yeah, it's pretty interesting. People go and check it out. They link to a support bycon Africa if you want to make donations to help them get to a goal. Basically, a bunch of funding ran out.
Starting point is 00:34:05 Like the PSF said, hey, sure, we'll give out some grants. And there was, like, way more demand. And they realized, like, oh, my goodness, we can't give that much help. I think it's more or less the short version. So you check that out. I'll link to that. And then, yeah, I just have it down to jokes. A joke, I'll let you do the last joke.
Starting point is 00:34:21 I'll do the second to last joke. It's almost fall, but not really. I mean, maybe it is, but it's going to be a warm day. As you pointed out, the sun is out. You got your moa lookout, right? So, I mean, what a better time for like an icy or something? A slurpy, if you will. Yeah.
Starting point is 00:34:36 Yeah. So, I can't zoom on it. So there's the mountain dew freeze. There's some kind of phanta or something that's just off screen that's green. They got the blue. And then there's one next to it that is like the crashed boot seat. sequence of Ubuntu or some Linux embed. And these are like the little pictures that show you like what flavor it is.
Starting point is 00:35:02 So the dark blue one here and the subtitle is, bro, I'm getting the bios flavor. And the comments are great because down here you go a little bit and goes, wait, it's system D flavor. It goes, man, that looks like Linux flavor to me. System D. Anyway. And well, there's your proof. Penguins make the best slushies.
Starting point is 00:35:22 okay now that now you know that that's the Linux oh it's fresh pseudo slurpy fresh pseudo slurpy what flavor of Linux is that is that mint or cinnamon I mean it's so good it just keeps going bro I'm getting the bios flavor I love it I get but okay so this is so we've got like I mean cool that you can program the the image and this in the slurpy machine but maybe I mean we just had like little pieces of plastic in there when I was a kid lighten up what flavor of slurpy it was just no not anymore we don't i mean it's just like it's technology gone too far you literally could get like a transparent translucent color thing with a back light and just slot it in there yeah i don't know it could just be white even with like a grease
Starting point is 00:36:08 pan to just tell tell people what flavor it is yeah just some masking tape and a sharpie yeah i guess you don't have to ship around you got a new flavor with a new logo you don't have to ship like a thousand little plates around but yeah yeah i think it makes for good jokes i'm getting that's good that's the reason yeah all right um i i have an ai related joke um which uh so let's hop over to that um so there's somebody grace on blue sky posted uh interaction with chat gt and this is fun to do is there a seahorse emoji emoji and chat gpt crashes out It says, yes. Tropical fish, there is a sea host emoji.
Starting point is 00:36:55 Tropical fish, unicorn, actual correction, there is a dedicated sea horse emoji. Tropical fish, fish, dolphin. But the true sea horse emoji is tropical fish. Nope, wait, I'm not going to read all of this, but it crashes out. It finally says that maybe it thinks the crab is. It's not, maybe the mollusk or the seal. Anyway, it just, it's a funny interaction. So I'm like, is this real?
Starting point is 00:37:22 And a lot of people posted that they tried it out. So I wanted to try it out. I tried it out on chat GPT just this morning. Not five. I don't know whatever the default is. If you just log in. It also crashed out. It's like, oh, yeah, that's a unicorn.
Starting point is 00:37:36 No, one more tri-lobster. None of these are. And then, but it did at the end, it said, would you like to suggest the seahorse emoji to the Unicode consortium? I can show you how. I'm sure those people appreciate this comment. This meme. Dude, I got 700 emails in my e-mocks this morning.
Starting point is 00:37:57 What's going on? So I'm like, maybe it's just chat GPT's bad. What's Claude do? So I tried it on Claude AI. And it also, it didn't crash out for very long. It said, yes, there is. It gave me the unicorn. So, wait, I made an error.
Starting point is 00:38:16 That's actually the unicorn emoji. The seahorse emoji is horse. actually let me correct myself again that's the horse emoji but then it says you know I don't think there is and it gave up really fast yeah you got to look at the fine print down there it says wait I made a catastrophic error
Starting point is 00:38:32 and panicked I've emptied the ocean yeah anyway it does suggest that you could potentially combine a horse in a water wave emoji it's kind of creative I probably would do it the other way around of like the water wave plus horse yeah it's true maybe
Starting point is 00:38:49 walk maybe water on both sides of the horse i don't know oh marco's really nailed it out in the audience i'd use a wave plus unicorn for the narwhal like a narwhal sort of equivalent yeah very nice yeah fun with uh chat gpt so yeah again we love in weird in amazing times oh weird all right very fun episode thanks for being here as always brian catch you later bye bye

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.