Python Bytes - #479 Talking About Types

Episode Date: May 11, 2026

Topics covered in this episode: httpxyz one month in Learn concurrency - a deep dive into multithreading with Python pip 26.1 - lockfiles and dependency cooldowns Python 3.15 sentinal values from P...EP 661 Extras Joke Watch on YouTube About the show Sponsored by us! Support our work through: Our courses at Talk Python Training The Complete pytest Course Patreon Supporters Connect with the hosts Michael: @mkennedy@fosstodon.org / @mkennedy.codes (bsky) Brian: @brianokken@fosstodon.org / @brianokken.bsky.social Show: @pythonbytes@fosstodon.org / @pythonbytes.fm (bsky) Join us on YouTube at pythonbytes.fm/live to be part of the audience. Usually Monday at 11am PT. Older video versions available there too. Finally, if you want an artisanal, hand-crafted digest of every week of the show notes in email form? Add your name and email to our friends of the show list, we'll never share it. Michael #1: httpxyz one month in First version of httpxyz contained just the fixes to get zstd working, and the fixes to get the test suite running on python 3.14, some ‘housekeeping’ changes related to the renaming End of March: a compatibility shim that allows you to use httpxyz even with third-party packages that import httpx themselves, as long as you import httpxyz first. Importing httpxyz automatically registers it under the httpx name in sys.modules , see https://httpxyz.org/httpx-compatibility/ Fixed a WHOLE bunch of performance related issues by forking httpcore Brian #2: Learn concurrency - a deep dive into multithreading with Python Nikos Vaggalis “Whenever you are trying to speed up code using multiple cores, always ask yourself: “Do these threads need to talk to each other right now?” If the answer is yes, it will be slow. The best parallel code splits a big job into completely isolated chunks, processes them separately, and merges the results at the finish line.” Good overview of thread concurrency with Python and how that’s been improved dramatically with free-threaded Python Defines lots of terms you come across, including “embarrassingly parallel multithreading” There’s a counter example that’s nice Start with a shared resource, a counter, and multiple threads updating it Attempt to fix with threading.Lock(), which fixes it, but slows things down Good explanation of why Proper fix with concurrent.futures and separating the work of different threads so that they can be independent and their results can be combined when they’re all finished. Michael #3: pip 26.1 - lockfiles and dependency cooldowns Python 3.9 is no longer supported Experimental: installing from pylock files Dependency cooldowns (see my post about this) Lifting several 2020 resolver limitations Brian #4: Python 3.15 sentinal values from PEP 661 MISSING = sentinel("MISSING") def next_value(default: int | MISSING = MISSING): ... if default is MISSING: ... Take a name str as a constructor parameter Intended to be compared with is operator, similar to None Sentinal objects can be used as a type, also similar to None and can be combined with other types with |. Unlike None, sentinal values are truthy. (Elipses ... are also truthy) This seems like a strange choice. but I guess it must have made sense to someone. It does force you to use is instead of depending on False-ness, so I guess it’ll make code using sentinels more readable. Interesting that the PEP was started in 2021, and we’re finally getting it this year. Extras Brian: Before GitHub - Armin Ronacher tenacity - cross-platform multi-track audio editor/recorder learned about it from Armin’s article Joke: Joke option Make it myself Seems similar to what people think about software now Links httpxyz one month in httpxyz.org/httpx-compatibility Learn concurrency - a deep dive into multithreading with Python pip 26.1 - lockfiles and dependency cooldowns my post about this Python 3.15 sentinal values from PEP 661 Before GitHub tenacity Make it myself

Transcript
Discussion (0)
Starting point is 00:00:00 Hello and welcome to Python Bytes where we deliver Python news and headlines directly to your earbuds. This is episode 479, recorded May 11th, 2026, and I'm Brian Akin. And I'm Michael Kennedy. This episode is sponsored by all of our stuff, like the Pytist book and Pytist courses and talk Python training and so much else. And also Patreon supporters. Thank you, Patreon supporters. If you'd like to get a topic, if you've got a topic that you think we ought to talk about, or just are interested in. Don't worry about whether or not we've heard it or not.
Starting point is 00:00:34 Just send it on over anyway. You can reach us on Macedon and Blue Sky. And there's a contact form right on the show notes, but right there on Pythonbytes.com as well. So you can check that out. Thanks to everybody that watches us live on YouTube. If you'd like to be a part of the audience and submit comments and stuff, you can head on over to Python.fm.fm.
Starting point is 00:00:57 slash live and be part of that. And you don't need to take notes because you can just get our newsletter. So go to the Pythonbytes.fm also and subscribe to the newsletter and we'll send you. We don't send much other than just the links and all the information. Well, there's a lot of content in the weekly email that we send out. It's not just the links to everything we talked about, but it's also some background information. And you'll especially want this time because we've got some cool stuff. So, speaking of cool stuff, Michael, want to kick us off?
Starting point is 00:01:32 Yeah, let's kick it. Let's kick it off. Let's talk about HTTPX YZ. So recall this is, Michael had created a fork, and we covered why I forked HGTGX, and it was part of the larger churn around in code, I believe, and so on and all of that. So this is the one-month check-in, and it's pretty interesting. I think it's got some ideas worth paying attention to you. So basically a couple things worth talking about is they got the first version out.
Starting point is 00:02:05 It contains some fixes to get ZSTD working, get the test running, housekeeping, and so on. But then more importantly, end of March, some fixes came out. compatibility shim that allows you to use HGPX. This is, okay, this is the thing that really made me want to cover this, because they released a compatibility shim that allows you to use HDPX, Y, Z. even when third-party packages use HGPX themselves. So this is my dilemma. I look at my project and I see that I'm using HGPX.
Starting point is 00:02:35 I'm like, huh, it'd be really cool to switch to HGPX Y, Z. Seems like the right thing to do. And then I go look at my PIP compiled output and it says HGPX because these seven packages are using HGPX. I'm like, huh, well, that's kind of useless. Like, why would I use one when I have to effectively use these others all the time when I interact with those libraries, right? I'm still in the same boat, and I don't see a life raft or any way to get out of that boat.
Starting point is 00:03:00 Here's that thing, and it's pretty interesting. If, all you got to do is if you import at the startup of your app, HGPX YZ, it also tells it the world that it is HGPX. Okay. So for example, you can assert that HGPX is in system modules of HGX YZ, right, that kind of thing. So if you import HGPX somewhere along the way, some other app, does, other part of your app does, it will still return basically a compatible later, even at a type instance. So if you ask, is this thing that I got from HTTP XYZ.
Starting point is 00:03:36 Response of some type, is that actually a HTTPX. Response? Yes. So really interesting compatibility. And honestly, I don't know how it does that. That's pretty interesting. Yeah. How do you say you are, I derive from this class in another library without having the other
Starting point is 00:03:52 library present? You know what I mean? Well, as long as the endpoints are there, you're going to fulfill it. But it is a... Yeah, but it's not duct typing. This is strong type. It's asking is instance, right? That's what I think's interesting.
Starting point is 00:04:04 Yeah. Well, yeah. I don't know. I feel like I'm going to go on our typing rant this episode. So prepare yourself, people. Prepare yourself. So this is not uncommon, like, Pillow has done this. Open CV has done this.
Starting point is 00:04:16 YAML, pi yamil have done this and so on. Well, but done this being it's, it's, they've got a different name that you use you import from the library. But the hijacking some other project's name, that's, I don't know, what I feel about that. I don't know a way around it other than this. No, in this case, I think it's a good thing. It seems like this could be abused.
Starting point is 00:04:42 Yes, agreed, agreed. Okay. Well, welcome to Python imports, I guess, huh? Yeah. So I think this is actually pretty interesting. And also, this is kind of the flow of this project, is like, oh, we noticed that HTTP core is actually the root of many of the problems,
Starting point is 00:04:57 which is also a similar project. So we have to create XDP core XYZ and fix a whole bunch of issues, some serious, and so on. So there you go. Got some nice quotes here. And then finally, it's on Codeberg, which I think is not the right choice.
Starting point is 00:05:14 I understand the desire. But in practice, I don't think it's the right. I think it should be on GitHub, even with all the GitHub issues. Because, you know, you look at it, it's got 39 stars. Like, HTTPX has 15,000. Right.
Starting point is 00:05:29 So on GitHub. I don't know. I think that that's, you know, Mikhail says, look, I'd love to make GitHub a little bit less dominant. Yes, I agree. But, you know, I don't necessarily know that I would try to fight two battles in one. You're already trying to make HTTPX not less dominant. I see it. I have sympathy.
Starting point is 00:05:47 There's, you can get rid of a lot of junk if you don't be, if you're not on GitHub. Yeah, I know. It's like, what are you optimizing for? Are you optimizing for the developers, the project's benefit, or are you optimizing for exposure? I don't think so. I think it's whether or not you want contributors, because the exposure's through like Pi Pi Pi not really GitHub. And maybe a podcast, people talk about it. So yeah, anyway, I think it's the cool project. I'm going to look into maybe use it more now that that gym thing exists. Yeah, cool. And while I'm doing that, Brian, maybe you want to talk to us about concurrency? I'll get back to you on what I've learned. I do.
Starting point is 00:06:25 Yeah. Well, while I'm doing that, yeah. So concurrency. So I ran across this article. It's on Geek Uni. I don't know what that is, but it looks like it's some interesting stuff way back to 2013. But anyway, lean concurrency, a deep dive into multi-threading with Python. And this is the author Nicos.
Starting point is 00:06:48 And I actually really like this article. So I do, I'm familiar with multi-threading in multiple languages, and I don't use it a lot in Python, and I do have a lot of project, or at least a couple of projects coming up that I want to use some concurrency. And so a refresher is a good idea. And like the world has changed a little bit recently in the, I mean, the big recently in Python, because we've got, we've got gildless Python. We have free-threaded. and it changes the dynamics a little bit.
Starting point is 00:07:22 So it's a good time to have a refresher on concurrency with Python. And this is, so we are going to talk about multi-threading in this article, but one of the great things that I like about this is it doesn't assume anything. So it's talking about like in data science or cryptography or data processing, some of the places where you might want concurrency, but there is really lots of places where you want it. And then filling in some of the terms that you may not have, be familiar with, like concurrency versus sequential. We should probably should know that.
Starting point is 00:07:55 But things like global interpreter lock and, you know, I guess we already know those things. But there's a one of the terms is like embarrassingly concurrent. And I kind of forgot what that meant. So it's good there. And that just means, well, it means that you can separate your, the word. into things that can work without any interactions. Those are embarrassingly concurrent. Anyway.
Starting point is 00:08:23 Yeah, one super, super common example of that is, like, video game graphics. Each pixel is computed independent of each other pixels. So if you've got a million pixels, like, as much parallelism as you can just compute each pixel at the same time, they're not going to interfere with each other, right? Yeah. It is funny that it's embarrassingly. It's not really embarrassingly. It's just ideal.
Starting point is 00:08:45 Exactly. that what a glass half empty perspective that is so um i just encourage people to look through this there's a some great walkthroughs about different things and also um one of the uh uh it talks about free threaded concurrency as well if you want to catch up with the gill and getting rid of the gill and stuff um the and even talks about how to install the uh free threaded python which is nice um which of course you're going to just use UV, but there's, even with UV, you have to do the, like, 3.14T. You have to add the T to the end to get it to work. The, one of the cool things I think is that it, there's no, it's not a magic bullet.
Starting point is 00:09:27 You can't just use free threaded Python with threads and have it be magically fast. And that's one of the cool, like Fibonacci is a hard one because you're sort of, you could really do it faster. You depend on, on previous ones. So the, um, it's, there's, it's, it's slightly. interesting with a Fibonacci, but how many times you really calculate Fibonacci? So the better examples below that, and it's a very simple example of just having a bunch of threads update a counter, but it's, and you won't do this, but it'll be, it's a, it's a cool example because it's, it's an
Starting point is 00:10:01 obvious one where you've got shared data. There are a bunch of threads all accessing the counter. So it's a very simple toy example, but it, but you could see how that relates to a lot of other work. The cool thing of One of the interesting things about this is it talks about some of the problems with concurrency, how to get rounded, problems with concurrency and shared data. And one of the first things I reached for because I came from C in C++ plus is Lox when I want to have shared data. And within this example, Lox turns out to be the wrong solution because it actually slows everything down. And, you know, I'm familiar with that with other languages as well. But the cool thing that I've never really understood before that I'm grateful for this article to talk about is the threadpool executor or executor, thread pool, I don't know, threadpool executor of how to collect, to launch a bunch of threads, have them come up with their own answer without talking to each other at all, no shared data, and then collect the return values.
Starting point is 00:11:08 And that's the really good way. So with concurrency, not surprising, the best way to do it is to re-architect your algorithms so that they can take advantage of it. And what I'm excited about is that now we don't have to do, like, we don't have to rely on async I.O. You can still. But for a lot of parallel processing, we can just use threads now, finally, in Python. Threads are okay. So anyway, good article.
Starting point is 00:11:35 It's very exciting. Good introduction. Yeah, cool. And it's very excited. I think it's going to give us a chance to talk about all sorts of fun things. You know, we in Python land have not obsessed about all these design patterns, race conditions, making sure you lock all of your activity. Yeah.
Starting point is 00:11:52 But we're going to have to start. I have to start. And I'm not sure that we necessarily should not have. The guild does not protect you against corrupt, multi-step race conditions. Well, we've, I mean, we've kind of benefited that the threads don't really run at the same time up until now. Yeah, yeah, I know, but like that's totally fine for one operation. But let's suppose it I like, you know, the classic bank example, I take, I debit some money of this account, I do 10 lines of work and then I credit the other account.
Starting point is 00:12:20 Nothing says that the Gill will not stop halfway through. It's just, it's not very likely. So the, you know, those problems are there in the code, which I think is going to be a double whammy. But it also lets us start to think about the fun things like double-checked locking, which is such a cool idea, you know, like, uh, there. I think it's going to be really fun. I think it's a really fun.
Starting point is 00:12:40 So for example... Yeah, so check this out. So I'll put this in the show notes. It's just off a wait or put it and look it up on Wikipedia. So what you can do is you can say if you have a single threaded thing, you can check like if this object is null. Let's say it's a singleton. If this object is null, create a new one. So you only create it once and then return it.
Starting point is 00:12:56 But it's like a lazy create. But if like the default multi-threaded version would be every time you call this function, take a lock, check if it's null. If it is, create it, return it. But you can do things like check outside the lock if it's null and then take a lock and then check again if it's null because maybe two things hit one of them locked, one of them created it. You still want the same one. But you can like skip the locking ever. Except for like during the once, one and only creation. Like there's really fun patterns like that. So I'm kind of actually looking forward to it, you know? I'm not looking forward to the race conditions. So yeah, yeah, yeah. Yeah. Speaking of locks, let's lock vials instead of memory and execution.
Starting point is 00:13:36 So it turns out that PIP 26.1 is out. And it comes with features that I've been very excited about. Most importantly, dependency cool downs. I don't remember when I wrote about this. It was December last year. I wrote about Python supply chains made easy, and I wrote a follow-up one for doing that production. One of the really most important significant bits of this was,
Starting point is 00:14:01 you know, instead of you finding out, let's let others find out that something's bad. And the way you do that is you simply just go over and you just say UV PIP whatever, install, compile, whatever, exclude new or one week. And this was a thing that UV had that PIP did not have, right? So chalk one more thing for PIP. Sorry, for UV over PIP. But now PIP itself actually has the same concept. They have different CLA flags, which I don't know. I feel like they should have just said, well, what are people using now? Let's use that same CLA flag. But so it is. So it has this. Let's go down to the cool downs. Here you go. So you can say dash-dash uploaded prior to, and you can put some kind of time in there, a duration. So P-N-D, where N is the number of days? Kind of, okay.
Starting point is 00:14:53 Anyway, so you say P3D is like prior to three days is the value. And then you can install some. And that says it will only install the older, you know, stuff older than three days. And this is super valuable. It sounds like, oh, Michael, this is like security through obscurity and it's not going to make a difference. Like almost all of these major takeovers are like three hours later it was found, five hours later it was found. Rarely do these things sit around for a week unless they're extremely rare. But the major projects take over like the, oh my gosh, remember which, that LLM library that used for like facade over all the different LLMs.
Starting point is 00:15:26 That one was taken down two hours after it was gotten in there. You know what I mean? So these, I think these are meaningful differences. Why the actual difference in CLI, I don't know. Maybe there's a good reason, but I don't know. Anyway, I'm really glad to see Pip having this. Also, Python 3-9 is not supported. It's end-of-life over six, seven months ago,
Starting point is 00:15:48 so it seems fine that it shouldn't be here anymore. I mean... I don't even remember 3-9. Yeah, I don't know. What was 3-9? We don't even know. So, honestly, I don't know why you would have any support for any older version. You might say, well, Michael, somebody might be on Python-39,
Starting point is 00:16:02 and they got to install stuff. Like, just pin PIP to 25, right? Like, PIP 25 is what they were using when they had Python 3-9 anyway. Just install a different version of P. I think that's like a no-brainer and should just be like a default. New features. This is interesting. Like, first of all, I have a very positive about this.
Starting point is 00:16:19 I think this is a great move, and I'm really happy to see it. Even if I have some criticisms like P&D, which is just weird, and there's already existing ones. So this is also a little like that. So Brett Cannon. Good. Okay. No, what are P&D files? I don't know. No, no, that's how you specify the three days ago. You say, uploaded prior to equals P3D instead of just three days.
Starting point is 00:16:42 Who's going to remember that? Okay, yeah. Like the UV one, you just put in quotes, one week, seven days. Well, I mean, it's easier. Yeah, whatever. The parsing is probably easier or something. I don't know. I know, but it's used by millions of people written by ones. You could ask AI, help me parse this. Anyway, yeah, I got it.
Starting point is 00:16:57 So there's this PEP 751, which was created 2024, and it was, I believe it was accepted 14 months ago. I'm not entirely sure. It's not super obvious, but the resolution is March 2025. So it's been around for a long time, about a year plus, okay? And this is the file format to record Python dependencies for installation and reproducibility by Brett Cannon. He's been working on this forever. And the idea is this is basically the pylok.t.comel. And that's, if you go over here, it's actually in the Python packaging user guide on how to do it.
Starting point is 00:17:31 This is great. So why am I talking about this? Because the deal is PIP now supports installing these somewhat as an experimental feature. So a year after its acceptance, switch standardized pylok.t.comel, PIP 26.1 gains experimental support for such files, for reading and installing from them. Okay, so this is great. I think it's just, you know, PIP is the most common way to install stuff in Python still. even if it's not the most hyped or most rust implemented way to do it. It's still the most common way, right?
Starting point is 00:18:03 And so having this support it is kind of like table stakes, right? So I think this is really cool. There's a bunch of caveats and limitations and so on and so on. But the criticism is like if this is officially accepted by the core developers and Python and the steering council, why did it take a year for PIP to support it? That seems pretty out of sync. That's a long time from, yeah, we have this official feature and our official installer can't use it. it. But I'm glad that it has it. So that's good. Maybe there's more to it than I know again, but who knows.
Starting point is 00:18:32 Some stuff about the older, like newer and older resolvers, the new resolver now does many of the things that the older resolvers were being kept around for. So I believe there's now a plan to remove the older legacy resolvers. Like if you've got this version less than or equal to and you've got that other version and so on, like how do you resolve that, right? Then a few security fixes, not necessarily going to go into it too much, but there you go. A brand new PIP, 26.1. And I feel like this is a big release, which is why I brought it up. It's got the dependencies cooldowns, which I think is critical these days. And it's got lock file support.
Starting point is 00:19:05 Yeah. Yeah, no, it's really cool. And it's definitely moving the right direction. Yeah. Also, shout out, this is, yeah, go ahead. With wacky, CLA names. But okay. Yeah, yeah.
Starting point is 00:19:16 Also, just shout out. This is by Richard C. So, thanks, Richard. Cool. I actually am excited about another PEP as well. So we're going to stick with the PEP thing for a little bit. this time sentinel values and also a blast from the past because um i have searched still up anyway this uh sentinel values is pep 661 and it was created in 2021 and it's five years later
Starting point is 00:19:41 uh so it's taken a while however it is uh resolved as of and and final as of uh for python 315 so we'll get we'll get sentinels finally um there's a there's uh the the the PEP 661 has a bunch of information, but it's also, it says important, this is a historical document, even though it's just finally here, but it's been being worked on for five years. The actual information then is in the Sentinel information now in the documentation. So we can click over and look at, I already had it up, look at Sentinel values. And they're kind of neat. So I'm excited about this.
Starting point is 00:20:23 I don't quite understand some of the information, the decisions, but that's okay. So we often use like none for a sentinel value or some, or NAN if you're doing like floating point stuff. But this, you just sentinels are built in that you just have to, you don't have to import anything. You just say sentinel and then give it a quote, a name like in the example, in the docs, it's missing. So a sentinel parentheses and then the quote missing.
Starting point is 00:20:51 You can name it whatever. But the object, one of the interesting things, it acts kind of like none. You use is, like, if you want to check a variable or an element, if it is the sentinel, you can say is missing. And then in the types, the type is handled interestingly. The object is both the value and the type. So you can say, like default, like some variable is of type int,
Starting point is 00:21:18 or the pipe operator, or missing. So you can have that together. And it isn't missing altogether. Missing isn't the magical sentinel value. You can have anything like not available or whatever you want to call it. So about, I guess, you know, I think that it's very much about time that we have a sentinel value built into the language. Interesting. Okay, so everything so far is sounds great. The part that I don't quite get is the, is that it's, it's not truthy, or it is truthy, that's it. So if you're going through a list, like an array, and, and you want to check to see if it's sentinel, you have to say is your sentinel, you know, if you use the is operator to make sure it's there. You can't, like none, you could just
Starting point is 00:22:08 say if, like, you know, if the value, and if that's true, that means it's not none, because none is, evaluates to false. All the sentinels evaluate to true. Um, so you have to treat it differently. All right. I'm already feeling a pep to enhance the sentinel coming on. To make it, to flip the values. But what do you mean? You know what?
Starting point is 00:22:29 I love the idea of a sentinel. I love it because it lets you check. You know, it's like if you're doing a, you know, finding the index of this thing, this item in a string and it's negative one, which is also an integer, but like that means it's not there, right? Like it's,
Starting point is 00:22:44 all those kinds of checks are just odd. And I love having the idea of a sentinel, a thing where you say, say, I can check this and it's the same type, but if it's, but if it is this, that means actually we weren't able to process it, right? Because yeah, something weird came back. What I mean is, so the way you create the sentinel value, like the, it's a singleton or one-off type of thing, like that's why use is, right? And so you say Sentinel and you give it a string, which I guess is the, what it prints when you try to like string or rep or the value, right? Yeah. And that's cool.
Starting point is 00:23:18 Why not give it a default, I'm even true, I guess, for compatibility at this point you would have to do, but why not give it a second variable you could pass like a falseness? Like is it true or false? Like comma false. And then boom, okay, if this sentinel comes up and I say if value, no, this is not a value because it indicates the absence of a value. You know what I mean? Yeah. Can I continue to rant?
Starting point is 00:23:41 I like that. It wouldn't break the current implementation, but you could make it more usable, I think. Yeah, exactly. Just default the value to true on that second parameter. So you don't have to set it. It doesn't break anything. But allow you to pass false. So if you if it or anything like where it's truthiness, truthiness has to be checked, then it can be what it says it is. Because you might have a sentinel for two different outcomes, right? That's plausible, I guess, although it's not usually the way it's used. Yeah. But like a lot of, I mean, for sentinels to mean, like I've got a big vector of stuff. I want to know if there's, if it's not filled out.
Starting point is 00:24:21 Yeah. I think the default should be false. Primarily, sentinels are used in the case where something didn't work out right. Normally you get a value, but sometimes you get a special value that means you couldn't get a value. That should be false. I agree. Yeah, I guess the one part where I say, you know, maybe it's, maybe it's, it's going to be more readable because it's going to force you to say is missing or is, you know, not there or whatever your value is. But I don't know.
Starting point is 00:24:49 I do like that you don't have to do that with none. You can just say that it's just value as false. I do like the idea. I think that you should get on that whole pep thing for extending that, Michael, won't you? Yeah, before you scroll that screen, let me work on my second version of the pep. I somewhat dislike the truthiness,
Starting point is 00:25:07 although I get it. And like, I'll just give a shout out to the quirky out in the audience that I guess even the Sentinel is an object, which kind of makes it hard to be false, but like you can implement Dunderbool, right? So then you could do it. Empty strings are objects and they evaluate the false. Yeah, yeah, yeah, exactly.
Starting point is 00:25:24 Like you can, there's a magic method to say false. They just return the value of the parameter that goes in and when you created it, right? I think it's like a five-minute implementation and 47-hour documentation and negotiation afterwards. But here's the other one. I think this is more subtle. There's an example saying, like,
Starting point is 00:25:42 How do I express in the type system that something might be of the value expect or the value that is the Sentinel? And this was much, I think this is a much more significant, probably not going to use this after all. Sort of thing in my mind. Let me set the stage. If I have a function that returns none or integer and I say X plus Y where X came from that function, my type checker will say, oh, you cannot plus that with another integer because none, does not allow addition with integer. So you'd say, if is not none,
Starting point is 00:26:17 X plus Y will work without the type checker failing. But if you just say X plus Y, types fall apart, it gets all upset. I'm pretty sure that's what's going to happen here with Sentinels. Your type has to be, I return an int or missing. An int or the Sentinel. The Sentinel does not have type information based on an integer.
Starting point is 00:26:36 And so any int operation on it will fail. And the type checker is going to warn. So you're going to have to add all of, these like I know it said it was an int or sentinel but it's not any longer now it's just an end and I feel like there's a lot of clumsiness that's gonna go go along with like trying to work with things that express themselves as int or sentinel so could we just add another another type another parameter to the creation like it's a sentinel of int it's a signal of we're we're not gonna get it for 315 but maybe we get it later
Starting point is 00:27:08 no no I'm just saying like wouldn't it be nice if you could have it also be the type so you don't have to express in the type system that it is something else. You know, I don't know. To me, I think that's going to be, I think it's going to be rugged. I think it's going to be rugged. So, yeah. I think we're, I think we're getting into like old man yells at cloud territory, so we may have to move on. Cut me off, Ryan. I'm telling you. I told you I was going to be on like a type run this time. But no, I'm glad you covered it because this is interesting. I love the idea of Sentinels. I've used them. I haven't used them very much in Python, but I used them back in my C++ and C Sharp days. And I really like them. I think they provide a
Starting point is 00:27:49 really interesting goal. Yeah. Do you have any extras for us? Yeah, you know what? I got some stuff to talk about. Let me swap it over. I didn't have any extras written down, but then I decided I have extras. So I haven't talked about this yet, but if I go over at Talk Python training, I've done some cool stuff. So, no, there's so many. You warn me of this pop-out thing, but I'm now starting to feel it. So over at Talk Python Training, we now have German subtitles for every single course, for all 280 hours of courses. So, like, I pull up Vincent Wormmerdams' LLM Building Blocks course.
Starting point is 00:28:31 And now if I click on the CC, I get English, German, and this one even has Spanish. How about that? Nice. Yeah, so I can turn on like German transcripts and you know they're on the screen maybe those are too big so I can make them a little smaller Put them up at the top so they're not in the way. So I've done a ton of work on the transcripts over here and now guess what? Vincent's speaking German now You'll show an ein Kleiner Datenbank knocked right let's go So anyway if you were a German speaker check out the courses and turn on the subtitles for German if you like I'm wondering how like when we're going to have AI tools that can
Starting point is 00:29:08 not just do different translations for subtitles, but can we get, like, dubbing in the original speaker's voice? That'd be cool. Hold that thought. All right, that's my extra. We have German subtitles for all the courses at Talk Python, all 280 hours. And I would just like to point, I said this in the blog post, but just to listen to 280 hours of audio is seven weeks,
Starting point is 00:29:31 almost two months of work. Not the translating, not the double-checking, just to listen. two is seven weeks worth of work 40 hours you know eight hours a day five days a week sort of thing so that is a mega mega project i've been working on and you got a lot of courses up i know it's crazy so i'm working through the spanish ones now i've got maybe four courses converted over spanish but spanish is going to be the next language so don't feel left out spanish speaking folks it's coming cool soon yeah that's my extra all right well um i'm gonna roll back to we were talking about
Starting point is 00:30:06 What was it? It's not the place, the non-Github place that we had somebody at. Cudberg. Cudberg, yeah. So on that topic, Armand Ronecker wrote an article about GitHub. So before, before GitHub, basically it's talking about this. Like we're at an inflection point, not sure where we're going to go forward. But before GitHub, it was a weird place.
Starting point is 00:30:33 We had basically source control's role. you're on or do something wacky or it was it was a very different experience so we're and i'll have to agree i am even at the state of gethub right now i'm still using it a lot and other things but i also understand some of the gripes people have um there's political stuff and with AI and everything but aside from that and whether or not they should have been able to just steal everybody's content or intellectual property because open source does not mean please just, you know, train your AIs on it. But anyway, regardless of that, there's another, there's another aspect of it that the GitHub is really built around community building and getting as many collaborators as possible. And it doesn't often work that way.
Starting point is 00:31:26 I mean, we often get a lot of people submitting issues and griping about things, but not a lot of people contributing actual good. code contributions. And it's even different, it's even worse now with a lot of people just submitting AI slop as stuff. So I totally get projects moving away. So there's an interesting discussion about like, you know, GitHub is slowly dying. I don't think it's going to die. I think it'll stick around for a long time.
Starting point is 00:31:55 And I think for many years it'll be still the dominant one. But there is some interesting shifts. At first there were like just a few people. moving over or not a few a lot of people moving over to Codeburg but it was smaller projects but now there's some bigger ones and and one of the things that I'm grateful for is he brought up tenacity and I'm like what's tenacity I don't know what that is and it's a it's a total tangent for the topic but it's a cross-platform multi-track audio editor and recorder that I'd never heard of
Starting point is 00:32:26 and it looks pretty cool so I might check that out it's open source thing anyway that's very cool I've used audacity and I I bet it is a riff on audacity, right? Yeah, yeah, yeah. Now, I'm definitely checking this out. This is cool. Yeah, so I definitely want to check that out, even though I'm happy with my audio situation right now.
Starting point is 00:32:46 But anyway, the interesting thing around this, that I, it's an interesting topic, and it's not a really long article. One of the things he talks about really is when, if we go to Codeburg or other roll your own stuff and have source code like all over the place, we do lose some things because even though you can clone, you can do things that conversations are less.
Starting point is 00:33:09 We have less conversations around issues, around the direction of some projects. And if it goes away, if somebody deletes their repo, we lose all of that. So there's some archive possibly needed, some history that would be helpful. Anyway, interesting discussion around that here. So thanks, Armin.
Starting point is 00:33:28 Yeah, a very nice article, Armin. And I've really appreciated his thought pieces and his writing over the last year or two. So good job. Yeah. Also, Hugo out in the audience said, hey, don't forget, Python 3145 is now out with the new, which is actually the old GC,
Starting point is 00:33:47 and 315 beta 1, the feature freezes out. So, yes, thank you so much for that. Like, most notably, the garbage collector GC has changed back to the full pausing one, not the incremental one. Everything old is new again. Yeah. Exactly. Here we go.
Starting point is 00:34:02 back around. All right. Tell us a joke. Ready for a joke. So I've been thinking about like everybody's saying, oh, now that with AI and everything, I can write my own software. I don't need software engineers. So it reminded me of this XKCD. So I'm bringing this up. So it's a make it yourself, make it, make it myself, XKCD. And there's two people looking at a couple boxes and I'll just read it out to you. It's mostly text. They want $80 for this. I could make one myself for, $10 in parts, an hour of work, a trip to the hardware store, another $30 for parts, another few hours of work, two more trips to the store for $20 more in parts, another hour to redo the first hour of work because I messed it up, and $80 to buy this
Starting point is 00:34:47 when the one I made breaks. So that is so true. It is 100% true. Yeah. Yeah, exactly. And I don't, that's why I think this whole vibe coding thing, you know, there was the SaaS apocalypse. And I think that's going to be a very focused and very narrow impact because it's like, yeah, you can make it and you got to back it up and maintain it and run it and operationally and secure it.
Starting point is 00:35:11 And you know what? Let's just let them run it for 10 bucks. Yeah, I think I'm hoping that it will drive, I mean, not, I think it'll change the dynamics of how people make money about software. But I do think that I'm hoping we get more good SaaS out there and some of the, even if it's helped. being made by AI, but the good ones will stick around, even if they have to get re-coded. Yep, indeed.
Starting point is 00:35:38 All right. Well, thanks a lot for a great episode. Thanks, everybody for listening. Yeah, thanks. Bye. See you later.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.