Python Bytes - #459 Inverted dependency trees

Episode Date: November 24, 2025

Topics covered in this episode: PEP 814 – Add frozendict built-in type From Material for MkDocs to Zensical Tach Some Python Speedups in 3.15 and 3.16 Extras Joke About the show Sponsored by u...s! Support our work through: Our courses at Talk Python Training The Complete pytest Course Patreon Supporters Connect with the hosts Michael: @mkennedy@fosstodon.org / @mkennedy.codes (bsky) Brian: @brianokken@fosstodon.org / @brianokken.bsky.social Show: @pythonbytes@fosstodon.org / @pythonbytes.fm (bsky) Join us on YouTube at pythonbytes.fm/live to be part of the audience. Usually Monday at 10am PT. Older video versions available there too. Finally, if you want an artisanal, hand-crafted digest of every week of the show notes in email form? Add your name and email to our friends of the show list, we'll never share it. Michael #0: Black Friday is on at Talk Python What’s on offer: An AI course mini bundle (22% off) 20% off our entire library via the Everything Bundle (what's that? ;) ) The new Talk Python in Production book (25% off) Brian: This is peer pressure in action 20% off The Complete pytest Course bundle (use code BLACKFRIDAY) through November or use save50 for 50% off, your choice. Python Testing with pytest, 2nd edition, eBook (50% off with code save50) also through November I would have picked 20%, but it’s a PragProg wide thing Michael #1: PEP 814 – Add frozendict built-in type by Victor Stinner & Donghee Na A new public immutable type frozendict is added to the builtins module. We expect frozendict to be safe by design, as it prevents any unintended modifications. This addition benefits not only CPython’s standard library, but also third-party maintainers who can take advantage of a reliable, immutable dictionary type. To add to existing frozen types in Python. Brian #2: From Material for MkDocs to Zensical Suggested by John Hagen A lot of people, me included, use Material for MkDocs as our MkDocs theme for both personal and professional projects, and in-house docs. This plugin for MkDocs is now in maintenance mode The development team is switching to working on Zensical, a static site generator to overcome some technical limitations with MkDocs. There’s a series of posts about the transition and reasoning Transforming Material for MkDocs Zensical – A modern static site generator built by the creators of Material for MkDocs Material for MkDocs Insiders – Now free for everyone Goodbye, GitHub Discussions Material for MkDocs still around, but in maintenance mode all insider features now available to everyone Zensical is / will be compatible with Material for Mkdocs, can natively read mkdocs.yml, to assist with the transition Open Source, MIT license funded by an offering for professional users: Zensical Spark Michael #3: Tach Keep the streak: pip deps with uv + tach From Gerben Decker We needed some more control over linting our dependency structure, both internal and external. We use tach (which you covered before IIRC), but also some home built linting rules for our specific structure. These are extremely easy to build using an underused feature of ruff: "uv run ruff analyze graph --python python_exe_path .". Example from an app I’m working on (shhhhh not yet announced!) Brian #4: Some Python Speedups in 3.15 and 3.16 A Plan for 5-10%* Faster Free-Threaded JIT by Python 3.16 5% faster by 3.15 and 10% faster by 3.16 Decompression is up to 30% faster in CPython 3.15 Extras Brian: LeanTDD book issue tracker Michael: No. 4 for dependencies: Inverted dep trees from Bob Belderbos Joke: git pull inception

Transcript
Discussion (0)
Starting point is 00:00:00 Hello and welcome to Python Bites, where we deliver Python news and headlines directly to your earbuds. This is episode 459, recorded November 24th, 2025, and I am Brian Ockin. And I am Michael Kennedy. And we want to thank everybody that supports us through the stuff we sell to you guys and provide services like the courses at Talk Python training, the complete Pytest course. Our Patreon supporters, of course, we love you. and also like everybody that buys books and stuff. Connect with us if you'd like to, like to send us some topic ideas
Starting point is 00:00:34 or have a comment about what we said on the show. Feel free to connect with us. There's a contact us form on Pythonbytes. Pythonbytes.fm. But you can also reach us on Blue Sky or Mastodon, and those links are in the show notes. If you are listening to us and occasionally we'd like to maybe see what we look like
Starting point is 00:00:54 or see the topics that we're, we're putting on the screen. You can join us at YouTube at pythonbites.fm slash live. And there you can also subscribe to the show and also get notified when the upcoming ones are going on. Usually Monday at, we need to change this, usually Monday at 11, but all the older versions are there too. And if we ever change it for personal reasons, you can get notified there. And of course, please sign up for the newsletter. We love to send you all of the links of the things we talked about in the show are sent to you right to your inbox, and we don't spam you. It's just the show notes, really. What do you have for us up first, Michael?
Starting point is 00:01:36 Actually, we have some pre-topics. We have a topic zero topic. How about that? Yeah. Topic zero. So I don't know what people notice. I mean, if you've read your email or listen to the radio or anything like that, you might know that it is the week of Black Friday. And I think there's a lot of restraint on our end, Brian, by keeping it to the week. Like, I was getting Black Friday sales in October, and this is wrong. This is, this is disrespecting Halloween amongst other things. Yeah. Anyway, Black Friday, I launched the Black Friday sale over at Talk Python.
Starting point is 00:02:10 And this year is different than the last year. Traditionally, I just sort of done, hey, get the everything bundle, get all the courses, owner forever sort of thing. But I wanted to do a couple of different deals. So now there's three options you can pick from. you can pick, I have what's called the AI Python bundle. So two of our most popular courses, Aigentic AI programming with Python, one that I wrote.
Starting point is 00:02:30 And that's if you want to use things like Law and Cursor or Junior or whatever to build applications with alongside you. And we have LLM Building Blocks in Python by Vincent Wormidom, which is basically if I'm going to build something that uses an LLM as an API. Really cool stuff in both courses. really well reviewed and received. You can get that for like half off as a special mini bundle. Then the whole library thing is still on.
Starting point is 00:02:58 You can get it at at super discounts. And I'm also offering the Talk Python in production book. It's doing really well for a while. It's been a good while at number one in software engineering on Amazon, which is amazing. It's nice. It's got pure five star reviews on Gum Road, at least as of yesterday. I haven't looked today, but I think that's still true. So really well received.
Starting point is 00:03:19 And that's also on sale out of discount. So that's Black Friday at TalkPython.com. Fython.fm slash Black-Dash Friday. Check it out, please. It's, you know, as you said at the top brain, it's a really great way to support a show. Make sure that we keep doing what we're doing here on the podcast. And with that, how about you?
Starting point is 00:03:36 Well, I was actually not planning on doing a Black Friday thing, but I saw your topic in there. So I thought, yeah, why not? I'll do a Black Friday thing, too. So the complete Pitex course bundle, I put that on sale. And if you go, so you have to use a coupon code, the coupon code, if you go to the, if you just say, yeah, I want to get this. If you type in Black Friday, hit apply, then it's a 20% discount. But, you know, I don't think that looks big enough.
Starting point is 00:04:05 I think we'll also do a save 50. So if you do save 50, apply that, that's better. It's $29 now. So you get half off, 20% off or half off your choice. so doing that it'll be in the show notes but i kind of want okay so i was trying to match michael because he was doing 20% on some of the stuff so i did i did the black friday for 20% but then i read my my newsletter um my from pragmatic uh and pragmatic is doing a 50% off um for black friday so you can get you can get um all pragmatic books if you say 50 you get 50% off and that includes
Starting point is 00:04:46 Python testing with pie test. So you can grab the video course or the book or both. So those are my Black Friday deals. Awesome. And yeah, I don't, I mean, I know sometimes it feels overly salesy, but I did not do Black Friday one year. And I got a ton of email like, Michael, where's your Black Friday sales? I've been hoping that you would do one. So I know at least a subset of people genuinely look forward to it. So I appreciate it. I personally, if I, if there's something I really want for my learning path, I try to get it whenever I get it. But there's some things that are like, it depends on your budget. And there's some things that I don't need to have right away and I'd like to use.
Starting point is 00:05:24 And there's a lot of stuff that I've purchased on Black Friday sales. And I appreciate that independent retailers, independent people like you and me do the Black Friday thing for people that, the full price is just a smidge out of their price range. That's good. Yeah, absolutely. All right. Let's talk about Python stuff.
Starting point is 00:05:43 I mean, that was Python stuff, but, you know, other Python stuff. Pure Python. And I'm talking language-level Python. Okay. There's a lot more to Python than just the language. I think some people get confused. I saw this comment on Reddit of all places. Somebody said, hey, what podcast should I listen to to learn Python?
Starting point is 00:06:00 And like, why would you do that? That's stupid. Why would you listen to a podcast? Like, just, it's a simple language. Just go here and take the tutorials. Like, a little bit broader. But today, no, actually, we're talking built-ins. We're talking runtime.
Starting point is 00:06:11 So PEP-814. This one is a really natural addition now that Python 314 came out. This is Add Frozen Dict as a built-in type. It comes from Victor Stinner, who does a ton of performance stuff as a core developer. Thank you for that, Victor. And the idea is that when you have a frozen type, you get a lot of guarantees that you can work against that make your life easier, your bugs, et cetera, right?
Starting point is 00:06:39 With a regular dictionary, you might say, oh, I need to return this dictionary to a thing and you don't do a copy, you just hand it over. And then somebody changes it. You're like, wait, you weren't supposed to change it. I'm just trying to give you some data, right? So these frozen times are great for if you don't want something to change, it's not going to change.
Starting point is 00:06:53 Also, in the code that would try to change, it would get an error, right? So you're like, no, you're not supposed to do this, figure something else out, right? The intention is sort of conveyed through the exceptions, I guess, it would cause. But also, it's really good for concurrency, which is the link back to 314. Now that we have true parallelism, we, I don't know how many people are doing, this but find yourself a re-enter lock and use it in your code if you're writing true parallel code right because race conditions deadlocks well starts with race conditions ends with deadlocks and hopefully fix anyway you want to do that and you got to lock mutable code but if you're reading
Starting point is 00:07:27 two things are reading from the same data structure and it's not changing well there ain't no problem with that you can read with a hundred threads in parallel and they're all going to get the same answers deterministically right so having frozen dict adds dictionaries in as a type that is basically purely thread safe if you use a frozen dick instead of a regular one. Cool, right? That is cool. I wonder if this will ripple up into other data types, like classes and stuff. That is a very good question.
Starting point is 00:07:52 So I have an answer for you as well. So there's some interesting things. I link to the discussion, not the pep. And there's some interesting. There's like a really thoughtful conversation by lots of people that you know from the show. It's like, well, okay, that's cool. And I equate a dictionary with a frozen dict because what you'll see from the pep is frozen dict actually derives from something that is not a dictionary so it's not if you said
Starting point is 00:08:14 is instance a dictionary frozen dick that would say no i'm not so sure how i feel about that but that's how it is so the question is well can you say equal equal and it's it ordered and so on it is ordered but is that order dependent and the equals and so on right so there's a lot of stuff coming along and here you can go see all the conversation there so i also just like you brown like well what else is frozen in python like what do we already have you know if we didn't have a frozen dictionary so I put together a list of existing frozen types and structures in Python. Two pole. Frozen set was the motivation for Frozen Dix.
Starting point is 00:08:47 Like, well, if we got a Frozen set, why came in a version dictionary? You know what I mean? String, obviously, bytes, ranges, memory views. Memory views are interesting to put over like a buffer, but you don't want it to be written to. But you've also got mapping proxy types. You've got a record class in the form of data classes. And then data classes, you can say Frozen is true. So this is as close as you're going to get.
Starting point is 00:09:09 probably to what you were asking for. Right, frozen classes. You technically could more or less do that if you had private variables and only read-only properties, but that's a lot of work. You might as well just create a frozen data class. I don't know. But yeah, so there's a whole bunch of stuff that I put in here that people can check out. I put a little extra data sheet because I'm like, huh, I also want to know this.
Starting point is 00:09:33 Yeah. Yeah. All right. I'm looking forward to Python 315 being more frozen. How about that? more frozen yeah more frozen might be faster too yeah it definitely has a possibility for faster or safer could more easily yeah document it though uh yeah got a document um i'm bringing up an interesting topic that this was uh this was suggested by john hagan thanks john um material for make
Starting point is 00:09:57 docks so i i use make docks and i use material for made doc make docs material for make docs is a theme for make docs but it also adds some cool stuff to it so it's a it's like a theme plus plus or something. But I'm, and I use it for both private project, personal projects and for at work, because it's just as easy to work with. Anyway,
Starting point is 00:10:20 there's some news about material. So the, the news that came out is that material, the team behind it is putting material, the MakeDox, material for MakeDox project in maintenance mode. So for new updates, I don't,
Starting point is 00:10:34 well, actually, I don't know because I didn't really fully read this, but the uh i'm guessing that they're going to let's see changes the issue tracker um the mace where are they encouraged to open bug reports following okay uh they'll they'll fix critical bugs and security vulnerabilities that's right that's a good that's fair um because a lot of people use it including me so i care about that but why are they putting a maintenance mode it's because the efforts that they're putting in this they're switching over to a new project called zensical um zensicle um zensicle and anyway interesting name uh modern static site generator built on
Starting point is 00:11:14 material for make docs team built by the material for make docs team so there it's a backwards compatibility and we're going to link to a post that is actually all the it's like a four post series talking about the transition because it affects a lot of people so they're transforming material for make docs they're just starting zensical um and they've got a material for make docs Insiders program and they're closing down GitHub discussions on material for make docs. So that's this is, it isn't, it isn't, it doesn't seem like it's profit driven because they were actually making a decent amount off of, of GitHub sponsorships for this, but maybe
Starting point is 00:11:54 that's part of it. But the, the gist of it is at the top, they said they, they want Zensical to overcome technical limitations of make docs. There's so many people that were using it, though, and not using any other form of make docs, but in part of this discussion they were talking about that a lot of the stuff in the forum and a lot of stuff in just GitHub issues were really not their problem. There were like really issues with make docs or issues with some other dependencies that they were using. So their Zensical is going to just wrap up.
Starting point is 00:12:28 It's going to be a replacement for make docs with and make docs material for make docs. And I think a few of the other plugins that used to work with all of this, they're pulling those in to have a support model so they actually can, so that they can actually support everything that matters to people that use it. The new website, zensical.org, is just fun to look at. There's this cool animation thing. Yeah, it's really cool. So for people listening, you've got to check it out.
Starting point is 00:13:00 Zensical.org, it's this, it's a blackbacker, background with this kind of a yellow light laser thing going around in circles or it's a i don't know what that shape is like a donut but uh um thank you uh henry shriner um it's also a a uh henry jumped in the show it's also a rust rewrite and it's five times faster so a lot faster um i'm i'm pretty excited to try it out i haven't tried uh tried this yet but um since i'm a heavy material for make docs user i'll definitely be checking this out the so one of the things i was curious about is the whole like how is this work for um like is it going to go away or what's the open source model it's it's still going to be open source and uh mit licensed and there's a compatibility
Starting point is 00:13:51 back they're trying to make it backwards compatible so when you switch over you can use your make dux yamil file just as is and hopefully it works i haven't tried this yet and i you know I actually haven't, like, customize it too much. So it's probably going to work fine for me. But I do want to check out this, this compatible, backwards compatibility thing to switch it over. I like some of the cool features that they're, they're trying to build in. And faster is great.
Starting point is 00:14:17 The other thing, so how are they making money is the Zensical Spark model. So this is going to be, so what's Zensical Spark? It's a, it's for basically, actually. I think it's more than this, but part of it is companies that use it or groups, teams that use it, even startups. It's not just for enterprise people, but groups can go ahead and sign up for this and have better support model. So they can prioritize supporting people that are paying them, which makes sense. But those fixes will help everybody. So that's good.
Starting point is 00:14:56 One of the cool things that they're doing also during the transition is material for make docs to one of the ways they made money is they had both an open version that's just free for everybody. And then they would release new features to an insider's group. So you could get new features right away. And then they would sort of trickle into like the general population features. They're getting, I don't know if there's going to be something similar for the spark thing. But for material for MakeDocs, they've just opened up the floodgates. Anything that was in The Insiders is now part of the open source thing. So if you want to stick to that, you get all the features.
Starting point is 00:15:35 So anyway, just news for people that use MakeDucks. Yeah. Nice. I'm optimistic here. If their websites, anything to go by here. No, I'm optimistic. I think it will, it's the kind of thing that'll make it really polished and really nice if they get decent number of customers.
Starting point is 00:15:52 So good. And I do like. that they're just actually putting the people on the, the people with their pictures. These are the people behind it. It's not something, just no, no face,
Starting point is 00:16:04 faceless people making your money off you. Exactly. Cool. That's right. Keep Python independent. Okay. So I want to talk about something that was sent into us by Gerber Deccan. And this is sort of the hat trick,
Starting point is 00:16:20 if you will, Brian. Two weeks ago, you talked about Hepteptree, and UV PIPTree, right? And then the last week, you talked about Deptree. This time I want to talk about a couple of things. I want to talk about UV.
Starting point is 00:16:35 I mean, I got to look at this. This is a bit of a statement here. I know it's rough. Rough, analyze graph, and you give it a virtual environment and some place to start or file or something like that. And it will generate basically an architectural layer diagram. This portion of your application depends on these other modules, which depend on those modules, and is there any form of circular weirdness or whatever, right?
Starting point is 00:17:02 So if you want to understand how that works, this is like, it'll give you just a list. So this thing, tech, which I'm covering and he sent in, is a Python tool to visualize and enforce dependencies using this architecture, and it basically takes that output that I described and then turns it into a UI, okay? Okay. So there's a little graph that shows it how it works, but basically you just create a tommel file, and then you run a command, and it will scheme your code.
Starting point is 00:17:30 It will generate basically that, more or less, that output. And then you can upload that, not your code, just this little tomel file that basically has module names and nothing else in it, to this service that will turn that into a graph. Okay? So I did that by creating one for a project that I'm not talking about yet, what I'm working on here, and you can see, I've got this cool little thing,
Starting point is 00:17:53 and it shows, like, how are you, how is it put together? What pieces depend on what, right? What depends on services? And then if you interact with it, you can click on, like, I want to see what views depend upon. Oh, that's a lot. But if I click on view models, you can see there's a nice layered architecture, right?
Starting point is 00:18:08 View models depend on services, infrastructure modules, and DV models, but it doesn't depend on the stuff above it that depends on it. So you don't get these weird cycles. and other things. And there's like a little CMS component. It's used by the views. It's used by the services and nothing else.
Starting point is 00:18:24 So you can get this like cool little understanding of different parts of your app. What do you think? This is cool. So this is maybe because we were talking about PIPT up tree before. This is not the third party stuff. It's just this is your own code. So how does your own code within itself like a set of submodules depend upon each other? Because if you look at this,
Starting point is 00:18:47 When you were actually looking at the code, this view model section probably has 30 or 50 classes in it. This infrastructure bit has maybe 20 different things like utilities for web stuff, URLs and so on, and text parsing and numeric converting libraries and, you know, stuff like that. This DV thing has, I think, five database models in a helper class or, you know, something. But they just show up as these blocks. So you can kind of get a sense like, how do these things interact? The lines are drawn by analyzing all the individual files, I believe. seeing which ones point to things from other categories. Oh, nice.
Starting point is 00:19:21 This is cool. Yeah. So I uploaded this picture. I created this, I create these shareable graphs. So you can actually play with this graph yourself as well. Yes. And then Christian asks, is this unmaintained? Yes, I believe tack has become unmaintained, but you can still run it and still looks
Starting point is 00:19:38 like it's okay. So that's a good point. I did notice after I put this together that somewhere it's got like a, although the GitHub repo, no, it's not. tack that is, at least doesn't have an explicit statement that has gone into maintenance mode or anything. But if you look at tack, it goes to gauge.sh. And if you go to gauge.s.h, this product is changed and it's becoming some, I have no one, some AI thing that means nothing to me.
Starting point is 00:20:06 So, I don't know. But I imagine they'll probably keep it running and apparently it's also open source. So you could go and fork it and just running yourself or whatever. I don't know. So there is some component in this whole chain that is, it's the visualization of it, not the parsing. I think TAC, which does the parsing, still works fine, but then it uploads the gauge. And the gauge thing is in question about whether that will still work. But, you know, you could always go and make yourself fork it.
Starting point is 00:20:34 I mean, how hard it would it be to make? Given the file, which I don't have it easy, I can pull it up, but given a file that lists these relationships, how hard is it to graph it, right? You could AI your weight of something that draws your pictures out of tack. in like an afternoon, if you really had to. This visualization left is an exercise for the reader. Exactly. Like, you could visualize it if you knew the right series of prompts. I don't know.
Starting point is 00:20:56 Or you wanted to write yourself. No, anyway, that's a good point, Christian. Thank you. All right. Over to you, Brian. Well, I wanted to talk about upcoming speedups in Python. So there's a couple news articles that came by that I noticed talking about some efforts to speed up Python in the future.
Starting point is 00:21:14 So we're at Python 314 right now, 315 is in the works and 316 later, right? So one of the things, this is a post by Ken Jin, a plan for 5 to 10% faster free-threaded Jit by Python 316. And the highlight is 5% faster for 315, 10% faster for 316. And some of the names popped out as why I should pay attention to this. This was brought up during the Python Core Dev Sprint in Cambridge. hosted by Arm.
Starting point is 00:21:48 They planned this project. The planners of the project that were present is Savannah Ostrowski, Mark Shannon, Ken Jen, this person, Diego Russo, and Brent Butcher, accompanied by other C-Python core team members as well. And also, I'm like, well, 5% to 10%, that's not that big big deal. It says, you might wonder, 5% seems awfully conservative.
Starting point is 00:22:15 However, note that this figure, is a geometric mean, the number can range from slower to significantly faster. All numbers are high performance figures. So I think that means they're optimistically thinking it's going to be a lot faster. So it's interesting that the LLM involvement here is a little interesting. Plan for 315 profiling support via LLVM 21. Oh, not LLLN. That's probably performance-driven compiler optimization stuff.
Starting point is 00:22:49 Yeah. Yeah. Where you profile it, and then the compiler says, let me look at the profiler output and then actually re-optimized based on how you actually ran, you know, like that sort of fact. I bet it's something like that. Yeah, but it's also looking forward to, so some support from LLVM 20 and then LLVM21. So some of those lower-level parsing improvements are going to help with the JIT.
Starting point is 00:23:13 So trace recording, better machine code, register allocation, top of stack caching, reference count to elimination. That's an interesting one. More constant promotion than basic free threading support. So I'm looking forward to seeing there's some graphs and some more information that's really above my pay grade. How old bars and stuff? They must be in things. We don't know. Yeah, it looks like it's got like those bar graphs with the eyes.
Starting point is 00:23:43 like max admin and stuff um so oh wow yeah so the uh the potential yeah anyway uh interesting i'm glad that i'm glad that we're getting some uh some jet speed up so this is cool uh another speed up is um uh this is from emma's blog um i forgot to grab what emma's last name is but emma's blog decompression is up to 30 percent faster in c python 315 uh this is cool um so So, you know, and this is based on the Z standard, Z standard being added to the standard library. So really, I didn't really get why we cared about the Z standard before, but hey, if we can decompress that much faster, that's great. There's compression and decompression happening all over the place that you don't even really know who's going on. You know, I actually ended up for the very first time, I think, ever using compression in my,
Starting point is 00:24:42 Python app for a non, not because I somehow received a zip file or a tarball or something and wanted to process it, but like has a part of operation of the app. So I'll run as by, you tell me what you think. So this project that I'm working on that I, that shall not be named, has to store text and it has to store about 250k of text per database record for like a certain thing. But you never search it, you never index it, you mean you don't have a database index Like, you find it through other keys, and you just want to process it. And I'm like, well, how long does it take to compress and decompress that?
Starting point is 00:25:17 How much smaller does it get? So I used XZ, which is also built in a Python, and it goes down to about a tenth of the size. So for caching and database records, it's, instead of being 250K, it's like 25K. And you want to process it as part of a web request. You're like, yeah, let me just unzip it and send it back to you or whatever. And, yeah, I've found it really useful. So I'm all of a sudden super excited about fast decompression inside of C-Python. Yeah, little bits matter if you do the right ones.
Starting point is 00:25:46 Yeah, and by the way, it's like a millisecond level type of time to do that. So you might think, well, it's going to make your site slow, right? But if you can store 10 times much data without running out of disk space or cash or whatever, like that also helps a lot, you know. Yeah, it helps with how much you're paying for database services as well. Yeah. All right. One more thing for Henry on the compression thing, ZSTD. is used by Python build standalone.
Starting point is 00:26:12 So having the ability to handle that without dependencies is great. And faster next year, too. Yeah, thanks for the background and for there. And shout out, like, I'm just such a fan of Python build standalone as a way. Like getting Python on your, a computer, your computer, server, whatever, is just ridiculous with UV plus Python build standalone now, right? Seconds, not trouble, right? You know, UV make a virtual environment desperation of Python.
Starting point is 00:26:35 Yeah, it just changed, like, it's one of those things that changed everybody's workflow. And I don't even think about having to install a different version anymore. It just happens. And just does. All right. You got any extras while you got your screen up? Sure. The one extra I've got is that the Lean TDD book, one of the things I, it's putting it out while I'm writing it.
Starting point is 00:27:00 This week, I'm doing Thanksgiving prep. So I'm not going to release a chapter this week. But there'll probably be one next week. And that one's on building on Lean. The last one was building on TDD. What I wanted to talk about, though, is I appreciate the issues. So I decided to just use GitHub repo as an issue tracker for this. And I've gotten a couple so far, not very many, but maybe that's a good thing.
Starting point is 00:27:29 A typo came in. And actually, it's one that, like, I was just missing. I was reading all the time. And so I appreciate that. And then one of the ones was that I really appreciate is, like, discussion of extra topics I should build in. One of the things I've done is release the headings and subheadings for different chapters and different parts of the chapter.
Starting point is 00:27:53 And I'm hoping, in the hopes that people will go, oh, you got to make sure you don't forget to talk about this. And that's what I got from, looks like Jonah, I'm not going to try to pronounce that, but I'll say Jonah. In the chapter on considering considerations for applications for applying lean T&D talked about not just technical considerations, but what were they doing to social considerations. Because, and I face this too, and I'm like, oh, yeah, I didn't even think about this. There's a lot of people just meant like just opposed to doing tester and development or opposed to doing different kinds of tests for various reasons. and I wasn't planning on talking about that,
Starting point is 00:28:37 but I think it's very essential. I'm glad that I'm getting people's feedback. Could be the other way. They only want to do TDD and they want to do integration tests because they're impure, yuck. There's a whole spectrum of things you've probably got to speak to. Yeah. Go ahead.
Starting point is 00:28:53 And these are, they even brought up some other things. Like, we don't have time to test. Testing makes it harder to refactor. Your tests are not compatible with our architecture. I did a release, even though the tests are failing. because it's too important to get the code out there. Yeah, these are great things to talk about. Yeah, for sure.
Starting point is 00:29:10 I set up an issue tracker discussion board for the Talk Python and production book too. And it's been pretty active. I really appreciate it. There's a couple of things I could respond to. So you're waiting on me. I'm forgotten. I'm forgotten. And I told you that this was a follow-up to the dependency thing.
Starting point is 00:29:27 But here's another. This one's by Bob Belderbus. I didn't think it was big enough to make its own thing. But check this out. Bob says, I need to see which Packer. were pulling in a certain dependency instead of what packages depend. So you can actually go and do UV tree and give it a package and say dash-dash invert to answer who depends upon this.
Starting point is 00:29:47 And look at this picture. I see that's not really helpful. But you look at it and it actually says six was pulled in by bleach, which was pulled in by Jango Bleach, which was pulled in by its platform. Also six was pulled in Python date you till. Why is that even there by faith? So it's like a reverse of your depth tree. you can focus it on a node and turn it and basically filter it and restrict it just to the stuff
Starting point is 00:30:08 that works with that leaf node I guess so you'd look at it yeah so if you're thinking about cleaning up your dependencies and say well I don't want to pull in six anymore it might not matter because one of your dependents a bunch of your dependencies are already pulling it in right exactly and so you could also ask the question like well do I really need that dependency yeah or could I find something else that would allow me to get rid of this thing this this lower level dependency that's causing a problem you know that's often a question trying to answer all right Well, that is it for... Well, now I want an extra feature that says,
Starting point is 00:30:39 hey, UV, can you tell me which packages don't have any transitive... Aren't transitive dependencies? Yeah. Anyway, cool. Okay, let's talk a joke. So do you remember, Brian, last week? It was like so many things went down.
Starting point is 00:30:56 And one of the things that we had... We had AWS go down two weeks ago and take out a huge chunk of the internet, I think, for 14 hours because of DNS. Because of course it was. It was DNS to the database, which was the foundation of all the other things. And then we had Azure go down at the beginning of the last week. And then, was it Friday, Thursday?
Starting point is 00:31:14 I don't know. GitHub itself went down. Not the web page, but all the Git-based operations like Git pull, get push, et cetera, that talked to GitHub over the Git, CLI and other tools, was failing for hours. And so my question is, my joke is, which is a question, can GitHub even push fixes to GitHub when is down? I'm like, what do they do? How do you fix GitHub if get is a surely.
Starting point is 00:31:38 And I'm like, oh my gosh, surely. There's a lot of ways that you deploy code within GitHub by doing some kind of Git pull or get pushed with a web hook or something. Like that might have been a huge portion of the problem. Yeah. Yeah, the fix was probably instantaneously known, but you can't roll back or you know, get rollbacks. Can't do that.
Starting point is 00:31:59 Exactly. They're like how, who is going to log in here and fix this? I cannot fix this. though i'm curious how they did it then i mean this is this funny but also it probably was an issue exactly so i thought we might have a um an educational joke this week can gethub even push to get hub when get up is down this looks like a um who is the the guy in the matrix neo who is that actor keanu reeves yeah kiana reeves i'm not sure who this actually is but from like bill and ted days yeah exactly whoa man how'd they even fix it
Starting point is 00:32:33 Anyway, I don't know how they fixed it, but when it came back, I was very excited that it was back. Awesome. All right. Well, a fun episode, and I hope everybody in the U.S. or anybody that celebrates a tricky day outside of the U.S. I hope you have a wonderful Thanksgiving. And everybody else. Hope you have a great week. Also, talk to you next week.
Starting point is 00:32:53 Bye, everyone. Bye.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.