Python Bytes - #369 The Readability Episode

Episode Date: January 30, 2024

Topics covered in this episode: Granian pytest 8 is here Assorted Docker Goodies New GitHub Copilot Research Finds 'Downward Pressure on Code Quality' Extras Joke See the full show notes for thi...s episode on the website at pythonbytes.fm/369

Transcript
Discussion (0)
Starting point is 00:00:00 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds. This is episode 369, 69, recorded January 29th, 2024. I am Brian Ocken. Hey, I'm Michael Kennedy. This episode is sponsored by us.
Starting point is 00:00:18 Check out our courses at TalkPythonTraining and the Complete PyTest course. Thanks to Patreon supporters and really everybody that just spreads the love and shares our PyTest course. And thanks to Patreon supporters and really everybody that just like spreads the love and shares our podcast with other people. Thanks so much. The first item is going to be Grinian. Oh, neat. Yes. Before we get into that, I just want to sort of set the stage is, you know, when you're running
Starting point is 00:00:38 Python apps, web apps in production, there's usually something that talks to the web browsers. And then there's the part that runs your Python code, right? The part that talks to your web browser is Nginx, Apache, Caddy, one of these things that does SSL, it has no knowledge of Python. But then behind that step, we've got MicroWhiskey, and we've got G Unicorn, and those types of things, right? So that's where your Python code runs. Usually it'll create multiple of them. And both those two that I named, I'm big fans of. MicroWhiskey is awesome, super fast, low memory usage,
Starting point is 00:01:15 but only does WSGI, does no async stuff. And that's a huge drawback, right? It doesn't do async. It doesn't allow you to like properly scale async and await. Unicorn, on the other hand, allows you to use UVicorn workers inside there, which is kind of like one more chain in that loop. But when you deploy it that way, you can do async and await, which is awesome. But there's this new-ish thing coming along called granion from the emit framework which is a new python web framework but this is a rust based http server for python applications so a rust version of micro whiskey or g unicorn yeah oh cool that's kind of cool it has 1.5 000 stars on github been coming along for a while and um it's created by a guy named giovanni i believe
Starting point is 00:02:08 and giovanni says well why why build this thing a couple reasons it's a correct http implementation supporting version one two and working on http3 which is awesome it avoids the gUnicorn, uvUnicorn, HTTP tools tendency composition when deploying in production. So this natively supports async and await, like right in it, along with WSGI. So whatever kind of app you've created, you can just run it right there in this thing without like chaining stuff together. And one of the things that's nice about it is it's not a ton faster, but it's way more stable. There's less jitter in its performance profile, which I think is super cool. I'll talk about that in a second. But yeah, it has HTTP 1 and 2. Excellent. Supports HTTPS and WebSockets directly. I'm not going to send HTTPS traffic to it. I'm just using, for now,
Starting point is 00:03:00 Nginx or whatever. But it also supports RSGI, which is a Rust server gateway interface, I guess, like ASGI and WSGI. So it does all the Python things, plus it has a Rust direct version, if that was the way you went down it. And it's super easy to run. But from the performance perspective, if you look, it'll compare it down here to, let's see, against UVA Corn. And HyperCorn is another one I should have mentioned.
Starting point is 00:03:27 That's like a parallel to UVA corn plus unicorn, but that one I think handles it all directly from Philip Jones, which is great. But if you look at, so let's just say like the ASGI get, it says it'll do numbers, please. 1.3 million requests at, okay,'s just totally doesn't mean anything that could be over three weeks 85 86 000 requests per second or maybe this this one's a little bit better um a different get for 94 000 compare that against um the uv acorn one which is 19 000 versus 94 000 or the hyper corn at 12 000 versus versus 94,000, which is great. But if you look at the variation, like response time on, let's say UVA corn is on average 8.7 milliseconds.
Starting point is 00:04:16 That's really good. But the max is 320 milliseconds. Whereas if you look at this one, it's 2.7, but the max is only 8.6, right? So that variability or jitter, I don't know however the heck you say it, is way more stable. And you just kind of look across the board, like another example is 6 versus 70 and so on. So I thought that was pretty cool. So I switched, if you come over here, Brian, to Python Byte. This is running on Graneon right now over on my Docker cluster for the moment.
Starting point is 00:04:44 So I just thought I'd see how it goes. And it's been going perfectly from what I can tell. is running on granion right now over my Docker cluster for the moment. So I just thought I'd see how it goes and it's been going perfectly from what I can tell. So. So it's pretty easy to switch then? Yeah, I mean, all you gotta do is pip install granion and then change the start command, no matter however you run it.
Starting point is 00:04:57 If you're running in the systemd on like a VM, you change the systemd exec command. If you do it in Docker, you just change the entry point command. Or micro-wizgi this or gunicorn that too. It's basically just another startup command. Okay, interesting. Cool. Indeed, indeed.
Starting point is 00:05:14 One thing I did want to add for people who are considering this, I got to move this over. One thing it doesn't, you can set it up so it'll do logging, but it doesn't do like easy logging out of the box. So I actually was messing around like maybe I should just do my own logging. Not for my app started up. Hey, somebody click this button, but just request response logging, which is pretty common. So I actually ended up playing with it and using Log Guru to come up with a color coded,
Starting point is 00:05:43 added some middleware that came up with color-coded request response logging that does all sorts of cool stuff. Like see how some of the sizes in this log are red and some are white. If it's like over 500K in the request size, then it colors it red. Or if the response time is too slow, it'll color it like yellow. Or if it gets really slow, it'll color it red. Or it's a 400 or 500 code error, it'll tell it color like yellow if it gets really slow it'll color it red or it's a 400 or 500 code error it'll color that that part of the request red or you know so you can like look right at just get right away and see so i've decided doing your own log guru request response stuff is pretty excellent actually so that's a kind of a cool consequence of playing around with this as
Starting point is 00:06:22 well cool so is that like the um your use of logger that a custom thing where you you look at the response times and color it differently or something yeah exactly exactly like it'd be easy enough to just go print this format the code is this the url is that but i said it would be a lot more useful if it was color-coded and meant stuff right like if it's a 404 it should be a different color than if it's a 200. Or if it's a redirect, it should be a different color. If it's really, really slow,
Starting point is 00:06:49 if you see these times like 10 milliseconds, nine milliseconds, eight milliseconds, 12 milliseconds, if that was a second, maybe that's a problem. Color that a different color.
Starting point is 00:06:57 And I did that by installing middleware and this is in Pyramid, but it could also be whatever, right? You do this in fast api or whatever it just says begin the requests do something pass it down to the framework and then end the request and just times it and logs it and colors it there okay neat yep indeed indeed well i uh i also want to talk about something new and old at the same time so nice um pi test has been around for a while but pi test 8 is brand new so pi test 8 just came out this weekend i'm super excited to
Starting point is 00:07:32 start running with it actually i've already started running with it um i um we're gonna put in the show notes i put put a highlighted blog post of just pi test days here and links to the the change the full changelog. But what they did was they spread out, if people are running seven, like old 7x PyTest, which that's what I was using before, the changelog just takes a little bit to parse because they spread it along the RC1, RC2,
Starting point is 00:08:04 and the final 8.o release uh the changes are all there so i pulled the highlights out so the the thing i'm really excited there's two things i'm really excited about one is um when you had an exception that would just be a red block of uh of exception stuff and they've there's a whole bunch of cool different differences so um there's a whole bunch of cool differences. So there's improved diffs when you fail an exception, especially if you do dash VV, so very verbose or verbose, verbose, however you want to think about that. But you get a colored diff instead of the big chunk of red. That's awesome.
Starting point is 00:08:40 Back to this color thing, right? Yeah, it's also more colors. Normal syntax highlighting, we're used to syntax highlighted code. So error reports are now syntax highlighted. And the different sections of the error report are separated better. And then also, there's better support for standard library containers for diffing. There was usually a pretty good tuple diff, for instance, but if you had big lists, it was a little bit hard to read.
Starting point is 00:09:13 It's a little bit better now. And then more comprehensive assert rewrites for other comparisons, not just equal, but things like not equal, less than equal, other comparisons. So that's really cool. Help people debug their code. That's super nice because you want to be able to just say, are these two things the same and not write code around? How do you do that, right? Yeah. Yep. And also like comparisons like less than or less than or equal, it's really nice for the PyTest to go out and really tell you why that check failed and highlight the part of your data that where it failed. So really, really fun to see
Starting point is 00:09:52 that. There's the thing that the next thing I wanted to talk about for PyTest 8 was probably pretty obscure for people that are not using XFail. So X X fail is a way to say, I expect this test to fail. And for a lot of people, why would you expect it to fail? But for large organizations, it's pretty common to file a defect and you don't have control over it. So you can't just go fix it. Somebody else is responsible for fixing it. So that's how we use it. We, and I mark a mark a test as failing as expected to fail and give it a defect number. And then and then when when it, and that's not new. But if it if it passes, there's there's decisions on how to make it x pass, or what to deal with do with if an x failed passes. I've talked about that a bunch on the other podcast.
Starting point is 00:10:47 However, the change for PyTest 8 is that with X fails, the traceback never showed up. It didn't used to. So the change is now, there's a way to turn that on. With the dash R command, you can turn on X fail tracebacks.
Starting point is 00:11:04 So that's really nice for CI test runs to be able to see what the failure was in CI and not have to try to rerun it again. So really cool. Excited about that. There's lots more things. So check out the changelog. The reason why they bumped to eight,
Starting point is 00:11:21 I think is because there was a bunch of, they changed the way collection works. And and that chain, that behavior change of backwards compatibility made it so that it made sense to bump the number. I think that's the reason. I'm not sure. But also, it's good to bump the number every once in a while so that we have that so that you can deal with deprecations. You can get rid of the code that you were meaning to get rid of so i i encourage people to run pytest 8 and check it out and turn on um you know turn on uh the full strict mode and everything so that you know if uh if anything breaks you can roll back if you need to so awesome this is like christmas for you right yeah major yeah a major
Starting point is 00:12:03 version release um yeah and it also made me when i was looking at this made me realize that i think i need to add more color to my blog it's just black and white and uh it's not very fun so i'm have to add more color colors always fun i mean that's the theme of this episode last week it was let's just ship announce open source projects we're releasing this is like color episode. The color episode, yeah. Or as Midnight and the audience puts it out there, it's a readability episode. Very good.
Starting point is 00:12:30 Indeed. Well, let's harken back to a couple of things on my side here. So I talked about this Docker work that I've been doing. That also kind of led me to the Grannian stuff and playing with that as well. I also forgot to give a quick shout out to the people people like both Andy Shapiro and Bill Crook pointed out, said, Hey, you should check out Grannian. So when I give them credit for sending that in, thank you. But as I was doing all these Docker things, I was, you know, you know, when you're in a unusual or an unfamiliar,
Starting point is 00:13:00 uncomfortable space compared to what you normally do, Like right now, my little browser in this episode has maybe nine tabs and that's like kind of a lot. But when I was doing the Docker stuff, I'd have a 30, 40, 50 tabs. And then I would close 30 of them because I'd solve some problem that would work its way. It was just tabs everywhere. So it was just like exploring stuff all over the place, right? Like, wow, I must not know what I'm doing right now. But you know, that's how you get to where you know what's going on. Along that, I'd be like, oh, people talk about, oh, my gosh, totally giving up on doing this. I'm using OrbStack or Podman or if they support this OS.
Starting point is 00:13:35 I'm like, what are these things? So I just want to go through a host of Docker goodies that I think people will enjoy. Not exactly Python specific, but certainly relevant if you're doing Python in any form of containers, right? So the first one of three, there's actually more, Brian. I'm not going to cover them all here. I'm saving some for later because it's just, it'll be a little out of control at that point. So orb stack, if you use Docker and you use Docker on Mac or Windows, typically what you
Starting point is 00:14:00 do is you get Docker desktop, right? So that gives you the Docker commands that allows you to run Docker locally. I think it might use, maybe it uses VirtualBox or something on Mac and it uses probably Windows subsystem for Linux on Windows. But you can run like Linux VM, Linux containers on top of some hidden thing of Linux, right?
Starting point is 00:14:20 So OrbStack is kind of that. They say, say goodbye to slow and clunky containers and VMs. It's a light and easy way to run Docker containers and Linux on your machine, right? So basically gives you a nice UI around it. It is 100% compatible with Docker. So you could say Docker run, Docker exec, Docker compose up or whatever it is you say.
Starting point is 00:14:43 And instead of using the docker engine it'll use this orb stack engine which is pretty neat it also has its own cli um if you want to directly work with it but basically it's kind of a one open source and two more lightweight they've got a bunch of cool commands but they show down here somewhere um speed if you're gonna open it says open edx i guess that's probably the the docker compose setup for edx um i'm gonna provision a development environment for it it's pretty long still because i guess that's a beast of an app 17 minutes on orb stack but 45 minutes on docker desktop to build post hog whatever that is it's like a quarter or a third of the time it uses if you're on a laptop it uses like less than 25 of the battery as well or it depends uh
Starting point is 00:15:33 if you're using super based or kubernetes or whatever right but pretty cool right yeah yeah and oh another thing i think i think let me look yes this is a big deal you and i were just talking about this before we pressed record. It says you can run Linux machines without a fuss. So one of the things I can do is I can go to like Parallels and I can run Linux, Ubuntu or whatever, on my Mac, my M2 Mini, my M2 Pro Mini, however those words go together, whatever order makes the right sense there.
Starting point is 00:16:01 I can run them, but I can only run the ARM 64 versions because I only have an ARM 64 processor. But this one allows you to run Intel machines on Apple Silicon with Rosetta, which allows you then to run Intel based Docker images and basically be closer to what your production environment is if you're using Apple Silicon. So that's also a nice feature of this. That's pretty cool. Yeah. It does cost money if you're a company, but it has a free version if you're just a person. So not an endorsement, but I thought people might find this to be pretty useful. It looks pretty cool. Yeah. Next one, Podman. Podman is the best free and open source container tools. So you can manage Docker containers, pods. And that term, I believe, comes from Kubernetes. The unit of execution in Kubernetes is called a pod.
Starting point is 00:16:50 And images with Podman. So yeah, it lets you seamlessly work with containers and Kubernetes from your local environment. So this is also really cool. A lot of people are doing interesting stuff. GitHub Action plugins. It's got a Visual Studio Code plugin and different things. And then the third one, actually 3.5 because it kind of is two, is this is pretty interesting.
Starting point is 00:17:13 So one of the things that kind of is inspiring once you get all this Docker stuff going is like, okay, well, if that open source, big, complicated thing and some technology I don't know how to run like PHP or whatever, But if it has a Docker container or a Docker composed set of containers, I can run it. All I got to do is just tell Docker to update it when there's a new one and just run it. That's super easy to do, right? So there's this like taking that to kind of the extreme is there's this thing called Kasa OS. Have you heard of this? No, I neither, but it has 20,000 GitHub stars, which is pretty cool. They call it your personal cloud operating system, community-based open source software focused on delivering simple personal cloud experience around the Docker ecosystem. So
Starting point is 00:17:55 basically if there's a thing that runs Docker, this is like an OS for running and a platform for running all that. So it gives you a UI and into this, this OS that they give you. And it says, look, you can collect all your precious data. It'll like tie together Google drive, Dropbox, iCloud, one drive, et cetera, et cetera, hard drives and everything into just one drive view. And then you can just access it and like map that drive over to your TV or your computer or whatever. Um, you can go, there's somewhere in here where they've got all these apps that you can go just grab and install. It supports. A lot of them are unfamiliar to me because I haven't done this enough,
Starting point is 00:18:33 but like Couch Potato, DuckDNS, Photoprism, and they just plug into this thing. This is pretty interesting, right? I think. Yeah, you think, right? Yeah. I don't know what I would do with it, but, you know. Yeah, so one of the things that I think this is sort of coming out of is they have this thing called Zimcube, which instead of running all your stuff on the internet, this is kind of like a really fancy NAS.
Starting point is 00:18:57 Okay. Network-attached storage, but it also runs Docker and all these things. So it says you can have up to 164 terabytes of SSD, but then it also runs all of these things. And this is on Kickstarter, and it was already funded at around 1.1 million US, which is a lot apparently. So this is kind of the intended destination for that,
Starting point is 00:19:20 but I think you can run it anywhere. So pretty cool, right? Yeah, I mean, some people think over a million dollars is a lot i don't you know jump change but you know yeah yeah so i don't think this thing is shipped yet i think it's in in development still now their kickstarter ended but yeah so there's a whole bunch of different fun things so orb orb stack podman casa and zemma cube, so I have a question about the orb stack. That doesn't generate Docker images, though, does it? I mean, I have to have orb stack stuff on the server as well, right?
Starting point is 00:19:55 No, it will manage. It is a transparent API or CLI to the Docker CLI. Oh, okay. So if you just install this, then you can go to your command line and type Docker build whatever, and it'll download and do all the things. But then when you ship it to production,
Starting point is 00:20:15 you could have real Docker there. Okay, okay. Right? Yeah, I didn't understand. It's more like, it's more, I think it's mostly around kind of, yeah, it's mostly on the desktop side of things. So it's like a simpler, lighter way to do desktop stuff.
Starting point is 00:20:28 Possibly you could run it on your own. Like I said, it does have its own CLI for doing things its way, but I think people will just use it as a Docker desktop alternative. Yeah, and it looks like the business and commercial use pricing is slightly cheaper than Docker desktop right right now so there's benefit there that's right i didn't remember that docker had gone commercial on that side as well so that makes them more comparable right as opposed to yeah there's this other free thing it's not terrible though i mean like i just we just had to re-up our docker uh at work and it's
Starting point is 00:21:02 what we i think i paid 300 bucks for five seats per year that's not bad yeah that's not bad not when it's not your money it's fine it's not my money it's easier to spend not your money yeah um so uh yeah i love not your money spending but anyway uh pretty cool uh i'll look forward to checking that out i actually think it'd be fun to have one of those Xima cubes. I do too. I would really consider it. Wouldn't it be awesome to just have all those cool apps running plus 100 terabytes of storage?
Starting point is 00:21:33 Yeah, so it's like the cloud, but it's at home. So it's just topped in? It's more like fog. It's way lower down. It's fog. It's personal cloud. It's fog. Personal fog. It's way lower down. It's fog. It's personal cloud is fog. That's nice. Okay, cool.
Starting point is 00:21:51 Okay, so that's some good news. And the next up, I don't know if this is good news or bad news. It depends on your perspective. So I'd like to talk about GitHub Copilot and other assisted AI stuff. So Visual Studio Magazine came out with this article called New GitHub Copilot Research Finds Downward Pressure on Code Quality. So the question really was kind of if I'm using coding with Copilot, if I'm using Copilot with copilot if i'm if i'm using um using
Starting point is 00:22:27 copilot to help me write some stuff is the copilot kind of like having a uh you know junior developer um that is is it more intern or is it more senior dev yeah is it more intern or more senior dev and uh so the the they they the result was or their question was is it more senior dev or more akin to a just the just jointed work of a short-term contractor which i thought that was appropriate because um a contractor might be very skilled but they don't really care about they might not care about maintenance too much and yeah exactly so yeah the technical debt is not a problem when you're done next week yeah and you don't have to yeah yeah one of the best ways to not deal with legacy code is shift switch jobs um anyway um we the answer is summarized uh of this white paper is um is
Starting point is 00:23:21 summarized by uh we find it discon we find disconcerting trends for maintainability. Code churn, which is the percentage of lines that are reverted or updated less than two weeks after being authored, is projected to double in 2024 as compared to 2021 pre-AI baseline. We further find that the percentage of added code or and copy pasted code is increasing in proportion to updated, deleted, and moved code. In this regard, AI generated code resembles an iterant contributor prone to violating the dryness, don't repeat yourself, of the repositories visited. So this isn't that surprising to me, but it's interesting that there was a study done by get clear.
Starting point is 00:24:08 And also interesting that it was, you know, printed in visual studio magazine, but I don't know if this is, if the, I guess the magazine probably doesn't have any ties to Microsoft, but or GitHub, but anyway,
Starting point is 00:24:23 interesting. So Bart out in the audience has a different analogy. but or GitHub, but anyway, interesting. Indeed. So Bart out in the audience has a different analogy. I like to adapt. It's not junior, senior, it's a parrot that recycles what it found on the web. So I'm thinking more stack overflow, copy and paste action type of stuff here.
Starting point is 00:24:39 And Grant says, I read this too, the research makes sense. Sounds right, but Copilot has benefits as senior devs like applying good patterns faster. Yeah, which is pretty interesting. There was some comment in the article that essentially stuff
Starting point is 00:24:54 still got done faster. It's or at least got to something working faster. It's just that people often go back and back later and modify and refactor the code. And that's not necessarily bad, but you know. I get the sense that Copilot is very focused on what prompt did you give me? I'm going to do that
Starting point is 00:25:16 rather than I understand what your entire project is. I've seen all 200 files. I've thought that through. Now in that context, the answer to your question is x rather than y right i don't think it does that i think i mean i don't it would be a very high level of like token usage that it would have to take into account i just i think it probably just goes like all right well you asked me a sentence here's the answer python yeah i actually want to play with it more i haven't really played with Copilot too much. It's not something I can use at work, but on personal projects, I think it'd be fun to use it more. But we're still, I mean, it seems like, you know, it seems like ChatGPT is kind of old now,
Starting point is 00:25:56 and Copilot's like really old. But really, we're really just starting to use these tools in development. Like they said, pre-2021. So it's still only a couple of years that we have have under our belt, um, working with these things. And I think that, um, I think that the tools can get better and I, I, I'm looking forward to being able to, I, I both worry about the developers that are going to lose jobs and stuff because of this. But, um, I also, I think that the tools will probably get better like you said if it can look at your entire project and say hey um uh in this context this
Starting point is 00:26:31 is the right call you're you're repeating something you can we can we already you already implemented that let's call this function instead um that would be great and also if if we could have ai tools to to help uh maybe keep a style similar or just a general philosophy similar around a project, I think there's room for that once it gets there. Yeah, I wouldn't be surprised to see that happen. Yeah. All right. Extra? Extra time? Extra, extra.
Starting point is 00:26:58 Oh, yeah. So my extras are like, did you know that PyTest 8 is out? Pip install dash U? Pip install dash U. Pip install dash U. But also, if PyTest is one of those things that you've always been thinking about doing, head on over to courses.pythontest.com. And you can learn it really fast using a course. Or you can grab the book, of course. Yeah, excellent. Yeah, that's very exciting. Do you have any the book, of course. So yeah.
Starting point is 00:27:25 Excellent. Yeah, that's very exciting. Do you have any extras? I do have a couple things. I want to quick give a shout out to one. This has been around for plenty long, although it's changed behind the scenes, not in a way anyone would
Starting point is 00:27:36 really necessarily notice, but it has. And that's just I want to encourage people to join our newsletter. If you go to Python by set of them right below the hero image, there's a thing that says new letter. If you go newsletter, you go over there. Let's become a friend of the show. Put your information in there. This is actually a revamp as part of my work that I did with ListMonk, the private self-hosted email stuff moving away from
Starting point is 00:27:59 MailChimp and others. We talked about that last week, two weeks ago, some number of weeks ago. And that means it doesn't go anywhere. We don't share it with anyone. But Brian and I are planning some fun stuff and trying to do more with newsletters and reaching out and connecting with you all. So we would love for you to go to Python by side of him, click on newsletter and put your information in there. We won't share it, but we'll try to make it worth your while.
Starting point is 00:28:22 But we also haven't emailed a lot on it yet in the past. So when we start using it, don't think that we like bought your name off some list. It's that we're just starting to use it more. Yes, absolutely. It drives me crazy when people mark, they'll come sign up for your newsletter and then they'll mark it as spam, which means other people have a harder time getting it. It's like you typed your information into there and then you mark, just, it doesn't unsubscribe.
Starting point is 00:28:47 Just please use the unsubscribe. Just use the unsubscribe. Yeah. I actually wrote a ton of software. I have a whole separate Docker thing running that like monitors for people marking stuff as spam. Cause there's ways that you can receive hooks about that information and automatically unsubscribe people if they do that, even if they don't unsubscribe.
Starting point is 00:29:04 But you know, it's kind of a sense that damage is still done a little bit. Email is a complete nightmare. All right, let's get away from email because it makes me upset. All right, so Pydantic. Pydantic's awesome. Sydney Runkle, who works the Pydantic company,
Starting point is 00:29:20 was on TalkPython recently, released a brand new version, version 2.6.0. And Samuel Colvin said, this is probably the biggest, most important release since 2.0. If you scroll through the release notes for Pydantic 2.6, there is a lot going on here. And even just the new contributors is massive, but apparently a ton of speed up, some other things going on here that you can check out. So if you're using Pydantic, everything's excellent.
Starting point is 00:29:50 Just no more Python 3.7 because we've already had the thanks and goodbye to Python 3.7. We're on to 3.8 as the minimum reasonable Python these days. Cool though, huh? Yeah, very cool. A lot of contributors.
Starting point is 00:30:04 There's a lot going on here. It's a popular library. Like if we go over here and we see like, okay, well, how many things depend on it? Where's the used by 318,000 projects? It lists one, two, three, four, five.
Starting point is 00:30:17 It lists like six and it says plus, as in there's more. It says plus 317,946. Like that's not really totally representative, but okay. I understand the UI. Anyway, yeah, it's used by a lot of people.
Starting point is 00:30:28 Well, it's used by more projects than their stars. So some people are using it and don't like it, apparently. Yeah, come on, start this up, people. The only reason you don't see a star from me is I'm not logged in. All right, and finally, I wrote a new essay called Use Custom Search Engines Way More. This is not DuckDuckGo versus Google versus Bing,
Starting point is 00:30:47 but rather if you use a proper browser like Vivaldi or Firefox or even Chrome, although anyway, you can go and set custom search engines for all sorts of cool stuff. Like one I set was PyPI, Brian. So if I go to my address bar, I don't know if I've done it on my streaming one here. Let me see. No now only my proper one. This is like a separate user account over here, but I could go and just type PyPI space, PyTest, and it will search using PyPI.org search results directly for PyTest or whatever it is you type there. I didn't know you could do this. Isn't this awesome? So if you want to search Unsplash for stock,
Starting point is 00:31:27 you just type U space and you type the thing or S-O space, you directly search on Stack Overflow. So instead of searching for it, oh, I was looking for Stack Overflow. So you scroll through until you find the result and you go, you know, just like, boom, just like a super short or GH for GitHub, just search only repositories, not users,
Starting point is 00:31:45 or whatever you want to type in. Incredibly easy. So that's my essay, my quick little. So that's not built in already. You have to, well, it's supported, but you have to like configure it on your browser? You have to type. Yeah, you have to type.
Starting point is 00:31:59 Basically, you go to Vivaldi search, and then you find, just go enter a new search engine, or there's ways to do it in Firefox. There's a way to do it in Chrome. They're all different. But then you just figure out, if you just search a site, like if you search Stack Overflow,
Starting point is 00:32:11 you'll see it's stackoverflow.com slash search question mark Q equals some string. And so you just put percent S there and say that's the search engine. Okay. I think I'll do a PB for Python bytes. Oh, you know what? I'm feeling like we could totally do this. I mean, the URL is right up there. Yeah.
Starting point is 00:32:31 Q equals a. Yeah, why not? Yeah, skip one step. Beautiful. Anyway, that's my set of extras. Very cool. Thanks. Well, how about funny stuff? This one's quick and short. Okay. It's a picture, but you don't need to know anything about the picture. It's just a lawyer arguing a case. Hey, Brian. It says, your honor, my client didn't know they were pushing to the main branch. That's funny.
Starting point is 00:32:54 Yeah, that's it. Didn't know I was pushing to the main branch. This is my defense. I'm sorry I took down the website during Black Friday. I thought it was my fork. That's funny. Yeah, how often have you like, well, I guess you don't do this too much,
Starting point is 00:33:08 but it's a three-day weekend. I want to make sure that I have my stuff pushed to the central repo. So push. Because you're working at home, right? You want to like sync it back up or whatever. Yeah, but make sure you're on a branch. So, okay.
Starting point is 00:33:24 We were talking about junior versus senior a little bit on AI stuff. So I wanted to share a little picture also of this. I saw this on Mastodon. Junior versus senior developer. It's a timeline thing. So the junior developer working on project encompasses the entire time senior developer finding the motivation to start takes up like 90% 80% of the time and then actually doing it at the end. And the total time is
Starting point is 00:33:54 equal. Yeah, that's amazing. I would have altered it to make the senior like the total time is like a little bit less. It's just Yeah, yeah. And the junior like a little bit less. It's just, yeah. Yeah, and the junior needs a little bit of finding the motivation, but just a tiny bit. Yeah, yeah. It's still a good one. Anyway, cool. Well, thanks again for, oops, such a great episode.
Starting point is 00:34:17 Good to talk to you this week. It's good to have everybody showing up for the live show. Thank you very much. If you want to, what's that link again? If people want to up for the live show thank you very much if you want to what's that link again if you if people want to go watch the live show um they can buy them by set of them slash live yeah all right cool uh plus if you just go to pythonbytes.fm it's right at the top there yeah thanks a lot make it easy we make it easy for people yeah all right see you next week

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.