Python Bytes - #208 Dependencies out of control? Just pip chill.

Episode Date: November 19, 2020

Topics covered in this episode: pip-chill - Make requirements with only the packages you need Windows update broke NumPy Build Plugins with Pluggy LINQ in Python Klio: a framework for processing au...dio files or any binary files, at large scale Collapsing code cells in Jupyter Notebooks Extras Joke See the full show notes for this episode on the website at pythonbytes.fm/208

Transcript
Discussion (0)
Starting point is 00:00:00 Hello and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds. This is episode 208, recorded November 11th, 2020. I am Brian Ocken. And I'm Michael Kennedy. And it's getting cold outside. It is getting cold outside. I feel like winter is coming. I went out on the deck and I'm like, well, there's something wrong with my deck. It's really slippery. Oh, that's ice. What a weird time of year it is.
Starting point is 00:00:23 Who do I call to get this fixed? Exactly. Gotta be some sort of contractor i can get to i'm sure there's something wrong with the sun we're gonna need some help here yeah this episode is brought to you by us we'll tell you more about what we're doing other than how you can support us a little later but first i want to speaking of cold i want to talk about something called pip chill oh i see have you heard of this before i have heard of this this i've uh just heard of it but it looks fantastic this drives me crazy i this whole new change with pip where pip is super super picky about the dependencies and then having things like dependabot or other automatic tooling upgrade stuff as things come out if they ever get out of
Starting point is 00:01:02 sync then people break in some ways like for example boto3 and boto core have been driving me crazy and it sounds like this would fix it normally you build up you've got like a requirements.txt file or you got your the versions that you're working with you want to put those in a setup or some sort of way to capture it one of the ways you can do that is pip freeze and if you already have an environment like a virtual environment set up with everything everything you need, you say pip freeze, it spits out all of the things that you have installed and all the versions that you've got. Now there's pushed in the future. That's a breaking change. And to me, I never appreciated that. Like that never seemed to be something I cared about. But once things like GitHub started saying, there's a new version of this or the thing you have installed, there's a security vulnerability that if you don't upgrade, you're subject
Starting point is 00:01:59 to on the web, which is a bad place to be. Then putting that version information in there explicitly allowed me to know, oh, I need to go and update the server because there's this critical vulnerability that GitHub told me about. That's what won me over to putting the pip-free style with versions in there. Oh, yeah. Yeah, and I think that's good. But you might want to just have the version,
Starting point is 00:02:20 the thing that you are really using. Like, let's say you're using black do you want to list black or do you want to list black and all of its dependencies yeah exactly and freeze just gives you everything and so what pip chill does is it just shows you the stuff that you installed so like let's say if you installed just black and you ran pip freeze, you'd see a whole bunch of stuff. But if you run, if you now install pip chill, pip dash chill and run pip chill, you'll get pip chill. Also, it'll tell you that it's, it's there, but it also just shows you black and what version of black you have. So if you've, um, you're hopefully black itself is specifying enough dependencies that make sense.
Starting point is 00:03:07 There's definitely times where you want to have everything, all the versions nailed down, like you said, for security updates and stuff like that. But there's a whole bunch of times where, like for instance, I've got internal projects where we actually have, we actually vet all of the versions and put them in a different repository,
Starting point is 00:03:25 but different repository. But different combinations, you don't need to be that specific. So PipChill is a way to just list the ones that you've installed. And I think it's really cool. I think it's really cool too. One of the things that's neat is it toggles between this. Here's what you installed and here's everything, which is what you were talking about with PipFreeze. But there's another mode where you do dash V and it'll show you the stuff you installed and here's everything which is what you're talking about with pip freeze but there's another mode where you do dash v and it'll show you the stuff you installed and then it'll show you commented out but present these are all the things the libraries and their dependent their
Starting point is 00:03:54 versions that were installed but are not top level so they're commented out and then there's a little comment like installed as a dependency of jinja two time or installed as a dependency of cookie cutter. So you can do that to get a look at say, Oh, what is my virtual directory look like? Why is this thing here? Oh, I see it came from this other place. Yeah. And then also if you, you can have that with or without versions. And so you can, if you know, there's a dependency vulnerability for particular thing, you can say, well, which, what version am I getting? So yeah, yeah, this is really nice. I'm liking it. You've got like a version checker thing that particular thing you can say well which what version am i getting so yeah it's good yeah yeah this is really nice yeah i'm liking it you've got like a version checker thing that tells you if there's like vulnerabilities right yes what are you using well if for vulnerabilities i think just
Starting point is 00:04:35 having a pinned version in a requirements.txt file is sufficient for github to say there's a problem so like any requirement file is pinned but if you sign up with dependabot which is there's a problem. So like any requirement file is pinned, but if you sign up with Dependabot, which is, there's some things that are super, super annoying about Dependabot, but what is nice is that it will show you anytime there's a thing. So you say like a weekly notify me of updates
Starting point is 00:04:58 to my requirements file. And so every Monday morning I get a list of like, here are the updates that I could upgrade for like my web framework or database access or whatever. And like put in the automatic PR for it. It'll show you like the change log and then you can accept that and it'll update your requirements. Yeah. Okay.
Starting point is 00:05:17 And then there's also a workflow that some people use that have a, like a smaller, like a requirements in or something. It's just a list of packages they're using. And then they'll just get the latest of whatever, those few things, and then test it and then freeze it. So you've got like a two-stage thing. So what you deploy is a frozen set of package lists and all the versions, but you've got some other way to do just how to generate that, which ones you should install.
Starting point is 00:05:48 Yeah, that's an interesting way to do it because then you basically have just PyPI and PIP just giving you the latest. Yeah, just to round it out, the thing that is super duper annoying about Dependabot is like on TalkPython training, that site has 30, 40 packages it's using. If you count up the dependencies, pip freeze, not pip jail freeze. And every change is its own PR. And I swear, almost weekly,
Starting point is 00:06:14 I'm manually merging merge conflicts between Dependabot and Dependabot. It's like, well, this requirement's lying and that requirement line changed. And they won't give it to you as one, here's the changes for the week it's like no no here's a bunch of changes and you'll probably be in here merging this manually so enjoy that anyway it's still really cool to have it automatically do i do like this idea of it just kind of like some part of the ci going this is what i depend upon install what i need
Starting point is 00:06:43 generate the pinned versions because then when you go to production and you say install pip-r, it will actually upgrade the stuff that needs upgrading rather than fulfilled. One of the requests I have for the people working on pipchill is I think an option to not list itself would
Starting point is 00:06:59 be good because I'm just installing it so I can run it. That actually makes a lot of sense to just omit self you can like type it through a dash v or something but maybe it's like some kind of worm and its goal is to just get into all projects and by listing itself it's just going to slowly make its way out maybe speaking of problems you might want to avoid, not that that is one, but one that you might is if you work on Windows, which from the Stack Overflow survey and the PSF survey, we found a very significant number of people in the Python space do that.
Starting point is 00:07:34 And I've heard people in Python talk about this thing called NumPy. You heard of NumPy? No, NumPy is something that definitely is the foundation of the many, many, many data science libraries, right? Well, big news sent in by Daniel Mulkey. Recent Windows update broke OpenBLAS, which is a dependency of NumPy, speaking of dependencies. So effectively, if that doesn't work, NumPy doesn't work. That sounds less good.
Starting point is 00:08:04 Yeah. dependencies so effectively if that doesn't work numpy doesn't work that sounds less good yeah yeah so there's a whole long developer community thread and i mean long people talking about it suggesting fixes that may or may not be fixes etc etc it starts like this i'll just read the quick overview so people know what to be on the lookout for i'm a numpy developer we've been trying to track down some strange issues where after updating to Windows 10 2004, suddenly code that worked no longer works. Here's the NumPy issue and the corresponding issue in OpenBlaz BLAS. The problem can be summarized as when calling fmod,
Starting point is 00:08:37 something is changed so that much later calling OpenBlaz assembly routine fails. And we're way down here, Brian. The only difference I can see is that in the register that Visual Studio exposes after the call to Fmod, ST0 is NAN. The bug that people. Yeah, geez. But the fact that it doesn't work is not ideal. So Steve Dower and some of the other microsoft people commented there there's a fix later to take until january 2021 to be released
Starting point is 00:09:10 but matt p posted a workaround says for all of those at home following along and looking for a quick fix numpy has released a bug fix 119.3 to work around this issue the bug fix broke something else on linux so we had to revert the fix in release 1.19.4 so the fix on windows that work around on windows broke the linux version but as a windows person you may not care so on windows you could pin it numpy equal equal 1.19.3 for now and just don't put that into production on linux sounds easy enough i guess yeah i mean if you're a package maintainer this seems ugly but if you're like an end user i'm either installing it on windows or i'm installing on linux it's not a problem yeah okay i don't know if pip actually has a platform flag like pip in the dash r you know the
Starting point is 00:10:03 requirements can you specify this platform gets this requirement that platform gets i'm guessing no well no i mean it's part of the wheel so the wheels can say this wheel is appropriate for only certain platforms yeah but if you're building an application not something that's on pip yeah i don't know i don't know how to do it but you can this is like a fix but just be aware that that will make it not work on linux so yeah anyway i just want to put this on people's radar so they're not down looking at registers going why is st0 man that's odd i'm glad i don't have to do things like that we can all be thankful we don't have to
Starting point is 00:10:35 do things like that i mean that's why we work in python right or let's reverse it i'm really grateful that other people are looking at that so i don't have to yes thank you people speaking of thank you brian and i are both doing a bunch of stuff online and people often ask how they can support us support the podcast and just get better at python and we're trying to help people do that right brian we are yeah lots of ways one of the things we're doing is this podcast and if people want to just support us directly there's a Patreon link in all of the show notes. You can donate. We have like 60, I just checked,
Starting point is 00:11:09 we have 65 people doing this now. That's cool. Awesome. Thank you, everyone. And you're teaching some classes. I am. The most recent one that I worked on should maybe, maybe be out. It's completely done.
Starting point is 00:11:18 I just put my head down and did the whole thing. It's like a four and a half hour fast API course for building APIs with fast API. So I'm really excited about that one. And hopefully it may even be out. But people can just check out training.talkpython.fm and it'll be over there soon if it's not by the time you hear this. Yeah, nice.
Starting point is 00:11:36 Indeed. I'll just keep plugging away at it, man. Just keep plugging away. This is going to be the 30th course that we've created over there and we're not done. We're going to keep plugging away. Keep plugging. Maybe you need a plugin architecture. Maybe I do.
Starting point is 00:11:49 Tell me about it. Well, so one of the things I love about PyTest is the plugin system. So you can, I mean, plugins can provide extra fixtures, extra flags. They can modify the output. They can do all sorts of fun things. Anyway, the plugin architecture that PyTest uses is not built into PyTest. It's something called Pluggy. And there weren't, I mean, there are a couple little how-to guides, but they were kind of sparse and I was having trouble. I wanted to do a plugin in a little application before and struggled. But now here's apparently two Python talks,
Starting point is 00:12:27 one of them held at PyGotham and one at PyCon India, and an article around it. So the article is Build Plugins with Pluggy. It's kind of a nice article. It starts with a small command line application that looks for books or something, and it has one output format. And the argument is like,
Starting point is 00:12:48 wouldn't it be cool if you could have more different kinds of output? And one of the ways to do that in an application is to have allow plugins so that anybody can extend the application to look for a different format. Yeah, one thing that's really nice about that is you don't have to understand the whole program
Starting point is 00:13:03 and all the code to contribute to it. You just have to understand that little tiny plugin interface. And if you can handle that, then you basically can extend that. Yeah, and one of the beauties of those sorts of things. So I've written plugins for PyTest, and I don't understand the internal architecture of PyTest. And I don't really understand how to do the plugin system that well. I mean, pretty good. But you can take something that's already done and copy it.
Starting point is 00:13:25 So you can say, oh, here's another one that also changes the output. I want to, I can copy that and change it myself. Yeah. Yeah. This is kind of how that works. It took part of the system of the application and he walks through how you would change the architecture to instead of doing the formatting internal, you could do it as a plugin. And then it talks about the different architecture of pluggy because you've got a host and a plugin and hooks and hook specs and all sorts of stuff like that.
Starting point is 00:13:52 And it's actually easy to get lost through this if you're just reading it. But I suggest walking through the code as he's doing it and actually running the stuff and looking at it. And there's not really any super easy ways to do plugin systems in Python. I don't know if any language has a better system. But plugin's pretty good. If you follow along, you can kind of get with it. Yeah, very cool. I haven't looked to see if the talks are online yet, but I'm looking forward to watching this if they do go up. I suspect they probably will in the modern
Starting point is 00:14:24 age of everything being online. Yes. Yeah, probably. Awesome. So let me talk about a little query syntax that I've been wanting in the Python space for a long time. And you almost have it. It's so close in some ways, but so far in others.
Starting point is 00:14:44 For example, with list comprehensions generator expressions all those types of things we can do a lot of in-memory data like things right we could go and say if i got a list of numbers i could get like the square of the number for all the numbers if the number is like you know every number that's odd or you know some weird thing like right that's sort of a query type of thing there There's like a where clause, there's a from clause, there's a select transformation bit. You could even do like paging with slices on the end of it. So like skip three pages of five. So, you know, slice 10 comma 15 or something like that, except there's a few things that are just not there the one that
Starting point is 00:15:27 drives me the craziest the most is like sorting by the way like if you could just do like an order by in a list comprehension oh that would be nice but you can't and you also can't put a dot sort on the end because that returns none so if there was some way to put like a little bit more together that would be great and And one of the, probably the best implementations of this, I would say across all the languages is this thing called link language integrated query and C sharp. Yeah.
Starting point is 00:15:54 And I know it's not about C sharp, but studying that there's a really interesting way of basically taking Lambda expressions, applying them to collections and doing those database like things but much more where you can do like joins and you can do other sorts of operations that are like paging or like filtering by type and all kinds of neat stuff there so adam sent over this project that he ran across called link it which adds link type of in-memory query syntax like almost database syntax to python lists okay so like uh for example if you had a list of programmers you could say programmers where
Starting point is 00:16:33 e you know lambda v goes to e.experience graded in 15 except for elon musk where it's the type of this type of uh person for you know category of developer take three to get only three of them skip you know you could skip like two times three to skip two pages and three to get the third page of them you could select just out there variables and then apply more queries to that and do this really cool fluent interface on top of these collections which i think is just super neat and i really like this interface so if people like this fluent style programming and they kind of think in data database type of thinking here's a really cool way to apply that to python this is kind of cool yeah i like it a lot too i like it a whole lot so previously and by previously i mean really previously i don't remember exactly let me see
Starting point is 00:17:27 use our little search over at python bytes back in episode 106 so what is that like three four years ago or something like that we talked about this thing called asq from 60 north which is a cool project by well the guys at 60 north who do a lot in python and it basically is a similar type of thing it adds link queries to it but one i didn't like what didn't spark joy in me i guess on that one is you had to like convert everything to this query object and then you could do queries on it that you could convert it back to a list which is like so close you know but what's neat about this other project this link it project is it is the things that you do the queries against are a super class of list or derived from list so anytime a list is expected anywhere you could just pass the result of this around you know if you do like conversions
Starting point is 00:18:24 like oh now it's a queryable thing. Oh, now it's not anymore. Now it is again. It's kind of always this thing that can stand in as a list but actually has this query capability, which sounds minor, but it seems like a big usability benefit, actually. Yeah, definitely.
Starting point is 00:18:37 Because you want to be able to write functions that can return this and send it to... Maybe you want one of these objects again so you can do further queries. Exactly. Or you can pass it to something that expects a list that's good yeah it's pretty good and if you want to upgrade a regular list to a list you just you know pass it to the constructor of this like derived list thing and then now it's like this queryable list so it's not quite as nice as some of the like true language built-in functionality but still this is quite neat actually yeah i think i'll play with it because i think that there's some use cases that aren't a true language built-in functionality, but still, this is quite neat, actually.
Starting point is 00:19:05 Yeah, I think I'll play with it because I think that there's some use cases that aren't obvious until you start playing with it. And it's really legible, right? Like, I would like to, from this group, where these are true, select this thing, where that is true about the sub thing you got, right? Like, it's a pretty natural way of reading codes.
Starting point is 00:19:21 I kind of like it. Yeah, I'm just chuckling about your example, though. I'm sure you didn't make it up. No, this comes from the documentation. What is it? So I've got programmers. I'm looking for somebody with greater than 15 years of experience. I don't want Elon Musk.
Starting point is 00:19:36 Something AV type. I don't know what that is. Yeah, I don't know what that is either. Take three random people, I guess. That's the first three. It's like limit three. Yeah, the first three. If there could be thousands, just give me the first three. It's like limit three. Yeah, the first three. If
Starting point is 00:19:45 there could be thousands, just give me the first three. First three, find out what they had for lunch. And if it was a hot lunch and it was not from Pizza Hut, get the last one and get the last slice of the pizza by the, yeah. Yeah. So what do you end up with? I believe it's a list of pizza slices, which correspond to the last slice of pizza each programmer ate. Oh, okay. I think. I haven't read it yet. I've got to write some code to make that happen.
Starting point is 00:20:10 But yeah, I think that's what you end up with. Yeah. Okay. Cool. Anyway, yeah, pretty neat. But this idea of having this sort of natural query language that mimics databases but is in memory could have some legs. I like it.
Starting point is 00:20:23 Yeah. What you got for the last one here? The last one of yours? Well, this one is um i just thought we should announce this because it sounded pretty neat although the details of it are a little uh a little over my head but so there's a spotify deals with music right yes probably a lot of music so one of the things they talked about and i'm going to cut to the chase it It's an application or a framework called Clio, I think I'm pronouncing that right, K-L-I-O, for processing audio files, or they say any large, or any binary files, especially large ones at scale,
Starting point is 00:20:56 things like, you know, maybe pictures or any other binary data. Anyway, so it's an application. It's used for pipelines. It's intended for large-scale input and output of all these files, scalability, reproducibility, working with pipelines and streaming and batching, and to try to get it to be easy to use and easy to read so that you can have a close collaboration between researchers and engineers. It's Python-based.
Starting point is 00:21:23 It looks pretty cool, and for a lot of people doing research and engineers. It's Python-based. It looks pretty cool. And for a lot of people doing research and data pipelines for these either audio files or vision or pictures or something like that, this might be cool. Yeah, it could be all sorts of stuff, right? I can imagine anytime there's a bunch of computation and a bunch of steps, people collecting the data,
Starting point is 00:21:41 someone's going to analyze the output on the other end. It might be a pretty neat use case. Apparently it's built on top of Apache Beam. And so it's also cloud agnostic. It's intended to work with cloud services, but you can use lots of different ones. I'm linking to a couple announcements and an article about it.
Starting point is 00:22:01 It looks like a large effort and probably a large effort for Spotify to make it open source. I think that's pretty cool that they're trying to have this be around. One of the articles says this might be useful for people doing like even comparing dolphin songs or something. There's a lot more use cases other than just music to deal with audio files and this is kind of neat. Yeah, quite cool. Is this a project that Lynn Root was working on? I saw her posting about, I think it was her, she was posting about something. I don of neat. Yeah, quite cool. Is this a project that Lynn Root was working on? I saw her posting about, I think it was her,
Starting point is 00:22:26 she was posting about something. I don't know. Anyway, it looks like a really cool project from Spotify. Does she work on Spotify? She used to. I think she still does. I just tried to pull up her GitHub repo,
Starting point is 00:22:36 but I couldn't find it, or her Twitter account, but I couldn't find it for some reason. Anyway, yeah, this is neat. I'm a little bit surprised they open sourced this, but at the same time, it seems useful. And why are you surprised? This seems fairly proprietary?
Starting point is 00:22:48 No, no, it seems, I guess, like a little mix of a somewhat specialized for their world a little bit, and then not at all related to what Spotify does. You know, like it's not, I don't know, it doesn't seem like it feeds back anything to Spotify necessarily. But I mean, it definitely looks useful. So quite cool. Okay.
Starting point is 00:23:07 Yeah. Yeah. Nice. And nice to see Spotify doing that. Also nice is getting feedback from listeners, right? Yes. Say like a good quarter of our entire history of our show has been like, and did you know about this?
Starting point is 00:23:20 I'm sure you do. You just didn't cover it. No, we didn't know about that. But thank you. Now we do. And so Marco Gorelli sent in something and something and says hey recently you mentioned that it would be really cool if you could have a jupiter notebook that just was more of a report thing you could reveal the code but by default the code is code cells are collapsed and hidden so you can just go through
Starting point is 00:23:39 it and if you say like oh i actually want to see the code for this particular part show me yeah yeah so he sent in article i guess walking through how you would do that with a notebook so he's like hey i actually wrote something about that so here you go in case people care about that so the idea is basically set up a jupiter notebook i'm reading this right set up a jupiter notebook you can tell it to build an html file that is like a static h static HTML file with all the data that's in there. Or to save like you would see on GitHub or something where it's not actually live in computer, but you see the output and you see the cells. Right. And then go tweak that HTML file.
Starting point is 00:24:16 I think you tweak the HTML file to just add a little bit of code that will use some JavaScript to collapse those. So then you can just share that HTML file and people can toggle stuff open and closed and they get more of a article style output oh nice yeah so nothing like super deep or whatever but it sounds like you're in that space you're like i really wish i could share this notebook but that 30 lines of python it needs to be there but it's not exactly what i want people to see i just want them to see the steps i think this is uh pretty neat oh that's cool yeah yeah so definitely useful short and sweet indeed i love it when that happens when people we come up with we're like i wish this could do this and i wish this existed did you know it does yeah exactly exactly well that's our six items do you have anything extra
Starting point is 00:25:01 to talk about i do have a couple of things that are all small little things. One, there was this tweet I was mentioned in. Let me just read the tweet to you and you just tell me how you feel about this. Okay. Okay. So Michelle A. Renard tweeted, the difference between Western and Chinese societies can be compared to M. Kennedy's difference between software 1.0 and software 2.0. And it links to the sanders media showdown
Starting point is 00:25:26 article what so i saw this i was like okay this is weird like we get copied all sorts of things like hey we're gonna copy 20 people that talk about stuff on twitter because we publish this article and we want you to cover it and i'm always like yeah not so much but if you hover over this it says this is actually an ai bot pundit trained to sound informed and profound. The content mine, the links is popular, and no tweet is the opinion of the author of the bot. I just thought it was really weird that there's like this AI bot going around on the internet, you know, using like machine learning and stuff
Starting point is 00:26:00 to just randomly be a pundit. Weird, right? Yeah. Anyway, Michelle A. Renard is out there. People do all sorts of weird stuff on Twitter. Yeah, but randomly mentioning just other random people on the internet. Like, hey, that person over here. Anyway, I just thought it was funny and amusing, so I put it up there.
Starting point is 00:26:17 Yeah. It would have been even, I guess, more to the point if they would have said the difference between Python 2 and Python 3. I know. Well, I'd be one of the bot even if it was software 2 and software 3 it would have been closer yeah pound software even yeah and also uh apple just had their big event a little bit ago have you got your new silicon mac you ordered it so apple silicon uh no i have not now have you i ordered a new mac i actually was thinking about getting one of those, which was really interesting because it's just, I think it's going to have a lot of knock-on implications.
Starting point is 00:26:49 Like, for example, if you're shipping C code as part of, say, like a wheel, will it also, do you now have to deal with like different platforms on the Mac where your package actually work on a Silicon Mac? I don't know. I don't know. I don't either. But it's going to be interesting. interesting i mean the stuff needs recompiled so i'm mostly wondering what this these apple silicon max will mean for python and its packaging ecosystem are we going to run into
Starting point is 00:27:17 a place where like well you can't use numpy on the new max for like three months i have an idea okay i think if we get a whole bunch of patreon followers they can help us buy each of us one of these things and then we can test stuff for people yeah there you go perfect i actually i did buy a new ish 16 inch macbook pro that's still intel because i was hoping to get one of these these new shiny ones but well if you're not going to make a new one my current mac is sort of dying. I'm going to have to buy something. So, actually, the whole talk about that actually encouraged me to buy
Starting point is 00:27:49 not the Apple cell kind of one. But I'm pretty excited to see where that goes. They've got some, like, huge ML performance speedups and a lot of interesting things. Like, the new MacBook Pro, I think, has 17 hours of battery life? 20 for video playback? Or something insane like that.
Starting point is 00:28:04 That's incredible. i've got like six i know the macbook air doesn't even have a fan like it literally it can't make a sound because it has no mechanical pieces besides typing i guess anyway just wanted to shout that out put that out there and see what people thought about um the new apple silicon being like a different app compilation type and what that's going to mean for Python and think about it. I mean, I guess it really comes down to what happens to CPython, right? Because it'll run under emulation mode if it's not. But if they upgrade CPython to run natively on Apple Silicon,
Starting point is 00:28:33 then what does that mean for packaging? Could be like a whole deal. I don't know. Yeah, a whole big thing. Yeah, hopefully not. You shouldn't joke about it, though. It's not funny. Everything's funny. All right, then tell us a joke. This, though. It's not funny. Everything's funny.
Starting point is 00:28:45 All right, then tell us a joke. This was sent to us, not sent to us. It was just sent out by Data HQ. It's a new database. They're doing some interesting things. They just posted out, 10E engineers are the future. Engineers with 10Es. Beautiful.
Starting point is 00:29:03 Beautiful. I've got one for you as well this one comes to us by Richard Carnes I don't know this one's a little bit political but we could probably pull it off do anything anyway why did the data scientists get in trouble with animal
Starting point is 00:29:18 welfare because she was caught trying to import pandas oh dear pretty good right yeah thanks for Richard for writing that one for us that was a good one Caught trying to import pandas. Oh, dear. Pretty good, right? Yeah, thanks for Richard for writing that one for us. That was a good one. Caught trying to import pandas. You know it says PD.
Starting point is 00:29:33 Come on. That's funny. Indeed. Well, thanks again, Michael. Yeah. Fun as always. Same time next week. Yeah.
Starting point is 00:29:41 We'll do it again. Bye. See y'all. Thank you for listening to Python Bytes. Follow the show on Twitter at Python Bytes. That's Python Bytes as in B-Y-T-E-S. And get the full show notes at pythonbytes.fm. If you have a news item you want featured, just visit pythonbytes.fm and send it our way. We're always on the lookout for sharing something cool. This is Brian Ocken, and on behalf of myself and Michael Kennedy, thank you for
Starting point is 00:30:03 listening and sharing this podcast with your friends and colleagues.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.