Stuff You Should Know - How AI Facial Recognition Works

Episode Date: February 4, 2020

With the development of increasingly smart artificial intelligence and lots more cameras spread around than ever before, we have reached a critical point in the US and other countries where government...s can easily track everyone, everywhere, all the time. Learn more about your ad-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener for privacy information.

Transcript
Discussion (0)
Starting point is 00:00:00 On the podcast, Hey Dude, the 90s called, David Lasher and Christine Taylor, stars of the cult classic show, Hey Dude, bring you back to the days of slip dresses and choker necklaces. We're gonna use Hey Dude as our jumping off point, but we are going to unpack and dive back into the decade of the 90s.
Starting point is 00:00:17 We lived it, and now we're calling on all of our friends to come back and relive it. Listen to Hey Dude, the 90s called on the iHeart radio app, Apple Podcasts, or wherever you get your podcasts. Hey, I'm Lance Bass, host of the new iHeart podcast, Frosted Tips with Lance Bass. Do you ever think to yourself, what advice would Lance Bass
Starting point is 00:00:37 and my favorite boy bands give me in this situation? If you do, you've come to the right place because I'm here to help. And a different hot, sexy teen crush boy bander each week to guide you through life. Tell everybody, ya everybody, about my new podcast and make sure to listen so we'll never, ever have to say. Bye, bye, bye.
Starting point is 00:00:57 Listen to Frosted Tips with Lance Bass on the iHeart radio app, Apple Podcasts, or wherever you listen to podcasts. Hey, everybody, it's me, Josh, and I'm here to tell you it's official. We're going to be in Vancouver, BC, and Portland, Oregon this March. On March 29th, we'll be at the Chan Center in Vancouver,
Starting point is 00:01:16 and on March 30th, we'll be at the Arlene Schnitzer concert hall in Portland. So come see us. Tickets go on sale this Friday. Go to sysklive.com for ticket links and info and everything you need. We'll see you guys in March. Welcome to Step You Should Know,
Starting point is 00:01:34 a production of iHeart Radios, How Stuff Works. Hey, and welcome to the podcast. I'm Josh Clark. There's Charles Scooter, Computer Bryant, and Jerry Matthew Broderick in War Games, Roland. And this is Step You Should Know. That's good. Thanks. What's your name?
Starting point is 00:02:02 It's Josh. Okay, Josh. Allyshede in War Games. Okay. Clark. I know, Aaron Cooper's doing right now. Man, how cute was she in that movie? Like, I think they designed that movie for like every 13-year-old boy in America
Starting point is 00:02:19 to fall in love with Allyshede. Wrong. I think you're talking about Short Circuit. I never saw that, believe that. What, with Johnny Five? I mean, I know the movie. You gotta see it. Really? It's pretty awful.
Starting point is 00:02:32 Okay. Especially with, oh, what's his name? Who is the sleazeball from Fast Times at Ridgemont High who is the ticket scalper? D'Amone. Yes. Yeah. He plays a Indian, like Asian Indian character,
Starting point is 00:02:52 like full-on brown face and everything. It's really bad. The movie's bad enough, but then now when you go back and see that too, you're like, I can't believe this. I can't believe it. I think he's Italian. Oh, easily.
Starting point is 00:03:08 Maybe Jewish. Or maybe just a straight up white guy. He's definitely not Asian Indian. No, he's not. No. But anyway, go see Short Circuit. Okay. See what you think.
Starting point is 00:03:17 Allie Sheedy just keeps looking at the camera going, I'm so sorry. That was a big hit though. She didn't have anything to be sorry about. So I guess War Games is kind of in your wheelhouse. Oh, yeah. It was a little old for me. Yeah.
Starting point is 00:03:32 Cause I saw that when I was like 12-ish and you would have been, I don't know, how much younger are you? I would have been seven. Seven. Yeah, that's a little young for War Games, I would think. I mean, I still watched it wherever, but I was like, that's hilarious, the computer's called Whopper.
Starting point is 00:03:48 Yeah. I mean, it was right in my wheelhouse. Right. I remember at the end of War Games, they lock in, you know, they're decoding the, the code one number and letter at a time. Very suspenseful. And it, yeah, very suspenseful.
Starting point is 00:04:01 And it finally locks in and me and my friends memorized it. So we could go home and plug it into an Apple II to see what happened. What happened? Nothing. Oh, okay. Cause I actually, How do you plug in a number anyway?
Starting point is 00:04:13 What does that even mean? Oh yeah, that's true. You know? Yeah. I remember a very rudimentary program you could run, where you could type in like four lines of whatever, I don't even know if you call it code and with a phrase and it would run the phrase like a thousand times
Starting point is 00:04:33 all over your screen in a big scroll. And that just thought that was the coolest thing ever. I feel like I remember what you're talking about. It's just like five lines. It was like, the only part I remember is 20 go to 10. And 10 was the phrase I think. Ready, set. Something like, I don't remember.
Starting point is 00:04:51 And I was like, man, let's just play Castle Wolfenstein. That was a good one. Yeah, I never did Oregon Trail. I never did as well. I was Castle Wolfenstein. But like Wolfenstein on the PC. Oh yeah. Like move left arrow, right arrow, shoot dash.
Starting point is 00:05:05 Yeah. Which is some sort of a bullet. Yeah. It was fun. That was fun. It's just like the height of technological gaming. It was, it was at the time. But now Chuck, we've reached the height of technology
Starting point is 00:05:17 which is being tracked everywhere you go. Look at you. All the time by whoever wants to do that. I'm gonna change your name to Josh Smooth Operator Clark for that transition. Very nice. Yeah. I like Allie Sheedy Clark.
Starting point is 00:05:32 Josh Allie Sheedy and Wargames Clark. Yeah, this is a good one. Did you put this together? It was just you and- This is Dave Ruse. Dave Ruse, yeah. Good stuff. And hi Dave, we finally got to meet Dave.
Starting point is 00:05:44 And his family? And his lovely family. That we cursed awfully in front of in Seattle. Felt terrible. Thanks for adding yourself to the mix. And he was like, you know, whatever. He was fine. His kids were adorable.
Starting point is 00:05:59 They were. They were great. They couldn't look at me though. Oh really? They were probably just intimidated by your presence. No, no, it was cause I cursed so badly. So this is good stuff though. Facial recognition technology
Starting point is 00:06:11 that they've been kind of at since the 1950s, which they rolled out as a test in 2002 at the Super Bowl in New Orleans. Did not go that well. No, it was a little clunky back then, but it's gotten a lot better since then. Let me explain why. For anyone who's listened to The End of the World
Starting point is 00:06:29 with Josh Clark, the AI episode in particular, everything associated with artificial intelligence got way better starting around 2007 when neural nets became a viable form of machine learning. Because you don't have to train a computer what constitutes a human face and what to look for. You just feed it a bunch of pictures of faces and say, these are human faces.
Starting point is 00:06:55 Learn what a human face is. And they train themselves. And so around about 2007, 2008, 2009, everything that had to do with machine learning got way smarter because we started using neural nets. And facial recognition software is no exception. Yeah, and there were a few things that kind of converged all at the same time
Starting point is 00:07:15 or around the same time. Social Meads kind of coming on the scene right in that wheelhouse was a big deal. Facebook, this is staggering. Facebook just by itself processes 350 million new photos through its facial recognition software every day. A day. And every time one comes through,
Starting point is 00:07:37 Mark Zuckerberg goes, mwah. Like you think it's neat when you go, when you put a picture up and it says, like, would you like to tag Emily, your wife? Because that's her. And you think, oh, well, that's super easy. Thanks, Facebook. But then you don't think like, wait a minute.
Starting point is 00:07:51 Right. How do they know that's my wife? Oh yeah. And you know, it's like with everything else. There's privacy people. That we're like, whoa, do you guys realize what's going on? And then the 99% of the sheep, they're like, huh? No, they're like, no, it's great.
Starting point is 00:08:06 Like I don't have to go in and like make, like click two links or two buttons to make this happen. Yeah, to tag somebody. Way easier. So that was one thing. There's way more photos out there. Good ones. For those machines to learn on.
Starting point is 00:08:19 Yeah, like good, high quality photos. Right. 350 million a day just on Facebook alone. Which means, you know, Facebook alone, which means the machines were getting smarter. They were getting better and better at training themselves. And then lastly, that has led to a ubiquity in facial recognition.
Starting point is 00:08:40 That the better the machines have gotten, the easier it has been to put together datasets for them to train on, which is lots and lots of pictures of people. The cheaper the technology's gotten, which means the more people that are now using facial recognition than ever. Yeah, Amazon has a service called recognition with a K,
Starting point is 00:09:02 which is not a good look. No, it looks very German. There's something about her placing a C with a K. Shoot, stop foolish. That just looks creepy. Like when you spell America with a K. Yeah. It means something.
Starting point is 00:09:14 It means like bad America. Ice cube. Yet they went right full steam ahead and called it recognition with a K. You have to say it like that. You do, I think. And you have to be like squeezing the air out of a syringe while you're saying it too.
Starting point is 00:09:30 So they have this, I didn't even know about this, but it's ubiquitous and it's not super expensive. And that means that while enforcement agencies, they don't have to like create their own. They can just say, well, just sign up for recognition. Right, exactly. Because it's there and because it's relatively cheap, you can just get a subscription.
Starting point is 00:09:53 Not just law enforcement agencies. If you have a photo sharing app or whatever and you want some facial recognition technology, you just contract with Amazon and Amazon goes, here you go, here's our data and you can, like our code and you can put it onto your platform. Anybody can use it. So it is kind of everywhere and that makes a lot of people,
Starting point is 00:10:16 including me, very, very nervous because as this guy, Woodrow Herzog, if he's not Werner Herzog's brother, I'll be disappointed. Oh, there's a lot of Herzogs. But a Woodrow and a Werner, come on. Maybe. Anyway, Woodrow Herzog is a professor of computer science and I believe privacy or civil liberty.
Starting point is 00:10:44 He basically says, look, there is no way we're going to reap the benefits of facial recognition without ultimately sliding irreversibly into a dystopian surveillance state where it's happening right now. And if we don't do something about it, it's never going to change back. We're about to fully give up our privacy
Starting point is 00:11:07 because it's one thing to have your phone tracked. You can give up your phone, get yourself a burner phone, like your Jesse Pinkman or something like that. And then you just throw that phone away. You can't be tracked anymore. You can't get a burner face. And if that does become a thing down the line,
Starting point is 00:11:23 it'll be very, very expensive. So the average person can't get a burner face. We'll be tracked by our face everywhere we go. And as we add more and more cameras and this technology becomes cheaper and cheaper, we will be living in a world where there will be zero privacy and we'll be monitored and tracked because it will be so easy.
Starting point is 00:11:45 And it will be sold to us like it's being sold to us now. That's like everything. That it's a law enforcement tool to get the bad guys. That's right. But it's eventually going to extend to include everybody. But what do you have to worry about? You're an upstanding citizen. It doesn't matter if you're tracked.
Starting point is 00:11:59 That's not true. That's just not the case, everybody. It's not the case. All right, we're gonna call that soap box soliloquy number one of what I guarantee will be probably three or four. Okay. Let's talk a little bit about how it works.
Starting point is 00:12:16 It is biometric authentication. It's like a fingerprint or a retina scan. And basically what it does is it is precise measurements of a face to calculate every person's very unique visual geometry. Like how far apart your eyes are. Sure. How far apart your pupils are from your nostrils.
Starting point is 00:12:37 Yeah, your facial geometry. How your face is all set up. I think, yeah, it's even gotten into things like facial hair, skin tone, skin texture. Yeah, I'm sure it'll get just more and more specific. Yeah, and because the machines are getting better and better and easier and easier to train on this stuff, you can just add more and more data to it
Starting point is 00:12:56 and the recognition will just become increasingly good. Yeah, and if you wanna throw off facial recognition software and freak out every human you meet, just shave your eyebrows. Oh yeah, that would be a little freaky. Have you ever seen that? You've seen it before, I'm sure in movies and stuff. It's an interesting thing.
Starting point is 00:13:19 I remember a kid in industrial arts class did that one year. He was like a little, you know, kind of a ninth grade burnout and he just showed up one day with no eyebrows. Like you would, I think like not having a nose would be more easily accepted. Sure. There's something just uncanny when someone shaves their eyebrows.
Starting point is 00:13:37 Like one day they have them, the next day they don't. Was it like immediately recognizable what the thing was or was it like there's something off today? That was it. Whereas if you came in the next day without a nose, the first thing you'd say is, what happened to your nose? What happened to your nose, Todd? Yeah, and Todd would be like, I can't rent a bit.
Starting point is 00:13:56 I fell off. So those measurements we were talking about, what happens then is they compare that just like a fingerprint with a database of images and depending on what this is for, it could be like just within your company or it could be the FBI's database of mug shots or it could be the DMV's database
Starting point is 00:14:19 of driver's license photos, which we'll get into. Yeah, what's interesting is each stage of the way there's a different algorithm that does, you know, each increasingly sophisticated step until you finally have basically like all of the different data points for that, you know, what makes up that facial geometry and then you can compare it to all the other data points.
Starting point is 00:14:41 We think of like a computer running like a, you know, a picture, you've got your input picture and then running, you know, all the pictures next to it. That's not what it's doing. It's running the numbers basically. It's doing computer stuff. Yeah, I love that first step, which is you have to teach the computer what a face is.
Starting point is 00:15:01 Yeah. So, I mean, it seems silly, but of course that's what it is. Well, yeah, because I mean, if you show it a picture of a person standing next to a fire hydrant. Yeah, they zoom in on the fire hydrant and say, hello, handsome. So, this is what a human face looks like, huh?
Starting point is 00:15:16 Yeah, or no, that's a butt. Yeah. And then it starts, you know, closer, closer. All right, now that's a face. You've know what a face is. Right. And now move on to step two, which is. Stop screwing around.
Starting point is 00:15:26 Yeah, so now you know what a face is. You've got to normalize it for the photo, which means there are not that many. Well, that's not true. You put it in a docker's. That normalizes it. Yeah. You isolate that face and then you have to make sure
Starting point is 00:15:40 that it's normalized as far as looking at the camera. So, if you get a photo of someone from a CCTV, let's say, and it's sort of a three-quarter, they have the ability to make it as if it is looking straight at you. Yeah, the computer can pretty accurately predict what the rest of the face looks like. Mm-hmm.
Starting point is 00:16:01 Face, you know, head on, I guess, face on. Maybe. And when it normalizes it like that, it makes it much easier to compare to other pictures because, as we'll see, most of the pictures, or most of the data points that it's comparing it to, are taken from databases of pictures that have been taking of people face on.
Starting point is 00:16:21 Right. So that's why it wants to go through that. Like mug shots or driver's license. Yeah, well, just spoiled it, but yes. Well, we already said that. Oh, we did? Yeah, I did. Okay, I missed it.
Starting point is 00:16:31 Well, I know. Sorry. So, from there, you have more algorithms still that isolate parts of the face. And this is where my old theory that, like, there are only so many sort of facial combinations, so that's why you have doppelgangers. We gotta do an episode on doppelgangers.
Starting point is 00:16:50 Yeah, there's only so many things you can do with two eyes, two eyebrows, a nose, and a mouth, and cheekbones. Right. And a chin. Okay, what else? I mean, there's not a whole lot. There's lips.
Starting point is 00:17:00 Lips, sure. So, what about, That's about it. These are called 11s, the ridges between your eyebrows. Well, if you want to get super specific. But that's what I'm saying. I think they're getting more and more specific. Oh, yeah, yeah.
Starting point is 00:17:13 But my whole point is, and we'll learn in here, in facial recognition, they do use doppelgangers. Yeah. But put a pin in that. So, they recognize all these features, and then each feature becomes what's called a nodal point, or nodal point. I think nodal.
Starting point is 00:17:30 I think. And this is where you're gonna get your super exact angles and distances between all these parts as a flat, two-dimensional thing. Right. Which my question was, because below here, it talks about Apple and their iPhone have a 3D facial recognition.
Starting point is 00:17:49 Is that, is two-dimensional superior to 3D? I don't know. Or is it just because that's what all the pictures are in the databases, so that's what they do? I don't know. What I know is my phone usually unlocks when I look at it. You know what I hate is having to take off my sunglasses. I know.
Starting point is 00:18:06 It's the worst. So, I found, I've got some wayfares that I don't have to take off. Really? But my aviators, I do have to take off. Interesting. Do they keep trying to make you into a mav when you have on the aviators?
Starting point is 00:18:19 Yeah. They go, hey. What is that? That was Tom Cruise laughing and chewing gum. Okay? Wow. I feel like, oh, okay, we gotta keep going. So, I was about to take a break unnecessarily.
Starting point is 00:18:35 So, when the computer's running through the pictures, it just goes like, no, no, no, no, no. Millions and millions of times. And then finally, it goes yes. But when it says yes, and it spits out another picture. It's not like this is that person. No, you want it to be because we all watch NCIS. We all watch CSI.
Starting point is 00:18:56 I don't. Law and order. I don't. We all watch. Party down. I do. Andy Griffith, like all that stuff. Shit, he got me there.
Starting point is 00:19:07 Matlock. Nope. The whole chame. So, we want it to just spit out and be like, here's your person of interest, right? But what it's really doing is it's producing a similarity score that is probabilistic. It's saying there's this percent chance
Starting point is 00:19:25 that this is the same person as the picture, or the person in the picture that you uploaded. Yeah, it's a bit of a guess. It is. A sophisticated guess. It is. And the better computers get at this, the likelier it is that if they say this is a,
Starting point is 00:19:42 there's a 99% chance it's the same person, that it's the same person. Right. But as we'll see, it's up to the human user to determine what is an acceptable threshold of a confidence. Is it 50%? No.
Starting point is 00:19:57 Is it 75%? No. Frankly, it really should be about 99% or higher. It should be the confidence setting. Yeah. The setting for the confidence level. Isn't that what Amazon's recognition says the threshold should be?
Starting point is 00:20:10 I'm glad you said that, man, because it really is creepy. And I couldn't put my finger on it, and it's exactly, I mean, I knew the cave looked weird or whatever, but it hadn't hit me that, just how creepy it is. And just how off the mark, or potentially on the mark that name is.
Starting point is 00:20:25 Like if my name was spelled C-H-U-K, I'm sinister. A little bit. You'd be more sinister. I don't think you could ever be truly sinister. I appreciate that. All right, let's take a break. I'm gonna go work on sinistering up a bit, and we'll talk a little bit more
Starting point is 00:20:41 about some of the uses of FR right after this. ["Hey Dude the 90s"] On the podcast, Hey Dude the 90s called David Lasher and Christine Taylor, stars of the cult classic show, Hey Dude, bring you back to the days of slip dresses and choker necklaces. We're gonna use Hey Dude as our jumping off point,
Starting point is 00:21:09 but we are going to unpack and dive back into the decade of the 90s. We lived it, and now we're calling on all of our friends to come back and relive it. It's a podcast packed with interviews, co-stars, friends, and non-stop references to the best decade ever. Do you remember going to Blockbuster?
Starting point is 00:21:27 Do you remember Nintendo 64? Do you remember getting Frosted Tips? Was that a cereal? No, it was hair. Do you remember AOL Instant Messenger and the dial-up sound like poltergeist? So leave a code on your best friend's beeper, because you'll want to be there
Starting point is 00:21:39 when the nostalgia starts flowing. Each episode will rival the feeling of taking out the cartridge from your Game Boy, blowing on it and popping it back in as we take you back to the 90s. Listen to Hey Dude, the 90s, called on the iHeart Radio app, Apple Podcasts, or wherever you get your podcasts.
Starting point is 00:22:23 Because I'll be there for you. Oh, man. And so will my husband, Michael. Um, hey, that's me. Yep, we know that, Michael. And a different hot, sexy, teen crush boy bander each week to guide you through life, step by step. Oh, not another one.
Starting point is 00:22:37 Kids, relationships, life in general can get messy. You may be thinking, this is the story of my life. Just stop now. If so, tell everybody, ya everybody, about my new podcast and make sure to listen so we'll never, ever have to say bye, bye, bye. Listen to Frosted Tips with Lance Bass on the iHeart Radio app, Apple Podcasts,
Starting point is 00:22:57 or wherever you listen to podcasts. So as with all technology, it has to be abbreviated into two letters, the second of which is R. Do they call it FR? I've seen it. Okay. I know.
Starting point is 00:23:20 Yes, I have. That was just being silly, but it doesn't surprise me. Nope, so in FR, Facial Recognition Technology, there are some beneficial uses for it. Yeah, like we said, you don't gotta tag people. So cool. Chief among them, for people like you and me, that's the pinnacle as it stands.
Starting point is 00:23:40 You don't have to tag people yourself. Facebook does it for you. That's what we're trading everything for. I gotta calm down. Okay, there are some other genuinely beneficial uses too. There's a nonprofit company called Thorn that scans missing persons pictures against pictures of children in child porn videos
Starting point is 00:24:05 or suspected human trafficking to get matches. And apparently they've rescued 100 kids so far from using that technology. There's a pretty beneficial use of Facial Recognition Software. Dating apps, let's say you want to, you can get pretty specific on what kind of face you find attractive, which is interesting.
Starting point is 00:24:27 But you can say, I really think... I like guys with high cheekbones and... But no, you would go find... Small lips. It would be more like somebody could be like, oh, I really find Christian Bale attractive. And they get a picture of Christian Bale in this dating app and I would come up.
Starting point is 00:24:45 But I wouldn't because I wouldn't be in the dating app because I'm happily married. Do you think you look like Christian Bale? I'm told that a lot. Really? Yeah. That's weird. I don't think you look anything like him.
Starting point is 00:24:53 I don't either, but people say. Interesting. I don't know what I would do if I was dating now. I guess I would just go to a service and say an Allie Sheedy type. Sure. For more games era. But they'd be like, okay, sir,
Starting point is 00:25:05 you would just upload the picture. You don't have to come into the office, which is really not even open to the public. Yeah, that's true. And just tell us you're interested in an Allie Sheedy type, like a weirdo. You mean dating apps don't have offices where they just field complaints and interested parties?
Starting point is 00:25:19 You sit down and they videotape you with the VHS camera and put you on with some other guys on a tape. That's how they used to do it. Oh yeah, that was one of the subplots of singles, the Cameron Crow movie. Oh yeah. It was expect the best was the name of the dating service
Starting point is 00:25:36 and you would make a videotape and like a watch videotape so people, you know, saying who they are. How do you remember that? I was a big singles fan. That's all a bunch. I got you. Yeah, expect the best. You, me, pack of cigarettes and some coffee.
Starting point is 00:25:51 We don't need anything else. Gesundheit. So what else here? This was Taylor Swift and her security team on tour used it to scan the audience to see if any of the creeps who have harassed and stalked her were in the audience. That's super beneficial. Sure.
Starting point is 00:26:10 No one should have to go through that. Also, cops use it in myriad ways, but in particular, especially beneficial, when they use facial recognition to identify people who can't identify themselves. Yeah, that's interesting. Somebody in the midst of a psychotic break perhaps, somebody wasted on shrooms, somebody-
Starting point is 00:26:30 Sir, you're not Jesus Christ. Who has amnesia, like our friend Benjamin Kyle, who apparently knows who he is now, but he's decided not to disclose it publicly. Remember the guy? He was found behind a Burger King near a dumpster, had zero recollection of who he was or how he got there. I think I remember that.
Starting point is 00:26:50 And there was this international publicity, publicizing who he was and that he couldn't remember who he was, and somebody finally came forward and identified him. So now he knows who he was, but he went a decade without knowing. Wow. By the way, when I said, sir, you're not Jesus Christ,
Starting point is 00:27:08 I was making fun of the guy on mushrooms, not someone in the midst of a psychotic break. Oh, I see. Just want to be very specific. I think that was very clear. All right, so I want to make sure. Everybody knows that. So those are some of the good ways that it can be used.
Starting point is 00:27:23 Now let's talk about all the bad ways. Yeah, I mean, when you're talking about the government, you're talking about law enforcement, when you're talking about things like what's going on allegedly in China with CCTVs everywhere, trained to single out ethnic minorities and religious groups, just walking down the street, going about their day, tracking them.
Starting point is 00:27:47 Right, being tracked, yeah. It gets into much different territory than tagging people in dating apps. Yeah, it's pretty difficult to attend your religious service if you're not allowed to attend your religious service and you're being tracked everywhere you go. Yeah, that's why places like, and this is the most predictable thing in the world,
Starting point is 00:28:06 San Francisco, Oakland and Berkeley, and then Somerville, Maine. I knew the Mainers would be there. Sure. Yeah, they're not into this. Live for your die. That's right. They have banned law enforcement from using facial recognition all together in California
Starting point is 00:28:22 as a state and has put a three-year moratorium on the use of it on body cams. Which is a big one. And the ACLU is basically, I know this is jumping ahead, but they're at the point where they're like, we need to tap the brakes here for a few years. And like, cause there's no legislation about this yet.
Starting point is 00:28:40 And it's just going full steam ahead. Yeah, I really, I don't want to run past that. There is, aside from Berkeley, San Francisco, who's the other one? Oakland. Oakland and Somerville, Maine. There are no laws, state, local or federal, governing the use of facial recognition technology by law enforcement.
Starting point is 00:29:03 Yeah, it's just happening very fast. Whatever they want to do, they can do. And in some cases, they do all sorts of stuff with it. They will use it like, NYPD very famously used, what you were talking about with doppelgangers. There was a guy who was caught stealing beer at a CVS. This is amazing. Not even a Duane Reed, a CVS.
Starting point is 00:29:28 And they said, but this guy looks a lot like Woody Harrelson. We don't have a good shot of him to use in official recognition software. You know, we do have, there's lots of great picks of Woody Harrelson. So they went and got a pick of Woody Harrelson and they came up with a match. And they think it was the guy on video in CVS.
Starting point is 00:29:46 And so the Georgetown School of Law produced a study called Garbage and Garbage Out. And they were basically like, that's not okay. You really shouldn't be doing that. But that's the level of legality as it stands right now. It's just open season. And it's just basically whatever you wanna do, you can do as far as facial recognition is concerned.
Starting point is 00:30:10 And that story in particular, it's like some people are like, awesome, the system works. Other people are like, what about poor Woody Harrelson? He was really in danger right then of being implicated in this beer stealing scheme from CVS. And Woody said, what dude? I love that guy. I do too.
Starting point is 00:30:27 Man, true detective the first season. First four episodes, just amazing. That's called using a probe photo when you use, when you say, hey, that looks like someone, they also did the same with one of the New York Knicks apparently. I could not for life when we find out who. Yeah.
Starting point is 00:30:44 It's like he's being protected or something. Maybe. No one said who it was. A couple of numbers for you though. The FBI receives about 50,000 facial recognition search submissions a month for their database. So that's the other thing. If you don't have even the money for a subscription
Starting point is 00:31:02 to Amazon recognition or you don't have an IT person who's capable of assembling it and putting it, you know, using it, you can just submit these requests to the FBI. So there's a lot of different avenues you could take as law enforcement to use facial recognition technology to catch suspected criminals. Yeah.
Starting point is 00:31:23 I was about to say bad guys, but as we'll see, that's not always the case. So here's some more numbers though, because it needs to be regulated, but when it works, it really works. Yeah, it really does though is the thing. Yeah. There was one department where they said it lowered
Starting point is 00:31:40 the average time required for an officer to identify a subject from an image from 30 days to three minutes, which kind of brings home the point. There's another number in here that's interesting, but. 17. It brings home the point that like, this is something that human policemen were doing,
Starting point is 00:32:00 officers were doing with their eyeballs by flipping through books. Yes. For 30 days straight, saying like, ah, it doesn't look like this person. This is like a chance to really speed up that process and to spend more time in theory catching bad guys. Yes.
Starting point is 00:32:16 I'm not arguing for it. I'm just saying they were doing this anyway just through manpower. Right. I think the thing is anytime you add artificial intelligence, it automatically makes the user of the artificial intelligence aside unfairly advantaged. It's not like the criminals are able to use AI
Starting point is 00:32:37 to steal beer from CVS more effectively, but the cops are using AI to catch them stealing beer more effectively. And it's kind of like, yes, it makes sense to catch like child pornographers and human traffickers and rapists and murderers and violent criminals with this stuff. But using that kind of technology
Starting point is 00:32:56 to catch somebody who stole beer from a CVS, that's when it starts to feel like, what kind of society are we moving toward? You know? Well, I think someone- Not, hold on, let me keep going here for a second because I don't want people to be like, what are you in favor of the guy stealing beer from CVS?
Starting point is 00:33:11 No, I'm not. I think you're a scumbag if you steal beer from CVS. But I also think that it's overkill to use facial recognition technology to catch that person. Use old-fashioned police tactics or don't catch them. Yes, yeah. That's just kind of the fairness of the old West in New York City.
Starting point is 00:33:28 I think I might be on the other side because I don't think we need to set a fair playing ground between criminals and cops in saying like, it's unfair that cops can use this stuff and criminals are just out there not able to use these same techniques. Okay, so my, the fairness thing, it doesn't just end at the law and order thing, right?
Starting point is 00:33:49 Like it's not just with cops using it, that they have this huge advantage. I totally get how people would be like, no, give the cops that huge advantage. I don't have an issue with that in and of itself. I think my issue comes a step or two down the road where the government or the cops acting on behalf of the government use that
Starting point is 00:34:13 against everyday citizens who have no recourse whatsoever. Right. That lopsidedness that's so evident when you're using AI to catch somebody's stealing beer from the CVS, it's really easy to kind of follow that a little further across to the horizon and see just how unfair life could be and how oppressive that could be using that technology.
Starting point is 00:34:37 I think that's ultimately what I'm saying. All right. I hopefully dug myself out of that hole by now. So, and this gets into some of the controversies and the arguments. If you're scanning mug shots for rapists and arsonists and murderers and violent criminals and you're catching people,
Starting point is 00:34:59 you're not gonna find a lot of people that say, well, that's not fair, go back and use, take a month to look through a mug shot book instead and waste a bunch of time and don't be efficient. So I think most people would say, if you're looking at mug shots, although we should point out that a mug shot doesn't mean, that just means you were arrested.
Starting point is 00:35:17 That doesn't mean you were guilty of anything. Right. So there were plenty of opportunities for false positives and people being put in jail that shouldn't be. Right, but- But there's not a lot of people who are like, no, don't use mug shot databases. Right, exactly.
Starting point is 00:35:32 If you're scanning driver's license databases or other just general public databases, that's when it gets super tricky because we can't avoid the fact that what that means is and the center on privacy and technology kind of stated very plainly, what that means is everyone is in a perpetual lineup, essentially.
Starting point is 00:35:53 Right. If you have a driver's license, you're part of a police lineup. Yeah, whether you like it or not, whether you know it or not. And if that computer says, here's the guy, it's Chuck Bryant, they will say, oh, he doesn't strike me as very sinister and the computer will be like, trust me, this is the guy
Starting point is 00:36:10 with like an 80-something percent confidence interval. Chuck, suddenly you're going to get visited by the cops and maybe you'll even get arrested because you were a little cagey when they talked to you and you set off their cop radar or whatever. Right. And then the next thing you know, you're in court being charged with a crime that you didn't commit
Starting point is 00:36:28 because a computer implicated you. Right. And the cops thought that you were acting cagey and let's say that you were a very, very poor person and you don't have any money to mount a decent defense, the best you can afford is a free public defender who has 50 other cases and is not really paying very much attention to you
Starting point is 00:36:45 and you're in jail now because you got convicted wrongly because you were putting a lineup just because you had a driver's license. Yeah, I think for me, and this is total my privilege coming through as well, like I'd want to see some numbers and if one of every 10,000 arrest and conviction of a real criminal or an a rapist and a murderer
Starting point is 00:37:08 and there's three people that get falsely identified and have to go through the system and may or may not be acquitted, I'd want to see those numbers. But again, that's coming from a privileged position as someone who could afford a legal defense. Right. Who is white.
Starting point is 00:37:24 Yeah, exactly. That's another one too, is that people of color bear an inordinate burden, a disproportionate burden when it comes to facial recognition technologies we'll see. Well, I mean, you might as well go ahead and talk about that. I think from the beginning, even with social media, there were certain facial recognition, early facial recognition technologies that admitted
Starting point is 00:37:47 like we're not as good as recognizing faces with darker skin. It's just not that good. Yeah, I think something like darker skin men and women were recognized 12% and 35% were misidentified compared to 1% and 7% of light skinned men and women. And they say it's because of the data sets that these machines have been trained on.
Starting point is 00:38:12 Which is crazy. It's not purposefully, but it makes sense if you live in like a generally like, like the white people are in power and it's like whiteness is the most celebrated part of the society or whatever, that's what you're gonna have more pictures of. And when you feed just a bunch of pictures
Starting point is 00:38:31 from your society into a machine and say, learn what faces are, it's gonna go, oh, white men, I got you. Well, there just are more white people numbers wise. So that probably has something to do with it. Right, yes, that's a really, that is an excellent point as well, for sure. But the fact of the matter is,
Starting point is 00:38:48 the data sets that the machines are learning on are largely white and largely male. And so they're just not as good at recognizing the differences in faces among people who aren't white males. Yeah, let's read these quotes. There's a couple of good quotes here. The first one is from Woodrow Hartzog.
Starting point is 00:39:13 What? I was gonna read it as Werner, I don't know if I can. I should get Nolan here, he does a good Werner. Oh yeah? The most uniquely dangerous surveillance mechanism ever intended, it's an irresistible tool for oppression that's perfectly suited for governments to display unprecedented authoritarian, I'm sorry,
Starting point is 00:39:32 authoritarian control and an all out privacy, eviscerating machine. That was dead on. I just realized it's Hartzog, so it's spelled differently. It's H-A-R-T, Hartzog is just H-E-R-Z-O-G. I'm glad that we didn't figure that out beforehand though. You wanna take the other one though? Also, I have to say I detected a hint
Starting point is 00:39:55 of Michael Cain in there too. There might have been a little bit. It's hard to get Michael Cain out of my system. What's the other one? Oh, from Microsoft President Brad Smith. Yeah. So Brad Smith says that when combined with ubiquitous cameras and massive computing power
Starting point is 00:40:10 and storage in the cloud, a government could use facial recognition technology to enable continuous surveillance of specific individuals, like they're supposedly doing in China, as an aside. It could follow anyone, anywhere, or for that matter, everyone, everywhere, at any time, or even all the time. And he wasn't, this wasn't a sales pitch. He was speaking out against this to Congress saying like,
Starting point is 00:40:31 guys, we gotta, we have to do something about this because this is the, the path we're heading down. And that's why Seth Abramowitz changed his name to Brad Smith. It sounds like a total, like, made up. Brad, it does. Yeah, like, I just wanted to blend in. So you've got scanning against mugshots,
Starting point is 00:40:52 scanning against driver's licenses, and then there's a new one that just came out. The New York Times just released this expose on January 18th, just a few days ago, on a company called Clearview AI. And apparently, even among Silicon Valley, there has been this longstanding kind of unspoken thing where let's steer clear of this facial recognition technology
Starting point is 00:41:16 because it's such a tool of oppression, potentially. And Clearview AI said, hey, we're not from Silicon Valley. Well, we're just gonna do our own thing. So now, there's this, From Sacramento. There's this tool that's available to law enforcement agencies that they're using. Remember that one guy who had a quote saying
Starting point is 00:41:33 that it went from 30 days to three minutes? They were almost certainly using Clearview AI. And the reason Clearview AI has such an advantage is because they've gone to this place where everyone else said was off limits, which is scraping social media. So rather than the 41 million driver's license and mugshot pictures that is available in the FBI's database,
Starting point is 00:41:57 Clearview AI is this app that you can subscribe to for a year for like 2000 to $10,000. And they have three billion pictures, including links to the social accounts of the people whose pictures come up so that you can not only see who it is, you can find out where they're at right then.
Starting point is 00:42:16 And it's a hugely invasive thing. And there's no legislation on this whatsoever. And it's only just recently come out that this company even exists or that this app exists and that law enforcement is using this stuff because again, there's basically no laws saying you can do this, you can't do that. And again, Woodrow Herzog has basically said
Starting point is 00:42:36 there's no way we're going to realize the benefits of this without the incredibly disproportionate drawbacks. And he just calls for an all out ban of the technology. He's basically saying it's not worth it. All right, let's take another break. Oh my gosh, we haven't taken our second break yet? Nope. Okay. And we'll be right back to talk about the rest of this stuff
Starting point is 00:42:58 right after this. We're going to use Hey Dude as our jumping off point, but we are going to unpack and dive back into the decade of the 90s. We lived it and now we're calling on all of our friends to come back and relive it. It's a podcast packed with interviews, co-stars, friends, and nonstop references
Starting point is 00:43:39 to the best decade ever. Do you remember going to Blockbuster? Do you remember Nintendo 64? Do you remember getting Frosted Tips? Was that a cereal? No, it was hair. Do you remember AOL Instant Messenger and the dial up sound like poltergeist?
Starting point is 00:43:52 So leave a code on your best friend's beeper because you'll want to be there when the nostalgia starts flowing. Each episode will rival the feeling of taking out the cartridge from your Game Boy, blowing on it and popping it back in as we take you back to the 90s. Listen to Hey Dude, the 90s called on the iHeart radio app,
Starting point is 00:44:08 Apple Podcasts, or wherever you get your podcasts. Hey, I'm Lance Bass, host of the new iHeart podcast, Frosted Tips with Lance Bass. The hardest thing can be knowing who to turn to when questions arise or times get tough or you're at the end of the road. Ah, okay, I see what you're doing. Do you ever think to yourself,
Starting point is 00:44:25 what advice would Lance Bass and my favorite boy bands give me in this situation? If you do, you've come to the right place because I'm here to help. This, I promise you. Oh, God. Seriously, I swear. And you won't have to send an SOS
Starting point is 00:44:39 because I'll be there for you. Oh, man. And so my husband, Michael. Um, hey, that's me. Yep, we know that, Michael. And a different hot, sexy teen crush boy bander each week to guide you through life step by step. Oh, not another one.
Starting point is 00:44:52 Kids, relationships, life in general can get messy. You may be thinking, this is the story of my life. Just stop now. If so, tell everybody, everybody about my new podcast and make sure to listen. So we'll never, ever have to say bye, bye, bye. Listen to Frosted Tips with Lance Bass on the iHeart Radio app, Apple podcast,
Starting point is 00:45:12 or wherever you listen to podcasts. I think we should talk a little bit, like we've talked about the false positives. And I think within Amazon, their contention is that what you're talking about with like these studies out of MIT that said that there are too many false positive is he's, they're saying, wait a minute,
Starting point is 00:45:44 you're talking about facial analysis, not facial recognition. And those are two different things. I did not understand this at all. I went and looked it up and there's- I didn't fully get it either. It sounds like some tap dancing to me. I looked it up and there's like not a distinction
Starting point is 00:45:58 between those two, aside from in this quote. Oh, really? Yeah, it's basically the same thing. And also it doesn't even make sense as a defense. So basically what they're saying is that that they were being called out by MIT's Media Lab. They did a 2018 study. That's the one that found that there was like a 12
Starting point is 00:46:17 and 35% misidentification of more darker skinned men and women. And especially women, I think. Yeah. And Amazon said, no, no, no, you guys are using facial analysis, not facial recognition. It's like, no, that's not the case at all.
Starting point is 00:46:34 They're doing facial recognition. All right, I'm glad it wasn't just me because you see I wrote, I don't get it next to this. It was a bad jam, I guess. But I think their point was, well, you're trying to tell the gender of somebody and if you're doing binary gender stuff, like you're trying to say this is male or female,
Starting point is 00:46:55 you can't really use facial recognition for that, especially among darker skinned people. And they said that you shouldn't use that, especially in cases of people's civil liberties or whatever. But it still remains the case that if you are a darker skinned person and you're being looked at by a police department that has their threshold for a confidence level set low,
Starting point is 00:47:21 there's a chance that a false positive is going to be put out there. And that can be trouble for you if you don't have the money to mount a defense. And even if you do have the money, you shouldn't have to mount a defense and spend money on that to be acquitted of a crime just because the computer's not so good
Starting point is 00:47:37 at distinguishing black people like it is among white people. Yeah, and when it comes to where this is going to end up legally, you might wanna look at the Fourth Amendment. It gets really dicey on how you interpret the Constitution when you talk about illegal search and seizure. Is this a search or a seizure?
Starting point is 00:47:58 Probably not, because it depends on what we're talking about with the Supreme Court. You've probably been stopped at a DUI checkpoint. And that's stopping everybody. That's sort of the same thing. It's like, if you're in a car, we're gonna stop you and check you out
Starting point is 00:48:15 because the public, the public has said, that's okay, it's reasonable, it's not super invasive. And if you're stopping drunk drivers, it's just putting someone out for a few minutes. Yeah, the court said if it's minimally invasive and the public good or the potential for public good,
Starting point is 00:48:33 which is in this case, getting drunk drivers off the road is high enough, then it's okay to basically search everybody without probable cause. Yeah, same with TSA checkpoints. When it comes to official rulings, obviously we don't have one in facial recognition yet, but if you look at Carpenter v. United States,
Starting point is 00:48:51 the court ruled 5-4 that police violated Fourth Amendment rights of a man when they asked for his cell phone location data without a warrant from T-Mobile. So hopefully this nuance will prevail and it just won't, it looks like it probably won't be some blanket ruling that just says, yep, you can use it for whatever you want.
Starting point is 00:49:10 Right. If it even gets to that point. And if the court hears it, which it probably would. So the other thing that has become worrisome for people though is it's becoming, our society is becoming increasingly surveilled, right? Like ring, the ring doorbell. Sure. They market to law enforcement basically saying like,
Starting point is 00:49:32 you can, these people like will pay to have video cameras put on their house and you can go get these videos. It's on neighborhood pages all the time. People like my car got broken into who can help me out with their cameras. Right. You're being marketed to law enforcement.
Starting point is 00:49:44 Your TV has a camera in it. Your smart speaker has a microphone in it. So the more that we are surveilled and the more ubiquitous facial recognition technology gets, the easier it will be to not just scan a picture of somebody stealing beard of a CVS against a mugshot database or driver's license database, but to say this person right here
Starting point is 00:50:09 that you're looking at that the camera's following. That's this, that's Chuck Bryant right there. And everywhere you walk, there's a little icon next to your head, Chuck Bryant. If you click it, it'll show you your Facebook page or a map up to your house or whatever they want to know. Your police record, it doesn't matter. And that this is what we're increasingly
Starting point is 00:50:30 getting closer to. And some people say this is what they're already doing in China. Yeah. And London has, they were one of the first on the CCTV train. Yes, but they use humans, which is fair.
Starting point is 00:50:43 Right. Oh, for recognizing faces? Yeah, they have people like actually looking at the individual monitors looking for crime. This is the idea of this is just tracking people who are just doing nothing wrong. Yeah, but there are plenty of people on the other side we should point out that are like,
Starting point is 00:51:00 you know what, if you're catching bad guys, that's great. If you're a good guy, you got nothing to hide, so you shouldn't sweat it. Yeah, I can never remember the name of the article. I'll try to find it, but there's a, man, I wish I could remember off the top of my head, but there's this amazing article from a few years back that basically says like, that's a terrible argument
Starting point is 00:51:21 that even if you have nothing to hide, you still are a human being. And if somebody wanted to put together, yeah, but if somebody wanted to put together like a dossier on yours, embarrassing things that you've done or said or thought or whatever, and put it all together and condensed it, you can make anybody look bad.
Starting point is 00:51:44 No one should want to live in a situation where like that could conceivably happen in there. A police state? Yeah, a police state. Yeah. Good stuff. I guess we'll see how it pains out. I'm not saying.
Starting point is 00:51:58 Police state. FR is good stuff, police state's good stuff. Yeah, we'll see what happens. Right in Woodrow Herzog and let us know what to do. If you want to know more about facial recognition technology, you can go onto the internet and start reading stuff about it. Definitely read the New York Times expose about Clearview AI.
Starting point is 00:52:20 It came out January 18th. Yes. Okay, since I said that, it's time for listener mail. All right, no it's not. You know what it's time for. Oh yeah, I know what it's time for. You ready? Yeah, you say it.
Starting point is 00:52:30 For administrative details. All right, this is part two. This is where we thank people on the show that have sent us kindnesses via snail mail. Siggy, S-I-G-G-I sent me some hand knitted socks. Not you for some reason, I don't know why. I got some socks too. Oh really?
Starting point is 00:52:58 Yeah, I didn't know who they were from. So. They may be from Siggy. I think probably what it was is you left them with my desk. Okay. And I thank you for it, Chuck. All right. Do another one while I'm pulling up my list.
Starting point is 00:53:10 My computer's acting up. Julie Shoup made us t-shirts. Shoup. This is good stuff, faux band name, tour shirts. Super fun. Thanks a lot, Julie. Very cool. You're still working, so I'm gonna keep going.
Starting point is 00:53:23 Thalia Dawes, is our pal from Australia, sent my daughter a couple of books. Oh. She's a very lovely lady who has a very adorable and whip smart daughter about the same age who listens to our show. And I was just like, man, I wish she lived here. We could go on a play date.
Starting point is 00:53:41 Yeah. They both seem like lovely humans. There's such things as planes. Yeah, go to Australia for a play date. So at our Portland main show, Chuck, we had like a lot of, we got a lot of neat gifts. Jim Diefenbacher made us amazing crosshatch portraits. Oh yeah.
Starting point is 00:54:00 The prints of them. Yeah, those are great. Of us, like, of a photo we took, I think on like our West Coast tour from 2015. Yeah, it brought back some memories. Yeah, it's just really great stuff. And you can see Jim's work at jimdiefenbacher.com, J-I-M-D-I-E-F-F-E-N-B-A-C-H-E-R.com.
Starting point is 00:54:18 And they were framed in everything. Yeah, very sweet stuff, Jim. We got some home-tapped maple syrup from Andy Hunsberger from Elgin, I-A. Okay. Is that Iowa? Yeah, yeah, okay. I was about to say the wrong state.
Starting point is 00:54:37 What were you gonna say? I don't know. I think I went to say- Illinois. Have you ever seen Gary Goldman's bit on abbreviating the states? No. Dude, just look it up.
Starting point is 00:54:45 One of the great comedy bits I've ever seen. Okay. It's hysterical. Let's see. Oh, another at the Portland show. We got a letter from Togue Braun from Down East Dayboats. From Lloyd Braun? Toge Braun.
Starting point is 00:54:59 Oh. And Down East Dayboats mission is to bring sustainable, delicious scallops from Maine to the world. And she said that scallops have varietals like oysters and that Maine has the best. So check out DownEastDayboat.com. I love scallops. And Togue Braun, feel free to send us some scallops
Starting point is 00:55:18 as long as they've been appropriately refrigerated the entire time. Yes. I got another children's book. Are You a Good Egg? And that was from Peter Deuschel, along with some stuff you should know, Coasters. Yeah, yeah.
Starting point is 00:55:31 Thanks again, Peter. I think we thanked him last episode for the Coasters, too. Oh, really? Didn't know about the children's book. Sarah Law, who is a SYSK Army member, she came to the Toronto show and she brought us a bunch of Canadian goodies. Oh, yeah.
Starting point is 00:55:47 Everything from Japanese cheesecakes and tarts from Uncle Tetsu, which is so good. And I think some other stuff, too, like coffee crisps, which are my favorite. Yeah, so thanks a lot, Sarah, as always. Why is everything from Japan awesome? They just... It's really good.
Starting point is 00:56:05 They don't necessarily invent much. They just take other people's inventions and perfect them. Yeah, and it seems like they take a lot of pride in doing things right. Yeah. I think he could say that, probably. Yeah. Because we got from Matt an assortment
Starting point is 00:56:20 of food things from Japan that came in today, including our beloved Cupid mayonnaise. I love that stuff. It's been too long. It's been too long. Thanks a lot, Matt. God bless you. Let's see, Leah Harrison gave us some amazing goodies, too, including coffee crisps and Canadian smarties,
Starting point is 00:56:39 which are way better than American smarties because they involve chocolate. Super smarties. A student named Maria Styling wrote us a letter for an honors English project because she had to write someone who inspired her. And she asked this and I told her we'd answer, how do we choose a topic?
Starting point is 00:56:56 Maria, we choose a topic. It's not... It's pretty lo-fi. We just send each other one each week on whatever happens to grab our fancy. We're always looking around our world and thinking, huh, I wonder about that. Yeah, and that's as easy as it gets.
Starting point is 00:57:11 And we'll just send each other an email. And 99 times out of 100, we'll say, great, let's do it. Yeah. Boring, I know. Let's see. Oh, Michael C. Lerner, who's an attorney at Law & Reno, sent us a letter about getting the word out about the National Consumer Law Center,
Starting point is 00:57:29 for which Lerner does a lot of pro bono work for people who are poor and getting screwed over because of debt, as he put it. So he pointed us to the National Consumer Law Center and the Practicing Law Institute's Consumer Financial Services Answer Book. So if you are in debt and you're getting pushed around, go check those things out, says Michael C. Lerner.
Starting point is 00:57:48 Good stuff. Van Ostrand, we got to thank him again. Our buddy from Washington sent us a book by his friend, Andy Robbins, called Field Guide to the North American Jekyllote. It's pretty awesome. Yeah, it's very fun. Paul Esbeth from Mars Community Brewing Company in Chicago
Starting point is 00:58:04 gave us a bunch of beer at the Chicago show. Thank you for that. I got one more. Okay. I'll go ahead and finish up and then you can round us out. Man, I have a whole page left. All right. Robert Highland from WAMO.
Starting point is 00:58:16 This just came in today. Okay. He works for WAMO. He sent us each their 70th anniversary Superboat. Oh, wow. Thanks a lot. He's like, you guys talk a lot about WAMO products. Does it bounce?
Starting point is 00:58:27 I have not dropped it on the floor yet. Let's find out after this. I'll give it a try. I'm going to do a couple more and then maybe we'll split these up because they're for both of us for another episode. That's up to you. He can blaze through them too. No, there's too many.
Starting point is 00:58:39 Okay. So let's see. The Crown Royal people again for hooking us up. Very sweet. They've hooked us up many, many times. And they gave us a nice congratulations because we got the best curiosity award from the I Heart Podcast Awards last year.
Starting point is 00:58:56 Oh, yeah. That's how old this one is. Mick Sullivan gave us a copy of his book, The Meat Shower, which is amazingly illustrated. You can check it out on the past and the curious.com. Meat Shower. Yeah. That just sounds really gross.
Starting point is 00:59:08 It really does. Let's see. And all around everything out with Danielle Dixon, who is a real life marine biologist, Chuck at the University of Delaware. And she sent us a couple of copies of her kid's book, Sea Stories, children's books based on real science. You can check it out at s-e-a-s-t-o-r-y-books.com.
Starting point is 00:59:30 All right. You going to save the rest? I'm going to save the rest. We'll split them up. All right. Thanks everybody who sent us stuff. And thank you also just for saying hi to anyone who does.
Starting point is 00:59:40 You can say hi to us by sending us an email. Wrap it up, spank it on the bottom, send it off to stuffpodcasts at iHeartRadio.com. Stuff you should know is a production of iHeartRadio's How Stuff Works. For more podcasts from iHeartRadio, visit the iHeartRadio app. Apple podcasts are wherever you listen
Starting point is 01:00:00 to your favorite shows. Thanks guys. Love you Die. Thank you. Love you too. On the podcast, Hey Dude, the 90s called David Lacher and Christine Taylor, stars of the cult classic show, Hey Dude, bring you back to the days
Starting point is 01:00:22 of slip dresses and choker necklaces. We're going to use Hey Dude as our jumping off point, but we are going to unpack and dive and relive it listen to hey dude the 90s called on the I heart radio app apple podcasts or wherever you get your podcasts. Hey I'm Lance Bass host of the new I heart podcast frosted tips with Lance Bass. Do you ever think to yourself what advice would Lance Bass and my favorite boy bands give me in this situation. If you do you've come to the right place because I'm here to help and a different hot sexy teen crush boy band or each week to guide you through life tell everybody
Starting point is 01:00:57 you everybody about my new podcast and make sure to listen so we'll never ever have to say bye bye bye listen to frosted tips with Lance Bass on the I heart radio app apple podcast or wherever you listen to podcasts

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.