Stuff You Should Know - Will Deepfakes Ruin the World?

Episode Date: July 30, 2019

Very recently, thanks to a new type of AI, it’s gotten much easier to create convincingly realistic videos of people saying and doing things they’ve never said or done. Will fake videos undermine ...our shared sense of reality and lead to the death of truth? Learn more about your ad-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener for privacy information.

Transcript
Discussion (0)
Starting point is 00:00:00 On the podcast, Hey Dude, the 90s called, David Lasher and Christine Taylor, stars of the cult classic show, Hey Dude, bring you back to the days of slip dresses and choker necklaces. We're gonna use Hey Dude as our jumping off point, but we are going to unpack and dive back into the decade of the 90s.
Starting point is 00:00:17 We lived it, and now we're calling on all of our friends to come back and relive it. Listen to Hey Dude, the 90s called on the iHeart radio app, Apple Podcasts, or wherever you get your podcasts. Hey, I'm Lance Bass, host of the new iHeart podcast, Frosted Tips with Lance Bass. Do you ever think to yourself, what advice would Lance Bass
Starting point is 00:00:37 and my favorite boy bands give me in this situation? If you do, you've come to the right place because I'm here to help. And a different hot, sexy teen crush boy bander each week to guide you through life. Tell everybody, ya everybody, about my new podcast and make sure to listen so we'll never, ever have to say. Bye, bye, bye.
Starting point is 00:00:57 Listen to Frosted Tips with Lance Bass on the iHeart radio app, Apple Podcasts, or wherever you listen to podcasts. Welcome to Step You Should Know, a production of iHeart radios, How Stuff Works. Hey, and welcome to the podcast. I'm Josh Clark, there's Charles W. Chuck Bryant. Here's our brand new producer from now on, Josh.
Starting point is 00:01:22 Hello, Josh. Have we said your last name, Josh? We don't do that. No, okay. Sometimes we say Jerry's. Jerry quit. She really did. But she did it like that kind of like quiet, silent way where she just stopped showing up, yeah. Jerry didn't quit everyone, we don't think.
Starting point is 00:01:39 We're not, it's not entirely certain. Until I see her sitting in that chair, then I'm assuming she's quit. What if Chuck, she sent us a video of herself saying, I quit, I'm so sick of you guys, I'm done with this forever. Would you believe it then? But the lips didn't quite match up. Yes. Then it would be a deep fake.
Starting point is 00:02:01 It would be a deep fake. I saw a tweet, I don't remember who it was, but they were, maybe Ryan Liza or somebody was complaining. Said, why did we call these things deep fakes and somebody schooled them on it? It was kind of nice to watch. Who did she say this? Ryan Liza, I think.
Starting point is 00:02:17 I don't know who did this. He's like a CNN correspondent, journalist. So, first of all, we want to issue a COA here that maybe kids shouldn't listen to this one. We're talking about some really every day dark, harmful stuff. Yeah, really despicable, gross stuff. The only thing I can think of that would be worse
Starting point is 00:02:40 than covering this would be to do one on like snuff films. I kept thinking of that while I was reading this. I don't know, man, I don't want to pit types of despicable media against one another, but I think revenge porn might have a leg up on this. Well, which this sort of is as well. Right, it's definitely a close cousin of it, at least. Yeah, but this one isn't for kids.
Starting point is 00:03:05 And I was shocked and dismayed because I didn't know about this. And when I saw it's a podcast on fake videos, I thought, well, how fine because I love those videos of David Beckham kicking soccer balls into cans from a hundred yards out on the beach. That's for real. No, yeah.
Starting point is 00:03:25 And it's just coincidences holding that Pepsi can up. I saw it with my own eyes. I thought that's what this is about. I was like, well, those are fun. It is kind of. But then I wanted to take a bath after this. I can understand. So we should probably start out after the COA
Starting point is 00:03:42 by saying what a deep fake is. Deep fake, D-E-E-P-F-A-K-E, all one word, is a type of video where somebody is saying or doing something that they never actually said or did. Which you say, okay, this is nothing new. This has been around for a while. Like people have doctored photos and videos and stuff like that for basically as long
Starting point is 00:04:04 as there's been videos. Or CGI. Sure. This is different. This is in the same ballpark, but this is an entirely different league. Like this league plays on Saturday and Sunday afternoons, not Tuesday night, you know what I mean?
Starting point is 00:04:18 Like this is something totally, just let it simmer for a little while. And you'll be like, wow, it's a really good analogy. This is just, it's different. It has a lot of the same in principle, but because they are so realistic and they're getting more and more realistic by the day, they actually, in a lot of people's minds, pose a threat.
Starting point is 00:04:37 Not just to individual people, as we'll see, but possibly to society at large, say a lot of people who are really worried about this kind of stuff. Yeah, and we're not talking about, I'm assuming the fake lip reading thing. That's deep fake, right? Or is that just no manipulation of video whatsoever?
Starting point is 00:04:57 And that's just people using their voice. So what that is, is the, yeah, it's just somebody pretending, like they're just fake lip reading and then doing a voiceover of it. So they're not manipulating video at all? No. Okay.
Starting point is 00:05:11 No, they're just doing a really bad job of lip reading. Those are hysterical. They are hilarious. I would put those up with the GI Joe PSAs. Yeah. Like pork chop sandwiches. Like those are just all-time classic, can watch them any time and still laugh.
Starting point is 00:05:26 Have you ever seen the GI Joe's action figures on like the dead road kills? No. It's sad because it's a dead animal, but there'll be like a dead squirrel in the road and someone will pose like a GI Joe action figure with his like foot on his head. Like it's a trophy game.
Starting point is 00:05:44 It's kind of funny. Yeah, I can see that. It's, let's put it this way. It's as funny as you can make a picture of a dead animal. Right. That I assume got hit by a car. Yeah, hope. I mean, maybe they are like killing squirrels just to-
Starting point is 00:05:57 After reading this, I don't doubt anything and it makes me hate the internet even more. All right, so let's get into this a little bit. Okay, Chuck, calm down. We're not allowed to have personal positions on these things. Oh, that's right. So this is totally a neutral thing. Okay.
Starting point is 00:06:11 So there's this really interesting Gizmodo article that talked about the history of kind of, not necessarily deep fakes, but altering videos. Like presenting a doctored video as reality. Yeah. And apparently there was a long tradition of it at the beginning of cinema where people got their news from newsreels.
Starting point is 00:06:31 Like you actually go to a movie theater to see the news because you were just a early 20th century yokel living in Kansas or something like that. Yeah, and after reading this bit, I thought that was a very Gizmodo way to say, here's one not so interesting fact that really has not much to do with this. Oh, I love it.
Starting point is 00:06:50 Really? I personally selected it and put it in here. Yeah, I thought it was kind of funny. I think it's great. Yeah, they used to fake real life events and recreate them. Don't try to backpedal. And that has nothing to do with deep fakes. It does because one of the big problems or threats
Starting point is 00:07:06 from deep fakes is it's a way of seeing what you think is news, but it's not. It's a sham, it's recreated. Yeah, the difference I see is they were recreating real news events and just like here you didn't see it. So this is what it may have looked like. But they were passing it off as real.
Starting point is 00:07:24 Therein lies the tragedy of all time. I thought it was a very thin Gizmodo-y. Ah, whatever, we'll edit this part out. Webster's defines deep fakes. I like it, I put it in there specifically because I thought it was good. That's all right. Okay, so we'll take another tack.
Starting point is 00:07:40 Okay, why don't we talk about deep fakes? So Chuck, let's talk about deep fakes. We can just cut there. So deep fakes actually are super new. Yeah. And the reason they're called deep fakes is because in late 2017, I think November, this guy who was a redditor, a guy who posts on Reddit.
Starting point is 00:07:59 That's your first warning sign. Not necessarily, Reddit's pretty sharp and smart and got some good ideas going on. As of all the social media platforms, I throw my two cents in with Reddit. All right. But there was a redditor called Deepfake, D-E-E-P-F-A-K-E, all one word.
Starting point is 00:08:17 And he said, hey world, look at what I figured out how to do. And he started posting pornography but with celebrities faces transposed on it. And he said, this is just my hobby, but here's how I did it. And he said that he used, I'm assuming it's a him. I don't know if it's a man or a woman.
Starting point is 00:08:36 I'm gonna go with a man. Okay. And he said, I just used Keras and TensorFlow. And these are a couple of basically open source AI programs that this guy was smart enough to figure out how to use to train to create these videos where you take a celebrity's face and put it on a clip from a porn movie
Starting point is 00:08:58 and it looks like the celebrity is doing what you're seeing. Right. And at first it was kind of hokey and not very, it was very obviously not real. Yeah, I think the scary part was how quickly and easily it could be done. Motherboard who we used to write for every now and then. Remember that?
Starting point is 00:09:20 I tried to forget. They tried to forget for sure. Yeah, Vice hit us up what? Like, I feel like seven or eight years ago. I'm trying to forget and you're making it really hard. So you guys wanna write some blogs for Motherboard? He said, sure. So we did.
Starting point is 00:09:32 We did. We wrote 10. Yeah, you can probably go find those in the internet if you wanna learn how to drive a stick shift or something. The fine people at Motherboard scrub those from the internet forever. Let's hope so. So this DeepFake character figures this out.
Starting point is 00:09:47 Another guy released a downloadable desktop software that said, here, you can do this awful thing too. Right, within like two months of DeepFake coming out and saying, look what I did and here's how I did it. Somebody said, that's a really good idea. I'm gonna turn it into an app and make it, give it to everybody. That's right.
Starting point is 00:10:05 And now people can, at this time, this was a very short time ago, people, and it's really come a long way in the past, whatever, not even two years. Right. Because this was late 2017, right? Late 2017, early 2018 when it really first popped up. Yeah, so this thing was downloaded 100,000 times
Starting point is 00:10:22 in the first month alone. And some people used it for fun stuff, like putting Nick Cage in movies he wasn't in. Yeah, those are called derp fakes. Yeah, they've all got fun names, don't they? Dude, Nicholas Cage's Yoda is patently objectively hilarious. I didn't like that one. I thought the, I don't know,
Starting point is 00:10:44 the rage of the all stark thing was interesting, I guess, but none of them made me laugh. Like, maybe I just don't have that kind of sense of humor. Yeah. But yeah, I didn't ever was like, oh my God, that's hysterical. It's Nick Cage's face. I understand, I understand where you come from.
Starting point is 00:10:59 I don't think I was like, you know, in stitches or anything like that, but it's pretty great. Okay, okay. It's just not my thing. You're not a gizmodo reader, are you? No, none of this is my thing, but that doesn't mean we can't report on it. However, since it started happening,
Starting point is 00:11:14 it became pretty clear pretty quickly that this could be a bad thing in the future and not just for putting your ex-girlfriend's face on a sex video and, you know, saying, look what she did. You could put a world leader up there and really cause a lot of problems. Yes, hypothetically you could. And that's really, as we'll see,
Starting point is 00:11:40 this new technology, this deep fake technology, it poses at least two risks, two immediately obvious risks, and they're hyper-individualized and hyper-macrosocial risks, but they both stem from the same root, it's the same seed to keep the metaphor going and on track. That's right. So let's talk about the technology behind this
Starting point is 00:12:06 because this stuff is just totally fascinating. Surely you agree. It is AI, it was created by a guy named Ian Goodfellow. Just this particular type of AI, he didn't make the deep fake stuff. No, no, no. But basically what this model, you know, everyone knows AI is basically when you teach a machine
Starting point is 00:12:25 to start teaching itself. It starts learning on its own, which is a little creepy, but the model that they're using these days is called artificial neural net, which is machine learning and basically what they've done in this case is all you have to do is show something, a lot of data, for it to start to be able to recognize that data
Starting point is 00:12:46 when you aren't showing it that data. Yeah, and it learns on its own what makes the classic example is an AI that can pick out pictures of cats. Right, that's easy enough. But you don't tell the AI, here's what a cat is, find pictures of cats in this data set. It's here's a bunch of stuff and figure out what a cat is
Starting point is 00:13:07 and they get really good at picking it out. You can also turn it the opposite way. Once you have an AI trained on identifying cats and get it to produce pictures of cats, but they're usually terrible and often very, very bizarre. Anyone would look at it and be like, a human didn't make this, it's just off in some really obvious ways.
Starting point is 00:13:29 And what Ian Goodfellow figured out was a way around that problem. Yeah, so I'm not sure I agree with his wording here, but we'll say what he calls it. He set up two teams and one is a generator and one is a discriminator and he calls it Generative Adversarial Network. So basically his contention is that these two are adversarial.
Starting point is 00:13:53 I saw it more as like a managerial in nature. Okay, very bureaucratic of you. Yeah, I mean, isn't that what it felt like to you? The discriminator is like, yeah, I'm gonna need you to come in on Saturday. That's what it kind of felt like. So you've got these two networks and they're both trained on the same data set,
Starting point is 00:14:13 but the generator is the one that's producing these fake cats and then there's a discriminator or what I like to call a manager saying, these look good, these don't look so good. Right, the other way, the way that Goodfellow has proposed it is that the discriminator is going through and looking at these generated pictures and trying to figure out if it's real
Starting point is 00:14:35 or if it's fake, if the generator created it or if it comes from the data set. And based on the feedback that the manager gives the generator, the generator is going to adjust its parameters so that it gets better and better at putting out more realistic pictures of cats. Yeah, I don't get the adversarial part unless it gets mean in how it delivers that message.
Starting point is 00:14:59 The way, the reason they call it adversarial is I saw it put like, it's like an art forger and a detective and the art forger is putting out, or an appraiser is a better way to put it. The art forger is putting out forged art and the appraiser is like, this is fake, this is fake, this is fake, well, I'm not sure about this one. I don't know if this one's fake.
Starting point is 00:15:21 This is real, this is real, this is real. And then at that point, the generator has become adept at fooling an AI that's trained to identify pictures of cats, creating pictures of cats that don't exist. Okay, it's adversarial. They're trying, the generator's trying to fool the discriminator.
Starting point is 00:15:40 And the discriminator's trying to thwart the generator. That's the adversarial part. But in the end, they're really on the same team. Yeah, okay, I guess that's where it loses me. You have a really positive image of corporate America. What is that to do with anything? The manager's on the same team as everybody. Come on, get on board, get on the trolley, Josh.
Starting point is 00:16:00 So if you wanna look up Mona Lisa Talking, there was a video this year from Samsung that showed how you could do this. And all the stuff is on YouTube, if you wanna see Nick Cage as Indiana Jones. Which is pretty funny. Or Yoda. Which is hilarious.
Starting point is 00:16:17 Or if you wanna see Mona Lisa Talking, it looks fairly realistic. Like, hey, they brought that painting to life. Right, yeah, yeah. And if you scroll down a little bit, they did one with Marilyn Monroe too. They brought her to life? Yeah, they did.
Starting point is 00:16:30 Interesting. Well, I mean, this is just set up for TV commercials. Yes. They've already done stuff like this. Right, this is... Like Fred Astaire dancing with a dirt devil or whatever that was. What this could bring is creating entirely new movies
Starting point is 00:16:49 and bringing back dead actors and actresses. Or I guess just actors now these days, right? Well, I mean, you've seen some of the... Are you talking about the de-aging people or just creating, like, bringing back someone that's been long dead and... No, no, I'm saying, like, you call actors and actresses just actors these days.
Starting point is 00:17:07 Oh, that part. Yeah. Do whatever you want. Okay. Motion picture performers. Maybe even television performers. But bringing them back and giving them... Like, they could star in an entirely new movie.
Starting point is 00:17:20 Yeah, that's what you're doing, sure. Because it's so realistic in life. Yeah. They're not at that point yet. No. Because they're just now getting to where the de-aging looks decent, depending on who it is. Like the Sam Jackson stuff and Captain Marvel.
Starting point is 00:17:36 My friend... Looks pretty good. It looked amazing. Yeah, and this Will Smith stuff in this new Ang Lee movie looks really good. What's that one? He's an Ang Lee movie where he plays some sort of assassin that... Aladdin?
Starting point is 00:17:51 Yeah, that's it. Have you seen that? No. That's good. No, I have no interest. Oh, that's good. To go back and kill the younger version of himself, or the young one's trying to kill the older one
Starting point is 00:18:01 is what it is, maybe. It sounds a lot like Looper. Sort of, but it looks pretty good. Like, it looks like young Will Smith. Slightly uncanny, but not as bad as... I think some people are easier than others. Like the Michael Douglas stuff in Ant-Man and the Marvel stuff is kind of creepy looking.
Starting point is 00:18:17 I haven't seen that. I mean, I've seen parts of it, but I didn't notice that they were de-aging Michael Douglas. They took Michael Douglas back in scenes to like the 70s and stuff and it's just like, doesn't look great. But anyway, that's sort of off track. A little.
Starting point is 00:18:31 But not really. I mean, it's kind of similar type of stuff, I guess. No more than that Gizmodo article to start. Yeah, that's a good point. But the whole reason we should point out that people are doing this stuff with celebrities and stuff like that is just because there's more data out there.
Starting point is 00:18:45 It's a lot easier when you have a gazillion pictures of Brad Pitt on the internet to do a fake of Brad Pitt. Yeah, because the data set that the AI is trained on is just more and more robust and the more pictures there are, the more angles the AI has seen Brad Pitt look around at and so can recreate this, these faces, because the AI seen like every possible pose or expression or whatever Brad Pitt's ever made.
Starting point is 00:19:13 We've seen all of them. But the thing is that Mona Lisa and Marilyn Monroe thing that Samsung showed, they showed that you could make a pretty convincing deep fake with just one picture, one pose, right? So that's a big deal. But again, the bigger the data set, the better. And that's why, like you said, celebrities
Starting point is 00:19:33 and world leaders were the earliest targets. But over time, with the advent of other software and the fact that people now post tons of stuff about themselves and pictures of themselves on social media, it's become easier and easier to make a deep fake video of anybody. There's like holes, there's software that scrapes social media accounts for every picture and video
Starting point is 00:19:59 that's been posted. Yeah, public accounts. Right. Big distinction. Yeah, for sure. There's, and then there's other sites and other apps that say, oh, this picture of this person you're targeting, you're a classmate or whatever,
Starting point is 00:20:15 they probably have a pretty good match with this porn star. So go find videos of this porn star. And then the next thing you know, you run it through that app that deep fake app came out with and you've got yourself a deep fake video and you've officially become a bad person. All right, that's a good place to take a break.
Starting point is 00:20:33 And we'll talk more about these bad people right after this. We'll talk more about these bad people right after this. We'll talk more about these bad people right after this. We'll talk more about these bad people right after this. Listen to Hey Dude the 90s called on the I heart radio app, ample podcasts or wherever you get your podcasts. Hey, I'm Lance Bass host of the new I heart podcast,
Starting point is 00:21:48 frosted tips with Lance Bass. The hardest thing can be knowing who to turn to when questions arise or times get tough or you're at the end of the road. Ah, OK, I see what you're doing. Do you ever think to yourself, what advice would Lance Bass and my favorite boy bands give me in this situation? If you do, you've come to the right place
Starting point is 00:22:05 because I'm here to help this. I promise you oh god seriously. I swear, and you won't have to send an SOS because I'll be there for you Oh, man, and so my husband Michael. Um, hey, that's me. Yep We know that Michael and a different hot sexy teen crush boy band or each week to guide you through life step-by-step Oh, not another one. Uh-huh kids relationships life in general can get messy You may be thinking this is the story of my life. Just stop now if so tell everybody ya Everybody about my new podcast and make sure to listen so we'll never ever have to say bye bye bye Listen to frosted tips with a lance bass on the iHeart radio app Apple podcast or wherever you listen to podcasts
Starting point is 00:22:59 All right, so you mentioned before the break this Person's face that I just stole off the internet fits this porn actor's body which is a it's a consideration if you're making that because To to look right that has to they have to bear a passing resemblance. I think that's right. Okay, that's I was just about to say So yeah, so what they're doing now is they're browsing these applications with facial recognition software To make this a lot easier and that's what most of this is about is like
Starting point is 00:23:32 Let's just see how easy we can make this and how how much we can democratize this Where any schmoe can take any single photo and do the worst things possible with it But also how convincing they've become as well Yeah, I mean some other big change. It's looking better and better quicker and quicker, which is pretty scary, right? Did you see the Obama one? Yeah, yeah Had it not been for had it not been for Jordan Peele's voice Obviously being not Obama. I would have been like this is really convincing. Oh, really? Yeah See, I didn't think the lips matched up at all. Oh, I thought it looked pretty close. Yeah
Starting point is 00:24:10 So what we're talking about is Jordan Peele did a basically a demonstration video to raise awareness about how awful this is by doing it himself and did a video of Obama like you know Referring to Trump as a curse word a dipstick and basically saying like hey This is Obama and what you know people are doing he basically is describing what's happening right as you're watching it Right, and I thought it looked kind of fake. He's describing a deep fake through a deep fake Yes, in Jordan Peele and he did it in conjunction with Buzzfeed and another production company But in their defense they were making this in like early 2018 like April 2018 and since then even more Technologies come out that is dedicated to matching the movement of the mouth
Starting point is 00:24:54 Yeah, to whatever words you want the person to say Yeah, and you can also like use only parts of it. So it's even more convincing, right? So like if Obama had a lead-in that actually worked you could just keep that in there and then take out certain words And you can manipulate it however you want to right and the AI can go through and find like phonemes and stuff like that to make the new words that the person never said it's It's becoming extremely easy. Let's just put it like this It's becoming extremely easy and it's widely available for anybody to make a video of Somebody doing something or saying something that they never did or never said and to make it convincing enough that you may
Starting point is 00:25:37 Believe it at first. Yeah, which like we said that you know the obvious scariest implications that aren't just of the personal Variety are in politics. Yeah, when you could create real fake news that actually put people in jeopardy or put the entire world in jeopardy by like announcing a nuclear strike or something like that, right? Yeah, Marco Rubio in I can't remember when it was but Within the last year or two basically said that deep fakes are the modern equivalent of a nuclear bomb That you could threaten America. He's probably hyperbole to that degree with the deep fake I think that is a little hyperbolic for sure. We're not the only ones Yeah, there are other people that say people that know what they're talking about not just
Starting point is 00:26:22 You know schlubs like us, but other people that say like hey, listen, this is probably not like a nuclear bomb going off we should keep our eye on it, but There are other bigger fish to fry when it comes to stuff like this for sure And then there are other people who are saying well, there are that's not to discount like the real problem it can pose, right? Sure, like we're already in a very polarized position in this country And so the idea of having realistic Indistinguishable from reality videos. Yeah of like world leaders or senators or whoever saying whatever
Starting point is 00:26:58 Yeah, it's not going to help things at all. No, it's not going to bring everyone together like look at this hilarious deep fake It's going to be like see look And it's just going to erode that trust that that is necessary for a democracy to thrive and To to take into its logical conclusion this one researcher put it like this like eventually we're going to lose Our ability to agree on what is shared objective reality, right?
Starting point is 00:27:29 And at that point what we would face is what's called the the death of Truth, yeah, like there is no such thing anymore And in one on one hand, that's horrible. It's a horrible idea the idea that nothing's real because The there's such a thing as deep fakes and anybody could could make something like this Yeah, but on the other hand you can kind of say you get in the cage as Yoda, right exactly On the other hand though, you can say the fact that people know that deep fakes are out there Means that it's going to be easier and easier to be like that's obviously not real, right? It's just too Unbelievable. Yeah, so it may actually make us more discriminating and more discerning of the news than we are today
Starting point is 00:28:15 Yeah, that's the only thing that salvaged my brain from this trek. Yeah of talking about this today was like well We'll go tell our listeners at least be on the lookout Be wary. Mm-hmm. Take everything with a grain of salt, right? Because we're already in a place where like you don't even need some deep fake video like right it's happened all over the place you can see a Something that's photoshopped or a real photo that someone just writes a false story about Yeah, that's a good one You can just come up with a false narrative from a picture or whatever there's a guy on the street And he's laying there bleeding and you can just say
Starting point is 00:28:51 this person was attacked yesterday by a group of angry Trumpers or an Antifa on the other side and it'll get passed around 20 million times And then the retraction gets seen by 800 people exactly. That's where that's not a deep fake. No, that's just that's low-hanging fruit That's a low-five fake imagine inserting into that and this is what we're talking about into that climate Like video where you're looking at the person see in seeing with your own eyes What they're saying and a lot of people who aren't like I thought the Obama video looked pretty fake. You thought it looked pretty real Mm-hmm every one's eye is different and ear is different like a lot of people will believe Anything they see like this, right? And we'll talk about like how to discern deep fakes in a second, but
Starting point is 00:29:40 We're getting to the point people seem to be in wide agreement that very soon It will be up to digital forensic scientists to determine Yeah, whether a video is authentic or not and that's all that's because you or I will not be able to distinguish it from reality Yeah, and I imagine that every country will have their own team that will be hard at work doing that stuff and by will Already does yeah as since the end of 2017. Yeah, or at least they're scrambling to catch up and because when the video comes out of you know The leader of North Korea saying we want to drop bombs on America. Yeah at 2 o'clock this afternoon
Starting point is 00:30:24 Right, that's gonna send our DARPA team scrambling to try and disprove this thing before we push the button, right? It's like war games. It is but way way worse Yeah, so just to reiterate one more time the one thing that you and I can do and the one thing that you guys out there listening can do To keep society from eroding is to know that deep fake videos are very real and just about Anybody with enough computing power and patience to make one can make one and the very fact that those things exist Should make you question Anything you see or hear with your own eyes that seems unbelievable or sensational Unfortunately, I think this stuff you should know crowd is pretty savvy very so we're sort of preaching to the choir here
Starting point is 00:31:07 Yeah, but maybe they can go pass it along preach to their older relatives on Facebook. Yeah, exactly. Take this to Thanksgiving dinner Just explain it to folks We should talk about porn a little more should we take a break first sure are you okay with that? Yeah, I feel bad now. No, okay. We'll we'll wait and talk about porn in about 60 seconds On the podcast pay dude the 90s called David Lacher and Christine Taylor stars of the cult classic show Hey, dude bring you back to the days of slip dresses and choker necklaces We're gonna use hey dude as our jumping off point But we are going to unpack and dive back into the decade of the 90s
Starting point is 00:31:54 We lived it and now we're calling on all of our friends to come back and we live it It's a podcast packed with interviews co-stars friends and non-stop references to the best decade ever Do you remember going to blockbuster? Do you remember Nintendo 64? Do you remember getting frosted tips? Was that a cereal? No, it was hair Do you remember AOL instant messenger and the dial-up sound like poltergeist? So leave a code on your best friend's beeper because you'll want to be there when the nostalgia starts flowing Each episode will rival the feeling of taking out the cartridge from your Game Boy blowing on it and popping it back in as we Take you back to the 90s listen to hey, dude the 90s called on the iHeart radio app ample podcasts or wherever you get your podcasts
Starting point is 00:32:36 Hey, I'm Lance Bass host of the new iHeart podcast frosted tips with Lance Bass The hardest thing can be knowing who to turn to when questions arise or times get tough or you're at the end of the road Okay, I see what you're doing. Do you ever think to yourself? What advice would Lance Bass and my favorite boy bands give me in this situation if you do you've come to the right place? Because I'm here to help this I promise you oh god, seriously I swear and you won't have to send an SOS because I'll be there for you Oh, man, and so my husband Michael. Um, hey, that's me. Yeah, we know that Michael and a different hot sexy teen crush Boyband or each week to guide you through life step by step not another one
Starting point is 00:33:16 Kids relationships life in general can get messy. You may be thinking this is the story of my life Just stop now if so tell everybody yeah Everybody about my new podcast and make sure to listen so we'll never ever have to say bye bye bye Listen to frosted tips with the Lance Bass on the iHeart radio app Apple podcast or wherever you listen to podcasts All right, Chuck you promised talking about porn Yeah, this in this research It says one of the defenses people make in favor of deep pig porn is that it doesn't actually harm anyone Is anyone actually saying that yeah a lot of people who I shouldn't say a lot
Starting point is 00:34:04 I've seen at least one quote from people who make this stuff saying like this is a this is the media Drumming up a moral panic like what's the what's the problem here? What's the issue? It's not like we're it's not like they're going and hacking into like a star's iCloud account sure getting naked pictures of them and then distributing that and like this is really a private Naked picture of a celebrity. They're just trying to fool people into thinking they've done that to them I think that they would say they're just creating some fantasy thing. That's not even real. It doesn't really exist I'm not defending it. I'm just telling you what the other side is saying yeah Well, and that's a perfect example of why these are very bad people because it is it is harmful
Starting point is 00:34:46 to everyone involved to the person whose face you're using to the adult film actor who did a scene right and wants credit and well, yeah I mean it's regardless of how you feel about that stuff someone did something and got paid a wage to do so and now it's being ripped off and There are real people's faces involved and real bodies involved and real lives. It's you know It's not a moral panic, but this you know, it's not like we need to march this to the top of Capitol Hill right now Well, that's funny because Congress held hearings on it this year. Well, yeah, but I have a feeling it's a little bit more about the political ramifications than
Starting point is 00:35:25 Putting your ex-girlfriend's face on a porn body. Oh, yeah. Yeah. Yeah, I see what you mean although they could do you know They could put Your your governor's face on a porn body right and get them removed from office. Yeah, you know this video was just dug up Yeah, look at look at this look at your governor. Yeah, or yeah Look what he's doing for sure, but even take that down to the the more the less political level like you were saying you could Ruin somebody's marriage if it was shaky you're on the rocks before sure Hey, here's a sex tape of your husband or your wife, you know, we know yes blackmail is another one, too That there's a lot of ramifications of this and it seems like the more you dig into it
Starting point is 00:36:03 The more it becomes clear that really the big threat from this is to the individual whose face is used on the deep fake porn Right, they could do a video of us holding hands walking down the street, right? Or they could just use that video of us doing that the one that exists in reality. I'm glad you picked up that one There was just Geez just a couple of weeks ago because I saw this on when I googled this under news So it's very recently. There was an app That we won't name that's undressed Basically, what you could do is just take a picture of any woman
Starting point is 00:36:38 Plug it into this app. Yeah, and it would show her What she would look like nude not her body But it would just do it so fast and so realistically that you could nude up some woman, right? With the touch of a button. Yeah, and like it would replace her clothes in the picture with nude lady clothes, right so with a birthday suit, right birthday suit, right? That's what I was looking for and it's just as awful as you think And the creator actually even shut it down within like three or four days. Yeah, but like what was this guy thinking?
Starting point is 00:37:12 Like yeah, this is a great idea. I'm like, oh people have a problem with this. Well, I'll shut it down, right? Like really? In his defense, he's probably like 14. Well, I Guess that's a good point Even if you plugged in a picture of a man It would show a woman's nude body and you know what that means that means that And the person who created this app says well, I just did that because there are way more pictures of naked women on the internet And I was gonna do a man's version, but I had to go to baseball practice and never got a chance to Yeah, that's that's pretty amazing. And of course that person is anonymous, right?
Starting point is 00:37:46 We didn't as far as I know. Yeah, which means that they really must be for team because they weren't unmasked on the internet Despite the outrage against this. Oh, you're probably right. I wonder that it's just pharma, bro That guy. Yeah, it's just playing everything on him. He's still in jail. Is he really? Yeah, good So that's a good Segue way into what you can do if this happens to you, right? There's a lot of outrage call your congressman against there's a lot of outrage against this kind of thing on the internet So if this if you are targeted and you end up in a deep fake porn Clip or video or whatever sure? Um
Starting point is 00:38:28 You could drum up some moral outrage on the internet or Go to the site that is being hosted on directly and say hey, I know this is messed up. You know, this is messed up Please take this down. I didn't consent to this. This is an invasion of my privacy Get rid of this video and porn websites are good about that actually. Yep. They don't want that stuff on there No, and it's not just porn websites like I'm porn hub reddit Giffy Some other sites have banned all kind of deep fake videos and apparently Giffy. I think it's Giffy g y f y
Starting point is 00:39:03 I have no idea. I don't either. I'd never heard of it until I started researching this But this site actually created an AI that's trained to spot deep fake videos and remove them from the site Which is a big tool that they need to be sharing with everybody else, right? But if you can contact the site and say hey man take this down This is me they will probably take it down just because of the the everybody knows this is really messed up Yeah, they got enough they have plenty of videos to work for from that are real, right? They don't need this stuff, but there's no laws that say they have to take it down either. Well not yet There's this guy Henry Fareed. He studies digital forensics at Dartmouth
Starting point is 00:39:43 And they are hard at work like again. This just started, you know very recently so all of this stuff They're just like scrambling to get ahead of right as far as Sniffing this stuff out. Well, the whole world was caught off guard by this. Oh, yeah, like this guy just went Hey, look what I can do. I'm gonna change the world here. Nick Cage is so funny. Oh wait What is Nick Cage doing is Yoda? Oh my god. So professionals there are some pretty easy to spot things if you're a pro Unless it's just really bad Your average layman can spot those but if you're a pro you're gonna look for like bad compression Stuff that like, you know lighting that looks off. Yeah, not blinking is a big one. Yeah, like Michael Cain
Starting point is 00:40:29 Don't blink. That's right. Maybe he's just a big deep fake right his whole career Sound is a big thing like wait, hold on. Hold on. I want to say why blinking is not a thing Oh, sure because it's fascinating to me Probably wasn't to you because you didn't like that gizmodo article, but the reason why not blinking is a thing and deep fakes is because Deep fake AI that is trained on data sets Probably are being shown photos of somebody not blinking right so they don't learn that people blink So they don't the AI doesn't know to add blinking when it renders the the new face on this video
Starting point is 00:41:09 But all they can do is just say all right well now then we'll program it to blink, right? That's the big problem Chuck. It's like everything they can spot in fact when they list out all the things like look for blockiness compression fix Pattern noise. I'm sure there's some deep faker that's like check check check. Thanks for the list of stuff We need to work on and they're there it's still maybe like you were saying at a point where possibly you or I could look at a Shoddy video and be like yeah, I see this this and this is a little wrong like there is a little bit of compression Relic or remnants or whatever like they're not blinking the shadows off a little bit Yeah, but there's also plenty of videos where you do need to be like a digital forensic scientist to find them or an AI to find it Yeah, you can also use your ear holes because
Starting point is 00:41:56 You know look at the room that the person is in and would it sound like that in a room like that Yeah, that's one of the things audio specialists look at is like You know if you have Obama in a concert hall speaking and it sounds like he's in someone's closet or us Yeah, it sounds like a can like our earliest episodes. Yeah, exactly. That's pretty you know pretty strong indicator. It is So there are things you can do When Buzzfeed Tweeted that Jordan Peele Obama deepfake. They included a list of things to to do to spot a deepfake Um, I wonder how many times you said deepfake in this episode. I don't know
Starting point is 00:42:34 Don't jump to conclusions. Okay Consider the source. It's a big one. That's what's gonna guide us through here people like this is Jordan Peele I can trust that. No, but I mean like if you go on to a site I can spot like a fake news site a mile away. Oh, yeah, like you can just tell it's just there's it's off It's uncanny. It's our sense of uncanny that's gonna guide us through this Yeah, the it's you can always tell because the screen is black and the text is fluorescent green in comic sans And then another one is check where this thing is and isn't yeah This is kind of like the opposite tip
Starting point is 00:43:10 We always give people where if you see the same thing and basically the same wording throughout the internet You should question that if you see a deepfake video in only a couple of places in its news news But you don't see it on like ABC or CNN or Fox News or wherever if you don't see it on like a reputable news site Yeah, you should probably question it. Yeah, Donald Trump threatens nuclear war. We have the video from slappy.com It's probably a good indicator slappy.com. I'm sure the good people at slap. You're like, hey We make we make hamburger, but I should pick it up. I should probably check and see what that is Everyone else is right now What else look closely at their mouth? Yeah, and then here's a kind of a no-brainer
Starting point is 00:43:53 It's like slow it down slow the video down So your role and like really look at it closely. Yeah, you see like because that's where you're gonna see like strange lighting changes and stuff But it's all legal it is so we were kind of we were kind of talking about that like the best way to get a video taken down Is to contact the website? Just be like bro. Come on. This is awful. Yeah There are no laws that protect you directly But a lot of people are saying well We've got revenge porn laws that are starting to pop up around the country. It's a very short short
Starting point is 00:44:28 Trip to from revenge porn to deep fake porn. Yeah, it's virtually the same thing. It's involuntary Pornography. Yeah, it's even more involuntary because with revenge porn the person even posed for the picture or whatever initially for whatever Context or reason sending that out with no intention to get it out. Okay with deep fake porn this person never even posed or engaged in this act or anything like that Yeah, so it's even even in a way maybe even worse than revenge porn, which is feels like bitter acid in my mouth to say So you can make a case though that these revenge porn statutes that protect people could be extended to this as well but That's a that's for personal stuff. Yeah for like
Starting point is 00:45:14 National stuff or a public figure or something like that, especially when it comes to politics You could make a really strong case that these deep fake videos even the most misleading Nefarious deep fake video you can imagine would be protected under the first amendment. Yeah, I could see a satire defense being mounted Yeah in the future Like you know, what's the difference between doing a really good deep fake and doing an animated cartoon like South Park Right, which shows people saying and doing things they wouldn't do either. Yeah it is very Slippery and thorny in a very fine line
Starting point is 00:45:51 But even if the person who makes the deep fake says nope I did not mean this is satire It was meant to be misleading and I wanted to see what effects it had Short they didn't shout fire in a crowded theater. So they could probably still get away with it under the first amendment Yeah, it's interesting to see where this is gonna go. Yep. Hopefully right down the toilet. Nope. It's just gonna get there It's gonna get more and more realistic and we're gonna end up inadvertently Falling into the simulation. That's what's gonna happen Chuck prepare for it. That's great. Okay Just try to put a smile on your face regardless
Starting point is 00:46:29 I'm smiling If you want to know more am I if you want to know more about deep fakes It's so hot right now. Just go on the internet and you can read all sorts of news articles about it Since I said that it's time for listener mail. I Think the first thing that turned me off was the name Mm-hmm anytime I see something that's like not a real word But they're like squeezed together two words and it's all lowercase or something right? Oh Gosh, yeah, it's just the worst of the internet. I know what you mean
Starting point is 00:47:04 It's terrible. All right. Hey guys, just finished the Neanderthal episode and there was mention that Their language could have have some remnants in our modern languages. That was a really good demonstration Automatically remembered that during how swearing works You guys mentioned that a different part of our brains activate when hearing or using swear words So maybe just maybe that small percentage of our Neanderthalian DNA activates when we stub our toes or hit our shins to unleash our primitive and original language It's neat. How about that? I like this person. Anyways, love the podcast guys grateful for the knowledge and entertainment And thank you Jerry or should we just say thank you Josh right Josh tea for keeping the quality of these podcasts awesome
Starting point is 00:47:49 We're not thanking Jerry anymore. No, just Josh tea from my overpriced apartment in Redlands, California. That is Falcon. No, it's his last name. Wow, the end is silent though. So it's Falco Thanks a lot Falcon. We appreciate you. That was a great name. Thanks for swooping in with that idea and I'm sorry for everybody for that If you want to get in touch with us like Falco did you can tweet to us. We're at Sysk podcast I'm at Josh on Clark. We're on Instagram. We're on Facebook. Where else are we Chuck? Uh, we're on every deep fake. Yeah, we might be on Giffy. Who knows Giffy slappy You can also send us a good old-fashioned email
Starting point is 00:48:35 Bank it on the bottom after you wrap it up, of course and send it off to stuff podcast at I heart radio dot com Stuff you should know is a production of I heart radios how stuff works for more podcasts from my heart radio Visit the I heart radio app Apple podcasts are wherever you listen to your favorite shows On the podcast hey dude the 90s called David Lacher and Christine Taylor stars of the cult classic show Hey, dude bring you back to the days of slip dresses and choker necklaces We're gonna use hey dude as our jumping-off point But we are going to unpack and dive back into the decade of the 90s We lived it and now we're calling on all of our friends to come back and relive it
Starting point is 00:49:23 Listen to hey, dude the 90s called on the I heart radio app Apple podcasts or wherever you get your podcasts Hey, I'm Lance Bass host of the new I heart podcast frosted tips with Lance Bass Do you ever think to yourself? What advice would Lance Bass and my favorite boy bands give me in this situation? If you do you've come to the right place because I'm here to help and a different hot sexy teen crush Boybander each week to guide you through life tell everybody yeah Everybody about my new podcast and make sure to listen so we'll never ever have to say bye. Bye. Bye Listen to frosted tips with Lance Bass on the I heart radio app Apple podcast or wherever you listen to podcasts

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.