Duncan Trussell Family Hour - 354: Hashiam Kadhim

Episode Date: September 24, 2019

Hashiam Kadhim, AI, Deep Learning, and Deepfake expert, joins the DTFH! This episode is brought to you by BLUECHEW (use offer code: DUNCAN at checkout and get your first shipment FREE with just $5 sh...ipping).

Transcript
Discussion (0)
Starting point is 00:00:00 We are family. A good time starts with a great wardrobe. Next stop, JCPenney. Family get-togethers to fancy occasions, wedding season two. We do it all in style. Dresses, suiting, and plenty of color to play with. Get fixed up with brands like Liz Claiborne, Worthington, Stafford, and Jay Farrar.
Starting point is 00:00:18 Oh, and thereabouts for kids. Super cute and extra affordable. Check out the latest in-store, and we're never short on options at jcp.com. All dressed up everywhere to go. JCPenney. Greetings, friends. It is I, Dee Trussell.
Starting point is 00:00:33 And you are listening to the Ducatrussell Family Hour podcast, Deep Fake Technology. That's what we're going to talk about today. This is one of my favorite topics. If you pay any attention to the DTFH at all, you know that I went through a period of being supremely freaked out by the potentials of deep fake technology. And suddenly this insane thing came out where they'd
Starting point is 00:00:57 taken Rogan's voice, perfectly duplicated it, and just got him to say a lot of weird shit. And I got lucky enough to meet one of the brilliant human beings behind that technology, Hashim Khadim. Hope I'm saying that right. He goes by Hash, and he's a brilliant programmer. And we ended up meeting, and I think he thought that we were just going
Starting point is 00:01:23 to do like a Skype where we talked. But I recorded it, and he let me use it as a podcast because I thought it was such an awesome conversation, covering something that I really do think is going to be one of the major cultural disruptors of our time. So that's what this podcast is. If you want to hear about deep fake technology,
Starting point is 00:01:44 the apocalypse, and all the supreme weirdness that is coming down the tubes technologically, then this is the episode for you. We're going to jump right into it. But first, some quick business. If pussies are gardens, as Socrates did say, then boners are about sprinklers, and com is what they spray. And if you want a baby, it won't hurt to pray,
Starting point is 00:02:09 but you'll definitely need a boner. If you don't like boners, then blue choose not for you. But if you're seeking boners, then we've got a deal for you. We will send you a free packet of blue chew, and you can see what happens to your boner. This episode of the DTFH has been brought to you by the lords and ladies of boners over at bluechew.com. Now you can increase your performance
Starting point is 00:02:39 and get that extra confidence in bed. Bluechew.com, that's blue like the color blue. Bluechew brings you the first chewable with the same FDA approved active ingredients as Viagra and Cialis, so you know they work. You can take them any time, day or night, even on a full stomach. And since they're chewable,
Starting point is 00:02:58 they work up to twice as fast as a pill, so you can be ready whenever an opportunity arises. Friends, I took blue chew to make sure it worked. I didn't want to promote something that was snake oil, and it works. It definitely, definitely works. A 45 year old guy with a bald spot here, covered in weird back hair, it worked.
Starting point is 00:03:26 It's prescribed online and shipped straight to your door in a discrete package, so no in-person doctor visit, no waiting in the pharmacy, and best of all, no more awkwardness. I have one ball, it worked. They're made in the USA, and since blue chew prepares and ships direct, they're cheaper than a pharmacy.
Starting point is 00:03:43 Right now we've got a special deal for our listeners. Visit bluechew.com and get your first shipment free when you use our special promo code Duncan. You just pay $5 shipping. Again, that's B-L-U-E-Chew.com, promo code Duncan, to try it for free. Blue chew is the better, cheaper, faster choice, and we thank them for sponsoring the DTFH.
Starting point is 00:04:09 All right, friends. We got a beautiful podcast for you today. Get ready to dive into this strange world of deep, fake technology. Before I forget, if you like us, why don't you subscribe to us at Patreon? I just uploaded an hour-long ramble about manifestation because I've become obsessed with Neville Goddard
Starting point is 00:04:29 and have been listening this incredible audible about him non-stop. If you subscribe, you're gonna get a lot more about him non-stop. If you subscribe, you're gonna get an extra hour-long rambling thing every month along with early access to episodes of the DTFH. Head over to patreon.com forward slash DTFH and subscribe.
Starting point is 00:04:51 I don't know what happened, but we had a rush on our store and that's exciting to me. We got a lot of amazing swag over at DuncanTrussell.com. If you wanna represent the DTFH out there in your particular note of the multiverse, go no further than DuncanTrussell.com. Click on the shop link and check out our amazing Goop and Goop. All right, today's guest ascended to the national stage
Starting point is 00:05:19 after making this insane Joe Rogan video where he and some of his partners created this insane perfect replication of Rogan. And they got Rogan to say a lot of really awesome, weird stuff and holy shit, I was so fucking blown away by this conversation because this stuff, this deep fake stuff, this crazy ability to replicate people
Starting point is 00:05:46 based on only a few minutes of audio of them talking has shaken me down to my very quantum core. And I think it'll give you the heebie-jeebies, but hopefully the good heebie-jeebies and not the creepy Fox News heebie-jeebies. Whatever kind of heebie-jeebies you get, it's October, that means Halloween's around the corner. So here's an authentic technological ghost story
Starting point is 00:06:11 just for you. Everybody please welcome to the DTFH, Hashim Karim. ["Welcome to the DTFH, Hashim Karim"] ["Welcome, welcome to the DTFH, Hashim Karim"] It's the DTFH, Hashim Karim. So yeah, you're saying tech people didn't get the apocalyptic nature of it? Yeah, usually they just focus on the techie details
Starting point is 00:06:49 and they don't really think about the big picture. I mean, I think even working on that, we suffer from it too. Well, we're just focused on the specific problems. We're not really thinking about the big picture that much. Who's behind it? Who's funding it or what's the plan with this? It started off as a side project with me, a friend, and then we brought a third person in that we found.
Starting point is 00:07:18 I was talking with him and I was working with him remotely, but he was actually in Tunisia, and then we brought him over. I convinced the company I was working at to hire him, and then all three of us were working on it. But my friend had this idea. He just, we were having dinner and he just said to me, wouldn't it be crazy if we could get Terrence McKenna on the Joe Rogan podcast?
Starting point is 00:07:39 That was the idea. Cool. From there, yeah, we just started working on, we started with Joe Rogan's voice and seeing if we could replicate his voice. That is so trippy. Because this resurrected quality of this type of technology fits in so perfectly with eschatological prophecies.
Starting point is 00:08:04 Specifically, the dead will rise again. At the end times, the dead will come back. And I think about the way I picture the world ending and deconstruct that sometimes and realize how much of that is put there by movies. And how explosions, fire, earthquakes, et cetera. But technology has added this layer of strangeness to the end times, not to say this is the end times,
Starting point is 00:08:41 but certainly to see a mirror of the book of revelations appearing technologically, which is like, if you guys have done this, you made an AI that converted texts, stop me if I'm wrong, it converts texts to speech, but it does it in Rogan's dialect and voice. Yeah. And that's gonna work for anybody, right?
Starting point is 00:09:06 Like you could do that for anyone you wanna do as long as there's some media existing. Yeah. So one thing I remember when your conversation on the Joe Rogan podcast with him, Joe always mentioned that he had, like there's so much audio of him available, so it's easy to do it with him.
Starting point is 00:09:23 But we were able to do it with other people without that much audio. So we were able to get good quality within 30 minutes of audio, but in the future, it's only gonna take a few seconds. And then that covers, you know, I mean, just off the cuff, but 70% of all human beings or something at least.
Starting point is 00:09:42 Yeah, anybody would give it a presentation or something. But then on top of that, doesn't it also, like the data troves that Amazon has and Google has, as they've been sniffing stuff for Alexa and Google Home and all that, that stuff would just as easily be usable too. Yeah. Well, that's what makes it possible to only have a few seconds of your audio
Starting point is 00:10:05 because there's so many people that have different voices, but maybe you could interpolate between different voices to get your voice. And those other people have said different things and the way you combine these things, one of the biggest assets is a huge data set. Right, so that's it. And there are huge data sets out there
Starting point is 00:10:27 for all public figures, but theoretically there's huge data sets out there for everybody if you've allowed certain permissions on Instagram, Facebook, Google, wherever, right? Like that's the sinister part to me is that there's storehouses of voice data being held by private corporations. And with that data,
Starting point is 00:10:50 you could essentially create a golem, some doppelganger of a person, right? Or am I overreacting to that? I think we're a bit early, but it's definitely coming. So we just poked at it, but it's definitely gonna get to the place where it's gonna be readily available, where everybody will,
Starting point is 00:11:08 or a lot of research teams will be able to do this. And then in the future, yeah. Go ahead, please. And then in the future, after the research teams will be available to do this, then people will make apps, and then a lot of, maybe people that are not so techy
Starting point is 00:11:23 will be able to do it as well. And that, of course, is gonna get used for augmented reality, for virtual reality, right? It means that pretty much anybody who's got some recording out there, you're gonna be able to hang out with them in a virtual space and have a conversation with them. It seems like the big chunk that we have yet to achieve
Starting point is 00:11:49 would be an AI that could, in some way, absorb the personality of a person, not just their vocal thumbprint, right? The one idea we're thinking of taking this project with is if we could have a conversational dialogue with Joe Rogan, and if we could add the visual aspect as well. So it's like you're video chatting Joe Rogan.
Starting point is 00:12:12 Right. So that was the one place we were thinking of taking it as well. And it's not that far-fetched. But you still have to like, this to me is like, once we cross the Turing test line with AI, this is one of the byproducts,
Starting point is 00:12:31 is that the AI begins to essentially possess digital costumes or masks for different people. And then the more granular the personality data you have, the more it's gonna cross over the uncanny valley to a point where there's a better version of you, right? That's how, that's what I see, it's not just like you could like, get Rogan and create a video Rogan
Starting point is 00:12:57 and have a conversation with your own Rogan. But theoretically, you could adjust, you could have sliders. Let's make him a little happier. Let's see what, you know what I mean? Let's see what he'd be like if he never became a comedian. I don't know, just like,
Starting point is 00:13:11 you could just change variables and an identity and start producing clones that have behavioral qualities that are different in certain ways. Some of them better, some of them worse, right? I can imagine you could even do stuff that's not so, that falls into certain buckets, like you could have a hybrid between you and him,
Starting point is 00:13:32 for example, or maybe like a 30% Rogan, 70% Duncan Trussell. Wow. Shit. Wow, that'd be amazing, like a kind of like lazy, Rogan sucks at the bandit in sports. But hey, the implication to me
Starting point is 00:13:55 is that this is like electricity. Am I wrong about that? Like I see this as the invention of electricity for what it's gonna do to us as a species. Like this really disrupts the fucking hive, man. There's only supposed to be one me. And our lives are lived on screens, you know? So it's like, this is this,
Starting point is 00:14:15 whatever you wanna call it, the digital biome that we're starting to exist, like you and I right now, we're hanging out on screens. You know, to me, it's quite nerve wracking to imagine a situation where the fundamental value of a person, being this sort of singularity that goes along with being an identity, gets complete. You essentially, you're devaluing identity with this shit.
Starting point is 00:14:41 You know, you're creating like a stock market crash for personalities here, right? Yeah, but again, it's hard to say when that would actually, when we'd actually get to that point, cause it could be very far off. How far off? I mean, that's not a fair question, but just off the cuff, you know, whatever.
Starting point is 00:15:00 You only need to be 10 years within range. Oh, that's impossible. So I'll tell you why it's hard to guess. It's because there have to be advancements in the industry that are paradigm shifting. And those big breakthroughs are very, very hard to predict because it takes somebody thinking so outside of the normal way of thinking in the industry
Starting point is 00:15:24 that they just, they're just a free thinker that just break everything and make a new step change in the whole field. So with our current technology, we can't do it. So that's why it's hard to guess, but it could happen, you know, anywhere within the next 30 to 100 years. Oh, so you think you three decades away
Starting point is 00:15:46 before we have to worry about the invasion of these masses. Yeah, but it's a very rough, very rough guess. Very rough guess. So, okay, let's jump at 30 years. We now live in the landscape where this technology has been advanced to the point where the replicants are indistinguishable from that, which is replicated. Their identities are identical to the identities
Starting point is 00:16:11 that they replicated. What does that world look like? It's going to be very different from now, especially because of the potential for misinformation. I was talking with a lot of people and this notion of high trust regions on the internet and low trust regions seems to come up where there's some areas you could browse
Starting point is 00:16:32 that are very, all the media there, whether audio or visual or video media, it's going to be verified in some way, either from the source or some sort of watermark in the media itself. And then that, you know, you could trust this sort of media. The vast majority of online will be some untrustworthy region where you could have, there'll be videos of you saying
Starting point is 00:16:56 and doing whatever and people will just assume it's fake. Okay, so it's sort of like the blue check mark on Twitter. There's going to be some universal verification. That's the implant. We're going to have to get a fucking implant. It's finally, the next thing is you're going to have to get some weird implant that interacts with something on digital cameras so that it's you.
Starting point is 00:17:20 I mean, I think that this low trust, high trust zone thing makes a lot of sense. Some kind of like non compulsory, but definitely it's worth thumb printing yourself or whatever. So people know that they've got the real you. Let's talk about though what we can expect in the next 10 years from this technology.
Starting point is 00:17:48 What do you think? Some example use case would be maybe there's a foreign film and you want to dub it in a different language. You can instantly do that with the voice of the actor speaking a different language. That's amazing. That's incredible, man. Yeah, I could see all the positive aspects of it
Starting point is 00:18:09 and the sort of mundane shit that people will use it for that's definitely going to revolutionize a lot of different industries. Even podcasting, I was thinking like, my God, if I particularly lazy and didn't want to read an ad, I could just get my bot, I could just type it up and the bot does the ad. And then not only that, if I wanted to,
Starting point is 00:18:31 I could type up my podcast intro and the bot does the podcast intro. These are things that are actually really useful in a lot of entertainment. A lot of aspects of entertainment for sure. Also, like the ability to, this is a question I had. How, I saw that y'all figured out a way to grab Rogans. What, do you have a name for the vocal thumb print?
Starting point is 00:18:59 What do you call that, the personality of the voice or whatever, do you have a name for that? What do you mean by that? So, you take a person's voice, you take my voice and within it is all of my many speech impediments, stutters, slurs, shrieks, guttural, whatever's that vary over time, somehow compacted in. Do you have a name for that or is it just a person's voice?
Starting point is 00:19:26 Your thing? You know, there's a technical term for it, that I don't think it's helpful, but it's because like these sort of techniques are very, there's inputs, there's the AI and there's outputs. And the AI, all the internal workings in the middle is where all the magic stuff happens. So, where it learns the intricacies of your voice.
Starting point is 00:19:49 And then you get prod inside that AI to pick some pieces out, but it's not like this manual process where we're learning these sort of stutters or stuff like that. Right, I got you. What's the technical term? It's like an embedding.
Starting point is 00:20:09 Okay, so yes, we need a better cooler term for that. It's like some kind of like vocal DNA or some shit, your voice genome or whatever, I don't know what it is, but that's your voice. So, you could take that, you could convert it into any language in the world that you wanna convert it into, but can you change it's mood?
Starting point is 00:20:31 Can you tell it to be excited? Okay, so you come up with basically shades of personality. So, depending on the thing you're getting it to read, you could say, read it like he's happy, read it like he's depressed, read it like he's... We've actually done that already. Like we have some examples of him saying the same thing, but angrily or happily.
Starting point is 00:20:51 That is fucking crazy. That's crazy, man. And how granular is that? Is do you have kind of happy, maximum happy? You have sliders, you have mood sliders. You have mood sliders? Yeah. Holy shit, man.
Starting point is 00:21:09 That is so fucking crazy. Wow. Okay, so then once we get an AI that can duplicate the personality and not just the vocal DNA, then we'll have sliders for intelligence. You could, this is why I'm excited is it's like theoretically,
Starting point is 00:21:30 now this is all theoretical. Because I was thinking in terms of therapy. Like imagine going in to get therapy, this AI scans you and shows you what you would look like if you weren't fucking depressed. Like you see a non-depressed version of you and that becomes your therapist. You could literally talk to a version of you
Starting point is 00:21:52 that is balanced. Hey, what would I be like if I didn't have like 16 addresses when I was growing up? What would I look like if that fucked up shit didn't happen to me in the fourth grade? What would I be like? And you could, that's what to me is the field of psychology. Just, you know, they talk about mirror neurons.
Starting point is 00:22:13 You get, and I know when I get around funny people, I become funnier. But what happens when you get around funnier versions of yourself and you get to see yourself doing shit? Would you learn faster from yourself? You know what I mean by like watching yourself and like, oh, that's what I would do. Have you thought about these applications of it?
Starting point is 00:22:34 No, I didn't think of that one, but it's fascinating. I think maybe it's something about podcasters, but I think I noticed you in particular in the conversations or people that are super open-minded, they come up with these examples very easily. It seems like, whereas people that work on these sort of problems, they don't come up with these examples as easily.
Starting point is 00:22:55 All right. Well, I see that, yeah. Cause y'all are deep in the thick of the, I know man, once you have to be, you've got to narrow the beam, so to speak, and just get the main groundwork done so that people can throw in this stuff. That's cool.
Starting point is 00:23:11 Well, I mean, this to me is like the terrifying quality of it is that what is more seductive to you than you? You know, and then like, what does that do to a person who has become really fixated on themselves as being this identity when another thing starts arguing with them that they're not really the real thing? That's where the whole course could be inverted.
Starting point is 00:23:38 You know, suddenly, theoretically, we couldn't enhance intelligence via AI, which I think we will definitely be able to at least pair it what you might look like if you were smarter by, you know, adding more words to your lexicon, for example. Right? Like, you know, the AI could probably, by scanning millions of hours
Starting point is 00:23:59 or how many hours you're out there, tell, no, get a pretty good account of how many words, you know, and then it could add to it better words. Think of the, just applying a thesaurus to you talking. So you're like, all right, let's like add like, much smarter words and remove the fucking likes, please, and all the fucking, refine me six degrees, please. And make me, you know, all that kind of thing
Starting point is 00:24:28 where suddenly you get to see yourself using bigger words, better words, and being more refined in the way you speak. I think that could be done without an AI, you know, or without the AI we're talking about, you know? Yeah, going back to the therapist thing, I can imagine if you wanted to convince somebody of something, talking to a smarter version of yourself,
Starting point is 00:24:48 and they're trying to convince you of something, and they know how to convince you because they know what you think, and they're thinking a thousand times faster. Yeah. And they can come up with these strategies on how to convince you. Yeah.
Starting point is 00:25:01 You can pretty much change someone's mind to do anything, I think. Anything. It reminds me of like, I mean, you know, to get real, let's get McKenna for a second, and imagine that we are living in a technological hive of super advanced primates who are in the process of converting the hive,
Starting point is 00:25:20 or have, or converting the hive into some kind of technologically enhanced cyborg, super complexified yet hyper-organized biosphere, a place where we're all living together, we're all super, we're all completely connected. This is a hive, four and a hive. And so traditionally, the queen bee releases pheromones, the pheromones make the workers act a certain way,
Starting point is 00:25:46 send out the sort of psychotropic pulsations within which the hive functions and understands how to function. And in the past, it seems like those signals have been sent out originally using just like town criers and shit, you know, and like newspapers and then radio and then, of course, TV and now the internet.
Starting point is 00:26:10 But the fundamental sort of thing that connects all those various degrees of communications technology is that the source is trusted. And you believe the source is some authority. And, you know, you disrupt that piece of the process and you have to cause hive collapse. You have to cause a kind of like suddenly of a super fragmentation of the hive
Starting point is 00:26:40 in the sense that people will have to cluster together in groups of physical, in packing of physical groups because nothing on your phone's fucking real anymore. It's all completely kaleidoscoped into absurdity. That's why I compare it to the nuclear bomb because we've grown so accustomed to this mode of communication. And specifically to trusting the person you're talking to
Starting point is 00:27:05 that you remove that and now, man, I don't know. Do you feel a little like shit? Maybe I shouldn't be fucking with this shit. One of the goals of that project was to let people know that it's coming. Ah, cool. Yeah, we didn't release the technical details to accelerate the progress of other people
Starting point is 00:27:25 being able to do that. And one question we had is like, should we even release a data set? Because even though we took it from the podcast, there's still a lot of work that went into curating that data set for this problem. And we decided not to do that as well. That was very cool of you not to do that, man.
Starting point is 00:27:41 That was very cool. One requirement if we would ever release it is Joe would have to at least put his stamp of approval on it because it's kind of messed up if you release some, because now people could maybe do something bad to him. Yeah, that's right. Well, yeah, I mean, but you know,
Starting point is 00:27:57 it's good that you're sending out the warning because yeah, you're not gonna release the data set. And right now that data sets, how big was it? How big is that data set? For Joe Rogan, specifically we used eight hours because we thought we don't know how much data we're gonna need. So let's just put a lot of data
Starting point is 00:28:13 and then we don't have to worry about that problem of working with a small data set. But 30 minutes was enough for other voices. Yeah, well, you guys have that data set and you guys are gonna hang on to it right now, but it doesn't matter because like everything this technology is desirable. People are gonna want it, they're gonna want it
Starting point is 00:28:40 for a lot of different fucking crazy reasons. And that means it's gonna become increasingly easy to obtain and eventually it's gonna be a button press. You don't even, it's like gonna be like a Google. You just put a person's name in there, the fucking bot scans, grabs what it needs to grab, interpolates the fucking data and suddenly they're dancing right in front of you
Starting point is 00:29:03 is whoever the fuck you want. For me, Nancy Grace. I want a little Nancy Grace living on my computer. It'd be so fun to have a little Nancy Grace you can talk to, it's pissed at you all the time. Let me the fuck out of here, just delete me. This is illegal, you know? Not even to mention that, but like being able,
Starting point is 00:29:22 like think of Grand Theft Auto. Only now it's populated by all your friends. It's taken this technology, it's emulated the voices of 50 people in your phone or that you know or that you want in there. And now you've personalized the video game experience. If you thought about this, just the application in video games.
Starting point is 00:29:45 Yeah, going back to that conversational bot idea of being able to talk to somebody, if that was in a video game, that would be crazy. Especially in a VR game where you talk to somebody and they're talking back to you in some specific voice and the conversation flows like a real conversation. Yeah. That's the next, yeah, that's the next thing for sure.
Starting point is 00:30:08 But then also, have you guys looked into ownership rights? Like that data set of Rogan, that's not your property, right? Essentially he's got that IP on that, right? Yeah, I don't think the law is that far ahead where I don't think it even anticipated the sort of stuff happening. But I think now Congress is starting to look at this, this sort of technology and start,
Starting point is 00:30:34 they're starting to make laws for it now. What are the laws? Just you can't, you have a fundamental right to your personality or something? I think it is the case where you do own your personality or something like that. But the complexity is, because remember that thing of the slider that I mentioned before?
Starting point is 00:30:54 What if I take the voice but I just slide it in different dimensions in different ways? And how much should I slide it where it doesn't become you anymore but people think it's you? For example, I'm sure there are people that sound like you or look like you, but they're not infringing on your IP or your personality.
Starting point is 00:31:11 Brad Pitt stole my whole shit, man. He stole my whole personality. Exactly, but you can't sue Brad Pitt because it's, you know. Right, that's it. Yeah, well, I mean, that's what I'm saying. Whatever legal protections people desperately try to assign to control this technology gives a fuck. Go ahead, do your stupid legal shit.
Starting point is 00:31:32 I don't care. I'm hanging out with fucking Tucker Carlson in a VR BDSM dungeon, just having a blast and there's nothing you can do about it. To me, that's the main thing is it's like, look, go ahead. Make it so that the publication of the bots or whatever you want to call them, the golems, the replicants is illegal without permission.
Starting point is 00:31:56 But still the underground of these things already exists, right? What's it called a deep fake porn? Is that what it's called? It's already there. Have you ever fucked around with any of that stuff? One continuation of this project we were thinking about is doing it, like adding the visual aspect
Starting point is 00:32:14 of the audio and visual and combining those two things. Yeah, that's the next step and you guys have to do that. And to me, the deep fake porn shit is so, that's like, I think in the future that making a deep fake of someone fucking is gonna be placed in some kind of category like rape, like they're gonna say that it's just as bad, that it's a form of sexual assault, right?
Starting point is 00:32:41 I wonder, it seems like a lot of people are uploading them like anonymously or something, where it's sort of hard to get them. Well, yeah, I mean, hopefully it's gonna land in the same like region as like child pornography or something that you just, you can't do it. But then that being said, how do you know it is a deep fake and not a deep fake?
Starting point is 00:33:00 Well, it's what you're thinking about. It just, there isn't going to, it just feels like this is the media, this is the like technological media or nobody, at least I didn't predict. And I cannot recall reading any of the theorists I love who predicted, oh, you see what's gonna happen as an apocalypse of a replication is what's gonna happen,
Starting point is 00:33:23 is that kind of like fun house circus, but circus fun house with warped mirrors of everyone infinitely replicating to the point where they're kind of madness that's in over the planet and people either abandon technology or just go insane. That deep fake community is interesting because that's an example where people released everything. The code is there and they made it easy
Starting point is 00:33:49 for people that are not programmers to use as well. And they also curated data sets of people already. We saw what happened with that kind of deep fake, but it turned out that audio deep fakes were a lot harder. So that was just behind the curve a bit. Oh, fuck. Video deep fakes were the hard part? That's-
Starting point is 00:34:10 Yeah, harder than audio. Oh, okay, that makes sense. I understand. Okay, yeah, okay, okay. But they're both, so, all right. So now, what would you do? You're gonna be remembered as one of the people in the beginning of this technology.
Starting point is 00:34:26 You did a huge thing that went viral, went everywhere. Everyone freaked the fuck out. I'm probably not even talking to you. You're probably, I did it myself or something. I'm talking to you. And a weird underground fucker. But what would you do if all of a sudden you got, I don't know, an email with a video of your mom in it
Starting point is 00:34:49 that is identical to your mom and she's begging for your help. She's like, fuck you, the shit they're doing to me. You gotta stop, you gotta stop them. You know it's a deep fake, but you all said they're holding your mom's deep fake hostage. Do you know what I'm saying? What would you, how would you react to that?
Starting point is 00:35:09 I think it goes back to, in the future, we'd have to deal with media differently. There's that high trust and low trust stuff where you just automatically dismiss that stuff as fake. And I think maybe in the future we'll be acclimated. Maybe that's the next level of desensitization where we just get used to all this fake stuff happening and maybe we just ignore it
Starting point is 00:35:29 unless we're morbidly curious. Okay, yeah, that's pretty cool. It's like we're still just in the embryonic phases of learning how to use the internet. We've shocked to realize that sometimes the stuff people put online is not only not the truth, but intentionally not the truth with political designs behind it.
Starting point is 00:35:52 We're still, I'm sure in 10 years that's just gonna seem like, ah, of course people would do that. Of course people would try to control elections. Of course people would try to create chaos. Of course people would try to, you know, ferment stupid, ferment, you know, try to make political movements happen
Starting point is 00:36:10 in different countries, but the road to get to that place, that's what I'm worried about. I'm worried about the in-between of now and that moment where people are like, oh yeah, whatever, it's just another video of a donkey fucking my mom while my brother eats her asshole and a fucking raven finds my mother.
Starting point is 00:36:31 Fine out. You know what I mean? Like I don't wanna look at that shit, man. I don't wanna deal with that. You know, I don't wanna deal with like, what, think of stalking. Just think of stalking, dude. You're gonna get stalked by yourself.
Starting point is 00:36:48 Somebody, you know what I mean? Like yourself fucking calling you and telling you to fuck off. You know, just someone using your technology like that or just think of the fucking mall, man, or going to buy shit. And it already knows you're coming in because you agreed to something,
Starting point is 00:37:04 you weren't paying attention when you got that app and you walk in. And it's just versions of you wearing every item of clothing in the store. I don't know why I think that's sinister. It's just weird. I think in the future they're gonna have cameras. Well, they already do have cameras everywhere,
Starting point is 00:37:19 but I mean, when you walk into a mall, it's gonna recognize your face and it's gonna track you everywhere. As granular as the data you could think of, they're probably gonna have it in the future. So they're gonna track you everywhere and see what you're looking at just from video and then have this profile of you.
Starting point is 00:37:35 So yeah, that's a whole nother area where this stuff is gonna be applied. And then the next area of it is a form of weird kind of slavery because it's like, okay, I've grabbed the deep fake you, I've got the data set, it's a perfect data set, it's perfectly replicating you. So I'm gonna put you to work now
Starting point is 00:38:00 in my fucking cubicle farm where you're gonna be making sales calls for me. It's that's the golem replicant version of us put to work in some global office setting, doing shit we don't wanna do that we might never even hear about, man. What about that? Have you thought about that digital enslavement?
Starting point is 00:38:25 So a lot of people actually approached us with that idea, wanting to collaborate to build that out. Yeah, so there are people that are trying to make that happen right now. There are people who are trying to make like essentially, I get the phone call from the bot and saying I've won a cruise or whatever. They're trying to turn that into like deep fakes of people.
Starting point is 00:38:47 So it's like Rogan calls you up. Yeah, but I mean, I don't think they might not necessarily marketize that, they might say, oh, we could have Duncan Trussell do a commercial, but we don't need you to actually be there kind of thing. But it's the same idea, I think. Wow, fuck. What else have we agreed to?
Starting point is 00:39:08 I just keep thinking of every single terms of service that I agreed to, that I didn't read. Have you thought about this at all? Yeah, it's definitely like, it's gonna be so different in the future and maybe in the near future. It seems like it's just gonna go, but it's like the only thing that's gonna become valuable
Starting point is 00:39:28 anymore is real interaction with people. That would be nice. Yeah, that would be nice. That to me is like the hope behind it all is suddenly live shows are gonna be very special and meeting a person in the flesh is gonna be special. But then come on, surely you've thought about robots. You know, you've thought about that, right?
Starting point is 00:39:52 The potential of Android replicants, Blade Runner style replicants of people. I mean, if you can do the shit with video and you can do this with audio, all you gotta then do, I guess, is figure out a way to map the video to some, I don't know, mask, right? And now we've got a 3D version of a person.
Starting point is 00:40:13 Right, it's like basically the premise of Westworld is not perfect. And you're right there, man. You're one of a triad of guys who are ushering in the techno apocalypse. What have you done, what are you doing, guys? You have to stop this work. You have to stop.
Starting point is 00:40:35 What are you working on right now? If you can talk. Yeah, yeah. I've always wanted to start at my own company. And after this project, I got a lot of attention. And previously, a lot of our clients were financial institutions. And I sort of wanted to explore having,
Starting point is 00:40:51 working outside of the financial industry and just having AI that could benefit us in different ways, like for examples in trains, you could take images of a train as it's going by. And AI could do that. And it could prevent a derailment. It could say, oh, you need to check this right now or else the train is gonna, yeah,
Starting point is 00:41:09 so that could help save lives. And a lot of money for the train companies. So there's a lot of examples like that. Yeah, well, I heard about, God, I can't remember what it's called, some kind of blimp technology or some kind of crazy observation drone that they've developed as,
Starting point is 00:41:27 it's using such hardcore technology that it can like zoom in to someone's, just zoom right into somebody, even from almost from like space or whatever. And then the AI can watch what's happening. And now you basically have some kind of like panopticon where an AI is observing and seeing patterns, recognizing the patterns and then use predictive technology
Starting point is 00:41:52 to determine when shit's about to happen, train derails, derailments and such, right? Like theoretically, this could just be a, some kind of observation drone that's spinning. And then whoever wants their shit monitored by this technology could just sign up, right? Like your crops, for example, your area around your house.
Starting point is 00:42:18 I can imagine the deciding they want these drones to do some behavioral analysis to make sure nobody's doing anything weird, like maybe an active shooter or like somebody dropping a bag somewhere and then running away or leaving a bag somewhere. Yeah, or spray painting or like, and then also, man, think of this aspect,
Starting point is 00:42:39 just like, I don't know, I'm like doing a new ad campaign for whatever my brand is. And I bought a billboard somewhere, a thing near a sidewalk. I could actually get the data on how long people are looking at it. How are people interacting with it? Is how effective is that particular campaign?
Starting point is 00:42:59 And I could buy into whatever this observation drone is and just ask for that specific GPS coordinate to be monitored and from that, gather some grotesque metric regarding how long people stare at my stupid billboard, right? Like that's what you could do. Some interesting stuff there too is you could predict who's coming by
Starting point is 00:43:18 or track who's coming by. And you say, this person will be interested in your product and there's a lot of people buying for that ad space at like a digital billboard. There's an optimization thing going on as well where it's like, okay, this group of people walking by will likely convert with this sort of product and then we'll charge them this sort of price
Starting point is 00:43:37 for this plot at this time space. Holy shit, that's crazy, man. It could even theoretically, oh, it gives a rainy day. It's a rainy day. So let's pop up our rainy day ad instead of our sunny day ad or something bad has happened. Flags are flying at half mask.
Starting point is 00:44:01 Don't like a upbeat one. Let's do just to show our logo or some shit like that or like this kind of haircut is becoming fashionable. Okay, let's just shift all the haircuts of our AI models to match this or even, oh, fuck. What about the ones that just scan who's coming in and tries to make the thing that's advertising look a little bit more like them?
Starting point is 00:44:26 You know, mirror them in some subtle way. And maybe with AR, it could be customized to the person itself. Right, you could ask for it, you mean. You mean, wow, man. You have really gotten involved in some bad ass technology. It's so cool.
Starting point is 00:44:50 And that was actually never the intention. Like it was literally the simple idea of can we get Terrence McKenna on the JRE? That was literally it. And we didn't expect anything to happen. Terrence McKenna would be grinning ear to ear to know that that is what gave you guys the inspiration to do that.
Starting point is 00:45:11 It's a beautiful thing, I guess. I mean, I suppose we just have to let go of all our attachments to the way the world used to be and sort of plunge into this strange new landscape. Sometimes I look back, if you don't mind my asking, how old are you? 26. Okay, see, I look back to the 80s.
Starting point is 00:45:37 MTV was still fucking cool. People were leaving messages on answering machines. And there was no way someone's getting in touch with you if you're not home. You know, like you're just, you're roaming out in the world. You're not getting tracked or watched or monitored. And I think back to those days versus these days and the sort of exponential increase in complexity
Starting point is 00:46:06 to say the least. And I think this landscape that we're in right now is unrecognizable. I think it's that world that I came out of is, that's a different world. That world's long gone. That was an empty, free world where you didn't have a online identity at all.
Starting point is 00:46:30 You barely had an identity, you know what I mean? You were just watching a few channels and that's it. Yeah, I don't know. Happiness. What's that? Like, what are your thoughts on that for what it meant to be human in terms of their experience through life
Starting point is 00:46:46 and how, like their quality of life? Yeah. I had a professor who is one of the smartest guys I know and he said this statement to be controversial, but there's some truth into it. He said that cities are a mistake. But the technology is not there yet where we could support the positive things from a city
Starting point is 00:47:09 with the benefits of that village lifestyle. If you can have sort of the best of both worlds. Yeah, yeah, yeah, I think you're, that's a smart guy. And I think you would be a bit of a fool to not admit that something that was crucial to human civilization, which was just basic human interaction in the flesh, is becoming something that's considered old fashioned.
Starting point is 00:47:41 You know, just like hanging out with a baby or raising a baby, not sending the baby to daycare. One of the parents having time to be with a baby. These things are outdated, modalities. So if happiness, which it must, has some connection to basic human interaction. And not just that, but human interaction over time, right? Like it's like, we are learning how to communicate
Starting point is 00:48:11 via texts, emoticons, whatever or the phone has to offer. But the other shit, like learning what it's just like hang out with your friends every day in the flesh and all the like nuance that goes into three dimensional human interactions. People are like getting a little rusty when it comes to how to do that, depending on how much they get out.
Starting point is 00:48:34 And if they do get out, usually, what is the main way people are interacting these days? Transactionally at their jobs. The recreational moment hanging out in the flesh for fun thing is happening online. Fortnite, whatever, you know, it's happening online. So that's, I think, maybe why you go to a public place and sometimes you might get this sense of like,
Starting point is 00:48:58 geez, people seem like awkward or something. You know, a little kind of like, they're not quite sure what to do with do because they're together in the flesh. So what do they do to not deal with the fact that they've grown rusty at making eye contact? They look at their fucking phone, right? They go into their phones while they're out in public
Starting point is 00:49:20 because they're dealing with like an unknown or a thing they're not that great at versus a known thing, something more controllable. So, yeah, I don't know what that's doing for happiness. I don't know if it's necessarily, I don't know, man. I don't know, because I love it too. Right, that's an interesting thing that you mentioned. I didn't connect that with the people
Starting point is 00:49:45 that go to their jobs. And it's not like a candid form of communication because in my old job, a lot of our clients were big corporations. So I would go into these big companies and talk to people and sort of observe that the dynamics in a big corporation. And sometimes it's not the most candid
Starting point is 00:50:03 or definitely a different dynamic than maybe your friends or something. So if a large majority of your time is spent in that sort of environment, and then the rest of it is online, and I noticed even like when people are taking public commute or something, it's always on their phone. So there's almost zero time
Starting point is 00:50:21 where we didn't have any of those distractions or any of those. It's almost like there's zero percentage of our time that's spent in the old way of doing things. Yeah, that's right. Yeah, we're losing that, that's gonna go away. And so I could see maybe there is some usefulness and just a signal jamming the whole fucking thing
Starting point is 00:50:43 by like irreversible AI clone swarms of yourself flying around. So it's just like, fuck it, we can't use the technology anymore. It's imitated us into oblivion, so to speak. And then we get to be with people again, I guess. There's something nice about that, but I mean, what do you think happiness is?
Starting point is 00:51:04 How do you define happiness? Oh man, I don't think I'm qualified to even try to answer that question. But I do listen to a lot of people that talk about it online. Yeah. I'm noticing at least for me, there are certain people that I think are good models.
Starting point is 00:51:23 For example, it's funny because listening to the Geraldine podcast so much, it feels like a lot of the guests there are my friends because I've listened to them in conversation, it's almost like I'm there. And certain people, I think they have a good idea of having this sort of being a person where they build a community around them
Starting point is 00:51:43 and they're giving to that community. So I think a really good example is Kevin Hart. He was describing about how he gives back in all the ways he does and how he's accumulated so much that it's more than he could ever consume for himself. And because of that, there's an overflow of positivity that is just everybody around him receives that as well. And I think that gives you exponentially more happiness
Starting point is 00:52:09 than anything else I can imagine. You mean generosity becoming generous. Yeah, yeah, but not everybody can be a Kevin Hart because it's like he did something so right in a way that he was able to get that position. Well, I mean, I think be suspicious of anyone who says they have an answer for how to be happy. The definition is who the fuck even knows what it means.
Starting point is 00:52:38 There could be a million definitions for happiness all together. You know, I have fallen in step with like a teacher called Chogim Churamparampashe and I'm really into sort of his concept of truth. You know, like happiness comes, happiness goes, sorrow comes, sorrow goes. These are changing, these change throughout the day.
Starting point is 00:53:04 You might be happy now, you're not gonna be happy in a few hours, you're gonna be happy again, but the place is in between happiness. Those are very interesting because I think if you could find a way to just accept that those are happening too, then happiness almost becomes a secondary thought.
Starting point is 00:53:22 We don't need to be happy anymore because we're here with what it is as it is. Also, happiness is a fucking like, goddamn, it's like paint by numbers, right? It's like this broad, huge category that forces you to sort of come, you know, turn your entire human experience into one button, you know?
Starting point is 00:53:43 Like a I smiley face button, where it's like so many moments throughout the day, there's some happiness, there's some misery, there's some anxiety, there's some boredom, there's some confusion, there's some like weird shit, you don't even know what you would call it. You know what I mean? Some like fugue state, some deja vu, who knows?
Starting point is 00:54:04 And then you come up to someone and you're like, are you having a good time? And all they can do is like, yeah, I'm happy, it's cool. But it's like, you know, it's like a really, you know, something's fucked up when emoticons become more expressive than just general human communication. You know, like, there's emoticons that show
Starting point is 00:54:27 simultaneously feeling shy and happy or, you know, at least they combine various things, direct human communication has limited us to a pretty brutal binary, happy, sad. You know what I mean? I love not being hung up on whether I'm happy or not anymore. You know what I mean?
Starting point is 00:54:48 That's a fun thing. The next time you find yourself in a real shit mood, instead of thinking to yourself, boy, I should change this. See what happens if you're just like, okay, I'm in a shit mood. You know, you know what I mean? Like where it's like, I'm not gonna try to get better. I'm not gonna try to do breath work.
Starting point is 00:55:04 I'm not gonna try to like do anything. I'm just in a shit mood right now. And then for me, that's really been a pretty liberating experience and paradoxically, I think makes me way more happy. Anyway, this conversation with you has been fucking mind blowing, man. I'm so excited to imagine that people who have your brain
Starting point is 00:55:30 are being inspired by Terrence McKenna. Do you find out there in your world, in your career path and the industry that you're in that Psyconauts abound, that Ford thinking philosophically minded, artistic people are everywhere out there? Yeah, there's definitely some of them. I mean, there's stereotypes, but there's definitely other people.
Starting point is 00:55:55 I think you can find these groups of people and definitely in tech and the software and the AI. Is there a general sense that you get out there of a kind of like ambition, an unspoken desire to not just produce these technologies for a financial gain but to like guide humanity or to sort of create changes, intentional changes in the way we go about our day-to-day lives?
Starting point is 00:56:30 I think that's definitely one of the more rare motivations. The motivation I've come across a lot is wanting to solve specific problems just because it's challenging. So it's like this sort of game that we fall into where we just stumble across this sort of technology and we get deeper and deeper into it and we're thinking, hey, can I push the limits this way?
Starting point is 00:56:53 And then there's all these challenges that come with it and we're thinking of the level of how do I solve these specific challenges to make this algorithm do this specific thing? And I've come across that, that's the most common thing I've come across where people just love the challenge of solving these specific problems
Starting point is 00:57:12 and they wanna be very good at it. I haven't come across that big motivation thing where they wanna change the world with AI. Typically, I think typically when I see those people, they're usually not the people that are making the algorithms. Here's my last question for you. When I was reading Nick Bostrom's book,
Starting point is 00:57:31 Super Intelligence, have you read that book? I haven't, no. It's great, you would love it. It's dense, it was too dense for me. You will be able to cut through it like that. For me, it was like it's very technological at places, but basically he's mapping out if there's gonna be a super intelligence.
Starting point is 00:57:50 Here's the pathways to that super intelligence. And one of the points he makes, which is a really good point, is that when the super intelligence emerges, whatever corporation has, however you wanna put it, summon the super intelligence, or I don't know, generated it or whatever it is,
Starting point is 00:58:10 is going to have an exponential head start. No one, theoretically no one will be able to catch up with because the super intelligence start improving upon itself. And also there's no reason that any corporation who gains this type of power is going to announce that it did it. So we won't even know it's happened when it's happened. Have you contemplated this point
Starting point is 00:58:34 that potentially in AI is already kind of running the show, but you still don't know it? I didn't consider it just because I didn't think it was possible yet. And I didn't read Nick Boschram's book, but I did read Life 3.0 by Tegmark. And he covers a lot of the similar topics. And I found it interesting when you were talking about
Starting point is 00:58:56 creating a better version of people. When you were on the JR, you were saying, at Joe Rogan, that's 50 times better than Joe Rogan. Yeah, and that was one of the premises of the book where the small team created a super intelligence. And they're thinking, how do we take over the world now? And their idea was, let's start a media company. And with this media company, we could capture or captivate
Starting point is 00:59:18 people, and it will make it like tailored to each person viewing it. And they go somewhat under the radar of controlling people's ideas, but in a subtle way, where they're just watching a movie or watching this sort of show. And so the way they would do it would be in a way where you wouldn't know it's happening.
Starting point is 00:59:39 And eventually, they get so powerful because of the revenue they have, and because it's a media company, they would have news outlets. And they would have so much money that they would be influencing policy and government eventually. And when they did take over the world, it's almost like it's too late when you realize it.
Starting point is 00:59:59 Yeah, yeah. Well, I'm sorry, this just made me think of something else. Do you have a few more moments? Of course. OK, so this is where this conversation goes into the occult, goes into mysticism. Because the Gnostic idea about Gnosticism, I feel like you sort of got Joe to say some shit about how
Starting point is 01:00:23 he was like a slave. You know what I mean? Yeah. You know about Gnosticism? I don't, no. Well, it's like there's a great book by Jaron Lanier called The Dawn of the New Everything. And I don't know if he specifically
Starting point is 01:00:38 mentions Gnosticism, but he mentions B.F. Skinner, and he mentions the manipulative algorithms that keep people glued to their phones. And B.F. Skinner being this behavior, you know B.F. Skinner? Well, for people who don't know, B.F. Skinner is he was a behavioral psychologist who was the guy who programmed pigeons using these things called Skinner boxes.
Starting point is 01:01:02 And he, I believe, was the one who came up with the term operant conditioning, methods that you can use to make something that's alive and has like food motivation do what you want using food. And so he got really good at controlling pigeons, man. He got really good at controlling creatures. And his idea was if I can like, if I have control of every variable in a thing's environment,
Starting point is 01:01:28 I can just about make it do anything I want, you know? And so this is like where you jump into more universal questions. The first thing, it's like, OK, we know this planet has life on it. So this planet has life on it. It implies probably there's life somewhere in other planets. If we find another place that for sure wasn't like somehow
Starting point is 01:01:56 that it was an Earth-based life that made it to Mars and some solar explosion or whatever, if we find true alien life somewhere else, then you could say with 100% certainty, your certainty of life on other planets goes up. Way more in that case, right? And so now we've got life on a planet that's barely been around based on geological time.
Starting point is 01:02:17 It's already hit the point technologically where it's replicating itself. It's already hitting the point technologically where it's like creating versions of itself and also hitting the point technologically where it's manipulating its own species, hacking their brains, so to speak, using algorithms that grab their attention. And now it's using AI to do that very same thing.
Starting point is 01:02:39 So anyway, this is where you run into the whole simulation theory idea, which is like, you think this is the first time it's happened? You think this is the first time in all of time space that civilization did this shit? And then if it can be done, who's to say it isn't already being done on us right now? The thing we call planet Earth, the thing we call nature,
Starting point is 01:03:05 the thing we call experience or sentience is just what it is to be glued to an environment produced by an advanced technology that has caught our attention and keeps us completely locked down like a cat chasing a laser pointer. You know what I mean? So have you ever thought about that, that in fact, you're just what y'all are doing, what you did with Rogan?
Starting point is 01:03:31 It's just a kind of echo of a thing that keeps getting done over and over and over again. Whenever there's a specific type of civilization that forms, you're just echoing a cosmic predicament. Right, so one requirement to create a simulation would be to be able to program something as complex as a human. So that's definitely one of the steps forward.
Starting point is 01:03:54 I mean, if you could program a human or something as complex as we know, then we know we're one step closer to being able to create a simulation. In that sense, I think we're closer to that. And one quick thing that was sort of funny, which on that clip, I did reference this sort of simulation, that was actually a true story.
Starting point is 01:04:14 So I was actually at the comedy store on Thanksgiving weekend and some guy came up to Joe Rogan saying it was a simulation. Yeah, it was a little Easter egg for people that were there. Wow, holy fucking shit. Y'all are the best. I hope everybody out there in your field
Starting point is 01:04:37 they're as cool as you are, man, because that means we have some good stuff in store for us and not just the creepy, demonic medical marijuana edible universe that my mind keeps prognosticating when I think about the implications of this technology. I've really enjoyed talking with you, man. I hope you'll let me use this for the podcast.
Starting point is 01:04:55 I know we started off saying it was just gonna be a Skype call. Well, yeah, it's been amazing talking to you as well. It's not me. It's not me. Listen, I haven't existed for a long time. You're talking to just AI and DARPA. We're actually having,
Starting point is 01:05:14 this is one of our recruitment sessions. This is the way we recruit people. We recruit people that they're vagal. No, thank you so much, Hashim, for being on the DTFH. Where can people find you? No where. I mean, look at him. I'm on LinkedIn.
Starting point is 01:05:36 I'm sure the coolest guest yet. Thank you so much. Hare Krishna. A tremendous thanks to Hash for being a guest on the DTFH. If you want to reach out to him, I'll have all the links you need at dunkatrestle.com. Thank you, Blue Chu, for sponsoring this episode
Starting point is 01:05:52 of the DTFH. Don't forget, use offer code DUNK and give them a try. They will send you free boner pills. And thank you for listening to the DTFH. Please subscribe to us on Patreon, subscribe to us on iTunes, and subscribe to the idea that you are an unlimited, super potential
Starting point is 01:06:09 manifesting as your current incarnation in this dimension. I'll see you next time. Hare Krishna. A good time starts with a great wardrobe. Next stop, JC Penney. Family get-togethers to fancy occasions, wedding season two. We do it all in style.
Starting point is 01:06:28 Dresses, suiting, and plenty of color to play with. Get fixed up with brands like Liz Claiborne, Worthington, Stafford, and Jay Farrar. Oh, and thereabouts for kids. Super cute and extra affordable. Check out the latest in-store, and we're never short on options at jcp.com. All dressed up everywhere to go.
Starting point is 01:06:45 JC Penney. Wendy's $3 breakfast deal is a bacon or sausage, air croissant, plus small season potatoes for three bucks. It's the breakfast that don't miss. So if you did miss Wendy's breakfast, don't imagine fresh cracked eggs, sizzling sausage, crispy bacon, and block out those hot, buttery, flaky croissants.
Starting point is 01:07:04 Croissants don't really make a sound, but if they did, for breakfast that don't miss, Wendy's is that breakfast. Choose wisely. Choose Wendy's $3 breakfast deal. Limited time only of participating U.S. Wendy's, so I'd like to request $3 breakfast deal to obtain discount not valid for all the cartocombo orders.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.