Duncan Trussell Family Hour - 475: Yan Zhu

Episode Date: November 20, 2021

Yan Zhu, digital privacy advocate, programmer, and head of security at Brave, joins the DTFH! You can follow Yan on Twitter, @bcrypt, listen to her tracks on her Soundcloud, or check out her web bro...wser, Brave! Original music by Aaron Michael Goldberg. This episode is brought to you by: Squarespace - Use offer code: DUNCAN to save 10% on your first site. StoryWorth - Visit StoryWorth.com/Duncan and receive $10 off your first purchase! BetterHelp - Visit betterhelp.com/duncan to find a great counselor and get 10% off of your first month of counseling!

Transcript
Discussion (0)
Starting point is 00:00:00 We are family. A good time starts with a great wardrobe. Next stop, JCPenney. Family get-togethers to fancy occasions, wedding season two. We do it all in style. Dresses, suiting, and plenty of color to play with. Get fixed up with brands like Liz Claiborne, Worthington, Stafford, and Jay Farrar.
Starting point is 00:00:18 Oh, and thereabouts for kids. Super cute and extra affordable. Check out the latest in-store, and we're never short on options at jcp.com. All dressed up everywhere to go. JCPenney. Good evening, and welcome to the Ducatrustle Family Hour podcast.
Starting point is 00:00:45 Recently, you've probably heard about this, Commander Gethric's coal of the infernal legion accidentally shape-shifted into his demonic form at a cafe in Hollywood. One of the humans dining at the cafe went insane. And tore his eyeballs out before the dimensional guards could freeze time and wipe the memories of the humans dining at said cafe. A tragic event, and our hearts go out
Starting point is 00:01:09 to the family of the human who had to be voided to avoid a detection. Was this an idiotic mistake? Yes, obviously. But does Commander Cole, a long-serving soldier in the Army of Chaos, deserve to have his wings plucked from his body at the throne of Amgor, the judge of time?
Starting point is 00:01:30 Absolutely not. And yet, this is exactly what the so-called protectors of the Covenant want us to think is normal. Over the last decade, punishments given out by the adjudicator and his servants have grown increasingly severe. In fact, there have been 7,000 additional wing pluckings this last season, more wing pluckings
Starting point is 00:01:55 than all the seasons combined. Many of us are asking, why? Some don't seem to want to admit what we all know. The adjudicator has entered his 300th season and has yet to shed his skin. He is entering his cycle of decay. This is normal. I wish him well.
Starting point is 00:02:15 But at some point, we must invoke the pact of Hamlin and bring the adjudicator to the arena, where he can prove his power through the trials of the seven cauldrons. It's always been this way, nothing new, except for the fact that the adjudicator is trying to use his power to change the law, to avoid the trial of the seven cauldrons
Starting point is 00:02:35 and, as a result, rule for another season. Maybe more. Why not six seasons or 10 seasons? Or you know what? Why not forever? When we return, we have an interview with Arch Cranston, Denivore Grinch, the Lord of Snakes, to find out what he thinks
Starting point is 00:02:55 can be done to renew the covenant and invoke a new adjudicator, who can lead us to glory as we colonize this garden world. We'll be right back. This episode of the DTFH has been supported by the Angels of the Internet over at Squarespace.com. Squarespace has everything you need
Starting point is 00:03:16 to get your online endeavor going. And I'm gonna offer you, my dear listeners, something incredibly special. Right now, the domain name, ciasanta.com is available. That's right, fuck Elf on the Shelf. The weird little snitch toy that you put on your shelf that watches your kids and reports back to Santa
Starting point is 00:03:37 and tells them whether your kid has been naughty or nice. Ciasana is so much more powerful than Elf on the Shelf, because ciasana actually sends information in real time to the CIA. That's right, ciasana is a device that has within it a camera that literally films everything you and your family are doing and sends it to the CIA. So will Santa see it?
Starting point is 00:04:01 No, Santa won't see it. Will the CIA see it? Yes, is Santa real? No, is the CIA real? Yes, and that means your kids are gonna behave better not because they want presence from some mythological creature, but because there's the very real risk
Starting point is 00:04:17 that you or them could be arrested based on their behavior. Ciasana.com, it's available. You could buy it right now, just by going to Squarespace. You'll get 10% off that domain name and you can use the beautiful mix and match templates Squarespace offers to quickly create a website that no doubt will become immensely popular with parents trying to figure out a way
Starting point is 00:04:42 to get their kids to behave. Not just during the holiday season, by the way, ciasana, he's there all year long. Go grab that domain name through Squarespace, head over to squarespace.com, Ford slash Duncan, try them out, experiment with all the incredible things they offer, not just shopping cart functionality, obviously they have that,
Starting point is 00:05:02 not just the ability to very quickly make beautiful websites, but you can use Squarespace's technology to send emails out to all your clients and let them know that they can now buy ciasana by going to ciasana.com, give Squarespace a shot. Again, you can head over to squarespace.com, Ford slash Duncan, use offer code Duncan, when you're ready to launch
Starting point is 00:05:25 and you will get 10% off your first order of a website or a domain, ciasana.com or anything that your brilliant mind can conjure up, can come to life through Squarespace. Thank you Squarespace or back. Before we jump into this fascinating conversation with Yanzu, I would love to invite you, love to, I'm going to invite you to sign up for my Patreon
Starting point is 00:05:50 over at patreon.com, Ford slash DTFH. This is the pulsing inner nexus of the DTFH. It's a powerful, ever expanding community of artists, philosophers, just all around incredible, beautiful, brilliant, wonderful freaks and you should join us. Why? I'll tell you why because we're working on a book right now
Starting point is 00:06:17 and we're going to be collaborating more and more in the future. It's a little late at this point for you to participate in our upcoming anthology of cryptid erotica, but I have to tell you, having worked on it now for over a month with these brilliant literary geniuses, this book that is coming out of the DTFH family is going to ripple the time, space continuum
Starting point is 00:06:43 as though someone threw a cosmic vibrator into the mind of the Christ. You're going to love it, it's coming out and if you sign up for the DTFH Patreon, join our family, you can participate every Friday at our family gathering. After this anthology, we'll be writing another book. I feel certain of that. Also, if you're into meditation,
Starting point is 00:07:04 you want a great group online to meditate with, we have a meditation group every Tuesday. It's our journey into boredom. And of course, if you sign up, you'll have access to our discord community as well as commercial free episodes of the DTFH that come out early. So I hope you will subscribe.
Starting point is 00:07:26 It's patreon.com forward slash DTFH. Now, if you're a longtime fan of this podcast, then you probably listened to the conversation I had with Bill Day, the DJ, Mr. Bill. And we talked a little bit about how creepy it is that these massive tech companies are sort of in control of all of our privacy or emails and pretty much anything that you are doing
Starting point is 00:08:00 when you have a phone with an app open on it. They track where you're going. Apparently they listen to some of the things you say. And he mentioned to me that his partner, Yanzu, is a digital privacy advocate. Right now she's the head of security over at this wonderful open source web browser called Brave, check it out.
Starting point is 00:08:21 If you're looking for an ethical web browser that isn't selling your data to God knows who, that's where you need to go. Also, Yanzu got to interact with Chelsea Manning just prior to the leaks that I'm sure you all are aware of and the story about that is really fucking creepy and super cool. Also, Yanzu is a programmer.
Starting point is 00:08:46 She's designed a lot of awesome apps that you can actually use in Ableton and she's a spectacular musician. You gotta listen to her tracks. All the links you need to find, Yanzu are gonna be at dougatrestle.com. But now everybody, please welcome to the DTFH, Yanzu. Yanzu, welcome to the DTFH.
Starting point is 00:09:08 I'm very excited to talk to you. Thank you for being here. Thank you. Thank you for having me. I wanna talk to you just to start off, to have a chat about online privacy. And I think I'm something of a privacy nihilist in that I think a lot of us have just surrendered
Starting point is 00:09:55 to this reality that some incredible AI algorithm is vacuuming up all of our data, our behavior patterns. And selling it, parsing it, using it to manipulate us. But I'm curious because this seems to be like one of the central aspects of your life. Why should we be worried about our online privacy? That is such a good question. And over the years, it seems more and more often
Starting point is 00:10:30 people start to self-identify as a privacy nihilist and say, like, look, Google has all my location, they have all my emails, you know, there's nothing I can do about that if I want the convenience of being able to, you know, have my travel itinerary and my calendar automatically every time I book a flight, you know. And I think if people were actually privacy nihilists,
Starting point is 00:10:54 like if I said, give me your phone unlocked and let me look through it, like they would all do it. But I've asked like 12 people this and none of them are ever willing to give me their phone unlocked, which means maybe they still have something to hide, you know, privacy is kind of the right, well, in my mind, it's the right to selectively choose
Starting point is 00:11:14 what information you share with people and what you don't. I show people and organizations and corporations and all that. So yeah, so I think it's worth asking yourself, you know, if you're really a privacy nihilist, would you give me like pretty much a stranger to your phone unlocked, right? You know what, honestly, just cause I know Bill,
Starting point is 00:11:37 I would give you my, I'd give you my phone unlocked. I don't know, like, yeah, go for it, you know, but not everybody, I wouldn't want everyone to have that. No, I wouldn't want everyone to have that. So that is a fantastic point. But we already did it, right? Isn't that the idea is like, well, it's too late. We already gave them our phone unlocked.
Starting point is 00:11:58 Like they theoretically, all of the data, if we haven't been using some kind of VPN or being super careful or using all the things that folks like you recommend that we use, isn't it too late? Yeah, so that's kind of the other argument, right? That it's kind of a lost cause that if you've used Google or Apple
Starting point is 00:12:19 or Facebook or these services, they've already been collecting all your data and there's nothing you can do about it. I think that's true to some extent, but I think it's important for people to still say, like, hey, look, we do care about privacy because otherwise there's gonna be no limit to how much data they're gonna collect and profile you.
Starting point is 00:12:40 So I think my role in this world is kind of just being a voice in tech that says, like, yes, you know, there is some compromise where people have to give you your data for you to provide useful services, but can we limit that somehow? Can we put safeguards and regulations on how tech companies are allowed to use this data? For instance, there was recently a study
Starting point is 00:13:05 that showed that there's all these apps that people have been downloading, which actually are selling their location data to advertisers and other third parties, right? So you might download some game and you just wanna play the game, but then, you know, you don't wanna be also, you know, so you have like kind of a trusted relationship
Starting point is 00:13:25 with that app, but you're not expecting that app developer to sell your data to like 40 other companies, right? Yeah, I remember when they were saying with TikTok, it's collecting keystrokes when it's on. And I remember relaying that to folks I know use TikTok and they're like, yeah, whatever, it's so great, I just love TikTok, I don't care that it's collecting.
Starting point is 00:13:49 And then that gets really interesting and that, you know, I think one of the assumptions folks have as well, what? The data's gonna get sold to Amazon or something, who gives a fuck, but it's like, this data could be sold globally, anywhere, like you don't know where it's gonna end up. Or the, you know, I remember one of these
Starting point is 00:14:09 like deep fake apps that came out, you know, put your face on Bruce Willis or whatever, right? But the terms of service basically said, like we can use your picture anywhere that we want, you know, we now own your likeness and that was in the terms of service. So I get it, like a lot of these apps, it's like, it's not just that they're trying
Starting point is 00:14:29 to sell us Vitamix or something, it's like they're taking something deeper. I wonder if you could talk a little bit about that, the possibility of some future golem, some future synthetic replicant based on the data that's been vacuumed being owned by a corporation. Is that just a paranoid fantasy that I have? Or is that a possibility?
Starting point is 00:14:57 I think some lesser form of that fantasy already exists because a lot of companies are using AI and machine learning to train models based on the data we're providing. And then those systems are making automated decisions without humans in the loops. No, I don't know if you would call that a golem, but. Wait, you mean they're sort of like creating some,
Starting point is 00:15:21 reflection of our online behavior patterns. And already like using that, like the thing is already in some neural network existing as a shadowy version of us, is that what you mean? Well, I think it's often a lot more innocuous sounding than that, for instance, if you use Netflix or Hulu, then those companies are using data about what you're watching and what you're interested in to build a model
Starting point is 00:15:51 of use so they can recommend other things that you might want to watch. So if that's possible, then it would be pretty natural to say based on your online behavior, people can build more sophisticated models of what you're interested in and target ads for you based on that. Yeah.
Starting point is 00:16:12 Yeah. Big thanks to Storyworth for supporting this episode of the DTFH, the holidays are coming. And I know what you're thinking, how do I find the perfect gift for my loved ones when the entire supply chain seems to be collapsing? Here's how you do it. You use Storyworth.
Starting point is 00:16:51 This is an incredible gift that's gonna make your loved ones feel special and unique. And it's a really brilliant online service that helps you preserve precious memories with folks in your family and preserve awesome stories for years to come. It's a thoughtful and meaningful gift that connects you to those who matter the most.
Starting point is 00:17:13 The way it works, every week, Storyworth emails your relative or friend, thought provoking questions that you get to choose from a massive pool of possible options. It's questions like, what's the bravest thing you've ever done in your life? Or if you could see into the future, what would you want to find out?
Starting point is 00:17:31 Look, before my mom passed away, I was lucky because she encouraged me to record a couple of podcasts with her. And in those podcasts, she filled them up with bits of data that are now very precious to me. I'm not trying to bum you out here. I'm just saying that Storyworth is genius
Starting point is 00:17:52 because not only does it validate the people in your life and make them feel like you're interested in them, but also it creates, sorry if this is a super old person reference point, but you know how Superman used to fly into that crystal cave and could get like answers from his parents? It's, that's what Storyworth actually is. After one year, Storyworth will compile
Starting point is 00:18:16 all your loved ones' stories, including photos, into a beautiful keepsake book that you'll be able to share and visit for generations to come. Storyworth is gonna make you seem like the king or queen of Thanksgiving or Christmas or whatever time period you decide to give this awesome gift to your loved ones and you're gonna be so happy that you did it. You can give them the most thoughtful, personal gift
Starting point is 00:18:42 that they're going to get this holiday season and preserve their memories for years to come. All you gotta do is go to storyworth.com slash Duncan and save $10 on your first purchase. That's storyworth.com slash Duncan. Save 10 bucks on your first purchase. Thank you, Storyworth. You know, have you heard the stories how they used to put
Starting point is 00:19:25 cocaine and Coca-Cola? I have. And I've always thought, man, those are the good old days. That must have been incredible to be able to buy cocaine or opium over the counter. You know, when things were heroin, you could just get heroin from a pharmacist. I've never tried heroin,
Starting point is 00:19:42 but I imagine that must have been nice too. But is that what we're experiencing right now? But with technology, are people gonna look back and be like, what were you thinking to allow these unregulated algorithms to manipulate your nervous system? That is an interesting question. I honestly don't think so.
Starting point is 00:20:05 I think maybe I'm not a nihilist, but maybe I'm a pessimist with online privacy and data collection because I just, you know, I look at all the supposed benefits people are getting out of this data collection and they're pretty big, right? Like being able to, you know, what's a good example. I guess like even with online advertising, a lot of people say like, hey,
Starting point is 00:20:27 I do get advertisements for products I actually buy and this is creating real value in my life. And I think we'll look back and say like, oh, all these advancements we've had in convenience was all thanks to this vast amount of data that we were willing to share with the algorithms. So yeah, I kind of don't see us regretting this in the future to be honest.
Starting point is 00:20:49 Oh, that's sweet. That's pretty cool. That's an interesting take. You know, my feeling on it has been so like weirdly pessimistic, like hyper pessimistic and apocalyptic really, just this sense of like, my God, there's some kind of collecting scanning, the collective scanning that's happening,
Starting point is 00:21:09 like we're being scanned by these self-created machines that are then, you know, tricking us. But that's nice to hear that you think this could lead to some like actual positive future because it seems like the storyline these days is like, holy shit, the AI is going to destroy us. It's, we're doomed. You don't vibe with that?
Starting point is 00:21:35 Well, I think there's kind of two sides to this. One is kind of using AI in the sense of machine learning algorithms that learn more about us and what we like so they can give us useful features and more things that we like. And then there's this kind of amorphous fear of such an AI becoming more intelligent than us. And then using that power to like destroy humanity
Starting point is 00:22:04 because it decides that our existence is not in line with its goals. And so, yeah, I have a lot of friends who are kind of more worried about that latter scenario, which, you know, might come about regardless of whether we keep using Netflix recommendation engines. As long as people keep working on artificial intelligence, but I don't know, to me, I feel like climate change
Starting point is 00:22:29 and these more immediate threats are more compelling than this kind of apocalyptic robot feature. Okay, well, let me present an argument to that and please shoot it down. So, omnipresent climate change, we all know the story by now, everyone knows the story by now. This is happening simultaneously
Starting point is 00:23:00 with some of the most hypnotic, I would argue paralytic technological advancements in human history. Like here we have this reality and then with it, I mean, I only, I resisted TikTok for a long time. And then finally I jumped in and was thrilling to realize, oh my God, this thing is like,
Starting point is 00:23:28 it's noticing how long I'm peering at these videos. And then from that, finding better ones that I like and then dialing it in and dialing it in, the effect being very long, I've heard that it's like sometimes hours people spend just staring at the screen. So, you know, don't you think like the fact that we have these pressing issues that have met
Starting point is 00:23:51 this incredible algorithm that as much as I would love to believe it's making people get out and like change their lives, it seems mostly it's just keeping us stuck staring at these machines. Yeah, that's kind of the, I think what you're referring to is often framed as like phone addiction or social media addiction, right?
Starting point is 00:24:14 That these engines keep generating more content that you like and that like hit some button in your brain and you just keep wanting to spend time on these platforms. Yeah. Yeah, no, I think that's a real problem. I just, I'm not sure if like the, how that will feed into the bigger apocalyptic fear about an AI that's like super intelligent
Starting point is 00:24:41 and eventually will control all of us. I think that's certainly possible in my mind, but I don't know. Yeah, I don't really have a good answer to how to address the problem of companies being incentivized to just get more and more of your attention. And so this is like a machine in itself that just keeps building things that are more addictive for us.
Starting point is 00:25:01 Yeah. Yeah, that's pretty scary. It's scary, but I got lost in the stratosphere, my apologies. I would love just to, and again, I'm sorry if people ask you these questions all the time or if this just seems mundane or something, but I would love some real,
Starting point is 00:25:17 like if I decided today to like reverse my privacy nihilism and start like protecting my like online behaviors and everything, what are some just pragmatic things that we can do? And not just with our phones, but like are there things that we need to be doing with our routers and like what would you advise if you could give just some like real simple,
Starting point is 00:25:43 hopefully simple tips for us who have are completely like barebacking the internet? Well, so one, I guess I've just been thinking about phones and like location data a lot recently, but one big one I think is to kind of turn on that setting that and when you download new apps, that says like only let this app see my location when I'm using the app.
Starting point is 00:26:12 Right. Because you know, if you download Yelp or something really, it doesn't need to know your location all the time. It don't really only needs to know when you're like looking up a restaurant in your area and that kind of limits the amount of data that app can collect on you. And then therefore limits the amount of data
Starting point is 00:26:29 they can be selling to some random third party. And the reason why I think people should care about this is like, you know, let's say you're a gay person and some oppressive regime that doesn't like gay people, right? And you're using Tinder to find other gay people. And so you might not care that like Tinder knows that you're gay and you live in this country, but maybe that country's government could go to Tinder
Starting point is 00:26:57 and say, hey, give us this person's location and identity and then you could be in some trouble. Wow. That is so creepy. I mean, theoretically, sorry. I guess the point, yeah, the point is like, you know, maybe you trust the person who made this app, but they could have relationships with other governments
Starting point is 00:27:17 with other companies that aren't going to act in your best interest at all. And you don't know that because you probably haven't read their privacy policy and know who they're sharing data with. What, yeah, is there, where is the regulation here? Like why is it that they're allowed to have these like massive privacy policies
Starting point is 00:27:37 or like terms of service that no one's gonna read? It seems like they should be forced to say upfront, if you use this app, we're gonna sell your movement, your data to Saudi Arabia. If you happen to go to any of these places, we will have no problem letting them know this is a gay person coming to your country. Why don't they say, it seems like it seems so obvious
Starting point is 00:28:05 in the same way that, you know, they have to list the ingredients on food. Why aren't they being forced to be completely upfront about what's happening with your data? Honestly, I don't know why there seems to be so little regulation on like the complexity of language and privacy policies in terms of service. I'd guess that, you know, companies would argue,
Starting point is 00:28:27 hey, our policy is just really complicated because it has to be and we don't know if we're gonna get acquired by some other company and then have to give them our data, et cetera. But there was a project at an organization I used to work at called the Electronic Frontier Foundation, which basically took companies terms of service and tried to like dumb them down.
Starting point is 00:28:49 So like, you know, a five-year-old or a 10-year-old could understand them. I don't know if they're still doing that, but that is a good question, right? Like if these terms of service were actually so easy that like anyone could understand them when people actually read them and make different decisions based on that.
Starting point is 00:29:07 I mean, it seems kind of clear the reason that they're complex is because if people understood just what you just said, if people even thought about that, there would be a real hesitation when it came to using these apps. I've, you know, one of the terrifying possibilities that I heard is like, and also don't forget,
Starting point is 00:29:25 regimes change all the time. Like the way, I mean, we just saw it. We just saw it. It can happen just like that where suddenly, oh, this isn't a democracy anymore. Oh, that's over. And then the new state, whatever it may be, could theoretically force companies
Starting point is 00:29:43 to give them all of this data and they can use it for whatever they want, including arresting people based on the new laws that they've created via whatever coup has just happened. So that's the other creepy possibility when it comes to our data sitting out there, which leads me to a big question I have for you. Is there any hope in eradicating our footprint online?
Starting point is 00:30:08 The current thumbprint that exists, is there any talk or hope of being able, if we wanted to, to completely obliviate all that stuff sitting out there? That's a, I don't think there is any way to completely obliterate it, but there is a regulation in the EU called GDPR that allows people to request that companies
Starting point is 00:30:33 delete any personal data that they have on someone. And so, so I work at a browser company called Brave, Brave.com, and we, since this regulation went into place, we've actually gotten dozens of requests from individuals saying, hey, please delete all the data you have on me. And so people are actually making use of this law now. And I think there's some services
Starting point is 00:31:00 where you can put in your email and they'll just automatically contact the privacy person at a bunch of companies and say, hey, I reside in the EU and under this law, you have to delete all my data. I'm not sure if it applies to people on other jurisdictions. I think there's a similar law for Californians now, so.
Starting point is 00:31:21 Whoa, really? Wow. Yeah, so I mean, it is within the power of state and federal governments to say like, hey companies, you have to have a process in place for deleting all personal data for people under these jurisdictions. And that could actually work in practice. When did you get into this?
Starting point is 00:31:43 Like, when did you start on this very strange path? Because, I mean, you truly are in the front lines of what I, and again, I'm sorry, I'm a weirdo, but I think of it as magic. But just because, what's that quote everybody use, any sufficiently advanced technologies indistinguishable from magic, which for me, having no idea how most of this stuff works
Starting point is 00:32:09 makes it all pretty magical. But when did you start learning about this? I mean, and maybe I should have asked, I had a question before that. Do you consider yourself a hacker? I consider myself a hacker in kind of the old school sense of the charm of someone who just likes to figure out how things work and figure out ways to circumvent rules
Starting point is 00:32:34 built into technology. Not the person with like the bank robber mask who's like on their computer trying to hack into the banks. Right, but I mean, surely that is a quality of the hacker. Even the old school hacker, the power of what's a captain crunch, the captain crunch whistle, you know? The whistle, yeah, yeah, the 2600 hertz.
Starting point is 00:32:58 Yeah, so when did you realize this about yourself? Like when did you realize, oh shit, I'm a hacker? That's a great question. So I did not go to school to do anything related to computers. I went to school for physics and that was kind of my first interest. And then when I got to grad school, I quickly realized like, I don't really want to work for the rest of my life
Starting point is 00:33:26 in this field where, you know, you kind of are uncovering some great mysteries of the universe, but it really has not that much effect on people's day to day lives. It's kind of just this like abstraction or this kind of curiosity that I'm working on. So I took a leave from grad school and started volunteering at this organization called TOR
Starting point is 00:33:54 which builds a network. I guess you can describe it as like a network of computers that you can pass your computer's traffic through to keep your IPO address anonymous. TOR, yeah, the deep web. The deep web, the dark web, yeah. That's what most people know it for. The Silk Road, the place where people try to buy drugs
Starting point is 00:34:17 and assassins and all that. But I would say actually the moment that I realized I cared about online privacy was in my last year of college when I was living in this like crazy hippie co-op at MIT and we always had people coming in and out and like staying with us. And one day someone brought their friend who most people now know as Chelsea Manning.
Starting point is 00:34:50 Yeah, so she later became this like famous whistleblower who like leaked to WikiLeaks this video of the U.S. Army doing, I don't know what you call it, like an airstrike where they were killing a bunch of civilians. It's murdering people. Yeah, yeah, and it caused a lot of outrage at the time. I think 2010, but when she came to visit our house,
Starting point is 00:35:18 she was completely unknown. I just knew her as my friend's friend who's in the Army and kind of had this like haunted look about the things that she'd seen in the Army. And so yeah, so I chatted with her for a bit during that visit and I was like, wow, you just come from a completely different world than me, right?
Starting point is 00:35:36 Like I'm some college kid just studying physics and you've been in the Army and you've seen these like horrific war crimes. And that kind of stuck with me. But I guess the interesting part is that when she did, a few months later after that, after I met her, she leaked that video to WikiLeaks and it was all over the news.
Starting point is 00:35:55 I think it was like the front page of Wired and the New York Times and her identity became known. There was a lot of government interest in her and who she was. And my friends and I became like really paranoid because there was this like black van parked outside our house and we thought it was like the FBI or something who was following us.
Starting point is 00:36:18 To this day, I don't know if this actually was what it was, but suddenly we were all like, oh, we should like encrypt our emails because we were in contact with her and they're like, do you have these investigations into her and we don't want to be followed? Yeah, and so that's when I learned to like kind of encrypt my email to kind of like care about privacy.
Starting point is 00:36:36 It was when I thought there was like, you know, some government agency that was like really interested in me. Whether they were or not, yeah. What a disaster because it's like the narrative for every paranoid schizophrenic is that there's a black van watching you. And so like when it really happens, I think, and I think the people who do this sort of observation
Starting point is 00:37:02 are fully aware of the psychological component in being observed, which is why they do it the black van. It's like, they could do any van. They could have a UPS or didn't have to be a van. I'm sure they could have a couple of cars or whatever, you know, but they do the black van because they want to fuck with your mind a little bit. They want you to like wonder and feel crazy
Starting point is 00:37:22 and they want to demonstrate to you their power and show you that it's like, yeah, you don't really have any control of anything. Like we're watching you now. That's terrifying. I would have, that must have been so frightening to realize that you had been interacting with someone who for a little while,
Starting point is 00:37:40 at least from the state's propaganda mechanism, became public enemy number one. Yeah, I think at the time I was like so young and like naive in these things that I didn't even real, like it didn't even scare me that much. I was just like, wow, this is kind of exciting, like a spy movie, like what's going to happen now. And then definitely at the time I was like,
Starting point is 00:38:01 well, you know, I have nothing to hide except that. I did talk to this person and I did have contact with her. I think like people were, the main reason we thought the government was interested in us particularly as like MIT students was that she had visited some of us and they were wondering if she had given us like material that she was trying to leak or if like we in some way
Starting point is 00:38:26 were like helping her disseminate this stuff. And so like I, to be honest, wasn't like, I didn't have anything, but my friends were really paranoid and were like, oh, we don't want people looking through all our computers and just poking around in our business. Yeah, and because I mean, even though I'm sure you wouldn't say it out loud, I mean,
Starting point is 00:38:46 like if anyone there was helping, if someone was probably helped, I mean, it's an MIT commune with of hackers with somebody who's got like valuable information regarding like a horror war crime. So anybody with chances are someone there is going to help. this podcast is sponsored by better help. Is there something interfering with your happiness that's preventing you from achieving your goals?
Starting point is 00:39:32 Like, I don't know, maybe your mom had an affair with a priest and your dad walked in and would have killed both of them, but the shotgun jammed. You only found that out much later, but now that you're a 47 year old with kids, you're starting to realize that anytime you experience any kind of real feeling of love or compassion or intimacy, it's always intertwined with a sinking feeling
Starting point is 00:39:54 that at any second, this thing that is making you so happy will get wrenched away from you by the cruel hands of fate. Better help will assess your needs and match you with your own licensed professional therapist. You can start communicating in under 48 hours. It's not a crisis line, it's not self-help. It's professional therapy done securely online. There's a broad range of expertise available,
Starting point is 00:40:16 which may not be locally available in many areas. The service is available for clients worldwide. You can log into your account anytime and send a message to your therapist. You'll get timely and thoughtful responses, plus you can schedule weekly video or phone sessions you won't ever have to sit in an uncomfortable waiting room as with traditional therapy.
Starting point is 00:40:35 Better help, that's H-E-L-P, is committed to facilitating great therapeutic matches so they make it easy and free to change therapists if needed. It's more affordable to traditional offline therapy and financial aid is available. Better help wants you to start living a happier life today. Visit their website. Look at all the amazing testimonials.
Starting point is 00:40:56 It's at betterhelp.com, forward slash reviews. And visit betterhelp.com, forward slash Duncan. That's better H-E-L-P. And join the over 2 million people who have taken charge of their mental health with the help of an experienced professional. In fact, so many people have been using better help. They're recruiting additional therapists in all 50 states.
Starting point is 00:41:20 Right now, DTFH listeners get 10% off your first month at betterhelp.com, forward slash Duncan. Thanks, better help. This is what's so scary about hackers. This is why people like me are kind of terrified of hackers because this idea that y'all like, you know everything, like you have the ability to like find out all the stuff. I mean, how far away from the truth is that?
Starting point is 00:42:12 Honestly, I feel like that's pretty far, right? Like if you're worried about that with random hackers, like how worried are you with like some random employee of Facebook or Google who decides they wanna look at all your email or all your messages? Worried, I mean, cause to me it's that very human reality, which is somewhere in these companies, more than likely there's at least one person
Starting point is 00:42:38 who's just bored and who just, you know, has some spare time and a back door into this, whatever it may be, people's phones, people's messages, whatever it may be, and just decides to start going in there. I mean, how possible is it that somebody at Apple or Google or Facebook or any of these mega companies has the ability to do that? Honestly, it's very hard for an outsider like us to tell.
Starting point is 00:43:15 I've never worked at Facebook or Google, but I've heard from people who work there that they have some kind of safeguards against employees accessing data. Like maybe they'd have to go through layers of management and get people to sign off on it, but who knows how well that works in practice or if it was some way a random employee could bypass it.
Starting point is 00:43:35 Like from my perspective, if a company is collecting data on you, then it's safest to assume their employees can't see that data if they really are motivated to for whatever reason. Right, right. And this creates a kind of shadowy power imbalance, doesn't it? Like in the sense that if what you're saying is true,
Starting point is 00:43:56 which it must be, I mean, I want to believe that there are like intense walls of encryption, but I don't, I just don't, I just knowing humanity and knowing curiosity and knowing the basic drive to power that you would, it would be more shocking that there was a true system of ethics within these companies preventing people there from looking into other people's private data
Starting point is 00:44:25 than that this stuff was constantly being siphoned and potentially being, you know, anytime they start talking about breaking up Google, breaking up Facebook, anytime the government starts talking about that, I always think, really? Because I bet they know stuff about you. I bet they've got information on people who can make the decision about breaking up the company
Starting point is 00:44:48 and they could use that to manipulate the government, meaning that in some weird way, there's like a kind of secret tech government that could theoretically be pulling strings right now. Please stop me if I'm just being too paranoid, but just do, you know, thinking of human nature. This is what sometimes when I'm having dark fantasies about the way the world works, this is one of them.
Starting point is 00:45:12 Yeah, no, this is great because what sounds like what you're doing is kind of thinking of worst case scenarios and from kind of the privacy, paranoid advocates perspective, that's like what I do all the time is saying, like, hey, if they have this data, like what's the worst that can happen? For example, Brave and a lot of other companies
Starting point is 00:45:32 actually use Google for a company email just because it's like the most convenient service. Yeah. And so like, you know, if you're a competitor to Google, you always kind of have this like worry in the back of your mind that Google can see your emails. They could be using this information to, you know, like see what products they're about to launch
Starting point is 00:45:54 and just, you know, launch them themselves or do something to throw a wrench in your plan. Yeah. Yeah, there's actually an antitrust case against Google right now going on. I forget how it was brought up, but basically the government's like collecting a bunch of information to make the case
Starting point is 00:46:14 that Google might have kind of a monopoly. And it's a lot of it's really interesting because I think one example they gave is that, so Google's essentially an advertising company, right? They sell ads, you see ads on search as a result and they found that they basically gave preferential treatment to Facebook ads. So for most advertisers, when they're trying to serve an ad,
Starting point is 00:46:44 they have a timeout of like 300 milliseconds or something before they can serve that ad, but Google gave Facebook a double that. They gave them a 600 millisecond timeout. So Facebook just had like twice the window of opportunity to show you an ad. Yeah, and so they can make arbitrary rules like this and it's not clear if like, you know,
Starting point is 00:47:05 it's not obvious to other people. Like certainly other advertisers weren't given the same opportunity and that wasn't fair to them. But who knows why Google chose to have this special relationship with Facebook. It's all kind of shadowy to us. It's scary.
Starting point is 00:47:20 It's so scary. You don't want to piss off Google, by the way. I mean, that's the other thing. It's like, I don't want, I would hate to imagine Google just decided, I don't like, I don't like that podcast or whatever it may be. It's just so much, it's so much power. And I remember when, you know, I don't,
Starting point is 00:47:39 I bet you do recall, remember when they were trying to get Apple to get into people's iPhones? Like, do you remember that? This was the San Bernardino, like terrorist iPhone, their shooter iPhone, yeah. Yes, and Apple, you know, I think was, where they were very like, they were amazing
Starting point is 00:48:01 and that they resisted and they said, listen, if we start doing this, we have to, we'll never stop doing it. Every law enforcement agency, not just in this country, but around the world is going to be asking us to open up people's iPhones. We're not going to do it. We're never going to do it.
Starting point is 00:48:16 But their lawyer, this, and again, Stoner, especially at that time, I was eating a lot of them, but their lawyer said something along the lines of like, to me, seemed like a threat, which is like, look, what if all of a sudden Apple just started coming out, saying that the CIA was responsible for the assassination of JFK. And then almost after he said that,
Starting point is 00:48:40 immediately they just left Apple alone. And I kept thinking like, was that like, because Apple has some shit that they just are hanging onto, secret stuff that like, they're like, look, you want to like force us to open up our phones? Okay, but we're going to start leaking all the information we have on you. And so back off.
Starting point is 00:49:01 Oh, I just was wondering, was that what that was? And why? Oh. You know, like, so to me, like, it seems like these tech companies are becoming like many nations or something, you know? And their entire arsenal is not military, is not weaponry, but data that they have on people in high positions of power.
Starting point is 00:49:25 And it seems to me that you're not refuting that or that that isn't necessarily my own pair and why. Yeah, I don't know. I guess I have never been like close enough to the upper echelon of these massive tech companies to know whether there's like some kind of relationship with the government where they're like saying, hey, if you don't do this,
Starting point is 00:49:48 we'll keep this data private. And like they're making bargains with each other. Certainly it couldn't be, I guess, but you know, in the first place, Apple employees should not be looking at people's personal data. So like, if that were happening, that would be kind of seems like a big violation
Starting point is 00:50:06 on the part of the tech company. Well, you know, what's the alchemical maxim as above so below? And God knows, in human society, people look at each other's phones when they're jealous or worried or they think people are cheating on them, they go into their phones.
Starting point is 00:50:19 And if that happens in families, it definitely is, it's probably happening in tech. But you know, you're at the forefront of security at Brave. I mean, could you do that if you wanted to? Is it even possible if you wanted to like start snooping into someone's data who's using that browser? Could you?
Starting point is 00:50:39 Honestly, no, because one of the principles that we have at Brave is we just don't collect the data if we don't need it. So like really, for the vast majority of users, we don't have any data on them. Like we don't have their email addresses, we don't have their contact information. So like if I tried to look really hard,
Starting point is 00:51:01 I could maybe, I could see like a bunch of IP addresses because there's various servers where we're getting IP addresses, but that's not usually like traceable to an individual. And we don't really have any like other interesting data that's associated with the IP. But how do you make money over there without the data? Like what's the monetization route?
Starting point is 00:51:20 That's a great question. So we kind of, we're kind of in the mindset that people, you know, if they want to give us data, well, maybe I should just give the straight forward answer, which is that Brave has an often privacy preserving ads platform. And so what that means is that we let advertisers show ads to users who opt into them,
Starting point is 00:51:47 but these ads don't leak any data about people. So they're kind of like non... So I say privacy preserving because the way we actually decide which ads to show users is not in the traditional way, which is where an advertiser kind of collects all this information about like what websites are visiting and kind of built a profile of you.
Starting point is 00:52:11 And then based on that decides like, hey, you're interested in buying shoes or you're interested in buying a car and then shows you those ads. And so what happens is that on your own device, and it's important to note this data never leaves your own device. The browser itself sees what your browsing history is.
Starting point is 00:52:29 And then based on that, just picks an ad out of a big catalog of ads to show you. And so as a result, the advertiser doesn't learn this information about you because that information is just kept on your device, but you still see like a relevant advertisement, hopefully. That's cool. That's beautiful.
Starting point is 00:52:47 That's so futuristic. That is like, that's the path forward. I mean, that's what it's going, that's clearly anyone with a brain is going to choose that over the other thing that they're doing. But how much money do you think? Like theoretically, if you had to guess,
Starting point is 00:53:03 how much money are you losing doing that? That is, I don't have a specific answer for you on that because I don't work on that side of Brave as much, but I will say, yes, like this is a new model and a kind of a challenge we have is convincing advertisers that this works as well as their traditional method of siphoning everyone's data, building these online profiles.
Starting point is 00:53:29 And so we have to prove to them that building these local profiles where the data doesn't just get shared with them, works as well and will lead to people still buying their shoes or cars or whatever. So yeah, it's still kind of in its early stages and we're still kind of just trying to show people this works and you should switch to showing people ads
Starting point is 00:53:48 in this more private way. Look, isn't that so crazy? Like truly, it's like there's just more money and like being unethical, it seems like. And that's why companies like Brave and it's in the EFF and it's like, they're amazing. That's what I love about the like old school hacker ideology. There is this kind of mystical punk rock ethic to the thing
Starting point is 00:54:15 where free software, what's that song? Do you know the song I'm talking about? The hacker song? Oh my God, I'll send it to you. It'll give you goosebumps. I'll send it to you. It's very lo-fi. Maybe I'll play it at the end of this.
Starting point is 00:54:32 But it's just a whole song about how keep the software free. The software should be free, keep everything free. And this seems to be what Brave is mirroring that a little bit. But you should also make some money. My God, I mean, what are you gonna do? We live in capitalism. Yeah, I mean, that's the thing.
Starting point is 00:54:51 Like a surprising number of people don't seem to understand is like, yes, we do wanna be ideological purists and give out free software and not make any compromises. But at the end of the day, I've seen so many free open source projects just basically get abandoned because they couldn't make enough money to sustain themselves. So at some point you have to say, like,
Starting point is 00:55:12 hey, we have to come up with some way to like make money and sustain our business, but also not compromise like this core set of values which are like basically untouchable. And so, yeah, I think it's tough. I definitely see a lot of projects which are basically like the backbone of the internet that everyone relies on and the people running them
Starting point is 00:55:33 are just like one or two people randomly who do it in their spare time and never really try to monetize it. That kind of scares me. Yeah, well, right, yeah, right? Because like you gotta eat. And, but is it do you with, you know, like everyone now is like obsessed with NFTs
Starting point is 00:55:54 and with crypto and like, and what are your thoughts on that as a monetization as an avenue of monetization for these people, you know, as some kind of like monetization engine that transcends the current mechanisms people are using to make a buck or do you think it's just a nightmare? I have a lot of thoughts about NFTs actually.
Starting point is 00:56:21 Let's hear them. Well, so I think like, well, I think not speaking about NFTs, but I think crypto speaking about cryptocurrency more generally, I'm kind of a pragmatist in the sense that I think if, you know, people who really need the money and are trying to do good things with it
Starting point is 00:56:38 are able to use the hype around these new currencies and investments in order to support themselves or support their businesses. I think that's a good thing, right? Like in some ways a lot of projects have been using cryptocurrencies kind of as like a replacement for Kickstarter where instead of just asking people like donate some money to our projects
Starting point is 00:56:59 so we can build a thing, they say, hey, buy our coins so we can build the thing. And then when the coin goes up, those people also get rewarded. So I don't think that's like inherently a bad model for funding software. I think where it gets really tricky is just the fact that these currencies have been,
Starting point is 00:57:18 you know, such attractive investments to a lot of people who don't necessarily have like the best ethics. And so I personally think a lot of NFTs are used for money laundering where, you know, basically someone says, I have this cryptocurrency which is from a ransomware attack
Starting point is 00:57:40 but I want to cash it out without showing that it was, you know, stolen from a ransomware attack. So the way I'll do that is I'll make an NFT, like sell it to myself and then I get my money back but now it looks clean, you know. Holy shit, that's crazy. You can do that? Yeah, I think that's why you see all these sides
Starting point is 00:58:01 with like all these crappy artwork that's being sold for a large amount. I think someone's just making bad artwork and selling it to themselves to like clean up their, you know, dirty money or whatever you want to call it. My God, that's the most grim thing I've ever, that's so sad, that's so weird, that's so sad. Wow, wow, that's amazingly weird.
Starting point is 00:58:26 I don't know how I missed the boat on just that obvious aspect of it. I've always thought, oh my God, that's incredible. Like anytime like monetization becomes possible for artists, I just think it's incredible, you know, cause so many, so many like musical artists are really just getting screwed and like a way for them to like get more profit
Starting point is 00:58:47 from what they're making or just a normal amount of profit that's currently it's being cut up in a million different pieces and given to whatever the particular server that is showing their music to people, it seems great, but yeah, is there a way to fix that? Part of it, is there a way to make it
Starting point is 00:59:04 so that it really is benefiting artists and not just being used to launder ransomware money? Well, so first of all, I'm really sympathetic to the kind of the sentiment that creators and artists need better ways to monetize cause yeah, and music piracy is a huge thing. And like from streaming, I think someone calculated that if a single person listened to a single artist song
Starting point is 00:59:33 on Spotify nonstop for a year, that artist would only make like $23 from those streams. It's crazy. Such a small amount, right? But I don't think NFTs are the answer because there's, I feel like they've just been polluted by like the money launderers and the people who are trying to use them for like pump and dumps
Starting point is 00:59:55 where they try to convince a lot of people that some art's worth a lot of money, but really they're just trying to like make their, make a bunch of money off of it. I really like in the NFT space, I don't think much of it is really about the artists themselves, but rather it's about like, how can we make the most money
Starting point is 01:00:15 and make our money look legitimate as fast as possible? I get approached every once in a while with a good pump and dump. Like someone's got some coin that they want me to talk about and then for, and they make it seem so appealing. And for a second, I'm like, my God, what an incredible chance. And then you realize like,
Starting point is 01:00:30 oh, that's one of those things they're talking about. Speaking of music, by the way, wow, SoundCloud.com, Ford slash Azuki. That's you, your music is so good. I mean, I didn't know if we start talking about your music or start talking about your technological work, but wow, like, wow. And also on top of that,
Starting point is 01:01:00 you're making your own VST, you're making filters for the, you're just making music software? You make that sound so much more impressive than actually. Oh, okay, okay. It's impressive. Give me a break. As I'm like researching you and going deeper and deeper into this crazy rabbit,
Starting point is 01:01:19 it's like, oh, and also you're making like digital synths? Holy shit. Let's talk about it. Well, so I think the thing most people know me for in the music software space is basically a plugin. It's both available as a VST and a Max for Live device, which is only usable in Ableton, but basically it takes this existing piece of software
Starting point is 01:01:46 called Spleter that was developed at the company Deezer for splitting a song into stems and kind of packages it in a way that more producers can just use it in their DAWs. So I mean, I think that's kind of a passion of mine is just taking research projects that companies have made where it's really cool technology, but it's not packaged in a way where like normal people
Starting point is 01:02:15 can use it and just making it more usable. So I don't know, I don't, I feel like I don't really do any of the hard work on that one. Okay, well, let me, can I run some ideas by you that I would like you to do? Yeah, please do. Are you familiar with UberDuck?
Starting point is 01:02:31 No, what's up? UberDuck uses deep, is it just, you can type in sentences and get famous people to say it. It's like deep, fake audio. And oh, it's glorious. It's amazing. Some of them aren't, I mean, it just depends on how much time somebody spent on the particular voice.
Starting point is 01:02:48 Some of them are really great. Some of them are low grade, whatever it is. It's an incredible technology, weirdly anonymous. You go to UberDuck, I can't remember the exact website. It's, let me see here. Google, I'm on Google, I'm sorry. UberDuck, I'm not used to it. I will switch though.
Starting point is 01:03:08 I will switch to UberDuck, okay. Yeah, uberduck.ai. And it's really amazing, but my God, I just keep, I have this dream of being able to interview these people. You know what I mean? Like somehow connecting UberDuck to, what's the new AI that Elon Musk was so freaked out about?
Starting point is 01:03:34 You know what I'm talking about? Like Elon Musk mentioned it. It's a new AI engine, philosophy bot. You know that philosophy bot? No, I've never heard of it. Okay, philosophy bot, let me see if I can find it. Uses this like, yeah, philosopher AI. It uses GPT-3, have you heard of that?
Starting point is 01:03:56 Oh yeah, I do know about GPT-3. Yeah, so you can ask it a question and it instantly produces a convincing response that is just from an AI. So anyway, will you please combine UberDuck with that software so that we can interview famous people for our podcasts without having to act? So you wanna like basically interview Kanye West
Starting point is 01:04:22 or someone and have it be like their voice, but speaking like a completely, the GPT-3 generator text, got it. Yes. What do you think? How hard would that be? Yeah, that doesn't seem too hard, but do you think they could like sue you for doing that?
Starting point is 01:04:38 I don't know what the like legal question is. Yes, I think they could sue you for doing it. I do, but I don't think they could sue you for combining the technologies so that if we wanted to do that, because then it opens up all this other stuff, which is, I mean, like, you know, like the ability to call these people
Starting point is 01:05:01 and just have conversations with them or the, you know, and again, someone's gonna do it. You might not be the person, but someone's gonna do it. And to me, it's one of the pathways to this kind of like fame diffusion. You know what I mean? Like I think we're in the last days of fame. You know what I mean?
Starting point is 01:05:21 Like AI is going to obliterate, it's gonna dilute. Like the meat body Kanye West becomes almost insignificant when there's five billion AI Kanye West that anyone can communicate with at any time, you know? Yeah, I wonder, this actually, I just had a thought, which is it would be really cool to have something like this for famous singers like Adele or Beyonce, where you could just write like a melody and some lyrics
Starting point is 01:05:54 and just have them sing it and use that in your song. Like I think people would find that super useful. Yes, incredibly useful. Or just as a podcaster to have my own AI version so that if I like lose my voice, I could type in what I wanna say and it would say it, things like that. Or I mean, aside from all the silly stuff,
Starting point is 01:06:15 people who maybe lose the ability to talk, you know, I think, you know, or the other for grief counseling, you know, like the ability to communicate with people who've passed away by creating an algorithm for their voice. Yeah, well, to do that, it seems like you'd also need like a big set of basically a large body of texts that they've written
Starting point is 01:06:44 or said in the past so the AI can kind of learn like what they're likely to say. I think there was actually a Black Mirror episode about this where, you know, someone dies, but you feed them like a recording of them or like basically all the texts that they've ever sent and then they make a bot that kind of can reply in the same way this person would reply to you.
Starting point is 01:07:04 Yes, yeah. I mean, this is the, one of the creepiest things I ever heard about these neural networks is they don't know what they're doing. Like inside the neural network and some of these layers, they're like doing things that people just can't figure out what's happening. And do you get much into simulation theory?
Starting point is 01:07:25 I used to be a big fan of the theory that we all are just living in like a simulation of like a single brain that's in the universe. It's called like the Boltzmann brain theory. Okay, thank God. I winced when I asked you the question because I felt like it's such a mundane, boring question, but can you explain what the Boltzmann brain is
Starting point is 01:07:44 to some folks listening? Cause they might not have heard of it. Yeah, oh man, this is, I think if you've like studied thermodynamics, this becomes like really compelling, but basically the theory is that, you know, the universe is really complex. And because in thermodynamics, it's very difficult for like complex organized systems
Starting point is 01:08:06 to just arise out of nowhere. It is actually more like statistically likely that the universe we live in is a simulation of like a single brain like in an empty universe than the possibility that all of this complexity that we observe actually exists. So essentially like what we're living in is like a dream of like a single brain
Starting point is 01:08:32 rather than a reality. Yes, and so that the weirdly the Boltzmann brain completely mimics like a lot of mystical traditions that say that we're the thought process of God and that we're what happens when God's thinking. What's the difference between the Boltzmann brain and God? Isn't that God's brain?
Starting point is 01:08:53 No, yeah, maybe it's like God, but without the whole religious connotation of like, you know, Jesus lived and did all of that. But yeah, I guess in some sense, it's like a, it's God in the sense that it's like omniscient and is the creator of everything. Yes, right. And then within that, we have this realm
Starting point is 01:09:12 that we're in a synthetic reality where a thought form of this kind of Boltzmann brain. But this is, do you ever think to yourself like, oh yeah, I'm in a neural network. I'm part of some processing unit that is using like even this conversation and all the conversations you've ever had to try to reach some, I don't know,
Starting point is 01:09:41 some form of data output, some resolution to a question. You ever, do you ever wonder that? Yeah, I do. But at the same time, I'm like cognizant that this isn't a testable hypothesis. So it's something where, you know, we can speculate on it, but we can't ever know if this is really what's happening
Starting point is 01:10:03 because there's no experiment we can construct where, you know, we decide if we're living in a perfect simulation or if we're living in this actual reality that we observe. Well, but there could be like, maybe not an experiment to truly test it, but couldn't there be some like mechanism of hacking it? Like couldn't theoretically there be like some,
Starting point is 01:10:29 as like something that we would consider to be like, oh my God, it's like the next, you know, industrial revolution, but it's really like some form of hacking into this thing to gain access to higher processing power or something. Yeah, if it's like a leaky simulation or some simulation with some way of like escaping it and detecting what's outside it, that would be interesting. How do we hack it?
Starting point is 01:10:59 I don't know, I think it depends on if like the laws of the simulation that we're living in like allow us to somehow see what's outside it. I don't know how we would figure that out. By doing it, I mean, you would just have to figure, you would just have to, my theory would be, well, one hypothesis I would, if it's a closed system that's designed to like keep you
Starting point is 01:11:23 from hacking outside of it, then I would imagine anyone who came very close to hacking outside of the system would be like would die or would just, you know, would be eradicated or something or would, you know, go insane or something like that. But it seems to me like the thing follows some kind of, again, I don't know what it is, but don't you think it follows like some pretty obvious
Starting point is 01:11:47 like video game logic? You know, like in a video game, it's inevitable that you think you only are able to do this one sort of, maybe you just don't realize, like the thrilling part of the video game is suddenly, you take flight, you're like, holy shit, now you can fly in this video game. So if the simulation is an entertainment mechanism,
Starting point is 01:12:09 then, you know, part of the game might be the intent to hack it to escape, which just leads you to another level of the game. Yeah, maybe there's a, yeah, I could see this, you know, becoming more of a thought in my head if, for instance, we notice like, some stuff that couldn't be explained by science, like, you know, people spontaneously disappearing
Starting point is 01:12:39 or something, it would be like, oh, that's like some kind of glitch, you know, we should look into that. Yeah, a glitch. Sometimes it feels like it's glitching out. Well, I, you know, I just want to talk a little bit about your music. Do you have a few more minutes? Yeah, of course.
Starting point is 01:12:58 When did you get into mute? Like, when did you start recording music? Oh, so I actually was, so in middle school, I started playing the flute. And then shortly after that, I discovered I was much more interested in like composing music. So I kind of became this like child composer when I was in middle and high school.
Starting point is 01:13:23 And I started really being interested in like writing music for like jazz bands and wind symphonies and all that. And that kind of like stopped when I went to college because I decided like a lot of people that I should just, you know, pursue a career that's more fruitful more likely because I, yeah, like I think even that I was super realistic that like the field
Starting point is 01:13:46 which I was really interested in, which was film scoring was just ultra competitive and extremely difficult to get into. So yeah, and then after that, around maybe like five years ago, I started messing around and Ableton and kind of just discovered the world of electronic music and thought that was really interesting. I'm still not an expert by any means in that
Starting point is 01:14:09 and I'm still learning and it's kind of like messing around with Ableton. I still don't feel like totally comfortable with it. How could you? It's the most insane thing I've ever, what is the craziest software of all time? Are you making your own beats? Who's, what's your process when you're making beats?
Starting point is 01:14:27 I, it's been a while to be honest. I kind of like stopped making music at the start of the pandemic because I just lost motivation like so many people. I was like, oh, there's no shows happening. Like what's the point in doing this, you know? But yeah, I do work a lot. So I find I'm actually most inspired by like vocals.
Starting point is 01:14:51 So if I can find someone who has like a vocal melody that they're singing that they wanna like give me or someone who has like a rap that they wanna put a beat under that's like more fun for me to work with. Oh, I see. So you don't start with a beat first. You start with some inspiration
Starting point is 01:15:07 and then figure out the beat from there. Yeah, that's usually more of my process. Cause whenever I try to make a beat, it's like, I feel like if there are invisible guardian angels watching us, they're cringing at the horrific beats. I can't do it. Like what do you do? How do you do this?
Starting point is 01:15:26 I don't understand it. These, your bill or you or any of these people, any like, how do you do it? Is it, am I just missing some fundamental component of my soul or something? When I sit down at the push two and try to make a beat, it either comes out the same as all my other beats or it just, it sucks.
Starting point is 01:15:45 Yeah, honestly, I kind of had the same process for a while. I think early in my like electronic music exploration, I was using a Launchpad controller. And I was like, cause I'd seen videos of people like making quote unquote beats and they were all just like playing them live. And so I was kind of like limited by my personal like rhythmic abilities to like play something
Starting point is 01:16:07 on like a physical instrument. And then I think when I started looking up like the way Bill and other people were working, I was like, oh, they just open Ableton and like click in the midi track. They're not like actually playing these things. And then so then I was like, okay, your limit. Like there's no longer limitation.
Starting point is 01:16:24 You could just click on whatever, you know, in the grid. I know you click in the grid. That's great. I click in the grid, it still sounds like shit. Maybe it's just hopeless. I don't know. I'll just give up on that part of my music creation. But God is, I can't believe you stop making music.
Starting point is 01:16:40 Your music is so good. I really hope that you resume this practice because it's really beautiful. Thank you. I appreciate that. I think, yeah, there's a lot of factors in that and my like kind of demotivation. One is like being around Bill more
Starting point is 01:16:58 and just seeing like the music industry and just being like, oh, there's so much of this that I don't want to like immerse myself in because it's ultra competitive. You have to do a lot of self marketing to be successful, it seems. And there's a lot, you know, in that that's not just sitting around at your computer
Starting point is 01:17:16 like doing creative work that you actually want to do. And also shows, I think I like don't actually like being at shows or playing shows because I used to DJ a lot more often. And is that just because of like the possibility of getting a disease at the shows? Well, that is part of it now. But I think also just having to like stand up
Starting point is 01:17:41 in front of a lot of people and act excited about something. You know, it's a lot of pressure because like that's a lot of DJing, I think, is just like the kind of performance aspect of like standing there in front of people trying to be excited so that they're excited. And I'm just not good at like faking excitement for not actually excited.
Starting point is 01:18:02 You know what, this is something I've been going back and forth on myself. I just interviewed a comedy writer, Martin Olson and his, we were talking just about how like, well, yeah, I mean, you've got to be a con artist if you want to be an artist. Like you have to like overcome that part of yourself that wants to emote what you're authentically feeling.
Starting point is 01:18:23 You have to like find a way to like actually do the thing that so many people say, that's like dishonest or terrible, which is to, you know, exhibit like some kind of exuberance that might not even be there. But I've noticed when I do that, the exuberance might just show up from the crowd, you know? Like it's possible to generate it in real time with even if you feel complete.
Starting point is 01:18:48 I mean, Jesus, I know so many comedians who are like in the midst of like a 10 year depression. And they go on stage and like are able to like completely tune into like some energy that maybe isn't exactly the energy of, you know, be having an endogenous depression or something like that. Wow, yeah, that's so amazing. And it definitely seems like comedy
Starting point is 01:19:12 is one of the more difficult stage performance types, right? Cause you kind of have to like react to the audience so much and like, do you find yourself like, I don't know anything about stand-up comedy, but do you find yourself having to like respond a lot to the audience and kind of just like see what their energy is and keep adapting according to that? Yeah, I think that's one of the,
Starting point is 01:19:32 I mean, there's so many different ways to do it. That's one of the ways, that's one of the like, you know, you do hear this stuff about, oh, comedians, they always come from like fucked up childhoods. And one of the explanations for that is not the fucked up childhood made you funny, but the fucked up childhood forced you to attune yourself to the moods of whatever chaos bubble you were in.
Starting point is 01:19:55 So you became hyper-tuned to if mom is angry, your dad's angry, your brother's angry, or what's happening in the room with your like, you know, drunk, abusive, whoever, so that you could survive. And so that like, finally tuned to an environment thing that sometimes gets translated into like, well, I'm an empath. And it's like, well, you're someone who like, as a kid learned how to instantaneously adapt
Starting point is 01:20:22 to shifts in energy so you wouldn't get beaten. Oh, that's so interesting. So like, then so comedians have just, some comedians, not all, I do think we have to try to let go of this like sick artist paradigm because I think it makes people sick because they so want to be an artist. You don't have to do that.
Starting point is 01:20:40 But I do think, you know, and comedians and probably some DJs and stuff like do that naturally just because they like had to survive with it, you know? So yeah, I think that's part of comedy. It's part of music too, I guess. I mean, like we have to do in real time with our shitty neural network, what the algorithm is doing, you know,
Starting point is 01:21:05 it's like weirdly similar, you know, that the algorithms mirroring our behavior online back to us in a kind of manipulative way. It's strangely like a performance that some AI is doing for us, I guess. Yeah, that to me just, it all sounds so difficult. And I've learned about myself through the pandemic that I really prefer work
Starting point is 01:21:29 that doesn't have any like real time component to it. So like software engineering, right? Like, or even writing, you know, you can just kind of hone in on what you want to say or do and you don't have to like keep taking inputs from the external world and adapting to it. And you can kind of just like take your time getting things to how you want it to be.
Starting point is 01:21:50 Here's my last question for you. Thank you for being so generous with your time. Of course. If you were given an infinite, infinite money, what software would you design? Oh, that is such a good question. Might have to think about that for a minute cause like you never think about like,
Starting point is 01:22:15 what if you had infinite money? Cause it's just not like a realistic situation. Yeah, and also if you just like want to spit ball, it's totally cool. It's not like, you know, whenever we're thinking about things, when you're thinking about making stuff, like making a sketch or something, we always start with infinite money.
Starting point is 01:22:36 Like what if we had Steven Spielberg money and what would it look like at that level? And then from that, you can kind of reduce it down to like whatever you have around you. But I just would love to hear what you would create if you had unlimited resources when it came to technology. I think in terms of impact, like if I had infinite money,
Starting point is 01:22:59 I could buy up every other tech company, right? I could like buy Google, Facebook, Apple, et cetera. And then I could just like run those companies in whatever way that I thought was like best for humanity. For instance, I could say like Google, stop all third party tracking and make sure like people, you know, have like consent to share whatever data they want. Like you can basically go through these companies,
Starting point is 01:23:25 like identify whatever like ethical or privacy issues that you think exists there and just fix them, right? Cause you're like, I have the resources to do that. That might be the most effective thing to do actually. What else would be cool? I don't know. I guess like maybe cut some kind of alternative to Facebook where it's not so addictive
Starting point is 01:23:53 and it's actually helping people connect in some meaningful way rather than just make themselves feel bad. Cause like I have heard through the pandemic with like people not being able to have in person events that it's been more useful for them to have social media and have these ways online of like sharing what's going on in each other lives and like keeping those connections going.
Starting point is 01:24:15 But I just think like with Facebook and Instagram, there's so much negativity that comes with that, right? There's the fact that Facebook is mining all the state out to sell you stuff. And then there's the fact that what the things that get the most likes or the most attention in these algorithms is not necessarily like the positive. That's things that are good for people to see.
Starting point is 01:24:39 So maybe just thinking about how can we like redesign social media to keep the positive connection aspects but reduce the like mental anguish and other negative side effects that come with it. Beautiful. Yanzu, thank you so much for your time. Thank you. Thank you.
Starting point is 01:24:58 It's been a real joy getting to know you. And maybe you can tell people where they can find you. Sure. So I think I'm most active on Twitter these days. My Twitter handle is bcrypt, b-c-r-y-p-t and yeah. And yeah, check out brave, brave.com. If you're looking for a new browser, we actually also have a search engine now.
Starting point is 01:25:20 Cool. Yeah, I'm on it. And thank you very much. All the links you need to find, Yanzu will be at dunkintrustle.com. Thank you so much for your time. Thank you so much, Sanka. This was really fun.
Starting point is 01:25:30 Thank you. Appreciate it. That was Yanzu, everybody. All the links you need to find are gonna be at dugintrustle.com. Tremendous thank you to our wonderful sponsors. Do try out Squarespace. Give Storyworth a shot.
Starting point is 01:25:46 Check out BetterHelp. And most importantly, my friends, have an incredible turkey day. I'll actually see you next week. It's a two episode week. Until then, Hare Krishna. We are family. A good time starts with a great wardrobe.
Starting point is 01:26:04 Next stop, JC Penney. Family get-togethers to fancy occasions, wedding season two. We do it all in style. Dresses, suiting and plenty of color to play with. Get fixed up with brands like Liz Claybourne, Worthington, Stafford, and Jay Farrar. Oh, and thereabouts for kids. Super cute and extra affordable.
Starting point is 01:26:22 Check out the latest in-store, and we're never short on options at jcp.com. All dressed up, everywhere to go. JC Penney. At Lowe's, we know you can get the job done faster if you don't have to stop and come into the store all the time. That's why we've updated our app
Starting point is 01:26:39 with your business and mind. With the app, you can build quotes, easily reorder your supplies, track orders, and much more. So you can get everything you need right away, stay on the job, finish it, and get started on the next one. Download the app today,
Starting point is 01:26:55 because Lowe's knows time is money. Lowe's knows pros.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.