The Joe Rogan Experience - #1736 - Tristan Harris & Daniel Schmachtenberger

Episode Date: November 18, 2021

Tristan Harris is a former Google design ethicist, co-founder and president of the Center for Humane Technology, and co-host of the Center for Humane Technology’s "Your Undivided Attention" podcast ...with Aza Raskin. Daniel Schmachtenberger is a founding member of The Consilience Project, aimed at improving public sensemaking and dialogue.

Transcript
Discussion (0)
Starting point is 00:00:00 The Joe Rogan Experience. Train by day, Joe Rogan Podcast by night, all day. Gentlemen, thank you for being here. I keep doing these podcasts where I just talk to people, so please introduce yourself and tell people what you do. I am Tristan Harris and came on this show about a year ago after The Social Dilemma came out. That's probably where most people know me. And I used to be a design ethicist at Google studying how do you ethically influence people's attention and thoughts and behaviors.
Starting point is 00:00:37 And I really enjoyed the conversation last year. The reason that today I'm here with Daniel Schmachtenberger, who's really a person I've learned so much from the last few years and why I thought it'd be a good through line, is that the issues of social media, which I know we're going to talk about today, are connected to a number of other issues that are going wrong in society that are all kind of interconnected. And I've learned a tremendous amount from Daniel, and I thought it would help really clarify some of these issues for everyone. Well, thank you, Daniel. Thanks for coming aboard. Thanks for having me here. What a daunting task, how to ethically influence people.
Starting point is 00:01:11 And what a weird thing that this industry that didn't exist 20 years ago has such a, I mean, think about life on earth. And then 20 years ago, all of a sudden this social media thing sort of evolves. And now you have to wonder how much of an effect it has on our just day-to-day lives and how to ethically influence people. What the fuck does that even mean? Well, first of all, I should say- How do those thoughts even get, how does that get worked out? Actually, I should first say that there wasn't at Google a department that said,
Starting point is 00:01:48 how do we ethically influence people? I actually sort of, as was shown in that in the film, The Social Dilemma, wrote this presentation worried about how technology was influencing people's thoughts, concerns, behaviors, et cetera. And I studied persuasive technology at Stanford, which is a whole discipline and field, the idea that technology can influence people. And it was out of my own personal concern that when that presentation went viral at Google, I kind of worked my way into this position that never existed before, which was how could we create a framework for what it means to ethically influence other people? And a lot of that has to do with asymmetries of power.
Starting point is 00:02:23 I mean, when I was a kid, I was a magician. We talked about this before. Magic is about an asymmetric relationship. The magician knows something about your mind that you don't know about your own mind. That's what makes the trick work. And actually across some of the issues I think we're going to talk about today are ways that there is an asymmetric relationship between what technology knows about us and what we don't know about ourselves. When you were studying at Stanford, what year was this? This was 2002 to 2006. I was an undergrad. And then 2006, I got involved with Professor B.J. Fogden, who, again, actually studied ways that persuasive technology could be used for positive purpose. Like, how do you help people be healthier? How do you help people floss? How do you help people work out more often? Things like that. It could be used in a positive way.
Starting point is 00:03:08 But I got concerned because it was all of this increasing arms race to use persuasive tools to harvest and capture people's attention, now known as the race to the bottom of the brainstem, to go down the brainstem into more social validation, more social narcissism, all of that. And that's one of the arms races we see everywhere, which is like in every single thing. If one oil company doesn't drill for that oil well, the other one will. If one attention company doesn't add the beautification filter, the other one will. If one company doesn't do narcissism, social validation hacking and likes and variable rewards, the other one will. And it's true across so many of the other issues that we're facing, whether it's like, if I don't build the drone
Starting point is 00:03:49 for everyone, then someone else is going to build the drone for everyone. So that's how I think. Did you realize it back then? I mean, 2002, 2006, you're talking about a completely different world in terms of social media. Totally. It's before the iPhone, actually. Yeah. 2007, right? iPhone came out in 2007. We were studying persuasive technology. And I've said in the iPhone, actually. Yeah. Yeah. 2007, right? iPhone came out in 2007. We were studying persuasive technology. And I was, as I've said in the past, partners with the co-founder of Instagram in the persuasive technology class. So we were actually studying how would you apply persuasive technology to people before the iPhone even existed.
Starting point is 00:04:20 And, you know, what bothered me is that I think when people think about how do you ethically persuade people, you just get into a whole bunch of ethical cop outs, like, well, we're just giving people what they want, you know, or if they don't want this, they'll, they what they're doing. And what concerned me was that the ethical framework wasn't really there. Not that I had one at the time, by the way. I studied at Google for three years to try to develop, like, what does it mean to ethically influence 3 billion people who are jacked into the system? And this is before Cambridge Analytica, before the Facebook files, and Francis Haugen talking about that, you know, we now have the receipts for all these things. So we talked about all these things in the social dilemma. But now there's the evidence with Francis Haugen's whistleblowing that, you know, Instagram makes body image issues worse for one in three teenage girls. I know I'm going fast, but that's the broad strokes. Do you know the conspiracy theory about her?
Starting point is 00:05:15 Tell me. The conspiracy theory amongst the tinfoil hat folk is, first of all, she started a Twitter account right before she went there and was immediately verified. Right. And then instantaneously was on all these major media outlets, major network television shows and being interviewed. And she was saying something that a lot of people felt like was a call to authoritarian intervention into social media, that it was government censorship was the solution, and regulation was the solution to dealing with this problem, and that it seemed like she was a sanctioned whistleblower.
Starting point is 00:05:58 She was saying all the things that they wanted to hear, and that's why they put her in the position to make a big, loud noise. What did you think about that when it came up? I always have to do this. You know, when something like that happens, like, maybe, maybe, because, you know, the government would do that. Like, most certainly they would love to have control over social media. They would love to be able to censor things like the Hunter Biden laptop story. They would love to be able to,or things like the Hunter Biden laptop story. They would love to be able to hide Joe Biden's medical records or Kamala Harris's time as a prosecuting attorney. There's a lot of stuff they would like to do, or district attorney rather. There's a lot of stuff they would like to do with access to
Starting point is 00:06:41 information. I mean, you're seeing it right now in terms of one of the things that's been fascinating about COVID is during this pandemic and during this terrible time of paranoia and dealing with this disease and fear and anxiety, you're seeing this narrative from social media networks that absolutely walk step in step with the government, where if the government wants certain information censored, it's being censored across major social media platforms. Yep. That has to be coordinated.
Starting point is 00:07:16 There's no way it's not. And there's no way they're incentivized to not have people discuss certain things. incentivize to not have people discuss certain things because we've said before you know it's one of the major points of The social dilemma is that things that are controversial whether they're true or not are the things that are the most? Clicked on the most shared the most and that's where I really am that's where the money is Yeah So there's got to be some sort of incentive for them to not do what they do with every other subject, whether it's immigration or gun control or abortion or anything.
Starting point is 00:07:53 The algorithm favors. Do they censor on immigration? Or you're saying that as an example, if something goes viral. Yeah, yeah, yeah. Not the censorship. Right. They don't censor on immigration. I mean, the border crisis is a great example of that.
Starting point is 00:08:06 Right. Like, the government would probably like us to not see all those Haitian immigrants storming across the border. But, my God, those were shared like crazy. Totally. You know? So, why was COVID information shared? Well, because there was a narrative that they could say, well, this is dangerous misinformation and we could protect people, even though some of it turned out to actually be accurate, like the lab leak hypothesis. At least that it's a hypothesis.
Starting point is 00:08:33 It's a hypothesis that at least is being considered by virologists. But the point is that who the fuck are they to decide what can and can't be discussed? And when they're doing something step in step with the government, I get concerned. So when someone comes along and this person who's a whistleblower says something needs to be done, you know, we're endangering young girls lives. We're doing this. We're doing that. We need some sort of government intervention. I mean, this is essentially calling for censorship and calling for government control of social media, which freaks people out.
Starting point is 00:09:09 So she's pretty clear that she's not calling for censorship. But the reason I asked you, I was curious how it came across your radar, because I happen to know and hear a little bit about this from her. We interviewed her on our podcast. And the story that goes viral about her saying that she's a psyop or that she's a plant, that's an incendiary, inflammatory, controversial story. So when that gets suggested, is it just going to fizzle out or is it going to go viral? How ironic. It's going to go viral. Exactly. And in fact, when you kind of realize everything,
Starting point is 00:09:43 I mean there's some things that are real conspiracy theories and there's some things that are real psyops and that's a real thing. But notice how many things we think of as psyops, conspiracies, et cetera now. And it's because anything that has that incendiary quality goes viral. And I happen to know, for example, I think one of the things that claims in there is that she's funded by this billionaire, Piero Midiar. But I happen to know from talking to her that that happened at the very, very end of what she was doing. And it was
Starting point is 00:10:09 a tiny grant of like $150,000 for us in the nonprofit world. That's like a tiny amount of money basically just to support her flight costs. And I happen to also sort of hear from her how much of the media was constructed at the last minute, like she was working this one newspaper, The Wall Street Journal, to do this sort of procedural rollout of specific stuff that she thought was concerning. I guess what I'll just say is like, what if she's just a good faith person who saw that virality was driving people crazy and that it was it was harmful to teenage girls? And it's true that the government would see some of that and say, hey, we could use that for something else. We could use that. She could be a tool for us to do something else. But I guess in the aim of complexity and nuance and not jumping to conclusions and this sort of thing, my perception from talking to her now extensively, she's a very
Starting point is 00:11:02 good faith actor who was concerned that this was going to drive the world apart. I should be really clear that this is not my position. This is just the conspiracy theory. I literally don't have an opinion on her. I do have an opinion on algorithms, and I do have an opinion on what it does do to young girls' self-esteem. Well, you have teenage daughters.
Starting point is 00:11:20 Yes. I just think, I mean, and young girls are a point of focus. For why they're a point of focus more than young boys, I'm think it mean in young girls are a point of focus because for Why they're a point of focus more than young boys. I'm not entirely sure I guess it has to do with their emotional makeup and and there's higher risk of self-harm Due to social media and Jonathan Haidt talked about that in his book the coddling of the American mind It's it's very clear that it's very damaging and I my kids, you know, my 13-year-old does have, like, interactions with her friends. And I do see how they bully each other and talk shit about each other.
Starting point is 00:11:52 And they get so angry and mad at each other. It is a factor. But it's an algorithm issue, right? There's multiple things here. So the first thing is, just to kind of set the stage a little bit, I always use E.O. Wilson, the sociobiologist, who sort of defined what the problem statement for humanity is. He said the fundamental problem of humanity is we have paleolithic emotions and brains, like easy brains that are hackable for magicians.
Starting point is 00:12:21 We have medieval institutions, you know, government that's not really good at seeing the latest tech, whether it was railroads or now social media or AI or deep fakes or whatever's coming next. And then we have godlike technology. So we have paleolithic emotions, medieval institutions, godlike technology. You combine that fact,
Starting point is 00:12:40 that's the fundamental problem statement. How do we wield the power of gods without the love, prudence, and wisdom of gods? Which is actually something that Daniel taught me. And then you add to that the race to the bottom of the brainstem for attention. What is their business model? Just to review the basics. Everybody knows this now, but it's engagement.
Starting point is 00:12:55 It's like, how do I get that attention at all costs? So algorithms is one piece of that. Meaning when you're on a news feed, like I don't want to just show you any news. I want to show you the most viral, engaging, like longest argumentative comment threads news, right? So that's like pointing a trillion dollar market cap AI at your brain saying, I'm going to show you the next perfect boogeyman for your nervous system. The thing that's going to make you upset, angry, whether it's masks, vaccines, Francis Haugen, whatever the thing is, it will just drive that over and over again and then repeat that thing. And that's one of the tools in the arsenal to get attention is that the algorithms.
Starting point is 00:13:31 Another one is technology making design decisions. Like how do we inflate people's sense of beautification filters? In fact, just recently, since we talked last time, I think it's a MIT Tech Review article showing that they're all competing, first of all, to like inflate your sense of beauty. So they're doing the filters. People know this stuff. It's very obvious. But they're competing for who can give you a nicer filter.
Starting point is 00:13:54 Right. And then now, instead of waiting for you to actually add one, TikTok was actually found to actually do like a 2% like just bare beautification filter on the no filter mode. Because the thing is, once they do that, the other guys have to do it too. So I just want to name that all of this is taking place in this race to capture human attention, because if I don't do it, the other guy will. And then it's happening with design decisions like the beautification filters and like the follow you. And if you follow me, I'll follow you back and the like button and check, pull, refresh
Starting point is 00:14:21 the dopamine stuff. That's all design. Then there's the algorithms, which is I'm pointing a thing at your brain to figure out how can I show you an infinite feed that just maximally enrages you? And we should talk about that because that thing drives polarization, which breaks democracy. But we can get into that. Daniel, let's bring you in here. So how did you guys meet and how did this sort of dynamic duo come about? Yeah, I was working on studying kind of catastrophic risks writ large. You've had people on the show talking about risks associated with AI and with CRISPR and genetic engineering and with climate change and environmental issues.
Starting point is 00:14:58 Pull up to the microphone there. And escalation pathways to war and all these kinds of things. Basically, how can shit hit the fan? Right. And I think it's a pretty common question of, basically, how can shit hit the fan? Right. And I think it's a pretty common question of like, how long do we have on which of these? And are we doing a good job of tending to them so that we get to solve the rest of them? And then for me, it was there were so many of them. What was in common driving them? Are there any kind of like societal generator functions of all the catastrophic risks that
Starting point is 00:15:23 we can address with to make a more resilient civilization writ large. Tristan was working on the social media issues. And when you had Eric on, he talked about the twin nuclei problem of atomic energy and kind of genetic engineering and basically saying these are extremely powerful technologies that we don't have the wisdom to steward that power well. Well, in addition to that is all things computation does, right? There's a few other major categories. And computation has the ability to, as you mentioned with Facebook, get to billions of people in a very, very short period of time compared to how quickly the railroads expanded or any other type of tech. How fast can TikTok get to a billion people, a billion users, which they did in like a few years versus before that it took software
Starting point is 00:16:04 companies like Microsoft even longer than that. Before that it took railroads even longer than that. So the power of this tech is you can compress the timeline. So you're getting a scale of a billion people. You're impacting a billion people in deeper ways much faster, which means that if you're blind to something, if you don't know what you might be doing, the consequences show up faster than you can actually remediate them. When we say exponential tech, we mean a number of things. We mean tech that makes more powerful versions of itself. So I can use computer chips to model how to make better computer chips, and then those better computer chips can recursively do that. We also mean exponential
Starting point is 00:16:35 speed of impact, exponential scale of impact, exponentially more capital returns, exponentially smaller numbers of people capable of achieving a scale of impact. And so when he's mentioning godlike powers and kind of medieval institutions, the speed at which our tech is having influences in the world and not just first order influences, the obvious stuff, but the second and third order ones. Facebook isn't trying to polarize the population. It's an externality. It's a side effect of the thing they're trying to do, which is to optimize ad revenue. It's a side effect of the thing they're trying to do, which is to optimize ad revenue. But the speed at which new technologies are having effects on the world and the total amount of consequences way faster than regulation can keep up with.
Starting point is 00:17:12 And regulation- And just by that alone, we should be skeptical of any government's ability to regulate something that's moving faster than it. Faster than it can appraise of what the hell is even happening in the first place. So we already- Well, not only that, you need someone who really understands the technology, and you're not going to get that from elected officials. You're going to need someone who's working on it and has a comprehensive understanding
Starting point is 00:17:33 of how this stuff works, how it's engineered, where it goes. I mean, I'm skeptical of the government being able to regulate almost everything. Right. Well, and so there's maybe a few things to say about that. So one is the complexity of all issues. Like, climate change is almost everything. Right. Well, and so there's maybe a few things to say about that. So one is the complexity of all issues, like climate change is really complex. Yeah. Like the where the nuclear pathways of escalation or the way a satellite or GPS could get knocked out triggers a nuke somewhere. That's also really complex. Social media is really complex. CRISPR, you know, bio stuff is complex. So in general, like one of the ways to summarize the kind of problem from our friend Zach Stein's kind of work is that the complexity of humanity's problems is going up like this,
Starting point is 00:18:09 but the capacity to meet them is like not really meeting it. And then you add in social media and you polarize people and divide them into like, they don't even know what's true because everyone's got their own personalized version of reality. And instead of even trying to try to meet that, it goes down. And in fact, social media also rewards the most cynical take on anything. So anytime a government institution has ever said something dumb, like when the guy asked Zuckerberg, how do you make money? And he says, Senator, we sell ads. That thing goes viral. And when that goes viral, everybody saw that.
Starting point is 00:18:40 And they didn't see that, you know, the five senators who I talked to who actually do really get these things pretty decently. And I'm not going to say like, let's just like regulate it, but just to notice, right? So the cynical take about every time an institution makes a mistake, that thing goes viral, which means we lose trust in so many things. Because no matter what the issue is, do you want to... You notice that you were bringing up the conspiracy theory of might the government have a incentive to make a plant like Frances? And so it's plausible, but plausible doesn't automatically mean is. One of the challenges is when someone has a confirmation bias, they hear something that's plausible and they just assume that it is without doing the due diligence of saying, what would I need to know? And you do
Starting point is 00:19:17 a good job of checking that. We could also say, would Facebook have an incentive to say that she's a plant and try to hire a bunch of PRs. And they were helping to spread that story, by the way. I'm not saying they're responsible for it. I understand. I actually think that what happened is organically, again, the cynical take goes viral. And then if you're Russia or China or you're Facebook in this case, you can be like, hmm, that's a really helpful cynical take from my perspective. from my perspective. In fact, one of the things that Facebook does try to do is turn the social media debate into a censorship or free speech debate, because they know that divides the political class, because they know that the right doesn't want censorship, obviously.
Starting point is 00:19:54 And so they say, the more they can spin whatever Frances is doing as she's claiming censorship, the more they can divide any possibility for actual action. In fact, I'll tell you just a quick story, really quick, is during the three-hour testimony that Frances gave, if you watch the full three hours, she had both people on the left and the right, and I've been working on this for eight years, I have never seen someone create a bipartisan consensus the way that she did. She actually did if you watch the video. And there was a senator there on the right who typically had been very skeptical of these issues. And the next day I talked to her, she was going to meet with that senator. And he later said, I can't meet with you.
Starting point is 00:20:31 Why? Because the story went viral saying that she was a Democratic operative. And he said, my base will hate me if I meet with you. So the very thing we're talking about, which is the ability to regulate anything is being broken and shattered because the incendiary controversial take on everything goes viral. Now, again, I'm not saying that we're this like easy world we should therefore regulate. It's just like, but noticing the mind warp, like part of what I wanted to do today is like, how do we reverse engineer this like bad trip we've been on for the last 10 years? Like it's like a psychedelic trip where we've all fractured into
Starting point is 00:21:02 this different reality where the controversial psyop interpretation of everything, the conspiracy-minded interpretation of everything. Again, some things are real. Some of those things have deserved to be seen that way. But just to understand how deep the mind warp has been the last 10 years. It's so funny you say the right doesn't want censorship. Isn't that a crazy statement? Are we shifted the polar, you know, the polar... What do you mean?
Starting point is 00:21:29 It used to be the left didn't want censorship. The ACLU used to defend Nazis. I mean, what the fuck has happened? Like, our poles have shifted. Like, north is south and south is north. It just shows you that so much of what ideology is, is tribal. It's like you find a group that agrees to a certain pattern of behavior and thought, and you subscribe to that. No, I am a right-wing conservative. I am a left-wing progressive.
Starting point is 00:22:02 And then you just follow the playbook. And it makes it so much easier than having your own individual nuanced thoughts on complex and difficult issues like this. But the fact that he couldn't talk to her because his base would somehow or another think that she actually is a democratic operative and she does work for the government and is some sort of an attempt at censorship. And I'm sure not only is Facebook amplifying that, but all of the different Russian troll pages on Facebook are amplifying that, which confuses the water. Totally. Well, also, if I'm Russia or China,
Starting point is 00:22:35 Facebook is like the best weapon I've ever had against the United States. Yes. Oh, my God. You've got an F-35. I don't need F-35. I've got Facebook. I can destroy your entire coherence as a society. And they have. And you won't get anything done. And all of your energy will be spent on waste, infighting, and heat. We talked about this recently, but there's, I'm sure you saw the story. There was 20, top 20 Christian sites on Facebook. 19 of them were run by a Russian troll farm.
Starting point is 00:23:01 I'm glad you actually mentioned that. Or excuse me, it was an Eastern European troll farm. Macedonia, I think it was. Totally. 140. This is an important stat, actually. I'm glad you actually mentioned that. Or excuse me, it was an Eastern European troll farm. Yeah. Macedonia, I think it was. Totally. This is an important stat, actually. I'm glad you brought it up. This is as recent as October 2019. 140 million Americans per month were reached by essentially troll farms actively.
Starting point is 00:23:20 There's three categories of pages in which they're... So for Christian pages, the top 15 out of 15 Christian pages were all run by troll farms. So all of the Christians in the country have, were receiving content and 85%. This is, this is a secondary point. 85% of the Christians who saw that stuff in their feed, they didn't actually accept an invitation from the group or the page to say, yes, I want to subscribe to you. Facebook, because they're optimizing for growth,
Starting point is 00:23:49 they changed the way the system works so that if a page invites you, that's enough for it to start putting the content in your feed. So there's an example in Francis's work where there was a QAnon person who invited 300,000 people in one day. 300,000 people. And because Facebook's optimizing for growth and engagement, those people didn't have to say, yes, I want to join that group. Just by being invited, it started testing like we want to optimize for growth. So, it puts it in your feed. And if you click on it, it auto adds you to the group. Oh, my God. Out of the top 15 pages for African-Americans, two-thirds of those top 15 pages were run by troll farms.
Starting point is 00:24:27 Of the top 15 pages for Native Americans, one-third of those pages were run by troll farms. So we're not living in an authentic reality. Reality, quote-unquote, is getting more virtual. If you read Chinese military doctrine, specifically look at the 36 stratagems, don't ever attack a superior opponent directly. Turn the enemy against themselves based on their existing fault lines. Population-centric unconventional warfare, right? Like that's kind of ancient doctrine. It's just Facebook makes that amazingly easy because it automatically already puts people into tribal groups that whatever the content is in that group is going to keep getting upregulated, optimizes for inflammation and tribal identity and those types of things. And so you don't have to kinetically attack a country to make the country so turned against itself that the polarized population supports a polarized representative class, which means you get gridlock on everything, which means you can't do effective governance, which means another country that does autocratic governance just wins geopolitically. It seems absolutely insane that they could, through one page, inviting people,
Starting point is 00:25:33 instantaneously start to distribute all of their information on those people that they invited. So why would Facebook even allow that? So if I'm designing Facebook, you would probably say— Wait, wait. You just said the government should regulate social media. It should be illegal is what I said. It should be illegal. Yeah. Well, this is – I don't think the government should regulate.
Starting point is 00:25:52 But I do think there should be rules in terms of like if you're a regular person that, say, has a specific group of interests. Like say you only like motor cars. You like vehicles. You like hot rods or whatever, and that's what you're interested in. You use Facebook when you're off duty at work and you just want to check some stuff out, and all of a sudden you get QAnon shit
Starting point is 00:26:18 because they invited you into this QAnon group, and you start getting all this information. You start getting radicalized. It seems like... QAnon group and you start getting all this information, you start getting radicalized. It seems like... And again, I don't know what we should do in terms of regulation, but I don't think that social media groups should be able to just distribute information to people based on this concept of universal growth. Yeah.
Starting point is 00:26:42 Well, I mean, think about it. If we were just designing... Or unlimited growth. Yeah. Well, I mean, think about it. If we were just designing unlimited growth. Yeah, exactly. I mean, if we were designing Facebook with a feature called groups and groups had a feature called invitations and you could invite people, wouldn't you design it so that people have to accept the invitation for the group before it shows up in your feed? Why would Facebook not do it that way? Right. Because what happened is starting in, I think it's like 2018, people stopped posting as much on Facebook. So you and I, and maybe we used to post a lot more in 2016, 2017. If we stopped posting as much, oh shit, we can't harvest all that attention from people. You were doing all this labor. What do you mean?
Starting point is 00:27:14 What caused it to slow down? Oh, just like people, I mean, people being more skeptical maybe of Facebook or just realizing they don't want to share as much or just usage burning out, more people moving to Instagram or TikTok or something. People are getting older as well, right? It's like older user base. Totally. And so now if I'm Facebook, I want to find new sources of free unpaid content creators. Where can I tap that pool of content? Oh, I've got this thing called Facebook groups where people are posting all the time.
Starting point is 00:27:39 So I'm going to start putting that stuff in people's feeds to just – so now I'm fracking for attention. I'm going lower into all these other places to backfill this attention-harvesting, we are the product machine. And how do you know, since there isn't rigorous identity, if a user that says they're a user is really who they are or if they're a troll farm or if pretty soon they're an AI GPT-3 algorithm? You should explain what AI GPT-3 is. The ability to generate text-based deepfakes so so people know what a deep fake is Well, there's a whole reddit thread with people arguing with each other that are all fake. Do you know about that? No, I don't actually hear I'm gonna send it to Jamie It's Duncan just sent this to me the other day and I was like what in the fuck I could only look at it for
Starting point is 00:28:21 a couple moments before I started freaking out, but the idea that I could only look at it for a couple moments before I started freaking out. But the idea that it's not far off, like this ability that deepfake AI has to recreate, especially in text. Yes, exactly. That's specifically what GPT-3 is. It's a text model that trains on trillions of parameters and basically the entire corpus of the internet. So you're basically ingesting everything everyone has ever said online ever, including stuff in your voice or in my voice. And then you could say, GPT-3, write me an argument about why social media is great,
Starting point is 00:28:54 written by Tristan Harris, using his words and phrases. And it'll do that. It'll actually be able to take my style of speech, and it'll generate text there. You could also say, do you want to do the vaccine? The ability to say, make arguments for vaccines or against vaccines and say, only use real data, and then be able to show the financial vested interest of anyone arguing on the other side, and just have it be able to create more data than people can parse in any reasonable amount of time. Like an academic looking paper that's 10 pages long saying why the vaccine is not safe with
Starting point is 00:29:29 citing real charts, real graphs, real statistics, and the real vested interests of people who are, say, positively pointing out that the vaccine is safe, who maybe they have some connection to Pfizer or something like that. And it'll generate that full 10 page or 20 page document. And it'll take a team of statisticians a while to decode that thing. And you can flood the internet with that kind of text. And it's already, we already have, through OpenAI and the GPT-3 algorithm, the ability
Starting point is 00:29:54 to pass the Turing test in many areas. We should explain what the Turing test is. Meaning that if you're reading the text, you can't tell that it wasn't produced by a human. Right. Turing test is the idea that if you, that's how you find out if someone, it's a very good robot. So you've already got an AI. So this is the
Starting point is 00:30:09 Reddit thread. So this is these are all, why do human babies cry? These are all robots. This is all bots arguing with each other. This is what happens when you give birth to a human baby. Oh, my bad. I thought you were just trying to answer the question. No worries. thought you were just trying to answer the question.
Starting point is 00:30:25 No worries. No, I'm trying to answer the question of how babies cry. YTA, I don't know what that means. And you are discussing, I can't even fathom the level of toxicity in this post. These are all bots. I am disgusted that you are making fun of others. Don't you know that people in this sub
Starting point is 00:30:40 are supposed to be empathetic of others' feelings? I'm sorry, but you're being a cunt. These are all robots. This is wild. Because if you just read this and you didn't know, I don't really care if you disagree with my opinion as long as you don't call me a pedophile. If you were a real man, you would be with a young girl
Starting point is 00:30:59 and take care of her, and you would be a sex offender. This is wild shit. One of the things people don't know, it actually was just developed over the summer. and you would be a sex offender. Like, this is wild shit. Yeah. One of the things people don't know, it actually was just developed over the summer. They announced that OpenAI, just to track since we came and talked about some of these things last time, in August 2020, OpenAI released a video
Starting point is 00:31:15 of using the same technology of machines generating stuff to actually write programming code. So you tell the GPT-3, I want an asteroid video game. And it's like, do-do-do-do-do-do, and it writes all the code, and then it puts a little graphic of a starship thing in the middle, and then there's rocks that are flying. And you say, I want the rocks to move faster,
Starting point is 00:31:35 and then the rocks move faster through the asteroid game. Only requiring natural language input, no programming. So you're just saying, you're typing in natural text, I want an asteroid video game that when I move left, it moves left. I want the asteroids to move faster. Actually make the starship bigger. And then it just changes, and it does it all for you. Now, it's not perfect, but this is AGI.
Starting point is 00:31:54 You're just typing it in text. That's right. But also voice to text, so you could just say it. You combine these things together. Alexa, make me a Pong game. Yeah, exactly. Alexa, code me the Unreal Engine. I mean, that one's going to be harder.
Starting point is 00:32:08 Not yet. Right, but the point is that's where we're headed, right? Right. And part of this is, again, we have the power of God. This is actually it right here. Here it is. This is the one. Make the person 100 pixels, and it's doing it all itself.
Starting point is 00:32:19 Yep. Wow. And it writes the code in the right-hand side. So this video that Jamie pulled up on YouTube is OpenAI Codex Live Demo and you can see this all happening while this person types in the data and they're actually explaining it now, how this is going to work. Once you see it later, set its position to 500 pixels down and 400 pixels from the left and then it just does that.
Starting point is 00:32:43 Oh my God, look how quick it codes it. Yeah. Wow. Now make it controllable with the left and right keys, the right arrows, right? Boom, and then now you can move it. So it does it progressively, right? It's adding the code in. Yeah.
Starting point is 00:32:56 Wow. Yeah. And it's going to be accessible to more and more people, too. Go ahead. This is an example of a kind of deep point to think about for the state of the world as a whole is one of the things that exponential tech means is exponentially more powerful. I hate to tell you this, but get this thing right up in your face. Exponentially more powerful tech that's also exponentially cheaper, which also means more
Starting point is 00:33:18 distributed. And so pretty soon this level of tech will not only be getting better but available to everybody. So what happens when you have an internet where not only do you have an AI that is curating the Facebook feed for the most sticky stuff, which usually means the most toxic stuff, and that's an AI that is curating human-made content. But now you have AIs that are creating content that also get to maximize for stickiness. And then you have the relationship between the curation and the creation AIs. Like how does anyone ever know what is true about anything again? So AI can create fake stories and the fake stories can be boosted up by these troll farms. Which themselves could be run by fake accounts and fake logic. Oh my God.
Starting point is 00:34:02 But wait, it goes one step further. So that's just distributed AI, right? But we also have drones making continuously better drones with continuously better ability to swarm and weaponize them that also becomes easily accessible. We also have CRISPR making biotech capability, something that you don't have to be a state actor to have, small actors can have. So there's this question of how do we make it through having decentralized exponential tech, which means decentralized catastrophic capability. Godlike powers, decentralized godlike powers. Decentralized godlike powers in terms of biology as well as in terms of technology.
Starting point is 00:34:39 That's right. So social media is an instance of one case. So I don't think we should just gloss over the CRISPR thing. For people who don't understand what CRISPR is, CRISPR is a gene editing tool. I think it's on the second iteration now or is it on the third? Something like that. They're getting better and better at it. The idea is eventually it's going to get to the point where it's like a home computer,
Starting point is 00:34:59 like where you are going to be able to edit genes. Yeah. So how do you stop that? Or what do you do about that? And if you wanted to have any kind of regulation about something like that, what is the regulation? Is the regulation that you have to have some specific level of clearance before you have access to it? But if that's the case, then you put it in control of the government and then also bad actors and other governments are going to just distribute it wildly.
Starting point is 00:35:30 And how do you control that someone would have to have some kind of access to get it if one of the its is something that you just need internet access for, like open AI or the ability for cyber weapons, right? Cyber weapons hitting infrastructure targets. It's like now the only way to regulate that is universal surveillance on everyone's use of their home computer. And we don't want that future. So in general, like, because this might sound like just disaster porn, which I want to be really clear that, I mean, I think our goal is coming on. There is a way through this.
Starting point is 00:36:00 Yeah, our goal in coming on was to be able to talk about framing the problem so we know what we're trying to solve. We're not trying to say, hey, we've just got this social media problem. Let's frame it really clearly. Yeah. Okay, you've got your coding problem and you have this biology problem with CRISPR. How does a civilization navigate this without killing itself? Well, Daniel's going to be able to speak to a lot more of this.
Starting point is 00:36:22 I just wanted to connect it first to social media so people see the through line. So I actually think that social media is its other kind of, it doesn't seem as dangerous, right? It just feels like this thing where people are sharing cat videos and their opinions and their political ideas and sharing links. But it's actually just like this. And in the same way that that dangerous capacity, we're now seeing what that dangerous godlike power was doing of steering 3 billion people's thoughts, personalized to them, the thing that would most outrage, you know, boogeyman their lizard brain and their nervous system, that's a godlike power. When you have a godlike power, there's sort of two choices. There's two attractors with that power. One is, think of it like a bowling alley.
Starting point is 00:36:59 You've got one gutter on the left and one gutter on the right. On the left, you've got a dystopia, centralized control saying like, here's how we're going to control that godlike power. That's like China controlling its internet. That's like Mark Zuckerberg having a total monopoly on what people can and can't say. Like those are both dystopias, that centralized power. The other gutter in the bowling alley is like, take your hand off the steering wheel and let this thing go for everyone. Like anyone can make anything go viral. Let's add the devious licks, which is by the way, a TikTok challenge for anybody to basically trash their high school bathroom. And it teaches you how to do it. And these videos go viral and it's just like, everyone's crashing. What is a devious lick?
Starting point is 00:37:36 I probably shouldn't have gone there. It's a quickly, it's a, um, uh, high school teacher told me this. There's all these horrible things that are going viral at the point. A high school teacher told me this. There's all these horrible things that are going viral at the point. Virality is a godlike power. And devious licks is a challenge that basically you're challenging your fellow high school-aged friends around the world to trash their high school bathroom. So you, like, flush a Big Mac with, like, shit and all this horrible stuff down the toilet at the same time. They put, like, pee. This is awful.
Starting point is 00:38:04 They put, like, pee in the soap dispenser. They do all this horrible stuff down the toilet at the same time. They put like pee, this is awful. They put like pee in the soap dispenser. They do all this awful stuff. And it just, you're just spreading a disaster meme. You're just teaching people how to create a decentralized catastrophe instead of a drone hitting something. And they do this just for TikTok likes? Well, they do it because it's getting attention and engagement. There's another one that's a self-harm challenge for teenage girls. They're saying basically, you know, this is teaching who can do a cutting. It's like a cutting challenge, I think is what it's called. So the point is that these decentralized— Do we know where this comes from?
Starting point is 00:38:32 Are these things from troll farms? I don't know. But they could be. Because some of them probably are, right? Yeah. There's a concept called stochastic terrorism. There's a good article on Edge, which basically is the idea—let's say there was a foreign state actor that wanted to mess things up in the U.S. population. Trying to control a specific person to do a specific thing is hard.
Starting point is 00:38:53 But trying to get an already kind of disenfranchised group more radicalized that makes it more likely that some of them do some harmful stuff is easy. Think about – you last texted me, Joe, on January 6th. I think we had a quick text exchange because I think that's an example of, I don't, I mean, and I'm not going to claim that everyone is just, that's an example, I think I would say of, I can basically go into a group of the Boogaloo Boys or, you know, Stop the Steal groups or something like that. And I can just seed stuff that's like, hey, let's get our guns out. Let's do this. And I just, just, just hinting at that idea. I'm not telling one person to go do something. I'm not controlling anyone.
Starting point is 00:39:26 I'm just hinting, and there's a wide enough group there that people can take action. So that's one of the other decentralized power tools. But I just wanted to close the thought on the bowling alley. We've got the bowling alley. One gutter is like, let's lock it down with surveillance. Let's lock it down with Mark Zuckerberg controls everything. Let's lock it down with the government tells us what we can and can't do on computers.
Starting point is 00:39:47 And the other gutter, which is the decentralized power for everyone, which without people having the wisdom to wield that godlike power or like not at least not evidence in people's own usage of it right now. Also, we've incentivized people to do destructive things just for likes. Right. So in certain places, there is an incentive for those things to happen. It's not just by accident. It's like by design and incentivized. Wait, you just said it's super important. It's a population that is getting continuously more radicalized on all sides that simultaneously
Starting point is 00:40:17 has continuously more powerful tools available to them in a world that's increasingly fragile. And so if you have an increasingly fragile world, meaning more interconnected global supply chains that have where a collapse somewhere leads to collapse everywhere, more sensitive infrastructure, you know, things like that. If you have an increasingly fragile world, you have more and more radicalized people and you have those radicalized people having access to more and more powerful tech. That's just fragility across lots of different dynamics. And this is why the social media thing is so central is it's a major part of the radicalization process. It's both a major part of the radicalization process and is itself an example of the
Starting point is 00:40:58 centralized control censorship, which we don't want, and the decentralized viral memes for everyone, which radicalize and enrage people and polarize democracies into not working. The thing is, in those two gutters, the gutters are getting bigger every day. Like on each side, you've got more potential for centralized control, you've got China basically doing full control over its internet, you know, doing a bunch of stuff to top down control. And the other side, you have more and more decentralized power in more hands, and that gutter is growing. So the question is, how do you basically – we have to bowl a strike down the center of that alley, but it's getting thinner and thinner every day. And the goal is how do we actually sort of – it's almost like a test, right?
Starting point is 00:41:37 We are given these godlike powers, but we have to have the wisdom, love, and prudence of gods to match that set of capacities. You were just mentioning what China is doing to kind of regulate its internet. That's because you're worth speaking about. Yeah. Have you been following this? Yeah. That's what terrifies me is that we have to become like China in order to deal with what they're doing. I feel like one step moving in that general direction is a social credit score system. And I'm terrified of that. And I think that that is where vaccine passports lead to.
Starting point is 00:42:08 I really do. And I think this idea that they're slowly working their way into our everyday lives and in this sort of inexorable way where you have to have some sort of paperwork or some sort of a Q code or something on your phone or QR code. That scares the shit out of me because you're never going to get that back. Right. Once the government has that kind of power and control, they're going to be able to exercise it whenever they want with all sorts of reasons to institute it. I'm worried about that too.
Starting point is 00:42:37 But I will say also, just to also notice that everywhere there is a way in which a small move in a direction can be shown to lead to another big boogeyman, and that boogeyman makes us angry, social media is upregulating the meaning of everything to be its worst possible conclusion. So like a small move by the government to do X might be seen as this is the first step in this total thing. I'm not saying that they're not going to go do that. I'm worried about that too. But to also just notice the way that social media amplifies the degree to which we all get kind of reactive and triggered by that. 05.05 The thing that I think is worth mentioning
Starting point is 00:43:09 is what China is doing regarding its internet, because it's seeing real problems and we might not like their solution. We might want to implement a solution that has more civil liberties than we should. 05.11 Let's explain what they're doing. 05.13 Yeah. 05.13 So I'll do it quickly. So it's almost, it's quite literally as if Xi Jinping saw the social dilemma because they've, in the last two months, rolled out a bunch of sweeping reforms that include things like, if you're under the age of 14 and you use Doyen, which is their version of TikTok, when you swipe the videos, instead of getting like the influencer dancing videos and soft pornography, you get science experiments you can do at home, museum exhibits and patriotism videos. Wow.
Starting point is 00:43:51 So you're scrolling and you're getting stuff that's educating because they want their kids to grow up and want to be astronauts and scientists. Yeah. They don't want them to grow up and be influencers. And I'm not, when I say this, by the way, I'm not, just to be clear, I'm not praising that model, just noticing all the things that they're doing. Well, I'll praise it. If you can influence people, that's a great way to do it.
Starting point is 00:44:08 They also limit it to three hours, sorry, 40 minutes a day on TikTok. For gaming, let me actually do the TikTok example. So they do 40 minutes a day for TikTok. They also, when you scroll a few times, they actually do a mandatory five-second delay saying, hey, do you want to get up and do something else? Because when people sit there infinitely scroll, even Tim Cook recently said mindless scrolling, which was actually invented by my co-founder of the Center for Humane Technology, Azar Raskin. He was in the social dilemma. He's the one who invented that infinite scroll thing. China said, hey, we don't want people mindlessly scrolling. So after you
Starting point is 00:44:41 scroll a few videos, it does a mandatory five second like interlude. They also have opening hours and closing hours. So from 10 PM until six in the morning, if you're under 14, it's like it's closed. Meaning one of the problems of social media for teenagers is if I'm not on at one in the morning, but all my friends are on and they're still commenting on my stuff, I feel the social pressure. I'm going to be ostracized if I don't participate. And if your notifications are on, your phone keeps buzzing. Totally. And even if they're not on, it's like, oh, but I want to see if they said something about my thing. And so it's a, we call a multipolar trap. If I don't participate, but the other guys are, I'm going to lose out. And Facebook and these companies, they know that by the way,
Starting point is 00:45:19 even Netflix said their biggest competitor is sleep. So one of the, because they're all competing for attention. So when you do this mandatory thing where you say we're going to close from 10pm to six in the morning, suddenly everyone, if you're in the same time zone, it's another important side effect, can't use it at the same time. So these are some examples for their military, by the way, when you if you're a member of the Chinese PLA army, you you get a lockdown smartphone, it's like a light phone, It's like hyper locked down. You can't do anything. By contrast, we know that Russia and China go into our veterans groups on Facebook and they actually try to sow disinformation, try to radicalize veterans. Hey, Afghanistan
Starting point is 00:45:56 happened. Don't you really piss? Let me show you 10 videos. And this is like dosing people with more mental health problems. So in a bunch of different ways, we see that- Specifically, if you want to drive civil war in a meaningful way in the US, take the people who have real tactical capability and radicalize them. And so target those groups in particular. And that's like, it makes sense why their military wants to lock down the ability for external influence. Of course. So while we're spending all this money building physical borders, building walls, or spending
Starting point is 00:46:27 $50 billion a year on the passport controls and the Department of Homeland Security and the physical, Russia trying to try to fly a plane into the United States, we've got Patriot missiles to shoot it down. But when they try to fly an information, like precision-guided information bomb, we, instead of responding with Patriot missiles, we respond with, here a white glove Facebook algorithm that says which zip code or Facebook group would you like to target? Right? So it changes the asymmetries. Typically, what made us powerful, what made the US powerful was the geographic board, you know, we have these huge oceans on both sides gives us a unique place in the world. When you move to the digital world,
Starting point is 00:47:01 it erases that geographic asymmetry of power. So this is an imminent national security threat. This is not just like, hey, social media is adding some subtle pollution in the form of mental health or, hey, it's adding a little bit of polarization, but we can still get things done. It's an imminent national security threat to our continuity of our model of governance, which we want to keep. Have you spoken to people in power? Have you spoken to congresspeople about this? Yes, but I'm hoping many more of them watch this because I think people need to see the full scope. And I really do want to make sure we're not sounding like just full disaster porn because we want to get to the point- Don't worry about that. Go full disaster porn.
Starting point is 00:47:34 Better that than not. It's not meant to scare people. Just to get an appraisal of what is the situation that we're in. It's going to scare. The reality is going to scare people. Reality is scary. It should scare people because we're so far behind the eight ball. There's a really important point Tristan was just at that we actually need to double click on, which is that democracies are more affected by what's happening with social media than authoritarian nations are. And for a number of reasons, but do you want to? Well, and we sort of hinted at it earlier, but when social media's business model is showing each tribe, their boogeyman, their extreme reality, it forces a
Starting point is 00:48:10 more polarized political base, which means to get elected, you have to say something that's going to appeal to a base that's more divided. And in the Facebook files that Francis Haugen put out, they showed that when Facebook changed the way its ranking system worked in 2018 to something called meaningful social interactions, I won't go into details, they talked to political parties in Europe. So here we are, it's 2018, they do an interview with political parties in Poland and Hungary and Taiwan and India. And these political parties say, Facebook, we know you changed your ranking system. And Facebook like smugly responds, yeah, everyone has a conspiracy theory about how we change our ranking system because those stories go viral. And they're like, no, no, no, we know that you changed how your ranking system works
Starting point is 00:48:53 because we used to be able to publish, here's a white paper on our agriculture policy to deal with like soil degradation. And now when we publish the white paper, we get crickets. We don't get any response. And we tested it. And the only thing that we get traffic and attention on is when we say negative things about the other political parties. And they say, we know that's bad. We don't want to do that. We don't want to run our campaign that's about saying negative things about the other party. But when you change the algorithm, that's the only thing we can do to get attention.
Starting point is 00:49:20 It shows how central the algorithm is to everything else. If I'm Tucker Carlson or Rachel Maddow or anybody who's a political personality are they really um saying things just for their tv audience are they also appealing to the algorithm because most more and more of their attention is going to happen downstream in these little clips that get filtered around so they they also need to appeal to how the algorithm is rewarding saying negative things about the other party so what that does is it means you elect a more political representative class that's based on disagreeing with the other side and being divided about the other side, which means that it throws a wrench into the gears of democracy and means that democracy stops delivering results. In a time where we have more crisis,
Starting point is 00:49:58 we have more supply chain stuff and inflation and all these other things to respond to. And instead of responding effectively, it's just division all the way down. But it's been division all the way down. But it's been division from the jump even long before there was social media. So all social media is doing – It's putting gasoline on you. Yeah. They're taking advantage of a trend that already existed. It's not like – but my opponent is reasonable.
Starting point is 00:50:17 Right. But I feel like I'm just a better choice. And you could disagree because he's a great guy. But this is how I feel. No one's doing that. Totally. Totally. But notice, though, in this example, how specific the change was.
Starting point is 00:50:28 Those political parties that before 2018, they could get elected in those countries because they hadn't gone that as partisan maybe as we were yet. They could have gotten elected and getting attention by saying, here's a white paper about our agriculture policy. But after 2018, the algorithm has the master say. Everyone has to appeal the algorithm. If I'm a small business, I have to appeal to the algorithm. If I'm a newspaper, do I just like write the articles I want to write or the investigative stories, the fourth estate that we need for democracy to work? No, I have to write the clickbait title that's going to get attention. So I have to exaggerate and say Joe Rogan just takes horse dewormer because that's going to get more attention
Starting point is 00:50:59 than saying he took ivermectin. Particularly in this world where no one's buying paper anymore. Correct. Everyone's buying everything, clicking online. And very few people are even subscribing. So you have to give them these articles and then have these ads in the articles. And those publishers, and that's also driven by the business models of these central tech companies, especially Facebook, Twitter, and Google. There's two feedback loops that he just mentioned. Politically, if you have Facebook and other platforms like this polarizing the population, then the population supports a more polarized representative class. But the representatives to be elected are doing political ads. And so the political ads then further polarize the population.
Starting point is 00:51:41 And so now you have this feedback loop. And then the same is also true with media. The media has to – meaning newspapers, television, still has to do well on the Facebook algorithm because more and more, there's a monopoly of attention happening there and it's someone seeing a clip there that has them decide to subscribe to that paper or keep subscribing to it or whatever it is. So you end up having the algorithm radicalizing what people want to pay attention to where then the sources of broadcast have to appeal to that, which then in turn further radicalizes the population. So these are runaway feedback loops.
Starting point is 00:52:17 And what's the solution? Actually, you asked me this last time, and I dodged the question. And part of it is because it's connected to a set of broader issues that I think is actually really deep in Daniel's line, which is actually the reason I wanted us to do this together this time. There's obviously many steps to this, right? So once you've kind of let this cancer sort of spread, if you take out the thing that was causing the cancer, we've now already pre-polarized everyone's beliefs. Like when you say, what's the solution to all this? Like all of our minds are running malware. Like we're all running bad code. We're all running confirmation bias. Except no one thinks that they are. Right. Everyone thinks the other ones are, but not me. But the point is that all of us,
Starting point is 00:53:04 like it doesn't matter, like people on all sides of the political aisles and all tribes, we've all been shown our version of the boogeyman, our version of the inflated thing that got our attention and then made us focus on that and then make us double down and go into those habits of those topics being the most important. And so we have to realize that. I almost think we need a shared moment for that. I wish the social dilemma was a little bit more of a – it was a shared moment. But I think there's almost like a truth and reconciliation like moment that we need to unwind our minds from the cult factory. Because it's a cult factory that found each of the little tribes and then just sucked them together and made them into self-reinforcing. Matthew Feeney So let's say we take any issue that some people care about and think is central, whether we take social justice or climate change or US-China relations. If half of the population thinks that whatever – half the population has a solution they want to implement, carbon taxes or whatever.
Starting point is 00:53:59 Other half of the population is polarized to think that that is bad and terrible and going to mess everything up. The other half of the population is polarized to think that that is bad and terrible and going to mess everything up. So that other half are still political actors and they're going to escalate how they counter that. How do you get enough cooperation to get anything done, especially where there are real issues and not just have all the energy become waste heat? In autocracy, let's take China as an example where you don't have to – where you don't have so much internal dissent. You don't have that issue. So you can actually do long-term planning.
Starting point is 00:54:35 So one of the things that we see is we have decreasing ability to make shared sense of the world. And in any kind of democratic society, if you can't make shared sense of the world, you can't act effectively on issues. But the tech, the types of tech that are decreasing our ability to make shared sense of the world are also increasing the speed at which tech is changing the world and the total consequentiality of it. And that's one way to start to think about like this bowling alley example is we're having faster and faster, more and more profound consequential effects and less and less ability to make sense of it or do anything about it. So underneath the AI issue, the CRISPR issue, the US-China issue, the how do we regulate markets issue, the how do we fix the financial crisis issue is, can we make sense of anything collectively, adequately, to be able to make choices effectively in the environment we're in? And that's underlying it. Tristan was laying out that you got these
Starting point is 00:55:25 two gutters, right? You've got decentralized catastrophe weapons for everyone if we don't try to regulate the tech in some ways and that world breaks. Or to say if we don't want decentralized catastrophe weapons for everyone, maybe we do something like the China model but where you have ubiquitous surveillance and that's a dystopia of some kind. And so either you centralize the power and you get dystop surveillance and that's a dystopia of some kind. And so either you centralize the power and you get dystopias or it's decentralized and you get catastrophes. And right now, the future looks like one of those two attractor states, most likely. Catastrophes or dystopias. We want a third attractor. How do you have a world that
Starting point is 00:55:57 has exponential tech that doesn't go catastrophic where the control mechanisms to keep it from going catastrophic aren't dystopic? And by the way, we're not here saying like, go buy our thing or we've got a new platform. This is not about, this is just about describing what is that center of that bowling alley that's not the gutters that we can skate down. The closest manifesting example of this so far, although we need to do one more construction, I think, which is this, but is Taiwan because Taiwan, actually, I think I talked about it last time we were here, is a, they've got this digital minister, Audrey Tang, who has been saying,
Starting point is 00:56:35 how do you take a democracy and then use technology to make a stronger democracy? So you can look right now at the landscape. You say, okay, we can notice that China, countries like China, autocratic countries, are employing the full suite of tech to make a stronger authoritarian autocratic society. They're adding surveillance. They're doing cameras everywhere. They're doing Sesame credit scores. They're using TikTok to educate their people instead of turn them into influencers. They're using the full suite of tech to create their kind of autocracy that
Starting point is 00:57:05 they want to see. By contrast, open societies, democracies, Western democracies, are not consciously saying, hey, how do we take all of this tech and make a stronger democracy? How do we have tech plus democracy equals stronger democracy? One of the other reasons I wanted to talk to you is so far, I think the tech reform conversation is like how do we make social media like 20 percent less toxic and then call it a day or like take a mallet and break it up and then call it a day. That's not enough when you understand the full situation assessment that we're kind of laying out here of the skating down the middle of the bowling alley. competes with that thing because we can't just also allow that thing is going to outperform the China autocratic model is going to outcompete a, you know, democracy plus social media that like is 20% less toxic, isn't going to outcompete that thing. Well, ultimately, in the long run, it's going to but the fascinating is they're willing to forego any sort of profits that they
Starting point is 00:58:00 would have from these children from 10pm to 6am.m. in order to make a more potent society of more influential, not influencer, but influential, more educated, more positive people that are going to contribute to society. This is something that I think you can only do if you have this inexorable connection between the government and business, and that's something that they have with corporations and with the CCP over there. They have this ability because they're completely connected. The government is- What did the senator tell us about China's-
Starting point is 00:58:32 Oh, yeah. This is a great point. We were talking with a sitting senator who was saying, or at some national security conference, talking to a foreign minister of a major EU country and said, who do you think the CCP, the Chinese Communist Party, considers to be the greatest rival to its power? You would say the United States, right? Right. He said it's not the United States. They consider their own technology companies to be the greatest threat to their power.
Starting point is 00:59:01 Oh, so that's why when someone like Jack Ma steps out of the line, they lock him up in the brig for a few months and shut his mouth. Notice that cryptocurrency, oh, that's a threat to our financial system. Oh, Bitcoin specifically. Oh, TikTok, that's a threat to the mental health of our kids.
Starting point is 00:59:16 Oh, Facebook, we don't want that in our country. That would open up our military to foreign hacking. So they see correctly that technology is the new source of power of basically what guides societies. It is the pen that is writing human history. And it doesn't have, if you let just for-profit motives, again, coupled with like, how do I get as much attention out of people as possible in the race to the bottom of the brainstem to suck it out of people, that thing doesn't work with societies. That breaks it. So they see that appropriately and then say,
Starting point is 00:59:43 let's do something about it. Now, the cynical view is obviously they're a communist country that's just doing their thing. That's a cynical perspective. But a post-cynical perspective is they're also appropriately recognizing that there's a certain threat that comes with allowing unregulated technology. So one way to think about this, Tristan was just saying that they recognize the power of new technologies and the need to be able to employ them if they want to be effective. We can see how much the world responded, how much the U.S. responded to the possibility of a nuclear bomb with the Manhattan Project, just even the possibility that the Germans would get it and how that would change everything asymmetrically. And so we make basically an indefinite black budget, find all the smartest scientists in the world, because that much asymmetry of tech will determine who runs the world. It's important to also say there are some people
Starting point is 01:00:33 who will have just like a knee-jerk reaction that says, oh, you guys are just being catastrophic. Yeah, you guys are just trying to scare us, disaster porn. There have always been these risks. We always come through them. Really, until World War II and the bomb, there was no way for us to actually mess up the habitability of the world writ large. We could mess up little local things. And, in fact, that happened.
Starting point is 01:00:52 Most previous civilizations did go extinct for different reasons. But World War II was the first time we had truly globally catastrophic tech. And we had to build an entire world system, mutually assured destruction, the Bretton Woods world, the IGO world, to basically not use that tech. And we had to build an entire world system, mutually assured destruction, the Bretton Woods world, the IGO world to basically not use that tech. Well, now that was basically the first catastrophe weapon. And then we had only two superpowers that had it. So you could do mutually assured destruction. And it's really hard to enrich uranium and make nukes. It's not hard to do these types of tech, right? That's the whole point. And we have now dozens of catastrophe weapons, many dozens of actors, including non-state actors who have them. And so we're like, oh, we're in a truly new phase. This isn't the same as it's always been. We're in a novel time of risk. And the exponential technologies with kind of
Starting point is 01:01:39 computation at the center, AI, and these other ones we're talking about are so much more powerful than all forms of legacy power that only the groups that are, and these other ones we're talking about, are so much more powerful than all forms of legacy power, that only the groups that are developing and deploying exponential tech will influence the future. That's like the big story. And then we would say, well, which groups are developing and deploying exponential tech? Well, China is, autocratic nations are. Facebook is, Google is, like major corporations that are also top-down, non-democratic systems are, and they're becoming – like Facebook has 3 billion people. The US has 300 million people, right? We're talking about something that has a global scale of influence but is really a top-down system.
Starting point is 01:02:16 A corporation though. So you either have corporations that are wielding the power of all this technology for mass behavior modification, surveillance of everyone, perfect sort of understanding of their psychological traits, and then moving them that scale. But in the big tech corporation model, they're doing it for a for-profit motive, whereas in the CCP model, they're doing it for their ideological goals. But neither of them are democratic. Neither of them have some kind of participatory governance, jurisprudence of, for, and by the people. And the open societies are not innovating. And how do we develop and deploy exponential tech in an open society way? And that's fundamentally what we're saying has to be like the central imperative of the world right now, is you're
Starting point is 01:02:56 not going to be able to compete with groups that are developing and deploying exponential tech if you are not also. But how do we do that in a way that preserves the civil liberties that we care about, actually advances them, and can advance participatory governance and collective intelligence? And that's not even— Well, the simple way is you don't, right? The simple way is you lock things down and become an autocratic—yeah. So you either beat China by becoming China or you figure out a third way. And we'd like to see there be a third way.
Starting point is 01:03:24 I'd like to see a third way too, but I don't see it. That's what's terrifying to me. A little more about Taiwan is actually worthwhile. We're moving in the direction of China more than we're moving in the direction of some new utopia. Currently, yes. Yes. Right.
Starting point is 01:03:40 So what about Taiwan? You can't even mention that. See what happened with John Cena? No, what happened? You can't even mention that. See what happened with John Cena? No, what happened? You didn't see that? No. John Cena was, there was an opening weekend for Fast and the Furious 9, I believe. And John Cena accidentally or inadvertently said that Taiwan is going to be the first country that sees the movie.
Starting point is 01:04:02 Well, China doesn't recognize Taiwan as a country. And if you want to do business with China, you can't say that. That was on full display when, and it made people very skeptical of the World Health Organization when one of their spokespeople was having a conversation with a journalist
Starting point is 01:04:16 and when she brought up Taiwan's response and other countries have done it like this, but Taiwan's response and he disconnected his line. Oh, I did see that. Did you see that? Yeah. And then came back on and glossed over very quickly. He said, China's done a wonderful job.
Starting point is 01:04:29 Let's move on. And she was like, but Taiwan. And he's like, China's amazing and China this and China that. Well, John Cena, by saying that Taiwan was the first country that was going to see Fast and the Furious, pissed off China. And then John Cena made a video where he spoke Mandarin. And in it is like this weird video. You should watch it because you haven't seen it. Let's show it to him. Show it to him. Because he's apologizing to China in the weirdest
Starting point is 01:04:56 way, saying, I really, really respect China and I'm so sorry. And I made a mistake. I was very, very tired. So this is the perfect example of a kind of dystopia that we don't want to go to a future where people are all accommodating or can't feel or think their actual thoughts because they have to appeal to some source of power. Exactly. And the source of power is financial because $160 million was the opening weekend for Fast and Furious 9 and $134 million of it came from China. I had many, many interviews and in one of them I made a mistake. Everyone asked me
Starting point is 01:05:42 if I can use Chinese. People at Fast and Furious 9 gave me lots of interview information. I made a mistake. I have to say right now, it's so, so, so, so, so, so important. I love and respect China and Chinese people. I'm so, so sorry for my mistake. I'm sorry, I'm sorry, I'm very sorry. You have to understand, I love and respect China and Chinese people.
Starting point is 01:06:26 I'm sorry. That's it. He doesn't even say what he's sorry about. Right. I mean, this is wild shit. Yeah. When you see this guy who is, you know, one of our big major action movie stars.
Starting point is 01:06:39 Right. Just on his knees apologizing to China. Right. Who hadn't said anything bad about China. Not at all. All he did was say Taiwan is a country, which you can't say. And if someone talks about something
Starting point is 01:06:53 that's not the mainstream narrative in the tech companies currently. I can't believe you never saw that. I think I'd seen it in like a John Oliver video or something briefly. So I would add taste of it. But yeah. I mean, that's a good example.
Starting point is 01:07:04 That's where you get your news? I don't know, but I was working out sometime. Um, I, so this is a good example of, we don't want to live in dystopias where our thought and our ideas and our free expression and our ability to figure out what's true in an open-ended way. Cause we don't know what's true. Um, we need to protect that, but we also remember the last time I ended our conversation talking about Orwellian dystopias and Huxleyan dystopias, that quote about amusing ourselves to death. Orwell feared a world where we would ban books and censor information. Huxley feared a world where we'd be drowned in irrelevance and distraction. So that's kind of another version of the two gutters. Right now we're kind of getting a little bit of both. We're getting a
Starting point is 01:07:44 little bit of, hey, we don't like the way that the companies are doing this sort of censorship or deplatforming of people. We also don't want the unregulated virality machines where the craziest stuff and the most controversial stuff that confirms our biases goes viral, because both those things break society. Right. So let's get back to that again. What's the solution? Well, let me just make one narrow solution for that one, because it's funny because Frances, in her own testimony, says Facebook wants you to believe in false choices between free speech and censorship. There is a solution that Facebook themselves knows about for this particular problem, which is actually just to remove the reshare button, basically the retweet button,
Starting point is 01:08:23 the reshare button. What they foundweet button, the reshare button. What they found in their own research, Facebook spent something like a billion dollars or something, multi-billion dollars on integrity, content moderation, all that stuff. And they said in their own research, it would be more effective than the billions of dollars they spent on content moderation to just remove the reshare button after people click it twice. In other words, you can hit reshare on a thing and it goes to all your friends. And then all those friends, they still see a reshare button and they can click reshare and then it goes to all their friends. After that, there's no reshare button.
Starting point is 01:08:58 If you just remove the instant frictionless, like make my nervous system twitch and then boom, I'm resharing it to everybody. If you just remove that one thing, you keep freedom of speech, but you kill an irresponsible reach, just like instant reach for everyone. But you also kill the ability to retweet something or reshare something that's interesting. You could still copy and paste a thing and share it. You can still do that. Yeah. Okay. But I think that we have to ask, there's a story about Steve Jobs that I've referenced to several times. Someone, you'd appreciate this because it's about podcasts. Someone showed him the latest version of the podcast app and someone wanted to make on the iPhone. This is on the iPhone early days. And they're like, what if we made it so in the podcast app, we had a reshare button and you could see a
Starting point is 01:09:41 feed of all the stuff that your friends were looking at. I mean, it sounds like kind of, it's just like social media and it'd be really engaging and people would get sucked into podcasts. But Steve Jobs' response was no. If something is truly that important and truly that meaningful, someone will copy and paste it as a link and be like, you got to check out this interview with Joe Rogan, which I hope people do with this episode, because it's crossing a threshold of significance of what is truly worth our undivided attention, which is also the name of our podcast. We call it that. As opposed to just publicizing something and spending a bunch of money or doing a bunch of PR work.
Starting point is 01:10:15 Or creating influencer culture or just rewarding, again, the controversy and the conspiracy theory and the thing. And again, I shouldn't use that phrase because it sounds like you're always one-sided on it, but just the most kind of aggressive take on anything, the most cynical take on everything being rewarded. It's nowhere is it written that like a virality-based information ecosystem where you have the, people are familiar now with the metaphor of like a lab that's doing gain-of-function research and the idea that something could get leaked out of the lab, just like that as a metaphor. Well, today we have the TikTok Institute of Virology, we have the Zuckerberg Institute of Virology, and they're testing what makes the memes go as
Starting point is 01:10:51 viral as possible, right? They're like, hey, if we make it have this photo, if we present it this way, and if we have these reshare buttons, except their goal is to create these memetic pandemics. Their goal is to have every idea, especially the ones that most excite your nervous system and your lizard brain, go as viral as possible. And then what you can even say that the Zuckerberg Institute of Virology released this memetic virus and it shut down the global democracy world because now we don't have shared sense making on anything. But we do if everyone's intelligent and objective and they just, you know, they don't use the reshare button for nonsense the problem is that people are you know we were impulsive and we and we
Starting point is 01:11:30 also don't spend a lot of time researching a lot of things that we read you know if someone says her to be smug and yes I mean I have and they know that and they prey on that right they prey on hey you're you're right you should be or should totally reshare that thing she You should feel great about it. Well, it's also, you know, things are, if true, spectacular. And it takes oftentimes hours to find out something is true or not true. Right. Or we don't even know. Right.
Starting point is 01:11:57 It's very difficult to even trust. Actually, talk about how social media kills science. You want to do that construction? Well, scientific bias, right? We talked about that. If someone has kind of an emotional bias towards what they already generally think is true, even if they're following the scientific method well, what experiment they decide to do, what they go looking for will be influenced because there's a lot of things to look at, right? Am I trying to do science on natural supplements versus on drugs versus on vaccines versus like I'll have an intuition
Starting point is 01:12:29 that is the basis of a hypothesis or a conjecture that then I'll do the scientific method on, intuition can be biased. So a point that Tristan was saying earlier that I think is really important is that this model of Facebook and it's not just Facebook, it's TikTok, it's all of the things that do this kind of attention harvesting, which ends up, it doesn't intend to polarize. It's a byproduct. It's a second order effect, an unintended consequence. But in the way that the unintended consequence of cigarettes was, it had an externality, which was lung cancer, and oil companies had oil spills, and so then government had to regulate it.
Starting point is 01:13:09 These companies are fundamentally different because their externality is a polarized population, which in a democracy decreases the capacity of government directly. So the big oil companies and big pharma companies and whatever can do lobbying and campaign budget support and whatever they can to affect government. But their mode of operation is not directly decreasing the effectiveness of government. The mode of Facebook's operation directly polarizes the population, which polarizes the representative class, which creates more gridlock, which decreases the capacity of government, both relative to other nations that don't have that issue and relative to their own internal tech issues, right, and their own internal domestic issues. So it can't even regulate Facebook. In the example I gave at the beginning of the senator
Starting point is 01:13:51 who was going to meet with Francis but then couldn't because these stories that polarized them went viral. She should have shamed that senator. That's what she should have done. So that would have gone viral too. And that would have gone viral, and then he would have backed down because then his constituents would have been mad at him and go, hey, man. What's interesting is that Instagram doesn't have a share feature.
Starting point is 01:14:11 Yeah. I just realized that. The share feature isn't actually the key. TikTok is not really emphasizing shares. But Facebook is. Facebook is, but the key is virality, right? So share is one way to get virality. TikTok is just looking at engagement
Starting point is 01:14:25 and upregulating the things that get most engagement. This is actually a key point, because let's say that we tried to make a piece of legislation based on thinking it was about shares. Then Facebook would just move to the TikTok algorithm. Just by what you look at the most, that thing gets reshared to other people. And it gets viral based on unconscious signals as opposed to explicit click signals. So there's a deeper point here, which is not, is there a piece of regulation that we can put in, even if we trust the government to do that? It's, let's even say we had a very trustworthy government.
Starting point is 01:14:51 It's, can the government regulate at the speed that tech can outmaneuver it? Well, and here's the other question. If you didn't have any algorithms whatsoever, wouldn't you be now open to being manipulated by troll farms just simply by volume you know if they have a hundred thousand accounts at each individual location they have a hundred thousand locations and they're just pumping out different instagram pages and tiktok pay and we don't even really even know how many they actually have because they discover them like facebook shuts down two billion fake accounts per quarter i'm'm sure they get all of them. Holy shit.
Starting point is 01:15:25 And this is before AI. Yeah, before the GPT. You're obviously being sarcastic by saying they get all of them. They don't. No, no, yeah exactly. Just to let people know that we're totally paying attention. That's insane. 2 billion per quarter fake accounts. Yep. And did they Is there a centralized area where these are coming from is it all russian troll farms are they some of them political ones that are used against opponents like well one of the problems is that we don't know because actually that's not true facebook has been um i really want to celebrate all the positive moves they make by the way this is not just so i'm clear and we're clear like this is about finding what's an earnest solution to these problems right all
Starting point is 01:16:04 the times they make great decisions that are moving in the positive direction. We need to celebrate that. They do these quarterly reports, I think called like information quality reports or they publish every quarter how many accounts they take down. But it's just like a PDF.
Starting point is 01:16:16 Like they're just putting out a post as opposed to letting external researchers know, for example, in each country, what are the top 10 stories that go viral? How do they find out that someone's a troll post? I don't know. I mean, there's classifiers that they build. There's activity.
Starting point is 01:16:30 You can tell. Like, usually, if you've ever had this happen, you use Facebook and you click around a bunch, and then it says, like, you look like you're clicking around too much. Have you ever gotten one of those messages? No. There's occasionally, like,
Starting point is 01:16:40 a person can trigger a thing that makes it, they're like, are you real? And they're trying to figure out if you're real. If you click around around a lot you're not real there's because these bots a lot of what they're doing is they're going around harvesting information so they want to click around various profiles and they like download the information and there's like ways more than a person could be able to do just by clicking yeah it's a hard problem right because essentially because the tech is getting better at simulating the behavior of a human and then simulating. There are people who are proposing that social media and the internet as a whole needs rigorous identity layer. I would say that's a requisite need for where we're going.
Starting point is 01:17:14 I was thinking that about Twitter a long time ago. And we kind of have that with Facebook, but it's not rigorous, obviously. Facebook, but it's not rigorous, obviously. But Twitter, you can have jackmeoff69, and that's your Twitter handle with some weird Gmail account and then just post nonsense. And so there's no ability for justice in that system. There's also no ability and accountability. There's also no ability for the user reading somebody else's thing to know who they are, right? For the veterans group or whichever group to know, is this a Russian troll farm that is pretending to be a Christian or whatever? And especially, is this even a human or is this an AI in the very near future? So the ability to be able to know this is a human and this is actually the human that they say they
Starting point is 01:17:57 are is not an adequate solution, but it's an example of a solution. Now, this, of course, then creates other issues. One of the things we've liked about the internet is anonymity. Because if there's rigorous identity, who has access to that information? And am I now centralizing data? So how do you become a whistleblower? Right. Which is huge. Or how do you how does something like Arab Spring happen?
Starting point is 01:18:19 Right. How can you be safely anonymous? There's a whole decentralized community. There's a great movement called Radical Exchange run by Glenn Weil, and they're trying to create part of this third attractor, this like what's the center of the bowling alley that's a digital version of democracy. What are decentralized ways of proof of personhood? There's a project called IDENNA.
Starting point is 01:18:37 There's a bunch of things. People can look up Radical Exchange's work. It's part of a whole movement of which Taiwan is included, which is that I don't know if we really got to Taiwan is included, which is that, I don't know if we really got to the Taiwan example, but- We didn't, I don't think. We showed John Cena instead. Oh, that's right. That's right. Well, people need to get it because it's an example of what is working. It's a solution. It's a direction of how we can go, which is you can only fit so many people into a town hall to deliberate, right? So there's sort of a limit to our idea of
Starting point is 01:19:04 democracies kind of guided by ideas from 200 years ago. They've created a system called polis, which is a way of gathering opinions about various ideas, and then sort of seeking who wants more funding for various things, less funding for various things. And whenever there's an unlikely agreement, so they sample a bunch of people, so you sit over here, you sit over here, they get these clusters, like these people kind of like this, these people kind of like these other things. Whenever there's an unlikely agreement between those clusters, in other words, consensus, rough consensus, that's what they sort of boost to the top of that system. So everyone's seeing areas of common ground, common ground, common ground, as opposed to fault line of society, fault line of society, be more angry, join the common thread, et cetera.
Starting point is 01:19:42 And then you're invited into a civic design process where you actually say, hey, I don't like the tax system. And they're like, great, we're going to invite 30 of the people who were part of that rough consensus. We're like, let's improve the tax system. Let's talk about how we're going to do it. And they do a combination of in-person stuff. This is a little bit before COVID and Zoom calls, and then do like these mechanisms to kind of get an idea of where do people agree and then how would we make it better they've done this with air pollution they have a huge air pollution problem because of the lithography that they do with chips and things like this they do it with with budgeting they also have a transparent budget for the entire taiwan government so people can see like imagine if the u.s federal budget you don't know if you want to do this but lived on a
Starting point is 01:20:20 blockchain where you had transparency into all the you know what was getting funding funding. And that would create more trust in government because you could see essentially who's getting the big contracts and for how much. And that was more accessible. And there was a civic participatory process where more people could contribute and participate in identifying areas of inefficiency. You could even imagine a place where citizens could get rewarded by saying, hey, this is inefficient. We could do it better this way. And if you identify places where it could get more efficient, you could get money or resources by making the whole system work better for everyone. If you ran a current audit of the US government through blockchain, you'd have a goddamn revolt. They would go, holy shit, this whole thing is corrupt. This is infested down to the roots. And that would be a real problem. And I think
Starting point is 01:21:07 the Nancy Pelosi's of the world would really have a hard time with that. 19.20 I heard some clip that you did where you were talking about your pot thoughts of people being in big buildings and the pipes everywhere. And just like how weird some aspects of civilization are. So think about how weird democracy is. Like as an idea, the idea that you can have some huge number of people who don't know each other, who all believe different stuff and want different stuff, figure out how to actually agree and work together as opposed to just tribalize against each other and do their own thing. Like it's actually a wild idea to think that that would be possible at any scale, maybe a tiny scale where they can all talk. Well, that's when it started. It was a tiny scale.
Starting point is 01:21:45 Right. And we've always had a scale issue, right? In 1776, you could all go into the town hall and fit. And so I wasn't just hearing a proposition that a special interest group had created and I get a vote yes or no, which will inherently polarize the population, right? Very few propositions get 99% of the vote. They get 51% of the vote because they benefit something and they harm some other things. And the people who care about what gets harmed are fighting against it. That polarizes the population against each other. Social media then just drives that like crazy. Just like Facebook saying, hey, this is a conversation about censorship or free speech. And boom, you just split the population in half. As opposed to, hey, we all agree we could do a little
Starting point is 01:22:20 bit less virality. We could stop the teenager use in these ways and we'd be better for everyone. The proposition creation isn't designed to polarize the population. It just does. Because as soon as you get beyond the scale of we can all actually inform what a good proposition would be by being in the town hall having a conversation. Just define a proposition. I mean, not everybody knows what a proposition is. Something you would vote on.
Starting point is 01:22:39 What's a good way to go forward? Before we make a choice on what a good way to go forward is, we have to do some sensemaking of what is even going on here. Like what are the values? What's going on here? That was – so the point was a conversation. That happened at a smaller scale. Also, if you had a representative, the level of tech at the time was something that a very well-educated person could understand most of, right? They could understand a lot of the tech landscape. Obviously, we're in a situation now where the scale issue of democracy has been completely broken. So almost nobody – we're supposed to have a government of form by the people but nobody really understands the energy grid
Starting point is 01:23:13 issues or first strike nuclear policy or monetary policy or anything like that. And everyone's voice can't be heard, right? Now, what Taiwan was working on is, is the tech that is particularly in the West breaking democracy, could that same tech be employed differently to actually make 21st century democracy more real? So the same AI that can mess up the information landscape for everyone, could we use that type of AI that understands language to be able to see what does everyone think and feel, and actually be able to parse that into something we can understand. So there's an online environment that says, here's the distribution of people's values.
Starting point is 01:23:48 Here's the various values people care about. Here's the emotions they have. Here are the kind of facts about it. And then is there a place where we can actually craft propositions together? So there's a way to make it to be able to utilize these same tools to make democracy more realized, to make collective intelligence more realized. But right now, as we were saying, autocracies are working on employing these tools. Corporations are working on it, both of which are top down. Democracies really aren't. Outside of Taiwan and Estonia and a few small examples.
Starting point is 01:24:21 What would be the incentive? Who would be incentivized to use that other than the people? And it's pretty clear that the people don't have control over Facebook, don't have control over Twitter, certainly don't have real control over the government. You know, you have control over elected officials who it's almost universally agreed will lie to you to get into office and then not do what they said they were going to do, which is the standard operational procedure. So what's the incentive and how would these get implemented? So again, at a small scale, 1776, your representative couldn't lie all that much because everybody, they lived in the same town, right? And you could all go see what was going on. And so can we recreate things like, as you were mentioning, people would freak out if they could
Starting point is 01:25:03 actually see how government spending worked? Can we create transparency at scale? Can we recreate things like, as you were mentioning, people would freak out if they could actually see how government spending worked? Can we create transparency at scale in a way that could create accountability at scale? We could. Could we have places where there's direct democracy and people can actually engage in the formation of what a good proposition is, not just voting yes or no on a proposition or yes or no on a representative? Can I stop you there? How would you say we could? How would you do that? How would you say we could? How would you do that? How would you have that transparency? And who would be incentivized to allow this transparency?
Starting point is 01:25:31 If the transparency has not existed up until now, why would they ever allow some sort of blockchain type deep understanding of where everything's going? I don't think they are incentivized, which is actually why this show is interesting. Because if we're really talking about a government of, for, and by the people, where the consent of the governed is where the power of government comes from, like ultimately, if, and the founding fathers said a lot of things about that the government will decay at a certain point, particularly when people stop taking the responsibility to actively engage. And so if tech has incentives to produce things that are catastrophically problematic for the world, and we need to regulate that somehow, and the
Starting point is 01:26:19 issues are too complex for individuals to understand, so you need institutions. But how do you make sure the institutions are trustworthy? We have to create new 21st century institutions, but they have to arise of, for, and by the people, which means there's a cultural enlightenment that has to happen, right? People actually taking responsibility to say, we want institutions we can trust, and we want to engage in processes of recreating those. Well, how do you get people to be enthusiastic about some sort of a radical change like that other than some sort of catastrophic event like a 9-11? Well, this is why we're talking about all the impending catastrophic events is to say we don't want to wait till after they happen. Maybe it has to do it before something happens.
Starting point is 01:27:00 It would be, but it seems like that's the only way people really change the way they think about things is something, some almost like cultural near-death experience has to take place. Well, it's like the problem of humanity is paleolithic emotions, medieval institutions, and godlike tech. One of the paleolithic emotions is it can't be real until, oh, shit, it actually happened. Right. But the test is we are the one species who has the capacity to know this about ourselves, to know our Paleolithic emotions are limited in that way, and say we're going to take the action, the leap of faith that we know we need to do. We're the only species that can do that.
Starting point is 01:27:36 If a lion was in this situation or a gazelle, they can't understand their own mind and realize they have the one marshmallow mind or the short-term bias or recency bias. They're trapped inside of their meat suit. This is a beautiful idealistic notion. However, in real world application, most people are just fucking lazy and they're not going to look into this and they're not going to follow through. And this is why most people that really study tech-mediated catastrophic risk are not very optimistic. And they think things like, we have to chip human brains to be able to interface with the inevitable AIs, or we have
Starting point is 01:28:10 to have an AI overlord that runs everything because we're too irrational and nasty. And the question is, there's always been a distribution of how rational people are and how kind of benevolent they are. And we have never, with that distribution, been very good stewards of our power, right? We've always used our power for war and for environmental destruction and for kind of class subjugation. But with the amount of power we have now, those issues become truly globally catastrophic. And this is the thing is like, and this is what almost every ancient prophecy kind of speaks to.
Starting point is 01:28:45 As you get so much power that you can't keep being bad stewards of it, either the experiment self-terminates or you are forced to step up into being adequate stewards of it. So the question is what would the wisdom to steward the power of exponential tech – what would the minimum required level be? And that's like the experiment right now. That's the opportunity for us. The opportunity, but you're talking about a radical shift in human nature. Well, it's a possibility. In human conditioning. Why don't you give some examples? Okay. So we can look at some cultures that have certain traits quite different than other cultures as a result of the conditioning of those cultures more than as a result of the genetics.
Starting point is 01:29:28 We can see that if you look at Jains. What are the Jains? The Jains are a religion that is highly emphasizing nonviolence, even more than the Buddhists. Where are they? Asia. They won't kill plants. They only eat fruits and things that come from plants. So you can see a whole population that won't hurt
Starting point is 01:29:45 bugs based on conditioning across the whole scope of human nature, the genetics in it. You go to a larger culture like the Buddhists and you can see that for the most part, you've got like 10 million plus people over 3,000 years in lots of different environments that mostly don't hurt anybody, including bugs, as a result of the way they condition people. Education, conditioning, culture. You can see that Jews have had historically a level of education that is higher than the embedding society that most everyone around them has as a result of investing in that. And so we're like, can cultures value certain things and invest in developing those in people? It doesn't mean that everyone suddenly has the wisdom God's to match the power of God's but
Starting point is 01:30:27 can we create a gradient that's like this is where there used to be this concept I'm building what's it I'm sorry I'm hearing what you're saying yeah but I don't see it yeah I'm hearing what you're saying like idealistically yes but I don't see the motivation for the shift. I feel like this is – it's a big ask. It's a big ask. And a big ask has to come with some sort of a master plan to get people to shift their perspective. Well, if you take a look at the like attractor of catastrophes and the attractor of dystopias, those are the likely ones. Right.
Starting point is 01:31:02 But we don't see it. Like people don't give a shit until it's happening. Which is why one of those two will probably happen. Yeah. Well, and with social media, they do see it. I think there's a unique moment. The reason I thought this would be an interesting conversation with the three of us is that social media has become the case.
Starting point is 01:31:18 Like, we can now all see it. We can now, I mean, it took, unfortunately for some people, seeing the receipts, which is what France has provided to things that we all predicted back, you know, unfortunately for some people, seeing the receipts, which is what France has provided, to things that we all predicted back eight years ago. But now people understand that that is a consequence of unregulated exponential technologies that are steering people at scale, making things go viral at scale and dangerous at scale. So that's a case we can now see that thing. Can we leverage the understanding of that to realize what bigger thing needs to happen? That doesn't mean – yeah. Before we get to the incentive, just imagine as a thought experiment for a minute that Facebook changed what it was optimizing for.
Starting point is 01:31:53 Because Facebook is this 3 billion person AI optimized behavior machine, right? Like that's a huge – it's not like normal companies. And it's important to understand that. And it's optimizing for engagement, which usually ends up looking like outrage, desire, addiction, all those types of things. But let's say that we – could we assess for are people being exposed to different ideas than the ones they're used to? Are they actually uptaking those ideas? Are people expressing ideas that have more nuance and complexity? And you were actually upregulating for those things.
Starting point is 01:32:26 There's a lot of actually quite constructive content on the internet. And imagine that you could actually personalize development and education. This is why you started to say, Winchester was saying what China is doing, where the kids are seeing museum and science experiments and patriotism. You're like, yeah, that actually kind of makes sense. And it makes sense, but it only makes sense when you have an autocratic government that has complete control over the corporations and their motivations. Like if the corporation's motivations were specifically designed to rake in the most amount of profit, like Facebook's is, you'd never be able to trick them into doing that. There's no way. They'd be like, fuck you.
Starting point is 01:33:02 We're not going to do it. That infringes upon our rights as a business to maximize our profits. We have an obligation to our stakeholders. They would never do it. And we can see how the government took major corporations that had such an effect on society and made them public utilities or regulated them in the past. Yes. Now, could we have a situation where because of conversations like this, there was enough bipartisan public demand that rather than being totally polarized as a representative class, the representative class had to unify to say, actually, these platforms are so powerful that they can't be harming our citizens and harming our democracy. We actually have to put some regulation not on who gets to, but what gets radically upregulated. Right, but the problem is the way they would do it is the same way they do the Build Back Better bill, where it's 40,000 pages and no one can read the whole thing.
Starting point is 01:33:52 And inside of it, there's a bunch of shit about how they can spy on your bank account. And it locked you down if you spend more than $600 and you have to go to a committee to decide whether or not you get your money back and make everybody scared and paranoid. I mean, this is the kind of behavior that our government engages in on a regular basis. This is not just a big ask for us to get people to be motivated to make this radical change, but it's a big ask to the government. It's like, hey, you fucks have to start being honest now. And that's not going to happen.
Starting point is 01:34:23 Yeah, it's a tricky proposition. It's not just tricky. It changes the way the government has been treating human beings through every single day of our lifetime. So do you trust Facebook to hold this power? Do you trust the government to hold it? No. Do you trust individuals to be resilient against all of this power pointed at them that is so radically asymmetric? More that. More that. More I trust people to wake up to the fact that you do have control over your news feed. You don't have to look at it.
Starting point is 01:34:54 You do have control over what you share and retweet. You should be more objective about the information that you consume, you should try to find fact-checkers that are independent and are unbiased and are not motivated by financial means. There's fact-checkers that are clearly connected to parties. We know this. So I could similarly argue you're trying to ask too much of human nature. It's way easier than that.
Starting point is 01:35:21 Ask that of the government and ask that of corporations. At least human beings don't – they have a personal understanding of the consequences of what they're doing, and they don't have this diffusion of responsibility that both government and corporations have. The thing about the diffusion of responsibility is one person in a corporation doesn't feel evil when the corporation dumps pollutants into some South American river. But that is happening and it is a part of it. But when an individual takes responsibility for their own actions and if we can somehow or another coach or explain or educate individuals about their consumption and what kind of impact their consumption has
Starting point is 01:36:06 on their psychology, on their future, on the way they view and interface with the world, that could change. The reason why these algorithms are effective is because they play to a part of human nature that we don't necessarily have control over, what we like to argue over, what we like to engage with. You know, I brought this up on your podcast before, but I'll bring it up again.
Starting point is 01:36:28 I have a good friend, Ari Shafir, and he ran an experiment on YouTube where he only looked up puppy videos. And that's all recommended to him. The problem with the algorithm, except for what you were talking about before with the QAnon thing, that's fucked. The problem with the algorithm on YouTube is it accentuates the things that people are actually interested in. But when
Starting point is 01:36:49 Facebook does fucks, when they do something like that where someone just invites people into a group and you can mass invite, I'm assuming through some sort of a program, right? They're not doing it individually one by one. So if some QAnon group mass invites a million people and then is all of a sudden distributing disinformation to that million people, then you've got a problem with the company.
Starting point is 01:37:11 There's a problem with the way that company distributes information because you're not allowing people to make the decisions that they could make to clean up their algorithm, to clean up what they get influenced by, to clean up what their news feed looks like. That's a problem. That's a problem because it's not as simple as you're giving people choices, this is what they choose. No, you're allowing someone to radicalize, like intentionally radicalize people with either willing or unbeknownst to them disinformation. Yeah, and we don't want the Nestle Coca-Cola vending machine in the preschools because do the kids actually have the ability to win the two marshmallow
Starting point is 01:37:53 experiment in the presence of that much advertising? Do they understand marketing? We want to spot asymmetries of power. And the challenge here is the asymmetry is I've got a trillion dollar market cap company that has observed 3 billion people's behaviors and click patterns. So I know more about people before they even know about themselves. Yuval Harari gives this example, his partner, he's gay. His partner, it's like makes two clicks on TikTok and it knows exactly what he wants, right? When you have that degree of asymmetry and it's designed with that much power on one side of the table, I mean, a system is inhumane if that symmetry of power is so asymmetric right so if i if it's influencing me more than i understand my own brain like a magician that's
Starting point is 01:38:34 that's not gonna we're not gonna be able to get out of that because if it's playing to my confirmation bias and i don't know that i have confirmation bias i'm just run by confirmation bias that's a form in which i'm essentially a foot soldier in someone else's culture war. If it's playing to my social validation, I don't know that it's playing to my social validation. I don't even know I have a thing called social validation, that that's an exploitable part of me. That's an asymmetric interaction. So I mean, you're right, by the way, as a part of an ecosystem of solutions, we do need a cultural, I mean, Daniel calls it a cultural enlightenment, but you can just simply say we need a mass education about how technology influences us that matches. Everyone who uses social media deserves to know how it works.
Starting point is 01:39:12 And in the Carnegie Endowment, they did a sort of meta-analysis for the problem of misinformation. If you look at like 100 organizations surveying how do we deal with this problem, like I think like 98% of them said like the number two result was at least do digital literacy education for everyone right everyone should understand more about how this stuff works so to your point about we should be educating everyone to be better stewards of their own attention their own information consumption but when you have bad actors that are manipulating this stuff at scales and at levels that people don't understand and we're about to have gpt3 and printing you know basically full research papers that justify everything you've ever believed,
Starting point is 01:39:50 that's not going to be an adequate solution. So we have to change something at the systemic level. The question is just how do we get that to happen? Yeah, no solutions, right? Well, if we're willing to, I mean... You know what I'm saying like we we have these ideas of what needs to happen but this is like what we need to do is get everybody to stop eating processed food and exercise regularly and only drink water well you can get like 10 people
Starting point is 01:40:19 to do that you can get like highly motivated people to do that. You can get really intelligent, really conscientious people that are considering the impact that their choices have on their future. But that's not normal human nature. And also you're dealing with the fact that most people are very unhappy with what they do for a living. They're very unhappy with their lives, their personal lives. There's like a good percentage of people that are not happy with most aspects of their existence. Well, they seek distractions. They seek distractions that might be the only comfort that they have is arguing about global warming with people online or arguing about Second Amendment rights. Like we got to take into consideration the motivation for people to engage in these acts in the first place. And a lot of it is just they're very, very unhappy with their existence.
Starting point is 01:41:17 So that's why they get trapped doing this. When you talk to someone who is like, hey, I realize that I got to get off of social media. I don't do anything anymore. I wake up in the morning, I have fresh squeezed vegetable juice, and then I go on a nice long hike. And those are rare fucking humans. They do exist. But the idea that we're going to change the course of the vast majority of our civilization and have most people behave in that manner is very unrealistic. So this is why now add increasing technological automation and the radical technological unemployment to that. Meaning automating more of our jobs, et cetera.
Starting point is 01:41:55 Right. Is that good or bad? Does that make people less happy or more happy? Well, for the most part, it makes a radically unemployable underclass, a huge radically unemployable underclass where at least in feudalism, you still needed the people to do labor. Now you won't need the people to do labor. So then this is why there are a number of people in the upper wealth class who believe in universal basic income, because it's at least the cheapest way to deal with those people. Now you add the metaverse to that, and this is the entry into the matrix world, right?
Starting point is 01:42:23 Right. Now this is where we we have to get to because that's what i'm really worried about what i'm really worried about is the solution will be to disengage with the actual material world and the solution would be to find yourself almost completely immersed and do whatever you can to just feed yourself and pay for the metaverse or whatever it is whether it it's Zuckerberg's version of it, which by the way, I saw a fucking commercial, which is so stranger.
Starting point is 01:42:48 It's a bunch of like incredibly diverse, multiracial kids. And they're sitting around bobbing their heads to a fucking, a lot or a tiger that's talking to them and dancing. Have you seen that? I haven't seen it. Please find that. Cause it's like,
Starting point is 01:43:02 what are you selling? The fuck are you selling? Like it doesn't even show what you're selling. It's like this weird change from Facebook to meta, right? And so it's showing this ad. It's very attractive. It's interesting. You see all these people.
Starting point is 01:43:16 They look cool. And they're all bobbing their head. And then the tiger's talking to them, telling them anything is possible. You're like, oh, cool. Anything's possible. But you're watching. You're like, what are you saying? Like, I don't even know what you're saying.
Starting point is 01:43:28 Like, what is this? Like, watch this, because it's so fucking weird. It's not loading. Of course not. Too many people are connected to the metaverse. It's failing. So it's the same thing, but different message. I thought when the audio started that was weird. Go ahead.
Starting point is 01:43:45 But see, the same thing. It's like cultivated, multiracial, multiethnic groups. This is the dimension of imagination. So the toucans are dancing, pelicans are dancing,
Starting point is 01:44:03 the tiger lets go of the buffalo. Look, everybody's bobbing their head. No one's going, The lions are dancing, pelicans are dancing, the tiger lets go of the buffalo. Look, everybody's bobbing their head, no one's going, what the fuck is going on? They're all bobbing their heads, right? Big, big, big. They're all bobbing their heads, right? But what is this? This is going to be fun. No, it's not. We're fucked.
Starting point is 01:44:32 This is not going to be fun. This is a trap. This is a trap. They're going to lure you into this, and you're not going to give a shit about your regular life anymore because it's going to be so much more exciting. And next thing you know, they're going to say, listen, they'd be much more involving if you put a gasket in the back of your head and they could just connect you straight to a pipe. And then you're in the matrix.
Starting point is 01:44:53 So a few things to say. The competition for attention and the attention economy was always about the metaverse. We've been living in the metaverse for a long time because it's about how do you capture and own and control people's personal reality. That's what Instagram, Facebook, TikTok, YouTube, the whole thing, that's what these things are. One of the things that this makes me think about that's subtle, actually, and from I know you've had Jonathan Haidt on the show talking about teenage mental health problems. When you look at when self harm and depression starts ticking up in the graph for the kids, 13 to 17-year-olds, there's a subtle point in a specific year period where that ticks up. And you know what that year was?
Starting point is 01:45:32 It was like 2009 to 2011. What changed in that period? The iPhone. The iPhone. And then social media. We had social media before that. Right, but we didn't access. What changed is when it went on mobile.
Starting point is 01:45:42 Right. Now, what changes when it's on mobile? Because it goes from – What's that? You have it all the time. You have it went on mobile. Right. Now, what changes when it's on mobile? Because it goes from... You have it all the time. You have it all the time. It becomes your new 24-7 metaverse. I would say that it's the... When you virtualize people's experience so fully
Starting point is 01:45:55 and that virtual world doesn't care about protecting and nurturing the real world. When you virtualize people's relationships in a way that they don't protect, they don't care about nurturing your offline relationships when they virtualize your online relationships. In the same way that, you know, we have a technosphere that doesn't care about nurturing the regenerative capacity of the biosphere, we have a virtual reality that's not trying to protect and nurture the real reality that's underneath that. And it depends on that real reality. on that real reality. So if the economy depends on the earth working in this fundamental way,
Starting point is 01:46:27 and it's not trying to protect to make sure those fundamental capacities keep going through deforestation and so on, that's very similar to a virtual reality that's not protecting the social fabric that it depends on. It depends on that thing for it being higher. And if you want to say anything to that. Yeah. So why did Facebook buy Instagram and buy WhatsApp and the various things they did is because a monopoly of attention is the play. And a monopoly of attention is a really big deal to be able to get. But as soon as new devices come out, you're going to get attention in different ways. So AR and VR as new platforms, obviously, you've got to lead the way and having the monopoly of attention there and increasingly hypernormal stimuli. And the cell phone took us up to something like 50 screen time
Starting point is 01:47:06 from say 25 screen time with only the laptop the ar can take us up to like approaching 100 screen time or you know engagement time right and then persistent tracking of reality across all those domains so we can see why this is super problematic and and pernicious that was just speaking to how the metaverse is a natural extension of what they've already been doing right where's the middle lane we've got you know we've got the gutters on each side what's the middle lane there is oh I remember remember what Tristan was just saying about if the... Daniel, get on that microphone. If I have virtual relationships online, but they're actually debasing the integrity of my in-person relationship. So when we're talking,
Starting point is 01:47:58 we're actually looking at our phones. We would say from the Center for Humane Technology kind of perspective, what is humane tech? One of the definitions would have to be – and he was mentioning earlier that tech plus democracy makes a better democracy. Similarly, if you want to think about what does humane tech mean, tech plus any of the foundations of what it means to live a meaningful human life and a sustainable civilization, tech has to make those things better. So tech plus families has to actually increase the integrity of families. Otherwise, it's fundamentally not humane. It's misaligned with what is foundational to being human. Tech plus democracy has to make better democracy. Tech plus individual human mental well-being. Right, but it's not, right? Tech plus democracy is debasing it fundamentally. Currently, but there are ways of actually, first of all, aligning and choosing your business models to be in alignment with that thing. So I mean, not to give Apple too much praise, but when it says, hey, you know, we're gonna, you know, they just added the Johnson and Johnson guy to their board. And they're choosing to go into health, because they could just say, hey, we're going to build our own maximize, you know, engagement machine metaverse thing. And I'm sure they're working on one, but the choosing business models, their business model isn't maximizing attention. That's why when
Starting point is 01:49:08 you use FaceTime, it doesn't have like, here's comments, here's notifications, here's the hearts and likes and thumbs up floating across the screen as you're using FaceTime, because it's you're the customer, not the product. Well, Apple's a fantastic example of what is possible when a company does have the most superior product in the market, right? Like, it's kind of widely acknowledged that when it comes to the phones, when it comes to the operating system that exists in the phones, and when it comes to the operating system that exists on the computers, and then the fact that Apple controls all of the hardware. So the problem that Windows has is you got Lenovo and Dell and there's all these different companies that are making Razer. They're all making different hardware. And then
Starting point is 01:49:52 you have the operating system that engages with that hardware, but there's all these different drivers because you got different video cards. There's so many different things that it's very difficult for them to make this one perfect experience. Whereas Apple's like, you know what? What we're going to do is we're going to control all the hardware. So they make the best laptop they can possibly make. They make the best phone they could possibly make. And they've done such a good job with it that they have this massive, loyal fan base. And then through that, they decided, you know what?
Starting point is 01:50:21 We're going to give you the option to not have advertisers track you and apps track you everywhere. That is a wonderful solution. And when Tim Cook announced that, he said, we cannot let a social dilemma become a social catastrophe. They're going after the social media business model of surveillance advertising, and that's one of the steps. And that's a good example. Maybe they can do something with the social media. Maybe Apple can use the same idea that they have and the same ethics and create a social media app and we can all jump on it. Something with no algorithms, something that doesn't accentuate certain
Starting point is 01:50:55 likes or hates. A stronger mode is if, I mean, they're never going to do this, but imagine that they said, hey, we now know how to diagnose what the problem is. It's these sort of engagement based business models that have infinite virality that treat us as the product and not the customer. And we know it's toxic for democracy. You know, we could just take it off the shelf. We could say those things don't exist in our shelf.
Starting point is 01:51:14 Here's a crazy move for you. It's a good time to do that now. They could do that. Now, here's the thing. If they did that, people would be cynical. They would say, wait, hold on. Apple is doing that only so they can basically keep a bigger monopoly on their app store. Notice there's this whole lawsuit right now with Epic Games and Facebook is trying to dial up that because they don't want Apple to be this big
Starting point is 01:51:31 top-down, take control with their app store. But these social media apps are free apps. Yeah. If they decided to say, listen, we think there's a real problem with these apps, so we're not going to make them available. They could simply do that. They could simply do that and say, we're going to have something that's available that we don't have any kind of control over what your feed is. Right. And they, you know, we were just talking about this last night, you know, in this, one of the things we talk about is that there's, there's always a cynical take when someone takes an action and there's, there's an optimistic, good faith take. The cynical take on Francis is a whistleblower who's a secret operative, dah, dah, dah, dah. The cynical take on Francis is a whistleblower who's a secret operative, da-da-da-da-da. The cynical take on the government wanting to regulate social media is it's just because they
Starting point is 01:52:08 want to take control or if the media is ever criticizing social media, it's just because the media is upset that they don't have a monopoly on truth anymore. There's a partial truth in each of these things. If Apple takes a strong move against these social media companies and the privacy thing that you just mentioned, they're now protecting people's privacy. They prevent cross app tracking. There's an article that they make an extra billion per year out of that change. They make a billion per year. So the cynical person says, oh, they're just doing that so they can get more money for them. How do they make an extra billion per year off of that? Because somehow the extra advertising goes through their network or something like that because you're not using cross app tracking through the other companies. Somehow people start spending more money
Starting point is 01:52:46 on their system. So now there's a cynical take there, but here's the move. If they wanted to prove that they're actually a good faith actor, this is your idea last night, they could take the billion dollars or even just a large chunk of it that's not legal fees and say, we're going to spend that billion dollars on solving the rest of the social dilemma. We're going to fund nonprofits that are doing digital literacy education. We're going to put $500 million into nonprofits that are doing digital literacy education and other sort of humane tech. We're also going to put another $500 million into R&D to do more features that help address the social dilemma and actually move our whole product ecosystem further in the direction of protecting society and not manipulating society.
Starting point is 01:53:27 That might be the only solution. If a company that's as massive as Apple, that has so much capital, I mean, they are literally one of the most wealthy corporations that's ever existed. But we would need a mass... If not the, right? Yeah. I think they're up there with Saudi Aramco. They may be the most...
Starting point is 01:53:42 They keep going up and down. But they would need a public will and support base of people. And that's why your audience is really interesting also, because like, this is going to take, as Daniel said, this is a, we, the people type moment. Like we have to actually ask what is, what is the best way out of this thing? There isn't an easy answer, right? It's not like, Hey, we're going to just tell you that's just do X and it's just all over. We fix it all. Like we have to navigate through this thing. So we have to find levers that are at least a little bit more attractive than other levers. This is one of them. Taiwan is another one. It's an example that works. We could, you know, Biden could invite Audrey Tang
Starting point is 01:54:13 to come to the United States and actually say, we're going to build a civic tech ecosystem. The decentralized web community that's building these Ethereum based like new web three things could actually say, we're going to take the central design imperative. We're going to do digital democracy that helps us do the bowling alley and get that thin tightrope that we've got to walk. These are the kinds of things that could happen, but we would need there to be a public zeitgeist that this has to happen. And I know it sounds dystopian if we don't do that. It's not an easy problem. It's not an easy problem, but one thing we can show is if people are happier and more successful if they follow
Starting point is 01:54:46 this path than the path of wanton destruction. Because we know that about alcoholics and gambling addicts, right? If you have an uncle that is an alcoholic and you see him, you're like, wow, I don't want to be like that guy. You learn. If you see someone just ruin their life with gambling, you go, wow, that's scary. I know a lot of people that are ruining their lives with social media. I know people that it's radically exacerbated their mental health problems. And I personally have had a great increase in my peace of mind by never engaging with people online. I know you told me that don't look at the YouTube comments. I don't look at any comments and I really don't.
Starting point is 01:55:23 And it's so much healthier for you. Oh my God, I'm so much happier. It's so much healthier for you. Oh my God. I'm so much happier. Right. It's incredible. I've told that to friends and occasionally they dip their toes back in the water and then they go, fuck, why did I do that? And you know, they'll, they'll do an episode or maybe they don't like something that they
Starting point is 01:55:35 said and then they go read and I'm like, my God, man, get out of there. And I don't engage on Twitter. I don't engage in the comments of Instagram or I don't even look at Facebook. And because of that, what I take in is my choice. Like I look at things that I'm interested in. And most of my social media, it's not really social media consumption, but most of it is YouTube. And most of it is like educational stuff or complete distractions. What was the Thanksgiving study? I was going to say, I And I was the Thanksgiving study.
Starting point is 01:56:05 I was going to say, I was just thinking the same thing. There's a Thanksgiving study that after 2016, the more per, they looked at zip codes that had the most like media advertising, political advertising. And the more of that media you had, the shorter the Thanksgiving dinners you were. They did this mass study looking at like tracking people's locations and how long they were in their Thanksgiving dinner location. And basically the places that were most bombarded with, like, polarizing media, Thanksgiving dinner was shorter. Because they argued?
Starting point is 01:56:34 And people, yeah, and people, I think, stood further apart or something like this. It actually had the geolocation on their phones too, right? The people who had right versus left views interacted less at dinner. Exactly. He said that was what it was. People with right versus left views interacted less at dinner. Exactly. He said that was what it was. People with right versus left views interacted less at dinner. And we're about to head into Thanksgiving. And I actually would say that Facebook and Twitter, their business model has been ruining Thanksgiving dinner because their business model is personalizing confirmation bias for everyone so that when you show up. So in the same way that that's an epitome of the problem, that's your personal version of the social dilemma.
Starting point is 01:57:03 same way that that's an epitome of the problem. That's your personal version of the social dilemma. Like we could also say, like, what would be the first step for each person listening to this that we can do during Thanksgiving dinner? That's like putting our phones at the door and actually trying to have a conversation about the mind warp that's taken place. Yeah, it's hard because when people get together and they haven't seen each other for a while, they want to argue about things that they feel the other person's wrong about, right? because they've got so much of their time invested in these echo chambers. But you just mentioned something that was so interesting, which was if people started to understand that the echo chamber was affecting them and affecting the integrity of their family. So rather than try to save everybody on the other side from Trump or Biden or whatever it was that they thought, they did this other thing, which is they actually tried to save people from excessive social media exposure. I want to save people from excessive exposure to everything that's harmful and
Starting point is 01:57:55 damaging, but it's very difficult. And I think it's got to be an across the board decision that you make. And it's got to be with your own health. It's got to be with relationships. It's got to be with honesty. There's got to be a lot of things that you do that you change. If we can influence people in any way that's positive, it's to understand where the pitfalls are. Where's the traps? There's a lot of them out there. Now, when we think about the social media issue to a degree, we can take the solution that you propose and just say maybe the individual can just remove themselves from it. We would argue that this is actually impossible population-wide currently because there are companies that just can't succeed if they don't market on there compared to their competitors. I'm not saying remove yourself from it.
Starting point is 01:58:42 That's not what I said. What I said is don't engage in anything personal. Like you can read people's thoughts on things. You can go and watch a YouTube video. You can stare at people's butts on Instagram. But if you get involved in engagement, that's when things get fucked up. The problem is that is the only form of self-expression that a lot of people have when you deal with, if you're talking about something that people think it's a critical
Starting point is 01:59:10 issue, how do you express yourself? How do you get your point of view? If you think your point of view is significant, how do you get it across? Well, you have to engage. That's a problem. One thing I wanted to share, we interviewed Dan Vallone from an organization called More in Common on our podcast, and he does this work on what he calls the perception gap. What they found in their work is the more time someone spends on social media, the more likely they are to actually misperceive what the other tribes believe. So like, first of all, we get hit by a double whammy, because you're talking about participation on social media. And you could sit there looking at stuff, but not participating. Well, it turns out the people who participate the most,
Starting point is 01:59:47 the extreme voices participate more often than the moderate voices. That's what they find in their work. And when they participate, they share more extreme views, so their stuff goes more viral. So we're looking at this weird funhouse mirror when we think like, oh, we're getting a sample of what everybody believes. We're not getting a sample of what everybody believes. We're getting a sample of what the most extreme people believe. So if you actually ask in their research think like, oh, we're getting a sample of what everybody believes, we're not getting a sample of what everybody believes. We're getting a sample of what the most extreme people believe.
Starting point is 02:00:06 So if you actually ask in their research, like, how many Democrats believe that, what would you estimate for Democrats? What do you estimate that Republicans, what percentage of Republicans believe racism is still a problem in the U.S.? I think they estimate like 40% or something like that, and the answer is closer to like 65% or 70%. So we are misperceiving because we're seeing through the stereotypes and the straw men and the bad faith examples of everyone.
Starting point is 02:00:29 So part of this mind warp is we have to actually, again, understand that we're seeing a very specific view. And even if very few, if we have a lot, you know, as a result of this podcast, if like 50% of people on Facebook stopped participating per what you just said earlier, the problem is that the small remaining group, the most extreme voices there, they would be identified by the algorithm and they would just maximally upregulate them. So we just have to realize what game we're in, what unfair fight we're in so that we can unplug ourselves from the matrix. You called me Morpheus last time. I think I was here. Well, there's also a problem with tribal identity.
Starting point is 02:01:03 And it's fucking silly that we only have two groups in this country. And because of the fact that we really have broken it down to two political groups, we are so polarized. We don't have this broad spectrum of choices that we can, you know, well, I like a little bit of this. I like a little bit of this. I like a little bit of that. And to be in the center is to be a fence sitter and to inspire the ire of both sides. But there's many more people who are in the center than we think. Most people are in the center.
Starting point is 02:01:33 That's why this show works. Exactly. But most people don't think that because when they look on social media, they just see people at the extremes and they're like, am I going crazy? Is the world gone crazy? And the answer is, you're not wrong.
Starting point is 02:01:43 Your mammalian instincts that things are upside down, that's not wrong. But it's not because of some master global conspiracies because social media is just showing us the craziest of the craziest voices on all sides. I just realized how much you look like Terrence McKenna. Look at that. For real. It's kind of creepy. We both got the white and beard thing going. You got the glasses?
Starting point is 02:02:00 Yeah, it's true. For real. It's a real problem. You're like Terrence if he's a little more buff. Sorry. Go ahead. So let's say that we could have a bunch of people get off social media. Yes. That's one of the exponential tech risks that we've talked about.
Starting point is 02:02:18 But that doesn't actually do much about the fragility of decentralized drones. It doesn't do much about the fragility of decentralized cyber weapons. Right. Oh, you're a bummer, man. You had a little bit of a solution. Debbie Downer over here. Well, no, the reason I'm bringing it up is because an individualistic only answer doesn't work. When other individuals and other small groups
Starting point is 02:02:34 have the capacity to, small and large groups, to affect everything so significantly. Right, but it does significantly impact the health of the overall population. If we're more healthy mentally and physically, we can make better choices. But the next step is not just that we make better individual choices, but that those who can work to make new, better systems. And so when you think about the founders of this country, they didn't just remove themselves from
Starting point is 02:02:57 believing in whatever the dominant British empire thought at the time was. They removed themselves from that and then said, we actually need to build a more perfect union. And they invested themselves radically to do so. And it wasn't a huge percentage of the population, but it was working to build something that could apply to a much larger percentage of the population. So we need some sort of a radical solution in terms of the way we interface with each other, the way we do business, the way we govern, the way we do everything. Yes. And so let's say you have people who start pulling themselves off social media and saying, I actually want to engage with other people where I really seek to understand their ideas. Before I just jump in criticize, I want to make sure I get their values and what it's like to be them.
Starting point is 02:03:37 And so they first, they remove themselves from the toxicity. Second, they work to actually start making sense of the world better and being in better relationship with each other. Next, they say, I want to make a platform that facilitates this for other people. And then I want to come on Joe's podcast and talk about the platform and get a lot of people on there. So we start to actually get the beginning of a new attractor, a new possibility. Don't you put that out there. Don't you do it. Because then there's a lot of people that think they have the solution. What this sounds like is kind of a radical reboot of the U.S.,
Starting point is 02:04:08 but there's the January 6th version of that, which we don't want. There is a different version. I wanted just to tell you in the Taiwan example, the way that that happened is actually it was a bunch of activists stormed the parliament, except they didn't try to break the glass and the windows and break through everything. They sat outside the parliament.
Starting point is 02:04:24 They brought in all these Ethernet cables, and they set up a Wi-Fi network and they had a bunch of hackers build this alternative civic engagement platform where people could debate ideas right there using technology. So they did storm the parliament, but they didn't storm it to hurt people. They did it to create the better form of government. But to debate ideas where you have things like where unlikely consensus is found, that's what gets upregulated. So they were designing that the better angels of our nature are appealed to rather than the lower angels of our nature.
Starting point is 02:04:54 And it's possible to do that. That's a real working example. I want people to really check that out. It's a real thing. We're not just pointing at a random idea. People do say it's obviously a much smaller country. It's not as homogenous as people think. They think Taiwan, they think everyone's the same. There's 20, I think, indigenous cultures or languages there. So they actually have quite a lot of plurality. They need, democracy has plurality, deliberation, and compromise. You have to have those three things work. Are you aware of the agent provocateur aspect of January 6th? Say more. I don't exactly know what the reality is, but what people are insinuating is that there
Starting point is 02:05:33 was federal agents that were involved in instigating the violence, instigating the entering into the Capitol, and that there's this one guy in specific that they've got him isolated on video. They've shown him over and over again. He's faced no legal consequences. They know this guy's name. They know exactly who he is. All these other guys are in jail. All these other guys who got into the Capitol, I mean, so many of them are facing these massive
Starting point is 02:06:01 federal charges and four years plus in jail. This one guy is like, we have to go in there. We have to take back. We have to get inside there. And people start calling him a fed in one of these videos. And I think he takes off and runs away. But this is what it seems like. It seems like, and this is something that governments have done forever.
Starting point is 02:06:22 You take a peaceful protest. What's the best way to break up a peaceful protest? You bring in agent provocateurs to turn it into a non-peaceful, a violent protest. Smash windows, light things on fire. Then you can send in the troops and you can clean up the mess. And then you don't have any protest anymore. This was the World Trade Organization in Seattle in 99 or whatever it was. That's what they did.
Starting point is 02:06:46 It's been documented that that is what happened. I mean, like literal government agents went in wearing Antifa outfits. This is pre-Antifa, right? Smashing windows, lighting things on fire, and they were all eventually released conveniently. Well, this guy, do you know about this, Jamie? See if you can find it. released conveniently well this guy do you know about this Jamie you know see if you can find it because it's a curious case of this one particular individual who's like yelling in these various groups that we have to get in
Starting point is 02:07:14 there where any like he did it pre January 6th he did it during the January 6th thing and then this guy's faced no legal charges whatsoever people like what the fuck is going on here because when you see some kind of organized debacle like that and then you see people in insisting that we have to take this further and we have to go inside and then if you find out that those people are actually federal agents that are doing that you're like well what is happening here and how is that possible? And how is this legal? That's a problem. Yeah. I haven't seen
Starting point is 02:07:50 this one. I remember the umbrella man who was breaking windows at the George Floyd riots. I think they found out that that guy was a cop. And that I think that was like a rogue human. But I'm not sure if that's true. So this is where it's interesting in this case.
Starting point is 02:08:07 I don't know the case at all, but is it that somebody in government actually initiated him doing it as an agent provocateur to shut down the protest? Or was he someone who happened to be in government who was himself radicalized, who acting on his own because of radicalization did the thing? Or is he an agent provocateur, but he's doing so independently just because he's a fucking psycho? You know, some firemen start fires. Right. But notice that whichever view you have, you probably had a motivated interest to see it that way.
Starting point is 02:08:36 Yeah. I didn't have any view on it. Right. That's the thing. I'm looking at it like this, like, what is it? What is this video? Yeah. I'm watching this guy, like this one big, beefy looking federal agent guy
Starting point is 02:08:45 telling them they gotta go inside and I think he was wearing a MAGA hat. And, you know, he's like a guy in his 50s and he's like, I'll tell you what we gotta do. We gotta get inside there. We gotta go inside the Capitol.
Starting point is 02:08:56 And these people are like, inside? Isn't that illegal? Like, what the fuck? This guy's taking it to the next level. But he's doing it like multiple times. That's, there is a real problem with intelligence agencies doing that kind of shit. Totally.
Starting point is 02:09:11 Because they do do it. And I think they do it thinking that this is – look at these group of fucking psychos. Like we got to stop this from escalating. So here's the way. We get them to do something really stupid. Then we can put fences up and create a green zone. And then we lock this down Meet Ray Epps
Starting point is 02:09:31 Meet Ray Epps the Fed protected provocateur who appears to have led the very first 1-6 attack January 6th attack on the US Capitol, so let's watch some of this because it's fucking crazy. It's really weird This guy is doing this like over and over and over again. There's a video of it, but this is an article about it. So this is an article that's in Revolver. Hold on, I'll get the video. We'll find the video because the video is fucking strange. Ray Epps video.
Starting point is 02:10:03 Here it is. Well, that's 20 minutes long. Well, just watch. We'll see some of it. Oh, these are guys that are watching it. What about that one? It goes to a website. These are on Twitter.
Starting point is 02:10:20 Arrest Ray Epps. So some people are hip to it. But most people like including you guys have no idea that this is a person right you never heard of this before i don't know why it's playing a video oh these fucks with the clicks sorry oh my god please log in log in i want you to log in we need to track you. God. One of the things that was so cool about C-SPAN was the idea of being able to actually see what was happening inside of proceedings. Yes. And we know that the idea of a modern liberal democracy is that we want to leave markets to do most of the innovation and provisioning of resources because they do a good job.
Starting point is 02:11:04 of the innovation and provisioning of resources because they do a good job, but we still want rule of law because there are places where markets will have a financial incentive for things that really harm everybody, like complete destruction of environments or organ trades or whatever it is. And so rule of law is intended to be a way that if you have a government that is up for and by the people that – and it's given a monopoly of violence, that it can check the predatory aspects of markets where the basis of the law because of voting is the collective values of the people. But the state only has integrity and can check the markets if the people check the state. And this was where, again, at a much smaller scale, it was easier to have transparency and being able to see what was happening. The
Starting point is 02:11:40 larger scale messed that up and also having so many things that were issues of national security where it just can't be talked about. then it becomes very hard to say, well, how do we have enough transparency that the people, even if they wanted to, could engage in being able to see what was going on so that we could have trustworthy institutions? What terrifies me is the solution of this is an autocratic government that controls all aspects of society so none of this ever happens. That scares the shit out of me because that seems to be where, there's that fuck. Let's play this.
Starting point is 02:12:09 The Capitol. Tomorrow. Do it from the beginning. I don't even like to say it. Tomorrow. We need to go into the Capitol. Into the Capitol. Tomorrow.
Starting point is 02:12:19 Hear that what? I don't even like to say it because I'll be arrested. Well, let's not say it. We need to go. I'll say it. All right. We need to go in. Shut the fuck up, Boomer. To the Capitol. The president is not speaking.
Starting point is 02:12:33 We are going to the Capitol where our problems are. It's that direction. Please spread the word. All right. No, David, one more thing. Can we go up there? No? When we go in, leave this here?
Starting point is 02:12:48 Yeah, you don't need to get shot. Can you arrest us all? Who wants us? Who wants us? Who wants us? Okay, I think we've seen enough. There's a lot of instances. It goes on for quite a while. There's a lot of videos of this guy, which is really fascinating because I think these methods that they've used forever are kind of subverted by social media
Starting point is 02:13:16 because you have 100,000 different cameras pointed at this guy. When someone starts screaming loudly, people start filming it, and then you get a collection of these, and you can go, oh, what is happening here? Like, I don't think they realized that people would be so cynical that they would go over all these various videos and find this one guy who's not being prosecuted or arrested. He's not being prosecuted or arrested. Ding.
Starting point is 02:13:42 Congratulations. Yeah, I don't know. No, he's not. Look at that guy. Congratulations. No, he's not. Look at that guy. Yeah. I mean, if you had to guess, if you had like 50 bucks, what are you going to put your chips on, red or black? I might put my chips on the result of stochastic terrorism. If I was China, I would have wanted to infiltrate the Facebook group that guys like him were in and just radicalize as much as possible so that some of them were motivated to do it
Starting point is 02:14:06 earnestly. And so it was like some patsy, but I don't even know who it is. For sure there's some of that going on there. There's a lot of stuff going on with January 6th. And it's a lot of sad humans who don't have a lot going on in their life. Did you see the
Starting point is 02:14:20 Into the Storm? Is that what it was? The HBO documentary on QAnon? No. Did you see it? No. It's fascinating. It's really good. And it's a multi-part documentary series about QAnon and the people that are involved. And one thing you get out of it is that these people found meaning in this nonsense. They found meaning and they really thought they were part of something bigger than them
Starting point is 02:14:46 and it gave them hope and happiness. And what I got out of that is, well, these are, this is exactly what we were talking about earlier, the people that are getting sucked into this distraction life, is that most people don't feel like they live a meaningful existence. So when something
Starting point is 02:15:02 like this comes up and you get radicalized, whether it's by China or Russia or that guy, and he's saying... That guy's just basically incendiary, right? He's just throwing gasoline on the fire. But you're saying, is there something out there that you can connect to that's bigger than you? And they're saying, yes, there is. You can be a part of this group.
Starting point is 02:15:23 You can be a patriot. Are you a patriot? Do you want to storm the Capitol? And then you've got the fucking president who's saying, you know, we have to make a big movement. We have to do a big thing. They stole this election. You're like, holy shit. You know, we have to go there and it's a show of force. And then they pull them off of Twitter and like, oh my God, it's a, the conspiracy is even bigger than I thought. Twitter's involved and And it becomes something that is all-encompassing. It involves every aspect of their life. They wake up in the middle of the night to check Twitter.
Starting point is 02:15:52 They take a leak and they check it and make sure that, you know, we move it in the right, has Q released a new drop? And these fucking people get completely locked into it. And at the end of this documentary on HBO, which is really excellent, I can't recommend it enough, you see a lot of them are realizing, like, this is all bullshit.
Starting point is 02:16:11 And they're like, what have I done with my life? There's a Reddit channel called QAnon Casualties, which is like people, especially who struggle with family members, who've fallen down different rabbit holes, and I guess that's one of them. And as people come out of it, just what happens. I have a friend who just reached out about that about his own wife. He asked me like, what can he do? Yeah. Well, I mean, I think what you're pointing to our friend,
Starting point is 02:16:31 Jamie wheel, uh, who's here in Austin, we had him on our podcast to talk about this. When we think about social media, a lot of times people think about it as a, as an information problem, misinformation, disinformation, it's actually about meaning and identity, which is what you're pointing to. People are getting meaning and purpose from a thing and it's therefore it's not a matter of like well let's just like tell people the true thing or the fact check thing there's a sense of meaning purpose narrative what i'm participating in that's that's bigger than myself that people are seeking and part of that which is exacerbated by social media because it's mass alienation and loneliness and those are exactly the kinds of people that can be pulled in various directions,
Starting point is 02:17:07 which includes also some of the decentralized ways that they can use those tools to cause havoc. Something I was thinking is, in the founding of this country, it was understood that both high quality education and a fourth estate, right, a kind of free and open press were considered prerequisite institutions for democracy to work. You had to have- You said what a fourth estate, right, a kind of free and open press were considered prerequisite institutions for democracy to work. You had that- You said what a fourth estate is? Journalism, right? Some kind of... But at that time, so both education and newspaper were the result of a printing press where you didn't just have a nobility class who had access to books
Starting point is 02:17:39 when books were really hard to get, but we could print a newspaper so everybody could know what was going on. We could print textbooks so everyone could get educated. If you could have a – at least that was the idea, right? If we have a population where everyone can make sense of the world, like they've learned how to make sense of the world. They've got history and civics and science and like that and they know what's going on currently, then they can all go to the town hall and participate in government. So it was – acknowledge it without something like a fourth estate, a shared way to make sense of the world together, democracy doesn't work. Facebook in particular is not just a destruction of the fourth estate. It's like an anti-fourth
Starting point is 02:18:15 estate. Rather than share something where everybody gets the same information to then be able to go debate, right now, two different people will have Facebook feeds that have almost nothing in common and polarized, right, and are identifying your fellow countrymen as your most significant enemy and that everything they think is wrong and a conspiracy and a lie or something like that. Right. But how do you rectify that and still have independent media? Right. So one of the things, as I said, that's interesting is that as we started to scale more, one of the things that newspaper and then with TV and broadcast became able to do was scaled propaganda, give the same message to everybody. And there was this whole big debate in World War I and then going into World War II that democracy requires propaganda because people are too dumb to make sense of the world adequately. So we have to propagandize them so they aren't fighting the war effort while we're in war. And one of the things that is interesting, just from a potential – and you'll say, yeah, but how do we get there?
Starting point is 02:19:12 Because how do you incentivize the Zuckerbergs or whatever? And that's – the enactment is a real tricky thing. could use the tools of social media, which is the ability to personalize a huge amount of content to the individual to actually not to make real democracy possible, where you don't need to give everyone propaganda because they're dumb. You can actually help people understand the issue progressively better in a personalized way. How are they already leaning, expose them to the other ideas, see that they're understanding it. And you can imagine that like real democracy could actually be possible at scale if you could personalize the kinds of education and civic virtue that would be necessary for people to engage in a democracy. Let me add on to that because this example you just showed me, right, with this guy, I had never seen that video. Imagine a Thanksgiving dinner
Starting point is 02:19:59 happening a few weeks from now where one set of people have been all exposed to this guy. And this is like the central way that they see January 6th is through the lens of that guy. If you're in one of the other filter bubbles, all you see is just the violent, crazy, whatever the right, you were not even operating on a shared reality. So when you talk about January 6th, normally, if we have a shared printing press, or we have a shared fourth estate, we've at least been exposed to some of the same things. Yeah. But when you show up at that Thanksgiving dinner table, when we argue about January 6th- And you haven't seen something. You haven't seen, but you assume our brains are not built. Part of our paleolithic emotions
Starting point is 02:20:34 is that we were built to assume with my brain constructs reality from my eyeballs. So I have to assume I was built evolutionarily to assume that your brain is constructing reality from some of the shared stuff that I'm seeing with my eyeballs. So all my biases are to assume I was built evolutionarily to assume that your brain is constructing reality from some of the shared stuff that I'm seeing with my eyeballs. So all my biases are to assume other people are talking about the same reality. And there's a little bit of a magic trick optical illusion because we both saw, quote unquote, January 6th, but we were exposed to completely different media sets. So now when we get in a conversation, it completely breaks down, not because we're actually even talking about the same thing, but because we don't even get to that layer of detail. And one of the things in a humane technology world, I think I mentioned to you in the More in Common research, they found that the more you use social media, the more likely it is that you are not able to predict
Starting point is 02:21:17 what someone else's views are on a topic. You think all Republicans are racist or something like that if you're on the Democrat side. Or if you're on the Republican side, you believe that all Democrats are LGBTQ and only 6% of Democrats are LGBTQ. So we are far off in terms of our estimations of other people's beliefs. And in the current world, the more you use social media, the worse your estimations are. In a humane future, the more you use social media, the better our shared understanding and my understanding of your understanding would be. the better our shared understanding and my understanding of your understanding would be. And so you can imagine there's some sensemaker out there who's showing both sides of these different filter bubbles and helping us bridge build. So we're actually even able to have a shared conversation. Those are the kinds of people that Daniel was just talking about
Starting point is 02:21:56 would get kind of upregulated to be at the center of our shared undivided attention. Let's say I wanted to say, how do I increase trust in our institutions that are processing things too complex for an individual to figure out on their own, like the reality of climate change or COVID? Well, let's say that C-SPAN-like, I had debates happen inside those institutions where people who had real expertise but had conflicting views had a long-form facilitated debate, but not the type of debate that is just oriented towards rhetoric and gotchas to try to win, but that is earnestly trying to seek better understanding. And there's a facilitated process and the people agree to it.
Starting point is 02:22:36 One of the things they agree to is, what would I need to change my mind about this? If the answer is nothing, then you don't even engage in the debate. If we can't even say what would change our mind, we're not really responsible participants of a democracy because we're not really open to it. And each of the debaters has to read each other's content first and agree to a facilitation process that's long form. And we start with, what do we all agree on? Say we're looking at climate change. What do we all agree on? That means now both around the values that are relevant and the facts of the matter, that where we go to what we disagree on, we know what our shared basis to derive a solution looks like. Then we try to formalize our disagreement.
Starting point is 02:23:12 I believe X. I believe not X. And we say, what would it take to solve this? Do we need to do a new piece of science? Do we disagree because of an intuitive hunch or a different value? We could do a much— Or because Al Gore finds a private plane or—there's these common cynical narratives also that get in the way, right? Because we're all just like, oh, well, it's just this one thing or Ben Shavir says, oh, the media doesn't like social media
Starting point is 02:23:30 because it's losing its monopoly on truth. Partial truth, but not complete. Go on. But we could have people who had different views, but were earnest and wanted to know what was true more than hold their own view, be able to engage in a process that could bring us to what is shared knowledge? Where are there disagreements? What is the basis of the disagreement? What would it take to solve that? And actually have that be what is informing the institutions and all of that be transparently oversighted. That would be a huge step in the right direction. There are groups like Braver Angels, Search for Common Ground. In fact, so my organization, we actually are standing up this thing called the Social Cohesion and Technology Council.
Starting point is 02:24:08 I think I got the name right. Taking these groups like Braver Angels, Search for Common Ground, that this is what they do. They actually run these civic processes with people who come from very different perspectives. And it takes like multiple hours. They bring them together for facilitated conversations. But what we're doing with this council we're putting together is actually matching them with technology designers. So they can actually take the lessons of like, how do you do conflict mediation? How do you get shared reality from like very different
Starting point is 02:24:30 identity held different realities? And then also apply that to how you design technology. Because you can imagine a world where Facebook's like, oh, do you want to see more bridge building between January 6th? Do you want to see more bridge building stuff on climate change? And you could imagine sorting for that thing, right? They could design it very differently. Right, but you would have to change people's intuitions and change human nature because human nature is to seek conflict. Most people seek conflict. This is a fundamental question about how we view human nature.
Starting point is 02:25:01 It is true that the worser vices, the worser devils or whatever you call them, the worser angels of human nature are there within us. Right. But if that's what we assume is true, that that is the full story
Starting point is 02:25:11 of who we are when we look in the mirror, then this story is over and we should just go do something else. No, no, no, no, no. We should just go and drink margaritas
Starting point is 02:25:17 and order a beer. No, no, no. I just think most people live these sort of unrealized lives. You have a giant percentage of the population that is disenfranchised.
Starting point is 02:25:29 Totally. And they're angry. And they look for things that make them angry. Yes. So I think we have to address it at the root level before we address it even at a social media level. That's why you had Johan Harjian saying the opposite of addiction is not sobriety,
Starting point is 02:25:43 it's connection. Yes. People need meaning, purpose, connection. And you can imagine a world where social media is like, hey, here's some drum circles or dance events or obviously post-COVID or whatever, but just social media could be steering us. We're making life choices every day, right? Now, when you look at a screen, it's just like, it's basically allocating decisions of where time blocks are going to land in your calendar, right?
Starting point is 02:26:00 Most of those time blocks are like, spend another five seconds doing this with your phone. Right. But imagine social media becomes a GPS router for life choices, where it's actually directing you to the kinds of things that create more meaning. Now, of course, the deeper thing is work inequality, meaning not existing in a lot of the work that people do. Agreed. And relationships. Yeah, absolutely. Right.
Starting point is 02:26:20 I mean, who's the angriest people? Incels, right? When people get really angry, you accuse them of being incels. But we can imagine a world that facilitates ways for people to, you know, go to dance events together where they meet other people in a more, like, facilitated environment. As opposed to you're going to sit there at home and, like, let's just get you swiping and Tinder's profiting from the attention of casinos so you match and then you never message someone, right? Like, they profit from just, like, that machine working that way. Right. And then we also have the emergence of the metaverse where people are just going to be more incentivized to go into that because it's going to be very exciting. Which is why a humane future is the online world has to care about actually like regenerating the connective tissues of the offline world.
Starting point is 02:26:56 If it doesn't do that, it's not going to work. Apple could be in a position to do that. You take it back to similar to people exercising and not eating too much sugar because those are – the too much sugar is a hypernormal stimuli, right? Remove the sugar, fat and salt from evolutionary food, which are the parts that create the dopamine hit and just make fast food out of it. And in the same way of like what is fast food to real food is just the hypernormal stimuli. That's what porn is to real intimacy. That's what dating apps are to actually meeting people. It's what social media is to real human relationships. It's kind of just the hypernormal stimuli. That's what porn is to real intimacy. That's what dating apps are to actually meeting people. It's what social media is to real human relationships. It's kind of just the hypernormal
Starting point is 02:27:29 stimuli. We know that GDP is not a good metric of the health of a society because GDP goes up when war goes up. It goes up when addiction goes up. The question of what is a good measure of the health of a society, one metric that I like, no one's applying the metric just as a thought experiment, is the inverse of the addiction in the society as a whole is a good measure of the health. A healthy society produces less addiction, meaning more sovereign individuals because addiction creates a spike in pleasure and then an erosion of their baseline of pleasure and baseline of health fulfillment in general. One of the reasons we're so susceptible to hypernormal stimuli is what you're saying is because we live in environments that are hyponormal, like not enough of the type of stimuli that we really need, which is mostly human connection, creativity, and meaning.
Starting point is 02:28:15 Right. And so at the basis of it is like how do we actually increase those is the only way that we become unsusceptible to the supply side marketing that appeals to... Yes. And it's interesting to think about if Apple were to take the small percentage of people who opt into tracking their usage statistics, and they could actually measure for a given country, hey, this is the percentage of people that are addicted based on usage patterns. Again, it's privacy respecting and everything, and reporting that back to society. So there was a better feedback loop between how healthy the society was and its use of technology. And then actually have ways of saying, how do we make, I mean, again, Apple's in this
Starting point is 02:28:51 really unique position where their business model is not addicting people, polarizing people. They could actually make their whole system about how do we have deeper choices in the real world. Well, there is a movement in society currently to try to get people to recognize through radically induced introspective thought brought on by psychedelics what the problems of our society and not necessarily the problems of these choices but the problems and you're
Starting point is 02:29:21 talking about like indulging primarily in these choices whether it's porn or fast food or gambling or alcohol or whatever these problems are that people have is that there are certain psychedelic compounds that allow you to see yourself in an incredibly ruthlessly introspective way that'll allow you to make radical changes. And those are all illegal right now. And there's a lot of great work being done right now with MAPS, where Rick Doblin's organization has worked to try to introduce these compounds to help specifically help soldiers deal with PTSD. It's a big one. And I think through that and through their advocacy and the understanding that this stuff is very effective, whether it's through MDMA or whether it's through psilocybin, through some of the research they're doing with that, that there's a way to get a view outside of the pattern,
Starting point is 02:30:18 this like deeply cut groove that you're stuck in. The default mode network. Yes. And I think if we're dealing with anything that is a possible potential real solution for radically reengaging thought, for changing the way we interface with each other and with society in general, I think that's it. And I think that the fact that that is illegal currently is one of the big problems, one of the big barriers between us changing the way our culture operates and what we find to be important. Yeah, I mean, you remember so many of the, like, founding writings of the country said we need freedom of religion, but we actually need a religious people. And what they were saying is like, we don't care if it's Confucianism or whatever, but you need people that have some transcendent values and morals that bind them to more than just their own self-interest and that give them something like love thy neighbor and give the benefit of the doubt and things like that. So it was acknowledged that this democracy only worked with a comprehensively
Starting point is 02:31:23 educated people where they meant both a kind of moral education and development of people as well as being able to understand the issues, right? Both they need to understand science and economics and like that. They also need to understand the importance of compromise over culture wars, of seeking to understand each other's perspectives. So the psychedelic renaissance kind of religion has decreased in its overall influence on society a lot. And the psychedelic renaissance is the beginning of like a new way that people are starting to access transcendent experiences and then reflection. I would say that
Starting point is 02:31:57 by itself, I have seen narcissists and sociopaths get more severely that way using psychedelics because it just creates reinforcement. Yeah, and gurus. So it has to be like psychedelics in a community of practice where there is checks and balances on each other. Exactly. And I think also it can move us away from this concept to use Terrence McKenna's words, a dominator culture, you know, that you can have advancement without having a dominator culture. And you can have advancement where you seek to engage in the greater good of the whole.
Starting point is 02:32:35 And, you know, and the choices can be made that way. And I think in many ways it's one of the things that Apple does. When Apple is talking about this world where they're creating less impact of advertisement by not having you track amongst all apps and allowing you to be able to generate more of it through that is so attractive to people. I mean, look, Android is just a big data scooping machine, right? I mean, they're tracking everything and anything. And it's one of the things they said about TikTok. I mean, they're tracking everything and anything.
Starting point is 02:33:23 And it's one of the things they said about TikTok. When software engineers first looked at it, they're like, Jesus Christ, this is like tracking everything. And it's one of the most invasive of the applications. Why TikTok is not considered a major immediate national security threat, I still don't understand. I mean, if Russia in the Cold War was running the media programming for the United States for all of its youth, like, that's insane. There's actually a specific strategy China uses called the, CCP uses called the borrowing mouths to speak. So you can imagine when anyone says, any Western voice in the U.S. speaks positively of the CCP, you can just add a little inflation. They just get a little bit more boost than someone, because you're more trusting of someone who's a Western voice than of someone who's from CCP or China. And so that's one of the invisible ways you can steer a culture. But going on back to the Apple point, we all sound like
Starting point is 02:34:15 we're promoting Apple in this podcast, and I just wanted to say this. Well, we're kind of promoting good faith companies that are moving in the right direction. Yeah. And you had John Mackey on Whole Foods. We went to Whole Foods last night and talked about how that's creating an environment for health and trying to at least couple better. It's not perfect. It's just coupling better towards we can make money by helping things be healthier. Right. Apple could say, we're going to couple our business model, put on the Johnson and Johnson, whatever you think of Johnson and Johnson, but you can say, we're going to orient our business
Starting point is 02:34:40 towards long-term health of people. We're going to do Apple Fitness. We're going to do things like that. We're going to change the app stores to, you know, put down in the shelf space, all the toxic social media stuff, if not take it off completely and put back on like, what are the things that help people connect with each other and connect with each other in person? I mean, part of that is it's actually kind of hard to host. There's these certain people in a community who are kind of the event hosters. They're the people that bring people together. And right now, I mean, they're good at it, but imagine that was just like 100 times easier. I'm not trying to sell anything.
Starting point is 02:35:07 I don't have any product or anything like here in thinking about this, but there are people who work on how do we make it easier to bring people together in the physical world? And if we made that a lot easier than it is today, so that was happening more often so that when you thought about what you wanted to do,
Starting point is 02:35:21 instead of I could open up this app or that app, I felt in my own community, in my physical community, more opportunities, a more populated menu of life choices where I can, you know, do dancing, connect with people, drum circle, whatever the thing, sewing clubs, book clubs, just the things that bring people together. And you could even have, you know, all the public spaces, libraries and town halls and squares and things like that have better instrumentation so that that was easier to book for communities, right? Again, this is part of a longer-term trend and transition of how you get out of this. But I do think that we have to make the choices that are fulfilling as easy to make as the choices that are not fulfilling but have the hypernormal stimuli
Starting point is 02:35:57 instant hit. I was thinking about something, Joe, when you were asking like what are the solutions and jumping quickly to why some proposed solutions don't work, which is true. It's like you think about what are the nutrients the body needs. You can die just from a vitamin C deficiency even if you have all the B vitamins, vitamin D, et cetera. And so it's like the body doesn't need a nutrient. It needs minimum levels of lots of nutrients. The same is like how do you get buff? Which muscle do you work?
Starting point is 02:36:22 Well, you have to work all the muscles and you have to work them in lots of different ways. How do you make a functional civilization? Do you do that through culture? Do you do it through kind of collective efforts or individual efforts? Do you do it through technology? Do you do it through changing markets or states? We'd propose that there's some stuff in all those areas that has to happen simultaneously that drives virtuous cycles. And otherwise, it's kind of like answering the question of like,
Starting point is 02:36:45 which nutrient do you need or which muscle do you need to work out? Like the problems are complex. They have many different causes. And all of the kind of single solutions might do something but end up failing. And so we have to also, and this is again, something that's very hard to do and attention spans are getting shorter and shorter, is how do we actually look at a whole ecosystem of solutions that collectively can start to address it even though any of them individually can't? Yeah. So we're going to hit the brakes and go backwards or go in a completely different direction
Starting point is 02:37:16 than the culture seems to be going in. So we – you know what I mean? We have to not just stop our forward – it's not even forward momentum. The general direction that we're going in. Well, I think with things like the social dilemma, which was seen by 150 million people. Helped a lot. And Francis' stuff coming out and people having a negative reaction to the metaverse. I don't know that many people who saw it and was like, yeah, let's totally do that.
Starting point is 02:37:38 Obviously, they have asymmetric marketing power. They're going to put billions of dollars into funding this thing. They're hiring 10,000. Let's talk about that. What are they doing? Because that commercial where the tiger's talking to the buffalo and then all the kids are dancing, I don't know what the fuck's happening. I mean, I don't know what's happening in that example, but it's a race to control the whole experience. I mean, the reason that Facebook is doing the metaverse, Zuckerberg doesn't like the fact that Apple has controlled his destiny by controlling
Starting point is 02:38:05 the operating system inside of which Facebook has to sit. And then all the various ways that whether they make advertising like they did recently, the privacy tracking stuff, it makes him not have control over his destiny. And so if you own the whole platform, bottom to top, it's a classic vertical integration. If I own the entire stack, I can control the entire thing. And then I own my own destiny. That's what this is really about. And it's going to become a land grab between these companies for who can sort of own the next metaverse platform. It's a fascinating, sorry to interrupt, but it's a fascinating observation too,
Starting point is 02:38:38 fascinating thing to observe when you're watching someone who has ungodly amounts of wealth, when you're watching someone who has ungodly amounts of wealth clearly ambitiously pursuing more in a very transparent way. What's interesting to psychoanalyze him a bit is that he has 55% of the ownership and voting structure shares of Facebook. He has a very unique position. There's never been, I don't think, a company as massively powerful as his in the sort of close to trillion-dollar market cap range where it's all basically controlled by his psychology. I mean, I talked to Facebook
Starting point is 02:39:08 insiders, you know, occasionally, and they'll tell me that at the end of the day, with a lot of these decisions, it really does come down to Mark in a way. He's a young guy. Yeah. How old is he? He is actually the same age. So he's must be 37. So he and and then i think what he cares the most about is being seen as an innovator like if i had to name it's not like you said it's not the money i think people always say oh it's just he's greedy he just wants the money it's no i think it's he wants to be seen as an innovator and if the world said the way you can be an innovator is not by uh building more stuff that that basically hollows out the physical world so we can make this unsustainable virtual world
Starting point is 02:39:48 that collapses society. You can be an innovator by actually fixing and helping to support the planet that we're on, the actual world that we're living in, the social fabric that needs to be strengthened. But obviously he's been trapped by a set of incentives. I mean, one of the most interesting things is if he came out and said,
Starting point is 02:40:03 post all these Facebook files and said, like, I'm actually trapped by the shareholder model and the corporate form. Like I am trapped by the fact that I have a fiduciary obligation to shareholders, but he doesn't say that. So in a way, he has a fiduciary duty to lie to himself about the gap between what his incentives are, and what the world needs for basically sustaining. Well, also imagine if you've created something that says whether or not he created it is a different debate, but you're the controller of something that's so massively influential on a global scale, and maybe he thinks that at least he's not evil. Like he may be trying to make money and he may be trying to come off as an innovator, but he's not
Starting point is 02:40:46 an evil person. I don't get an evil sense off of Mark Zuckerberg. I get a kind of robotic. He's odd. He's odd in the way he communicates. But maybe that's like a social awkwardness in dealing with his own public image being broadcast to the world and comes off clunky. You know, people come off clunky when they're concerned with how people view them, right?
Starting point is 02:41:07 Maybe that's it. But imagine just giving that up. I'm going to back out now, like, you know, Jeff Bezos is leaving Amazon, and he's going to, like, hand it over to another CEO. What, imagine him handing over Facebook to some other person and watching them fuck it up, or watching them take this insanely powerful thing and actually make it more evil or make it more destructive but more profitable.
Starting point is 02:41:34 That's a total possibility, right? capitalist, and some really ruthless CEO got a hold of Facebook. And they said, listen, our bottom line is like, we're trying to increase the amount of money that our shareholders get off of this. And what we're going to do is we're going to make these choices. And these choices might not be that popular with analysts and with people that are, you know, sort of trying to examine culture and the impact that social media has on it. But for us, it's going to be a windfall. We were speaking with a friend who is in a senior position at Google working on AI and has come to the conclusion that a lot of people in senior positions in AI have come to that something like artificial general intelligence is inevitable and inevitable near term.
Starting point is 02:42:23 And near term, like how many years? Depends upon who you're talking to, but this was forms that are inevitable in the five-year time period. Jesus. Well, how come that's debatable? Because some really intelligent people think it's off by like 50 years. Partly it has to do with how you define artificial general intelligence. Are you defining it as something that is truly sentient and self-aware or simply that can beat us at all games?
Starting point is 02:42:47 Okay. Beat us at all games is already here, isn't it? Well, pretty much. And so then let's say you start applying that to beating us at the games of how to concentrate all the capital, right? Because ultimately market is a game. And most of the market is high-frequency trading run by AI now already. If you do the super one, then you concentrate
Starting point is 02:43:08 all capital into one thing. Yeah. It's actually worth noting. I mean, it's a chess game, and you can out-compete. If you can see more moves ahead in the chessboard against the other AIs, you win the other AIs, and then you just move faster. I know, but I mean, that's what's terrifying, is that it moves completely into the realm of AI and it's outside of human comprehension. We're not even in the game
Starting point is 02:43:31 anymore. So specifically, his thinking was, it's inevitable that that will happen. It will be dystopic. There's no way for it to not be dystopic. We at least think the Google dystopia is less bad than the China dystopia. Oh my God. So we're in a full-blown race, right, to get there. And there are many people actually who understand, and other people are like, well, actually the only answer is to jack our brains in so that the meat suit is somewhat useful to the AGI, so now we're in a race to do that.
Starting point is 02:44:03 When people understand the catastrophic risks and they don't see any good possibility out, then oftentimes they will actually accelerate some version of a catastrophe as the only reasonable solution. And this is why it's so important to actually define the design criteria right and have people committed to find solutions even though they're really hard. And it's why I think something like this is interesting is truly a belief that a lot more people focused on what we need to be trying to solve is actually useful. We think there's a lot of super smart people at a lot of universities and in institutions and in their garage who care about these things, who could be working on solutions more. We think about the entire blockchain Ethereum community. I mean, these are some very smart people, but they're not necessarily working on this.
Starting point is 02:44:47 Matthew Feeney, Jr.: So it's very, very smart people. So let's start with the right design criteria, right? The design criteria of if you're adding tech that affects society, it has to actually be increasing the quality of democracy. It has to be increasing the integrity of markets, has to be increasing the quality of families and mental health. You look at what are the foundational things. If it's not doing that, it failed the design criteria. Similarly, the idea that we have these dystopias and these catastrophes, we need – and the
Starting point is 02:45:14 catastrophes come from not being able to create order in the presence of this excessive tech. The dystopias come from top-down order. So what that means is rather than have imposed order or no order, we need emergent order, which is what democracies and markets are supposed to do. But they have to be upregulated, a new, more perfect union that's upregulated to the level of tech we have because the tech has advanced so far, we need new versions of it. So how do we bring about emergent order of, for, by the people that can direct the tech to not be catastrophic, but it isn't dystopic. I just want a lot more people thinking about that. I want a lot more smart people at MIT
Starting point is 02:45:51 and Stanford and the State Department and wherever, and in Ethereum, working on those issues, proposing things, finding out what's wrong with them so that the collective intelligence of the world is centrally focused on how do we make it through the metacrisis? How do we make it through the fact that we are emerging into the power of gods without the ability to steward that well? What would it take to steward it well? What will it take? Well, a lot of people working on what it will take and coming up with partial answers is part of the answer, right?
Starting point is 02:46:26 That's what we're kind of saying right now. When we started the Manhattan Project, we didn't know all the ways that it was going to come together, right? So there is a leap of faith. We have to be comfortable with the uncertainty. It's one of the developmental qualities that's needed. It's like we don't know how to make it through this. Like, got it. Step into that reality.
Starting point is 02:46:42 Take a breath into that. Yeah. And we have to figure out how we're going to do this. We have to refresh the way people felt after they saw the social dilemma. Because the problem is they waited about two weeks and then they got right back to their normal consumption. Maybe not even two weeks. 100%. And I think that we have a very short memory when it comes to things that are impactful and really sort of jog your view of reality.
Starting point is 02:47:09 It's so easy to slide right back into it. And I think there has to be an effort where we remind people, we remind each other, we remind ourselves, whether it's a hashtag, whether it's some sort of an ethic, a movement, an understanding. Like we're moving in the wrong direction. And we need to like establish that as a real clear parameter. Like we've got a problem here. I think people do get that. A lot of people.
Starting point is 02:47:42 They do. The social element. People got it. A lot of people. The Facebook file stuff. People's negative reaction to the metaverse. Yes. Yes, there's a lot of power on the other side of the table, right? We've got trillions of dollars of market power.
Starting point is 02:47:53 The question is, are we the people, the culture, going to be able to identify what we don't want and then steer ourselves in the direction of what we do? But are we operating in an echo chamber where we're talking to a lot of people that are aware of it? So when you say people are aware of it, like what percentage are we talking about? Is it even 20? 06.20 Most people who watch the Social Dilemma walked away with something like tech is a problem. It's kind of generally scary and it seems to be bad for teenagers and families. They didn't get is fundamentally incompatible with democracy because it polarizes the population, polarizes the representative class, creates gridlock and makes it less effective relative to other forms of government.
Starting point is 02:48:29 Trevor Burrus This is also like – here we are. We're at the three-hour mark here. So we've been having this conversation and even though we're doing our best to try to pin down solutions and it's like – this is a very ethereal thing it's a very uh it's just like it almost seems uh ungraspable you know it just seems like you you can nail down all these problem issues but then when it comes to real world application like what the fuck do you do? Well, this show is going to air by bouncing off of satellites that are in outer space to be able to go to people's phones and computers using the most unbelievably like advanced technology. Like that's pretty ethereal.
Starting point is 02:49:19 It's actually very hard for people to grasp the whole scope of the technological complexity. When you have that much technological complexity and that much technological power, we also have to be able to work on complex social systems that can make us able to wield it. And we just haven't had the incentive and motive to do that. But hopefully, recognizing where it goes if we don't is incentive for enough people to start working on innovation more. But this technology, this fascinating and super complex technology is disconnected from human nature, from these thousands and thousands of years of human reward systems that are baked into our DNA. That's part of the problems. All these-
Starting point is 02:49:57 Well, only because we have a whole system, trillion dollar market cap system dependent on hacking and mining from those human vulnerabilities. Because they've already done that. If we subtract that thing and we instead reflect back in the mirror, not the worst angels of our nature, but the better angels of our nature, we see examples of people doing the hard thing for the easy thing. We see examples of people hosting events for each other and being better to each other rather than being nasty to each other. We're just not reflecting the right things back in the mirror. We do reward when people do those things, right?
Starting point is 02:50:26 Occasionally, but the social media algorithms don't reward them by and large, right? They take a couple examples where the positive thing happens, but mostly we see the most engaging, outrageous, controversial thing. And so we have to reflect back something else in the mirror. I think it's like, if you remember the 1984 ad, I bring it back to Apple. And if you remember the ad for the Macintosh, the famous thing where there was a woman like running down the, like there's the booming big brother on the screen and the woman's running down and she takes this hammer and she's wearing a Macintosh t-shirt and she takes this hammer and she throws the hammer at the screen and it blows up. And it says on January 24th, 1984, Apple will introduce Macintosh and you will see why 1984 won't be like 1984.
Starting point is 02:51:07 Was it 1984 that Apple came up with that computer? Yeah. It was actually, and the reference was we have to defeat Orwell. We have to defeat that fiction. That can't come true. And we, I mean, it was specifically against IBM as being Orwell. Oh, you haven't? It's worth seeing for people to check it out.
Starting point is 02:51:21 So this is pre-internet. This is it. Let's watch it. This is it. We want to check it out. So this is pre-internet. This is it. Let's watch it. This is it. What are we on Earth? We are one people. With one will, one resolve, one cause. Our enemies shall talk themselves to death,
Starting point is 02:51:38 and we will bury them with their own confusion. We shall prevail. On January 24th, Apple Computer will introduce Macintosh. And you'll see why 1984 won't be like 1984. Wow. It's powerful. That is powerful.
Starting point is 02:52:02 We ended our last conversation with me giving you... I was in high school. Yeah. I was actually born that year, which is crazy. That is powerful. We ended our last conversation with me giving you- I was in high school. Yeah. I was actually born that year, which is crazy. That's nuts. And you can imagine, that was aired during the Super Bowl, by the way. So that was seen by like, it was actually rated the most successful television advertisement in TV history. Wow.
Starting point is 02:52:18 Steve Jobs had a direct role in it. You can imagine, we have to not let the Orwell Huxley two gutters thing happen. We have to throw a hammer down the middle, down the middle and, and create a future where technology is actually humane and cares about protecting the things that matter to us. One thing that gives me hope is that these kinds of conversations are very popular. Right. You know, like, uh, the social dilemma is very popular. That's when we did got like nine nine million views
Starting point is 02:52:45 Yeah, this one will probably be similar It's like people are interested in this conversation because they know it's real issue at least the kind of people that are tuned into this podcast and I think It's gonna be like little baby steps Into the the correct direction and you know what I said about psychedelics, it's one of those seemingly frivolous discussions. People that don't have any psychedelic experiences,
Starting point is 02:53:16 they don't realize the dramatic transformative impact that those things can have on cultures. But we don't have much time. No one has much time. Not a fucking human that's ever lived. A hundred years ain't shit. And during this time of this lifespan that we've experienced, we've seen so much change and so much almost unstoppable momentum in a general direction,
Starting point is 02:53:37 and it doesn't seem good. But recognizing it, discussing it, and having documentaries like The Social Dilemma, having folks like you guys come on and discuss what is really going on. And we didn't even really get into a lot of the real technological dilemmas that we have. We basically glossed over the idea of drones and the idea of CRISPR and many of these other problems. Just watching that text-to-code thing, going,, my God, the barrier of entry has been eliminated. But can you type now, you know, and you can code?
Starting point is 02:54:14 It's wild. But hopefully through conversations like this and you putting attention on it, I mean, you are part of the sense-making world. You are helping people make sense of the world. And when you put your attention on it, I mean, I'm grateful for you creating this opportunity to talk about these things because, you know, they're heavy conversations and they're hard to look at. Well, and it's actually important that you have these long-form podcasts, right, that are two plus hours as opposed to five-second clips or tweets, is when we talk about tech has to enhance the fundamentally important things. So we saw how tech kind of specifically social media tech with the
Starting point is 02:54:51 polarization algorithms messed up the fourth estate, also messing up education. It doesn't matter what you learn if you can't remember anything and you have no control of your attention. And so one of the things is the tech has to actually be increasing people's attention, right? Their attention span, their be increasing people's attention, right? Their attention span, their control over their own attention, and their memory. If we were to be able to measure those things and say let's upregulate that. If you want a democracy to work, the tech should upregulate people's perspective seeking. How much are they actually seeking other people's perspective? And if I have a short attention span, I can't hold multiple perspectives simultaneously because you just can't fit that in.
Starting point is 02:55:24 You just say one cynical perspective and I'm right and that's the only thing I'm going to think. And so imagine that like instead of the short clickbait thing, because otherwise I'll bounce, if I actually read the longer thing and if my post had more nuance, that actually got upregulated. So it created an incentive to take in other perspectives, to try to parse them and synthesize them. That would be really different. We've got to incentivize kindness to yes, you know this Willingness to engage in nonsensical arguments It's just so common Twitter is the best example that it's like it's a mental institution and people just throwing shit at each other It's it's so wild to watch when you don't see examples like this in real life
Starting point is 02:56:03 It's like accentuating the worst examples of the way people can communicate, but doing so in this weird public square. It itself is a virtual reality. If you think about just what it does, it's ranked by what's the most engaging. So it's like every dramatic event that happened with anyone anywhere,
Starting point is 02:56:18 like little drive-by, like, oh, you just cut me off on the freeway and I'm upset for a second. Anywhere that happens anywhere, it just collects it all into this one efficient feed. you just like cut me off on the freeway and I'm upset for a second. Anywhere that happens anywhere, it just collects it all into this one efficient feed. And then people are responding as if it's all like happening at the same time. It's already this weird chronologically distorted reality
Starting point is 02:56:34 because it's pulling from all these different moments and making it seem as if it's all one moment. So people need to just see, I mean, it's just, Twitter is just bad. And it affects people's minds where they think that this is the world that they live in, where it is this concentrated form of it. It's like, you know, my friend Peter Atiyah was on the other day, and he was talking about how bad juice is for you.
Starting point is 02:56:59 Like people think that juice is good for you, like orange juice. He's like, it is such a sugar rush to your liver that your liver is like, what the fuck is all this? Like you're drinking like 11 oranges worth of juice and it's just going straight to your liver and your liver has a really hard time processing. This is almost like that, the social media version of that. Like your brain gets all these
Starting point is 02:57:26 impactful moments without All the like regular life space in between them if you live a normal existence Instead it concentrates it from billions of people all around the world and shoves it right through your fucking brain And we're not designed for that I had a period where I intentionally went and curated my Facebook algorithm where I followed all of the groups that look at police violence, cop block and those ones.
Starting point is 02:57:52 And so my feed just became filled with cops killing black guys and killing, you know, escalating violence in ways that didn't look useful. Now, of course, those videos also didn't show what happened beforehand to possibly justify it or not. So like they were selected for a reason. But even where they were egregiously wrong, they might be a very small statistical percentage of
Starting point is 02:58:16 all police interactions. But even though I knew I was curating my feed for this purpose, it emotionally affected me intensively just watching that many in a row. But by the time I've watched 12, it feels like this is everything that's happening. Right, right. That's the whole world. And then I got rid of those and I curated ones that were like pro-police, thin blue line kind of ones. And you saw people aggressing against cops and you saw what they have to deal with. And I was like, man, these guys are heroes. And again, it only took like 12 videos. And even though I was knowingly doing it to myself, it was that emotionally compelling because we are used to evolutionarily seeing a world that is representative of the world. But when there's so much data that 0.01% of it is more information
Starting point is 02:59:02 than I can possibly take in, and it can be totally statistically unrepresentative, but it still affects what I feel the world is. You can see how earnest people can get completely polarized. That's such a good point. And it is earnest people. And the fact that you are consciously curating it and still having this effect on you, but you at least can objectively express it to other people.
Starting point is 02:59:24 And hopefully that gets into some people's brains and they see how dangerous this stuff is. And this is also why these troll farms exist because they can really influence the way our culture moves and behaves and the way it thinks about itself. Gentlemen, thank you very much for being here. This was terrifying and daunting. I mean, I feel good, but I also don't.
Starting point is 02:59:53 It's hard. I get it. I feel like we should another time come back and talk about more concrete pathways for this stuff. Yeah, let's do it. Let's do it. Let's give it a couple of months and hope things don't turn to nuclear war. all these issues fit together? And how do we actually deal with the fact that we've created so much technological power, and we've had such a huge impact on our environment through the whole industrial use of technology that the world's increasingly fragile? These aren't just separate issues. They are related. There is a kind of global meta-crisis, and there is a need for
Starting point is 03:00:38 real innovative solutions in how we think about it. And I think because of this, more people will just at least be thinking about that and then be talking about it. And I think because of this, more people will just at least be thinking about that and then be talking about it. And that means more of the collective intelligence of the world, hopefully being able to start to work on solutions. And I am hopeful about that. I'm hopeful about that too.
Starting point is 03:00:54 Let's end in a positive note. Gentlemen, thank you very much. Really, really appreciate you. Thank you. Bye, everybody.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.