Tech Won't Save Us - Pronatalism and Silicon Valley’s Right-Wing Turn w/ Julia Black

Episode Date: May 18, 2023

Paris Marx is joined by Julia Black to discuss tech billionaires’ embrace of pronatalism and how it’s part of a broader rationalist project to remake society and protect their privileged positions.... Julia Black is  a senior correspondent at Insider and previously worked at Esquire and Vox. Follow Julia on Twitter at @mjnblack.Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Follow the podcast (@techwontsaveus) and host Paris Marx (@parismarx) on Twitter, and support the show on Patreon.The podcast is produced by Eric Wickham and part of the Harbinger Media Network.Also mentioned in this episode:Julia wrote about pronatalism, Lex Fridman, and Sam Altman.Paris wrote about eugenics in Silicon Valley.Marc Andreessen wrote “It’s Time to Build” in April 2020.Timnit Gebru gave a presentation on the TESCREAL bundle of ideologies. Émile Torres made a thread on Twitter about it.Support the show

Transcript
Discussion (0)
Starting point is 00:00:00 in their PR for what they're doing, they say, AI is going to change the world. It is going to yield radical abundance, like all these kinds of meaningless phrases. And it's like, okay, but break it down for me. Like, how's that going to work? Is she still going to have a job? Are her kids still going to be going to school? Like, what does this world look like that you're pitching? Hello and welcome to Tech Won't Save Us. I'm your host, Paris Marks, and this week my guest is Julia Black. Julia is a senior correspondent at Insider and previously worked at Esquire and Vox. Now, Julia has been doing some fantastic work recently with pieces on pronatalism, on Lex Friedman's podcast, on Sam Altman, and, you know, just on this kind of aspect of the tech industry in general, as it has been moving to the right and adopting some very concerning
Starting point is 00:01:02 ideologies. Julia is also the reporter who broke the story about Elon Musk having twins with one of the executives at Neuralink. And we talk about that a little bit in this conversation as well. In this episode, we touch on a lot of aspects of the various pieces that Julia has written recently
Starting point is 00:01:21 and topics that are related to it. And the real goal here is to understand what these tech elites are thinking and saying behind closed doors, the kind of world that they are trying to build through their various projects and through the ideas and the viewpoints and, as I say, the ideologies that they are really spreading into the world. And I think for a long time, we kind of praised the tech billionaires, or at least, you know, maybe not we specifically as people who listen to this podcast, but, you know, they were generally praised, they were generally held up. And in the
Starting point is 00:01:54 past few years, in particular, we have seen a really significant shift in their public personas, and also just what they are kind of openly saying and admitting and the discourses that they are kind of engaging in, right? And that has really serious consequences for the world that we live in and, you know, where our society might be going in the future. Because even as more people are recognizing the impact that these people are having on our world, they still have a lot of influence. They have a lot of power.
Starting point is 00:02:23 They have a lot of wealth. They have a significant ability to kind of reshape the direction of society basically and so we not only need to understand these things but if we ever hope to kind of challenge it you know we need to be aware of what they're doing and what they are trying to accomplish and i guess really what is kind of behind all of these actions that they are taking. And so that is why this conversation with Julia is so important, because we dig into so many aspects of this. We also refer back to some conversations I had previously with Emil Torres and Timnit Gebru. So I think that you'll see a lot of connections with topics that we've been talking about in the past,
Starting point is 00:02:59 but approaching them from a different lens and a very important lens. So with that said, I hope you enjoy this conversation. And if you know, approaching them from a different lens and a very important lens. So with that said, I hope you enjoy this conversation. And if you do make sure to leave a five-star review on Apple podcasts and Spotify, you can also share the show on social media or with any friends or colleagues who you think would learn from it. And if you want to support the work that goes into making the show every single week, you can join supporters like David from Manchester in England, Sean from Philadelphia, Freya in Oahu, and Snoop Boop from Wellington in Aotearoa, New Zealand, by going to patreon.com slash techwontsaveus, where you can become a supporter and, you know, help us keep making the show. So thanks so much and enjoy this week's
Starting point is 00:03:35 conversation. Julia, welcome to Tech Won't Save Us. Thank you so much for having me, a big fan of the show. Thanks so much. Obviously, always appreciate that. And you have been doing a bunch of fantastic work over at Insider recently with a number of pieces kind of digging into this aspect of the tech industry that we're pretty interested in here at the show. You know, you've written major pieces on pronatalism and of course, looking at Elon Musk and other kind of tech people's interest in that kind of ideology, I guess, or that kind of approach to reproduction. You know, Lex Friedman's podcast, which, of ideology, I guess, or that kind of approach to reproduction. You know, Lex Friedman's podcast, which, you know, I'm sure is known to a lot of
Starting point is 00:04:09 people who are interested in the tech industry because of his relationships with many of these powerful people, and a recent article on OpenAI CEO Sam Altman. And I'm sure that we'll get to many of the things mentioned in many of these articles. But before we dig into those, I'm wondering, you know, how you approach these pieces. Like, what are you trying to understand when you're looking at these people in these movements within the tech industry and have us as readers learn about what's happening in the tech industry right now? Yeah, I think that fundamentally, my interest has kind of always been, what are the conversations that are happening behind closed doors in these
Starting point is 00:04:45 hyper powerful spaces, very wealthy people? Why don't they want us to know what they're talking about behind closed doors? And how are the effects of those conversations going to trickle down to the rest of society? So yeah, I mean, I'm definitely really interested in individuals and personalities and trying to understand kind of who are these people who are shaping the future. At least they think they are. But that does seem to be a big part of it, right? Like, I think you can always see the tech industry as shaping aspects of society in many ways, right? But it does feel like, especially recently, there's this kind of renewed interest in shaping a lot of like, not just the society that we
Starting point is 00:05:26 live in, but also how we think about, you know, how we should live in a way, right? Like I was thinking, you know, as I was rereading your pieces and going over these things about like Marc Andreessen's piece back in 2020, you know, it's time to build or whatever the hell that was called, where he was kind of making this kind of forceful argument for Silicon Valley to get much more involved in many different aspects of the hell that was called, where he was kind of making this kind of forceful argument for Silicon Valley to get much more involved in many different aspects of the society that we live in. And it seems like many of the people that you are talking to and that you are profiling are kind of interested in this in many different ways as well, right?
Starting point is 00:05:57 They were explicitly trying to think about how human society exists right now and to think about how they can change it or reshape it so that it better fits with how they think it ultimately should operate. And usually that's a way that kind of puts them in control of much more aspects of what is actually going on. 100%. I mean, I think that something that frightens me sometimes is to feel like their ambitions are growing. These people want to now colonize more and more of our minds, colonize more and more of the earth, of space even, get involved in politics in unprecedented ways. It really doesn't seem like designing apps is enough for them anymore. It seems like,
Starting point is 00:06:38 especially figures like Elon Musk, really want to have control over society, want to have control over this long-term future they talk about. So yeah, it's just these growing ambitions and kind of like, whoa, I think we need to take a step back and make sure that we're all aware of this and we're having conversations about this. So that's what I always try to do with my work is try not to put too much subjective value judgment on it, but just put it out there and like, make sure everyone knows what they're talking about. Because yeah, I do think their ambitions to change the way the rest of us live are growing. Absolutely. And you know, there's obviously one approach to this,
Starting point is 00:07:14 where, you know, your kind of values are stuck right in there. But then there's another approach, which I think you do really well, where even just laying out like what these people are saying and what they're doing kind of tells you a lot about, you know, the ambitions that they have and kind of the world that they want to see and create. Yeah, I mean, obviously, there's a big debate in journalism always over like how objective we need to be. I'm a pretty traditional reporter, I think, and that I just try to talk to a lot of different people and put the facts on paper and let people decide. But I also think it's ridiculous to pretend that like, we're not humans with ideas and values ourselves that get projected onto that. So yeah, I think just as it's fair for like Elon Musk to have his values
Starting point is 00:07:56 projected onto his projects, the rest of us get to say, what about our values? What about the way I want to live my life? So yeah, that does naturally happen a little bit. It makes perfect sense, right? Like, obviously, your values are also going to affect kind of the types of stories that you want to tell the topics that you're interested in and want to pursue. And you know, I think that shows in the work that you're doing, obviously, which is fantastic and essential work. And so, you know, you've talked about some of these people and some of the approaches that are driving some of these people. And on the show in the past, we have talked about effective altruism and we have talked about long-termism, but do you want to talk to us a bit about those
Starting point is 00:08:32 particular ideologies and how they link up to the pronatalism that you were also describing late last year in kind of a long feature that you wrote on that movement, I guess, and how it's gaining traction among tech circles. Yeah, 100%. I mean, what's been really surreal is to see like the rest of the world become kind of aware of this stuff. I've been sounding insane talking about it dinner parties for about a year now. I'm sure you're familiar with the work of Tim Gebru and Emil Torres. And maybe some of your listeners are too. Both been on the show in the past. Yeah.
Starting point is 00:09:06 Yeah, yeah. And they've coined this term that I'm really interested in. I think it's called TESCRIAL. I think that's how you pronounce it as an acronym. So that goes transhumanism, extropianism, which is like anti-entropy, improving the human condition, singulitarianism, which is we're going towards the singularity, cosmism, going to space, rationalism, effective altruism, and long-termism. And those last three are all kind of connected as these ideologies that no one had heard of a year ago, but are suddenly super pervasive in these Silicon Valley communities. And I think in a way, a lot of this is tied with tech's kind of
Starting point is 00:09:45 sharp fear to the right, which no one really noticed until recently. I frankly didn't really notice it until a year ago. So yeah, my life kind of changed in terms of the way I see the world about a year ago, a little over a year ago now, when I first got this tip about Elon Musk's pro-netalist objectives. And so I talked to a source in Austin who basically said to me, Elon has a lot of kids, everyone knows that, but they don't realize that A, he has more than the public realizes, and B, it's part of this worldview, this agenda. And simultaneously, he was out there tweeting it. As much as this sounded crazy, and I almost didn't even pursue
Starting point is 00:10:25 the lead, he was on Twitter saying these words that I now know to recognize as buzzwords about civilization. And he talks a lot about fertility rates and population, but it's all tied into this long-termist worldview. These people talk about preserving the light of consciousness and just really these sci-fi ideologies. They seem different, but they're all interconnected. So once I got onto that track, what happened was I decided to do some reporting. Pretty soon I found that Elon had these twins with one of his employees, Siobhan Zillis. And yeah, I just kind of expanded my understanding from there, talking to more and more people involved in this movement and realizing how methodical it is. And again, how it all kind of connects to this like global or cosmic
Starting point is 00:11:11 domination. Like there are some really fascist leaning ideologies in here, even in a weird way that's hard to explain, like ties back to the free speech absolutism and what Yolande's doing with Twitter. And it's about like elevating this elite class. And you might do that through having lots of kids who perpetuate your own DNA. And yeah, elevating this elite class to basically be in a position where they have dominion over the rest of us. Like it really does sound straight out of Dune or something, which is again, one of their favorite sci-fi novels. Yeah, it's like this fascinating subculture that might be niche, but again, just the way that a few individuals are able to exert influence on the rest of us, you got to start paying attention
Starting point is 00:11:58 when their ideas are coming straight out of sci-fi. Absolutely. And niche, but growing in appeal, especially in these elite circles. I don't know if you saw this, but Marc Andreessen in his Twitter bio now has that he is a test realist. So adopting that term and explicitly saying that I ascribe to these ideologies. Oh my God. You know what? Marc Andreessen blocked me so long ago. I don't even know why. I've never written about the guy, but I hear he blocks a lot of people. So I have not seen that, but that's wild. I'm going to have to go look into that. Yeah. I saw a screenshot and I'm not blocked on my podcast account, so I can get in and double check through that. Wow. Yeah. No, I mean, that's really interesting to hear because again, I think that in talking about this and reporting on it and writing about it, it's forced them to come out into the open with it a little bit. And at the end of the day, that's all I can ask. All I can ask is that they like explain themselves to people and be kind of upfront and honest about it. So I think these ideologies have been spreading. And it's just that now awareness of them is growing. But that's wild.
Starting point is 00:13:10 And I do think that the more they are open and upfront about it, the more that we can then have a conversation about what they're actually trying to do, rather than just kind of like the PR angle that we usually get on a lot of these stories and, you know, a lot of what they're actually involved in, right? That's one of the, you know, kind of good things about Elon Musk's turn recently is that instead of talking about him as kind of the man building the future and saving us from climate change and all this stuff, we can have a real talk about his politics and how that kind of intersects with the various things that he's been doing. Because, you know, obviously you talk about how he's been tweeting about this kind of population stuff, but he's also been talking about it since many years before that. It's included in his 2015
Starting point is 00:13:45 biography. And you note that, you know, you had a source tell you that even way back earlier than that, he was talking about Genghis Khan and how Genghis Khan had kind of like, you know, his DNA was throughout the human race. And Elon Musk was someone who was kind of interested in those ideas. And we know that Jeffrey Epstein was as well. Yeah. No, I mean, I've now spoken with quite a few people who've known him for a long time and said that this has been a long-term interest of his. I've also spoken with his father on the record and he said some pretty shocking things to that effect. You know, why should my son be any different from a monarch? He should be spreading his seed. He should be perpetuating our superior DNA. Again, it's like, okay, I just want to make
Starting point is 00:14:27 sure you're willing to say this all on the record out loud forever and a year and people can make of that what they will. But let's be honest. That's absolutely wild. Yeah. His dad is a piece of work and clearly has influenced his formation as a person quite a lot. Yeah. I do think it's fascinating. Elon Musk is often very open about his kind of work and clearly has influenced his formation as a person quite a lot. Yeah, I do think it's fascinating. You know, Elon Musk is often very open about his kind of difficult relationship with his father and often dislike of his father. But there are also so many ways where you can see these two men being like, very similar to one another and holding a lot of similar views. Yeah. And now that Elon has however many children, because I don't think we really know,
Starting point is 00:15:05 you're starting to see some of the patterns recreate themselves. So yeah, that's all I'll say on that. Absolutely. But, you know, kind of drilling down on this just a little bit more, like there's an aspect to this that I'm really interested in, right? Because obviously they seem to think very clearly that, you know, because of their position in society, because they are wealthy, that that means that, because of their position in society, because they are wealthy, that that means that they're also smarter than everybody else, right? That IQ is inherently linked with wealth. We know that IQ is quite a racist kind of measure of intelligence that was created many decades or whatever ago, and that has explicitly been applied in this way. But they
Starting point is 00:15:41 very much use it to kind of argue that they are smarter than everyone else, that they deserve to be in the position that they're in and that they want to kind of maintain that dominance into the future. You know, in the past, these tech billionaires were trying to like see how long they could live themselves. And now as they're getting older, they're thinking about, you know, their future generations and how those generations are going to kind of keep that position in society. And we know that they both have a lot of interest in kind of IVF, a lot of their children are conceived through IVF. And they're also very interested in kind of choosing genes and kind of investing in companies that have a lot to do with genetics and, you know, the genetics of children or embryos or
Starting point is 00:16:21 whatnot. Can you talk to us about that kind of aspect of this and what they're trying to achieve through all of that? Yeah, definitely a few different layers there. Just to start with the IQ thing. Absolutely. I think that's correct. Again, according to sources who know Elon, for example, they tell me he absolutely believes your wealth is a direct reflection of IQ and therefore the orders of magnitude of wealth that he has over the rest of us he also believes he is basically that much smarter so that's terrifying yeah it is especially when you see his tweet what's also weird though is that for such smart people and I am willing to grant that they are really smart I something that does drive me crazy sometimes when I talk about Elon Musk is I'll refer to him as a genius or something and people go, oh, come on, he's not a genius.
Starting point is 00:17:07 And I'm like, you know what? I'm willing to grant like when it comes to engineering rockets, he's really, really smart. The guy is clearly not an idiot. But there are certain areas where you realize that that like IQ, whatever that means, doesn't apply to every area of intelligence. Certainly doesn't always apply to every area of intelligence, certainly doesn't always apply to emotional intelligence, for example. But also, I mean, just talking to geneticists around this pronatalism story, there's a lot of debate over whether any of these technologies that they're counting on actually work. And, you know, anyone who's studied genetics kind
Starting point is 00:17:40 of understands that just having two super high IQ parents doesn't guarantee you a super high IQ kid. So there is some fundamental, maybe misunderstanding of like, how the science is going to work out, not to mention all the social factors, like how are kids going to feel about being treated like a project? How are they going to feel about being one of 13 children, all those things that might influence who they turn out to be. But I think at the end of the day, my best theory for what they're really doing here, there's a phrase that people in this sphere throw around a lot called the Overton window. It's basically this idea in social sciences that's like, if you expand the Overton window, you're expanding the realm of ideas that are socially acceptable to talk about.
Starting point is 00:18:27 So I think that in a way what they're really doing is just kind of getting everyone more comfortable with talking about this eugenics language. So whether or not the tech is there yet, it doesn't really matter because maybe one day CRISPR, for example, gene editing will be legalized. And by that point, I think that certain people would love for society to be at a point where we're comfortable talking about superior IQ or picking eye color or, God forbid, picking skin color, which that's something that we may be nearing capabilities for technologically, but like society is still not comfortable with any of that stuff. I think there's kind of, it's almost like a propaganda movement at this point. It's less about whether or not the tech
Starting point is 00:19:14 is actually there. And it's more about helping people feel comfortable with the idea of using these technologies to quote unquote, improve the general population, which, yeah, as I actually was listening to an episode recently with Malcolm Harris, and he noted like the long history of eugenics, obviously in the US, but also specifically at Stanford and like in this tech world and how long eugenics has been part of the agenda. And again, it's just this matter of like, it's not that the actual technology is reemerging so much as the ideas and the acceptability of such ideas is reemerging. And that's where this all comes back to like, the way that these people are newly interested in politics and really interested in
Starting point is 00:19:55 the social aspect of what's going on in the world and wanting to influence that. You know, I think that the fascinating thing there is how, again, like, as you're saying, I went back recently and like looked at some of the comments from some of these people who are like, you know, very clear geneticists at Stanford. And like, the kind of language that they're using is so similar to the language that we hear today, like just being completely repeated. So I think, you know, what you're saying about it, it kind of being like a propaganda movement to try to make us more comfortable with and familiar with these kind of like eugenicist terms and
Starting point is 00:20:30 arguments and framings of these issues is really spot on because that does seem to be what it is about, right? And in particular, about ensuring that at a moment when I think these people are feeling that their position in society is being challenged in a way that it hasn't for a long time, that they are trying to kind of create like a narrative justification or something like that, an ideological justification even for why they hold the position that they do in society and why they shouldn't be removed from that position or challenged in that position. Yeah, 100%. And this is also where it all kind of comes back to that test grill thing where all these seemingly disconnected ideologies actually have a lot to do with each other,
Starting point is 00:21:14 because what it all comes down to at the end of the day is kind of the rationalism element, which is stripping everything of emotion and fairness. What is that? Who needs fairness? Like all decisions should just be made as if by a computer for the sake of pure optimization. And so once you start seeing the world that way, it becomes a lot easier to think about like inequality. And again, these systems we have that people are starting to challenge. But I think certain people in this movement, I want you to actually think, okay, like, that makes sense, because I'm inferior, and you're superior. And so like, you get more than me. And, you know, to fight that would just be to fight rationalism. Like, I do think that at the end of the day, that's the single ideology that kind of encompasses the rest of them is just wanting to optimize everything and strip
Starting point is 00:22:06 everything of feelings. You know, it's frankly that like, I don't know where that phrase came from Fox news or something like facts don't care about your feelings. I think that's Ben Shapiro. There you go. Yeah. He may not be a tech guy, but like, I am willing to bet there are conversations being had between people like Ben Shapiro and people like Elon Musk, if not those two people specifically interested in this as well, and kind of seeing it repeated through the pieces that you had done, right? And how there's this real kind of desire and real interest in quantification, right? Everything needs to be quantified. We need to be able to track how everything is doing. You know, you can see this broadly through Silicon Valley, but particularly in these ideologies, right? Both, you know, you can see this broadly through Silicon Valley, but particularly in these ideologies, right?
Starting point is 00:23:09 Both, you know, effective altruism is about how do we ensure that our philanthropy is, you know, having the most effective outcome as possible. Long-termism is like, how do we ensure that we maximize the number of like people who are doing well, well into the future? And we need to not think about, you know, kind of the particular issues that we're motivated by right now, but we need to think on a much longer timescale, even if people are suffering today, well, you know, what about all the people in the future, and we could protect them, right? And then with pronatalism as well, like some of the people you were talking to were saying how they are like, kind of tracking their children, like from the time that I guess, even before they're born, because they're looking at these
Starting point is 00:23:40 kind of genetic, you know, profiles or whatever that are done of the embryos and choosing them based on that, and then tracking them as they they get older and saying that after a few generations, they'll have all this data on kind of their family lineage and all this stuff. What do you make of this whole kind of drive to kind of remove any kind of social or even moralistic thinking? And I would say sometimes they use this quantification to act like what they're doing is moral and just to think so purely about numbers in terms of all of it. Yeah. It's the single reason I'm most fascinated by all of these people and also the single reason I'm most alienated from them. I cannot relate to this just from my personal point of view. Yeah. This ability to strip everything down to data and to numbers and to metrics and
Starting point is 00:24:26 yeah to me that that leaves out so much about like the human experience that makes life wonderful and at times I find myself feeling kind of sorry for my subjects because I have this great personal life that I feel like I keep very separate from all of this tech stuff that I report on. And like, it's so far from this world. And sometimes it feels like they're coming for these parts of life that should be untouched. And, you know, I was emailing you about like, this phrase has been going around my mind, like Soylent world, I just feel like we're moving towards living in Soylent world where like, everything can be reduced to efficiency. And yeah, I mean, Soylent might be the best way to deliver calories in the most efficient manner.
Starting point is 00:25:14 But I don't want to eat Soylent for the rest of my life. I want to eat burgers and Chinese food and, you know, like the wonderful things that may not be good for me. And it just feels like in so many areas of life are coming for them. And without this understanding of what makes humans tick and and you see it applied to dating, romance, people are talking about how. And I was listening to your episode about Elon and Twitter. That's been something really amusing to kind of watch play out is him thinking that you can apply this kind of engineering thinking to this product. But like, Twitter is not really a tech product. It's a people product. It's about understanding consumer behavior and what people want and what people enjoy. And so I feel like it can really backfire once you start stripping out like that irrational stuff. You know, I think he's paying the price with Twitter. It's not working because he's assuming that everyone thinks like him and they really
Starting point is 00:26:15 fundamentally do not. And so, yeah, it's once you start applying that thinking to society at large and politics, I think there's a lot of ways that could backfire, not only for all of society, but like even for the people themselves. I think it would be wise to listen to some advice from people with higher EQs than them, even if they have a higher IQ. Julia, how dare you want to do things in life that don't just make you the most productive as possible? Yeah, I mean, it feels that way sometimes when I read about their approach to life. Yeah. And, you know, you see it reflected in like, you know, the hustle bros and people like that who are getting more attention on Twitter these days. And, you know, we know who they are since a while and that kind of perspective on
Starting point is 00:27:00 these things. But I feel like, you know, as you were talking there, I was also thinking about how this also kind of affects our approach to addressing issues in society beyond kind of the personal experience where it feels like constantly kind of the drive with kind of policy and governments now as well is that we need to be able to collect data on everything so that we can actually understand, you know, what is actually going on here. And if we don't have data, then we can't possibly think of a way to like address a problem. And it seems like not just, you know, as you're saying, like the personal life needs to be tracked, and everything about it needs to be kind of known, and you need to be maximizing your kind of personal productivity and whatnot. But also that kind of view of Silicon Valley, that has kind of taken hold over the
Starting point is 00:27:42 past couple of decades, then kind of filters out to the rest of society where everything has to be datafied. Everything has to be kind of absorbed into this logic that they have. And it kind of degrades everything around us because not everything can be captured from that form of knowing, I guess. Yeah. I mean, again, I think once you start thinking through like the political implications, it gets kind of frightening. And obviously, Silicon Valley has been very fixated on what's happening in San Francisco recently as like this microcosm for America and why liberal values are bad for America. And I do think that at a certain point, everyone can agree on the problems that we're facing as a society. You know, no one wants to see homelessness and poverty and those kinds of struggles. But what's interesting is that I think Silicon Valley is like chomping at the bit to apply their tech thinking to solving these problems. And again, without realizing that these are very human problems, like I have a lot of friends who work in that kind of world and social services, and you cannot
Starting point is 00:28:50 deal with that stuff on a data level. Like, it's so complex. I don't know, it's just the idea that you would try to solve society's ills with technical data points, to me seems misguided. And that said, I do wish that we could see more cooperation and collaboration between government and tech. I think it would do both worlds a lot of good to have a more open line of communication, but I do not want to live in a technocracy because, yeah, it just, I think it could get really nightmarish and fascist very quickly. Yeah. And, you know, you can see how kind of divorced from reality a lot of these people are, right? Like, you know, as I was saying before, you see in Andreessen's kind of argument, it's time to build how they want Silicon Valley to serve this role that you're talking about, where they're much more involved in many other aspects of society, without really recognizing about how their efforts to kind of alter and change and improve the physical world haven't really worked out over the past 10 years or so when they have tried to kind of engage in that way. And then, you know, you also talked in your
Starting point is 00:29:54 piece about Sam Altman about how I believe he hasn't been to a grocery store in four or five years. So like, you know, there's this kind of very kind of base level understanding of what most people experience that just isn't there. Yeah, 100%. This was a really interesting issue to kind of engage with Sam on. I think he did surprise me in that he sort of agreed that tech should not be left to rule society. And he is begging for more regulation around AI, for example. That said, I kind of tried to make this argument to him and in my piece that as long as they are stepping into these roles that are so powerful and so influential,
Starting point is 00:30:31 again, they kind of owe it to society to almost act like politicians in their communications. Like they owe it to us to explain things and they owe it to us to take feedback and have this be kind of an open discussion about how these technologies and products are going to shape our lives. And frankly, I found it pretty troubling, the fact that he really felt unable to articulate after being pushed like several times on it, just how AI is going to change our lives. I really wanted him to be able to spell out, I kind of kept trying to find different angles to explain it to him. But I was like, okay, picture whatever you want to think of as your average American, 40 something year old mom of three making $50,000 a year, whatever, pick whatever person
Starting point is 00:31:14 and just like try to talk to that person and tell her what her life might look like in 10 years and how it might be affected by your products. And this question just like blew his mind. He just had no answer. The problem with that is that in their PR for what they're doing, they say AI is going to change the world. It is going to yield radical abundance, like all these kinds of meaningless phrases. And it's like, okay, but break it down for me. Like, how's that going to work? Is she still going to have a job? Are her kids still going to be going to school? Like, what does this world look like that you're pitching? Because you're just using these very vague terms right now. And you're promising it's going to be so fantastic. Unless it kills us all,
Starting point is 00:31:53 by the way, which is this weird side note they always add. One or the other, you know, whatever. Well, it's like, yeah, I just don't think you can sell your product that way and get us all to accept these products into our lives so willingly without being able to spell it out for your quote unquote average person. What is it actually going to mean? Because, you know, I think a lot about when it comes to AI right now, I think a lot about where we were with social media, maybe 10, 15 years ago. I remember conversations about social media where I so fundamentally did not understand what was coming. And I don't think anyone did. I was a teenager and in my early 20s, it makes sense that I didn't. But I really don't think anyone did except the tech leaders who were
Starting point is 00:32:39 having these closed-door conversations I've been talking about. And so they knew that it was all about data gathering. They knew that it was about these algorithms that could change behavior, get us to spend more and more time on our phones, on these platforms. But I remember looking at Instagram the first time I got it and going, I don't understand how they're going to monetize this product. I just post pictures. It's so fun. What are they getting out of me? And of course, the answer is they're getting so much out of me. And now I'm an addict. And now I'm like on there shopping for things I don't need. And I'm on there, like completely changing my tastes and preferences and political views because of what I'm seeing on a day to day basis. And so I wonder how society would be different if
Starting point is 00:33:22 we'd been actually informed of the risks and consequences before we signed up for Instagram. You know, it's like before we'd gotten ourselves addicted. And so I feel like what's happening with AI right now is there's this like refusal to spell things out and to actually make concrete what's going to happen to people's lives, how it's going to change our brains, how it's going to change our jobs, how it's going to change our daily lives. And so it's like, I don't know, I feel like we're all being opted in, whether or not we want to be. And I wish we knew a little bit more about how that's going to shape things in 10 years.
Starting point is 00:33:59 So it would be nice if we could get some communication on that. It's a great point, though, right? Because I guess maybe one thing that I would see is at least a bit more positive in this moment versus maybe back then with the introduction of social media and kind of, you know, how all these apps kind of exploded in the early 2010s was that I feel like there is a bit more of a recognition that all of the kind of promises of the big companies are not going to work out as they're saying, you know, as you're talking about Sam Altman and an open AI are saying a lot of things about what chat GPT and these AI tools might mean for us, but they're kind of PR speak, right? And
Starting point is 00:34:36 I feel like the kind of critical voice, the kind of questioning voice saying, hold up now, like, what is this actually going to mean for us rather than just kind of taking the PR at face value is a lot more kind of prominent in the conversations today. And the notion and the idea that regulation is necessary, obviously, there's a question about what regulation is necessary and what it would look like. But at least that is like there in a way that I guess it wasn't so much earlier on when we were maybe more easily duped about what these potential technologies were going to be and what they were going to mean for us. Yeah. I mean, I would hope we've learned some lessons from the social media fiasco, and maybe there's reason to be optimistic. And again, that's why I think the work of
Starting point is 00:35:20 journalists is important to get this awareness out there. But I don't know. Then again, every time I talk to anyone in AI who knows what they're talking about, I come away from the conversation just petrified. I mean, I don't know. Even this morning, I had coffee with an AI founder who was a really nice guy. And he was really optimistic, as it is his job to be. And he was talking about, I was like, okay, what do you think it means for people? And he said, I think it means augmented humans. And so I'm like, okay, what is, what do you mean by that?
Starting point is 00:35:53 He's like, you know, I mean, we already have some of it. We have Google Translate. We have this ability to like speak other languages using this technology, even if we don't inherently have that skill. And I know for me personally, like Google Maps is a big one. I love that I have a terrible sense of direction and yet can get anywhere. So I think that that's one positive look at it. But then by the end of the conversation,
Starting point is 00:36:17 he was like conceding, but like, yeah, you know, we do have this open source approach and definitely possible that there could be bad actors who use it in really bad ways. And yeah, misinformation is going to be really bad. You know, it's like, I don't know, we just have a lot of weighing of the benefits and the risks to do. And, and as you say, like maybe regulation can help with that. But again, I don't know that I totally trust the competence in Washington to, you know, like that conference, the meeting of the AI minds with Joe Biden and Kamala Harris, and just like, as if that was a productive conversation, what could I have possibly achieved? I did not have faith that that's like, making a difference. It kind of feels just for show.
Starting point is 00:37:00 Yeah, getting all the CEOs of the big AI companies together with the top of the government is probably not the best way to like learn about, you know, the proper approach to these things. Yeah. And like Biden like stopped in for three minutes for like a photo. Like, no, actually, well, first of all, don't get me started on gerontocracy issues and like the fact that we're going to have an 80 year old president no matter what, like, that's who we trust to understand AI. I don't know. Absolutely. And, you know, I think what you're saying there about the potential for like augmented humans brings up something else, right? Because in the Altman piece, you were also talking about, and of course, this is something that's not just with Sam Altman, but many of these people, how they kind of would like to see, or they imagine that we're going to see more of kind of emerging between human and machine. And, you know, like I was talking with Emily Bender about, Sam Altman has tweeted out that we're all stochastic parrots, you know, kind of relating
Starting point is 00:37:54 our intelligence to the intelligence of a large language model, which, you know, I think is kind of degrading human intelligence quite clearly. But also you hear people like Altman and Elon Musk talking about like, maybe we live in a simulation and maybe this is just all a big computer. It seems a bit wild on one hand, but then on the other hand, you can see that if you think that humans are like machines and if you think that we are already in a computer, then maybe like the stakes are a bit lower in that sense because, you know, it's not all really real anyway. A hundred percent. And this goes to show like how these themes are all connected because this kind of goes back
Starting point is 00:38:28 to the beginning of the conversation. Like what you didn't mention about long-termism is it's not just about saving future lives. It's about saving future simulated lives. Like there's this idea that a flourishing human future could enable us to upload our consciousness to microchips that will one day be floating around space after Earth has been destroyed by an asteroid. So we owe it to those floating microchips to enable their ultimate happy life. That's how
Starting point is 00:38:58 crazy this stuff sounds when you really dig into it, which to me is not life. But that's the same way I feel about Mars. It's another reason that Elon's whole ideology is it, which to me is not life. But that's the same way I feel about Mars. It's another reason that Elon's whole ideology is so alienating to me. I do not want to live on Mars. I do not want my children to live on Mars. I think Earth is pretty great and we should maybe focus on saving it. But yeah, it just goes back to this idea of like, you have to dig into the personalities and the worldviews of these people who are shaping these technologies because they're really alien to, I think, your average person. And I think it's kind of important to expose that because, yeah, once you realize like,
Starting point is 00:39:33 oh, they think a simulated life is the same as a quote unquote human life. They think that a brain chip interface is a good idea. They think that like an optimized diet and an optimized sleep schedule and an optimized daily life is the best way to have a happy life. It's like value systems really matter. Yeah. Again, I just, I find that hard to relate to when I look at my own life, which my most wonderful moments are the ones spent with like friends, loved ones, family, very outside of technology. I do not want technology integrated into every aspect of my existence. So yeah, I hope that in exposing some of those preferences of these people in power, we can start to push back on them a little bit and say, well, maybe the rest of the population
Starting point is 00:40:23 doesn't want to live your way. And that's fine if you do, but I don't think you get the right to enforce this hyper-technological lifestyle on everyone. You could just schedule your socializing time. You have a block of 2.25 hours for socializing, and you can fit it in there. And you need a bit of that to make you more productive. It's all about balance. Of course, you should should have a healthy diet and you should have a decent sleep schedule, but like pure rationality is not the way to a happy life because you're going to want to eat that burger. You're going to want to stay out late or, you know, like have your fun. And yeah, I mean, even just to like, come back to the question of like love, it's so interesting
Starting point is 00:41:04 talking to some of these rationalist types,. They fundamentally don't believe that love exists. They will reduce that to some hormonal delusion that is only purely for the purpose of biological propagation. And it's like, I don't know, that's, that's no way to live. Just gets really depressing. Yeah, I feel like you see that with Musk. But, you know, another of the people you've written about is Lex Friedman, right? And I feel like you very much see that kind of approach with his kind of, you know, lifestyle, his diet that he's talked about. You know, he also seems to kind of probably take that approach to love or just kind of steps back from it altogether. I don't know the guy very well or him very well, but that was kind of what I got from your piece.
Starting point is 00:41:50 But I feel like he also kind of links this kind of tech politics with this right-wing politics really well, right? Or it's a good kind of entry point to talk about that. Because obviously we have this large right-wing media ecosystem that exists that both you have like kind of the Fox News piece of this, but you also have digital media organizations and a whole range of like YouTubers and influencers who are kind of pushing this kind of right-wing politics. But you also have this kind of increasing coming together of that kind of right-wing politics with this kind of like increasingly right-wing tech politics as well, right? And I feel like Lex Friedman's podcast is one of the places where those kind of perspectives find one another. So you want to talk to us a bit about that kind of approach and that linkage and how tech has kind of taken this lurch to the right and how that's reflected in their politics, but also kind of the media that they kind of consume and promote and all these sorts of things. Yeah, I mean, it definitely applies to all the stories I've written recently
Starting point is 00:42:50 in this whole crowd. But Lex is a great example. Just to start with the love question. I mean, Lex's approach to love is very interesting. He does display this intense, like romanticism at times and talks a lot about the Russian soul. And, you know, he's like this hopeless romantic. But at the same time, he kind of seems to be creeping towards this like advocacy for like the sex robot era. Like he talks a lot about, and this goes back to that question of like, whether or not these people see humans and machines as inherently distinct like he talks a lot about forming very intimate relationships with robots and ai algorithms and he talks about how his ultimate dream is to have this startup this ai startup that creates like a personalized algorithm for every human and it's going to be integrated into like everything in our lives. He has this, he goes off and waxes poetic about like the relationship between
Starting point is 00:43:50 your AI enabled refrigerator and yourself and how the refrigerator is going to be there for you when you have these like late night ice cream binges. And I'm going to forget that, like, it's going to have a memory now and you're going to feel connected to that refrigerator. And yeah, I mean, he's got these robot dogs who he forms very intimate connections with. And even these Roombas, he did this experiment where he trained Roombas to scream, I think, and then tested human reactions to like, if you hurt the Roomba, are you going to feel bad? Because I think his argument is kind of you should, we should show machines the same kind of empathy as we should for humans. So again, it starts to go to pretty weird places. And you also start to then combine that with like the profile of his listeners, which leans a little put this there are a lot of young single men who listen to his podcast. which leans a little, I don't know how to put this. There are a lot of young single men who listen to his podcast and he talks a lot about being unlucky in love and
Starting point is 00:44:51 not knowing how to find connections. So yeah, again, it kind of gets to this like Overton window question of, are we just expanding what's okay to talk about in terms of forming correct connections with bots? And I mean, you're already starting to see a ton of this with AI bots. You have all these men who are interacting with their AI girlfriends, and then they switch the algorithm and the girlfriends, it was like they've been lobotomized, they said. And these men were like heartbroken. And I was recently talking to a psychologist who said that she's been seeing a lot of patients who are forming these connections
Starting point is 00:45:25 with AI chatbots and, you know, like teenagers who don't want to hang out with their friends anymore because they've got their chatbot friend and a married man who like was seeking romantic marriage advice from a chatbot. And he was really benefiting from these conversations. And so again, like as a society, whether or not we're aware of it, we're getting more comfortable with these ideas of like forming connections, romantic relationships, and I think possibly soon enough, sexual relationships with robots. And again, I just, I think we need to talk about it. Yeah, it's wild because we know that these things aren't intelligent, right? Like, like they're not actually talking back to us in the way that they might make it
Starting point is 00:46:09 look or the way that we might imagine. Like, you know, you're not actually getting advice from the chat bot. It's not actually your friend that doesn't like remember your conversations or anything. Like it's just kind of responding to your prompts. Like, you know, but again, like if you dig into someone else's ideology and value system and way of thinking, I think a lot of people disagree with you there, which, yeah, maybe you and me sounds crazy. But like, I think that they would argue that if the algorithm is sophisticated enough and if the AI is convincing enough, what is the difference?
Starting point is 00:46:42 I mean, you know, like the test a lot of people give is what if you were told that you're living in a simulation right now? Like, it feels pretty real, but what if the tech is just that good? And like, would you want to end it? Even like, if your life turns out to be a simulation? Don't get me wrong, I think that's why I do think it's important to try to at least put yourself in the mind of someone who thinks that way. And then ask yourself, how much power do we want to give a person like that? And the answer lately seems to be a lot. Yeah, it's shocking and scary, really, when you really got to the conservative question there. Yeah, I mean, I think you're starting to see some sort of unlikely alliances form again between this tech world and this far right conservatism. And especially if you look at the pro-natalism question, you look at people like Lex Friedman, who's a pretty right-leaning podcaster.
Starting point is 00:47:44 Like this just keeps coming up again and again. And frankly, as a woman, I'm pretty freaked out because I think that we are turning to some pretty traditional ideas of gender roles in a lot of these conversations. I mean, not to mention the anti-trans sentiment in this world. And again, just these weird kind of unlikely connections, you would think, why do tech people care about trans issues? Like, stay in your lane. But it all kind of ties back to this, like, hyper rationalism. And like, well, that's the way biology, quote, unquote, works. Like, that's the way it's always been. That's the way it should be. It's just, you start to see people kind of meddling in stuff that I would rather they get their hands out of.
Starting point is 00:48:26 If you really break down the pronatalism ideology, whether or not it's convenient for them to say out loud, it is about the reason fertility rates are dropping is because women have entered the workforce. And that's what happens in modern developed societies is there's more gender equality. So yeah, if you're saying women need to start having 13 babies again, I have questions for how that's going to play out in terms of gender equity. And it was really interesting to kind of dig into the backstory a little bit of Siobhan Zillis, who was Elon's executive. She still works at Neuralink. She was on the board of OpenAI. She was like this really top rising figure in AI. And I've spoken with a lot of people now who know her who say like,
Starting point is 00:49:11 yeah, I haven't seen her in two or three years. She just dropped off from that. She entered Elon's orbit and cut all ties. And by all reports, their children were conceived with IVF, which kind of fits into this narrative around this was more of a project than a romantic relationship. So yeah, I've talked to women in tech who are really sad about all that. And they say, what does this say about how we're viewed in the workplace? Are we just breeders to them? This doesn't really make me feel respected in these tech workplaces. So yeah, I definitely think that gender is one of the biggest social issues that's going to be affected by this turn towards conservative values in tech. And it's something we should all keep an eye on.
Starting point is 00:49:54 Absolutely. You know, like you even see it in the recent interview that, you know, and Musk has been saying a lot of things. Musk has been kind of making explicitly anti-trans statements, even though he has a trans daughter saying that has been kind of making explicitly anti-trans statements, even though he has a trans daughter, saying that this is kind of like communist indoctrination and all this kind of stuff that's coming from schools. But he also did an interview with Fox News recently where he seemed, based on what he was saying, it seemed like he was very critical of birth control and women's access to abortions and things like that at a moment where there's a pretty strong right-wing project in the United States in particular to crack down on access to abortions
Starting point is 00:50:32 and I imagine birth control as well. But the right has also taken a particular interest in trans issues and has been passing bills across the country to kind of, you know, make life hell for trans people. And so the tech industry and people in the tech industry are very powerful and very influential, right? I feel like there has been kind of a changing in how the public thinks about these people in the past few years, but they do still hold this position in society that comes with a lot of influence and where people really do listen to them. And so I think as you see these growing links between, you know, these powerful individuals and this kind of growing and also very powerful right wing movement, it's pretty scary. Yeah, I mean, not to mention, like,
Starting point is 00:51:16 Elon has literally purchased the public square. So he has seized the means of communication for society. And now his tweets are elevated above other people's. And those tweets are often things like a woman's most important role is as a mother or something. Or, you know, he'll And the whole alliance kind of reminds me of the rise of Trump, because there was this moment early on where you were like, how on earth are Trump and evangelicals going to form an alliance? Surely they realize they don't share any values. He does not represent an evangelical way of life. But it's just strategic at the end of the day. And so I think that, yeah, whether or not abortion and birth control are top of mind for Elon, if increasing the population of elites is on his mind, then he'll form alliances and he'll find people who have similar goals and
Starting point is 00:52:18 maybe they'll kind of blend their reasons for wanting those things as a strategic way of thinking. So yeah, just again, like, I think people really need to catch up on how involved in politics the tech world is trying to get and is succeeding in getting. Absolutely. You know, I guess to kind of start to close off our conversation, it feels like one thing that is really kind of common here is using kind of this aesthetic of science or this notion of science to kind of justify perspectives that are not really scientific, right? Like you were talking about how, you know, they make this really strong link between IQ and wealth, and also the notion that IQ is something
Starting point is 00:52:59 that you pass on to your children, which, as you say, is not something that's really kind of backed up by the facts, right? But then you also see it in like, Lex you say, is not something that's really kind of backed up by the facts, right? But then you also see it in like, Lex Friedman himself is someone who kind of positions himself as a scientist or, you know, as a researcher, but does not really have those credentials, right? And maybe you can talk to us a bit about that. But that does seem to be one way that they kind of act to justify some of these things is by using platforms and expressions and, you know, concepts that we do inherently trust. You know, we trust in science generally, and we trust in the ability of people to have free speech and kind of to say what they want.
Starting point is 00:53:37 But then it seems like these concepts are kind of like turned on their heads to be used to justify these things that are quite scary and against, you know, what we actually would usually be supporting and in favor of. I wonder what you make of that, or if you have any opinions on it. Well, I mean, I almost had to laugh when you said we trust science because that has been systematically dismantled over the last few years. I mean, again, really intentionally attacking the public's trust in science. And well, I do feel like even the people who are, you know, say against vaccines and stuff like that act like their skepticism is scientific, right? Because, and you see, even see Elon Musk say it,
Starting point is 00:54:18 right? I've seen him tweet about it where they'll say, this is the scientific method where we question things and blah, blah, blah. So there does seem to still be that use of the term totally totally it's like a co-opting of it which again is has been really effective i think um and and i think it was even like yesterday you saw elon tweet out some like graphics about quote-unquote black on black crime and and yeah they they try to make it look legitimate they they but as soon as you click through i think that's a hilarious aspect of like the community notes program that ilan has introduced which i actually don't think is the worst thing in the world because it often ends up like catching him and his cohort in lies um or in total misrepresentations um and so yeah that was one that was like quickly
Starting point is 00:55:06 dismantled by like the actual data community who studies these things and understands these social phenomena and like but yeah it's it just all again goes back to this idea like how is this all connected how how could free speech absolutism possibly have to do with um you know cosmism and pronatalism like how is it that all these things are but like it's a very strategic program that's being run and and again i think it has a lot to do with like why acquire twitter that's that was this big question on everyone's minds and we've all speculated about it i've heard a few different theories that I think make sense, but certainly I think part of it had to be just this realization of like the power of controlling, um, this public square and, and yeah, being able to like
Starting point is 00:55:56 introduce these, I'm just asking questions, ideas. And that's what Lex does on his podcast, which is kind of the thesis of my piece. Um, and so, yeah, you're, it all comes back to like broadening the Overton window. And, you know, once you plant these just questions in people's minds, it starts to make them more socially acceptable to ask. It makes it easier to lie about them. It makes it easier to get people to just like swallow whatever you're giving them. So, so yeah, I do think it's all pretty strategic and it's kind of working. And so I think it's important to expose that fact. Yeah. I would just say, you know, you often see discussion of the Overton window on the left as well, right? This notion that, you know, if we just push it a bit more to the left, get people more open to like, you know, ideas that like public health care and things like that. But you can also see how it's very effectively being used by the right to get these things that we've been talking about into the public consciousness. And with Elon Musk's ownership of Twitter in particular, you know, he has talked
Starting point is 00:57:05 about this as being like a platform that allows free speech and allows different opinions. And you see him often very frequently responding to outright conspiracy theories and acting like, wow, I didn't know this or looking into it more and all this kind of stuff. And so it shows how his kind of, you know, his kind of whole headspace, his information ecosystem is filled with, you know, because he frequently attacks mainstream journalism, but is filled with these kind of conspiracy theories and right wing outlets. And that is not only shaping his perspective on the world, but our views that he is trying to see elevated and to get more prominence on this platform and thus, you know, to further influence a lot more people. Yeah. I mean, it was a crazy moment a couple of weeks ago. I don't know if you caught
Starting point is 00:57:49 in one or another of Elon's lawsuits, his lawyers basically tried to make this argument that, oh, are you sure he said that statement or could that have been a doctored AI video? And that just, to me, it was so transparent as like part of this wider effort to get us all to question our reality and to get us to question quote unquote truth, which the chaos that is about to come with like AI generated video and audio and imaging is like so horrifying to me. But I think it's important to remember like these people thrive in chaos. Like, I always remember this tweet by Peter Thiel's biographer, Max Chavkin, who said, like, what you need to understand is like, apocalyptic society. And I don't think it really matters to them if chaos reigns, because they'll be okay. And in fact, can probably find ways to benefit from that. But like, the next election is going to be so screwy in terms of just like, did Biden really make that speech? Did Trump really say that thing? And now I'm just going on a tangent. But like, when you start to think about the mental health effects of that, like how it's already starting to feel, I'm now having like weekly moments where I'll see an image and I'll be like, is that real? Did that really happen? And I think that like the effects that that has on the mind have never been studied before. And I think we're
Starting point is 00:59:19 about to learn in real time how that kind of drives you insane. So it should be exciting. Move fast and break things indeed. Break our minds. Yeah. Julia, this has been a fantastic conversation and so insightful into this kind of whole worldview and everything that these people are putting out into the world and that your work touches on so frequently. Thank you so much for taking the time. It's really been fantastic to chat. Thank you. Maybe next time we can. It's really been fantastic to chat. Thank you.
Starting point is 00:59:45 Maybe next time we can cover all the things I'm optimistic about. Nah, then we wouldn't have much to talk about. That's true. Thank you so much. It's been really fun. Julia Black is a senior correspondent at Insider. You can follow her on Twitter at MJNBlack.
Starting point is 01:00:03 You can follow me at Paris Marks and you can follow the show at at TechWon'tSaveUs. TechWon'tSaveUs is produced by Eric Wickham and is part of the Harbinger Media Network, and if you want to support the work that goes into making the show every week, you can go to patreon.com slash techwontsaveus and become a supporter. Thanks for listening. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.