Factually! with Adam Conover - Robots, Robots, Robots! with Kate Darling

Episode Date: June 1, 2022

When we picture robots, we normally think of an artificial being created in our own image. But what if this were deeply misleading? Author ofThe New Breed, Kate Darling, joins Adam to separa...te fact from science fiction and discuss the potentials and perils of real life robots. They get into the ethical issues involved with autonomous weapons systems and vehicles, why robots don’t need to look like people, and why robots might be better thought of as animal companions, rather than human replacements. You can purchase Kate’s book here: http://factuallypod.com/books Learn more about your ad choices. Visit megaphone.fm/adchoices See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Transcript
Discussion (0)
Starting point is 00:00:00 You know, I got to confess, I have always been a sucker for Japanese treats. I love going down a little Tokyo, heading to a convenience store, and grabbing all those brightly colored, fun-packaged boxes off of the shelf. But you know what? I don't get the chance to go down there as often as I would like to. And that is why I am so thrilled that Bokksu, a Japanese snack subscription box, chose to sponsor this episode. What's gotten me so excited about Bokksu is that these aren't just your run-of-the-mill grocery store finds. Each box comes packed with 20 unique snacks that you can only find in Japan itself.
Starting point is 00:00:29 Plus, they throw in a handy guide filled with info about each snack and about Japanese culture. And let me tell you something, you are going to need that guide because this box comes with a lot of snacks. I just got this one today, direct from Bokksu, and look at all of these things. We got some sort of seaweed snack here. We've got a buttercream cookie. We've got a dolce. I don't, I'm going to have to read the guide to figure out what this one is. It looks like some sort of sponge cake. Oh my gosh. This one is, I think it's some kind of maybe fried banana chip. Let's try it out and see. Is that what it is? Nope, it's not banana. Maybe it's a cassava potato chip. I should have read the guide. Ah, here they are. Iburigako smoky chips. Potato
Starting point is 00:01:15 chips made with rice flour, providing a lighter texture and satisfying crunch. Oh my gosh, this is so much fun. You got to get one of these for themselves and get this for the month of March. Bokksu has a limited edition cherry blossom box and 12 month subscribers get a free kimono style robe and get this while you're wearing your new duds, learning fascinating things about your tasty snacks. You can also rest assured that you have helped to support small family run businesses in Japan because Bokksu works with 200 plus small makers to get their snacks delivered straight to your door.
Starting point is 00:01:45 So if all of that sounds good, if you want a big box of delicious snacks like this for yourself, use the code factually for $15 off your first order at Bokksu.com. That's code factually for $15 off your first order on Bokksu.com. I don't know the way. I don't know what to think. I don't know what to say. Yeah, but that's alright. Yeah, that's okay. I don't know anything. Hello and welcome to Factually. I'm Adam Conover. Thank you for joining me once again as I talk to an amazing expert about all the things that they know that I don't know that you probably don't know. Both of our minds are going to get blown together. We're going to have a fantastic time. Now, before we get started, I want to make a big announcement. I am going on tour this summer. I have a brand new hour of stand-up. No one has seen it before. I'm taking it all across the country to the following cities. We got Phoenix. We got Boston. We got Washington,
Starting point is 00:02:51 D.C. slash Arlington, Virginia. We got Nashville, Tennessee. We got Spokane, Washington, Tacoma, Washington, and the Bell House in New York City to round it out. If you live in or near any of those places, head to adamconover.net slash tourdates to pick up tickets to these shows. And here's a special guarantee. If you come to the show, I will take a selfie with you afterwards should you want to. Only at your request, but I will happily do it with every single person who attends one of these shows. Again, brand new hour of stand-up, so excited for you to see it. adamconover.net slash tourdates for tickets. And of course, please don't stop watching The G Word on Netflix. I have loved hearing everyone's responses, seeing them on social media. Dare I say it, the show seems to be resonating with people, and nothing could be more important to me. So I thank you so much for
Starting point is 00:03:45 watching. Please keep checking it out if you haven't already and let me know what you think on social media or by emailing factually at adamconover.net, which is my special address. And I always love hearing from you, but enough promo. Let's get to this week's show because we have such a fun one for you this week. This week, we are talking about robots. You know robots. You love robots. The square guys, they go beep, beep, boop, boop. I cannot get enough of them.
Starting point is 00:04:15 But haven't you ever wondered where robots came from? Like, we were imagining robots long before we could actually create them. And now, we are making robots that literally resemble the science fiction robots that we previously used to imagine. So where the hell does this concept originate? And what does it have to say about humanity that we have this constant companion in our fantasy lives? Well, the concept of the robot dates back to the Jewish tradition of the Gollum. This is a being conjured from inanimate clay who's created in order to protect medieval Jewish ghettos and occasionally run some errands. A Gollum isn't quite human, but it can do human things. And, you know, does it sometimes get out of pocket and need to be put down? Yes,
Starting point is 00:04:57 of course, like all artificial humans do. But, you know, if something sounds familiar about this story, well, it's because the creation of the golem out of clay mimics the creation of humanity from dust in the Abrahamic religious tradition. So isn't that fascinating? When we talk about a robot, we're almost talking about an artificial human created in our image. And when we tell that story, we are recapitulating the origin myth of the Abrahamic religions, that God created humanity in his image. And there's just as much depth with the story of Frankenstein, the 19th century story written by Mary Shelley of a man who used science rather than Jewish magic to reanimate a dead body. And the result of that story was chaos. Everyone rejects the super strong,
Starting point is 00:05:44 super ugly, super emo brute who vows revenge on the misguided dork who created him. Now, the lesson from both of these stories is the same. Don't breathe life into a person. You are not God. It goes wrong. Now, the first time the word robot actually appears was when it was coined by the Czech playwright Karel ÄŒepik in his 1920 play, Rossum's Universal Robots. This play was about scientists, mass-producing workers who didn't have a soul or feelings.
Starting point is 00:06:12 And eventually, of course, the robots decided to take over. And pieces of this story have been reappropriated and reused in science fiction ever since. Data from Star Trek is a robot in search of his human soul. The robots from the Matrix subjugate all of humanity. So this meta story, the story of humans who create intelligent artificial life in our image, has permeated our culture and it has influenced the real world development of real world robots too. The great science fiction author Isaac Asimov's
Starting point is 00:06:45 Three Laws of Robotics, which are all about making sure robots don't injure or disobey human beings, have actually guided the making of robots today. And companies from Honda to Boston Dynamics have tried to make science fiction a reality and build robots that actually resemble and move and work like humans.
Starting point is 00:07:03 This story also influences our real life anxieties about robots. I mean, hell, Andrew Yang ran an entire presidential campaign off of the premise that robots would someday replace us. But here's the thought. What if this entire story of what robots are and what they might be in the future is off? What if there is another way to think
Starting point is 00:07:27 about our interactions with the non-human entities we create? What if there's a story that can provide us a more realistic, less apocalyptic, more helpful view of what these things could be? Well, as our expert today compellingly argues, there is a way to think about robots that makes a lot more sense. Instead of comparing them to ourselves, they're a lot more analogous to a much more familiar human companion, animals. If you start to think about robots as compared to our relationship with animals, you might be able to get a much more accurate and thorough understanding of what they are, what they
Starting point is 00:08:02 might be, their promise, and their potential perils. So, to elaborate on this fascinating idea and to discuss how animals can help us make sense of robots, our guest today is Kate Darling. She's a research specialist at MIT's Media Lab and the author of The New Breed, What Our History with Animals Reveals About Our Future with Robots. Please welcome Kate Darling. Kate Darling, thank you so much for being on the show.
Starting point is 00:08:29 Thanks for having me. So you are a robot ethicist. What does this mean? Yes, some people call me that. I know it sounds very science fiction-y, but it's basically just thinking through kind of the near-term effects of robotic technology from a social, legal, ethical perspective. I think there are actually quite a few very interesting questions
Starting point is 00:08:49 that are coming up in the near future around integrating these machines that can sense, think, make autonomous decisions and learn. So that's my job. And wait, you said some people call you that. What do you call yourself? And what you said, some people call you that. What do you call yourself? Usually I just call myself a researcher. It's hard. I reside at the intersection of disciplines.
Starting point is 00:09:15 I originally went to law school, but I'm not a lawyer and I'm not technically an ethicist. So I exist in the space in between. So, look, I'm so excited to talk to you because I love robots, as does everybody. Like, you just sort of grow up. I just grew up with a love of robots. I don't know. You see a robot, you get happy. It's like same with, they're in the same bucket as like chimpanzees.
Starting point is 00:09:38 There's other things that, you know, you just, you like to think about them. You like to look at them. You know what I mean? You're like, hey, a movie with a robot in it? I'm there, right? But that is a science fiction robot, obviously. How do you define the kind of robots that you're like, hey, a movie with a robot in it, I'm there, right? But that is a science fiction robot, obviously. How do you define the kind of robots that you're talking about? Let's make sure we're all talking about the same thing, first of all. Well, we're never all talking about the same thing because there's no universal definition of robot. And yeah, it's, well, I mean,
Starting point is 00:10:00 there's no universal definition of anything, really, if we want to get super philosophical about it. But I think robot is a particularly problematic one because we're so influenced by science fiction, like you said, that if you do a Google image search for robot, you get a bunch of humanoid robots with like a torso and two arms and legs and a head and they usually have eyes. that's not necessarily how a roboticist would define a robot, but it's what we all imagine in our heads when we think robot. And that's kind of, that's been a fascination of mine for a while, that we have this image of robots that's very humanoid, but that a robot can be something that looks very different or does very different things or has a different type of intelligence. But one of the interesting
Starting point is 00:10:45 things about robots and, and forgive me if we're going off course right at the beginning here, but I find this really interesting that, that robot originated as a science fictional concept. Um, well it's what probably, uh, over a century old, right? Like I would guess the beginning of the, of the early 20th century, it came out of like out of like Jewish mythology about the Gollum, right? And things like that. And, you know, it was a fictional concept, but almost more than any other science fictional concept, it has influenced technology in the real world. Like we had probably like 40 years of just like science fiction authors going, I think this is what a robot will be. And then people started trying to invent them basically. And the, and the weird thing that you say about, well, when you look up Google image
Starting point is 00:11:29 search for a robot, you see them with, with heads and arms and legs. And that's not really what they are, except that there are people in companies who are constantly trying to create robots like that. Like the Sony, I forget the name of Sony's, but Sony had a famous one for many years that they would like trot out at trade shows and stuff like that. Oh, you mean the Honda one, the Asimo one? Thank you. That's what I was thinking. Yeah. Honda Asimo. Right. And like, you know, I look at those and go, okay, this is clearly a little more science fictional than reality, but you have people who are constantly, I don't know, half, half the shit that Elon Musk says about robots. He just literally read in a science fiction novel the other day. So it's that, that interplay is really funny to me. I mean, because that's not true of
Starting point is 00:12:09 like cars or other, you know, like the Internet that we have was not invented by science. It was not in science fiction. It's different from the science fiction that we had. But robots were often trying to create the fictional reality in a way, it seems like. Totally. I think, yeah, and you hit the nail on the head. Like, we have all these technologies like cryptocurrency, where people can't really picture in their minds what that is. But for robot, we have this very specific idea of what it is and should be. And it comes from science fiction, or it comes even, I mean, we've always been fascinated with recreating ourselves.
Starting point is 00:12:42 If you go back to the ancient Greeks and their automata that they had, I mean, it's incredible how throughout history we've tried to like recreate a human. And I think that that is still part of what, like you said, robotics companies are doing today. A lot of people in robotics and in artificial intelligence are trying to recreate a human body or human ability, human skill. And while I totally understand that impulse and I understand why we're fascinated by it, I also think it's a little bit boring to recreate something that we already have. And we have the ability to create different things and things that could actually supplement us. Why are we so focused on recreating a human? That is a really good question. Why is that our fascination? I mean, it's our fascination in our lives to create. We want to procreate, right? We want to create more life, or at least many people do. A lot of people have a goal in life of creating a tiny new human and
Starting point is 00:13:45 raising it up. And so maybe there's some sort of broad human psychic need to create, yeah, reproductions of ourselves that walk around. But it's true. What is the utility of that type of robot? Like probably not that high because we can already do all those things. Like you can, you don't need a robot to wash your car because you could pay a human to wash your car. Right. Or, and like, even if you had a car washing robot, why would it look like a human? I think sometimes like you could make a much more efficient, a car wash is basically a robot that washes your car and it's much cheaper and more efficient to build than a humanoid robot that goes around
Starting point is 00:14:25 with a sponge and a hose, you know? But a lot of people who make the humanoid robots are still making the argument, oh, well, the robots need to look like humans because we have a world that's built for humans. We have stairs and we have, you know, door handles and things that the robots need to be able to operate. And even that I think is a little bit short-sighted. When you really think outside the box, you could think about our infrastructure and how it's created for able-bodied humans and not for wheelchairs or strollers or assistive devices. And suddenly you've solved two problems because you can create a world or an infrastructure that can accommodate a wheeled robot, which is much
Starting point is 00:15:05 easier to create than, you know, a two-legged thing that walks around. And you have a more accessible world for people. Yeah. These are incredible points. Well, I want to know when you are, apart from science fictional robots, when you're talking about what are the ethical problems or dilemmas of real robots, of the robots we actually have, what kind of robots are you talking about? Or the kind of robots we intend to create in the very near future? So for me, a robot is a physical machine that can sense its environment and make
Starting point is 00:15:36 some sort of autonomous decision based on what it's sensing and then act on its environment again. And so anything from autonomous weapon systems, I think the ethical issues there are pretty obvious. Autonomous vehicles, robots in the home that might have microphones and cameras that are recording everything that you do in the intimacy of your own home. Well, is that science fiction? No, that already exists. Okay. All right. We're already in the worrisome future. Okay, got it. Yes. So I think lots of issues. And it's always struck me so interesting that we know there are issues, but people are freaking out about them in a way that I think is also a little bit short-sighted because people are like, well, this is totally new.
Starting point is 00:16:25 We're in this new science fiction world. We have these robots that can make decisions that no one could anticipate, not even their creators. What are we going to do about responsibility for harm? What are we going to do about people developing emotional connections to these machines? And I think one of the things that we forget is that we do have a long history of engaging with entities that can sense and think and make autonomous decisions and learn, they're called animals.
Starting point is 00:16:50 And we have integrated them into the workforce. We have created weapons out of animals. We have sent animals on spy missions, changed our communication with each other. Send them to space. Totally, yeah. Anything that we don't want actual humans to do, we send animals. And that's also the great use case for robots, even better use case for robots because you're not killing them.
Starting point is 00:17:13 Wow. I never thought of that connection. And that's absolutely right. Like if you look at the long history of the domestication of the horse, which, you know, now the horse has finally been freed from, at least in the United States, from the burden of being a mechanical animal that we force to do our labor, right? Horses generally are very rarely used for that, whereas they used to be a system of transportation, like literally horsepower. We would power things using horses. That's why we have the phrase. And we just had like, well, millions and millions of horses living in misery. But we, we created that animal over the process of domestication, training them, you know, breeding them, et cetera, et cetera, you know,
Starting point is 00:17:57 over the course of thousands and thousands of years. And like, yeah, now that you mentioned it, hold on a second, that is completely analogous to like trying to create a robot to do certain things that we don't want to do. Am I right? Yeah. And I've always been so struck by how perfect the analogy is. Obviously, it breaks down in some cases. Robots aren't exactly like animals.
Starting point is 00:18:19 But I think that getting people away from this human comparison and towards thinking about how we've had these other types of skill that we've drawn on in the past. I think it really kind of gets people away from this kind of technological determinism that the robots are coming to take our jobs and that the robots are coming to replace our relationships and that type of thing. It gets us thinking, even in the design process, more creatively about how we can actually harness the skill and the strength of robotics and artificial intelligence to be partners in what we're trying to achieve instead of just trying to, again, recreate what we already have. And so looking at the diversity
Starting point is 00:18:57 in the animal world and all the different skill sets and sensory abilities that animals have that we've been able to make use of, I think is a really great starting point for thinking about what possibilities we have in robotics. Yeah, that's really cool. Okay. So it helps ground our conversation about it because instead of saying, oh, wait, what if we invent a robot that people love the robot so much they stop taking care of each other and they only want to make out and kiss the robot because they love the robot so like that sort of like horror story apocalyptic story well we already have these other non-human things that we love they're called pets and we understand that pets are like complementary to the rest of human life we're not like we i love my dog my dog loves me but not in the same way. I love my girlfriend. Yeah. Your dog doesn't replace your girlfriend. Yeah. And no one freaks out. Like if you don't
Starting point is 00:19:50 have a girlfriend and you get a dog, no one's going to be like, Oh, you're never going to get a girlfriend now. Right. But the, but the dog also fulfills a certain set of human needs. Like when I, when I I've thought a lot about when I'm staring into my dog's eyes and I'm petting her little head and I'm going like, why do I do this? I do it because it fulfills a human need in me to care for another living thing. Like it makes me feel good to simply do it. And there's like, you know, I don't know. I believe there's research on, you know, elderly folks when they, when they have a pet or a plant to take care of, it like improves their overall wellbeing because that have sort of responsibility is like good for you in like a small way and in little pet
Starting point is 00:20:29 sized doses. Like it's, it does, it does, it gives me something that I want, but it doesn't like replace a baby. You know what I mean? But it gives me a little taste of it maybe in a way that's enriching to my life. And so maybe a robot could do the same thing. Is that the idea or no? It is the idea. And it's a little bit freaky, but a lot of research in human-robot interaction is showing that robots can have a similar kind of emotional role in people's lives. And where, okay, so there's this baby harp seal robot called the Paro, and they use this in nursing homes and with dementia patients. And so you were saying like the old people, when they get a pet or a dog or something to take care of, it actually improves their lives. Okay. So they're giving them these
Starting point is 00:21:15 robots and the robot is an animal therapy replacement. It makes these little sounds, it responds to your touch, and it really gives people the sense of kind of nurturing something. And it's a little bit freaky, like I said, because it's not a living thing. So are we, you know, tricking these people? And yet it has really, really good effects. It's really good for people's health and well-being, and they've been able to use it as an effective alternative to giving people medication to calm them down. There are a lot of contexts where we can't use live animals. So even though it's kind of controversial, I think that it's really interesting to see
Starting point is 00:21:50 kind of the net benefit of using a robot and also really interesting that we will respond to the robot in the same way that we respond to a therapy dog, even though we know that it's not real. Yeah. Well, I mean, I could easily imagine myself having the same reaction if there was a robot that behaved
Starting point is 00:22:10 like even 50% as complexly as my dog, right? You know, that looked at me, that wanted me to do a thing, that made a little sound. Like I would just start treating it that way, you know? I believe that about myself. But yeah, I don't, I don't, it's funny that way. You know, I believe that about myself. But yeah, I don't. I don't. It's funny that you would describe that that that baby harp seal robot as controversial. It doesn't sound like it would be bad to me, especially like, yeah, if someone is very elderly,
Starting point is 00:22:37 has, you know, really struggling with dementia, maybe they can't actually care for a regular animal, but you still want to give them that experience. How is that different from, you know, a child playing with a baby doll, right? And having the experience of playing with a regular animal, but you still want to give them that experience. How is that different from, you know, a child playing with a baby doll, right? And having the experience of playing with a baby doll, except which is something that children have been doing since literally pre history. Um, except that in this case, it's like a little bit more complex. It gives a little bit more feedback. It, it, it does its job a little bit better than an inert baby doll, right? Yeah. It's no longer just solely in the realm of imagination. It's actually like giving you these biological cues
Starting point is 00:23:09 as Sherry Turkle, who's one of the critics of this, describes it. But I do, you know, I think to some extent, people are right to freak out about this because I think one of the things they worry about is that we're just creating a society where we give people robots instead of human care. And that's, I mean, I think that of the things they worry about is that we're just creating a society where we give people robots instead of human care. And that's, I mean, I think that's absolutely correct because the robots can't replace human care.
Starting point is 00:23:31 Like in this case, they can be a sufficient animal therapy replacement, but they're not going to replace, you know, a person. But that seems like a broader societal issue. And I think with robots, we tend to focus a lot on either blaming them for societal problems or trying to fix societal problems by using robots, both of which I think is the wrong way to think about robots. Yes. Oh, I agree with you entirely. I mean, my criticism of the robots are coming for our jobs, automation coming for our jobs argument, which we've talked about on the show before in at least one or two episodes that you can go find. But is that there's, you know, it lays all the blame with the technological. The robots are going to become so good that we're not going to be able to do jobs anymore. And I'm looking at it from a labor perspective thinking, hold on a second. There's no way that the cost of a robot is ever going to fall below the cost of abusing and underpaying a human being to do a job.
Starting point is 00:24:29 You know, like robot taxis are, I don't think, ever going to take over for a driver who you can pay less than minimum wage because the robot is probably going to cost more. know, for Uber to have to take care of like a million dollar robot car and make sure nothing bad happens to it is a lot more work for Uber than Uber saying, hey, you bring your own Honda Fit and we'll pay you five dollars an hour. Right. And so now there are certainly cases in which like automation can replace things that humans are doing. But in that case, the problem is what is the human who is setting up the job doing? You know, like problem is what is the human who is setting up the job doing? You know, like what is the what is the human infrastructure of capitalism, of the corporate machine doing?
Starting point is 00:25:10 It's not the robot. It's like what are humans doing to each other is the fundamental question that like we need to focus on. And so as a result, I always find the sort of like, you know, super intelligence robots coming for our jobs. We all need to like, I, you know, I have no strong opinion on UBI, right? There's that's a whole thing, but that's often presented as we have to do this because of the robots. And I'm like, this is not the conversation we need to be having. Yeah. We don't need to get into UBI.
Starting point is 00:25:38 If you please UBI people don't start emailing us. We'll do a different episode about UBI. As soon as you mention it, like suddenly a thousand nerds zoom up to you and they're like, yes, but what do you think about UBI though? And we're not, different episode, different episode. But do you, does any of that track for you? Oh, totally. Yeah.
Starting point is 00:25:55 I think you're absolutely right. I mean, look, if there's anything the pandemic has taught us is that robots are not waiting in the wings to take everyone's jobs. Otherwise it would have happened, right? There's all these labor shortages and the robots are not swooping in to take jobs. Right. We don't have the technology. Like you said, it's not cost effective. And I think for me, the even bigger issue is that I think we kind of take our system of labor for granted and the ways that we think about labor and, you know,
Starting point is 00:26:28 companies trying to automate tasks or even put the humans in positions in an assembly line or on a factory floor where they're doing just a rote task that the companies are hoping to eventually automate away. I think that's a really uncreative way to think about human labor, not to mention probably very unethical. Like you're basically putting humans in these roles that are like very unhealthy and unfulfilling for anyone. And so a more creative view of labor would look at, OK, what are the strengths of people? What are the strengths of machines? And how can we actually rethink the work process to combine those instead of just trying to automate away the pesky humans? But it's really hard to change. I mean, our whole political and economic system is set up for this unbridled corporate capitalism that
Starting point is 00:27:24 chases short-term profit incentives. And so there's not a lot of longer-term thinking about how we could have a more productive society. And I think that's the issue. And like you said, we focus so much on the robots are coming to take the jobs and we give the robots so much agency that we forget that these decisions are being made by people, about people, and they are choices. They are choices. We could be doing something different. Yeah.
Starting point is 00:27:50 And as always, I think this is a theme we return to on this show a lot, the technology that we really need to be focusing on isn't like silicon. It's like our societal technology of how did we organize our society. That is somewhere that we can also advance. That's an area that we have not been able to make any advancements there in the last couple of decades, but that is a form of human technology as well. It's just like the social infrastructure, the way we organize our world. Totally. And it means we all have a say in this. We all have a little bit of power in shaping the future because it's about the political and economic systems that we
Starting point is 00:28:25 are all choosing. It's not just the people designing the robots shape the future. Yeah. Well, I'd love to talk a little bit, Ben, that would have been a great note to end on, but we're only 20 minutes into the podcast. But like, what a great big picture ending. So we'll have to work our way around there again for the end of the conversation. I mean, okay. So, so I'd love to talk more about the social robotics that you were, that you were talking about all the different dimensions of that. So you talked about the, the baby seal, the pleo robot for just being a companion robot for very elderly folks. And I'm curious, are there other examples in that world
Starting point is 00:29:03 of social robotics that are either happening or close to happening that are interesting to talk and think about? Yeah. Well, I've been really fascinated by social robots ever since I got into robotics because having these robots that interact with people on a social level and then seeing both through my own experience but also through a lot of research in the space, seeing that people will consistently treat the robots like they're alive, even though they know perfectly well that they're just machines is so fascinating to me. And the fact that people can develop genuine empathy for robots and genuine emotional connections. Let's see. You know, I think the main problem that social robotics has commercially is that people's expectations are too high for what a robot should be able to do. And so if you try to create like a little humanoid robot that can like talk to people and like fetch things, that's not going to work. It's going to fall over. It's not going to understand your commands.
Starting point is 00:29:59 And people are going to be like, why should I pay, you know, $1,500 for this thing, which is still like the price point for these robots. But something with like a slightly more clever design, something that mimics more of an animal or like a mythical figure that you've never even like seen in real life, you don't have as many expectations for what that robot is going to do. And so let's see, one of the ones that was recently announced is Amazon has this new home robot called Astro. And it's kind of interesting because they basically took all the functionality of Alexa, but they put it in this like little character on wheels that has a really cute face and can dance and move its body. And I think companies are starting to understand that that is very powerful. Like it doesn't even need to have the functionality of Alexa for people to enjoy a robot that is
Starting point is 00:30:52 physically moving around and making cute eyes at them. And what we've seen is that people who have gotten social robots in their homes, a lot of companies have like had these robots and gone out of business, but the people who got them actually really bonded with them. And so I think that the social use case, almost like, you know, why we have pets. We used to have animals in our homes to guard our grain stores and guard our houses. But now we have pets just because for many reasons, including what you said about like, we like this, the experience of caring for something or interacting with something. And I think that there's a use case for robotics there as well, where it hasn't quite hit the market yet, but I think we're going to see more companion
Starting point is 00:31:36 robots. Well, and, you know, I'm a big video game player and a whole sort of design pattern in video games is caring for a virtual creature or a virtual character of some kind. You know, there's the Tamagotchi, obviously. There's, you know, Pokemon is entirely, you know, based like a big thing that people like in Pokemon. They add more features every generation. They add more features where you just play with your Pokemon and feed them treats and you increase your bond with them. And people then go out of their way to take the same Pokemon from cartridge to cartridge, from generation to generation. And, you know, I think if you ask those people, they wouldn't think it's alive, but they're there or, you know, that they have any ethical obligation to it or that they're really caring for it in a real sense.
Starting point is 00:32:21 But they are doing it as a form of play and they enjoy it, you know. for it in a real sense, but they are doing it as a form of play and they enjoy it, you know? And I've played, first time I played one of those games with my girlfriend, Lisa, she was like, oh, I love this one. Oh, I'm going to feed him. You know, like that's, you have that reaction. And I could easily see that being put onto some kind of robot that sounds unquestionable that that'll be a big Christmas hit in some years hence, right? That I can completely imagine what you're talking about. And now the question is, if I would want to have that, if it'll become so successful, people have it in their house for like 10 years, or if they, you know, pack it up, you know, by the time it's spring. But if it's a toy or if it's like truly an addition to our lives in the same way that a pet is. Oh, totally. And I think that, you know, that's one of the challenges is making the robot interesting enough that people want to keep engaging with it.
Starting point is 00:33:15 And again, matching people's science fictional expectations of like all the cool things that a robot can do, which robots mostly can't do. a robot can do, which robots mostly can't do. So I think it's going to take some very clever design tricks and some tricks to keep people engaged, which also then runs into some ethical issues because unlike a pet, there's probably a company behind this robot that's trying to keep you engaged. So you run into then all sorts of interesting questions, but I do think we're going to get there. I think that even more, I mean, what research shows is that even more so than a Tamagotchi or a video game or anything on a screen, physical robots, people will bond to much more easily, more strongly. And we love anything in our physical space. We're very physical creatures. Our brains are scanning our environments, trying to separate things into objects and agents. And yeah, the robots are definitely objects that move like agents and trick our brains.
Starting point is 00:34:15 And so people will name their Roombas, and people are definitely going to love having a slightly more sophisticated social robot. Well, so let's talk about a couple of the ethical issues. Like you mentioned, companies will be designing these. And I agree that perhaps they could design them in ways that take advantage of us. You know, there's plenty of, again, in the world of video games, there's plenty of companies that will take advantage of something that people enjoy in a video game and push the mechanics so far it can become abusive or addictive.
Starting point is 00:34:47 You know, it'll suck money out of players pockets or they'll feel compelled to keep pouring money into it, for instance, because it's giving them a certain sort of reward that they they can't get enough of. And I think about how, look, you know, like dogs and cats have taken advantage of us. Right. Like dogs, the reason, like the real reason we have dogs is because they were eating our trash, right? They were just like, come to human settlements, eat our trash. And eventually they evolved to sort of take advantage of what we find cute, right? Like to sort of resemble human babies in a certain way to such a degree that, you know, we invited them into our homes. And so in a sense, you could look at them as like parasites on human society, right? Or symbiotes at the very least.
Starting point is 00:35:31 They, you know, through the process of evolution over thousands of years, they managed to sort of take advantage of certain features of what we like and give that to us. Now, if you are giving a designer at a robot company the remit of, hey, make humans love this device, make them love it so much, I could see that maybe getting to some weird places. Does that make sense to you as a concern? Yeah. That's probably my main concern that is just being totally overlooked about social robots. Because we have a long history of persuasive design and technology from like the ways that casinos are designed to, you know, shopping malls and like the music. And then of course, you know, online, the internet
Starting point is 00:36:18 and the apps, like all like so, so much effort and money and investment goes into just getting people to stay on the app a few seconds longer. And so take that and put that in a robot that you now also have like an emotional connection to. And it's fun. Like you said, it's like in the video games, like people love to do this. So people are going to do this. And then what are you using that engagement for is the question. Are you trying to improve the user's life or are you trying to manipulate the user's behavior so that they buy more of a certain product? Right. And we've seen this a little bit with pets. People have really exploited, their entire industry is that exploit people's emotional connection to their dogs. I know you said you have a dog, so I'm going to tread lightly here, but there's like entire like doggy hotels and spas and dog ice cream and like all these things that people will spend money or even like the medical procedures that people will spend money on for their dogs did not exist before dogs started becoming like really part of the family. And I think it's getting, it's ramped up even more like,
Starting point is 00:37:26 and, and, and to what extent is that taking advantage of people's love for their pets? And to what extent is that something that people actually want? Yeah, no, that's a really good point that there's a lot of products that are like, you know, Hey, you should buy the expensive dog food. Cause this one is made with with real beef or whatever. It's taking advantage of your human assumptions about about food, but applying them to don't you want your dog to be happy? Don't you want your dog to be healthy? And it's like, look, your dog would literally be happy eating like rat carcasses. Your dog is dogs are scavengers. They they eat trash there. They don't need fancy food.
Starting point is 00:38:05 If you want to give your dog fancy food, more power to you. But yes, you're right. Those companies are taking advantage of our emotional connection to our dogs. And so, yes, if you imagine a robot pet that is not only creating that emotional bond, but can also communicate that it feels sad and then offer you the chance to purchase something that will make it happy. Yeah, exactly. communicate that it feels sad and then offer you the chance to purchase something that will make it happy. Right. Click the,
Starting point is 00:38:26 click the micro payment and your dog will get a new skin, a new, a new, you know, cosmetic feature. And it will, and it will seem so happy and joyous and thankful. And you'll have enriched your,
Starting point is 00:38:37 your fake pet's life or your digital pet's life. That is, I could see that turning into an abusive design pattern. And I like the way that you put it is, is are you creating the technology to enrich people's lives? Or are you trying to do it to suck money from their pockets or abuse them? Bringing it back to video games, I always think of the difference between, you know, there are some games that, you know, like Nintendo, right? When Nintendo, when you buy a Nintendo game, it might really, they're trying to make you like it. They're designing it in such a way
Starting point is 00:39:08 to really keep you hooked. But really, they just want you to pay your $40 to $60. And then after that, you have a good time. And best case scenario, they want you to buy a new game next year, right? As opposed to the types of games where, hey, it's free upfront, but we have designed this game
Starting point is 00:39:22 to suck progressively more and more money out of your pocket to, you know, instead of just loving the character and enjoying your time in the fictional world, we want you to love the character and we want you to buy outfits for the character. We want you to buy new weapons for the character, et cetera, et cetera. If you're a video game player, you're familiar with these design patterns and you probably feel one of them is enriching and one of them is kind of predatory and we could have that same thing going on with our robots is what you're saying oh totally and we're already seeing kind of the very beginnings of this so sony has this robot dog called the aibo um it originally came out in the 90s was super popular in japan and in the u.s and like in japan even you know some of the
Starting point is 00:40:03 aibos that that still exist from the 90s that have broken down beyond repair at this point, families are having funerals for them at Buddhist temples. So it's like people really made the Aibo part of their family. And there's a new version of the Aibo that costs almost $3,000 just for this robot dog. And additionally, you have to subscribe to a cloud service. And if you don't pay for the cloud service, the dog will like lose all of its memories and some of its abilities and things that you've taught it. Yeah. So, and, and, and, you know, you could make the argument or Sony could make the argument that, you know, it's expensive to like maintain these servers. And, you know, we need to be actually we make the robot cheaper by only charging the people who are actively using it.
Starting point is 00:40:55 But they also haven't set the price for the cloud service yet. You get the first three years free and then they're going to figure out how much to charge you. And so then they can make the decision. Am I charging based on how much people like this dog? Or am I charging based on how much it's going to cost me to run these servers? And it's really hard to tell what they're going to do. Well, God, that is so fascinating. We have to take a break. I love talking to you, but we have to take a really quick break. And then I have a lot more questions. We'll be right back with more Kate Darling. OK, we're back with Kate Darling.
Starting point is 00:41:40 We've been talking about the ethical problems with robots and social robots. I do want to ask a prurient question. Sex robots, are these a thing? Are these going to be a thing? Or should we be concerned about them being a thing? Are they a concern that you have? And certainly that's another human emotion that, you know, needs to be fulfilled. And I could imagine it being helped out by some sort of robot, you know?
Starting point is 00:42:05 Well, now we get to the question again, what's a robot? What's a sex robot? Because if you go to a sex shop right now, you'll find plenty of like, you find smart vibrators and like all sorts of things that, you know, you could call a robot. But I think what you're talking about is kind of the humanoid, the human shape sex robot, right? I guess I don't even know what I'm talking about. I'm just curious about that. Say, yeah, sorry, please go on. Well, because I would say we already have sex robots to some extent in various forms.
Starting point is 00:42:55 I think that, you know, the holy grail for a lot of people in the sex robot industry is probably creating kind of a humanoid sex robot that's actually compelling and doesn't cost, you know, $14,000 or whatever they cost right now. Right now, they're basically like, they're basically dolls that like have a heartbeat and can like say some things. Yeah. feet and can like say some things. So that's definitely something that exists, but it's not anywhere close to having sex with a human. And I wonder whether we'll ever get there because I think recreating humans is not only boring, but also very difficult. So yeah. I guess there's the question of like what, it seems again, like that goal of replicating a human exactly would be off. And what you'd want to do instead is figure out, like, what is the experience? What is the like the emotional connection that people need to have?
Starting point is 00:43:36 You know what I mean? That people are missing. And how could you provide that? What is the version of Amazon Astro, but for but for sex, right. Where you're not trying to do verisimilitude. You're not trying to create a fake person. You're trying to create like some sort of like happy, loving emotion of some kind. Maybe, maybe the innovation is in like how the, how the device speaks to you rather than, I'm sorry, we don't need to get into it too much. I feel uncomfortable in the area, but okay. Let me ask ask you something else. Something that's always fascinated me thinking about robots is the moment at which we become ethically beholden to them. we're happy that horses are not hauling us up and down the street anymore, generally, because sort of, we all know like, like that sucked for the horses. Right. And we, and, and we, we had like our whole society was built on like misery of horses and also at the same time, misery of people still is built on the misery of people in many ways. But you know, we're, we've, we're happy
Starting point is 00:44:39 to not have unhappy horses all over the place. Right. And you, you know, you had people who were going, Oh my God, this is bad for the horses. Do you feel that, you know, as we create robots that we have more and more of an emotional connection to, are we one day going to start to object to what we are doing to them, you know? And is that, how do you think about that question at all? And if so- Oh, yeah. I think about it all the time. Well, and also because I'm so glad you started with the horses because usually the whole robot rights conversation starts in the science fiction realm where in all of our sci-fi, robot rights is basically, it's the question, when are robots sufficiently like us to deserve moral consideration?
Starting point is 00:45:26 But animals aren't necessarily like us. And we've been very selective about what animals we've given moral consideration to. So yes, in the United States, we love horses now. We don't want to see horses mistreated. We don't eat horse meat. In other countries, people still eat horses. It's very like, you know, I'm sure you wouldn't want to eat your dog. In other countries, eating dog meat has been a thing. Like, it's super...
Starting point is 00:45:59 The animals that we relate to emotionally or culturally are the ones that we want to protect. And so, to answer your question, will we, when we emotionally relate to robots, want to give them moral consideration? I think the answer is definitely. I've already seen in some of the research that I've done and others have done that people genuinely empathize with robots that respond in a lifelike way. They don't want to see them, quote unquote, mistreated. A lot of people have seen these videos by this company, Boston Dynamics, that makes these very lifelike way. They don't wanna see them quote unquote mistreated. A lot of people have seen these videos by this company Boston Dynamics that makes these very biologically inspired robots.
Starting point is 00:46:30 And there was one that they released in 2015 of a robot being kicked and the robot looks kind of like a dog and people got very upset. And they were like, why does this bother me so much? It's very visceral to watch that. And then of course you have stories like, you know, Westworld where it's clear to watch that. And then of course you have stories like, you know, Westworld where it's clear to us that if the robots look very lifelike, it's not a comfortable experience
Starting point is 00:46:53 for us to watch them being quote unquote mistreated. And people are going to have a problem with that. And if they don't have a problem with it for themselves, they're going to have a problem with it for their kids seeing that so i think we're definitely going to get into the that conversation but the robot rights conversation is going to be about that and it's going to have nothing to do with what the robots inherently feel or how intelligent they are because we're so far away from creating anything approaching human consciousness and it's not even going to look like our consciousness. And we haven't even given animals rights really. I mean, we still own them completely and we still eat a lot of them. So I, I, I would, I don't think we're anywhere close to having like a, anything close to human rights for robots, nor might we ever, we may not ever get there.
Starting point is 00:47:41 Yeah. I mean, it's funny because the science fiction version is always the robot saying like, I am a man, you know, I feel like there's a famous episode of Star Trek where Data is like on trial for whether or not he's, you know, sentient and has human rights. It's an incredible episode of television. Measure of a Man is the title. Yes. I remember there was an early, I was fascinated by a teenager. There was some piece of like there was like a comic book set in the world of the Matrix about like the moment at which robots began to like protest for their rights in like the history of the Matrix, you know. Like, and that's always connected to the intellectual question in that Star Trek episode of what is what sorts of things deserve rights? You know, are the robots conscious? Do they have an inner life? You know, et cetera, et cetera.
Starting point is 00:48:33 But in the real world, like we choose what we have an ethical obligation towards generally. And this is this is my own intuition, but generally based on how much it acts like us. this is, this is my own intuition, but generally based on how much it acts like us, you know, like if a human seems to, if a human talks to us and, you know, uses the same language as us, and like, it seems to behave the same way we say, oh, this is a person with another inner life, just like mine. And therefore it has rights. And I have responsibilities towards this person. And the, and the less we see ourselves reflected in another being, the less we are likely to like give it that consideration. Right. People, people who are less like us, we tend to offer them less rights, you know, and animals like the more an animal. Because my dog looks into my eyes and seems to exhibit human emotions because I'm able to anthropomorphize my dog.
Starting point is 00:49:21 I'm like, oh, she has rights. I have to treat her very well, but a crab does not, you know? And so I just intuitively extend less of that to a crab. So my point is it'll probably, my intuition is it'll take more of the form of people will just sort of spontaneously start feeling empathy for robots that behave in a certain way. And that'll be what we have to reckon with. Not with the robots like marching in the streets, but we'll have to reckon with our own emotions about our ethical responsibilities towards animals. We'll start just like arising. Does that sound right to you? That sounds spot on to me. I think that's absolutely how it's going to happen. And that you're right that I think we very much default to anything we relate to as humans.
Starting point is 00:50:07 There's also that story about the Save the Whales movement. Like people did not care about whales at all. Like we were still killing and eating whales. And then someone recorded them singing and suddenly everyone's like, oh, the whales can sing. Let's save the whales. And that's how the Save the Whale movement took off. Like Greenpeace had a field day with that. And so, you know, I think that that's our default, whether that's a good or a bad default.
Starting point is 00:50:30 You know, if you look at human rights, that hasn't worked out well for us. I also think that maybe for animals, it hasn't worked out so well, like protecting the cute animals and not the other ones. Yeah. I think generally. Cows are cute by the way, you know, like that's part of, it doesn't always work because cows are cute and very sensitive, and so are pigs. And yet those are the animals that we slaughter in the largest numbers. Totally.
Starting point is 00:50:51 Yes. And yes. So yeah, and I think that some cultural defaults flow into that as well. Maybe that's also a product of capitalism. We have a whole infrastructure to make sure people keep eating cows and pigs. But yeah, I think you used a good word, which is we need to reckon with this. And I think that at the moment that we start having emotions for something that is clearly not alive and we know it doesn't feel anything, maybe that is going to be the time at which we reckon with our default in general and think more deeply about this.
Starting point is 00:51:30 I think it would be good. Yeah. Well, it makes me want to know, one of our really deep things we do as humans is anthropomorphizing, that we say that things are like us. We do it to animals. We do it you know, the same way, you know, we talk to our dogs as though they can speak back to us. We, you know, Coco the gorilla, right? Like, oh, animals can talk. Then they have our same emotions. We also do it to, you know, non-human things. Is the degree to which we immediately go to anthropomorphization, is that worrisome?
Starting point is 00:52:04 Is that a problem in our relationship with robots with robots or how do you look at that? I would say it depends. It's definitely something that happens. I mean, we're obsessed with ourselves. Like, like we talked about earlier, we want to recreate ourselves. We see ourselves in everything. And I don't think, I don't think that that part is ever going away because robots are this really interesting technology where not only do we love to project ourselves onto anything with a face or that moves in an autonomous way. Robots are also often designed to look like something that moves in a way that we understand just for usability purposes. So I think we're definitely going to anthropomorphize the robots. I used to say it depends on the use case for the robot because you might, you know, if it's a social robot, you want people to anthropomorphize
Starting point is 00:52:56 it. If we're using it in health and education to engage people, great. That's a great effect. If it's a factory arm, it kind of depends on whether that's like more or less dangerous for people if they're anthropomorphizing it. Um, but one of the things that I've, um, more recently read about and written about is the use of animals and robots in the military and during wartime. And, uh, it's pretty interesting that there is some bomb disposal robots out there that have been out there for over 10 years now that robots become very, sorry, soldiers become very emotionally attached to and they'll give them names, medals of honor, they'll have funerals for them. And there's even stories like Peter Singer in his book, Wired for War, even talks about a soldier like risking his life to save the robot that was on the battlefield. And so I used to think, this is bad, right? We don't want people, you know, anthropomorphizing and becoming so emotionally
Starting point is 00:53:53 attached to something that's supposed to function as a tool that they're actually being inefficient or risking their lives. That's not. And then I was reading all about how people used to do this with animals in war too. They would become emotionally attached to the horses. They would try to save the horses in dangerous situations that, that, I mean, it seems bad, but. We made a whole movie about a horse, like the movie war horse, the entire movie about people trying to save the life of this horse over and over again about a heroic war horse.
Starting point is 00:54:21 Everyone loves the horse and everyone is like the whole movie's focused on a horse that doesn't even know it's in a movie, to quote John Mulaney. Right. But the thing is, oh yeah, sorry. The thing is that actually these bonds with the animals really helped soldiers in what were very difficult situations. And so in situations where you need to emotionally bond with something, having a robot there might actually be a net positive. I don't know. I can't say whether it's a net positive or negative, but there's actually really, it's useful in some contexts to have that emotional connection. So yeah, I would say it really depends. Again, my main concern is that the sex robot will have compelling in-app purchases and not that, you know, that people are treating these things like they're alive. I don't think that's inherently bad. Yeah. Well, that's really interesting. Maybe in the future, the movie will be, instead of War Horse, it'll be like War Drone. It'll be people who are in love with a drone and following the drone from place to place. I mean, that actually raises a good question. You mentioned at the beginning autonomous weapons and robot weapons, and we're nearing the end here,
Starting point is 00:55:35 but I'd be remiss if I didn't ask you, like, what are your concerns there? Obviously, many of them. I have many concerns, but what troubles you the most? I have many concerns, but what troubles you the most? What troubles me the most is the ways in which we project responsibility onto the machines. When we say, oh, the algorithm did it. We couldn't possibly have anticipated this. And that worries me not just in autonomous weapon systems, but in how we're kind of deploying AI generally in the world today. I see a lot of like, oh, you know, the algorithm did it. It caused harm.
Starting point is 00:56:10 We're so sorry about that. But it's not really our fault. Oh, look, I work with Netflix. OK, I understand people who give too much weight to the algorithm and use it as an excuse for their human decisions. Right. Oh, the algorithm just decided people are not gonna watch this. No human decided to cancel this. The algorithm did it, right? It's like, that's all over the place.
Starting point is 00:56:30 It's infected my business. It's infected businesses around the world, I'm sure. So yes, I know what you're talking about. Yeah, so yeah, people distancing themselves from responsibility, I think is a problem. And I think it's a problem specifically with these kind of autonomous technologies. I think the animal analogy helps a little bit again, because we've used animals as weapons in war. We've set pigs on fire and sent them into the battlefield.
Starting point is 00:56:57 That's an autonomous weapon system right there. Or I just saw an article the other day where the Russians are using dolphins to defend some naval base. Wow. We have a long history of using animals for all these different things. We have a long history of thinking about who's responsible. Animals can be owned by organizations. They can be trained by one person and handled by another. So we have some history that we can look back on instead of just throwing our hands up and being like, oh, well, the robots should be responsible
Starting point is 00:57:29 or the robots did it. Yeah. And the strange thing about bringing it back to our point from the beginning about technology, the strange thing about our science fictional lens on AI and robots is that we tend to use it to reduce our own responsibility. We say, well, the robots are coming for our jobs, no matter what we do there, it's super intelligence. It's the singularity. It's, you know, it's a force of nature. The humans had nothing to do with it when in reality it's humans all the way down. And we don't do that with animals. We don't say the horses are coming for our jobs. The drug sniffing dogs are coming for our jobs, right? We know that they were trained and put in there by the humans. And in fact, if, if I think people know that, like, if, if a canine
Starting point is 00:58:16 cop, right, does something he shouldn't do, that it's the people who are at fault. No one blames the dog, right? They, we know that it's a human problem caused by human actions. And we need to retain that same understanding when it comes to robots, if I'm understanding you correctly. Exactly. And that's why I think it's such a powerful analogy, because it changes our conversations in really fundamental ways. And it lets us take some agency back. It's not just about not having responsibility. It's also that we need to have choices in what we're doing with these machines. And we need to recognize that we do have choices. about their own interactions with their own robots when they're watching their Roomba putter around on the floor.
Starting point is 00:59:05 Right. And which, by the way, Roomba, let's be honest, this is not even a robot. Roombas suck so much. This is I refused in the words of in the words of one of my favorite podcasters, Rob Zakin, he once said that, you know, you get a Roomba and you have to become a robot foreman and you have to suddenly, like, keep track of your Roomba and figure out where the hell like, oh, God, it got stuck in the corner. Like, what do you have to suddenly keep track of your Roomba and figure out where the hell, like, oh, God, it got stuck in the corner. You have to babysit a vacuum cleaner. Terrible.
Starting point is 00:59:36 But when people are, I'm sorry, I got distracted by a tangent. When people are thinking about their own interactions with this kind of technology, what do you hope they can take away from this conversation to think a little bit differently about it, you know? Well, one thing that we learned in some of our research around empathy for robots is that empathic people are more empathic to robots than less empathic people. And so if you find yourself naming your robot or having feelings for it, whether they're negative or positive, I guess, because you seem to have very strong feelings about Ruma's Adam. Yes, I hate them so much. I don't even have one and I hate them. But either way, like you're projecting feelings onto them. And I think that that's not necessarily a bad thing. I think it says a lot about us as people and our willingness to be relational with others. So, yeah, don't feel foolish. If you're doing it probably means you're empathic.
Starting point is 01:00:45 about robots are like part of the equation that we need to take into account when we are thinking about them. It has been such a pleasure talking to you. The ideas that you're sharing are so cool. Your book is called The New Breed, right? People can get it. If you folks want to pick it up, you can get it at our special bookshop, factuallypod.com slash books or wherever books are sold. And Kate, where else can people find your work? I'm an active Twitter user. I'm Brock, G-R-O-K underscore. Amazing. Kate, thank you so much for being on the show. Thank you so much for having me. This was great. Well, thank you once again to Kate Darling for coming on the show. I hope you enjoyed that conversation as much as I did.
Starting point is 01:01:30 Once again, if you want to pick up her book, The New Breed, you can head to factuallypod.com slash books. That's factuallypod.com slash books. I want to thank everybody who supports our Patreon at patreon.com slash adamconover. If you sign up, you can get bonus podcast episodes, you can join our live nonfiction book club, and you get exclusive standup from me. I do not post anywhere else. And I want to thank everyone who supports our Patreon at the $15 a month level. That's Adam Simon, Adrian, Alison Lipparato, Alan Liska, Anne Slagle, Antonio LB, Aurelio Jimenez,
Starting point is 01:02:07 Alain Lisca, Ann Slagle, Antonio LB, Aurelio Jimenez, Beth Brevik, Brandon Sisko, Camu and Lego, Charles Anderson, Chris Staley, Courtney Henderson, David Conover, Drill Bill, M, Hillary Wolkin, Jim Shelton, Julia Russell, Kelly Casey, Kelly Lucas, Lacey Teigenhoff, Mark Long, Michael Warnicke, Michelle Glittermum, Miles Gillingsrood, Mom Named Gwen, Mrs. King Koch, Nicholas Morris, If you want to join their ranks, head to patreon.com slash adamconover. Of course, I also want to thank our producer, Sam Roudman, our engineer, Ryan Conner, and our WK for our theme song, the fine folks at Falcon Northwest for building me the incredible custom gaming PC. I'm recording this very episode for you on. You can find me online at adamconover.net,
Starting point is 01:02:56 where you can find, once again, all my tour dates. Thank you so much for listening, and we will see you next time on Factually.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.