StarTalk Radio - C-3PO and the Rise of Robots, with Anthony Daniels

Episode Date: March 23, 2020

What’s the difference between a robot and an android? Should laws protect robots? Neil deGrasse Tyson explores the rise of robots with “I Am C-3PO” author and Star Wars actor Anthony Daniels, co...mic co-host Chuck Nice, and robot ethicist Kate Darling, PhD. NOTE: StarTalk+ Patrons and All-Access subscribers can watch or listen to this entire episode commercial-free here: https://www.startalkradio.net/show/c-3po-and-the-rise-of-robots-with-anthony-daniels/ Thanks to our Patrons Leon Galante, Tyler Miller, Chadd Brown, Oliver Gigacz, and Mike Schallmo for supporting us this week. Photo Credit: StarTalk. Subscribe to SiriusXM Podcasts+ on Apple Podcasts to listen to new episodes ad-free and a whole week early.

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to StarTalk, your place in the universe where science and pop culture collide. StarTalk begins right now. This is StarTalk. I'm Neil deGrasse Tyson, your personal astrophysicist. And today I got with me Chuck Knight. Hey, Neil. Just bumped that. Boom, what's up? And there's like someone between us here.
Starting point is 00:00:27 I know, I know. We're always fist bumping in somebody's face. In somebody's face. Today, we're talking about robots. In fact, that's not only what we're going to do, that's the title of the show. Talking about robots. Talking about robots.
Starting point is 00:00:39 Talking about robots. A little on the nose. And we have as our studio guest, Kate Darling. Kate, welcome. I don't get a fist bump? Double fist bump. Give it a... Bam!
Starting point is 00:00:49 Oh, look at that. There you go. There you go. And while you're here with us, you are a robot ethicist. Didn't even know that was a thing. We'll get into that in a minute. From the MIT Media Lab, of course, in Cambridge, Massachusetts. And what we're featuring today is my interview with Anthony Daniels.
Starting point is 00:01:08 Anthony Daniels. Yeah. Oh, she's getting all nerdy, nerding out on that one. Anthony Daniels, the actor who portrayed C-3PO. Oh, my God. Oh, dear. Oh, my God. Oh, oh, oh, oh.
Starting point is 00:01:23 Oh, you want the gig. How lovely. Oh, no, I love it. R2, R2. You want the gig. In particular, we're not just talking about robots. We're talking about relationships with robots. Okay. Between humans and robots.
Starting point is 00:01:37 And we don't even know what that means entirely. Not at the moment. The movie AI kind of covered it. Yeah, okay. All right, all right. But, of course, C-3PO is from the Star Wars franchise. One of the most successful movie franchises there ever was. And just let me get a little bit of background on you, Kate.
Starting point is 00:01:55 So, did you come to this from robotics? No. No. Well, I've always loved robots, but I'm a social scientist. Nice. I have a legal background. I did social scientist. Nice. I have a legal background. I did social sciences, and now I study human-robot interaction from a social, legal, and ethical perspective.
Starting point is 00:02:11 Wow. So it's good to learn that someone such as you exists in that world. Yes. We should have somebody like you in all of the potentially troubled places where technology is going. You mean everywhere? Yeah, like everywhere. So you have a book that may be coming out in 2021. Yeah.
Starting point is 00:02:34 I have a title here. Is this the right one? The New Breed. What Our History with Animals Reveals About Our Future with Machines. That is an awesome title. That really is so so let me go get to my first clip with anthony daniels he's the only actor that was in all nine star wars movies wow all 90 official not the right not the the the the fan off ramps right exactly and he's also the author of imc 3po thePO, The Inside Story. Cool. You see what he did there?
Starting point is 00:03:06 Yes. Very clever. You see what he did there. So what is it about C-3PO? Is it his performance, the way he speaks, that people could relate to him so deeply? Oh, gosh. C-3PO is amazing. I think what it is, actually, is that C-3PO looks like a robot but acts kind of like a human.
Starting point is 00:03:26 Like, he's very flawed and has all these human emotions. And I think people just relate to him, ironically, because he's so human-like. Oh, so it's the opposite. It's not like he's the perfect robot and we're finding a way to relate to that. Right. It's that he has enough human in him so that he's an imperfect robot right that's what we're related is that what you just told me yeah yeah well you know that makes sense because that is what makes us human they're like you know the
Starting point is 00:03:57 fact that we are flawed and imperfect and and kind of annoying and well definitely not kind of definitely annoying you know so i kind of liked it when he definitely. Not kind of. Definitely annoying. You know? So I kind of liked it when he got a little excited. Oh, oh, oh. What shall we do? That was just kind of fun. It was like, hey, that's kind of cool.
Starting point is 00:04:14 Yeah, exactly. So it's the human side of the robot that we're relating to. Sometimes, yeah. In that case? In that case, for sure. Okay. Okay. And so what is it about him, other than his costume, that told you he's a robot in how he's interacting with people?
Starting point is 00:04:31 What is the evidence that he's a robot, other than he's a shiny metal thing? Right. I mean, a lot of it is the design. I'm trying to remember anything, like, specifically robotic that he said. See, I don't think so. Other than I know 80 trillion languages. Right, exactly.
Starting point is 00:04:46 Yeah, something like that maybe. We don't know human wood, so that's a robot talent. Right. But I think a lot of the visual and the way he moves. Yes, and he does. It looks like he's actually doing the dance, the robot. You know what you want to do if you want to be a contemporary robot, though, is fall over a lot.
Starting point is 00:05:02 Oh, this will be YouTube videos. Yeah. Well, let's go to a clip. So I sat down with Anthony Daniels. And did you know he wasn't always a fan of sci-fi? Really? Well, remember, he's an actor. Until he got that first check.
Starting point is 00:05:19 Love me. Oh, dear. I do believe I love me some sci-fi. Oh, R2, R2, where's the nearest bank? Let's find out. What was he up to before he landed where he did? Check it out. Maybe I'd been traumatized.
Starting point is 00:05:36 You know, I never thought of this. I had been bashed around the head by 2001 A Space Odyssey to the point where I never wanted to see another spaceship. Because that was a long movie with very little dialogue. There's no character development except for the how, the computer. Well, you're right there. So it's a very different genre. It's not even science fiction.
Starting point is 00:05:57 It's a science portrait, in a way. It was almost a philosophical treatise on man and man versus space. Man, machine, and space. And so there we are. So I didn't want to go because back then the only robot that I remembered really were the Daleks
Starting point is 00:06:17 on television. Oh, the Daleks. Yeah, so of Doctor Who. Yeah, Doctor Who. Doctor Who with sink plungers on their faces. I mean, cute, and as a kid I adored them, but as an acting role, not so much. Then there was Robbie, before them, Robbie the robot in, what's it called?
Starting point is 00:06:33 The Forbidden Planet. The Forbidden Planet. And he was this kind of lumbering thing made of Michelin tires, really, it seemed to me. Yeah, because he had these horizontal segments. That's right, and he lumbered in a kind of unprepossessing way,
Starting point is 00:06:46 I felt. But anyway... Are you judging the acting talents of a robot? No, but it was... You just gave a critique. No, no, no.
Starting point is 00:06:54 He lumbered. He didn't pull off that movement convincingly. He lumbered, and when you read on page 95 that I met Mr. Kanishita, who designed him,
Starting point is 00:07:03 and I said joyfully, oh, what was it like to see your design come off the paper? And he went not so good. I didn't mean him to kind of lumber. And I said, oh so you get why 3PO kind of teeters around because it's more characterful, more forgiving
Starting point is 00:07:18 more human in a way. Yeah, because Robbie the robot was, I'm going to say robotic. No, I don't know. He just didn't, there was no, he had the arms, that was it.
Starting point is 00:07:32 Whereas C-3PO, there were sort of body gestures that could help communicate a mood. And it's all I had, really. Right, because there's no, this is not a moving mouth here. And you'd be surprised how many people think it's makeup that I'm wearing. No, it's a solid. We, because there's no, this is not a moving mouth here. And you'd be surprised how many people think it's makeup that I'm wearing. No, it's a solid... We all saw
Starting point is 00:07:47 Goldfinger, so just a couple years ago. Ah, yeah. She was prettier than me. So he was referencing his book, I Am C-3PO, and he said, on page 95. Right. So Kate, how has, we talked about a few generations of robots there, how has our concept of robot evolved from the beginning until now?
Starting point is 00:08:06 It used to be that anything that was even remotely automata and could move on its own was viewed as a robot, and now we have a little bit more that we expect a robot to be able to do in behavior. Right, because in fact now, so when does it become an android versus a robot? An android is a robot that looks deceptively human-like. Lieutenant Commander Data. Yes, yes, Data, my favorite android. And then there's C-3PO, which is more of a humanoid robot.
Starting point is 00:08:34 So like head, torso. I forgot the word humanoid goes in there. Yeah, so androids look realistic. Humanoids just have a kind of human-shaped torso, head, arms, legs. So not R2-D2? Not R2- arms, legs. So not R2-D2? Not R2-D2. So what's R2-D2? He's just a robot.
Starting point is 00:08:48 Just a robot. Damn, with no oid on it. Right. Poor R2. He's just a robot. Okay, so, but presumably all three kinds of robots are still legit in storytelling today. Oh, yeah, for sure. Okay.
Starting point is 00:09:04 But we don't have the big Robbie the Robot kinds anymore. Those, the lumbering. Right. With the circuits turning in his... Warning, Real Robinson. Warning. Right, that was a Robbie Robot style.
Starting point is 00:09:15 Exactly. In Lost in Space. Lost in Space. Warning, Real Robinson. Danger. Not warning. Danger, Real Robinson. And with the arms would be flailing.
Starting point is 00:09:25 Right, right. And then Dr. Smith would be like, Oh, warning. Danger, real Robertson. And with the arms would be flailing. Right, right. And then Dr. Smith would be like, oh, dear. Oh, dear. You know. At what point do you inform
Starting point is 00:09:35 a person who might be trying to design a robot in terms of its personality or its character or how they would best be an actor doing so?
Starting point is 00:09:45 I mean, I do, I work with roboticists a lot in social robotics. And, you know, we have more and more robots coming into shared spaces and they have to, you know, interact with people. And not all roboticists... What's a shared space? Oh, you know, a workplace, household, public areas. Oh, okay, okay. Like Stop and Shop has robots roaming the aisles now.
Starting point is 00:10:01 Do they? Yeah. You could call, they are indeed a robot, but it's more like an obelisk on wheels with googly eyes attached. It looks like a penis. Wow. What? All right.
Starting point is 00:10:13 Good. It does, though. I mean. I'm just saying maybe to you. Okay. So it goes up and down the aisles? Yeah. And there's no one controlling it?
Starting point is 00:10:25 No one's controlling it. There's no joystick? It actually moves about like a Roomba. Like, you know, except it doesn't have to touch things. And I assume it doesn't bump into the orange aisle. It doesn't bump into the orange stack, right. But I think, I've never engaged with them personally, but I think you can ask them questions
Starting point is 00:10:38 and they will direct you to places in the store. All right, I got it. The next time I see one, I'm doing all kinds of experiments on it. Oh my God. I was going to say., I'm doing all kinds of experiments on it. Oh, my God. I was going to say. I might have gone to Stop and Shop
Starting point is 00:10:49 last weekend with Daniela, who's one of the students in the personal robotics lab who is obsessed with this robot. And we might have, like, put stuff in front of the robot
Starting point is 00:10:56 to see what it did. You might. Just might have. And what happened? Yeah. Well, they might have done it. They didn't. Oh, you might have.
Starting point is 00:11:01 If you had. We might have. What do you think would have happened if you had actually done it? If you had. If you had done it. If we had. What do you think would have happened if you had actually done it? If you had done it.
Starting point is 00:11:06 If we had. What do you think might have happened? What were you testing it for? We didn't get kicked out yet. We're going to go back. Well, we just wanted to see what it would do because the purpose of the robot is to find hazards on the floor and alert someone to come pick them up. And so we wanted to know what's the hazard. Clean up all four.
Starting point is 00:11:22 Clean up all four. All right. Yep. Now, I wonder what it would do if you just laid down in the floor, like on the floor in front of it.
Starting point is 00:11:29 Like, what it would do if it's actually... It'll go around you. See, that's a really bad robot. No, why? What are you talking about? So you'll recognize a spill,
Starting point is 00:11:39 but I just had a damn heart attack. And you'll just go around me. And you'll go around me? Really? So somebody drops a jar of pickles and it's a huge monumental problem. We need, you know,
Starting point is 00:11:50 somebody to get here right away. But you fall and you can't get up and the robot, you're just in the way. Exactly. Instead of hitting my meta-alert bracelet, you're just like,
Starting point is 00:11:58 okay, excuse me. Like, really? So that's the thing though. Like when designing robots, you have to think about like what's going to be frustrating to people when they're interacting with it. And they're going to be like,
Starting point is 00:12:08 why isn't it helping me do X when they don't understand that building a robot is really, really hard. And they only have very limited capabilities. And so roboticists really need to think not just about how they're working, but how people are going to perceive them. They only have limited capabilities now.
Starting point is 00:12:23 Right. Yeah. Stop covering for them. So only have limited capabilities now. Right. Yeah. Stop covering for him. So Anthony Daniels, as an actor, he's best known for C-3PO, but he almost didn't take the gig. Oh. Yeah.
Starting point is 00:12:40 Let's find out why. Okay. Check it out. So there we were thinking about playing a robot. And the thing that really changed my mind was reading the words that I had not written. George and his team had written them, pretty much George. And clearly he had invented a machine with more human characteristics than he could apply to a human being.
Starting point is 00:13:02 You couldn't get away with Han Solo being the character of Threepio, if you see what I mean. So Threepio is allowed to have intense humanity because he isn't a human. He isn't human. That's deep. Not really. Yes, it is. Because what you're saying is, with a machine who is sort of human, but it's still a machine, you can take it to human places that would be unconvincing if written for a human character. And slightly uncomfortable. I never thought about that.
Starting point is 00:13:35 You will now. Thank you. In your next lecture, you can talk this out. There is a film called Bicentennial Man. I never got to see that. It's interesting. Robin Williams. Beautiful guy.
Starting point is 00:13:50 I had luck to meet him a couple of times. We didn't talk about it, but that was a slightly uncomfortable film because the storyteller was... He was transitioning from a robot that arrived in a packing case, you know, from Amazon or somewhere. And then...
Starting point is 00:14:05 Pack that. And then pack that thing. Whatever the Amazon equivalent was back when that movie was made. And then gradually he metamorphoses into a human, and it's slightly uncomfortable because it veers towards pushing our humanity buttons. Like, what does it take to be human?
Starting point is 00:14:22 And why are we slightly uncomfortable through the uncanny valley and beyond? So tell us about the uncanny valley, because it's a great name, but it still has to be defined for people to know what it is. It is often used in games or in visuals or in film or in computer terms. The Turing test almost gets there, but it's when something is almost real, looks great, and it speaks nicely and has great skin, for instance, in a real world.
Starting point is 00:14:51 But there's something that's not quite right. There's something that we sniff as a human being that's not quite there. So it's even unconscious within us, perhaps. It's innate within us. Innate, that's a better word, right. You don't even know how to verbalize it. Verbalize it, yeah.
Starting point is 00:15:06 And so people have coined this phrase, the uncanny valley, because you know there's something not quite right. Kate, do all humans respond to the uncanny valley the same way? So people have tested this theory empirically with very mixed results, but most people who work in robotics seem to think that there's something there. And they're wrong. They're wrong?
Starting point is 00:15:29 Yeah, they are. Let me save them a lot of money. You're wrong. Save all the academics who have researched this. Save all the academics who are researching this forever. You're wrong. What you're talking about is the perception of normal humanity.
Starting point is 00:15:41 That's why you can't put your finger on it because it doesn't exist. We feel the same way about human beings that may have some type of brain disorder. And we talk to them and we go, oh, something not quite right here. But you don't say they're not a human being, but that's really what your perceptions are telling you. So what you're talking about is the normal perception of humanity as opposed to what makes someone human. And they're two different things. I think you're right. I think I personally think it's about.
Starting point is 00:16:15 I know I'm right, Kate. Oh, oh. Let me tell you something. No, I'm joking. I'm joking. I'm joking. So that comedy thing doesn't work out. I'll hire you in the lab.
Starting point is 00:16:23 All right. But go ahead. I'm joking. So that comedy thing doesn't work out. I'll hire you in the lab. All right. But go ahead. Well, for me, Uncanny Valley has always been about expectation management because you're expecting something to behave a certain way.
Starting point is 00:16:32 If it looks human, you're expecting it to blink like a human and not twitch its face. And if it doesn't, that kind of unsettles you. If it does something that you're not expecting. So what do programmers, yourself, what do you all do in the media lab to either exploit the uncanny valley or to dodge it? I don't think anyone wants to exploit it. But I also don't understand why we would try to create something that looks like a human or talks like a human because we can create anything we want. Why create, like, why try to, like, risk this uncanny valley creepiness factor
Starting point is 00:17:05 when we can create an R2D2 that communicates in beeps and boos? Do you tell this to your peeps back at MIT? Oh, yeah.
Starting point is 00:17:12 Like, everyone, I think, in the social robotics field agrees that making, you know, human-like robots is not as interesting
Starting point is 00:17:18 as making something that has expression. Something better? Yeah, you can make something better. Something better than humans. Animators have honed this technique
Starting point is 00:17:27 for hundreds of years, how you can make something like Bambi that looks like a deer, but actually looks better than a deer to us. Okay, all right. So I agree with you, but from a different direction. So I think the future of AI and robots
Starting point is 00:17:44 is not to try to mimic a person. Okay. A person is not even an ideal form. No. Right, right. For tasks that you want to conduct, the human body is like,
Starting point is 00:17:55 why would you design that? Right. That's not the case. Even with the people who don't have legs but they run track. On the blades. On blades.
Starting point is 00:18:03 Yeah. We're not trying to duplicate the bones of a foot and then put flesh on it and say, now you're... No. It's like, we got something better. Something better. Something better. Here's something that'll spring and propel you forward faster. So in your lab, are people thinking of the task they need, not trying to duplicate a human?
Starting point is 00:18:20 Because we can just... People make babies all the time. Why do you need to make a robot human? We have real humans. Well, I think people have like this fascination with recreating ourselves, but like, I really don't see the point. I think we're all in agreement here. Wow.
Starting point is 00:18:32 Yeah. I mean, I've never thought of it that way, but you're right. I think, you know, when you look at sci-fi movies, like Alien comes to mind and the so-called android robot is so human that it's indistinguishable. But the problem is it doesn't have a soul. So it can't make any more. It's a sociopath. Let's get to that next. Okay.
Starting point is 00:18:59 We got to take a break. When StarTalk returns, more about the evolving relationship between robots and humans. We're back with Star Talk. Neil deGrasse Tyson, your host. Chuck Nice. That's right. Kate Darling. Kate Darling. Kate Darling.
Starting point is 00:19:38 Kate Darling. In from Boston. Thanks for coming. From Cambridge, specifically. The MIT Media Lab. Good stuff happens. Every time something amazing is happening, it's traceable, the MIT Media Lab. Good stuff happens. Every time something amazing is happening, it's traceable back to the MIT Lab. It's funny how that works.
Starting point is 00:19:50 Just not only like art science and robotic science and computing and culture. Yeah. So congratulations to all y'all. Oh, yeah. It's just me. I mean. You the one. All right. I just want. It's just me. I mean. You the one. All right.
Starting point is 00:20:06 No credit. Excellent. I just want to pick up on where we left off. This idea that you're talking about in the Alien series, there was a human who was not human. Right. So not even humanoid, android. Android. Android.
Starting point is 00:20:21 Yeah. And you're cool until you realize they would make a different ethical choice than you would. Yeah. Yeah. So do you have to program this in? Is this something they can learn? There's a whole field called machine ethics that looks at can you program ethics into machines? And it turns out that's really, really hard because we don't even fully understand or agree on humans.
Starting point is 00:20:41 We haven't programmed ethics into us. Yes. So... You can't program something that you ain't got yourself. Yeah. So I would prefer maybe not to create robots that have to make those kinds of ethical decisions. But there are people who are trying to solve that problem.
Starting point is 00:20:58 Okay. And so, but it would also be a way if some, so let's get back to the concept of soul. The religious person would say the soul gives you a sense of right and wrong and purpose and these sorts of things. And that was the idea with Bishop. Bishop didn't have a soul, so if it meant that bringing back this life form to earth that could potentially wipe out all humans, it doesn't make a difference because it's in the interest... It's an interesting experiment. Right. It's in the interest of experimentation and exploration. So who cares?
Starting point is 00:21:28 I think this is people's greatest fear about scientists gone astray. Yes. Yeah. Without a doubt. So will that be the hardest thing to program into robots? A soul?
Starting point is 00:21:39 That's three lines of code. I think that's all. Well, Japanese actually believe that certain things have souls. Tell me about the Japanese. Yeah. So, like, there's this Japanese roboticist who creates these very, very lifelike androids. Like, he's made one of himself.
Starting point is 00:21:54 Hiroshi Ishiguro is his name. Ishiguro. Ishiguro. And, like, it seems that in, you know, eastern cultures that have a history of Shintoism and believe that, you know, even objects can have a soul, like they have funerals
Starting point is 00:22:09 for sewing needles, for example. It seems that they're more... Yeah. Yeah. That must have been a badass sewing needle. If you were to give it a funeral,
Starting point is 00:22:17 that must have sewn some good stuff. See? With a darn, a lot of socks. Poor Needy, we knew him well. Needy? Is that the... Needy, that, we knew him well. Needy, is that the?
Starting point is 00:22:26 Needy, that's what we called him. That's your nickname. Yes, exactly. So tell me, I was unfamiliar with this, so keep going. Yeah, and we don't have that concept in more Judeo-Christian society. We have, oh, things are alive and have a soul. Things are not alive, don't have a soul. And so there's this idea.
Starting point is 00:22:43 Not only humans have soul. Or that only humans, yeah, depending on, you know, yeah. But that's why some people say that the Japanese are much more accepting of robots and this idea of having humanoid and android robots around because they're like, hey, that's cool. Are they also more accepting of robots in the Uncanny Valley?
Starting point is 00:23:01 They might be. Again, like I said, the empirical testing on the uncanny valley has kind of been mixed so there's not a good scientific basis for it but anecdotally yes is it part of the fact that in their culture they have a greater need for robots i mean it is clear that they have like in japanese health care uh they don't have enough people and you have a a great advancement of robotics in that particular arena are you confusing robots with automation no i'm talking about actual robot care i mean in different like for instance in a hospital like for um the delivery of certain. A robot will do that.
Starting point is 00:23:45 Rather than an orderly or something. Rather than an orderly, right. You know, so for instance, or just even go outside of healthcare. Hotels that you go to where they have robot check-in and it'll be like a Tyrannosaurus Rex will check you into the hotel. Yeah.
Starting point is 00:24:00 You know, a robotic, a robotic Tyrannosaurus. Because it's a novelty. That's how much into robots they are that we are not. You know, a robotic, a robotic terrarium. Just for fun. Because it's a novelty. That's how much into robots they are that we are not, you know? So, and what about Shintoism enables that or empowers it or drives it? Well, some people would say that that makes them more willing to accept robots as this thing that's alive but not really alive. Oh, so the simple element of inanimate objects having souls, that alone would be sufficient. So that is one reason people think the Japanese are more accepting of robots.
Starting point is 00:24:34 Another reason is, like you said, the need. As robots come more into these shared spaces and people interact with them more, people just get used to them. And then there's also the fact that their science fiction and pop culture tends to be less dystopian when it comes to robots. Like they have Astro Boy. They have these positive stories.
Starting point is 00:24:50 I grew up with Astro Boy. Astro Boy bounds away on his mission today. Rocking high to the sky. How come I don't remember that? Because I just made it up. No, I'm joking. That was the actual song.
Starting point is 00:25:02 That was the actual song. I remember Astro Boy. Yeah. And there was, well, they also had Speed Racer. There was a lot of sort of early anime. That was the extra song. That was the extra song. I remember Astro Boy. Yeah. All the, and there was, well, they also had Speed Racer. There was a lot of sort of early anime.
Starting point is 00:25:08 That was the early Japanese anime. Yeah. That made it to American television. That's right. And yeah, it was all very, very happy stories. Yeah.
Starting point is 00:25:14 And. But we have a lot of Terminator and stories of the robots taking over. They have less of that. That is true. We are so messed up here. So, so are,
Starting point is 00:25:24 are the, is the Japanese culture a good bellwether for the global acceptance and trajectory of robots? I would say not necessarily. I think that maybe the ways that they will want to use robots are different. Okay. Like the fact that they like androids, and I don't really think that we do in Western society. Right on, yeah. But it'd be interesting to see if all countries have equal access to this technology,
Starting point is 00:25:50 what they'll come up with relative to their own cultural needs. Right, for sure. So, Anthony Daniels, we're featuring my clips with him. He has an interesting perspective on what makes C-3PO more human than a robot. Ooh. That's his perspective, because he was, he is C-3PO. Cool. So let's check it out.
Starting point is 00:26:15 George came up with this idea of this kind of figure, this Art Deco figure. Then he employed Ralph McQuarrie, who made this life-changing painting that I saw of the character. And then Liz Moore, the sculptor, turned that into 3D and made this beautiful face that people recognize. And interesting, I only just realized the other day because I was trying to cheat in Photoshop. Because some robot faces are just scary. shop because some some some some robot faces are just scary and this is actually very it's it's got it's got curiosity in it and it's but it's you want to know what he's thinking because he clearly is thinking and partly it's that sort of wide-eyed uh almost babyish stare with with big eyes um and what was interesting liz had actually and I never realized it until recently, created
Starting point is 00:27:05 something that wasn't machine perfect, it wasn't symmetrical about a center point. It is actually, as in a human face, I tried to flip it in Photoshop to double it up to make it perfect. And it doesn't work because he is asymmetric. And that is one of the clues, I think, to his humanity. That's an interesting philosophical point because there's been research on symmetry and there's a whole off-ramp from that research that says, maybe it's not an off-ramp, maybe it's an on-ramp, that a little bit of asymmetry brings interest to a character,
Starting point is 00:27:39 to an image, to a painting, to art. Perfection, there's nothing more to say. It's like somebody did it already. Right, right, right. Now, just between you and me, you do have a very symmetric face. Let me just stare into the camera here. I personally don't. If you cut me in half, if you cut me...
Starting point is 00:27:56 Here we go. It is not symmetric. Yeah, you are so symmetric. Would I like... You are perfect. I'd probably like to be, but it's too late now. I think we have to go, but it's too late now. I think we have to go
Starting point is 00:28:07 with what we've got. Were you hitting on C-3PO? No, I was just saying. His book
Starting point is 00:28:14 had a picture of him and the robot, and so I put the other half of him next to his head, and
Starting point is 00:28:21 it was him. I'm saying. But what a genius design tactic to actually purposely put in asymmetry. Tell me about perfection. I mean, I hadn't heard about
Starting point is 00:28:31 this asymmetry thing before. That's really interesting. But one of the tricks that a lot of robot designers use in social robotics is to, you know, if you're going to give it a face, don't make it as human like as possible and don't give it too many features. Don't necessarily give it eyebrows or a nose.
Starting point is 00:28:49 Just eyes is enough. Things that we automatically respond to, like he was saying, like the big eyes, the babyish face, things that we kind of evolutionarily respond to are the best design tricks. Oh, okay. That's right, because babies,
Starting point is 00:29:06 their head grows only by a factor of three and the body grows by a factor of five or six. So babies have a disproportionately large head to their body. Yeah, I pushed one out of me. Tell me about it. Oh, is it? Okay.
Starting point is 00:29:23 You should have just built it in the lab yeah you you had the power you have the power you don't have to do you don't have to do it you don't have to biologically recreate okay the rest of us we'd do that if we could right yeah so i think that the argument from evolutionary biology standpoint at least what i learned from my colleagues here at the American Museum of Natural History, here's a commercial, is that in order to prevent mammals from killing their children, the children have to look cute. Yeah. And so the, not that everyone would kill their children, but I'm just saying. I would. Most would.
Starting point is 00:30:02 No, it's not that you would want to kill them all the time. There are occasions in the arc of raising children where if they weren't cute, we go extinct a long time ago. Have you done research into what our relationship with robots says about us? Ooh. A little bit. Psychologically? us? Ooh. A little bit.
Starting point is 00:30:25 Psychologically? Emotionally? A little bit. I could talk about this all day, but it's kind of like, you know how when you go on a date with someone and they're really mean to the waiter and you're like, that's a red flag? Some of our research indicates that
Starting point is 00:30:38 if you're mean or violent to a lifelike robot, that might say something about you as a person. Wow. You know, that makes sense. So Boston Dynamics has these videos online of robots being abused. And I know clearly that that's a thing. That's not a person. And I got to tell you, it is so hard to watch
Starting point is 00:31:04 because they're hitting it with bats and they're kicking it and they're knocking it over. It's a robot that's trying to walk. Yes, it's trying to walk. And basically— I've seen those. You've seen those? Yeah. And it's really disturbing.
Starting point is 00:31:15 Yeah, people get really upset. Like the first time that they put one out that looked kind of like a dog and they named it Spot. And then they're like kicking it and it's like struggling to stay on its feet. They named it Spot, and then they're, like, kicking it, and it's, like, struggling to stay on its feet. People got so upset that PETA, the animal rights organization, was getting a bunch of phone calls and had to issue a press statement. And they didn't even take it seriously. They were like, yeah, we're not going to lose any sleep over this. It's not a real dog.
Starting point is 00:31:38 But there actually might be something there. Okay. Okay, so would you preemptively, I mean, is this like, what's that movie? Tom Cruise? Yeah, yeah, The Minority Report. The Minority Report. Is this how you would pre-diagnose someone's propensity to, you sort of already said so, because in a date, someone behaves in a way that is to someone who they have power over. I mean, right now robots are still really
Starting point is 00:32:05 primitive and we're still able to mentally compartmentalize. But as the design gets more and more lifelike, I mean, we do definitely draw connections between animal abuse and child abuse in the same household legally. If you have a case of one, you look for a case of the other. And it's possible
Starting point is 00:32:21 that... Strong correlations already established. Okay. All right. And if you have a robot that can mimic pain and suffering and you enjoy inflicting that on it, that might be an indicator that you might also enjoy
Starting point is 00:32:34 torturing an animal. But we don't know. We don't have the evidence. This requires more research. All right, so it's certainly evidence that you're a dick. That's for sure. I feel like it kind of is. So it's certainly evidence that you're a dick. That's for sure. I feel like he kind of is.
Starting point is 00:32:48 So it's interesting. So in the dating scene, these are like secondary cues. They can be really nice to you, but the waiter not so much. If they kick the Roomba, it's over. Given the examples of the Boston Dynamics and people kicking it, and you feel the emotion for something that is not alive. Right. Where do laws ultimately have to land with regard to rights for robots?
Starting point is 00:33:18 Well, it depends. So I believe in evidence-based policy. Really? Yes, I actually do. What's wrong with you? Unlike most legislators. Have you been checked out? I know, I know.
Starting point is 00:33:31 But like really, like it would be nice to have some evidence. And if, for example, we found out that it was actually desensitizing to people to behave really violently
Starting point is 00:33:41 towards life like robots, then, you know, there's some question of whether we should regulate and say you're not allowed to do certain things to certain types of robots. Because it's fostering behavior that would be counter to the interest of civilization. So to me, only if it actually has an impact on that behavior. Actually, I have to say that makes a lot of sense.
Starting point is 00:34:01 That would be evidence-based legislation. That's evidence-based. That makes a lot of sense. Very hopeful be evidence-based legislation. That's evidence. That makes a lot of sense. Very hopeful there. But it's tough to research. So actually, my last clip of this segment, I talked to Anthony Daniels about robots today. Just to get a sense of what is...
Starting point is 00:34:18 Because, you know, that character dates from the 70s. Right. So just what were his thinking about the interaction of humans and robots today? Let's check it out. And one of the frustrations we have now with machines that pretend to be human
Starting point is 00:34:31 and certainly in Japan there are companies working on human. Oh, do I need leading the way on that? Every time I see a new robot, it's a Japanese robot. Yeah, well,
Starting point is 00:34:40 they like that kind of thing. They've slightly taken it to their own in the sense of social interactions with machines, human to machine, human-cyborg relations. Indeed, George was there first. Oh, yes. Some of them, we're in early stages of real robotics,
Starting point is 00:35:00 and we have to think what we want from that. But when you have something that pretends to be human and then sort of suddenly malfunctions, it's like, well, we're talking about Stepford Wives. Suddenly I'm alerting to all these... You've given us a full review of 20th century robots here. This is great. And 20th century film writers, script writers,
Starting point is 00:35:23 who now very, I think, cogently have adopted this slightly outer world, nether world, where we are going, not in my lifetime, I hope, because I need the work, you know. So let's not move. Actually, I'll come back to that, because in Japan it's widely known that they are looking for really human-relatable, probably bed-sized machines that people can relate to. But then you have to look at what kind of figure physically do you supply?
Starting point is 00:35:58 Because if it's too humanoid and it starts clicking, then it's a little scary, isn't it? Right. If it's too mechanical, then you're it's a little scary isn't it if it's too mechanical then you're relating to i don't know in a can of fizzy drink like where's the balance between who i want to believe i'm relating to because if i get too fond of you and you're a machine it's not going to end happily there are all kinds kinds of off-ramps there for where that would go. Let's not go there. That's for the second series. So is there any thinking in your lab about human-robot relationships?
Starting point is 00:36:39 Bonding? Yes. And where does that land? Bonding? Yes. And where does that land? Well, for me and a lot of my colleagues, I feel like we're, as humans,
Starting point is 00:36:52 capable of a lot of different types of relationships. And to me, the relationship with a robot isn't necessarily the replacement of a human relationship. It's more like how we would treat a pet or something completely different and new. So it's not something that I worry about. That's enlightened, though. But maybe not. But I think that's an enlightened outlook.
Starting point is 00:37:08 It's not clear to me that that's where that's going to go. I think people, you know, if people can have imaginary friends, then they can have a robot that becomes a friend. That becomes a friend. Okay, but why is that bad? No, no, I'm asking you. Yeah. Is there, should we, I don't mean to imply it's inherently bad.
Starting point is 00:37:29 I'm asking you, have you guys thought about whether or not it's. Well, what I, what keeps me up at night isn't. That's what we want to know. Isn't that someone might bond or have like a friend as a robot. It's that a company is making that robot and maybe is using the robot to emotionally manipulate that person. But that already happens in toys. No, no, it's called advertising. Yeah.
Starting point is 00:37:49 That's even better. Manipulate all the time. Don't even take a robot. Doesn't even take a robot. You're absolutely right. Big psychological brain screw called advertising. But no, I remember
Starting point is 00:37:59 it was like a furry or a Furby or something, but it's a little robot and it says things like, I love you and you're my friend. And it's like, you know, I was like, I would never get that for my kid. Like, that's the loneliest kid
Starting point is 00:38:15 in the world that needs this toy that's like giving it love and affection and reinforcement. There was an episode of The Twilight Zone where there's a guy isolated on a on an asteroid somewhere it's early before they knew how right what space was really going to be but anyway he's on an asteroid and this asteroid apparently has a breathable atmosphere but holding that aside holding aside these complex these these holding that aside right okay um
Starting point is 00:38:41 they he couldn't be rescued for like a long time and he's slightly going crazy so they brought him a robot a female robot okay and it says turn here and then she comes to life comes to life of course it's played by an actual actress but it doesn't matter she's a robot and matter she's a robot and then it's they they're they're companions and they're there for like a year and then the rescue mission finally comes but there's no room on the ship for her no wow no this was a deep story oh my god what happens. They said, either nobody gets back, or we're going back without your companion. He says, no, but she's this, isn't it? I love her. Guy takes out his gun, shoots her in the head.
Starting point is 00:39:34 What? And then the springs come out and the thing, and it said, let's go. Who does that? It's in the show. The guy in love with her? No, another guy. No, the other guy who's trying to save his fellow astronaut. That's so cool.
Starting point is 00:39:48 To remind him that she's just a robot. She's just a robot. So talk to me. I mean, robots can fill a void like that. They're already being used as an animal therapy replacement in nursing homes because we can't use real animals. And so you bring in this baby seal robot that gives you the sense of nurturing something and people become very attached to them wow no so i'm asking yeah the ethics of the story i just shared with you oh well i i mean i i think it's unethical to
Starting point is 00:40:19 shoot lady robot but uh okay but that wasn't psychologically damaging for otherwise they all die because there's only one seat on that rescue well yeah yeah okay that's the construct yes and you're an ethicist talk to me why couldn't she sit on the outside she doesn't breathe there oh good boy they just strapped her to the bottom of the ship. Chuck, I forgot about this. Yeah. You know. Thank you. Now we don't have to resolve the ethical issue. Right. Oh, my gosh.
Starting point is 00:40:50 Oh, wow. Chuck solved that problem. Okay. No, but tell me, how would you, where? Can it be unhealthy, though, this bonding that you're talking about? It can be, yeah. I mean, if it's being used to manipulate someone or if they're bonding with something.
Starting point is 00:41:07 So it sounds like she was meant to be a tool and they didn't anticipate that he would bond with her this much. Yes. And this happens in the real world. This happens with soldiers bonding with their bomb disposal robots where they treat them like pets and they get really upset if they get broken. Particularly if they save your life, it doesn't matter. Yeah.
Starting point is 00:41:25 So Peter Singer has written about soldiers actually risking their lives. Peter Singer, the Princeton philosopher. No. So there are two Peter Singers. Peter Singer, there's a Peter Singer who has written a book called Wired for War about military robots. Oh. And apparently soldiers have risked their lives to save the robots
Starting point is 00:41:46 that they work with. they're actually missing the point of that robot. Yeah, well, or the people. The point of the robot is to save their lives.
Starting point is 00:41:53 Right. Yes. Exactly. Yes. But, kind of, you bond with something if it saves your life,
Starting point is 00:41:58 though. And I don't think the people who deployed that really anticipated that response. Real interesting. Okay, so here's the question then,
Starting point is 00:42:05 if you're going to make it empirical. There's the risk to his psychological health having no companion for a year versus the risk to his psychological health of having a companion that you put a bullet through her head. Which of those is worse? I mean, not having a background in psychology, my guess would be it's... Ethically, even.
Starting point is 00:42:27 I mean, you know, we get pets and we know they're going to die. And this is a similar thing. That's true. All right. We got to land this plane. We have a whole other segment. Wow. Okay.
Starting point is 00:42:41 We're going to take a break. When we come back, more of the relationship between humans and robots on StarTalk. Hey, we'd like to give a Patreon shout-out to the following Patreon patrons, Leon Galante and Tyler Miller. Guys, thanks so much for your support, because without you, we couldn't make this show. And if you would like your very own Patreon shout-out, make sure you go to patreon.com slash startalkradio and support us. We're back. StarTalk.
Starting point is 00:43:38 We're exploring the relationship between robots and humans. Yes. Featuring my interview with Anthony Daniels, who recently published the book, I Am C-3PO. Ooh. Yeah. And we have with us, as sort of an expert commentator, Kate Darling.
Starting point is 00:43:52 Kate, reintroducing you to those who, whoever comes in only in the third segment, I don't know who that is. Animals, that's who. So, anyway, thanks for bringing your MIT Lab perspectives for us. And I was delighted to learn that Anthony Daniels was affiliated with academia. Let's check it out.
Starting point is 00:44:14 Maybe I shouldn't call you Anthony Daniels. I should call you Professor Daniels. I'm fundamentally an academic, so my radar perks up. Well, you can call me a professor, but I know where you're going because I'm not an academic, so my radar perks up. You can call me a professor, but I know where you're going, because I'm not a professor. I am a kind of visiting professional at Carnegie Mellon University. Carnegie Mellon, one of the leading institutions in computer science and robotics and everything automated.
Starting point is 00:44:41 But years ago, I kind of got connected with it, curious circumstances, through the Robot Hall of Fame. Oh. They invited me. It's an institution in the Science Museum there in Pittsburgh. They contacted me. Would I come and accept an award for C3PO to be part of their exhibit? Yes, of course.
Starting point is 00:45:02 How could you not? But on the way there... So where is the Hall of Fame? It's in the center of Pittsburgh. It's in the Science Museum. Oh, very good. And for instance, they've got C-3PO,
Starting point is 00:45:14 they've got R2-D2, they've got a machine that can pot a ball every time. Get that hoop every time. A basketball. A basketball. It can do it mechanically every time perfectly.
Starting point is 00:45:26 No matter where you put it in the, or just from that one spot. I think you, yeah, it's cheating, isn't it? Yeah,
Starting point is 00:45:31 yeah. It's rubbish, isn't it? I know. And they've got one of the original arms of that original, could pick up an egg
Starting point is 00:45:38 and put it there and just do that all the time. They've also got, Oh, so they have the history. They've got the history. And at the time,
Starting point is 00:45:45 that would have been quite remarkable to get a machine to do anything. It was the They've also got... Oh, so they have the history. They've got the history. And at the time, that would have been quite remarkable to get a machine to do anything. It was the first, what was the industrial robot there was. There it is. And it's got to be able to pick up an egg and not break it. Not break it,
Starting point is 00:45:56 but also put it exactly to replicate. And, you know, the definition of a robot has changed now from the early Asmodean days to where we are, a machine that can do something kind of that's useful, just doesn't need a human to do it. They also have, for instance, a room as large as this,
Starting point is 00:46:13 which is a medicine dispensary, which is apparently far, far more accurate than having a human dispenser in a hospital. It's dishing out the drugs, but in a good way. If you visited that Hall of Fame, let's assume they have the drugs, but in a good way. If you visited that Hall of Fame, let's assume they have all robots, what would be your favorite robot, Kate? My favorite robot?
Starting point is 00:46:32 They only have real ones, right? Like, not science fiction. Oh, no, C-3PO's in there. Is WALL-E there? I like WALL-E. WALL-E. You're allowed. Even if they just have a drawing of WALL-E, we'll give you Wall-E. You're like, well, that's cute.
Starting point is 00:46:46 I like that. Okay, how about you? Alien Covenant, which wasn't the best movie in the world, but Michael Fassbender. Oh, yeah. Hot robot. Hot robot.
Starting point is 00:47:00 But not as hot as, what's his face in AI? He was the male sex robot. I forget his name. He's gorgeous. God, did I just go gay for robots? I think I did. You totally did.
Starting point is 00:47:12 I totally did. Totally did. Anyway, Michael Fassbender plays two robots. He plays himself. He plays Walter, who has no emotions, but then he plays Walter's evil twin, who does. And they're both robots? And they're both robots? And they're both robots, but the one without emotions, believe it or not, easily manipulated by the one who has emotions
Starting point is 00:47:31 because when you're evil, you can do evil. But when you don't have any emotions and you're just susceptible to anything. So that's your favorite robot? Yeah. Which one? The evil one. I got it. I can't lie lie the evil one is my favorite yeah you know why because i don't have it in me to be that and i think in some i think maybe if i did it what i would feel differently maybe you need it to complete you oh wouldn't that be cool
Starting point is 00:48:01 so at what point do you think about the good and evil that a robot might or might not do, either because they're programmed to or because they learn it on their own? Right. Yeah. I don't think we think about it in terms of inherent good or evil, but more how is the technology being used? By those who should know the difference between good and evil. Yes.
Starting point is 00:48:25 Right. Humans. But see, now apparently at some point, these machines will be programmed by algorithms written by people, even if they're written by other machines, written at some point by people,
Starting point is 00:48:38 and good and evil will be kind of inherent in that algorithm. Yeah. It's going to be a whole mess of gray. Oh, okay. Thank you. Thank you for that. Thank you.
Starting point is 00:48:52 Thank you for those hopeful words. I'm very, very hopeful, yes. Well, I had to take the conversation there. Okay. The future of AI infused in robots. Let's see what Anthony Daniels says. I am a little frightened by AI, and you're quite right. I come in to deal with the talks of the students with an objective eye
Starting point is 00:49:13 that I don't really understand much of the science, but I have an outer perspective. And gradually through practice, you know, I'm enjoying virtual reality, augmented reality, all these sort of things that are gradually bringing the theatrical user, if you will, the entertainment user, into the scene. So you're not just a sit-back participant.
Starting point is 00:49:36 You are actually involved. So maybe the gradualism of this prevents anyone from even noticing the day that AI takes over. I think kind of that's already happening. But in the world of entertainment, which is what the Entertainment Technology Center is about, the growth in involving entertainment is very marked, you know, with all these headsets coming onto the leap motion and all that kind of thing. But in a world where robots are going to industrialize jobs
Starting point is 00:50:08 and take jobs away from human beings. Which they've already begun doing. Yeah, we'd better look to what humans can do apart from twiddle their thumbs. Wow. Interesting. So let me ask you now then. Whatever you were not thinking about evil in the moment how about evil in the future how about ai turning evil how about have have let me ask you a tighter
Starting point is 00:50:33 question has ai as portrayed in film gone the right places that we all should be thinking about no they're missing something yes they're missing a ton. And I love science fiction. I think that science fiction opens people's minds to thinking about what's possible. But we have so many dystopian stories of robots taking over that people are fearing robot uprisings when that's very premature
Starting point is 00:50:59 and we should be worried about other things that are happening right now. Wait, you didn't say that it wouldn't happen. You just said it's premature. Well. That's funny. People worry about robot uprising? Right.
Starting point is 00:51:10 Not yet. Not for at least another two years. Not yet. That's so 2027. No, so tell me. So where should we be focused? Well, there are a lot of issues right now with privacy, data security, supplement versus replacement of human ability, with reinforcing racial, gender stereotypes in the design of these technologies.
Starting point is 00:51:34 All of this is happening right now. There's autonomous weapon systems being developed. There's things we should be concerned about that aren't the robots becoming smart and taking over the world. I think that that one also tends to be a lot of rich white dudes worry about that one
Starting point is 00:51:50 because they don't have to worry about the other stuff. They're just like, my only danger is that a robot is going to kill me. I don't have to worry about like facial recognition
Starting point is 00:52:01 holding me up at the airport. That's sociologically insightful. You're right. That makes a lot of sense. Plus, you've seen the racial sinks in bathrooms. Yeah. The racist sinks.
Starting point is 00:52:13 No. I don't know about this. You don't know about those? No. They can't see black hands. Yeah. And we've done this with all types of technology because it's all white dudes building it.
Starting point is 00:52:22 Right. And so, right. I just thought the sink didn't work. You put your hands hands on waiting for the water because it's an automatic has happened to me and nothing's maybe that sink doesn't work so that i go to another sink then i wave my hands more eventually you know it'll hit but if i so if you do the experiment you put in a darker surface or a lighter surface it's it's it's reflecting reflection of light reflecting reflecting the light right so what you're saying is white men, say it, white men. White men, because there's also a lot of gender stuff that happens.
Starting point is 00:52:50 Right, right. We'll design things thinking that they are the model of what it is and should be and capturing their concerns. But it's also not their fault. Like, we all view the world through our own experiences. And so the problem is that we don't have diverse teams building technology. That's really where it is. It is their fault if they're not hiring you.
Starting point is 00:53:10 Or Chuck. Or Chuck. They need to hire Chuck, really. There you go. Can a black man have clean hands? I'm trying to stave off viral infections. Dang! Yes, sir, you're hired. We'll put you in.
Starting point is 00:53:26 But we don't want any trouble. No, but you raise a very important point. I'm stereotyping here, but if white men are programming all of the code that will be the future of AI, it could have remarkably biased consequences. As an unintended consequence. Yeah.
Starting point is 00:53:47 Right. You're speaking about this as though this is in the future, but this is actually happening right now. Well, gotcha. Listen. Yeah, I know. That's why my hands are dirty. Chuck can't clean his hands, and you don't know when.
Starting point is 00:54:00 Oh, God. Thank God for hand sanitizer. All right. So how about this? Let's try to land this plane. Okay. What do you think is our largest ethical dilemma going forward? I think the thing I worry about the most is that a lot of AI, the way it's built right now, relies on data collection.
Starting point is 00:54:20 They need massive amounts of data, and so I worry about privacy because there's no incentive to curb that right now. Interesting. Because if you want to know all about humans, you've got to know everything they do. But then other people will also have access to that data. Governments will, companies will. It's already happening in China
Starting point is 00:54:37 where they're collecting information privately, but then the government forces them to turn over that information, including facial recognition that happens on just the streets of the cities where they're like, all that camera data. How to identify foreign nationals. Exactly.
Starting point is 00:54:54 Exactly. So how do we lean into, because I do think there are so many positive use cases for this tech, so how do we lean into those positive use cases and curb some of this stuff? That's a challenge. No, don't ask us that question. I'm asking you.
Starting point is 00:55:10 I didn't bring you here to ask that. No. If I had an answer, then my job would be, you know. Over. Yeah. All right. So let's get some parting thoughts. Chuck, what's your parting thought here?
Starting point is 00:55:23 You know, I'm really disturbed by the fact that there are racist things, man. I'm sorry. You did not know about that, Chuck. I did not know about that at all. And this has happened to me. It's like I feel violated by sinks now. That's all I can think about. I'm sorry.
Starting point is 00:55:45 Okay. Sorry to take you off the rails there, Chuck. So, Kate, give us something hopeful here. Reflecting on it all. So, you know how people are sometimes nice to robots and then they feel
Starting point is 00:55:59 silly about it? Like, they'll say excuse me to their robot vacuum cleaner, or they'll say please me to their robot vacuum cleaner or they'll like say please or thank you to Alexa, the Amazon's assistant. I don't think people need to feel silly about that because I think that what that is saying is that their first instinct is to be kind to another. And so what I really, really love about robots is that they are kind of a reflection of our own humanity in a way. You mean our interactions with the robots?
Starting point is 00:56:28 Yeah, our interactions with them. That's cool. That's cool. Yeah. I like that. So if you're a good person, a robot will tease that out of you. So here's what I think. Not that you asked.
Starting point is 00:56:40 We don't care, but go ahead. Says the media lab professionals. I was going to the media lab professionals want to treat him like a robot this is the real side of who we got here um i think and i don't even know if i have foundation to think this way i think the apocalyptic scenarios are overplayed. I think we always deal into our base lowest fears because fear tends to always override our joys. That's natural, I think, for survival, right? You can, you know, if you're not afraid of something and it kills you, then you're dead. Gone is the gene to be afraid of stuff that will kill you, right? So you're taken out of the gene pool.
Starting point is 00:57:29 So I think that's been overplayed. My worry is that our distraction with the evil prevents us from thinking more creatively about the good. The good that robotic AI can bring to this world. And I don't want to lose out on the creative solutions that they can bring. So, Kate, I put it entirely on your shoulders to fix the problem.
Starting point is 00:57:58 Because this office is not called the Media Lab. This is just Neil's office. Right. Or the Cosmic Crib. We're going to send you back home, back to your peeps, and we want you
Starting point is 00:58:09 to solve this problem. Challenge accepted. Excellent. Chuck. Always a pleasure. Good to have you. Kate, thanks for coming down
Starting point is 00:58:18 from Boston. Thank you so much for having me. All right. And you've been watching, possibly listening, to this episode of StarTalk,
Starting point is 00:58:25 Robots and Humans. And I just want to thank Kate and Chuck for doing the show. Absolutely. As always, I bid you to keep looking up.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.