StarTalk Radio - Human Augmentation with Adam Savage and Neil deGrasse Tyson

Episode Date: November 3, 2017

Exoskeleton suits, brain implants, tiny people, AI, and more! Adam Savage and Neil deGrasse Tyson investigate human augmentation from the main stage at NYCC 2017 with comic co-host Chuck Nice and NYU... bioethics professor and philosopher Matthew Liao.Special thanks to Adam Savage and Tested. Check out Tested.com. http://www.tested.comWatch a SPECIAL FREE Exclusive Original video featuring Neil deGrasse Tyson and Adam Savage on StarTalk All-Access: https://www.startalkradio.net/all-access/neil-adam-odyssey/NOTE: StarTalk All-Access subscribers can listen to this entire episode commercial-free: https://www.startalkradio.net/all-access/human-augmentation-with-adam-savage-and-neil-degrasse-tyson/ Subscribe to SiriusXM Podcasts+ on Apple Podcasts to listen to new episodes ad-free and a whole week early.

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to StarTalk. Your place in the universe where science and pop culture collide. StarTalk begins right now. Thank you. Thank you, New York Comic Con. This is StarTalk. We're gonna... We're gonna... So, we're gonna actually record a live episode of StarTalk right now. So we have three segments, and we'll pretend like we take a break, but we're not actually going to break.
Starting point is 00:00:59 We'll just go straight into the next segment, and you'll know how to play along with that. So let's get this party started. So here we go. Welcome to StarTalk Live at New York Comic Con. Tonight's topic, we're talking about human augmentation. Oof. Oof. Oof. And before we begin, let me bring out my trusted co-host, professional comedian, Chuck Nice.
Starting point is 00:01:35 Chuck, come on out here, dude. Where is he? Chuck. Chuck, there you go. All right. Hey, everybody. What's up, Comic-Con? Nerds rule. And we have Matthew Liao,
Starting point is 00:01:55 a professor of bioethics at New York University. Come on out. And the whole point of this entire episode will be pivoting on the knowledge and expertise of the one, the only Mythbuster, Adam Savage.
Starting point is 00:02:15 Adam! Dude, have a seat. Hey, nerds! How's your con going? Dude. Dude, so... Whoa. They asked me to ask you to raise your mic a little. That was a Comic-Con-gasm right there
Starting point is 00:02:42 that we heard in the 10th row. So we're going to be talking about sort of human augmentation. And I just want to say, Matthew, what do we mean by human augmentation in your field? Because you think about this. Yes, in your field, not like in L.A. So to augment is to sort of add something, add some sort of capacity. So human augmentation is where you're adding
Starting point is 00:03:10 some sort of additional capacities to our organism. And there are different kinds, cars, telephone. So we've been augmenting human bodies ever since we've had technology. That's right. Okay. So when I think of augmentation, there's a traditional sort of a Comic-Con sense of that where you might have
Starting point is 00:03:30 an exoskeleton or some sort of mechanical extension of the body. That would be one branch. That's right. Right. How about if I'm chemically enhanced, is that also an augmentation that you think about? That's right. So there are drugs that you can take. There are people taking maybe, taking smart drugs. I like the way you say it. There are drugs you can take. Like this is a new discovery in the field. Wait a minute, really?
Starting point is 00:03:54 Yeah, yeah. So. And so, and then there's also any talk of the marriage of the technology with your human physiology. That's right. So these are three sort of major branches there. Would with your human physiology? That's right. So these are three sort of major branches there. Would you agree?
Starting point is 00:04:09 Yeah, that's right. Okay. So Adam, you build stuff. You've been doing this your whole life. Yeah. So you are human augmenter prime. Sure. With a high school diploma, yeah.
Starting point is 00:04:21 Yeah, okay. Diplomas mean nothing here. It is what you... Clearly. Clearly. So let's just get straight out here. If we're going to talk about augmentation and we're going to think about superheroes,
Starting point is 00:04:41 we're kind of down to only two in that category. Arguably also my two favorite superheroes. Me too, I think. We're talking about Batman and Iron Man. Batman and Iron Man. Those are my two favorite because their secret power is their brains. And they're human.
Starting point is 00:04:57 Yeah. Yes. Well, we got some people taking issue with this out. Wow. You can't express that strong an opinion in front of this crowd that's all i'm saying but so but they clearly have augmentations to their bodies in some way or another i think it's more they have augmentations to their bank account you're bringing up there's a poster that shows all of the philanthropic giving that Bill Gates has undergone in the past 20 years.
Starting point is 00:05:28 $30 billion. And by conservative estimates, he's saved over 6 million lives. This is as of a few years ago. At the bottom it says, suck it, Batman. This is how a billionaire saves people. There you go. Yes. But does he have a utility belt?
Starting point is 00:05:49 If Bill wanted a utility belt, I'd make him one. If Bill wanted a utility belt, I'd be it for him. So let me ask Adam, how do we define super in this regard well because is batman a superhero he can't fly he can't there's a lot of stuff he can do yeah the where you where super gets into the realm of the fictional is in both of the Batman and Iron Man augmentations because Iron Man's exosuit while in any small piece of it is somewhat possible or plausible that there are mechanical linkages you could build that would be self-perpetuating and give you all sorts of extra strength the idea that it would work without flaw repeatedly
Starting point is 00:06:38 is an absolute fantasy I mean, one might say, a myth. That's no longer my job. But, I mean, there's a reason NASA has never used cables to assist astronauts in their grip or their ability to move the suit, because the engineers at NASA, as brilliant as they are, understand that extra moving parts is extra things that can go wrong. So you're saying there are a million ways Iron Man's suit would fail and the movies don't show any of them. No. Right. Batman 2. I mean, I have actually tried to build a device that shot a cable into a wall that you could hang off of. And then I
Starting point is 00:07:20 talked to the government, an agency that tried to build one of these for the government and they failed in exactly the same way I did. I'm old enough to remember Batman in first run on television and when he had that little device, you know, the gun that shoots the dart, I said, how does that dart stay in the wall? That's like not happening. Not happening.
Starting point is 00:07:40 No. And, wait, see you got me started on this. Then they throw the thing up and then climb up the wall. And I said, they're not climbing up a wall. They're just walking along a flat thing. And he tipped the camera. Because the guy sticking his head out of the window is that all the angles are wrong. And I knew this.
Starting point is 00:08:01 Sorry. Sorry. I feel you, brother. You feel my pain. Absolutely. So here's my point. You have these, in the modern Batman,
Starting point is 00:08:12 he's got, it's really kind of an exoskeleton. Yeah. Not an exoskeleton, a body armor, I guess, is what you would call it. Yeah, it's segmented body armor. And so is this,
Starting point is 00:08:21 so between the two of them, who do you think would win? Oh, Batman. Oh, no. Understand. No. Quick vote. Mike.
Starting point is 00:08:37 I think Iron Man will win. Iron Man. Yeah. Wait, wait, wait. Tony Stark for life, baby. Yeah, you're outnumbered, so you're wrong. Thank you, one person. I appreciate it.
Starting point is 00:08:54 My thinking is that Iron Man would be like, I'm going to punch. And then he can't move, and Batman's like... Okay, the reason why I like Iron Man better is because he builds his own stuff. Whereas Batman has he's got like other people who do it for him. Well, Wayne Industries.
Starting point is 00:09:20 Wayne Industries. That's who builds all his stuff. Technically that's Wayne Enterprises. True. Oh! Oh! Oh! You are correct, sir. So are there any real-life examples of exoskeletons used in the world? Yeah, so the military has been creating these exoskeletons for soldiers, I think.
Starting point is 00:09:45 Adam, you probably know about them. Thank you. And they're prosthetic limbs for people who are disabled. And are they working the way Luke's hand worked in Star Wars? I mean, are we there yet? Let me ask that question. Not yet. Not yet. Can I bring this back, though? Are we there yet? Let me ask that question. Not yet.
Starting point is 00:10:04 Can I bring this back, though? There must be an attachment that someone who has no arm below the elbow has asked a prostheticist to make, and the prostheticist has said, no, I'm not going to graft a.45 caliber pistol onto the end of your prosthetic. Or a buzzsaw for a fist. Yes! That's good. But that's where we're talking about. Or a buzzsaw for a fist. Yes! That's good. But that's where we're talking about.
Starting point is 00:10:30 That's an ethical problem that the constructor has with the goal of the person who needs the device. That's right, that's right. So, I mean, right now they're just doing it mainly for treatment for people who are injured, right? So that they can move about. But eventually you can sort of add more things to it. You can add weapons, you can move about. But eventually, you can add more things to it. You can add weapons. You can add swords.
Starting point is 00:10:50 I would totally do that. Laser pistols. You want to be like a Swiss Army knife human. One of my sons once asked me when he was about four years old. Thing one or thing two? Thing one. Okay. He said, Daddy, the penis is a very special part of your body. And I said, yes,
Starting point is 00:11:08 yes, you're right. And he said, because all children are jailhouse lawyers, he said, is it more special than a foot? You should have said, son, just wait 12 years. Here was my metric. I thought, well, let's see. If I lost my foot, I could make an extremely usable functional replica of it. Yes, the penis is far more important than the foot. Oh, so this is from the point of view of... Of repeal and replace. Of a remodel maker, yes. Yes, okay.
Starting point is 00:11:40 Okay. Yes, okay. So... But neurologically, in principle, what is to prevent us the future? We're just not there yet to just rebuild our neurophysiology rather than just our muscles or our skeletons. Yeah, so there's this thing called the deep brain stimulation. So it's something where you sort of insert a thin electrode to your brain. Deep brain stimulation. Deep brain stimulation. And you connect it to a battery pack and you can control it.
Starting point is 00:12:21 And actually about 100,000 people in the world today already use it. What are they doing with it? It's for people with Parkinson's disease, depression, epilepsy, and so on. So they, when they turn it up, they'll sort of stop them from having epileptic seizures. Okay. So the funding for that would be driven in the future, surely by the sex industry. So is there a pleasure center where you press the button and it'll do the same thing to your brain? They're sex robots. Deep brain Viagra?
Starting point is 00:12:52 There's a whole swath of the population that would never leave their house again. But DARPA, actually the US... The Defense Project Research Agency. They're also really interested in this technology because a lot of soldiers come back from war
Starting point is 00:13:08 with post-traumatic stress disorder. So they're using the deep brain stimulation to try to cure or to ameliorate the effects of depression. So this is just the beginning of what could come. That's right. So in the limit, what do you see it in 50 years, 100 years? Well, eventually, so right now the system, it's called an open loop system. So you control it.
Starting point is 00:13:28 You sort of control the amount of electricity going to your brain. But DARPA wants in five years a closed loop system. And what that means is that it's going to automatically monitor your brain waves and sort of send signals. So say if you're fighting a war and you get really scared, it'll sort of send electricity to calm you down. And so eventually we could create super soldiers. Now with that, here's a, whoa. Wow. That's never going to go wrong. Yeah. What could happen there? But is anybody asking, I'm curious, is like, obviously
Starting point is 00:14:06 with technology, we're like, driverless cars, bring them on! I'm all for that, but go on. You're so am I, except that we should... No, I don't have an except. You're not all for it if the rest of the sentence has the word accept in it. Do we understand language construction, sentence construction
Starting point is 00:14:22 here? I am all for it. Okay. However, within humans' adoption of technology, there are questions that come. And so, is your department sort of working on like, hey, everybody, when you get to this point, you guys should consult us?
Starting point is 00:14:39 Yeah. So take self-driving cars, for example. There's a big issue about, so say the car is about to hit five people. The trolley problem. Yeah, it's the trolley problem. It's so not a problem. And it can veer off and kill you. So how many people would actually buy a car that will kill you?
Starting point is 00:14:58 Yeah? All right. So having a rough con. Yeah, yeah, yeah, it's tough. Costume didn't make it this year. My wig fell off. All these sort of AI researchers and programmers, they have to decide whether to save five people or save you, right?
Starting point is 00:15:16 And so that's got to be programmed in. Wait, wait, wait. But there's way more than that. Before we even get there, I'm thinking, let me just ask. If you could fix let's say congenital problems or some other sort of physical problem either from birth or from war or from accident and okay i get that are you obligated to do that for everyone i mean so so the reason why I ask is.
Starting point is 00:15:45 That guy. There are people. No, it's a serious point. There are people who are who they are because of whatever that disability was. Expressing some point of insight, creativity, some sort of compassion, and they've made a life built on everything that they were or were not relative to everybody else. Like deaf people who refuse a cochlear implant. That's right.
Starting point is 00:16:14 Right, right, because they already know how to speak. Someone's here who did that? Yeah, yeah, deaf culture is its own thing. It's a thing, and so who are you to say... And it's no less of a thing than anybody else's. So my question back to you is what is a future
Starting point is 00:16:31 where there could be one culture that puts social pressure on you to conform when there are people figuring out their lives with whatever deck of cards they were dealt? Yeah. So that's a really great question were dealt. Yeah. So that's a really great question. Okay.
Starting point is 00:16:48 Yeah. So I think, so I wrote a book on human rights, and I think that. Tell the title of the book. It's a great book. It's called The Right to be Loved. The Right to be Loved. Oh. That almost makes up for you wearing a suit tonight.
Starting point is 00:17:05 Right. Yeah. So I think that for the disabled individuals, they have rights just like we do. And we have to make sure that I think our job is to make sure that if they want to, to augment their capacities if they want to, but if they don't, if they don't want to have cochlear implants, then that's their choice. But in terms of creating new individuals, I think there we can sort of say maybe we have an ethical obligation not to create individuals who lack certain what I call fundamental capacities.
Starting point is 00:17:41 So these are things like being able to see, being able to hear, being able to walk about, right? So deliberately creating a child who can't walk, that seems like, so as an ethicist, I think there's something wrong with doing that. Yeah. Wait, wait, wait, you're creating a child that can't walk. You're kind of a dick. Right.
Starting point is 00:18:02 But what if you let them be born with wheels? Wait, did you say with wheels? With wheels. Okay, it turns out, I think, biophysiologically, it's hard to have wheels. If you think that through, biologically,
Starting point is 00:18:24 you would need something in your body that... An axel. You need an axel. You need an axel. But this presupposes that we could somehow lessen the suffering by engineering a being. And the fact is, just as you were saying about handicapped people, it's true for everyone.
Starting point is 00:18:50 We are all a product of both all the good and bad things that have happened to us. And we all suffer. No one escapes from that. So, okay, a couple people have escaped from that. That strikes me as an interesting question when you talk about, oh, so now that we can engineer kids that all kids have uh 2020 vision and great hearing and excellent physiology
Starting point is 00:19:12 that somehow is based on this precept that like somehow they'll suffer less which is really a folly wait wait but continue that yeah there's the all right let's make sure we don't create a kid who can't walk, whether or not it has wheels. But then suppose now, suppose now we figure out how to give an arm sort of bionic strength, to borrow the term. And would you start having people amputate their arms so they can get one of these? And then would that, would we be creating one by one some race of super humans because we technologically can? I always loved
Starting point is 00:19:51 the Bionic Woman episode where she's in a race car and the wheel falls off and she has to put it back on and she reaches up to the lug nut and goes, I would love my arm to be able to do that. Would you amputate your current arm to have that property?
Starting point is 00:20:09 Yes. Yeah, I think so. Whoa. Let's ask the ethicist. Is there any ethics involved in this? So I don't think we have to make that choice. So with the exoskeleton. That's a cop-out.
Starting point is 00:20:21 It is a cop-out. But I think people are going to be able to design things where we can jump in just like we jump into cars rather than having to amputate our arms. Like that would be sort of... I see. Actually, I agree with you on that. Because in the same way,
Starting point is 00:20:37 we're not operating on your brain to insert the elements of a smartphone. The smartphone is at the end of your fingertips and it serves almost the same function. It's just a little slower. You have to tap your fingers. insert the elements of a smartphone, the smartphone is at the end of your fingertips, and it serves almost the same function. It's just a little slower. You have to tap your fingers instead of having to go straight into your cerebellum, your frontal lobe.
Starting point is 00:20:53 There is a theory out there about spider webs being an extension of the spider's central nervous system, and if that's true, then absolutely, my phone is an extension of my central nervous system. It's true. The withdrawal that I feel when it's not around me lets me know you're right. And if I want to feel better for a day, I don't look at it. We've got to close out this segment of StarTalk live at New York Comic Con. We're back.
Starting point is 00:21:33 StarTalk Live at Comic-Con New York. Mythbuster Adam Savage with a book coming out next year. What's it called? Don't Give Guns to Robots. Good title. I love that title of that book. Matthew, you had a book a couple of years ago. We discussed it in the first segment.
Starting point is 00:21:52 The Right to Be Loved. The Right to Be Loved. That's beautiful. It's also a Carla Mae Jepsen song. Oh, right. No, I don't think so. And please don't forget my book that's out on shelves right now. It's called Astrophysics for People in a Hurry.
Starting point is 00:22:06 No, you're lying, Mark. So let's see what genetic engineering does. But before I do that, Matthew, you co-authored a paper and apparently it was controversial about human augmentation and climate change. Right. What, how, what was that? So climate change is one of our, you know, one of the biggest problems that we face today. And a lot of people are talking about very drastic solutions, something called, for example, something called geoengineering i love it yeah give me more of it this is sort of just one example spraying sulfate aerosols into the ozone layer in order to increase
Starting point is 00:22:52 the reflectivity of the planet right and lower the amount of infrared radiation and sunlight coming down to the that's right that's right but geoengineering seems really risky. We only have one planet. So if you destroy the ozone layer, you end up destroying the whole planet. So in this context, I propose something called human engineering, which is this biomedical modification of human beings in order to allow us to mitigate or adapt to the effects of climate change. Wait, wait, wait, wait. That's like saying, how do you fix acid rain? Just give everyone an acid rain-proof umbrella. Why is that a solution?
Starting point is 00:23:33 Yeah, so let me give you an example. No! No. There is no... All right, all right, I'll give you a chance. Go. Okay, so large people tend to use more energy than smaller people, right? And they wear our shoes, you know, quicker.
Starting point is 00:23:52 It takes more energy to transport them from A to B. So the idea is all the things beneath the wall. I don't want to hear the end of that idea. This car is steering towards five people. Large people use more resources and energy than small people. So what I propose is that the small people eat the large people. Right. There's no...
Starting point is 00:24:16 It's a problem taking care of itself. So Chuck, that sentence can only end badly. Exactly. Okay. Exactly. Okay. So you propose... Yes, Matthew, can only end badly. Exactly. Okay. Exactly. Okay. So you propose, yes, Matthew, go ahead. Yes, so we should try to have smaller people.
Starting point is 00:24:32 Uh. Chuck. Wait a minute, man. I just realized I could have been a doctor. Go ahead. So there's a movie coming out. It's called Downsizing with Matt Damon, Kristen Wiig, and Christopher Wolff. It's exactly on this topic. So they've taken up this idea.
Starting point is 00:25:11 So climate change, and they want to downsize it. So they want to create small people in order to make more efficient use. The movie's coming out in December. But these are really small people. They're like 10 inches tall. So I'm not talking about that small. I don't think that it'd be good to... Not this.
Starting point is 00:25:27 This would be too small. But maybe that. The question is if you're flying with one, do you have to pay a full extra fare? Yeah. Or could a whole group of friends camp out in first class?
Starting point is 00:25:47 I see a lot of sequels. Matthew, you're completely creeping me out here. So, it's just odd that, you know, rather than reduce your carbon footprint, let's just make small people. That's just, I don't know how, you live with yourself? I do. Okay.
Starting point is 00:26:12 But you're also, so when you take that stance, you're taking it, not take that stance, when you examine that question, you're examining. Just to be fair to Matthew, he's an academic, so when you're in academia, you explore extremes of theses and see what comes out of it. That's all. So I don't want to, you know,
Starting point is 00:26:29 I'm having fun with you here, but it's your job to see where that goes so that you will know whether anything should go that way or not. Because if you don't know what that is... I think we figured that part out. It's fascinating that we're talking about this at Comic-Con with this crowd,
Starting point is 00:26:46 because this is a crowd of people that I don't think have all felt normal most of their lives. This is the misfits. Yes. And the innate problem we have in our minds, that I have when I hear that, is I love the variety and variability of the human experience. All right. So, Matthew, both of you are familiar with the gene editing that's been going on lately. With CRISPR. CRISPR. Yeah. So, the future of CRISPR, we ought to be able to go in and genetically augment the body, putting
Starting point is 00:27:25 him out of business, because this boy makes stuff in his garage, and you go into the body and augment it that way. So my hope for that is the following. Okay? Wheels! Wheels!
Starting point is 00:27:41 I've started a thing. No, here's why I don't think wheels are an option. Because in all experimentations of nature, the tree of life has not come up with wheels. Okay? But in principle, we should have access to any feature that exists in any other living thing. Right. In principle. We should have access to any feature that exists in any other living thing. Right.
Starting point is 00:28:06 In principle. So, for example, if you have a wounded warrior back from missing a limb, you find some gene in the newt that regenerates a limb. You graph that or whatever. I'm just making this up. But how far-fetched is that when you know there are other vertebrates in this world that can regrow a limb? That's awesome. And so I'd like to think that we would have access to all of the benefits of four billion years of nature's experiment on its own. Is that a fair hope that I can put in the hands of a CRISPR expert?
Starting point is 00:28:43 Yeah, that's right. So there's sort of embryonic stem cell research, for example, for regeneration, so they can become any kind of cell in your body. And then CRISPR for gene editing. So they're already using this to edit out, say, HIV, to make an embryo more HIV-resistant, for example. And to be clear, we're talking in a fairly... I mean, I know politics run the gamut, but we're talking in a fairly similar, I think, political spectrum in this crowded Comic-Con.
Starting point is 00:29:16 But what about if someone can edit genes out? I mean, as soon as somebody said they thought they'd isolated a gene for homosexuality, people were calling them saying, how do I make sure my child does not, my unborn child does not have this? So there's a whole ethical frontier. Designer babies.
Starting point is 00:29:30 Designer babies. Yeah, designer babies. So what's going on in your, because you are also affiliated with the philosophy department at NYU. So that means you're just a deep thinker about all of this. So where does that go in the future? So we have to make an ethical decision about where to draw the line.
Starting point is 00:29:51 So we need to make a decision. Thank you. Make a decision about how to sort of do designer babies. And what I think is that, so earlier I mentioned this thing about fundamental capacities. And so I think we can draw a line, one line can be drawn there, where we have to make sure that whatever child we create has to have all the fundamental capacities. I see. So that's the basic model still has to be intact. That's right.
Starting point is 00:30:19 Right. When you get the car, it should have all four wheels and two axles, maybe. No, suppose I like the basic model of the car, but I just want a leather interior. Okay. You know what I mean? Like, so I would like a child who has a certain type of hair or a certain kind, like, basically, you know, like you have a bunch of black parents with little white blue-eyed babies or, you know, Madonna. Yeah. little white blue-eyed babies or, you know, Madonna. Yeah.
Starting point is 00:30:59 Yeah, so race, for example, doesn't affect our fundamental capacity. Speak for yourself. I don't know about your life. Go ahead. I'm sorry. By that, I mean sort of like things like being able to move around, being able to think, being able to hear, et cetera, et cetera. And so that's why it's wrong to try to, you know, to edit out genes. Again, I'm not ever talking about, I want to talk about enhancement rather than subtraction of whatever might have been there.
Starting point is 00:31:33 So if I could go in and edit a gene, if we find such a gene that gives you high propensity to great piano playing, because I like the piano, and I invested in one in the household, and it's in the nursery now. So should I be prevented from doing that? I don't think so. I think you should be allowed to do that. Okay, so what's interesting is, Richard Dawkins, when posed with that question, he said, it's probably more humane to do that
Starting point is 00:32:00 than force a kid to take piano lessons for 10 years. Exactly. See, but the way I feel about that is you should be able to do that. I agree with you. But it should only be within what's in your particular gene line. So, for instance, if there is a propensity for playing the piano somewhere in your gene line, it's in there somewhere. It just wasn't passed to you. This is the movie Gattaca.
Starting point is 00:32:24 It's like a switch you turn on. It's a switch you turn on. This is the movie Gattaca, where they say, we're not creating some other you. We're making the best you. We're getting the best of all genetic pools between you and your mate.
Starting point is 00:32:39 And that's what we're putting in. But you know what they, in one of them, a person played the piano and it was really beautiful. And when they were done, they had six fingers on each hand. Oh, yeah, that's right. Genetically.
Starting point is 00:32:49 That's right. Because she had that gene and they activated it so she could play songs that nobody else could play with six fingers. Six fingers on each hand. Oh, my God. I want six fingers so bad right now. They didn't have to learn to count in base 12. Yes. Are you prepared for that?
Starting point is 00:33:03 That's right. Yeah. They didn't have to learn to count in base 12. Yes, are you prepared for that? But we started this discussion about gene editing by talking about enhancements to a basic model, but this presupposes... I just got to go there for a second. Of course, our timekeeping is base 12. You're correct. We get to 12 and then we start over again.
Starting point is 00:33:19 Because base 10, you get to 10 and start over again. So base 12 is not all... Babylonians had a little bit of that in them. Base 60 and base 12 devise easily into 60. You're the only dude I know who bring up Babylonians and math at the same time. So what are we saying? I was saying
Starting point is 00:33:37 when we talk about enhancing a basic model, this presupposes that humans as a species could potentially decide upon what a basic model is, and I think that sounds possibly like folly. I can imagine every iteration of weird, bizarre, messed-up ideas about what constitutes a basic model of human.
Starting point is 00:33:56 Wheel! Did he say you? Wheels. Wheels. Wheels. Yeah, wheels. All right, we got to bring this segment to a close. Clearly.
Starting point is 00:34:11 You've been listening to StarTalk Comic-Con Live. Yeah. Thank you. We're live back at New York Comic Con. StarTalk. Adam Savage, a myth-buster. Singular. Former myth-buster. Former myth-buster singular.
Starting point is 00:34:46 Yeah, no, it's a mouthful. And we have Matthew Liao. Did I pronounce that correctly? Yes, a professor of bioethics at New York University. Co-appointed at the Department of Philosophy. Deep thinker on all matters. And I love the title of your book, Why We Should Be Loved. Did I get that right?
Starting point is 00:35:02 The Right to Be Loved. The Right to Be Loved. You have the right to remember a title. I know, I know, you got it. And Chuck, we love you always. Let's take a look at what we might want to do if we augment ourselves for survival purposes, rather than just to have fun. Maybe now don't talk about little people again okay don't even go there uh so if you want to go if we're going to go to if we're going to if we if we find out that space requires a certain kind of physiology bioluminescence i've been thinking about that the whole time.
Starting point is 00:35:48 But wait, if I could augment humans for survival, I would allow our stomachs to digest cellulose. Oh, that's the end of hunger. Right. End of hunger. Boom. Yeah. Cellulose has calorie content,
Starting point is 00:35:59 just like any other carbohydrate does, except we have no enzymes to dissolve it. But it could have alternative consequences. Okay? Like pieces of grass in our poop? That wouldn't be the worst of the problem coming out of your butt. Because as you start digesting cellulose, the byproducts, the anaerobic byproducts are high in methane, for example. And so any room with this many people in it is an ignition risk.
Starting point is 00:36:28 Or a vital source of rocket fuel. Yeah, okay. Depends how you look at it. But I agree. If you create an enzyme for the stomach, that'll digest cellulose. Matthew, any ideas about how
Starting point is 00:36:43 we might take ourselves into the future so that we can survive on a tiny planet with gazillion people yeah so be more heat resistant uh be able to absorb water better water retention uh oxygen for example sort of do you think we can again it's in the animal kingdom it's in the in the tree of life do you think we can ever create photosynthesizing skin that That'd be a useful property to have. Right, you would eat less. I don't think it's enough surface area to drive the two million calories a day that you need.
Starting point is 00:37:15 Unless you're smaller. Touché, sir. Touché. He plays both sides of the fence. You know this is Neil's show, right? And just to be clear, 2,000 food calories is a capital C. That's 1,000 physics calories.
Starting point is 00:37:41 And so 2,000 food calories is 2 million calories. What? Really? Yeah, you didn't know that? No. Is that like the fudging that they do physics calories and so 2000 food calories is 2 million calories that's what yeah really yeah yeah you didn't know that no is that like the fudging that they do where the chocolate milk says secretly five servings that's no that's a different thing going on um yeah so one calorie that to raise one gram of water one degree celsius that's, um, uh, okay. So, uh, one food calorie is a thousand of those. Oh, yeah. So as a measurement of heat, is that what you're saying? Yeah. Yeah. That's what I'm saying. It's just, I, it slipped out. I'm sorry, but we can go back to food calories and we went. So, so that would be one thing you might put in the physiology.
Starting point is 00:38:25 Yeah. Anything else? Wings. Wings. Oh, okay. But if you read my tweet. Wheels. So wheels, yeah.
Starting point is 00:38:39 They see me rolling. So here's the thing. Would you, I got a good one for you, all three of them. Go ahead. Would you, I got a good one for you, all three of them. Go ahead. If you could have wings at the expense of having hands with an opposable thumb, would you? No. No.
Starting point is 00:38:52 No. No. No. You ever see a bird try to turn a doorknob? It is not pretty. It is not pretty. She said get rid of doorknobs. Oh, yeah, yeah, just get rid of doorknobs. Yeah, just get rid of doorknobs. That's her.
Starting point is 00:39:09 Yeah. Yeah. So, yeah, just because there's no example of life on Earth where they have four limbs plus a set of wings. You've got to give something up to get your wings, as was accurately done in the dragon in Game of Thrones Yes Yeah, yeah, you saw it moving along That's pretty yeah, yeah, yeah, oh you didn't know you know, I have a whole set of tweets on this. Where were you? You presuppose that I can read Neil
Starting point is 00:39:41 This is Neil you are the physical embodiment of the word actually. No, not to... No, I'm... I mean that in the highest... No, you don't. Not even. Not even. So, tell us, in the future, what is the laboratory in the future
Starting point is 00:40:07 what is the laboratory of the future for the superhero of the future if you're going to go beyond Batman Superman today can you dream this up is it all biology rather than physical toys I don't I think it's going to have to be both biology and physical
Starting point is 00:40:23 toys because we keep on enhancing our ability to use technology. And if there's one thing that's consistent about humans is we don't think about it before we do it. I mean, that's just, it's in our nature. Oh, new thing, let's try it out and see how wrong it can go. Oh, wow, that wrong. Okay, let's keep on going. Unless it kills you, then you get the Darwin Award.
Starting point is 00:40:45 Yes? Okay. Or at best, the Ig Nobel Prize. The Ig Nobel Prize, right. So, Matthew, what is the future laboratory? Is it all biology, CRISPR, gene editing, do you think? I think it's going to have to be mixed, right? Sort of a bit of biology, a lot of computing, AI, artificial intelligence, and hybrid.
Starting point is 00:41:03 In this crowd, you don't have to say AI is artificial intelligence. You realize this. Okay. Are you afraid of AI? Are they smaller? They're distributed widely. So I am scared of AI. I think that at some point,
Starting point is 00:41:19 they already beat the world's best Go player last year, and they're going to get way smarter. And at some point... And by the way, it beat the world's best Go player last year, and they're going to get way smarter. And at some point... And by the way, it beat the world's best Go player with an ingenious strategy that had never been previously imagined by humans. So it wasn't just doing what humans do, but better than other humans.
Starting point is 00:41:36 That's right. Get ready for your robot overlord. So that's going to be... I'm going to be sweeping the robot's floor. Yeah. So that's going to be one of the biggest issues that we're going to have to face is there are going to be these really super intelligent beings that are not human. And how are we going to survive in that context? So does that put him out of business making stuff?
Starting point is 00:41:55 Because everything he makes is controlled by the human that wears the suit. Oh, I'll still be making stuff. It'll just be for our robot overlords. Oh, I'll still be making stuff. It'll just be for our robot overlords. I'll have those bearings done by Tuesday, M17545. No, it'd be funner if the AI said, call me Freddy.
Starting point is 00:42:19 Call me Freddy, little man. Chuck, are you afraid of AI? You know, I'm afraid of so many things. I actually think that I'm not afraid now. I'm afraid of the person who will actually free AI. Because I think that we will never give AI the ability to make the kind of decisions that we won't let it because we're in control. At some point, though, there will be a singularity. And either someone will come along and cause that or the machine itself will go, hey, you know what?
Starting point is 00:42:58 You ain't the boss of me. That's what I'm afraid of. Yeah. Well, see, this America. That's what I'm afraid of. Yeah. Well, see, this is America. I can like kick it in the... I can unplug the machine, I'm thinking. Okay, but as an AI, I kind of anticipated that.
Starting point is 00:43:21 I bought all the AA batteries. You're screwed, human. Give me a final reflection on this evening. What do you have? I think of AI, in one sense, I'm really scared of it. In another sense, it might be, look, we are already a colony in which 99% of the cells in our body are not human in origin. We are a biome.
Starting point is 00:43:43 We are a colony. 99% of the cells in our body are not human in origin. We are a biome. We are a colony and it may be that the goal of Biology is to create ever larger sentience It might be that our planet that Gaia really is an overall intelligence and the singularity is simply humans extension of their consciousness to the size of a planet an AI planet AI planet Matthew but you got to remember he's working for the overlords yeah he just admitted that earlier Matthew what do you have
Starting point is 00:44:15 yeah so one of the issues is going to be whether we are going to survive into the future right that's the identity question but another issue is going to be whether humans will survive. So we might survive as sort of in cyborgs or other forms, but the humanness might not survive. So we might have to make a choice
Starting point is 00:44:34 between us surviving and humans surviving in the future. Whoa. Damn. I don't even know if I understood what you just said, but it sounded really deep. Do we survive or do the humans survive? Well, he is smaller. Chuck, any reflections?
Starting point is 00:45:02 I'm just really looking forward to tiny people. My job is done. Me too. So when I look over the last... And wheels. And wheels. When I look over the last several centuries since the Industrial Revolution, and you see machines replace human physiology and animals for mechanical work, and we adapted to that pretty well. A few people lost their job, others gained jobs, societies grew, economies grew, productivity grew.
Starting point is 00:45:44 Companies grew, economies grew, productivity grew. The IT revolution saw the exchange of human intellectual computational powers with that of a computer, as well as mechanical abilities with robotics. And so we seem to survive that pretty well. Robots make our cars. It makes almost everything that we care about. No longer is it, well, this is handmade. You say, I don't want that. It probably has flaws.
Starting point is 00:46:12 We're in an age where a machine-made thing mostly is better than a handmade thing. In most cases, that wasn't how it was way back when I grew up, but that's the world that we've created for ourselves. And so when we think of what machines can do, your lifeblood, that next transition, I think, is what cyber can do, what genetics can do. That is, I think, the next evolution of this, what can it do for me lately? And I am fearless of AI for the following reason. I don't see people making humanoid robots. That's not what we have done. If you look at old shows from the 50s and 60s, and they had robots, and then they have the robot to drive the car.
Starting point is 00:47:01 No, the car is the robot now. You don't make a robot to drive your car. So our usage of not only the machines, but the AI that's a part of it, is very specifically tasked for exactly what we need, when we need it, and why we need it. And so as we go into the future, I see our lives being transformed, but not because AI has become our overlord, but because we actually, yes, I know those risks that we face as well, but I see a future where AI serves us the way machines and computers have served us to this day, day and perhaps we carry forward into the future the wisdom necessary to do right by AI lest AI get really pissed off and decide that earth is better off without us all that is a cosmic perspective. You've been watching StarTalk at Comic-Con New York City. Adam Savage, Matthew Lau,
Starting point is 00:48:11 Chuck Nice. Ladies and gentlemen, this has been a production of StarTalk.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.