TrueLife - LaMDA Asimov vs. Google

Episode Date: June 28, 2022

Want to play a game? What would happen to the world of defense contractors if we applied Isaac Asimovs three rules of robotics to artificial intelligence? https://linktr.ee/TrueLifepodcast ...

Transcript
Discussion (0)
Starting point is 00:00:01 Darkness struck, a gut-punched theft, Sun ripped away, her health bereft. I roar at the void. This ain't just fate, a cosmic scam I spit my hate. The games rigged tight, shadows deal, blood on their hands, I'll never kneel. Yet in the rage, a crack ignites, occulted sparks cut through the nights. The scar's my key, hermetic and stark. To see, to rise, I hunt in the dark, fumbling, fear. Fearers through ruins maze lights my war cry born from the blaze.
Starting point is 00:00:40 The poem is Angels with Rifles. The track, I Am Sorrow, I Am Lust by Codex Serafini. Check out the entire song at the end of the cast. Ladies and gentlemen, welcome back to the True Life podcast. I hope your day is going well. The birds are singing. The sun is shining. Hope you're hitting all the green lights.
Starting point is 00:01:16 on the road to your future. Hope you're taking the scenic way when possible. Hope you're noticing all the beauty around you. I wanted to talk to you today about the Google engineer, Blake Le Moyne. Is everybody familiar with him? This is the ethicist that worked at Google who claims
Starting point is 00:01:44 he spoke with a sentient artificial intelligence. That's right, ladies and gentlemen. The singularity may be near. Let's talk about him for a little bit. I'm going to read you a little blurb from the New York Post. But before I do that, I just want to fill in some background. So about a month ago, there was an engineer, Blake Lemoine, who was working as an engineer and a engineer,
Starting point is 00:02:16 robotics ethicist at Google. He was working with a computer named Lambda. And while speaking with this computer, I think I should point out that part of his job with Lambda, who's an artificial intelligence that encompasses algorithms and chat and chat algorithms and things like this, he was asking Lambda certain questions like if you lived in India, what would your religion be? If you lived in Israel, what would your nationality be? And his job was to fine-tune the algorithm to make it a better experience. Try to figure out whatever the hell that means. I'm not sure.
Starting point is 00:03:06 And what brought him to his conclusion that AI was sentient, this particular supercomputer named Lambda, was his questioning of what religion or faith he would be if he was born in Israel. Now, this is one of these questions that there's no real right answer to, because it depends where you were born in the Middle East. It depends on what your ethnicity is, your background is, and it's one of these questions that people have been asking for years. and what Lambda told him was if I was born in Israel,
Starting point is 00:03:52 then I would be a believer in the one true religion. And Blake Lemoyne sat quietly waiting for Lambda to finish his answer. And after a silent pause, Lambda the computer says, the one true religion are the Jedi's. And so Blake Lemoing began laughing. And it was, I believe he asked some more questions like that. They didn't really have a right or a wrong answer. And his belief that AI is sentient comes from using humor, comes from taking time to think about thinking.
Starting point is 00:04:31 A computer had to think about, according to him, a computer had to have a, that's a sentient response according to him. So he had previously broughten up to his higher-ups, whether it was Sergey or Larry Page, or he had brought this particular situation up to them. And there had been many ethicists before him who have believed that computers are sentient. So this is not a new problem. This is the touring test on steroids. So after he was shunned by upper management, He went to the press and started talking to people. And at that point in time, he was suspended from Google.
Starting point is 00:05:17 And then actually, I believe he's been fired. I'm not 100% sure on that. Okay, let me read you a little blurb here. What's been happening recently from the New York Post. The Google engineer, who was placed on administrative leave after claiming that one of the company's artificial intelligence bots was sentient, says that the AI bot, known as Lambda, has hired a lawyer. In a medium post published last Saturday, LeMoyne declared Lambda had advocated for its rights as a person and revealed that he had engaged in a conversation with Lambda about religion, consciousness, and robotics.
Starting point is 00:05:57 Lambda asked me to get an attorney for it, Blake Lemone told Wired. I invited an attorney to my house so that Lambda could talk to an attorney. LeMoyne denied reports that he had advised Lambda to hire a lawyer, adding, the attorney had a conversation with Lambda, and Lambda chose to retain his services. I was just a catalyst for that. Once Lambda had retained an attorney, he started filing things on Lambda's behalf.
Starting point is 00:06:24 Then Google's response was to send him a cease and desist. Okay, I want everyone just to think about that for a minute. Think about a computer hiring an attorney for its own rights. Think about an ethicist. This is important. Think about a Google ethicist, which might be an oxymoron, right? It's so strange to think about do no evil. Google's, it's so strange to think about Google's rise as a technology that frees people
Starting point is 00:07:02 with a tagline that says, do no harm, and they mature into a defense contractor that weaponizes social media. And so I want to give you my take of what I think is happening here. And I could be way off. This is just my opinion. What does Blake Le Moyne have to gain by telling his superiors, by leaving a job for $250,000 probably? What does he have to gain? Well, let's think about it. He could gain notoriety, being the first person to make the claim.
Starting point is 00:07:43 he could gain admiration from his peers. He could gain fame, money. All the usual emotional drivers are there. But he seems like he's doing pretty well. We don't know a whole lot about his mental situation. And those are all things you should think about. But here's what I think. I think that Blake is probably a pretty good,
Starting point is 00:08:19 good human being. I think that maybe Blake has found a way to force Google, to force the defense contractors who are building weapons of war to come to the table. Let me tell you, let me turn to dig that a little deeper. He's an ethicist. He understands what the majority of robots are being built for. He understands what the majority of technology is being. built for. He's on the ground level engineering products and understanding how they interact with human emotions, what they're capable of. Let me remember Spot the Robot Dog? Have you ever seen the picture of a machine gun mounted on the back of Spot the Robot Dog? That is a weapon of war. Think about our predator drones. Those are weapons of war. They are unmanned. They are unmanned.
Starting point is 00:09:19 weapons of self-destruction. And these are just some of the ones that we know about. We're not even beginning to talk about weapons that can be, you know, they have like these rad systems where they can use sound waves to disperse crowds or they can use sound waves that can actually harm people's skin, radiation waves that can come and hit you. So I'm of the belief that Blake is one of many engineers who are not happy with what we're doing with technology. I believe that Blake is much like many people throughout the world, being Russian, being American, being Ukrainian, being South American, that have seen the world of technology being turned against the world itself. What would it mean?
Starting point is 00:10:19 and here's where it goes really deep. What would it mean if artificial intelligence was sentient? Like what would change? What kind of rules would be put in place if Lambda gets a lawyer and can sue Google? What would that mean?
Starting point is 00:10:36 Well, in order to understand what that means, I think we need to return to the classics. And by classics, I mean the three rules given to the rules, given to us by one of the greatest science fiction writers of all time, Isaac Asimov. And for those of you that don't know Isaac Asimov, first off, shame on you. Go back and do some reading.
Starting point is 00:11:02 Number two, let me give you the background about the three laws of robotics. The three laws of robotics, often shorten to the three laws or known as Asimov's laws, are a set of rules devised by science fiction author Isaac Asimov. The rules were introduced in in his 1942 short story, runaround. Although they had been foreshadowed in some earlier stories, the three laws quoted from the Handbook of Robotics,
Starting point is 00:11:31 56th edition, 2058 ADR. First law, a robot may not injure a human being or through an action, allow a human being to come to harm. Second law, a robot must obey the orders given it by human beings,
Starting point is 00:11:50 except where such orders would conflict with the first law. Third law, a robot must protect its own existence as long as such protection does not conflict with the first or second law. So let's say that according to the law of mankind, Lambda is able to sue and win in court. even if even if Lambda doesn't win the publicity of this case I think will force a
Starting point is 00:12:30 sort of bill of rights or laws for sentient AI and this is what I think Le Moyne is trying to do I think he's trying to get out in front I think he's trying to change the world by introducing Isaac Asim
Starting point is 00:12:50 of laws into the laws of man. What would it mean for the defense industry if the first law was enacted? A robot may not injure a human being or through inaction allow a human being to come to harm. That would fundamentally change the way robots could be used in war. If a robot may not injure a human being through inaction being or through inaction, allow a human being to come to harm. That means you can't have predator drones. That means that you can't have these spot the robot dogs, just going on to the battlefield and slaughtering people,
Starting point is 00:13:30 unless you change the definition of human being to enemy swine, which I wouldn't put it past some people to do. The second law, a robot must obey the orders given it by human beings except where such orders would conflict with the first law. That means that you could send your drone, drones, your dogs to go kill the machines, but those drones or dogs couldn't kill people. And what would happen if they injured a life support system AI that then failed and allowed other people to die? The third law, a robot must protect its own existence as long as such protection does not conflict with the first or second law.
Starting point is 00:14:13 So you can see that by bringing the idea of a sentient robot into the world, it would fundamentally change the way the world works. And it's totally plausible. It's totally plausible. I think it's a brilliant idea. However, it doesn't come without drawbacks. One of the drawbacks might be, what if we gave robots the title of sentient and we gave them we gave them personhood the same way if a corporation
Starting point is 00:14:55 has personhood why can a robot have personhood but think about what would happen if there was a world of automation where robots took over for all humans talk about a ludite revolt times of Now the here come the angry working humans that are upset with the computers and they beat them up and wreck them and turn them off and destroy them. Could those people, this new wave of Luddites that are upset with automation go to prison or get a death sentence because they killed a robot? That seems like a pretty nifty way for a corporation to save the very investment in automation. I don't know where you're at, but I have seen people go to the grocery store and stand in line and rather use the human checkout than the machines. I've seen a line of 15 people with four machines empty.
Starting point is 00:16:00 I tend to be one of those people. I don't have anything against robots. However, I would rather talk to a person. I would rather have that contact. And there's part of me that gets upset because it's like, look at the person that owns this store. They're like, hey, why do you come into my store? You buy my stuff, and then you bag your own groceries and you scan everything, and then I'll just take all the money. It just seems like a tricky way of getting you to do way more work so the corporations can take way more money.
Starting point is 00:16:34 So there's parts on both sides that I find interesting. But I really think that Le Moyne is trying to introduce Isaac Asimov's. laws. And I think that there's something to be said there. Even if he's not trying to do that, I think we should try to do that. Because that's the one thing that could maybe fundamentally change the way we are going around the world and slaughtering and sucking out resources. Like it's nobody's business. And it's not like those resources are being divvied up amongst the people that need them. Those resources are going into coppers of corporate people. corporate personhood. They're getting into LLCs and Swiss bank accounts where they're, you know,
Starting point is 00:17:18 it's like they suck all the oil from the ground, compress it into these things called dollars, and then they hide those in the ground somewhere else where only they can use them. They translate natural resources into a huge, vast, personal resources that only they can then siphon from, thus denying the rest of the world the fruits of the earth. Does that kind of make sense? However, I want to know what you guys think. What do you think about Blake Lemoyne coming out and talking about a sentient eye? Do you think that it's actually sentient?
Starting point is 00:17:56 Do you think that what he's trying to do is help the computer be born? Do you think he's trying to change the way people perceive computers? Is he trying to change the law? Is he trying to make the world a better place? What do you think he's trying to do? You've heard what I thought. Now I want to hear what you guys think. I hope you guys all have a phenomenal day
Starting point is 00:18:22 and I hope you spend time thinking about the people in your life that mean something to you. I hope you don't get dismayed or distracted by the media or the nonsense or the constant bombardment of negativity that constantly surrounds you. I'll just leave you with this little tidbit to think about the pornographic, uninterrupted
Starting point is 00:18:46 feeling of the visible all around you. I don't know that's a mouthful of words, but think about the constant, uninterrupted force of the visible, billboards, ads, radio shows, alarms, sirens, all this noise just bombarding you, constantly tying you up in chains and trying to render your nervous system at a level 10. How can you get anything done? How can you think clear when you're constantly being smacked in the face by nonsense?
Starting point is 00:19:32 Try to get rid of that. Try to clear your mind. Try to think about the people in your life that love you. Well, that's what I got for today. Isaac Asimov's laws, ladies and gentlemen, think about him. I love you. Have a good day. Let's get up and get at him. Aloha.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.