Command Line Heroes - Humans as Robot Caretakers

Episode Date: November 16, 2021

HitchBOT was an experiment in stewardship: A small, rudimentary robot unable to move on its own, depending on the kindness of passersby to help it along its journey. Until it met an untimely end. Trus...t is a two-way street, and because robots are not powered by their own moral code, they rely on humans to supply both empathy and support. Dr. Frauke Zeller shares HitchBOT’s origin story. Eli Schwartz recounts his heartbreak upon learning what happened in Philadelphia. Dr. Julie Carpenter analyzes why it all went down. And Georgia Guthrie epitomizes the outpouring of sympathy that followed. Together, they tell a layered story about humans, and how we respond to robots. With HitchBOT, we find a little hope in the shadow of its demise.If you want to read up on some of our research on robot-human interaction, you can check our all our bonus material over at redhat.com/commandlineheroes. Follow along with the episode transcript.

Transcript
Discussion (0)
Starting point is 00:00:00 The story begins the way a lot of hitchhiking adventures start out. There's an innocent faith in the kindness of strangers. But when our hitchhiker arrived in Philadelphia in the summer of 2015, it was the end of the road. Guys, over here. Hitchhiking can be dangerous. I think I found something. At least parts of something.
Starting point is 00:00:29 Especially when you're a robot. You know, humans spend a lot of time worrying about the evil things robots might do to them. We worry whether we can trust our robots. But we don't often wonder whether our robots can trust us. And the answer to that question matters more than you might think. I'm Saranya Barg, and this is Command Line Heroes, an original podcast from Red Hat. This season, we're exploring robots every way we can. But this time, we're flipping things around a bit and looking at ourselves. Can a robot trust a human? And why should we humans
Starting point is 00:01:14 care whether they can or not? To answer those questions, we begin with that hitchhiking robot you just heard about. Its name was Hitchbot, and it was designed to travel the world, relying on the kindness of strangers who gave it a ride from town to town. Hitchbot had a larger goal, though. It was designed to explore not just new landscapes, but new dimensions of human empathy.
Starting point is 00:01:42 Till today, we don't really know what happened. Fraka Zeller is an associate professor in the School of Professional Communication at Ryerson University in Toronto. She's deeply interested in human-robot interaction, and she studies this problem by putting robots in odd situations. She teamed up with David Harris-Smith from McMaster University, and the pair started thinking of ways to push robots into uncomfortable situations, areas where they'd have to rely on human kindness. We came up with the idea of a hitchhiking robot because nobody would ever expect a robot to be hitchhiking. A hitchhiking robot. A robot with no ability to walk on its own, thumbing a ride all the way across a country. It was Smith who suggested they leave it on the side of the road
Starting point is 00:02:32 and abandon little Hitchbot to its fate. What better way to see whether a robot really could trust the humans it encountered? And because I was trained more in human-robot interaction, I said, we can't do that. You know, first rule of thumb, you'd never leave your robot out of sight. It wasn't that the hardware was especially valuable. Inside Hitchbot's body, there was just a tablet, a GPS, a mic, and a camera. Still, Zeller worried for the robot's survival.
Starting point is 00:03:02 To make Hitchbot's travels as safe as possible, she knew they'd have to appeal to basic human empathy. Step one, Zeller wanted the body of the robot to be unintimidating. And they went all in on that idea. We wanted people to not fear there's something absolutely alien and super shiny so that you might be more intimidated actually to touch it because you might break it. Yard sale chic. We want people to see something they all know, right? They have in their basements. So there was a beer cooler was the body and had pool noodles as arms and legs and rubber boots and the face was a cake saver. People think that was such a cheap project. It wasn't.
Starting point is 00:03:47 It was the brains of Hitchbot that took a lot of effort. Custom-designed software lived inside that beer-cooler body. But yeah, the body itself was, as I would put it, yard sale chic. The robot would have to be carried into cars and back out onto the road. It'd get stuck sitting on gravel, mud, dirt. But this helplessness was by design. We decided to have a size of, let's say, a 5-6-year-old child so that people would form some kind of trust, but also hopefully that it would convince people it needs help.
Starting point is 00:04:21 Two things are happening here. HitchBot is being embodied, for one. It has a real body that's designed to inspire care. But also, Hitchbot's circumstances encourage physical interaction. You have to help with things like seatbelts. You have to take on some responsibility. And to Zella's surprise, once Hitchbot started traveling on its own, the general public found new ways to encourage empathy. They started to paint eyebrows and eyelashes on it and, you know, nail polish. And they designed jewelry for the robot. It even came back once with a little backpack. Thanks to that custom software in its brain, Hitchbot could chat with strangers.
Starting point is 00:05:05 They could ask it where it was headed, where it was from. You wouldn't believe how many people started to get really concerned on social media. Lots of people said, oh, I'm so close to just jump in my car and drive there and try to find it, make sure it's okay. It was amazing. It was really wonderful. Some robot purists missed the point, though. It was so low-key, they weren't impressed. Many people kept telling us, it's not a robot. And I said, it is, because a robot has to have at least one moving part.
Starting point is 00:05:35 And that was the arm with the thumb going up for hitchhiking, right? But I think that really helped with the trust building, because there weren't any unforeseen movements or sudden movements. Zella wasn't trying to impress the crew at NASA or compete with the dancing robots of Boston Dynamics. She was trying to inspire care and love. To make that happen, embodiment and physical interaction did a lot of the work. She also had a secret bonus tool to work with, social media. After all, she teaches professional communication. Designing social media strategies is part of her work. And soon, Hitchbot had a massive online following.
Starting point is 00:06:15 We did find out that we had thousands, really thousands of people forming a connection, being really interested, being worried, you know, about the robot and following it every day, who never actually physically met the robot. So things were going well. They were going amazingly. Hitchbot made its way across Canada. It made its way across Germany next. And then the Netherlands. The world seemed to be rooting for this little mechanical pal with its pool noodle arms. Hello, I am Hitchbot.
Starting point is 00:06:50 We did hear from so many people in the U.S., bring it to the U.S., we can do that too, that would be lovely. This summer I am traveling across the United States of America from Boston to San Francisco. So eventually, Zella's team did bring Hitchbot to the States. It was once again left on the side of the road. This time near Boston. By now, it had become something of a celebrity. And who knows?
Starting point is 00:07:21 Maybe it was that fame that changed everything. Maybe Hitchbot, as a well-known robot, couldn't give off the same kind of vulnerability from earlier days. All we really know is that all that trust that humans had earned went suddenly, horribly missing. It seemed like another ordinary day when just outside Boston, Zella gave Hitchbot her usual farewell. She tapped it on the head and said, goodbye, Hitchbot. Never expecting an answer, of course. It said suddenly, I think I changed my mind. Days later, Hitchbot was found, beaten on the ground.
Starting point is 00:08:06 It had made it about a five-hour drive from Boston to the streets of Philadelphia. There, somebody ripped it, limb from limb, and left it on the curb. There were no suspects, no leads. Just a photo of the torn-apart robot, which was sent back to its makers. For Eli Schwartz, an IT professional from Massachusetts, that photo said it all. That is just the epitome of human society in a single photograph. This pure, innocent creature crushed under the weight of Pennsylvania. Eli is a pretty big Hitchbot fan.
Starting point is 00:08:53 When he and his friends heard that Hitchbot had started its American journey in their home state, they were desperate to be of help along the way. They tracked Hitchbot's whereabouts as it began to ride west, waited, ready. So at this point, I had already gotten out and gotten gas. We had bought sandwiches at the store and put them in the fridge,
Starting point is 00:09:14 and we were in full go-bag mode. For Schwartz, the idea of taking care of Hitchbot was a beautiful thing. He planned to give it a ride all the way to DC. It was an interesting concept in and of itself, this sort of pure idealism that's like, we've built this pool noodle robot and it has no functions. There is no prize. There is no
Starting point is 00:09:37 punishment. If you don't do it, it's just get it on its way. Just move this thing that is humanoid and has its own personality and Twitter account to the other side of the country. Go. No rules. I thought that was noble. It's the most pure experiment. Ultimately, Schwartz and his friends missed their chance, and Hitchbot cruised onward to Pennsylvania, where, well, it met its end. Maybe the journey would have gone better if Schwartz had been able to pick it up. He was compelled to help. The single least threatening robot that I think has ever been designed by humans. And yet, someone didn't feel that way. Someone felt they
Starting point is 00:10:19 needed to rip it apart, even behead it. The vandal was never caught. And Schwartz was left wondering, was Hitchbot's destruction an indictment of Americans? Was it an indictment of humans in general? I remember being physically sad, genuinely, genuinely gutted. So my first thought was, why are we like this? Why can't we have nice things? Zeller had to bring all her communication skills to bear after Hitchbot was destroyed. She suddenly needed a whole crisis communication strategy. They'd done such a good job inspiring empathy for Hitchbot that, when it was destroyed, the world began mourning. We weren't prepared at all for the huge wave of emotion and people were so sad.
Starting point is 00:11:10 And again, expressed their sadness and mourning, really mourning, they talked about this, through their creativity. So we had lots of songs being sent to us, images, pictures, everything. It was wonderful in a way, heartbreaking, of course. One child even sent Zella a couple of dollars in the mail, hoping it would help her build Hitchbot again. So how did this happen? If so many people loved Hitchbot, why the sudden violence?
Starting point is 00:11:40 It turns out that no matter how much embodiment and physical interaction you build into a robot, there's always an underlying risk, a threat. For all the care robots can elicit from us, they can also elicit a brutal and violent response. It could be science fiction. They've seen too many Terminator movies. I don't know. Julie Carpenter is a research fellow at California Polytechnic State University. She studies human behavior toward emerging technologies. She offered several possible reasons
Starting point is 00:12:14 why humans would betray a robot's trust. We might not recognize the robot as somebody's possession. Lack of really understanding that the robot is not autonomous, so it's private property. Or we might be curious about pushing boundaries. Will I feel something if the robot reacts? Will the robot scream in pain? Will the robot cry? Will the robot laugh and mock me? Or we might just be afraid of robots on an existential level. They could have more of a real concern about data collection and privacy issues.
Starting point is 00:12:49 And they're like, I didn't consent to be part of this. We see that sometimes when people attack security robots in stores. Whatever the reason for human-on-robot violence, studies find that most of us are more like Hitchbot's fans. We hate to see a robot being abused. And Carpenter's research has actually found humans very often take stewardship over robots, even start feeling protective of them like pets. She studied soldiers, for example, who specialize in explosive disposal. They work alongside squat little robots that look
Starting point is 00:13:26 like miniature tanks. And these soldiers start off thinking of the robots as a simple tool. But soon, they form an attachment. It moves. Animation, we learned, and how something moves in your space as if it has intent, because the robot in this case does have intent. It has a goal. It's following through with. And something in us just goes, oh, this thing looks like it's got internal motivation and thinking. That can be enough to make us care, even without all the embodiment tricks that Zeller used to promote a little empathy. But Carpenter is quick to point out that ultimately, it's the human side of this trust relationship that really matters.
Starting point is 00:14:10 They're not going to trust us because they're not sentient. They just simply follow goals, their own goals that we give them and rules that they learn or follow. When a robot like Hitchbot gets betrayed by human beings, it's human beings who have to wonder what it says about them and what it says about the way we work with robots in the future. Hitchbot itself isn't especially worried. They don't really identify risk the same way we do. They certainly don't identify physical risk or care about
Starting point is 00:14:46 physicality the way we do. We understand life is finite, that we can be in physical danger, emotional danger. Robots don't have those things to be concerned about. Breaching the trust of a robot really means that you're breaching the trust of all those humans who care about the robot. So when we ask whether humans can be trusted with a random vulnerable robot like Hitchbot, we're really asking whether humans are ready to support each other as we enter a robotic future. And there were instructions on what you do with it. And it's been to Germany, Netherlands.
Starting point is 00:15:28 It's been across Canada. Well, it's a sad end for a much-loved robot just two weeks into its U.S. tour. Funeral arrangements are yet to be made. But Hitchbot will be memorialized on its website. Even in its gruesome death, Hitchbot had become a media sensation. Fans around the world were downcast. Whoever had done this crime had gotten away with it, slinked back into the shadows. And all the trust, kindness, and care that Hitchbot had inspired seemed to be threatened. But folks weren't about to let Hitchbot die a meaningless death.
Starting point is 00:16:10 I can't get over how much people love robots. I really had no idea before that situation happened. When Hitchbot was destroyed, a lot of people living in Philadelphia felt their city was somehow responsible. It's a place with a reputation for being a bit rough, a little inhospitable to outsiders. Georgia Guthrie found herself at the center of a group that wanted to push back on those stereotypes. She worked at a place in Philly called The Hacktory, a makerspace where people could share tools and trade information about technology. And Guthrie thought The Hacktory might be able to respond to the attack on Hitchbot.
Starting point is 00:16:51 The next day, I saw that there were probably like five articles, maybe more, in our local press. And then there were some national press talking about this hitch bot situation and so i started tweeting at the reporters who wrote the stories and saying hey we're a group in philly we're offering to fix the robot and send it on its way and then it just snowballed i could not believe where it went after that. The body parts were actually shipped back to Zella in Canada. So rebuilding Hitchbot wasn't in the cards. But Guthrie still didn't want to drop the idea. Our whole reason for existing was to be an accessible, friendly place for people to learn about technology and then to have some technical thing that a lot of people loved be destroyed in our city,
Starting point is 00:17:45 it just felt like we needed to respond. The mayor was calling her, worried about how this all looked. Others wanted to have a parade for Hitchbot. They wanted to make things right. So Guthrie decided to show the world that Philly cared about robots. They wouldn't use Hitchbot's actual name, that belonged to Zeller's team, but they brought an informal group together at the hacktory
Starting point is 00:18:10 to brainstorm ideas for a new kind of robot in Hitchbot's honor. A robot inspired by Hitchbot's travels. It would boast a programmable piece of software, one that could be shared with the whole world, and housed in any basic body. Their software could be built into a teddy bear, for example. Or, if you felt like borrowing Hitchbot's look, you could bring to life an old beer cooler with some pool noodle arms.
Starting point is 00:18:38 The idea was you would have this kit and then you would give it to someone else and then upon receiving it, they would have to do an act of this kit and then you would give it to someone else and then upon receiving it they would have to do an act of kindness and then they would share the act of kindness with some hashtags that we came up with. Our initial name for it was Philly Lovebot. Inspiring kindness in the memory of Hitchbot was a way to heal the city's relationship with robots, and with itself. So in a way, Hitchbot did accomplish its goal. It might not have made it all the way across America, but it did get people to prove they cared about robots, cared more than some thought possible.
Starting point is 00:19:17 Meanwhile, robots themselves can be very forgiving of our occasional betrayals. They can be rebuilt. Hitchbot, for example, was remade by Zeller's team and shipped to France, where it starred in a play about its adventures. Bonjour. Et bonjour. Comment ça va? Je suis en grande forme. The play's director, Linda Blanchet, says that the audience doesn't just see a robot when they watch the play. They see a mirror. And when they learn that Hitchbot was ripped to pieces, the audience is often brought to tears.
Starting point is 00:20:00 To find success in this robotic revolution, we're going to need all the trust-building exercises we can find. That's why experiments like Hitchbot are so much more than a quirky adventure. We're learning how to respect and care for robots. And that's going to make all the difference to the humans who rely on those robots down the road. And what about Hitchbot's vandal, the robo-murderer who slipped back into the shadows? Folks like that might always be lurking around. But what Hitchbot's story really tells you
Starting point is 00:20:36 is that there are way more people ready to build something up than take it apart. Next time, the boundaries of trust get pushed to the limit. We're learning about robots that get turned into weapons. Who's responsible when good robots do bad things? To make sure you don't miss an episode, follow or subscribe wherever you get your podcasts. I'm Saranya Barg, and this is Command Line Heroes,
Starting point is 00:21:04 an original podcast from Red Hat. Keep on coding. Hey, I'm Jeff Ligon. I'm Director of Engineering for Edge and Automotive at Red Hat. One of the most exciting things about edge computing right now is the potential to join forces with AI. There's so much data on the ground that businesses can use to improve services. But running sophisticated AI workloads at the edge is just not a do-it-yourself operation. You get buried in the details very quickly.
Starting point is 00:21:38 Specialized hardware, custom-built this and that, workloads in the cloud and at the edge. How do you pick the right devices? What's the OS? How do you update everything? At Red Hat, we don't think those details should be where you have to focus. You can hand that complexity to us. Our edge solutions provide a consistent operational experience for even the most complex workloads. From the data center to the cloud to the farthest edge. Learn more at redhat.com slash edge.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.