Consider This from NPR - Does artificial intelligence deliver immortality?

Episode Date: June 11, 2024

Michael Bommer likely only has a few weeks left to live. A couple years ago, he was diagnosed with terminal colon cancer.Then, an opportunity arose to build an interactive artificial intelligence vers...ion of himself through a friend's company, Eternos.Life, so his wife, Anett, can interact with him after he dies.More and more people are turning to artificial intelligence to create digital memorials of themselves.Meanwhile Katarzyna Nowaczyk-BasiƄska, a research associate at the University of Cambridge, has been studying the field of "digital death" for nearly a decade, and says using artificial intelligence after death is one big "techno-cultural experiment" because we don't yet know how people will respond to it. Artificial intelligence has opened the door for us to "live on" after we die. Just because we can, should we? For sponsor-free episodes of Consider This, sign up for Consider This+ via Apple Podcasts or at plus.npr.org.Email us at considerthis@npr.org.Learn more about sponsor message choices: podcastchoices.com/adchoicesNPR Privacy Policy

Transcript
Discussion (0)
Starting point is 00:00:00 Jason and Melissa Gowan didn't spend much time thinking about death until the couple faced serious health scares back to back. And then all of a sudden we were pretty mortal and we were very concerned about what was going to happen to our kids. That fear led them to an AI company called You Only Virtual. It creates AI chatbots modeled after deceased loved ones. Jason and Melissa became two of their first users so that their young sons would have a tool for memorializing them when they died. They uploaded audio and video of themselves to the company's cloud and watched their digital avatars come to life. Jason, who's a comedian, says his is pretty spot on. He like makes little jokes and he like references funny stories and things that happen to us.
Starting point is 00:00:52 It's such a surreal thing and it's also, it like lifts my heart to see my son react to it. Both parents' health conditions have stabilized, but they're comforted knowing that their AI likenesses, known as Versonas, are ready to go. We built these Versonas as a fail-safe just in case, so that on big days like wedding days or college graduation or high school graduation or on a day you just need a pick-me-up from a parent who passed. We have that there just in case. You Only Virtual is one of a number of companies at the intersection of AI and memorialization. Users like Jason and Melissa Gowan are finding peace of mind in this space, but many tech ethicists are concerned. For some people, using the simulation of a deceased loved one might be comforting and helpful during this very difficult time, while for others it might be emotionally draining or even devastating.
Starting point is 00:01:55 Karadzina Nowadzik-Baszynska is a research assistant at the University of Cambridge. She's been studying the field of digital death for nearly a decade and says the whole thing is one big techno-cultural experiment because we don't know yet how people will respond to it. In a recent paper, she and her colleagues outlined some of the potential red flags this technology could raise, like how it will impact the grieving process, who owns the data, how it's used. And then there's this whole question about consent. It will be crucial to seek explicit consent whenever possible from the so-called data donor. So the person whose data is used to create the grief bot, ideally before the death of that
Starting point is 00:02:39 person. Byszynska says as immortality AI inevitably grows, her priority is to keep users safe, to protect their digital rights. We have two options here. We can either work towards a future where we make the most of this tool by promoting values such as diversity, empathy, care, trust and respect. Or we can remain as we are, where the most important value is profit. Consider this. Artificial intelligence has opened the door for us to live on after we die. Just because we can, should we?
Starting point is 00:03:21 From NPR, I'm Mary Louise Kelly. It's Consider This from NPR. Michael Ballmer likely only has a few weeks left to live. A couple of years ago, he was diagnosed with colon cancer. The doctors told him it was terminal. And then an opportunity arose to build an interactive AI version of himself through a friend's company called Eternos. Michael and Annette, who are based in Berlin, took a little time away from their day to talk with us. And I asked Michael to tell me about the moment he decided to create an AI version of himself. Like a year ago, I sat with my wife in one of these more teary-eyed exercises,
Starting point is 00:04:11 talking about what comes. And my wife said, hey, one of the things I will miss most is being able to come to you, ask you a question, and you will sit there and calmly explain the work to me. And then I posted on Facebook to all my friends, hey, guys, it's time to say goodbye. And Rob called me after that and said, hey, we all thought you might make it through, but hey, here's a gift. Why don't we do this together?
Starting point is 00:04:43 And I, of course, immediately yes, because I already had the thought myself to do something with voice synthesizing. But now adding AI to that was a great thing for me. Annette, I do have a question for you. When Michael first told you about this idea, what did you think? Well, I thought, well, yeah, let's do it. Really? He has a lot of projects in our life.
Starting point is 00:05:15 And at this moment, it was a little bit silent in our daily routine. And I thought, wow, that makes this part in this life a little bit better, filling out with to-dos. So Michael, tell me, how did it work to build it, to program it? I understand it's AI, so like you said, it has access to all kinds of knowledge and information that you don't have, and it will keep learning. But the things, it sounds like Annette wants to ask you, are things only you would know.
Starting point is 00:05:58 How do you program? So there's two steps to that. First, you need to give it my voice. And this happens with 300 sentences you record. And out of these 300 sentences, these are specific sentences, you create the voice of all the nuances of a voice. And the second part is that you fill it with content. Now, in my case, because we were so short on time, I simply told 150 stories about my life,
Starting point is 00:06:32 early life, midlife, late life. What I would recommend back to me as a young person, what would I recommend to my children, my grandchildren. So to give it all the content around life and living, all the content about my history. And that's the content where the AI is created. Normally, this will take weeks and months, right? In my case, we needed to put it into more or less mere days. And out of that, you create the AI. Now, when the AI now wants to answer a question, the question goes into, you can imagine it like a cloud. And in the cloud is all the knowledge which I left for the AI. And he picks parts of the things I talked, which fit for the answer and put them together into a strain, into an answer.
Starting point is 00:07:28 Now, sometimes there is something which knowledge base where the AI can take knowledge from the internet and ask a question to the internet and say, last time it was the car was making noises, right? And so they went out in the internet and said, so what causes these noises in cars? And he took this, you know, mainstream answer back. Are you sharing with it then things that you want to make sure Annette knows? Like when the car starts making a weird noise and you're not around, she can ask and have an answer of, well, last time it was
Starting point is 00:08:06 this. Why don't you check? Yeah. So I didn't do that in that depth. What I did more is try to convey my principles, my principle in life, so to speak. Always de-escalating. As I say, straight at your home safe. Say, hey, I'm sure our auto mechanic can help you.
Starting point is 00:08:28 So reinforcing, you're good, right? And then, hey, by the way, these noises could come from this and this and that. So the principles which I gave, de-escalating, stay calm, reflect, whatever, which is my nature, right? That's in this AI. I'm listening to you I'm thinking of something I read which was a reference to this kind of ai as immortality tech um is immortality a part of it I mean because there is a there is a piece of you, an essence of you that will carry on and carry on interacting with people you love. No, I see my AI as intelligent, digital memoir.
Starting point is 00:09:16 And so if you write your memoir, that's not eternal life. So I see it more as a tool. I want to give my knowledge and experience, and then I'm gone. I'm gone and I'm gone, and I want the next generations to inherit my experience and my knowledge as much as possible. Annette, have you talked to it yet? Only for testing. For testing, yeah.
Starting point is 00:09:43 And in this moment, I love the time with him. You're talking to the real Michael while you can. Yeah. Annette, do you think it will really feel like him, like your husband? No. For me, it is a machine. It is a machine. Exactly. Not warm, not touching, not not it isn't human is there any part of this that frightens you
Starting point is 00:10:10 worries you um no i'm not afraid about this one it is uh for me a tool and should I get afraid, I can close this tool and don't use it. So therefore, in this moment, I'm happy to have it and to try, and when I fail, then I'm fair, but I'm not afraid about it. I'm leaving it behind, right? If it's used or not, if they hang it as a picture, like a picture of me at the wall or they put it in a drawer, I don't care. I cannot influence that. But I can leave it, right? I can leave it behind. I like that way of thinking about it. It's like a picture of you or a painting, which feels very normal to leave behind for people who will be grieving.
Starting point is 00:11:06 Yeah. What type of questions can you imagine asking Annette? I assume perhaps to read me a poem, or I could ask him when we met us or when we got married. Or I can ask him, okay, tell me about what he proposed. So a little bit of remembering together all the nice things we had. If people have a problem with grief, and this is independent of an AI or not AI, it is a problem with grief.
Starting point is 00:11:55 And unfortunately, our society is treating grief very badly because we distance ourselves from people out of a good meaning but it's not good because if you're grieving you need to grieve openly and you need to be open to embrace people who are grieving but people are often isolated in their grief and then they turn to whatever kind of remembrance they can to try to relive what they had. And that's bad. Yeah. In terms of technology used for that.
Starting point is 00:12:32 Well, I want to thank you both for being so open and talking this through with us. I'm sorry for everything you are dealing with. And I appreciate your sharing it with us. Thank you. Thank you very much for taking us. Thank you. That is Michael Bommer and his wife, Annette, speaking with us from Berlin. Thank you. Bye-bye. Bye-bye. This episode was produced by Catherine Fink. It was edited by Courtney Dornan. Our executive producer is Sammy Yinnigan. And one more thing before we go.
Starting point is 00:13:09 You can now enjoy the Consider This newsletter. We still help you break down a major story of the day. And you'll also get to know our producers and hosts and have some moments of joy from the All Things Considered team. You can sign up at npr.org slash consider this newsletter. It's Consider This from NPR. I'm Mary Louise Kelly.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.