Everything Is Content - Everything In Conversation: Can AI Cure Grief?
Episode Date: March 11, 2026Happy Wednesday EICorporeal beings. Last month, a BBC documentary posited the question, 'can AI ‘cure’ grief?', which got us wondering, should we want to cure grief?In AI confidential with Hannah ...Fry, Fry speaks to Justin Harrison, founder of You, Only Virtual. He ‘created the concept of the Versona after facing the dual challenges of his own near-fatal motorcycle accident and his mother's terminal cancer diagnosis. During this difficult period, he sought a way to preserve the unique bond he shared with his mother, leading to the development of YOV's Al technology. The Versona allows individuals to digitally recreate and continue conversations with their loved ones, even after they've passed, providing lasting emotional connection.’Justin Harrison is not the only person working on this emerging technology, there are others such as Project December which is is an online AI platform created by video-game designer Jason Rohrer that allows users to simulate text-based conversations with specific personalities, including deceased individuals and, HereAfter AI which promises to ‘Preserve memories with an app that interviews you about your life. Then, let loved ones hear meaningful stories by chatting with the virtual you’.Is this something that we need? Is it something we want?Thank you so much for being so generous and sharing with us your experiences on such a sensitive topic, we're always so grateful.Love you! and hope you enjoy the episode O, R, B xxxAI Confidential with Hannah FryThe Dead Have Never Been This TalkativeAlexis Ohanian thread The Year Of Magical Thinking - Joan DixonThe Obliterated Place Hosted on Acast. See acast.com/privacy for more information.
Transcript
Discussion (0)
I'm Beth and I'm an only.
And this is Everything in Conversation.
A little snack of content to get you ready for Friday's Main.
We would love for you to take part in these conversations.
We love when you agree and disagree with us.
And you can give us your opinions by following us on Instagram at Everything is Content Pod.
That is where we decide on topics and open the floor for all of your thoughts.
So in a new documentary for the BBC titled AI Confidential with Hannah Fry,
Fry speaks to Justin Harrison, founder of You Only Virtual.
He created the concept of Vesona after facing the dual challenges of his own near-fatal motorcycle accident and his mother's terminal cancer diagnosis.
During this difficult period, he sought a way to preserve the unique bond he shared with his mother,
leading to the development of YOV's AI technology.
The Vosona allows individuals to digitally recreate and continue conversations with their loved ones,
even after they passed, providing lasting emotional connection.
The BBC documentary posits the question,
can AI cure grief, which got us wondering if we should want to cure grief?
Justin Harrison is not the only person working on this emerging technology.
There are others such as Project December,
which is an online AI platform created by video gamer designer Justin Rora
that allows users to stimulate text-based conversations with specific personalities,
including deceased individuals and hereafter AI,
which promises to preserve memories with an app that interviews you about your life,
then let your loved ones hear meaningful stories by chatting with the virtual you.
There are ethical concerns and risks around digitally reanimating deceased people's voices without their consent,
as well as the risk of falsifying memories, creating emotional dependency,
and even psychological harm.
Beth, you actually sent this into the group,
and I wondered if you had any initial thoughts,
on this idea of rendering someone that you loves deceased voice in order to help you cope with
their passing. So I do first want to say, thanks. Shout out my best friend Jess because she
suggested this topic and sent it to me and I sent on to you because I do think in my own like
AI exhaustion, I do avoid some things like this but actually I really like Hannah Frye and I
really like this episode about I mean it's about a few things it's called like the boy who tried
to kill the Queen, which we also talked about in our AI episode last week. But this particular
facet of AI, I don't think we even touched on this when we did. We touched on basically everything.
I don't think we really went here for AI and its ability to, yeah, basically render dead people
in voice and probably eventually in physical face. Physical face, yeah, I'm going to stick with that.
I was initially, and I talked to Jess about this, I've initially intrigued as a person who,
has a lot of anxiety located around death and grief,
was very intrigued, but also having watched Black Mirror,
and we'll get into this because a lot of people flag this for us.
Initially, and also AI skeptical, I was initially very unnerved by this.
I think there's something so sacred about death and grieving.
And so to blend it with AI,
which is the thing that I'm very skeptical and very afraid of,
it became immediately like a horror story at first.
my opinions maybe have slightly changed since reading RDMs,
but my initial reaction was, oh no, no, no, no, this is unholy.
This is meddling with something which we should not touch.
Yeah, I think I had the same initial reaction.
My gut reaction was more I know how much I would want it.
Just my immediate thought was touch would cross my heart, wherever else.
Just if I look, if and I guess when I lose my mom,
I would absolutely do anything to have a phone call with her
because a lot of my relationship with my mom now that I'm old day,
is on the phone. I bring her every time I walk Astrid. So my fear around it was more. I know how
tempting this would be. At the same time, it's kind of innately feeling like grief is not something
that we should meddle with. And Ellie said, Black Mirror episode San Junipero is far too close to this,
and said, it's literally Black Mirror, be right back. Season two, episode one. And Sarat said,
dying as a natural part of life, I wish AI would just leave one thing alone. And I guess it's tricky
because we don't know there's not enough examples of this being used.
I guess we could be proved wrong.
It could be that in the future, this is standard.
You sign a consent form, people gather information, or like one of the apps I mentioned
at the top, it's actually you feed in information for this AI to kind of train on who you
are and learn about you.
And maybe we live in a post-grief world where no one actually does process death in the same
way.
we are so used to having digital relationships with our families, especially they don't live near us.
We're quite used to a lot of the communication, like I said, being on the phone via text, whatever.
Maybe we just cease to grieve altogether.
I think that's a really interesting point.
One I hadn't really thought about is that it's not a far cry from actually how we already talk to people.
People will be used to talking to their mum, their dad, just on the phone for the most part because we all live such dispersed lives.
And it's a really worrying thing than actually, I mean, grief does send people to all kinds of state.
And people say like it was, it's a kind of madness where people will say like the lights came on and off.
And this is not me saying like that wasn't a communication, but like we are looking for contact.
And in the sense of like, well, I could just have it.
You could almost fool yourself that I can talking to them.
They're not gone.
But the process of grief is the reconciling of they should be here.
I've gotten used to them being here.
I'm so attached to them.
My reality is built around the fact they're here and now I must accept that they're not.
And that's a huge thing.
That's why grief is so hard.
It is your whole attachment system and your neural pathways saying like this is not
reality.
We need to get back to the real world.
And there was a really good piece in Time magazine last summer called The Dead Have Never
Been This Talkative by Tharine Pallet.
And it references a post on X, which went viral last year, of Reddit founder Alex
O'Hanian, who was married to Serena Williams.
And he had basically used, I think, Mid Journey, to animate a picture of his mom to show
a video of him, them actually hugging, which didn't exist in real life. I think she had died when he
was quite young. And the piece explores first false memory, as you sort of referenced earlier,
this idea that even without AI, each of us are capable of distorting our own memories really
to such a severe degree and how AI allows that to happen much more rapidly and go into
overdrive. And in studies, they've shown that even viewing a single AI manipulated visual can affect
the original memory immediately. So while I do think there's the trauma, there's
the trauma and therapeutic argument for it, for anything like grief, which is such a natural
but also such a difficult process, it feels like implanting misinformation, like putting a
corrupted file over the top of what actually happened and losing the original. Also in the piece,
they talked to a neuroscientist about the neurobiology of grief and basically how it is
reconciling the brain. Someone's not here, but they should be here, but they're not.
not here, they're nowhere. And to a grieving person, that's like a hole in reality. And then to
suddenly introduce into this like delicate process, this technology that allows that person to
sort of be there and to have that connection, I feel that we have to be very careful. Because
the idea of disrupting someone's healing from what is for most people the hardest thing they will
ever go through. It feels like we are, it feels like we're breaking our brains on a lot of ways
to incorporate AI, this feels, like I say, like something unholy.
We had a message from Megan, which reads,
my dad died two years ago, and I think such a big part of grieving has been adjusting
to not being able to speak to him.
I have some of his voicemails saved, and I've only been able to bring myself to listen
to them in the last month.
I found them so comforting and will absolutely treasure them forever,
but I can't imagine relying on an AI version of my dad to get me through this.
I worry it would have complicated what is already an incredibly individual complex process.
I don't think you ever complete grieving.
And there's a lot of misconceptions about the stages of grief.
They were actually developed for the person who was dying rather than the bereaved.
I didn't know that.
I thought it was really interesting.
Grief is far from linear, but I worry that this sort of technology removes us further from reality
and can keep us stuck in the all-consuming grief stage for longer,
rather than being able to adjust and move forward with our lives.
Such a complex topic.
and I worry it's exploiting vulnerable people.
I thought that was such an interesting message
because I was reading about grief and those stages
and the last stage obviously is acceptance
where finally you're able to accept this person
is no longer with us,
but you're able to carry on your life.
And lots of people cite that as kind of one of the most human experiences
you can go through.
It is, as everyone always quotes,
there's only two things certain in life that is death and taxes.
And I do think Megan is right.
like I said, I do think that rather than aiding us to grief, it actually just keeps you stagnant
in that grieving moment. And to go back to what you said about sort of like falsifying memories,
there was something else I was reading, which is of course, AI, and we've seen this with so
many examples now, could kind of go off piece. You could have your deceased relative saying
things they never would have said in real life, advising you in ways they never would have advised you.
It could actually become quite a dangerous thing if somehow your kind of AI,
bot evolves past the point of whoever the loved one is that you're speaking to and says things
they would never say in real life. And yet I think it is partly just like the unknowableness of this
technology that I am finding it hard to see what the positives would be. I was in there was a Reddit thread
about the Alex Hohannian video that I mentioned where quite a few people in it were saying that they
had tried to do something similar to Alex, try to make a chatbot in particular speak and answer
like a late loved one using their text, etc.
And they had succeeded to some degree.
And then they said, oh, this is actually awful.
And they regretted it.
And actually it delayed their grief and still haunts them now that they tried to do that.
It was such a twisting of like, but my last conversation with them was this.
But then there was this follow up.
And I do think, I'm not saying that there's no utility for this.
But the idea that and actually the guy that she speaks to in the P's seems really level-headed.
And I would say it was very sad because,
He does still talk to his mum on the phone and there does, he says something like grief is just like this ultimate terrible thing. It's this hopelessness and he's not interested in having that. So he's living in a world where he can still talk to his mum. I found that almost unbearably sad. But at the least, I thought, okay, he's driven by grief and compassion and believes wholeheartedly doing something for other people. Whereas it does feel like that this will make its way into the hands of people like Sam Altman, people, the, the worst. The worst. The worst. The worst.
the worst ghouls of Silicon Valley, I do not trust those people. I do not trust the people
that think human beings are lesser, who thinks we will all be replaced by AI, who sort of worship
at the altar of AI to have any stake in how people grieve. That to me feels like the worst
possible outcome, just like these are ghouls. I don't want them to touch this. I don't want
to be a profit margin in having people rely on a technology where they can talk to their dead
loved ones. Something about that just gives me the ultimate willies. We had a message from Izzy
and she said, I'm training to be a clinical psychologist and have worked in teams for people experiencing
psychosis and a team for supporting families and children with cancer. And we had so many
important conversations about death and dying. The increasing use of AI across the board feels
like it's homogenising all human experiences with the goal of streamlining and optimizing everything,
stripping us of what it really means to be human. We haven't evolved to be happy all the time.
Sometimes we are sad. Sometimes we grieve. It's part of the organic human experience.
And AI just seems to want to sanitise every element of texture in our lives. And that,
relating to what you just said, this idea of sort of everything being taken over by AI,
the way that it can infiltrate, it's kind of like to what end.
is, I thought that was such a poignant point to make is that the human experience,
well, if you go by a Buddhist point of view, it is suffering.
Like we, we are not here to be comforted and coddled and feel like everything is perfect all
the time because actually what that produces is probably a vessel of a person.
You need to have the highs in order to experience the lows, sorry to say the most right thing
in the world.
But I do think that there is something, and we did have messages about this where grief is
obviously so harrowing.
And I think like you, Beth, it is one of my.
biggest fears of losing people close to me. I kind of preemptively almost try to feel sadness,
almost like to experience it before it's even happened because I am so terrified of it.
But we can get through it and we do get through it. And the process of loss is something which
every human before us has experienced and everyone to different extends. And also this idea,
and Izzy also said this in our message like you said about like the costs of it. But she said
that I don't know the specifics, but I imagine it isn't a cheap service to access, which feels exploitative of people
who are vulnerable and grieving.
But it also, again, creates this kind of hierarchy
of who experience is suffering.
If you have enough money, you can actually kind of numb yourself
to some of the most base human experiences, e.g. grieving.
It's very odd in the ways that we are kind of separating
because, again, all of this is paid for.
Ultimately, this is a business.
And it's extracting from you,
your human experiences in exchange for your cold, hard cash.
Whereas other people who can't afford it
are having to go through that pain and suffering.
Even that and of itself is quite a weird dystopian thing to think about.
Yeah, that is true because there will be,
these things don't exist out of the goodness of someone's heart,
even though, as I say, I think this man is generally from a place of good.
The minute there's profitability, the paywall shoot up.
And as DAPO says in RTMs, I think it's a way to exploit the grieving,
yet another example of a wider societal failing,
in the sense that Western societies do not handle grief well at all,
which made me think of what Hannah Frye says at the end of this documentary
just after this conversation she's had
where she has reckoned with her dad dying earlier that year of filming
and saying, I really would have probably done this.
And she says there's something so thin about the intimacy at offers.
And once we start replacing real relationships with artificial ones,
I worry it's very difficult to go back.
And that stuck with me because I guess what AI has allowed us to do
more and more is be singular, to be a singular,
to be alone with the technology in a room
and it's replacing the real people
that would otherwise need to be in those rooms.
So in other more trite senses,
like actors on set,
or not trite, obviously, it's the arts,
but like actors on set or people on the other end of the phone
and you've got a problem with your dishwasher,
it's also becoming life partner and friend.
And in the case of grief,
and I'm so interested in the different cultures
and the way that grief happens,
it will allow us more and more to do grief alone in a room with someone that's not a person.
And instead of talking to another living person about the person that's died,
you're talking to the person that's died who isn't really there.
And I think that is very linked to what Dapo says about how badly in Western societies,
we process grief.
We are still quite buttoned up.
We have our rituals.
We do gather, but we don't, you know, full to floor wailing.
It's very, which they do in other cultures.
It is very stiff up a lip period of mourning.
life goes on. People say this all the time, like the first year of grief is hard, but then the
second year is harder, the third year is hardest of all, because people forget we don't talk about
the dead. It feels twisted that the way that we've, the solution we've come up with to keep a person
alive is to weirdly keep a person alive. But again, it's just alone in a conversation with nobody,
rather than sitting around a table, sitting around a fire and talking about them. Like these cultural
practices of grief do not spring up out of nothing. They are need-based and they are tried and tested.
And I think my wish would be for us to see the floor in this way that we grieve
and to follow it to a more communal end point rather than going, brilliant,
we won't ever need to grieve.
Again, as you say, I can just go and talk to the person that died about it.
And I think when she says there's something thin about it, that is what it is.
It doesn't represent a healthy outcome in any way, even if there are some therapeutic uses for this.
Yeah, when you were saying about how, you know, people's memories were changed,
because of, you know, generating stuff that then muddied what was real and what wasn't.
We had a message from Harriet and she said,
recent griever from the sudden and tragic loss of my big brother,
I put serious thought into if I want to explore this idea more,
as in the first few weeks, all I wanted was to hear his voice and feel his presence.
Since then, I've had repetitive and traumatic dreams about him
and our last conversation with one another.
I feel like this idea of an AI ghost would exacerbate the subconscious tricks my mind is playing on me
and would ultimately lead to negatively impacting my sleep
and significantly drawing out the grieving process
as in a way you're refusing to let go.
My mind might change in the future,
but in this first year of grieving,
it is a delicate path to tread.
And being guided by a professional, i.e. a therapist, has been a guiding light.
If I took matters into my own hands and without guidance,
something like this could have caused me to take 10 steps backwards.
And I think it's so interesting,
it's so complicated, because of course the impulse is like,
course you're going to want this. And I understand grief can have catastrophic impacts on people's
lives. But I would say that in the most part, if it's grief when it's expected, like an elderly
grandparent or an elderly parent, it's the pain. I know my mum still gets really sad when she thinks
about her mom and her mom was older and that was years ago now. I understand that it like never
goes away. But for the majority of people, you learn to live with grief. And actually we had a
really beautiful message from Anand about losing his dad and then separately, because I replied,
he then replied to that and said that his dad had always explained grief like you have a ball
in a jar and the ball never goes away but the jar just gets bigger and you learn to live with the
grief and the weight of the grief kind of gets smaller. And I think that's kind of, it's a process
that people go through. But what this is making me think is we're just creating a solution to, yes,
grief is a problem. But I really don't think it's a problem that needs to, you know,
to be solved. I think that's my main issue with this is that I think that you can't cure grief
as in you can't make it go away because you can never truly bring the person back. So what you're
doing is just interrupting grief. You're actually just stopping it. You're not curing it.
You're actually stopping people from experiencing that process. And we will probably experience
multiple griefs over our lifetime. If you live a long life, it's like you're going to lose
your parents, your friends. There's all sorts of times when you might experience it.
And so I just don't think this is a good thing to have, even though God would I feel tempted
presented with it if I lost someone that I really loved.
It reminds me.
And I'm like I said, I'm really fascinated by grief.
And I have to say, like, I'm just kind of blown away by the vulnerability and the willingness
to talk in our DM to people that are recently bereaved and reckon with it.
I do think it just speaks to how brave and brilliant.
And just that power of love is like it does sustain even if it does wreck you.
And I mean, there's so many books about this that I've read.
But I remember reading The Year of Magical Thinking by Joan Didion, which is about it's, I think it's her husband dies and then her daughter dies and relatively considering how long her life was, short space of time.
And it's just this idea when people are grieving, they do things which would be considered odd, but actually are very in line with how grief functions.
Like you say, all right, well, I can't.
I must keep their shoes there because they will need them when they come back.
To enter something else, to put other kind of magical thinking in that,
which in any way suggests that there is return or that death is not in that direction,
I think it's just really, again, it poses a solution which is something that isn't a problem,
even if it is the most deep pain and the most unraveling thing.
and that I think that there are problems with how we grieve as I say but that's not one of the existence
of grief isn't a problem the existence of grief is it's always it's just you pay as deeply as you've loved
and I do I read a lot about it part of my anxiety but like I just I think it exposes some the way
that we grieve the way that we love like it's human and anything that's not human coming into
that space automatically makes me wary and makes me feel like someone is going to
get badly exploited here. We got a message from Han who said, my nana died recently and when my
aunt and I went to the funeral place to write the notes that would be attached to the flower
arrangements at the funeral, she used chat GPT to write the note that would be from her and her
siblings. I was so annoyed and upset. My family know that I'm hugely anti-AI and tease me about it
constantly. When she did this, I rolled my eyes and said, I can't believe you're using chat
GPT for this. But she shrugged me off and said she wasn't sure what to write. And no grief is hard,
it's hard to convey what you want to say. My own note was barely 10 words.
long, but at least I'd written it all myself. I'd rather have sat there for an hour to come up
with something. I just think that things are so unpersonal now. And this, I think we're seeing more and
more of this. And I really am so, this is what I really detest about chat chute pt. It is, it's the
frictionlessness that is robbing us of even the most cathartic and important moments to say
imperfectly, this is what you meant to me. Chatchipt does not know. It's basically like opening up a
greetings card or a sympathy card and saying, that's exactly how I feel about you, this person.
that was my like north star, I find it, I understand the impulse, but the idea of streamlining
or making grief less full of friction or even like the kind of the lovely parts of grief,
which is saying, this is what you meant to meet, this is when we are gathering to talk about it.
I almost can't believe that there are, and obviously everyone's relationships are different.
Sometimes maybe it's appropriate to not do the thinking yourselves, but for the most part,
if you love someone, I think you do owe it to them to tell them to live and react.
and tell them how you feel, not outsource it to some, like basically a greetings card machine.
Yeah, it is that thing of it being frictionless. And I guess we live in a lot of discomfort now,
which is exacerbated by social media and our closeness to so much information that like years
ago we wouldn't have. So I do understand the impulse or the tendency or the want to reduce
friction in areas when it does feel like life is a struggle or it does feel like we're so close
to seeing so much more of the world than we used to. So it's difficult because it's almost like
the, I remember writing this one of my subsets, but like the cure is the poison. The thing that's
making us all sick is the thing that we're reaching to to try and like use as an antidote for it.
And that just isn't going to work, I don't think. The problem is I think that so many people,
it's so interesting when we hear from my listeners because actually more often than not,
you guys do have a similar outlook to us. And I'm always interested in that because I don't think
that I think there will be a lot of people who won't think about the ways in which this maybe
feels inhumane, the ways in which this could be ethically or morally quite ambiguous from
the position of like the deceased, the deceased person who's being kind of like digitally
reanimated. And people that are in the throes of grief and who maybe haven't had a minute
to interrogate this, who are presented with this option, who think, you know what I could afford
that, are going to use this. And we've seen the uptick, even since we've,
did that AI episode, the way that AI has evolved. And that was only, I think this time last year,
actually, that we did our AI double part deep dive. It is so much more entrenched within, you know,
day to day life, within work. We've already seen, I think the CEO of one of the, the app thing,
I can't remember who it was, but I remember reading it and he was basically saying like every single
person in this kind of role will be obsolete within, you know, the next couple of years because AI is going
to take it over. The advancement of AI imagery is, like I said, I get tricked by it all the time
now. I will not be surprised if we come back a year later and actually it's extremely normal for
people to go to a funeral. And after that funeral, the done thing is you then download this app
and you basically just live in a continuous cycle of a continued conversation with someone
who's no longer here. And I find that really scary, but I now feel pretty certain. Actually,
unfortunately, whenever these technologies emerge,
they are just adopted almost immediately.
I think there is, I mean, there's a use case here
which argues for closure and like final conversations,
which I guess is sort of what Hannah discusses.
And even people in RGM saying, like, I would revisit this.
At some point in the future, right now, the grief is too fresh,
but I'm going to let myself feel how I feel,
where it is an elderly relative or a parent,
perhaps there was some warning, but not enough.
And you do have that.
You're able to close the circle and have one final conversation
where you hear them say things that you know that they said before or meant things like,
I love you if you don't have a, if you don't have a recording of them saying that, for example.
But there's also a use case for this where someone dies perhaps prematurely.
So, you know, when someone dies and they're very young, for example, where you are able to
see what happens next almost and go, but if they would have lived and we would have had these
conversations and one of the most powerful pieces of writing, basically I've ever read my life
on anything, but on grief especially, and I'm sure a lot of people will be acquainted with this,
is an advice column by Cheryl Strait called Dear Sugar. And one of, I mean, so many of them are
cult favorites, but one of them is called the obliterated place, which is a, it's a letter
that she wrote to, so a man, a 58-old man had written into her, after his 22-year-old son,
was killed by a drunk driver. And it's just such a.
a shattering exchange and he writes his in a list and she replies in a list we will link it I will say just tread
carefully it just it wrecks me every single time but he the at the nexus of his grief I mean he says like
he struggled to keep living and he feels like he is this like living dead dad and all he thinks about is
what his life what his son's life would have been basically if he'd live beyond 22 years which I
imagine is what everyone thinks in those cases and in her reply Cheryl in her list
writes, number 20, when my son was six, he said,
we don't know how many years we have for our lives, people die at all ages.
He said it without anguish or remorse, without fear or desire.
It's been healing to me to accept in a very simple way that my mother's life was 45 years long,
that there was nothing beyond that.
There was only my expectation that there would be.
My mother at 89, my mother at 63, my mother at 46.
Those things don't exist, they never did.
Number 21. Think my son's life was 22 years long, breathe in.
22 think my son's life was 22 years long, breathe out 23, there is no 23. And I think it's that
bit of writing about grief and accepting reality of every life is just inherently precious, whether
it's minutes long or 100 years long. But to live in reality is to live with what we did get.
And that is how we honour people rather than saying, but it could have been. The fact is it never was.
and I just think there is such a scope to get lost in
being like, well, I'll just see what they would have been like at 89.
I'll just ask the AI to tell me,
because I'm sure it's not even beyond the realms of possibility now
to be like, here's all the information about this person,
what they were like, what would they be like in 20 years' time?
And the reality is you just don't know because it's not real.
And I think that's where I feel like I don't want to lean further into the what is,
because grief is a challenge of acceptance.
And just as I was thinking about this,
that piece of writing just played my mind of like,
sometimes that is how long a life is
and it almost dishonours the years that you are alive to say
but what else what else just because we so badly wish there had been
more and more and I think that is the most vulnerable
potential use for this.
Totally and when you were saying early you know there's a case study that
you know it's just used for this final resolution
I think that's a good idea if there was a thing where
you were allowed to send one message and you got one response
if there was something inbuilt within the app that actually was thinking about its user.
But as far as I've read, all of these things are totally open-ended.
You could be calling your mum for the next 30 years.
And that's where it feels scary.
That's where dependency can come in.
That's where all sorts of issues could arise.
If there was like a service where you send a text, you get a call and it's like a voicemail.
And, you know, that's it.
You can't ever do it again.
But none of these things seem to be built in with, I'm sure they have psychologists working on them.
But how much do I trust those people when they're in the pocket?
of these kind of creators.
And it's interesting because I lost a friend when we were in our 20s.
And when you're saying about that final message,
I actually did.
I WhatsApped her a really long message.
And then I remember I kept WhatsApping her intermittently.
And I also knew that she wasn't getting them.
But it was that thing of I just wanted,
I didn't get to say goodbye.
So I just sent her a message without a response.
And I think there are ways that, you know,
you can feel like you're saying something without being able to say it.
And it is, I just think that I think you're right about that thing of allowing someone's life to be everything that it was and not dishonoring them by being like, I didn't get enough of it.
And we had a message from Sadi who said, this is very timely for me as I sadly lost my darling mom on the 2nd of February.
She was elderly, but it was still unexpected.
We miss her terribly and not being able to verify her Ramadan recipes is just one example of how much she's missed each and every day.
AI, I work in tech for context, is no substitute for the nuance and texture of familial love.
I would give anything, anything to have her here, but she's not.
And missing her and navigating grief is, in my book, the price legacy of our love for her.
I want to feel that I loved her with every fibre of my being.
Maybe it's early days for me and I'll change my mind.
But right now, the very thought of it is abhorrent to me.
And like you said, Beth, you are all so generous in sending messages.
We were actually quite nervous about posting this.
It's obviously so sensitive to talk about,
but so many of you were so ready to share how you felt.
And I think Sardia's message is something that I've thought about,
which people say a lot now, but I do think it's true,
which is grief is the price you pay for love.
And I know that that's kind of like a bit like a teetail kind of slogan,
but it is this feeling of every single moment of grief,
everything that you experience when you lose someone.
It shows you exactly how much they make.
to you and experiencing that level of love and that level of outpouring of grief,
it is love. So as much that's devastating and it's sad and it's cruel and it's unfathomable,
I find death really confusing to think about, to think about someone not being here.
And I remember when I first lost a grandparent, I physically could not get my head around
the fact that they were never coming back. But at the same time, it is also, I guess,
an honour to experience that feeling because it just shows that you had so much love for.
someone and I do think that this I guess somewhat undermines that if you're just saying I no longer
have to go through that pain I can just bypass it by pretending it hasn't happened yeah and I think like
reading Sadiya's message it was sort of just so thankful as we say that people were sending
some messages I mean I hope that her memory is a blessing and that you are doing okay as you could be
and it is gratitude for people talking from you know the cold face of it and and grief is there's no rule
book to it is very instinctive. It's doing what you can to make sense of a time that logically
your brain cannot make sense of. And sometimes that's therapy. Sometimes it's spending time with
the person's things or being with family. Other times it's quitting your job and moving and changing
things. It's the way to process grief. I think it is very instinctual and there is no rulebook.
And this AI almost applies a rulebook to it out of nowhere. And I think what I'm getting at is
you have to follow your instincts.
I think it's so interesting that people are saying,
even as desperate as I am,
to talk to that person,
something in me knows,
there's kind of some primal animal part of me,
knows that that's not right,
that that wouldn't work.
And actually in the piece I referenced earlier,
the time piece,
the neuroscientists they talk to,
Mary Francoe Conner has written a book about this
that explores basically what's happening in the brain
when we grieve and how enormous of a thing it is
and how discombobulating and huge.
And I do think to offer a person in that
very specific time, this glimmer of, here you go, you can talk to them for the low, low
sum of whatever it is. I think that is the most dangerous, not that it would exist in some
abstract way, but that it would, your algorithm would properly sense you're going through something
and serve you up. One more conversation, two more conversations. I do, I can't get away from
the kind of commerce of it. But then on the other side, I do understand that we've always used
technology to grieve since like the advent of photography and video we do then rely but I think the
point is those things are real and those things did happen and even if you're returning to a memory
and making some alterations oh that day was happier than it was or we were always laughing you at least
are leaning on something real and I do think beyond that my memories are just too sacred and also
I know that I wouldn't like to be A-I'd but I also know that
potentially I could be and that AI would
I mean likely she would not be saying like fuck this I fucking hate this
why have you done this to me which is what I would really if I really came back from
the dead I would be like stop doing this you mad cows you know like move on
but either the either the AI me would be saying things like that I wouldn't say in real
life or they would be doing some kind of twisted version of my own personality which is what
you see in the in the documentary Hannah's AI rendering her of her voices expressing like
okay it's a little bit weird I find that so uncanny I just
And I hope everyone that knows me knows is my sincerest wish.
Watch my TikTok videos, you know.
Text me, but please do not.
This voice you're hearing right now of which there's so many samples
you could have me saying all sorts of things.
Just please do not.
Yeah.
And you're so right that it's, and I promise I'll tell everyone not to,
but I mean, the amount I'm vaping, I'll probably go before you.
It's the fact that with pictures and videos and voicemails,
they're not interactive.
So yes, you're revisiting, but you're not getting a response.
And I think that's the main thing.
And I think this idea of technology is, it is confusing because AI is technology,
but AI is technology like we've never kind of experienced before.
And we had a message from Jess kind of saying,
we see already with how small children are so frequently given an iPad when they're distressed
as a distraction rather than allowing them to be upset and learn how to self-soothe,
that technology is stopping us from developing the ability to solve our own emotional problems.
And this feels like another symptom of that.
And I do think to go back to an earlier message as well,
it does feel like if you really strip it back,
that actually what we are doing is we are becoming the robots,
we are becoming more numbed, more passive, more frictionless,
so that we're more functional in a society
that needs cogs in a system of capitalism props up,
kind of the richest people.
Because the things that make you rich, really,
are your human emotions, your experience and depth of life
and those things.
Obviously, you have to have your Maslow's hierarchy of needs
and have all of your safety and enough money
to live like a comfortable life which a lot of people are falling below that but beyond that
our human experiences our emotions the love and loss we experience that is all it really is to exist
really beyond everything else and I think that we should cling on to that as almost a mark
of resistance because this is all I keep thinking it just feels like this technology is designed
to kind of smooth us out and if we're outsourcing our grief I think that doesn't
does feel like the final frontier of stripping us of what it really means to be alive.
Yeah, I couldn't have said it better. I think that's what my thoughts come down to.
It's it's anti-life almost. It's anti-reality, anti-messiness, and it's anti-mortality.
And we're a culture that is obsessed with living forever. We fear death, but we just crave
dominion. And with carbon-based life forms, we are, as points out in the timepiece,
we are naturally finite and impermanent. We need to live within that reality rather than
trying to fool ourselves, that we are anything other, and that I can't see my opinion changing
anytime soon.
Thank you so much for listening and for all of your opinions and takes on this topic and for
being so vulnerable and open and honest with us.
Honestly, it's a real honour to have your trust with some of the things that you tell us.
So thank you so much.
You can follow us on Instagram at Everything is ContentPod.
And please do leave us a rating or review wherever you're listening if you haven't already.
We'll see you as always on Friday.
Bye!
