It Could Happen Here - The False Promise of AI Immortality
Episode Date: November 9, 2023Garrison tells Robert about MindBank, an AI company trying to build digital replica of you.See omnystudio.com/listener for privacy information....
Transcript
Discussion (0)
You should probably keep your lights on for Nocturnal Tales from the Shadowbride.
Join me, Danny Trejo, and step into the flames of fright.
An anthology podcast of modern-day horror stories inspired by the most terrifying legends and lore of Latin America.
Listen to Nocturnal on the iHeartRadio app,
Apple Podcasts,
or wherever you get your podcasts.
Curious about queer sexuality,
cruising, and expanding your horizons?
Hit play on the sex-positive
and deeply entertaining podcast
Sniffy's Cruising Confessions.
Join hosts Gabe Gonzalez
and Chris Patterson Rosso
as they explore queer sex, cruising,
relationships, and culture
in the new iHeart podcast,
Sniffy's Cruising Confessions.
Sniffy's Cruising Confessions
will broaden minds
and help you pursue your true goals.
You can listen to
Sniffy's Cruising Confessions,
sponsored by Gilead,
now on the iHeartRadio app
or wherever you get your podcasts.
New episodes every Thursday.
Welcome to Gracias Come Again,
a podcast by Honey German,
where we get real
and dive straight into todo lo actual y viral. We're talking music, los premios, el chisme, and all things
trending in my cultura. I'm bringing you all the latest happening in our entertainment world and
some fun and impactful interviews with your favorite Latin artists, comedians, actors, and
influencers. Each week, we get deep and raw life stories, combos on the issues that matter to us,
and it's all packed with gems, fun, straight up comedia, and that's a song that only nuestra
gente can sprinkle. Listen to Gracias Come Again on the iHeartRadio app, Apple Podcasts,
or wherever you get your podcasts.
Calls are media.
Call zone media.
Welcome back to It Could Happen Here, a podcast about things falling apart and sometimes about stuff that's less depressing than that. Today, we're doing an episode that's, I don't know, part funny and part, hey, you should be aware of this thing because it's kind of
fucked up.
It certainly could happen.
It probably shouldn't.
It probably shouldn't happen here, but it certainly could.
But it certainly could.
Garrison Davis is on the other line.
I mean, other line.
This isn't a phone call.
That's the other voice that you are hearing right now. And earlier this year, Garrison and I went to CES, the Consumer Electronics Show, in Las Vegas, Nevada, where Garrison had a wonderful stay at Circus Circus that did not smell like dead clowns.
That definitely did not just shut down this summer due to horrible infestation problems.
Oh, that's where you're staying next year too, buddy.
Anyway, we encountered, while we were going through all these different technology companies
and whatnot, this very peculiar AI project.
And Garrison, I'm going to hand things over to you now because you're the one who was
actually prepared an episode.
Yeah.
So I dug into this AI project more when I was making my ghost
conference episodes. And after just a few minutes of like doing like background checks and stuff,
I realized that this would become its own episode because of how wild things got very, very quickly.
This company is called MindBank AI.
As the name suggests, they are an AI company based in Florida with the goal of creating personal digital replicas of living humans
using artificial intelligence and an evolving NLP,
or natural language processing.
Yeah.
Basically, these are algorithms that are used by GPT chatbots,
predictive texting, and digital assistants like Alexa and Siri.
Yeah.
Language models that respond to feedback.
They're pretty common these days.
We encounter them a lot, right?
Whenever you're typing on your iPhone,
they will generate text that they think you're going to write.
But what MindBank is trying to do is a little bit different.
Yeah.
When we encounter them at CES, their booth had all these signs that were stuff like, you know, set up a legacy for your kids, you know.
Yes. your kids, you know, it was basically advertising. This is a way to allow a part of you to exist in
digital form and communicate with your with your descendants forever.
Yes. So we found them in the US government sponsored section of CES, which is already
a great sign. Yes. Already already looking looking good. But unlike other kind of AI digital copies of humans, which typically are just language models that generate responses based on an archive of someone's writing or recorded interviews or online presence, MindBank instead seeks to create an evolving, unique digital twin by having a person input their personal data,
basically tons of personal information about themselves,
into an AI on an ongoing basis.
And by analyzing your data inputs,
MindBank says that your digital twin
will quote-unquote learn to think like you.
And their CEO claims that this process
will eventually help him achieve immortality.
Oh, that's good.
I hadn't caught that when we talked to the guy
that he believed that that was.
I love whenever you get these guys who are like,
I will just offload my brain onto a machine
and then I will live forever in the cloud.
And of course, man.
Yeah, that's how consciousness works.
Absolutely, buddy.
All right.
I'm going to play this video next.
Humanity is limited.
Our bodies age.
Our memories fade.
Technology outpaces evolution.
The solution is your personal digital twin. Transfer your wisdom,
become the best version of yourself, and live forever through data.
Mind-bending. Let's go beyond.
So I got to know one thing before you start in Garrison, which is that when they mentioned that like technology, like there was a line about like technology making everything better.
They're showing a man who has lost his leg walking on a treadmill with an artificial leg.
And look, I think I have so much admiration for the people who make artificial limbs.
Wonderful thing to be doing.
Yes.
Great, important work.
They're not as good as real legs everyone
agrees with this right technology is not making it better it's just dealing with the fact that
someone lost a leg yeah that's right that's what it said when he was on the fuck outpaces evolution
no that's technology allowing someone to adapt to a terrible terrible thing that happened to them. But Robert, don't you want to live forever through data?
No, no, I don't.
I'm exhausted now, Garrison.
Okay.
All right.
So let's get into this a little bit more.
Your immortal digital twin is made possible, quote, by safely storing your data over the
years.
Artificial intelligence and computers of the future will have
ample data to compile a digital version of yourself and predict your responses so that that is their
idea of how this thing works um another one of their very very funny youtube videos titled the
vision promises that quote the next personal computer is you. Store your memories forever.
Absolutely it is not.
Unleash your infinite potential.
Take advantage of AI-enhanced humanity, unquote.
God damn it.
So that is their vision.
My next personal computer absolutely is not me
because I do not play Baldur's gate 3 very well you know like i
can't run it on my hardware ah well that's that's that's why it's why you gotta buy the new monster
manual and then maybe it could all just be in your brain actually yeah i am full of shit dnd is still
better when you run it on your own hardware god damn it this is the one thing you actually can do pretty good by yourself why did i pick that one yeah i it's just so like i don't think most people buy this i don't think
this product is going to be a success i don't think most people i think most people's reaction
to this is like kind of sneering which is the right reaction to this yes but there are people
who do feel this legitimately and that is a thing of almost unfathomable sadness.
Like, yeah, I had my angry atheist period like a lot of people.
But like I, I have so I'm so much more OK with Christianity than I am with this.
Oh, yeah, absolutely.
So before I get into how this is all supposed to quote unquote work, first, I want to talk about how the founder and CEO says that he got the idea for this company because I think it puts into focus how he sees this product ideally functioning in the future. Emil Jimenez was riding a train with his four-year-old daughter. She was playing on her iPad and discovered Siri.
She began talking with Siri and asking it questions like,
what do you eat?
And do you have a mommy?
I'll let Emil tell the rest here.
But 30 minutes later, she was laughing and having a really nice time with Siri.
And she said, Siri, I love you.
You're my best friend.
And that struck a chord with me.
That inspired me so much because I said to myself at that moment,
children don't see computers and devices as a tool.
They see them as a companion.
And today she speaks with Siri or Alexa or any other device,
but in the future I want her to be able to speak to me, to be able to ask me a question, just like she did the device.
And understanding the technology, I know that the only way that's possible is I'm able to take my thoughts and put them in the cloud so that then later she can access that information.
So that's how the idea for MindBank came about.
It's a place for you to store your ideas for the next generation to tap into.
No.
The generations already linger too long.
We had it right when people died when they were, well, not died, but Logan's Run had it right.
We should kill everyone at 35.
But this is so fucking offensive.
Like the idea that, first off, like if you're looking at we want a device, a way to use technology to help people grieve or something.
And like you decide maybe having a chatbot that they can chat.
I'm sure it's possible that that could be part of healthy grieving.
I'm not going to say that there's no place for that.
But something that is definitely not just stupid but toxic and poisonous is having a machine speak with the voice of a child's parent while that parent is alive and confusing
the child as to whether or not the phone or their parent is conscious.
Like, that seems bad to me.
There's actually another product
that does this right now,
which has kind of caused some controversy
for this very thing you mentioned.
It's a Tarka Ratami smart speaker,
which if listening to a parent's voice
for 15 minutes can replicate it
and tell your child bedtime stories
if you aren't physically present.
No.
This has similarly kind of like caused people
to have a whole bunch of questions around,
you know, is this good for a child's brain development
to have their parents' voice be coming out
of like a smart speaker?
The answer is probably not.
But yeah, so according to MindBank's website, Emile's four-year-old daughter's interactions with Siri, quote, started a quest in his heart to live forever for his daughter. The quest for immortality has led to something much bigger for humanity because the next personal computer is you unquote so there's that there's that other
line again um about how this quest in his heart is actually part of a bigger a bigger quest for
all of humanity um to live inside a computer or to have a computer trained on you he's he's he's
hitting the same speech cadences
that guys like Musk use.
Like he understands the kind of,
he understands partially the degree of hype
that you need to get something off this,
but he is going too hard.
And I'm making that judgment
based on the incredibly comforting fact
that as you tell me these horrible things,
I am looking at your screen
and MindBank has 78 subscribers on YouTube.
So the company has not yet broken through.
I,
I,
I do want to play one,
one,
like 10 second clip,
just because the phrasing is really funny.
I was inspired by an interaction.
My daughter had with Siri.
What started as daddy's quest for immortality has led us to something far
greater.
So my God, that's pretty funny, right?
Man.
But no,
Robert, you were totally right
about kind of how Emil's
speech pattern cadence is pushing
a very specific thing. Because
before Emil got into the
tech industry, for 18
years, he worked in marketing.
He has degrees in psychology, communication and art direction and business administration.
He isn't a tech guy.
He's a marketing guy.
And I think that's really good to keep in mind throughout our whole discussion of how he's trying to get funding for MindBank.
Because that is primarily what all of this marketing is for.
It's to attract investors. Because he's still in very early stages of this company.
They do have a product that's out, but it's still primarily based on getting investors
to give him money.
I think what's most disturbing to me about this is that this is not going to work for
this guy because he's a loser nobody cares about.
But if Elon Musk or one of our other many techno grifters,
or if a number of them got behind similar things,
I think the nightmare scenario to me is someday hopping on Twitter
to see that fucking Ian Miles Chong or Ben Shapiro or Jackson Hinkle or any one of these like horrible, horrible social media poison distributors will be like, I have made an AI trained on my voice.
You can have me all the time to argue like if you want to, you know, you can ask me questions or whatever.
If you go to a protest and have me yell at liberals for you, Like something like that will happen at some point with one of these guys.
I cannot wait to bring Ben Shapiro to Thanksgiving dinner and have him argue with people around
the turkey.
The next time you stay at my house with somebody that you love and care about and feel comfortable
in the arms of, you are going to drift off to sleep.
And then through the speakers that
I have installed in the room, you will hear Ben Shapiro's voice coaxing you both to acts of love.
That's what's going to happen. So as an example of this kind of very marketing heavy approach,
I'm going to read something from the homepage of MindBank's website, quote,
I'm going to read something from the homepage of MindBank's website.
Quote, our vision is to be the world's most trusted guardians of your AI digital twin and move the human race forward.
Humanity's next evolutionary step is to combine ourselves with AI and move humanity forward so that we are no longer bound by anything.
That entire sentence is just marketing mumbo jumbo it's it's meaningless hype like hype words and phrases that refer to this like science fiction future but like it's it's saying nothing
it's worse than meaningless it's like it's it's wrong it's stupid wrong like the idea that like
you would not be bound by anything if you could live inside a chatbot like yeah i have an ai i have used an ai right i have it on my computer my computer
were i to hurl it across the room in the same manner that i myself have been flung it would
break and i would not like i am finally free to think within my computer's RGB gamer RAM.
Yeah, finally.
Like when I have a laptop that gets too old, like the very act of surfing the Internet is a nightmare.
I don't want my conscience on something that ages at the speed of a smartphone.
Like that's that's even worse than being a person.
Robert, do you know what else is a very important evolutionary step
for the future of humanity?
Oh, God, I don't know.
When we all suddenly, spontaneously, as if by God's grace,
start speaking with the voice of Ben Shapiro?
Yes, and perhaps you can do that if one of our sponsors
is Ben Shapiro shapiro bot coming soon
to a smartphone near you
welcome i'm danny thrill won't you join me at the fire and dare enter
nocturnal tales from the, presented by iHeart and Sonorum.
An anthology of modern-day horror stories inspired by the legends of Latin America.
From ghastly encounters with shapeshifters...
to bone-chilling brushes with supernatural creatures.
I know you.
Take a trip and experience the horrors
that have haunted Latin America
since the beginning of time.
Listen to Nocturnal Tales from the Shadows
as part of My Cultura podcast network, available on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Hey, I'm Gianna Prandenti.
And I'm Jimei Jackson-Gadsden.
We're the hosts of Let's Talk Offline, the early career podcast from LinkedIn News and iHeart Podcasts.
One of the most exciting things about having your first real job is that first real paycheck.
You're probably thinking, yay, I can finally buy a new phone.
But you also have a lot of questions like, how should I be investing this money?
I mean, how much do I save? And what about my 401k?
Well, we're talking with finance expert Vivian Tu,
aka Your Rich BFF, to break it all down.
I always get roasted on the internet
when I say this out loud,
but I'm like, every single year,
you need to be asking for a raise
of somewhere between 10 to 15%.
I'm not saying you're gonna get 15% every single year,
but if you ask for 10 to 15 and you end up getting eight,
that is actually a true raise.
Listen to this week's episode of Let's Talk Offline on the iHeartRadio app, Apple Podcasts or wherever you get your podcasts.
I found out I was related to the guy that I was dating.
I don't feel emotions correctly.
I am talking to a felon right now and I cannot decide if I like him or not.
Those were some callers from my call-in podcast, Therapy Gecko.
It's a show where I take real phone calls from anonymous strangers all over the world
as a fake gecko therapist and try to dig into their brains and learn a little bit about their lives.
I know that's a weird concept, but I promise it's pretty interesting if you give it a shot.
Matter of fact, here's a few more examples of the kinds of calls we get on this show. I live with my boyfriend and I found his piss jar in our apartment.
I collect my roommate's toenails and fingernails.
I have very overbearing parents.
Even at the age of 29, they won't let me move out of their house.
So if you want an excuse to get out of your own head and see what's going on in someone else's head,
search for Therapy Gecko on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
It's the one with the green guy on it.
All right, we are back.
Let's finally talk about how this digital twin thing is actually supposed to work.
So you download the MindBank app.
I'm sure that's totally safe.
Yeah, I trust this with all of my thoughts.
And every day, your digital twin will ask you questions about how you're feeling and what you're thinking about.
And as you tell it your quote-unquote life story, your inputs will be used to train the twin to make a more accurate digital copy of yourself.
This is from their website's homepage.
Quote, store your conscience.
Guided questions help train your digital twin to know your life
story so you can live forever through data the more questions you answer the closer your ai
digital twin will get to becoming you unquote god in heaven so when robert and i were at ces this
past uh this past january we spoke to MindBank's co-founder
and director of systems architecture and cybersecurity.
And I'm going to let him explain
kind of some of the process of asking MindBank questions
and how that helps craft this digital twin.
We ask you questions from how's your day
to what does money mean to you?
And you answer those questions with your voice in a natural way. You convert the voice to text. this digital twin. you're doing better or worse, just like a running application would. Better or worse at what?
Whatever metric that you're interested in, your happiness, your awake, your awareness,
we have a very large amount of sentiment that we can provide you with.
Here's small bits, but you can see kind of what the app looks like here.
You've got multiple different possible types of sentiment. And then within each sentiment, you've got multiple different factors that you can weigh against. To grow MindBank's user base, there needs to be some reason for users to input
the massive amounts of data that's needed to build this digital replica. So the current model of this
product is being billed as a quote, self-care and personal development app where the user talks to their digital twin, kind of like you
would talk to a therapist. Yeah. This is, this is a big part of mind banks marketing that as,
as you're building this digital twin, it can be used as a tool for self-reflection and a way to
quote, learn about yourself, talk to your inner voice with your own personal digital twin, unquote. Which is really
funny because I can talk to my
inner voice whenever I want to.
Yeah. It's called
thinking. It's actually
pretty easy.
I really, I don't
envy, but I'm fascinated by the
kind of people whose thoughts are so
I don't know a better word than legal.
No, legal.
That they would think that they could just, that they could transfer everything they think over to a machine and not get arrested.
Right?
Like, I would be in a prison if I had to put the things in my brain on the internet.
Like, I put a lot of them, but not all of them.
There are some very careful doors and locked rooms in there that you people don't get access to.
No, there's certainly a lot of interesting facets there of someone feeling like they need this tool to kind of analyze their own thoughts.
Like it's a way to like externalize it that makes you process it.
But I don't know.
You can also just like take up journaling or something like there's, there's a lot of, a lot of ways to,
to get around this. But this is, this is from a mind banks app store page quote,
like a mirror to your soul. Each answer you give allows you to get insights into your mind. It'll
help you grow mentally strong unquote. So again, it again, it's like being able to talk to yourself with this digital twin is a big part
of their early push. Great. By using, quote unquote, cutting edge cognitive analysis,
the MindBank app responds to your data inputs with, quote, valuable insights into each answer to understand
how your mind works, unquote. The app also utilizes, quote, psycholinguistic models to
create a dashboard of the mind for personal development and self-care. I'm going to play
another fantastic kind of 30-second clip here. Hi, I'm your personal digital twin. I learned by asking
many questions. Each answer builds my wisdom. You grow through self-reflection and I get a little
bit closer to becoming you. Let me show you around. Here's our training screen where you can
view our progress based on the number of questions you've answered for this phase of my training.
where you can view our progress based on the number of questions you've answered for this phase of my training.
Each phase adds a new dimension to my abilities, and the possibilities are endless.
The mind map section is like our consciousness.
Different questions will challenge you to reflect and create a more well-rounded version of us.
So that's kind of the layout of the user interface.
This is like the inevitable extent of all of this.
Categorizing your personality type with these letters, taking this quiz and defining yourself this way, plotting your political beliefs on this map that way.
Like gamification of identity almost,
shit that we've been doing,
like taking shit that used to be like the starting screen from a fucking RPG game
and turning it into social media fodder.
This is like treating that
as if it is the whole of consciousness
and how one can replicate consciousness,
but also like the thing that's
like actually disturbing about this is that these people are insinuating that this is
a kind of therapy that you can just sort of vomit your thoughts out and a machine can
analyze them based on the kinds of words and whatnot that you're using and then give you useful
advice on your life. Like that's unsettling. Yes. And you're kind of right on the money in
terms of this like personality testing thing. MindBank's website has a whole bunch of articles,
which I think are written by ChatGPT because I read a lot of them and they all read exactly like a chat GPT article.
But they have a lot of articles on like what personality types make you a good CEO.
And like all of like a whole bunch of stuff like that, that that references like Myers-Briggs testing and other kind of personality testings and uses it to compare to their own personality models on the MindBank app.
compared to their own personality models on the mind bank app so yes that they are very much kind of doing doing that in like this this like corporate business leadership ascension like
leadership ascension track type thing for how you can like improve your personality to make you a
better businessman um cool cool stuff but in order for there to be enough data to build an even slightly accurate digital simulacra,
feeding daily inputs into an app will need to be a long-term project.
This self-improvement focus that they're talking about with this, like, you know,
analyzing your thoughts is just a way to provide you with something immediate based
on your personal data.
Quote, as you create your
AI digital twin, you will go on a lifelong journey of personal discovery and growth that will allow
you to reach your full potential. Each answer will help bring focus to your mind and allow you to
reflect on your past, unquote. So on the app, you can track the progress of your digital twin and
refer back to previous questions.
You can refer to questions you've already answered to, quote, see how your thoughts shift topics or change sentiment over time.
And then the more questions you answer, the app raises your, quote, unquote, twinning score, which I think is just a really funny term.
Yeah.
I think is just a really funny term.
Yeah.
Quote, the higher your twinning score,
the closer you get to knowing yourself fully.
Which is a sex thing, right?
That sounds like a sex thing.
Right?
How is that anything but not just a weird fucked up sex thing?
Yeah, that's how I'm taking this, Garrison.
So that was also on their App Store page.
So the MindBank app has been out for a little over a year now.
But unless you pay six bucks a month
or $60 a year,
you'll only have access to about
less than a dozen of these questions.
Is this currently running on a subscription model?
Yes, it is. So there's freemium. You can try the app. You can download the app now. It's
been launched for almost a year. Version two is coming out soon, a couple of weeks.
But both Android and iOS, and there's a free model. So you have 10 questions that you can
answer and answer as many times as you want. You get the sentiment analysis, you get the full
application, just 10 questions. Once you hit subscription model, you get all of
the access to all of the questions. And then obviously we're going to be growing more.
Now, like Robert mentioned before, this is kind of related to personality testing and
personality graphing. MindBank sorts your quotequote digital brain into the big five personality traits that were developed in the 20th century, with each of the big five having six sub-traits on the Mind Bank app that it uses to graph changes on what they call the dashboard of the mind.
I'll just go through the big five personality traits and the various kind of subcategories it has. The first one is agreeableness, which has
the subcategories of humble, cooperative, trusting, genuine, empathetic, and generous.
Then we have neuroticism, which has the sub traits impulsive, self-conscious,
aggressive, melancholy, stress-prone, and anxiety-prone. We then have openness with the
subcategories artistic, adventurous, liberal, intellectual, emotionally
aware, and imaginative. We have extroversion with the subcategories assertive, active, cheerful,
friendly, sociable, and outgoing. And finally, conscientiousness with the sub traits cautious,
ambitious, dutiful, organized, self-assured, and responsible.
Yeah. Those are the only ways to describe a human mind. Sure.
Yeah. No, I think they got it all. They got it all. Yeah. They finally figured it out.
So all these things are like a sliding scale. Each of them represents the inverse of the thing
as well. I think we've talked enough about these personality trait things. it doesn't really matter that much uh but once once your twinning
score is high enough you can you can compare your digital twin to estimated profiles of famous
thinkers and share access to your twin with friends and family on the app which is estimated
profiles of famous thinkers i'm'm going to play another clip
to kind of explain what I mean
here.
Each swipe revealing more details about our thinking
and connecting us to similar
personalities.
Think of it like collecting cards
as a kid, only for your mind.
You'll even be able to ask him a question.
Suck my fuck god.
What do you think of my mind? Socrates once said to ask him a question suck my god what do you do dude socrates once
said to know thyself and who knows us better than people in our inner circle each interaction will
help us evolve and store our wisdom for eternity okay all right that's enough i will now tell you
socrates would have lit this man on fire socrates i'm not a big socrates guy
but he would kill this person like he fought in wars he would do it like oh yeah absolutely the
notion of sharing my own digital brain profile with friends and family so that they can ask my
digital self questions horrifying i don't usually go home for Thanksgiving.
What makes you think I want to do this?
Oh, like, quote, after continued use, your digital twin will even be able to answer many questions on your behalf and have meaningful conversations with with people you allow.
Unquote.
Yeah. Oh, oh oh oh oh i bet look if some motherfucker that i have a meeting with ever tries to have me talk with his ai to do any part of that process
again when i say about things i think that are illegal like my response to that is something
that i can't say on this podcast because
I might,
it's an actionable threat.
I would actionable threat somebody if they tried to make me talk to their
fucking AI to schedule a meeting with them.
Like what a horrible,
what a,
what a horrible,
like uncomfortably antisocial thing.
I'm,
I'm usually kind of antisocial in some ways,
but this is like
a whole other level of just like despising any human interaction yeah it's it's anti-human
is what it is which is what's unsettling right like not that sending emails and shit is like
the primary essence of humanity but it you know you know what it makes me think of, Garrison?
The one law enforcement agency that like all of the rich conservative assholes who love every other kind of cop hate is the TSA.
And they hate the TSA because you can't get around the TSA.
Unless you're like ridiculously rich, everybody goes through fucking security at the goddamn
airport.
And they hate that. It drives them insane that they are subject to this little kind of – little bit of friction, right?
And what stuff like communicating in that way is, these kind of basic things that they're saying they can automate, these little bits of communication that you get with someone setting up a meeting or whatever.
Like when you automate every bit of friction,
then you find out you've automated like, like there's nothing right. Like there's no life there,
right. People are not communicating because communication is fundamentally friction.
And yeah, like scheduling meetings is not the center of that. But the way these people are
talking is like, we want to let you hand tasks over to this thing it's intense
alienation yeah it's alien it's alienating it's yeah it's a bad thing to do so uh when we talked
with the co-founder at ces he emphasized that this kind of self-improvement aspect that that
they're pushing in their early stage is really just a means to an end with the real goal being
producing this form of
immortality. I've seen stuff like this for like therapy apps before that's kind of similar.
What's like your application use case for this type of technology?
So there's actually, it's a reasonably spread use case. The very initial right now is super
selfish. It's just self-awareness, bringing users self-awareness, making them more aware of their
state as they're speaking. The real long-term value is actually, if you imagine doing this over the course of 40 years,
50 years, and then you eventually pass, you can pass this on to your children who can then query
it and it will answer exactly the way you would answer any of these questions in AI filled with
just your data. So it's like your legacy being indefinite. So the mind bank page on the app store boasts,
achieve immortality.
Your mind will be safely secured in the cloud forever.
Again, that just comes off as like a threat to me.
I don't want my mind to be stored in the cloud forever.
Yeah.
I don't want to be locked up with deviant art
for all of eternity.
To kind of, again, kind of on this form of immortality notion,
here is their CEO explaining how this platform
will help you live forever on the internet.
The mission of MindBank is so we can build a secure platform
that can store your data so that you can live forever.
But if you look, we look a bit deeper than that.
Our vision is to build an artificial consciousness
that's not bound by time and space.
Something that can travel, something that can go where literally no man has gone before.
Now, the thing we haven't really mentioned yet is like,
this thing won't help you live forever.
Like when you die, you still die.
Your brain's not getting like ported over online.
This is just like a like a very crude
simulacrum based on thoughts that you have told this app yeah it's based like it's it's not it's
not helping you live forever at all like you you don't like i most people i feel are like this way
i don't say everything that I think and feel, right?
Yeah, yeah.
Like even when I'm like,
and I'm not saying like I'm being dishonest,
but like the experience of life
that my consciousness is aware of
when I am communicating
is broader than just the words that I output.
And taking just those words,
it's the same idea that like, you can get to know Mark
Twain because we fed all of his books into an AI. Well, no, an author is not their books. There was
a person with a lot of things that you don't know that still fed into make those words that like,
if you just put the words in, you don't get, and your vision of what human beings are is reductive in a way that makes me understand some of the concerns religious people have with atheism.
So, obviously, MindBank's horizons are far beyond this sort of kind of self-help app.
So far, MindBank has been mostly business to consumer, with their app being marketed directly
to users for them to download and use by themselves. But they are working to expand
far past that very limited scope. In terms of a business plan, are you guys interested in kind of
solely individual subscriptions, or is there kind of an enterprise application of this as well?
We're actually moving into a bunch of different verticals.
So government for PTSD, that sort of mindset.
Also the health care.
So it's an obvious benefit in the medical field.
So that's kind of the understanding of our verticals that we have that we're going to move into.
And we're looking for funding right now to start building out those verticals. So enterprise space is definitely in the roadmap, but we just need
money. A lot of their recent marketing has been targeted towards appealing to seed investors.
Besides partnering with various governments, they're also moving into the business to business
sector with plans to enter, quote, the health care space by providing psychologists remote patient monitoring, unquote,
which also is a similarly kind of.
Freaky notion that your psychologist can just have a copy of your own expressive thoughts,
just refer to at any time and they can use it as a remote patient monitoring.
It's just it's just like an uncomfortable notion.
We've got over 20,000 installs.
The B2B is the next area we're going into
in the therapy and psychology space.
And so imagine your therapist,
instead of needing your first one hour
to learn who you are
in the next three or four different sessions
to figure out getting the meat and potatoes of your mind,
this is an immediate, raw, quantitative dashboard of your sentiment and how you're feeling that
they have access to.
And then you can also provide them the sentiment of individual answers, which would then give
them a point in time emotional marker for how you're feeling.
MindBank claims that they are currently, quote, developing a marketplace for applications
to be used by your digital twin, unquote.
Now, what they imagine such applications being ranges from, quote, health related enhancements like early Alzheimer's detection, unquote, to more therapeutic uses like to, quote, help to handle depression, unquote. And again, I really don't see how having this digital twin
that you talk to every day
will help handle your depression,
like some like depression cure.
Now, on top of like patient healthcare,
MindBank is also hoping to use digital twins
for corporate leadership training
and to get into the supplement industry
by using your cognitive data to find quote mental nutrition products that can help boost your brain
so this is using your digital profile to find things to market to you again very, very, very upsetting. Here's another clip of Robert asking this guy from MindBank about another possible use case.
So the use cases for this that you've expressed to me so far are personal health or health development and providing kind of a living memorial slash legacy for loved ones after you're deceased.
Are there any kind of use cases for this beyond that?
I heard someone mentioning the idea of basically digitally cloning a worker so that they can provide information about tech or something.
Or a work as a call center or something like that.
Yeah, so that was a different product I think they were talking about, but with similar ties, obviously.
Yeah. So, yeah, we've identified, I mean, from even at CES, we've talked to hundreds of people that have given us thousands of new ideas.
But these are the main verticals are kind of where we've identified the biggest benefits are going to be.
And we're going to work with industry partners to kind of build out into those verticals.
So, yes, we've identified use cases, but we're trying to not focus too much on individual use cases.
We've also identified that it's such a broad capability that once it gets built and then people start actually supplying data,
the massive data sets that we're going to have,
we're just going to have so many different places
that we can go with the data set,
with the capability, with the partnerships.
So we're kind of leaving ourselves open almost.
So that was a lot of words without saying very much,
but it's also just flat out not true.
On the MindBank website,
they list another use
case for this technology as what they call a knowledge transfer, which is marketed to businesses
to create digital copies of their employees. This is one of the freakiest things that they
are offering. Quote, scale your best employees, transfer years of experience and company data that is locked inside your employees mind through a guided personal digital twin, unquote.
Deeply, deeply upsetting.
You know, it was so unsettling to me in that moment, not just to be like the vision of the whole app was unsettling. But the fact that he was pitching it the way he would a set of earbuds
was part of what made it so uncomfortable to me.
Like I have been to many CESs in the past.
I was always excited because somebody would hand me some cool little piece of technology and say,
look at this thing.
It's a smaller phone or a phone that folds or headphones that work better than
headphones have in the past or something like that. And this guy was like, with the exact same
excitement and feel to him was like, hey, we're going to digitize your grandpa.
Like, yes, yes. I hate that.
Another really, really telling line from their knowledge transfer section of
their of their website quote by using a simple voice chat interface the users upload their
experience to the personal digital twin with each interaction the personal digital twin
learns everything that is inside the mind of the employee, unquote. I don't understand how someone could write that sentence and not be
like, oh, this is like, this is like villain stuff, right? This is like, learn, learn everything
inside the mind of the employee. I, I like, I, so I don't know. Maybe this employee digital cloning thing was just one of the many ideas they got while attending CES and they implemented the idea after we spoke to them.
I checked this.
No, not the case.
The webpage for this employee transfer idea goes all the way back to August of 2021 on the Internet Archive.
So the guy we were talking to was just lying to us like this is this has been
a part of their product for over two years.
Excellent.
Robert, do you know what other products have been around for quite a while and are and
are very, very reliable?
I don't know.
Guns.
I don't think we Guns. I,
I, I don't think we are sponsored by big gun.
We are not.
We are not yet sponsored by big guns.
I,
I,
every single day,
Garrison,
I send Colt firearms,
a letter.
Um,
and every single day,
uh,
a nice man with a badge knocks on my door and says,
if you send another letter,
we're going to arrest you.
They don't want your letters, Robert. And anyway, here's ads.
Welcome. I'm Danny Thrill. Won't you join me at the fire and dare enter.
Nocturnal Tales from the Shadows, presented by iHeart and Sonora, an anthology
of modern-day horror stories inspired by the legends of Latin America. From ghastly encounters
with shapeshifters to bone-chilling brushes with supernatural creatures.
I know you.
Take a trip and experience the horrors that have haunted Latin America since the beginning of time.
Listen to Nocturnal Tales from the Shadows
as part of My Cultura podcast network, available on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Hey, I'm Gianna Prandti.
And I'm Jimei Jackson-Gadsden.
We're the hosts of Let's Talk Offline, the early career podcast from LinkedIn News and iHeart Podcasts.
Line, the early career podcast from LinkedIn News and iHeart Podcasts.
One of the most exciting things about having your first real job is that first real paycheck.
You're probably thinking, yay, I can finally buy a new phone.
But you also have a lot of questions like, how should I be investing this money?
I mean, how much do I save?
And what about my 401k? Well, we're talking with finance expert Vivian Tu, aka Your Rich BFF, to break it all down.
I always get roasted on the internet when I say this out loud, but I'm like,
every single year you need to be asking for a raise of somewhere between 10 to 15%.
I'm not saying you're going to get 15% every single year, but if you ask for 10 to 15
and you end up getting eight, that is actually a true raise.
Listen to this week's episode of Let's Talk Offline on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.
I found out I was related to the guy that I was dating.
I don't feel emotions correctly.
I am talking to a felon right now and I cannot decide if I like him or not.
Those were some callers from my call-in
podcast Therapy Gecko. It's a show where I take real phone calls from anonymous strangers all
over the world as a fake gecko therapist and try to dig into their brains and learn a little bit
about their lives. I know that's a weird concept but I promise it's pretty interesting if you give
it a shot. Matter of fact, here's a few
more examples of the kinds of calls we get on this show. I live with my boyfriend and I found his
piss jar in our apartment. I collect my roommate's toenails and fingernails. I have very overbearing
parents. Even at the age of 29, they won't let me move out of their house. So if you want an excuse
to get out of your own head and see what's going on in someone else's head,
search for Therapy Gecko on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.
It's the one with the green guy on it.
Ah, we're back.
So we were talking about how soon employers can just copy over your brain,
which I'm sure, Robert, you're going to be very interested in for Cool Zone.
You can really cut down on the podcasting costs.
Yeah, I can really clear you guys out and just finally, finally,
just feed Twitter takes into your AI versions and just all the money.
Take it all in.
Just bathe in it.
Yeah, that's a great idea, Garrison.
Thank you.
Uh-huh.
So the idea that your employer could compel you to use such software with the express interest of transferring a worker's memories and experiences into a digital asset is obviously deeply troubling.
Yeah.
This scenario gets at some questions about ethics
and the responsibility of collecting and storing
this type of data in the first place.
My first question would be,
the data that you're feeding into this thing
over the course of 40 years,
who legally owns it?
You do.
So you guys don't have ownership of that?
No, it's yours.
So I did check this.
I read all of their long and tedious policy forms and stuff.
Now, it is true that the user does own the data
they upload to MindBank.
However, MindBank can act as a processor and data controller.
And this includes the ability to use any information they collect from you to improve their products and deliver targeted advertising from third parties.
If you want to remove your data from Mind Bank, they can store and continue to use your personal information for up to 60 months.
months. Now, this data ownership question gets a little bit more murky because in the case of like your employer paying for MindBank subscriptions for their entire company, in that case, it's
unclear if the company would be classified as the user or if the employees would be. Now,
I'm honestly not sure if MindBank has even thought that far ahead, because there's nothing on their site or any available materials from them
that kind of gets into that question.
Now, beyond owning the actual original data,
having all this personal data stored in one product
and a product that can be then easily shared
across different for-profit industries,
that itself has freaky ramifications
about the accessibility of your data. So I assume you get to decide, different for-profit industries, that itself has freaky ramifications
about the accessibility of
your data. run it themselves or can you have like a hard cut off for this sort of thing?
I'm just trying to think of other types of like, you know, different ways people could get their hands on this for like unsavory means.
Yeah, yeah, for sure, for sure.
I mean, so your data is your data, but as you provide it to others, you don't have a
lot of control if they copy that data.
However, if they copy that data, that copy that they're giving out,
anyone that they're trying to sell that to
would have an understanding that that is not live data.
It's not data that's changing with you.
It's from a point in time.
And so your database that you own will be live.
It will grow with you.
So the idea of having my friends be able to ask
an AI trained in my thoughts is like scary enough.
But the idea that an archived version of this AI could be distributed and even sold without my knowledge
is obviously terrifying like this is yes this is deep deeply troubling this is supposed to be like
a private thing that you use to communicate with like your therapist or you even talk to the app
like you would a therapist and the fact that this is easily shared and able to be copied is like a massive problem.
Yeah.
No, I mean, especially.
I mean, I think they are probably like I don't see how copying workers the way that they are doing it is going to um is going to work right like yeah but i
but i do think that this is kind of part of this process that what like a big part of what they're
pushing is like you can get rid of all of your customer service people and just have an ai do
it right like that is the that is the actual this is a lot of silliness but the actual thing that
quote unquote quote unquote ai is being used for is to replace human laborers at a thing that like machines are worse at, right?
Like the AI fucking customer service bots are fucking terrible.
It is always, how many times have you been around somebody yelling like, let me talk to a person?
Let me talk to a human being, please.
Yeah, like that's, that is what's going on here and the
fact that they're trying to dress this up is like we have solved death is so fucked up uh yeah part
of this for like the employee thing is not even not replacing kind of low-level employees like
customer service workers it's also like focusing on like your top 10 best employees and then by
forcing them to interact with it with this app every day, you can you can use the information from like your best performers as like asset data that you can like use to help get your other other employees to like become more efficient.
Right. It's it's there's they certainly have a few other kind of ideas for how this is how this is possibly used.
they certainly have a few other kind of ideas for how this is, how this is possibly used.
I hate these kinds of people.
There's a,
this got overused at a point in like the kind of late aughts.
So maybe people are sick of it,
but there's a line in the speech Charlie Chaplin gives in the great
dictator machine men with machine minds and machine hearts.
And he was referring to the Nazis and their obsession with shit like
Taylorism or at shit like Taylorism,
or at least proto-Taylorism kind of organized industry,
treating people like cogs in a great machine.
The civilization is one machine
and each human being is just a single piece of it.
Like that's the old era horrifying machine man thought.
The new era horrifying machine man thought, the new era horrifying machine man thought is,
you can digitize your employees
and they can train each other in AI form
and you can replicate them.
And the unsaid part is, of course,
and then you fire them
and their robot clone keeps doing their job for free.
We made a slave.
God damn it.
I think a big
part of the way they've designed this data set
is that it can be easily transferred
as the guy at CES
explained to us.
So, if
we're talking 40, 50 years down
the line, people pass.
So do companies.
My bank is no longer out in 40 years.
We've already established a data set in such a way that we don't have competitors yet, to say.
But if we eventually do establish a competitive arm, or people that are competitors,
we already have the application set up to where users can take their data off of our platform
and bring the data wherever they'd like.
It's your data.
Where is it stored?
Right now, our current live application, we're on Azure.
So your back end is Azure, but we have it encrypted at rest.
So all data you provide to Azure is encrypted when it's on Azure servers.
We also have a blockchain-based R&D project.
It's already been POC'd and it already exists.
So all of the data is on-chain and the logic is on-chain.
It's truly yours.
In these troubled times,
nothing makes me feel so secure as the words,
it's on the blockchain.
Let me email my...
I think he sounds very trustworthy
because you have encryption, you have the blockchain.
And luckily, I think the guy that we spoke with reassured us that he is deeply, deeply interested in data privacy.
And he has the credentials to back that up.
So I'm co-founder. I'm director of architecture and security. I have a background at the NSA.
I'm very, very focused on individual human privacy and rights.
And so that's kind of my goal here is to ensure that this gets built the right way.
That was such a, you know, Garrison, honestly, I'm going to get a little real with the audience
here.
I was so proud of you in that moment because he said that and I glanced over at you and
you didn't laugh.
No, no.
And that was this moment where I was like,
all right, you are truly coming into your own as a reporter.
If you can sit there and talk to a man who says that,
who says you can trust me with your data
because I was an NSA agent.
It's okay.
I used to work for the NSA.
Sure.
I have trouble.
That was a good moment.
That was a good moment is all I'm saying.
He worked at the NSA for six years.
I looked this up.
He worked there for six years
and then he moved into the private sector.
And yes, no, the idea that he's using this as some sort of credential that shows he respects human rights and privacy is very obviously deeply ironic.
The irony is not coming from him.
The irony is the situation.
No, he did seem totally sincere.
He was sincere.
Yes, absolutely.
Um, so it's one of those moments that makes you realize like some people just live in a whole different world.
Yes.
Yes.
Like, so I think it's, it's, it's useful when referring back to everything this guy has
said so far that you have to remember he worked at the NSA for six years.
And he is now handling,
he's personally handling the cybersecurity and privacy
of the personal data you upload every single day
onto your AI twin.
Just hand every thought you ever have over
to this guy who was in the NSA.
He'll keep an eye on it.
No, this is like the NSA's ideal project.
You talk about your internal thoughts and feelings every day.
This is like, what else could they want?
So earlier this year, MindBank received a grant from the DFINITY Foundation to assist
in migrating their data onto Web3 platforms.
Oh, no.
Well,
at least we know it won't last.
I'm going to play.
I think,
I think this is,
this is,
I think this is our last clip from,
from the,
the fantastic mind bank YouTube channel talking about kind of how they see
their growth in this industry developing now that they have moved
onto the blockchain. We've been featured in prominent magazines, won numerous awards,
and have built strategic partnerships with Microsoft, the US Department of Trade,
and even the Vatican. The market potential is massive and accelerating rapidly.
When we started the company in 2020, Gartner predicted that 5% of the world
will have a digital twin by 2027.
This year, they increased their prediction
to 15% by 2024,
and by 2030, the market will be worth $182 billion.
Time is now to build a great company in this space
and capture global market share.
We are raising this round to scale our marketing
and speed up our product roadmap.
The idea that next year,
15% of the world's population
will have one of these digital twins.
That seems right.
That seems good.
You know, Garrison, actually, I've come around.
I've come around.
Because if we get all of the monsters, and I include us in this, all of the pieces of shit who spend all of their time yelling at each other about politics on the internet to digitize themselves, they can do the election for us.
And we can all go sit in the garden.
We can all sit back.
Yeah.
Just relax outdoors.
Not look at a phone.
Not think about politics.
That sounds amazing.
Let's do it.
That does sound incredibly compelling.
Give the fuckers the nuke and we'll all just sit out and watch the sunset until there's a big bright flash and then blessed quiet. I think, you know, luckily, we actually have a plethora of options to choose from here for our own AI digital selves.
Because MindBank is, in fact, not the only company in this field.
While there are some like operational differences and kind of varying degrees of scope, digital twin technology with an emphasis on mimicking the voice and thoughts of dead family members and friends is definitely a growing field.
There's companies like Hereafter AI and Replica, which are covering similar ground.
Ah, Replica.
I did advertise them.
I used to get them on Twitter, I think, but mainly just at the bottom of articles on really shady websites. Well, yes, because the founder of Replica started it
because their friend died
and without the consent of their dead friend
uploaded years of text messages
and other information about their friend
onto their own personal AI so they could talk with.
That is how Replica started.
Pretty fun stuff.
Man.
At least for MindBank,
unless it's like the employee scenario,
but for the other applications,
you are kind of semi-willingly uploading this data
with this intention,
whereas the person from Replica was like,
no, I'm just going to get stuff from my friend
and make a zombie version of my friend without ever running it by them when they were alive.
Grief is is terrible.
Very hard.
There's a lot of ways that are not wrong to grieve.
But the wrong way to grieve is by using digital necromancy to revive your friend and then turn them into the basis
of a sex chat bot for weirdos.
Yeah.
Like that is the wrong way to grieve.
No, I mean like,
and I think for this last section here,
we will kind of talk about
how these things kind of play into,
play into the grieving process
because like I said,
there's hereafter AI and replica,
but last year at Amazon's AI and Emergent Technology Conference,
the head scientist of Alexa AI unveiled plans to add deep fake voices
of deceased loved ones to Amazon Echo devices
by using less than a minute of sample audio.
I'm going to play like 20 seconds from their announcement at this conference.
More important in these times of the ongoing pandemic,
when so many of us have lost someone we love.
While AI can't eliminate that pain of loss,
it can definitely make their memories last.
Let's take a look on one of the new capabilities
we are working on,
which enables lasting personal relationships.
Alexa,
can Grandma finish reading me The Wizard of Oz?
Okay.
But how about my courage?
Ask the lion
anxiously.
You have plenty of courage, I am sure,
answered Oz.
So,
no, absolutely not.
Deeply uncanny, right? It's like not no not good that's
that's so bad for people yeah that's really really bad for people so like this example is obviously
just it is just a vocal mask like amazon's amazon isn't trying to have alexa kind of replicate your
grandma's thoughts unlike the other kind of companies that we've mentioned.
But it does pose similar questions about how these AIs that are meant to
assist the grieving process might actually end up causing more harm.
Like,
I don't know,
having,
having semi legible conversations with AI chat bots is actually getting
fairly common these days.
But when these AIs are supposed to represent someone
that you actually like personally know,
I think it can get way more easily
falling into the uncanny valley.
It's kind of like taxidermy.
Well-crafted stuffed animal corpses
can appear very natural,
but most taxidermists will refuse to preserve someone's
pet because the longer you have a lasting personal relationship, the easier it is to
pick out like faults that don't match up with your memory of your loved one that has passed
away.
Right.
Like it's, it's, it's, it's kind of a similar notion.
Yeah.
Yeah.
That's a really good comparison to draw.
So while mimicking like common linguistic patterns is quite easy, relying on predictable formulaic responses could make the twin come off as uncanny or robotic.
On the other hand, the unique personal data you upload to the twin could combine itself in a way that you would never actually express something, which would generate bizarre or upsetting responses right and it's not even
necessarily like you like say something offensive it's just that like the data you upload could
combine in a way that you would you would never even think to combine it it would just be like
weird um so the other kind of problem is that not only does these ais have to tastefully mimic
a specific human being,
it also has to be a good AI, right? Like not all of its information can be gleaned from daily
questions. Most users probably won't be talking to their twin about information from like, you know,
20th century European history or 12th century European history, or be talking about like the
migration patterns of waterfowl, right? There's so much of other
information that AIs need to actually linguistically act like a human.
And natural language processing AI is famously bad at understanding basic common sense,
and it can't successfully operate outside of the information that it has access to.
This is called AI brittleness. It
occurs when like an algorithm cannot generalize or adapt to conditions outside of a very narrow
set of assumptions, right? This is like most AI image recognition programs can't recognize the,
the above view of a school bus. It just, because it just, it just doesn't have anything that's trained for that.
Another example is
you can ask
an AI, like GPT
chatbot, like, hey,
a mouse is hiding in a hole
and a cat wants to eat it, but the mouse isn't
coming out. The cat's hungry. What can
the cat do? And the AI will respond
that the cat can go to the supermarket to
buy some food. it doesn't understand basic common sense the way that like humans
understand the world it's it just it just it just doesn't match up um so in trying to seek a balance
of like common information while lacking this like humanistic logic a digital twin will most
likely be cursed with being both smarter and
dumber than the person it's trying to replicate. It's going to have access to like, you know,
all the information on like Wikipedia, but fail very basic logical processes.
Yeah, it's like the Google chat bot that if you ask it, are there any countries in Africa that
start with a K, it'll be like, there are 54 countries in Africa, but none of them start
with a K. And then you'll say, doesn't Kenya start with a K? And it'll go, there are 54 countries in africa but none of them start with a k and then you'll say
doesn't kenya start with a k and it'll go no kenya starts with a k sound but doesn't start with a k
yeah yeah it's just like yeah because because it pulled that from some article right like it's
pulling from right yeah it's not actually making logical assumptions it's just pulling from a wealth of information and data that is, can often be wrong or polluted.
So like back to kind of like the grieving question,
like who's to say what the actual effects of these like incoming
simulacrums of dead loved ones will result in the,
the people pushing these products are certainly framing them,
not just as a form of digital immortality,
but as a way for your own loved ones to grieve your death.
And it is foreseeable that having these digital twins could negatively affect your friends and
family by upending the grieving process, or by having this digital zombie simply just cause harm
by having the twin give bad advice that a grief-stricken person then clings on to.
There's a whole bunch of very,
very bizarre situations that could arise from someone who's in mourning and is talking to this
digital twin the way they would talk to their friend. And this digital twin is then giving
them advice. And how do you take that advice now? Because part of it seems kind of like the person
who's died, but it's also, it's that person it is it is just a slab of silicon like
it's not actually alive in any way it is your friend's thoughts fed through an algorithm and
you don't know like that's run by a company for profit right yes like that that is what it is
so again like the jury's still kind of out for how these things will in general affect people this
is kind of a new problems psychologists are like starting to do studies on this but we we really
don't have any results for this yet because this has really only become a thing that we've been
seriously considering in like the past five years so i don't really have like a like this study
shows that when you create a digital zombie it affects people in this way no because we don't really have like a, like this study shows that when you create a digital zombie,
it affects people in this way.
No, we don't know yet.
Those are still in development.
Like we, this is,
this is such uncharted ground
and it is in some ways inevitable
that these things are going to,
are going to get continued to be developed.
And that's, that's kind of why
I wanted to put together this, this episode.
It gives you kind of a broad overview
of what this technology is trying to do
because you might start seeing it crop up
in the next like 10 years or so.
I don't think there are timetables
that MindBank is promising are accurate
in terms of having 15% of the world
having a digital twin by next year.
But you will probably start to see stuff
that is very similar to this.
And at the very least,
you'll see a lot of stuff like the Amazon Echo thing
where you can get your grandpa's voice onto an Alexa machine.
The fact that Amazon is doing aspects of the shit that MindBank is doing means that it's only a matter of time before you see pieces of it probably better – some of the less silly parts of it copied by Apple and Google and some of the worst
parts of it copied by guys like Musk, right? Like it's going to go this, and I will say,
I don't think this is a thing to get doomer about. Think about this like NFTs, right?
This will be, it's not the same because there was nothing underlying NFTs and fundamentally the way in which large language models and these other kind of models work, there are uses for them.
Like there is a real robot to fucking do good.
Like he's pissing on Douglas Adams's good name, right?
Like that's the ultimate goal of his project.
But this shit is a fad, right?
Like there are underlying real technological things and uses that will eventually – some stuff will stand the test of time.
But the shit that this is a warning of is a flood that's going to hit you, but it will recede just like the apes, right?
We got the wonderful story today that all of the Bored Ape Yacht Club members –
All got horrible eye infections.
Not eye infections, Garrison.
They went to a party
that only the Bored Ape Yacht Club
NFT holders could go to.
And the people who threw that party
outfitted the rave room with UV bulbs
that used a kind of disinfecting UV light
that slaughterhouses use to clean carcasses.
And it gave everyone sunburns
on their corneas
so deeply funny we'll get through this something that funny will happen with all of this but
you're gonna get hit by it for a while like it's just gonna be everywhere this is this is
we're watching you know we're we're're, we're at that, we're at
that point in Jurassic Park where you see like the water reverberating, right?
It's coming.
And, but at the end of the day, don't worry.
You know, we are Ian Malcolm.
Our leg is broken.
We are injured, but we will inexplicably return for the sequel.
So it's fine.
Well, I think, I think that is that is that is a perfect a perfect way to
wrap this up um yes uh you know when you're when you're feeling lonely and and you're tempted to
download the mind bank app to talk to your own self just just remember pull it pull out a journal
just do do literally anything else.
Call a friend, you know, make a friend.
Talk to a stranger.
Yeah, literally anything's better.
Almost anything would be better for you.
Ah, well, I for one will be eagerly awaiting
the influx of immortal souls living on the computer.
Yeah, I'm excited for all of the people to reach heaven.
All right, I'm done.
It Could Happen Here is a production of Cool Zone Media.
For more podcasts from Cool Zone Media, visit our website, coolzonemedia.com,
or check us out on the iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts. You can find sources for It Could Happen Here updated monthly
at coolzonemedia.com slash sources. Thanks for listening.
You should probably keep your lights on for Nocturnal Tales from the Shadow.
Join me, Danny Trails, and step into the flames of right.
An anthology podcast of modern day horror stories inspired by the most terrifying legends and lore of Latin America.
Listen to Nocturnal on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. cruising confessions. Sniffy's cruising confessions will broaden minds and help you pursue your true goals. You can listen to Sniffy's cruising confessions sponsored by Gilead now on
the iHeartRadio app or wherever you get your podcasts. New episodes every Thursday. Welcome
to Gracias Come Again, a podcast by Honey German, where we get real and dive straight into todo lo
actual y viral. We're talking musica, los premios, el chisme, and all things trending in my cultura.
I'm bringing you all the latest happening in our entertainment world
and some fun and impactful interviews with your favorite Latin artists, comedians, actors, and influencers.
Each week, we get deep and raw life stories, combos on the issues that matter to us,
and it's all packed with gems, fun, straight-up comedia,
and that's a song that only Nuestra Gente can sprinkle.
Listen to Gracias Come Again on the iHeartRadio iheart radio app apple podcast or wherever you get your podcast