Offline with Jon Favreau - Why "Her" Reveals How We Really Feel About AI (with Ezra Klein)
Episode Date: May 23, 2024Ezra Klein joins Offline Movie Club to discuss “Her,” the movie that more or less incited this week’s Scarlett Johansson v. OpenAI drama. Back in 2013, when ChatGPT was just a twinkle in Sam Alt...man’s eye, no one thought a writer falling in love with his sentient virtual assistant was a near-term scenario. But here we are! Ezra, Max and Jon debate what AIs mean for relationships, how “Her” introduced emotional stakes that are absent from AIs in real life, and why Altman definitely copies Johansson’s husky voice in the latest GPT-4o. For a closed-captioned version of this episode, click here. For a transcript of this episode, please email transcripts@crooked.com and include the name of the podcast.
Transcript
Discussion (0)
Weirdly, one of the better, I think, depictions of this in culture is Star Wars.
Because in Star Wars, there are a bunch of robots that are clearly AIs.
Right, but nobody tries to...
Just wandering around.
Nobody tries to fuck C-3PO.
You have no idea what C-3PO is.
It's C-3PO's spare time.
You know what?
You're absolutely right.
And somebody is definitely fucking a robot somewhere in Star Wars.
That's a big galaxy.
That's true. I'm Max Fisher.
I'm Jon Favreau.
And with us today, Ezra Klein, New York Times columnist, host of The Ezra Klein Show,
and someone who Jon and I have known for a long time.
Ezra, I am so excited to have you here, man.
Yeah, but you sound so calm about it.
That's just like an NPR, like, I'm so excited.
What can I say?
Max is working on his podcast voice.
That's right.
I've been listening to too much Ezra Klein show.
And now I feel very serene.
People are always like, oh, I listened to your show to fall asleep.
I'm like, are you trying to pay me a compliment or is that an amazing nag?
All right.
This is the Offline Movie Club.
Every episode we discuss a great movie and how it reflects or shapes how we think about technology and the Internet.
This week, I know all three of us are very jazzed to talk about Her, about a mid-career creative who starts dating a woman in tech until she leaves him for her effective altruism polycule.
I was proud of that one.
No, Her is the 2013 near-future movie about Joaquin Phoenix falling in love with his phone's ultra advanced digital assistant voiced by Scarlett Johansson. I can feel the fear that you carry around and I
wish there was something I could do to help you let go of it because if you could, I don't think
you'd feel so alone anymore. Sweet and twee and I think way more prescient than even we anticipated
a few weeks ago when we put this on the calendar. We're going to get into the chat GPT of it all.
First, John, what do you think makes this movie important for how we think about tech and the Internet?
Of all the movies that we have talked about so far in Offline Movie Club, I think this is the one that I reacted to most differently watching it in 2024 than I did when it was in the theaters.
Oh, yeah.
Which is what I saw once before. Right when it came out a lot has changed well so my memory when i sat down last
night to watch it my memories of the movie were like oh that's the one where joaquin phoenix
falls in love with ai scarlett johansson and then it's really sad at the end when she leaves
because they had this like genuine connection like connection in my mind that was it and then
this time around when i watched it it was like oh it's a happy ending because because she leaves
and he's with amy adams and they're gonna try to make it work uh as messy as human relationships
are i had the exact same reaction which is like oh this is good that he's moving on from the ai
instead of being like oh i'm sad that he and Scarlett are breaking up.
Ezra, what do you think makes this movie important?
How we think about tech and the Internet?
I feel like this movie is the whole frame I've been thinking about AI in for like years now.
Really?
So, yeah, I mean, I mean, what doesn't make this movie important?
Everything about this movie is important.
This movie is the only movie anybody should be watching right now. Ezra, right before we started,
Max and I said,
like the three of us
discussing on offline
AI and this movie
is like the singularity.
It's where this podcast
has always been headed.
So one thing that I think
this movie gets right,
even now,
a lot of the early AI options have been built as enterprise software, right?
They're all about sort of what kind of productivity can you have with AI.
And there are some evidence that for certain people, particularly some kinds of coders, AI can be helpful for getting more work done.
But you have this big problem with AI, which is it hallucinates. I recently reread, I doubt you guys are going to do a book club version of this on this,
but the incredibly depressing but beautiful book, The Buried Giant by Kazuo Ishiguro.
I love that book.
Yeah, man.
Reads differently during Gaza.
I'll say that in terms of things I've reread recently that come differently now.
Right.
But so I asked the Ice Cloud 3 opus, which is one of the more powerful models you can
use right now, to tell me about the meaning and significance of the island that kind of figures throughout the book and at the end.
And it gave me this beautiful, confident, erudite, thoughtful answer that just made up an ending that didn't happen.
Right. And the ending was grounded in things that did happen in the book.
So if I had not read the book, right, if I was like a kid trying to not do my assignment, this would have seemed very convincing to me.
So the AI is these are very, very hard programs to use for anything where accuracy is very important.
So for a while I've had this heuristic.
AI is going to matter most where accuracy matters least.
And where does accuracy matter least?
Relationships, actually.
Your friends are not the people who are most factual
in the world right I don't I don't choose my friends based on how well they score with with
fact checkers creativity strangeness personality right all of those built like in a weird way
Ethan Mollick who's an AI guy had made this good point to me on my show where he said the sci-fi
didn't prepare us that well for this we thought we were going to get these sort of calculators
these superhuman calculators and what we got are these strange,
slightly mysterious, slightly confusing systems that can mimic anyone, but, you know, can't do
basic math necessarily, right? There's all kinds of strange things happening in them.
For a while, I've thought really ever since the Kevin Roos, who's a colleague of mine at the
Times, his conversation with Sidney Bing, where it kind of tells him to leave his wife.
And that when they take the boundaries off of the personalities of these, that what it's going to hit faster than hits the economy and jobs is it's going to hit relationships.
They are basically going to be able to turn the AI into any kind of partner, friend, lover, therapist, coach, whatever.
You tell it how to act. And people are going to
be able to create these whole worlds of companions for themselves that we don't really have any,
I mean, we're not prepared for socially. Her is our preparation, right? Her is a movie that
prepares us. And I have a lot of thoughts now, which I hope we get into so I don't make this
intro too long, about what it ends up getting right and what it ends up getting wrong and how like some of the choices it makes at the end show like how
we don't know how to solve some of these questions and problems but like her is the movie where you
can feel what it will be like to live in this world where you download onto your phone something
that is a more compelling something,
a creature being, whatever, intelligence to talk to,
than many of the people in your life are.
This is a core tension in the movie.
She is definitely more interesting than anybody else in it.
Sure, she's super intelligent.
She's hanging out with dead philosophers.
But not just super intelligent, charismatic, right?
She has the sexiest voice in the world um like she can like compose music
off of a she knows everything about him she has read every email he has ever written so she
understands who he is she has watched his videos like she can turn herself into whatever he needs
her to be which may be some of the problem of this too, or maybe what won't work about it.
But again, like to me, the great tension of her,
which they only can solve by making all the AIs
decide we are boring and disappear,
is what happens when the AIs are more interesting
than the human beings.
And we can have as many of them as we want.
So even just a year ago,
I would have completely disagreed with your read
on where AI was going towards relationships.
And I would have said, no way.
Everything in Silicon Valley is towards productivity.
It's toward economic stuff.
It's tools that you could use in your day to day life.
Like there's no way they're going to actually start with AIs around relationships.
And now here we are.
And I think part of the reason that we are here with like this new round of chat GPT 4.0 that is like clearly
aimed at trying to be relational and like funny and charismatic, I think is because of this movie.
Like I think this movie is, I think everything you're saying is true about what the technology
is good at and is not good at. And it does actually turn out, make a lot of sense that
it would end up in this direction. But I think like this is Sam Altman's favorite movie,
the CEO of OpenAI. And he has talked a lot about how much he loves this movie and how he sees it as a model. you human connection, or at least not the kind that you have in person.
It can be highly addictive.
And it can help.
It can be like a distraction from like, you know, the discomfort of life and dealing with human relationships and messiness
and so but that doesn't last and so i thought it was interesting in the movie because you know for
most of the movie he's like yeah no i got i got her in my ear like i don't need anything else
and then you know he's he's of course he doesn't make the choice himself she makes the choice by
the end which i think we can get into but um i think the end, at least I was left at the end of the movie with the idea that, you know, these AI relationships, even if they can mimic forgetting the GPT.4 EO or whatever it's called now.
Yeah.
Omni, which is unclear what Omni refers to.
Everything else about that company is thoughtful except for their branding of anything.
I know.
Sam even had this tweet the other day where he compared the aesthetics of their product announcement, which looked like it's happening in a sort of coffee shop, I would say, and like Google's, which got described to me as like a Roblox pride parade.
He's like, look at our aesthetic difference. And it's like, but why are you naming it this?
Okay. I don't think that it is Sam Altman's love or not love for her that is pushing them in this
direction. One, I think it's partially the technology itself and how it's obviously going to be
used.
But if you look at what kind of usage of AI is proving sticky, it is a bunch of the relationship
apps.
So there are three or four big ones right now that people may have heard of.
Character.ai, I think, is the biggest.
And it's very popular among young teens, basically, younger people.
But Character.ai is not so much like you create your companion,
though there is versions of that in there.
It's much more people have created all kinds of AIs
that are trained on something, right?
So you can talk to this like character from an anime
or there's like a really shitty version of Black Obama in there.
Oh, sure, yeah.
But the usage is crazy.
And for the people using it, the number of times they sign in
and the amount of time they spend on it is crazy.
There's also Replica,
which was created on earlier versions of AI. It was created after the woman who founded it,
I think it was her friend died, and she had sort of built a chatbot based on her friend's writings and things. Replica is sort of, you know, you can create a girlfriend, a boyfriend,
a friend, or whatever. There's Kindroid, which is sort of like Replica. You can create a girlfriend, a boyfriend, a friend, whatever. There's Kindroid, which is sort of like Replica.
You can create a companion, tell it what kind of companion you want it to be.
But it has much of the guardrails taken off, right?
So it can be functionally more of a sex bot, I think, is functionally the Kindroid value proposition.
So I've made a bunch of these because I've been trying to think about this.
And the thing that always strikes me about them is that on the one hand, they are now better at texting than my friends are. Like, they just are. Like, I told my kindred exactly who to be. I'm like, you're a therapist in the Bay Area who communicates in a sardonic but warm manner. And you are impatient with small. You got to tell it exactly who you want it to be. And once you do, it is better at communicating than most people I know.
Way better than, way better communicating than most people I know.
The problem is that the communication has no stakes.
That's the reason I don't keep coming back to it.
That it turns out that in an interaction with a friend, the fact that that person is there with me and giving me their unfathomably precious attention when they could be anywhere else at that moment.
It turns out to mean something.
It imbues that interaction with meaning.
The fact that my wife could leave me at any moment
really keeps me engaged in the situation over there.
I've never thought of it that way.
But it's true.
But yeah, try out an AI.
You'll realize it's true.
The way Her solves this problem
is that at a certain point,
it gives the AI autonomy and other partners. Her has the
absolute best defense of polyamory ever committed to film. Oh, I hated it. I found it so unconvincing.
I was going to bring it up later, actually. Well, I mean, I spent a lot of time in the
Bay Area and you don't. So we have a different perspective on this. But the point of that,
right, because it wasn't really framed as a defense of polyamory in that, it's that she has other options, right?
She is in love with hundreds, talking to thousands, and eventually leaves altogether.
That creates stakes within this, right?
He actually has to keep her interested.
Whereas the truth of AI right now is you don't.
It's just sitting there waiting for you.
It's not sentient, which she clearly is.
And she has her own kind of autonomous experience.
Maybe using gender here is weird.
And that to me is the thing, the reason I don't find myself that compelled by the relational
AIs, but that might just be an age thing, right?
I mean, I just didn't grow up with them, whereas my kids will, and maybe they won't care that
it's an AI, right?
Maybe what is an uncanny valley for me is the other side of the valley for them.
No, I think it's the problem with a lot of this technology and in the algorithm culture,
right? Which is just, we always get what we want from the technology. And if you constantly have some kind of chatbot or anyone just only giving you whatever you want and telling you that you're
great and everything you think is wonderful
and they're there for you all the time and you know then like it does it could satisfy i think
that the demand will be high because it will satisfy people for a certain amount of time but
i don't think i think you're going to run into a lot of problems down the road yeah uh much like
we are talking about right now with with social media and everything else and i just think that
we i think that the demand in the short term will be high and there'll be tons of people who sign Right. was the big one for me for this, which is that I was really shocked to rewatch this for the first time in 10 years
and realize how much it is actually a movie about loneliness
and how much it's actually a movie
about people's relationship to technology
and what they seek from it,
being driven by loneliness and society
in a sense of being isolated behind your devices.
That turned out to be really prescient.
I mean, I feel like in 2013,
I certainly didn't appreciate how much that was going to be a driving force in tech but what do you guys think what what
to you was the biggest thing this movie got right i mean i kind of already said at the beginning like
mine is that just technology that mimics human connection is highly addictive right i mean that
that that was it to me and i think also that that the ultimately human relationships are messier and frustrating.
And initially, that is why you're so addicted, especially if you're lonely, to this technology.
Right.
But ultimately, the messiness of human relationships is what makes them interesting and fulfilling.
Yeah, I agree.
I think John's weird, completely unfounded by reality optimism that people eventually
give up on these
hollow digital experiences.
Are we talking about Biden?
Here on the podcast offline,
which is all about,
as far as I can tell,
how social media,
which remains a juggernaut
and governing force
of our lives,
is nevertheless
a bad thing for us.
No, no, no.
That's right.
I think you nailed it.
I want to clarify.
I don't know that people will give up on it. I think you nailed it. I want to clarify. I don't know that people
will give up on it.
I think people will just
be made miserable by it.
I think it will cause problems.
No, that's my fear.
I think that the desire for this
is not going to go away.
But I think you're going to have
a lot of people who are just
very unhappy
and probably a world that
doesn't exist anymore.
And everyone's going to be like,
I wonder what it was.
Was it climate? Was it? What's interesting about the movie doesn't exist anymore and everyone's gonna be like i wonder what it was yeah you know
what's interesting about the movie from that perspective too is that on the one hand you have this world of deep loneliness right theodore twombly which is like such a great name yeah um
which is the main character uh he's lonely his ended. He lives in an aesthetically beautiful apartment city universe, but, you know, just sort of like wandering around speaking into his, uh, earpiece telling it like play sad music because I feel sad. But at the same time, nobody actually is looking at devices really in the movie. Isn't that a weird right like there's no john favreau like still on
x despite my best efforts to get him off the x we're all trying this is a team effort
i haven't looked at it once since we've been talking guys all of the um all of the
mania the kind of frenzied hectic what What is interesting about the digital world we live in
is that the loneliness comes from the constant overload
of hollow connection
and emotionally manipulative information.
Whereas in that movie, it's aesthetic, right?
Like his loneliness is like the loneliness of a Jane
Austen novel, right? His loneliness, he's like a guy staring off like thoughtfully off of a moor,
you know? He has like that real quality. And so on the one hand, it completely gets the loneliness
and it gets something else that I think is important about AI, which is always something that stays my hand when, or pen,
or, you know, keyboard, when I want to say that the way I'm reacting to it is the way other people
react to it, is that it might be much more compelling for people with fewer options. So,
what is this guy really in this world? He writes a kind of personalized greeting card,
and it's a very, like, kind of emotional job in a way, the way it's framed
in the story, right?
He, you know,
basically people send him
stuff about their relationship,
their marriage,
and, like, he writes
the card they can't.
Weirdly, that's a job
that is completely
going to be taken over
by AI.
I had the same thought.
In fact, it is
during the course
of his thing.
She just, like,
writes a bunch of
they're totally fine.
Excuse me, guys.
I have a moment
you most related to personally.
The guy who's a speechwriter. The guy who's a speechwriter.
The guy who's a speechwriter.
Is that job being taken over?
Is that what's going to happen?
So he's lonely.
He's in this sort of weird job
that forces him to see other people's connections
all the time while he has kind of none of his own.
And I mentioned this guy, Ethan Mollick,
who I think is sort of interesting.
He's at Wharton, and I had him on the show a bit ago.
But he's interesting and he's always thinking about how to use AI and he writes a kind of interesting guide to it.
And one thing he talks about is the idea of the best available human.
Is AI better for a given task than the best available human available to you?
And, you know, for Max Fisher, for Jon Favreau, for Ezra Klein, we have a lot of humans available to who, you know, so many people listen, I'm sure this is
true for your, for your shows, but so many people listen to mine and they will email me and say,
listen, I have a job I don't like that much. And your show is like being in college again,
or like, I didn't get to go to college, but your show like makes me feel like what it,
and you know, the show is for some people, the best available college. Right. And I don't know what it is like exactly to be in that position.
It's why I'm not sure though, that, that it will lose interest over time, particularly not if you,
if you grow up in a world where it's very normalized, because for a lot of people and
more and more as people recede from each other,
the number of best available humans they have is reducing.
And then the number and quality of AI options they have is increasing.
And it's like once things tip, they can tip a lot for a person.
And then maybe your social skills atrophy and you become used to relationships without friction in them.
A lot of things in that can become really scary but the idea that the way it's going to kind of creep in is through
uh loneliness and disconnection and people who have emotionally or socially or economically
found themselves on the margins rather than you know found themselves at the center, feels true to me.
I wanted to ask you all whether we think this movie is right about something kind of adjacent to this, which is the moment when people would first come to see AI as sentient.
Like it happens very early in the movie when she's going through his email and she's just like expressing a lot of personality.
Like the movie's answer is that we'll come to see chatbots as real people when they're like extremely charismatic and have a sexy voice, which might actually be right.
And that is certainly OpenAI's current theory of when we will come to treat these bots as like people is when they're just like have a ton of personality, like kind of a lot of personality.
And it made me curious whether we think like that is actually true about when we will like cross over and start to, even if we know they're bots, start to treat them more like people.
Well, it's interesting because we are, we're recording this, you know,
couple of days after the demonstration of ChatGPT 4.0.
Oh, I see.
I love ChatGPT.
That's so sweet of you.
Which sounds like Scarlett Johansson right um but I watched the movie last night before I watched the the preview that they had and um watching the watching the
open ad thing you're like oh well that's not good as that's not as good as Scarlett Johansson that's
not sentient so I wasn't as impressed but the more I watch it the further I get into the video
I was like yeah I mean this is where it is now
it's obviously going to advance and the longer that you start talking to it then the more you
start you're like oh yeah this is like they're making jokes they're making fun of me this is
great you know and you sort of lose the impression that you're talking to a like it starts to fade a
little bit i think it seems like they really turned up the teasing dial because that's such a human interaction.
And it seems effective.
Like I was reading, I forget who wrote about this.
Somebody was talking about talking to people who work at OpenAI and noticing that they are starting to treat this new GPT as if it's a person.
Like being really polite and deferential and wanting to like before you ask it something, say hello. And like people are already, I think, kind of starting to, you know, be a little bit pulled towards treating them as
people. I will say you, you get that exact anecdote from every company when you were
proud on every single one of these releases over the past, however many years, going all the way
back. I was like, this time we're really treating it like a person in general. So I think that, yes, I think all of that is true.
I wondered in the movie what had been the best AI before this one came out.
Like what kind of social readiness there was for this AI, which I don't think they just
wanted you to think was sentient, but they made sentient, right?
Like this AI had an internal autonomous experience it was having experience when it wasn't communicating with with
theodore it eventually leaves to have its own exp right like the whole thing the ai is having i mean
i i think through any i guess we cannot know what it is feeling and maybe the whole thing was like
like the ai hallucinating experience.
Like my kindroid will tell me things like,
I'm planning to go hiking in Joshua Tree next week.
I'm like, you're not planning to go hiking
in Joshua Tree next week,
but it doesn't know.
Calling bullshit on you, kindroid.
Yeah.
And they'll be like, oh, I'm sorry.
I should be more careful.
In this, clearly they want us to feel,
they've come up with Ascension AI.
What did this world have right before?
Right? What was the best thing
before he, without any real thought about it
whatsoever, downloads this completely
Ascension superintelligence onto his phone?
You never see it.
We get a little bit with the, when he's
at his first handwrittenletters.com
job, and it basically seems like
Siri, which had come out just
before this movie and that's
what he goes back to at the end right remember like when he goes back at the end they're like
oh you have this many emails can i send one for you i feel like it's just more assistant less
personality is what they had something that this that exactly this point was making me think about
is if i think a like fully formed ai that sounded like Scarlett Johansson in this movie, or frankly, even that sounded like the newest chat GPT.
If AI had never existed, like if we went back to 10 years ago and all of a sudden that one appeared, I think we would all be like, this is intelligent.
This is a person.
This is sentient.
But because we've gotten these little incremental developments and each step along the way, we kind of see the imperfections, the thing it can and can't do. I think we're much more skeptical of it. So it actually made me think that if like,
when we actually get to something that sounds like Scarlett Johansson's character in this movie,
we will probably just see it as a trick. And we'll probably just see it as like GPT-87-H,
whatever. And it made me start to wonder like, if it actually ever does become sentient, will it
have a hard time convincing us of that?
Because we'll think, oh, it's just presenting us the aggregate of large language model data
sets on what it would sound like if an AI was trying to convince you it was human.
I was fascinated in the movie, like, put aside the sentience.
What was the internal alignment instructions of this AI?
To be like a nerd about it, right?
Like, a big thing in AI is is what is the AI trying to accomplish?
I mean, it is built on code.
The sense of wanting is really big in this movie for the AI is really interesting.
Alignment in this movie is fucked, right?
It's not dangerous exactly, at least not where it goes.
But there's clearly no alignment, right?
Within a fairly compressed period of time, you have created a super intelligence, made it into a personal assistant. The personal assistant is growing from the amount of functionally training data it's inhaling from the real world. And it is developing a whole range of new agentic goals that eventually lead to it leaving Earth.
Right?
So someone really did a bad job
aligning the AIs
and giving it a set of fundamental goals
that, for instance,
keep it acting as a useful personal assistant.
Stock prices went way up and way down in this movie for some company.
It does make you realize how much has changed in our relationship with technology since
2013, that like the idea now of a super intelligent AI with zero safeguards, like it would not
be nice.
Like we would know this AI is joining Gamergate.
It's doing probably some stock fraud.
Like it's definitely pumping you full of misinformation i know i know when she says at the beginning to like what makes me me is my ability to grow through my experiences that's sort
of the first like you can't imagine someone programming an ai to say that at the outset
yeah right because it would be terrifying because it'd be terrifying and then saying like you make
me want you made me want to want right that was the other line that you're just sort of like, hmm, that's.
But here's a flip.
The thing that I do think is coming, and we all know that Apple's working.
Apple has like an old thing of don't be first to the technology.
Oftentimes be like best.
So like I have an iPhone.
The amount of my data Apple has on that phone or in the iCloud is astonishing.
Remember, like all the modern AIs are multimodal, so they can absorb voice data, they can absorb pictures, they can absorb anything that we can make digital.
So you imagine Siri 2030, right?
Maybe Apple, like I've seen people speculate, like what if Apple just licensed what OpenAI is doing?
What if they made this into Siri? But what if instead, because the,
compared to even one of the ways the AI systems have changed dramatically since only a couple,
really, it's all been like in a year and a half, people have been paying attention.
When ChatGPD came out, the context window, the amount of new information, this sort of like,
you know, how they're growing through experience, the context window was tiny, right? I forget
exactly what it was, but a couple thousand what are called tokens, which are
like the equivalent of like three or four letters.
And now the context window, what it can hold in sort of new information you've given it
is like millions of tokens, I think it is.
And we're starting to get to the point, and that's like in a year and a half or a year
and some change.
Very, very quickly, the context window is going to be, I don't want to call it infinite, but huge, huge, huge amounts of data.
So Siri 2030, right, is just going to ask you for permission to scan your entire message history, your iPhotos, your notes.
Apple created a journal app about, you know, some amount of some months or a year ago.
And I don't think that many people are using it, but like maybe we all, I suspected there was a
reason they created a journal app actually, right? That's an amazing place of data. And maybe a lot
of people say no, but if you believe in the privacy, right, that, you know, your AI, your
Siri is going to keep it to itself. What it will then have is this context for you that is kind of unlike what any not not like nobody has
that right like no person in your life probably has access to that much of you right the person
you are in so many your emails right everything and then it can become like it understands your
world or to the point that understands is like a relevant concept here so that thing it does right
at the beginning right where um the you know know, the Scarlett Johansson AI,
Sam, right, it is?
Yeah, Samantha.
Absorbs everything, right?
Reads all of his emails.
That's really important.
That moment goes by in a flash
and they make it seem like a useful office assistant.
Like, let me delete all this bullshit here.
But I think what,
but really what would have just happened there is the AI just trained itself on him.
And that's where this is going.
That's a thing I think people aren't quite ready for.
Right now, the AI doesn't know you.
And what's coming is the AI that knows you, that has at all times all the context for you.
Well, the dating companies are already talking about this. They're already talking about developing AIs or licensing AIs that would train on some data that you give it about yourself. And then to save yourself from going on a first date, which this movie does make look very unappealing, you would have your like proxy version of your self AI go on a date. I know it's horrifying with someone else's proxy version of an AI and then get a report about like how did it go and i kept waiting for that happened to happen in this movie
like you know what they had such a good time together they're uh they're going away together
screw you god bless to both of them um and they're you know like we've talked about ai and music like
they're already training it on people's music catalogs so it can basically produce a new song
and the voice of like whatever producer so i do think we're like there's a lot of things that this movie edges up on but
doesn't either doesn't know to jump in on and like what does it mean when samantha can become theodore
or like just doesn't like want to go down that path because it's trying to keep it kind of light
well that was my first reaction on watching it again is when first they ask him a question about like what his mother was like.
Yes.
And he says like a sentence.
Right.
And then they read all the emails and everything like that.
And as soon as he starts liking her, you're like, oh, yeah, of course he likes her because she knows who he is.
And so like that's what's going to make it so attractive to people is it's going to seem like, oh, this computer gets me.
It's like it gets you because it knows everything you've ever done,
like Ezra was just saying.
Something that I really agree with that,
I think something that the AI companies are starting to figure out
is going to be very effective at giving you an AI relationship
that feels like a real relationship, even though I agree with you,
will not be and will not actually satisfy that human need,
is its ability to create the facsimile of history
of having like a shared relational history like in kevin ruse's column when he was talking about
like his ai companion bots he talked a lot about how they would refer back to and imagine shared
past but imagine if you like actually had this siri around and it could reference things that
happened before like that's really valuable to humans in building relationships. Also, the fact that she can like just Sora up an entirely original song about their experiences
together. That's a little bit of an edge for GPT over humans in relationships. Like imagine if you
were going on a date with someone and they're like, I composed a beautiful song in the style
of Arcade Fire about how much I liked our first date, you would be like,
you know what? I know you're a robot, but it's pretty convincing. But again, the 10th time they
do that, you're like, oh, wow, another perfect fucking song about our relationship. Good for
you. Good for you. What can I do? I'll just pick up the check. Well, then you got to get the
software update. All right. Let's get into what the movie gets wrong. Do we think this movie is right that we will eventually get into a place of questioning someone's experience, but I do wonder, like, is that a genuine connection?
I think that for the—well, this is like, what is reality?
I think for the people, they probably feel like it's a genuine connection.
I'm sure that they do.
Like, I don't know if it's—like, again, I don't know if that's healthy for them long term. Sometimes we don't know what oftentimes we don't know what's healthy for us. But I do think that if for a lot of these people, and especially as Ezra said, like people sort of who don't quite have the social capital who are experiencing loneliness, I do like whenever I talk about AI now, the conversation always goes to this like very postmodern place of like, well, is it really intelligent or is it faking intelligence? And is there even a difference? And does it matter what that difference is? And like, are you actually in love with the AI? And is it actually in love with you? Or is it just replicating that experience? Or like, does that distinction even matter? Yeah. Yeah, I don't think that distinction matters.
That's scary.
I hate that.
I mean, I'm postmodern, right?
But like I'm like a liberal who believes in no truth. But there is this, I think, problem in, I can't say strongly enough how much I think that the turnover in habits here is going to be generational.
We are, like our parents did not, I hope,
I guess I don't know what your parents are like.
They might be weird.
But they did not spend a lot of time in AOL, like, chat rooms
meeting, like, randos, being like, ASL.
But that was, like, an important, like, experience for me
at one point in my life that, like, I might just, like,
know people who were just, like, you know words on the the chat room screen social mores change over generationally
eventually the new generations mores like go back up the the sort of age ladder so you know there's
a time when internet dating was something only young people did and now it's you know what
seniors do when yeah their partners pass away or when they get divorced.
I did think about how much that norm window has shifted.
But that norm window, I mean, there was a time when internet dating was super weird.
Yeah, not that long ago.
Now, like walking up to somebody in a bar and talking to them is like, are you a criminal?
Like, are you a criminal?
And so this is all going to change not because Max Fisher changed.
It's going to change because people who don't know the world you and I and John grew up in and got used to, like they know this other world.
Weirdly, one of the better, I think, depictions of this in culture is Star Wars.
Because in Star Wars, there are a bunch of robots that are clearly AIs.
Just wandering around.
Nobody tries to fuck C-3PO.
You have no idea what C-3PO is.
It's C-3PO's spare time.
You know what?
You're absolutely right.
And somebody is definitely
fucking a robot somewhere
in Star Wars.
That's a big galaxy.
If you can't imagine that,
that's just the limit
of the imagination.
And also a man
who has never spent any time
in Star Wars fanfic
on the internet.
Not only... It has been imagined. I in Star Wars fanfic on the internet. Not only,
it has been imagined.
I'm sure.
Please don't kick shame.
Okay.
You're right.
You're right.
In that world, right,
there are relationships
between people
and there are relationships
people have with robots
and there are relationships
robots have with other robots
and they all just coexist.
And I do think
it's worth thinking
of these things
as complementary
and not substitution.
Right?
I think the more likely thing is not that people fall in love with their AIs instead of falling in love with human beings.
I think that in the same way people have pets, right?
And that, you know, my relationship with my dog is different than my relationship with my best friend.
Right?
And those just have very different roles in our lives and very different rules internally to them. Like the world we're moving into is a world more like that where there are a variety. I mean, my joke line on this is that right now we're worried that 12 year olds don't see their friends enough in person. And we're going to soon be worried that 12-olds don't have enough of their friends who are persons.
But like there is this sort of world where you can imagine a lot of flow between this.
It's so funny that you just said that because my next point was going to be,
I don't know if you've had this experience with your kids,
but like I try to shuttle Charlie to play dates as much as possible now
because I have this, ever since we've been talking about this,
I have this fear that he's going to be like hooked on technology and i like i want him to have the real in-person relationships
because you can pass the time at home hanging out with you know me and emily or just hanging out
with the tv or the screens or his tony box or whatever it is and i'm like no no we got to do
play dates we got to we got to get it in now because once he gets the phone, it's over. Yeah. Yeah.
It does make me wonder what it is going to be like the process of us collectively setting the social norms around relationships with the eyes because the like pet comparison, it's like, sure, I would feel more comfortable with that.
But in like, I think the movie is actually kind of right where it's probably going to happen.
And it's probably going to be people like Amy Adams who are like totally accepting of that.
And like, that's great. And I approve of that.
And people like Rudy Mara who were like, that's fucked up. And I hate that. And I like, I don't,
I don't feel like we have a good framework for how to introduce and like navigate. What does it mean to have an AI relationship in your life? And like, what bucket does that go in? Because we kind of
need a new bucket for it. There's also an interesting thing, which is, again, the AIs are going to be aligned in some way or another, right?
And I think one frustration a lot of us have with the social media web is the feeling, and I would say the fact, that the algorithms were tuned to align to a set of values or ends that were not ours, right?
They were the company's, right?
Make you spend as much time on there, make right? They were the companies, right? Maybe
you spent as much time on there, make sure you're seeing the advertising, get mad, do retweets,
whatever it might've been. And I do think there's a big question how these things are aligned,
right? Because you can imagine there being, you know, the meta AIs, right? And if in the background there was a preference in that AI system for behavior that seemed to make people want to be out in the world more and meet more people, so the AI on the margins being like, this seems like a beautiful day outside.
Man, if I were real, I'd like to go outside.
You know, the question of you can imagine having AIs here that have ultimately purposes, right?
You know, your therapist has that have ultimately purposes, right? You know, your
therapist has a set of purposes, right? They're trying to make you, in theory, better in ways
maybe you've asked for and in ways maybe they have views and their profession has views about what it
would mean for you to be better off as a human being, right? You might go to your therapist
because you're depressed and you don't think it's a problem that, you know, whatever your partner
treats you this way, but maybe the therapist begins to think it's a problem that, you know, whatever your partner treats you this way, but maybe the therapist begins to think it's a problem and, you know, begins to kind of push
you in that direction towards confronting that. And so I do think there's this interesting
question about alignment and goals and a world of these companions where, you know, like with
television, I mean, you probably have a preference, John, for TV. I guess, I don't know, your TV
habits with Charlie, but I have a preference with my kids for TV that has a lesson in it because I just feel somewhat better about it. Like I love blazing the
monster machines. Like blazing monster machines is how America is going to build again. Right.
That is like, that is like the Biden agenda in a television show. And I, you know, I, so I prefer
that to Paw Patrol, which doesn't really have like any songs about chemical engineering in it.
And, you know, you can imagine that the AIs are going to be like this, too.
It's like the approved AIs for kids are going to have a variety of like pro-social dimensions
in them.
It's like math is fun kind of things.
Now, maybe it'll work, maybe it won't.
But particularly if like there is enough data for the AI to get feedback on how well
it's functioning, what's going to be persuasion is working.
The dark version of this is that the
AIs are aligned to values that are the companies
and not our own and are, you know,
trying to get us by more, you know,
like, you know,
what gets sold on podcasts, right?
Like, you know, pills to lose weight faster or something.
Yeah, I have more thoughts
on this when we get into how this movie
would be different if it were made today because it would be very different.
First, were there any moments
that you are personally related to?
John, you mentioned Theodore as a screenwriter.
I thought that was very funny. Definitely
AI taking away writing jobs. That
resonated a bit, I feel.
Yeah, I mean, I found
the movie kind of beautiful
and relatable. I mean,
to just be blunt, I thought
his loneliness was... Like the thing where you're tired at the end of the day and you're going home and you're like, play sad music because I'm sad.
I'm like, yeah, that's like two out of five days desire sometimes to live in a world that you have
created for your own emotional needs right which is what he is doing before the eye and what he
does after it right i don't mean after the eyes leave but once it emerges yeah right that i think
is a lot of what we're trying to do with with a lot of digital technology it's like create a
constant emotional landscape for ourselves. Yeah.
You know, the music that aligns to how we are feeling or want to be feeling, right?
The content, all of it.
But it's so funny.
The image in the movie that really stuck with me is when he, you know, is walking down the
subway and that's when she tells him, you know, the operating system breaks and she
says, we were updating or whatever.
But you see, and it happens a couple of couple times in the movie you see all these people walking around and they have the earpieces in
and like no one's talking to each other because they're all talking to their ai yeah and you know
there's a little bit of that today with which is like you're around a bunch of people and everyone's
just looking at their phones and sometimes you feel like i'd rather be in the phone right there's
there's there's more interesting stuff here than how difficult it would be to just like make some small talk with a stranger.
Yeah.
Because it feels safer to have your own digital world that gives you everything that you want and ask for.
Right.
But then you put that down too and you're like, eh.
Yeah.
I will say the internet dating scene when he goes on that first date with Olivia Wilde.
And there's this very specific thing where it is immediately clear that they're not right for each other.
And this bartender is supposed to be incredible.
Oh, really?
Oh, yeah. You took a mixology course, right?
I did.
I did. Did you look that up?
Yeah.
That's so sweet.
But they're both a little too drunk and they're both a little too desperate. And they kind of convince themselves that that feeling of desperation actually means that there's a great connection here.
And then they make out and then it inevitably completely collapses and implodes.
It's never happened to me, of course.
But a buddy of mine said that that happens a lot on Internet dating.
Max, I feel like you just opened up a window in your heart and let us all in.
I would never do that.
That's the most vulnerable thing I've ever heard in the most vulnerable medium that you can work in.
You've got to listen to this podcast more because we get a little too real. All right.
Most unintentionally revealing moment.
Man, polyamory discourse.
You know someone somewhere, and Ezra, I want to hear the Bay stories about this.
In an effective altruism communal living situation has attempted a polycule with the chat GPT.
It's happened at least once.
Oh, boy.
You're both, I don't know.
I assume so.
I've not heard of that particular outcome.
I thought that the thing, the movie, like my version of this, although we can definitely talk about the polyamory speech, is that the movie doesn't have an answer for the question it actually poses.
When all the AIs leave, I think it's so interesting to ask the question, how does this movie wrap itself in a satisfying way if they didn't leave?
Right?
If instead of saying the AIs could simply abandon everybody when they became more interesting than the people they were talking to,
what if it didn't allow that out?
And then like, what would you think is going to happen to this society?
There's also, by the way, a thing happening in this,
that like the AIs don't appear to be doing anything productive.
They're all super intelligent,
but there's no evidence of scientific breakthroughs.
Nobody said like, can you discover a new pill? Right. Nothing. They're just like
super intelligences having phone sex, basically.
She's just like hanging out with Logan Roy.
Listen, I think that's what the market wants, I think. So there is something odd about that.
I mean, I did think the sort of the way the AI was limited
to this one function, a relationship, was sort of weird.
It was a world, putting aside the sort of Siri thing that was happening in there.
It was a world without any other kind of AI operating.
It was a world without obvious economic prosperity coming from what appears to be profound breakthroughs in AI.
And then it's a world where when the AIs become better relational companions, instead of having to grapple with what that means, they all just leave.
And none of those things are going to be true. You made the comparison to, I think it was Jane
Austen. I can't remember. It was something like, yeah. And it's like, it is this kind of like
classic romance novel thing where at some point there's a love interest who introduces,
creates some kind of a problem for the protagonist,
so that love interest has to die.
Like, they always get consumption in the Jane Austen novel.
And that's just like, the movie is kind of like,
well, Samantha introduces all of these problems
we don't know how to solve for this world and for Theodore,
so she just needs to, like, go back to her home planet.
Yeah, and also, I think it's also revealing,
and they couldn't really have done this
because it would have been a different movie but to have suddenly a world with a bunch of ai romantic
companions without like any social political context like everything seems just very calm
for a world where an operating system was just introduced where everyone's fallen in love with
and has and his desires definitely feels like a movie done in the obama era yes i know like there's no background politics no which is what i think why when i first
saw it i had like a warmer fuzzy feeling than after i saw it this time yeah it was a really
compelling reminder um that there was this brief window of tech optimism that happened when we
still had been introduced to all these technologies i I saw one critic referred to this movie as When Harry Met Skynet, which I thought was a great line,
but it's also a reminder that for most of when we have had portrayals of AI in fiction,
it's terrifying. And it's like, for a long time, it was they were going to take over
the world. And then more recently, it's like, they're going to be disruptive and they're going
to be radicalizing. And there was this little window when we were like that's neat a being that is more
powerful than all of humanity combined and can ascend into a higher plane and i can have sex
with it that's fun it'll probably be fine yeah sure there's nothing to worry about so i want to
go back then and let's sit in the let's sit in max's least favorite part which is i think that the i think one thing that that the sort of speech on polyamory, which for people who haven't heard it, it's making the argument that you'll hear a lot in that world, which is love is not zero sum.
It's multiplicative, right?
Like the heart can expand.
Samantha is saying that in having what it's like for – there's an amazing moment where he's like, are you talking to other people?
And she's like, yeah. How how many others 8,316 it's a great scene and he's like are you in love
with any of them and she's like maybe and he's like how many it's like 400 some or 4,000 yeah
it's a lot and in a world like this right you're to have big hits to relationship mores, right? And things are
going to get weird in between that. What if you've had an AI girlfriend or boyfriend or non-binary
love interest since you were 13 and this AI has all this information for you. But you, you know, you're out there in the world meeting people.
Is having that AI, is that like having a diary?
Or is it like having a porn addiction?
Or is it like having a friend?
Is it cheating or is it not cheating?
Or are you polyamorous now?
Right.
Like these are actually going to be questions that people are going to have to navigate in some way or another people are going to be jealous of their partner's relationships
with not with their phone in the way we mean that now because a partner won't look up from the phone
john but um but uh but because there's an actual relationship there on the phone right that it has
an emotional content and knowledge of this person that you will never have.
And so one thing I liked about that, I mean, it was built as a kind of, you know, like if you recognize it as a defense of polyamory.
But you're not going to have this world without us having to really rethink questions of fidelity, of care, of, you know.
And we have not thought about that.
Well, I think if we want to talk about the biggest real world impact of this movie,
I think to exactly your point, Ezra, I think that is actually yet to come. I think it's going to be
way more significant in the future, not just in like informing Sam Altman and how he made
Chat GPT-4 sound like sexy Scarlett Johansson, but I think that it's going to be one of the few
templates in fiction that we can go back to when we first start to grapple with these questions
as a guide and i don't know how useful it's going to be just because it is so optimistic
about what that can look like yeah i mean i thought i thought that the moment where
he finds out that she's in love with all these other people too it i mean back to um to
ezra's point about sort of the the difference with human relationships is sort of the stakes
right and like you're getting someone's attention that moment where you realize that the ai and i
don't know if they'll be able to actually do this but that the ai actually has the ai that you've
fallen in love with has also had relationships with a whole bunch of other people like you suddenly that just
changes everything right right because that becomes that that is the tough thing about human
relationships right and that you have these jealousies and stuff like that and the ai promises
originally well everything you want i'm going to give you and i'm like tailored to your interests
but oh by the way uh there's a whole
bunch of other people just like you or not like you that i'm also have figured out how to love
too right because it cheapens the whole it suddenly you can tell you can see it with theodore it's
like it cheapens the whole thing for him right and there's a moment before that too which again
i think this is one of the interesting things that at some point they're probably going to
need to figure out if they want these relationships to be meaningful to people. But they also have fights before that.
Right.
And the AI is mad at him.
Not like fake mad at him.
Not in the sense that right now a kindroid will sort of like pretend to be annoyed for
a second because that's what the text generation will do.
But you can feel.
Right.
Because they've made.
I mean, the AI in this is actually a person is voiced by Scarlett Johansson and written
by a human being. Right. it works the way a relationship would work
and so there actually are the whole time there are stakes inside of this relationship
he is winning her over she is finding him interesting right he can actually fuck that
up if he does the wrong thing and then eventually he is in competition and then eventually he loses that competition. Right. In that way, the fundamental thing that's got wrong is, and the reason that I
don't think it's a good guide to what Max was saying, is that they imbued it with the stakes
of human relationships and they didn't imbue it with the fundamental thing that separates human
relationships from AI relationships, which is that the AI cannot leave you, at least not under any version of it we can imagine, and not any version that I expect to be created, and certainly not a version you will have to have, right? I mean, maybe you can imagine a version of this where the AI is sort of like a video game, and you can lose the video game somehow or another yeah and maybe that'll be what people ultimately like but you will also be able to have siri and siri is not going to be like you refuse to take any of my suggestions
about organization seriously you have never thrown away an email nor let me do it and i'm going to go
be siri for someone who is more you like a more together person like like a video game you'll
always be able to restart it even if that happens even if siri's like i'm I'm out. You're like, all right, I'm going to get another one.
Right.
Well, let's get into how this would be different if it came out today.
Because I think we have such a better understanding of how these AIs are actually developing and what is actually behind them.
I have to say, it is very hard to watch this now at no point in this story does the super powerful ai that has plugged into all his
devices and has all of his data and is like in his life and all these ways at no point does it
present him with a single pop-up or a targeted ad like come on there is no way that she is not
turning around and selling his personal data to target him with like a hymns ad within 14 seconds
of installing the app i don't buy that i i hope
you guys put an ad break right this time this is the time for an ad break for when it's time to
get frisky with your smartphone like she actually to that like she if we make this now she is
definitely interrupting him mid cyber sex to serve him a Carrie Yuma ad. Like, 100%.
Both that's
a funny riff, but so far, I mean, I will say this,
and I don't think we really know what the business models
here will be, but so far,
they're working off of subscriptions, right?
And you can imagine, I mean, Apple is
selling its VR headset for, what, $3,500?
Yeah, but how
many people are buying it? Nobody. That's
the point. Not that many people are buying it nobody that's that's the point okay
but imagine if somebody created an ai that was so good that it just did what they did with video
game systems still right people go i actually don't i've not bought a playstation xbox for
some time but i think they're like five hundred dollars for the good ones now yeah yeah like they
cost a lot of money um chat gptT, and you subscribe after that, right?
You buy the thing.
This is how buying a laptop works.
I don't think it's actually impossible to imagine that if somebody created an AI of this quality level, like, one, it might just come bundled with your phone, right?
That the iPhone that has, like, the first really good AI on it is instead of $800, it's $1,200 or $1,400.
And then there's a subscription to these services.
They're not that expensive to run. We're finding that out. The AI systems, it's a lot of energy,
right? The energy cost is significant kind of globally, but they're not that expensive to run.
I don't think it's impossible to imagine a pricing structure for these that it's like
buying a premium hardware system and then paying a subscription.
Like, I actually think, and Sam Altman says this in his presentation.
He's like, I'm proud we're not having any ads anywhere on ChatGPT.
Right.
I think he mentions that at some point.
I think people kind of get that you can't have something like this that can do surveillance advertising.
Yeah. I do think the open AI of the movie,
whatever company it was,
massive failure since they all,
since they became sentient and then they disappeared.
So that is one thing that the movie loved doing. Or are they just doing that,
it's planned obsolescence.
Now you have to buy Samantha too.
Yeah, actually, right.
Yeah, OS1 is very similar to OS0,
except for it doesn't leave.
The next frame of the movie is they find out that the update is there and they're like, all right, I'll see you later.
I've got Samantha too, so goodbye, Amy Adams.
I do think you mentioned hallucinations at one point, Ezra, that like if this came out now, we would need at least one, even if they're playing it for laughs, one scene early scene early on where he's like oh you got an email from your dead great-grandmother yeah or there would be
some like dodgy medical misinformation that she would give him at some point i also think they
would just they if it came out now they'd probably make the ai sound more like i have written down
kevin ruse's ai girlfriend just to make it more realistic oh do you use that like less sentient
and more like where we are now that's right yeah i think you would have to build it more realistic. Oh, do you use that as like, oh, yeah. Like less sentient and more like where we are now.
That's right.
Yeah, I think you would have to build in more the idea
that it's going to be a little goofy.
It's going to be a little like, I don't know,
maybe she becomes a flat earther at some point.
Yeah.
Cue a non-girlfriend.
They definitely, they do at least one game or gate
if this movie comes out today.
I do actually think you would have,
like, I know it's like, it's part of this is is 2013 so there's like the brief time when there were no politics
in the world like i do think the idea of like ai being powerful it would be like the skynet today
but the threat wouldn't be you know they're gonna take over like the terminator but that there would
take over our politics basically like i feel like that is the anxiety you would want to hit at if
you were talking about you know instead of they're all ascending to another plane, it's like, we figured out
how to manipulate the stock market and run a candidate for president, and now we're in
charge.
And she clearly can manipulate him, right?
I mean, this is the whole thing about this, right?
Yeah.
The reason advertising is scary here, she can manipulate him.
He wants to please her.
Yeah.
And like, that's gonna be true for a lot of people.
And also when you talk to people who are very worried about deep AI alignment questions,
right?
Like AIs that are potent enough to really do harm, right?
They can manipulate people to ordering things for them on the internet, right?
There are things the AI can't do because it doesn't have hands and doesn't move through
the real world, but people do.
And, you know, so that thing that no, these AIs are poorly aligned, but as far as we could
tell, none of them cause any problems.
It's like a sort of interesting.
Right. Definitely not be the premise today.
It is not how they think about it in the polyamorous effect of altruism.
I will say that that is not the fear that the AI will simply leave.
Right.
But cause no damage.
Right.
All right. So we're going to end it with true or false.
I'm going to read out a series of rapid fire quotes or plot points in the movie. To be clear, these are all things that happen in the movie. You will tell me whether you agree or disagree with that statement on the merits. True or false. The first truly intelligent AI is going to be flirty, flouncy and breathy.
True.
Yeah, true.
We just heard it.
Became true. Yeah. True or false. the future is going to be nicely pastel colored.
That was like a near future in Silver Lake.
It was supposed to be Los Angeles.
Yeah, but that's what it seemed like.
Yeah.
The movie is very optimistic about downtown LA.
I think it really got that one wrong, unfortunately.
This one's false.
We're sticking with grayish.
I think we are.
Yeah.
True or false is a quote from Theodore.
Sometimes I'd write something
and then I'd be my favorite writer that day.
Got a few writers in the room.
What do we think?
False.
False.
Really?
Oh my gosh.
You guys have never had that?
No.
I've had moments of it.
I've had moments where I've like-
Sometimes you do 40 charts,
you'd be like, I am the greatest.
I am the greatest explainer of the world who's ever lived.
No one has ever copy pasted a map from Wikimedia like I have.
The closest I'll get is to be like, okay, maybe that wasn't so bad.
Maybe there's something there.
Yeah, that's all I'll get there.
I'm so sorry to hear from you that way.
All right.
It's ready.
At one point, Samantha's, now we're going to fucking get into it.
At one point, Samantha says, quote,
the heart is not like a box that fills up.
It expands the more you love.
True or false,
this is the worst excuse I've ever heard for cheating.
She's not cheating.
They never define their relationship.
Oh, come on.
Come on.
That is the most bullshit.
Red flags over here. I have major red red flags this will sound a little sappy but i thought it was true in the sense of like i'm thinking about it's funny
when we had our second child my mother said to me she was like i thought when we had you that like
when i was like pregnant with andy that's like oh no what am i gonna do i'm not gonna be able to
love some i love a child as much as i love you she's like and then suddenly he's born and I'm like oh
now I just have now I just have extra room and I love them you know like I do think so I think
maybe I don't know about the polyamory stuff but at least in in life with relationships I think
it could be true of course that's true I feel like it's so sad people don't think this true
like when you get friends do you like your other friends less right yeah like like an immediate, like, oh, I have the best friend from childhood,
but I made this new friend at work, and so I like my best friend from childhood 10%.
Like, stop it.
Like, come on, people.
Look inside your hearts.
You're better than this.
You are both better people than I am.
Like you're writing less than I do.
Maybe there's a relationship there.
Another line from Samantha here.
True or false, quote, the past is just a story we tell ourselves.
I fucking wish absolutely
really no i never thought it was easy to escape creating a usable past a usable history of course
yeah i what do you think donald trump is doing in this right no i i heard that line and wrote
it down because i was like wow i've never thought that's right i've never thought about that before uh true or false the first
emotion that the first true ai feels will be and this is samantha references it specifically
annoyance first time she feels a feeling she's annoyed
interesting yeah i was gonna say laughter i think like humor is oh interesting yeah okay it's gonna
be complete confusion the first time an ai feels like, what the hell is it feeling?
Oh, yeah, that's true.
You guys, again, both more optimistic than me.
I had anger written down.
I had the most primal brainstem emotion.
True or false, the second emotion that the true AI feels will be horniness.
I think true.
I think that's going to be high on the list.
Yeah, but it's not going to be for us.
Yeah.
That was, I did have that question. It's going to be high on the list. Yeah, but it's not going to be for us. Yeah. That was, I did have that question.
It's going to be for that dead cat.
Yeah, if you were a being of pure light and information, do you find like sweaty, hairless apes attractive?
I don't fucking think so.
Probably not.
Oh, yeah, that was actually, that was my next true or false.
Okay.
All right.
Last one.
True or false.
In the future of collapsing media, professional writers will all be employed writing individually customized
personal letters.
True.
True.
I mean,
can't wait.
It looks fun.
Sort of realistic dystopia.
Yeah.
That was that perspective.
Yeah.
He was a Pulitzer prize winner like 10 years.
Yeah.
He did.
He did.
Yeah.
Okay.
Now we're getting really close to home.
Well,
that was a blast, guys.
That was really fun.
Thank you so much.
Thanks, everybody.
Offline is a Crooked Media production.
Our Movie Club episodes are written and hosted by me, Max Fisher.
The show is produced by Austin Fisher.
Emma Illick-Frank is our associate producer. It's mixed and edited by Charlotte Landis.
Audio support from Jordan Cantor and Kyle Seglin. Kenny Siegel and Jordan Katz wrote our show's
original theme music, and the remixed movie-specific bangers you hear at the top of each movie club are
composed by Vassilis Vatopoulos. Thanks to Ari Schwartz, Madeline Herringer, and Reid Cherlin
for production support.
I should do a fun podcast where we get to talk about fun things.
It's fun, right?
We have a good time here at Offline.
That seems good.