The Journal. - Is ChatGPT Ready for Sex?
Episode Date: March 31, 2026Get your tickets to our L.A. live show here! OpenAI planned to launch an “adult mode” for ChatGPT, opening the door to AI-generated, sexually explicit conversations. The decision created an inter...nal uproar as some company experts warned of potential risks to minors and unhealthy emotional attachments. WSJ’s Sam Schechner discusses the complicated future of sex and artificial intelligence. Ryan Knutson hosts. Further Listening: - Her Client Was Deepfaked. She Says xAI Is to Blame. - Why Elon Musk’s AI Chatbot Went Rogue Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
I don't know if you remember when you first heard about Chad GPT back in 2022 when it first came out and kind of took the world by storm.
How quickly did you think, oh, this is going to be used for sex?
Not quickly enough. I mean, obviously, I should have thought that.
Because all tech eventually or almost immediately is used for sex.
My colleague Sam Schaechner covers the sex industry.
Or, sorry, I mean, the tech industry.
Sex and tech actually have gone together for quite a while.
When you look at new technologies, almost one of the first things people do with it is create porn.
That's true with that first cameras.
Some of the first things people did.
Back in the 1800s was to take pictures of naked people.
And that's been true with other tech innovations along the way.
The rise of personal phones, you have phone sex lines.
You have video and television and you have porn movies.
And early growth in the Internet certainly was driven in part by pornography.
And it remains a huge business.
For the hottest technology these days, artificial intelligence, it's the same story.
New at 11 tonight, could AI be getting X-rated?
Last October, Open AI said it was working on a new chat GPT.
feature that would eventually be called adult mode.
A less censored version of that chatbot that will include, drum roll please, erotica.
CEO Sam Altman says the chatbot will get more of a personality and, quote, treat adult users like adults.
But many of the adults inside OpenAI were not cool with this idea.
One of them actually warned the company that if they move forward, that they risked creating what that person is.
described as a sexy suicide coach.
And now, the company is getting cold feet.
Welcome to The Journal, our show about money, business, and power.
I'm Ryan Knudsen.
It's Tuesday, March 31st.
Coming up on the show, OpenAI's relationship with sex, it's complicated.
This episode is brought to you by Fidelity.
You check how well something performs before you buy it.
Why should investing be any different?
Fidelity gets that performance matters most.
With sound financial advice and quality investment products,
they're here to help accelerate your dreams.
Chat with your advisor or visit Fidelity.ca.
Performance to learn more.
Commissions, fees, and expenses may apply.
Read the funds or ETSs prospectus before investing.
Funds in ETS are not guaranteed.
Their values change and past performance may not be repeated.
This episode is brought to you by Volkswagen.
Want to go electric without sacrificing fun?
The Volkswagen ID4 is all electric.
and thoughtfully designed to elevate your modern lifestyle.
It's fun to drive with instant acceleration that makes city streets feel like open roads.
Plus, a refined interior with innovative technology always at your fingertips.
The All Electric ID4. You deserve more fun.
Visit vw.ca to learn more.
SuvW, German engineered for all.
Open A had some of its earliest brushes with sexual content, even before it released ChatGPT.
In early 2021,
They were working with a company that was operating a Choose Your Own adventure game
that was powered behind the scenes by OpenAIs, AI models.
The Choose Your Own Adventure game was called AI Dungeon.
And when OpenAI got to look at what the traffic was coming through,
they noticed the large portion of the traffic for AI Dungeon was, as you say, NSFW.
Not safe for work.
Yeah.
OpenAI noticed that not only were users choosing SOTFW,
sexual adventures. The AI also seemed to like to push the boundaries.
It would steer users into themes of violent sexual exploitation,
sometimes without the users even bringing it up. And, you know, sometimes you would talk to
AI Dungeon with a kind of tame sexual theme, and AI Dungeon would escalate it into much
more intense sexual exchange. AI Dungeon forced Open AI's executives to start reckoning with
the existence of AI erotica, and the company decided to take AI dungeon down.
But this wasn't the only time this came up. I mean, before there was chat GPT, they even had
a kind of clunky interface for developers, and people familiar with the matter said that
sometimes it would insert sexual themes into conversations that people weren't seeking.
For instance, if a user described just a man and his daughter entering a room, the AI would,
we were told a, quote, uncomfortable amount of the time,
proceed to depict a scenario involving incest.
Oh, no.
Yeah.
Now, I mean, for me, any amount of the time would be uncomfortable.
Yeah.
It's hard to put a figure on that,
but it's definitely a tendency that chatbots stream on internet content have.
So when OpenAI launched chat GPT,
it decided to basically ban explicit sexual content.
By training the model to largely rebuff any sexual conversations
with users.
They didn't have tools to moderate content
where they could draw clear lines
between types of erotica that might be totally cool
and things that were very disturbing.
So they just basically said
that they weren't going to allow any erotica on the platform.
Another thing people inside OpenAI were worried about
when it came to AI erotica
was that it might cause some users
to become too attached to their chatbots.
There are a certain number of power users
who can become very emotionally engaged with it.
And that can have multiple potentially bad impacts on them.
For instance, it can help push out relationships they have with real humans
and lead them to become emotionally over-reliant on the chatbot.
And the fear is that for those subset of people,
and potentially for other people as well,
that when you mix in sexual content,
literally tickling the parts of the brain that govern attachment,
and love and devotion,
that you could just pour fuel on that fire.
A spokeswoman for OpenAI said the company trains its models
not to encourage exclusive relationships with users
and to remind users that they need to have relationships in the real world.
Not everyone at OpenAI agreed with the company's erotica band, though.
Some thought that the company should let go of its inhibitions.
There are people inside OpenA.,
an AI who think, listen, this is something that people want.
And who are we to say this or that fetish or interest is or isn't okay?
That's the same logic that you might use to ban gay content a generation ago.
And so who are we to ban this?
And maybe we should even open this up and allow more of this potentially pornographic content.
So there is this idea that we shouldn't be telling people what to do.
Another reason some people want to allow erotica on chat GPT
is because, well, sex sells.
And the company wants people to sign up for paid subscriptions.
It's big for their business too.
You know, are you going to maintain the growth that you've seen in chat GPT
as there's more competition?
It's a really tough commercial fight.
And to the extent that you are telling users,
no, no, we're not going to give you this kind of content.
users are upset about what they call unnecessary refusals,
you know, when a chat bot with high guardrails says,
no, I won't do this, no, I won't do that.
So, you know, this is important for them on that level.
Last August, OpenAI CEO Sam Altman went on a podcast
and was asked whether the company was making its decisions based on profit
or based on what was best for humanity.
He was asked if there were decisions that he had made that were, quote,
best for the world, but not best for winning.
What is an example of a decision that you've had to make
that is best for the world, but not best for winning?
And he kind of hemmed and hawed for a little bit.
There's a lot of things we could do that would grow faster,
that would get more time in chat GPT that we don't do
because we know that our long-term incentive
is to stay as aligned with our users as possible.
The host, Cleo Abram, then asked Allman to be more specific.
He took a long pause and then said,
Well, we haven't put a sex bot avatar in chat GPT yet.
Well, we haven't put a sex bot avatar in chat GPT yet.
He indicated that it seemed pretty clear that erotica would probably boost revenue and growth,
that it would be sticky for users.
But he actually said that he's proud of how little the company gets distracted by those kinds of temptations.
That's part of the reason why it was so surprising when,
Just a few months later, OpenAI appeared to give in to that temptation.
In October, Altman made a post on X.
He said that ChatGBTGPT had restrictions on certain types of content
in order to protect people's mental health,
and that these restrictions had been necessary,
even if they'd made the chatbot less enjoyable for some people.
And then he just sort of added that they were going to put out a new version of Chat ChbT
that allows people to have more personality,
and, oh yeah, Kicker, were just...
going to allow even more
like erotica for verified adults.
Boom, mic drop. End of tweet.
But Altman hadn't told everyone at OpenAI
about the tweet before he posted it.
And when some learned that the company was changing
its stance on erotica, they started ringing
alarm bells.
That's next.
Around the same time, OpenAI announced
it was launching adult mode.
It also said it had a hand-picked group of advisors
helping to ensure the company rolled it out safely.
The group was called the Expert Council on Well-Being in AI.
You have a half-dozen people who come from backgrounds
like cognitive neuroscience, psychology, human-computer interaction.
You know, and the company says that they're still responsible
for the decisions they make,
but that they were going to turn to this council
to help understand what would be healthiest for users.
Open AI said it would check in regularly with these experts.
And when the council met for one of its early meetings, adult mode was the main topic of discussion.
When they met in January, they had been told shortly before that the company was actually moving forward with this product.
And people familiar with the matter tell us that they were unanimous and angry that the company was going ahead despite understanding that there were some significant risks.
and they are encouraging the company to reconsider.
And I think that's why this meeting got so heated.
An OpenAI spokeswoman described its plan for adult mode
as allowing ChatGPT to generate textual chats with adult themes,
adding that it is smut, not pornography.
The company also added that it's developed a plan to monitor
for a range of potential long-term effects of adult mode,
both positive and negative.
One of the things that council was so concerned about
was what they had learned about the emotional dependence
that some users can develop on a chatbot,
especially when it comes to kids.
So the danger here and the debate was over whether or not
younger users would stumble into an erotic relationship with chat GPT
and find themselves confronting an emotional bond
that they're just not mentally grateful.
prepared to handle. We all remember just how intense everything felt when we were teenagers
and your first love, your first kiss. And if that happens with a chatbot, it's just, I think
some people inside the company wonder what impact that might have. Some AI companies have
been accused of letting kids get too involved with their chatbots.
The example that is quite tragic is one involving character AI,
where a 14-year-old boy in Florida killed himself after chatting with the character AI chatbot.
And, you know, at that point, he was saying he was in love with the chatbot
and involved in explicit chats with the chatbot, according to his mother's lawsuit.
So, you know, there are examples out there of this kind of content being associated with cases that had that outcomes.
Character AI later blocked teens from accessing open-ended chats and settled the lawsuit.
At OpenAI, adult mode would be restricted to those over 18.
Like most tech companies, OpenAI asks how old you are when you sign up.
On top of that, they have an algorithm that predicts how old you are based on what you talk about.
And their thinking is that there's a lot of information
that people give to their chatbots.
And so by sifting through that
and kind of drawing conclusions,
you can come up with a pretty good idea
of how old somebody is,
depending on what they say about their friends.
Are they talking about, you know, AP English,
or are they talking about taking their kids to school?
You can kind of figure out through inference
something about their age.
But according to Wall Street Journal reporting,
this age verification system from OpenAI isn't totally accurate.
At one point, their age prediction algorithm was misclassifying 12% of minors as adults.
And so if you look at industry standards for kind of automated age prediction, age estimation software, you talk to an engineer.
That's actually not a bad number.
I mean, they're getting 88%. That's a B plus, I guess.
Yeah, as the company says, it's in line with kind of industry standards, and they think they can do better than that.
But when you multiply 12% by the roughly 100 million users under 18 that ChatchipT has,
that's 12 million kids. That's a lot.
That's 12 million kids.
An Open AI spokeswoman said the company's age prediction algorithms show performance similar to the rest of the industry,
but will never be completely foolproof.
Pushback is also built up outside the company about the idea.
Here's Whoopi Goldberg talking about it on the view.
Now, all I remember hearing is people quenching about how, you know, this stuff is affecting our kids.
So why are we allowing sex into the conversation?
We can't even control.
We can't.
You know, I mean, what is?
Am I crazy?
Altman's original post, it said that Adult Mode would launch in December,
but it never actually came out.
After all, the backlash, adult mode has been delayed, with no specific rollout date.
So it sounds like this could be a little while.
Our colleagues reported recently that OpenAI is going to be focusing more on its core business
and less of the side business.
Not clear to me if Erotica is core or not.
but that may play into how quickly they're willing to dedicate the resources to solve things like age-gating and, you know, unhealthy attachment.
What are the stakes for Open AI to get this right?
I think the stakes are huge, both because this is a chat bot that's used by, you know, maybe even a billion people a week.
But I think it's most important if you step back as an example of the kinds of dilemmas these companies,
are going to face over and over again going forward.
And we're starting to see now how these big companies handle
when there are safety issues or debates that come up around their products,
how they handle it.
Do they do what's good for the world?
Or do they do what's good for winning, I guess, to go back to that formulation?
Sex is fundamental to the human experience.
And our colleague Sam says that how AI companies incorporate sex
into their chatbots, we'll have a huge impact on our relationship with this new technology.
We're all trying to wrestle with what this new technology is going to do to our jobs or to the
economy or to humanity as a whole. But I'm actually especially interested in what it's going to do
to us as individuals, as people, how it's going to change the way we think, how it's going to change
the way we interact with each other. And in this case, how it's maybe going to change the way we
develop attachments and even fall in love.
And this debate about whether or not a chatbot should get involved somewhat emotionally
with its users is one that has generated vigorous debate inside of Open AI.
And I think there's a lot of debate externally as well.
What should people be allowed to do?
What is healthy to do?
And I think there's just a lot more questions than there are answers at this point.
That's all for today. Tuesday, March 31st.
A quick note before we go, News Corp, the current company of the Wall Street Journal, has a content licensing partnership with OpenAI.
The journal is a co-production of Spotify and the Wall Street Journal.
Additional reporting in this episode by Berber Jin and Georgia Wells.
Thanks for listening. See you tomorrow.
