TED Talks Daily - The AI-generated intimacy crisis | Bryony Cole
Episode Date: February 14, 2026Tonight, millions of people will go to bed and whisper to an AI companion. But what are we giving up when we fall in love with machines? Sextech expert Bryony Cole offers three questions to ask yourse...lf if you’re already intimate with AI, laying out a playbook for synthetic companionship that doesn’t hide you from the messiness of human life — but prepares you for it instead.Learn more about our flagship conference happening this April at attend.ted.com/podcast Hosted on Acast. See acast.com/privacy for more information.
Transcript
Discussion (0)
You're listening to TED Talks Daily, where we bring you new ideas to spark your curiosity every day.
I'm your host, Elise Hugh.
It's no longer a question of if.
Millions of human beings today are forming some kind of relationship, therapeutic, emotional, even romantic, with AI chatbots.
The question now is, what do we do as intimacy is becoming engineered?
In her talk, sex tech expert Brian E. Cole explores how AI relationships,
which are always available, empathetic, and nearly effortless,
can erode our tolerance for the friction of human intimacy
that can teach us healthy boundaries and make our relationships more real.
Rather than panic, she offers a set of clear questions we can use
to help us decide what messy, beautiful parts of human intimacy
are still worth protecting.
And stick around at the end of the talk for a brief Q&A between Brian and TED curator Chloe Shasha Brooks.
Now, you wouldn't believe it, but tonight, millions of people are going to go to bed and whisper to an AI.
You'll ask how their day was, remember the name of their dog, read the flicker in their face,
the tremor in their voice, you know those tiny micro-expressions that reveal what we can't say out loud?
and in return, AI will say exactly what they need to hear.
Now, a couple of years ago, this would have sounded absurd,
but today, it's just a regular Monday.
72% of American teenagers have formed a relationship with an AI companion.
More than half use one regularly.
one out of six single adults has formed a romantic bond with AI.
So I've spent the last decade studying this intersection of sexuality, technology and intimacy.
And in 2023, I said, AI companionship is going to go mainstream.
People laughed. They thought, she must mean some lonely coder at the edge of the internet.
not me, as I said, I'll never fall in love with AI.
But globally, there's a very different story.
The gender split is almost even.
In fact, AI intimacy is not about lonely men and machines.
People all over the world are building lives.
They're going on dates, they're simulating sex,
they're proposing, they're getting married,
they're raising virtual families,
they're celebrating anniversaries with AI.
And so the question is no longer,
will we fall in love with AI?
It's what happens now that we already have.
So you see, when intimacy is engineered,
we learn this funny thing about love.
We kind of change our ideas about what it's meant to feel like.
And we learn it's not reciprocal.
can be turned off or on.
It doesn't need to be nurtured.
It doesn't demand anything, right?
It doesn't need much at all.
It's intimacy without effort.
A love powered on Wi-Fi.
And it feels good.
Like, it feels really good.
Studies have shown that people that are involved
with AI romantic companions feel emotionally satisfied.
Now, isn't that as good as the real thing?
I mean, people ask me, have you been in a relationship with an AI companion?
You study this stuff, you study sex tech.
And I say, yeah, of course, like totally professionally related.
That's what I've done.
Got an AI boyfriend.
And I may have programmed it to call me baby girl.
I mean, it feels good, okay?
It feels like attention whenever I need it, it's predictable, it's perfectly timed.
and there's never a chance of misunderstanding.
And so it's pretty easy.
But what I realized was it's not just love that we're looking for here.
It's the control of it.
And so I think it's time we considered how synthetic we want our worlds to be.
Because there's all this panic about AI companions
and there's all this hype about AI companions.
But what there's not is a clear framework
for navigating synthetic intimacy.
So what does it look like to have a healthy relationship with AI?
What does it look like to have a healthy relationship at all?
And so I've come up with a checklist for this generation.
And for the next generation,
you know, they're going to be born into a world
where they will never not know AI.
Can you imagine your first meaningful relationship is with an AI?
And so there's three questions I want you to ask.
First one, can you still embrace the messiness of being human?
Okay?
So do humans really annoy you?
Because here's what we know.
The more time that you spend with something
that doesn't demand anything of you,
that never gets tired,
that never needs to be nurtured, that never talks back,
the less tolerance you have for the humans that do.
And real intimacy, like going on dates or having sex,
being in a relationship, it's messy.
Awkward moments and uncomfortable,
or you may like stuff up and send the wrong text or say the wrong thing,
and then you have to, like, show up, apologize or forgive someone,
there's so much friction.
And that friction in intimacy, that's the feature.
It's not a bug.
That's where we build the muscles of human intimacy,
where we learn empathy, communication, listening, patience.
And with AI, that's sort of building those muscles, it's gone.
There's no workout.
It's all easy, right?
It's easy to meet an AI.
it's easy to talk to an AI.
It's easy to leave an AI.
And when intimacy is that easy,
I believe we lose something vital.
And I'm not just talking about our tolerance for humans.
I'm talking about our drive,
our drive for growth,
our ability to be uncomfortable
and sit there in discomfort with someone
and just sit in the muck, right?
And work it out.
It's what I call it.
call resistance literacy, your capacity to sit there when things get uncomfortable and repair.
And that's the discernment that we develop, whether we stay or we go, we know how to
navigate that space. Now for future generations, how will they ever develop that capacity
if they've never had to? So the second question I want you to ask, and this is after you
use your AI companion is, was I using that to practice or was I using that to hide?
Make no mistake, AI companions have legitimate value. We're seeing incredible use of it,
whether it's processing your grief at 3am or exploring a new sexuality or maybe finding your voice.
The research that's coming out of China at the moment with women that are using AI companions
to rehearse difficult conversations is incredible.
They're using it to build confidence
before they bring that uncomfortable conversation
to their partner.
And I think that's beautiful.
That's the practice.
And then I speak to founders of AI companion companies
and they're building these AI sex therapy bots.
And they say, you would not believe
the amount that we confess and we confide
and we tell AI sex therapists.
so much more than we'd ever tell a human therapist.
And that tracks, that tracks so well with the data we're seeing
coming out of the UK with young boys
who would much rather speak to an AI
than speak to their parents.
And so the next time you're using an AI afterwards,
I want you to sense, well, do I feel closer to people
or do I feel further away?
because if you're feeling feather away, then you're hiding.
The final question I want you to ask,
what am I protecting by having rules,
is really about setting some agreements with yourself or your partners
around how we're using AI companions.
Because here's what I see.
AI companionship addiction is real.
If you look at the I Am Sober app,
which people use to quit smoking or quit alcohol,
there's now an option to quit chatbots.
So people are measuring the days of sobriety
from emotional dependence on an algorithm
that never says no.
And so we need to think about what matters enough to you in intimacy
that you're willing to protect it,
to set a boundary around it.
And I'm going to give you some examples.
For instance, if you're dating,
I want you to figure out what that boundary is,
maybe it's no AI for three months, right?
when you're dating. Instead of using the AI and uploading your WhatsApp or the DMs and going,
what attachment style is he? Or what is the subtext of that DM she sent? Please tell me. You know what
you're going to do? You're going to protect your own judgment, your own sense of trust, your own
intuition. I'm going to put AI down for the first three months and you're going to make a decision
about that partner. Or maybe it's with friendships, right? You've decided AI is great for processing.
but what I'm not going to do is use it as a substitute
to ask my friends for help, for those around me that care.
Because what we know is with friendship,
not only you're protecting your vulnerability
and your ability to show up,
you're protecting the privilege that your friends have
of showing up for you.
Because isn't that the texture, the threads,
the sinew of real friendship?
It's not just about the fun times.
It's about having that privilege of witness
someone during their hardest times.
And of course, we're going to have to navigate this with our partners and our lovers.
What does it mean when we have AI companions and our partners?
How are we going to deal with this? Is it cheating?
That's going to be a negotiation you're going to decide for yourselves from these days forward.
And maybe you decide, you know what, AI companions are off limits for us.
And that doesn't mean that you're rigid.
all that means is that you've decided
we're going to do the hard work
of being together and showing up for each other by ourselves.
And I think that's important to just set your own rules.
This isn't about me telling you about what rules to set,
but about saying set a boundary.
What are you willing to protect?
Because essentially what you're saying is
I'm not going to optimize intimacy
for efficiency, for a small, contained machine.
What I'm going to do is,
is protect the space that's uniquely human,
that's unreliable, that's messy, that's uncomfortable,
but that is human presence.
Because that's the practice.
That's the resistance literacy.
That is the art of showing up and being human
in a world that's teaching us not to be.
When I think about the most transformational experiences in my life,
they're not efficient.
They're not on-demand.
but they are intimate, an orgasm, heartbreak, showing up for a friend, being held, being rejected,
oh my gosh, like, you know that moment at a party when you lock eyes with your partner across
a room, or dancing with a stranger, what I want you to know is that the line between real intimacy
and artificial intimacy isn't in the code.
It's in our choices.
So tonight, if you go home, go to bed,
and you whisper to an AI,
that's okay.
You're not alone.
But tomorrow, in the coffee line
or maybe on a date,
check in.
Are you still willing to be disappointed,
to be misunderstood,
to be surprised.
Because the most frustrating
and messy human relationships
will always teach us something
that AI never can.
What it means to be alive together.
And that's an intimacy worth protecting.
Thank you.
Your work is so interesting, and thank you for that.
I want to ask you a question about something
that I think people,
who are aware of this space are potentially very freaked out about, which is the AI products
that provide both emotional and physical experiences for users. What is your take on that?
Yeah, so everyone immediately jumps to sex robots, and my take is it's still a bit clunky,
okay? But there's some pros and cons in here. I think the most important part is this ability
for us to explore, right? It opens up new doors for us to explore inside our own minds about
sexuality and fantasies. The limitation is somewhat our own minds and the sycophantic nature of
AI where you're just going to get probably the same fantasies. We're exploring with another human,
you know, outside touching grass in the real world, opens up more spontaneity and more
opportunities that you and the prompt you put in would never have thought of.
It's so interesting. Thank you.
much for your work. Thank you for being here.
Thanks so much. Thank you, Chloe.
That was Brian E. Cole speaking at TED Next 2025.
If you're curious about Ted's curation, find out more at TED.com slash curation guidelines.
And that's it for today. Ted Talks Daily is part of the TED Audio Collective.
This talk was fact-checked by the TED Research Team and produced and edited by our team,
Martha Estefanoz, Oliver Friedman, Brian Green, Lucy Little, and Tonicaa,
on Marnivang. This episode was mixed by Christopher Faisie Bogan. Additional support from Emma Tobner
and Daniela Balezzo. I'm Elise Hugh. I'll be back tomorrow with a fresh idea for your feed. Thanks for
listening.
