TED Talks Daily - Love, intimacy and connection in the age of AI | Bryony Cole

Episode Date: March 7, 2026

Relationships were never meant to be efficient, says sextech expert Bryony Cole, and yet AI companions are increasingly designed to be exactly that. As intimate relationships between humans and AI bec...ome more common, Cole challenges us to think more deliberately about how we shape our connections to machines — and with each other. (This conversation, hosted by TED's Whitney Pennington Rodgers, was part of an exclusive TED Membership event. TED Membership is the best way to support and engage with the big ideas you love from TED. To learn more, visit ted.com/membership.)Learn more about our flagship conference happening this April at attend.ted.com/podcast Hosted on Acast. See acast.com/privacy for more information.

Transcript
Discussion (0)
Starting point is 00:00:03 You're listening to TED Talks Daily, where we bring you new ideas to spark your curiosity every day. I'm your host, Elise Hugh. Relationships were never meant to be efficient, said sex tech expert Brian E. Cole. And yet AI companions are increasingly designed to be exactly that. Last month, we shared her TED talk on this feed about what happens when you develop an intimate relationship with an AI and why it's time for us to think deliberately about how we shape our connections to machines and with each other. And so the question is no longer, will we fall in love with AI? It's what happens now that we already have.
Starting point is 00:00:42 Briani recently joined TED curator Whitney Pennington Rogers for a follow-up conversation about how the rapid advancement of AI is forcing us to expand our understanding of love and what it might look like. They discuss the opportunities and ethical questions that come with it and what we can do to build more connected, authentic relationships in these uncharted times. How comfortable are we with a world in which, yeah, everything is automatic, does everything is efficient, including our thoughts and our needs and do we want to live in that world? That's coming up right after a short break. And now our conversation of the day.
Starting point is 00:01:29 Hello and welcome TED members. I'm Whitney Pennington Rogers. I'm a curator here at TED and I am thrilled to be here with you for today's live event. As technology has become more embedded in our daily lives, it's also quietly reshaping something deeply human. How we connect, how we experience intimacy, and how we understand relationships. From dating apps to AI companions, the tools we use are influencing not just who we connect with, but what we expect from connection itself. To dig deeper into what this moment means, not just for sex and romance, but for all of our relationships.
Starting point is 00:02:04 someone who has been at the forefront of these conversations, sex tech expert Brian E. Cole. Hello, Brianne. Hello. Thanks for having me. Yeah, thank you so much for being here with us. To speak to sort of this realistic part of what the moment we're living in, I think when people hear the terms sex tech or AI relationships,
Starting point is 00:02:25 there is this sense that this is futuristic or maybe something that's very fringe. And you talk a lot about what the landscape looks like now, and what we're seeing. Can you share some other technologies that are out there at this moment and maybe some surprising ways people are using them in their everyday lives? Oh, yeah. I mean, I think that it's almost like the technology isn't surprising me anymore. It's us, right? So in the last year in 2025, there were over 300 AI companion apps just developed and released on the market. And I actually don't find the features that interesting. anymore. Certainly in the future, we'll see that they will be able to track our eye movements
Starting point is 00:03:09 and, you know, they already track tone of voice, those sorts of things. It's actually just the amount that are in the market and who is using them and how we're using them. So, as I said before, you know, you can't go too far without realizing, oh my goodness, this is actually not just the cliche guy in the garage. This is everyone. It's definitely young people are using it more and more, but also older people for support that are feeling lonely, people that understand technology as well as people that are a little less dexterous with it. It's people that have using it for caregiving or have lost a loved one or in long-distance relationships or divergent users or people that are navigating grief, illness. There's all these different
Starting point is 00:03:54 reasons now why people are turning to AI for companionship. And I think, you know, I was at a 40th a couple of weeks ago, and it became very obvious that we're all using AI for these emotional moments as well in front of people. So not just for that companionship, but for writing a 40th speech, there's just something you can tell when you go to like a moment that's meant to be about you reflecting on your friend and this is going to be amazing. I think that guy used AI to create that talk or, you know, even I've heard anecdotes of people using it to write eulogies. These things in relationships that were never meant to be efficient that suddenly we're saying, oh my gosh, this is infiltrating my day-to-day relationships. I think when I look on the frontier,
Starting point is 00:04:48 which is, you know, three years ago, that's what I was seeing was, okay, we're all going to use this for companionship. I think even in the last couple of months, that idea has shifted even more, and this is really a peek into what's around the corner, particularly if we don't be more careful, and we can get into that around the guardrails or the questions we'll ask ourselves. But I think what I'm seeing now and we're seeing always a good signal is
Starting point is 00:05:14 what are these sensational stories in the media around this topic? And it's about people deciding to have families with AI. And so it's not just about your AI girlfriend or boyfriend or you're falling in love or you're dating an AI for companionship, it's now this extension of what was previously exclusively a human domain. And there was a recent story about a young boy who's decided he wants to, you know, have this AI wife and they want to adopt a human child in the future and raise an AI family. Now, I think that's going to be pretty difficult legally to do.
Starting point is 00:05:56 But we're seeing those ideas. years, people are going, I could raise a family. And then, you know, as I said in the talk, this isn't just about, you know, the lonely men, you know, that are, that have decided to use this for companionship or for this idea of raising a family. We also have a gender split where women are also using this. And in that sort of future facing context, women that are facing fertility issues and can't actually have children are thinking or talking about what would it mean to raise an AI family? So if I can't have a child biologically, what would that mean to have AI children and simulating that? So I think that's where I'm more and more surprised about
Starting point is 00:06:43 how people, you know, the technology is already here. We're not putting that back in the box, but how people are deciding to use it. And quite often that then dictates the next innovation or features we're seeing. I mean, there's so much there, and it's incredible to see that there aren't really these patterns of, it's always this person doing this thing that really does seem like it's ranging the gamut. And when you think about adoption, are you seeing that there are certain misconceptions for people who maybe are slow to pick this up or have some hesitation or, have some hesitation, or, you know, questions around the usage of AI in this way and companionship.
Starting point is 00:07:26 What are some of the, I guess, the common misconceptions you're seeing coming from that segment of the population? That are resistant to using AI for emotional needs? That's right. I'm thinking of my dad right now who just turned 80, who, you know, who will ring me up and be like, Brian, can you use chat GPT for this thing for me on my behalf? But I think, I actually think it's more, that there's a resistance more there around, well, how do I use it? But it's becoming so easy to use that, you know, that adoption curve is so much quicker than if we say 10, 15 years ago when social media first came out and people didn't really, what's a status update? How do I use this social network?
Starting point is 00:08:13 and I was getting those same calls from my parents. I think if you're a critical thinker, you're already over the adoption curve, and you've started using it and saying, oh, this is actually providing a lot of support, almost frictionless in terms of how I can talk to AI. The resistance to skepticism there is more about, oh, this feels a bit extractive.
Starting point is 00:08:40 What is going to happen here? And I saw some questions in the chat there, about, you know, what happens when suddenly diarrhea, most innermost thoughts are now landing on a tech platform and what sort of, what does that mean for me? So I think there's first a question where people are sort of thinking about this in terms of where's my data going, which we already saw with things like Facebook
Starting point is 00:09:05 or when people are using phones and downloading apps, there was that, oh, I've got to read the terms and conditions. And then very quickly as a society, We kind of have gotten over that a lot of us and gone, oh, yeah, just press agree. But there's that initial resistance there. And then I think the second thing we think about when we're using this, we go, I don't know if this actually feels good is because you're recognizing the dependency that you have on the platform. Because it feels really good, what's the actual sacrifice?
Starting point is 00:09:38 What is the cost here if I go and use AI for, my therapy needs rather than talking to a therapist. If I use it for a difficult situation at work or friendship, how is that going to weaken the bonds? And I think that's what we're thinking about, oh, is this actually supportive or is this extractive? And I think the next step I'd like people to take is, you know, thinking about, well, how could it be useful to me?
Starting point is 00:10:06 Is it, you know, a tool to practice vulnerability or articulating my needs or building some sort of emotional literacy and where does that line exist for me where I can sort of hold back. Does that make sense? Perfect sense. And it sounds like in the argument of whether or not synthetic intimacy, as you call it in the talk, become something that replaces human intimacy or is a thing that, you know, we use to support it, is a personal decision that you don't think that it should be either or that it's, it seems like, they can work together in one way or another.
Starting point is 00:10:46 That's the hope, right? I think that's the hope because I think we're looking towards a future where it is very hybrid and there are going to be people that will, you know, use AI inside their relationships and there will be people that will say, you know, I don't want to do that because I can already see that I want to keep this relationship just for me and I want to use my own humanity to decide about this person
Starting point is 00:11:10 rather than using AI or to settle an argument with my partner. I do think it's a personal choice, just like it's, you know, I keep coming back to social media, but you can't deny. It feels very similar, and we're seeing those warning signs of, you know, social media addiction. Now we're seeing with AI companionship addiction, I think it's going to come back to our own personal frameworks we build of like, how do I want to use this, how much time do I want to spend on this?
Starting point is 00:11:40 because, I mean, obviously the tech companies are building this for engagement. And we see, you know, now Instagram might have like pop-ups where you've spent this much time on Instagram. Now's the time to jump off. AI companionship apps are already doing that as well. So they're already putting in, I guess, safeguards. You could call them. But ultimately, it's on us as humans to decide at what point do I feel this is, you know,
Starting point is 00:12:09 something I want to use every day or once a week and create those rules and those boundaries for ourselves. And that's, that's really important to start thinking about that. And I think the easiest way to think about that is how do I use social media and how do I feel about that? How am I going to use AI? How do I feel about using AI? And you give us some great questions towards the end of your talk that can help us decide these things for ourselves, which are, which is really helpful. And you started to talk a little bit about how this sort of expands beyond romantic relationships, that this is just opening the door for helping us to understand what AI could mean for relationships beyond this and how it challenges our assumptions about friendship or other relationships. What do you think
Starting point is 00:12:57 are some of the ways that this is upending the way we think about connection more broadly and not just from a lens of romance? Yeah, I think, I mean, in every, Every facet, every relationship in our life, I think is going to be impacted by this because people are using AI as a tool to either support or sometimes replace this. So yes, certainly in dating and in relationships with intimate lives, also we talked about raising children and families and talking to our, you know, parenting is a really good example where I talked about. people and I didn't know if I should change the nappy at this time. So I asked chat TPT. So we're outsourcing all those little moments. And if you think about from when you wake up to put your pillow on the head at night, how many different people you're talking to, I think the biggest place we're not looking right now, which is kind of funny because we use this so much
Starting point is 00:13:58 at work anyway, is at work and with our colleagues. And I think those relationships at work, especially because, you know, we've moved towards this remote work, working from home situation. How do we build teams when we're outsourcing a lot of those difficult conversations or those emotional moments to an AI? Because at work, it's still about fostering healthy relationships with your colleagues and with team dynamics. And what happens when Sally decides that she doesn't like some of my work and she's, doesn't want to give the honest feedback and instead puts that into an AI companion and deals with that or deals with her burnout, to stress it from work. I think we're not clocking that that
Starting point is 00:14:48 is actually going to have a huge impact on organisations. And then if you look at the leaders in organisations, how are they using their voice and their own? I always struggle to use authenticity because I think it just gets overused. But, if we're talking about organizations leading the future and those leaders are using AI to do all those, the visionary work and have the voice and have that true authentic voice and they're outsourcing that to the AI. I think we're starting to lose a lot of the stuff that makes companies and building amazing products and services great is just like the humanity and our ideas.
Starting point is 00:15:30 It's not just, and this sounds very contrary to any, you know, business. best practice, it's not just about efficiency when we're at work. It is about, you know, what makes us a great team together and what makes visions truly visionary is they come from somewhere inside. And I think from the heart, you know, from our creativity, our imagination, a sense of intuition, these things that are distinctly human qualities. I mean, and you get into this a little bit in the talk and We have so many questions that are coming in from our members, and I'll start to bring some of those in now.
Starting point is 00:16:09 In fact, there's a good one from Sheila, which feels connected to this. She mentions that if 70% of Gen Z are going to have committed relationships with AI chatbots, how can we rebuild the modern village, which I think also connects to what you're suggesting around other spaces to other places of community workplaces and general governments and that sort of thing, how do we rebuild that if, I become such a presence of life.
Starting point is 00:16:37 Yeah, what a beautiful question, Sheila. I love this idea of rebuilding the village. And I think when we talk about rebuilding, we're not talking about going backwards. There's no going back to the village. So we have to update what we really think a village could be now that we have this presence in our lives. And I think it's, you know,
Starting point is 00:17:04 what's historically so beautiful about villages is they just sort of existed. It was about proximity. We were in the same place and space and there was this level of shared care there. I think today rebuilding a village, a village isn't like a place that we just happen upon that we find because we're there. It's actually a practice. A village is something we practice. And what makes a village is frequency, of interaction. So that consistency of whether it's, I don't know, the school pickup or the same barista, those sorts of things, that's so important that we have that frequency of interaction. It's not just a one-off event that we're going to meet new people. I think the other thing is, you know, this is such a great sort of quote by Esther. Perrell comes to mind about villages of like,
Starting point is 00:18:00 we expect from one person today what we used to receive from a village. And she's talking about all these different relationships in our lives that we can depend on, not just this one person. And now I think the update is instead of what we expect from a village, you know, we expect this AI to be that, we actually have to look at our ecosystem of our relationships and be really intentional about that, right? This is a village as a practice.
Starting point is 00:18:29 And so in that, we want to take like the non-romantic relationships in our life and and prioritize them as well. So whether that's friendships that are within proximity or, you know, scheduling calls with friends that are long distance or having that practice of who are the other non-romantic relationships in our life that can also provide a village and being really thoughtful about that. I think that's something that, yeah, we sort of fell off the deep end when we thought, just a lover's going to do everything. And now we think our AI companion lover is going to be this village.
Starting point is 00:19:05 I think the other thing to think about without, you know, I always try to sort of tread the cross the bridge really from like, okay, AI is not terrible, but it can, I mean, it can be scary, but it's also got these opportunities is, okay, well, what is the opportunity here with rebuilding a village and with technology? How can we use the technology as the connective tissue, whether that, you know, is scheduling the face time. I want to say some benefits of using an AI companion to schedule in these friendships
Starting point is 00:19:37 or find new, fun, novel, interesting ways to have shared experiences with our village is also really useful. So thinking about, yeah, those frequencies of interaction, that ecosystem of all other relationships in our life that are human and making sure they're frequent and then figuring out, well, how does AI companions fit in this? And is it as a scheduling tool? Is it, what is the way I'm going to use AI in a productive way to make my village? And now back to the episode. I love that and I love you sort of drawing this picture of us, you know, having the village that we rely on people, different people for different things. And then that sort of narrows to the
Starting point is 00:20:34 space of one person for everything. And now we're in this moment. where we can broaden it back out. And in a lot of ways, it sounds like really customize what your village looks like using AI to support that, like what you can get from certain people. And I wonder in doing that, are there certain things that you think you lose in those connections when you are customizing things in a way that really suits you? Is there some element of not having control over that that is really meaningful? Oh, I mean, I do think that like spontaneity and things, you know, the, the tough stuff is actually what builds great relationships.
Starting point is 00:21:15 I think the, you know, we more and more want customization. That's just, that's just sort of what we've been designing for. But for me, just off the top of my head, I think, yeah, those other than like the things, the things that happen that are unintended consequences of our friendship, of being out in the world, of, I don't know, know, your dog running off and doing a poop on the neighbor's lawn or something. Like, they're really important ways that we build our humanity and we slip up and we strengthen the bonds. But I think in terms of using a tool like AI for personalisation for figuring out, hey, I really love doing this practice with yoga and so does this friend let's schedule that in or let's
Starting point is 00:21:58 use this to find out the best teacher. Yeah, I mean, that feels really good. And Muriel asked the question connected to this as well. They mentioned that there's a French idiom that it's better to live alone than in bad company. And AI makes them wonder the opposite. Is it better to have a relationship with an AI than being alone with a sense of isolation? What's your feeling around that? Well, I mean, when I hear the word alone, and I think we talk about this so much today, right, in our world is loneliness.
Starting point is 00:22:34 And I think about how loneliness isn't really about being alone. It's about disconnection. And so I think the question there is about connection and what makes a rich connection. And while an AI companion can provide a connection in terms of richness and depth, you know, it can't witness us in the way that humans can. it can provide perhaps a different way.
Starting point is 00:23:06 You know, an AI can recognize when, you know, if it's got the visual compatibility with you, it can recognize when you're about to tear up or it can hear the tone in your voice when you're anxious. And that is a form of connection of being able to mirror and respond, but it can't wipe away your tears or it can't, you know, offer you a genuine forgiveness
Starting point is 00:23:29 in a way that I think deepens connection. Do I think it's better than, total disconnection actually do. I think that it's, you know, we have seen these cases in the research that find that it has actually helped people through difficult times. Now, that doesn't necessarily have to be for years or in a relationship. It might be someone navigating menopause or navigating the loss of a loved one or these really hard moments where I think in today's world humans are burnt out. And sometimes we are burnt out on relationships
Starting point is 00:24:05 and showing up for each other. And even though I talk about it being the magic, I think if there's an ability for, you know, you to have a connection and go through a hard time and have that companion provide some level of that, I do think that's worth something. And I think for the people on the other side that are trying to support you,
Starting point is 00:24:24 if it allows them to show up more fully and show up for you as well, I think that's a useful use of having a connection with AI. Maybe not a deep reciprocal one, but certainly a reflective one. Well, all of this raises a lot of questions around safety, security, and you touched on some of this already, but even bigger questions around morals and what's acceptable when we're thinking about approaching AI,
Starting point is 00:25:02 relationships. What do you think are some of the ethical questions we should be asking right now at this moment before these tools become even more embedded into our lives? It's, you know, who's designing them? Who is behind this? And remembering this is a business model. You know, I think we already are in the practice of telling our most deepest, darkest thoughts to, you know, it started, I guess, with journaling, right? And writing them down or with people that we barely know, with strangers, we already do this. And certainly then we saw with the internet and we're in the practice of this.
Starting point is 00:25:41 But I think we need this moment in time we'll either decide, well, are we going to demand more of these companies? Or are we going to think about how we're going to share those thoughts? Because I think that's a really important question around data and privacy that we've been asking technology companies for a very long time.
Starting point is 00:26:01 And I wonder and hope, that this is the moment that we wake up and ask, well, where is that going? Because that is the most vulnerable thoughts that we hold. And they're with a company, even if it feels like a human. They're part of a business model. And so I think that's really important. I think consent is embedded in that and important to ask. And who are the people that are designing these and that are writing these LLMs? And we need for a future to involve, really to involve everyone. But but to involve so many different points of view. And, you know, minority groups, caregivers, psychologists,
Starting point is 00:26:41 this can't just be people with technical skills. It needs to be people that have deep understanding of the human experience and of empathy and of psychology to really understand how do we navigate this in an ethical way. And, you know, for the most part, it is the Wild West. It reminds me very much of my earlier career in sex tech where there aren't, There are very few guard rails and people are scrambling, governments, institutions are scrambling to catch up and to legislate around this.
Starting point is 00:27:11 But in the interim, I think the gate needs to be with ourselves of how we're going to use these tools and how do we invite more people in with more diverse experiences into these companies to look at them and to build with them because there's no not building them. they're happening. I mean, I think in thinking about these conversations around how AI can, you know, sort of having these conversations around safety and for users of this, it definitely brings up questions around children or people who are,
Starting point is 00:27:49 young adults who are not used to navigating some of these complex spaces. And we have an interesting question from Jennifer M. Around whether there's a way to harness AI for sex education for young people who are uncomfortable, while speaking to adults. How do you think about it, that use for it? Yeah, I mean, there's a company at the moment mojo that are doing this for sex therapy and I think, I think it is. I think there are great uses of, and we've seen this previously with technology and tools to allow people to understand topics that are taboo that are off limits that feel like, oh my gosh, am I normal
Starting point is 00:28:25 and be able to ask someone about that in someone being, you know, in this case, AI, in a way that feels safe. And so, yes, there's a great opportunity for that. And we're already seeing that that exists. And I think it's really helpful for things like sexual health and STIs and understanding those things that, especially children, feel a little bit embarrassed to ask. I do think it should be a supplement, though. I think part of sex education, so much of the curriculum that's missing is actually about the things that make us human. It's the listening.
Starting point is 00:29:04 It's the communication. It's the showing up. It's having the difficult conversations. It's asking if you got that STI test. And so that critical skill of being able to communicate with a partner, I think there still needs to be that human element. And that's why sex educators and sexologists are so fantastic. fantastic because the way they talk about these things to young kids and deliver it is in a way that gives people permission and I actually think we're going to need more of that skill,
Starting point is 00:29:36 not just around sex and intimacy, but around relationships. I see that with kids now and we joke about the Gen Z stare, right, and people not being able to read social cues or have that body language, that's only exacerbated by, using AI and not having those cues of learning body language and physical touch and having those difficult conversations. So I would say yes and hybrid model, absolutely. It's an incredible tool just like the internet has been for developing our knowledge, but it also needs to be paired with the things that keep us distinctly human and in relationship. Well, I'm thinking about sort of ensuring that companies are really thoughtful about security and safety and developing these products.
Starting point is 00:30:34 Ashley M. asked how do we get AI companies to train their LLMs on consent principles? You know, what's the incentive for them to do that? Pressure from public pressure. You know, and again, I think we just look back to what's worked and what hasn't. and there's plenty that hasn't still with social media. But I think organizations like the Centre for Humane Technology are doing an incredible job and that activism is the pressure point is the only sort of lever that we can pull on as everyday people who aren't in the companies and aren't driven by the bottom line to demand that
Starting point is 00:31:14 and to ask for that. And sometimes that means using your own, the way you use it or not use AI to send that message. But I think that's sort of what I look to and try to be a voice for is asking for that. There are so many questions in here, and we won't get to all of them that are connected to this idea of safety and security. And I think that this just speaks to what you're suggesting, like when you're thinking about these conversations around social media
Starting point is 00:31:45 and people really having real concerns around privacy, I'll pose one to you, though, here that I think is useful for us to think about. Peter O asks if you're worried about our most intimate thoughts being recorded and used by our AI friend provider into the future forever. Where does your thinking fall in that space right now? It does. I mean, yes, I am worried. and I also look at our usage of, you know, things like everything that we do already with the apps and things. And I think people are going to do this.
Starting point is 00:32:28 People are going to use these tools and, you know, have this experience where AI will have access, like with friend to their deeper thoughts. In fact, it won't really be an AI companion in many ways. It will be an AI extension of ourselves, right, that can, almost hear our thoughts and whisper in our ear as we're walking down a busy street. And then, you know, there's not even a need to pick up a screen and to text that AI. It's actually happening with, you know, a little chip here. I think that, you know, or the friend, which if you haven't seen that, is sort of a necklace
Starting point is 00:33:05 that lives around your neck and takes in your whole world. And so this is, I would say, Peter, this is already happening. People are already using this. I think in the future where we're going to be having these conversations, not as I've fallen in love with my AI, but actually AI is an extension of me. Some people are going to think that's utopia. A lot of us are going to think that's dystopian and choose not to use it. And I think the reality is the Thrutopia is it's going to land somewhere in the middle,
Starting point is 00:33:38 where yes, there will be a lot of movement before we get to that place. I think there will be people that think that that's amazing. But the three top here is there's probably going to be some good and there's going to be some terrible stuff that will happen and this will be the new normal. And so the most important thing we can do right now is remember that intimacy and innermostorts and sharing those, that is a skill and a system that we need to develop for ourselves
Starting point is 00:34:08 before we start outsourcing it to technology and making it automatic. It is something that we practice and learn how to do and get comfortable with ourselves before we do it with others. And how comfortable are we with a world in which, yeah, everything is automatic. Does everything is efficient, including our thoughts and our needs and do we want to live in that world? And it'll be a personal decision. Well, speaking to scared everyone, yes. Everyone's like, oh my goodness.
Starting point is 00:34:40 But also, you know, I think that there are a lot of opportunities. here, which you'd highlight throughout this conversation and in the talk. And I mean, thinking about the future, if you think just even five, 10 years from now, there's so many things that you've highlighted that are distinctly human traits. And I think that there are a lot of people who are designing these technologies who hope to find ways to have AI take on these traits that feel distinctly human. So if we get this right, and, you know, as a society, what do you think technology could make possible in terms of deeper connections or emotional well-being with AI? Yeah, I think we're already seeing the positives there.
Starting point is 00:35:22 And it is, you know, the step in between using a therapist. So the mental health care or the grieving process or, you know, going through those uncomfortable situations where we do, we can't call a friend or we do, you know, keep looping on the same thought like today. we see a lot of people using a lot, a lot, a lot, using AI companions through breakups. And that, you know, instead of going to your friend and talking ad nauseam about that breakup,
Starting point is 00:35:56 there is a step in between. And so I think there is this moment where we go, okay, it's not a replacement, but how can we use this for positives in our lives that provide a bit of reprieve? they're sort of the elements that I'm hopeful of while still retaining our humanity, you know. But I think the point I keep coming back to is just intimacy, relationships, they're not
Starting point is 00:36:28 efficient. We shouldn't want them to be efficient. And so how do we, if we just like take the technology off the table for a second, I think, I think a big trend in the future is going to be in efficiency. It's a luxury already, but what does it look like to have slow relationships? What does it look like to build in a way that feels a little bit uncomfortable and a little, like, slowing everything down? And I think that's sort of where I'm hopeful in the future is that we can take a bit of that and that actually feels good. good. That feels different. And for younger generations when I talk to them, that's a foreign concept
Starting point is 00:37:17 because everything's on demand. And so how do we learn when to take technology off the table and what does it like, what does it feel like to go slow? Because that's like the best other things about human relationships when it's, you know, feeling a little bit slow. And then in those moments where we need relief, emotional relief. We can use companions and with the hope that it makes us better in relationships. It makes us better humans. That's the optimistic view that we always see with companies. Oh, you know, you'll download this and you'll learn how to date better.
Starting point is 00:37:56 But the onus will always be back to you. Am I using this to practice a conversation? Am I using this to just completely make it efficient and outsource it? Well, it sounds like an answer to this question from Brian L about how humans can, can humans ever be as compelling or desirable as the best AI will become. And, you know, it sounds like the answer is that that's not the goal, right, for us to be as the same as AI but to contribute something different. So, I mean, I wonder if you feel like from the place of how we keep up, is there something we should be trying to do to keep up here? Yeah, I think, you know, these, like the AI relationship and the human relationship really speaks to two different sides of our hearts. And we keep sort of, which is so normal right now, is we keep fusing them together and thinking about replacement.
Starting point is 00:39:01 And, you know, it does do a lot of the ideas of relationship that we thought in intimacy that we previously have had from humans. But I think we need to remember that this is a different form of connection. And this is something that will be a different, if we call it a relationship or a connection, different. It's distinct. It offers us different benefits than being with a human. and it can provide support.
Starting point is 00:39:32 But I think when I think about real relationships, this idea, you know, we talk friction, uncomfortable, like a part of all of that, the stuff that feels uncomfortable in the moment, the relationship is about a two-way transformation. And really, you can never have a two-way transformation with an AI companion. So just really understanding how we can start to, to in our minds conceive of these things as really two distinct relationships that will dominate the future, but seeing it as an entirely new category.
Starting point is 00:40:11 And I think that helps also for young people and when we think about education around these, around relationships, right? I think there should be education and curriculums in school that are about, what does it look like to have a healthy relationship with AI? Just like what does it look like to have a healthy relationship and sex education? education and all those things, I think this is where we need to start educating people on, this is actually a new column we've created of relationship. And I think that's a really important way to prepare for the future.
Starting point is 00:40:47 What do you think are some practices that can help keep us grounded in our own humanity in this moment, but at any point in time? Yeah, I love that question. I think it is when we, okay, so when you're using the technology, I do think it is asking, is this bringing me closer to people in my life or is this actually isolating me further? I think, you know, in relationships, I think there is a lost art that's happening at the moment of curiosity and of listening, of being a good listener.
Starting point is 00:41:24 Because we're used to this world where everything is now on demand and, you know, efficient. I think if you are on Valentine's Day, if you're with someone, you know, yes, use AI to plan the date if you want. I think a good conversation to have with your partner is how do you use AI? Do you have an AI companion? And to talk through how that feels, I think that's going to be a negotiation that we're going to have with every single relationship. Is it cheating? How are you using it? How am I using it?
Starting point is 00:41:58 What gender have you got? What voice? I think that's a really interesting discussion. Maybe not sexy enough for Valentine's Day. But I think you put it on the agenda of something. And really this idea of curiosity and listening, it's about presence. And we talk about it so much. But I think that's the thing that feels sometimes hard to do.
Starting point is 00:42:24 We've all got tension spans of goldfish. But how do we, how do we stay present with that person? How do we read their social, their facial expression, their cues? Yes, technology will be better at doing it in the future, but I don't want to lose that skill. And how do I really feel into someone's energy? Because that's a human skill. That's something that we can do really well if we tap into it.
Starting point is 00:42:51 Beyond conversation, there's a universal language there of energy, being in the room with someone, how that feels, how do I feel? And so checking in with yourself, checking in with your partner. It doesn't have to be rocket science, doesn't have to be this therapist version of questions, 20 questions, but just being present. It's important. Well, in this last minute that we have, I know that something really exciting came out of the talk for you. So could you tell us a little bit about what's next? Yeah, great. Well, I continue to go down this rabbit hole and to still talk to people that are at the frontiers and try and make sense of how do we do this. A lot of this conversation and a lot of questions that we're getting at the
Starting point is 00:43:42 moment is sense making because we don't have that guide, right? We don't have this way to navigate the world. And so we're all trying to look to someone to have the answers. Tell me it's bad. tell me I need to do this, tell me I need to use it for five minutes. And so out of the TED talk and, you know, I shared those questions, which are really just like the beginnings of a blueprint has become a book. And talking about this guide, which I call the intimacy operating system, because it is a system now that we need to own for ourselves. It's like, what are these questions that we need to ask ourselves?
Starting point is 00:44:18 What are the questions we need to ask others? and provide people with a pathway through using this in the future and what the future of intimacy looks like in every area of your life, whether that is dating, falling in love, you know, getting married, but also with friendships and with parenting and with your colleagues at work, how do we navigate a world where there are these two distinct relationships and what does that mean for me? And so, yeah, I'm really exciting to be writing a book on it
Starting point is 00:44:49 and looking at intimacy is not something that we lose to technology because there are plenty of benefits, but something that we as humans can now actively design for. I'm very excited to see that, and I know so many others on this call are we didn't get to all of the questions, but you answered so many, so thank you so much, Brian, for sharing your insights here and for this conversation.
Starting point is 00:45:17 Thank you. Thanks, Whitney. That was Brian E. Cole in conversation with Whitney Pennington Rogers at a TED membership event in 26. Ted membership is the best way to support and engage with the ideas you love from TED. To learn more, visit TED.com slash membership. If you're curious about TED's curation, find out more at TED.com slash curation guidelines. And that's it for today. Ted Talks Daily is part of the TED Audio Collective.
Starting point is 00:45:46 This episode was produced and edited by our team, Martha Estefanos, Oliver Friedman, Brian Green, Lucy Little, and Tonzika Sungmar Nivong. This episode was mixed by Lucy Little. Additional support from Emma Tobner and Daniela Ballerazo. I'm Elise Hu. I'll be back tomorrow with a fresh idea for your feet. Thanks for listening.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.