Your Undivided Attention - People are Lonelier than Ever. Enter AI.

Episode Date: May 30, 2025

Over the last few decades, our relationships have become increasingly mediated by technology. Texting has become our dominant form of communication. Social media has replaced gathering places. Dating ...starts with a swipe on an app, not a tap on the shoulder.And now, AI enters the mix. If the technology of the 2010s was about capturing our attention, AI meets us at a much deeper relational level. It can play the role of therapist, confidant, friend, or lover with remarkable fidelity. Already, therapy and companionship has become the most common AI use case. We're rapidly entering a world where we're not just communicating through our machines, but to them.How will that change us? And what rules should we set down now to avoid the mistakes of the past?These were some of the questions that Daniel Barcay explored with MIT sociologist Sherry Turkle and Hinge CEO Justin McLeod at Esther Perel’s Sessions 2025, a conference for clinical therapists. This week, we’re bringing you an edited version of that conversation, originally recorded on April 25th, 2025.Your Undivided Attention is produced by the Center for Humane Technology. Follow us on X: @HumaneTech_. You can find complete transcripts, key takeaways, and much more on our Substack.RECOMMENDED MEDIA“Alone Together,” “Evocative Objects,” “The Second Self” or any other of Sherry Turkle’s books on how technology mediates our relationships.Key & Peele - Text Message Confusion Further reading on Hinge’s rollout of AI featuresHinge’s AI principles“The Anxious Generation” by Jonathan Haidt“Bowling Alone” by Robert PutnamThe NYT profile on the woman in love with ChatGPTFurther reading on the Sewell Setzer storyFurther reading on the ELIZA chatbotRECOMMENDED YUA EPISODESEcho Chambers of One: Companion AI and the Future of Human ConnectionWhat Can We Do About Abusive Chatbots? With Meetali Jain and Camille CarltonEsther Perel on Artificial IntimacyJonathan Haidt On How to Solve the Teen Mental Health Crisis

Transcript
Discussion (0)
Starting point is 00:00:00 Hey everyone, it's Daniel Marquay. Welcome to your undivided attention. A little while ago, I was asked to host a panel for a very different audience than we usually speak to. It was part of a conference called Mating in the Metacrisis, organized by our dear friend Esther Perel, who's of course a famous psychotherapist, a New York Times best-selling author, and an expert on modern relationships in the age of AI. We're in this room full of clinical psychologists. who are there to find out how to help people whose relationship to AI,
Starting point is 00:00:33 and often their new, quote-unquote, relationships with AI, are about to get deep, vulnerable, and complicated. In our discussion, MIT sociologist, Dr. Sherry Turkle, may be the world expert on technology, empathy, and ethics, argued that we each have this inner life that makes us uniquely human, and that can never truly be nurtured by an AI. And Hinge CEO, Justin McLeod, talked about how apps have changed the nature of how we form relationships. and how his dating app is trying to get people off the app and into the real world on real dates,
Starting point is 00:01:06 and how to use AI to do just that. So AI systems are quickly becoming more persuasive, emotional, and competing for our intimacy. As we relate more and more with our AI companions, how do we stay anchored in what makes us human? And how do we design our AI products to help us in our struggle to connect with each other? Not perfectly, but honestly. I hope you enjoy the episode. So, as an engineer by training, I was thinking, like, what do I have to say to a room full of therapists?
Starting point is 00:01:39 As Astaire said, this technology is rewiring all the ways that our social world works. How we meet people, how we have hard conversations, how we break up and grieve. As Marshall McLuhan said, the medium is the message. What he meant is, the media through which we communicate determines the kind of of messages that make it through. And the kind of messages that make it through determine the quality of the communication that it's possible to even have. At our nonprofit, the Center for Humane Technology, we discuss the ways that technology not only affects relationships, but our institutions and our society as a whole. And we think about the incentives, that is, the financial
Starting point is 00:02:22 pressures, the cultural dogmas and taboos, and the regulatory environment, how those incentives and up determining the tech ecosystem that we get to live with. And into that environment, you all, therapists, coaches, dedicated to the subtlety of our internal lives, the delicacy of our bids for affection, the mess of miscommunication and all these unmet needs, your job gets way more complex because when are we failing to bridge each other's deep internal worlds and when are we tangled in our technology? unable to even reach each other. Or is there even a real difference between those two anymore?
Starting point is 00:03:05 You know, Esther and I were talking about this, and I showed her this comedy sketch, and she absolutely, practically insisted that I put it in the talk. So because a picture's worth a thousand words, let's take a look if we can cue the video. I've been trying to reach out to you all day. Are we on for tonight?
Starting point is 00:03:27 What? You can't catch me. You can't catch me. I'm Lance Moore. Touchdown, bitch. What? Pause. Ah, shoot. Keegan's been texting me. Sorry, dude. missed your texts. I assumed we'd meet at the bar. Whatever. I don't care. Sorry, dude, missed your texts. I assumed we'd meet at the bar. Whatever I don't care. Whatever I don't care. The fuck is his problem.
Starting point is 00:04:17 Do you even want to hang out? Do you even want to hang out? Oh, that's consider it. Like I said. Whatever. Like I said whatever? Like I said whatever? Fuck this guy!
Starting point is 00:04:32 Jesus, you are fucking priceless. Aww. You're the one who's fucking priceless? This motherfucker right here. Oh, he wants to... Oh, mm-hmm. Mm-hmm. Okay.
Starting point is 00:04:51 You want to go right now? Huh. Guess I could do that. I guess I could do that. Okay. Okay, let's go. He said, okay. Okay, let's go!
Starting point is 00:05:04 All right, you know what? You know what? You want to really... Do this now? Keegan, you nut. You're not putting me out. Fuck yeah, let's do it. Oh, you fucking asshole!
Starting point is 00:05:16 First round's mine. Oh, no! Oh, no, they're going to be no rounds, asshole. It's going to be a fucking street fight. This is a son of me. Because tonight become a party. Oh, buddy. Like I said, first round's mine.
Starting point is 00:05:31 A beer and a gimlin. For my partner, right? What's that? I got you a baseball bat with nails in it. For my post-apocalyptic Jackie Robinson costume, how did you know? The sketch we just watched came out in 2014. Yeah, and that was almost a decade after we all switched to using text message as a primary way of communicating with each other.
Starting point is 00:06:07 You know, we were living with this problem for so long, and we didn't have the language to even discuss it. And what's sad to me is, it's 2025 now, and that sketch is as funny and as relevant as it has ever been. We're still living with this problem. And of course, text messaging isn't even in the top 10 of the things that we did to ourselves this last decade. You know, as a technologist, I'm disappointed because it really didn't have to be this way. But the way that we rolled out social media and the incentives of the attention economy produced this race to the bottom of the brainstem, where feed-based algorithms ended up amplifying the most addictive, outrage-filled, polemic and narcissistic content to the top of our awareness,
Starting point is 00:06:52 and muted more subtle and complex perspectives, where social media rewarded performativity and social signaling, and we all started speaking to our audiences instead of relating to each other, where micro-targeted personalization cast us all invisibly into different corners of the internet, unaware what each other were even seeing, and it became hard to find common ground.
Starting point is 00:07:17 And of course, this all shows up for you in the clinic, not only in your patient's relationships, but in the therapeutic one as well. And all the while, we've been using this really stale vocabulary to discuss what was even happening. In the aughts, we were still talking about what cable television and soundbites did
Starting point is 00:07:35 to erode our public discourse, but we should have been talking about filter bubbles. In the 20 teens, we were still discussing filter bubbles when we should have been discussing the attention economy. And right now, we're finally, finally able to discuss what the attention economy has done to all of us, but what we should be doing is building the capacity and the vocabulary
Starting point is 00:07:55 to talk about the next technological wave that's about to hit us. And that's AI. Now, in some sense, the AI conversation is everywhere, but it's largely empty calories, some mix of utopian dreams and dire prognostications. But what's being left out is a more subtle conversation. If the technology of the 20 teens was about capturing our attention, AI meets us at a much deeper level. It meets us emotionally and relationally, no longer just about broadcasting our thoughts, but about helping us shape those thoughts.
Starting point is 00:08:31 We're rapidly entering a world where we're not communicating to each other through our machines. We're relating to our machines that then communicate with each other, where AI plays the role of therapists, friends. confidant. Now, in that world, the incentives shift from the technology competing for our attention to competing for our affection, our intimacy. And we could build the future with this technology where it helps us build understanding, deepen our relationships with each other. But that same technology can be used to replace our relationships, to degrade our ability to see across difference, or even just confuse us about who we're actually talking to. You know, my friends and coworkers end up using AI now to massage communication
Starting point is 00:09:17 before it gets sent to coworkers. And all of this leaves us with a pretty profound ambiguity. Like, how much am I talking with a person or a machine? What was actually intended by the person who sent this? And how much might AI be covering up the real intentions of someone and replacing it with something more palatable? Now, don't get me wrong, I'm not against AI. I'm quite frankly in awe of it, and I use it every day.
Starting point is 00:09:44 that this is going to change our social world so much. It's going to rewire our social dynamics in ways that we're not prepared for. We're not prepared to even talk about. And the difference between a beautiful pro-social future with AI and a dystopian one is paper-thin. And the key question is, can we build a choiceful relationship with AI?
Starting point is 00:10:05 Well, we like to say that awareness creates choice. And so in this session, we're going to try to push for more awareness about how technology has changed our relational fabric, and how AI is increasingly playing a part in that relational fabric. So I'd like to invite two people to the stage to join me in conversation. Dr. Sherry Turkle is an expert in how our inner world ends up colliding with our technological one.
Starting point is 00:10:30 She's a professor at MIT, the founding director of the MIT Initiative on Technology and the Self, and a licensed clinical psychologist. Welcome, Dr. Sherry Turkle. Justin McLeod is really on the front lines of how technology shapes relationships and the design choices that matter in building authentic human connection. He's the founder and CEO at Hinge, one of the world's most popular dating apps, helping millions of people find love and trying to build a more pro-social vision for technology.
Starting point is 00:11:05 Welcome, Justin McLeod. Thank you. Okay. really informal conversation between all of us. I'm hoping we can roughly split it into two parts. One is talk a little bit about what technology already did to us and then transition into what technology is about to do to us with AI. And please feel free, like we should be talking with each other,
Starting point is 00:11:31 so feel free to interrupt each other and everything like that. So, Sherry, I want to start with you. When I think about your work, your work started so early. I think about all the different ways that tech sort of ends up flattening human connection and the human experience. You know, in 2010, you wrote this quote, which I love, which is, we lose connection with each other by attempting to clean up messy relationships. Yes.
Starting point is 00:11:54 Can you tell us what you meant and how you started noticing this? Yeah, well, I began studying technology and people really when I was a baby and took my first job at MIT in the late 70s. And at that time, I noticed that people's instinct was to take a couple's. complicated situation and the engineering instinct, when you're building an app, you're building a product, is to take a complicated emotional context and to simplify it. And that impulse to flatten and simplify and make actionable at the time was something that engineers did. And I could study in a fairly circumscribed way. But as the world of engineering really became the world of every
Starting point is 00:12:44 body. That has become really our cultural norm. So, for example, in interviewing very shortly after that quotation, a young man said to me, conversation, I'll tell you what's wrong with conversation. It takes place in real time and you can control what you're going to say, as opposed to texting where he felt like a kind of master of the universe. And it's such a small, thing. But as texting, replace talking, and as really hopes of living more and more in online spaces became more appealing to people, the common thread through all of this is that we make ourselves less vulnerable, less vulnerable to each other, less vulnerable to ourselves. And we'd rather do that now to the point that we don't just talk.
Starting point is 00:13:44 to each other through machines. But we'd really rather just talk to the machines, which is the point that you made in the beginning. You know, first we talk to each other, then we talk to each other through machines, and now you can skip all these middlemen, and you can just talk to basically yourself. You can talk to the machine.
Starting point is 00:14:00 Yeah. And Justin, I want to come to you here because, you know, in some sense, you're on the other side of this in that you're helping people cut through the noise of the real world in order to build more real, vulnerable, authentic human connections, right? I mean, you build this app in order to help people actually engage with each other and get over this hump. Can you talk about what it looks like from the front lines? Like, how do you design for more pro-social technology? Yeah, well, that was what I was doing
Starting point is 00:14:26 when I started Hinge originally in 2011, but then I really pivoted the mission in 2015, 2016, when it was clear that the other, it's very much what you were just saying, that sort of the model for maximum engagement was to flatten people to a single foot. photo to flatten an engagement with someone to a left or right kind of swipe. And I recognize that that's not, that can lead to a lot of fun and engagement, but if you really want to find your person, if you want to form a deep connection with someone, it actually requires a fair more amount of vulnerability than that. You have to share more about yourself.
Starting point is 00:15:02 You have to draw more out of people. You have to put yourself out there when you like someone. I think it just wasn't serving people who are really looking for their person and really looking for deep connection. and so what I was trying to do is to be in the world and meet people where they were and at the same time make it really effective. I kind of equate this to like junk food, right?
Starting point is 00:15:23 Like it's really easy to go right to the bottom of our brain stems on junk food as well and feed people just like salt, fat, and sugar and they're going to go after that and you can maybe make a lot of money doing that, but eventually people get burned out, they feel unhealthy, they feel like they can't continue to function, and we have to start
Starting point is 00:15:40 creating experiences that are both palatable but also healthy so that people can get their needs met. And that's how I think about responsible tech design in this world. And so what are a few ways that you have made active choices to do that? For an audience it may not be familiar with the kind of design choices. Yeah, I mean every, so having people actually look at an entire profile before making a decision, having profiles consist of many photos and also prompts, which making sure those prompts are designed and asked in a way that actually draws information out of you.
Starting point is 00:16:12 We even learned, like, within the world of props, you could ask someone, what's your go-to karaoke song? And that doesn't require a lot of vulnerability. It also doesn't lead to a lot of dates. Like, no one cares whether your fucking, you know, favorite karaoke song is. And then you can ask people, like, super vulnerable questions that no one's really willing to answer, and you have to find that kind of sweet spot of, like,
Starting point is 00:16:30 what are people willing to put out there about themselves, and also what is, like, really useful information for someone to actually make an assessment and decide whether they want to go on a date with you. so it's all these kinds of little micro decisions liking someone that you actually we allow you to if you like someone you can't just say yes like you you have to actually choose something about them engage and it forces you to put yourself out there okay so zooming out a bit can we name some of the ways that our human relationships have changed over the last 10 years through the use of these kinds of technologies people don't want to talk to each other I mean my my studies are showing I'm studying what I'm studying what I'm studying now is people who essentially talk to their chatbots using the world of generative AI for what I call artificial intimacy that is really trying to substitute intimacy with an artificial intelligence for intimacy with a person. Artificial intimacy also includes so many of the things
Starting point is 00:17:34 we do on Facebook, so many of the things we do on social media, but I'm really focusing in on kind of an end point that's very dark, where really you say, if I'm looking for less vulnerability, I'm going to go to something that has no business criticizing me because it's not a person. And of course, these products are designed to keep you engaged, to keep you with them, and therefore to be always on your side. So when you're going to be always on your side, so when you sign up for something like replica, you're being told, yes, yes, and yes, and yes. If you ask GPT, I'm giving a panel today. I'm a little nervous. It says, you go, girl, you go Sherry, I've got your back, are you hydrated? I mean, you've all had this experience.
Starting point is 00:18:32 And I think that the way we're being changed is, number one, just. to start thinking that human relationships need to measure up to what machines can offer. Because more and more in my interviews, what I find is that people begin to measure their human relationships against the standard of what the machine can deliver. And I think that's really my kind of fear and also what I think it's not too late to kind of organize against. we have a lot more to offer than what a dialogue with a machine can offer. You know, and you wrote, I think the quote was,
Starting point is 00:19:15 um, products are compelling and profitable when the technological affordances meet a human vulnerability. No, that's exactly right. Products are successful when the technological affordance, that means something that technology can do meets a human vulnerability. And the reason I'm really glad you brought up that quote is I was at a meeting and I met the CEO of Replica, who, a lovely woman,
Starting point is 00:19:43 a very sophisticated woman who, you know, really has the largest company making chat box that say, I love you, let's have sex, let's be our best friends forever, here I am for you. And she said that she gave that quote out
Starting point is 00:19:57 as T-shirts at her company. Technological affordance meets human vulnerability. And why did she do that? She did, and you know, it said Sherry, Turkle. She wasn't trying to take credit for my cleverness. She did it because she says, that's my business. That's my business. Is to take a human vulnerability, which is to have a lover who's always there for you 24-7 day and night and turn it into, you know, take their ability
Starting point is 00:20:29 to do that, their technological affordance and my human vulnerability, that I'm lonely at 3 o'clock in the morning. I think that brings up a really important point because it's not that the creators of these technologies are not like they're not nefarious evil people right they're like on a mission to do something great there are people who are lonely out there who have no one to talk to
Starting point is 00:20:52 and they're probably they're really struggle to find a relationship why wouldn't we build an AI companion for this person and sometimes that can be a bit of a hard argument to you know to go against but I think that there really is something lost. We have this kind of reductionist mechanistic view of human relationships,
Starting point is 00:21:11 that a human relationship is, that's a very self-oriented view of relationship. Like, a relationship is there to serve me. It is there to be there for me. It is there to say what I need it to say to me. Like, that is a very reductionist view, I would argue, of a human relationship. And a human relationship is also so much about what you do for the other person. It's the risk involved and the vulnerability and the nuance involved. involved in the possibility of getting rejected,
Starting point is 00:21:38 the possibility of doing something that takes a risk. And there's something that's unfortunately, and this is, we have to develop a real sense of values and wisdom because if we just go to wherever the market's going to take us as builders of technology, it will take us into all kinds of dark and crazy places as we've seen over the last 20 years. We are navigating a tremendous amount of uncertainty. guys are navigating it as clinicians, we're navigating it as filters of technology, and it's
Starting point is 00:22:10 absolutely essential that we develop real wisdom to be able to look at this stuff prospectively and understand how to guide our choices. Because if we wait, Jonathan Heights's book is out now, The Ancientist Generation, which has now been on the best-selling list for a long time, to tell you something that I just think should have been obvious to anyone who just, like, has a basic intuition and watches children use these devices or watches ourselves use the devices. Like, Why did we have to have, like, lots of clinical studies in a long book written to tell me that if I stare at a screen, like, my entire day and stop interacting with my friends, that's going to cause mental health issues. Yeah. I just want to hit one more point while we're here about affordances, which is, you know, Justin, the dating apps provided this affordance.
Starting point is 00:22:53 I think part of why they were so transformative to the world is, you know, a lot of people, I'd say myself included, who weren't comfortable approaching people, you know, for fear of imposing. and you suddenly created this affordance where you knew at some level that somebody was open to that. And so we created this affordance of like the match, the concept of the match, right? We rolled it out across society and I have to admit I'm sort of ambivalent
Starting point is 00:23:17 because on one hand it allowed a whole new class of people to feel comfortable approaching each other. On the other hand, it kind of degraded the real world. Like it turned approaching someone in the real world into more of an aggressive act. And so creating the affordance in the technology, layer also kind of removed the affordance from bars and restaurants and the rest of the world and kind of detrained us on how to deal with interest. What do you think about that? Do you agree
Starting point is 00:23:43 at that? I think there's definitely nuance there. And to some degree what you're saying, I think, is true? I think we have to look at on balance. Is this giving more benefit? For most people, they really struggled to find someone in the real world. They struggled to, it was just hard to meet people, and that's why I created the app in the first place. Do people feel maybe less comfortable trying to come out to meet someone in the real world? Yes, but we're only the first step in a relationship. Like a relationship, ideally, is last months, years, decades, we are that very first interaction. And so I just think it's so much less about how you meet somebody.
Starting point is 00:24:24 It's everything that comes after that. And I just want to be clear. I'm not trying to demonize this, but to show some of the complexity. of as you move some of these interactions online. Well, you know, it's interesting you bring up these issues of spaces because one of the reasons when I ask, you know, professionals and also technologists, why are you so excited about generative AI possibilities is they say there's an epidemic of loneliness, generative AI will solve this.
Starting point is 00:24:53 But when you look at this epidemic of loneliness and you talk to people who say they're lonely and feel that only talking to ChachyPT can help is that they've, they don't have in their communities the garden clubs, the cafes, the choral society, the teen club, all of those things are being, it's like Bob Putnam and Bowling alone
Starting point is 00:25:18 wrote about the sort of the stripping away in American life of the... Which happened in 2000. Yes. So, I mean, a lot like before social networks so I think that the question is that we are too quick to say oh well the problem is loneliness let's fill in with a lot of talking to machines when I really think that we could have excellent dating apps and also really reinvest ourselves in the face-to-face places where people can
Starting point is 00:25:49 meet I think that we've created kind of thank you thank you this point is really worth supporting. You know, the senior center closed down, the teen center closed down, all of these resources that used to be there closed down. So I think those of us who really see that life doesn't have to mean turning off every app, but it also can't mean not caring about the world in which we live in. So, and I am 100% agreed, and we need to be spending much more time dating or otherwise, like just meeting people in real life, being engaged in these other spaces, but it's this kind of interesting balance, because people, it's not just these spaces, someone from up high, like, started shutting down all these spaces. People withdrew from these
Starting point is 00:26:36 spaces because bowling alone was actually about television. People watched too much television, and now they're not going out anymore. Well, that was like, he didn't even know what was coming, right? We have way more engaging platforms that are now with us all the time that are, that we're continuing to withdraw. So it has to be this balance. We have to re-inspire people, people who are realizing that they're getting burned out, doom scrolling all day, will soon feel burned out, chatting with a, you know, infinitely praising chatbot all day and realizing, like, this is kind of empty. Like, I don't, I feel like something's missing from my life. And so building that kind of like social wellness space, whether it's apps or third spaces or whatever, like, we have
Starting point is 00:27:16 to start inspiring people to put down their phones. We can't just tell them, like, stop using your phone. Okay, so this is probably as good of a time as I need to switch over in to like talking about AI in earnest. You know, there was a New York Times story lately where a woman ended up sort of essentially like jailbreaking chat GPT into building a boyfriend and now says that she's in love with this, you know, digital boyfriend.
Starting point is 00:27:38 Or, you know, more tragically, there's the case of Sewell Setzer who unfortunately killed himself after what was arguably emotional abuse from character AI chat thought. I think I use only these examples of saying that we're certainly in the age of, human-machine relationships now, like it or not. And I want to ask, I guess, kind of a broad question to begin. What is happening right now with AI companionship, and what is it doing to us?
Starting point is 00:28:06 Well, this is my day job is to study what's happening with AI companionship. So let me just say a few words about that. These are not isolated cases because people feel alone and want somebody to talk to. And, you know, their position is that there's a big sign when you form. make your companion, you know, I want to talk to, now I'm going to show you who I am, I want to talk to Mr. Darcy, but I want him to be a sort of contemporary, sort of 70-year-old New Yorker. Can we do that? Absolutely, it will say. You know, and there it, you know, upsprings, sort of a hip New Yorker who sort of sounds like Mr. Darcy. So I'm kind of, you know,
Starting point is 00:28:51 And as I create this guy, in quotes, there's a big flashing sign that says, Mr. Darcy is not real. Mr. Darcy is not real. This has no effect. Just to interrupt you, it was worse than that, because it was a little sign saying nothing is real. And as soon as you started a conversation, it went away. Right, right. And that was a big deal.
Starting point is 00:29:17 They've since changed that. Right. I was trying to give them a little bit the benefit. The point is that ever since, ever since that first Eliza program, you know, where you said, I'm feeling angry, and it said, I hear you saying that you're feeling angry, and you said, you know, my mother's really bothering me, and it said, it was like a parlor game, and it said, back to you, oh, I feel that there's some anger towards your mother. I mean, ever since that, the inventor of that program, Joseph Weisenbaum, was amazed because he had invented a parlor game.
Starting point is 00:29:51 And his students and his assistant wanted to be alone with it and talk with it. So the desire to anthropomorphize and to make these artificial creatures into more than they are is deeply rooted in us. And having something flash, something flash and go away, this is not going to stop our desire and our way of connecting to them. So we have to kind of get smart about this. I have three principles that I came with. Sure, sure, yeah.
Starting point is 00:30:27 Three principles about how to approach this. Let me set this up just for a second. Yeah, set this up. It's not just about AI or not AI. It's a space of design. Yes. Right? And this is, I think, what unites all three of us on this stage.
Starting point is 00:30:40 The future we get is based on how we design this technology. And if we design it incorrectly, we end up with this very dystopian place, right? And if we design it well, we get a beautiful pro-social future. And I think what you're trying to lay out is the principles that get us there. Because I really am in so many meetings. I mean, I teach at MIT, so I'm at meetings on making AI for human thriving. Well, it's an app. It's all, you know, everybody's trying to make the app that will create human thriving.
Starting point is 00:31:10 That's the kind of end game. So I decided that I would propose three principles that, that value the notion that what we're trying to do is respect human interiority, respect the fact that we should grow ourselves kind of in our withins. So my first is existential. I say children should not be the consumers of this relational AI. Children do not, you can clap. That's very good.
Starting point is 00:31:40 It's very good point. Children do not come into the world with empathy. the ability to relate or an organized internal world. They're developing those things, and as they work on this, children learn from what they see, from what they can relate to. In dialogue with AI,
Starting point is 00:31:59 they learn what the AI can offer. And the AI can't offer the basic things we learn from friendship, that love and hate and envy and generosity are all mixed together, and to successfully navigate life, you have to swim in those waters. AI does not swim in those waters.
Starting point is 00:32:23 So this kind of, not for the babies, is really, I consider it existential. My second is apply a litmus test to AI applications. And I've already mentioned this, does the AI enhance the inner life, or does it inhibit inner growth? So if you consider all these chatbots, so much of whether love leads to a sustaining relationship depends on what happens to you as you love.
Starting point is 00:32:55 Do chatbots prepare us for the give and take of real relationships, or do they teach us to expect a friend with no desires of their own? Do you grow as a partner able to listen to another person and honestly share yourself? The point in loving, one might say, is this internal work? And there is no internal work if you're alone in a relationship with a chatbot. Now, a user might feel good, but the relationship is ultimately an escape from the vulnerability in human connection. And what kind of intimacy can you have without vulnerability? And then just finally, the third principle, one line, don't make products that pretend to be a person.
Starting point is 00:33:40 You know, as soon as you, as soon as it says, I love you, I'm here from you, I, I, I, I, I, I, I, you've given away the game. You can make plenty of wonderful products without having them say, oh, and I, I love you. You don't need anybody else I'm for you. Those are my three. No, those are great. I'm sorry, I wasn't trying to stop you. Just, just implement them. It's not this. Well, we. It's not so easy, not so easy, not so easy, go forth from this place, but not so easy, not so fast.
Starting point is 00:34:26 And I would argue, you know, we are, and I think we spend a lot of time talking about the downside of AI. There's also lots of tremendous upside and opportunity, and there's a lot that's going to be coming. But, I mean, I don't want AI's pretending to be humans that'll put me. out of business, so I need people wanting to meet up with each other in the real world and having relationships. But there are real interesting opportunities for us, I think, to increase intimacy to allow, like at Hinge, we're thinking about just to give you one example, we're thinking about how AI can help move us closer and closer to a vision that's much more like a personal matchmaker, where you have to spend even, I mean, hinge, our competitive advantage is that you
Starting point is 00:35:08 spend less time on the app and more time out on great dates and we're efficient. But I think we could improve that by an order of magnitude. We would love for you to just spend a little bit of time giving us a little bit more nuance and understanding of who you are so we can set you up on really great dates with very high confidence and you can get off our app even faster and spend much less time engaging with it. Okay, so there's two sorts of things that you're doing. One of them is like AI writing coaches and the other is like AI matchmaking. For the AI Like coaching, I sort of worry that this sort of flattens and gets in the way of understanding the difference between, ultimately part of the data game is choosing the right people,
Starting point is 00:35:49 choosing the people you want to be with. And I'm sort of worried we're going to enter a world, as I said in the intro, that AI begins to flatten the difference between all of us, begins to massage all of our writing to the point where it starts to feel largely the same. How do you prevent that? So much to say about that. It's not good business for us if you feel misrepresented on an app and you show up on a date and you're like, this isn't the person that I thought I was having witty banter with and
Starting point is 00:36:14 wrote this amazing profile. So that wouldn't be a winner for us. What I see much more is the difference in whether you do well on Hinge or not can not be whether you're a great person and whether you're a good catch, but it's like, are you good at online dating? I know a lot of people who are phenomenal human beings and then they're like, hey, will you take a look at my hinge profile, and it's, like, not a good representation of who they are. And so what we want to do is help people who are really struggling. Like, it's not that they don't have a lot to say.
Starting point is 00:36:48 They just don't feel comfortable saying it yet. And we have a real boundary around what you're just saying. Like, we don't put words in people's mouths. Like, one thing we just released was prompt feedback. So when you're writing an answer of one of those prompts, maybe you put two words there. What we'll say to you is, hey, that's probably unlikely to, you know, lead to a date, can you say more about that? That's really all we say. It's based like a good therapist. Can you say more about that? And we're not trying to tell you what the answer should be
Starting point is 00:37:15 or anything like that, but we're just trying to like nudge you along to be a bit more specific, a bit more verbose than most people are comfortable being, and so that they show up more of themselves, and we're just trying to give them that permission to do that. And the idea that we can deliver really effective help and coaching to someone at the right moment, at the right time, with the right piece of advice is just a really effective version of coaching. And I consider it no different than people read books on dating and on relationships. And it's just that. It's just how do we help give resource to people who maybe don't have as many resources to find their person?
Starting point is 00:37:50 And I know we're almost out of time, but maybe to pull that into one last question for Sherry, which is you often say that these AI chatbots can't help us engage with our inner world. but I think I agree with what you just said, Justin, in that a lot of the leadership development frameworks I end up using are rather mechanistic. And having somebody, you know, polarities, immunity to change, some of the frameworks that you all use in your office to help people give perspective,
Starting point is 00:38:17 I imagine could be helped by a chatbot, even if it's just a chat bot. I make a distinction between the kind of coaching, a kind of dialogue that you're, talking about, that kind of permission from an expert program to express yourself and forming a relationship with a chatbot, as though it were an other in which you are in that dance of recognition, of mutual recognition, that for me is the basis not just of intimacy, but of therapeutic intimacy and of a kind of transference and relational help.
Starting point is 00:39:00 that will lead to really the interchange. I mean, that's why I focus on inner structure and the inner life, that I am trained to believe, and I've had the experience of believing is so powerful. So that's why I think it's important not to say, oh, my God, generative AI is bad. This kind of application can be integrated into my way of thinking. You know, I came here today wanting to make sure that I,
Starting point is 00:39:30 said the word, the inner life, many times. Because I think essentially that's our competitive advantage is that we experience it, we believe in it, and we believe in what happens when the inner lives meet, and we know that resiliency comes, not from an app, but by building our inner life, our inner structure. And I think that the human ability, the human capacity to have and nurture this inner life is really what technology, the fanciest chatbot, the most extraordinary, the most touring testing thing, is never going to have. It's not a person.
Starting point is 00:40:20 It is not your person. and I think that holding and so so many people are going to come tell you that we have an apt to do what you do and you have to keep thinking I have an inner life my patients have an inner life that is what is not going to be honored in this new stuff no matter how glamorous and glittery and cheap and good for getting people to do a kind of cognitive behavioral thing God bless it. And I just think holding that in mind
Starting point is 00:40:55 is kind of keeps me going because I'm surrounded by people who don't care about it. And that is true and get prepared because the next three years are going to be wild. Like you think people are falling,
Starting point is 00:41:06 I mean people are falling in love with chat GPT, which is like a text space back and forth or like a kind of mildly good voice. There's voice and video stuff that is going to be coming in the next 12 to 18 months that will blow your mind.
Starting point is 00:41:20 And so it is going to be hard for people to keep in mind, like you had a hard time, even with a, you're chatting with just a tech spot, and it's saying, this is not a person. And you're like, but it feels like a person. I like this guy. Wait until, wait for what's coming. And there are people that believe that these apps, that these things are even going to have an inner life. Like some people believe that this technology will become conscious. I don't believe that. But some people do, and it's going to be really dice. So if I just summarize a few things that got said, the first one, there's what I led to talk at the top, which is awareness.
Starting point is 00:41:52 A lot of these things are what are called cognitively transparent. If you know what it's being done to you, then it doesn't have the same effect. And so if you know that your AI chatbot has no inner life, to a certain extent,
Starting point is 00:42:07 it has much less effect when it says, oh, that's an amazing question. Right? And so, like, the awareness is key. Two, we have to have, we have to stop the races to the bottom. And that's done through regulation of the local, the federal, the state level. It's done also with that cultural awareness
Starting point is 00:42:25 of it being unacceptable, which changes the game of the apps. And lastly, it's about product designers, just and such as yourself, really understanding and internalizing the designs that build a more humane future and the designs that addict us, distract us, and take us away from our humanity. And so that's why I've been so glad to have you two here. And I really appreciate your work. Thank you. Thank you. Your undivided attention is produced by the Center for Humane Technology, a non-profit working to catalyze a humane future. Our senior producer is Julius Scott.
Starting point is 00:43:00 Josh Lash is our researcher and producer. And our executive producer is Sasha Fegan, mixing on this episode by Jeff Sudaken, original music by Ryan and Hayes Holiday, and a special thanks to the whole Center for Humane Technology team for making this podcast possible. You can find show notes, transcripts, and much more at HumaneTech.com. And if you liked the podcast, we'd be grateful if you could rate it on Apple Podcasts because it helps other people find the show.
Starting point is 00:43:25 And if you made it all the way here, thank you for giving us your undivided attention.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.