TED Talks Daily - What makes us human in the age of AI? A psychologist and a technologist answer | TED Intersections

Episode Date: September 14, 2024

AI has the potential to impact the way humans interact with the world — and each other. Social psychologist Brian S. Lowery and AI technologist Kylan Gibbs dive into the ramifications of em...erging technologies on people's mental health and social dynamics. Hear why Gibbs thinks, counterintuitively, the more we use AI, the less real it will feel — and how Lowery suggests we fulfill our intrinsic need for connection amid dizzying technological advances. (This conversation is part of "TED Intersections," a series featuring thought-provoking conversations between experts exploring ideas at the intersection of their experience.)

Transcript
Discussion (0)
Starting point is 00:00:00 TED Audio Collective TED Intersections, featuring speakers taking on subjects at the intersection of their expertise. We've paired up social psychologist Brian S. Lowery with AI technologist Kylan Gibbs to talk through technology in the near future, whether we're too worked up about AI, and what differentiates humans from other creatures, including artificial ones, after the break. Support for this show comes from Airbnb. If you know me, you know I love staying in Airbnbs when I travel. They make my family feel most at home when we're away from home. As we settled down at our Airbnb during a recent vacation to Palm Springs,
Starting point is 00:01:01 I pictured my own home sitting empty. Wouldn't it be smart and better put to use welcoming a family like mine by hosting it on Airbnb? It feels like the practical thing to do. And with the extra income, I could save up for renovations to make the space even more inviting for ourselves and for future guests. Your home might be worth more than you think. Find out how much at Airbnb.ca slash host. I want to tell you about a podcast I love called Search Engine, hosted by PJ Vogt. Each week, he and his team answer these perfect questions, the kind of questions that, when you ask them at a dinner party, completely derail conversation. Questions about business, tech, and society, like, is everyone pretending to understand inflation? Why don't we have flying cars yet? And what does it feel like to believe in God?
Starting point is 00:02:27 If you find this world bewildering, but also sometimes enjoy being bewildered by it, check out Search Engine with PJ Vogt, available now wherever you get your podcasts. And now, our TED Talk of the day. What makes a human a human? It's one of those question, isn't it? I mean, there's like two ways I would look at it. Like one is for my personal life and one is for my work life. Like one thing that's interesting is like there's been points when I've been spending like four to five hours a day interacting with AI.
Starting point is 00:03:00 And the interesting thing that happens in that is the things that you notice, like when you first start interacting, oh, this is really realistic. Similar to, like, when people first had, like, black and white TV. And they're like, wow, this is, like, real life. But then as you get used to it, you start to kind of realize, like, the things that make it less authentic. And I think something that I realize with AI is, like, there's certain ways that we interact that are just more spontaneous. Like, there's something about the predictability of AI that teaches you about, like, the spontaneity of being human. The ways they communicate, the naturalness, the contextual awareness, these little things that all add up.
Starting point is 00:03:30 That's on the technical side. On the other side, I think there's something of just the shared experience of being human that actually differentiates it from other animals' experience. You have a traumatic moment in your life and then you start to resonate with other people's. Like, I feel like every time I've had something that's nearly catastrophic, it like opened up like a new door of empathy. And then you start to be like, oh man, that really hurt. You know, or like when you cry about something, you're like, wow. And then you start to remember, like, this is what usually happens to me. It's like,
Starting point is 00:04:00 I start crying about something. And then I think about all the things that like I did for my mom or my grandma and like all these kinds of things and the things that I felt. And I feel like that, there's something in that kind of, like, shared experience where we have these things that, like, differentiate us. We're all different people. But there's something about, like, those common feelings that it all kind of arises from. But anyway, that's one thought. I love that answer. And I want to say that you're not normal in that way.
Starting point is 00:04:22 Here's why I don't think you're normal. People anthropomorphize anything. It doesn't have to even be that good, right? It doesn't have to be anywhere near as good as, like, AI for people to treat it like it has some, like, human characters. People treat their cars like they're people. So I'm surprised that when you interact with it a lot, it feels less real to you. There's something about, like, I think about, like, resolution. It's like the way of seeing the world.
Starting point is 00:04:48 And, like, you kind of increase this, right? It's like the same reason you can't look at TV that's not 4K now. And it's someone, I think, who worked on, like, early VR was saying, you know, the interesting thing about it was, like, when you stepped out of it, you're like, oh, the real world is actually amazing. And it's actually really hard to recreate that in technology. And I think the same is true for AI. It's like maybe for some people when they interact of it, you're like, oh, the real world is actually amazing. And it's actually really hard to recreate that in technology. And I think the same is true for AI. It's like, maybe for some people when they interact with it, the thing that they see
Starting point is 00:05:10 is some commonality. But the thing that I always notice is like, well, this is very different from the conversations I have with my parents. Even though it's something similar, there's something off. And it's those little things that it's like, that's like, I think what over time will add up as people use AI more, is they'll start to recognize, and I can't even point at them, like what are those nuances though that make us human? You just know it when you see it and you're like, and it's missing in AI. Yeah.
Starting point is 00:05:33 So, I mean, that's also interesting because what you just suggested is that the more people use AI, the less real it's going to feel to people. You think that's what's going to happen? I mean, there's probably another case. It's the same way as your Instagram and Facebook feed isn't a real conversation. There are certainly kids especially who would look at those kinds of feeds and feel like, oh, that's a real representation of my friends or my favorite celebrities or whatever actually think. But it's completely, well, I shouldn't say completely, largely false. And I do think
Starting point is 00:06:05 something similar will have will happen with ai where some people for sure will like almost be in like encaptured and they will believe that like that's the most realistic thing that exists and then start compare people to that but i i think that you know if you have that degree of empathy you'll be like oh there's something off here it's the same way as like even if you use a zoom call yeah there's something off it's like i can't you, you can't, it's hard to pick it up, but, like, I'm not picking up all the signals. And it's the very little, like, nuances that you probably just subtly pick up as well.
Starting point is 00:06:31 So you don't think that the technology is going to advance quickly enough where it'll overcome those little things fast enough to capture all of us. You're not worried about that? I am definitely worried about that. Mainly because I think for most people it's easy, right? So the thing about AI is it's so beholden, at least if you think about like the chatbot styles, it's so beholden to what we want. And that's kind of like what people I think want in their, a lot of people
Starting point is 00:06:56 want in their life or they need is like the sense of control. And the AI gives you the sense that like, I can control this anthropomorphic thing. And that's honestly one of my fears is that people get used to that. And what does it mean when like, I get used to interacting with something that is beholden to only my views and interests and then I go in there with a human who has their own interests. Do you think people want control? I think people want to be
Starting point is 00:07:17 controlled. Maybe it's this form of control though. To be controlled is a predictability, I guess. Yeah, people want the world to make sense, don't you think? Yes, yes. I think they also want the world to be, like there's something about like preferring predictability over optimality.
Starting point is 00:07:37 So like I've even felt it like when you have like, you know, a mental health moment, you have friends who have mental health moments. Things that I've always seen as interesting is, like, your brain and your mind prefer to stay in a state that's familiar, even if it's worse. So if you're in, like, a depressed state, you almost would rather, like, stick in that than break outside of it, right?
Starting point is 00:07:56 So there's something about, like, things that are familiar rather than actually, like, better. And I don't know. There's, like, a bias towards, you know, you almost identifying then with those kinds of states. Yeah, there's, there's, there's research on this. One, it's called the status quo bias. People like things that are already there. And two, people like to have what they believe about themselves affirmed. They really believe them, even if they're not positive. So that, that is true. So I'd be like, what does that look like in AI? Yeah.
Starting point is 00:08:26 I mean, it's definitely interesting to me that people seem to love, like, you know, you talk to a lot of these things, and they sound like computers, and they sound like AI, but people love it because it's kind of like it's familiar, it's controllable. You know, if you start to add lots of personalities and those kinds of things, it makes sense in context. But I found it interesting that, like,
Starting point is 00:08:44 as we started developing these AI systems that people interact with, they all have this similar voice. It's a very AI voice. You can tell that you're talking to an AI. Maybe that's intentional, but there is something there where I think people have a preference to getting what they want from humans from humans and from AI from AI. But that could blend. Like, I think that there's, you know, there's already lots of, you know, people in certain demographics who spend a lot of time on the Internet and they start to identify like that's their favorite form of interacting with people. And so I do think that there's a reality where, like, as we move into the future, there will be people who bias towards that for whatever reasons, you know,
Starting point is 00:09:20 whether it's the comfort of knowing that someone's not judging them, whether it's like the format that it speaks to you with that will kind of bias towards preferring those types of interactions. But on the other hand, I always think that there will just be a distribution of people. You'll have some people who really don't like it. Like I was saying, the more that I interact with that, now I find it almost painful because I just pick up on so many of these issues that you're like, I can't even use it at a certain point.
Starting point is 00:09:48 And you'd think that I'm in the AI space. I write like 20-page docs. I don't use AI for a single bit of it because it does remove that voice. And I do also wonder, though, is as people interact with it more, will they either identify the differences or start to conform to the things that they're trained with AI? It's the same as like, you know, if you interact with the same thing, your partner, for example, right?
Starting point is 00:10:12 You start to be biased by the communication because you're talking so much. You mean they're training you. They're training you. Yeah, I can agree. Right? Your partner is probably like, you know, they have a preferred way of communicating. You get used to it, these kinds of things. So I do wonder if as people interact with AI more, that they'll kind of all converge.
Starting point is 00:10:28 That's probably one of my biggest fears, actually, of AI, is it causes convergence of, you know. I'm concerned about the exact opposite. So I'm going to shift a little bit. So when we talk about AI, the way you're describing it, it's usually like dyadic interactions. Like I'm interacting with one AI, one agent. Yeah, that's true. But really what people do is interact with multiple people, right? You interact in some community or some small group setting.
Starting point is 00:10:51 And I'm surprised there's not more of that in AI. So you're also in gaming. Like, my understanding, I don't really game, but my understanding is a lot of the gaming is about connecting with the people. And it's a community kind of experience. So there's two things. One, I'm really surprised that AI seems so focused on these, like, one-on-one interactions as opposed to, like, multiple AI agents creating a more immersive social experience. I love you, bro.
Starting point is 00:11:15 That's literally what we do. Okay, good. So that's one. The other thing, like, the reason I worry less about convergence and more about divergence is if you could produce a more immersive social experience, now everybody's having their individual social experiences. Like now what I worry about with AI, with VR, with all these kind of technologies that are expanding what we can control about our social environment, about our physical perceptions in the environment, is that we all inhabit our own singular world. Like, that is more frightening to me than, like, you know,
Starting point is 00:11:53 that we all converge to the same experience. And now, back to the episode. Well, my mom's a grade 7 teacher, and one thing that she said is really interesting is if you went back like 20 years, everybody was watching like the same TV shows. And they'd come to class, and they'd all be talking about, you know, Pokemon or whatever it was. And now everybody watches their own favorite YouTube channel. And it's the siloing of reality. So, yeah, like what we do is when we work with games, for example,
Starting point is 00:12:22 one of the interesting things is like as people play through games, it's basically the same thing. You can have a million people go through a game. And it's some differences, but you're largely going to hit the same points. And so one of the things that we think about is, you know, what does that mean for agency? So, you know, the way we interact with media changes the way that we feel agency in the world. So if we see inert media that we can't change, it also gives you the sense that you can't change the world. So to your point, one of the things that we do with, we want to do with games is,
Starting point is 00:12:49 how do you make it so that each person can actually influence that outcome? As you add more agents into that, that you see, okay, I interact with this one and it has a cascade effect. Yes. Right? I love it. I mean, even in some of the stuff we've done here, the magic actually happens when you do have those agents interacting. Because then you're also not just
Starting point is 00:13:06 seeing that one-to-one interaction, but the emergent effect of basically that interaction. And another thing is, if your main controls that you have in a computer is like point and click, or the games jump and shoot, we're trying to see what does it mean if social skills, like interaction like this, are the ways that you actually interact with the games or technology and the agents.
Starting point is 00:13:26 Like, that's a very different way of conversing or of dialogue than, like, button presses, right? And I think that changes the way that you sense agents in the world. Because I think the way that most people change the world is by speaking, interacting, interacting with other humans, not by pressing buttons. I mean, arguably, it's the case in some way. Yeah, you know, the other thing that's interesting to me is I don't think people have an understanding
Starting point is 00:13:49 of the systems they exist in, right? People think about themselves as existing in, like, individual relationships, and they have a harder time understanding system effects, like I affect you, which affects your partner, which affects your partner's parents, right? That is a harder thing to grasp. But I think there's something that's fundamentally human about that.
Starting point is 00:14:08 Like you are also impacted by all these different things going on. Like we had the person come and put on our makeup, and now I'm looking beautiful and that's affecting everybody else around me. It's flowing. Yeah, exactly. I mean, how does that fit in? I just haven't heard people talk about it in that way, which is surprising to me. Because that, I think, is what fundamentally makes humans human, is interaction in complex social situations.
Starting point is 00:14:35 In these, like, nested systems. Yeah. And, like, they all affect each other, right? Like, you think that your, like, small activity doesn't affect whatever higher level political stuff. But it's all aggregate. Yes. And it's all interlinking as well. Yeah.
Starting point is 00:14:46 I mean, it's like the AI thing is interesting too because I often hear people talk about it as like this evolution, right? You have like singular cells to monkeys to humans to AI. Or it's like you could flip it, right? Where it's like more like, you know, cells to organs to human to AI. It's a system overarching that just because it's trained on us and we do these things, we actually influence that system. Then it's kind of this, now that people are interacting with it, it has this interplay.
Starting point is 00:15:10 And that's interesting, too, when it becomes like, AI isn't this singular entity. It is more of an institution or a system almost that is kind of overarching something else. Yeah, and it's also weird because it's like our vision of who we are. We talk about AGI. It's like we don't even know what intelligence is and we think we're going to produce something that has it. It's just an interesting situation where we talk about it, as you said, it's natural evolution,
Starting point is 00:15:37 but in fact we're creating it and it's not clear that we know exactly what it is we're creating. I actually think that one of the most interesting things is that we're starting to work on AI at a point where, like, I still think we're figuring out, you know, ourselves. Like, you know, neuroscience is, like, very early on in its days. And yet we're creating things that are, like, based on our own intelligence. We don't really understand even what's going on inside. And so to your point on, like, what are the effects, we don't really know yet. You know, we don't know.
Starting point is 00:16:05 Every, you know, every year a new paper comes out is comes out that changes how people think about child rearing, right? Like how to bring up a child well, like all those kinds of things. And now we're creating systems that will, you know, kind of be overreaching other humans. Like what does that mean? I don't know. I do actually think, you know, I happen to be an AI. We happen to be at this point in time. But if we could pause for a second, I think it would be a good another few centuries of figuring out, you know, I happen to be an AI, we happen to be at this point in time, but if we could pause for a second, I think it'd be good another few centuries of figuring out, you know, what we are
Starting point is 00:16:28 and understanding that a little bit better before we created something that was in our image, because like, we're kind of just, you know, it's, it's kind of like taking a photograph and like painting it, right? And it's like, you're not actually getting the person and painting it, right? Like, it's like, there's something about the life that's missing there. So I do agree. I think that we're honestly kind of premature, but I, I think it's just how, I guess, you know, life goes, that things come out when they naturally should. So, I mean, you work in AI. So what are you, what's the most exciting thing for you in AI? Like, what's your hope for it?
Starting point is 00:17:02 I think it's kind of back to that agency question. So, I mean, the way we, you know, you read a news story, you read a book, you watch a movie, you watch a TV show. I mean, this is specific to, like, my domain. Like, there's something about communication that we're having right now where, like, I'm adapting to the things that you say, to your body
Starting point is 00:17:20 language, to all those kinds of things, right? To, like, you know, the people in the room. All these kinds of things. And so when you have AI able to sort of help that adaptation so that you have that agency and the things that you interact with. I don't necessarily believe in like fully personalized media because I think we need like a shared social context. Like we reason to watch a movie because then we can all talk about it, right? We watch a TV show, we can all talk about it.
Starting point is 00:17:42 But there is something about the fact that we're all interacting with these internet objects. And so, you know, the way that technology feels, you're on a screen, it doesn't change. You're in a movie, it doesn't change. You're on, you know, watching Netflix, it doesn't change depending on what you do. And I think that changes the way that we see our own agency in the world. And so, I hope with AI that one of the things that it does is kind of opens this door to agency in the way that we interact with media and technology in general, such that we do notice that effect that you have on systems. Because even if it's small, right, where I take a certain action and it completely changes an app or changes an experience, maybe that helps us learn that we have an effect in the social systems as well that we're affecting. So something to that effect.
Starting point is 00:18:25 So you wanted to make our agency more transparent. Yeah. And do you think it does that? Because right now, I'm not sure it doesn't obfuscate our agency. No, I don't necessarily. No, yeah, I agree. I mean, this is why I think also, like, media and games is the domain I mainly focus on. And I think it's interesting, especially because young people use it a lot.
Starting point is 00:18:44 And so, like, I've heard, like heard very veteran game developers say, how people interact with games kind of trains kids how they should interact with the world. So even people who tend to be professional players in different games have different psychological profiles because they bias towards certain ways of interacting and seeing the world. The same way, I guess, if you trained in something academic, you have a different way of viewing it. And so if we make games and media in a way that you feel that sort of social impact as well, maybe it opens the door to another realm of understanding.
Starting point is 00:19:16 But yeah, I agree that a lot of the systems that we have today give you maybe a false sense also of agency, where like we were talking about the AI systems, where you actually feel like you're controlling this thing, whereas maybe it's also biasing and controlling or having some influence over you as well. So where do you think things are going? So there's obviously a huge race among some very, very well-resourced organizations
Starting point is 00:19:41 over AI, right? You know, Microsoft, Google, I mean, are the biggest maybe. And they are very quickly going to need to monetize it because this is what those companies are designed to do. Like, what do you foresee? Because I look at social media as an example. I think at the time when it first came out, people were really excited as a new way to connect with people,
Starting point is 00:20:07 a way to stay connected to people. You know, you can't, you couldn't otherwise catch up with people you lost contact with, that sort of thing. And it changed them to something else. In large part because the way it was monetized, I don't know, like going to ads, focusing on attention. Like what's the trajectory of AI? I can't, you know, I'm taking guesses.
Starting point is 00:20:33 Of course, we're all taking guesses. Of course, yeah. I won't hold it to you. I won't hold it to you. Don't worry. I mean, I think that the reality is we were kind of mentioning before about the challenges of scale. And when you invest tens of billions of dollars in something you need scale and i think that's one of like the the way that ai is developed and specifically even the models that were types of models we're using the economic model of it which
Starting point is 00:20:54 is effectively the more compute you have the better models you can create the better models you create the more usage you get the more usage you get better so it has somewhat of a honest honestly like monopolistic tendency i I think, in the way that actually even like the architectures and the economy of it works. And so I think it's almost inevitable that whatever AI systems are produced by these large organizations will be pushed to scale as quickly as possible. And there's maybe, there's some pluses in that where like, you know, sure, they're building in feedback loops. People can give their input. It biases it.
Starting point is 00:21:26 But also at the same time, what does it mean when a single model is fit to a billion people, right? So, like, that's kind of what I meant about the converging effect where, like, what happens when we are pushed to produce something that fits to a billion people? There's a lot of diversity in there. And so, you know, we create these scaled systems that are fitting with the whole like trying to fit the whole planet like does that work and so i think what will you know we're going to go through this phase we're like yeah you're going to have a billion people interacting the same ai and i don't know what the effect of that will be um and i also don't know even like the monetization models now are kind of you pay to like pay to use these kinds of things
Starting point is 00:22:02 which are are maybe okay but certainly ads will probably enter the equation also what happens when like you want attention and you can do ai is much better than the algorithms you even have on youtube and instagram and you can start to capture that attention and so i certainly think it's going to be an interesting little bit here now as we see these huge organizations spending tens of billions of dollars and the choices that they make to then monetize that and what that means for like how ai proliferates which is not you know a lot of these i know a lot of the folks in the organizations and their their interests have never been in that domain but at the same time you're beholden you know stock market interest and whatever it is then what happens like it it shifts it right we're in this we're in a capitalist world and that's
Starting point is 00:22:43 kind of like you know what what ultimately will change the incentives. So, yeah, it's interesting. Yeah, I mean, I am interested in coming from your background. You have a very different stance on it. But all this AI stuff is interesting. But when you think, it's almost your first question, what makes us human? And as people, just technology in general, and specifically with AI, like, where can people find like the meaning in their life, the values that they, you know, find true?
Starting point is 00:23:13 And how will that change? Do you think, I guess, with like the advent of these new technologies, or how do you even have it? How have you seen it change with the technologies you've already seen come to life? Yeah, I don't, you know, this is going to sound like a funny answer. I think people are too worked up about technology, personally. I mean, you know, we had this conversation. I think, you know, people have been using technology since we've been human, right? So paper was a huge invention. Talked about this.
Starting point is 00:23:39 The printing press, huge invention. Computer, huge invention. Internet, huge invention. AI, great. Another huge invention. Computer, huge invention. Internet, huge invention. AI, great. Another huge invention. And through all of that, I think what you see in a lot of the biggest technologies is the desire to connect with other human beings. I think what fundamentally makes us human is our connection to other human beings, our ability to engage with other human beings. And consciousness and all these other things, I think, are necessary preconditions.
Starting point is 00:24:04 But really what makes us human is connections with other humans. And that is incredibly complex. And I don't think we're close in terms of technology of replicating that. I mean, even the way you described it, it's like you have this feeling of like, this isn't right, this is off. And even if you felt like it was right, it still would be off in ways you didn't quite get. I don't think we're close. Though because it's designed to pull our attention away
Starting point is 00:24:32 from other things, I think it impedes our ability to do what we all kind of wanna do, which is interact with each other. Yeah, yeah, yeah. Right? And it might change the way we interact with each other in a way that might feel less fulfilling. And I think you see some of that in social interactions now. Some of that, I mean, recently maybe COVID was an issue,
Starting point is 00:25:04 but people feeling less comfortable in face-to-face interactions, right? Like people dating, there's no serendipity in hanging out and you meet who you meet. It's like you're using an algorithm to try to present to you options. Like that's a very different world. So that's prior to AI. And I don't know how AI is going to further influence that. And I guess just even like the general point, like, how core do you think the need for connection is? In the sense that, you know, I've heard some parents say that like through COVID,
Starting point is 00:25:35 their kids went through like a major change. You know, these regressions in their, you know, their different habits and these kinds of things because they weren't connecting with people. And that it's taken years to overcome that. So I do also wonder, whether it's through technology or things like COVID or just circumstances, could we lose that need for connection? Or that even if we need it, we might lose the desire for it and feel emotional trauma as a result, but still not go for it. How core do you think it is and do you think it's do you think we're safe in that in that kind of need um so i'm gonna give you the most extreme answer which is i think the true one that you will cease to be human if you don't have a need for human connection like i think you you will be a physical person but you you will literally
Starting point is 00:26:21 break down as a human being and this is is why in part social isolation or solitary confinement is considered inhumane. Because people literally break down. You will start to have hallucinations. You will break down mentally and physically absent human connection. So I don't think there's any, there's no possibility in my mind of losing the need. Like you may get less than you need and that will have negative consequences for you. But I'm not worried about people not wanting to be around people. Are you worried that like things like social media or AI or any of these things could give
Starting point is 00:27:01 you that sense that you're fulfilling that need, but not actually fulfilling it? You know, you said like, it's totally true, right? Solitary confinement is a great example because we need it. We absolutely lose our sanity as well as our well-being. But maybe technology can manufacture the sense that we're fulfilling it. And then over time, we see these mental health crises evolve as a result. Yeah, that's a good question. I think it's a, I don't, I think it's unlikely, but I don't know. Honestly, I don't know. I think, I'll give you, I mean, I'll talk about meaning for a second. And I think of that as fundamentally tied to our need for connection to other people.
Starting point is 00:27:37 I think sometimes we confuse, for example, our need for meaning with a desire for personal achievement, that we chase personal achievement. And what we're trying to do is generate meaning. So I think we can be confused and we can have those needs displaced into less productive routes. But I don't think it's going away, but I don't know that it's the healthiest. Yeah. No, I'm totally aligned. Yeah. Well, thank you, Brian.
Starting point is 00:28:08 That was an awesome conversation. It was great. It was super fun. It was really fantastic and super informative. Thank you. Support for this show comes from Airbnb. If you know me, you know I love staying in Airbnbs when I travel. They make my family feel most at home when we're away from home. As we settled down at our Airbnb during a recent vacation to Palm Springs, I pictured my
Starting point is 00:28:30 own home sitting empty. Wouldn't it be smart and better put to use welcoming a family like mine by hosting it on Airbnb? It feels like the practical thing to do, and with the extra income, I could save up for renovations to make the space even more inviting for ourselves and for future guests. Your home might be worth more than you think. Find out how much at airbnb.ca slash host. You are listening to a conversation between AI technologist Kylan Gibbs and social psychologist Brian S. Lowery. If you're curious about TED's curation, find out more at TED.com slash curation guidelines.
Starting point is 00:29:11 And that's it for today. TED Talks Daily is part of the TED Audio Collective. This episode was produced and edited by our team, Martha Estefanos, Oliver Friedman, Brian Green, Autumn Thompson, and Alejandra Salazar. It was mixed by Christopher Fazey Bogan. Additional support from Emma Taubner and Daniela Balarezo. I'm Elise Hugh.
Starting point is 00:29:31 I'll be back tomorrow with a fresh idea for your feet. Thanks for listening. Looking for a fun challenge to share with your friends and family? Today, TED now has games designed to keep your mind sharp while having fun. Visit TED.com slash games to explore the joy and wonder of TED Games.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.