Lex Fridman Podcast - Whitney Cummings: Comedy, Robotics, Neurology, and Love

Episode Date: December 5, 2019

Whitney Cummings is a stand-up comedian, actor, producer, writer, director, and the host of a new podcast called Good for You. Her most recent Netflix special called "Can I Touch It?" features in part... a robot, she affectionately named Bearclaw, that is designed to be visually a replica of Whitney. It's exciting for me to see one of my favorite comedians explore the social aspects of robotics and AI in our society. She also has some fascinating ideas about human behavior, psychology, and neurology, some of which she explores in her book called "I'm Fine...And Other Lies." This conversation is part of the Artificial Intelligence podcast. If you would like to get more information about this podcast go to https://lexfridman.com/ai or connect with @lexfridman on Twitter, LinkedIn, Facebook, Medium, or YouTube where you can watch the video versions of these conversations. If you enjoy the podcast, please rate it 5 stars on Apple Podcasts or support it on Patreon. This episode is presented by Cash App. Download it (App Store, Google Play), use code "LexPodcast".  The episode is also supported by ZipRecruiter. Try it: http://ziprecruiter.com/lexpod Here's the outline of the episode. On some podcast players you should be able to click the timestamp to jump to that time. 00:00 - Introduction 03:51 - Eye contact 04:42 - Robot gender 08:49 - Whitney's robot (Bearclaw) 12:17 - Human reaction to robots 14:09 - Fear of robots 25:15 - Surveillance 29:35 - Animals 35:01 - Compassion from people who own robots 37:55 - Passion 44:57 - Neurology 56:38 - Social media 1:04:35 - Love 1:13:40 - Mortality

Transcript
Discussion (0)
Starting point is 00:00:00 The following is a conversation with Whitney Cummings. She's a stand-up comedian, actor, producer, writer, director, and recently, finally, the host of her very own podcast called Good for You. Her most recent Netflix special called Can I Touch It features in part a robot. She affectionately named Bearclaw, that is designed to be visually a replica of Whitney. It's exciting for me to see one of my favorite comedians explore the social aspects of robotics and AI in our society.
Starting point is 00:00:30 She also has some fascinating ideas about human behavior, psychology and urology, some of what she explores in her book, Caldame Fine, and other lies. There's truly a pleasure to meet Whitney and have this conversation with her and even to continue it through texts afterwards. Every once in a while, late at night, I'll be programming over a cup of coffee and we'll get a text from Whitney saying something hilarious. Or we're there yet, sending a video of Brian Callin saying something hilarious. That's when I know the universe is a sense of humor and it gifted me with one hell of
Starting point is 00:01:05 amazing journey. Then I put the phone down and go back to programming with a stupid joyful smile on my face. If you enjoy this conversation, listen to Whitney's podcast, good for you, and follow her on Twitter and Instagram. This is the Artificial Intelligence Podcast. If you enjoy it, subscribe on YouTube, give it 5 stars and Apple podcasts, support on Patreon, or simply connect with me on Twitter. Alex Friedman spelled F-R-I-D-M-A-N.
Starting point is 00:01:34 This shows presented by CashApp, the number one finance app in the App Store. They regularly support Whitney's Good for You podcast as well. I personally use CashApp to sign money to friends, but you can also use it to buy, sell, and deposit Bitcoin in just seconds. Cash app also has a new investing feature. You can buy fractions of a stock, say $1 worth, no matter what the stock price is. Broker's services are provided by Cash App Investing, subsidiary of Square, and member of SIPC. I'm excited to be working with Cash App to support one of my favorite organizations called First, best known for their first robotics and Lego competitions. They educate and inspire hundreds of thousands of students
Starting point is 00:02:13 in over 110 countries and have a perfect rating and charity navigator, which means the donated money is used to maximum effectiveness. When you get CashApp from the App Store or Google Play and use code Lex podcast, you'll get $10 and cashapp will also donate $10 to first, which again is an organization that personally is seen and inspired girls and boys to dream of engineering a better world.
Starting point is 00:02:40 This podcast is supported by Zipper Cruder. Hiring great people is hard, and to me is the most important element of a successful mission-driven team. I've been fortunate to be a part of and to lead several great engineering teams. The hiring I've done in the past was mostly the tools that we built ourselves, but reinventing the wheel was painful. Zippercruder is a tool that's already available for you. It seeks to make hiring simple, fast, and smart. For example, a codable co-founder, Gretchen Hebner, uses Zipper Cruder to find a new game
Starting point is 00:03:13 artist to join her education tech company. By using Zipper Cruder's screening questions to filter candidates, Gretchen found it easier to focus on the best candidates and finally, hiring the perfect person person for the role in less than two weeks from start to finish. Zipper Cruder, the smartest way to hire. CY Zipper Cruder is effective for businesses of all sizes by signing up as I did for free at zippercruder.com slash lex pod that's zippercruder.com slash lex pod. And now here's my conversation with Whitney Cummings. I have trouble making eye contact. As you can tell. Me too.
Starting point is 00:04:12 Do you know that I had to work on making eye contacts? Because I used to look here. Do you see what I'm doing? That helps. Do you want me to do that? No. Well, do this way. I'll cheat the camera.
Starting point is 00:04:21 But I used to do this. And finally, people like I'd be on dates and guys would be like are you looking at my hair like they get it would make people really insecure Because I didn't really get a lot of eye contact as a kid. It's it's one to three years Did you not get a lot of eye contact as a kid? I don't know. I haven't done the soul searching right So I but there's definitely some psychological issues. Make you uncomfortable. Yeah For some reason when I connect eyes I start think, I assume that you're judging me. Oh, well, I am. That's why you assume that.
Starting point is 00:04:51 Yeah. We all are. All right. This is perfect. The podcast is me and you both steer the table all the time. Do you think robots of the future, ones with human level intelligence, will be female, male, genderless, or another gender we've not yet created as a society.
Starting point is 00:05:09 You're the expert at this. Well, I'm going to ask you. You don't know the answer. I'm going to ask you questions that maybe nobody knows the answer to or and then I just want you to hypothesize as a as a imaginative author, director, comedian, and can we just be very clear that you know a ton about this and I know nothing about this, but I have thought a lot about what I think robots can fix in our society. And I mean, I'm a comedian. It's my job to study human nature, to make jokes about human nature,
Starting point is 00:05:45 and to sometimes play devil's advocate. And I just see such a tremendous negativity around robots, or at least the idea of robots, that it was like, oh, I'm just gonna take the opposite side for fun for jokes. And then I was like, oh no, I really agree in this devil's advocate argument. So I, please correct me when I'm wrong about this stuff.
Starting point is 00:06:06 So first of all, there's no right and wrong because we're all, I think most of the people working on robotics are really not actually even thinking about some of the big picture things that you've been exploring. In fact, your robot, what's her name, by the way? Bear claw. We'll go on Bear claw. Okay.
Starting point is 00:06:28 What's the genesis of that name, by the way? We'll go on Bearclaw. Okay. What's the genesis of that name, by the way? Bearclaw was, I don't even remember the joke, because I black out after I shoot specials, but I was writing something about like the pet names that men call women, like cupcake, sweetie, honey, you know, like we're always named after desserts or something, and I was just writing a joke about if you want to call us to dessert, at least pick like a cool dessert, you know, like baraclaw, like something cool. So I had enough calling her.
Starting point is 00:06:53 Just stuck. So do you think the future robots of greater and greater intelligence will like to make them female male? Would we like to sign them gender, or would we like to move away from gender and say something more ambiguous? I think it depends on their purpose. I feel like if it's a sex robot, people prefer certain genders. And I also, when I went down and explored the robot factory, I was asking about the type of people that bought sex robots. And I was very surprised at the answer because, of course, the stereotype was it's going to
Starting point is 00:07:31 be a bunch of perverts. It ended up being a lot of people that were handicapped, a lot of people with erectile dysfunction, and a lot of people that were exploring their sexuality. A lot of people that were thought they were gay, but weren't sure, but didn't want to take the risk of trying on someone that could reject them and being embarrassed or they were closeted or in a city where maybe that's taboo and stigmatized. So I think that a gendered sex robot that would serve an important purpose for someone trying to explore their sexuality.
Starting point is 00:08:01 Am I intimate? Let me try on this thing first. I have to admit, let me try on this thing first. So I think gendered robots would be important for that. But I think genderless robots in terms of emotional support robots, babysitters. I'm fine for a genderless babysitter with my husband in the house.
Starting point is 00:08:16 You know, there are places that I think that genderless makes a lot of sense, but obviously not in the sex area. What do you mean with your husband in the house? What's that have to do with the gender of the robot? Right, I mean, I don't have a husband, but hypothetically speaking, I think every woman's worst nightmare is like the hot babysitter.
Starting point is 00:08:34 You know what I mean? So I think that there is a time and place, I think, for genderless, you know, teachers, doctors, all that kind of, it would be very awkward if the first robotic doctor was a guy or the first robotic nurse as a woman. You know, it's sort of, that stuff is still loaded. I think that genderless could just take the unnecessary drama out of it and possibility to sexualize them or be triggered by any of that stuff.
Starting point is 00:09:06 So there's two components to this, to bear a clause. So one is the voice and the talking, so on, and then there's the visual appearance. So on the topic of gender and generalist, in your experience, what has been the value of the physical appearance? So has it added much to the depth of the interaction? I mean, mine's kind of an extenuating circumstance
Starting point is 00:09:28 because she's supposed to look exactly like me. I mean, I spent six months getting my face molded and having, you know, the idea was I was exploring the concept of Ken robots replace us because that's the big fear, but also the big dream in a lot of ways. And I wanted to dig into that area because, you know, for a lot of people, it's like,
Starting point is 00:09:46 they're going to take our jobs and they're going to replace us, legitimate fear. But then a lot of women I know are like, I would love for a robot to replace me every now and then. So it can go to baby showers for me. And it can pick up my kids at school and it can cook dinner and whatever. So I just think that was an interesting place to explore. So her looking like me was a big part of it. Now her looking like me just adds an unnecessary level
Starting point is 00:10:07 of insecurity, because I got her a year ago and she already looks younger than me. So that's a weird problem. But I think that her looking human was the idea. And I think that where we are now, please crack me if I'm wrong. A human robot resembling an actual human you know is going to feel more realistic than some generic face.
Starting point is 00:10:30 Well, you're saying that robots that have some familiarity look similar to somebody that you actually know you'll be able to form a deeper connection with. That was the question. I think so on some levels. That's an open question. I don't, you know, it's an interesting or the opposite. Because then you know me and you're like, well, I know this isn't real because you're
Starting point is 00:10:50 right here. So maybe it does the opposite. We have a very keen eye for human faces and they're able to detect strangeness, especially that what it has to do with people whose faces we've seen a lot of. So I tend to be a bigger fan of moving away completely from faces. Recognizable faces? No, just human faces at all.
Starting point is 00:11:12 In general, because I think that's where things get dicey. And one thing I will say is, I think my robot is more realistic than other robots, not necessarily because you have seen me and then you see her. And you go, oh, they're so similar, but also because human faces are flawed and asymmetrical. And sometimes we forget when we're making things that's supposed to look human, we make them too symmetrical.
Starting point is 00:11:33 And that's what makes them stop looking human. So because they molded my asymmetrical face, she just, even if someone didn't know who I was, I think she'd look more realistic than most generic ones that didn't have some kind of flaws. Got it. Yeah, because they start looking creepy when they're too symmetrical, because human beings aren't.
Starting point is 00:11:50 Yeah, the flaws is what it means to be human, so visually as well. But I'm just a fan of the idea of letting humans use a little bit more imagination. So just hearing the voice is enough for us humans to then start imagining the visual appearance that goes along with that voice. And you don't necessarily need to work too hard on creating the actual visual appearance. So there's some value to that.
Starting point is 00:12:15 When you step into the character of actually building a robot that looks like bearcaused, such a long road of facial expressions of sort of making everything smiling, winking, rolling the eyes, all that kind of stuff, it gets really, really tricky. It gets tricky. I think I'm, again, I'm a comedian, like I'm obsessed with what makes this human and our human nature in the nasty side of human nature tends to be where I've ended up exploring over and over again. I was just mostly fascinated by people's reactions. So it's my job to get the biggest reaction from a group of strangers, the loudest possible reaction. And I just had this instinct just when I started building her and people going, and scream and people scream. And I mean, I bring around on stage and people would scream. And I just, to me that was the next level of entertainment.
Starting point is 00:13:08 Getting a laugh, I've done that, I know how to do that. I think comedians were always trying to figure out what the next level is and comedies evolving so much. And, you know, Jordan Peale had just done, you know, these genius comedy horror movies, which feel like the next level of comedy to me. And this sort of funny horror of a robot was fascinating to me. But I think the thing that I got the most obsessed with was people being freaked out and scared of her.
Starting point is 00:13:35 And I started digging around with pathogen avoidance and the idea that we've essentially evolved to be repelled by anything that looks human, but is off a little bit. Anything that could be sick or diseased or dead, essentially, is our reptilian brain's way to get us to not have sex with it, basically. So I got really fascinated by how freaked out and scared. I mean, I would see grown men get upset. The good thing away from me,
Starting point is 00:14:02 and I'm like, people get angry. And it was like, you know, what this is, you know, but the sort of like, you know, amygdala getting activated by something that to me is just a fun toy, said a lot about our history as a species, and what got us into trouble thousands of years ago. So it's that. It's the deep down stuff that's in our genetics, but also is it just, are people freaked out by the fact that there's a robot, so it's not just the appearance, but there's an artificial human.
Starting point is 00:14:34 Anything people I think, and I'm just all, also fascinated by the blind spot humans have. So the idea that you're afraid of that, I mean, how many robots have killed people? How many humans have died at the hands of other humans? Yeah, a million? A few more. Hundreds of millions.
Starting point is 00:14:49 Yet we're scared of that. And we'll go to the grocery store and be around a bunch of humans who statistically the chances are much higher that you're going to get killed by humans. So I'm just fascinated by, without judgment, how irrational we are as species. The war is the exponential. So you can say the same thing about nuclear weapons before we dropped on Hiroshima and
Starting point is 00:15:11 Nagasaki. So the word that people have is the exponential growth. So it's like, oh, it's fun and games right now, but overnight, especially if a robot provides value to society, we'll put one in every home. And then all of a sudden, loose track of the actual large-scale impact it has on society, and then all of a sudden, it'll gain greater and greater control to where we'll all be, you know, affect our political system and then affect our decisions. And robots already ruined our political system. Didn't that just already happen? Which ones?
Starting point is 00:15:46 Oh, Russia hacking. No offense. But hasn't that already happened? I mean, that was like an algorithm of negative things being clicked on more. We'd like to tell stories and like to demonize certain people. I think nobody understands our current political system or discourse on Twitter, the Twitter
Starting point is 00:16:05 mobs. Nobody has a sense, not Twitter, not Facebook, the people running it. Nobody understands the impact of these algorithms, they're trying their best. Despite what people think, they're not like a bunch of lefties trying to make sure the Hillary Clinton gets elected. It's more that it's a incredibly complex system that we don't. And that's the worry. It's so complex and moves so fast that nobody will be able to stop it once it happens.
Starting point is 00:16:32 And let me ask a question. This is a very savage question, which is, is this just the next stage of evolution as humans? Some people will die. Yes, I mean, that's always happened. You know, is this is just taking a motion out of it? Is this basically the next stage of survival, the fittest? Yeah, you have to think of organisms. You know, what is it mean to be a living organism? Like, is a smartphone part of your living organism?
Starting point is 00:17:02 Or... We're in relationships with our phones. Yeah. But it sex through them with them. What's the difference between with them and through them? But it also expands your cognitive abilities, expands your memory and knowledge and so on. So you're much smarter person because you have a smartphone in your hand.
Starting point is 00:17:19 But if one as soon as it's out of my hand, we've got big problems because we've become so morphed with them. Well, there's a symbiotic relationship. And that's what Elon Musk and your link is working on trying to increase the bandwidth of communication between computers and your brain. And so further and further expand our ability as human beings to leverage machines.
Starting point is 00:17:43 And maybe that's the future, the evolution, next evolution step. It could be also that yes, we'll give birth, just like we give birth to human children right now, to we'll give birth to AI and other places. I think it's a really interesting possibility. I'm gonna play devil's advocate. I just think that the fear of robots is wildly class-est because, I mean, Facebook.
Starting point is 00:18:07 Like it's easy for us to say they're taking their data. Okay, a lot of people that get employment off of Facebook, they are able to get income off of Facebook. They don't care if you take their phone numbers and their emails and their data as long as it's free. They don't want to have to pay $5 a month for Facebook. Facebook is a wildly democratic thing. Forget about the election and all that kind of stuff. A lot of technology making people's lives easier.
Starting point is 00:18:29 I find that most elite people are more scared than lower income people. And women for the most part. So the idea of something that's stronger than us and that might eventually kill us, like women are used to that. That's not, I see a lot of really rich men being like the robots are going to kill us. We're are used to that. Like that's not, I see a lot of like really rich men being like the robots are gonna kill us. We're like, what's another thing that's gonna kill us?
Starting point is 00:18:50 You know, I tend to see like, oh, something can walk me to my car at night. Like something can help me cook dinner. Something, you know, for, you know, people in underprivileged countries who can't afford eye surgery. Like in a robot, can we send a robot to underprivileged, you know, places to do surgery where they can't afford eye surgery. Like in a robot, can we send a robot to underprivileged places to do surgery, where they can't.
Starting point is 00:19:07 I work with this organization cooperation smile where they do cleft palate surgeries. And there's a lot of places that can't do a very simple surgery because they can't afford doctors and medical care and such. So I just see, and this can be completely naive and should be completely wrong, but I feel like a lot of people are going, like the robots are going to destroy us.
Starting point is 00:19:26 Humans, we're destroying ourselves. We're self-destructing. Robots to me are the only hope to clean up all the messes that we've created. Even when we go try to clean up pollution in the ocean, we make it worse because of the oil that the tankers, like it's like, to me, robots are the only solution. You know, firefighters are heroes,
Starting point is 00:19:43 but they're limited and how many times they can run into a fire. You know, so there's just something interesting to me. I'm not hearing a lot of like lower income, more vulnerable populations talking about robots. Maybe you can speak to it a little bit more. There's an idea, I think you've expressed it. I've heard, actually, a few female writers and robotists I've talked to express this idea that exactly you just said, which is it just seems that being afraid of existential threats of artificial intelligence is a male issue. Yeah. And I wonder what that is, if it, because men have insert positions,
Starting point is 00:20:30 like you said, it's also a class issue. They haven't been humbled by life. And so you always look for the biggest problems to take on around you. It's a champagne problem to be afraid of robots. Most people don't have health insurance, they're afraid they're not to be able to feed their kids. They can't afford a tutor for their kids.
Starting point is 00:20:47 I mean, I just think of the way I grew up and I had a mother who worked two jobs, had kids. We couldn't afford an SAT tutor. The idea of a robot coming in being able to tutor your kids, being able to provide child care for your kids, being able to come in with cameras for eyes and make sure surveillance. I'm very pro surveillance because I've had security problems and I've been, you know, we're generally in a little more danger than you guys are. So I think that robots are a little less scared of us because we can see that maybe as like free assistance help and protection.
Starting point is 00:21:20 And then there's sort of another element for me personally, which is maybe more of a female problem. I don't know. I'm just going to make a generalization. Happy to be wrong. But the emotional component of robots and what they can provide in terms of, I think there's a lot of people that don't have microphones that I just recently kind of stumbled upon
Starting point is 00:21:45 in doing all my research on the sex robots for my standard special, which is there's a lot of very shy people that aren't good at dating. There's a lot of people who are scared of human beings who, you know, have personality disorders or grow up in alcohol, or struggle with addiction, or whatever it is, where a robot can solve an emotional problem. And so we're largely having this conversation about rich guys that are emotionally healthy and how scared a robot is. We're forgetting about a huge part of the population who maybe isn't as charming and effervescent and salt meant as people like you and a lot of us who these robots could solve very real problems
Starting point is 00:22:26 in their life, emotional or financial. Well, that's a, in general, really interesting idea that most people in the world don't have a voice. It's, you've talked about it, sort of, even the people on Twitter who are driving the conversation. You said comments, people who leave comments represent a very tiny percent of the population, and they're the ones they, you know, we tend to think they speak for the population, but it's very possible on many
Starting point is 00:22:52 topics they don't at all. And look, I'm sure there's got to be some kind of legal, you know, sort of structure in place for when the robots happen, you know, way more about this than I do, but, you know, for me to just go the robots are bad. That's a wild generalization that I feel like is really inhumane in some way. Just after the research I've done, you're gonna tell me that a man whose wife died suddenly and he feels guilty moving on with a human woman
Starting point is 00:23:19 or can't get over the grief. He can't have a sex robot in his own house. Why not? Who cares? Why do you care? Well, there's an interesting aspect of human nature. So, you know, we tend to as a as a civilization to create a group that's the other in all kinds of ways. Right. And so you work with animals to you're especially sensitive to the suffering of animals. Let me kind of ask, what's your, do you think will abuse robots in the future? Do you think some of the darker aspects of human
Starting point is 00:23:53 nature will come out? I think some people will, but if we design them properly, the people that do it, we can put it on a record and they can put them in jail. We can find sociopaths more easily. But why is that a sociopathic thing to karma robot? I think, look, I don't know enough about the consciousness and stuff as you do. I guess it would have to be when they're conscious, but it is the part of the brain that is response for from a passion, the frontal lobe, or whatever.
Starting point is 00:24:22 People that abuse animals also abuse humans and commit other kinds of crimes. Like that's it's all the same part of the brain. No one abuses animals and then it's like awesome to women and children and awesome to underprivileged, you know, minorities, like it's all so, you know, we've been working really hard to put a database together of all the people that have abused animals. So when they commit another crime, you go, okay, this is, you know, it's all the same stuff. And I think people probably think I'm nuts for a lot of the animal work I do, but because when animal abuse is present,
Starting point is 00:24:53 another crime is always present, but the animal abuse is the most socially acceptable. You can kick a dog and there's nothing people can do, but then what they're doing behind closed doors, you can't see. So there's always something else going on, which is why I never feel a compunction about it. But I do think we'll start seeing the same thing
Starting point is 00:25:09 with robots, the person that kicks the, I felt compassionate when the kicking the dog robot really pissed me off. I know that they're just trying to get the stability right and all that, but I do think there will come a time where that will be a great way to be able to figure out if somebody has anti-social behaviors. You kind of mentioned surveillance.
Starting point is 00:25:35 It's also a really interesting idea of yours that you just said, a lot of people seem to be really uncomfortable with surveillance. And you just said that, for me, there's positives for surveillance. Yeah. And you just said that, you know what, for me, you know, there's positives for surveillance. I think people behave better when they know they're being watched. And I know this is a very unpopular opinion. I'm talking about it on stage right now. We behave better when we know we're being watched.
Starting point is 00:25:57 You and I had a very different conversation before we were recording. If we behave different, you sit up, and you are in your best behavior, and I'm trying to sound eloquent, and I'm trying to not hurt anyone's feelings. And I have a camera right there. I'm behaving totally different than we first started talking.
Starting point is 00:26:13 When you know there's a camera, you behave differently. I mean, there's cameras all over LA at stoplights so that people don't run stoplights. But there's not even film in it. They don't even use them anymore. But it works. It works. Right?
Starting point is 00:26:26 And I'm working on this thing in stand-by-surveillance. It's like, that's why we embed in Santa Claus. It's the Santa Claus is the first surveillance, basically. All we have to say to kids is he's making a list and he's watching you, and they behaved better. That's brilliant. So I do think that there are benefits to surveillance. I think we all do sketchy things in private and we all have watched weird porn or Google weird things
Starting point is 00:26:51 And we don't we don't want people to know about it. Yeah, the our secret lives So I do think that obviously there's we should be able to have a monochrome of privacy But I tend to think that people that are the most Negative about surveillance of the most secret I tend to think that people that are the most negative about surveillance of the most secret. The most of high. Well, you should, is your thing you're doing a bit on it now? Well, I'm just talking in general about privacy and surveillance and how paranoid we're kind of becoming and how, you know, I mean, it's just wild to me that people are, like, our emails are
Starting point is 00:27:22 going to leak and they're taking our phone numbers. Like, there used to be a book full of phone numbers in addresses that were, they just throw it at your door. And we all had a book of everyone's numbers, you know, this is a very new thing. And, you know, I know our MIGGLE is designed to compound sort of threats. And, you know, there's stories about, and I think we all just glum on in a very, you know, tribe away. I'm like, yeah, they're chicken or data. Like, we don't even know what that means.
Starting point is 00:27:50 But we're like, well, yeah, they, they, you know. So I just think that someone's like, okay, well, so what, they're gonna sell your data, who cares? Why do you care? First of all, that bit will kill in China. So, and I said, I said said I'm only a little bit joking because a lot of people in China, including the citizens, despite what people in the West think of as abuse, I actually in support of the idea of surveillance. So they're not in support of the abuse of surveillance, but they like they mean the idea of surveillance is kind of like
Starting point is 00:28:27 The idea of government like you said we behave differently and away It's almost like why we like sports. There's rules and within the constraints of the rules There is a more stable society and they make good arguments about Success being able to build successful companies, being able to build successful social lives around fabric that's more stable. When you have a surveillance, it keeps the criminals away, keeps abusive animals, whatever the values of the society with surveillance, you can enforce those values better. And here's what I will say,
Starting point is 00:29:02 there's a lot of unethical things happening with surveillance. Like, I feel the need to really make that very clear. I mean, the fact that Google is like collecting if people's hands start moving on the mouse to find out if they're getting Parkinson's and then their insurance goes up. Like that is completely unethical and wrong. And I think stuff like that, we have to really be careful
Starting point is 00:29:22 around. So the idea of using our data to raise our insurance rates or I heard that they're looking, they can sort of predict if you're gonna have depression based on your selfies by detecting micro muscles in your face. You know, all that kind of stuff, that is a nightmare not okay, but I think, you know, we have to delineate
Starting point is 00:29:39 what's a real threat and what's getting spam in your email box, that's not what to spend your time and energy on. Focus on the fact that every time you buy cigarettes, your insurance is going up without you knowing about it. On the topic of animals, can we just linger in a little bit? What do you think? What does it say about our society, the society-wide abuse of animals that we see in general, So, in fact, we're farming just in general, just the way we treat animals of different categories. Like, what do you think of that?
Starting point is 00:30:14 What does a better world look like? What should people think about it in general? I think the most interesting thing I can probably say around this, that the least emotional, because I'm actually a very not emotional animal person, because it's, I think everyone's an animal person. It's just a matter of, if it's yours, or if you've been conditioned to go numb. I think it's really a testament to what is the species we are able to be in denial about,
Starting point is 00:30:41 mass denial and mass delusion, and how we're able to dehumanize and debase groups, where we're two, in a way in order to conform and find protection and the conforming. So we are also a species who used to go to coliseums and watch elephants and tigers fight to the death. We used to watch human beings be pulled apart. And there wasn't that long ago. We're also species who had slaves.
Starting point is 00:31:13 And it was socially acceptable by a lot of people. People didn't see anything wrong with it. So we're a species that is able to go numb and that is able to humanize very quickly and make it the norm. Child labor wasn't that long ago, like the idea that now we look back and go, oh yeah kids, we're losing fingers and factories
Starting point is 00:31:32 making shoes. Like someone had to come in and make that so, you know, so I think it just says a lot about the fact that you know, we are animals and we are self-serving and one of the most successful, the most successful species because we are able to debase and degrade and essentially exploit anything that benefits us. I think the pendulum is going to swing as being like, I think we're wrong now, kind of, I think we're on the verge of collapse because we are dopamine receptors, like we are just, I
Starting point is 00:32:05 think we're all kind of addicts when it comes to this stuff. Like we don't know when to stop. It's always the, the thing that used to keep us alive, which is killing animals and eating them. Now killing animals and eating them is what's killing us in a way. So it's like we just can't, we don't know when to call it and we don't moderation. It's not really something that humans have evolved to have yet. So I think it's really just a flaw in our wiring.
Starting point is 00:32:30 Do you think we'll look back at this time as our society is being deeply unethical? Yeah, I think we'll be embarrassed. Which are the worst parts right now going on? In terms of animal? Well, I think that... No, in terms of anything. What's the unethical thing? If we, and it's very hard just to take a step out of it, but you just said we used to watch.
Starting point is 00:32:52 You know, there's been a lot of cruelty throughout history. What's the cruelty going on now? I think it's going to be pigs. I think it's going to be, I mean, pigs are one of the most emotionally intelligent animals and they have the intelligence of like a three-year-old and I think we'll look back and be really great. They use tools. I mean, they're, I think we have this narrative that they're pigs and they're pigs and they're disgusting and they're dirty and they're bacon and so I think that we'll look back
Starting point is 00:33:20 one day and be really embarrassed about that. Is this for just what's it called the factory farming? So basically mass. Because we don't see it. If you saw, I mean, we do have, I mean, this is probably an evolutionary advantage. We do have the ability to completely pretend something's not something that is so horrific that it overwhelms us. And we are able to essentially deny that it's happening.
Starting point is 00:33:44 I think if people were to see what goes on in factory farming. And also we're really to take in how bad it is for us. We're hurting ourselves first and foremost with what we eat. But that's also a very elitist argument. It's a luxury to be able to complain about meat. It's a luxury to be able to not eat meat. There's very few people because of, you know, how the corporations have set up meat being cheap, you know, it's $2 to buy a big Mac, it's $10 to buy a healthy meal. You know, that's, I think a lot of people don't
Starting point is 00:34:16 have the luxury to even think that way. But I do think that animals in captivity, I think we're going to look back and be pretty grossed out about mammals in captivity whales dolphins I mean, that's already starting to dismantle circuses Um, we're gonna be pretty embarrassed about but I think it's really more testament to You know There's just such a ability to go like that thing is different than me and we're better. It's the ego I mean, it's just we have the species with the biggest ego ultimately
Starting point is 00:34:45 Well, that's what I think that that's my hope for robots is they'll you mentioned consciousness before nobody knows What consciousness is but I'm hoping Robots will help us empathize and understand that That there's other Creatures out besides ourselves that can suffer, that can experience the world and that we can torture by our actions. And robots can explicitly teach us that,
Starting point is 00:35:16 I think, better than animals can. I have never seen such compassion from a lot of people in my life toward any human animal child as I have a lot of people in the way they interact with the robot because I think there's I think there's something of I I mean I was on the robot owner's chat boards for a good eight months and the main emotional benefit is she's never going to cheat on you. She's never going to hurt you. She's never going to lie to you. She doesn't judge you. You know, I think that robots help people, and this is part of the work I do with animals,
Starting point is 00:35:57 like I do at point therapy and train dogs and stuff, because there is this safe space to be authentic. You're with this being that doesn't care what you do for a living, doesn't care how much money you have, doesn't care who you're dating, doesn't care what you look like, doesn't care if you have cellulite, whatever, you feel safe to be able to truly be present without being defensive and worrying about eye contact and being triggered by needing to be perfect
Starting point is 00:36:19 and fear of judgment and all that. And robots really can't judge you yet, but they can't judge you. And I think it really puts people at their, at ease and at their most authentic. Do you think you can have a deep connection with the robot that's not judging or? Do you think you can really have a relationship with a robot or a human being that's a safe space, or is it tension, mystery, danger necessary for a deep connection? I'm going to speak for myself and say that I grew up in now called a comb. I identify as a co-dependent, talked about this, just stuff before, but for me, it's very hard
Starting point is 00:37:01 to be in a relationship with a human being without feeling like I need to perform in some way or deliver in some way And I don't know if that's just the people I've been in a relationship with or or me or my brokenness, but I do think this is kind of sound really negative And pessimistic, but I do think a lot of our relationships, our projection and a lot of our relationships, our performance, and I don't think I really understood that until I worked with horses. And most communication with human is nonverbal, right?
Starting point is 00:37:35 I can say like, I love you, but you don't think I love you, right? Whereas with animals, it's very direct. It's all physical, it's all energy. I feel like that with robots too. It feels very what how I say something doesn't matter. My inflection doesn't really matter. And you thinking that my tone is disrespectful like you're not filtering it through all of the bad relationships you've been in. You're not filtering it through the way your mom talked to you You're not getting triggered. You know, I find that for the most part, people don't always receive things the way that you intend them to or the way
Starting point is 00:38:09 intended and that makes relationships really murky. So the relationships with animals and relationships with the robots as they are now, you kind of implied that that's more healthy. Can you have a healthy relationship with other humans or not healthy? I don't like that word, but shouldn't it be you've talked about code of penalty, maybe you can talk about what is called a penalty, but is that is the the challenges of that, the complexity of that necessary for passion for for love between humans? That's right. You love passion. That's a good thing. I thought this would be a safe space. I got
Starting point is 00:38:52 right. I got trolled by Rogan. I know I think hours on this. I look I am not anti-passion. I think that I've just maybe been around long enough to know that sometimes it's a femoral and that passion is a mixture of a lot of different things adrenaline, which turns into dopamine. Cortisol, it's a lot of neurochemicals, it's a lot of projection, it's a lot of what we've seen in movies, it's a lot of, you know, it's, it's, it's, I identify as an addict. So for me, sometimes passion is like, this could be bad. And I think we've been so conditioned to believe that passion means like your soulmates. And I mean, how many times have you had a passionate connection with someone and then
Starting point is 00:39:32 it was a total train wreck? Pass it. Train wreck. How many times exactly? You did a lot of math in your head at that little moment. Counting. I mean, what's the train wreck? What's a, why is obsession? So you describe this codependency and sort of the idea of attachment over attachment to people who don't deserve that kind of attachment. That's somehow a bad thing. And I think our society says it's a bad thing.
Starting point is 00:40:04 It probably is a bad thing, like a delicious burger is a bad thing. I think our society says it's a bad thing. It probably is a bad thing, like a delicious burger is a bad thing. I don't know. Right. Oh, that's a good point. I think that you're you're pointing out something really fascinating, which is like passion. If you go into it knowing this is like pizza where it's going to be delicious for two hours and then I don't have to have it again for three. If you can have a choice in the passion, I define passion as something that is relatively unmanageable and something you can't control or stop and start with your own volition. So maybe we're operating under different definitions. If passion is something that ruins your real marriages and screws up your professional
Starting point is 00:40:39 life and becomes this thing that you're not in control of, and becomes addictive, I think that's the difference. Is it a choice or is it not a choice? And if it is a choice, then passion's great. But if it's something that consumes you and makes you start making bad decisions and clouds your front aloe, and is just all about dopamine, and not really about the person,
Starting point is 00:41:02 and more about the neurochemical, we call it the internal drug cabinet. If it's all just your own drugs, that's different because sometimes you're just on drugs. Okay. There's a philosophical question here. So would you rather, it's interesting for a comedian, a brilliant comedian to speak so eloquently about a balanced life I Kind of argue against this point. There's such an obsession of creating this healthy lifestyle. No
Starting point is 00:41:32 It's ecologically speaking. You know, I'm a fan of the idea that You sort of fly high and you crash and die at 27 Mm-hmm as a also possible life and it's not one we should judge, because I think there's moments of greatness. I talked to Olympic athletes where some of their greatest moments are achieved in their early 20s. And the rest of their life is in the kind of fog of almost a depression because they can have...
Starting point is 00:41:59 Based on their physical prowess, right? Physical prowess. And they'll never, so that's, so they're watching the physical prowess fade. They'll never achieve the kind of height, not just physical, of just emotion, of... The max number of neurochemicals. Yeah. And you also put your money on the wrong horse. That's where I would just go like, oh yeah, if you're doing a job where you peak at 22,
Starting point is 00:42:18 the rest of your life is going to be hard. That idea, I mean, I'm going to be like oh yeah, if you're doing a job where you peak at 22, the rest of your life is going to be hard. That idea is considering the notion that you want to optimize some kind of, but we're all going to die soon. What? Now you tell me, I've immortalized myself. So I'm going to be fine. You're almost like how many Oscar-winning movies can I direct by the time I'm 100, how many this and that, like, but, you know, there's a night, you know,
Starting point is 00:42:55 it's all life is short, relatively speaking. I know, but it can also come in different. You go life is short, play hard, fall in love as much as you can, run into walls. I would also go life is short, play hard, fall in love as much as you can, run into walls. I would also go life is short, don't to pleat yourself on things that aren't sustainable and that you can't keep. You know, so I think everyone gets dopamine from different places, everyone has meaning from different places.
Starting point is 00:43:18 I look at the fleeting, passionate relationships I've had in the past and I don't like, I don't have pride in that. I think that you have to decide what, you know, helps you sleep at night. For me, it's pride and feeling like I behave with grace and integrity. That's just me personally. Everyone can go like, yeah, slap with all the hot chicks in Italy I could.
Starting point is 00:43:38 And I, you know, did all the whatever, like whatever you value, we're allowed to value different things. We're talking about Brian Callan. Yeah. Brian Callin. Yeah. Brian Callin has lived his life to the fullest, to say the least, but I think that it's just for me personally, I, and this could be like my workaholism
Starting point is 00:43:55 or my achievement is, if I don't have something to show for something, I feel like it's a waste of time or some kind of loss. I'm in a 12-step program and the third step would say there's no such thing as waste of time and everything happens exactly as it should and whatever that's a way to just sort of keep us sane. So we don't grieve too much and beat ourselves up over past mistakes. There's no such thing as mistakes. But I think passion is, I think it's so life-affirming and one of the few things
Starting point is 00:44:28 that maybe people like us make us feel awake and seen. And we just have such a high threshold for adrenaline. You know, I mean, you are a fighter, right? Yeah, okay, so yeah. So you have a very high tolerance for adrenaline. And I think that Olympic athletes, the amount of adrenaline they get from performing, it's very hard to follow that.
Starting point is 00:44:51 It's like when guys come back from the military and they have depression. It's like, do you miss bullets flying out? You get kind of because of that adrenaline which turned into dopamine and the camaraderie. I mean, there's people that speak much better about this than I do. But I'm obsessed with neurology and I'm just obsessed with sort of the lies we tell ourselves
Starting point is 00:45:10 in order to justify getting neurochemicals. You've done actually quite, done a lot of thinking and talking about neurology and just kind of look at human behavior through the lens of looking of how our actually chemically our brain works. So what, first of all, why did you connect with that idea and what have you, how is your view of the world changed by considering the brain is just a machine? You know, I know it probably sounds really nihilistic, but for me it's very liberating to know a lot about neurochemicals because you don't have to, it's like the same thing with like critics, like critical reviews, if you believe the good, you have to
Starting point is 00:45:51 believe the bad kind of thing, like, you know, if you believe that your bad choices were because of your moral integrity or whatever, you have to believe your good ones, I just think there's something really liberating and going like, oh, that was just adrenaline. I just said that thing because I was adrenalized and I was scared. And my amygdala was activated. And that's why I said you're an asshole and get out. And that's, you know, I think I just think it's important to delineate what's nature and what's nurture, what is your choice and what is just your brain trying to keep you safe. I think we forget
Starting point is 00:46:19 that even though we have security systems and homes and locks on our doors, that our brain for the most part is just trying to keep us safe all the time. It's why we hold grudges, it's why we get angry, it's why we get road rage, it's why we do a lot of things. And it's also, when I started learning about neurology, I started having so much more compassion for other people. You know, if someone yelled at me, being like, fuck you on the road,
Starting point is 00:46:39 I'd be like, okay, he's producing a tunnel in right now because we're all going 65 miles an hour and our brains aren't really designed for this type of stress and he's scared. He was scared, you know, so that really helped me to have more love for people in my everyday life instead of being in fight or flight mode. But the, I think, more interesting answer to your question is that I've had migraines my whole life. Like I've suffered with really intense mig intense migraines. Ocular migraines ones were my arm would go numb. And I just started, I'm gonna go to so many doctors to learn about it.
Starting point is 00:47:13 And I started, you know, learning that we don't really know that much. We know a lot, but it's wild to go into one of the best neurologists in the world who's like, yeah, we don't know. We don't know. And that fascinated me. It's like one of the worst painsists in the world who's like, yeah, we don't know. We don't know. We don't know. And that fascinated me. It's like one of the worst pains you can probably have all that stuff. And we don't know the source. We don't know the source.
Starting point is 00:47:31 And there is something really fascinating about when your left arm starts going numb and you start not being able to see out of the left side of both your eyes. And I remember when the migraines get really bad, it's like a mini stroke almost and you're able to see words on a page, but I can't read them. They just look like symbols to me. So there's something just really fascinating to me about your brain just being able to stop functioning.
Starting point is 00:47:55 And so I just wanted to learn about it, study about it. I did all these weird alternative treatments. I got this piercing in here that actually works, I've tried everything, and then both my parents had strokes. So when both of my parents had strokes, I became sort of the person who had to decide what was going to happen with their recovery, which is just a wild thing to have to deal with it,
Starting point is 00:48:15 you know, 28 years old when it happened. And I started spending basically all day every day and I see use with neurologists learning about what happened to my dad's brain and why he can't move his left arm but he can move his right leg but he can't see out of the, you know, and then my mom had another stroke
Starting point is 00:48:33 in a different part of the brain. So I started having to learn what parts of the brain did what and so that I wouldn't take the behavior so personally and so that I would be able to manage my expectations in terms of their recovery. So my mom, because it affected a lot of her front of the lobe, changed a lot as a person. She was way more emotional, she was way more micromanaged, she was forgetting certain things. So it broke my heart less when I was able to know, oh yeah, we'll just throw a kit this part of the
Starting point is 00:48:58 brain and that's the one that's responsible for short-term memory and that's responsible for long-term memory, dead it up. And then my brother just got something called viral encephalitis, which is an infection inside the brain. It was wild that I was able to go, oh, I know exactly what's happening here, and I know. That allows you to have some more compassion for the struggles that people have. But does it take away some of the magic for some of the, from some of the more for some of the, from the, some of the more
Starting point is 00:49:26 positive experiences of life? Sometimes. Sometimes and I don't, I don't, I'm such a control addict that, you know, I think our biggest, that someone like me, my biggest dream is to know why someone's doing it. That's what stand up is. It's just trying to figure out why,
Starting point is 00:49:40 or that's what writing is, that's what acting is. That's what performing is. It's trying to figure out why someone would do something. As an actor, you get a piece of material and you go this person, why would he say that? Why would he, she pick up that cup? Why would she walk over here? It's really why, why, why, why?
Starting point is 00:49:53 So I think neurology is, if you're trying to figure out human motives and why people do what they do, it'd be crazy not to understand how neurochemicals motivate us. I also have a lot of addiction in my family and hardcore drug addiction and mental illness. And in order to cope with it, you really have to understand it.
Starting point is 00:50:11 Borderline personality disorder schizophrenia and drug addiction. So I have a lot of people I love that suffer from drug addiction and alcoholism and the first thing they started teaching you is it's not a choice. These people's dopamine receptors don't hold dopamine the same ways yours do.
Starting point is 00:50:26 Their frontal lobe is underdeveloped. And that really helped me to navigate loving people that were addicted to substances. I want to be careful with discussion, but how much money do you have? How much can I borrow $10? Okay. No, is how much control, how much despite the chemical balances or the biological limitations that each of our individual brands have, how much mind over matter is there. So through things and I've known people with clinical depression and so it's always a touchy subject to say
Starting point is 00:51:12 how much they can really help it. Very. What can you, yeah, what can you, because you've talked about cognitive intensity, talked about issues that you're struggled through, and nevertheless, you choose to take a journey of healing and so on. That's your choice, that's your actions. So, how much can you do to help fight the limitations of the neurochemicals in your brain? That's such an interesting question, and I don't think I'm at all qualified to answer, but I'll say what I do know, and really quick, just the definition of codependency,
Starting point is 00:51:45 I think a lot of people think of codependency as like two people that can't stop hanging out, you know, or like, you know, that's not totally off, but I think for the most part, my favorite definition of codependency is the inability to tolerate the discomfort of others. You grow up in an alcohol home, you grow up around mental illness,
Starting point is 00:52:02 you grow up in chaos, you have a parent that's a narcissist, you basically are wired to just people please worry about others, be perfect, walk on eggshells, shape shift to accommodate other people. So co-dependence is a very active wiring issue that doesn't just affect your romantic relationship, it affects you being a boss, it affects you in the world, online, you get one negative comment, and it throws you for two weeks. It also is linked to eating disorders
Starting point is 00:52:33 and other kinds of addiction. So it's a very big thing. And I think a lot of people sometimes only think that it's an romantic relationship. So I always feel the need to say that. And also one of the reasons I love the idea of robot so much because you don't have to walk on eggshells around them. You don't have to worry they're gonna get mad at you yet, but you there's no codependence are hypersensitive to the needs and moods of others and it's very exhausting. It's depleting. Just a well one
Starting point is 00:53:00 Conversation about where we're gonna go to dinner is like you want to go get Chinese food? We just had Chinese food Well, wait are you mad? Well, no, I didn't mean to and it's just like one conversation about where we're gonna go to dinner is like, do you wanna go get Chinese food? We just had Chinese food. Well, are you mad? Well, no, I didn't mean to, and it's just like, that co-dependence live in this, everything means something, and humans can be very emotionally exhausting. Why did you look at me that way?
Starting point is 00:53:18 What do you think about? What was that? Why did you take your phone? It's a hypersensitivity that can be incredibly time consuming, which is why I love the idea of robots just subbing in. Even I've had a hard time running TV shows and stuff because even asking someone to do something, I don't want to come off like a bitch, I'm very concerned about what other
Starting point is 00:53:34 people think of me, how I'm perceived, which is why I think robots will be very beneficial for co-dependence. By the way, just a real quick tangent that skill or flaw, whatever you want to call it, is actually really useful for if you ever do start your own podcast for interviewing because you're now kind of obsessed about the mindset of others and it makes you a good sort of listener and talker with. So I think what's your name from NPR? Talk to Terry Gross, talk to about having that. So I don't feel like she has that at all.
Starting point is 00:54:10 What? What? But she worries about other people's feelings. Yeah, absolutely. Oh, I don't get that at all. I mean, you have to put yourself in the mind of the person you're spoken with. Yes, oh, I see just in terms of, yeah, I am starting podcasts. And the reason I have it is because I'm
Starting point is 00:54:28 co-dependent. I'm too worried. It's not going to be perfect. Yeah. So a big co-dependent adage is perfectionism leads to procrastination. It leads to paralysis. So how do you, sorry, take a million tangents? How do you survive in social media? Because you're exceptionally active. But by the way, I took you on a tangent and didn't answer your last question about how much we can control. I want you to get, we'll return it, or maybe not. The answer is we can. Now it's a codependent, I'm, we're, okay, guys.
Starting point is 00:54:53 We can, but, but, you know, one of the things that I'm fascinated by is, you know, the first thing you learn when you go into 12 step programs or addiction recovery, and any of this is, you know, genetics loads the gun environment, pulse the trigger. And there's certain parts of your is, you know, genetics loads the gun environment, pulse, the trigger. And there's certain parts of your genetics you cannot control.
Starting point is 00:55:08 I come from a lot of alcoholism, I come from, you know, a lot of mental illness, there's certain things I cannot control. And a lot of things that maybe we don't even know yet what we can and can because of how little we actually know about the brain, but we also talk about the warrior spirit. And there are some people that have that warrior spirit, and we don't necessarily know what that engine is, whether it's you get dopamine from succeeding or achieving or marting yourself or that tension you get from growing. So a lot of people are like, oh, this person can edify themselves and overcome, but if you're getting attention from improving yourself, you're going to keep
Starting point is 00:55:49 wanting to do that. So that is something that helps a lot of, in terms of changing your brain, if you talk about changing your brain to people and talk about what you're doing to overcome set obstacles, you're going to get more attention from them, which is going to fire off your reward system, and then you're going to keep doing it. So you can leverage that momentum. So this is why in any 12 step program, you go into a room and you talk about your progress, because then everyone claps for you. And then you're more motivated to keep going.
Starting point is 00:56:15 So that's why we say you're only as sick as the secrets you keep. Because if you keep things secret, you know, there's no one guiding you to go in a certain direction. It's based on, right? We're sort of designed to get approval from the tribe or from a group of people because our brain, you know, translates it to safety. So, in that case, the tribe is a positive one that helps you go through the positive direction. So that's why it's so important to go into a room and also say, hey, I wanted to use drugs today and people go, hmm, they go me too. And you feel less alone and you feel less like your,
Starting point is 00:56:46 you know, have been castigated from the pack or whatever. And then you say, and I do, you get a chip when you have a drink for 30 days, or 60 days or whatever. You get little rewards. So talking about a pack that's not at all healthy or good, but in fact is often toxic, social media. So you're one of my favorite people on Twitter and Instagram
Starting point is 00:57:06 to sort of just both the comedy and the insight and just fun. How do you prevent social media from destroying your mental health? I haven't. I haven't. It's the next big epidemic, isn't it? I don't think I have.
Starting point is 00:57:23 I don't think it have. I don't think... It's moderation to answer. What? Maybe, but you can do a lot of damage in a moderate way. I mean, I guess it depends on your goals, you know? And I think for me, the way that my addiction to social media, I'm happy to call it an addiction. I mean, and I define as an addiction because it stops being a choice. There are times I just reach over and And I'm like, that was,
Starting point is 00:57:46 that was weird. That was weird. I'll be driving sometimes. And I'll be like, oh my God. My arm just went to my phone. You know, I can put it down. I can't take time away from it. But when I do, I get antsy.
Starting point is 00:57:58 I get restless, irritable, and discontent. I mean, that's kind of the definition, isn't it? So I think by no means, do I have a healthy relationship with social media? I'm sure there's kind of the definition, isn't it? So I think by no means do I have a healthy relationship with social media, I'm sure there's a way too, but I think I'm especially a weirdo in this space because it's easy to conflate is this work, is this, I can always say that it's for work, you know? But I mean, don't you get the same kind of thing
Starting point is 00:58:21 as you get from when a room full of people laugh at your jokes, because I mean, I see, especially the way you do Twitter, it's an extension of your comedy, in a way. So I think a big break from Twitter, though, the really big break. I took like six months off or something for a while because it was just like, it seemed like it was all kind of politics and it was just a little bit, it wasn't giving me dopamine because there was like this weird, a lot of feedback. So I had to take a break from it and then go back to dopamine because there was like this weird, a lot of feedback. So I had to take a break from it and then go back to it because I felt like I didn't have a healthy relationship.
Starting point is 00:58:49 I do have a try. I don't know if I believe them, but Joe Rogan seems to not read comments. And he's one of the only people at the scale, like at your level who at least claims not to read. Like, because you and him swim in this space of tense ideas that get the toxic folks growled up. I think, Rogue, I don't, I don't know. I don't, I think he probably looks at YouTube
Starting point is 00:59:22 like the likes and that, you know, I think if something's, if he doesn't know, I don't know, I'm sure he would tell the truth. You know, I'm sure he's got people that look at them and is like disgusted, great, or I don't, you know, like I'm sure he gets it, you know, I can't picture him like in the weeds on. No, for sure.
Starting point is 00:59:40 I mean, he's honest actually saying that, just it's, it's a, it's a admirable. Feedback, we're addicted to feedback. Yeah, we're addicted to feedback. I mean, you know, look sure. I mean, he's honest actually saying that just it's it's a it's a it's a it's a bad miracle. We're addicted to feedback. Yeah, we're addicted feedback. I mean, you know, look, like I think that our brain is designed to get intel on how we're perceived so that we know where we stand, right? That's our whole deal, right? As humans, we want to know where we stand. We walk in a room and we go, who's the most powerful person in here? I got to talk to them and get in their good graces. It's just designed to rank ourselves, right? And constantly know our rank. And social media because of you can't figure out your rank with 500 million people.
Starting point is 01:00:14 You get possible. You know, start bringing it is like, what's my rank? What's my, and especially for following people. I think the big, the interesting thing I think I may be able to say about this besides my speech impediment is that I did start muting people that rank wildly higher than me because it is just stressful on the brain to constantly look at people that are incredibly successful so you keep feeling bad about yourself. I think that that is like cutting to a certain extent.
Starting point is 01:00:45 Just like, look at me looking at all these people that have so much more money than me and so much more success than me. It's making me feel like a failure, even though I don't think I'm a failure, but it's easy to frame it so that I can feel that way. Yeah, that's really interesting. Especially if they're close to,
Starting point is 01:01:02 like if they're other comedians, something like that, or whatever. That's really disappointing to me. I do the same thing as well. So other successful people that are really close to what I do. I don't know, I wish I could just admire. Yeah. And for it not to be a distraction, but that's why you are where you are because you don't just admire your competitive and you want to win. So it's also the same thing that bums you out when you look at this as the same reason you are where you are. So that's why I think it's so important to learn about neurology and addiction because
Starting point is 01:01:29 you're able to go like, oh, this same instinct. So I'm very sensitive and I sometimes don't like that about myself, but I'm like, well, that's the reason I'm able to write good stand-up and that's the reason. And that's the reason I'm able to be sensitive to feedback and go, that joke should have been better. I can make that better. So it's a kind of thing where it's like, you have to be really sensitive in your work. And the second you leave, you've got to be able to turn it off.
Starting point is 01:01:50 It's about developing the muscle, being able to know when to let it be a superpower and when it's going to hold you back and be an obstacle. So I try to not be in that black and white of like, being competitive is better, being jealous of someone I just go like, oh, there's that thing that makes me really successful in a lot of other ways, but right now it's making me feel bad.
Starting point is 01:02:10 Well, I'm kind of looking to you because you're basically a celebrity, the famous sort of world class comedian. And so I feel like you're the right person to be one of the key people to define what's the healthy path forward with social media. So I, because we're all trying to figure it out now.
Starting point is 01:02:29 And I'm curious to see where it evolves. I think you're at the center of that. So like, there's, trying to leave Twitter and then come back and see, can I do this in a healthy way? You have to keep trying exploring and thinking about it. I have a couple of answers. I think I hire a company to do some of my social media for me.
Starting point is 01:02:50 So it's also being able to go, okay, I make a certain amount of money by doing this, but now let me be a good business person and say I'm going to pay you this amount to run this for me. So I'm not 21st seven in the weeds, hashtagging and responding. And it's a lot to take on. It's a lot of energy to take on. But at the same time, part of what I think makes me successful on social media, if I am, is that people know I'm actually doing it.
Starting point is 01:03:12 And that I am engaging and I'm responding and developing a personal relationship with complete strangers. So I think figuring out that balance and really approaching it as a business, that's what I try to do. It's not dating, it's not, I try to just be really objective about, okay, here's what's working, here's what's not working.
Starting point is 01:03:30 And in terms of taking the break from Twitter, this is a really savage take, but because I don't talk about my politics publicly, being on Twitter right after the last election was not going to be beneficial because there was going to be be you had to take a side You had to be political in order to get any kind of retweets or likes and I just wasn't interested in doing that Because you were going to lose as many people as you were going to gain and it was going to all complain in the wash So I was just like the best thing I can do for me business-wise is to just
Starting point is 01:04:02 abstain, you know and me business-wise is to just abstain. And the robot, I joke about her replacing me, but she does do half of my social media. You know? Because I don't want people to get sick of me, I don't want to be redundant. There are times when I don't have the time or the energy to make a funny video, but I know she's going to be compelling and interesting, and that's something that you can't see every day. you know? Of course, the humor comes from your,
Starting point is 01:04:28 I mean, the cleverness the with, the humor comes from you when you film the robot. That's kind of the trick of it. I mean, the robot is not quite there to doing anything funny. The absurdity is revealed through the filmmaker in that case, whoever is interacting, not through the actual in that case where whoever is interacting not through the
Starting point is 01:04:46 The actual robot, you know being who she is. Let me sort of Love okay How different what is it? What is it? Well first an engineer in question. I know I know you're you're not an engineer Well, first, an engineer in question. I know, I know you're not an engineer, but how difficult do you think is it to build an AI system that you can have a deep,
Starting point is 01:05:10 fulfilling monogamous relationship with? Sort of replace the human-human relationships that we value. I think anyone can fall in love with anything. How often have you looked back at someone? Like I ran into someone the other day that I was in love with and I was like, hey, it was like there was nothing there.
Starting point is 01:05:33 There was nothing there. Like, you know, like where you're able to go, like, oh, that was weird. Oh, right. You know, I were able. You mean it's from a distant past or something? Yeah. When you're able to go,
Starting point is 01:05:45 like, I can't believe we had an incredible connection. And now it's just, I do think that people will be in love with robots, probably even more deeply with humans because it's like when people mourn their animals, when their animals die, they're always, it's sometimes harder than mourning a human because you can't go, well, he was kind of an asshole, but like he didn't pick me up from school.
Starting point is 01:06:08 You know, it's like you're able to get out of your grief a little bit, you're able to kind of be, oh, he was kind of judgmental or she was kind of, you know, with a robot, it's there's something so pure about an innocent and ambitious and childlike about it that I think it probably will be much more conducive to a narcissistic love for sure at that. But it's not like, well, he cheated. She can't cheat. She can't leave you. She can't, you know, well, if bear claw leaves your life and maybe a new version or somebody else will enter there, will you miss bear claw For guys that have these sex robots, they're building a nursing home for the bodies
Starting point is 01:06:51 that are now rusting, because they don't want to part with the bodies, because they have such an intense emotional connection to it. I mean, it's kind of like a carcola, a little bit. But I'm not saying this is right. I'm not saying it's cool. it's weird, it's creepy, but we do anthropomorphize things with faces and we do develop emotional connections to things.
Starting point is 01:07:13 I mean, there's certain, have you ever tried to like throw away, I can't even throw away my teddy bear from when I was a kid, it's a piece of trash, and it's upstairs. Like it's just like, why can't I throw that away? It's bizarre. You know, and there's something kind of beautiful about that.
Starting point is 01:07:27 There's something that gives me hope in humans, because I see humans do such horrific things all the time. And maybe I'm too, I see too much of it, frankly. But there's something kind of beautiful about the way we're able to have emotional connections to objects, which a lot of, I mean, it's kind of specifically, I think, Western, right, that we don't see objects as having souls.
Starting point is 01:07:51 Like, that's kind of specifically us. But I don't think it's so much that we're objectifying humans with these sex robots, we're kind of humanizing objects. Right? So there's something kind of fascinating in our ability to do that, because a lot of us don't humanize humans. So it's just a weird little place to play in, and I think a lot of people, I mean,
Starting point is 01:08:11 a lot of people will be marrying these things as my guess. So you've asked the question, let me ask it a few, so what is love? You have a bit of a brilliant definition of love as being willing to die for someone who you yourself want to kill. So that's kind of fun. First of all, that's brilliant. That's a really good definition. I don't think it'll stick for me for a long time. This is how little of a romantic I am. A plane went by when you said that and my brain
Starting point is 01:08:38 is like, you're going to need to revert for that. And I want you to get into post and then not be able to use it. And I'm a romantic as a demeanor room in the moment. Actually, I cannot be conscious of the fact that I heard the plane and it made me feel like how amazing it is that we live in the world planes. And I just went, why haven't we fucking evolved past planes? And why can't they make them quieter? Yeah. But yes, this, my definition of what? What, yeah, what's your sort of, what I'm more serious about.
Starting point is 01:09:14 Consistently producing dopamine for a long time. Consistent output of oxytocin with the same person. Dopamine is a positive thing? What about the negative? What about the fear and the insecurity? The longing, anger, all that kind of stuff? I think that's part of love. I think that love brings out the best in you,
Starting point is 01:09:38 but it also, if you don't get angry and upset, I don't know, I think that that's part of it. I think we have this idea that love has to be like, really, you know, placid or something. I only saw stormy relationships growing up, so I don't have a judgment on how a relationship should look, but I do think that this idea that love has to be eternal is really destructive,
Starting point is 01:10:04 is really destructive, and self-defeating and a big source of stress for people. I'm still figuring out love, I think we all kind of are, but I do kind of stand by that definition. And I think that, I think for me, love is like just being able to be authentic with somebody. It's very simple, I know, but I think for me it's about not feeling pressure to have to perform or impress somebody, just feeling truly like accepted unconditionally by someone. Although I do believe love should be conditional, that might be a hot take.
Starting point is 01:10:38 I think everything should be conditional. I think if someone's behavior, I don't think love should just be like, I'm in love with you now behave however you want forever. This is unconditional. I think love is a daily action. It's not something you just like get 10 year on and then get to behave however you want because we said I love you 10 years ago.
Starting point is 01:10:58 It's a daily, it's a verb. Well, there's some things that are, you see, if you make it explicitly make it clear that it's conditional, it takes away some of the magic of it. So there are certain stories we tell ourselves that we don't want to make explicit about love. I don't know, maybe that's the wrong way to think of it. Maybe you want to be explicit in relationships. I also think love is a business decision.
Starting point is 01:11:21 Like, I do in a good way. Like I think that love is not just when you're across from somebody. It's when I go to work, can I focus? Do I, am I worried about you, am I stressed out about you? Am I, you're not responding to me, you're not reliable? Like I think that being in a relationship, the kind of love that I would want is the kind of relationship where when we're not together,
Starting point is 01:11:43 it's not draining me, causing me stress, making me worry, you know, and sometimes passion that word, you know, we get murky about it, but I think it's also like I can be the best version of myself when the person's not around, and I don't have to feel abandoned or scared or any of these kind of other things. So it's like love, you know, for me, I think is, I think it's a flowbear quote, and I'm going to butcher it. But I think it's like be, you know, boring in your personal life. So you could be violent and take risks in your professional life. Is that it? I got it wrong. Something like that. But I do think that it's being able to align values in a way to where you can also thrive outside of the relationship. Some of the most successful people I know are those sort of happily married
Starting point is 01:12:25 and have kids and so on. It's always funny. It can be boring. Boring is okay. Boring is serenity. And it's funny how that those elements actually make you much more productive. I don't understand the- I don't think relationships should drain you and take away energy that you could be using to create things that generate pride. Okay. Did you say your relationship of love yet? Have you said your definition of love yet? Huh? Have you said your definition of love? My definition of love? No, I did not say it.
Starting point is 01:12:52 We're out of time. No. When you have a podcast, maybe you can invite me on. Oh no, I already did. You're doing it. We've already talked about this. And because I also have codependency, I have to say yes.
Starting point is 01:13:06 No, yeah, thank you. Yeah, no, no, I'm trapping you. You owe me now. Actually, I wondered whether when I asked if we could talk today after sort of doing more research and reading some of your book, I certainly wondered, did she just feel pressured to say yes? Yes, of course. But I'm the fan of yours too. No, I actually, because I am Kota Mano,
Starting point is 01:13:30 but I'm in recovery for a Kota pendant. So I actually do, I don't do anything, I don't wanna do. You really, you gotta, you gotta say no. Well, I still know all that. You, you, you, you, you, you, you, you, you, you, you, you, good, I'm trying to learn that. I moved this, a couple, remember I moved it from one to two. Yeah, yeah, yeah, I just, yeah, just, yeah, just a lady on how recovered I am in my, I'm not completely learn that. I moved it from one to two. Yeah, just to let you know how recovered I am in my head.
Starting point is 01:13:47 I'm not completely fed in it. But I don't do anything I don't want to do. Yeah, you're ahead of me on that. OK, so do you think about your mortality? Yes, it is a big part of how I was able to kickstart my code of penance recovery. my dad passed a couple years ago And when you have someone close to in your life die Everything gets real clear in terms of how we're a speck of dust who's not only here for a certain amount of time
Starting point is 01:14:17 What do you think is the meaning of it all like what? the speck of dust what what's Maybe in your own life. What's the goal, the purpose of your existence? Is there one? Well, you're exceptionally ambitious. You've created some incredible things in different disciplines. Yeah, it's where I'll just manage our terror because we know we're going to die. So we create and build all these things and rituals and religions and robots and whatever we need to do to just distract ourselves from imminent rotting. We're rotting? We're all dying. I have very into terror management theory when my dad died and
Starting point is 01:15:00 it resonated. It helped me and everyone's got their own religion or sense of purpose or thing that distracts them from the horrors of being human. What's the terror management theory? Terror management is basically the idea that since we're the only animal that knows they're gonna die, we have to basically distract ourselves with awards and achievements and games and whatever, just in order to distract ourselves. From the terror we would feel if we really processed the fact that we could not only, we are going to die, but also could die at any minute because we're only superficially at the top
Starting point is 01:15:37 of the food chain. And, you know, technically we're at the top of the food chain. If we have houses and guns and stuff, machines, but if me and a lion are in the woods together, but most things could kill us. I mean, a bee can kill some people. Like something that's big can kill a lot of humans. Like, you know, so it's basically just to manage the terror that we all would feel if we were able to really be awake.
Starting point is 01:16:01 Because we're mostly zombies, right? Yeah, job, school, religion, books, go to sleep, train, football, this relationship, dopamine, love, love, you know, we're kind of just like trudging along like zombies for the most part. And then I think... That fear of death adds some motivation. Yes.
Starting point is 01:16:20 Well, I think I speak for a lot of people in saying that I can't wait to see what your terror creates In the in the next few years. I'm a huge fan. Whitney. Thank you so much for talking to Thanks for listening to this conversation with Whitney Cummings and thank you to our presenting sponsor cash app Download it and use code Lex podcast You'll get $10 and $10 will go to first. It's time education nonprofit that inspires hundreds of thousands of young
Starting point is 01:16:50 minds to learn and to dream of engineering our future. If you enjoy this podcast, subscribe on YouTube, give it 5 stars and Apple Podcast, support our Patreon or connect with me on Twitter. Thank you for listening and hope to see you next time. you

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.