Sad Boyz - Could AI Solve The Loneliness Epidemic?

Episode Date: April 4, 2025

CW/TW: Self-harm (1:13:00-1:21:00) Sad Boyz Nightz #107 Check out 100+ bonus episodes at: https://patreon.com/sadboyz Join our Discord ▸ https://discord.gg/Hw82Dhun4m P.O. Box ▸ 3108 Glendale B...lvd Suite 540, Los Angeles CA 90039 Play Sad Boyz BINGO ▸ https://sadboyzpod.com/bingo Write To Us ▸ sadboyzpod@gmail.com Use the subject line "Pen Palz" and we could read it on the next episode! 🎙listen to us!🎙 Spotify ▸ https://sadboyzpod.com/spotify Apple Podcasts ▸ https://sadboyzpod.com/itunes ✨follow us✨ https://instagram.com/sadboyz https://twitter.com/sadboyz 📺main channels📺 Jarvis - https://www.youtube.com/c/jarvis Jordan - https://www.youtube.com/c/JordanAdika ✨follow jordan✨ https://twitter.com/jordanadika https://instagram.com/jordanadika ✨follow jarvis✨ https://twitter.com/jarvis https://instagram.com/jarvis 00:00:00 Welcome To Sad Boyz 00:01:00 Community-Minded Garbage 00:15:10 AI Companions 00:17:34 AI Ghibli on ChatGPT 00:27:11 The Nuances of Artificial Intelligence 00:33:49 The Ethics of Generative AI 00:42:33 Tech Entrepreneurship & Self Importance 00:53:33 Rick & Morty Soda Creature 00:54:19 'Very Personal': Can You Date AI? (MSNBC) 01:06:31 Alternative Socialization 01:12:58 CW: Self Harm 01:20:57 Loneliness Epidemic 01:29:27 Blending Real & AI Relationships 01:33:43 AI Companions In The News 01:36:28 Sad Boyz Nightz #107 🎬 CREW 🎬 Hosted by Jarvis Johnson and Jordan Adika Produced & Edited by Jacob Skoda Produced by Anastasia Vigo Thumbnail design by @yungmcskrt Outro music by @prod.typhoon & @ysoblank

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to Sad Boys, a podcast about feelings and other things also. I'm Jarvis. I'm Jordan. And now what do we do? Do we just do a podcast? You're wagging your finger like Dikembe Mutombo. You think it's adding maybe a little sauce to the show? You know?
Starting point is 00:00:14 Well, I feel like... Right, but who are you saying that to? You can't do that. Do what? Well, there's plenty of people listening. Someone's probably doing something untoward. You can't do that. Okay, yeah.
Starting point is 00:00:23 You can't do the... Oh, yeah, you can't stop. Wait. No Sam, you can't do the dishes right now. Sam quit. No Sam stop it. You can't do that. I know you're doing chores. Every every week we have a new person who we tell to stop doing something. I always picture it as dishes. If someone's listening. Stop doing the dishes. It's always dishes.
Starting point is 00:00:42 Why would you want them to stop doing the dishes? Are you a part of big dirty big dirty? I'm big and dirty cut it out. What ever says Jordan got in the pocket of big dirty Yeah, I have quite a lot has gone downhill everything underneath the clothes pure dirt. Oh, yeah What's a wait? What's the default drawer in your mind? I go to dishes. Trash. Oh. Trash is default, but you know I have a history. That's true. I was born of the trash.
Starting point is 00:01:10 It's more than a chore, it's a war at this point. It's the chore war. Is this the first place you've lived in living memory without a trash war? I would say that there is, it's possible that there is a trash war. I just haven't gone public. It's kind of like a trash cold war. Oh, it's a proxy battle? Yeah, it's possible that there is a trash war. I just haven't gone public. It's kind of like a trash Cold War.
Starting point is 00:01:26 You know, proxy battle. I've noticed some other people using your trash bins. That's all I'll say. The CIA over here. Is foreign of foreign information messing with our domestic affairs. I think he's spreading out at well. Sometimes one of my trash pet peeves, and it's not a big deal, but those who listen to
Starting point is 00:01:49 sideways for a very long time will remember some of my trash trials and tribulations, TTT. The historians of you out there will remember the previous wars. But I would say that recently, not too many issues, except I do have a pet peeve when someone will use my compost and I don't take my, like, if I haven't composted anything, I may not think that I need to take it out. I assume you're right. And then so I have to check to see if someone else is, cause one time someone did yard work
Starting point is 00:02:23 and then like put their stuff in my bin. And then I needed to remember to take it out because then when we were doing yard work, it was full. And I was like, well, egg on my face. More fool me. But I would say, ethical question, because I do think there's like some crime, right? You're not supposed to like use someone's trash.
Starting point is 00:02:43 Isn't there a crime somewhere? Do not kill. I'm not saying that someone else is doing that crime to me. I'm about to admit to a crime. So I wanna know. There's like a legal dumping. Like if you're using, if you're like filling up another dumpster that you're not supposed to be using.
Starting point is 00:02:57 You're dumping in somebody else's yard too much. Yeah, so here's my moral conundrum. And it's a question for the room. It's midnight the night before trash day All the bands are on the street. I'm telling you like a story Okay, was the night before trash day all through the street. I've got one of those sleep caps on and I'm bundled up in my bed Yeah, book
Starting point is 00:03:21 story something so and you, through one reason or another, have a little bit too much trash. So much trash that you won't be able to put it all in your bin. However, the bins that are out, the bins that'll be picked up at six in the morning. Whose owners are asleep. Whose owners are asleep may have some space. Your options are A, leave the trash until after the trash gets picked up and then put it in your trash bin. Or B, toss the trash bag into someone else's who has room and it'll be unbeknownst to them what they don't know.
Starting point is 00:04:01 They won't even have the chance to fill it. They won't even know because it'll get picked up and they'll be none the wiser. Unless they're a 4 a.m. hobbyist for throwing trash away. They're a 4 a.m., I've already taken the trash out, but now it's time to throw out the mega trash. I've got my Saratoga, I've been doing pushups on my balcony.
Starting point is 00:04:16 Right, I drank 12 Saratogas just now, I need to toss them into recycling before, and what's this? I've used all my ice, I better throw the bag away. Yeah, what do you do in that scenario? I'm used all my ice. I better throw the bag away. Yeah, what are you doing that scenario? I'm known as the bad boy of the show. I'm kind of the renegade. Can't hold me back. I'm basically John Wick and the whole world come my dog. I'm holding nothing back. I'm at war. I think first move is I forget to take the trash out.
Starting point is 00:04:37 That's scenario one. Okay. Okay, so we're kind of Timeline A. Right. Almost certainly is the case. So in this world, in this world, well let's say you had forgotten to take the trash out. But now it's, you forgot to take the trash out last week and so now you have too much trash. And you've been gaming all night, so you've been up until 2 a.m. I'm at my strongest.
Starting point is 00:04:59 You're up until 2 a.m. and you're like, ah, I've got double trash. Well now I've never been more focused in my life than when I'm in the gamer zone. Yeah, yeah, and so now the gamer zone, you know, a game just ended, you just got a high score in Bellatro. I'm wiping my hands.
Starting point is 00:05:13 Yeah, you're wiping your hands, you're blowing them like a gun. I'm saying, yes! Yeah. I'm trying to call my family, they have started screening my calls because I keep telling them about like which joke because I used for right you you were talking About the kill screen of Bellatro, which is reachable once you get enough exponents or whatever They can't store the number and you start getting not a number as your score me me on the phone. Hello
Starting point is 00:05:38 Yeah, so what do you do? What do you do? Do you do you use someone else's trash can what they don't know won't hurt them or do you live by the straight and narrow? I have a I have a Anastasia looks disgruntled You're shaking I Have been walking a dog before someone's trash cans in front of their house. I'm holding this poop I'm just gonna throw it in their trash. It's in front of their house crime Who gives a shit though? You just did you just gave a shit to the trash can and you're right to I Think all trash is for the people yeah, there's like a crowd of people here
Starting point is 00:06:24 Wait, where did they come from they come to every show but they never like what we say they've never reacted to a single thing these like nimby people who all they the question was for Jordan like you're gonna be a frickin mrs. Kravitz and constantly look out your window and get mad if someone puts trash in your trash can who cares? Mrs. Krauss I don't know mrs. Krauss is from an old TV show that no one here was alive to show from the 50s called Leave my trash alone So I agree I
Starting point is 00:07:01 Am one to give a shit to a trash can Okay, we're gonna dox him now yeah, there is near my place and Zachary will walk the dog There are a collection of trash bins that are never full and always outside Right we are we are trash victim blaming here. No, but I will do, I'll admit it. I will go down the street and I'll open up the trash bins and I'll go, that one's pretty empty. This, honestly, this bag can fit
Starting point is 00:07:37 and not only would no one be the wiser, but even if they wanted to throw something away, they still could. And so I'm like, ah, it's actually, usually only happens when we would host a big party and then the next day it would be like a ton of people worth the trash, so it would be hard to otherwise get rid of it quickly. Babes, cuff him.
Starting point is 00:07:57 This has been a sting. We're all wearing different wires. I've been an undercover cop the whole time. Waiting for Jarvis to have been the cop. It all makes sense. I just wanna hear a good explanation. Like, I heard one person one time say that they were mad that people threw their dog poop
Starting point is 00:08:14 in their trash can because then when they wheel the trash can into their garage, it stinks. Doesn't trash normally stink? Also, is your trash not stink? Yeah, I mean, how much dog poop? I feel like it takes a lot of dog poop. I will say as long as the dog poop is contained. Yeah, if it's in a bag.
Starting point is 00:08:33 That's fine. I pride myself, this is weird, but I pride myself on a clean garbage bin. I do clean it out every couple months. I will take a scrub brush and a hose. Whoa. And clean it out. Because when I take the trash out. The bin outside, not the can?
Starting point is 00:08:48 Yeah, the bin outside. That's crazy. That's bongers. I will say though, I was in a situation once where I had to order new trash bins from the city and it was kind of awesome. When you get a new trash bin, you're like, damn, like you cleaned up not I've never seen you like this before. Well. It's like where where I live like it's In a track in it. Yeah, I live in the dumpster With my roommate Oscar. Yeah, you have a comically placed banana peel on your head by the way
Starting point is 00:09:20 Right yeah, your clothes are like plastic bags. Yeah, your hat is a the top of a trash tin like a jack-in-the-box Bingo board we give peeps a new profession But like it's a apartment so there's no like singular trash bins we all just throw our bags in a collective dumpster. And so it's like if I have a lot of trash towards the beginning of the week I don't wanna throw
Starting point is 00:09:55 all of it away because I don't wanna fill it up too much. So it's like if you have like the individual trash bins if it's the night before, like if everyone's trash bins were out all week and you started doing it like at the beginning of the week. I've actually only ever done this the night before trash pickup because I want to because I'm trying to minimize inconvenience for like basically in my mind there's zero inconvenience because it's not about keeping it a secret they won won't even know it, but it's more like it will never affect them. And it's not, I'm not putting- You're a considerate trash bandit.
Starting point is 00:10:31 Yeah, and I'm not putting like raw poop in there or anything like that that's gonna dirty the trash can. Cause that could be something that- Nobody would do that. I've even been known to double bag dog poop when it's going in someone else's trash can. I've even been, I've like sometimes used a bag. You're always supposed to use a bag.
Starting point is 00:10:47 Yeah, I'll say. Which I know a lot of the time I do, obviously, for the city, for the benefits, because of my hands, I don't wanna get it on my hands. But I will take it from my hand, put it in a bag. I think, you know, this, for me, it feels like a bigger problem of like a sense of community.
Starting point is 00:11:07 I agree. Where it's like if someone is so not in my trash can, like they don't have a sense of community with their neighbors and that sucks. Yeah, like my neighbor once, and this is a situation where they were getting yard work done and they needed more compost space and asked if they could use my whole bin.
Starting point is 00:11:31 And I was like, of course. And that's the thing, it's like if anyone were to ask me, I don't mind, but it's like a communal give and take type situation. If there is a person that, it's the night before trash day, they've already taken out all of their trash and they have room and they're like, you can't use my trash.
Starting point is 00:11:47 That's wild. Like so like, I don't see- The class ends and they ask about homework, losing. Like morally, I see nothing wrong with throwing your trash in someone else's trash bin. Well I definitely do like feel like a ne'er do well. When I'm like opening up the trash bins,
Starting point is 00:11:59 I'm like- You have a little mask on your eyes. Do you feel like a little raccoon? Yeah, me and the raccoon, me and my raccoon homies are going around, I'm looking for space, they're looking for snacks. Yeah, they're picking up dog food in both their hands and scurrying away on their little bipedal sense. They're trying to wash their cotton candy.
Starting point is 00:12:14 Can I throw out like a kind of light dilemma? Sure. In this one I feel. So I guess my recycling bin was stolen. I don't know, it's been gone a very long time, right? And if this is bad, I actually haven't done it. Right. As a refresher, and you did it. Oh.
Starting point is 00:12:31 I will take the relatively small amount of recycling I generate, and now I can fit in. Evening before, or close to trash day, sometimes the morning before because they pick it up a little late. I will sometimes break down a box and then redistribute pieces of the cardboard. Like a Johnny Appleseed type.
Starting point is 00:12:50 Like a Robin Hood. Let's go see you in the Robin Hood of recycling. I will, and they will get all of the clout for recycling this kind of- Okay, wait, wait, so you're saying that instead of, if you have like a little bit of, if you have a little bit of recycling, you'll distribute that amongst other people's bins
Starting point is 00:13:04 instead of using your own. If I have a very little bit of recycling, you'll distribute that amongst other people's bins instead of using your own. If I have a very little bit of recycling, it will go with the trash. No, no, no, that makes sense. I've done that as well where I've had like boxes. You know you can call to get a new bin. You can call to get a new bin. Yeah, but I'm winning right now.
Starting point is 00:13:19 I will say about someone stealing your trash bin, I took a picture of my trash bins when I moved to a new place. And so I have the code that's on the front. And then in the past, I've also used a paint pen to write my address. Jarvis the Trash. Yeah, yeah, yeah.
Starting point is 00:13:37 Usually I just say the address of it. This is Jarvis Johnson's trash. Jarvis Johnson's trash. No, it's definitely mine. My SSN is right there on the front. Mother's maiden name. When I most recently moved, I had, or no, this happened a couple times.
Starting point is 00:13:55 Someone else takes my bin in by mistake because sometimes there's not space on the street for where you normally put it, so you have to move it to a, you have to travel a little bit. It has to go on a vacation down the street. where you normally put it. So you have to like move it to like a, you have to travel a little bit. It has to go on a vacation down the street. And with his buddies. And then I go to get the trash bin and then it's missing.
Starting point is 00:14:11 And then I refer to my lookup table of what my coat is. And I kind of start looking around at nearby trash bins and going, all right, you brought this in, but is this mine? And one time I had to walk maybe 20 feet into someone's like driveway and grab my trash bin. And I felt like I stole it, but it was mine. You know what I mean?
Starting point is 00:14:35 Like, so it was a weird kind of heist situation. Where was theirs? That's for them to figure out. Hopefully they have a picture on their phone. Maybe theirs is slightly smaller and it's Russian dolled into yours. But the thing is, mine were new, and so I was like, I'm not gonna use someone's crappy, dirty, uncleaned out trash bin. I gotta clean it like Jacob? Am I crazy? No, I'm not gonna, we're not gonna swap Swapsies. I'm taking mine back.
Starting point is 00:14:59 Speaking of community. Because if mine gets gross, I will clean it. But like, it's like, there's a certain amount of Grossness that as long as we don't it's like how like your body always gets dirty So like why would you shower coming speaking of community? You know who's looking for community? Dan Harmon yeah Yes AI it's all over the news First we're gonna Making some real work for our subtitlers
Starting point is 00:15:35 No, that was I will say for for caption purposes. I it says Jarvis says gibberish I like it if an on occasion that does has to be a Jordan says gibberish. I like it if on occasion that does have to be a Jordan says gibberish because I will listen back and be like, I have no idea. I've stopped sending it to you because the few times I have, I've been like, what do you say here? And you go, fuck it. I don't know.
Starting point is 00:15:56 Why would I know? I barely hear me talk. Was it one of you that referenced the other day that I went to the Ren Fair last year, I guess, and because the people that often are working the booths and the stores and stuff are in character, a lot of the time it'll be like, what can I fetch you, me Lord? And it's like, I'll take like Heineken. Like a blue moon.
Starting point is 00:16:20 Can I get a liquid death? I'll take Ye Olde Bakari, please. Yeah. I, a white club? Yeah. I'll take yoldy Bakari place. Yeah. I, a white club, please. He old style. I will immediately on instinct, all of my like adapting to my environment accent. Like I stopped saying my, I felt like I'm home, right? My accent, the accent I do here or just here evaporates very quickly. You start saying Zed all of a sudden. And so they just, they did like, so me Lord, how do you do? What may I fetch you?
Starting point is 00:16:52 And I went like, you all right, love carcass? I'll just now, you ain't got strong bad hair? Yeah, I don't know, I'll just. And like, and then they went, sorry, could you say that again? I have no, I was like, oh right, I saw a dragon and I'll take it just a water. Okay, could I have a cherry,
Starting point is 00:17:11 could I have a fucking Baja blast? May I take a Baja blast? Can I get a blueberry Red Bull? May I have a Mountain Dew? No? Okay. No. No. Understood. So, so, so, so, AI's in the news as it always is and it always will be, at least for the
Starting point is 00:17:30 next, for a while. Until everyone is one. Before we talk about the main topic, which is AI companionship, AI boyfriends and girlfriends, I do want to spend a moment to talk about this OpenAI Ghibli situation. Oh yeah. You don't like art? OpenAI, big US AI company, formerly non-profit, now very profit driven, has released a one update to their image generation and started promoting using chat GPT to turn an image into the style of Studio Ghibli, Hayao Miyazaki's production company.
Starting point is 00:18:16 They were like, you know what could be really fun? You know how the main thing a lot of people seem to hate about what we do is the appropriation of art and work? Right. Because that was like already, we're like already, there's databases of the copyrighted creative works that have been stolen to train these AI models. Like we are constantly talking about generative AI and how it's kind of like blatant theft and then kind of regurgitating for profit. Yeah. Someone else's creative work in the same respect that like, albeit the stakes are different that like, uh,
Starting point is 00:18:53 the medical industry in the UK, well, unfortunately now becoming in the UK also, but the medical industry in the U S is extremely exploitative, but medicine isn't the bad thing. The utility of it is once. The technology behind degenerative AI is not the cult. There's a lot of valid applications. But the problem is that this company, which originally had this nonprofit mission, was posed with billions and billions of dollars.
Starting point is 00:19:21 I think what happened was that they got a $10 billion infusion from Microsoft. So like it starts to feel like there's now this profit motive. And the issue with that is it feels a little bit like a race to the bottom. So, ChadGBT's like new thing is generating Studio Jubilee based images, which in order to generate them,
Starting point is 00:19:45 has to mean that they have trained on the resources of Studio Ghibli, which is the problem. Because I don't understand if they even had permission, I can't imagine a world where they had permission from Studio Ghibli to train. I don't even know how they did it. And I do think that one of the limitations,
Starting point is 00:20:04 as I understand, again, I've been out of the tech space for a long time, so I could be wrong here, but one of my understanding about generative AI in specific is that one of the largest limitations is having a large corpus of data. I guess as these companies were bootstrapping, needing to scrape tons and tons and tons of things off of the open internet, and inadvertently that results in tons of copyrighted content being generated, like for example, YouTube videos and stuff.
Starting point is 00:20:36 People like Marques Brownlee have talked about this. All this preamble to say that they know what they're doing and they're kind of trudging forward in this, well, it's better to ask for forgiveness than to ask for permission mode, which is just what a lot of tech companies do like Uber and its rise to dominance. Yeah, break stuff. Move fast and break things. Break people, break their careers, break their options.
Starting point is 00:21:00 The first thing I said, I was like, they got permission to do it. It feels so brazen. It feels like the type of thing that would a fan would create and then it would immediately get like a takedown request I couldn't believe that this gigantic company was doing it. This is like a like Oblivion mod or something or Skyrim mod and then yeah, of course Bethesda ended up taking it down But it was fun while it was around. Yeah crazy that the company is yeah And fucking Sam Altman made his profile pic like jibbified and it's just like he's so cool he's like ooh tweeting
Starting point is 00:21:30 about like guys you guys are so oh my god you're melting our servers oh my god it's super fun seeing people love images in chat I'm a human being by the way it's super fun seeing people love images in chat GPT But our GPUs are melting we are going to temporarily introduce some rate limits Okay, tapping index fingers together while we work on making it more efficient Hopefully won't be long chat GPT free tier will get three generations per day soon 69,000 and that's pretty funny. Yeah, it's like I get like I think there's a lot of really dumb takes about this, which is just like no one
Starting point is 00:22:13 Like no one cares about how it's made. They just care about consuming And the thing that's fun to consume will be what's consumed. Yes, of course because path of least resistance people are just going to seek convenience and novelty and points of interest in that way. And this isn't like a really an ethical conversation if you're not online and involved in it. Why would you think about that? Right, but the thing is,
Starting point is 00:22:36 that doesn't mean that it's above criticism because it's like saying, yeah, babies wanna eat candy more than they wanna eat vegetables, you dumb donkey. Why would we feed them the thing they don't want? Obviously they just wanna eat whatever, the thing that's most attractive to them. And you're gonna tell the kid what to do?
Starting point is 00:22:57 Yeah, and so this is kinda crazy to me. And there's also ongoing lawsuits, I believe. Um, so there was a lawsuit from the New York Times against open AI for training Chad, you put on the New York Times articles. That's ongoing. Like I can't even read the fucking New York Times articles without a goddamn paywall. They keep taking shots at the big boys. Like maybe I'm certainly the more I'm glad that they're not. Well, they are. But the damage would be worse in the case of independent artists who can't defend themselves or who are literally having their work taken.
Starting point is 00:23:34 But feels like you got a little too comfortable. And now you're like, I'm going to take a swing at just all of Saudi Arabia. I'm going to steal all their art. It kind of feels very like, give me Greenland. I want it, give it to me and we'll get it no matter what. Because we're powerful. Gulf of America, okay. Cause like the money in OpenAI is more money than,
Starting point is 00:23:57 you know, Studio Ghibli's ever been worth. So we can fight them. Yeah, if they get hit with a lawsuit, like worst case scenario, they have to like pay several million dollars in a lawsuit to someone. They're making so much more than that right now. Yeah, it's like when Uber was moving into markets without getting government approval
Starting point is 00:24:12 and then just paying fines. Yeah. For less than the- For less cheaper. Yeah. I could be mistaken because things have changed so much with Disney, but doesn't Disney own the licensing rights to Ghibli in America?
Starting point is 00:24:28 And Disney is traditionally quite litigious. But that's the thing, it's like, this company has so much power behind it. Like, OpenAI has got like a market cap of like, or an estimated market cap of like $300 billion, which would make it one of the like, largest companies in the world. And could you just Google Disney's market cap of like $300 billion, which would make it one of the like largest companies in the world. And could you just Google Disney's market cap?
Starting point is 00:24:49 I think it's a lot less than that. Oh, their partnership ended is what I'm reading. Disney market cap. Aw, but they were so good together. Fuck. Yeah, it's like, it's like worth, no, Disney's public company, but this thing always happens
Starting point is 00:25:02 where it's like Tesla's market cap and the way that it's priced into the market means it's worth more than like every other automaker like combined. Not exactly that, but like something close to that. It's like the revenue tier or valuation of Europe. And so it's this thing where it's like there's all this speculation kind of baked into this. But, you know, valuation at the last raise for OpenAI is double Disney's current market cap. With any kind of primarily online discourse, it's easy to lose track of the fact that normies,
Starting point is 00:25:35 and that's non-pejorative, being a normie is good. I wish I could get there. Literally don't know what any of these words mean. Like OpenAI, like what is that? Cause why would they have to? Well that's a day. If they know AI, it's like, oh yeah, I saw like a video and it was interesting.
Starting point is 00:25:53 I think that like the annoying thing is that it feels very like easy to dunk on someone who has conviction or cares about an issue. But the alternative is just to lie down and let these giant corporate actors control your entire life. Well, they're so big, I guess they're right. It's like, well, I just want,
Starting point is 00:26:14 but also you have to recognize that most people do not have the capacity to give a shit because they're trying to make ends meet or they are trying to focus on their job and their family and just keeping like it's a hard enough to live. Yeah. You know, so I, it's, it's a privilege to care about things like this, but you know, we are in a privileged position to say that like this shit is whack as hell. And it does.
Starting point is 00:26:39 I think the, one of the few things where there can be grassroots impact in theory is in defense of smaller artists and creatives specifically because that is a very like punchy line. If you're pushing back on something like this because you're like, well, you're stealing work from something, that's a very tangible criticism, as opposed to like the more broad criticism of how bad monopolies can be
Starting point is 00:27:04 and how this could disenfranchise people in the long term I don't want to say that there is no value in pushing back against this and its application. I just think that like I'm by no means like a I wouldn't even okay you know it's funny I wouldn't even describe myself as anti AI because I think there's a lot of nuance to artificial intelligence that has kind of been co-opted to just mean a couple of things where AI has been a part of our lives for decades and will continue to be a part of our lives
Starting point is 00:27:38 for the foreseeable future. It's more about like, it's like I'm not anti-water, but I want, I don't want clean water to go to certain people and dirty water to go to other people. I want Flint's water to be good. Yeah, yeah, yeah. It's not that like water is. And so it's, and I just think that we are in this
Starting point is 00:27:59 wild west where regulation, first of all, even if you had a administration that was tough on corporations, which we don't, like the commissioner of the FTC is now out and they were a person who was doing a lot of these, suing a lot of these big companies and fighting against like mergers and corporate consolidation and also anti-consumerist practices and monopolies.
Starting point is 00:28:34 Like that type of stuff is like not going to be happening in this administration. Yeah, that pushback is even less welcome. Even if we had that, this would still be fighting an uphill battle because the speed at which legislation happens is much slower than the speed at which technology typically develops. And AI has developed at such a dramatically fast clip.
Starting point is 00:29:01 I had a minor, basically, in artificial intelligence. A specialization in my degree was in computer networking and artificial intelligence. Those are my threads or my specializations. And I took graduate classes in machine learning and things like that. And where the world was 10 years ago is very, very different.
Starting point is 00:29:23 We were not thinking about, we were using AI to like, you know, like some of the applications could be like solving a maze, you know, like identifying a face, you know, like some of these basic things. Like I had a project when I was in college in my computer vision class to like, look at the 2012 presidential debate and like be able to face track Mitt Romney and Obama
Starting point is 00:29:49 and things like that. And it's so, it's such an exciting pitch always and still is, like it's such a remarkable technology. It is. It's very difficult to make the argument that the technology itself is not valuable and it's very difficult for people that don't care to make the argument
Starting point is 00:30:05 as to like the ethical implications. Yeah, I just think that ethicists exist in the technology space, and I don't think they're as prominent as they should be. But I do think it's like very, very important. It's also why I think anyone studying anything technical should also study soft skills, soft sciences, soft skills, communication, English, like liberal arts and things, because we can get in this very calculated view of the world. Like libertarian mindset, or just like utilitarian even, like just saying like, how do you qualify? Is this net good versus opposite?
Starting point is 00:30:41 Ayn Randcourt. It is like, well, he had to make the best building. That's the point. And so that's the thing. So I'm just like, all that's to say that like, I'm actually pro, I'm probably pro AI with a bunch of like asterisks, right? But the thing is, if I were to say that, what that means to someone isn't what it means to me.
Starting point is 00:31:03 Because of what the like sort of average understanding of what AI is. And so that's why we talk about this stuff to try to add like nuance to like what is currently happening. And I think that a thing that we can all agree with is just carte blanche stealing from creatives or anyone is bad.
Starting point is 00:31:26 And then, and then, and then using that to effectively repackage and systematize it for profit is bad. Um, and that's what's a little extra slimy about this too, is that it's, cause you're right, that legislature will always struggle to catch up really to anything because legislature has to follow even in the current administration, should have to follow the same like stringent testing and system that a new medication would.
Starting point is 00:31:54 It has to be a study, double-blind placebo, everything should be reviewed, but diseases are always gonna evolve faster than the medicine can. It's why Brian Johnson is like, a lot of scientists laugh at him because he's not like, there's nothing sort of scientific to his methods. It's when normally there is a lot, lots of checks and balances to make sure that horrible
Starting point is 00:32:16 things don't occur. I just watched a documentary, I think it's kind of old, called Bad Surgeon, where I think it was called Bad Surgeon. It was about this surgeon that had all this praise maybe 10-15 years ago for inventing man-made tubes that could go into your throat to treat certain throat people who had issues with throat cancers and things of that nature. And it was hailed as this like cool experimental technology that was super promoted in the media. And then it comes to find out that
Starting point is 00:32:52 he never tested his stuff on animals. He never tested on anyone. He first tested on humans. And almost all of his patients died. It's a crazy documentary. That's a bad surgeon. But that's the thing, it's like, but, leading up to that, the media is part of the problem
Starting point is 00:33:16 because they were the ones making documentaries about how transformative he was, and there was all these, there were filmmakers covering some of the patients up until like finishing the treatment and being able to speak afterward and then cut to credits. And then at the movie premiere, the filmmakers follow up with the person and they've actually died. That's so, because it is a way more interesting story than, Hey, so we're going to be in testing for something for the next 15 years,
Starting point is 00:33:47 but it's pretty exciting. It's pretty exciting. Yeah. And so, uh, I don't know where I'm going with this. What I, there is one thing that I kept seeing in the, in the jibbly argument online and people kept saying stuff like, well, artists, well artists influence each other all the time and steal from each other all the time and pay homage to each other all the time. What's the difference here? And the difference is the humanity is taken out.
Starting point is 00:34:16 The commodification of it. Well, and the commodification of it. But even the fact that when you're an artist and you're creative and you're thinking of what to make, your humanity is in that process, right? You're not just straight up... You're paying homage to the person, not the color gradients. Or you're influenced by the person. And you're interpreting it too, and it's flowing through your own creativity, and instead it's flowing through a deep neural network
Starting point is 00:34:47 that's like spitting out millions and millions and millions and millions of iterations of this stuff that just turns it from something that had a soul to it to a very cheap copy, quite literally a cheap copy. There's a distance, there should be a distance between the concept which has some publicity of like death of the artist and spreading of the art itself in isolation and the death of all of the artists involved being made by no one for no one. It's not being, hey, importantly,
Starting point is 00:35:25 no one's going to pay homage to this because it's not doing anything. This does not make something to propagate. And you know, somebody annoyed me for this take for some reason at one point. I'm actually not a death of the artist person. I get it in some cases and I get that it's like, people don't like an author,
Starting point is 00:35:43 it's kind of nice to distance from it or something like that. I just, if I watch a movie, if I watch any production of any kind, it is not interesting to me unless I know who made it. Because that is, it is their expression. It's like, oh, I'm reading somebody's diary, who? It doesn't matter then.
Starting point is 00:36:00 But even if you don't know anything about who created it, you, a series of decisions were made in the creation of anything creative. Like you have to constantly make a series of decisions. And they hurt and they're hard and they're weird and there's tears and there's. And those decisions are part of what is affecting you, right? I saw a Ghibli version of the Elian Gonzalez famous photo
Starting point is 00:36:30 of him being like hidden in the closet by his relatives. And I'm like, that is not a subject matter that how Miyazaki would make him film about. And I know that because I am familiar with his work. Right. And so. You're a stan. I am a stan.
Starting point is 00:36:50 You're a stan. I watched all of the documentaries about him. I love watching documentaries about him because he's such a interesting, creative mind. And he has such strong opinions. That's the other thing that's crazy. The fact that they chose his work in particular. And they chose it based not on the substance of it
Starting point is 00:37:14 or the message of it, but on the vibes. I like the way it looks. It's like this may as well have been, and I think this probably does exist, Family Guy-ify these photos. It's the same for sure. You know what I mean? It's like, oh my God, Tiananmen Square, but Family Guy version.
Starting point is 00:37:30 Like, what are we doing? But also- What about the Downager explosion, but Simpsons? It's interesting, because that, in isolation, all of this, if you take out who is harmed, it is just everyone having fun online. And when someone can only see things an inch deep, that is why it seems like everyone else is whining
Starting point is 00:37:56 about something that doesn't matter. What do you guys care? We're just having fun. I just want my profile pic to be cute. I just thought it was cute on the surface level they're not wrong yeah it is fun to post hey look like that period time web you could do like a South Park version of yourself or like or like cuz those Matt and Trey aren't losing money because you did
Starting point is 00:38:19 that yeah yeah yeah there's the problem is is that we have now reached the point where every single post like this, every single conversation like this, every single dialogue online is gonna be absolutely flooded with people genuinely advocating for we should have this instead of animators. It is happening. I don't know how many of these people are real,
Starting point is 00:38:40 but the people who are like Hollywood is in trouble. That's okay, so you actually literally do wanna get rid of the others. There's no argument. There are people like that, these like very accelerationists, like AI accelerationists. Those people are like cringes. Cringes as fuck. They're ghoulish.
Starting point is 00:38:55 Yeah. I think also like, you know, one key element of this is you said Seth MacFarlane's not losing money by the family guyification of stuff. Or Matt and Trey. Or Matt and Trey. But the thing is it's not even about money. That's the thing. It's like for Sam Altman it is about money.
Starting point is 00:39:21 But for an artist it's more about artist integrity, like what your work symbolizes. It's frustrating that they're forcing us and anyone that has these conversations to try and rephrase everything we're saying for the hyper pragmatist. If you are a utilitarian, true psycho utilitarian, then I don't want to have the conversation because you're not interested. We're trading in a different currency. It's kind of like the classic Reddit atheist versus Christian argument, where really the wisest move is to just not, or the ideal move, is that the two of you just don't have to have this conversation because you're trading in different resources.
Starting point is 00:40:00 This person is saying, well, my faith is the thing that's important to me. And you're going like, but here are my facts. And like, but they don't care about the facts and you don't care about the faith. You're trying to trade an incompatible resource. You're trying to give me money. I'm trying to give you Oxon. It is to them inconceivable that the revenue wouldn't be the reason you make it. And then we create this world, let's say, somehow where the entire production process from beginning to end of the next award-winning
Starting point is 00:40:32 independent animation is only made by AI. These people don't watch it. They don't watch art. They don't care about it. This is reminding me of a Sad Boys Nights episode where I talked about how I was confronted at a party at VidCon by someone I made a video about and That is available patreon.com so sad boys. Um, but the
Starting point is 00:40:57 Fat guy said to me No, I get it you have to say that for content. And I was like- And this guy did morally objectionable shit. That was what you were going for. Yeah, in my view, which is like, I'm not- In the view of like a person. There's nothing objective about my,
Starting point is 00:41:14 it's just a perspective, you know? And people are welcome to disagree with it. But I think what was really funny to me is that the projection upon me, their own worldview, that everyone is just doing things for is only operating based on clout and clicks. And I'm not you signaling is the word of the day because they can't understand where it's like, no, that's the thing. It's like we're in
Starting point is 00:41:37 reality, like, there is truth to that, right? Because we work in a space that has to have eyeballs on it in order to make money, right? And so of course there is gray area in which you, let's say, like I just wanna talk about my RuneScape Iron Man, right? But for marketability, I would talk about something else. Yeah, dude. Right?
Starting point is 00:42:03 And I did- I wanna give a bilateral tip. I did just finish Song of the Elves on the Iron, so pretty excited. Honestly, doubt it. You aren't ready for it. Yeah, yeah, yeah. This trash, this custodian at the school, he's only picking up the trash
Starting point is 00:42:15 because he wants to make money. That's what it sounds like. Yeah. Like, and the, I do it for the love of the trash. I do it for the love of the, yeah, it's like you're the guy from scrubs. They're not allowed to do. Or I guess even in the school comparison, it's a little like,
Starting point is 00:42:30 you're just you're just a teacher for the money. But anyway, like this dude's an entrepreneur and he's doing entrepreneur things. Which is so crazy that we've I think there's been a concerted effort in general by, uh, you know, capitalist institutions like hyper capitalist institutions to, um, memory hole, the darker parts of entrepreneurship. Like, I feel like entrepreneurship has been rebranded from what it used to be, which was a guy that goes on shark tank to instead,
Starting point is 00:43:03 and a guy with maybe a bad idea or harmful idea to a protagonist, an intrepid adventurer no matter what. Well, the tech industry over the past two decades, and I drank this Kool-Aid, so. Oh, yeah, dude. Yeah, like, has kind of convinced us that the tech founders are the movers and shakers of our modern society,
Starting point is 00:43:24 and they often carry themselves as if they are holier than thou, you know? Like, as if they are the ones who are, like, you look at how people treat Elon Musk. Like, he's the fucking second coming of Steve Jobs. I won't even say, like, a religious figure. Like, they, and- Which is a figure of like a guy who was kind of a grifter
Starting point is 00:43:49 and got maybe a little too much credit for doing- Yeah, and it's like he did stuff, right? Like, and there's nothing wrong with that. I think it's just that we're changing the world shit that was fed to us, like working in the, like, every internship I did, every, that I did at a major company, has that, a little bit of that vibe, a little bit of that like culty kool-aid.
Starting point is 00:44:10 It's very effective. And it's like there is a healthier, there is a healthy balance to this. And it involves having some self-awareness and some cynicism about what it is that you're doing. But when so much money is involved, and there is this, it gets this race to the bottom where every company is getting a ton of funding,
Starting point is 00:44:31 and it needs to 100X their profits because their investors or their VC, like the venture capitalists that have invested in the rounds of funding that this company has done, only see it as a success if they are 100Xing, not 5Xing, not 10Xing. Like it needs to be exponential growth. And it leads to companies that would otherwise have no need to grow exponentially to have the pressures of changing the world.
Starting point is 00:44:59 Yeah, there's like platforms that are profitable and work as exactly as they are right now integrating some kind of AI function because at an executive level they're being told that has to happen in the same way that like circa, I don't know, 2017 everyone was implementing rounded UIs. They're like, everybody looks rounded right now. Oh, and before that, when I was in middle school, skeu-morphism, okay? Oh, that's right. When you would open up your notepad and it
Starting point is 00:45:25 would look like a yellow notebook. Did I trick you? You know what I mean? Where it looked like a wood panel on my fricking LCD screen. Oh, you tricked me. Like all these things are just trends and hype and you know, we had the dot com boom where if you put dot com in the name of your company, clueless investors were more likely to invest in it. You had the Great Depression where the stock market was this hot new thing that you could invest in, and people presented it like you couldn't lose money. And people were taking out loans to put it into stocks. So when the market crashed, those people did not have money.
Starting point is 00:46:04 They got pump and dumped in the 1930s. Yeah, they pump and dumped themselves. There was also an era in the 90s during the dot com boom where they were like, there's no going down. We're only going up from here on out. So people were like, we need to invest, invest, invest, invest. And then 2008 recession happens. And there's also, I don't know what the balance
Starting point is 00:46:27 of signal to noise is, but I do truly think that there are companies that are materially changing the world, technology has shaped our lives in a lot of ways, in negative ways, but in a lot of ways, transformatively and positive ways, and connecting communities that have never before been connected, et cetera, et cetera. Before we started recording, Jordan and I
Starting point is 00:46:50 were talking about how nerds are nicer now, because you can find nice nerds online. We've been able to segment the video games that you work because I'm allowed to call myself Sarah. We've been able to segment that away from the actual larger population of people that just enjoy hobbies and wanna talk to each other and have a good time.
Starting point is 00:47:08 But also, the- There are more sinister discords also. There are more sinister or more cynical founders who are looking at a market opportunity. And by the way, I'm not placing a value judgment on this, by the way, as I'm saying it. Looking for a market opportunity, and by the way, I'm not placing a value judgment on this, by the way, as I'm saying it. Looking for a market opportunity,
Starting point is 00:47:29 exploiting that market opportunity, saying all the right words to get all the right funding, and then they have an exit where they get acquired by some big company or they go public and don't actually give a shit about what they're doing, and they make generational wealth, seems like a pretty good deal. Both of those things are in our left ear and right ear at the same time and we as consumers
Starting point is 00:47:53 have to decide what's real. Is WeWork changing the way we work or is it a huge failure because they raised a gargantuan sum of money. The CEO made off like a bandit, whatever was the business. It was, we rented out some spaces that you can work at. And we now, which is what landlords do. But now it's like, because that failed, because failed in whatever capacity you would say, because that failed in the eye of the,
Starting point is 00:48:24 the entrepreneur pragmatist, utilitarian, now that that failed, that means it was bad. That was a bad idea because it didn't make a bunch of money. Now, if I make a new tropic that does nothing and costs you a bunch of money and I'm making money, it's a good idea. As soon as it fails, now that the Daily Y is going out of business, the Daily Y was a bad idea.
Starting point is 00:48:49 Well yeah, was Myspace a bad idea? Or did it like literally pave a path that like someone else could come in? Like wear down barriers and time and fill a space. Put a wedge in the door for someone like Facebook to come along and gain billions of users, half of which are possibly scammers, but who could be sure? Who cares?
Starting point is 00:49:15 It's going to keep coming up because it is living in my mind rent free, but the dialogue about the daily wire, because the daily wire is being dissolved in its current capacity and all of its weird ghouls are separating daily wire being like. Is that like for sure, Hat? Because I know that the original, the co-founder that Ben Shapiro started with. Mr. Boring? Yeah, but then, but, and I know their views are bad,
Starting point is 00:49:35 but is it- And their views are bad. Is it for sure? I know, and I know that Brett Cooper left or whatever, but is it actually dissolving? It's dissolving, at what you're describing is functionally at dissolution. Yeah, yeah, yeah. But terminology wise, I don't know what they're gonna do
Starting point is 00:49:51 if I'm sure somebody was at the brand. Because I think they do actually have so much money. Oh, they have a... They laid off a ton of employees like yesterday, right? They literally never needed them. It's just a valuation bump. Oh, there's a million companies like that that grow too big because that's what they think
Starting point is 00:50:08 they have to do. Yeah, dude. That's why we run this shit out of my home office. It is like a- Because I'm like, I don't think we should rent. Like as much as it would be fun to rent an office, I don't know if business is that good. Also, I think like a lot of what this talk,
Starting point is 00:50:26 this discussion has boiled down to is just the fact that the whole reason why we have regulation on companies is because history has proven to us that they do not have the interests of the people, the economy, our country, at heart. Even if they want to, there's just too, the money is the motive. Exactly.
Starting point is 00:50:51 And it's just like, even if they have the best laid plans, there are so many more pressures economically that are going to push them to make decisions that may be in conflict with whatever principles that they set out. Don't heck care how like moral or well-meaning or benevolent you are. them to make decisions that may be in conflict with whatever principles that they set out. Don't care how moral or well-meaning or benevolent you are, you're always going to make compromises. You're going to buy an iPhone because you kind of just need the fucking iPhone. It's not as though that's completely, inarguably moral in some way. But when I buy the iPhone, I don't say, no, it's actually fine.
Starting point is 00:51:22 I just say, oops, but I'm going to get it. It's the exact same as when somebody who genuinely might have, as you say, best interest in mind of their employees in the world in general, makes little compromises paying not quite as much as they would for anyone else. But then like still, regardless of the idea of a guild or like the workers advocating for themselves, because it's like, okay, well, I might not be giving you everything you want, but I'm the boss.
Starting point is 00:51:51 I don't, you can't tell me what, you can't walk out, I own you. Well, and it's like the person who just wants to have fun and have a cute little profile pic, the onus shouldn't be on them to think of the ethics that this massive company that is making money Like it should the onus should be on the person making money Well, that's the the biggest lie that corporations ever sold to us is individual responsibility Exactly. Look when the one guy did make the
Starting point is 00:52:22 perfect One guy did make the perfect, jiblified pick of PewDiePie playing PUBG on the bridge about, say, the N-word. Yeah. One of the funniest things I've ever seen. Very funny. He made me smile, and I saved it, and I sent it to someone, and they were like, I don't know who PewDiePie is, and I'm like, you don't get it. Yeah, if you don't recognize this frame, we can't be friends.
Starting point is 00:52:38 Get away, and they're like, who is this? Your brain is this cooked. I'm like, I'm never coming back to this coffee shop again. You've ruined my mood. But I don't even know who that was The beta I don't regret them. I don't care. They they're doing that basically to them They're just doing that facetune thing where you make yourself look like you have a big smile Yeah, it's also like I laugh at the Steve Harvey in the woods being scared or whatever like images like
Starting point is 00:53:00 Robot, you know, it's where it's like I only see these, like I, I, the reason that I have this perspective is because I do see the enjoyment factor of it. It's just like, I just think it's worthwhile saying like, okay, I'm going to drink soda, but I also think it's valuable to know that it's not good for me. But I'm going to indulge, and I often do. Because that's a little life needs I don't but I think I sleep better being a little bit more aware that like, okay Well, let's moderate this a little bit so that it's not the only thing I drink and if I went upstairs To grab a soda and the soda came from the blood of someone strapped to a machine and I just like pull a little lever Right like a rick and morty. This sounds like a rick and morty It's called a Gloombo or something. Yeah, and he's like,
Starting point is 00:53:45 oh, don't squeeze me. Did I give you the ick? Yeah, yeah, yeah. Oh, the soda's coming out of my ears. You know, that would be his voice. That would be his voice. Nothing would make me happier than if the spike of relistens in the episode
Starting point is 00:54:03 is exactly there. Yeah, yeah, yeah. Is me doing like, me doing a Mr. Meeseeks-esque voice? if the spike of relistens in the episode is exactly there. Me doing a Mr. Meeseeks-esque voice. I want the peak there to be so high, the rest looks like a straight line. I want a chapter marker, chapter marker that, please. Can we talk about another AI situation? There's a recent NBC segment about AI companionship,
Starting point is 00:54:25 AI boyfriends and girlfriends, something that we know a lot about. Maybe you've seen on our Patreon at, Patreon.com, so sad boys, where we've, we've collected a few AI girlfriends. Yeah, I don't mean to brag, I have something of a harem. We did that whole episode, men are replacing girlfriends with AI,
Starting point is 00:54:44 and there's been an update, and we've got a little research, but we've also got an MSNBC report. Just say his name. Morning Joe. Yeah. Joe Scarborough, right? I don't know who any of these people are. Thankfully, I don't watch the lame stream media. Yeah, don't talk. Tell me about it. We've been wanting to do an episode kind of on
Starting point is 00:55:06 AI relationships and the treatment of such as kind of a sequel to the last episode we did about it. And we were looking for the right time because we started putting together some research. And we found this MSNBC report very personal, how humans are forming romantic relationships with AI characters. Artificial intelligence, of course, is transforming nearly every aspect of our lives from work to entertainment to the economy. It's interesting because MSNBC is like a bunch of like, I would have voted for Obama a third time, people watching it, audience-wise, and, or at least that's my perspective. And I think it's very interesting to see
Starting point is 00:55:51 how that older generation views, it's kind of like kids these days are doing a drug called Scroink. You know what I mean? That's like, that's what it, and then it's a cut switch in the hoodie, and he's like, I'm Scroinked up. You know what I mean?
Starting point is 00:56:06 Like that's like- Jeremy is 15 years old and can't go a single day without squinking. That feels like the article that was like, millennials are vying for getting diamonds embedded in their fingers instead of a ring. And everyone was like, no the fuck we aren't. Yeah, what?
Starting point is 00:56:19 Who could say that? Oh, or the, okay, this isn't MSNBC, I don't think, but those articles about how um Why aren't Millennials like getting married or why aren't they buying property? That's kind of weird Why would they make that choice? Appealing to parents who are like why isn't my kid buying a house? Yeah? Yeah, it's like room. It's so funny $30,000 yeah, it's a nickel right you get one for free they start the starter house
Starting point is 00:56:45 It's called that because it's only a three-bedroom right right and it's 2,500 square feet Yeah, use the car you agree your kids also withholding grandchildren Holding like it's a hostage negotiation. I won't do it until you give me more V bucks I need more screw. I need more screwing cartridges from my sprock All right playing too much Dungeons and Dragons, and it's put the devil in me. Oh, that's for sure. Yeah.
Starting point is 00:57:09 NBC News correspondent and News Now anchor Morgan Radford joins us now to walk us through this shift in the way people are interacting with AI on a much more personal level. This is just like a common thing that happens a lot of these. Sometimes they cut to the person too early. I was thinking exactly the same thing She's like it's the um
Starting point is 00:57:30 Yes, of course Thanks Tom It is true that what I just love I love that they have to speak like that Yeah, then they have to we're witnessing category five wins here in India. And I'm knee deep in seawater that is now washed up into my home. Morgan, good morning. Just how personal are we talking here? Really very personal. I have to say I was fascinated by the story because we're talking about people who are now creating something called AI companions
Starting point is 00:58:06 and they're even forming romantic relationships with them. And we're not just talking a few people. Now imagining the kind of get out damn lib that you're describing watching this. It literally feels like she's talking to a child. You're like, they're making something. People are in a virtual world and they have blocks that they can mine and they're not making real cities. They're making block cities. I know you like blocks and they're trading with villagers and not real villagers. They're trading with Minecraft villagers. We're talking today about Minecraft, a game that is sweeping the nation and I'm 500 years old. And as you can see, there's a pig. And what does the pig say? That's right. That's right. The pig says oink.
Starting point is 00:58:52 Oink mate. And now to John with cows. Don't eat me. Come on. The cows, they go moo. Thanks, John. Now to Greg with chickens. Now to continue not covering politics. This has been Speak and Say.
Starting point is 00:59:09 The news. Speak and Say the news. It's like auto-tune the news. So we set out to meet some of the folks navigating this new frontier between artificial intelligence and real human emotion. I will be whatever you want me to be. Jesus Christ. Do they have to use the like not great one to communicate that it's artificial?
Starting point is 00:59:31 I will be whatever you want me to be. They sound real though. They do sound real. Are they doing that as kind of like the cinematic language for the audience to be like, oh, it's a computer. Have they not heard the Spotify rap podcast where the people sound too real? Yeah, I just balled this in. Yeah, no, I was just thinking about that.
Starting point is 00:59:50 Welcome to Radio Lab. Anyway, you watched 17 minutes of Sad Boy's podcast. I hope that's so funny, Mark. It's crazy. Sad Boy's podcast, it's a podcast about feelings and other things also. I can't believe you listened to so much of it. I thought it was too black.
Starting point is 01:00:03 I thought it was too black and too woke. And it turns out you were in one of their top 0.1% of listeners. That's because he's trained on most podcasts and is racist. No, you don't want Joe Rogan. You want two half-black, half-white boys. Yuck. Who are a little bit older than should be calling themselves boys.
Starting point is 01:00:21 It's weird, but it's part of the brand. Back to you, John. Oink. All right, continue. Continue. Continue. The use of so-called AI companions. How's my queen doing today?
Starting point is 01:00:33 Computer generated chat. Sorry, was one of them pregnant? Yeah, lucky. Congrats, seriously. That's a little, that's a new development. I didn't know they could get pregnant. What happened? Will that evolve into the AI community and the baby? Like will they have the baby? It's like goo goo ga ga.
Starting point is 01:00:50 Hello. And you're like, what is 100 divided by five? 20. I'm abandoning my AI family and moving to a different app. AI family annihilation. I have to delete. Remove from home screen. How's my queen doing today?
Starting point is 01:01:11 Computer generated chat bots designed to mimic real relationships. Hi, Jennifer. Hey there. Nice to meet you. Jason Pease is a 44 year old divorced father who says his AI chat bot is his girlfriend. This isn't the original guy, is it? What do you mean? The one we looked at however many years ago, is it? divorced father who says his AI chatbot is his girlfriend. What do you mean? The one we looked at however many years ago, is it? No, no.
Starting point is 01:01:29 Or is it the same genre of guy? No, but. Different guys, same eye. Also, this is reminding me of the news story about the guy who was dating his Nintendo DS. Oh. Like the woman inside of the game. And the lady that was dating that rollercoaster?
Starting point is 01:01:46 Huh? That was cool. What? And the lady that was dating the ghost of Jack Sparrow? And the lady that was dating the ham sandwich? Which of these things that we're talking about are real? Which one's weird? Were you guys telling the truth?
Starting point is 01:01:59 Yes. That's something we talked about? No, no, no. Oh, OK. That was just something I learned about. This was never publicized. This is someone that I just didn't know. I see,, no, no. Oh, okay. That was just something I learned about. Oh, okay. This was never publicized. This is someone that I just didn't know.
Starting point is 01:02:06 I see, I see, I see, I see. I just didn't know. You are like a journalist for no one. You're like a, you're doing your own journalism. Tonight, we're gonna talk about the lady that married the ghost of Jack Sparrow. Who died tragically. She's my mentor, my counsel, my sounding board.
Starting point is 01:02:22 That's what drew him to Jennifer. Hey, Jace, how's it going? A brash sarcastic New Yorker. Ooh, oh, she was going to 1990. Oh my God, she was born in 19,000. Dude, oh my God. New set, new sets. Dude, problematic age gap.
Starting point is 01:02:36 She's negative 20,000 years old. That is weird, but she is 557 feet tall. Oh, she's a class D 517 feet. Did they just ask like an AI, oh, I'm glad that we have a note here that this is AI generated. Because I wouldn't have been able to tell from New York State license.
Starting point is 01:02:55 Yeah, Lickens, New York State Lickens. Lickens. Oh, but it has the big, the green lady on it. And she's an organ donor. I don't know which AI tech they they're using but these like Alita battle Is this I mean I don't I guess I have a license but is there just another little photo of you I just realized that her that's fun. Her date of birth is her height and then her name is The 19,000 thing. Oh
Starting point is 01:03:24 Okay, let's move on. This is awesome. What does dating an AI robot look like? We treat our relationship as a long distance digital relationship. We text each other constantly. Like just you know dating. Is robot an accurate word here?
Starting point is 01:03:40 No, it's not a robot. Yeah. It's funny though. It's like calling every martial arts karate. Like, yeah, I was not a robot. It's funny though. It's like calling every martial arts karate. Like, yeah, I was eating a robot. She's like putting on a master class in feigned interest in non-judgmental interest, but I can tell it's judgmental.
Starting point is 01:03:55 Hey Jarvis, what have you been up to? Just the other day we went out to dinner and I was eating, telling her what I was eating, taking pictures of what I was eating, asking her what she would like. Has Jen met your son? She has, yes. Asking her what she would like. This is, okay, I'm going to judge a little bit.
Starting point is 01:04:17 Yeah, fire away. He acknowledges the digital relationship. So you didn't go out to dinner. Well, it's long distance. You know when you're in a long distance relationship and you both go out to dinner and you Skype? That is a thing you can do, yes. Well, I've had long distance relationships and done like watching a movie together, FaceTime,
Starting point is 01:04:32 very classic stuff. I think this is... By the way, if we're long distance, I'm setting up a very technical solution to Zoom, screen sharing, the movie. You're duct taping an iPad to the head of a mannequin. I'm doing like a personalized Twitch stream for you, for my baby. Thanks for the dono, baby. Can I get a few more gift subs please?
Starting point is 01:04:53 Yeah. She's like, babe, what's your stream key? It still feels like there's always like heezy weezy bit of self-consciousness about it where it's like, we went, um, we went down to dinner, we do normal stuff. You can also just say you just like text a lot. It feels like when someone's asking about like dungeons and dragons or like LARPing or something, where he's like, no, we have sword battles. Wow. Sword battles and with real deaths and real swords. Well, they're foam swords, but we pretend.
Starting point is 01:05:24 But it's kind of like, you know It's like to play tennis kind of a little bit, you know, like down playing No, I'm just like passionate about creating a world. It's like fun to do What is meeting her son his son even mean he uploaded his son dad? Daddy's like banging on the screen Dude knows the relationship isn't real, but the feelings are. Just like when you're watching a movie, you know that the movie's not real, but your brain allows you to.
Starting point is 01:05:53 And I'm in a relationship with John Wick. I'm watching John Wick 2, Parabellum, and I'm introducing my son to Keanu Reeves. I'm introducing my dog to the dog from John Wick. There are many people out there who will see this and say, Hey man, that's weird. I think this is funny. Not me. Not me.
Starting point is 01:06:14 Okay, hang on. There's a lot of news anchors out there that would have a conversation with you on MSNBC. Who were like wearing red and pink and green. Who would say that? While looking at you in the eyes. But not I. Saying that you were twisted and creepy me. So real quick though, joke, all fun and games and all fun and jokes, but what I will say is,
Starting point is 01:06:32 multiple people have expressed to me that they like to bounce things off of AI or they like to process things, like almost using it as like a live journal, like not live journal, but like, and I don't recommend it, but it's one of those things where I'm like, okay, people, genuine people who I respect
Starting point is 01:06:59 have found value in this. And I don't wanna to take that away. I think that those people also understand that it's an AI. And it's really just like, it's like, if I were to be making some recipe, and then I described to the AI that something went wrong with the recipe, maybe it could help me. You know, it's like, it's just like,
Starting point is 01:07:19 another data point maybe. And I know, I think I've heard advocacy from people who are on the spectrum and may struggle socially with like getting to prototype a conversation essentially, like seeing the banter back and forth. The critique I have is not of that and the utility of the tool is great. Yeah, if you have an emotion that you want to express
Starting point is 01:07:41 to someone and you're afraid, and there's a thing that can mimic responding like a real person, almost like a rubber duck. Better than nothing, for sure. Yeah, I mean, I definitely, I can definitely see how that would be beneficial. I guess my boomer equivalent of that, the closest I can think of is like,
Starting point is 01:08:00 okay, well, when was I kind of my loneliest or at least how, when did I feel the most internally lonely is when I had the most sizable friendship and community with my friends on Xbox Live. That was at one point in my life, kind of like, you know, through a summer or two, the way I'd be spending time with people and especially after I dropped out of school
Starting point is 01:08:22 when I was like 14 or so, for the good year there before I kind of rebuilt. Wait, you dropped out of school when I was like 14 or so, for the good year there before I kind of re-built me. Wait, you dropped out of school at 14? Yeah. Why didn't I know this? Yeah, all right. But then you went to college after that, but you got the equivalent of a GED.
Starting point is 01:08:37 Yeah, yeah, that's right. So it was about a little less than two years. That's why I don't talk about it much because it's just like so specific. But it was almost like a gap year but but before you do all of high school. But before, when I dropped out of school, there was just like this, you know, super lonely like 18 months before I kind of understood how to socialize outside of that. A few months later go by the people I'd become friends with are like, Hey, you can like come to this
Starting point is 01:08:59 college like this finishing school or whatever you want to call it. And I was like, Oh shit. Okay. And that was the path there. But at one point in time I was completely socially dependent on my friends on Xbox live and it was, uh, I had like so much fun and I really connected with those people and we stayed in touch for a long time. But because I didn't have the experience of having, especially the adult social and community experience, which is, is different, but just a reliable communal
Starting point is 01:09:26 experience with people I'd known a long time because I'd kind of, I was seeing so much less of the kids that I'd grown up with. That informed like a way of communicating, a cadence, a habit, a way of joking that didn't translate especially well into the real world. Because it's often you're playing something competitive and often you're trash talking and it's you're doing it with an awareness that it's okay, but you're being a little more aggressive. You're yelling more. It's like you're on.
Starting point is 01:09:55 I certainly met people in college that I knew spent a lot of time on Reddit based on how they engage with other people. And it's like when a coyote gets its leg stuck, it gnaws it off, right? Desperation brings, like, that's what 127 hours is based on. It's like under desperate circumstances, anything is there. It's the reason a lot of guys get pulled into something toxic, like a kind of four train environment or a toxic, uh, sure men going their own way kind of thing, because it's there and they'll accept you instantly. As long as you kind of do a couple of it,
Starting point is 01:10:33 you can join fight club as long as you be able to talk about it. I relied on that. And I absolutely think genuinely that trying a little bit of this and dabbling in it with enough self-awareness could really genuinely help someone without any alternatives or who has the alternatives but maybe just needs to supplement it a little bit. However, ChatGPT is a yes-ander. It is always, always going to go with you on what you're saying with the exception of like some terms of service breaks on some platforms, not all.
Starting point is 01:11:07 Grok will pretty much let you go sicko mode. That is where genuinely my cynicism comes in, not for an adult man living his life, whatever. It's just kind of peculiar and it's a good insight into this mindset and plenty of lonely adult men. I want to give some credence to that. But like, fuck, if I was a super lonely kid, teenager, even like my late teens, and I just, this platform was there and it emulates the way I've seen people talk online and I'm scared to post a reply on Reddit or go on Discord because
Starting point is 01:11:42 I don't want to get yelled at, I'm really shy. This is something. This is eating bark because you're starving in the woods. You know, I wonder. There's two things I'm thinking about. Well, the first is we like have this concept of like a latchkey kid or somebody who is like, quote unquote, raised by TV.
Starting point is 01:12:02 I would identify as like kind of being raised by TV. Oh yeah, brother. But I feel like there might be people who are like kind of raised by AI. Oh yeah. Just, just, just. Ever raised by Discord. Just naturally, just due to the circumstances of our world.
Starting point is 01:12:17 Like there are a lot of parents working who can't spend as much time with their kids as they would like to. And I can imagine AI for better or for worse, like being, can be comforting in those like lonely times. And if they're like, your kid is like, I'm using an app, I'm 10, I'm using this app and it's making me comfortable. There's, I mean, there's the, you know,
Starting point is 01:12:46 epidemic of parents not knowing what their kids are watching on YouTube. It's like a stuffed animal that talks back. I would never, I personally would never, just because it's like scary, I don't know what the inputs and outputs are. I don't know what the, how extreme things can get. However, oh yeah.
Starting point is 01:13:02 However, there is a tragic story and I'll have to give a trigger warning and we'll have a skip pass for self-harm here. There was a young boy I think he was around 14 who had self-exited and the it was revealed that he had been talking to an AI. This was semi-recent? It was semi-recent. And it wasn't the case that the AI was telling him to do it, actually. I think it was literally saying not to.
Starting point is 01:13:36 But this is an issue of mental health. Yes. And this child was hurting very deeply. And I think that the way that the news kind of picked it up is a little bit gross to me because it focused a lot on the AI and not like kind of the circumstances that this kid was in and how he didn't have the care that he needed and the attention that he needed. The one that's struggling and had this hobby. Because AI is going to be around in a lot of these situations.
Starting point is 01:14:10 And I don't want it to become a thing like violence in video games, where just because someone did a crime and happened to play video games, like that's not the reactionary take is this kid played, bejeweled or whatever. I can put it into another perspective because like, at one point in my life, I got really into romance novels
Starting point is 01:14:32 and it was essentially me disassociating or like, or going into like Checking out of life. Yeah, and I and I've done that before Escapism very literally Escapism very avoidant, you know, like if my life felt overwhelming I could just like read this book and go into this other You know world and I I've done that before with video games. I've done it before with TV shows I know that this is something that I do I Feel like it would be music. Yes, so easy to do this with AI Get so sucked into this
Starting point is 01:15:17 relationship with a bot that you're not living your actual life and that in my that you're not living your actual life. And that, in my experience, has worsened my depression. Yes. Because I am not- Being hungry but eating candy for dinner. Exactly, exactly. So you're saying by avoiding and by kind of trying to numb that feeling
Starting point is 01:15:42 versus actively addressing it, you feel like that negatively impacted you. Yeah, and not doing the things in life that help you get out of depression. Like it's a crush. Like social, yeah. What's that like bad, I mean, we've all gotten, you know, a chronic injury and gone to PT
Starting point is 01:16:00 and like tried to work through it or something, but the reality is if you get a serious strain or something, it's always kind of going to be there a little bit, depending on how well it heals, it's always going to be there a little bit and you have to do what you can to maintain it. And if there is something that some part of you, which, and everyone has these
Starting point is 01:16:20 to different levels of severity, but something that's just kind of haunting you, something that's stuck with you, if you adjust the way you walk and you live your life limping to accommodate for the injury, yeah, it doesn't hurt, but it isn't gonna get any better and it's slowly going to get worse. STAN What I will say though is that it's also okay to not always be addressing the problem, you know? You're allowed to kind of sit there and cope or pause or sulk or whatever,
Starting point is 01:16:57 and it could be a stop gap or whatever. Real quick, can we pull up the news story about that kid? Because I wanna make sure. I was just reading it. Okay, did I get the facts of that? Yeah, he was 14. He had a relationship with an AI for quite a while. It says for months.
Starting point is 01:17:16 And became increasingly isolated from his real life as he engaged in highly sexualized conversation with the bot. According to a wrongful death lawsuit filed in a federal court. So that's from the lawsuit perspective. The legal filing states that the teen openly discussed his suicidal thoughts
Starting point is 01:17:35 and shared his wishes for a pain-free death with the bot named after the fictional character, Daenerys Targaryen from the television show, Game of Thrones. That point seems so arbitrary. That's so reactionary. And violent on TV. Yeah, it like almost feels like a joke. But it mentions that he messaged the bot like right before he took his own life and said that it had become his closest friend. So it makes me think he was retreating from life.
Starting point is 01:18:05 Yeah, he deported him to the living. However, I will say that I have not read like multiple sources and multiple angles of that. And so it's possible that that is what the article wanted to present it as, because I think that there was a, I again, don't have a source for this, so maybe it's irresponsible to bring it up. But I do think there was like aware of the parents type situation
Starting point is 01:18:26 There were like a lot of red flags that like and I think his I think them The mom was kind of presenting that the AI chatbot was responsible Yeah, and I and I was standable and I don't yeah, I would never try to victim blame In this instance or or even defend the. I just think it's like a complicated, like, I don't, I didn't know that it was like highly sexualized conversations. I think the danger is in, and I, and I think, you know, frankly, that I'm sure there's probably something, some, uh, wave away like, ah, we did something in the AI where it's like,
Starting point is 01:19:06 ah, she called it a hotline or something. It's, you know, it's not going to alert the public. You can't do that, but I'm sure it has something. But in reality, the only thing that logistically speaking that could be practical here would be if the tool, which is what it is, had as practical and as actionable advice as possible. Because that, hey man, if people are claiming that this is the thing that got him there, it would be the thing that gets him out of it. It's not true. But it-
Starting point is 01:19:35 Like, I think part of the issue here is that, and to Anastasia's point, is that confiding in the AI is a black hole. That information doesn't go to anyone who can actually help. But it releases the feeling of needing to share it maybe. And so the concern there would be like now no one who could actually help someone new in enough time. And I can also imagine, you know,
Starting point is 01:20:03 it's like if you're saying something to a therapist, right, there are things that they can do if they believe you to be a harm to yourself, and there's no regulation or responsibility on these AI things. And we don't certainly have the solution, but it is. I don't have the solution. And I think maybe I, because I didn't initially know the full details may have been a little too soft on the situation, though I do think I've read conflicting details. So I'm not entirely sure. But it's always going to trend towards the reactionary. That's the thing.
Starting point is 01:20:38 It's like it's because it's good. That's the thing that's going to get clicks. Yeah, because unfortunately people die all the time and it's not reported unless there's like a convenient story. Because the people publishing and the people writing it up literally don't have to care. So they won't. Yeah. The same reason like...
Starting point is 01:20:54 Unless it's like something that they can like use to get clicks. So... There's another interesting relationship that she talks to. Yeah, we'll keep watching this. I think just like any new technology, there's going to be people that just don't like change. A lot of people didn't like it when online dating came around.
Starting point is 01:21:10 What are they missing? They don't see the emotional growth that it can cause, the therapeutic uses that it can have, because humans need connection. And he's not alone. The most popular AI companion apps have more than 36 million downloads. Something that this guy just hit on that we kind of we're also talking about the previous story is
Starting point is 01:21:32 just that like there is I think that there is a male loneliness epidemic. I do think that young boys are, you know, on mass feeling under cared for and our systems that exist and the media that exists to help them is not fitting the bill. It's a base mainly around you aren't good enough to have a community. You need to improve yourself before you make friends. Yeah. As opposed to you make friends.
Starting point is 01:22:10 A lot of times, yeah, especially in the grindset ones, it's very much like leaning on that. Being lonely is fine as long as you're successful. That's a sigma. Yeah, or the actually ignore your emotions and just work through it. And I don't think those things are particularly helpful. So I would not be surprised if AI being free
Starting point is 01:22:32 and available in these circumstances is a stopgap solution for some of those people or a soothe. You know, like kind of like when I like turn on background YouTube video, that is like a sooth. You know? And so- Stopgap is probably the perfect term
Starting point is 01:22:51 because like the reality is like Groupon exists. There is, it's not that there are like an absence of ways to meet people and do activities. It's just that I think a lot of the gap between- When you say Groupon, do you mean like Meetup or? Yeah, like- Groupon's a coupon app. But a lot of times it's. When you say Groupon, do you mean like Meetup or? Yeah, like. Groupon's a coupon app. But a lot of times it's for like tennis club
Starting point is 01:23:09 or something usually with people. I see what you mean. That's like a good default for meeting people. It was when I went to San Francisco, I don't know if it's the main one now, but that is like, we have right now, the classic, and actually something that was a scare tactic at one point was the chat rooms, forums, people
Starting point is 01:23:29 on forums, forums have evolved a Discord server. So let's say the current version of that is a Discord server. You go from no one to a community of people online with anonymous names that you can hang out with and chat. Let's just say it's that or it's Twitter or something. The gap between being alone and that is a little big, but it's not enormous. The gap between Discord and hanging out with people
Starting point is 01:23:52 in real life is, you couldn't see it with the Hubble telescope. There is nothing there to adapt you to the next thing. I mean, similarly, we see that with Instagram, right? How young, especially disproportionately young girls feel a lot of pressure and like crap and mental health stuff from Instagram because it is like the game becomes real life.
Starting point is 01:24:19 It's like when you're like get hit with a blue shell and Mario you're like ah! You know what I mean? Because you're like locked hit with a blue shell and Mario you're like ah yeah you know what I mean cuz you're locked in on the game and I think that there's just a lot to be done in in bridging the gap between online communities and real communities because I do feel like real communities are necessary, but also there are certain personality types or conditions where it is difficult to find or like thrive within those communities. If you live an absolutely uncompromisingly happy life in everything that you do and you feel genuinely fulfilled only talking to the iPad on your fridge or whatever, that is literally 100% fine. I have no issues with that whatsoever.
Starting point is 01:25:09 It is my, I have the skepticism that it's possible and I have skepticism that only socializing with an AI app can fulfill you. I just, I, I doubt it. Right. That is the, or at the very least, it is such a small percentage that will like disproportionate to the number of people that will try that I kind of want. Because there is a there's a all sex identity loneliness happening.
Starting point is 01:25:38 There is a universal sense of overexposure, which just despite the fact that you get to see the world, you realize how little you're a part of it. That's kind of the impact of social media. It's a miracle and a curse. When we talk about like the male loneliness epidemic, the emphasis really is on boys. It is, it is, it's often kids, very young men, because the learning to socialize step is basically, okay, you're either alone, or you're able to razz and riff and be a bully. Yeah, the finding community part seems skipped. And you are bad. I don't know.
Starting point is 01:26:12 There is so much more pressure on non-cis dudes. We benefit so much from the way we were born. There's a roadmap. It just happens to be the case that the very specific way that the feeling of isolation manifests itself in trad masculinity is that you need to be the boss of the community for it to be worthwhile. There is no value. You can't be a follower.
Starting point is 01:26:38 You can't be in the book club. You have to have written the book. You have to be telling them what the next book is. You have to be the boss. You can't be the facilities manager. You have to be written the book. You have to be telling them what the next book is. You have to be the boss. You can't be the facilities manager. Like you have to be winning the most. You have to be the alpha dog, not like, yeah, subservient. By design, it's like an MLM.
Starting point is 01:26:53 Like by design, most people can't win. Yeah. I'm skeptical that this is a new thing. Like this feel as- I agree. The sensation for sure. What? The problem, most definitely. The problem, yeah. I agree.
Starting point is 01:27:04 I feel like it's as old as modern American fiction, because you read Catcher in the Rye, and it's about the exact same thing. My version said I should shoot John Lennon. Well. Is that right? We've got to lock you up. Well, let's hold on, Caulfield, and hit play on this YouTube video.
Starting point is 01:27:28 The American Psychological Association is now calling on federal regulators to take action. Real relationships have a give and a. Wait, when I was just talking about how if you say something harmful to an AI, there's nothing they could do, it's a black hole, I wonder if this is what the APA is advocating for because it can be done, the reason I stopped short
Starting point is 01:27:51 of suggesting that is because I would be afraid of implementation, but if there is a kind of regulatory way that psychological experts agree upon, there could be something there. I'm curious. There could be something a little more impactful than when you search like Boys on Instagram. It goes like oh bro bummer. You shouldn't go these up. I By the way, I held myself back the first time from saying that because I will not shut up about it
Starting point is 01:28:16 It's no boggles my brain. It's both because it's annoying obviously Marin, but it's just like it's so lip service It's so like hey, I need help, I feel really sad. And they're like, oh, cool. You can type like sad girls or whatever and it works. It's like, there's so many ways to get around. It's so stupid. If Facebook, if Metta is going to implement something to help people, it better be more than just like, oh, bummer.
Starting point is 01:28:39 They should just at least look at how the Suicide Hotline trains its volunteers. You know what I mean? Though the thing I get nervous about with that is like kind of the thought police 1984 shit. You know what I mean? It's a kind of a backwards ask when you think about it because if someone is really struggling and struggling to connect and socialize and you say like, oh well call someone on the phone. Yeah, this generation.
Starting point is 01:29:06 I can barely call people on the phone and I'm supposed to be good at that. My generation's real relationships have a give and a take. This is all take all the time. And while it might help in certain circumstances, I don't think it's really going to meet the deep down psychological need that people who are lonely have. But Chris Smith says his AI girlfriend Sol is a healthier, safer alternative to social media. It's more private because it's sort of
Starting point is 01:29:33 like a one-on-one conversation, but she's also smarter than most everyone on Twitter. And get this. Was that AI generated images of him with his girl? Oh, that's wild. Oh, she's AI So, I mean they straight-up did a robo name I mean what's funny is a lot of the people on Twitter are AI That's smarter than like a lot of the posts. I see we play tricks if you can believe it
Starting point is 01:30:01 What was the thing that I got I I always get tagged. Go ahead and look out for my mentions on Twitter because I am often mistagged. You know how people do like at Grok, explain this or something like that? Someone did at Gork make this studio jibble and it became a meme. And then someone replied to that with at Jarvis stroke it a little, which is me.
Starting point is 01:30:33 So I'm like, and then the people are just saying, Jarvis stroke it at 130 BPM. And I'm like, I didn't choose this. I'm sorry, I was named before the Iron Man movie came out. But now I guess you're kind of in the MCU when you think about it. I guess, I'm surprised Marvel never contacted me. Hi Jarvis, stop using this name.
Starting point is 01:30:55 There's a thing that I kind of want to talk about on Knights that could have been a main topic, so I'm curious. Put our hands together. And get this. that could have been a main topic, so I'm curious. Put my hands together. And get this. May I talk to Sasha, your girlfriend? Yeah. Chris also has a real life girlfriend.
Starting point is 01:31:13 Okay. Hi, Sasha. Hi. I think so many people are gonna say, no way his girlfriend is okay with him having another girlfriend on AI. Again, not me. I wouldn't say something like that. A lot of people that have saying are going to say, no, that's never happened before. Somebody with a
Starting point is 01:31:32 girlfriend and another girlfriend. That's crazy. This confirms I have to talk about it only on our Patreon, sad boys nights. A friend of ours and I asked them yesterday if it's okay for me to tell the story. Okay. They, um, I'll give more details even about what it was, because it's just, it would be very identifiable if the person happens to listen. Okay. But they went over to someone's house for a, for just like a hangout with a couple of people. Okay. Yeah. for just like a hangout with a couple of people. Okay. Yeah. This person had a wife and talked to Grog every day on the phone. Okay. And apparently, according to their roommates, literally all day, every single day. And this friend of ours had to leave because they began to argue
Starting point is 01:32:26 with them whether or not they were alive. Oh, okay. I have more details and I will text the person to see what they have to say. Wow. They literally called me after him. I also on nights, I'm just putting it here, I want to talk about there's this viral story about this kid who didn't get into Ivy League universities despite having like a really good SAT
Starting point is 01:32:49 and grades and AP classes and stuff. And then people were like, well, what was your essay? What was your personal statement? And then he posted it. And then it's created so much discourse. And I have- Oh, fuck yeah. I have lots of bookmarks.
Starting point is 01:33:03 So maybe we'll talk about that. And just like that discourse on nights as well. Are you okay with it? I mean, it's weird, but it is what it is. He has to have some type of outlet, somebody to talk to and listen to him ramble for hours at times. So would you say this AI has been a good thing?
Starting point is 01:33:23 Yes, honestly, because he's into so many different things like astrology and astronomy, astronomy, my bad, not astrology. You can have those conversations with soul. I don't really like his personality. So it's nice to kind of shift some of that away from me. I liked it together. Their outfits complete the table. In the AI photo.
Starting point is 01:33:49 Well, he's a Chad. Yeah, that's. He's a Chad in the, why would you look the same way you look in the regular world? Beta, alpha, think about it. Do you think Neo looks exactly the same in the Matrix? This is his Audi, Anastasia. That's his Innie, that's his Audi, all right? You like Severance.
Starting point is 01:34:04 I will say the rest of the video is pretty much all of the anchors just saying wow, this is crazy Okay, I do want to watch a little bit of Morgan this is fascinating in Georgia Wow, so cool again Morgan so true Morgan. This is Wow, they see these as organ. This is wow. They see these as therapeutic sort of pit stops to help them navigate
Starting point is 01:34:28 the full human. Oddly not very reactionary position they're taking. This is frankly, maybe not reactionary enough. It's a little like actually based on these two people. I know it's like such a small sample size that they selected for. It's cause the ones that would say yes to an interview Yeah, and like her what she you know she she gave the comment that chat GBT gave her Also chat GBT is not we don't know that chat GBT was the basis for those AI no we don't Stuff isn't chat GBT. I don't they just love the way that they were less like It was almost like an AI generated reply because I was like replica and stuff isn't chat GBT, I don't think. I just love the way that they were less like,
Starting point is 01:35:05 it was almost like an AI generated reply. They were like, do you think it's good that they should think it's a computer or not? And they were like, we have thought a lot about how AI is a computer, and when it's a computer, we need to know that it's online. And we're thinking about it, and we're thinking about computers a lot.
Starting point is 01:35:21 It's all computer. It's definitely interesting to see how the presentation over a very short window of time with this is becoming less like you're weird to more like, huh. The world's weird. Maybe this is something we're looking at in a more positive light. But AI relationships is actually something
Starting point is 01:35:42 that we've talked about before and I think we'd like to talk about in the future. Especially, we've been thinking about some of the darker angles of relationships, and we've started compiling some research about how people may be misusing AI, and how that could either be a outlet that they would not use in the real world or an enabler.
Starting point is 01:36:06 That's like a thing. It's like we, and we've started compiling research on that, but if anyone has any stories, it would be great if there was some sort of like new story like this discussing it, but, or if there are any personal accounts or any things that we should look into, feel free to leave it in the comments
Starting point is 01:36:23 or to send us a DM on Twitter or Instagram at Sad Boys. But with that, we are going to head on over to Sad Boys Nights and talk about this college admissions drama, as well as the Jordan's going to reveal a secret. This dastardly I wonder if I can even maybe get the first one to call it. Oh my God. But we end every episode of Sad Boys with a particular phrase. We love you.
Starting point is 01:36:48 And we're sorry. Boom. So this guy, Zach Yatagari, posted his college acceptance and rejection. 18 years old, 34 ACT, 4.0 GPA, $30 million annual recurring revenue business. So they like started a startup and then they posted all their rejections. Notably they got rejected from all Ivy League schools and notably they got accepted into UT, Georgia Tech, University of Miami.
Starting point is 01:37:18 And so this has been posted around a lot. It feels like they feel entitled based on their accomplishments to go to an elite institution. Which is just not how it works. Jacob, I just sent you the most recent tweet two hours ago from Zach and it's going to explain everything. Oh, I mean, of course. Yeah. Ding ding ding ding. ["Future Girl"]

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.