Pivot - The Discord Leaks, San Francisco Safety, and Kate Crawford on AI

Episode Date: April 18, 2023

Scott’s back on Twitter, and Elon says the platform could be cash-flow positive this quarter - coincidence? Kara and Scott discuss growing calls for Senator Feinstein to resign, a delay in the Domin...ion v. Fox News trial, and impressive JPMorgan Chase earnings. Also, a tech consultant has been arrested for the murder of Cash App founder Bob Lee. And U.S. National Security is in disarray over Discord after an Air National Guardsman allegedly leaked classified documents on the platform. Then, we’re joined by Principal Researcher at Microsoft Research Lab and Professor at USC Annenberg, Kate Crawford to talk everything AI. You can find Kate on Twitter at @katecrawford, and can buy “Atlas of AI” here. We’re nominated for a Webby! Vote for us here. Send us your questions! Call 855-51-PIVOT or go to nymag.com/pivot. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Support for Pivot comes from Virgin Atlantic. Too many of us are so focused on getting to our destination that we forgot to embrace the journey. Well, when you fly Virgin Atlantic, that memorable trip begins right from the moment you check in. On board, you'll find everything you need to relax, recharge, or carry on working. Buy flat, private suites, fast Wi-Fi, hours of entertainment, delicious dining, and warm, welcoming service that's designed around you. delicious dining and warm, welcoming service that's designed around you. Check out virginatlantic.com for your next trip to London data, and a matching engine that helps you find quality candidates fast. Listeners of this show can get a $75 sponsored job credit to get your jobs more visibility at Indeed.com slash podcast.
Starting point is 00:01:00 Just go to Indeed.com slash podcast right now and say you heard about Indeed on this podcast. Indeed.com slash podcast. Terms and conditions apply. Need to hire? You need Indeed. Hi, everyone. This is Pivot from New York Magazine and the Vox Media Podcast Network. I'm Kara Swisher.
Starting point is 00:01:23 So I'm on the flight back from Tokyo to London, which is about a 73-hour flight. Yeah, 73. And I'm working on my book, as I heard you're finished, which really upsets me. Okay. And I decide, okay, I'm going to take a break. Weeks ago. Weeks ago. I'm going to take a break from my 15 minutes of working on the book, and I pull up the TV, and who do I see?
Starting point is 00:01:46 I'm literally, I thought I was hallucinating. I had taken a Lunesta, had a couple of Jackson, I'm like, oh my God, I'm now seeing, I'm hallucinating like Kara Swisher. You were on the BBC. The beep, yeah. Oh my God, it was hilarious. They would play 30 seconds of their interview with Elon Musk and then it'd be like, Kara, what do you think? It'd be like, no, he's a fucking idiot. This makes no sense. You were very good, by the way. I thought that was, you know, most of your work I find pretty mediocre,
Starting point is 00:02:10 but this was Mediocre Plus, the new streaming platform from Kara Swisher. Yes, thank you. You were very good. I was really impressed. Thank you. You know, I thought it was BBC Radio.
Starting point is 00:02:20 That's why I was wearing a sweatshirt. I just did. You were just so over it. You're like, whatever. He burnt his hand and called himself a hero. You had all of these like... Arsonist. He's an arsonist. He's telling us. You had all of these Brian Williams
Starting point is 00:02:33 outages. You're like, well, you know, a moose can't drive a convertible. And you're like, what? And you're like, well, it kind of makes sense. They ask me a lot. I like the BBC. They're smart. I did a Canadian one, too, which you're like, well, it kind of makes sense. They ask me a lot. I like the BBC. They're smart. They're very smart.
Starting point is 00:02:46 I did a Canadian one, too, which you may, on your next flight, find. Because I have to go up to Toronto. And I was with the CBC. And they're so good at interviewing. They're just really solid. They're not screamy. I'll tell you, you literally, that guy must hate you. Because anything he said, they'd stop and they'd go to you.
Starting point is 00:03:02 And I'm like, this is why that was a really fucking stupid thing to say. You literally, you tore him limb from limb. It was a terrible interview. Come on. I mean, I expect that from Tucker Carlson tonight. He stuck his chin out. I couldn't believe that guy didn't push back on him more. He didn't have any, like, he didn't just push back. He didn't have any, like when he said, give me an example. That's what teenage boys do when you're like, your room is a mess. Give me a specific example. When you were rude to me, give me a specific example. You know what I mean? Like, you're just an asshole. I wanted, I was screaming at the interview because he said, well, you've laid off all
Starting point is 00:03:36 these people. I was like, okay, you're four months from bankruptcy. What do you do? And I was like, boss, they weren't four months from bankruptcy until you took over. That's correct. They were financially solvent and on solid ground until you took over. He just made a crisis. He loves a crisis, that man. And alienated 70% of your advertisers and created a financial crisis. I agree.
Starting point is 00:03:55 I agree. Well. Anyways, well done. What did you do this weekend, Kara? I went to a funeral and a birthday party. My son turned 18, Alex. Well, tell me about the funeral. I will.
Starting point is 00:04:04 Well, it was interesting. It was my aunt died. It's the last of my brother's sister. I remember you talking about that. My son turned 18, Alex. Well, tell me about the funeral. I will. Well, it was interesting. It was my aunt died. It's the last of my brother's sister. I remember you talking about that. She was months ago. Yeah. And so they did a memorial service. And I drove Lucky out to Morgantown.
Starting point is 00:04:13 Me and Lucky took a road trip to Morgantown, West Virginia, where my dad is from. And I took her to the—my dad's buried there. And I took her to the cemetery. And then there was a memorial service at this. I don't know what her—she was very Christian. She was quite Christian. So, it was a lot of Jesus during the ceremony, which was—actually, it was lovely. The kids are—her kids, she has four kids.
Starting point is 00:04:36 One is more terrific than the other, but they're really great kids. But it was a lot of Jesus everywhere, I have to say. She was very religious. As long as they gave her comfort, right? Yes, exactly. Yeah, the preacher was like the guy who was, and he was sort of a preacher type character. Because they kept doing the Jesus thing, even when they were, everywhere you go, you move like a block. And they're like, now let's thank Jesus and our Savior and stuff like that.
Starting point is 00:05:03 Which I really, I do respect. And this guy goes, I guess you're not too religious. I said, actually, I'm Catholic, although I'm not a practicing Catholic. And he goes, well, you probably don't like all this praying. And I said, you know what? I love your praying, but it's getting a little much, I gotta say. It was nice. It was really nice to celebrate her life. She was an incredibly kind and good person. And her husband is still living, and he's terrific. And it was nice to celebrate her life. She was an incredibly kind and good person. And her husband is still living, and he's terrific. And it was nice to go back. I used to spend summers there when I was a kid.
Starting point is 00:05:34 That's nice. Well, you know, Jesus drove a Honda, but he never spoke of it. He used to say, for I speak not of my own accord. Oh, yeah. A little dad humor to lighten up the mood. Oh, you know a little bit of a biblical. And it was nice to be with my mom driving hours and hours to West Virginia. That was like a road trip. We almost lost another relative on the way down.
Starting point is 00:05:58 Yes, exactly. You sit down here, Mom, and I'll just leave you by the side of the road. No, it was my roots. My roots, as they say there. My roots. It's my roots. I'm sorry for your loss. Yeah, thank you.
Starting point is 00:06:11 Anyway, and Alex turned 18, which was nice. We took him for crabs. We took him to crabs. Well, that's funny. I got crabs on my 18th birthday. That was good. That was good. I see that you're back on the road.
Starting point is 00:06:22 In Tijuana, actually. You're not making crab jokes about my 18-year-old son. Anyway, I thank you for speaking to him about college. Scott was very kind to talk to Alex because he's gotten to a couple places and give him good advice. And he keeps saying the word optionality to me now because I feel like it was really – is that correct? Well, that was easy. He's saying optionality like all day yesterday. Optionality, mom.
Starting point is 00:06:46 And I'm like, where did you come up with that crazy phrase? Yeah, but literally, this is a conversation like, well, I don't know if I should go to this world-class university or this world-class university. And I'm like, hmm. Yeah. This is a tough one. Most of the kids I speak to are like, you know, at some point I want to go, got to be honest, boss, I think you're a little bit fucked in this economy. But your kid is definitely an overachieving switcher. Anyway, you had a great impact on him because he's talking like a stock gal.
Starting point is 00:07:09 He loves you. He listens to Prof G and not Pivot. He listens to Prof G and not Pivot. Anyway, he's a big fan of yours. And you had a great impact because he went with your selection. I think he's gone with your selection. Oh, really? I think so.
Starting point is 00:07:22 Oh, I'm excited. That's great. I think it's going to be a great move for him. I see that you're back on Twitter. What happened? Well, I got back on after two and a half weeks, and I don't know. I can't figure out if it was a conspiracy or just a conspiracy of dunces. I think it was a ladder.
Starting point is 00:07:35 I don't think it was anything. Yeah. Yeah, I'm back on. Yeah. I'm back on Twitter. And did you miss me? How were the two weeks? I listened.
Starting point is 00:07:43 Did you enjoy the shows while I was gone? Yes, I did. I had Alyssa Farah, who was great. She's the view, the conservative on the view, right? She's the conservative on the view. She was terrific. She was great, actually. And then Olivia Nuzzi, who is from New York Magazine.
Starting point is 00:07:56 She had written a big Stormy Daniels piece that week, so it was perfect. And very political-oriented shows. But there's a lot of politics because of the indictment and stuff like that. And I'm glad you're back on Twitter. We're going to do a live interview. By the way, I listen to all your shows, and I think you perfectly encapsulated what it means to be a progressive now. A bunch of white women yelling at each other about what should offend us versus what affects us. No, no.
Starting point is 00:08:18 I really enjoyed it, though. No, she didn't. Oh, my gosh. She likes Mike Pence. I let her go on about Mike Pence. You found the only woke Republican. I'm like, this woman's a Republican? She worked Mike Pence. I let her go on about Mike Pence. You found the only woke Republican. I'm like, this woman's a Republican? She worked for Trump.
Starting point is 00:08:29 She was at the Defense Department. Mike Pence, Mark Meadows. This woman's got her bona fides, but she's not part of the gang anymore. She had it. She felt like what I would refer to as a Rockefeller Republican, and that is like sane conservative values, but not mean. Yes, that's what I would say. Yeah, I think we would part ways on abortion, all kinds of stuff. But you could at least, you know, it wouldn't be this sort of vicious struggle in the dirt, in the mud, as Logan Roy, the now late Logan Roy says. Anyway, today we're going to do a lot of
Starting point is 00:08:58 things. We're going to do two notable arrests, shaking up the tech community, one over the discord leaks, the other over the murder of a San Francisco tech executive, Bob Lee. And we'll speak with researcher Kate Crawford about secret ingredients of AI. Spoiler alert, it's people. Elon Musk has big news following your Twitter return, Scott. In an interview with the BBC conducted on Twitter Spaces, Musk said most advertisers have returned and the platform is roughly breaking even and could be cash flow positive this quarter. He has shown no proof of this, but there you have it. In another interview, he said the government can see your DMs, another thing he has no proof of. And many people say this is untrue that I've spoken to. Three top executives from Twitter, former at very different times, wrote me, this is crap.
Starting point is 00:09:41 So talk about any and all these things. Well, I thought you summarized it perfectly. I mean this sincerely. I thought your critique or analysis of the BBC interview, which I do real time, which must have driven him crazy, was exactly right. First off, you can't trust anything he says. I don't believe that most advertisers have returned to the platform, I know big advertisers who left, and I have not heard their back. And what he basically has done is taken a company that he paid $44 billion for and turned it into a company that's probably worth four or seven. And, you know, again, and this is the Silicon Valley playbook, I'm going to start a fire in my backyard, pick up a hose, and then call myself a hero. playbook, I'm going to start a fire in my backyard, pick up a hose, and then call myself a hero.
Starting point is 00:10:31 And the thing that just I think is hilarious is him. I think Tristan Harris and the people who signed that AI letter were really earnest and genuine about their concerns. I think where they screwed up was agreeing to have him sign it. Because as he was signing this letter, urging for a pause, he's buying GPUs for Twitter to start ramping up AI at Twitter, and is also supposedly starting his own AI startup. So what it comes right down to is anyone calling, you have to be very, you have to ask one question, is someone who's calling for a pause in AI, are they genuine about the risk? Or it's like, we're a number 50- He wants to catch up. He wants to catch up. So Kara, we're the 52nd biggest podcast in the world.
Starting point is 00:11:06 I'd like the other 51 to pause on their pods for a while. That would be great. Right? Because it's just so ridiculous that Elon is calling for a pause as he is accelerating his own AI efforts. I mean, he's just monstrously full of shit is what it comes down to. And the government stuff, the government in your DMs. Literally, I got so many texts from him. Well, they can subpoena them, but they don't have open access to them.
Starting point is 00:11:29 Yes, that's correct. But he's just claiming this. It's just crazy. I usually get tons of emails from Twitter executives, and they're like, this is such shit, this idea that the government. Now, listen, I'm sure the government is spying everywhere as these new documents, which we'll talk about in a minute show, but they're not, they don't have access to your DMs unless you get subpoenaed. And if they spy, that's a different story. And he doesn't have any proof of it. And what he does, he says things without proof. And that was my issue is like proofless. Let's see the numbers then. Don't tell us it's breaking even. Let's see the actual numbers
Starting point is 00:12:03 and what's the advertising numbers? What's this? So, I don't take his word for anything. I don't know about you, but cutting people, fine. That seems fine. Probably saved a lot of money, right? A couple, whatever. Correct? But this libertarian notion that the deep state, this amorphous but evil sinister thing is coming for you. First off, the government is us. And anyone who spends any reasonable amount of time with people who work for our,
Starting point is 00:12:31 the most noble organization in the world, which is the US government, will see that generally speaking, they're really good people who make less money than they would in the private sector because they want to do something in the service of their country. And there's some incompetence, there's some waste,
Starting point is 00:12:44 there's some bad actors, but for the most part, this notion that's being fomented that it's the deep state are by people who want to be the deep state themselves or that don't want regulation. And if you look at the really bad shit that's gone down in the last few years, has it been a function of over-regulation or overstepping of the government boundary? Or is it because government hasn't done anything to stop crypto from vanishing or disappearing 1.2 million people's FTX accounts? Is it that there's overreach in social media and they're reading your DMs? Or they haven't moved in to stop anything resembling a guardrail such that teenagers don't get images of nooses and pills and razors. It has not been government overreach that is the problem. It's been government underreach.
Starting point is 00:13:29 Yeah, I just think, I just don't believe any coming out of his mouth unless I see the actual proof. And I wouldn't even believe the proof, honestly. It's just so, it's so, just he says whatever pops into his brain. And that's, you know, I think one of the things I think was factual about that interview with the BBC is that I shouldn't be tweeting after 3 a.m. That is a very good piece of advice to himself. Another thing, calls are growing for, speaking of government, Senator Feinstein of California to resign. My longtime senator from when I was living there. The 89-year-old has missed almost 60 Senate votes and left a seat on the Senate Judiciary Committee open while at home recovering from shingles.
Starting point is 00:14:05 The Judiciary Committee has only been able to confirm one judge in her absence. Last week, Feinstein released a statement saying she has asked Majority Leader Chuck Schumer to temporarily fill the seat until she's able to return, which is not the easiest thing in the world to do, because you've got to get the Republicans to agree to it. I don't know. I mean, you talk about this issue a lot. And she refuses to step down. She refuses to step down. But Strom Thurmond didn't step down either, as you recall. Yep.
Starting point is 00:14:31 And he was wrong not to. I think these people are public servants. And I think that at some point their service to the country should outweigh their narcissism and their understandable desire to stay in their jobs. And I think I got to give it to Representative Ro Khanna. He was the first to kind of take that leap, which was dangerous, and say, this is a dereliction of duty. He's a California representative in the House. He's getting a lot of prominence recently. Who's a Democrat. And by the way, that costs him a lot.
Starting point is 00:15:03 I bet of personal relationships and goodwill with people he knows. I bet people are giving him attaboys. Good for you. Well, you know what? It's never the wrong time to be right. And he's absolutely right. Because I had questions about Senator Fetterman's ability to serve. And you correctly pointed out, or I think there was a really good argument, that we're in such a sensitive time with such low margins for Democrats.
Starting point is 00:15:28 When you're talking about a woman's right to choice and protecting that, that a reliable blue vote is a lot of reason to vote for somebody. And he's back, by the way. He's back in the office. I get that. I think that's a really good argument. But now in the case of Senator Feinstein, another Democrat would be appointed. argument. But now in the case of Senator Feinstein, another Democrat would be appointed. And if you talk to people on her staff, this isn't something, quite frankly, that is probably going to get better. Well, there was a lot of articles about her cognitive abilities, too,
Starting point is 00:15:55 not just shingles. A year ago? Yeah, I don't even know. There was tons. I have people who are very close to her. And, you know, great mayor of San Francisco, great history as a senator. Some of the last years were, you know, up and down. But it's really, it's an interesting question of this age thing. I mean, look. We're going to have more of it. We're going to have more of it. A hundred percent. No one said anything about Ronald Reagan. They kept it quiet. And it was wrong. I'll be honest with you. I don't think President Biden should be running again. Well, as you said. You know, at some point, there's a general trend in our society of hyper-individualism,
Starting point is 00:16:31 where people believe their individualism and their rights supersede their fidelity to the commonwealth or the greater whole. And I think it takes a certain amount of leadership and dignity to say, okay, I can no longer – I've lost a step. I can no longer – being a senator, I mean, you've spent time with these people. They do really hard work. It's really hard. It's very exhausting. It's very mentally and physically taxing.
Starting point is 00:16:56 A hundred percent. And most normal 69-year-olds would have a difficult time doing this. Agreed. Interestingly, Alyssa Ferris said that about Biden. She said that would be the greatest thing he could do for his country, do at George Washington, which was interesting. And then today I interviewed Jen Psaki for the ON podcast, and she talked about it.
Starting point is 00:17:14 She said he's got to prove, they've got to get him out with the people and prove that he's cognitively able to do stuff. But it's not about whether he's cognitive now. The last time Marine One would leave the West Lawn if he's reelected, he'd be 86. Yeah, that's true. And at the end of the day, biology is not politically incorrect. And you need an individual, if need be, can get on an Air Force One to Singapore and be sharp in negotiating and start
Starting point is 00:17:38 talking to the commanders of the Pacific Fleet. This is a difficult taxing job. We do have a history of this with Reagan. I mean, Reagan, they had all kinds of people around him. And he was 10 years younger than that. I agree. But he had, well, he had Alzheimer's, I think. Is that correct? Or something in that area. I mean, I'll even, just to really piss off people, I think Ruth Bader Ginsburg should have resigned. I think she, I think her narcissism and popularity got in the way of her making the right decisions for what ultimately would be the right thing for the nation. Well, I have some news. I'll be quitting now because I'm old.
Starting point is 00:18:14 I can't. I have to have the energy. Stepping down. Stepping down. Well, thank God erectile dysfunction is not an issue in podcasting. I'll be here for a while. Speaking of erectile dysfunction, we have to move on from this topic. The Fox News Dominion trial was supposed to start on Monday as we record this. It went limp, but late Sunday night, the judge announced a delay until Tuesday. He has asked them to talk.
Starting point is 00:18:36 The parties are not looking to settle. Fox is looking to settle. 100%. The judge asked this to happen and avoid what was shaping up to be the libel trial this century. The wind is all at Dominion's back here, so they have to get exactly what they want. The judge asked this to happen and avoid what was shaping up to be the libel travel century. The wind is all at Dominion's back here. So they have to get exactly what they want.
Starting point is 00:18:51 And Fox has to really give up some stuff. Last week, the judge sanctioned Fox News for withholding evidence and mischaracterizing Rupert Murdoch's role in the company. He's king, everybody. Just FYI, having worked for him. We'll see if they settle. I think they will, unless they give up everything that they want and, you know, admit on the air and have, like, Tucker Carlson read it. That's the kind of thing I think Dominion would want. And Sean Hannity and especially Maria Bartiromo and Lou Dobbs.
Starting point is 00:19:16 All of them. I don't even think Lou Dobbs is still on there. But I don't think they're going to settle. And then there's another trial right after it, the Smartmatic trial. So, I don't know. And then shareholder lawsuits, et cetera, et cetera. You haven't seen anyone want to settle this badly as the News Corp board does since Elon Musk on the eve of the Delaware Chancery Court about to say, no, you've lied, and we're going to show how much you've lied in court. And what do you know?
Starting point is 00:19:42 He kept saying, let's settle, let's settle, let's settle. And on the eve, he said, okay, I'll give you exactly what you want. And Twitter just said all? He kept saying, let's settle, let's settle, let's settle. And on the eve, he said, okay, I'll give you exactly what you want. And Twitter just said all along, no, you're closing at the exact same price. And Dominion, their lawyers, their counter should be the following. We'll settle at 1.6 billion. And apologies. They want public apologies. Even though they can't have cameras in the courtroom, can you imagine what they're going to do to Laura Ingraham and Tucker Carlson and Lou Dobbs when they say, what are some of, do you consider yourself a journalist? We're going to have to say yes.
Starting point is 00:20:12 What are some of the responsibilities of the tenants around truth that a journalist has? Okay, now let me read you some of your emails. Right. They are going to. They're tough trials. The New York Times had one with Sarah Palin, even though they did prevail on that. They had a lot of stuff on their side, and not this much proof. This was crazy. This is a crazy
Starting point is 00:20:30 amount of proof that they did what Dominion's saying they did. I don't think Dominion wants to settle. That's my impression. I think the judge was just like, let's try, and Fox is probably screaming for it, and we'll see. They're not going to get what they want. They've misbehaved the whole time, including last week with all this stuff. I say they don't settle. What do you say?
Starting point is 00:21:08 that the firm was seeking financing at a $60 million valuation, but News Corp does not want this to go to trial. They are all wet. Every day, there's going to be a ton of media sketching, talking about the questions, and just how terrible this network and these anchors, their stars look. They might just say, they might bite the bullet and go, give them the 1.6 billion and let's call it a day. Yeah, they still need the apology. They're going to still demand the apologies. Well, they've already apologized. I don't think that... No, they're going to have to say it on the air. They're going to make them say it very explicitly for a matter of time, I think. I really, I mean, all of us are really, are praying this goes to trial. Anyway, we'll see. Speaking of a thing you got right, JPMorgan Chase reported first quarter earnings and they're beating expectations.
Starting point is 00:21:48 The bank's revenue rose 25%, almost $40 billion in profit, jumped 52%. The report also suggested net interest income will be slightly over $80 billion this year, beating previous forecasts by $7 billion. Shares of the bank were up almost 10% in the last five days. Scott, I seem to remember you making a prediction about this in March. Let's play it. JP Morgan's about to have their best quarter in history. The wave of capital that goes into JP Morgan and is going in. Just JP Morgan or Goldman or what?
Starting point is 00:22:15 Oh, all of them will benefit, but no one's going to benefit like JP Morgan. Well done. Take a bow. That was an easy one. That was an easy one, Cara. Very smart. Well, there you are. Let's take a bow for Scott Galloway. There you go. Jamie Dimon's really hitting on all cylinders. It'd be interesting to see if he next becomes like a Treasury Secretary or Fed Chairman. I would like to see him in government. I think
Starting point is 00:22:37 he's a great leader. Yeah, that seems to be the next step for this guy. Also, my limited experience with him, I think he's quite empathetic. And I think you want to talk about the difference between citizenship and these venture catastrophists. Four banks shored up and put deposits into First Republic because they see that there is a benefit to having a healthy banking system. Yeah. But he did bear hug Adam Neumann, as you know. Well, he was looking for fees. He was looking for fees.
Starting point is 00:23:03 Remember, I'm the guy who interviewed him on stage at the J.P. Morgan conference. Yeah, that's correct. Yeah. But they're bankers. That's their job. I know. I get it. But I would love to see him in government. I think he's a really decent man. He's also very spicy. I've spent a lot of time with him. He's very spicy. He likes to get into arguments, which is great. Anyway, he's very accessible in that regard. He doesn't mind mixing it up, as they say. Okay, let's get to our first big story. The man who allegedly murdered a Silicon Valley tech executive is in jail.
Starting point is 00:23:33 Earlier this month, the founder of Cash App, Bob Lee, was stabbed to death on the streets of San Francisco. As the mayor mourned his death, some prominent tech personalities, most especially Elon Musk, were quick to blame the killing of the city's leadership. It supposedly lacks approach to crime. Among them, Elon, as I said, Jason Calacanis, who posted an all caps tweet accusing city officials of, quote, enabling rampant violence. But there's one problem. On Thursday, police arrested a man they say killed Bob Lee.
Starting point is 00:24:00 He wasn't an unhoused person suffering a mental episode. He was another tech consultant who knew Lee personally. Reports say that Lee and his accused killer, Nima Momeni, argued about Momeni's sister prior to Lee's death. Our condolences to Bob's Lee's friends and family. Unfortunately, the information had a great story. He's been caught up in this fight, you know, over this thing. He became a symbol of violence in San Francisco immediately. Everything about this is terrible. It seems to be a personal beef, what happened. It's shocking, nonetheless. But you don't want these pet theories to show up. He did the same thing around the Pelosi attack, all this violence towards people. They jumped to this, it's a crime, it's the homelessness, not just waiting to find out. The tech guy's trying to make San Francisco into a hellscape. It's got enough problems, it is as it is, including homelessness. And the mayor just got, a federal court just said she couldn't clean up people off the streets, for example. So,
Starting point is 00:24:56 she's contending with legal issues, fentanyl crisis, all kinds of things. And this just added to the pile. I mainly went to the data. And the reality is the murder rate for a city of one million people is not especially high or low in San Francisco. Having said that, and your opinion matters more here than mine because you spend a lot of time there. I live there. I've lived in L.A., San Francisco, Miami, New York, now London. San Francisco for me is a distant fifth. That's the only place I would not live again. San Francisco, Miami, New York, now London. San Francisco for me is a distant fifth. I've just, that's the only place I would not live again. And whenever I go back there, and I only go back about once or twice a year now, I am shocked at how many severely mentally ill people are on
Starting point is 00:25:34 the street in front of a building where there are billionaires. And I find that it's somewhat dystopic. Downtown. Yes, downtown. But if you look at the data... Yep, let me read the data. Local police data recorded 56 homicides each in 2022 and 2021, and that's a rise of 36% over 2019. For comparison, Columbus, Ohio recorded 140 last year. That's quite a bit more. And Indianapolis recorded 226. Oh, St. Louis. I know. And Duval County, Florida is apparently quite a dangerous place to be.
Starting point is 00:26:05 But property crimes such as burglary and larceny are up according to the San Francisco PD, but still below 2017 levels. Certain areas of the city, I'll speak to this, like the Tenderloin, some parts of Soma, some parts of Soma, not all, downtown, because nobody's downtown. It's a really bad situation. You go to every city, storefronts are closed, people aren't in offices, etc. That's an issue. But my neighborhood has cleaned up a lot. Like, it's much better since the pandemic. And the area where he was killed, when that happened, if you remember me saying to you,
Starting point is 00:26:38 there's no way this wasn't, just that I know this neighborhood. It's not a dangerous neighborhood. It's a desolate neighborhood. But I always felt it was like somebody new, like, because it was such a, it wasn possibly a psychotic homeless person, which is, and then he used it to attack the city because a lot of them want to run for office. They want to run the city, essentially. So, I find it incredibly sad that they continue to do this around attacks like the Paul Pelosi one. It's just heinous to do things like this, especially around someone who died or was hurt, either way. Yeah, the politicization of everything is obviously discouraging,
Starting point is 00:27:32 because at the end of the day, I mean, the thing that's rattling here is a young, you know, someone's son was wandering around bleeding to death. He wasn't. He wasn't wandering around. This guy took him where he murdered him. But go ahead. Sorry. Yeah, but wasn't he looking for help after he'd been stabbed? He was, yes. But this is a very empty part of the city. I don't know how to describe it. My first thought when I heard about this was I was just really sad for the guy and that his life or his death, I should say, is being politicized. That's kind of sad on its own. But my first thought was the tech community is much more dangerous than the San Francisco community.
Starting point is 00:28:09 These guys want to talk about that. The tech community and this belief that they're saviors and shouldn't be subject to regulation and their success at avoiding all regulation has resulted in a lot of unnecessary death of kids and children and tremendous damage to the Commonwealth. It's the tech community that's the menace, not San Francisco. And of course, they were immediately going, well, not in this, after doing that, and same thing with SVB, after doing very irresponsible things, they're like, see what we did made a difference. And here they're like, see what we pointed out made a difference. I'm like, all you did was take a guy's death and use it for your own selfish purposes and
Starting point is 00:28:44 just to scream. You know, Elon Musk is the worst of them. But, you know, there's definitely, there was a really interesting story at the same time in, I think it was the Times. Yeah, Aaron Griffith did it about alleged fraud in Silicon Valley. Charlie Javis at Frank was arrested last week, speaking of JP Morgan, on charges of falsifying data. Founders of Slink Software was arrested last month, charged with defrauding
Starting point is 00:29:10 investors or upcoming fraud trials. The co-founder of Headspin, Carlos Watson of Aussie Media, was arrested on fraud charges. Elizabeth Holmes is headed to prison. Of course, Sam Bankman freed. There's a lot of people, you know, this dishonesty going on, but it's also compounded by these people on the top just deciding to be trolls in a way that's really disturbing. Yeah, it's not. You get to this certain level of influence. And again, it's more, let's shitpost our local government. Let's shitpost our state government. Let's not offer any solutions.
Starting point is 00:29:47 Let's shitpost our government making up stories about what could or could not be true. And then in our notes, you have the actual text of this person's statements. And I apologize, I got to read it.
Starting point is 00:30:00 We don't know what happened yet, but I think we suspect, and I would bet dollars to dimes that this story is very similar to the case in la recently where a young one was basically stabbed for no reason by a psychotic homeless person who had been through a revolving door of jail and criminal justice system who should have been locked up who was arrested multiple times but not kept locked up because of this push for decarceration push for decarceration we have more people locked up in this nation than I think the other 10 nations,
Starting point is 00:30:25 biggest nations combined. We absolutely need a prison release program in this nation. And then to use terms like psychotic and homeless, and by the way, this was not a psychotic homeless person. This was an acquaintance. This is, yeah.
Starting point is 00:30:38 So it's- This was an acquaintance, yeah. It's again, it's just not- And then this is the last thing. So they are setting loose on us a predatory criminal or psychotic element that jeopardizes our safety and makes these cities unlivable. They, meaning the tech community? No, the government. I mean.
Starting point is 00:30:54 No, government. I know, but that's my point. The tech community is a much bigger threat to our society right now. But on the whole, if you look at crime in America, over any extended period of time, it's been going down. Yeah. There's no solutions.
Starting point is 00:31:12 That's what it is. It's never about solutions. It's all about accusations. It's about recall Democrats and put libertarian weirdos, i.e. me, in office. Yeah, and do nothing about it. And then be authoritarian. The fact that Musk is doing this and at the same time Russia just convicted someone 25 years for dissent. They don't care about that. They don't care about, they don't attack Russia for that.
Starting point is 00:31:48 All you bros who did this, we look forward to all your next monologues about cracking down on the culture of tech entrepreneurs who don't think the rules apply to them. That's what we need to crack down on. Anyway, Scott, let's go on a quick break. When we come back, a popular app is in the news for national security risk, but it's not the one you think. And we'll speak with a friend of Pivot, Kate Crawford, about the planetary cost of artificial intelligence. You'll enjoy that conversation. intelligence. You'll, that's not what it is anymore. That's Ian Mitchell, a banker turned fraud fighter. These days, online scams look more like crime syndicates than individual con artists. And they're making bank. Last year, scammers made off with more than $10 billion. It's mind-blowing to see the kind of infrastructure that's been built to facilitate scamming at scale.
Starting point is 00:32:50 There are hundreds, if not thousands, of scam centers all around the world. These are very savvy business people. These are organized criminal rings. And so once we understand the magnitude of this problem, we can protect people better. the magnitude of this problem, we can protect people better. One challenge that fraud fighters like Ian face is that scam victims sometimes feel too ashamed to discuss what happened to them. But Ian says one of our best defenses is simple. We need to talk to each other. We need to have those awkward conversations around what do you do if you have text messages you don't recognize? What do you do if you start getting asked to send information that's more sensitive? Even my own father fell victim to a,
Starting point is 00:33:29 thank goodness, a smaller dollar scam, but he fell victim and we have these conversations all the time. So we are all at risk and we all need to work together to protect each other. Learn more about how to protect yourself at vox.com slash zelle. And when using digital payment platforms, remember to only send money to people you know and trust. Thumbtack presents the ins and outs of caring for your home. Out. Uncertainty. Self-doubt. Stressing about not knowing where to start. In. Plans and guides that make it easy to get home projects done. Out, word art.
Starting point is 00:34:11 Sorry, live laugh lovers. In, knowing what to do, when to do it, and who to hire. Start caring for your home with confidence. Download Thumbtack today. with confidence. Download Thumbtack today. Scott, we're back. The U.S. national security state is in disarray. They're not like sneaking around, in fact, over Discord. Last week, the FBI arrested 21-year-old Air National Guard Jack Tashira on charges of unlawfully copying and possessing classified defense records. Authorities said Tashira uploaded the documents to the app Discord, where he shared them in a private chat room that you can have there.
Starting point is 00:34:51 That included about 20 people, many of them young men. The online group shared an interest in guns, video games, and frequently posted racist and anti-Semitic memes. The documents provide some insight into the war in Ukraine, although they don't know which ones are real or not, including a bleak assessment of Ukraine's upcoming spring offensive. There are also some reveals around the U.S.'s ability to spy on Russia. Apparently, we're good at it. And evidence that U.S. is spying on its allies, not a surprise, of course, but embarrassing. I'd love you to comment on these document leaks. Other leakers like Chelsea Manning, Edward Snowden, and Reality Winner leaked to journalists and wanted their leaks to have large social impact. This leaker seemed to have clout with teens. He wanted to
Starting point is 00:35:29 impress teens and school them. So talk to me about this, because there's a whole young men thing in here. There's a whole security thing in here. I found this really upsetting, and I think it calls on a number of issues. The first is obviously in the short term, they're going to have to review the protocols around who has access to classified documents. 1.2 million people in our government have access to classified documents. And it's actually been expanded since 9-11 because they found some of the red flags
Starting point is 00:35:53 that were ignored around 9-11 were because interagency shareability wasn't happening. So they actually lowered and loosened the requirements for who has access to classified information. But I'm sure they'll address that sooner rather than later. The issues that it really brings up, I think, that are more interesting is, one, again, if you look at the pharmaceutical industry worth a couple trillion dollars and has a big impact on our society, massive regulation, hundreds of millions of dollars
Starting point is 00:36:24 to get through FDA hurdles to get anything on the shelves. If you look at the restaurant industry, once we have 20 locations, you have to disclose the number of calories. If you want to put out a product, any sort of consumer product, it has to go through testing. If you want to launch a car, you have to have all sorts of emission standards, airbags. The industry that is worth more and has more influence than any of these industries has absolutely no regulation or guardrails. And the question I would put forward is similar to sex trafficking. If we had a carve out from 230 on national security, I would bet, I would speculate that organizations and platforms, including Discord, could use an LLM,
Starting point is 00:37:09 a large language model to recognize this type of data and before it gets forwarded or out of control, move in and flag it for human review. But because there is no need, there is no incentive. It costs too much. The zero to land speed record of young dumb men getting in trouble and causing huge damage has never been greater. We have given every kid in this nation a car that no longer goes 60 miles an hour, but goes 600 and said, no airbags, have at it. And it also goes to a larger issue, and that is younger people have less opportunity to get together with people offline.
Starting point is 00:37:50 In general, the offline analog to anything online is much safer, much better, much more empathetic, much more loving. These kids were into Christianity and they were into guns. If they had been at a gun range and started yelling out anti-Semitic tropes, someone at the gun range would have said, cut that shit out, and this is why you don't say that shit. If they had been at church, in a church group, and laid out on the table,
Starting point is 00:38:13 hey, I have information about the Ukrainian war effort, someone in that church would have said, this is dangerous, come here, speak to me. And also illegal. But what you have online is absolutely no guardrails. So you take every terrible impulse, poor judgment that young men especially demonstrate in spades, and it goes from zero to really fucking awful faster than ever before. I have mixed feelings about some of the Edward Snowden stuff, the reality winner stuff,
Starting point is 00:38:44 Chelsea Manning, some of it, you know, the government was saying they were not doing things that they were doing. I don't know what they could have done otherwise, in many ways, except become what is considered traitorous. At the same time, this seemed to be done for just... Social status. Social status. And you're just like, what? Like, and also what happened to you that you, this is how you are. Like, this is what you think about the government. It's like too much, too many online conspiracy tropes, but it's sort of like, wow, you were in a position of authority of some point,
Starting point is 00:39:15 and this is what you do with it. You know, this is, this is your thing just to impress 15 teenagers and have status among them. If you read some of the articles, the Washington Post had one of someone in that group, a young guy, and his quotes were heartbreaking. Like, he was cool, he was tough, he was our guy. And I was like, oh my God, your life is so small on your little online service. But it's getting smaller and smaller because, you know, lack of male role models, lack of offline socialization, lack of potential for romantic interests. All of these things are guardrails. All of these things, when you're in person with people, whether it's a Boy Scout troop,
Starting point is 00:39:52 whether it's an after-school program, whether it's a male, a man who takes an interest in your life, whether it's the prospect of a romantic relationship, all of these things serve as guardrails where you start behaving in a more societally accepted way. And when you go online and can begin clotting with other extremists from around the world, and there's no safeguards. There's no- I agree. Snowden and Assange, that was different. They saw themselves as people who were deeply disturbed by what the state was doing, took a risk. I think they knew they were taking a risk. This is a kid
Starting point is 00:40:24 who made a series of terrible decisions, and he's probably, Kara, going to be in a basement cell in a max, super max federal prison for the rest of his life. He may never see- No, not the rest of his life. Apparently, it's not. It's a long time. This is the end of his adulthood, really. His life is ruined. Yeah, his life is ruined. His life is ruined. Although he's become a right-wing, look, Marjorie Taylor Greene thought he was a hero for doing this.
Starting point is 00:40:46 The thing is, you don't know what's real and what's not mixed in there. That's the thing. It's like, who knows? He was copying some things down. It's just like, oh, God, this just makes it even more confusing. One interesting detail in the documents, there was a U.S. intelligence assessment that repeated an internal claim from the Russian government that says social media platforms catch less than 1% of fake Russian accounts if a claim is true. And who knows? It comes from Russia. Who knows? They're probably bragging on things. It's a pretty damning indictment of Meta's efforts to combat disinformation. It also reflects poorly on Twitter's mass layoffs. Who's to say? Like, who's to say, right? In that area. And then in the other area is this, the names, this right-wing hate group kind of mentality. The group was called Thug Shaker Central. It's a racist anti-gay joke about a gay porn thing, I think, I believe, where the documents share they,
Starting point is 00:41:38 it was called Bear versus Pig for a Russian slur about Ukrainian people. you know, that was also disturbing sort of disguise in the military. And I'm not surprised, necessarily. There's a survey that found more than one third of all active duty troops reported witnessing white nationalism or racism within the ranks. This is not a surprise either. But it's just sort of this attitude towards our country that's just so, I don't expect you to be raw, raw patriotic. I'm not. But it's this idea of Armageddon almost, right? Like anything goes. And again, it's linked to that stuff we were talking about earlier.
Starting point is 00:42:14 It's a lack of respect for institutions. It's a lack of appreciation for shared stories. And quite frankly, the liberal press is guilty of some of the reason why people don't feel as good about America as they should. press is guilty of some of the reason why people don't feel as good about America as they should. And these young people, they don't have the same economic prospects that my generation had. They don't have the same opportunities for socialization. And so they find each other, they engage in extremist content. And it just sort of the zero to 60 from bad impulses and poor judgment to something really damaging is just much more fluid online. And you make bad judgments on alcohol. So, look at the guardrails we put in there. You're not
Starting point is 00:42:51 supposed to drink under the age of 21. Cigarettes. Cigarettes, adult content. I mean, we age gate the military. We have all sorts of guardrails. I mean, I know this sounds stupid. You know what I think would have saved this? If this kid had had a girlfriend. Yeah, maybe. And where do 21-year-olds... I gotta say, both my sons are stupid. You know what I think would have saved this? If this kid had had a girlfriend. Yeah, maybe. And where do 21-year-olds... I gotta say, both my sons are nicer with their girlfriends. I love them having girlfriends. You know what?
Starting point is 00:43:11 They get their shit together. The prospect for companionship and, quite frankly, sex makes men behave much better. And also, a woman who tends to be a little bit more thoughtful and more engaged in slow thinking says, should you really be copying confidential national security documents and putting them on Discord? You know, it's just, but these kids have no opportunities or fewer and fewer opportunities to build offline relationships with each other. Yep. And we're mammals.
Starting point is 00:43:37 We're meant to be in each other's presence. And, you know, you just, if you have sons and you think about, you see firsthand what fucking idiots they can be because of, you're literally, your role as a father and as a mother is to be their prefrontal cortex. And the problem is every piece of bad judgment can take, you know, can take a firecracker and turn it into a fucking nuclear bomb online. Right. Agreed. It can go to millions of people. It can do tremendous damage. And not only that, when they say these things,
Starting point is 00:44:08 if they said these things in a group of people, of normal people in the community, someone in the community would intervene and go, no, this is wrong, and this is why it's wrong, and this is why I'm going to get involved in your life. But online, it's like, well, let's find all the other crazies. Yeah, there are some great places online to find community. There really are.
Starting point is 00:44:27 But this is just so toxic. You know, it's going to get worse. So we need to talk to our friend of Pivot. Kate Crawford is a principal researcher at Microsoft Research Labs, a research professor at USC Annenberg and author of Atlas of AI, Power Politics and the Planetary Costs of Artificial Intelligence. Welcome, Kate Crawford. You and I have talked many times. Let's start like some basics. Everyone's familiar with generative AI, like chat GPT, but help bring us back to the basics. How do you define artificial intelligence?
Starting point is 00:45:05 to the basics. How do you define artificial intelligence? Yeah, I mean, I think there's a common misperception with AI that it's, you know, this set of mathematical or sometimes magical functions in the cloud. But actually, these are profoundly material systems that have very real impacts. They use lots of human labor, lots of natural resources like energy and water, They use lots of human labor, lots of natural resources like energy and water, and of course, a gigantic amount of data. So part of what I think is really useful with understanding AI is really breaking it down into its component elements, really demystifying it and moving away from the kind of superintelligence hype that's happening at the moment to really say, you know, these are socio-technical systems. People build them. They build them on all of these resources. Let's look at how they work.
Starting point is 00:45:51 And I think that's a really good way to start to unpack a lot of the kind of gloss that gets attached to this term AI. Mm-hmm. All right. So explain, there's, you can call it big data machine learning, cloud computing, and now a generator of AI. Explain the differences for people. And now that we're finally talking only about something like chat GBT, what it means for the field? Well, there's a really long history here.
Starting point is 00:46:20 We could go back to the Dartmouth conference in the late 1950s. We could go back to the Dartmouth conference in the late 1950s. We could look at the first phases of AI that were really focused on expert idea of really just probabilistic modeling. So effectively trying to guess the next word in a sentence, you know, doing this type of mode of brute force AI, which is much closer to what we're seeing with things like GPT. So we could get super technical about how generative AI works, but the important thing to understand is that these are essentially gigantic probabilistic models. They don't understand language. They're much more interested in predicting things, predicting the next word in a sentence, predicting the type of image that you might be requesting. And it's not coming from a basis of understanding all language or understanding all art history. Right, or the intelligence of humans.
Starting point is 00:47:29 It's a very different form of interpretive ability. I think even the word intelligence can actually be a bit of a trap here. I mean, intelligence has its own gnarly history as a term, and I think it can become a real cul-de-sac where people try to anthropomorphize these systems when actually they are very large statistical analysis at scale. Yeah, I always say it's not real. They're not real people. I say like, I don't know, with regular people are getting that idea. So is ChatGPT a radical technological advance from your perspective? You know, that's a really good question because the idea behind ChatGPT has
Starting point is 00:48:08 actually been around for decades. I mean, we could go back to Joseph Weisenbaum who created Eliza, you know, in the 1960s at MIT, which is essentially a chatbot that just used simple scripts and was designed as a therapist that would basically ask your questions back to you. And Weizenbaum wrote at the time that he was shocked how easily people were taken in by this. He said it was basically creating delusional thinking and quite normal people and like, why would people believe this? And what we have now is decades later, a huge shift in computational scale, a huge shift in available data, which means much better modeling of speech, much better modeling of ideas, information, things that
Starting point is 00:48:52 these systems are trained on. And they are trained on truly, I mean, we're talking about data sets the size of the internet itself. So think about putting that in a blender and you're really getting an idea of how much data these things are working with with the foundational model. So perhaps what's new here is not so much the idea, but the difference that scale can make. I'm curious what you think of this AI pause movement. So this is a very big deal in the AI world. So basically over a thousand scholars and business leaders signed the letter that said pause for six months.
Starting point is 00:49:31 I didn't sign it. And here's why. It's really framed in this kind of apocalyptic tone, essentially. It talks about this idea of, you know, incredibly powerful digital minds ending human civilization. And I think this is a massive distraction from what we really need right now, which is how do we actually regulate these systems? What impacts are they going to have right now on things like misinformation and security? So I think in some ways it's sort of drawing us away from the bigger issues that are underlying these questions. I also noticed that Elon Musk was one of the first people to sign, you know, asking for a pause.
Starting point is 00:50:11 And he just said in an interview, it's going to kill us all. It has the potential of civilizational destruction. And then two weeks later, he announces that he's training his own AI model called X. So I'm like, hmm. Surprising. That doesn't surprise Scott or I, but go ahead. No, it's not surprising, but I think it points to some mixed motivations about why people are signing that letter. But if there's something really useful here, it's that there are concerns. And I think you'll see several more letters start to circulate before the end of the year
Starting point is 00:50:45 from different types of communities. Just as a follow-up, it feels as if we'd be smart to get out, at least not ahead of it, but develop some sort of regulatory oversight and not make the same original sin of the internet and have no regulation. Is there a model we can look at in terms of an industry for, let's say, okay, we're charged with establishing some sort of regulatory oversight that lets a thousand acorns bloom and keeps our lead in this and creates economic growth? And I'm actually excited about this technology. I think it could be an unbelievable unlock across healthcare, medical research, you name it. What could we do to ensure that we don't make the
Starting point is 00:51:26 mistakes of the past with other emerging technologies? Right. Well, I mean, you're pointing to exactly this issue of balance. You know, how do we ensure that those benefits are possible with the types of guardrails that we need? And there's different theories that have been circulating around. But one of the core things, America still doesn't have basic comprehensive federal data protection laws, right? So we can think about the GDPR in Europe. You know, that would allow us to have much more control over how our data is being used to train these models. So there's a proposed American Data Privacy and Protection Act, the ADPA. I think that's definitely a step in the right direction.
Starting point is 00:52:06 But, you know, I wonder if we need something a little more urgent. And, for example, the White House has had a domestic policy council for decades that kind of coordinates policy across diverse and, you know, different sorts of areas, including criminal justice, healthcare, education. different sorts of areas, including criminal justice, healthcare, education, you could imagine something like an AI policy council that would go across these domains. It can't be called an information council, though, because look what happened there, right? Well, exactly, right? We know how that goes. But you could think about what's happening with AI in the Department of Education, in transportation, in labor, in the SEC, in the FTC. You've got all of these different agencies and groups seeing different parts of the elephant. So how do we quickly harmonize a sort of a policy vision
Starting point is 00:52:52 across all of those components? And I think, honestly, trying to do this quickly is one of the most important things right now. Now, the EU is streets ahead. They've got their AI Act, which should be becoming law essentially by the end of this year. But even there, they started framing that three years ago plus, and they're now racing to catch the generative AI framework. So, they've been at it. We tend not to do anything. Because let's talk about who benefits from the efforts. Because one of the issues I keep focused, I said, let's stop talking about student term
Starting point is 00:53:26 papers and it killing us. Because really, that's not a concern. Either of them is not the biggest concern we have. But who controls and owns it? Who benefits are these private, large private companies, which I do agree with Elon on. And although he wants to be one of them, I don't want to be in his camp because I think he's just mad he's behind and they didn't want to do what he wanted to do. How do you share them across society? Because this is really, again, our information,
Starting point is 00:53:54 right? It's ours that these private companies are using to benefit themselves and then vomit it back up at us in some fashion. And then if you could talk, you just mentioned environmental impacts, add that in because then we'll pay for that too if there are impacts, not them. Well, this is one of the questions that the research field has really been pointing to for years, which is concentration of power. So you could go back to the first dot-com boom and you could look about how many players were there and kind of, you know, making money from pets.com or whatever it might be. And then you start to see that number start to shrink. And you could think about sort of the big data moment, the deep learning moment, we really start to see fewer and fewer companies who are really at the forefront. And now we are looking at one of the
Starting point is 00:54:36 most concentrated industries in terms of who can do generative AI at scale. Now, the interesting counterpoint to this is the number of open source models that are starting to be released. And that, you know, we can raise questions there too around, okay, if it costs so much in terms of energy, in terms of human labor to try and make these things work, and to build in some guardrails, and that happens at multiple layers in these technical models. Are people going to be able to do that in the open source models? And what is that going to do to these broader public harms and risks that many of us have been flagging? So I think in the end, you're looking at the moment at a kind of an interesting turn, which is, are we going to see
Starting point is 00:55:23 the gains just go to the same big tech players? And of course, Microsoft, Google, Amazon, Adobe's just released a model. I mean, it really is the kind of, you know, all the big players are in this now. This is, everybody is playing this game to win. But then you've got this open source question. And so, again, we've seen this play out historically before. But I do think that if we start to see these models being able to be trained with less data and with less energy, which is key, then it opens up some more space. For everybody, for innovative young companies. That's right. What are you most excited about?
Starting point is 00:56:02 Just talking about trying to be a little bit glass half full here, where do you think the most disruptive change for good lies in terms of this new technology? Well, the things that, I mean, I'm going to speak personally here. The things that I use these sorts of large language and large multimodal models for is like summarization. Extremely good at, you know, give it a long document, get a really concise answer. Coding, you know, I haven't had to code for years. It takes me all the way back to, you know, when I was learning this stuff in my late teens, early 20s.
Starting point is 00:56:37 I can, you know, really, what is amazing is that the hot programming language in 2023 is English. You can literally just write the program that you would like to produce and you'll get that. That is exciting to me. I think that actually- Sort of like spreadsheets. It's like spreadsheets. Who uses an abacus anymore, except for Scott? Well, they're kind of cool, but yeah. I mean, in some ways, I think it's more transformative than a spreadsheet. I think you really have to think about some of the biggest technological changes that we've experienced. I mean, a lot of people compare this to, say, the iPhone or, you know, the internet.
Starting point is 00:57:16 I tend to go further back. I kind of think about this in relation to, say, the creation of artificial perspective in the 1400s. It literally changed how we represented the world. And that's why I think that these models are so powerful. And Scott, you know, there's some big upsides there, but there's also some very big social questions that we have to get ahead of, because if they just get released and they just go and do their thing, we know that a whole lot of problems can emerge.
Starting point is 00:57:47 So this is why research and guardrails are so important. 10 years from now, if you were to make a prediction for AI, where it is? Besides killing us all, apparently. Cara, you know, I don't know. It's been such a hectic year. I mean, I don't know how much sleep you guys are getting, but we've, I mean, I've just never seen anything. You AI people must be crazy. I've never seen anything this fast.
Starting point is 00:58:07 I've been in this field for almost two decades. This is unlike anything else. I think it's really difficult to make a 10-year prediction, but I do think that the things that keep me up at night are not, you know, machines that are going to run the world. It's going to be people who haven't done enough of the work to actually think about the really complex social interactions that are already happening with these systems. Here's what I'm scared of. The people who run the machines keep me up at night. If you're somebody who wants to get, say, from A to,
Starting point is 00:58:46 you'll never get to Z, but wants to get some elemental or some basic foundational knowledge of AI, what LLMs would you suggest experimenting with? How do you get to not letter Z, but letter D or E quickly? Well, I'm a big fan of experimentation. So there's different types of models out there. As we know, there are ungrounded ones that just will give you answers and you won't know where the source comes from. There are so-called grounded models that are connected to search results that will give you a different kind of set of answers. how do we put this in this broader social and cultural context? And that means you need to read further afield. That's part of kind of what we're doing with our Knowing Machines Research Project is really trying to show people, okay, let's think about how you might actually interact with a model yourself. How do you start to look at training data and understand how it maps the world? And then how might you start to think about how you'd use these systems differently? In some ways, I think we need this kind of rapid increase
Starting point is 00:59:49 in literacy, and it's not just going to be about focusing narrowly on the tech. It's going to be looking at what does this do to the public sphere? What does it do to the media? What does it do to democracy? That's the piece we need to be thinking about. Yeah, I try and use it every day,
Starting point is 01:00:04 and I just think it's fascinating. And I'm a big believer that AI is not going to take your job. Someone who understands AI is going to take your job. So I think my advice to any young person would be is, Kate is suggesting, experiment with this and get good at it. It's a weapon. You want to be trained in it. 100%. It is like using the internet in the early days.
Starting point is 01:00:23 Anyway, the Atlas of AI is available everywhere and Kate is on Twitter and Mastodon thank you Kate Crawford it's great to see you both thanks Kate that was really interesting the one thing I thought
Starting point is 01:00:33 most interesting was coding like you say I want a program that does this and it does it for you like you don't need to just like you don't need to fix you know
Starting point is 01:00:41 there's a lot of things you don't need to do anymore that you used to you know basic research the internet help for example That's really interesting that you could code without learning to code. Why bother? Because it does it for you. Anyway, just thought that was interesting. Not sure if you want to code with me, Scott. But in any case. I'll melt with you, but I won't code with you. Okay. One more quick break. We'll be back for wins and fails.
Starting point is 01:01:04 with you. Okay. One more quick break. We'll be back for wins and fails. As a Fizz member, you can look forward to free data, big savings on plans, and having your unused data roll over to the following month. Every month at Fizz, you always get more for your money. Terms and conditions for our different programs and policies apply. Details at fizz.ca. Okay, Scott, let's do some wins and fails. I'm going to start with a fail. Russian journalist Vladimir Akhara-Murza was convicted and sentenced to 25 years on charges of treason after he criticized the war in Ukraine. He's in jail, obviously, for political views. He's very sick. He was poisoned, too. It's depressing, the whole thing. And of course, American journalist Evan Gersovich remains in jail in Russia. The U.S. ambassador to Russia finally visited him, but also in potentially
Starting point is 01:01:57 trouble to get the same kind of sentence for treason. What kills me is all these people talking about free speech in this country, and then giving an easy time to Russia. It's sort of like the people who, maybe on the left, who are censorious, and then the people who actually pass terrible laws around gays and trans or whatever they happen to, or abortion. We need to focus on Russia, like what they're doing here. And they are the global thugs of this world, I think, and getting closer and closer to China. I can't believe what these families are going through with their relatives just convicted in a kangaroo court. You know, at least we have courts that have problems, but they're not kangaroo courts. In any case, that is my fail.
Starting point is 01:02:42 That is my fail. All right. Any wins? Oh, happy birthday, Alex. That's nice. And then Louie's birthday is coming up soon. They're really wonderful kids, as I've always said. And so Louie's going to be 21 and Alex is 18.
Starting point is 01:02:56 So they're on their own now. They're on their own. They're boys. They'll be home. Good luck, boys. Enjoy. I've done all that I can for you. Anyway. My win and fail both fall under the same kind of umbrella. And that is,
Starting point is 01:03:11 if you have high blood pressure, you just become much more vulnerable to a series of health outcomes. If you're pre-diabetes, I mean, there's just certain things. Absolutely. If your body mass index is over a certain level, you're just prone to all sorts of bad health outcomes. And I think there are some things in our nation that are basically high blood pressure. And unless we address them, we're just going to have a series of bad outcomes. We can never be truly healthy. And I think that reinvesting in young people and giving them opportunities to be more economically and emotionally viable, especially young men, is a huge issue. I think some sort of common sense gun regulation,
Starting point is 01:03:49 but far and away the number one for me, and it's become a litmus test for who I'm going to vote for and who I'm going to give money to, is until we restore something resembling modern progressive rights for 51% of our nation, specifically a woman's right to choose, I just think we're going to have terrible outcomes everywhere. I just can't see how we move forward and claim to be a progressive, empathetic nation when we are rolling back women's rights. It is so counter to every modern, thoughtful, societal movement around the world, whether it's Ireland, whether it's Mexico. It's just, it's so incredibly discouraging. And we have always held onto this cold comfort that it couldn't happen here,
Starting point is 01:04:29 and it is happening. And so my fails in Florida, my state where I'm a resident, just passed an incredibly, where the governor just signed a very restrictive abortion law with very few exceptions. It's a bill banning abortions after six weeks. Both my children, we didn't know we were pregnant until after six weeks. That is correct. Me either. And not only that, I mean, as if to say, you know, to big fuck you to women, if the exception of rape or incest, you have to show a police report. Documents, yeah.
Starting point is 01:05:02 To prove incest. And in the case of the life of the mother, two doctors, two doctors must sign off on an abortion. And Florida is a wonderful, diverse state, huge state. And the fact that they've basically outlawed choice, it means to me that Florida is no longer part of a modern functioning democracy. And they did it under the dark of night. They didn't do it. They did it 11 o'clock at night. He didn't do it because he knows it's a losing issue in elections, right? So he's trying to do it and then get credit among the people he needs to please and yet not let everybody else know.
Starting point is 01:05:37 But we're going to hang this around his neck. A hundred percent. And I believe it was a stupid move for him. And granted, I have some bias here. But the reality is one of the best examples of minority rule is that the majority, vast majority of Americans believe in a woman's right to choice and the nation is headed the other way. It is the best example of minority rule in our nation, maybe with the exception of assault
Starting point is 01:05:58 weapons. And then my win is on the same subject. It is now a litmus test for me. Janet Protasewicz, I think is her name, was elected to the Wisconsin Supreme Court and flipped it to a more progressive body. And it was basically Wisconsin voters, and there's more Republicans outnumber Democrats here, which is a tell for how America feels about this. It was really about choice and a little bit about gerrymandering. But America has said, how can we pretend to be an economically viable?
Starting point is 01:06:25 I mean, this not only goes to just basic human rights, it goes to our economy. It goes to our ability to have functioning households. It goes to our ability to have kids who are raised in loving, economically secure households. economically secure households. And I'd like to think, and my win here is that Wisconsin, which is sort of a purple, slightly reddish hue, has said, okay, this is just out of control. She won by a lot. She won by a lot, too. It wasn't a little. It wasn't a squeaker. No, this really is. And I'm hoping- Young people. I'm hoping this represents a bottom on this, and that the demo in democracy is going to kick in here,
Starting point is 01:07:06 and then more people— I don't know, Scott. They're relentless. These people are relentless. Well, but we have justice and righteousness on our side, Kara, and I do think that America gets it right over the long term. These people are fucking relentless, and they'll stoop to every low possible. But go ahead. Sorry. I think they're relentless.
Starting point is 01:07:31 Yeah, but I have a lot of faith in our system, not over the short term, but over the medium and the long term. And I do think that this is, I think people are waking up to the fact that our fears are being realized. And again, we all sort of took this cold comfort that, oh, it couldn't happen. I even think a lot of Republicans weren't expecting it to go this far. And I think that I'd like to think Florida is a terrible example of where it can go. But Wisconsin hopefully represents a turning point. and have decided to ideologically flip their Supreme Court because they have, they are, in my opinion, concerned and worried about the constant, we're the only nation, I think we're the only nation in the world that hasn't been taken over by religious fanatics where
Starting point is 01:08:15 rights are being taken away. You just, that's never happened in the history of America. We've always granted rights to people, additional rights to people. America. We've always granted rights to people, additional rights to people. So anyways, my fail is what is an incredibly restrictive, primitive, there's no way around it, misogynistic law that's been passed in Florida, the great state of Florida, which is feeling less great today. And I'm hopeful around the flipping, the election of someone, a Democrat to the Supreme Court in Wisconsin. Yeah, well, you know who the only person who's saying this is a real loser for us is?
Starting point is 01:08:52 But Donald Trump. He's the only one who gets it. Someone who's familiar with the topic. Allegedly, he's the only one. And the rest of them, it's like a suicide pact these people have. I don't even understand it. But they're relentless.
Starting point is 01:09:05 They are just relentless. And now they've got a lot more allies like Musk and others who are just lying their way right to the top, which is really depressing to me. Anyway, we'll see. That's a really good one, Scott. I really appreciate that. We want to hear from you.
Starting point is 01:09:20 Send us your questions about business tech or whatever's on your mind. Go to nymag.com slash pivot to submit a question for the show or call 855-51-PIVID. And don't forget, we were nominated for a Webby Award, and we need you to vote for us at the link in the bio. We're losing. We're like in third right now. Okay. Well, what are you going to do? Do I got to make my jokes dirtier? Is that what's going on here? Make your jokes dirtier.
Starting point is 01:09:41 Jesus walks into a restaurant and says, I need a table for 26. And the waitress says, there's only 13 of you. And he said, yeah, but we all sit on the same side. Oh, my God. That's a terrible Jesus joke. You need to up your Jesus jokes. Do you know what car he drives? Honda, of course.
Starting point is 01:09:58 A Chrysler. Okay. Vote for us. Next time I'm in West Virginia, I'm going to do that one because they would laugh at that one. That's a good one. I like that. In any case, please vote for us. Okay, Scott, that's the show and Jesus Jokes. We'll move back to dad jokes very soon, hopefully.
Starting point is 01:10:16 We'll be back on Friday for more. Scott, read us out. Today's show was produced by Lara Neiman, Evan Engel, and Taylor Griffin. Ernie Indertot engineered this episode. Thanks also to Drew Burrows and Neil Saverio. Make sure you subscribe to the show wherever you listen to podcasts. Thank you for listening to Pivot from New York Magazine and Vox Media. We'll be back later this week for another breakdown of all things tech and business.
Starting point is 01:10:37 Optionality!

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.