Pivot - SPACs tumble, the education tech market and the future of automation

Episode Date: March 9, 2021

Kara and Scott talk about how SPACs have tumbled in recent weeks and what that means for the IPO landscape. They also discuss Coursera going public and the future of education tech post-pandemic. Then..., New York Times tech reporter and author of the new book, "Futureproof: 9 Rules for Humans in the Age of Automation", Kevin Roose joins to discuss the future of automation in our day-to-day lives. Mackenzie Scott's marriage is our win! Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Support for Pivot comes from Virgin Atlantic. Too many of us are so focused on getting to our destination that we forgot to embrace the journey. Well, when you fly Virgin Atlantic, that memorable trip begins right from the moment you check in. On board, you'll find everything you need to relax, recharge, or carry on working. Buy flat, private suites, fast Wi-Fi, hours of entertainment, delicious dining, and warm, welcoming service that's designed around you. delicious dining and warm, welcoming service that's designed around you. Check out virginatlantic.com for your next trip to London and beyond and see for yourself how traveling for takes forever to build a campaign. Well, that's why we built HubSpot. It's an AI-powered customer platform that builds campaigns for you,
Starting point is 00:00:50 tells you which leads are worth knowing, and makes writing blogs, creating videos, and posting on social a breeze. So now, it's easier than ever to be a marketer. Get started at HubSpot.com slash marketers. Hi, everyone. This is Pivot from New York Magazine. Get started at HubSpot.com slash marketers. Harry calls her. I've been consuming all this Harry-Meghan content a lot. Did you watch that? I did. It was interesting because my son was downstairs watching the NBA All-Stars or whatever. They were doing all kinds of beeswax down there. And he was screaming from downstairs every time Steph Curry made some sort of three-pointer type of thing. And then I watched the entire with Amanda, who was not that interested in Meghan Markle and Harry.
Starting point is 00:01:43 But I got her to watch it, and she liked it a lot, I think. She didn't seem to be interested in it. But it was a big splash for Paramount Plus, which is CBS All Access. It was a big plus for CBS. It had, I think, 17 million views. Oprah can really still bring it when she needs to. Oh my gosh. She's incredible. I thought for a minute she wasn't going to bring it, but she brings it. I didn't watch it. I think they should have renamed the episode White Lives Don't Matter. I just think this is literally— I think these are some of the most boring people on the planet.
Starting point is 00:02:11 No, you're wrong. Oh, my gosh. It was a— It's porn. It's like, let's trade my celebrity and talk shit and trade secrets. I didn't even watch the thing, and I watched it. Why do we need to protect the crown? The crown is already, like, a successful TV show.
Starting point is 00:02:24 Oh, the crown is wonderful like a successful TV show. Oh, The Crown is wonderful. We love The Crown. We love The Crown. Well, season seven is going to be lit when you listen to this interview, let me just say. It doesn't sound very pretty there over at Buckingham Palace. It's trading in on celebrity they have not created. So what?
Starting point is 00:02:37 So what? Big deal. So what? God save The Queen. Come on. Why do you like The Crown and not this? Tell me why. Why do I like The Crown? I think that— Yeah, because it's really quite fictional according to many people who like The Crown and not this? Tell me why. Why do I like The Crown? Yeah, because it's really quite fictional, according to many people who live with The Crown.
Starting point is 00:02:48 I don't doubt that, but it's good fiction. I just don't like people trading off celebrity they inherit. I don't like the Trump kids who have done nothing themselves, who have accomplished nothing themselves. And I don't—it's unfair. I don't—maybe you can tell me about anything these two have accomplished on their own, but I feel like it's just porn. Well, she's been an actress. She was on a relatively successful TV show. Yes.
Starting point is 00:03:09 She was on Suits. She was fine. She was on Suits. No, but it was a successful show. Are you on a successful show? I'm sorry. I would draw everything I've said. No.
Starting point is 00:03:20 She was on Suits. Look, 17 million people disagree with you that this is interesting by the way the point she was making during the interview was how much the tabloids in britain have this deal with the with the crown with the with buckingham palace to like they have parties for them and they trade access for behavior like not attacking certain members of the crown like prince andrew for example let me just, the crown is already doing this. So I don't know why you're all hot and bothered about Harry. Well, you know Harry.
Starting point is 00:03:49 You know the story about Harry. When he was born, they were going to name him Up so they could call the family Up, Chuck, and Die. Oh, my God. That's good royalty humor. Or they were going to call him, actually, the second name they had for him was Shamu. So they would call him Shamu, Prince of Wales. Oh, my God. That's good royalty humor.
Starting point is 00:04:05 That is not good. Let me just say, with Diana, I did, because I think this, first of all, the crown is wrapped up in the tabloids already, and they have deals and everything else, the way that works in Britain. It's a whole little economy, so I don't think there's anything wrong with these two cashing in on it. That's Uno.
Starting point is 00:04:21 Dose, his mother died in a paparazzi accident. Like, come on. Like, she was being chased by paparazzi. That's Uno. Dose, his mother died in a paparazzi accident. Come on. She was being chased by paparazzi. That's not fair to pull that out. What does that have to do with Oprah interviewing him and then moving to LA and going to Soho House all the time? Because one of the topics discussed was how much she didn't play
Starting point is 00:04:37 the game and got attacked for her race. Go look at the things of Kate Middleton, who's perfectly nice, and her. Kate Middleton's pregnant, holds her belly. They're like, isn't this beautiful? She holds her belly. She's got a weird obsession with it.
Starting point is 00:04:50 You're conflating. No, I'm not conflating. Princess Diana and whether they're nice or not. And basically, these folks cashing in on, again, celebrity based on other people's discretion and also their commitment to their nation and their crown. You love that crown, don't you? You think the crown is all that? I think it's all been a scam the whole time. I was a dual citizen.
Starting point is 00:05:12 I was a British citizen for a short time. You have to like the queen. They were nice to the queen during this interview, just so you know. The queen didn't get touched. Well, they're not stupid. I mean, who's not going to be nice to the queen? I kind of believe the queen is one of the most, I think, respected or well-liked people in the world over the age of 105 or whatever she is now. Perhaps.
Starting point is 00:05:31 It's not that many. I thought it was, look, let me just say it was a huge, huge hit for Oprah. Enormous, I heard. Let's talk about Oprah. Oprah, once again, does a fantastic job. Unbelievable. Someone was just pointing, there were so many good tweets about the whole thing. That was my favorite part.
Starting point is 00:05:45 It was like Oprah just restarted the war of 1812. Everyone was waiting for black Twitter to go against UK Twitter because she was
Starting point is 00:05:52 talking about racial issues quite a bit including someone asking about the color of her baby skin and stuff. So I thought it was
Starting point is 00:05:58 just good entertainment. That's what I said. At the end of the day it's what I would call is it's anything about these two. It's a palate cleanser for anything of substance. So, okay, I'm sick of hearing about the pandemic.
Starting point is 00:06:12 I'm sick of hearing about what the Biden administration is doing. I know I'll watch Meghan and Harry and talk about their new home in the Hollywood Hills. Good for them. Entertainment. They live in Santa Barbara, in case you're interested. They live in Santa Barbara? That's right. They live in Santa Barbara.
Starting point is 00:06:24 I just want to throw it out there and call this the Super Bowl for millennial women. Like, that's what it was. Really? Oh, there's Rebecca. Thank you. I just need to throw it out there. Thank you. Rebecca answered, as usual, on my side.
Starting point is 00:06:35 Correct, Rebecca? That's a shocker. Almost always. That's a shocker. What do you mean? No, no. Get rid of the word almost and we're accurate. Yeah.
Starting point is 00:06:42 Yeah. That's right. Well, you know what? You missed the entertainment. Like, it was fun. I didn't watch it. I got a minute. I didn't watch it.
Starting point is 00:06:47 It was so good. It was so good. Go watch your NBA All-Stars bullshit and then get out of here. I'm sorry. The Millennial Super Bowl, does that mean it was very expectant and obnoxious and will decline in viewership 8% a year for the next 30 years? Oh, no, because it's Oprah. Thank you, Rebecca.
Starting point is 00:07:01 After the pod, I'm going to go watch it on Paramount+. Listen to me, Scott. Okay, go ahead. Oprah, it shows how good Oprah is, I have to say. I you, Rebecca. After the pod, I'm going to go watch it on Paramount+. Listen to me, Scott. Okay, go ahead. Oprah, it shows how good Oprah is, I have to say. I'm with you. Oprah's a gangster. She's fantastic. This is why she was so—I learned a lot as an interviewer.
Starting point is 00:07:12 So Megan would say, you know, someone was tweeting about this X, Y, Z, and then Oprah would go, X, Y, Z? So she'd feign like she was excited. She was in with them. Shocked. Then she sort of challenged them. She's a genius. She's a frigging genius. She's very good.
Starting point is 00:07:29 She's an inspiration. I love Oprah. I got to say, she's really good. She made some great entertainment. In any case. A billionaire who deserves to be a billionaire. But anyway, my son was watching the NBA All-Stars. But the NBA created a blockchain.
Starting point is 00:07:42 He does not care about those. He cares about Steph Curry and three-pointers. But NBA created a blockchain. This is, he does not care about this. He cares about Steph Curry and three-pointers. But NBA created a blockchain advisory committee. And a friend of Pivot, Mark Cuban's involved, Ted Leonsis, Joe Tsai from Alibaba. They'll determine how to leverage blockchain to benefit NBA's business. And now you can use Dogecoin to buy Mavericks merchandise. Jesus Christ, this is a slow news day. Why don't you and I create an Ethereum advisory committee for Vox?
Starting point is 00:08:05 I mean, who the fuck cares? An advisory committee? That's news? Breaking news. A bunch of old white men are trying to act younger. What are you interested in? What are you interested in? This is like the mother of all Botox.
Starting point is 00:08:19 I know. Let's try and pretend we're younger and hipper with the young people and announce an advisory committee on Vox. Well, they did pass stimulus. What are you interested in, Scott Gatlin? There's not a lot going on this week. I'm excited. Oh, my God. What are you talking about?
Starting point is 00:08:31 I think the SPAC meltdown is more interesting. We're going to go into that in our big story. We're going to go into our big story. But is there any other interest to you? Look, I'm a little bit forlorn. I'm a little bit— Why? Why are you forlorn?
Starting point is 00:08:43 Why would I have my heart sort of broken today? Oh, well, we'll get to that. Okay. Mackenzie Scott got married and it wasn't to you. I am, honestly, honestly, I am genuinely happy for her. I think science teacher. Look at that. He looks nice too.
Starting point is 00:08:59 I am. She is so fantastic. Everything she's doing is like. Literally, she's a role model. You think to yourself, that's how I would want my kids to grow up and behave. I don't know. She just, I'm just, I mean this sincerely. I read it and I'm like, I haven't been this happy for a famous person in a long, long time.
Starting point is 00:09:21 She is doing it right. She's living her best life. She's finding happiness. She's living her best life, giving away finding happiness. She's living her best life, giving away money, being her own person. That was a really nice story. I thought that was like,
Starting point is 00:09:30 oh my God, when's the last time I opened TMZ and was happy? Yeah. Like page six. Oh my God, I'm happy. When does that happen
Starting point is 00:09:37 after you reach page six? When does that happen? When does that happen? I don't know. Here's one good thing. CDC guidelines say people who are vaccinated can hang out with other people
Starting point is 00:09:44 who are vaccinated indoors and just breathe all over each other. They're not tremendously concerned about more infection. That's good. That is good. I think that's really nice. You can have restaurants of just vaccinated people like yourselves. I'm so sick of hearing about restaurants. Jesus Christ. By the way, 27 million, 27 million, the National Restaurant Foundation got for restaurants, but not all schools are open. Thank God. Johnny's Taqueria is going to stay open. I just don't get the whole restaurant thing. This is not the Hunger Games. We can be speaking of restaurants. I just don't
Starting point is 00:10:17 get the whole restaurant thing. In any case, there's no perfect answer in this situation. Anyway, we're going to big stories now, something you'll be happy about. are in this situation. Anyway, we're going to big stories now, something you'll be happy about. SPACs are falling. The SPAC index fell 20% since February peak. This is a technical definition of a bear market.
Starting point is 00:10:34 As a reminder, SPACs are special purpose acquisition companies. Essentially, but we like the word SPAC, essentially a shell company set up by investors with the purpose of raising money through an IPO and then they back a company into it. They've become the dominant way for a company to raise equity finance this year. Over 200 new SPACs have raised $70 billion in 2020. Meanwhile, one of the most prominent SPAC holders, he has other SPACs, by the way, Chamath
Starting point is 00:10:55 Palihapitiya, faced backlash last week after filing showed that he had sold his entire personal stake in Virgin Galactic, which he took public as a SPAC, one of the very earliest ones. So what do you think, Scott? What's going on? Tell me about SPACs if you don't want to talk about important things like Megan and Harry. Well, I think SPACs, we're going to look back on SPACs the same way we looked back on the.com or.bomb. And that is a lot of them are just overvalued.
Starting point is 00:11:17 But it's a new financing mechanism. It's an interesting way. It's an old one that's become new. It's an interesting way. It's an old one that's become new. Well, but in the first 17 days of 2021, there were more SPACs raised than in all of 2019. Right, but it had been something that had been around. But go ahead. And this is essentially, I think, of everything through the lens now of dispersion because I'm hoping to get into the Urban Dictionary as the guy who read in dispersion.
Starting point is 00:11:41 Is that your next book called Dispersion? My next book is called The Algebra of Wealth. Oh, right. Okay. Strategies on Personal and Economic Satisfaction. Anyway. You just churn these books out. I like to write.
Starting point is 00:11:52 You do? I like to write. So, anyways. Write my book for me. What are you talking about? Spacks. Oh, yeah. So, anyways, but it is the dispersion of investment banking,
Starting point is 00:12:01 and that is companies and operators, and there's some very talented operators doing these things, have said, okay, we don't need the investment committee of Morgan Stanley or Goldman Sachs to tell us who should be a public company. We'll go public and we'll add a lot of operating muscle and tensile strength to a good private company, and boom, just add water. It's public and reflects a few things. One, never underestimate the market's ability to come up with a product when people have cash in hand.
Starting point is 00:12:26 Sort of the analogy I use was when it starts raining, there's an umbrella for sale every six feet in Manhattan. And I'm like, where did this guy come from? Where are they hiding these things? Anyways, a lot are gonna go out of vogue, a lot are overvalued, but I do think for the first time these things are here to stay. Here to stay.
Starting point is 00:12:44 Chamath. So you're sort of SPAC positive. I mean, there's what I'd call the bearded lady A lot are overvalued, but I do think for the first time these things are here to stay. Here to stay. Chamath. So you're sort of SPAC positive. I mean, there's what I'd call the bearded lady and the world's strongest man inside the tent. And then there's the guy in front of the tent saying, come on in, come on in. And Chamath has done, you can call him a storyteller, a visionary, P.T. Barnum, but he has been the face for SPACs. And he was early. He got them done early.
Starting point is 00:13:04 He was visionary about it. But when you sell your entire stake, to be fair, it's not his entire stake because he has a company that owns other shares in. But when you sell $200 million in one day of your, there's just no getting around it. It's a negative forward-looking indicator that an individual who understands his company fairly well
Starting point is 00:13:22 and understands the market, he's a great investor. Maybe he's buying NFTs. Maybe he's doing that. But go ahead. Well, this was the ultimate jujitsu move, I thought, on his part. He said, well, I'm still committed to their mission despite selling a quarter of a billion dollars in stock, but I'm taking the money to reinvest in climate change. That's like saying, well, if he decides to bail on the shares of his other SPACs, is it going to be to cure cancer? I thought that was an interesting kind of acrobatics or gymnastics to try and turn chicken shit into chicken salad. But it ended up, just that one sale, the whole market went, well, if Chamath, if the king of SPACs has decided to leave the kingdom and go on Oprah and talk about how much the kingdom sucks, then something's wrong. And the whole space – and it's not his fault, but all you need is an excuse for a space that's overvalued to begin leaking value.
Starting point is 00:14:14 No question. So, look, certain SPACs have raised the money, and they have to either use it or lose it, right? Two years. Two and a half years, two years? Two years. So, the ones that exist are going to have to back themselves into a bunch of companies. Oh, I mean, you don't want to talk about it.
Starting point is 00:14:29 There's now something like 150. Right, looking for smash. And now I add up, it's 150 billion searching. It's really more like half a trillion because they typically do staple on financing or a pipe. So there just aren't that many
Starting point is 00:14:40 great private companies. So there's people wandering around. You're going to see SPAC start to go abroad. You're going to see them go into Latin America and Europe. But basically, there's too much capital looking for good private companies. And then there's all that venture capital. There's a lot of capital. There's a lot of dough. Especially a lot of spending going to be happening in the next few months, I think. So, what does it say about the IPO landscape in general?
Starting point is 00:15:04 That's a good question. I think that it's bifurcating, and that is, people have been saying, or the easy, the kind of easy narrative is, well, this hurts investment banks because now there's new investment bankers. No, it's not. Goldman and Morgan Stanley are taking SPACs public, taking their 7%, and there's definitely a bifurcation, and that is investment banks, Goldman and Morgan Stanley, everything distills down to one dynamic in our society, Kara. Which is? iOS and Android. And iOS is the premium kind of higher-end, more aspirational, cost-more positioning, luxury positioning.
Starting point is 00:15:38 And Android is for the masses, and it works really well and is almost just as good, 80% of the value of iOS for 0% of the price. And essentially what you have now is SPACs that go public, and then you have companies that go public through JP Morgan, Morgan Stanley, and Goldman, and quite frankly, they're just of a higher quality. And so iOS is the companies that choose to go public the old-fashioned way. And everybody says, everybody complains, VCs, oh, they screwed up, they got a huge pop, they left money on the table.
Starting point is 00:16:09 It's a branding event. If your company goes up 30%, 50%, 100%, you're not getting, you didn't lose 50% of your net worth. Usually, they only take 10% of the shares public. So, if you underpriced it, then you lost 5% for what is the branding event that you will never have again because you're on the cover of every business story the next day saying, you know, Coursera, which just filed to go public, up 40%. So, they purposely, it's an ecosystem where everyone sort of wins. And that is, okay, the initial shareholders complained that they left money on the table. Sort of, not really because of the branding event. And also the investment bank gets to give a pop to their clients. It's a big ecosystem.
Starting point is 00:16:48 So these SPACs will find homes and they will find companies to back into them. And then these IPO landscapes, what? There's just more public companies or some are questionable or they'll find shitty companies? We're catching up. There's been a dearth of IPOs and the number of companies publicly traded in the last 30 years has been cut in half. So you could argue that this is just a regression to the mean, and there'll be more publicly traded companies. I would say somewhere between a third and two-thirds of SPACs will trade below their offering price. We're already getting there. We're already starting to see the SPAC index come back to where they went public at. come back to where they went public at. But it's definitely, I think the mechanism as a form of fundraising is definitely here to stay. This is an innovation that's working.
Starting point is 00:17:30 It will not be quite this firmer. Well, SPAC, if you look at the economic history of SPACs, they typically underperform the market because the investment committees at Morgan and Goldman are smart and can pick from the best companies. Pick of the litter. There's a lot of litter going on here. That's right.
Starting point is 00:17:47 What do you think? What's your, you've seen a lot of this. I think there won't be as much fundraising and everyone and their mother won't have a SPAC like everything else, just like sub stacks or podcasts or anything. Everyone all rushes into things and then they rush right out.
Starting point is 00:18:00 Like, you know, audio space, anything, everything. And this is just the same thing. And they'll run out of really good things to do. And some of them will do well and some of them won't. But it's certainly an easier way for a lot of companies to go public, for sure. And we'll see where that goes. Whether they should be public is another question. You just said almost nothing.
Starting point is 00:18:16 I did not. You just said almost nothing. You conditioned everything. You literally conditioned everything. It would be impossible for you to be wrong with what you just said. No, I don't. Like anything else, it overvalues the under. It could rain.
Starting point is 00:18:26 It might not. It could rain. It might not. It's neither here nor there. There will not continue to raise as much capital. Some of these companies will do badly because they aren't as good at picking things like you said. I just got very self-conscious. You know how I'm self-conscious?
Starting point is 00:18:40 You did. You got mean there. That's okay. Of course, it's International Women's Day and you attack a woman, but that's okay. All right, Scott, let's go to a quick break. That is, oh, by the way, that's entirely fair. That's entirely fair. You know what? I look forward to a day. I look forward to a time, Kara, when we don't need an International Women's Day. I feel unsafe. I feel unsafe in this relationship. I feel unsafe. I look forward to an age
Starting point is 00:19:05 when we don't have to have an International Women's Day. But anyways. Oh my God. Well, that would be nice. I'm not only self-conscious now. We have equal rights under the law. Same thing for everybody else
Starting point is 00:19:14 except white guys. About you. You saying I'm mean and not nice to women, which will always stick when you say that about me. You know what I'm really self-conscious about?
Starting point is 00:19:21 We have to go to a break. Well, hold on. I tell you I'm self-conscious. I don't care. You don't care. I'm self-conscious about? We have to go to a break. Well, hold on, hold on. I have all this. I tell you I'm self-conscious. You don't care. I'm self-conscious. I have all this family. Oh, feign interest. Go ahead.
Starting point is 00:19:31 Tell me about it. I have all this family. No, do what Oprah does. What? You're insecure? You're what? You're self-conscious? Oprah would move it right along.
Starting point is 00:19:40 I have all this family. Oprah would move you along. I have all this family in London and Glasgow, and I'm worried. My comments were supportive of the crown, right? They're not going to be mad at me. Okay, good. I'm fine. I'm fine.
Starting point is 00:19:50 Good. I'm fine. There's not a queen you don't love. Anyway, we'll go to a quick break. When we get back, we'll talk about the future of education tech, and then we'll be joined by New York Times tech columnist, Kevin Roos, who has a new book called Future Proof. called Future Proof. Fox Creative. This is advertiser content from Zelle.
Starting point is 00:20:17 When you picture an online scammer, what do you see? For the longest time, we have these images of somebody sitting crouched over their computer with a hoodie on, just kind of typing away in the middle of the night. And honestly, that's not what it is anymore. That's Ian Mitchell, a banker turned fraud fighter. These days, online scams look more like crime syndicates than individual con artists. And they're making bank. Last year, scammers made off with more than $10 billion. It's mind-blowing to see the kind of infrastructure that's been built to facilitate scamming at scale. There are hundreds, if not thousands, of scam centers all around the world.
Starting point is 00:20:53 These are very savvy business people. These are organized criminal rings. And so once we understand the magnitude of this problem, we can protect people better. One challenge that fraud fighters like Ian face is that scam victims sometimes feel too ashamed to discuss what happened to them. But Ian says one of our best defenses is simple. We need to talk to each other. We need to have those awkward conversations around what do you do if you have text messages you don't recognize? What do you do
Starting point is 00:21:23 if you start getting asked to send information that's more sensitive? Even my own father fell victim to a, thank goodness, a smaller dollar scam, but he fell victim and we have these conversations all the time. So we are all at risk and we all need to work together to protect each other. Learn more about how to protect yourself at vox.com slash zelle. And when using digital payment platforms, remember to only send money to people you know and trust. Support for this podcast comes from Anthropic. You already know that AI is transforming the world around us, but lost in all the enthusiasm and excitement is a really important question.
Starting point is 00:22:02 How can AI actually work for you? And where should you even start? Claude from Anthropic may be the answer. Claude is a next-generation AI assistant built to help you work more efficiently without sacrificing safety or reliability. Anthropic's latest model, Claude 3.5 Sonnet, can help you organize thoughts, solve tricky problems, analyze data, and more. Whether you're brainstorming alone or working on a team with thousands of people, all at a price that works for just about any use case.
Starting point is 00:22:33 If you're trying to crack a problem involving advanced reasoning, need to distill the essence of complex images or graphs, or generate heaps of secure code, Claude is a great way to save time and money. Plus, you can rest assured knowing that Anthropic built Clawed with an emphasis on safety. The leadership team founded the company with a commitment to an ethical approach that puts humanity first. To learn more, visit anthropic.com slash clawed. That's anthropic.com slash clawed.
Starting point is 00:23:02 That's anthropic.com. Okay, Scott, we're back. Let's talk about your area of expertise, education tech. Coursera is among the education tech companies going public this week. And then your company, Section 4, has raised $30 million. Speaking of not enough rat holes to shove all the cash down, your company, Section 4, has raised $30 million. I appreciate your support, Kira. Full disclosure. We were just about to ask you to go on the advisory board of our rat hole.
Starting point is 00:23:31 No, thank you. I don't do any advisory boards. No, you're too pure. Explain to us, yes, I am, why you think education tech is the next big thing. Please illuminate us on what you're doing there. And what do you think about Coursera, too, and all these education tech? Tuition up 1,400%. The upward lubricant of the middle class in America has been accessed to cheap, great education.
Starting point is 00:23:52 It's now become the caste system where we enforce on people. We say you're either the children of rich people or freakishly remarkable or you don't get into a great school. And my cohort has become drunk on luxury and brag about turning away 95% of our applicants. And the result is a transfer of wealth of one and a half trillion dollars, middle-class households to the endowments and pockets of administrators. So there is the mother of all chins hanging out there.
Starting point is 00:24:19 My company, Section 4, is very straightforward. I think the scarcest product in the world. When I see scarcity, I mean value relative to the number of people who the product is available to, is the American MBA. It's a global brand. It is transformative. Kids who come into the Haas School of Business at Berkeley are making 70 grand. When they leave, they're making 140 grand 24 months later. It is transformative. And unfortunately, the total market for graduate MBAs full-time is about 8,000 people a year. If you are one of the 66% of America that doesn't get a college degree, not accessible. If you're one of the 99% of
Starting point is 00:24:56 America that can't take a half a million dollars in investment and lost income, you're shit out of luck. If you're, say, have a kid at home or you're a single mom, there is no opportunity for you or you've collected things like dogs and spouses. So what our vision is, we want to make it accessible. It's 10% of the price and 1% of the friction. Well, by the way, I want to ask, one, you're at NYU,
Starting point is 00:25:18 so you're kind of playing against your business. Biting the hand that feeds me? Biting the hand that feeds you. I've returned my compensation for the last decade, so they're no longer feeding me. All right, all right. So you just said I'd screw you. Enough virtue signaling.
Starting point is 00:25:27 Yeah, okay, all right. Go ahead. So when you think about this, education kind of has failed us during the pandemic. It's been a really bad experience, I think, for everybody. And this idea of analog education or analog people going to school, people seem to, that's the one thing. Telework, I see. Telehealth, I see. It's continuing.
Starting point is 00:25:45 But teleeducation just hasn't been as great as it could be. So, how do you justify that? What do you think has happened? It's not either or. The jig is up. A lot of people have realized that they're paying $59,000 for a streaming video platform called Harvard when it's online. Yeah. And also, it's where are we headed?
Starting point is 00:26:05 The same way we're headed with the opposite. It's not going to be all remote. But the dirty secret of every professor is the following. We have two or three good jokes. And what I mean by jokes is we have two or three real insights that our research has provided or produced. And we stretch it out over 36 podium hours in 12 weeks, such that our host institution can charge $7,000 or $8,000 for those
Starting point is 00:26:25 12 weeks. So, you have just two things to say, really? Well, I generally believe, and the model of our framework is the following. In two to three weeks with intensity, team groups, a bunch of TAs, live streams, videos, we can get 70% to 80% of the intellectual property across to you for, again, 10% of the price and 1% of the friction. Right. And we've talked about Google's certificates, too. Which I'm super excited about and I keep hearing nothing about. Which people tweeted back at us. No, actually, they tweeted back saying that this is when they're starting. No, no, no.
Starting point is 00:26:50 You aren't paying attention. They said stay tuned. Stay tuned, yes. We're tuned in. But it's coming. I'm ready. We're tuned in. Okay.
Starting point is 00:26:55 I'm ready. But anyways, you're right. But where I was headed with this is it's not going to all remote because you're right. Remote, it's not – there's something to the electricity, but the dirty secret of higher ed is the following. Half our classes, half our classes where there's not a lot of interaction could be done synchronously. I'm not saying videos, but synchronously online and lower costs. And what do you do? You take UCLA back to 60% admittance instead of where it is 12%. And by the way, their applications are up 28%, meaning the kids of single parents, not remarkable, are no longer getting the same access to opportunities that you and me got 30 years ago. So, big tech, small tech, online learning, a mix. It's a mix. It's not zero-one. It's a hybrid. It's some on campus. Tell me what you think of analog. Are you back in school? Are you back in teaching? I will be in the fall.
Starting point is 00:27:45 Okay. So, what what you think of analog. Are you back in school? Are you back in teaching? I will be in the fall. Okay. There's a magic and a mystery to being in a classroom, Greek-style theater, seeing people's emotions. It's fantastic. But you could have 95% of that with six of those classes in person and then six online, and then we can lower the cost and increase the admissions rates. So what's the 6% that's in line? What's the 6% that's in person? When you say the 6% I don't understand.
Starting point is 00:28:06 Oh, the six sessions. Yeah, yeah. What should be the real lives analog session? Where you have the key sessions that are important
Starting point is 00:28:13 and there are a lot of them is when you have interaction with the students and it's the bumping up against your students where you say, okay, should Dotson have become Nissan?
Starting point is 00:28:21 Why was it good? Why was it bad? Should Google have launched YouTube? I mean, that is, there's fantastic learning there, but there's also a lot of classes where you're basically just lecturing and they don't need to haul their ass to Soho
Starting point is 00:28:33 or to Palo Alto or to Ann Arbor. And also it scales really well. Online scale, I taught 160 kids, I teach 160 kids in person because guess what? That's the biggest classroom at Stern. But when I go online, I have 280 kids. So the question is how do we leverage, I'm not saying replace, but supplement online such that we can dramatically increase scale and stop this bullshit of thinking we're Hermes and realize we're public servants again, dramatically increase acceptance rates,ramatically decrease costs and provide younger people with the same opportunity my generation had. And then get certified.
Starting point is 00:29:10 I don't know why I'm so angry. Why am I so angry? You're very angry about this because you just raised $30 million. There you go. You're a rat hole. But let me ask, what do you think about K-12, though? Seemingly problematic from a mental health person. Look, I think the two are totally different.
Starting point is 00:29:27 I think a 19-year-old taking half his classes at home or even trapped at home is a nuisance, but it's not a tragedy. A fourth grader needs, in my view, and I have experience with this, this is pulse marketing, I feel like I have some research background in higher ed, but none in K through 12. But what I've noticed personally is when a fourth and a seventh grader are trapped at home and doing remote learning, your house can collapse. That it's incredible stress on the mother. And let's be honest, it's almost always the mother that picks up the slack. And kids need socialization. Kids need to make eye contact with their teacher. The college students don't. Not as much. Graduates.
Starting point is 00:30:03 Not as much. You know why? Because the socialization, they'll do regardless of the class. Well, essentially, so you're trying to put NYU out of business, I see. This is what's happening here. Is that correct? I'm going to choose my words carefully here. No, I'm going to give stock to NYU. And NYU, actually, to their credit, they kind of embrace this stuff. Let me put it this way.
Starting point is 00:30:23 I'll give you this. They're more generous and patient with me than I would be with me. Yeah. But they like this stuff. They've actually approved some professors teaching on it. They're trying to – they don't see it. I am with you on this. It's not competitive because I can't get you the certification that a top 20 MBA gets you,
Starting point is 00:30:41 and it's still incredibly powerful for all the shit we talk about. And that's what should happen. That's the next shoe to drop, correct? Certification, some sort of... Well, that's what Coursera is trying to do. Yeah. Coursera, the original gangster here, has filed to go public.
Starting point is 00:30:53 I think that's going to be... I don't know if it'll get at the pop. I think it's going to be a fantastic stock to own. I shouldn't say that in time of the valuation, but these guys are the original gangsters. They took a lot of arrows. They got a lot of mud on their face. They were the pioneers here.
Starting point is 00:31:08 And I think it's really exciting that they're growing public. Online education and capital really holds out the opportunity to move away from this bullshit artificial constraint of campus and self-aggrandizing arrogant administrators at universities and move it back to where it needs to be. And that is the greatest upward lubricant in the history of mankind. And that is higher ed that is affordable and lets in unremarkable kids and says, you're unremarkable, but we're going to give you remarkable opportunities.
Starting point is 00:31:34 We become totally drunk on the freakishly remarkable and rich people. Does that bother you that I'm freakishly remarkable and you're unremarkable? That's my question. I'm used to it. I'm used to it. I'm used to it. All right.
Starting point is 00:31:46 This gets us in a perfect situation for our friend of Pivot. Kevin Roos is a tech columnist for the New York Times. He has a new book out called Future Proof, Nine Rules for Humans in the Age of Automation. It focuses a lot more on automation, but Kevin, you've just been listening to us. Welcome to the show. Thank you for having me.
Starting point is 00:32:08 So talk a little bit about future. What do you think of what Scott's saying here? Does this fit into your future-proof situation where we have to proof ourselves? You talk about automation most of the time, not online school, essentially. Yeah, well, I think a lot of that is connected. I think what we're seeing now,
Starting point is 00:32:23 and the reason I wrote this book is because we've seen this huge influx of AI and automation into industry and higher education and journalism. And it's changing all of our jobs and requiring us all to adapt. And so I was freaked out because I was looking at my own future
Starting point is 00:32:41 and thinking, what can I actually do to prepare for this? And I think, as Scott said, we've realized now that some parts of what we do are likely to be automated or taken online or disaggregated in some way. And so it's up to us to figure out how to deal with that. Name some of the rules. Tell us what the rules are.
Starting point is 00:33:05 Well, there are nine rules in the book. The subtitle is Nine Rules for Humans in the Age of Automation. I want it to be more efficient than the Ten Commandments. So this is just nine here. But yeah, there are basically three buckets of rules, and I won't list them all because that'll take
Starting point is 00:33:19 forever, but basically there are three buckets. One is how to future-proof your life at home, your brain, your personal life, your family life, how to future-proof your career, and how to future-proof your community. All right, give us one from each. So the one that is for your own self is that I think you need to basically find things to do that are not going to be replaced by machines. I think we've been training people
Starting point is 00:33:48 for the future entirely wrong. We've been teaching them to become more machine-like, to major in STEM, to become super efficient, to optimize and life hack their way to success. And I think we really need to focus on the more human skills that machines can't replace. I have three categories in the book of work that I think is unlikely to be automated soon.
Starting point is 00:34:14 And it's surprising, social, and scarce. So those are the types of work. One thing that I think you both have done very well is what I call leave handprints, which is to make yourself less of a sort of cog in a machine to make it clear that you are a human creating human work. One of the things I got from interviewing AI experts and economists from this book is that in the future,
Starting point is 00:34:39 things that are done by machines will become very cheap and things that are done by humans will become more valuable. Artisanal. We leave our dirty handprints. You know that. That's our thing. I like that. There's a lot of literature showing that we actually value things more when we think that people had a real hand in creating them.
Starting point is 00:34:57 That's something that people can do for their careers and for your community. I think that what we teach children and what we teach ourselves as adults is really important here. I think that we need a kind of update to the curriculum that we've been teaching people now for 100 years, which is all about making people into effective workers in an economy that rewards things like output and productivity and efficiency rather than the more human traits
Starting point is 00:35:26 that I think people are going to end up moving to as a result of AI and automation. Scott? I love that, leave handprint. So first off, and this is the most important question, although it's a bit of a digression, do you meditate? I try. Yeah, I don't always succeed, but I try.
Starting point is 00:35:41 Because I can just hear in your voice, you have nice chi and center. We need to roll. We need to hang, because my testosterone therapy has totally made me jumpy. So we'll talk after the show. But anyways, kudos to you. I can tell you're a centered person. I love that hands-on.
Starting point is 00:35:56 Let me ask you the following. My sense is we need a new approach that all capital sees labor that doesn't have a double E from MIT as a cost and a negative, as opposed to looking at people and trying to figure out how automation enhances human capital instead of replaces it. Don't we need a different mindset around robotics? Isn't the problem us, not the robots? Absolutely, yeah. I mean, one of the key lessons of the book is that robots don't do anything on their own right now. I mean, people say robots are coming for the jobs,
Starting point is 00:36:31 but it's really the executives at the Fortune 500 companies who are saying, I want to shrink the accounting department by 200 heads, or I need to squeeze out some more margin in next quarter's numbers, so I'm going to automate these jobs away. It's a really simplistic, really substitutive kind of automation that we're seeing a lot now. As I put it in a story yesterday for The Times,
Starting point is 00:36:55 it's about replacing fill in accounting rather than becoming a market leader, doing new dynamic things, developing new products. The economists call this so-so automation. It's the kind that kind of sucks. market leader, doing new dynamic things, developing new products. Economists call this so-so automation. It's the kind that kind of sucks. It's like the automated customer service line that you're like, I just want to press zero and get to a human.
Starting point is 00:37:17 Please put me through to a human. That's the kind of automation we're seeing a lot in the corporate world right now. And that's the part that's really dangerous, because that's not actually adding to human capability. That's not empowering workers. That's not developing new tools that are going to move the economy forward. It's purely substituting a machine for a human. And what do you see as,
Starting point is 00:37:36 what do you think the most exciting technology, I mean, robotics is, my understanding, is an amalgam or an alchemy of different technologies. What technology specifically do you think is driving the boom in robotics? And the reason I'm asking is, what area, if people want to devote their human capital or financial capital, not only just to robotics, but what is, you know, it was the micro, it was Intel that drove the computing revolution, you could argue. And that was, Intel was a great stock to own. What are the technologies that are driving robotics right now? Well, machine learning is the big one.
Starting point is 00:38:14 I mean, that's the thing that is transforming automation from something that can do rote and repetitive work to something that can do more kinds of creative, sort of cognitive work. And that's what we're seeing now, is that people, I think, still conceive of AI and automation as this thing that's going to replace people in factories. And the thing is, the factories are largely automated already.
Starting point is 00:38:34 That happened a long time ago. Now the robots are doing more kind of managerial tasks. There's some great research out from the Brookings Institution in Stanford about the fact that actually the people at most risk of replacement from AI are white-collar professionals in big metro areas who are doing sales projections and data analytics and those kinds of cognitive tasks that are done by people with college degrees who make a lot of money. That's not safe the way that we thought it was. make a lot of money. That's not safe the way that we thought it was. So when you think about that, this debate right now is over the $15 minimum wage. What happens to those jobs? Do you feel that they're at risk?
Starting point is 00:39:19 Certainly. I mean, retail is a big target of companies doing automation. We've already started to see some of that happening with the self-serve kiosks at fast food restaurants. I know Amazon just came online somewhere, where was it? I just read a story. I think what worries me is less the threat of displacement for those workers, because I think some of that stuff is going to end up being surprisingly hard to automate. But it's using automation and AI to turn workers into human robots, essentially.
Starting point is 00:39:45 Such push button, like at Amazon warehouses. Exactly. So if you work at an Amazon warehouse, you are taking instructions from one algorithm, you're putting things into a box, you're wearing a bracelet that tracks your productivity, you can be fired if you miss your packing target. I mean, it's essentially,
Starting point is 00:40:02 these jobs are kind of human robots. And I think that that's one of the cautionary tales. And they have the arms to do it right now until they figure out arms that can do it better. Exactly. And so those people should be concerned. But it's also like, there are ways that this can improve workers' lives, that people can be freed up from mundane,
Starting point is 00:40:22 sort of bullshit tasks. And we're just not seeing enough of that right now. We're seeing, I want to replace 10 people in accounts payable, so I'm going to buy this off-the-rack AI solution that can do that for me. So should we work at all? There's all those movies where we just sit in chairs and eat, and then robots take care of it.
Starting point is 00:40:44 Do you predict that's what's going to happen with automation? Is there any need for humans? There's definitely a need for humans. I'm not full dystopian, but I think that our jobs are going to change a lot. I think that right now a lot of people are employed in jobs that involve making things. I think that the future of jobs are going to be jobs that involve making things. And I think that the future of jobs are
Starting point is 00:41:06 going to be jobs that involve making people feel things. It's going to be the kind of things that bring about human connection. It's not going to be enough to be a really good radiologist. You're going to have to have a good bedside manner too so that people have a reason to come to you rather than going to an AI. Scott? And what do you, we were just talking about education. Do you see any thoughts around robotics as it relates to education and also healthcare? Yeah. I mean, these are the areas that have been sort of resistant to automation. You know, we don't have robots teaching college classes, most of them. Yet.
Starting point is 00:41:46 Yet. We don't have robot doctors, many of them yet, but that's coming. And I think that we need to stop educating people and telling them that they need to take on the machines head on to sort of compete with them, like the old John Henry thing. And we need to start telling people what they can do that is not going like the old John Henry thing. And we need to start telling people what they can do that is not going to be disrupted by these forces. And that's a journey that I've been on too. I mean, I'm a journalist at a newspaper.
Starting point is 00:42:12 That's not like the most future-proof job in the world. Wait, wait, why not? How are you getting replaced? Well, for example, one of my first jobs in journalism was I covered finance and I wrote corporate earnings stories. Yeah, those were stupid. Alcoa made this much money last quarter in their smelting division, that kind of thing.
Starting point is 00:42:33 And that's largely been automated in just the last 10 years. Those jobs are few and far between now. And I think there's another way in which automation has changed our industry, which is something that we don't talk about as much, which is that we've replaced these people who we used to call editors and ad salespeople with algorithms run by Google and Facebook that now choose what to show people and sell the ads around it. So there's been a lot of automation happening in our industry. It just hasn't been happening at the news organizations. It's been happening in our industry. It just hasn't been happening at the news organizations. It's been happening in Silicon Valley. Your insight and your word, artisanal,
Starting point is 00:43:09 is really, I think, striking insight as I think about it. And I'm going to parrot it as my own. But the notion that any activity that's wrote, that's the opposite of artisanship, right? I mean, what you do, and I think most of the journalists other than Kara Swisher at the New York Times do, is it's artisanal. They actually come up with original ideas that no AI for decades is going to be able to do. And I think that's a very interesting way of looking at
Starting point is 00:43:37 saying, okay, artisanship. So, how do you prepare a new generation of people coming up and maybe the ones that don't have access to college? How do you encourage or how does our education system instill artisanship? Well, I think it has to start from instilling a sense that this stuff is valuable, that, you know, you're not going to be unemployable if you're a musician or a philosopher or a sociologist. I mean, there was this really strong thread coming out of Silicon Valley for many years about the fact that STEM education was all that mattered. I think Mark Andreessen said, English majors are going to end up working in shoe stores and stuff like that. And so we really defunded and devalued those humanities programs.
Starting point is 00:44:26 And I think we're starting to see the results of that. I had a tech CEO tell me recently, I can hire tons of engineers, but I'm having a lot of trouble hiring salespeople. Because no one in the Bay Area has the people skills to be able to go to a company and sell them software. And that is a real missing piece in the economy today are the people with those kind of empathetic human communication skills.
Starting point is 00:44:53 And I think we need to start reorienting the curriculum that we teach to kids around that. The jobs would be healthcare workers, they would be social workers, they would be what else? Well, I think the sort of categories that are sort of most automation-resistant right now are in things like health care. But I actually think that the job-based sort of taxonomy of like, this job is going to be totally safe from automation and this other job is not going to be safe, I think that's
Starting point is 00:45:22 the wrong way to look at it. I think it's about the way that you do that job. Are you doing it in a rote and repetitive way, or are you doing it in a creative and human way? And so that's what the book is trying to guide people to, is even if your job is not being an artist, even if you're an accountant, there are ways to do that job that are more human and less automatable.
Starting point is 00:45:44 And so pushing people toward those parts of their jobs is one of the goals here. But when do computers get the ability to do that? Are they on that road? You're already starting to see AI do some kinds of emotional work, like analyzing emotions. They've implemented bots that are therapy bots for people who need some assistance there. In education, you already see things that are dynamic curricula that are being used by AI or that is using AI to personalize learning for people. And I think those kinds of jobs are something to think about.
Starting point is 00:46:32 But I do think that this job-based framework is not the right one. I think we need to be thinking less about what you're supposed to be doing to prevent, to make yourself future-proof, and more about how you're supposed to be doing it. But there are certain jobs that really have a big red target on them, for sure. Sure, but that's not to say that those jobs are going to go extinct.
Starting point is 00:46:53 Even though the corporate earnings reports, journalists are not doing so well these days, people who are creative and flexible and very human are finding ways to make money off of it so i think that's where the value is going is to the to the things that are sort of deeply human and away from the things that are more mechanistic and and based on sort of productivity and output so whenever you write a book you tip you go in with a set of kind of predetermined theses around what's the book is going to be about.
Starting point is 00:47:25 There's some thoughts on, okay, I want to write about X, Y, and Z. And then you do the research, you write the book, and typically you discover a couple things that sort of change your view. What did you come out, after writing this book,
Starting point is 00:47:34 what changed in your view of automation and robotics? So I started off as basically an optimist about AI and automation. And I still am, although I've tempered it somewhat. I now call myself a sub-optimist because I think this technology can be amazing if we do it right and thoughtfully. I mean, it could free us from our worst tasks.
Starting point is 00:47:56 It could solve world hunger and climate change and cure diseases. The technology is not the problem. It's the people who are implementing the technology right now that's the problem. And so I think that's what's changed in my view of AI and automation, is that it's not a guarantee that all this is going to improve people's lives. We actually have to work hard to make that happen. And people need to prepare themselves. It's not enough to attend a coding boot camp anymore.
Starting point is 00:48:28 You really have to work on yourself so that you're ready and you're not as replaceable as you might be right now. I'm going to ask you one further question. Speaking of the people responsible for making these decisions about algorithms or automation or cutting costs, there are people
Starting point is 00:48:43 doing this at the top who just want to make more money and they want to cut costs. And the pandemic's been a way to hide a lot of these sins, essentially. Like, why not do this? I've heard, like, I have three or four people have gotten laid off recently
Starting point is 00:48:55 who had jobs like that, that they're trying. We're saving costs now since the pandemic. But I think it's just a feint because this is what they want to do in the first place. But one of the things you do write about, and I love your comment at the end of this, you write a lot about misinformation. And you've sort of looked at the past year or two,
Starting point is 00:49:11 and these hearings are coming up on Capitol Hill around the impact of social media networks on the attack on the Capitol. What is your assessment right now, given you're a sub-optimist, but I think in this case you may not be an optimist, but I don't know. Yeah, I think the misinformation conversation is related to the automation conversation. I think a lot of what we talk about when we talk about automation is kind of external automation, like robots coming into factories
Starting point is 00:49:41 and stuff like that, but there's a kind of internal automation that's happening to a lot of people that is really worrisome, and I include myself in that. Every day we wake up and we look at our phones and we are just fed algorithmic feeds and machine-generated recommendations, and our phones and our devices are telling us what to think about, and misinformation
Starting point is 00:50:00 has always been present, but now it's hyper-present because of these algorithms and the platforms that use them. And so I think it's not only enough to get the right job to future-proof your career, it's also about detaching yourself from some of the technologies that exist to sway your opinion, to persuade you of something that maybe you don't want to be persuaded of, and to ultimately confuse you about who you are. You have to know yourself better than the algorithms know yourself, or else you're going to be replaced. And how do you look at these companies now?
Starting point is 00:50:39 How they influence us? Well, I think they exert tremendous influence, and I think they don't want to admit it, but they do have, that's another thing that I think I've become much more skeptical on is I think that AI is never going to make for
Starting point is 00:50:56 an effective editing function. What we've seen at Facebook, they've been promising for years we're going to automate content moderation. AI is going to take over, it's going to filter out all the hate speech and all the misinformation.
Starting point is 00:51:11 And I think what they've realized, belatedly, is that humans are just better at that. And so you have to bring in a bunch of people and train them and put them to work in content moderation divisions. And that's sort of a mistake of over-automation and under-investment in human potential. On that note, Kevin, your book is called Future Proof.
Starting point is 00:51:33 It's an excellent book. I've read it. Nine Rules for Humans in the Age of Automation. Thank you so much. Let's roll, Kevin. I need the G. Don't roll with him. Figure out a way I can put one of those Amazon watches on Scott and make sure his productivity is up.
Starting point is 00:51:47 And Kara, I should say, you are a big part of the reason that this book exists at all. Because I threatened you. Yeah, because a couple years ago we were talking and I said, I have this idea for this book called Future Proof about how people can protect themselves from AI and automation. And you said, that's a great idea and if you don't write it, I'm going to. And I said, as know, that's a great idea and if you don't write it, I'm going to. And I said,
Starting point is 00:52:07 as Michael Jordan would say... That about sums up Kara Swisher. I was trying to get him to do it. I wasn't going to do it. And as Michael Jordan would say, I took that personally.
Starting point is 00:52:16 Yes, good. She left her handprints on you. I left my handprints. I was trying to scare you because you thought I could. You thought I could and I might. That's right, right?
Starting point is 00:52:24 Good for you. I opened you. That's what I did. So thank you for inspiring me I could. You thought I could and I might. That's right, right? Good for you. I opened you. So thank you for inspiring me to do this book. Yes, through fear and loathing is how I like to do it. And I'm glad it's here. It's a beautiful book. Congrats on the book. Thank you.
Starting point is 00:52:34 Do not hang with Scott. Let me just give you that piece of advice. Don't say that. We're going to double date with Megan and Harry. Okay. Don't insult them. And Oprah. Thank you so much.
Starting point is 00:52:42 Thanks, Kevin. Congrats on the book. We're going to get our dirty handprints on the rest of the show, but we really appreciate it. It's a great book. Thanks so much. And you should all buy it. Okay, don't insult them. And Oprah. Thank you so much. Thanks, Kevin. Congrats on the book. We're going to get our dirty handprints on the rest of the show, but we really appreciate it. It's a great book. Thanks so much. And you should all buy it. Okay, Scott, we'll be back after one quick break for wins and fails. The Capital Ideas Podcast now features a series hosted by capital group ceo
Starting point is 00:53:07 mike gitlin through the words and experiences of investment professionals you'll discover what differentiates their investment approach what learnings have shifted their career trajectories and how do they find their next great idea invest 30 minutes in an episode today subscribe wherever you get your podcasts. Published by Capital Client Group, Inc. Do you feel like your leads never lead anywhere and you're making content that no one sees and it takes forever to build a campaign? Well, that's why we built HubSpot. It's an AI-powered customer platform that builds campaigns for you,
Starting point is 00:53:46 tells you which leads are worth knowing, and makes writing blogs, creating videos, and posting on social a breeze. So now, it's easier than ever to be a marketer. Get started at HubSpot.com slash marketers. Okay, Scott, wins and fails. Obviously, your failure to marry Mackenzie Scott was a grave error, despite the fact that you are already married. Well, I have a trivia question for you, Kara. Okay, go ahead.
Starting point is 00:54:12 And this is my win. Okay. Do you know which organization is responsible for three-quarters of all charity money given last year by philanthropists for pandemic relief? I don't know. The U.S. government? Who is responsible for three-quarters? I don't know, the U.S. government? Who is responsible for three quarters? I'll make it easy. What individual is responsible for three quarters
Starting point is 00:54:30 of all pandemic-related giving philanthropy last year? Bill Gates. Mackenzie Scott. Oh, there you go. So my win, I think Mackenzie Scott is a wonderful person. I loved, and we talked about this, rather than taking, and you'll like this, a male approach to giving where she wants her name etched in marble rather than hiring a bunch of Yale grads in social justice and RFPs and pretending to understand their viewpoint should influence education in Newark or Connecticut. She just said, people are hurting,
Starting point is 00:55:06 I have money, and I'm going to push it out. That was a bold gangster move. And she's marrying a science teacher. Good for her. You're in love with someone who's a science teacher. You're worth $30 billion. You're empathetic. You're changing the world. Mackenzie Scott, you deserve all the happiness that is coming to you. I like that. What's your fail? My fail is, so the stimulus package gets through. I think that's a great thing.
Starting point is 00:55:35 I think the stimulus is less bad than the rest, the relief package. But it just struck me that Joe Manchin, the senator from West Virginia, was the only senator. And that is, the Senate is supposed to be a deliberative body. They're supposed to argue, and then they're supposed to compromise with each other based on evidence and argument. And what we have now is essentially red and blue. And they all get into their caucuses and say, all right, Nancy and Chuck or Kevin and whoever else runs the Republican caucus and say, this is how we're voting, yes or no. And it's literally a party line vote. And for once, we had an individual. And I don't know.
Starting point is 00:56:17 Senator mentioned that. Well, I don't know much. I don't want to make an assessment on whether his arguments were the right one. But he is doing. He's the one senator. He is doing what they're all supposed to do, and that is negotiate and hammer out things and say, okay, maybe the other folks have a point. So let's get together. And it just struck me that right now, as we sit here today, we are one for 100 in terms of deliberation and what the Senate is actually supposed to be doing. That's my fail. I like that. I like that. I think that's excellent. I didn't so much,
Starting point is 00:56:49 I think a fail for me was Kyrsten Sinema doing the weird hands down thing. I thought that was thoughtless. Say more. She was voting against the $15 minimum wage, which is fine. Like she can make her choices if she wants. I do think the Democrats failed in pushing it out so much. They should have done it at 10 and 12. I know it was gradually. It just seemed like an opportunity lost in that one. But I didn't like how she voted. She came out. She put her thumbs down like John McCain did during the health care thing, and then she curtsied.
Starting point is 00:57:18 And I thought, well, you know what? Even if you lose, you're on a losing side. You don't want to be made fun of. I didn't like that very much. I didn't like that. Not at all. Here's what I did like. I did like the vaccinations are going.
Starting point is 00:57:33 I like how it's moving. It feels like there's some moving. And I think it is a failure that schools haven't reopened, and there's only so much time left in the year. So that means kids really won't go back to school until September. and there's only so much time left in the year. So that means kids really won't go back to school until September. But I do think that the vaccinations
Starting point is 00:57:49 and the CDC is releasing important information. I think the Biden administration's done a nice job here. And so there is sort of a feeling of possible hopefulness in the news, which I think was going to happen in some fashion at some point. But you do feel like winter is over. I don't know why.
Starting point is 00:58:10 I just feel like it is. It actually physically is here in Washington, but that's what it feels like. So I feel hopeful. I feel hopeful. That's nice. That's good. That's what I feel like.
Starting point is 00:58:17 In any case, so Scott, thank you so much. This has been a delightful time. I'm very happy you got raised money. I'm serious. I am happy you raised money. You sound happy. For my rabbit hole? You sound happy.
Starting point is 00:58:32 Rat hole. Not a rat hole. My rat hole. That's the expression I used to use when there was so much money in Silicon Valley chasing stupid startups. I don't think these are stupid startups. I think I'm happy that there's going to be
Starting point is 00:58:41 serious startups done going forward. I feel like climate change, whatever you think of Jamaat putting the money, it's climate change tech, education tech, health tech. I'm good with all those things. I'm good with all those things, and I think they're important for the planet. Okay, Scott, that's the show. We'll be back Friday for more. Go to nymag.com slash pivot to submit your questions for the Pivot podcast.
Starting point is 00:59:02 We love listener mail. The link is also in our show notes. Scott, read us out. Today's show was produced by Rebecca Sinanis. Ernie Indretod engineered this episode. Thanks also to Hannah Rosen and Drew Burrows. Make sure you subscribe to the show on Apple Podcasts, or if you're an Android user,
Starting point is 00:59:17 check us out on Spotify or Frinkly, wherever you listen. If you like the show, please recommend it to a friend. Thanks for listening to Pivot from New York Magazine and Vox Media. We'll be back later this week for another breakdown of all things tech and business. Three quarters of all philanthropic giving last year, a pandemic-related giving.
Starting point is 00:59:34 Thank you so much from the country. Congratulations, Mackenzie Scott. Support for this podcast comes from Anthropic. Thank you. select few. Well, Clawed by Anthropic is AI for everyone. The latest model, Clawed 3.5 Sonnet, offers groundbreaking intelligence at an everyday price. Clawed Sonnet can generate code, help with writing, and reason through hard problems better than any model before. You can discover how Clawed can transform your business at anthropic.com slash clawed. I just don't get it. Just wish someone could do the research on it. Can we figure this out? Hey, y'all. I'm John Blenhill, and I'm hosting a new podcast at Vox called Explain It To Me. Here's how it works. You call our hotline with questions
Starting point is 01:00:43 you can't quite answer on your own. We'll investigate and call you back to tell you what we found. We'll bring you the answers you need every Wednesday starting September 18th. So follow Explain It To Me, presented by Klaviyo.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.