Pivot - Sam Altman Goes to Microsoft, Elon Goes Thermonuclear, and Guest Dr. Joy Buolamwini
Episode Date: November 21, 2023Kara shares her latest reporting on Sam Altman and his decision to go to Microsoft, then she and Scott discuss what's next for OpenAI. Plus, Elon Musk threatens a "thermonuclear lawsuit," and X CEO Li...nda Yaccarino resists calls to resign. Our Friend of Pivot is Dr. Joy Buolamwini, founder of the Algorithmic Justice League, and author of "Unmasking AI: My Mission to Protect What Is Human in a World of Machines." Dr. Joy gives her take on OpenAI and the Altman ouster, and also discusses her mission to root out bias in AI.Follow Joy at @jovialjoy Follow us on Instagram and Threads at @pivotpodcastofficial. Follow us on TikTok at @pivotpodcast. Send us your questions by calling us at 855-51-PIVOT, or at nymag.com/pivot. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
Support for Pivot comes from Virgin Atlantic.
Too many of us are so focused on getting to our destination that we forgot to embrace the journey.
Well, when you fly Virgin Atlantic, that memorable trip begins right from the moment you check in.
On board, you'll find everything you need to relax, recharge, or carry on working.
Buy flat, private suites, fast Wi-Fi, hours of entertainment, delicious dining, and warm, welcoming service that's designed around you.
delicious dining and warm, welcoming service that's designed around you.
Check out virginatlantic.com for your next trip to London data, and a matching engine that helps you find quality candidates fast.
Listeners of this show can get a $75 sponsored job credit to get your jobs more visibility at Indeed.com slash podcast.
Just go to Indeed.com slash podcast right now and say you heard about Indeed on this podcast.
Indeed.com slash podcast.
Terms and conditions apply.
Need to hire?
You need Indeed.
Hi, everyone.
This is Pivot from New York Magazine and the Vox Media Podcast Network.
I'm Kara Swisher.
And I was right.
Scott?
And I'm Scott Galloway.
So let's talk about board governance.
So first off.
Yeah.
First off, when I saw in the script today that we were going to replay some back and forth between you and me.
Yeah.
It was like watching this crime drama I grew up on called Scooby-Doo,
where there was always just a shock at the end that the innkeeper was the one who was the ghost.
And I'm like, whatever this back and forth is between Kara and me, Kara is the hero and she
is right. I was though. Actually, can you just acknowledge that my reporting was stellar?
You were right.
But let me, first off, when I see people on the street, they don't say, oh, you bring the greatest insight around tech, young Kara.
They say, we love the chemistry between you and Kara.
So let me just go meta here and find out why you are so desperate to always be the hero.
Hold on.
I will tell you why, but go ahead.
Hold on. I grew up you why, but go ahead. Hold on.
I grew up on two hours a day.
We were an experiment in Laguna Niguel
on the shores for cable TV.
I grew up on four episodes a day of I Dream of Jeannie
and one episode a day of Scooby-Doo.
Me too.
I am a sexist who doesn't trust people close to me.
And this is what I think happens to you.
You'd walk into the garage and your mom would be
talking to Jeffrey. And like moms, you know, most moms who treat their boys with, Jeff would say,
mom, can you get me another six pack? And she'd go get it. And he'd be like, I'm thinking about
going to med school. And she'd say something along the lines of, well, whatever you want,
because mommy loves you and you are wonderful. And then you would walk into the garage and she'd say something like, you look fat in that.
Or what are you wearing?
You look like a lesbian.
And the need, I just want you to know.
Here's the need, Scott.
You matter.
We love you.
You matter.
We love you.
How about I was wrong?
How about that might come out of your mouth?
I get it wrong all the time.
I constantly say I get it wrong all the time.
Here's the deal.
I was listening to it, and you mansplained me on something.
And you were playing it over and over and over.
No, only because I had to deal with that online.
This is the closest thing to porn you watch.
You just listen to that clip.
Stop talking.
I had to deal with similar reactions from all men all weekend on accurate reporting and it was always like i'm
mad or i have to be right or whatever i just was a good reporter and that is irritating to be
mansplained on a topic i know more about that's all i don't think it's my mom i wasn't mansplaining
i was giving my view yes but anyways against someone who would actually spend a lot of time talking to the actual principal.
Well, okay, but here's...
You're right.
I agree with you.
Let's agree that you and Sam Altman are in the worst gay club in the Castro having sex
in the bathroom, and I don't even know where the club is.
So, Kara, fill us in.
I did not talk to Sam.
I did not talk to Sam.
We will talk about...
I think that's bullshit.
I think you guys have been texting, like, the two nice ones out of Mean Girls. I talk to everybody on all sides of this,
and that's why the reporting was so accurate. Yeah. So tell us what's going on. I want to
acknowledge I have no idea what's going on here. Accuracy. You don't respect, one,
you don't respect my qualities, and two, and yes, I would like you to respect my qualities.
I talk about your hair all the time, and Amanda.
You have great kids. No, you don't. I don't want to talk about my
hair. Anyway, we're going to talk about
Sam Alton going to Microsoft and what happens.
It's actually a really big story. You matter.
A big win for Microsoft. You matter, Dara.
You know what? Just I'm wrong would be nice. You matter.
Just a little. I made a mistake.
Anyway, plus Elon Musk threatens
to file a thermonuclear war lawsuit
against Scott Galloway, I hope.
No, but ad execs urge Linda Iaccarino to leave X.
That is certainly true.
Plus, we'll talk to someone I really like who's going to hand it to you, too.
Joy Boulamwini about the future of artificial intelligence in her new book, Unmasking AI, My Mission to Protect What is Human in the World of Machines.
She's fantastic.
You're going to really like her.
Great.
I'm bringing in troops to talk to you.
All right.
Scott, let's waste no time and get straight to our big story.
Microsoft has a new employee, Sam Altman.
I think he's probably also the heir apparent to the CEO title at this moment in time.
After a chaotic weekend with reports that Altman might return to OpenAI, Microsoft CEO Satya Nadella, who I actually think is the real winner here,
announced overnight on Sunday that Altman would be joining Microsoft. He'll lead a new advanced
AI team along with former OpenAI president Greg Brockman. As Altman wrote on X, the mission
continues if he chooses to accept it. OpenAI, meanwhile, announced a new interim CEO, Emmett
Scheer, the former CEO of Twitch, that poor guy. I was reporting all night on this and broke quite
a few stories, including that 505 employees is probably higher now of the 770 employees of
OpenAI. I signed an open letter to the board calling them idiots. First, let's talk about the move for Sam,
Satya Nadella, and then what the employees did. So he will reportedly have the CEO title of this
new Microsoft team, which will be made up of the OpenAI people, because they've all been
guaranteed jobs according to their letter. Sam has been raising money for a new chip venture
in recent week. We talked about that last week. And he was looking for money for a hardware device that he was working on with Johnny
Ive and others.
So first Sam, then Satya.
Satya really pulled this off.
I think he just got this company for free no matter what happens.
If this board resigns and Sam and Greg Brockman go back to OpenAI, Microsoft will have more
power, probably a board seat, very different company, and a big stake in it. If Sam stays there, they've hollowed out OpenAI and
will just have, has eliminated a competitor. I don't know, Scott, what do you think?
Well, again, as someone who's not talking to these people, this is just speculation.
Okay.
But it does feel as if, and we still don't know what went down here, really. What was
the event that triggered all this? And there's going to be a lot of long-form articles, I bet,
in the next couple of weeks around what the board was worried about, whether there's a there, there,
the AI, it becomes sentient, the risks, the dangers. There's going to be articles on what
was everyone freaked out about? What was the danger they were supposedly concerned about? But this is, you said five of
the 700 employees have basically- 500. 500.
Yeah. I mean, this feels like so far, this has been a really elegant way to take $90 billion
in value and a leadership position in the most seminal technology of the last decade
and destroy it. It's so strange because you can imagine over the weekend,
Sam talked to everyone from Tim Cook and Mark Zuckerberg to Mark Andreessen.
Doubt that one, but he definitely didn't talk to Elon Musk.
Well, big VCs. Several big Cs who called him and said, I can get you a check by close of business
for a billion dollars to start a new thing. Whatever valuation you want, we're here. We're
with you, Sam. We think you're a genius. And Mark Zuckerberg and Andy Jassy, or probably even Bezos
got involved saying, come over here. Because wherever he went, he was going to need to sit on top of massive amounts of processing power.
And then my guess is the way it played out was that Satya said, your optionality is better here.
You know us, we're the devil you know.
We've already integrated all your stuff into our products.
You can kind of hit the ground running.
We got the computing power.
That, I think, was very important.
We got the processing power.
There's also probably a bit of a – my guess is Satya said, look, if everyone resigns, We got the computing power. That, I think, was very important. said to him, look, we're a huge investor in open AI. And if you were to go somewhere else and start
a competitor, we would have a fiduciary obligation to try and get in the way of that with any means
possible. And I doubt he did that. He's a very soft touch guy, I think, in this way. I think
he just saw his opportunity, took it such an adult. You know him better than I do. But Microsoft
could not let, let me put it this way, Microsoft did not want to invest $10 billion and have all of this lead to have Sam Maltin go somewhere else and start a competitor to them.
I was surprised because when I was talking to people last night, I noted that Satya really wanted him there and it would be a big get for him last night during the reporting.
And then, you know, I was talking to a lot of people and they were like, we think he's going to do a startup, you know, that he's going to do this.
And he doesn't like being in a big company, et cetera, et cetera, et cetera.
Several people suggested he got a promise of being the next CEO.
I have not been able to confirm that, sure.
But he sure looks like an heir apparent to me.
I think he, I'm surprised he did it.
I thought he might do the startup route, you know.
But I think it's like, why should I rebuild this again. I thought he might do the startup route, you know, but I think it's
like, why should I rebuild this again when it's, I got everything I want here. And also, I'm not
sure he thought it would precipitate such an employee uproar, right? And that gives him even
more power. A lot of really irritating tech bros last night were like, Sam overplayed his hand.
And I'm like, I don't think so. I don't think he did. You know what I mean? He did it. Seems like the board overplayed there.
Yeah. Yeah. We'll get to that in a second. But I think Satya's the bigger winner here of anything.
He managed to finesse something that was very difficult because this guy had a lot of choices.
So it shows, you know, he walked with his feet and so do all these employees. I think that was
a worry that the employees wouldn't want to work for Microsoft. He must have been promised great
independence, which Microsoft's actually been very good at when they buy things. So this is a victory
for Microsoft. They just bought a company for free. I don't know what else to say. You know
what I mean? Like it's, and they don't, by the way, people think they had to keep their investment in
OpenAI, but they haven't given all the money over. They've given some of it, and it's dependent on the partnership. I want to stress this. They don't have to keep giving them the investment money, which they didn't get a big old $10 billion check. It depends on delivery of things. And if there's no employees there to deliver it, they don't have to. So they didn't lose that investment, you know, and we'll see what happens now with this board. So let's talk
about this board. They, you know, these employees are joining, the employees voted with their feet
or voting with their feet. And then they had this whole thing on Twitter about people being the most
important things, which is true because the talent walks out the door. I mean, the asset walks out the door every day. So unless they want current board to resign,
two independent directors hired and possibly Will Hurd, who used to be a former presidential
candidate and congressman who was on the board previously, they had Reid Hoffman's name,
another former board member who has another company. So probably that wouldn't work.
They want Sam and Greg reinstated.
That could still happen. And Microsoft would still be in a catbird position because they would be,
they would probably get a board seat. The board has a lot of choices here. The letter says the
board undermine our mission and company. Your conduct had made it clear that you do not have
competence to oversee open AI. Shockingly, Ilya Tsutskever,
who I named as one of the people who led the board rebellion right at the beginning on Friday,
he signed the letter and posted on X that he deeply regrets his participation in the board's
action. So after saying they had good reason, but never were transparent about any of it,
he's sorry and wants to, and he wants to join Microsoft, I guess.
Meera Muradi is now the former interim CEO.
They replaced her on Sunday
after dicking Altman around all weekend
with a guy who used to run Twitch.
Perfectly nice guy, but I was like, that guy?
Okay.
They replaced this woman who was a very well-regarded CTO
who was trying to get Sam and Greg back. Obviously, she wasn't down with what the board had done.
Let's talk about the board. I am going to play this clip because I want to talk about board
governments. I agree with you that board governments is important, except when they're
stupid boards. But let's play our Friday night emergency episode where we had a back and forth
about this. Maybe they're just a bad board.
Maybe they're an experienced board.
Maybe they made the wrong decision.
This press release feels stupid to me unless he's the way they phrased it, left them open to all kinds of problems.
And we have bad referees in the Premier League.
And on the whole, it's the way to govern the game.
Of course.
But what if it's just a bad board and they just had a fight and one side beat the other?
You know, that doesn't make them good governments at all.
I'd have to push back on this for my reporting.
So let's talk about this decision
because we both agree that good governance is a good thing,
but this was a bad board.
Well, first off, it'll be interesting to know
if there really is nothing there
or if there's something tangible they were worried about.
Why would the guy leading it suddenly apologize?
One wonders.
Money?
And he realized.
No, he just got it.
I mean, if 500, the thing the board clearly vastly underestimated was you're supposed to, I mean, typically when you, the first thought I have as a CEO when we fire someone senior is, how is their team going to
respond? And what generally you find is after you let them go, that the team sees it as like
accretive to morale. They're like, yeah, we didn't know what took you so long.
In this instance, they clearly just didn't do the math because if 500 people the next day are
willing to walk out or support, and the next wrinkle here will be he was, I can't believe he didn't have a non-compete,
but I guess it doesn't imply if he was fired.
But the 500 people, I would doubt, can just walk over to Microsoft.
I don't think they can do that.
But anyways, that'll come out, I imagine, over the next few days or a few weeks.
They can do it.
Yes, they can.
Well, they don't have non-competes? Oh, they're not going to sue. These people
have absolutely no leg to stand on. There will be no good luck. They're going to sue them all now
to add insult to injury? Non-competes? Not in California, my friend. Nope.
Yeah, but 500 people just walking out the door. I'd be curious what their employment
contracts say. But anyways. Come on.
You're the one that's against those, but go ahead.
I am, but they exist.
You're right.
They're not as enforceable in California.
Anyways, the question here is, okay, so could Microsoft end up acquiring a severely diminished open AI?
Like what kind of what is the next thing here?
But the board, the biggest error was them not recognizing that the entire team was going to the possibility that the entire team might want to walk out the door.
But at the core, the core fissure here is is not the board the core, the core fissure here is not the board and Sam,
the core fissure here, the reason why I think this all kind of blew up is trying to mix
business models. This whole, it's a nonprofit, it's a for-profit, we have this fucked up weird
Byzantine corporate structure where the investor's return is capped at 100x. I mean,
none of it, it was such an unnatural act. And it just doesn't work. It's like, okay,
are we in the business of for-profit or are we a not-for-profit? And I feel like those two
things just didn't, the collision here created a weird governance structure. But
you're right. The board, I have no, I was writing down yesterday, I've been on 16 boards,
seven public companies, seven private companies, two non-profit. I have never seen anything like
this. It's almost as if the board is the ultimate useful idiots for Microsoft in the sense that
they said, let's take $90 billion and transfer most of it to Microsoft. And you have a
series of employees. And here's the thing. If you've been with OpenAI longer than, say, 18 or
24 months, you are about to sell some shares in a private market transaction where you are going to
get, at probably a fairly young age, $3, $8, $15 million. And you can't help but start thinking, I'd really like to buy a house
in Noe Valley and then pay off my mom's mortgage and pay off my student loans. And now that this
has been snatched away from them in a move where they think the board just had their heads up their
ass, you got to think the employees there are really angry because I don't think that $90
billion transaction is going to go through now. Or maybe they just liked Sam Altman.
Like, you know what I mean?
Like this is, of course there's money involved,
of course everything else.
But in this case, the board did not read the room,
did not understand, including Ilya, who led this.
He absolutely led this.
Now he says, maybe there weren't problems.
They never, I called them and tried to get specifics
out of these people and the people close to them.
Well, it was a close to them. Well,
it was a misalignment. Okay, what? What did he do like that? And now this Ilya is saying, there was no malfeasance, essentially, right? I'm so sorry for what I did.
Guy who's the new CEO, interim CEO said it clearly, I need to look into this board fuck up,
because this sounds like a fuck
up to me. He said he checked on the reasoning for getting rid of Sam, which he says was not
specific disagreement over safety. They won't say what the issue is, right? So everyone's waiting
for this shoe to drop, like, oh, it turns out he's like an alien from, I don't know, outer space or
something. But they don't, they're all backtracking on this.
And I'm sorry, it's not just because of money.
It's because this board is incompetent
and they had no thing.
By the way, they didn't hire, if you can believe this,
and you've been on boards, I have not.
They didn't hire outside counsel.
They didn't hire crisis PR.
They thought it was all being done.
They had no pulse of the employee base, which is backing
Sam. They just are. He has the goodwill of not just all of Silicon Valley, but the employees.
And this other guy who was, let me just say, this Ilya is a legendary technologist and this stuff is,
a lot of the technology is due to him. He did not have the backing of the employees, right? And so he had to flip. And it's not because of money. He just didn't have the room. And maybe they can hire
an independent investigator, but this board has got to go, right? I just-
This board, yeah. No, well, one thing's clear. This board doesn't survive this.
What could they do? I mean-
Well, I thought they, I read over the weekend, I kind of got sick of watching the drama over the weekend, but I heard in the interim, they hired
this other guy without telling anybody. You know, everybody was like, what? Like they had been
delaying him for no good reason. And then they replaced this woman who everybody likes of the
company, who was a very compelling and competent executive. Maybe not the one, you know, she just, it was like a battlefield promotion for her.
But, so they replaced the woman
who'd been there for years,
who was their CTO,
with some guy who ran Twitch.
I mean, can you imagine this company?
Like, and then she, of course,
was backing the return of Brockman and Altman.
But I can't believe the chairman didn't call.
The chairman was what left. He was Greg Brockman. So they. But I can't believe the chairman didn't call. The chairman was what left.
He was Greg Brockman.
So they acted without the chairman there.
Just to make the world,
put the world back in balance,
Greg Brockman should not go to Microsoft.
He should be the local anchor
for the local news station.
It's Greg Brockman and a good evening to you.
Isn't that a news anchor name?
Greg Brockman?
Anyways, but somebody should have,
some representative of the board, a senior person with some gravitas, had, in my opinion, I don't know if he's going to fiducia your obligation, but just to call their largest investor and key
partner, Satya Nadella, and say, this is what we're thinking and what we've decided, and get
his input. And my guess is Satya would have said,
let's game theory this out, folks, before you actually make the decision. But my guess is,
I was thinking about it last night, and I even started drawing down the different decision trees and scenario planning. I'm like, I think this probably ends up with Microsoft buying OpenAI.
Yeah, maybe. Yeah. And then it becomes the division or something. I mean, it'll be,
Yeah, maybe. Yeah. And then it becomes the division or something. I mean, it'll be, someone who really close, not on the SAM side, just a person who knows everything that's going on there, told me they wondered who was a ringleader of this thing. Absolutely. There's two members,
a woman named Helen Toner, who works at Georgetown and is an activist of some sort. You know,
she's an academic activist also around this alignment, a lot of effective altruism. There's Tasha McCauley, who's a techie who happens to be married to Joseph Gordon-Levitt, who played Travis Kalanick in that Super Pumped. They're very tight from what I understand.
And then the sort of the cipher is Adam D'Angelo, who's a former, one of the Facebook sort of early
people. He may be called the founder of Facebook. I'm not sure. Nice guy. I find him hard to talk
to. I don't know what, I found a lot of those people hard to talk to. I don't know where he is. He seems to have backed that too. And they were going to bring in to the board in this deal that was being discussed over the weekend, Brett Taylor, former Will Hurd, who had previously been on the board.
Many, Cheryl Sandberg's name was raised, Marissa Mayer. They were going to bring in kind of an
all-star kind of board. And then that got rejected by this board. So, it's three people,
or four, I guess, and one just defected. This board is no more. But again, it just all kind of comes back to this notion of-
How do they get rid of the board?
I don't understand.
That I've not been able to understand.
Well, typically in the investment documents, people have certain rights to certain board
members.
But when 500 people of your seven, when basically your whole firm is saying they're going to
walk out the door, somebody on the board is going to go, okay, we need to reconstitute the board. This is not played out. It just feels like,
you just get the sense one way or another, it feels like Sam is going to be running this
company again. Either there's a division of Microsoft or the board's going to be reconstituted with Sam and Greg back. The current state of affairs at OpenAI is unsustainable.
They can't just say, well, we'll let some people leave and hope it dies down.
If 500 people overnight, it sounds like basically the entire workforce.
Oh, that was just at 5.
I broke that story.
That was just at 5.55 a.m. California time. So I'm sure it's 700 at this
point or something. So I think I wouldn't be surprised if pretty much the entire board is
reconstituted or swept out. And if they try to go it alone out of ego or some weirdness,
and they actually have the control to hold onto these board seats, which they may,
then ultimately this thing becomes, I don't want to call it hobbled or weak, because they do have
some assets. They have 100 million people who signed up and are paying or whatever it is.
You know, it's a big brand.
It's got a lot of momentum.
But the board is going to—
They have no friends in Silicon Valley, I can tell you that.
Is that right?
Elon Musk is their only friend at this point, I would suspect.
I wouldn't be surprised if Elon's saying, go buy the company and come out.
People told me he's called in.
They've all called in.
Yeah, they all see opportunity here.
Sure.
But it just feels like one way or the other, Microsoft, it's kind of like, okay, who's on top right now? Who's eventually going to be in charge? The board has basically stuck a gun in their mouth and said, we are irresponsible. You can't trust us. We may pull the trigger, we may not, but you're going to take the gun away from us and just get us out of the building. Yeah, it's interesting how they're going to do that. I don't know, because from what I understand,
these two of them are very adamant.
Yeah, but at some point, they're going to go, all right, we're really going to just take this
thing down in a blaze of glory and have 50 people a day walk out the door. Someone's going to call
them and say, you need to declare
defeat and leave. Yeah, now that Ilya's gone and is doing sort of mea culpers, you know,
it was interesting because I got a lot of like, why aren't you listening to what Ilya says? Why
aren't you, you know, his concerns and this and that? I said, did you see what he just did?
Obviously, his concerns aren't enough to have him not apologize, which
then Sam Altman then heart emojied his apology, which is like, the whole thing is so freaking,
it's all on, you know, both threads and Twitter. It's really interesting. This is all being played
out in public, which is really fascinating. You wouldn't ever see this anywhere else.
And it has put the spotlight on the split
in the AI world between those who see it as a business opportunity and are probably too optimistic
and others worried about the dangers who have good points and are too pessimistic. There is a
middle ground of mitigation of dangers and also seeing the opportunities. I think it's a false
dichotomy that it's all disaster. And I think
it's saying it's all great, too. And you should have concerns. I do think, to be fair to Ilya,
I think he does have concerns. But at the same time, I think he's a little religious on the
topic. And if you look at some profiles of him, he's got a God complex, too, you know?
Well, look, I've said this at speaking gigs. I have one deck that I do for about two or three
months, and I switch it up. And right now, it's called the AI Optimist. I think all this
catastrophizing is just narcissism that, you know, I invented the ultimate weapon here. Look at me.
When I was writing this out over the weekend, a lot of people, a lot of stories said this was a
failed coup by the board. And the way I see it effectively is it was the successful accidental
coup of Sam Altman, because what happened was he was clearly doing something that upset the board or was behaving in a way that upset the board, upset them enough that they made what looks like an incredibly rash and, quite frankly, stupid decision.
Not planned at all.
But he clearly was running kind of without their permission or whatever you want to call it.
And then they attempted to take back power.
And the generals or the soldiers, it reminded me of the coup or the attempted coup in Turkey with Erdogan. And he basically, you know, they put him out of office.
They're about to kill him or exile him.
And all of a sudden, the military flipped their minds.
And now he was back even stronger.
This feels like, quite frankly, the accidental successful coup of Sam Altman,
because I think what's going to happen- You're comparing him to Erdogan. Okay, go ahead.
Is there a better one you could find? Okay, I don't know. He's coming back.
Who's an athlete who was more powerful than the coach?
That authoritarian brute. Go ahead.
I mean, here's the bottom line. When you're a Phil Jackson of the Chicago Bulls, your job isn't to be Michael Jordan's boss. Your job is to get along with Michael Jordan.
And that kind of, quite frankly, the board's job here was to get along with Sam Altman because clearly he controlled the company.
way he controlled the company. But he's effectively pulled off the accidental successful coup. Sam Altman is, it strikes me that if you play this out, all roads kind of lead to at some point,
I don't know if it's in six hours, six days or six months, Sam Altman's going to be in control
of this company again. Yeah. Oh, 100%. It's really interesting. You know what I think they missed?
And look, if they had real, as I said on our emergency pod when we taped on Friday, if they had real problems or scandal.
Let's play a clip again where Kara's the hero.
I'm not the hero.
I just want you to respect my reporting skills.
I am actually, I think, the finest tech reporter in Silicon Valley.
So you might respect me for who I am.
Listen to you.
I say you're a great professor. Kara, you matter. We care about you. I don't say we care about you, but you might respect me for who I am. Listen to you. I say you're a great professor.
I am. Kara, you matter. We care about you. I don't say we care about you, but you are ignoring
my reporting. You have value, Kara. That is not what I need. I want you not to mansplain to me
when I know better. That's all I'm saying. I defer to you on marketing things. Tell us what
happened in the bathroom. What drugs are you doing? No, no. I'm outside.
They won't even let me in the club.
I acknowledge I have no idea what's going on here.
No, you don't even try to get in the club.
Don't even start with me.
Did you pick up a phone?
You could have picked up a phone.
Oh, okay.
Hold on.
Let's play this out.
Hi, it's Scott Galloway for Sam Altman.
Yeah, that gets through.
That gets through.
Satya, I call him Satty.
Can you have him call me back from Bellevue or whatever he's doing, like watching Orcas or whatever substantive, thoughtful thing he does on weekends?
Yeah.
Yeah, that was, yeah, a lot of people calling the dog back.
Yeah, that's right.
They might.
If you tried a little bit, if you were a little more charming.
When these people call me, I don't want to meet with them.
I'm like, I'm going to like you, and then I will start shitposting you. I have no desire to meet any When these people call me, I don't want to meet with them. I'm like, I'm going to like you. And then I don't, I will start shit posting you. I have no desire to meet
any of these people. I don't, the wonderful thing about not being a journalist is I don't need to
be balanced or fair. All right. I'd like you to now, okay. You've been on boards though. So,
so really what would you do if you had made this incredible error, not realize the goodwill Sam
Altman didn't just have with Silicon Valley Power people,
but the employees, what would you do? Very easy. I'd get all the board in a room and go-
There's three of them now. They're down to three.
Okay. I'd be all three of them together and go, okay, we are fiduciaries. A fiduciary means you
represent other people's interests. Who are we fiduciaries for? We're fiduciaries for the world.
We're fiduciaries for our employees. We're fiduciaries for our employees. We're fiduciaries for our shareholders. We're
fiduciaries for the local community. We have fucked up here. We're probably, if we become
a shadow of ourselves, we can't save the world nor help it. So that's not going to,
our employees are an open revolt and don't want to work here. Our shareholders are
fucking furious at us. So let's just be, let's just call this what it is. We screwed up. So how
do we take chicken shit and turn it into just rancid chicken? How do we make the best of a
bad situation? And there's only one thing here. We need to call Satya, maybe Sam, but Sam's probably pretty hot and clearly we don't
have a functioning communication channel with him. But we need to call Satya and say, we fucked up.
We want to do what's right for the company. These are our concerns. Let's reconstitute the board.
We are all going to resign. We'd like some input on the new board and for them to understand our
concerns. And then best of luck to you. Don't hit your ass on the way out the door.
That's what they should do right now
is they should try to hold on to as much value
because try to create as let as minimize the disruption,
minimize the damage and be good soldiers
and recognize their fiduciaries and they fucked up.
They need to acknowledge it.
Call Microsoft, try and make it the transition back or forward as least damaging as possible, admit they screwed up, express their
concerns, maybe even have some input on the new board members who will share their concerns.
They get no input. What's going to happen to that interim, the second interim CEO?
Oh, I trust he put into his contract a really generous severance. He's going to be there. I just
don't think it's going to be there very long. What do you think? Again, I'm heckling from the
cheap seats here. I don't know. I think either Sam will be back with Microsoft with a substantive
control position, or they'll just all go over to Microsoft if this board holds out any longer.
They need to stop.
They've done a bad job and they need to do something else.
They need to resign.
They need to resign.
They need to resign.
That's the next thing.
And then we'll see.
You didn't even ask me
how F1 in Vegas was.
You care so little about me.
I'm not going to ask it.
All right, Scott,
let's go on a quick break.
When we come back,
Elon is threatening
a thermonuclear lawsuit
and we'll talk about
the future of AI
with friend of Pivot, Dr. Joy Bulumwini.
Fox Creative.
This is advertiser content from Zelle.
When you picture an online scammer, what do you see?
For the longest time, we have these images of somebody sitting,
crouched over their computer with a hoodie on,
just kind of typing away in the middle of the night.
And honestly, that's not what it is anymore.
That's Ian Mitchell, a banker turned fraud fighter.
These days, online scams look more like crime syndicates than individual con artists.
And they're making bank.
Last year, scammers made off with more than $10 billion.
It's mind-blowing to see the kind of infrastructure that's been built to facilitate
scamming at scale. There are hundreds, if not thousands, of scam centers all around the world.
These are very savvy business people. These are organized criminal rings. And so once we
understand the magnitude of this problem,
we can protect people better. One challenge that fraud fighters like Ian face is that scam victims
sometimes feel too ashamed to discuss what happened to them. But Ian says one of our best
defenses is simple. We need to talk to each other. We need to have those awkward conversations
around what do you do if you have text messages you don't recognize?
What do you do if you start getting asked to send information that's more sensitive?
Even my own father fell victim to a, thank goodness, a smaller dollar scam,
but he fell victim and we have these conversations all the time.
So we are all at risk and we all need to work together to protect each other.
Learn more about how to protect yourself at vox.com slash zelle. And when using digital
payment platforms, remember to only send money to people you know and trust.
Thumbtack presents the ins and outs of caring for your home.
Out. Procrastination. Putting it off, kicking the can down the road. In. Plans and
guides that make it easy to get home projects done. Out. Carpet in the bathroom. Like, why?
In. Knowing what to do, when to do it, and who to hire. Start caring for your home
with confidence.
Download Thumbtack today.
Scott, we're back.
Elon Musk is threatening to file what he calls a thermonuclear lawsuit against Media Matters
and others for fraudulent attack on our company.
Media Matters for America is a group that put out the report showing ads for mainstream
brands on X for running alongside posts with pro-Nazi views.
Since we've seen the advertiser exodus on X continue with the European Union halting advertising, everyone's halting advertising.
As usual, it's hard to tell if Elon is just threatening or will actually do something.
Many people pointed out that he could e-file over the weekend.
He just likes to be a drama queen.
He pushed back on bogus stories of him being anti-Semitic.
Bill Ackman finally chimed in,
posting that Elon is not an anti-Semite
and he was not perfect,
but the world is a vastly better place because of him.
Because Bill Ackman cannot punch up ever,
but he's still beating up on college students.
So Ron DeSantis is also an Elon defender of the weekend,
refusing to condemn him and saying Elon believes in America.
Yeah, look, this is a guy accusing people who have a problem or say his comments
are anti-Semitic or pulling their advertising. He's threatening them with a thermonuclear lawsuit
and accusing them of fraud. This guy is the guy who said there'd be, in an earnings call,
said there'd be a million autonomous Tesla taxis on the road by 2020.
But if somebody decides to pull their advertising or someone says, when you endorse replacement theory, where you had people marching in Virginia saying, Jews will not replace us,
or you call George Soros a Jewish supervillain, and people say, we don't want to advertise with you. You start threatening lawsuits and wrapping yourself.
The most ridiculous ones is he had a call to action saying,
these people want to suppress your free speech.
If I would imagine Walmart or I don't know,
the National Association of Churches listening to our show
is going to find my humor offensive, profane, and decide
we probably shouldn't advertise there.
Are they suppressing our free speech?
They are.
Our free penis speech, for sure.
According to Elon Musk, speech is not only advertising, it's the decision not to advertise.
That's free speech as well.
So, like, none of this—
I think we should threaten thermonuclear lawsuits against people who don't like dick jokes.
Against anyone who doesn't advertise against us.
Against people who don't like dick jokes.
A lot of people advertise.
Stop advertising with us.
That doesn't mean they're suppressing free speech.
I find—I will say this.
I just want to give a shout-out to IBM, who put out a press release and said his statements were, quote, unacceptable.
Even the Biden administration has weighed in here and said we find these—
They like to do that.
They rushed to the typewriter for that one, you know.
But I'm encouraged.
I don't think the most powerful man in the world that has gone red pill
and has this absolutely enormous platform can start saying these, well, he can.
We're not denying his right to say them, but we're also saying people can criticize him for it,
call him out for it, and also decide they don't want to have the IBM logo next to tweets about replacement theory.
Well, you know, the thing is, Ben Shapiro explained to us this weekend that he was misunderstood.
He's more nuanced.
And then actually, Ackman backed that. I mean, you like Ackman. Can you call your boy and say,
hey, dude? Hold on, hold on. So unlike you and Sam, Bill and I have never shared the same bathroom.
I've never met the guy, Carrie. All of a sudden, you've decided that he and I are good friends. Well, you like him. So I just want to comment on the Ben Shapiro thing.
I watch Ben Shapiro.
I think Ben Shapiro, I do not agree with his political views.
I think he's an enormous intellect.
I do think he's an incredibly impressive blue flame thinker around his issues.
And he has been a strong voice for Israel.
He has been unafraid.
He has.
He just went to fight with Candace Owen.
He called out Candace for what he called faux intellectualism around the issue.
I don't know if you've seen the YouTube of him.
It's a Cambridge Speaks where he goes and he argues against students.
He was, in my opinion, outstanding.
But here's the problem.
All of these guys, when it comes to someone who's promoting a far right, either a far right platform or has gone red pill or offers the
prospects to invest and make a ton of money with him, they find that his anti-Semitism is with a
small A and he's just deeply misunderstood. No, having a lot of money, having a lot of money,
having a platform that aligns with your political values or being potentially a lucrative client for your investment banking
or money management fees does not make your anti-Semitism any less fucking abhorrent.
And these guys, what I wanted to say to Ben was, I think he's shown incredible leadership
and courage around Israel.
And he vastly diminishes it and goes, but this guy who's taking his platform hard right
gets a hall pass because I like what he's doing over here at Twitter. But this guy,
when Ackman says, look, he's one of us, so I cut him some slack. Well, you can't, boss. That's not
how this works. That's what it means to be principled is you're not afraid. You got to be somewhat even-handed here. And what Elon has done,
there's a higher bar for Elon. People, can you imagine, we had Meredith-
Levian.
Meredith Levian. Can you imagine her doing anything like this?
No, she wouldn't.
Can you imagine Barry Diller? Can you imagine-
No, they're adults.
The, anyone, Steve Schwartz at Hearst, can you imagine anyone who said, I have a large platform here, a large, even Zuckerberg, he toyed with this some stuff when he was in his 20s and he stepped back from this kind of behavior.
He did.
He did indeed.
But here we have the most, I would argue, probably the most powerful man in the world endorsing replacement theory.
He's misunderstood, Scott. Don't you understand? Honestly.
To their credit, a lot of brands haven't gone quiet. They haven't slithered away. They have said,
this is unacceptable.
Speaking of people who have also backed him, Linda Iaccarino continues to do so.
She's been contacted by a number of ad execs. I've heard from those ad execs who are talking to her,
questioning why she's risking her reputation,
urging her to step down.
I have heard, they've called me, telling me they told her,
and she said she's going down with the ship, I guess,
some version of that.
She's also brought in her son to X.
She's been tasked with outreach
to Republican digital advertising firms,
according to Semaphore.
I don't know what to say.
She's not quitting.
Boys, you can call her all you want. NBC cut advertising, by the way, where she used to work. She's a big girl. She's got to make her own decisions. She's, you know, it's easy for us to be
generous with other people's careers. She's just going to do what she's going to do. I don't.
What would you advise her? She called you up and said, Scott.
Well, I said this last week. My test is the following. When my kids talk about me What would you advise her? She called you up and said, Scott. bad behavior on the CEO and turned it into a profitable company. Okay, that's one story. That's a good story. Or the other story is she was brought in, she had incredible economic
opportunities. She was a baller. She was a female CEO in a world where there weren't a lot of female
CEOs in tech. And when the most powerful and wealthy man in the world, who was her boss,
started saying bigoted, bigoted anti-Semitic things that tore at the fabric of America.
She decided she couldn't handle that, and she left.
Like, what's a better story for your grandkids?
What's a better legacy?
Landa, listen to Scott.
Listen to Scott.
Another wrinkle here, as we said, the White House denounced Elon,
but he still works for the government.
They're dependent on him in space.
The government
agreed to a $1.2 billion worth of SpaceX launches next year to put crucial Pentagon assets into
space. They can't go back on that. NASA is using SpaceX for several contracts. In September,
the Pentagon agreed to pay tens of millions of dollars to StarShield, a new military-specific
version of Starlink. This guy is benefiting from the U.S. government even as he trashes it,
which is always a delight.
There are two things that are incredibly obvious to me over the last couple of years.
One, if you look at how anti-Biden and anti-Israel is people under the age of 25 and where they get all of their media, it is incredibly painfully obvious to me that TikTok and the algorithms have been weaponized by the CCP,
and all of us are so narcissistic and believe that America is so smart that we can't even imagine
what is clearly going on to us. And two, that Elon Musk will go down in history
is cementing the age-old proverb, piece of advice, adage, knowledge that power corrupts.
And when you let one person aggregate this much wealth
and this much power, where they can decide, I might just turn on communications in Gaza,
despite that our military has decided that, no, Hamas does not need this tool to reorganize.
I'm going to turn off and on weapons field technologies. I'm going to, this guy, quite frankly,
people just shouldn't have this much power
in the private sector.
What will we do with this much power?
I don't know.
I want to be rich and anonymous.
Oh, no, you don't.
You always say that and then it's just not true.
And you're all over the globe.
I agree with you.
I think, boy, good luck with these lawsuits.
I mean, I feel like I'm probably on the receiving end.
You're not going to go to court.
He's just trying to bully people.
You know, one of the things with ADL, though, I'm very disappointed in John Greenblatt in that.
Elon made one comment about that he wasn't for decolonization, which is like saying I'm not for taking oxygen from everybody.
And he said, good job, Elon.
I think he's heading him when he gets tiny victories out of him.
But Elon continues to attack ADL as the real problem.
And, you know, you cannot, you have to stand firm against this stuff.
How can you be?
He was threatened with a lawsuit.
He didn't want to get sued.
ADL would have gotten killed.
And even if he, even if it was a get sued. ADL would have gotten killed. And even if it was a
nuisance lawsuit, they would have gotten killed. How can you be anti the Anti-Defamation League?
So you're for defamation? Yeah, that's Elon's joke. They should be called the Defamation League.
Look at their history. Look how reserved and thoughtful Jonathan Greenblatt is.
He is. I just think he's a little cheap. The ADL is a gift to humanity. Their heart is in the right place. They are trying to stop people.
They are trying to stop-
Not everybody likes them, just saying.
An action that ultimately leads to violence. And that's who he goes after?
Right.
That's who he's going to try and sue out of existence is the anti-defamation link?
Well, it worked. They definitely backed off because of fear.
So I don't think Media Matters is going to back off.
Power corrupts.
This guy has too much power.
I don't think Media Matters is backing off.
But we'll see.
We'll see.
Many people don't.
Anyway, let's bring in our friend of Pivot.
Dr. Joy Bulumwini is an MIT researcher, the founder of the Algorithmic Justice League,
and author of the latest book, Unmasking AI, My Mission to Protect What is Human in the World of
Machines. I know her very well. Welcome, Joy. It's good to, or should I call you Dr. Bulumwini?
I like Dr. Joy. It's a mix of authority and familiarity.
Then that's what you'll get. Okay. So we're going to get to this book shortly. Joy. It's a mix of authority and familiarity. Then that's what you'll get.
Okay.
So we're going to get to this book shortly.
You couldn't be a more perfect guest to have today.
In my experience with you, you've been someone who is, you know, very interested in the good
parts of AI and also very clear and early, one of the earliest about the dangers of it.
You feel you have a lot about mitigation and figuring out what to do.
And I would love to get your take.
I'm going to talk about your book on AI, but I'd love to get your take on what's happened at OpenAI.
You were on stage with Sam at an event a little over a week ago talking about the future of AI.
So talk about what you think about this thing. Now it's changed. All the employees have said
they want the board gone and him or it's turned or they'll go over to Microsoft to work where he is taking a job.
The biggest lesson I take from what we saw happen at OpenAI
is what we've been saying for a while.
We cannot trust companies when it comes to AI governance.
And we have to get internal governance right, right,
before we're even thinking about global governance.
I'm also really concerned with the consolidation of power.
So it's one thing to say, OK, there's an off ramp for Sam over at Microsoft.
But also, what does it mean for Microsoft to concentrate this much talent within one company. I am hopeful that OpenAI can
now become open, which is what it was set out to do in the first place. And so if this leads to a
pathway with more transparency with some of the models that they've created that have been quite influential. I do think that would be an
overall net positive for the ecosystem if, and there's so much speculation, I can only say this
is speculation, if it is in fact true that because of differences in views with AI safety, openness,
transparency, this is what led to the current situation. So I'm a bit concerned with
the consolidation of power and somewhat hopeful, right, with potential for open AI to actually be
open in the way it was set out to be. Well, here's the deal. I think this board did it badly. I think
there may have been other things at work here. They were never specific about anything. Although, as you know, I wrote that it was a misalignment.
I think they probably overstated that and then didn't give any reasons and then created
this opening for Microsoft to control this narrative completely, which was amazing.
The people that were so concerned about consolidation and dangers have handed the keys over to a
very big commercial company.
It seems like it to me.
I mean, I feel like I'm watching a movie sometimes, you know, you blink for a moment
and something else changes. What do you make of the employees wanting to leave? Because,
you know, he obviously has the room. Sam Altman has the room if they're going to do that. They
did not go with Ilya, who is the chief scientist, although he now is on the side of Sam
Altman after having led the saying there were problems. So what do you make of that?
It's hard to tell without being an insider, just my own experiences with different organizations
as leadership makes such a huge difference in terms of where employees want to work. And so if he held the vision that
they believed in, and that vision now no longer seems to sit within OpenAI, the max exodus
would be anticipated. But again, I don't have enough of an insider view with the internal
dynamics of the company to really comment. So let me talk about the products that OpenAI
launched or was
developing, not just OpenAI, all of them. You told Rolling Stone this weekend that AI leaders,
quote, cannot ignore growing issues around consent, compensation, creative rights,
biometric rights, and civil rights. This is a concern I have also, as you know, especially
around consent and IP, essentially, and of course, biometric rights, which is one of your specialties.
Talk a little bit about that. What's your biggest concern right now?
I think we will continue to see the type of litigation that's come up against Stability AI,
also OpenAI, when it comes to the use of copyrighted works. I am a new member of the
Authors Guild, also new member of the National Association for Voice Actors. It's a thing. I recorded the audio book and I model from time to time and they also have associations, right? And when we see AI systems that aren't fine-tuned on what is the best of humanity, it tends
to have a bit of a plastic kind of appearance.
Or even when you read the text that's generated, it's really getting that cream of the crop
content from writers, you know, from artists that takes AI to that next level and gives
it these powerful capabilities.
that takes AI to that next level and gives it these powerful capabilities.
And so this is why I support the four Cs really of creative rights, which is first consent,
because so much of the data is taken without any permission, but also compensation. We're going to need to figure out, is it data residuals?
What kind of model allows artists who choose to, right, to get some sort of compensation for their work
and other creatives.
I also think about what does control look like?
What does agency look like?
Because maybe your grimes and you say, use my voice,
we'll split it 50-50, we'll figure it out.
Maybe your Dolly Parton, you say, when I die, it's done.
It's me, I gave what I had.
And so truly thinking through that agency.
And I do think that credit to artists when we talk about AI systems and AI capabilities could be
more pronounced. Nice to meet you, doctor. So if people agree that you should sort of own your
digital twin and be asked for consent and then receive
compensation. If your data is being crawled right now as we speak as a creator, what do you think
effectively happens to try and achieve that power to force some sort of consent or compensation?
How does that play out? I think it has to be litigation and regulation. And so,
for example, we saw with Meta, formerly Facebook, they settled $650 million settlement for
allegations of violating BIPA, the Biometric Information Privacy Act of Illinois. And they
actually deleted over a billion face prints. So it is possible, but it has to come with
a concerted effort and a push there. So I don't think it's going to come from inside the companies,
but I do think it is possible. And do you worry that the fissure here appears,
at least on the face of it, that the sort of mixing nonprofit
and for-profit models created tension or unhealthy tension in the company. And I agree with you,
it would be nice to have more transparency, but isn't it likely if a lot of the power or the
momentum of the leading AI company goes to Microsoft? Microsoft is a for-profit company.
AI company goes to Microsoft, Microsoft is a for-profit company.
They are very good at when they want to hide any number.
They don't even have to break out numbers by division should they not choose to.
Isn't it unlikely that if, in fact, Microsoft seeds more power, or not seeds, acquires,
usurps more power in the AI space, that it'll actually be less transparent? I'm not sure I would say that's the case because I look at the release of Lama 2 from Meta,
and some will argue it's different levels of openness.
But I do think other companies might respond with releasing more open models. And so I don't only see Microsoft as the sole player here,
which is why I don't,
I see their move to consolidate power,
but I think the ecosystem is going to respond as well.
And that's where I think there is an opening for more open models, but we'll see.
So let's talk about this book and your role
in the Algorithmic
Justice League, which is a great name. How did Rooting Out Bias in AA become a mission for you?
Obviously, we're working on facial recognition. That's what you got well known for. Talk a little
bit about why you're focused on this. I guess it's just the latest landscape, correct?
Yes. So my experience actually came from being an artist. I was a student at MIT and I was working on an installation.
And in that process, I was using face tracking technology to do an interactive piece.
And long story short, the system didn't really track my face that well until I literally
put on a white mask.
And so it was that white mask exploration that led to the cover of the book,
Unmasking AI, but also led to deeper questions. Are machines neutral? And at that time,
especially with the deep learning revolution that was happening, I was reading about so many AI
breakthroughs. So I was curious why my personal experiences didn't seem to be adding up to the literature I was reading in the
computer vision space, machine learning, AI more broadly. And so then as I dug further, it went
beyond facial recognition. We start thinking about other areas in which we're using data-powered AI
tools. And so we start thinking about healthcare.
We start thinking about hiring, employment, housing.
And that is why I realized I had to continue this work
because AI is touching so many aspects
of our day-to-day lives.
Yeah, you use the phrase coded gaze in your research
into AI bias.
What does that mean?
Explain that.
The mask makes
sense. Like it can't see you if you don't have a white face, essentially.
Yeah. So I think about notions of the male gaze or the white gaze coming from various scholars,
which is to say, who has power? Who gets to decide what is viewed as worthy? What gets the spotlight?
what is viewed as worthy, what gets the spotlight, and also who has the power to shape the priorities of the technologies we see. So like men have had power and white people have had power,
hence the male gaze and the white gaze. Here, I'm putting those cousin concepts into the notion
of who has power when it comes to shaping AI. And who would that be? Let me guess.
Who do you think? It's the pale males I often talk about, you know.
Yeah. But one of the things that's interesting is a lot of women and women of color have been
very early, not just to the warnings, because I don't find them quite as doom-scrolling as others
in the space, have been very early to these warnings, at least, or the need for mitigation.
I think that's probably the best way to put it. Yes. I mean, some of my earlier
reflections, I saw the work of people like Dr. Safiya Noble, who wrote Algorithms of Oppression,
Virginia Eubanks with Automating Inequality. And I do think there's this notion of outsider within.
So when you're in a space where you're not necessarily centered, it can be easier to see
some of the cracks in the system. I think of the work of Dr. Latanya Sweeney, for example.
She was trying to prove to a reporter in a conversation that search engines couldn't be biased.
And so she put in her name.
And when she put in her name, ads suggesting that she had an arrest record came up.
And this was around 2012-ish.
And it was that personal experience that then led to the greater research exploration. And my own journey mirrors hers. So I do think sometimes when you're an outsider, I wasn't even attempting to create something like the Algorithmic Justice League. I was working on an artist installation.
At MIT. And so it's not a situation where I was actively seeking this, but my life experiences brought me into contact with some of these issues that those who are more often centered might not even see.
Okay, pale male, it's your turn.
So there's a lot of catastrophizing and there's a lot of people who are optimists.
There's a lot of catastrophizing and there's a lot of people who are optimists.
And now that we're almost a year in with respect to the launch of some of the consumer applications, whether it's GPT or Clode.
And so we've seen kind of the commercial applications of some of these things.
And we do have some experience under our belt.
As an expert in the area, have you become on the continuum?
Are you based on what you see empirically so far, are you more of an optimist or lean more with the catastrophes? Or is there somewhere else?
I am somewhere else, right? So between fear and fascination, between hype and doom, I look at this as an opportunity space.
look at this as an opportunity space. And so my fear with all of the doomerism, right,
is that we don't actually get to experience the benefits
of AI if we shut things down too soon, right?
And then my fear with the hype is that that belief
in the hype leads to decisions that end up being harmful.
So here's an example.
We're excited about chat GPT.
What can chat bots do?
You have a nonprofit called NETA, National Eating Disorder Association.
So I think it was May 25th. The headline is the company has replaced their call center workers with the chat bot.
They wanted a unionized management said no, right? All right, chatbot is online. People with eating disorders start reaching out. Turns
out the chatbot was giving advice known to make eating disorders worse. And so I bring this up
because then they had to shut it down. And this was because of the belief in the hype
of what the systems could do, even though the capabilities weren't actually proving fit to
context. And that happens so often where we see context collapse. The demo looks sweet. All right,
let's adopt it without truly making sure it makes sense with what we're attempting to do.
And what do you like? Give me an example of what you're optimistic about.
Well, I truly believe the release with AlphaFold with 200 million protein folding structures is a
huge contribution to science. And I actually start the book as the daughter of an artist and a
scientist and feeding cancer cells in my dad's
lab. And so those sort of protein folding structures you'll see with AlphaFold when I
was a little girl, I would see that in my dad's office and he wanted me to get into chemistry
and computer aided drug development. And I kind of went a different route. But I do think that offers exciting
potential. I think about companies like Bloomer Tech, where they've noticed this huge gap with
women's health. So, so many AI systems are being trained with other types of pale male data that
doesn't really reflect the rest of society.
And so what's that opportunity to close these data gaps
so we actually create more robust tools.
What AI do you use?
What AI do I use explicitly?
I don't know, I feel like any of these
would be an endorsement.
Yeah, there's nothing I could-
But you use them. You try them all.
I test them out. I mean, actually, I was thinking about when I did my PhD defense,
I actually ended with an illustration of GPT-2 because it was completing text with Islamophobic responses.
And so generally, if I'm using AI systems,
it's towards the eye of testing them for potential risk and harms.
So where is this going to end up?
And you started with the idea of consolidation,
which seems where we're headed.
And obviously, Meta is trying the open route, many others.
Open AI might become closed AI very soon.
Where do you see it heading?
And what does government need to do to mitigate that?
Yeah, I think we're going to see different types of AI landscapes,
depending on where you are in the world.
I think one thing that was interesting at the recent UK AI Safety
Summit was this call for having only a certain type of company have the access or resources to
large language models. I think that would be a mistake. I think that's closing off these powerful
tools in a way that doesn't allow for the scrutiny and the
harms mitigation that we need. So I think without resistance, we will see more consolidation,
but we've seen that resistance can work. Right. And then when you think about that,
at the same time, the safety rules, many people are worried about that it can only be done by
big companies, correct? The rules they're asking for, the checking and the et cetera. I think we can become more imaginative than that.
I think we would be cutting it off a little prematurely to say, we have created these tools
that are harmful. Now we are the ones who are going to provide the tools to mitigate our harms. I don't think we can have such a circular, insular way
of regulating AI.
So the checks have to come from outside.
Okay.
All right.
I love your work, Dr. Joy Bulumwini.
Again, the book is called Unmasking AI,
My Mission to Protect What is Human
in the World of Machines.
And humans work out pretty good most of the time, some of the time.
Some of the time, for sure.
Some of the time.
Some of the time.
We really appreciate your being here.
Thank you so much for having me.
Thank you, Dr. Natsumichi.
Nice to meet you as well.
One of my favorite pale males.
Oh.
Go on.
Oh, I love you, Joy.
I love you.
Anyway, I'm sure I'll see you soon.
Yes, see you soon.
All right, Scott, isn't she fascinating?
I really like her.
She's one of the good ones.
Super interesting.
She's one of the good ones.
Highly trained.
She's a good friend, just for disclosure, of my ex-wife, Megan Smith, who's helped with the Algorithmic Justice League.
But I think what's good about people like her, and actually Megan too, is they do see the positive.
You know, they're not doom scrollers and they're not, wow, this is the best thing.
They're very thoughtful about it.
So that's what I appreciate.
I don't know.
I think you should have joined us pale males.
And when you refer to your ex-wife, refer to her as that bitch.
I can't do it.
I can't do it.
Although I have to say Megan had the best line of all time.
She was texting me about this whole thing.
And I said, God, they're being such a pain in the seat.
She goes, well, when boys, it's boys trying to make a baby is what AGI is.
She's a big technologist.
And I actually was like, that's exactly what it is.
They're trying to create life.
Anyway, isn't that deep?
Well, she went to MIT too.
Could you get into MIT?
Neither of us could.
No way.
That'd be 100% no.
100% no.
Me, too.
I got rejected from Indiana.
I got rejected from Brown.
I got rejected from Stanford.
Oh, no.
Duke.
I got rejected from Duke, too.
I got waitlisted at UT.
I was waitlisted at Penn.
Wow. Yeah. Oh, my God. Anyway. I know. I got waitlisted at UT I was waitlisted at Penn wow oh my god
anyway speaking of which
we'll be back for some wins and
fails
as a Fizz member
you can look forward to free data
big savings on plans
and having your unused data
roll over to the following month.
Every month.
At Fizz, you always get more for your money.
Terms and conditions for our different programs and policies apply.
Details at Fizz.ca.
Okay, Scott, let's hear some wins and fails.
Can I go first?
Yeah, of course.
I still love The Crown.
It's so good.
I'm sorry I mentioned it the other day. Again? The Crown? I do. I The Crown. It's so good. I'm sorry I mentioned it the other day.
Again?
The Crown?
I do.
I love it.
It's so good.
Is it a new season or are you just watching old stuff?
No, it's a new season.
It was the first four episodes.
It's sort of the death of Diana.
And now then they'll come back in December for the rest of it.
God, Netflix just, I got to tell you, Netflix is really hitting on all cylinders.
You're doing a good job.
They really are.
I'm trying to figure out where I go most, and I'm going a little more to Apple Plus
because there's a bunch of interesting stuff on there.
Very seldom Hulu, but there's some good stuff on there.
Disney because Frozen.
By the way, speaking of the fail, there's going to be Frozen 3 and 4,
which I was in a store this weekend at Target,
and there was a whole frozen
center that my daughter ran to, and then she was eaten by it.
Well, my win is the life of Rosalind Carter. She passed away peacefully with her family by
her side in her home in Plains, Georgia. Classy lady.
This is what Jimmy Carter said about their 77 years together.
Rosalind was my equal partner in everything I ever accomplished.
She gave me wise guidance and encouragement when I needed it.
As long as Rosalind was in the world,
I always knew somebody loved and supported me.
I think these guys,
I think it's just such a nice role model.
Amazing people.
And the thing that in my view
is a real win amount in Rosalind Carter
is that she was a mental health advocate.
Early.
Before it was cool.
When a lot of people didn't think mental health deserved treatment as a disease. And she was instrumental in the
White House. And her husband, the president, said that she was key to a mental health act
that was one of the first of its kind that recognized the importance of destigmatizing it.
So she not only, I think a lot of first ladies advocate for
very worthwhile causes, but she advocated for something that a lot of people had a gag reflex
against, at least initially. And those people played an especially important role. Anyways,
my win is Rosalind Carter, just a wonderful life of service and love. The Carters just continue to be such a nice role model for
America. Best ex-presidents ever, presidential couple ever. All right, my fail is this Argentinian
win of this guy, Javier Malay. And I don't, look, the voters voted him. He's a far right,
very Trump-like radical. Look, the previous administration has given, everyone was just
needed a change from the Peronists, which have been failing them for many years. So
this guy walked right into that void of bad governments, speaking of bad governments.
But he's really, I think one of the quotes really disturbed me. It's like, well, we know he's crazy,
but he's not going to be able to do everything. And maybe he'll do the things that need doing
that the Peronists wouldn't do. So you're going to put crazy in charge.
My son is there, which is interesting.
He said it's fascinating to watch.
But people were so fed up with previous government,
they voted for a person,
even if a lot of them didn't like him.
So, interesting.
So you know how we talk about Britain,
England just has so many assets,
has so many things going for it.
Incredible culture,
incredible universities, Premier League football, a culture of wit. They've got kind of everything,
and yet they figure out a way to fuck it up, mostly with Brexit. Argentina's been doing that
for 70 years. At the end of World War II, Argentina was the third largest economy in the
world. It's got incredible natural beauty. It's
got resources. It's got an amazing culture of beef and tango. It really was the Paris of South
America and their ability to absolutely put in power the wrong people who have taken an
unbelievable culture and country and resources and just fucked it over and over and over.
The Argentinian people are just the most admirable people.
It's just such an incredible culture, and yet they've consistently managed to snatch
defeat from the jaws of victory with all this pop.
You can't even, the peso is basically a failed currency at this point.
Yeah, he's going to dollarize everything, which actually is a bad idea. It's probably a good idea, but it has its own risks.
I don't know how to position this as a fail. It's just an observation. I was watching Meet the Press
and they have that data download. And in terms of it's not only anti-Israel content, but anti-Biden,
younger people are so down on how Biden has handled the conflict in the Middle East.
And I'm just trying to figure out why this is, because when I look at President Biden in the
White House, and I've said I don't want the president to run again, if you want to talk
about how this becomes World War III, it's easy. It's that Hamas gets their way and inspires a multi-front war
from other Arab nations who also are anti-Israel. And he immediately deployed two carrier strike
forces who are sitting off the coast and are keeping the peace, or let me put it this way,
ring-fencing this or cauterizing it from becoming a regional conflict. He has been steadfast. He has been consistent. He has been strong. And I'm absolutely
flummoxed at how America, much less such a disproportionate number of young people,
exactly what do they expect the guy to do at this point? I think that so far,
the president has put on a masterclass in how to be there for an ally while also saying, look, the current status
quo will not survive on either side here, neither Hamas nor the way Israel has approached this
situation, but at the same time has been a great ally, realizes that Israel is the Western outpost.
And Western outpost means rule of law, democracy, women's rights, civil rights, jury
trials. My hope is they'll come around when they see Trump, who is even, who speaking-
God, I hope so, Kara, but I just can't- They can't possibly, they can't-
I just can't fucking, I just literally, and I realize some of it's, well, maybe you're the
one that doesn't get it, Scott, but- No, this guy won though, right? I don't
think they're going for Trump to solve that problem because he's even more, we're going to kick out
people who are for Hamas. That was one of his things. We're going to, like, Biden doesn't want
to do that. We're going to kick out anybody who's, you know, any immigrant who's for Hamas and who
protests for Hamas, or not Hamas, but for the Palestinian. Anyway, I think you're right.
It's really disturbing.
Anyways, I don't know what a fail is,
because some of it may be, Scott,
you're the one that's wrong here,
and it's young people who understand
what's really going on.
But I am absolutely flummoxed at how America,
specifically young Americans,
wouldn't look at how the White House has handled this
and come up with any better solutions
than how the White House has approached this.
Yeah, well, we'll have to see.
That was, yeah, it's a really interesting time.
We'll see.
We've got a year till the election, so things can change.
Anyway, we want to hear from you.
Send us your questions about business, tech, or whatever's on your mind.
Go to nymag.com slash pivot to submit a question for the show or call 855-51-PIVOT.
Scott, that's the show.
Today's show was produced by Lara Naiman, Zoe Marcus, and Taylor Griffin.
Ernie, Andrew Todd, Andrew in this episode. Thanks also to Drew Brosniel,
Saverio, and Gaby McMahon. Make sure you subscribe to the show wherever you listen to podcasts.
Thank you for listening to Pivot from New York Magazine and Vox Media. We'll be back later this
week for another breakdown of all things tech and business. Oh my God, it's the innkeeper who's
pretending to be the ghost. And what do you know, Kara was right.
I'm Velma.