The Prof G Pod with Scott Galloway - Algorithms and the Threats to Democracy
Episode Date: September 17, 2020Yaël Eisenstat, a visiting fellow at Cornell Tech's Digital Life Initiative and former Elections Integrity Head for Political Ads at Facebook, joins Scott to discuss the online threats to democracy a...nd the damaging role Facebook plays in our elections. Yaël has also served as a CIA officer and White House advisor. Follow her on Twitter, @YaelEisenstat. Scott opens with his thoughts on TikTok’s powerful algorithm and why he does not think the deal with Oracle will go through. Algebra of Happiness: There’s no such thing as quality time. There’s just time. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
Episode 27, the Trinity of Trinities or 3Cube, the 27 Club, don't want admittance there.
Famous people who've died at 27 include Kurt Cobain, Amy Winehouse, and Jimi Hendrix.
The number of letters in the Spanish alphabet will be on the needle on my podcast, bitches.
Go, go, go! Welcome to the 27th episode of the Prof G Show.
In today's episode, we speak with Yael Eisenstadt.
Yael is a visiting fellow at Cornell Tech's Digital Life Initiative.
True story, I taught at Cornell Tech for a semester when they had a floor on the Google campus,
and I was fairly underwhelmed.
Cornell had this enormous opportunity, incredible positioning, the third university,
Roosevelt Island, support of the mayor, a lot of funding, a lot of Cornell alumni stepped up.
What went wrong? They held on to the same model of kind of lame tenured professors. And I think
the faculty there is fairly underwhelming. I'm sure I'm going to grab shit for that. And I find that their tech offering is somewhat anemic and that they are not commanding the space they occupy. They're a little competitive juices flowing, but I was actually very excited about Cornell Tech and think they have underwhelmed. Today's episode, we're talking to Yael. She works on technology effects on civil discourse
and democracy. She previously served as the elections integrity head for political ads
at Facebook. Okay, get this, elections integrity, Facebook, not words you find in the same sentence
office. She's also a former CIA officer and White House advisor. We discussed the damaging
role Facebook plays in our elections and the online threats to our democracy. She also has worked at Exxon and was an advisor to the White House. In some,
in some, it should just say on our Wikipedia profile, total badass. She's also having a bit
of a moment. She gave a TED Talk that went viral. And it's just in general, a very thoughtful person.
And something I love about her is that while I don't know her exact political leaning,
she comes across as raging and moderate to me.
Anyone who works at Exxon is likely not going to the woke spot to get the progressive pedicure, if you will.
Anyways, the big news, the big news, ByteDance denied Microsoft's bid for TikTok, which leaves Oracle as the winner. Well, not quite. Oracle would serve as TikTok's trusted technology
provider, which means ByteDance is not actually selling TikTok to a US company,
and therefore holds the reins on the algorithm or continues to control the algorithm. Microsoft's
bid was rejected because it would have taken over this powerful algorithm had the deal gone
through. What was Microsoft's biggest mistake? Simple. The same mistake that almost everybody in this country has made, and that is they took the president at his word and
thought that ByteDance was going to actually have to sell. And Microsoft proposed actually taking
over the company, taking over the algorithm, putting their security in place, having their
engineers dictate the algorithm or control the algorithm. Microsoft has had some success with
the consumer companies. They have the cash flow, they have the security. They seem to me to be the likely acquirer.
But it appears that holding fundraisers for the president is in fact the deciding factor. And that
is Larry Ellison and his president are kind of the two of the few that have come out of the closet
as Trump supporters. My bet is there's a lot more Fortune 500 CEOs who are going to go into the voting booth and vote red because I think they mostly vote with their pocketbook and think whoever's just going to put more money in my pocket, then I'm a lot of faith in government and are kind of closeted Trumpers, if you will.
Anyways, these two, to their credit, I guess, are fairly out and proud about their support of Trump.
And what do you know?
The Sequoia and General Atlantic-backed
ByteDance figures out a way to not sell. This is, what is this? This is another example of how China has usurped global leadership from the US. We've had 10 years pulled forward in 10 weeks,
and the new geopolitical leader is in fact China. We've been played. This is similar to the trade
war. The intention here was good. China can't expect to have free reign over our markets with their technology companies while kicking meticulously
and deliberately all of our technology apps out of mainland China and expect to have free reign
over here. However, going about it as a series of one-offs based on the president's id or personal
biases or who is throwing fundraisers for him seems, I don't know,
seems like we've become fucking Russia. I mean, this is just totally out of control. What happens
when China turns around and says, you know, we'd like to crash your markets and we've decided all
supply chain facilities from Apple have to turn over to Huawei within 45 days? Could that not
spark a major sell-off in the NASDAQ and potentially spark,
I don't know, a market crash? What happens when India, Brazil, Canada, Indonesia say,
you know, Facebook, your second largest market is in Indonesia. We'd like your hosting to be
done with a local provider, or we are going to force a sale within 45 days. This isn't even a
sale. This isn't even a sale. This isn't even a sale.
What happened here? What happened here? Error number one, Microsoft took the president's word.
Error two, this was legally unenforceable. You were going to have to get Google and Apple to
pull the TikTok app off of their app stores, which would have caused a legal battle because
Apple nor Google want to be forced into taking certain apps off
in certain countries. This was never legally enforceable. So it's likely that the legal
advisors whispering in Trump's ear said, hey, boss, we recognize you think you're in a reality
show where you wake up and deploy this ridiculously bad business judgment thinking you're going to be
the hero at the end. But the reality is legally, you're up Schitt's Creek without a paddle. So maybe if
your buddy Ellison comes in and turns it into basically an investment and gets the cloud
business and they seem happy, you can recover from being way out too far in front of your skis,
declare victory and leave. Bike dance seems happy, although the bottom line is,
I don't think this deal closes. I think they're going to play, wait out the clock, beat the clock, and then there'll be
a Biden administration based on all the polls I'm looking at.
And we're going to see if, in fact, that ByteDance ends up closing this ridiculous transaction.
Another example of how we have passed the baton of global leadership to the Chinese.
Another example of how governing by id, it just doesn't work.
Yeah, it makes sense that China should not have free reign in markets and not have any sort of
reciprocity, but it has to be policies. It has to be things that are enforceable. It has to be
certain standards and protocols that companies know the rules that they're playing by so they
can make appropriate investments. Can you imagine how pissed off Microsoft is, as evidenced by their press release,
basically saying, yeah,
we were gonna actually have security here.
We were gonna actually do what you wanted
or you said you wanted to happen.
So good luck with that over at Oracle.
But what's the inside here?
When the dog puts his nose in the air
and he smells something in the air
and he goes, something's up, something's up.
I smell a bear, right? I smell a bear. Or is it that great chicken dinner that mom's making? I
don't know where I got that great chicken dinner. Anyways, dogs are very intuitive. What is the
intuition here? What is the insight? I spent some time on TikTok and a little bit of reminds me of
my son turned 13. And it literally feels as if yesterday I dropped him off at preschool. And
today I came home and he was surfing and was a seventh grader who rolls his eyes and won't kiss
me any longer. But that's another story. That's another story. So along the lines of time just
flying by, I decided to check out TikTok I'd never been on. And I went on it last Friday and I lifted
my head and it was Monday. This shit is unbelievably addictive. We're talking MDMA, heroin, a kind of
addiction. And I was thinking, okay, it got me thinking. It got the dog thinking. And that is,
and my head's cocked. I'm thinking like when a dog walks into a room and doesn't know
why it's in that room. And I'm like, what is so powerful about TikTok?
And what does it mean for the rest?
What's the learning?
What it comes down to for me is signal liquidity.
Signal liquidity, trademark, hashtag, all rights registered to Prof G.
Signal liquidity.
And that is the example I always think of is Netflix.
And that is if I'm watching season
three, episode four of House of Cards, and I watch it all the way through, the AI on the back end of
Netflix goes, well, we think, and we're so confident that Scott's going to like season three, episode
five, that we'll begin playing it in three, two, one, without asking him to find his remote, click
yes, what have you. That for me is kind of how AI has
changed my life, if you will. And the signal liquidity is a couple of things. One, I picked,
I clicked, I found House of Cards and I watched it all the way through. And I'm sure there's
several other signals there. But with TikTok, what you have is you have the signal liquidity that is just exponential. And that is for every signal that Netflix gets from me to inform their AI algorithms, by virtue of the videos, what I like, what I comment on.
And slowly but surely, and slowly but surely, I end up with a stream of videos that have
chiropractors adjusting people's necks for some reason.
I find that fascinating, and I do.
And I didn't even know I found that fascinating.
That is what is so fucking scary about TikTok and the algorithm here is it seems to know
what you want before you know what you want.
Anyway, this thing is so good and it calibrates, it takes that signal liquidity and it calibrates
in on content that you find fascinating or enjoyable and you go into a rabbit hole and
you look up and boom, it's two hours later. So I think it comes down to signal liquidity.
Let's compare and contrast that versus another, another short form video platform that launched around the same time or was exposed
to Americans around the same time, Quibi. Quibi went for a star mentality and got these famous
60 and 70 year olds, right? And by the way, a consumer doesn't care that you produce Shrek.
The consumer just really doesn't care or that you ran HP. The consumer doesn't care. Not to say they aren't fantastic executives, but please name a media tech firm that has been
successful where the founders are in their 60s. That is incredibly ageist, and guess what?
Business is ageist, and so is the human brain. Going back to those 27-year-olds who killed
themselves with heroin and are remarkably creative. What happens to the creative brain
after 30? Jesus Christ, U2 hasn't written anything in 15 years. Michael Jackson couldn't slip and not spit out a number one song until
about the age of 27. And then he stopped doing anything and for the life of him couldn't get
a hit. But anyways, there's something unique about the young brain. There's something unique
about young entrepreneurs. Back to Quibi, it's not working. Why? Founders that were too old. And two, not enough signal liquidity.
And three, this old notion of overproduced expensive content.
And what is TikTok?
More signal liquidity sitting on top of free content that is created by users.
And then that algorithm, the genius is that algorithm begins zeroing in and calibrating on what type of seven and eight production value content you absolutely love, whether it's Labradors on skateboards versus hoping that you can spend three or five or $7 million on Cesar the Dog Whisperer on a series on dogs that embrace extreme sports. And we're going to find a way to take all
of that content and begin to slice it and dice it with the use of the signal liquidity and this
algorithm to get you to eight, seven or eight content that you love or that hits sensors for
whatever reason, whatever those sensors might be, versus trying to find a
0.01% of content we're going to put money behind and try and get it to an eight or nine
and trust that it tickles that sensors.
Or put another way, the new forward-looking algorithms or the new forward-looking platforms
are more about a means of figuring out inspiring low-cost content, but figuring out a way to
get dramatically more content and then figuring out signal liquidity such that you can get to the seven or eight out of 10 content that is more relevant
to you versus nine or 10 quality content that may or may not appeal to you.
And this blows my fucking mind.
Why?
It's dangerous.
Because with this type of signal liquidity, with this type of algorithm, someone on the other end, there's always a human on the other end of the algorithm.
There's a human on the other end of the benign algorithms of Facebook saying, we don't give a shit about the health of the Commonwealth or teen depression.
We just want the algorithms to figure out a way to get more engagement.
And then the algorithms figure out that the ultimate way to get engagement is enragement.
And then when Facebook executives get shit because their enragement and
tearing of the fabric of society and depressing teens is bad for us, they decide to protect the
algorithms and come up with bullshit like, we don't want to be arbiters of truth, or we don't
want to be in the business of determining what's right and what's wrong, such that they can let
the algorithms become the fucking antichrist of technology. But what could happen here? What
could happen here? The same decision on the front end of the design of these algorithms could say, all right, I want Biden to lose.
I see Trump as being more favorable for my interests because he is tearing apart America,
because the pandemic will continue to rage on, because he will likely turn America into a
shit show, virus ravaged, polarized, extremist society that will literally begin to
collapse under its own self-indulgence, weight, narcissism, lies, conspiracy theories. All right,
that's what I want. Now, how could I get the algorithms and TikTok to play a role in that?
Simple. I'm going to start sending you content that appeals to you, that undermines the credibility of the Biden campaign. Now,
it might be humorous videos. It might be videos of Trump rallies that are appealing to me.
It might be videos around the economy. It might be videos undermining or misogynist videos about Kamala Harris or videos that are racist or subtly racist, if there's such a thing as subtle racism,
but slowly but surely start calibrating in on what is the soft tissue around individuals'
biases, what receptors are most open to these signals that undermine the credibility.
Or maybe we don't even go there.
Maybe the signals start immediately telling the algorithm or informing the algorithm,
hey, it's hard to get these people off of Biden.
You're never going to get them to Trump.
I know.
Let's tell the algorithm when it recognizes that to immediately go to discouraging them.
Talking about extremist positions from both, conspiracy theories about both, misinformation
around voting, and let's just suppress the vote.
Let's just get people so fed up, so confused.
Let's muddy the water such that there's just get people so fed up, so confused. Let's
muddy the water such that there's zero visibility, such that come election day in the areas that
lean Biden, we're going to confuse them and discourage them and suppress the vote. That's
what these algorithms could do with their signal liquidity and with their massive amounts of
content such that they could begin zeroing in and slicing the cheese so finely, so finely that they get the perfect type of cheese because there's more flavors in their ability
to slice it and test it and test it over and over such that we get you to the exact cheese that you
cannot stop eating, my brother, you cheese-eating weirdo. Anyways, TikTok. TikTok, to summarize,
this was not a sale.
This was the Series G financing paid at an inflated valuation that includes a big cloud contract for Oracle.
The president got over his skis legally, was out on a limb here, and is going to declare
victory and leave and move on.
I think there's more likely than not this deal will not close.
And TikTok, TikTok and signal liquidity and algorithms are dangerous,
and we should be concerned. We'll be right back. marketing. And when you're starting your small business, while you're so focused on the day-to-day,
the personnel, and the finances, marketing is the last thing on your mind. But if customers don't know about you, the rest of it doesn't really matter. Luckily, there's Constant Contact.
Constant Contact's award-winning marketing platform can help your businesses stand out,
stay top of mind, and see big results.
Sell more, raise more,
and build more genuine relationships with your audience
through a suite of digital marketing tools
made to fast track your growth.
With Constant Contact,
you can get email marketing that helps you create
and send the perfect email to every customer
and create, promote, and manage your events with ease,
all in one place. Get all the automation, integration, and reporting tools that get
your marketing running seamlessly, all backed by Constant Contact's expert live customer support.
Ready, set, grow. Go to constantcontact.ca and start your free trial today. Go to constantcontact.ca
for your free trial. Constantcontact.ca.
Welcome back. Here's our conversation with Yael Eisenstadt, a visiting fellow at Cornell Tech's
Digital Life Initiative, where she works on technology effects on civil discourse and
democracy. Yael also served as the elections integrity head for political ads at Facebook
back in 2018, and has a really impressive background in the national security sector, including stints
as an advisor to the White House and with the Central Intelligence Agency. Yael, where does
this podcast find you? I am sitting in my apartment in New York City. You have, sorry, I don't want to
say taken the world by storm after 30 years or 20 years of good work. You're sort of an overnight
success. I keep seeing your name everywhere. It was a coup to get you. So first off, let's just start with you were the global head of
elections integrity for political advertising at Facebook. Isn't that an oxymoron?
One could say it is a bit, yes. But when they reached out to me, they offered me that title. I said,
don't hire me if you don't mean it. And so, yeah, that's what they said they were hiring me for.
And did they mean it?
They did not. In my case, they did not mean it, just to be really blunt. So,
yeah, I came in with this sort of mandate of, according to the recruiters and everyone I spoke to, of creating, building a new team to really, it was very shortly after the Cambridge Analytica scandal
really became public. And I mean, the reality is I came in to do this and on the second day,
they changed my title and job description. So I guess they didn't really mean it.
Second day.
Yeah, before they can say I screwed up.
So I mean, yeah, second day.
And in a job like that, is your job to actually try and figure out how to make, how to ensure
that there is some integrity, that bad actors haven't weaponized the platform?
Is it really about integrity of the platform as it relates to elections?
Or is it to get more money from political advertisers? or is it to create a veneer of security? What did you feel like you were,
what did success look like for you in their eyes at Facebook?
It's just, let's talk about political advertising for a second, because as I'm sure
you're aware, I actually don't think political advertising is the biggest problem on the platform. But for this particular role, there were some legitimate integrity efforts that they
were trying, such as let's make sure that Russians can't pay in rubles to buy ads on our platform.
I think for the foreign interference part, it's pretty clear what the mission should be there in
terms of cleaning that up and making sure it doesn't happen again. And it's a lot less politically
risky for a company like Facebook to try to figure out how to not let Russian actors exploit the
platform through political advertising. The trickier questions though, is I have a much
broader lens. I am looking at this in terms of how are you affecting our democracy? And that includes domestic actors. That includes a whole bunch of things that gets
much more politically tricky for the company. And for my experience anyway, there was no appetite
for me to go deeper than the sort of reactionary moment of they were building that ad library. They were putting
out new requirements for how to verify political advertisers. They were very sort of reactionary
tech responses. They weren't bigger questions of how are we affecting elections in general and what
can we do to protect against that? And you'd said that advertising wasn't the most dangerous thing
about Facebook's role in elections. What is the most dangerous thing about Facebook's role in
elections. What is the most dangerous thing? So, I mean, it clearly played an important role,
especially in 2016. And I know that a lot of people, their talking point likes to be,
well, the Russians only spent this much money on ads and therefore it wasn't a big deal.
Let's be really clear though. They might not have spent a lot of money on ads, but they also,
if they use them, and some of this stuff is black box, some of them, some of it we'll never know
because there's just not transparency in a company like Facebook. But to say they had no impact
through ads is not true. I mean, they got to use their sophisticated targeting tools. They got to,
I mean, I would not say it is not important. That said, at this point, you have a platform who is fundamentally
successful because it has succeeded in using our human behavioral data to then try to persuade us,
whether it's to buy Nikes instead of Adidas, whatever it is. Ultimately, it's a persuasion
machine to try to get us to do something, to be on their platform more, to engage more, to look at the ads, maybe not buy, but to click on the ads they want us to click on.
But what does that do for political speech, for how we think about political rhetoric, for how we think about truth versus fiction, how we think about how we even consume information?
These are the things I am much more
concerned about. I really don't care if you show me a Nike's ad versus an Adidas ad. I do care
on how you are affecting my ability to discern truth from fiction, to even understand the
information environment at all anymore. And yes, bad actors are going to exploit the hell out of that. I mean, I'll just say,
I know you've been asked this, but I'll just say the idea that they can continue to say nobody
could have seen this coming when it comes to what the Russians, for example, did in 2016.
Maybe nobody at Facebook could have, but I guarantee you people who worked on Soviet Union
information operations, propaganda, Cold War stuff could have seen But I guarantee you, people who worked on Soviet Union information operations,
propaganda, Cold War stuff could have seen it coming if they had understood how Facebook worked.
So it's, yes, is it Facebook's fault that there are bad actors out there? No. Is it Facebook's
fault that they are more concerned with growing and dominating the entire world's information ecosystem as opposed to figuring out how to not enable those bad actors and provide them tools
to disrupt our democracy, yeah, that's where I put the blame at their feet.
Isn't the danger or one of the dangers or a bigger danger than the advertising itself is that if the algorithms promote content that takes you one
way or the other or upsets you or muddies the water or just discourages you from turning out
and voting, that it's the actual content and the algorithms promotion of certain types of content,
this whole freedom of speech versus freedom of reach. Isn't that the real threat? In my opinion, absolutely. So it's,
it's, and it completely also contradicts whether you want to give the freedom of speech argument,
whether you want to give the mirror to society argument. I'm not going to give the whole
attention economy speech. You've heard that before, but at the end of the day, this is how
they make money. They make money by keeping us engaged in
that. There are plenty of people who speak about what that means, but in terms of elections and
political speech, it means that the algorithms, they steer us, they steer us towards what content
we view. They steer advertisers towards what content they target us with. They recommend
groups to us. This is not just me going
on Facebook and seeing exactly what all of my friends posted on any given day. And the most
dangerous part in this is really when it comes to things like voter suppression, when it comes to
things like completely destroying the public's trust in our election system to begin with. But to be clear, this isn't
something that just suddenly happened. From day one, their product was built to steer us towards
certain kinds of conversations. And so, I mean, I first got into this five years ago when I was
looking at what was causing the breakdown in civil discourse here.
And that was well before we were talking about whether or not the Russians manipulated the platform for the election. So I just think that there is a fundamentally unhealthy way in the way
this platform has been built, the way it's monetized, the way... I mean, I'm a public
servant at heart. I spent most of my life in the national security world.
I think the idea that the world is a better place if we have frictionless virality and
everybody can immediately within less than a half a second flat boost their message out
to the entire world, no matter how damaging that message is.
And the algorithms can do whatever they want and amplify it and
spread it without any, just slow down for a second. Is this something that's worth being amplified?
I'm not saying take it down, but is it worth being boosted and spread and targeted? That's
the questions that I want to get at. Yeah, it feels like there's a difference
between frictionless and accelerants that tend to
be poured on things that might be damaging for the Commonwealth, right? I don't think people
are arguing that we should shut down anti-vaxxers, but if anti-vaxxers represent X percent of the
population and the algorithm recognizes that it creates a ton of controversy and more engagement
and more ads, should it get 10X the amount of oxygen that it would naturally get on its own, say at a representative cocktail
party, if you had a cocktail party that consisted of the representative population of America,
would you let the anti-vaxxers, the white supremacists stand on a table and just
dominate all the conversations? Isn't it that for some reason, these algorithms have decided to, you know, what's good for advertising or just happen to be the most inflammatory,
damaging thing? Right. So, you know, there's a lot of people who love to use these arguments
about algorithms are neutral or algorithms, you know, or we, some people will even say,
we don't even know what the algorithms are doing, but there's a critical step they're missing there. It's at the outset, you decided to set a goal for your algorithm. You decided to train your
algorithms to ensure maximum engagement on your platform. And the fact that the algorithms have
figured out the way to keep us engaged is to feed us the most, you know, I know some people don't
like the term clickbait, but it is the most clickbaity, you know, salacious. It is human nature to, if you are offered two
things to look at, and one is super exciting and salacious and, oh my gosh, just click here. You
won't believe what you see. That is going to engage more people than some super wonky, like,
according to these three sources, this is what I found about X, Y,
or Z today. And so that's what's happening. And that is, and now Facebook is in this like whack-a-mole
reactionary responsive stance of, but we're taking this down and we're taking that down,
but they never, ever, ever talk about, okay, maybe the problem here is actually how our algorithms
are delivering content, are connecting people, are recommending groups. I mean, I have lots of
examples that just remain black boxes because we'll never get the answers to them because
there's no transparency around how these things work at that company.
So there are a few people who would be more qualified to discern whether I am being paranoid or if I have insight around this issue.
And I'll outline a scenario and tell me where I stand on erotic, paranoid, or common sense.
If I were working for the GRU and I went to Putin and said, okay, you can spend $4 billion on a new
nuclear class aircraft carrier, or you can give me 500 million. And I'm going to identify the
50,000 most influential people in the US who tend to have what I'll call anti-Russia tendencies,
or a talk tracker narrative that's anti-Russian. And I am going to deploy using content farms and humans and crawlers, an army of people and technology to do a couple
things. One, undermine their credibility. Anytime they bring up Russia, anytime they talk about
Trump, who is perceived as pro-Russian versus an anti-Russian candidate, I am going to weigh in on
Twitter, on Facebook,
and in a thoughtful way say, hey, Scott, love your stuff, but every time you bring up Russia,
you get it wrong. Or whenever there's an opportunity, when you say something provocative,
I'm going to weigh in and try and pick a fight and turn your Twitter feed into a cesspool of anger,
such that people just turn off you, turn off your ideas. I think that would be a
really smart thing to do in terms of an allocation of capital for a foreign government versus
investing in traditional armaments. And as a result, I feel as if there are bad actors on my
platform every hour of every day. Am I paranoid or is that common sense?
Oh my God, I love this question so much. I mean, you're definitely a little paranoid, but it's also common sense. Doesn't mean I'm wrong. Right.
Listen, I'm going to move away from my elections hat here and just say I did spend my life in the
national security world. I recommend everybody go look at the video from the KGB defector,
Yuri Bezmenov in the 80s. He actually did an
entire interview. He was a KGB defector and he did an interview about the Soviet Union's
plan to demoralize America. And he completely lays this out, right? It's going to take a
generation or two to just inundate you with so much information that you don't know what to trust,
you don't know what to believe. He kind of lays out what their grand strategy is. What
you're saying, can I say if that's exactly happening or not? No, but think about it.
Russia gets to play a much larger role than they actually have the military capabilities or the
economy to do because this is such a cheap operation that actually just requires like this true vision of Russia exporting
its philosophy to the world that does not include tanks, that does not include drones. I mean,
it is just technologically, maybe not the lowest lift. It is sophisticated, but it's inexpensive
and it's completely consistent with what someone like Vladimir Putin's ultimate goals
have been for a very long time. So I mean, I can't confirm that your paranoia isn't somewhat
misplaced. But you're right, like this is exactly and this is in part what they were doing. And we
had this example. I hope I'm not going to get it wrong. I'm working from memory here, but just a week or two ago where Facebook talked about how Russians were at it again and they were paying Americans who didn't even know library, so now the Russians aren't going to try anymore.
And what really concerns me is, like, do I think that some cybersecurity experts are great at what they do?
And the FBI is also watching this now.
And Facebook doesn't want to be caught having the Russians overtly manipulate our elections again.
But at the end of the day, you do have
a platform that has tools that can be used in very dangerous ways. And those tools still exist.
And they've never been regulated. They are a free for all. More tech is going to solve it all.
And this is an age old... I mean, I don't want to overly focus on Russia. I think we have just as
many bad actors in the US right now. But the Russia question, this is not new. This is an age old, I mean, I don't want to overly focus on Russia. I think we have just as many bad actors in the US right now. But the Russia question, this is not new. This is an age old
ideological battle, and they are getting a much less expensive way to handle it now.
And what, they bring you back to Facebook and they say, okay, our number one priority,
our number one stakeholder is the health and wellbeing of the Commonwealth, not shareholder value.
What would you recommend or what would you have them do?
First of all, despite everything, I have looked at this from every angle possible.
I do not think it should be up to Facebook to fix it.
I think they should fix certain things for sure.
But the idea that this industry, this many years on, is still 100 percent.
I mean, it's not 100% unregulated. That's an exaggeration. But it's largely unregulated. And if you ever talk about some of the ways that we should impose responsibility on these companies, all the free speechers yell that you're trying to curb free speech and they go into these absolutist, completely binary arguments is frustrating. So first of all, I would absolutely define responsibility for this industry. But if I had the company itself,
I mean, I would first and foremost say, you have to just change your entire business model.
And if you were like, you have to figure out how to monetize your platform without, A, using my human behavioral data against me.
B, without doing anything.
I don't want to have to click through 27 things to figure out what my security profile is and what my data is.
No.
You should ask me before you do anything with my data.
So first and foremost, they have to change their business model.
And that's the something that they'll never, why would they? This is your area more than mine. The markets keep rewarding them. No matter how much I might scream from a rooftop,
they're not breaking the laws and the market keeps rewarding them.
Yeah. It feels as if Netflix hasn't been weaponized because it's subscription. That if you think about in terms of tobacco, social media is nicotine.
It's addictive, but in and among itself, nicotine doesn't give you cancer.
It's the tobacco.
It's a delivery mechanism.
And in this case, it's the ad-supported business model that is really the kind of the stuff
that gets you sick, right?
That I agree it has to come down to
a change in the business model. It doesn't also have to come down to what I refer to as the algebra of deterrence. And I'll use an extreme example. The Rosenbergs conspire with the Russians.
We decide to take a mother and a father and execute them. And at what point,
when social media platforms make so much effort to delay and
obfuscate what is actually going on in their platforms, because it might reflect them in a
bad light, even if it means delaying or counterbalancing bad actors, at what point does
that negligence become criminal? And do you think outside of a total change in the business model,
this is going to get better
unless the deterrence becomes stronger? I'm not talking about a $5 billion fine. I'm talking
about a $50 billion fine. I'm talking about a per block. Does this ever get better without
substantially increased downsides? So I actually don't think it does. And that's someone who's
somewhat optimistic. I wouldn't keep fighting this hard if I wasn't. I hate that I'm coming to the conclusion that I don't know if it is fixable.
Listen, first and foremost, I do not believe it is fixable under its current leadership.
I do not think it is fixable when no matter how much society, civil rights leaders, academics,
journalists, like I know that some people love to say that because I am
not a computer scientist and I am not a lawyer, what right do I have to think that I should be
part of this debate? Because I am a member of the public. I am someone who spent my whole life
fighting for democracy and I am a consumer of your product. And I fundamentally do not think
that Mark Zuckerberg will ever be persuaded. He made a very intentional
choice to grow at all costs, to scale at all costs, to dominate. He uses the word dominate,
to dominate the world's social media landscape. And I don't know that there's any changing him
because we've made it very clear there's no way to punish him. And part of the reason there's no way to punish him is because we've never actually created laws that apply to these companies.
Our internet laws, I mean, they were written in the 90s. And so I don't think it's fixable without
actual strong regulation. I do think there's a way to do it without having to destroy the whole company. I do think that if we could, I mean, can I give you like a, just a, an example
of a world that I would like to see? Um, let's use a real example. Let's use this Boogaloo
example from May. We had the situation where there were two men, they met in a Facebook group,
right? Two, and they were parts of the Boogaloo group for anyone who doesn't know it's the group that's basically abdicating
for a civil war so they meet in this in a facebook group they sketch out their plans
according to court documents i believe using messenger then they go and meet in person for
the first time and they go and they kill the federal security guard in Oakland. They're
exploiting the Black Lives Matter protests and they end up killing a federal officer.
And so there's like three different layers here. It's the first question, right, is should Facebook
bear responsibility for the fact that there's boogaloo groups and content on their platform?
And maybe not. Like you could argue both sides of that, right? You could argue free speech. You
could, I'm not going to actually weigh in on an answer on that. But then you go to the next level and it's, okay, well, what about the fact
that should Facebook bear responsibility for the fact that these two men met in a Facebook boogaloo
group on their platform? And the only way we could answer that question is if we knew if those two
men actually went online and searched for that group,
or if Facebook's, what if you found out that Facebook's recommendation engine recommended
that group to those two guys? And they met because Facebook's platform actually steered
them towards that group that they weren't looking for to begin with. And you know why you and I will
never know the answer to that question? Because if the widow of the officer who was killed in
Oakland decide to try to take Facebook to court over this, I suspect it would be thrown out based on section 230. We would
never get to the discovery process. We would never be able to find out if those two men were connected
using Facebook tools, as opposed to if this was their intent to begin with. And because we'll
never know, nobody will ever be held accountable.
Why shouldn't we at least get to the point of even being able to find out that piece of information?
What about Twitter?
My sense of Jack Dorsey, despite the nose ring, despite the silent retreats, just simply put, doesn't give a shit.
And I see so much just rage and hate on Twitter.
And I don't know if you listen to the Daily, the New York Times podcast, where they ask,
what are they doing to try and prevent this? And the best they could come up with was
this app they were testing in Canada or this feature that would prompt you to say,
do you really want to send this article? You haven't read it yet. I mean, that's kind of the sum of their efforts so far. And granted, they decided to stop taking
political advertising, but that was a pretty big, pretty easy gift for them because they were making
no money there. My sense is Jack Dorsey and Twitter are just as bad or their complexion
is just as irresponsible. It's just that their negligence, their delay in obfuscation,
their lack of regard for the commonwealth doesn't sit on as big a platform, but it's just as toxic.
So it's interesting. I mean, I've heard you talk about Twitter before, and I'm going to start out
with, to be frank, I struggle with this one. I don't have direct, I mean, on the surface,
it definitely looks like Twitter is trying at least harder than Facebook is. And from my own
experiences of trying to push for some of the things we care about and working with people do,
Twitter is at least more open to trying to figure out and grapple with what have we become and what do we need
to change? Now, still, a lot of it is still this sort of bandaid, whack-a-mole, reactionary fixes.
But it's funny, I listened to that New York Times Daily podcast, and I very, very rarely
tweet about this kind of stuff. but I did a whole Twitter thread
about, come on, Jack, like you couldn't even answer. I thought it was so frustrating.
You couldn't even answer these questions, right? But I do think he's more thoughtful. I think he
is more open to at least admitting that maybe some of his ideas are not perfect. I think some of the
people who work for him seem
pretty dedicated to, at least in the election space, for example, to trying to take stronger
stances, whereas Facebook is immovable. They are never going to change their ideology that more
speech is better than bad, counters bad speech. They're never going to change all of their
ideologies that drive me insane. That said, I don't know enough about Twitter to know if I completely agree with you. I struggle with this one, to be honest.
But let me ask you this. There's being thoughtful and responsive, and then there's speaking in slow,
hushed tones to give you the impression I'm thoughtful. And then when you listen to what I say, I'm not saying or doing anything.
And so let's say, all right, Facebook will, we can expect what we can expect there. Twitter,
less bad. Where do you put Google on the spectrum? So Google is another one where it's a little bit,
I mean, there were lots of things that I would have been very strong about how I feel about YouTube. Um, especially, I mean, you can just listen to
rabbit hole if you want to listen to a podcast that goes into it. But, um, I am not as much in
the space in terms of Google's dominance in terms of ads. Like I'm not an antitrust expert. I do
think they have done some work to clean up some of the things like
auto-populating search results. I mean, the fact that you used to be able to start with is a Jew
and it auto-populates all the most horrible things you could possibly imagine. They've
cleaned some of that up. YouTube is what I'm focused on more because it's that same model, right? It's the engagement model.
It's the recommending videos to you.
It's that sort of mirror to society.
And yet they're using information about me that they've tracked about me all over the
internet to try to persuade me to watch something.
And the only way that I am going to stay on and click the next thing is if it is a little more exciting than the thing I watched before it. Otherwise,
why would I click on the next thing? So that part of, of Google, the YouTube business model,
again, that is more my expertise in terms of the other things about Google. Are they too big? Like
I, I don't have the answers to that. But one more thing about
going back to Twitter, the answer that Jack gave that, you know, I try to give him the benefit of
the doubt because I do know he's made some changes that I think are interesting and they've taken a
stronger stance to be frank on speech, including from our current president than Facebook has. And so at least I
give them credit for that. But in that interview, at the end of the day, he still implied, not
implied, so pretty frankly, that growth is the solution. More voices will lead to a better
society. And if we've come this far, and is still the unbelievable like end all be all response from the Silicon Valley, is it is more voices?
I mean, more voices at the table certainly makes for a more robust democracy.
I'm all for it. But more voices with no guardrails and no rules and not fixing any of the ways your platform has been weaponized to spread hatred and division and all of that is not the solution. And when you look forward, when you, okay, so 2016, the election
interference on Google, Twitter, Facebook versus 2020, is it the same, better or worse outside
interference right now? Outside in terms of foreign interference or any kind of interference?
Yeah, or bad actors, internal, whatever it might be, trying to use these platforms to,
I would say, well, it's interesting, right? If it's citizens trying to influence the outcome,
that's- It's two different things.
That's fine. Yeah, I would say bad actors.
Bad actors. All bad actors.
Okay. So I think we do have a better grasp in terms of the foreign interference angle.
I also am encouraged to see
that there seems to be collaboration between government and the platforms. The recent takedown
was because of an FBI tip to Facebook, according to everything I've seen in the news. So that's
good. I think we have a better handle. I think the platforms don't want to be caught again,
allowing foreign actors to severely intervene. It doesn't
mean it's perfect and there's still a long way to go. At the end of the day, still nobody's ever
been held accountable for any of that. So that's another conversation. But I think we've solved a
lot for the threat of 2016. I don't think we've solved for the threat of 2020. And while foreign
actors are still a very important threat, I think we don't have a grasp of the domestic actors, whether it is people who truly are doing it for profit.
Like some people who are spreading chaos and all of this are doing it for profit or they're doing it for all sorts of reasons.
We don't have, we haven't really, I mean, look at the coordinated inauthentic behavior that happens on the domestic side on a
platform like Facebook. The fact that they've allowed that as long as they have, because it
was politically difficult. It is a politically difficult decision to tackle domestic bad actors
when you need to stay on the right side of the administration in power. And also think about
if you're a smart company, you know this, if you're a smart company, you want to play both
sides of the fence, because if you have a long game, you need both sides of the fence to not
regulate you, right? You don't know who's going to win next. So tackling the domestic bad actors
that coordinated an authentic behavior. And let's just be frank before, I know I'll be accused of being a liberal for saying this. Maybe just consider that I'm
saying this because I'm looking at facts. There is a movement on the far right that Facebook is not
tackling strongly enough because it is politically complicated for them to do so.
Because they've cozied up to Trump and said,
we'll leave these folks alone if you leave us alone in terms of... It feels like there's,
from an outsider standpoint, it feels like there's this unholy alliance between Trump
and Facebook. So I cannot confirm if there's any sort of actual overt unholy alliance,
but there's also... Zuckerberg loves to play this both-side-ism thing, right?
Like we're going to put labels on all posts about voting now. That is the most politically
convenient decision because all that's doing is saying, I'm not going to make any judgment on,
on who's being bad here. And at the end of the day, do I think he actually has some sort of
deal with Trump? I'm going to sway the election in your direction. I certainly hope not. You know
what the bigger question is? Why does one man even have the power to be able to sway an entire
election? That's the bigger question. But to get back to your question, I'm rambling now.
Conservatives have been very, very good at mastering this talking
point that there's an anti-conservative bias at Facebook. I personally never saw it. In fact,
the only time I was very surprised by a decision we made about an appeal on certain content,
it definitely went in the conservative way, not the other way.
But they're very good at mastering their talking points.
So there's the public pressure of,
now Zuckerberg can say,
but anything I do, somebody will be unhappy.
But the way he handled, let's just be frank,
the way he handled Trump's posts
about the looting and shooting,
very blatant attempts to lie about voting
procedures.
That's the line that has been crossed that really makes me say, you really don't want
to be on the wrong side of this administration.
You must have your reasons because otherwise you would enforce your policies evenly as
opposed to saying, I enforce my policies against everybody
except for the president. So Yael, what are we missing as we tend to look at the dumpster fire
that is the elections? It's always the stuff you're not watching that tends to jump up and
bite you. What are you worried about? So I'm really worried about what happens after November 3rd.
And you are starting to see some people talk about that. Even Zuckerberg actually made mention of that in his post a week or two
ago. But my biggest concern now, in addition to everything leading up to the election, is just
picture this. Picture that November 3rd hits. And we've already been inundated with every reason not
to trust this election, right? All the chaos that's been spread by whether it's the president or whether it's bad actors,
whether it's foreign actors.
And then we come to a situation where, let's say, for example, more people from one side
voted in person because that is what their party has been really promoting.
And more people on the Biden camp voted by mail because that's a lot of what we've heard
on the left.
And on November 3rd, Trump declares victory.
And he starts declaring victory on Facebook,
on Twitter, on every social media platform.
And exit polls start to show that
and the media starts to talk about it.
And then as the votes start getting counted,
as they start coming in,
slowly, slowly, the numbers start to shift.
And what is he going to do then?
He's going to immediately claim that, see, I told you they would steal the election. And my biggest concern is not
even about the chaos that all of that is going to bring, but we're in a very volatile time with
COVID, with the pandemic, with social justice at the forefront of so much with fires in California. There's so much
volatility. We're so anxious. And, you know, Facebook definitely contributes to a lot of that
by allowing all of the salacious content to constantly be flooding our feeds and by not
ensuring that only true trusted content about elections is flooding our feeds. And he starts
dog whistling to his supporters to get out in the
streets. And I'm just really concerned about, because of all the disinformation that has been
allowed to spread on these platforms about the election, from November 3rd until we actually
have a verified result, that's the most volatile time. And if those platforms do not take very bold
steps, including possibly not allowing candidates
to talk at all about results, like bolder steps than they are used to, and no, they
won't scale globally.
This election is in crisis.
Let's talk about how to protect this election right now.
And if they don't take really big, bold steps after November 3rd, I'm very concerned about
how the platforms are going to be used to really spark
what is already a tinderbox of anxiety and what that might look like. And advice to your 25-year-old
self? Oh, so many things. I think if I were to narrow it down to one, I would advise myself to
seek out mentors, especially female mentors. I grew up in a very, very male dominant world
in the national security world. And I was so tough at the time. And I thought that I thought I was
tougher if I could do it on my own and not ask for help. And I think seeking out mentors, not
like what's happening right now, where I get 500 LinkedIn requests a day saying, can I pick your brain?
But actually investing in finding someone who inspires you, but who also can help really
give you advice that is not necessarily helping me find a job, that is helping me think about
my goals and my way forward.
I think seeking out a true mentor, especially more women mentors, would be something I'd
tell myself.
A more just world.
Yael Eisenstadt is a visiting fellow at Cornell's Tech Digital Life Initiative, where she works on technology's effects on civil discourse and democracy.
She previously served as the elections integrity head for political ads at Facebook and as a former CIA officer and White House advisor and joins us from New York. Yael,
stay safe. Thank you. It was great chatting with you. We'll be right back after this break.
Hey, it's Scott Galloway. And on our podcast, Pivot, we are bringing you a special series
about the basics of artificial intelligence. We're answering all your questions. What should
you use it for? What tools are right for you? And what privacy issues should you ultimately watch out for? And to help us out,
we are joined by Kylie Robeson, the senior AI reporter for The Verge, to give you a primer on
how to integrate AI into your life. So tune into AI Basics, How and When to Use AI, a special series
from Pivot sponsored by AWS, wherever you get your podcasts. process, there are a lot of really complicated things happening that have to go right in order for that sale to go through. Stripe handles the complexity of financial infrastructure,
offering a seamless experience for business owners and their customers. For example,
Stripe can make sure that your customers see their currency and preferred payment method when they
shop. So checking out never feels like a chore. Stripe is a payment and billing platform supporting
millions of businesses around the world, including companies like Uber, BMW, and DoorDash. Stripe is a payment and billing platform supporting millions of businesses around the world, including companies like Uber, BMW, and DoorDash.
Stripe has helped countless startups and established companies alike reach their growth targets, make progress on their missions, and reach more customers globally.
The platform offers a suite of specialized features and tools to power businesses of all sizes, like Stripe Billing, which makes it easy to handle subscription-based charges,
invoicing, and all recurring revenue management needs. Learn how Stripe helps companies of all sizes make progress at Stri lot about time actually that's sort of misleading
i'm always thinking about time i'm fascinated by it this notion that time is probably the most
important metric in our life because it has to be static it has to be trusted it has to be
immovable because everything we do whether it's launching missiles or showing up to have lunch with friends to when we're supposed to work
is largely dictated or predicated based on this immovable, steady, static, entirely credible,
valid, trustworthy thing called time, right? Time waits for no man. And it's based on celestial objects,
specifically the sun. I was watching Game of Thrones last night and Khaleesi says to her
call or Khal Droga, you are my sun, my moon, and my stars. To connote you are my everything
is to say you are my celestial object, because these are the anchors for the most trusted metrics in the world, and
that is time. But time is, I believe, malleable. And that is, time is a function of celestial
movement, a year, right? Takes us 365 days to get around that spherical 10 billion trillion ton item
of hot plasma called the sun. And at the same time, it takes, I think, the moon about 24 hours to get
around us. And these things are immovable. What's not immovable is our perception of time. Remember
when your mom told you that we weren't going to go to the movies tonight, we were going to go
tomorrow night, and it felt like that would be years and you are just outraged. You're outraged.
Who do I call? Let me speak to the manager. Mom has put off the movie for 24 hours. Anyways, your perception of time is entirely malleable. And this is advice for parents. Time becomes incredibly porous
and soft with kids. And that is literally yesterday, as I mentioned earlier in the show,
I was dropping my oldest son off at pre-K and feeling very
emotional. I remember when I dropped him off at pre-K nine years ago, it was after
one of the many school shootings. I can't even remember which one. And when I came in to drop
him off, there was some semblance of an attempt at security. And they had this big kind of wood
metal plated door and they had to buzz you in.
And I remember thinking, God, just how sad, how fucking sad that you have to drop four-year-olds
off at a place where they are thinking about security. But that's not the story. The story is
later that afternoon, I picked up a 13-year-old and took him surfing. And one of the big themes of my work over the course of the
past four months has been that COVID-19 is not as much a change agent as it is an accelerant. And
we talk about that in the context of business and how you make money. I think healthcare is
going to change dramatically. I think 17% of the GDP in the form of healthcare is up for grabs
because telemedicine and remote medicine
and the way we consume and distribute medicine has accelerated 10 years. But what can we take
away personally? And what I would ask of all of us or what I'm trying to do, whether you do it or
not, is I'm trying to imagine that the next 10 years with my kids are going to go even faster.
And the advice I would give to dads is there's no such thing as quality time. There's just time. And if someone were to tell you, okay, you got 10 more years
with your kid, if your kid's eight or nine, and in 10 years, they're going to be at college or
have left the household, but that 10 years was going to be one year. If the rest of the time
you had with your kid, the rest of the time, the total sum of the amount of time you had to spend time with them, to enjoy yourself, to teach them, to love them, to be affectionate with them, to express your paternal and maternal emotions, but you only had 12 months to do it, how would you behave?
How would you prioritize your time?
What would you do?
And then make that your life.
Make that your relationship with your kid.
Because trust me on this.
Trust me on this.
It goes by so fast.
And in addition, for your own selfish measures, wanting to take advantage of that, wanting
to really embrace that relationship, I got to think that the next 30 years are going
to go even faster.
And selfishly, I want my kids near me toward the end.
And I want them to look back on their childhood and think, dad wasn't about quality time. Dad
was just about time. He was there. He was probably there even a little too much. And granted, a lot
of this comes from a privileged decision because I have the ability and the security and the
wherewithal to spend a
lot of time with my kids. And some people just don't have those options, but we all make trade
offs with a certain band of the amount of time we allocate to things, whether it's our friends,
whether it's our work, or whether a lot of the time is just we're in our own heads and not engaged.
Because I'm telling you, brothers and sisters, it is going to go so fast.
You want them at the end to look at you and think, yeah, dad was wrong.
Dad screwed up a lot, but dad was there.
There is no quality time.
There just is time, and it is rocking and rolling and going fast.
Our producers are Caroline Shagrin and Drew Burrows.
If you like what you heard, please follow, download, and subscribe.
Thank you for listening.
We'll catch you next week with another episode of The Prof G Show from Section 4 and the Westwood One Podcast Network.