Front Burner - Former Facebook insider's wake-up call to the "catastrophe" of big tech
Episode Date: April 9, 2019The Canadian government is considering regulating social media giants like Facebook. This comes after the release of a report by Canadians electronic spy agency, showing how Canadians are vulnerable t...o foreign interference in this upcoming election. Today on Front Burner, Roger McNamee, the author of "Zucked: Waking up to the Facebook Catastrophe" explains how the business model of big tech is inseparable from its most negative effects.
Transcript
Discussion (0)
My name is Graham Isidor.
I have a progressive eye disease called keratoconus.
Unmaying I'm losing my vision has been hard,
but explaining it to other people has been harder.
Lately, I've been trying to talk about it.
Short Sighted is an attempt to explain what vision loss feels like
by exploring how it sounds.
By sharing my story, we get into all the things you don't see
about hidden disabilities.
Short Sighted, from CBC's Personally, available now.
This is a CBC Podcast.
For years, men were disappearing from Toronto's gay village.
I feel terrorized.
I'm Justin Ling, this season on Uncover.
If we see this is happening, how can you not see this?
They suspected a serial killer.
And they were right.
Police arrested 66-year-old Bruce MacArthur.
But this wasn't the first time the village was targeted.
You don't start killing at 66.
You start killing when you're in your late teens or early 20s.
Uncover. The Village. Available now wherever you get your podcasts.
Hello, I'm Jamie Poisson.
On October 30th of 2016, Roger McNamee hit send on an email.
He says it was about two pages long, single-spaced, and its message was clear.
Facebook's business model and algorithm allows for bad actors to harm innocent people.
And do you know who he sent that email to?
He sent it to Mark Zuckerberg and Sheryl Sandberg, Facebook's founder and its chief operating officer.
Now, Roger wasn't just some guy lobbying emails to the most powerful corporate leaders on the planet.
He was a mentor to Zuckerberg.
He helped get Sheryl hired.
So he had clout.
And that day, that email, it kicked off a kind of crusade for Roger McNamee.
To twist the arms of big tech to stop what he sees
as very dangerous practices. Today, I'm talking to Roger, a longtime Silicon Valley investor
and the author of Zucked, waking up to the Facebook catastrophe. This is FrontBurner.
When I met Mark Zuckerberg, I'd literally spent half my life in the tech world. He was 22. I was 50.
Facebook was only two years old.
It was still just a picture and your address and your relationship status,
just high school and college kids.
Move over, Google, there's another verb in town.
College students across the country are, quote, Facebooking.
Right now, almost 70% of our users come back every day,
which is kind of ridiculous.
And I don't necessarily expect to be able to keep that up permanently.
So it was super early in the company's life.
And he had a crisis he was dealing with,
and I was able to help him through it.
And that began a three-year period where I was a mentor.
I stopped being an insider at the company around 2009. So in 2016, when I first started seeing
things going wrong, I had been back from it for a while. I'd still been a huge fan. I had this
deep-seated pride that I'd had my fingertips on what I thought was the one company of its era
that was doing things the right way. And the part that makes me really frustrated is there were signs that things were going wrong way before 2016.
And for whatever reason, I missed them.
And I started to see things in the context of the Democratic primary in January of 2016.
Then I also saw things related to Black Lives Matter in March.
Geofidio would use your location to flag information online,
something the ACLU said police agencies
used to monitor protests.
And then, of course, the Brexit referendum in the United Kingdom in June.
So how, in an age of unprecedented data, did most polls predict the wrong results?
And then the Department of Housing and Urban Development citing Facebook for civil rights
issues in its advertising. It alleges that companies targeted advertising platform discriminated on the basis of race and color.
So I had two civil rights things and two electoral things that, in my mind, could only be explained by a flaw in Facebook's business model and its algorithms that allowed bad actors to harm innocent people.
And it's algorithms that allowed bad actors to harm innocent people.
And so I reached out on October 30th of 2016 to my friends to warn them, to say, guys, I've been asked to write an opinion piece about this for a tech blog.
I'm sending you the draft.
And I want you to understand that I think there's a really huge problem here and you've got to get right on top of it. And I was hopeful that my past history with them would
cause them to embrace the ideas and really dig in and do whatever was necessary to fix the flaws
that I saw. And what happened? So they treat it like a public relations problem. What I had hoped
was that we would have a serious conversation and get to the issues. What they did instead was to
pass me to a senior executive, somebody I really liked who was very senior, but whose job at the time was to put out fires.
And the fire that they were putting out was me.
So for Roger, to truly understand what happened in the U.S. or the Brexit vote, which were coordinated campaigns of inflammatory, false, divisive, hateful posts intended to influence political decisions.
With the stated goal of spreading distrust towards the candidates and the political system in general.
To really understand what Canadian intelligence officials just yesterday said is already happening in the run-up to our federal election. We have had several discussions with all of the platforms and we have not really seen that much progress.
You have to first understand the business models of companies like Facebook and Google.
We have to understand how this business model moves beyond a transactional relationship
where you and I get a service, like Facebook Messenger, and in exchange, they target ads at us.
The age of big data has a much longer-term goal.
Google, Facebook, Amazon, and Microsoft have figured out that they can use data to
modify human behavior.
Where industrial capitalism was about using technology to harness nature, these companies
want to use data to harness human behavior, in a sense, to create
behavioral manipulation.
And to illustrate this, he uses Google as an example.
The thing to understand is that these companies have a business plan that has a lot of components
that we cannot see and that we can't even detect when they're going on.
So it started with Google.
Around 2002, they had an amazing insight, which was that in doing what a marketer would always do,
which is collecting data from users in order to improve the product for those users,
they discovered they only needed a small percentage of what they collected in order to improve search.
discovered they only needed a small percentage of what they collected in order to improve search.
And search, if you think about it, is a product designed to identify the intention to purchase something, right? You're going on vacation, you look for airplanes or hotels or whatever.
And what they discovered was that the other data, the stuff they didn't need to improve
the search experience, had a signal related to behavior. You could predict behavior from it.
The problem was that Google didn't know
who they were dealing with or where they were.
So they created Gmail.
And with Gmail, you can think about this.
The notion of tying identity to purchase intent
makes the ads for whatever you're purchasing
a lot more valuable
because you know who you're dealing with.
Right.
If I'm a company and I want to get women in Texas from 24 to 32 who drink
Jameson whiskey and drive a Chevy pickup truck, I can do that.
You can do that. And that is consistent with normal marketing experience. So I'm not
complaining about that part at all. Here's where it got creepy. Was Google realized that if they're
trying to do behavioral prediction, what would be the single best source of insight about people's behavior
you could have? Well, the answer is their emails. So they had to find a way to justify reading
people's emails. So they said, we're going to put ads in and we need to scan the emails in order to
target the ads. I think they knew that people were going to protest against the ads.
So when that happened, they took the ads out,
but they continued to scan the emails.
Now think about this for just a minute.
If you work for the Postal Service or you work for a delivery service like Federal Express,
you're not allowed to read the content of the stuff you're shipping.
If you're a telephone company, you're not allowed to listen.
And it's not at all obvious why it's legal for companies like Google to do that. But yet they did. And it created this
incredibly valuable information. And people were not aware that that was what was going on.
So that was phase one. Then phase two is they realized there was all this unclaimed data in
the world. So they start driving cars up and down the street with cameras on top. They called it
Street View. I'm here to tell you about the latest feature in Google Maps.
Street View allows you to rotate 360 degrees and zoom in.
Street View!
Find a place to park in Chicago.
They take a picture of your home with your children out front,
your dog and all that, and they didn't ask permission.
Then they wanted to get up close, so they do Google Glass.
Tap the touchpad to wake up Glass.
You should see the display above your line of sight. Adjust it to see everything. And everybody
hated the people wearing the glasses, so they go back into the lab. They repackage it as a video
game. They spin it out as a separate company, and they call it Pokemon Go.
Imagine discovering a squirtle hiding along the waterfront in San Francisco or a Bulbasaur at Shinjuku Station.
And they have one billion people going around with their phones,
photographing other people, places,
and they get to run these behavioral modification experiments.
So they create these points to go to, like Starbucks coffee shops,
and they run experiments to see if they altered the deal at different places, how far can they get you to go?
It's all about behavioral modification.
It's not really about the game Pokemon. It's about what they can get you to do.
No, if you're the user, it's about the game Pokemon.
But as the user, you're not aware that that's not the goal of the people offering the game. And the critical thing
to understand is that increasingly we are essentially pawns in this large game. And then
the third thing they did was to go out and acquire our most intimate data from third parties. So they
would acquire banking data and credit card transaction data. They would acquire health and wellness data from people who have apps about that.
They would go to mobile carriers to acquire location data or Uber and Lyft.
And the next thing you know, Google and then Facebook and now Amazon and Microsoft are building data avatars of each and every one of us that is the digital representation of our lives.
every one of us, that is the digital representation of our lives.
And they know so much about us, and they have all of these apps where they can nudge us in directions, or in the case of Pokemon Go, lure us over a fence, that they can run experiments.
And the reason that's valuable is that the closer they get to certainty, the more valuable the thing is that they're selling.
Right?
Life is full of uncertainty
and advertisers always had to deal with that. But these guys are getting to the point where with
behavioral modification, they can create certainty, which makes their stuff worth a lot more.
So this business model, it has the goal of collecting more and more of your data
so that your behavior can be predicted or even modified,
which allows companies to stop guessing when you're going to buy that new car.
They'll know and even nudge you towards a specific product.
And what helps them gather more data?
Keeping you online longer.
And what keeps you on these platforms longer?
Inflammatory content, fear-mongering, conspiracy theories,
misinformation, says Roger. So the algorithm puts that stuff right in your face. And it's in this
context that all the nasty stuff, in the lead-up to the U.S. election, in the lead-up to Brexit,
it flourished. Essentially, if you're in the behavioral prediction business, the most important thing is to get to the essence of each person.
And if you're outdoors or you're in public, you have your best sort of your most civilized self out there.
You know, you hide all the unpleasant aspects of your personality.
if your news feed is filled with things that are related to outrage or fear or disinformation or conspiracy theories, those will reveal the true essence of you. If you are, you know, shall we say
curious about anti-vax or you are, you know, curious about flat earth or maybe curious about
white supremacy, those things will come out. They're getting people to their least pleasant self.
And that is, in a political sense, it increases polarization because it makes it much harder
for people to communicate.
Essentially, Facebook is 2.5 billion Truman shows where each person is being fed what
they like in a way that allows them to believe that whatever they
believe, those are the facts.
Right, because I think to myself, well, this is what I'm seeing, so this must
be correct.
This must be true.
This is what's happening on my newsfeed.
And what winds up happening is when somebody disagrees with you, it's no longer
political debate.
It's actually more like they're the enemy.
So what winds up happening is you get this polarization and essentially life on these
platforms becomes a competition for status. So the fellow in New Zealand, the terrorist in New
Zealand was looking to build status. So what did he do? He orchestrated this show on a selection
of social media platforms. He started on Reddit and 4chan and 8chan. He then
took it to Twitter. Facebook Live is where he executed the terrorism. They used YouTube to
spread it. But he built a group of roughly, I think, a thousand fellow travelers ahead of time
who knew he was going to do this, who then got the film and spread it and who spread his so-called manifesto.
And the reason that this was so dangerous, it was is that it was an escalation in what people are doing on these platforms.
And the platform showed they're completely unprepared to stop it.
They have no circuit breakers.
It took Facebook 17 minutes to realize that there was a terrorist act going on on Facebook Live.
We need to build our systems to be able to identify live stream terror events more quickly as it's happening, which is a terrible thing.
Would a delay help any delay of live streaming?
It might in this case, but it would also fundamentally break what live streaming is for people.
The challenge of all these things is the exact same thing that gets you the service you like is enabling that precise thing.
And the question is, how do we get the stuff we like from these platforms without getting these incredibly destructive, dangerous, harmful things from happening?
How do you prevent those?
How do you prevent those?
We'll be back in a second.
At Franklin Templeton, we help you invest in companies that believe good enough is never far enough.
Reach for better. Franklin Templeton Investments.
So I asked Roger where he thinks this could all be headed.
I thought he might look 10 years into the future.
But instead, his mind went to this fall, to our upcoming Canadian election.
What's the worst outcome?
Well, what if the outcome is not decided democratically by the voters of Canada?
What if, you know, either external or internal forces distort the outcome using social media to do it? I mean, all across Europe, there was an impact of election manipulation.
Russian interference in Ukraine's presidential election is less a concern than an expectation,
according to the SBU State Security Service,
it says cyber attacks have already occurred. In Brazil, WhatsApp was used and there's a high
probability that it affected the outcome. Folha de Sao Paulo reported contracts between companies
arranging mass messaging on WhatsApp to attack Haddad's workers party. It's going on in India
right now with WhatsApp. Essentially, the most recent example
is the country with the most recent election.
And there is a consistent pattern.
It's all been sort of far-right interference, right?
It's not equal across the spectrum.
And, you know, it's all fear-based.
And in my mind, it's really authoritarian models trying to use technology.
When it comes to that far-right interference Roger was just talking about, well, just yesterday,
Facebook banned a number of Canadian far-right anti-immigrant groups from the platform,
including Faith Goldie, a far-right personality who recently spoke at a rally
that included some yellow vest protesters outside Parliament.
No more Trudeau. No more open borders. Our borders will be protected.
So what do we do here?
These platforms play such huge roles in our lives.
They give us services that we love.
There are some smaller fixes, says Roger.
There are literally dozens of things we can do around elections.
You can end targeted advertising some period of time before the election.
You know, you can require real-time disclosure on advertising.
But we also need big changes, he says,
and that includes essentially flipping the business model of these companies on their heads,
even if that means they make less profits.
That means potentially banning the sale of our data,
not collecting data on children,
using anti-competition laws to get companies into the market currently
controlled by a few big players.
It also means making us consumers again, paying for products instead of being the products.
So what are we looking at here?
Like, I pay for Facebook.
No, no.
Let me give you an actual example.
And this is a U.S. example.
In Canada, it wouldn't work the same way.
But Google has a thing called CAPTCHA that many sites use to identify if you're a robot or not.
And you have to touch on pictures of automobiles or traffic lights.
And what's really going on is they're not using those pictures to figure out if you're a human.
They're using those pictures to train the artificial intelligence for Google self-driving cars. They know you're a human being because of your mouse movement.
Now these guys save everything. So they have, for me, they probably have a long history of
my mouse movement. Now imagine the following scenario. I get a little bit older and suddenly
for the first time, my mouse movement gets slower and more erratic. It's the first symptom,
the very first symptom of Parkinson's disease. Now, here's the problem. In their model, I'm the
fuel, right? I'm not the customer. I'm not even the product. They're not legally obligated to
tell me I've just shown a symptom of a neurological disease. Now, in my world view, that exact same AI product
can have its business model reversed
and it becomes an insurance product.
And it basically says,
we're going to monitor your mouse movement
for $1 a month to age 50,
maybe $2 a month to age 60,
$3 to 70 or whatever.
I'm going to tell you
if you might have a neurological event. I'm going to protect your privacy, I'm going to tell you if you might have a neurological event.
I'm going to protect your privacy.
I'm going to warn you.
I'm going to give you a list of neurologists to go see.
Right?
I mean, that's a really valuable product.
And I think a lot of people would sign up for that.
And there is literally an equivalent of that for every single thing that's out there.
Now, I do not think Facebook's going to go to a subscription model, but I think something else could do that.
But while we're waiting for that something else, if it ever comes, Roger has decided to change his own behavior, which is actually a lot harder than you would think.
Roger, I'm interested in the final question for today.
What's your current relationship like with these platforms,
with Google?
You mean, do I use them?
Yeah.
I decided I would need to get off of Google entirely,
which is really hard to do.
It's so hard.
It's so hard to do.
There's so many different places.
I never adopted Gmail, which made it a lot easier.
But, you know, getting off of Google Maps,
getting off of the search engine,
getting off of Google Docs, those were all very complicated.
I decided to make it like a video game.
Do you remember the video game Frogger?
Frogger is you got to get across the river and you hop across logs, right?
Okay, so I'm the frog.
Google's the river.
The logs are the other products, you know, so it's DuckDuckGo for search. And, you know, it's the Apple Safari browser, the Brave browser, and it's Ghostery for getting rid of trackers.
And it's 1Password for, you know, logging into sites. It's all these different things.
Right. These are all other options.
All other options, right. And they're basically all free other options except for 1Password. And so
basically, I adopt all these things. And then I try to avoid
Google. And it's really hard because people will send you like addresses or, or, you know,
their calendar invites or whatever. And my high score, if I don't include YouTube is two months.
And I felt really good about that. But it's really hard because they're all these embedded links. And
Google is everywhere, right? And, And I'm only talking about over at Google
because it's embedded in the app service of everything I touch. So I'm really interacting
with Google no matter what. They're getting data all the time. Now with Facebook, I have
changed my relationship to it profoundly. I do not do any politics. I don't get any news on any of
these platforms. The thing I would say to you is that I have a book out, which we talked about a moment ago. And in order to sell the book,
because it's called Zucked and it's aimed at people who are on Facebook and Instagram,
those are the only platforms you can use to reach people on Facebook and Instagram.
Right. And are you advertising for your book on these platforms?
I am. These things are so deep into our society that we have to recognize there's not a silver bullet.
I mean, unless you want to get rid of them,
and I don't see that that's realistic.
And so what I really am trying to do
is to get people on board and just say,
pick the aspect of the problem you like best.
The thing you're most worried about.
And get your friends on board.
And the thing I say to everybody is,
the fact that you didn't see it right way,
hang on, I'm a professional analyst. I cover tech. I did that for 34 years. Get your friends on board. And the thing I say to everybody is the fact that you didn't see it right way. Hang on.
I'm a professional analyst.
I cover tech.
I did that for 34 years.
I whiffed on this for years.
And when people say to me, you know, I'm just figuring this out.
I go, I know exactly how that feels.
Roger, thank you so much for this really fascinating conversation.
These are the issues that keep me up at night. So I'm so happy to have the chance to talk to you about them. Roger, thank you so much for this really fascinating conversation.
These are the issues that keep me up at night.
So I'm so happy to have the chance to talk to you about them.
Jamie, thank you for this opportunity.
So as I mentioned already, Facebook announced yesterday they were banning a bunch of far-right groups from the platform.
Also yesterday, Canada's Democratic Institutions Minister Karina Gold told the Toronto Star and BuzzFeed News that the Canadian government is, quote,
actively considering regulating social media companies.
Said Gold,
We recognize that self-regulation is not yielding the results that societies are expecting these companies to deliver.
That's all for today. I'm Jamie Poisson. Thanks for listening to FrontBurner.
For more CBC Podcasts, go to cbc.ca slash podcasts.
It's 2011 and the Arab Spring is raging.
A lesbian activist in Syria starts a blog.
She names it Gay Girl in Damascus.
Am I crazy? Maybe.
As her profile grows, so does the danger. The object of the email was, please read this while sitting down.
It's like a genie came out of the bottle and you can't put it back.
Gay Girl Gone. Available now.