Modern Wisdom - #093 - Roger McNamee - Is Facebook Zucked?
Episode Date: August 12, 2019Roger McNamee is a businessman, investor, musician and founding partner of the venture capital firm Elevation Partners. Roger has had a front seat to the biggest developments in Silicon Valley for the... last 30 years and this culminated with his time as an early investor and key advisor to Mark Zuckerberg at Facebook. Over the last few years he became concerned with the direction that Facebook and Big Tech was taking and now spends his time educating the general public on the dangers of Surveillance Capitalism, data privacy, technology addiction and much more. Extra Stuff: Buy Roger's Book - https://amzn.to/2ySnZcm Find Roger's Tour Dates - http://www.zuckedbook.com/ Follow Roger on Twitter - https://twitter.com/Moonalice The Age Of Surveillance Capitalism - https://amzn.to/2Tm1HsU Check out everything I recommend from books to products and help support the podcast at no extra cost to you by shopping through this link - https://www.amazon.co.uk/shop/modernwisdom - Get in touch. Join the discussion with me and other like minded listeners in the episode comments on the MW YouTube Channel or message me... Instagram: https://www.instagram.com/chriswillx Twitter: https://www.twitter.com/chriswillx YouTube: https://www.youtube.com/ModernWisdomPodcast Email: https://www.chriswillx.com/contact Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
Oh, hello friends. Welcome back to Modern Wisdom. My guest today is Roger McNamee. He was an early investor in Facebook, an advisor to Mark Zuckerberg for three years.
He was the person that brought...
Brrrr...
Brrrrought.
Brrrrought. He was the person that brought Cheryl Sandberg in.
Essentially, he's had a front row seat to everything which has happened to Silicon Valley for the last 30 years, and that culminated
with his time at Facebook. Over recent years, Roger became increasingly concerned with the
direction that Facebook and other tech giants were moving in, and he's finally decided to
write a book about it, and obviously come on modern wisdom. So expect to learn an awful
lot about the inner workings of Silicon Valley, what tech companies are really trying to do
with your data and its predictions for the future. If you enjoy this episode, you have to go
check out the one I did with Professor David Carroll about the great hack a couple of weeks
ago. And I'd love to hear what you think. It's an issue that appears to be polarizing
a lot of people, some of them saying it's all tinfoil hat stuff
and others saying that we are at the edge of the apocalypse.
So get at me at ChrisWillX on whatever social media you prefer.
And also, if you're enjoying these episodes,
please rate the show five stars on wherever you're listening.
iTunes or Stitcher Spotify, wherever else you're tuning in.
It will only take you a couple
of seconds, but it massively helps the show move up the rankings, so thank you very much in advance.
But for now, please welcome Roger McNamy. My Facebook reach is declining by the second after I've had Professor David Carroll first
and now Roger McNamee joining me on Modern Wisdom.
Roger, welcome to the show.
It is such a pleasure to be here, Chris.
Really excited to have you on today very timely with the release of the great hack on Netflix
and this resurgence of interest in Facebook and Cambridge Analytica. Can you tell the listeners
who don't know who you are a little bit about yourself, please? Sure. So I first came to Silicon
Valley in 1982 as an investment analyst right before the personal
computer industry began.
I was lucky enough to be part of the most important firms investing first in public text stocks
in the 80s than part of Clinder Perkins, which was beginning the biggest venture capital
firm in Silicon Valley during the 90s, the heyday of the internet.
I was in their office the day that Mark Andreessen brought in the idea that became
Netscape, the day that Jeff Bezos brought in the idea that became Amazon, the day
that Larry and Sergei brought in Google. I started a firm with Bono in 2003 called
Elevation. And in 2006, a 22-year-old entrepreneur came into my office with a crisis.
His name was Mark Zuckerberg.
I helped him navigate the crisis, and for a period of three years thereafter was one of
his advisors.
And that is the context for my conversation with you here today, that from 2006 to approximately
2016, I was a huge fan of Facebook. And then
at the beginning of 2016, I started to see things going wrong on Facebook, which really
violated my sense of what the company was about. I thought Facebook was a force for good.
And it began in the US presidential primaries in the Democratic Party in 2016.
It then included issues related to Black Lives Matter, a protest group in the United States,
and then Brexit happened in June of 2016. And if that was the point at which I realized
that the same algorithms, the same business model that makes Facebook so great for marketers,
those same tools could be manipulated and used to undermine democracy.
And at that point, I started to realize I needed to do something. I started looking for allies
and couldn't find any. I was finally able to persuade a tech blog to let me write an opinion piece about this in September
of 2016.
And while I was writing it, more news came out, civilized violations on Facebook, and then
our intelligence agencies said the Russians were trying to interfere in the US election.
So I sent the draft of the opinion piece to Mark Zuckerberg and Cheryl Sandberg.
My friends, to warn them, I thought something was really catastrophic, we were wrong with
the product.
And they were not particularly supportive.
Actually, I think they treated it like a public relations problem, and the public relations
problem was me.
So they tried to make it go away.
And in the process, they handed me off to one of their colleagues.
A guy I knew really well that I was very friendly with.
And we spent three months.
And after the US presidential election,
I go to them, guys, you have got to do what Johnson and Johnson,
the pharmaceutical firm did, when somebody put poison in bottles of
Tylenol, the analgesic, in drug stores, all over the city of Chicago, killed a bunch
of people.
And the CEO of Johnson and Johnson didn't wait for the government, didn't wait for anybody
to tell him, they withdrew every bottle of Tylenol from every retail shelf in America and
didn't put it back until
they had invented tamper-proof packaging.
And I thought, that's the right way to handle a crisis like this.
If you have played a role in harming the civil rights of people, if you've played a role
potentially in undermining democracy in the United Kingdom of the United States, you
have got to throw yourself on the mercy of the court the same way that Johnson Johnson
did.
And I spent three months pleading with them, please do this. And they would have none of it.
And finally in February of 2017, I realized it was hopeless. And I was faced with one of those choices,
a moral choice that hopefully you never have to come across. But in my case, I did.
Where I had
officially retired from the investment business. I could just walk away and say
it was somebody else's problem. But I didn't. For whatever reason, I realized my
fingerprints were on this thing. I'd played a role. I wasn't a huge part of it,
but I was part of it, and I'd done two things. I'd help suck keep the company independent 2006. And then I'd help to bring Cheryl Sandberg into the company. And those
two things had made a big difference. And I thought to myself, if I'm not willing to
just stand up and do something about this, I'm not willing to do stand up about anything.
And so I became an activist. And I didn't know what that meant at the
beginning. Honestly, the goal was really simple. Just make people aware of the problem. And I met a
young man named Tristan Harris who was coming out of Google. And Tristan had been on the U.S.
news program 60 minutes. And he had given about a 10 minute presentation on something he called
brain hacking, which is the use of persuasive technology on smartphones, by internet platforms, and how you can manipulate people's attention.
And a light bulb went off in my head, and I just realized that's how the Brexit thing
and the US election things had happened.
And so the two of us joined forces and sit out to try to make the world aware of this
thing.
And the first thing we did, we caught a miracle in a bottle.
I mean, it was just unbelievable.
Two weeks after we joined forces was the Ted Conference in Vancouver, Canada.
Ted is the, you know, it's ground zero for Ted talks, right?
I mean, this 1500 technology and entertainment and design people all in one room were thinking we'll
get Tristan to do his brain hacking thing there.
Everybody will know.
We'll get 1500 business cards.
We'll be all done.
Our job will be over.
So we go to 10.
This is a miracle occurs and somebody gets Tristan on the docket on 10 days notice and keep
him my normally Ted talks
people prepare them for six months Tristan's got 10 days. He goes, he just shoots the light
out. It was just amazing. He was fantastic and we're going around to collect cards. We
got two business cards afterwards and no follow up at all. It was completely hope as we're just despondent.
And then another miracle occurs, and I get connected to a guy in Washington
who works for Senator Mark Warner, who is the vice chair of the Senate Committee on Intelligence,
which was at that time the only committee at Congress where the two parties were working together.
And they were the natural place to look at Russian interference and
potentially to help us protect the 2018 and 2020 election. And that began our interaction
in Washington. And so we go there, we have very successful meetings with him and with
Senator Elizabeth Warren, who was very focused on the competition and the trust issues.
And that got us started. We wind up playing a role in causing the hearings
to take place in the fall of 2017
in preparing Congress for those hearing,
teaching everybody what they needed to know
about Facebook and Twitter and to a lesser extent.
Google, they kinda understood Google.
And I've basically been doing this full time 24 hours a day ever since. And I've
written a book called Zucked Waking Up to the Facebook catastrophe, which has been a platform
for activism. I continue to go to Washington all the time. I continue to deal with folks
in press. And my job is really simple. There is a very, very scary thing going on and most of us have no
idea. I mean, we think we're giving up a little personal data for a service we love, right?
It could be YouTube, it could be Facebook, it could be Instagram, it could be, you know,
anyone if it doesn't. We have no idea that that's not what's going on at all that these guys are building versions of
The matrix or of minority report and they're not doing it because they're bad guys
They're doing it because they're engineers and it's a really interesting
engineering thing
can you
use
massive data gathering and
Algorithms to make the world more efficient. That seems like
a lot will go. The problem with it is that if you convert all human experience into data,
if you process it with machine learning, if you then use it to make the world more efficient,
something's got to give. In fact, two things have to give.
One of them is democracy,
and the other one is free will
and individual choice.
And my attitude on this is real simple.
Maybe people are okay with that.
But they don't know that's what's going on,
and we need to have a conversation about it.
In order for a conversation to happen,
somebody's got to tell them what's really going on.
And I'm one of the people who has decided to devote my life to doing that.
We are big fans of Tristan Harris on the show.
I first heard him on Sam Harris just over a year ago, a year and a half ago.
And we went down that rabbit hole big style after that. I spoke to Kai Wei, the CEO of the
Lightphone. Have you heard of that? Yeah, of course. Yep, I spoke to Kai, very interesting guy.
So we've been, the listeners will be familiar, I think, with tech reduction and the race to the
bottom of the brainstem, a little.
Yeah, but it's really important, but the thing, Chris, to keep in mind is we keep learning
new things.
And Tristan and I spent last weekend, not this past weekend, but the one before together.
And we were comparing, I know it's about how far we had come, because when we got together,
we thought that the addiction hypothesis that Tristan had come up with,
the one that underlies the light phone
that underlies all these other things,
was the root cause of the problem.
But earlier this year, a professor
from Harvard Business School named Shashana Suba,
published a book called The Age of Surveillance Capitalism.
It actually came out in Europe in the fall of last year.
But it came out in the States in January, and it changed our lives because Shoshana has
been studying Google for more than a decade, and she spent roughly a decade writing this book. And it is the most important work of economics,
sociology in my lifetime.
And again, that's my opinion,
but I really feel strongly about this.
That this book is to the 21st century,
what Adam Smith's The Wealth of Nations was,
to the 18th and 19th.
I mean, it describes an economic system that is dominating the world,
but about which people are completely unaware, and it names everything it describes how it works.
And the important thing to understand is the reason that addiction exists is because of the incentives
of this surveillance capitalism, which is about gathering all the possible data, converting
all human experience into data, and then using it to make the world more efficient.
And you sit there and say, again, is there anything wrong with that?
And I go, it's not for me to say, but most people do not realize the degree to which the
choices that they're making are being influenced by people that they're not even aware of
and how when you're going to the search results on Google or to your newsfeed on Instagram or Facebook
that they're not picking things that you like so much as they're picking things that they believe they can monetize.
And this is a really important distinction because what they're doing is they're creating
a data voodoo doll.
Tristan, it's a great phrase Tristan came up with data voodoo doll.
They're gathering data.
We think we give them a little data.
The thing you have to understand is that data that we're giving them, that's a vanishingly
small percentage with the HANA. Because in real life,
we touch electronic systems all over the place. Every time we use a credit card, every time we go to
the bank, if we want to buy a home, there's a credit rating score. Any medical thing generates a
digital record. Our phone tracks where we are in real time. Anytime we have an affinity card from
a grocery store or any kind of airline, all that data is captured. Anytime we send an
email on or to a Gmail or Microsoft email account, when we use a shared online application like Google Docs or Office 365.
There's a digital trail from all these six.
And all of that data is available to Google, Microsoft, Facebook, Amazon.
Now, obviously Microsoft and Google do not sell their data.
They just keep theirs.
But they buy the stuff from everybody else. Facebook
does the same thing. And they create this data voodoo doll. And every time you do a search,
every time you look at your newsfeed, you have to in your head realize, oh my God, these
guys are using that data voodoo doll to manipulate the choices that are available to me. They
know my income, they know where I work, they know how long I commute, they know the composition of my family, right?
They know how I spend money, where I spend money.
And they're not using that to make my life better, they're using that to make money.
And again, we may be okay with that, but we ought to have a conversation about what's really
going on, because it's in my thought experiment that I want to give to everybody who's listening
right now is just imagine this.
What if I'm right?
What do you wish you, what will you wish you had done if it turns out I'm right?
Because the thing about Shoshana Zubaspo is it is brilliantly researched.
It is like 500 pages of text and 200 pages of footnotes. I mean, the woman is,
I mean, she deserves the Nobel Prize in economics. And I've never met anyone like her. It's impressive
as Tristan is, as impressive as all these other people in Silicon Valley is. She's at that level or more. And that requires thinking. So Tristan and I have
spent a lot of time trying to amplify the signal from Shashana to make sure more people are
aware of it. And, you know, the other people who circle around this thing are all looking
at Shashana like, oh my God, how did you see this? Right? Because it's so complex, it's so nuanced. And how in the hell did we miss it?
Well, the answer is, these guys, there's a famous sociologist in America named
B.F. Skinner, no longer alive. But he had this notion of operant conditioning, this notion that
you could manipulate behavior of any species by changing
the conditions in which you held them.
But the critical element that he always said was you had to do the experiment in a way
that the subject was not aware of it.
And Google in particular has been really careful to conduct experiments in ways we're not
aware.
Can I give you a couple examples?
Fire away.
So think about Google Maps.
Now, look, he's good.
He's good.
Just did a 2,000 mile road trip across America
and used it to get me from Dallas to Norfolk, Virginia,
via a number of major cities.
So it was, in my experience there, surface level,
life saver.
Exactly.
Everybody loves Google Maps, right?
Yep.
Now, let's think about the person who's drive in the morning to work is an hour long or 45 minutes
and where there's more than one way to get there.
There's a standard way that is the, you know, all else being equal, it's the fastest way to go.
You take that route every day,
but you check every morning just to be sure.
And everyone's sort of wild,
and I'll say, no, not today.
Today you're gonna take this really weird route.
And maybe they say there's traffic conditions or whatever.
Do you know what's really going on?
I'm scared to find out.
So there's a concept, there's a concept in engineering called load balancing, which
is when you have a complex system, in order to have maximum efficiency through the system,
you need to balance the load and you distribute the weight or the traffic in ways that keep
everything moving smoothly.
The same concept applies in airplanes.
Why sometimes you have to have put more people at the back of the plane to keep the balance right. So when it comes to traffic,
the reality of this thing is that on any given day, some people have to go on inferior roots
in order to maintain the flow of the system as a whole, which means that sometimes when they tell
you to take that other route, it's not because there's a traffic problem.
It's because, hey, it's your turn. Today, you're going to take the end for your route.
Let's say you're in a huge hurry. That could be a real problem for you, and you've had no choice in it,
because you trust Google, and they've not been honest with you about, hey, today is your day.
They're just saying to you, please take this other route. And you say to yourself, hey, what's the big deal on that? And I'm going, maybe there isn't one, right?
So then you think about how they monetize maps or especially how they monetize ways, right?
So ways is now the, it's the equivalent.
Is ways is ways owned by Google? Of course, it's owned by Google. And so ways, so ways, it's taking all the traffic conditions for everybody and putting in there.
And again, the goal of the person using maps or ways is to get from one place to another
as quickly as possible.
But Google's goals are actually different than that because they're load balancing in the
case of maps.
And in the case of both maps and ways, they got to get paid.
Now, how do they get paid?
They get paid for something that Shoshana describes as, footfall.
And footfall is getting you to go buy some, basically advertisers pay them for traffic
to go by their location.
So let's say they know that you like fried chicken and you're on ways and it's roughly
meal time.
Well, they're going to drive you past the fried chicken people who've paid them for a
footfall.
And that may take you off the optimal route.
Now, does this happen to you every time?
No.
Does it happen to you some of the time?
Yes.
Can it happen at a time that's very unfortunate?
Definitely.
But here's what matters about it.
Google is playing God.
They're making choices for you.
And because you trust them as an honest broker, it doesn't occur to you that they don't
have your best interests at heart, that they're really doing this optimizing for their own
business.
And on those two, it doesn't seem like it's that big a deal.
Now let's go to Pokemon Go.
So most people don't realize where Pokemon Go came from.
In fact, I didn't know in Toshoshana told me.
So Google had this thing.
Do you remember those glasses?
Google Glass?
Yeah.
So Google Glass was a thing where you
had a little screen in the corner
so you could keep track of whatever you were keeping track of
all the time and have a camera point for them.
And as far as I can tell, the reason people
work Google Glass was so they could look like a dork, right?
I remember, do you remember what we called them?
We called them glass holes.
Nice.
Nice.
Yes.
And it completely failed.
But I want you to understand what Google is trying to do.
They had been capturing human behavior for a lot of ways.
And they've been capturing a lot of free data, right?
I mean, the reason they created Straight View was that they realized that having a picture of every street and having a picture of
every home was really valuable and it was un... Why is it valuable? Well, you'll see, in a larger context,
they want to know where everybody lives, they want to know what their place looks like.
There are all kinds of ways, at some point, you can monetize that information.
ways, at some point you can monetize that information. For example, you're not likely to sell one of those little things that take old people up staircases, those chairs that go up.
Stanislaus, that lift type thing.
Yeah, okay. You're not going to do that to a one-story home, right? So it's useful to have that
information. But the point is they didn't ask permission. They start driving up down the street.
They take a picture of your home and whatever's in front of it. Maybe your car, maybe your kids, whatever your dog.
And they don't ask permission.
But when you get to Google Glass, what you're changing is you're going from a static picture,
which is what street view or satellite view is.
Right?
Remember, whether you want it or not, your home is in those things.
Ready for a burglar to now be able to case your place without even coming to look at it.
And you have no choice in no way to withdraw.
So now they want it in real time.
With Google Glass, the camera on the front is capturing the behavior of the user and everyone
around them without their awareness.
And they convert all that into data.
So if I came up to you wearing Google Glass the camera would capture your face and
Google has all this incredible artificial intelligence so they don't just get a picture of you an ID who you are
They can look at the muscles in your face and make a really good estimate of your emotional state of mind
You say stuff. Well, how do they monetize that the answer is
When you have enough data in a data who who do all, you can monetize anything.
And so they captured that stuff, but the product failed.
They don't give up.
They're Google, right?
They put it back in the lab.
They reinvented the video game.
They realize it's probably a bad idea to publish it as Google.
So they call it something different, niontech.
They spin it out as a separate company. They call the game Pokemon Go.
Was that not Nintendo? Is that not Nintendo or is that a...
No, Nintendo licensed Pokemon to them, but Niantic...
Oh, I see.
The same guy who created Street View created Niantic and he is like an apostle of BF Skinner. He's really big on operate
condition. So, Pokemon go, next thing you know, there's one billion people wandering around
with their phones, capturing video images of everything that they see. So, what happened?
The people think they're playing a game, but Google says, or in Niantic says, ah, let's
run a few experiments.
In this post 9-11 world, can we put a Pokemon in some of these backyard and get you to
knock on a stranger's front door?
It turned out you could.
I mean, people would literally descend on homes and knock on the doors.
The experiment was unbelievable.
I mean, in America, you can't get people
to knock on strangers' doors for anything
other than Pokemon Go.
So how about if we put it over a fence?
Can we get you to climb a fence?
People who would never otherwise climb a fence,
climb a fence to chase a Pokemon?
So then they go, well, let's commercialize it.
So they cut a deal with Starbucks, you know?
Can we?
Can we get you to go into a Starbucks and buy a coffee?
Yes, we can. Now, the key
thing to understand is this is behavioral manipulation. And it's a billion people they were doing
it on, simultaneously. People are terrified about what's going on in China. China has a program
called Social Credit, which is designed to use behavioral manipulation to control the
whole population. But get this. There are only 750 billion people with internet access in China.
Poki Manga was a wildly larger experiment than social credit. And you can see yourself,
well, it's harmless. And I'm going, right, but it's not the end game. Now they have a
project going on Toronto, Ontario called Keysight in the Zasmart city and they're
taking everything they learn from maps and ways in street view and Pokemon go and applying
it at city scale.
They call it a smart city, but Google's going to gather data from everything or sensors everywhere.
There's no privacy, right?
Everything is turned into data, all human experience in that area.
And the things run into a political problem because a small number of people came up and said,
hang on.
How do we influence this?
Because Google has said to the city, no politics. You have to protect us.
There's no democracy in this space. We get to do what we're going to do, and you have to protect us from the people.
Google went to the city of Toronto and stipulated a condition that there could be no interference for the people of Toronto and what Google is doing.
In effect, they had to be protected from democracy, that they got to be in charge, they got
to control the data, do whatever they want with it.
Effectively they controlled the public utilities, they effectively controlled all the city services
inside this development that they were going to help fund
And there would be no scrutiny no control
Effectively they would be substituting
Algorithms and artificial intelligence for democracy
Now there's a one-year hold on that project, while the city actually finally does the deliberation,
while they have the democratic process to decide if this is good.
My simple observation here is that these guys have a plan, and it's, again, it's not evil,
it's just a different value system.
And I believe we're supposed to have serious conversations, we're
supposed to do the thought experiments to imagine what it would be like to live in something
that looked like the matrix or like minority report that was run by a corporation, something
over which we have no control that might not even be in the same country. I'm sure that for each of the elements of business that
you've put forward today, Google acting as whichever company it may be,
whether it be Ways or Pokemon Go or Facebook acting as Instagram or however
they piece it together, they will have a justification for why Street View
exists. Well, it allows you to see things before you get there
or it allows you to be more accurate with maps
or the reason why they redirect you.
Well, there actually really was traffic.
What is the evidence showing this other side
to have there been whistleblowers inside
of the company which you've come out?
To be clear, Chris, I mean, the points you're making
are demonstrally true.
There is massive value from products like maps.
There's massive value from ways.
I'm not saying you're not supposed to like the products.
My simple point here is that Google's goals are at odds with the goals of the people
using the products.
And the people using the products just aren't aware of that.
Mm-hmm.
My question is, how do we know that their primary directive,
their primary goal is collect data?
Well, hang on, to be clear,
it doesn't have to be murder in the first degree. It could be manslaughter, right?
Their goals can be perfectly honorable and still cause huge harm that requires some kind
of action by society. I believe that their goals are honorable. I just disagree with the
value system that they're applying. And my point to you here is, I mean, I've said this, I think three times already,
I believe these people are not bad people. But I do think they have a different philosophy.
And in the United States, they have grown up as executives in an environment where we
allowed the banking industry to destroy the economy with no negative consequences,
where we allowed a bank Wells Fargo
to take money out of millions of accounts of consumers
with no penalties,
where we essentially allow businesses
to do whatever they want with no constraints.
So against that backdrop,
smart people are gonna take any advantage they can get.
And these guys are operating in a new area where there are literally no rules.
And in the early days of the telephone system, operators listened into phone calls.
Eventually we decided, you know, that's not a good idea.
We don't want the postal service to be reading the mail.
We don't want to operate us listening in.
We are at exactly that point within our platforms where we need to set the boundaries.
And my only goal here is not to convince you that there's something wrong with Google
or something wrong with Facebook or something wrong with anybody else, but rather
to say we need to decide where the boundaries are.
In the United States, we have a problem right now, which we've had for a long time and
it's gotten a lot worse, which is that we have a white supremacist who are going on
murderous rampages all over the country with staggering frequency, more than one mass murder
a day nationwide.
And a huge percentage of these things are effectively incubated.
That is, the people are activated by things they read, things that they learn and interact
with onsocial media. Social media is playing an outside role
in a whole series of problems related to public health
for children and adults, for democracy,
for privacy and our ability to make choices without fear
and for competition and innovation.
And in that sense, I would argue that Google, Facebook, Microsoft, and Amazon
are like the chemical industry in the period between 1950 and 1980. When that industry was
very high-grow, very high-profit, why? Because they could pour their waste products, mercury, chromium,
mind-tailings, wherever they felt like.
They could pour mercury into fresh water,
polluting rivers and streams and drinking water.
They could leave mind-tailings on the side of the hill
to leach into fresh water in rural areas.
And there were no costs, no consequences.
And eventually, the society realized that was a terrible idea.
And we should make the people who created these toxic chemical
spills, pay the cost of cleaning them up.
That is the conversation we need to have.
If your system is used to undermine democracy,
if your system is used in ways that lead to increases
and teens suicide, massive increases in bullying,
if your system is used in ways that allow
disaffected people to find each other,
gather, and then perpetrate violent crime,
should there be some penalty?
Should there be some penalty? Should there be some limit? Should there be responsibility?
And my goal here is to just trigger that conversation. We've looked at these things as though
they're all separate problems that the public health issue, which would be Tristan's brain hacking
and all that, was unrelated to elections and that that was unrelated to privacy and that
was unrelated to the whole issue of innovation competition.
But in fact, all of them are the result of the incentives of what Shashana Zubov called
Surveillance Capitalism. of the incentives of what Shashana Zubov calls surveillance capitalism. And it is this business model
where our lives, our digital existence is for sale to anybody that they can know the most
intimate things about us. And we have no control. I mean, why is it legal for Google or Microsoft
or anyone else to read our documents or to read our emails
for their economic benefit. Why is it legal for a bank or a credit card processing company
to sell our most intimate financial details to anybody who wants them? Why is it legitimate
from a mobile company to sell our location? Why is it legitimate for a healthcare services
company of any kind or healthcare data company to sell our most intimate health data? Why is it legitimate for a healthcare services company of any kind or healthcare data company to sell our most intimate health data?
Why is anything about us for sale?
That is the conversation I want to have.
And again, you don't have to agree with me, but we should have the conversation.
And until we have the conversation in the United States, people are going to be killed
in shopping centers, bars, and other environments in schools. And, you know, in the United Kingdom,
you're going to be struggling with issues like Brexit. In Myanmar, they're going to be dealing
with a genocide in the Philippines. They're going to be dealing with desquads in New Zealand.
They're going to be dealing with terrorist attacks on Moss.
You know, I mean, these are not small issues.
And the critical thing is, you know, I talk to young people all the time and say, Ratchet,
I'm a digital native.
I grew up with this.
I'm not worried about this.
And I go, I'm not talking about how it affects you.
I'm talking about how the data they get from you is used to harm other people. Think about this.
Google has so much data. What they do is they identify life events and then look at the things
that preceded it. So if you buy, they look at everybody who buys a car and they look at the things that preceded it. So if you buy, they look at everybody who buys a car.
And they look at the like the thousand or 10,000 steps
they took before they bought the car.
And they compare it to everybody else, right?
And they find the distinct patterns.
They do the same thing with women who announce
that they're pregnant.
So they look at like 10,000 steps in front of that.
And they find what's common.
So there are certain things before you buy a car that you do
that you may not even be conscious of.
They don't say you're buying a car,
but are just indicative of increasing probability.
Because when you have a data of Uddu Dahl,
you're not selling targeting.
You're selling certainty, right?
You're selling, I have 90% confidence
as person is gonna buy a car or get this.
Google knows I believe with 90% confidence
that a woman is pregnant before she knows.
Think about that for a minute.
I'd like all the women to listen to this
to think about that for a moment.
You may not know if you're pregnant, but Google may.
And we ought to ask, is that okay?
Because what happens is that once they know everything in your feet, everything in your
search results reflects what they know that you do not know.
And suddenly you do not, as Tristan says, you do not have agency over your life.
And I don't know that that's
legitimate.
There's a good point there. I recently made a comment on a podcast about how much I love
the new schedule send email feature on Gmail. I said, it this now allows you to send an email tomorrow at 12
midday as opposed to having to be online. So it saves it and you
drive it and then publishes it when you want. But you are right.
If I start writing to my business partner about flyers for
freshers events in September, which we need to get flyers for, in
the sidebar of that email, I have adverts for flyers.
Right, because they are allowed to read the content of my emails.
And the question of,
Well, actually, they have asserted the right to do that.
We've never actually had a conversation
whether that's legit or not, because remember,
they say they're a common carrier.
That's how they protect themselves
from legal equity.
One of the common carrier, please.
It's when they say they're a platform, not a media company, to avoid responsibility
for the content on their platforms. They say, we're a platform, which in the parlance
of the communications industry, it means that you control a pipe. And if you control a
pipe, you're not responsible for it goes through the pipe unless you know what it is right if you're going through and looking at it
Okay, yeah, so essentially this is what makes it illegal for the postal service to read your letters
For Fed looks for us to look inside your packages and for the phone company to listen right and so you could say that
Google and Facebook's
You could say that Google and Facebook's insistence
that they are platforms not media companies, therefore not responsible,
that that means that they should not be allowed.
It's a precedent.
Oh yeah, there's a lot of precedent.
Point is, in the United States, our Congress is,
I think you could say, charitable ineffective.
And at the moment, the government of the United Kingdom is distracted,
right? And as a consequence, getting thoughtful legislation to deal with problems like this
is harder than it should be given the scale of the problem, right? And my point to you is,
you might find it incredibly convenient to have a list of
people who will sell you flyers. But there are points in your life that should be private.
There are moments when sanctuary is absolutely essential to our psychological and human
well-being. And these guys are systematically stripping that way. Think about Alexa. You put
Alexa in your bedroom. You put Alexa in your office. And you're trusting the employees
of Amazon. You're trusting the hardware vendor. You're trusting hackers to never do anything
that hurts you. And maybe statistically you're going to be okay, but there are going to be a lot of people
who are harmed really badly from just the indiscretions and the things that should remain private
and are not under this model.
And Shashana Zuboff makes this brilliant point that the imperative of surveillance capitalist is to invade every possible domain of human existence.
I mean, when you hear about Facebook doing experience to try to create things that allow
you to think instead of typing your queries or your posts, they're trying to invade your
head.
And you say to yourself, well, it'll never work.
And I go, well, I hope not.
But what if it does?
These are really smart people, right?
And the critical thing is, at a point in time, when we're dealing with climate change,
at a point in time when we're dealing with crisis of obesity and all those stuff. The best and brightest in our economy are devoting 100% of their effort to manipulating human
attention and human behavior.
Now that sounds very Chinese to me, right?
In China, the government is determined to manipulate everybody in order to keep them in
line.
And I'm going, well, I just think the United States of America, and I don't know how you
feel in the United Kingdom, but in the United States of America, I think that's a very
un-American thing.
And our companies are doing it, and we are not conscious of it.
And I just think there is something wrong with all this happening outside of our awareness.
I think you're right. One of the things that I want to touch on, I want to begin to go back a
little bit further to the beginning of your story and talk about this journey from 2006 to 2016.
And find out you were with Facebook through it's not its inception, but really it's
add a lesson, I suppose, and it's establishing itself in the world. So what
did you see during that period? I'd also love to hear the first time that you
met Mark and what that situation was like as well. I'm sure that the
listeners would be fascinated. And then it's a funny story. So let me go there first and then you can ask the rest of the questions. Okay. So when I met Mark,
he had just had his 22nd birthday. Facebook was two years old. They had nine million users,
all of whom were university and high school students in the United States who were required to use
their school email account.
So effectively, they had authenticated identity for every user Facebook.
I thought that was the game changer.
That that meant Facebook would finally crack the code of social media that had eluded
my space and friends, or in America online, in all the early efforts to bring people together online.
I was convinced that anonymity allowed bad actors to pollute any environment they could get into and that all of these things had from the beginning.
Whenever you allowed anonymity, it always descended into an ugly mess.
descended into an ugly mess. And so I had not used Facebook when I met Mark
because I was 50 years old and I didn't have an authenticated
school email address.
And so I was convinced of this without actually having used
the product.
This conceptual, I thought he'd solve the core problem.
And one of his colleagues sent me an email saying,
my boss has a crisis and he needs to talk to somebody who's
been around a long time, it really gets the valley who can help him figure it out. Would you take a meeting?
And so Mark comes to my office. And you know, he looked just like Mark Zuckerberg.
He's got the sandals. He's got the skinny jeans, the great T-shirt, the hoodie, the career bag.
Short-killer hair. He sits down, I sit down. Now, keep a
mind. I'm in business with a rock star, right? Bono. We're investing at the
intersection of technology and media and we have one conference room that's set
up like a living room with video game consoles and stereos and it sounds
prudent to it. It should like it. It looks like a living room, right? I mean, it's
big comfy chairs. We're three feet apart in these two comfy chairs.
And I say, Mark, before you say a word, would you mind if I gave you the context for why I'm
taking this many?
Because once you start talking, you'll assume anything I say is influenced by what you've
already told me.
So I need to say something before you say, you said, go ahead, I said, look, Mark, you've
had nine million or average, you've had 9 million or even you've got 9 million
users, but I think you have the most important companies in Google and before law, you will
be bigger than Google is today.
And because of that, if it hasn't already happened, either Microsoft or Yahoo, is going
to come up to you and offer you $1 billion to buy Facebook.
They're going to do it right now.
And everybody, you know, your mom and dad,
your board of directors, your management team,
your employees, your friends, everybody's saying,
Mark, take the money.
It's a billion dollars, dude, you're 22.
You can change the world, you have 650 million bucks.
You venture capital, so promise to back your next company
in Hilltalea.
It'll be way better than the Facebook
said Mark I've been doing this for
24 years and I know two things with extremely high confidence one is
There is not one entrepreneur in Silicon Valley who has ever had the perfect idea at the perfect time twice I am convinced Facebook is the perfect idea at the perfect time twice. I am convinced
Facebook is the perfect idea at the perfect time. It didn't happen for Steve Jobs, not going
to happen for you. Second thing, I don't care who buys it. They're going to wreck it. This
is your baby. If you believe in the vision, you got to carry it out. So, you have to
imagine we're three feet apart, like coffee chairs, in this conference room with
boot lighting, right? It's not like a conference, it's not like a conference room,
it's really like a living room with low lighting and we're right next to each
other face to face. I'm expect, I've just laid this really heavy trip on a guy I've never met. He's
22 right? Some kind of reaction I'm expecting. I got nothing. I mean he goes into these the series of
thinker poses. He's trying to decide if he trusts me. And I don't know if you've ever sat with somebody in an absolutely dead
quiet room expecting a response and not gotten one, right? If 15 seconds it starts to get
really, really on top of the lifetime. Right. At the one minute mark, I'm saying to myself,
well, it's okay because he's really showing me great respect. He's trying to decide if
he trusts me. At the two-minute mark
I'm going whoa this is really creepy
At three minutes, I'm digging
trenches in the upholstery of the chair. I'm sitting in at four minutes
I'm ready to scream, but I realize I'm in the presence of a singularity, right? I mean
There's nothing like this has ever happened to me in my life and I'm thinking wow
This kid is like he's truly one of a kind.
I mean, I've never seen anybody go through a thought process like this.
Finally, he comes down and he looks at me and goes, you're not going to believe this.
I go, try me.
That story you just told, That's why I'm here.
That exact thing has just happened.
Every detail that you said was correct.
Yahoo, billion dollars.
Everybody told me, sell the company.
And I said, well, Mark, what do you want to do?
Do you want to sell?
He goes, no, but I can't run it by myself and I don't want to disappoint everybody.
And I said, well, hang on. You got plenty of capital. You're growing like a
bad out of hell. They don't have any right to tell you the dances over. So I helped him
understand how you convince people that, wait a minute, when you're part of a rocket ship, you don't abort the mission just because
you think you've gone high enough, right? You play it out. And the whole meeting maybe took half
an hour. He goes back, he kills the deal, convinces everybody to do the right thing which they did.
And from that point for three years, I was one of his advisors because you imagine this.
He's got a problem, right, that his board of directors wanted to sell the company.
Management, he wanted to sell the company.
He was going to have to do some upgrading, and he didn't know who to trust.
So I was one of the people he trusted, and so I helped with that.
And the key thing I did was to bring in Cheryl.
And I knew Cheryl back from when she was at the Treasury Department because she and Bono
had worked together to forgive all this debt of emerging countries for the millennium.
And she introduced Bono to me.
And then I helped to introduce her to Google, which got her stirred in Silicon Valley.
So we were super close.
And I always convinced she was the right fit for Mark.
And likewise, Mark for her.
So you could imagine I'm a huge believer in this company, but around 2009, Cheryl's
on board in place and the thing about really successful, the executives of Silicon Valley,
they rotate their mentors according to what they need.
And by 2009, they had the management team that they wanted.
So my role was kind of does.
I move into the background.
And I'm in the bleachers cheering a lot, but I'm not no longer in its side or from 2009.
What was your role from 2009?
What did you do?
Well, we were an investor.
So it didn't happen right away.
But about a year after I met Mark, he gave us a chance to invest,
and then another one that elevation took advantage of.
Did you unload every single penny of your net worth in it into that to try and maximize?
No, no, no. I mean, it wasn't at all obvious at the beginning what the business model was.
I just thought with the rocket ship propelling itself towards the future as you suggested,
this would be it. Yeah, but that's not, I mean, elevation made a huge investment in it.
Okay, it made as big investments it could, and it worked great for elevation.
But to be clear, it doesn't work quite the way you described. That's just not how the world works.
I understand.
It doesn't work quite the way you describe it. That's just not how the world works.
And, and again, I really liked it.
I liked Mark.
I loved Cheryl a lot.
I was a real big believer.
Now, the thing is, I'm a professional technology analyst, and there were signals along the
way that not everything was right.
But unlike Google, Facebook took quite a long time to get its business model right.
And the thing that's a problem today, it didn't really begin until about 2013.
So I've been out of the company for four years by the time it began.
But the things that led to it respect the people who used the products.
He didn't respect privacy and that he wanted to push them, force them if necessary, into
a level of openness that would ultimately be really problematic.
Can you remember the first particular signal
of that kind that comes to mind
or whether a couple of those that come to mind?
The problem here was that I thought that he learned
from the mistakes.
The first one was a thing called beacon.
And you may remember beacon.
Beacon was their first effort at monetization at scale.
And the notion was, if you were out in the real world
and you bought something and you were a Facebook user, the person you bought it from could do a promotion on Facebook and publish it to your
newsfeed. You know, you went to a sporting goods store and bought
a sporting goods store and bought trainers, right? And they published in the newsfeed. But almost immediately a disaster happened. A young American man bought an engagement
row, a ring from an online company called Overstock.com. And Overstock was one of those things that
take sold things that really deep discounts. So they published with newsfeed that Mr. Axe has just bought
an engagement ring for 95% off,
and here's a picture of it.
So the first word his bride to be gets
that he's going to propose is that she sees the same
as newsfeed, all of his friends see this thing
at the same time. Now, nobody
has any control. They have no idea this is going on. It's just started. It's a massive disaster.
And I mean, we're talking train wreck of the first order. And it's the kind of thing
where you would think a company would never go near invasions of privacy again. And I want to say that's sort of 2007 that that's going on. And so
Cheryl comes in in early 2008, you know, they repackage the technology as something called
Facebook Connect, which it works the same way, but it does something totally different.
Facebook Connect is a very convenient way to log on to websites.
You don't have to remember lots of passwords.
The problem is, of course, if Facebook used it, to spy on people, they went around the
web.
The same way that when they put like buttons around the web, it was all about tracking what
you were doing as you went around the web.
Now, Facebook resisted using that data in their advertising
for a long time. But when they went public, the ad tool they had at the time
was incredibly successful because of the growth in users.
Advertisers have to go where the audience is.
And so they were putting lots of ads on,
but Facebook reprised the ads.
And so all the advertisers really angry.
Facebook didn't want to cut the prices.
They wanted to make the ads more valuable.
So they realized they had to take what they would call web data from spying on you as you
go around the web tracking and embed that into the ad system, which they did beginning
in 2013.
And that was the trigger for all the bad stuff that happened.
The problem was that it wasn't obvious that it would have the negative consequences for
elections or for public health or for privacy.
At the beginning, it just appeared as though you were getting ads that were better targeted, right?
They seemed way more relevant to you.
And that you can log into a new e-commerce website by saying sign up with Facebook as opposed
to having to put your credentials in.
And you could like something out on the web and it would show up on your Facebook feed.
You know what I'm saying?
All these things seem very innocent and convenient.
But what we didn't think about was that literally anybody could use them, which means that,
you know, in Terrier Square, you've got a protest organized in Egypt over Facebook and then
crushed over Facebook because the forces of darkness, if you will, had way more money, way more power.
And every bit is easy access to Facebook as the processors have.
And so they were able to use all these tools to identify who the protesters were, right?
I mean, because Facebook, basically in those days, you know, they had this thing when they
were first trying to figure out monetization,
all they had was a large user count.
But people spent like a minute a day.
And then this company called Zingat came along
with a poker game and with Farmville.
And people were spending an hour a day
playing Farmzilla, playing poker.
And what Zingat did that was so clever
was they kind of hacked the Facebook system to get friends. Facebook was wow if we give app developers access to friends list
They could build these social apps and that'll make usage go way way up
So beginning in 2009 or so they started systematic doing the US Federal Trade Commission go wait a minute
You have to get permission from the users before you do that. Facebook
goes, okay, no problem. We'll sign a consent to green promise to do that, which they completely
ignore. So they continue to give access. And here's the math. They went public in 2012. They
had nine million apps in theory. All of them were eligible to get these friends list. But let's
just imagine only one percent did. I believe the numbers
actually higher than that. But let's imagine only 1% did. That would be 90,000 apps. Let's think
about what some of those apps were. Well, how about Xbox Live? How about PlayStation?
Right? I mean, these are things with tens or hundreds of millions of users getting access to the
friends list of all of those users on Facebook.
That's everybody on Facebook multiple times over, right?
So everybody's days, that's the thing that Cambridge Analytica was about.
That actual thing.
And that took place after the consent decree.
And that's why the Federal Trade Commission just put a $5 billion penalty out,
which by the way was a bargain.
The reality is they probably,
if you apply normal Federal Trade Commission pricing
to each violation, it was probably multiple trillion dollars
worth of violations.
I mean, as digital toxic spills go, it's hard to get, hard to imagine I think bigger
than that, but the digital talks bill associated with things like video game platforms was
almost certainly orders of magnitude larger.
And five billions just the cost of businesses that the case is.
It's way, way, way too low.
And it didn't mean anything to go investors thought it was literally less than a slap on the wrist.
I mean, the investors treated it like they just been handed a bouquet of roses by the
federal trade commission.
It's not as terrible.
And the point here is, I'm an analyst.
I might have picked up signals earlier.
And the embarrassing thing was I retired at the end of 2015.
And I retired because the culture of Silicon Valley had changed.
If you think about it, I grew up in the era of Steve Jobs and Gordon Moore, this notion
of using technology to empower the people who use it.
It had the combination of the intersection of the value system, the US space program, with
the hippie value systems
of empowerment.
And we all believe in that.
And that began to change after 2000.
And I didn't initially appreciate how profound the change was.
But it was really led by what are now known as the PayPal Mafia, which is Peter Tiel and
Elon Musk and Reed Hoffman and the rest found a PayPal
who were not just brilliant executives but brilliant investors and had a set of insights
about how the technology world was evolving.
That they saw before anybody else and they took advantage of before and they also didn't
really was what created social media.
So they did LinkedIn and Facebook and a bunch of other stuff.
But they brought a different value system.
They brought this hyper-libertarian value system of, hey, move fast, break things, right?
You're not responsible.
You could disrupt without being responsible for the consequences of your actions.
And that, unfortunately, happened in kind of a vacuum that posed the bubble burst in
2000.
If I took Apple industry was in retreat, it turned out that that was a critical moment
because suddenly you could, you know, the limitations historically of technology, we didn't
have enough processing power and memory or storage or bandwidth to do what you wanted
to do.
Those all basically evaporate around 2003, 2004.
So you could, for the first time,
do global consumer products like Facebook or Google
or Instagram.
And these guys saw it before anybody else.
So they got there and they set the rules
and they set the culture.
And at the beginning, it was like this hybrid culture, right?
Where these people were doing real business
and more or less
a bag of the rules. But the second generation companies. So now we're talking about YouTube.
We're talking about Uber and Lyft and Spotify. We're talking about Airbnb. These guys basically
started off with the notion that laws just don't apply to us.
And that there are people out there who do not understand what's going on and we can take
advantage of them because we're not responsible for the harm.
So suddenly predatory business models become the rule in Silicon Valley.
You know, there's a company in the nicotine business called Jewel that makes vape pets. And every one of these things I had a chance to invest in.
And I'm just sitting there going, wait a minute, this is the best that Silicon Valley
has to offer. I can't manage other people's money if I'm not willing to invest in these
things. But my personal value system didn't allow me to go there.
And so I realized I got a retire. So I retired in 2015 at the very end.
Because elevation winds up very successfully. We in 2015 at the very end because elevation winds up
very successfully. We distribute everything at the end of 2015 and I'm free, free at last.
And something happened because within a month, I'm on vacation with my wife and I'm looking at
Facebook. And I see posts coming from Facebook groups,
see posts coming from Facebook groups,
notionally associated with the Bernie Sanders for President campaign.
This is a very beginning of the Democratic primary.
And they're really deeply misogynistic.
And they're not the sort of thing any campaign would want
as fingerprints on you.
You'd never put it on an official Bernie site
that you were doing misogynistic thing.
And yet it was spreading like wildfire, like somebody was spending money to get my friends to
join these Facebook groups. I'm thinking, that's really weird. And really bad.
You know, because I always, I have a rock and roll band called Moon Alice, actually,
second one called Dubie Decimal System. And we've used Facebook really successfully to build our fan base into
Tickets and tell fans about when you've got shows and events
Yeah, and and communicate with them directly and for them to communicate
I mean we have there's a Facebook group for for Facebook, but there's the one called the tribe. It's just the fans
right and
You know, it's so it's like you know,, I'm, I mean, I've been a true believer
of Facebook, early adopter. I mean, we did the first Facebook live of a rock and roll
concert. The day the product was turned on, we had a show that night and we broadcast
the whole show on Facebook live the first time that I've been done.
That's cool.
Yeah, no, and we did a bunch of stuff. We were the first one on whatever the one is the Twitter
owns.
I can no longer remember the name, but I know what you mean.
Yeah, it's what's got a periscope.
Periscope.
But all those things came out.
We missed Miracat by like four days, but, but, you know,
we, we were really early adopted and we've been streaming our
shows for years, right?
So we really knew a lot about it.
And I have four patents on live streaming, which by the way are completely
worthless because of the market power of Facebook and Google. I can't enforce them. But
the point here is that in March of 2016, Facebook expelled a group that was using the ad tools to identify people who are interested
in Black Lives Matter, which is a protest group and a completely peaceful, honorable, really
in my opinion, important civil rights organization.
And they were gathering data on the identity of these people and selling it to police departments,
which is a massive civil rights violation. Now, Facebook did the right thing, right? They expelled them,
but by then the damage had been done. And that's the first inkling I got that, wait a minute.
These ad tools are not benign, but Brexit was the critical thing because it breaks it all the sudden I realized wait a minute. There is an asymmetry because engagement, you know Tristan's hypothesis, right?
You want to go down the brainstem.
So the most basic human emotions, fear and outrage, which are part of flight or fight, those
are the lowest common down there.
You can get engagement from the largest number of people by appealing to that.
It's asymmetric.
People with a naturally conservative personality will be on average more fearful.
From a political perspective, it's easier to use these tools to inflame people who think of themselves as being conservative,
which means you can't use these tools equally effectively across the focal ecosystems
spectrum.
There's a massive asymmetry.
That's interesting.
And that's something I've never heard of thought of before.
And that asymmetry has been, didn't begin here.
That's been Rupert Murdoch's model from the beginning, right?
And so the Murdoch organization has played to that
unfox and on sky, wherever they've had control
of the news agenda.
And, you know, it's just, these are elements
of the human psychology that we can't avoid.
And I didn't know all the nuances, but I knew enough to knew that there was something
really wrong here.
And that's when I got activated to try to do something about it.
But again, in this context, I'm viewing Facebook as the victim.
When I went to Mark and Cheryl, nine days before the US election in 2016, It was as a friend, as their mentor,
to try to alert them of something that was,
I thought, really horrible.
And the truth is, I think that they looked at me
and said, Roger, if you're so smart,
where are your two billion active users?
You know, where are your billions of dollars? Because for the prior five or six years, pretty much everything Facebook had done had worked out. And if
you're as successful as they were for as long as they were, human psychology, you start to
think that all these things happen because you're a genius
and that therefore everything you think is right.
Right?
They've been proving critics wrong for a long time.
And so I think their visceral reaction was to just assume I was wrong.
And I totally understood that, which is why I spent three months trying to persuade them,
because I mean, I've been around this a long time and this is not the first group of people
I've been around this a long time, and this is not the first group of people I've seen, have had that kind of Midas psychology.
And I didn't, I didn't think less of them because of it,
right?
I just thought it was a high-party and over,
and I was worried that I was no longer the right messenger.
And, but I spent three months, I gave it my best shot,
because I didn't want to say anything publicly, because I thought the minute I spent three months, I gave it my best shot because I didn't want to say anything
publicly because I thought the minute I said something, public, he was going to change
our relationship permanently.
So I didn't say anything, public, for a long time.
I actually waited another six months after that before I said anything publicly.
And because I wanted to give them a chance to get it right, I was convinced that no one
could fix these problems
better than the people running the companies.
And keep in mind, at that point,
I think it's just Facebook.
I have no idea that this is also about the Instagram.
It's also about Google.
It's also YouTube.
It's about Microsoft, Amazon, all these are the companies.
I just maybe I said another.
To interject there, what I think a lot of people at home might be thinking, and
certainly one of the thoughts that's coming through in my mind is that it looks to
me and it seems to me like right now litigating from top down, but restricting what
platforms can do or some sort of very stringent law enforcement with more transparency, etc.,
etc. appears to be really one of the few cards that we have to play now. But as you've
alluded to there, a much more effective solution would have been for it to be bottoms up,
would have been for the people within the business behind the business to have gotten there,
nipped it in the bud early and then redirected the business model appropriately.
Well, we continue to appeal to employees at all these companies.
And Tristan does this every week.
And he's still in communications with senior people at particularly Facebook and Twitter.
Now the reason we do that, you may remember Uber had a massive situation where the woman
came named Susan Fowler, and Susan wrote a blog post a couple of years ago in which she
talked about the toxic culture at Uber.
And here's the thing, the toxic culture of Uber wasn't a secret. Everybody knew about
it, but the management team, the board of directors, the employees tolerated it because the
company was on this rocket ship of growth. And Susan Fowler wrote this blog post that
for whatever reason, shifted the perception of the employees so dramatically
that over the next six months there was almost a hundred percent turnover of the management
team. It changed everything. And we looked at that and that's all we need here. We just
need to get the right person. And it was interesting because it was
a serious people, first Justin Rosenstein who did the like button, then Sean Parker,
we'd been the first president, and Facebook completely ignored these guys. And then
Chimath Palipitia, who had been in charge of growth until 2011, does a thing at Stanford University
in late November of 2017. And it gets publicized in early December, and we're going,
this is it. Chimath had hired essentially all the people running the growth team, which is
the brain hacking part of Facebook. And we're thinking, this is the guy.
Well, Facebook apparently recognized that.
They came down on him and somehow within 72 hours, he reverses his field completely and
starts becoming a spokesman for Facebook.
Wow.
And he goes on TV, he came to the UK, he was on Christiana, on Pura and CNN, and he's talking about how, you
know, sucks the smartest guy on earth and only a suck can fix everything in society.
And I mean, not quite, but something like that.
And I'm going to get a tack on.
Wow.
I mean, Chimapahapati is a world-class poker player.
I mean, seriously, you finished like 101st out of,
I want to say 3,500 players in the world champion,
poker championship.
I mean, he is no shrinking violet, the notion.
I mean, he's not going to roll over
because Cheryl calls up and yells at him, right?
That's just, he's not that.
What do you think happened?
I have no idea.
I mean, there's been a lot of speculation I only met Shemoth one time
Zuck asked me to help recruit him when he went into Facebook because his office was literally directly upstairs from my
Detellitation and so I spent an hour and a half with him and I will take Shemoth's one of the smartest people I've ever met
One of the most ambitious people I've ever met. He's he's almost the prototypical top of the pyramid Silicon Valley bro.
Exactly the kind of person you would want running the growth team.
Growth team at Facebook. I mean, Zach totally picked the right guy. And, you know, so I
don't know more enough to know what really drove it. But, you know, you can speculate.
But the point is, it doesn't matter.
They turn them around.
And ever since, with all that news flow,
there hasn't been anybody else other than Chris Hughes,
the co-founder, who's really come out and sent anything.
Now, Chris has been amazingly effective.
He was Marx-Rumaydin College, he was the fourth co-founder of Facebook, and he has been
talking a lot about any trust.
I have enormous respect for Chris.
But think about this.
According to Made Senate Sanofa, there have been at least 9,000 people killed in what
the UN calls in classic ethnic cleansing in Myanmar.
There are I think 42,000 missing and presumed dead.
So if they're all dead, let's assume that's 51,000 people.
Not one person at Facebook has spoken out saying,
this is a moral outrage and we need to do something about this.
Not one.
And I'm thinking myself, how can that be?
I mean, how can that be?
After Christchurch, I mean, Christchurch,
proportion to the population of Zealand,
Christchurch is exactly the same percentage
that 9.11 was in the US, right?
I mean, this is a nationally traumatizing event.
And nobody's got a problem with that.
I'm just going, hang on. What is going on here? You know, how about YouTube? Right? I mean, YouTube
recruits and trains terrorists. It poisons the minds of little children and it's caused in the
United States an epidemic of measles. It's got, you know, climate change denial happens on YouTube,
flat earth happens on YouTube.
I mean, there's been one person, Guillaume Shalom,
who was an algorithms engineer, left the company and has become an activist.
But no current employees have got a problem with this.
Actually, they do.
They spoke out quietly in the company, ignored
them, right? And that's the difference between Google and Facebook. Is it Google you've seen
the employees go to management and tell them there was something wrong with YouTube? They fought
against Maven, which was a defense contract. They protested at least a little bit against
creating drag and fly,
which was the version of the search engine for China with all the censoring, and then
they had a very, well, at least temporarily successful protest against the required arbitration
in cases of sexual harassment, sexual misconduct.
But Google's punished the people who organized the protest and they've gotten away
with that. But at least at Google, you see signs of the employees sticking their heads up and
doing something. And that makes me hopeful. But Google's a really smart company, really well-managed,
with a lot of tentacles and a lot of places. they're the ones who are you know trying to replace cities. Facebook's just
trying to replace currencies which is a disaster right. I mean because you sit
there you go hang out. What makes you a sovereign nation? The legitimate use
of force with the military in the police and control a currency and Facebook's
go hey we're gonna replace you on the currency part.
And I'm like, WTF, dudes, I mean, come on.
What do you think it?
That's not.
So to round up, one of the things that I'm thinking,
I'm sure a lot of people at home will be as well,
is that we are currently a part of this infrastructure.
A lot of these things are part of our lives.
Totally. Are there any bits of advice?
Is there anything which you can suggest to the people at home, which are the lesser of
few evils? Well, actually, no, actually, there are positive things going on.
Okay. The most important thing to understand is that Apple has decided to make protecting
consumer privacy,
the focus of their product development,
the focus of their business.
And the things that they've done are huge
and they're getting more huge by the day.
So, you know, if you do facial recognition on Apple,
that image of you, which is incredibly detailed,
never leaves the phone.
That's a huge deal.
Siri, one of the reasons why Siri doesn't perform
as well as Alexa is they do a huge amount of the processing on the phone. That's a huge deal. Siri, one of the reasons why Siri doesn't perform as well as Alexa is they do a huge amount of the processing on the phone to minimize the data that
goes into the system. Apple maps, Apple spends a billion dollars a year on this mapping product.
They do no monetization. Do you know why it exists? So that you don't have to use Google
maps. And here's what's really interesting. Instead of doing a Google does, which is to
track every route you ever take forever,
Apple, as soon as you're done with the root, snips off the place you left from and the
place you're going to, breaks the rest into 10 segments and anonymizes them so they
can't reconstruct your root.
Is that true?
Wow.
So then Apple creates Apple Pay, where you can use your phone to pay for stuff.
The merchant, all they get is money.
It's like you've just done a cash transaction.
But the data leakage occurs with your credit card, particularly if you're on master card.
There's a little bit less data leakage with, well, the data leakage on Verizon occurs
in a different place.
But Apple says, okay, so we're going to make a credit card.
And the
deal on this is the guy with the bank we partnered with can't leak any of the data. So if you
use an Apple card, which comes out this month in the United States and soon thereafter,
everywhere else, if you get an Apple card and use Apple Pay and Apple Card, it is like
paying cash. It is a great idea. Apple has a new thing coming out in the new operating
system called Sign In with Apple. Let's see equivalent of Facebook, Connect, or Google Connect. But the difference is it uses a random
email every time you log onto a website so they can't track you. Now, this is really important
stuff because basically Apple is going after a portion of surveillance capitalism and they're
going after it hard and in a way anybody can use.
The other thing that's going on around Apple is that they're all these other products,
and this is like duck, duck, go for search, but also now for browsing. Disconnect me, which blocks
trackers from being sent out by apps that are running inside your phone. I mean, the Apple ecosystem
is becoming wildly more secure. We're
Android. The whole purpose of Android is surveillance. So the first thing is,
if you've got an Android, as soon as your contract is done, Apple, okay?
Seriously, that is the best thing you can do. I don't think you have to
drop, you know, using the apps you love. But what I would recommend you do is don't embrace
ecosystems all the way in.
If you're going to use Facebook, don't use Messenger.
Don't use everything with Google and Google accounts.
Do things anonymously.
If you must use Google products, now I treat Google
like a game.
You remember Frogger?
Yep.
I treat Google like Frogger.
I'm the Frog, Google
Sariver, the logs of the other products, like the Apple products or Duck Duck Go or Disconnect Me.
And my goal is to avoid using Google. I'm a sound music musician. And the way musicians share
music, if they say, if you got to learn a song that was created by somebody else, you'll listen to
it on YouTube, right? So I have a real problem with YouTube, but I haven't done a Google search in two years.
I haven't used Google Docs in over a year.
And, but there's some things that, you know,
this morning somebody called me
and they wanted us to do a conference call
on Google Hangouts, I go,
no, why can't I do that?
You gotta call me.
Yeah.
And my point here is, I'd be in into a game. And I can't really, you gotta call me. Yeah. And my point here is, I made it into a game.
And I can't really win the game.
I can just stay alive a little bit longer
than with people.
I pay cash for a lot of stuff.
Now, you say, Roger, you're paranoid.
I go, I readily admit that.
But my point is, I'm mostly doing it
to see if it can be done, right?
Yeah.
And again, it's a thought experiment. You're supposed to ask the question, what if I'm right?
And the most important thing is every piece of evidence, I track every news story that
goes by.
I get hundreds per month.
These guys haven't learned that thing.
I mean, the ad stuff that they were supposedly going to protect in the context of elections
does not work, right?
That, you know, you see in the UK there's this brand new story about Boris Johnson and
you know, that's that consultancy that did basically propaganda on behalf of, of Yeltsin
or on behalf of Boris Johnson, that was authentic because the people put their real names on it,
but you couldn't connect it to Johnson,
and it basically went through a loophole in the new rules.
I mean, there is no protection for democracy right now,
and obviously Google's still trying to push its smart cities,
and Amazon's going in there and taking over the healthcare industry,
and Microsoft's playing all these games in AI.
And we don't yet have enough pressure.
And so I'm just saying to people, look, when you meet a politician, get in their face
and go, when are you going to look out for my privacy?
When are you going to look out for my self interest?
Because there is no reason that you can't have all the service you should love on the internet without surveillance, surveillance capitalism. The surveillance is there only for the benefit of
the vendors of the platforms. And the services we love, God bless, I mean, I'm not, what I want to do,
I want the next big thing to be a return to technology that empowers us. And that's a way bigger opportunity.
Why do we want to only work for four companies in the whole world?
It'd be much better to have an environment where you can start a company in your living room
or in your basement and be an entrepreneur and do your own thing.
And right now that's getting harder and harder to do.
And we don't have to be that way.
We can go back to the value system that Steve Jobs called bicycles
for the technology that helps us be more fit as well as being more fun.
Roger, it's been fascinating today a lot to think about, especially with David Carroll's
episode. If you have not already checked that out, I suggest that you go and have a look.
Also, what would you suggest if people want to read a little bit more? Is there one blog?
Is there one Twitter account?
Is there one particular web page that you would direct them to?
I wish there were.
The most important thing, if you have the courage, is to read or listen to the audiobook
of the Age of Surveillance Capitalism by Shoshana Suboff.
Failing that, the best journalist in the United States on this issue is Cara Swisher.
She does a month, actually now I guess a biweekly column in the New York Times. So Cara Swisher's
Twitter account is a really good one to follow. I think David Carroll's Twitter accounts,
excellent follow, Carol Cadwalder in the UK relative to anything that was, you know, privacy but Brexit and, and, and,
and Cambridge Analytica Relays, she's absolutely the best.
But at the moment, they're not enough of us in the advocacy world to create a central
hub for all this.
God knows, I'm, I mean, I just don't have enough time in the day.
And so what I would tell you here is it does take a little bit of energy, but that's okay because they're
You know
There's only a million of them the rest of us around the same team
So you can join you can join team human and and go for it, okay?
I like it a lot Roger. I really appreciate your time. Thank you so much link to
Zookt on Amazon and I advise you to go and check out. It will be unordable as well, right?
Oh, yeah, it's unaudible.
And the great thing is I'm coming to the UK in September.
I'm going to do a bunch of events in London.
And you can go to zuckedbook.com if you want to see where I'm going to be.
And just the thing I would say to everybody is, again, it's your life.
You get to make your own choices. Well, I want's your life. You get to make your own choices.
Well I want to make sure that you get to make your own choice, okay?
And that's what this is about.
That's a great way to finish.
Roger, thank you so much for your time, guys and girls who are listening.
If you are in the UK and you want to meet Roger, don't forget to go and check out the
website.
It will be linked in the show and out to below, along with everything else.
Please do not forget to like, share, subscribe, do all that good stuff. Roger, I hope you have a great time and I really hope
you enjoy your trip to the UK as well. I hope I will get to see each other, Chris.
you