Behind the Bastards - Part Two: Mark Zuckerberg Should Be On Trial For Crimes Against Humanity
Episode Date: September 24, 2020Robert is joined again by Jamie Loftus to continue discussing Mark Zuckberg. Learn more about your ad-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener for privacy informatio...n.
Transcript
Discussion (0)
Alphabet Boys is a new podcast series that goes inside undercover investigations.
In the first season, we're diving into an FBI investigation of the 2020 protests.
It involves a cigar-smoking mystery man who drives a silver hearse.
And inside his hearse look like a lot of guns.
But are federal agents catching bad guys or creating them?
He was just waiting for me to set the date, the time, and then for sure he was trying to get it to happen.
Listen to Alphabet Boys on the iHeart Radio app, Apple Podcast, or wherever you get your podcasts.
What if I told you that much of the forensic science you see on shows like CSI isn't based on actual science?
And the wrongly convicted pay a horrific price.
Two death sentences in a life without parole.
My youngest? I was incarcerated two days after her first birthday.
Listen to CSI on trial on the iHeart Radio app, Apple Podcast, or wherever you get your podcasts.
And that was a little, a little bit of levity at the start of it before we get into depressing shit again.
Some abstract levity.
Some abstract levity. Yeah.
Pieces of levity that one can assemble into comedy.
Yeah.
That's very nice. Yeah.
I mean comedy, you know, it is just a series of things that you put in the correct order.
It's like a deconstruction of comedy.
Like when people take apart a sandwich and then serve it on a plate in a fancy restaurant.
I gotta be honest.
Sometimes someone says it's a deconstruction of comedy.
It's the least funny shit you'll ever hear in your entire life.
They're like, he's deconstructing the medium.
And it's usually just like some, some guy.
It's usually just some guy.
Yeah, it's never any good.
But you know what is good, Jamie?
What?
Facebook's.
I was like, this can't be a transition to Mark Zuckerberg.
No, because nothing about him or his company is good.
Robert just talked about this delicious lunch he had as I eat a pancake.
I had a great lunch and I'm eating a packet of peanut butter.
And you know what?
I'm content.
Sophie is eating peanut butter.
I had a delicious lunch.
They were fried eggs involved.
I love that for you.
I had a couple of chips and most important, most important,
we're all going to get back to my favorite thing to do with my good friend, Jamie Loftus,
which is talk about the extensive crimes of Mark Zuckerberg.
Oh, yes.
I changed shirts between episodes, Robert.
So now I have a little Marky with me.
Yeah, you do.
You've got your, your, your Marky's E shirt.
Yeah.
My favorite, my favorite Mark quote that you can, you can be unethical and still be legal.
That's the way I live my life.
Ha, ha.
It is amazing.
And he really, I mean, there's been a lot said about him,
but the man sticks to his guns.
He lives by this credo to this very day.
Yeah. Yeah. Yeah.
You know who else sticks to their guns, Jamie?
Whom?
The death squads of the various dictatorial,
political candidates who use Facebook to crush opposition and incite race riots.
That was a transition.
That was a transition.
So Jamie.
All right.
I'm going to start.
It's time to start the episode.
I'm going to start with a little bit of a little bit of an essay here.
So once, once upon a decade or so ago,
I had the fortune to, to visit the ruins of a vast Mayan city in Guatemala to call.
And the scale of the architecture there was astonishing.
If you ever get the chance to visit one of these cities,
you know, in Guatemala or in Mexico or wherever,
I really worth the experience.
Just the, again, the size of everything you see,
the, the, the precision of the stonework.
It's just amazing.
And one of the things that was most kind of stirring about it was the fact that
everything that surrounded it was just hundreds and hundreds of miles of dense,
howling jungle.
So I spent like an afternoon there and I got to sit on top of one of these
giant temple pyramids,
drinking a one liter bottle of Gallo beer and staring out over the jungle canopy
and just kind of marveling at the weight of human ingenuity and dedication
necessary to build a place like this.
Sounds metaphorical.
And while I was there, Jamie,
I thought about what had killed this great city and the empire that built it.
Because a couple of years earlier, really not all that long before I visited,
theories had started to circulate within the academic community that the Mayans
had in the words of a NASA article in the subject in 2009,
killed themselves through a combination of nasty forestation in human induced
climate change.
A year after my visit, the first study on the matter was published in proceedings
of the National Academy of Sciences.
I'm going to quote here from the Smithsonian magazine.
Researchers from Arizona State University analyzed archaeological data
from across the Yucatan to reach a better understanding of the environmental
conditions when the area was abandoned.
Around this time, they found severe reductions in rainfall were coupled with
an rapid rate of deforestation as the Mayans burned and shopped down more and
more forests to clear land for agriculture.
Interestingly, they also required massive amounts of wood to fuel the fires that
cooked the lime plaster for their elaborate constructions.
Experts estimate it would have taken 20 trees to produce a single square meter
of cityscape.
So in other words, the Mayans grew themselves to death,
turning the forests that fed them into deserts all in the pursuit of expansion.
It's a story that brings to mind a quote from the great historian Tacitus,
writing about Augustus Caesar and men like him.
Salitudinum faciunt, pacum appellant, they make a desert and call it peace.
That's what he's saying about Augustus Caesar in the emperors like him.
They make a desert and call it peace.
That sounds like one of those sundial phrases.
I think a more accurate summation of the 200 years of peace that Augustus Caesar
created than what Mark put out.
They make a desert and call it peace.
Now, I read that quote for the first time as a Latin student in high school,
and I saw it referenced in relation to Mark Zuckerberg in a Guardian article
covering that New Yorker piece we quoted from last episode.
And the title of that New Yorker article was,
Can Mark Zuckerberg Fix Facebook Before It Breaks Democracy?
So, democracy, a free and open society where numerous viewpoints are tolerated,
cultural experimentation is possible and evolution is encouraged.
These are the things that have made Facebook's success possible.
It could not have come about without them,
an outside of a culture that embodies those values.
And now that Facebook's member count is closing in at 3 billion,
the social network is doing what all empires do.
It's turning the fertile soil that birthed it into a desert.
And as it was with the Mayans, all of this is being done in the name of growth.
Catherine Lohse was an early Facebook employee
and Mark Zuckerberg's speechwriter for a time.
In her memoir, The Boy Kings, which is what you call it.
I'm triggered.
That's a good title.
She lays out what she saw as the engineering ideology of Facebook.
Quote, Scaling and growth are everything.
Individuals and their experiences are secondary to what is necessary to maximize the system.
Mark Zuckerberg and thus Facebook have held a very consistent line since day one
of the company operating as an actual business.
And that line is that Facebook's goal is to connect people.
But this was and always has been a lie.
The goal is growth, growth at any cost.
In 2007, Facebook's growth leveled off at around 50 million users.
At the time, this was not unusual for social networks.
And it seemed to be something of a built in natural growth limit,
like maybe 50 million is about as much as a social network can get
unless you really start jinking the results.
And 50 million users, that's a very successful business.
You can be a very rich person operating a business like that.
You can just call it a day.
You can call it a day.
That's a great thing to accomplish.
MySpaceTom was thrilled with that.
Yeah, my God bless Tom.
MySpaceTom is now...
Tom who hasn't done a goddamn problematic thing.
Traveling the world now, taking photographs of the world
that Mark Zuckerberg is destroying.
Tom, he might be the only person worth hundreds of millions of dollars
that I'm fine with not taking the money back, right?
Like, let Tom, you're fine.
Like, go keep doing your thing.
Use the money to be boring.
It seems like he is just use his fortune to be boring.
What?
I remember it took like five months for me to get my MySpace deleted.
I don't remember anything about MySpace.
That's my favorite thing about MySpace
is I've forgotten everything about it, but the name MySpace.
Oh, for sure.
MySpace was not good,
but did I learn about a lot of goth music on it?
Yes.
Much middle school angst from not being in somebody's top eight.
But...
Oh, my God.
PC for PC, I did a lot of PC for PC.
As did I, my friend.
You are so pretty.
PC for PC.
And you know what no one used MySpace for?
Genocide.
Organizing militias to show up at the site of protests
and shoot Black Lives Matter activists.
That is very true.
Was not done with MySpace,
and I suspect Tom would have had an issue if it had been.
I think so.
Well, I don't know about Tom's politics, but...
I don't know the man,
but the fact that he's kept his fucking mouth shut
since getting rich and going off to do whatever he does
makes me suspect that he's a reasonable man.
Sure.
Facebook hits this growth limit,
and it kind of levels off a bit.
And again, it's a very successful business in 2007,
but it's not an empire.
And that's what Mark wanted.
That's the only thing Mark has ever wanted in his entire life.
And so he ordered the creation of what he called a growth team,
dedicated to putting Facebook back on the path to expansion,
no matter what it took.
So the growth team quickly came to include
the very best minds in the company,
who started applying their intellect and ingenuity to this task.
One solution they found was to expand Facebook to users
who spoke other languages.
And this is what began, we talked about last episode,
the company's heedless growth into foreign markets.
Obviously, at no point did anyone care or even consider
what impact Facebook might have on those places.
Neat.
Yeah, I'm going to quote from The New Yorker again.
Alex Schultz, a founding member of the growth team,
said that he and his colleagues were fanatical in their pursuit of expansion.
You will fight for that inch, Alex said.
You will die for that inch.
Facebook left no opportunity untapped.
In 2011, the company asked the Federal Election Commission
for an exemption to rules requiring the source of funding
for political ads to be disclosed.
And filings of Facebook lawyer argued that the agency
should not stand in the way of innovation.
Oh, okay.
It doesn't seem like an innovation to me.
It's a real loose interpretation of the word innovation.
Well, you know, I get this to an extent.
So the other day, I was drunk driving my forerunner
and I was shooting at some targets I'd set up in the trees.
And the people in the neighborhood I was doing this in said,
oh, for the love of God, please, you're endangering all of our lives.
And I said, you're standing in the way of innovation.
Because I was innovating what you can do drunk
in a forerunner with a Kalashnikov.
Oh, absolutely.
Yeah.
I understand, Mark.
I like to really heat up a pan and put it on someone's face
just to innovate the art of what you do.
Yeah, you innovate their skin by burning it.
I've innovated your face.
Yeah.
And really people standing in the way of that innovation.
How am I supposed to make progress in hurting people's faces?
Yeah, I'm a fan of how Pol Pot innovated the capital city
of Cambodia by forcing everyone out of it
and then killing hundreds of thousands of them.
It's just...
It's just...
It's innovative.
The way...
All of this is just horrific and like...
I don't know.
I was talking about something else,
but it's like the language of Silicon Valley
applied to the two genocidal situations is just so...
It's awesome.
It makes my fucking...
I've just peeled all my skin off.
Look, you have to agree that Hitler was an innovator.
He innovated so many things.
He really did change the narrative there.
He changed the game.
Yeah, he absolutely changed the narrative
from there not being a war in Europe
to there being a war in Europe.
That's called disrupting, honey.
He did disrupt it.
He disrupted the shit out of the Polish government.
Oh my God.
Fun stuff.
Why are you doing this to me, Robert?
So, Sandy Parakelus, who joined Facebook in 2011
as an operations manager,
paraphrased the message of the orientation session
he received as,
we believe in the religion of growth.
The religion of growth is what he was told
when he joined the company.
That's what it was called to him.
Not only horrifying, but like,
could you sound like a more of a sniveling loser
than saying the phrase religion of group?
Yeah, okay.
Yeah.
He said, quote,
the growth team was the coolest.
Other teams would even try to call subgroups
within their teams the growth X or the growth Y
to try and get people excited.
And in the end, Facebook's finest minds decided
that the best way they could...
I know.
I know.
I know.
I like it.
Yeah.
It's horrible.
I'm excited.
I'm horny.
I'm ready to go.
I mean, with that kind of narrative,
I love to hear it.
I love to...
You do love to hear it.
So in the end, Facebook's finest minds decided
the best way they could further the great God of growth
was to, for Facebook to become a platform
for outside developers.
But this was the way to really,
really get things going again.
And you all remember the start of this period
when Facebook made this change.
This is when like,
what had once been a pretty straightforward service
for keeping up with your friends from college
was suddenly flooded with games like Farmville
and a bunch of like personality tests and shit
that period of time.
The mom's fucking drone struck Facebook
by coming down with Farmville,
sending you five trillion invitations,
leaving your like high school choir concert
to go harvest strawberries.
Yeah, I'm familiar.
And making a bunch of fucking money for Facebook.
Whatever appearance these apps took,
their main purpose was the same,
which was to hoover up all of your personal data
and sell it for profit.
Yeah.
I might have given her a social security number to Farmville,
and that's just a fact.
Yeah.
And the only person you should give your social security number
to is me.
I do encourage all of our listeners to find my email
and just email me your social.
Yeah, just to your tips line.
Yeah.
It's like you like that thing from that documentary
about Keith Ranieri,
who we also did episodes on the vow.
It's your collateral.
Send me your social security number.
So I'll know that you really care.
Yeah.
So Facebook's employees kind of realized very quickly
after this change was made and these developers
start flooding the service with all of their shit
that the company's new partners were engaged
in some really shady behavior.
One worker who was put in charge of a team
ordered to make sure developers weren't abusing user data,
immediately found out that they were.
And I'm going to quote again from the New Yorker here.
Some games were siphoning off users' messages and photographs.
In one case, he said,
a developer was harvesting user information,
including that of children,
to create unauthorized profiles on its own website.
Facebook had given away data
before it had a system to check for abuse.
Parakeelus suggested that there be an audit
to uncover the scale of the problem.
But according to Parakeelus,
an executive rejected that idea,
telling him,
do you really want to see what you'll find?
No.
Which look, I can identify with that too.
I recently had an issue where I left a bag of potatoes
in the top counter of my kitchen
for, I don't know, somewhere between four and seven months.
And when I took them,
I didn't want to, I knew something was wrong.
I knew something was wrong up there
because of the flies and the strange smell.
But I didn't want to look into it
because I didn't want to see the extent of the problem.
And when I finally did,
I regretted learning what an issue I had made
for myself and my home.
Robert, you are really, you've,
since we last spoke,
you've become very prone to a metaphor.
I am, I am a living metaphor, Jay.
You are living out a metaphor at this time.
Yeah.
An innovative, I innovated those potatoes.
You disrupt, those potatoes were severely disrupted.
I disrupted them with a family of maggots.
Okay.
We don't need to,
we don't need to talk about what a problem my life has become.
Um,
Parakeelus told me, the New Yorker reporter, quote,
it was very difficult to get the kind of resources
that you needed to do a good job of ensuring real compliance.
Meanwhile, you looked at the growth team
and they had engineers coming out of their ears.
All the smartest minds are focused on doing whatever they can do
to get those growth numbers up.
Now, Jamie.
Yeah.
Jamie Loftus.
Yeah.
I happened to read this quote.
Well, I was struggling to work in the midst of unprecedented
wildfires that devastated a huge amount of the state of Oregon
and made our air quality the worst in the world for a while.
On the very day I read that article,
four of my friends and journalistic colleagues
were held and threatened at gunpoint by militiamen
who had taken to the streets of a town very near Portland
in the middle of an evacuation because viral Facebook memes
convinced them that Antifa was starting the fires.
Around the same time that this was happening,
that my buddies were getting held at gunpoint
because they were not white people
and a militia thought that was suspicious.
Around that same time, a tweet went viral
from a Portland resident and a former Facebook employee
named Bo Rin.
She posted a picture of the city blotted out by thick,
acrid clouds of smoke and wrote,
My dad sent me this view from my childhood room in Portland.
It hit me that we have been wasting our collective intelligence
in tech optimizing for profits and ad clicks.
Huh.
Hmm.
Glad you got on the, on that page, Bo.
Well, glad we, I mean,
sometimes it just takes something to put it all in perspective,
wouldn't you say?
Like your home burning down.
Yeah, sometimes.
And the militias being 20 minutes from your door. Yes.
Unfucking believable.
The militias that organize on Facebook.
Yeah. I mean, the story went pretty viral.
Yeah.
They wrote articles about it.
They weren't harmed.
Yes.
Yeah.
The, the two people I knew best who were there were Sergio almost
and Justin, almost and Justin Yau who are both wonderful reporters.
But yeah, it was, it was not lost on me that I think of the four
people who were there, three of them were not white people.
Um, and that some of the white reporters had a much easier time.
Interesting things about militias you learn anyway.
It makes you think.
It makes you think.
Now, uh, I thought that quote was interesting.
Um, anyway, opening interesting disruptive.
Disruptive.
Thought provoking.
Like the fires and like militias.
The fires made me think when I could think.
Yeah.
I like how Facebook hurt too much.
Yeah.
I threw up in my N95 mask, walking down the street.
Awesome.
Yeah.
I've been jogging and doing pull-ups in a gas mask.
Um, just half naked in a gas mask in my front lawn.
Like a normal person.
You're the only person I know who, uh, would have seen this as,
as a, as a possible outcome.
And for that, um, for that, I thank you and I curse you.
Yeah.
So anyway, opening Facebook up to developers made a shitload of
money and membership grew.
And for Mark's point of view, everything was going great.
Uh, but Catherine, Catherine Lowes, uh, his speech writer saw a
lot of the same problems Paraquilis had seen.
And in her memoir, she writes.
The idea of providing developers with a massive platform for
application promotion didn't exactly accord.
I thought with the site stated mission of connecting people to
me connection with another person required intention.
They have to personally signal that they want to talk to me and vice
versa platform developers though went at human connection from a
more automated angle.
They churned out applications that promised to tell you who had a
crush on you.
If you would just send an invitation to the application to
all of your friends.
Oh, I know.
The idea was that after the application had a list of your
contacts, it would begin the automated work of inquiring about
people's interests and matching people who were interested in
each other.
Soon developers didn't even ask you if you wanted to send
invitations to your friends.
Simply adding the application would automatically notify all of
your Facebook friends that you had added it and invite them to
add it to using each user as a vessel through which invitations
would flow virally without the user's consent.
In this way users needs for friendship and connection became a
powerful engine of spam.
As it already was with email and on the internet long before
Facebook, the same will tell you you have a crush on who has a
crush on you.
If you just send this email to your address book ploys were
familiar to me from Hopkins when spammers would blanket the
entire email server with emails in a matter of hours spread
violently by students gullibly entering the names of their
crushes and their crushes email addresses.
This was the start of Facebook making choices for its users.
Choices that were based on what would be best for the social
network which was keeping people on the site for as long as
possible.
The growth team saw that proactively connecting people to
each other worked out really well for Facebook's bottom line
even though sometimes, for example, people who had been
horribly abused and raped by their spouses were reconnected to
those spouses who they were hiding from and had their
personal data exposed to them.
A thing that happened repeatedly and still happens repeatedly.
But that's a small price to pay for growth.
In 2010 Facebook launched for that inch.
You got to fight for that inch and sometimes fighting for that
inch means connecting abused women to the men who horribly
injured them.
That's like Mark talking to Priscilla when they're trying to
conceive a child.
He's just like, you got to fight for my inch, honey.
You got to fight for it.
Oh, Mark Zuckerberg is incapable of talking during sex.
He lets out a high-pitched hum that is only audible to crickets.
Yeah, he's sort of got a Kendall situation going on,
whereas he just has a sex lump that gets really hot.
Yeah, she has to actually withdraw the semen from inside
his sex using a needle.
I think we need to reach the certain temperature where she
has to actually put in the little, hold on, hold on, hold on.
Okay.
Holding my vomit.
Put in a syringe and then suck and then she just has it.
And then she just has it.
And if you want to have the emotional equivalent of Mark
Zuckerberg's semen, no, that's not.
Oh, that's a bad way to...
That's not fair to the product sort of services.
It's not.
Anyway, here they are.
Vomit.
During the summer of 2020,
some Americans suspected that the FBI had secretly infiltrated
the racial justice demonstrations.
And you know what?
They were right.
I'm Trevor Aronson, and I'm hosting a new podcast series,
Alphabet Boys.
As the FBI sometimes, you got to grab the little guy
to go after the big guy.
Each season will take you inside an undercover investigation.
In the first season of Alphabet Boys,
we're revealing how the FBI spied on protesters in Denver.
At the center of this story is a raspy voiced,
cigar-smoking man who drives a silver hearse.
And inside his hearse was like a lot of guns.
He's a shark.
And not in the good and bad ass way.
He's a nasty shark.
He was just waiting for me to set the date, the time,
and then for sure he was trying to get it to happen.
Listen to Alphabet Boys.
On the iHeart Radio App, Apple Podcast,
or wherever you get your podcasts.
I'm Lance Bass, and you may know me from a little band
called NSYNC.
What you may not know is that when I was 23,
I traveled to Moscow to train to become the youngest person
to go to space.
And when I was there, as you can imagine,
I heard some pretty wild stories.
But there was this one that really stuck with me.
About a Soviet astronaut who found himself stuck in space
with no country to bring him down.
It's 1991, and that man, Sergei Krekalev,
is floating in orbit when he gets a message
that down on Earth, his beloved country,
the Soviet Union, is falling apart.
And now he's left defending the Union's last outpost.
This is the crazy story of the 313 days he spent in space.
313 days that changed the world.
Listen to The Last Soviet on the iHeart Radio App,
Apple Podcast, or wherever you get your podcasts.
What if I told you that much of the forensic science
you see on shows like CSI isn't based on actual science?
The problem with forensic science in the criminal legal system
today is that it's an awful lot of forensic and not an awful lot of science.
And the wrongly convicted pay a horrific price.
Two death sentences and a life without parole.
My youngest, I was incarcerated two days after her first birthday.
I'm Molly Herman. Join me as we put forensic science on trial
to discover what happens when a match isn't a match
and when there's no science in CSI.
How many people have to be wrongly convicted before they realize
that this stuff's all bogus. It's all made up.
Listen to CSI on trial on the iHeart Radio App,
Apple Podcast, or wherever you get your podcasts.
We're back.
Okay.
So in 2010, Facebook launched Facebook Groups,
which would allow just about anyone to create a private
walled off community to discuss just about anything,
including fascism, white genocide,
or the need to gather a militia together
and use it to kill their political enemies.
If you're a regular listener of my show,
you know the next part of this story.
From about 2010 to 2016,
the United States saw an astonishing leap
in the number of active hate groups.
For some perspective, just from 2015 to 2020,
the SPLC estimates there were a 30% increase
in the number of hate groups nationwide.
All of this growth was mostly spurred on by social media,
and Facebook was one of the main culprits.
And they knew they were, too.
They didn't admit it openly,
but internally they were talking about it from pretty early on.
And I'm going to quote now from a report in the Wall Street Journal.
A 2016 presentation that names his author,
a Facebook researcher and sociologist, Monica Lee,
found extremist content thriving in more than one-third of large German
political groups on the platform.
Swamped with racist conspiracy-minded and pro-Russian content,
the groups were disproportionately influenced by a subset
of hyperactive users, the presentation notes.
Most of them were private or secret.
The high number of extremist groups was concerning,
the presentation says.
Worse was Facebook's realization that its algorithms
were responsible for their growth.
The 2016 presentation says that 64% of all extremist group joins
are due to our recommendation tools.
No.
Yeah.
And that most of the activity came from the platforms
groups you should join and discover algorithms.
Quote from the presentation,
our recommendation systems grow the problem.
Oh, okay.
Well, I mean, as long as the word grow is in the sentence,
I think that that's good enough.
Growth is in there, you're good.
Growth is in there.
So really where we're growing and what the consequences are,
not really worried about it.
Yeah, it's just like when I'm in my forerunner,
drunk as shit on mezcal and firing a Kalashnikov,
all that matters is forward movement.
It doesn't matter if that forward movement is driving through
the trailer that a family lives in.
What matters is that I'm moving forward and shooting and drunk.
You're trashy.
Thank you.
Wow.
A judgmental statement.
I'm innovating home ownership.
I mean, this is another example of just, you know,
Facebook innovating people's interests.
Like, hey, do you enjoy this?
I'm trying to think of the old Facebook groups that you used
to be able to join like 10 years ago where it would be like,
science is my boyfriend.
And it's like, do you enjoy science as my boyfriend?
I want to fuck the Smithsonian Institute.
Or it'd be like school groups.
It'd be like class of 2012, stuff like that.
But like early, I mean, it's like, I mean,
obviously very much in the same line of algorithmic thinking
as YouTube where it's like, oh, did you enjoy this like
Kalash of Gerard Butler images?
How about a man sitting in his forerunner whispering
conspiracy theories for three hours on end?
That's just growth.
I love growth.
I love growth almost as much as I love everything that I do
with the Toyota forerunner while hammered in a trailer park.
Yeah.
That's the real content is innovating the trailer parks
near my house with a Toyota and a rifle.
Just kind of changing the narrative around it.
Changing the narrative around it to screaming mainly.
So yeah, throughout, right.
So throughout 2016 and particularly in the wake of the election,
a lot of Facebook employees began to increasingly express
their concerns that the social network they were pouring
their lives into might be tearing the world apart.
Because again, most of these are very nice and intelligent
people who don't want to live in a planet dominated by
nightmarish dictatorships and a complete collapse in the
understanding of truth that allows, for example,
viral pandemics to spread long after where they should have
spread because people don't have any sort of common
conception about basic reality as a result of the influence
of social media.
Where's the example?
They don't like that.
Like the people who work at Facebook got kind of bummed out
about contributing to that.
One observer at the time reported to the Wall Street Journal,
there was this soul searching period after 2016
that seemed to me this period of really sincere,
oh man, what if we really did mess up the world?
In 2016?
Yeah.
Yeah, I love that we're going from in the 40s,
like the scientist who does this, who does the same thing
going, now I am become death, the destroyer of worlds.
An appropriate comment for the thing that he'd done.
And then something honestly equivalent in its destructive
potential, but the response this time,
because everything is tacky now.
Oh, what if we messed up the world?
We might have fucked this up.
God.
Like, yeah, LOL.
Yeah.
Starting to think we've severely fucked up the planet.
Never mind.
Like, Jesus Christ.
Yeah.
This is why Aaron Sorkin is still working is because people
are saying shitty stuff in shitty ways.
Yeah.
I don't cut that out.
History.
No, no, let's we connect.
We should never cut out criticizing history's real villain,
Aaron Sorkin.
I agree.
Who I call the pole pot of cable television.
He's like, yeah.
Yeah.
That was evil.
This soul searching did not extend to Mark Zuckerberg,
who after the election gave the order to pour even more
resources into Facebook groups.
Marking that feature out is emblematic of what he saw
is the future of his site.
He wrote a 6,000 word manifesto in 2017,
which admitted to playing some role in the disinformation
and bigotry flooding the body politics.
So he's like, yeah, we did.
We had something to do with it.
He also claimed that Facebook was going to start fighting
against this by fostering safe and meaningful communities
from CNBC.
Quote, Zuckerberg noted that more than 100 million users
were members of very meaningful Facebook groups,
but he said that most people don't seek out groups on their
own.
There is a real opportunity to connect more of us with groups
that will be meaningful social infrastructure in our lives.
Zuckerberg wrote at the time, if we can improve our suggestions
and help connect 1 billion people with meaningful communities,
that can strengthen our social fabric.
Again, fascinating use of the word meaningful.
Meaningful.
Yeah.
Meaningful.
Meaningful.
What happened next was terrible and predictable and meaningful.
Jamie, very meaningful.
A flood of new users got introduced and even pushed into
extremist groups on Facebook.
The changes Mark insisted upon have been critical to the growth
of QAnon, which was able to break containment from the weird
parts of the internet and start infecting the minds of our
aunts and uncles, thanks mostly to Facebook, which took no action
against it until like a month or two ago.
Within two years, Facebook hosted thousands of QAnon pages
with tens of millions of collective members.
I'm going to quote now from an NBC News investigation on the
matter.
Facebook has been key to QAnon's growth in large part due to
the platform's groups feature, which has also seen a significant
uptick in use since the social network began emphasizing it in
2017.
There are tens of millions of active groups.
A Facebook spokesperson told NBC News in 2019, a number that has
probably grown since the company began serving up group posts
in the user's main feeds.
Most groups are dedicated to innocuous content, extremists
from QAnon conspiracy theorists to anti-vaccination advocates
have also used the groups feature to grow their audiences and
spread misinformation.
Facebook aided that growth with its recommendations feature
powered by a secret algorithm that suggests groups to users
seemingly based on interests and existing group membership.
And growth.
And growth.
Yeah, it's funny.
One of the things I like about this NBC report, which is partly
authored by Brandi Zdrozny, who's done a lot of great work on this
subject, is they kind of talk about how profitable, spreading
dangerous fascist content is for Facebook.
Quote, a small team working across several of Facebook's departments
found 185 ads that the company had accepted praising, supporting
or representing QAnon, according to an internal post shared among
more than 400 employees.
The ads generated about $12,000 for Facebook and 4 million
impressions in the last 30 days.
Well, you have to imagine like they have to, if they're doing the
math of what I mean, it has to be financially profitable because
it has to offset the cost of the PR hits that they know that
they're going to eventually take for shit.
So they're in the debt.
Again, it's just assigning a price to lives and brains.
Yeah, which is a good thing to do.
$12,000 seems reasonable.
Yeah, seems fair to me.
So, yeah, outside Facebook, the only people who really noticed
what was happening initially were a handful of researchers that
studied extremist groups.
And I wasn't really one of them until like 2019 that I realized
Facebook groups specifically were a problem.
It was obvious that Facebook was the issue, but the...
I wasn't until Facebook group kept threatening to kill me for two years.
Yeah, that did happen to you, huh?
That did happen to me, yeah.
And if you haven't listened to my year in Mensa podcast, what are you doing?
Thankfully, the people threatening to kill you were just members of Mensa,
who I trust are not competent enough to pull off an assassination.
I mean, don't challenge them, but let's hope so.
No, I'm throwing down the gotten...
You're like, no, no, no, I don't think they could do it.
Yeah, sorry.
Yeah, I didn't really grasp the scale of the problem with Facebook
groups in specific until 2019 when I started really looking into the
Boogaloo movement.
And it was kind of camouflaged because there was just so much
fascist content everywhere on Facebook that the fact that groups
in specific were driving a lot of the expansion of fascism in this country
kind of got lost in the noise.
But there were other researchers who started to realize this early on
that the workers inside Facebook realized what was happening right away.
In 2018, they held a meeting for Mark and other senior leadership
members to reveal their troubling findings from the Wall Street Journal.
A Facebook team had a blunt message for senior executives.
The company's algorithms weren't bringing people together.
They were driving people apart.
Our algorithms exploit the human brain's attraction to divisiveness.
Read a slide from a 2018 presentation.
If left unchecked, it warned Facebook would feed users more and more
divisive content in an effort to gain user attention and increase time on platform.
So that presentation went to the heart of a question dogging Facebook
almost since its founding.
Does its platform aggravate polarization and tribal behavior?
The answer it found in some cases was yes.
In some case, I mean, I guess that's technically accurate in some cases.
Yeah.
Yeah.
So Facebook in response to this meeting starts like a massive internal effort
to try to figure out like how its platform might be harming people.
And Mark Zuckerberg in public and private around this time
started talking about his concern that sensationalism and polarization
were being enabled by Facebook into Mark's credit.
He made his employees do something about it.
That phrase.
Yeah, a little bit to his credit.
Yeah, it's okay.
We'll take away the credit in just a second.
So, quote, fixing the polarization problem would be difficult,
requiring Facebook to rethink some of its core products.
Most notably, the project forced Facebook to consider how it prioritized user
engagement, a metric involving time spent like shares and comments
that for years had been the load star of its system.
Championed by Chris Cox, Facebook's chief product officer at the time
and a top deputy to Mr. Zuckerberg.
The work was carried out over much of 2017 and 18 by engineers and researchers
assigned to a cross-jurisdictional task force dubbed Common Ground.
And employees in a newly created integrity teams embedded around the company.
Integrity teams.
Sounds good to me.
It sounds reliable.
It sounds like they made sure that integrity was accomplished via teamwork.
Yeah.
Yeah.
So, the Common Ground team proposed a number of solutions.
And to my ears, some of them were actually pretty good.
One proposal was basically to kind of try to take conversations that were
derailing groups, like conversations over hot button political issues,
and excise them from those groups.
So, basically, if a couple of members of a Facebook group started fighting
about vaccinations and like a group based around parenting,
the moderators would be able to make a temporary subgroup for the argument
to exist in so that other people would...
Like a Zoom breakout room.
Yeah.
So, that other people wouldn't be...
Which I don't know if that's a great idea, but it was something.
Another idea that I do think was better was to tweak recommendation
algorithms to give people a wider range of Facebook group suggestions.
Yeah.
But it was kind of determined that doing these things would probably help
with polarization, but would come at the cost of lower user engagement
and less time spent on site, which the Common Ground team warned about
in a 2018 document.
They described some of their own proposals as anti-growth and requiring
Facebook to take a moral stance.
You can guess how that all went.
Yeah.
Mark Zuckerberg almost immediately lost interest.
Yeah.
Some of this, a lot of this was probably due to the fact that it would harm
Facebook's growth, but another culprit that like employees who talked
to the Wall Street Journal and other publications repeatedly mentioned
is the fact that he was all but hurt about how journalists were reporting
on Facebook because after the Cambridge Analytica scandal,
they kept writing mean things about him.
No.
Yeah.
Well, Mr. Mark always has to ask himself what would Bad Haircut Emperor do
and Bad Haircut Emperor wouldn't slow down in this shit.
Absolutely not.
One person was familiar with the situation told the Wall Street Journal.
The internal pendulum swung really hard to the media hates us no matter what
we do, so let's just batten down the hatches.
By January of 2020, Mark's feelings had hardened enough that he announced he
would stand up, quote, against those who say that new types of communities
forming on social media are dividing us.
According to the Wall Street Journal, people who have heard him speak
privately say he argues social media bears little responsibility for
polarization.
Now, there may be an additional explanation for Mark's shifting
opinions on the matter that go beyond being just greedy and angry about
bad press.
And that explanation is a fella named Joel Kaplan.
Do you know Joel Kaplan?
You ever heard of this dude?
I don't know this Joel Kaplan character.
Well, in short, he's the goddamn devil.
Oh, okay.
In long, he's the guy that Facebook hired to head up U.S.
Public Policy in 2011, and he became the VP of Global Public Policy
in 2014.
Oh, cool.
And Joel was picked for these jobs because unlike most Facebook
employees, he is a registered Republican with decades of experience
in government.
This made him the perfect person to help the social network deal with
allegations of anti-conservative bias.
Was in little empathy as possible, I'm sure.
Yeah.
In 2016, there's all these rumors that Facebook is like censoring
conservative content that are proven to be untrue, but the rumors go
viral on the right.
And so everyone on the right forever assumes that they were true.
And basically, Joel becomes increasingly influential after this
point because he's Mark Zuckerberg's best way out of angering the right
wing, which you actually can't not do because they're always angry and
will just yell about everything until they get to kill everyone who
isn't them because that's what the right does.
Yeah, life finds a way.
Life always finds a way for them.
So Joel was a policy advisor for George W. Bush's 2000 campaign and a
participant in the Brooks Brothers riot, which is the thing that was
orchestrated by Roger Stone to help hide a bunch of ballots in Florida
that swung the election for W.
He was a part of that.
What the fuck?
Yeah, that's the guy who's basically running Facebook's response to
partisanship right now.
I had a physical reaction to that.
That's awesome.
That's so upsetting.
He worked in the White House for basically the whole push
administration.
And in 2006, he took over Carl Rove's job.
So if you want to visualize Joel Kaplan, he's the guy you get when
you can't get Carl Rove anymore.
He's Mr. Carl Rove wasn't available.
When the worst person in the world is like, I can't do this job anymore.
Joel Kaplan's like, I got you.
I got you famous monster.
Look, I'm trying to disrupt some shit over here.
Wow.
Infamous piece of shit, Carl Rove.
Don't worry.
I will continue your good work.
I am Joel Kaplan and now I basically run Facebook.
And if you Google him, Google has him listed as American advocate.
Yeah, he is an advocate of things.
I was like, I was like, again, I guess.
Some might say fascism.
It's more specific.
It's not untrue.
Don't enjoy his face.
Just put it out there.
Joel is presently one of the most influential voices in Mark
Zuckerberg's world.
And he was one of the most influential voices in the entire
company when the common ground team came back with their suggestions
for reducing partisanship as policy chief.
He had a lot of power to approve these new changes.
And he argued against all of them.
His main point was that the proposed changes were, in his words,
paternalistic.
He also said that basically babying people.
He also said that these changes were proportionate.
Can't be a daddy story.
This can't be a daddy story.
I can't handle any more daddy stories that end in a genocide, Robert.
Oh, my God.
Well, if it makes you feel any better, all the genocides that this is
going to lead to haven't happened yet.
Oh, OK.
Well, there you go.
Yeah. So Joel also said that these changes would disproportionately
impact conservative content because it tends to be bigoted and divisive.
Since the Trump administration was at this point regularly tossing
threats at Facebook, this had some weight.
Quote from Wall Street Journal.
Mr. Kaplan said in a recent interview that he and other executives
had approved certain changes meant to improve civic discussion.
In other cases where proposals were blocked, he said he was trying
to instill some discipline, rigor and responsibility into the progress.
As he vetted the effectiveness and potential unintended consequences
of changes to how the platform operated.
Internally, the vetting process earned a nickname.
Eat your veggies.
No.
Which sounds paternalistic to me, actually.
It sounds like the beginning of a daddy story that ends in a genocide.
Wow.
OK.
Eat your veggies.
We'll get back to Joel Kaplan in a little bit.
For now, we need to talk some more about the problem of violent of how,
we're going to talk about how the problem of violent extremism on
Facebook groups got completely out of control.
So this summer, which was marked by constant militia rallies,
the explosive growth of the Boogaloo movement, numerous deaths as a result
of violent far right actors showing up at protests with guns.
Facebook finally took action in late September to ban militias
from using their service because they have to be balanced.
They also banned anarchists from Facebook at the same time,
even though anarchists have not been tied to any acts of fatal terrorism
in recent memory.
Because you got to placate the right wing because they're the only ones who matter.
So let's ban the anarchists who have been spending the last four years
trying to lay out the individual actors and groups who are members
of these militias that are doing stuff like taking over checkpoints
and holding my friends at gunpoint.
We wouldn't want the folks who are keeping track of them to be able to use Facebook.
That's the wrong kind of disruptive.
You see, that's the wrong kind of disruptive and advocating.
You know, that's very similar to what the dude in that trailer said
when I was driving my forerunner through his trailer
and shooting towards his children.
Not at.
And I'll tell you what I told him.
What did you say?
I'm an innovator.
So is Mark.
I don't know.
That didn't really tie into things.
It worked for me.
I could see it. I could see it in kind of an Ozarky kind of way.
I could see.
Yeah.
Yeah.
So Mark, by the way, is on record declaring that Facebook is a guardian
of free speech, which is one of the things he cited when he refused,
noted that he was refusing to fact check political ads in 2020.
So anarchists who want to talk about operating a communal garden
or, you know, share details about dangerous militias
are the same as militiamen baying for the blood of protesters.
But political candidates spreading malicious lies about protesters
who are being assaulted and killed based on those lies.
That is fine.
That's fine.
Yeah.
Back to Facebook's integrity.
I mean, that doesn't lend to growth or anything like that.
No.
Yeah.
Let's get back to Facebook's integrity teams and their doomed
quest to stop their boss from destroying democracy.
So the engineers and data scientists on these teams and chief,
like mainly like the guys who are working on the newsfeed,
they, they, yeah, they, according to the Wall Street Journal,
arrived at the polarization problem indirectly,
asked to combat fake news, spam, clickbait and inauthentic users.
The employees looked for ways to diminish the reach of such ills.
One early discovery, bad behavior came disproportionately from a
small pool of hyperpartisan users.
Now, another finding was that the U.S.
saw a larger infrastructure of accounts and publishers that met
this definition on the far right than the far left.
And outside observers documented the same phenomenon.
The gap meant that seemingly apolitical actions such as reducing
the spread of clickbait headlines along the title of,
you won't believe what happened next.
It meant that like doing this stuff affected conservative
speech more than liberal speech.
Now, yeah.
Yeah.
And obviously this pissed off conservatives.
The way that Facebook works means that users who post and engage
with the site more have more influence.
The algorithm sees if you're posting a thousand times a week
instead of 50.
It likes that engagement because engagement needs money.
And so it prioritizes your content over the content of someone
who posts less often.
This means that a bunch of networks of Russian bots and
hyperact or like Ian Miles Chong, who's a fascist troll who
lives in fucking Malaysia and tweets about how like everybody
needs to have a gun that they can use to shoot Democrats,
even though guns are illegal in his country and like makes like
did very recently miss anyway, total piece of shit that these
pieces of shit who are actively attempting to urge violence and
who have urged violence and cause death mobs in other countries.
It means that these people, because they're just shotgunning
out hundreds of posts per day, will always be more influential
than local journalists and reporters who are trying to bring
out factually based information because it's better for
Facebook for a stream of lies to spread on their platform than
a smaller amount of truth.
Yeah. And it also lends itself to just never like to be releasing
content so quickly that you couldn't possibly disprove or
fact check things fast enough because there's just a
bullshit machine.
Yeah.
And you know, Facebook's teams found that most of these
hyperactive accounts were way more partisan than normal
Facebook users and were more likely to appear suspicious like
to engage in suspicious behavior that suggested either a bunch
of people were working in shifts or there were bots.
So these teams, these integrity teams did like the thing that
has integrity, which was they suggested their company fix the
algorithm to not reward this kind of behavior.
Now, this would lose the company a significant amount of money.
And since most of these hyperactive accounts were right-wing
in nature, it would piss off conservatives.
So you can imagine how this idea went over with Joel Kaplan.
Since Mark was terrified of right-wing anger, he tended to
listen to Joel about these sort of things.
Joel's daddy, let's not forget. Yeah, Joel's daddy and the
Eat Your Veggies policy review process stymied and killed any
movement on halting this problem.
So how do we feel about that?
We feel great.
We feel good.
Glad to dance in charge.
Glad everyone's eating their veggies.
I mean, even just the dystopia nature of like mobilizing these
teams to be like, hey, I've ruined the world.
Do you think you could stop it before it blows up?
Because this is going to be a real PR issue.
Why would you do that?
Well, best of luck to the team.
It was another case where, like, because basically the only way
to combat this stuff is to have another person, Mark Zuckerberg,
respects or is at least scared of yelling at him or, you know,
talking politely to him about...
The daddies of the world.
The opposite of whatever Joel Kaplan is saying.
And there thankfully was someone like that in Facebook.
They hired in 2017 Carlos Gomez Uribe,
who was the former head of Netflix's recommendation system,
which has obviously made a lot of money for Netflix.
So this guy Carlos Uribe is a big, important get for Facebook.
So he gets on staff and he immediately is like, oh,
this looks like we might be destroying the world.
And so he starts pushing to reduce the impact that hyperactive
users had on Facebook.
And one of the proposals that his team championed was called
Sparing Sharing, which would have reduced the spread of content
that was favored by these hyperactive users.
And this would obviously have had the most impact on content
favored by far right and far left users.
And number one, there's more far right users on Facebook
than far left.
So that was going to disproportionately impact them.
But the people who mainly would have gained influence
were political moderates.
Mr. Uribe called it the happy face.
That's what he called this plan.
And Facebook's data scientists thought that it might actually
like, it might actually help fight the kind of spam efforts
that Russia was doing in 2016.
But Joe Kaplan and other Facebook executives pushed back
because yeah, and they didn't want to say,
because you know, Max Uribe, you couldn't like,
you had to be careful arguing with.
So instead of saying this will be bad for money
or it'll make the right angry at us,
Joe Kaplan invented a hypothetical Girl Scout troop
and he asked what would happen if the girls became
Facebook super sharers as part of a cookie selling program.
Robert, that sounds like a metaphor you would do
at the beginning of an episode.
Yeah.
He was like, basically, like what if these Girl Scouts
made a super successful account to sell their cookies?
Like we would be unfairly hurting them if we stopped
these people who are paying for the deaths of their fellow
citizens and gathering militias to their banner.
They're like, okay, okay.
I hear you, but what about fictional Girl Scouts?
Fake Girl Scouts.
Yeah.
He thinks on his feet.
It's awesome.
So the debate between Mr. Uribe and Joe Kaplan
eventually did make it to Mark Zuckerberg.
He had to make a call on this one
because both of them were kind of big names in the company.
Mark listened to both sides and he took the coward's way out.
He approved Uribe's plan,
but he also said they had to cut the weighing by 80%,
which mitigated most of the positive benefits of the plan.
Yeah.
After this, Mark, according to the Wall Street Journal,
quote,
signaled he was losing interest in the effort to recalibrate
the platform in the name of social good,
asking that they not bring him something like that again.
Neat.
200 years of peace, Mark.
That has big 200 years of peace energy.
Yeah, big 200 years of peace energy.
Yeah.
In 2019,
Mark announced that Facebook would start taking down content
that violated specific standards,
but would take a hands-off approach to policing material
that didn't clearly violate its standards.
In a speech to Georgetown that October,
he said,
you can't impose tolerance top down.
It has to come from people opening up,
sharing experiences,
and developing a shared story for society
that we all feel we're a part of.
That's how we make progress together.
That is just such a wild way of saying,
I don't feel I am accountable for this,
and once again,
I'm going to delegate this to the users of the people whose brains
I'm actively ruining.
You know what makes progress harder?
In my opinion, Jamie.
Products and services?
No, when fascists are allowed to spread lies
about disadvantaged and endangered groups
to tens of millions of angry and armed people,
because your company decided sites like The Daily Collar
and Breitbart are equivalent to The Washington Post.
This is something Facebook did.
When at Joel Kaplan's behest,
it made both companies Facebook news partners.
These are the folks that Facebook trusts
to help them determine what stories are true.
They get money from Facebook.
They get an elevated position in the news feed.
Yeah, on an unrelated note,
earlier this year, Breitbart News shared a video
that promoted bogus coronavirus treatments
and told people that masks couldn't prevent
the spread of the virus.
This video was watched 14 million times in six hours
before it was removed from Breitbart's page.
They removed it, presumably,
because it violated Facebook policy,
and Facebook has a two-strike policy
for its news partners sharing misinformation
within a 90-day period.
When Mark was asked why Breitbart got to be
a Facebook trusted partner while spreading misinformation
about an active plague that was killing
hundreds of thousands of Americans,
Mark held up the two-strike policy as a shield.
Quote,
this was certainly one strike against them
for misinformation,
but they don't have others in the last 90 days.
So by the policies we have,
which, by the way, I think are generally
pretty reasonable on this,
it doesn't make sense to remove them.
No!
That's pretty great, Jamie.
That's pretty awesome.
But you know what's even better about this?
Ethical and unethical, but still legal.
What's even better about this
is that Breitbart absolutely violated
Facebook policies more than two times in 90 days,
and it was covered up.
That's what's even better!
You have to imagine Breitbart is violating
Facebook policies multiple times a day.
Mark Kaplan helped some hide it, yeah.
That is such, I mean...
It's awesome, it's awesome.
I'm going to read actually about that in more detail.
By citing an incredible report by BuzzFeed,
who by the way, all credit to BuzzFeed.
I've cited a number of great articles,
including that one from the Wall Street Journal,
which is really important.
BuzzFeed has probably been
of all of the different media companies,
the most dedicated and hounding Facebook
like a fucking dog with a groin fetish.
I don't know how to...
I'm very proud of BuzzFeed's reporting on Facebook.
Thank you for keeping on this one, y'all.
Good work.
Now I have to remove that image from my head, but yes.
Yeah, I'm going to quote from this report
on the fact that Facebook fraudulently hid the fact
that one of their information partners
was violating their own policies
and spreading disinformation about an active place.
And then you need to take an ad break just so you know.
Oh, I'll take an ad break now.
We'll get to this afterwards because...
Hot teaser.
If there's one thing that prepares me
to hear about how democracies both in the nation,
I live and around the world are being actively murdered
for the profit of a man who's already a billionaire.
If there's one thing that makes that easier to take,
it's products and services.
It's the sweet lullaby of a product or a service.
Nothing keeps me going,
gets me intellectually hard like a product or a service.
I want to be surrounded.
I want to die surrounded by my most beloved products and services.
I have a feeling that you will
because there's a good chance that a horrible wildfire
will sweep through the city you live in.
And sorry, that's getting too dark.
Mine too, maybe.
Yeah.
Yeah, that's what I was going to say.
I'm like, hey, as long as we're on the same page there,
that's great.
And it's okay.
If we make it out of that fire,
Facebook will ensure there's lots of armed
and misinformed militias waving guns wildly
in the areas we attempt to evacuate through.
Well, as long as my death will have been completely in vain.
Yes.
That's what Facebook promises for all of us.
And that's what products and services promise for all of us.
Here we go.
Each season will take you inside an undercover investigation.
In the first season of Alphabet Boys,
we're revealing how the FBI spied on protesters in Denver.
At the center of this story is a raspy-voiced,
cigar-smoking man who drives a silver hearse.
And inside his hearse was like a lot of guns.
He's a shark.
And not on the gun badass way.
He's a nasty shark.
He was just waiting for me to set the date, the time,
and then for sure he was trying to get it to happen.
Listen to Alphabet Boys on the iHeart Radio app,
Apple Podcast, or wherever you get your podcasts.
I'm Lance Bass,
and you may know me from a little band called NSYNC.
What you may not know is that when I was 23,
I traveled to Moscow to train to become the youngest person
to go to space.
And when I was there, as you can imagine,
I heard some pretty wild stories.
But there was this one that really stuck with me
about a Soviet astronaut who found himself stuck in space
with no country to bring him down.
It's 1991, and that man, Sergei Krekalev,
is floating in orbit when he gets a message
that down on Earth, his beloved country,
the Soviet Union, is falling apart.
And now he's left defending the Union's last outpost.
This is the crazy story of the 313 days he spent in space,
313 days that changed the world.
Listen to The Last Soviet on the iHeart Radio app,
Apple Podcast, or wherever you get your podcasts.
What if I told you that much of the forensic science
you see on shows like CSI isn't based on actual science?
The problem with forensic science in the criminal legal system
today is that it's an awful lot of forensic
and not an awful lot of science.
And the wrongly convicted pay a horrific price.
Two death sentences and a life without parole.
My youngest, I was incarcerated two days after her first birthday.
I'm Molly Herman.
Join me as we put forensic science on trial
to discover what happens when a match isn't a match
and there's no science in CSI.
How many people have to be wrongly convicted
before they realize that this stuff's all bogus?
It's all made up.
Listen to CSI on trial on the iHeart Radio app,
Apple Podcast, or wherever you get your podcasts.
Alright, we're back.
So we're talking about how Facebook covered up
the fact that Breitbart was repeatedly spreading
this information that should have gotten them removed
as a trusted partner.
Some of Facebook's own employees gathered evidence
they say shows Breitbart, along with other right-wing outlets
and figures including Turning Point USA founder Charlie Kirk,
Trump supporters Diamond and Silk,
and conservative video production nonprofit Prager University
has received special treatment that helped it avoid
running a foul of company policy.
They see it as part of a pattern of preferential treatment
for right-wing publishers and pages,
many of which have alleged that the social network
is against conservatives.
On July 22, a Facebook employee posted a message
to the company's internal misinformation policy group,
noting that some misinformation strikes against Breitbart
had been cleared by someone at Facebook
seemingly acting on the publication's behalf.
A Breitbart escalation marked urgent end of day
was resolved on the same day,
with all misinformation strikes against Breitbart's page
and against their domain cleared without explanation,
the employee wrote.
The same employee said,
a partly false rating applied to an Instagram post
from Charlie Kirk was flagged for a priority escalation
by Joe Kaplan, the company's vice president
of global public policy.
Now, the whole article itself details
just a ton of other instances in this,
and it's all incredibly shady.
I'm not going to go into all of it in tremendous detail
because we are running out of time,
but if you read the article, it's extremely clear
that Joe Kaplan is directing Facebook
to actively violate the company's own policies
in order to keep right-wing bullshit peddlers
spreading lies on the platform for profit.
Kaplan has faced no punishment for this,
although his behavior did provoke outrage from employees
in and Facebook's internal chat system.
The rules aren't applied to daddy, that's how it goes.
Facebook employees are beginning angrier and angrier
at this sort of thing throughout the year.
Remember back in May when President Trump
posted this message to Twitter and Facebook?
Quote,
there is no way, zero,
that mail and ballots will be anything less
than substantially fraudulent.
Mailboxes will be robbed,
ballots will be forged,
and even illegally printed out and fraudulently signed.
The governor of California
is sending ballots to millions of people,
anyone living in the state,
no matter who they are or how they got there,
will get one.
That will be followed up with professionals
telling all of these people,
many of whom have never even thought of voting before,
how and for whom to vote.
This will be a rigged election, no way.
I do remember that, Robert.
I do remember that.
Twitter to, again,
it's unbelievable, like the mildest
I could possibly give someone credit
to that level of credit.
Twitter fact-checked the president's tweet,
which was not nothing,
and that's all I'll say about it.
Unfortunately, that does not qualify as nothing.
Again, that qualifies as the most responsible action
that has made a social media CEO took.
Mark, on the other hand,
refused to let his employees do anything similar,
allowing the president's flagrant misinformation
to circulate on his network.
This enraged employees,
and they got angrier when the looting starts,
the shooting starts post was let up.
They created a group and workplace,
their internal chat app called
Let's Fix Facebook, parentheses, the company.
It now has about 10,000 members.
One employee started a poll asking colleagues
whether they agreed,
quote, with our leadership's decisions this week
regarding voting misinformation
and posts that may be considered to be inciting violence.
A thousand respondents said the company
had made the wrong decision on both posts,
which is more than 20 times the number of responses
who said otherwise.
Facebook employees after this staged a digital walkout,
and they changed their workplace avatars
to a black and white fist and called out sick in mass,
hundreds of them and stuff.
I'm going to quote from BuzzFeed again here.
As Facebook grappled with yet another public relations crisis,
employee morale plunged.
Worker satisfaction metrics,
followed by micro-pulse surveys
that are taken by hundreds of employees every week,
fell sharply after the ruling on Trump's looting post,
according to data obtained by BuzzFeed.
On June 1st, the day of the walkout,
about 45% of employees said they agreed with the state
that Facebook was making the world better,
down 25 percentage points from the week before.
That same day,
Facebook's internal survey showed that
around 44% of employees were confident in
Facebook leadership leading the company
in the right direction.
The percentage point drop from May 25th.
Responses to that question have stayed around
the lower mark as of earlier this month.
So, pretty significant drop in faith in the company
from its employees.
And yeah, Zuckerberg, the ultimate decision maker,
according to Facebook's head of communications,
initially defended his decision to leave
Trump's looting post up without even hiding it,
like with a warning like Twitter.
Mark stated, quote, unlike Twitter,
we do not have a policy of putting a warning
in front of posts that may incite violence
because we believe that if a post incites violence,
it should be removed regardless of whether or not
it's newsworthy, even if it comes from a politician.
Oh, so you had to wait for there to be violence incited
and then be like, oh, it turns out that post
was actually very bad and we should take it down.
Again, the amount of bodies that he needs attached
to do a single thing is staggering.
Four days later, Mark backtracked from BuzzFeed, quote,
in comments at a company-wide meeting on June 2nd
that were first reported by Recode,
Facebook's founder said the company was considering
adding labels to posts from world leaders
that incite violence.
He followed that up with a Facebook post three days later
in which he declared, black lives matter
and made promises that the company would review
policies on content discussing excessive use
of police or state force.
What material effect does any of this have?
When employee asked in workplace,
openly challenging the CEO,
commitments to review offer nothing material.
Has anything changed for you in a meaningful way?
Are you at all willing to be wrong here?
Mark didn't respond to this,
but on the 26th, nearly a month of June,
nearly a month later, he posted a clarification
to his remarks noting that any post
that is determined to be inciting violence
will be taken down.
Employee dissatisfaction has continued to swell
over the course of the summer.
One senior employee, Max Wang, even recorded
a 24-minute long video for his colleagues
and BuzzFeed in another article
has all the audio for this.
I'm listening to.
In the video, Max outlines
why he can't morally justify working
for Facebook anymore.
He's a pretty early employee, I think.
His video quotes at length from books
on totalitarianism by Hannah Arendt,
who is one of the great scholars of the Holocaust.
He shared the video on workplace
with a note that started,
I think Facebook is hurting people
at scale.
Yes.
Yes, it is.
Absolutely.
Fuck, all right.
Like Emperor Augustus,
who had members of his own family killed for disobedience,
Mark did not like being questioned
and gasped, disapproved of
by his own employees.
On June 11th, he hosted a live Q&A
where he delivered a message to employees
who were angry at his enabling of hideously violent
fascist rhetoric.
A lot of this is in response to
the killings and such.
I've been very worried about the level of disrespect
and in some cases, a vitriol
that a lot of people in our internal community
are directing towards each other as part of these debates.
If you're bullying your fellow colleagues
into taking a position on something,
then we will fire you.
Well...
Good.
You know, the amount of consistency,
I mean, you got to appreciate it.
I'm really glad that that employee,
I mean, just spoke directly
about, because at what point, truly,
what do you have to lose?
I guess, except for your life,
depending on how Mark Zuckerberg wants to go about it.
I mean, it's...
I don't know.
It is so frustrating,
even though it's like, I don't know what else to do
other than, you know, whatever,
some shit in Minecraft.
But, yeah, just people are continually
waiting for this person
and this company to act in the best interest.
It's like, it's not...
When has it ever happened? Name a time.
Even in the face of, like, the most
brutal public disapproval.
There's too much power.
It's amazing. It's incredible.
As you're saying all this, and as I just
finish the thing that I'm saying,
a Bloomberg story just dropped.
Like, as we were recording this episode,
I'm just going to read you the...
I haven't read the story, I'm just going to read you the title.
Facebook accused of watching Instagram users
through cameras.
No!
Oh, man.
Oh, it fucking rules.
God, it's so good.
Have we talked about that before, though?
I've had that issue with Instagram before
where I'll close out Instagram and then
you'll see the little
section of your iPhone in the top left
where
it indicates that you're being
recorded.
It turns...
Like, when I...
Listeners, let me know if you've had a similar issue.
Sometimes when I close Instagram, it looks like
my phone just stopped recording, but it goes away
quickly. It's like a millisecond
that it's up. It happens all the time.
So that is not shocking at all.
Mm-hmm.
Yepery-do.
I don't know what I'm going to actually
title this...
I don't know what I'm actually going to title this episode.
The working title
that I started this under was
Mark Zuckerberg needs to be tried
in the Hague and hung in public until
dead. But I don't think
legal is going to let me go with that
title. Okay.
I think it's clickable. I think it's clickable
as fuck. I think you'd get great
engagement on that.
I mean, it's what he'd want.
Yeah, we may have to go with a different title.
I mean, I'm not urging illegal behavior.
I'm urging that he be tried in the international
criminal court and then once convicted
hung by the neck until
dead for his crimes.
Which is what you do when a world
leader commits genocide.
Right, right. It is true.
But I probably won't
title the
episode that. I don't know. I mean, I'm glad
you put it out there though. Let's not
take it out of the running. Yeah, there's a
number of other options.
Mark Zuckerberg continues
to disrupt
200 years of peace.
There's so many options. I can't wait
for the 200 years of peace that only
involve dozens of wars.
Yeah, I mean, let's say
if the 200 years of peace began
in 2004, imagine
how much peace we have to look forward
to. I think it's similar to the
amount of peace I brought that trailer park.
They're
offered this.
You're a little
sicko. You're a little sicko.
I know it.
I know it.
Well,
we're all
gonna
be fine.
Fine. We're all gonna be great.
Jamie.
You want to plug some shit?
Yeah. Thank you for disrupting
my life and inner
sense of peace once again.
That's always be disrupting baby.
You've always been
ABR. You've always been a huge
disruptor.
Yeah, you can follow me on Twitter
or Instagram, which is
watching me right now.
And then if you want
to contribute
to a candidate that I love
Fatima Iqbal Zubair
we're doing a live
read of the Twilight script
this Friday evening
5 p.m. Pacific.
That sounds very exciting.
It is something to do
to distract yourself from
the void.
Yeah.
See you there.
See you there.
I am going to be
You know,
doing the thing that I normally do
which is staring into the abyss
and going, hey,
hey, quit being an abyss.
You're really bumming us all out
abyss.
You and the abyss have great chemistry.
We do.
The abyss has made me a lot of money.
A lot of money, which is something that I feel
very, very conflicted about.
The abyss is rich.
That's the thing that Nietzsche missed
is sometimes when you stare into the abyss
you get a six-figure salary
because it's incredibly profitable to talk about
the abyss on a podcast.
The abyss has facial recognition
software and it's pretending to be me
elsewhere.
Jamie the abyss loftess.
That's what I've been called.
It's been said.
You can follow us
on Twitter and Instagram
where you're probably being watched at
Bastard's Pod. You can follow Robert on Twitter
and I write okay. You can buy stuff
from RT Public Store
and also the Bechtel Castee Public Store
where
Jamie designs all the artwork
for that and it's amazing.
I think I've covered everything.
Wash your hands,
wear a mask
yell into the abyss.
Oh wait Robert,
did I show you my bedazzled bolt cutters?
I'll send you a picture of them
by bolt cutters.
No, I would love to see your bedazzled bolt cutters.
I have a pair of bolt cutters that are
still usable but also mostly covered in rhinestones.
I'll send it to you.
Yay!
That's the episode.
Hell yeah.
Alphabet Boys is a new podcast series
that goes inside undercover investigations.
In the first season,
we're diving into an FBI investigation
of the 2020 protests.
It involves a cigar-smoking mystery man
who drives a silver hearse.
And inside his hearse was like a lot of guns.
But are federal agents catching bad guys
or creating them?
He was just waiting for me to set the date,
and that's what happened.
Listen to Alphabet Boys on the iHeart Radio app,
Apple Podcast, or wherever you get your podcasts.
Did you know Lance Bass
is a Russian trained astronaut?
That he went through training
in a secret facility outside Moscow,
hoping to become the youngest person
to go to space?
Well, I ought to know.
Because I'm Lance Bass.
And I'm hosting a new podcast
that tells my crazy story
and an even crazier story
about a Russian astronaut
who found himself stuck in space
with no country to bring him down.
With the Soviet Union collapsing around him,
he orbited the Earth
for 313 days
that changed the world.
Listen to The Last Soviet
on the iHeart Radio app,
Apple Podcast,
or wherever you get your podcasts.
What if I told you
that much of the forensic science
you see on shows like CSI
is an actual science?
And the wrongly convicted
pay a horrific price?
Two death sentences and a life without parole.
My youngest?
I was incarcerated two days after her first birthday.
Listen to CSI on Trial
on the iHeart Radio app,
Apple Podcast,
or wherever you get your podcasts.