Big Technology Podcast - A Look Into Facebook's Soul — With WSJ's Jeff Horwitz and Ex-FB Exec. Brian Boland
Episode Date: September 18, 2021Jeff Horwitz is the Wall Street Journal reporter who unearthed a trove of internal Facebook documents that reveal a damning disconnect between what the company says in public and its actions inside. B...rian Boland is a former Facebook executive who spent more than 11 years inside the company. The two come on to discuss Horowitz's bombshell series of reports, unpacking what they tell us about Facebook, dissecting the company's responses, and looking at potential solutions. Enjoy this bonus episode!
Transcript
Discussion (0)
Hello and welcome to the big technology podcast, a show for cool-headed, nuanced conversation, of the tech world and beyond.
And today we have an emergency podcast for you, because this week, in quick succession, the Wall Street Journal has published five fascinating, damning stories about Facebook that it calls the Facebook files.
have come every day of the week, and they paint a picture of Facebook that is fairly
concerning and something that's worth discussing right away.
And to do it, we have two amazing guests for you.
The first is Jeff Horowitz.
He is the journalist who obtained the documents and has been on all five of the stories,
and he's going to come on and take us through his reporting a little bit.
So, Jeff, welcome to the show.
Thanks.
And then we're going to take you inside the company because we have Brian Bullen.
who has been inside Facebook for 11 years or was inside Facebook for 11 years.
As an executive there, Brian and I have gotten the chance to get to know each other over the years.
So it's really great to have a chance to have him on.
And not only are we going to talk about the problems, which is stuff Jeff can highlight,
but we can also talk about the mentality inside the company and then potential solutions with Brian.
Brian, welcome so much to the show.
Hey, Alex. Thanks for having me.
Thanks for being here.
Okay. So let's just quickly start by summing.
up the five stories that the journal is published. So the first is that Facebook has a multi-tiered
system for content moderation that lets, you know, close to six million VIP users basically
say whatever they want with little recourse, little chance their posts are going to get
taken on. The second is that Instagram knows that it harms girls' mental health and has done
little about it. The third is that Facebook's, Facebook made an algorithm tweak to keep the company
irrelevant. And it did so, and it worked, but it did it did so by making people angry on the
platform. And it passed it off at the time as something that it would would encourage meaningful
social interactions and keep people, people's well-being intact. The fourth is that cartels
and human traffickers are operating freely on the platform with little recourse. And then the
fifth is that pertinent to this day is that anti-vaccine comments are dominating the discussion
on Facebook, making a big chunk of the discussion far out beyond what they would in real
life, and Facebook is struggling to deal with.
So, Jeff, does that sound about right to you?
That does sound about right.
So these are, these stories are pretty fascinating.
They are built off of tons of documents inside Facebook.
How did you get them?
So I think I'm not the first guy to be provided with a whole bunch of information from inside
that company. And I think something's worth noting is that the information tends to get given to
people like me from people who've been asked to handle societal grade problems that might be
related to a new and very powerful form of technology and communication that Facebook has
become the leading purveyor of. And I think the reason that this just seems to be happening
over and over again, is that Facebook has asked people who, you know, have really promising
tech careers to sort of go into a different part of the company, one that is not considered
to be a path to the C-suite or, you know, necessarily even good for your career.
The integrity is doing it because they really think it's important to actually try to fix
these things. And I think something, it's pretty hard if people feel like they've been asked to
fix these things and they aren't being given the leeway and latitude to do it. And so I think
that sort of that tension becomes something that kind of makes people feel like they have a moral
obligation to talk to people like me. And obviously there's a whole bunch of obligations and whether
it's right or wrong to leak, et cetera. People have plenty of debates on over. But I do think that
The way these documents make it to folks like me is people have a moral crisis.
Now, a big theme in your reporting is transparency.
So in the name of transparency, will you release the documents that you got?
I really like that idea.
I am not the sole voice that gets determined that.
There are other entities inside the Wall Street Journal and perhaps out that might have a say on it.
I would say TBD on it.
I think that the question is a good one and a very fair one.
And nothing would make me happier than to let other entities have a go at this,
either to find things that maybe we didn't or to take issue with how we did our work.
Who would be the outside entity with a say in it?
I don't want to get into that right now.
You know, no, no, like, let's put it this way.
Like, it's not like it's, it's, you know, oh, okay.
I don't want to get into that right now.
Brian, you've been inside the company.
What happens when someone, you know, provides a publication with this much?
I mean, this is sort of unprecedented, but they've provided the journal with all this information.
Are they going to find the person that did this or the people that did this?
Hard to say, you know, probably would be my guess.
You know, the company prides itself on being open and transparent with the information internally.
And, you know, based on just reporting today, that may be changing, which would be in a lot of ways unfortunate.
But, you know, the company takes those kinds of things seriously.
So the steps this person took speaks to the level of conviction that they had around the importance of these topics.
Otherwise, that personal risk they took wouldn't be worth it.
Yeah.
So if I were to put these stories into categories, the first one is that I would say is Facebook
just being unable to manage its size.
I mean, I think, you know, Jeff, there was a line from a Facebook researcher in your story
about vaccines that was basically like, you know, we've created a monster and we don't know
how to control it.
And the three stories that sort of go to that are the fact that Facebook needed to exempt
millions of people from content moderation, the fact that it could.
couldn't effectively manage or wouldn't effectively manage the cartels and human trafficking
activity on the platform and the fact that it couldn't contain the anti-vax stuff.
So is there a way to do this?
Or when, I mean, you know, they could have the best intentions.
But if you're a platform of, I think what is it, 2.7 billion people on any given day
trying to manage a network that big might just be physically impossible, no matter whether
your heart's in it or not. So what do you think about that, Jeff? So I would, the first thing
is I'd say I'm going to be interested in Brian's response to this question. But I don't think the,
the tenor from people inside the company and from the documents that I've seen isn't this stuff
is hopeless. If it's hopeless, then, I mean, there's, you know, that's almost kind of like,
well, you do it you can and that's that. I think there are, you know, just saying it's too big is perhaps
a little bit simplistic, right? Cross check, this is a manageable program. Facebook is in process
of getting it managed. Yeah, I'm sorry, that's the special, the white listing and special
exemptions. Cross check is what they renamed it because the word shielding didn't sound great.
It sounded arguably a little bit too much like what it is. And so I think like cross check and shielding
and white listing and all of that stuff, this is manageable. Facebook, in fact, has been improving
the program. The fact that it took them this long and that they,
They weren't willing to rein it in earlier, in part because they were relying on it to get them through the 2020 election, I think raises some questions about sort of approach.
The fact that it was so understaffed in the review crews for VIP enforcement were so understaffed that they were in early 2020 only reviewing 10% of the material that had already been flagged because Facebook systems basically said they believed it likely violated.
you know that's like that's that's that's a choice that's like that's in attention it's not like
what was me that could never be addressed um i think the same thing goes for some of the international
stuff right i mean like one of the the story about the global issues uh just demonstrated that
facebook's flat out spending on international work is a pittance relative to their spending on
you know the u.s and europe and even though that facebook is very very aware that the consequences of
Facebook not getting things right are so much worse abroad, you know, that people literally die
or are like sold into, you know, sex slavery. I mean, this is, you know, some pretty rough stuff.
And, you know, you shouldn't need, as evidenced by the, by basically the way that, one of that,
that situation played out, which is Facebook didn't really do anything for a long time.
And then when Apple threatened to kick the big blue app and IG off the app store, suddenly
just went into overdrive and took care of the problem.
them. Again, that's like, that's solvable, right? And the fact that they then kind of went back to
being inattentive after that is, again, like, solvable. It's just, that's will. I do think that
there are some questions about how complex the system can get and whether Facebook, you know, I think
something I ran across repeatedly in the documents with people just being surprised by like, oh God,
like we just inadvertently took action against the post from President Trump because
he was not cross-checked or shielded from enforcement in this like one weird minor element
and we have these 60 different systems that are all sort of patchwork overlap and this one
slip through. I mean, this is a company that literally couldn't prevent Mark Zuckerberg's
own Q&A on misinformation. From being classified as misinformation, right? So like I'm just saying
that like, I think it's one thing to talk about sort of responsibly monitoring the speech
of 2.7 or 8 billion people and saying that like, well, of course, that can never be done
perfectly. It's another thing to just be like screwing it up because it's way too complicated
and the attention isn't there. Right. I mean, Brian, what do you think? Because I imagine inside
Facebook, it's like almost being in this game of whackamol where, you know, you might successfully
address 20 problems, but 30 more will come up just because you're managing such a big
network. So is this a true failure of will here, or is it a product of the network size,
or is it something that I'm not thinking about? No, I mean, Jeff's done a great job in the reporting
of highlighting how big of a problem this is, right, and how massive the scale of this issue is,
that it's global in nature, it's complex in the types of issues you're dealing with. So I'm
incredibly sensitive to the fact that these are huge, huge issues and frankly, like, brand new
issues to the world. No one's had to deal with this before. But that said, there's also a question
of focus and effort, right? If you think about two other really interesting times in Facebook's history
where the leadership, and particularly Mark, focused the company on something important and got
the will of the employees to move in that direction. One of them, you remember, he had that F8 talk where he
talked about moving away from the move fast and break things to move fast with stable
infra, right?
And that signal to the company that infrastructure is important that the company can't
scale without infra.
The second famous one is a shift to mobile, where instead of having like a mobile team,
which is what Facebook originally had, was they had a mobile, like one team, small team
building mobile.
And he reoriented the company to say, we're going to build mobile first.
And now you look at the company, it's the largest mobile properties that exist.
So it's possible to shift the attention of the employees when you tell them it's important.
And I think that's one of the big missing elements is that it's reactive safety as opposed to proactive, proactive safety first by design.
And I think that while these are huge problems, the company has hired some of the brightest minds in the world to work on growing the company.
It also has these same bright minds who could work on these incredibly complex problems and start to make progress on fixing them.
That signaling that safety is core, that understanding the impact of products in a harmful way, not just a growth way, is core, would be absolutely critical.
Yeah.
And so, I mean, what is the rationale for not making this a priority?
I mean, it's obviously exploded in some, you know, pretty dramatic ways for the company, at least this week, through Jeff Stories.
It seems to erode trust among users, yet Facebook hasn't made it a priority, obviously, in the way that the others you speak of.
And I'll offer one possibility.
I mean, these scandals don't seem to hurt Facebook's bottom line.
Jeff, you pointed this out in your first story, Brian, you worked in the ad.
during some of Facebook scandals.
And I know what happened there.
I mean, the money stayed, you know, on course.
It kept coming in from advertisers despite their fake boycotts and stuff.
So is it really, I mean, is it a money thing for Facebook?
Is it it just doesn't see the financial consequences here?
Or is it something else?
Is it that Mark Zuckerberg?
Maybe this is just sort of like, I guess the other side of it is maybe this is a product thing.
And if you start to rein in this stuff, you might not get.
the same amount of usage and engagement.
And that would eventually lead to a money thing.
Maybe it's not money.
I don't know, Brian, what do you think?
You know, hard to say it.
Like, I'm not in everyone's head.
But I do think that there's this ideological thing that you've seen come up time and
time again.
You saw it from Messeri this week in this notion that the good is outweighing the bad, right?
Is that with these massive transformative things that you have.
have both good. You have both bad. And I felt like the leadership just firmly believes, and
mind you, believes without researching it or having deep evidence of the fact that there is way, way,
way, way, way more good than the harm. The challenge with that to me is that when you're dealing
with, was it three billion people you were talking about, that when you start to take percentages,
even like one percent harm on a three billion population is just freaking massive. Yeah. So I think
this ideological thing comes to play. And I think the challenge is that a lot of the things that you
would do to test improving systems lead to impacts in usage, engagement, and revenue. And I just
don't know whether the company would be willing to take those impacts. So there is some financial
risk there. Combined with this optimistic view of what the products are. Well, yeah, anytime that
you would make massive changes, and this is what kind of really popped for me in a lot of Jeff's reporting,
is this notion of being worried about sacrificing engagement
or sacrificing core metrics to test some of these things.
So if you were to say, look, we're going to make this a top company priority,
we're really going to investigate our products and we're really going to try some things
to clean this stuff up, they're going to be unintended consequences of those.
But some of the consequences will be on core metrics,
which I just don't know that the company would do.
I think there's a really interesting thing that sort of arises.
is given, look, inside the company, one of the problems for people on the integrity team
has been that content quality and questions of sort of societal good are really hard to
sort of justify, right? Like, is, you know, it's really hard to sort of benchmark, right? Because
Facebook can tell you exactly what even the most minute change to its platform is going to
do to user metrics, right? You, like, change the size of the comment box by two people.
pixels, they can tell you whether or not that was a good or bad thing for those metrics. They can't
tell you nearly as well about quality. And obviously, there are some inherently subjective issues
here. And I think the really interesting thing about the current, like we do more good than bad
thing. I mean, it's not current. That's been historically the only thing is that that claim,
and so a lot of people in integrity have sort of always been frustrated because they're like,
look, just like very clearly, yes, it's subjective, but very clearly the things we are doing here
are really bad. And, you know, the response is well-defined bad and quantify it for us,
you know, and then come back, right? At which point they walk off with like shaking their heads
because that's impossible. The interesting thing, though, is that the company's response on this
and like Adam Miseries, you know, we do more good than bad line on Instagram is like the exact
same concerns apply, which is like, okay, like, how are you going to quantify that? They are
very convinced of it. There's no question that Facebook legitimately from the bottom of its heart at
the senior levels believes that like basically they are being beset by like a whole bunch of
chattering idiots who aren't appreciating how Facebook has you know improved society that's like
absolutely how the company feels is that is that how this is going to be received inside
facebook brian are people going to look at these stories and be like you know the presses out
to get us and you know ignore the noise and charge on i think it's going to be hard with these
because a lot of this reporting has both internal documents and concerns from teams.
It also, man, like the content in this stuff, the topics is just so, so, so awful.
Right.
Like I understand, yeah, I understand how like getting, you know, believing you're doing
more good than bad, like, you know, it sort of prevents you from taking some serious action
to crack down on the platform.
But when you have these examples, I think like you're referencing.
of human trafficking and cartels posting, you know, their murders on Instagram and, you know,
a country being pushed toward vaccine hesitancy.
It's, it's, I just don't see how that's like something that, you know, you can keep going on
and saying, all's good.
Yeah, I don't know how.
Like, when I, when I got to the point where, where it was time for me to leave, it wasn't
because, you know, I didn't have a great job or, like, wasn't working on amazing things.
And frankly, it wasn't because it wasn't working with amazing inspirational people.
Like, it absolutely was.
I got to the point where I was concerned about polarization in the platform, just racial
polarization around the stuff around George Floyd and last summer and then continuing
to see the lack of effort to really understand different categories of polarization.
That was enough for me to feel like I questioned whether we're doing more good than bad
and I have to leave.
I didn't know about all this other stuff.
Right.
You know, like this is the ugliest of the ugly.
So how can it get to?
So, I mean, you know, Jeff, I saw, you were on CNBC talking with John Ford about this, and he asked you, like, is it going to cause any change?
But, Brian, you're, and you know, Jeff, you politely said, listen, all I can do is put the information out there.
Brian, you've been inside the company.
So I'm curious if you have any ideas about, like, well, we talked about whether it's going to change.
Well, how do you solve this stuff?
Incredibly hard things to solve, right?
Like, I'm super sympathetic to the scale of these problems.
And they're global, right?
So you have, you know, endless languages and countries to deal with in the context of these issues.
And, you know, Jeff's just highlighted a handful that probably exists in a lot more places and probably are as ugly.
So, you know, I think there's two things.
One, you really need a signal from Mark.
Like the company follows Mark's direction.
If Mark said, you know what, this stuff is abhorrent and we are going to shift towards really understanding and ending, frankly, if they want to claim it's a narrative and anti-Facebook narrative, fine.
like let's end it improve once and for all how the platform does shape and whether it's net good or
not harmful for society. You'll never be able to measure it with some like 80, 20 kind of of sort of
metric, but you could start to do research that a lot of these internal things would be done with
external research. So one thing is like Mark's got a signal that the internal teams like this is
important. This is a priority. And it's a goal. And like sacrificing growth or sacrificing
revenue on behalf of safety is important. Second one is transparency. You know,
a lot of these issues, you could start to see if more of the public data was transparent
in the way that CrowdTangle brings engagement data to the public.
There's another report about troll farms out today from MIT that was super interesting.
And it was the kind of thing that you would find if you had reach data.
Again, that's the data that Facebook says more important than engagement anyways.
And here's a thing showing that for Christians and for Black Americans,
the most popular pages on Facebook were troll farms, right?
And transparency brings the ability to see that.
And then look, it's time for regulation.
Like regulation is the kind of thing that happens when industries grow and they're successful.
And it's happened in the autos industry in the 60s.
It happened with chemicals before that.
It happened with building codes.
Like imagine buildings without building codes.
Time and time again, we've brought in regulation to help industries do a better job of being safe.
I think the regulatory approach right now is super important.
Right.
And I guess, like, the key here, right, is that this stepped up, just to bring it back to business, the stepped up enforcement would be a labor cost.
And, you know, I mean, Jeff, your story highlighting international, right, if you start to put the same amount of content moderation in every country as you have in the U.S., which is in Canada, where Facebook makes most of its money, it would cost a heck of a lot in labor costs.
Facebook has a 26, you know, multiple price to equity ratio.
So if you hire that, you know, more people than the multiple goes down.
So yeah, go ahead.
What do you think about?
Yeah, I actually think, look, the idea of having every country having the same level of
investment in content moderation that's occurred in the U.S.
is not possible under Facebook's current system.
Their current system, as we note in the global story, is very much language-based.
the U.S. has the best.
After that, hello romance languages.
After that, maybe Hindi.
After that, bye-bye.
I mean, it's not that bad.
But there are, like, a lot of the classifiers don't work.
The algorithmic systems for content detection do not work or do not exist.
And this is sort of like every incremental country or in every incremental community,
the linguistic community that Facebook goes into is almost definitely going to be smaller and
poorer than the last, just because that's the order of operations for growth.
And so they've been kind of moving further and further down the scale in terms of societal
wealth and kind of ability to get attention and with a system that is very much specific
language-based.
So I don't think it's like necessarily reasonable to expect that they would be able to pull this
off on the hundred and whatever languages they've got.
they currently offer services on in the platform, and that's not to say additional languages
that they just don't offer services, but people make do.
I think that something that occurred to me a lot in the course of this reporting is how
much of these problems would exist with Facebook, circa 2008.
And now, obviously, this is very much a counterfactual.
I think that we like this.
And I'm not suggesting that it's time to go back.
back to Facebook 2008 or 2006.
But I am saying that many of the issues highlighted the groups in which sort of maids were being
sold, the concerns about virality and recommendation systems, these literally didn't exist
in the early iterations of Facebook.
And arguably, if the goal is to make it possible for people to connect each other,
that goal was being carried out fairly effectively in the early days.
I mean, you know, any person in the world could for free log on and chat with any other person
in the world, you know, like, congratulations, mission accomplished.
But the product has sort of continued to get increasingly complicated.
And it does seem like a lot of the problems that Facebook has have come out of those
iterations and sort of new features. And again, where lines are drawn isn't clear, but I know that's
been a frustration to people inside the company, which is that, you know, they keep on building
and they expect integrity to clean it up, whereas the previous version was safer.
That says so much. Last question before we go to break. Jeff, three of these stories are
essentially on content moderation. There's going to be people who say that you're a whiny reporter
that wants Facebook to censor? What's your response? Oh, I think I think censorship is one of the
interesting things about censorship is you have to censor things. Look, letting anyone say whatever they
want to say, it works great on the open internet. There are Nazis on the open internet. My name's
Horwitz, right? So it's certain ethnic background. I don't mind Nazis being on the internet.
They can, you know, have their own websites and Nazi it up as much as they want. I think the problem is
when you start, like, basically providing tools that make small, potentially militant communities
more powerful? I think that that is something that perhaps raises different questions. So in some
respects, I think the fact that Facebook has to moderate its platform as much as it does is because
of how viral things are on the platform. So I guess I'd say, like, I think there's actually,
I think point this is like a really interesting thing when writing about parlor, right?
Like everyone hated parlor because, well, not everyone.
But but.
Parlor users liked it.
Yeah, okay.
It's true.
It's true.
Everyone, everyone in our general profession kind of scans at that thing.
I will just say that.
And I actually think they didn't do algorithmic content ranking.
You just saw the things from the people you followed.
And I think that that was a really interesting, in some way limiting fact.
on how bad things could get.
I'm not saying that parlor was good
and that there was a ton of super,
it wasn't a ton of super bad stuff on there.
But I am just saying that like,
even with a commitment to not having rules,
if you don't have recommendation systems,
it's kind of a less risky business.
Yeah.
Okay, I do want to get into the algorithm
and the recommendation system.
So why don't we do that right after the break?
We'll be back here.
here on the big technology podcast right after this with Jeff Horwitz and Brian Boland.
Hey, everyone. Let me tell you about The Hustle Daily Show, a podcast filled with business,
tech news, and original stories to keep you in the loop on what's trending.
More than 2 million professionals read The Hustle's daily email for its irreverent and informative
takes on business and tech news. Now they have a daily podcast called The Hustle Daily Show,
where their team of writers break down the biggest business headlines in 15 minutes or less
and explain why you should care about them.
So, search for The Hustled Daily Show and your favorite podcast app,
like the one you're using right now.
And we're back here on the big technology podcast talking about the Facebook files of
Bombshell Wall Street Journal investigation into Facebook that features tons of original
documents and memos that we've been discussing.
Well, there was one part of the reporting that I found most interesting, and that was
this, what happened when Facebook optimized its algorithm towards meaningful social interactions.
Facebook at the time told us that this was all about improving people's well-being.
But the truth was that it came at a time in 2017 where, according to your reporting, Jeff,
was it comments, reshares, likes, and original posts were all declining.
And without that engagement, Facebook essentially becomes a shell.
people stopped going to it.
And Jeff, from your reporting, what happened was the shift worked.
People were more engaged, but they were kind of just hanging out in flame wars, you know,
in the comments, going back and forth, yelling at each other about controversial stuff.
And publications and politicians took note of this and started putting more outrageous stuff on the platform.
So I guess like Jeff, maybe you could first of all, you know, let me know if I got that right.
and then, because, look, it's been a week and spending my mornings reading your stories.
Trust me.
It's been a even longer week over here.
I believe you.
It's probably your whole summer.
But number two, I want to ask you, and then, you know, we'll toss it over to Brian after this.
But, like, does Facebook's existence and getting people angry, are they kind of linked in some way?
Because it is interesting that, like, when they made people matter, they ended up reversing some of the downtrends that they were seeing.
I don't know that it needs to be that.
tightly linked
and again
we are now
entering dangerous reporters
territory for a reporter
because thinking
about platform design
there's a reason
that like no one's
on you know
Jeff Horwitt's book
right
and that said
so there were two parts
with the meaningful
social interaction thing
and I think it was interesting
one of them was
to focus kind of more
on content from friends
and family
and you know that was kind of
kind of trying to take Facebook a little bit more back to its roots and research the company
did found that that was, I think, from an integrity point of view, content quality point
of view, sort of unambiguously positive. And, you know, in fact, like definitely something
that Facebook can do when they want to, you know, if they're worried about content quality
issues, is they can show people more stuff from their immediate friends. It tends to be safer.
And, you know, like, nobody starts flame wars on vacation photos, right?
Um, so, um, I don't know. We might have a difference. Possibly not, but you, you know, you know where I'm going. But, uh, but so the, uh, the other part, though, was that they did explicitly want to goose engagement, right? And, you know, the way you goose engagement is you simply reward content that it was just engagement. And, um, they did that quite successfully. It's not like that was the first time they'd ever sort of, you know, done engagement based ranking. I mean, this is a company that originally started out. And, um, they did that. And, um, um,
with just like likes you know that was the cumulative that was the metric right uh and um
but that said like they they did switch more heavily to this and i think the difficulty they had
first detecting it as an issue um it was kind of a hindsight based thing of like uh-oh you know like
oh god we did this uh is interesting um and
I don't know, though, that it has to be that way.
There are other ways to wait things.
You know, you don't have to, like, I mean, one thing that was of interest was like,
how much should the anger emoji be worth?
Because it turned out that was a very specific thing that you could optimize for as a
publisher.
I mean, the concept of hate bait, right?
You know, just riling people up.
And it was very successful.
I think, you know, one of the things from MSI that I found fascinating that
my colleague Keech reported out was that it turns out that targeting political ads to your
opponent's supporters was a strategy that some 18 and political parties took.
Because they get to be angry.
Yeah.
And this is a new thing under the sun, right?
Like, yes, politics has always been negative.
But nobody ever was like, okay, sweet.
Like, I'm going to go find the TV spot.
Yeah, like I, a liberal Democrat, am going to.
to, you know, I'm going to go basically buy up every slot on, like, Glenn Beck.
Like, that was not like a thing that anyone ever thought of doing.
Right.
As a means of getting distribution.
Yeah, you're definitely, you're in a troublesome spot when you start to say,
the way that I beat this system is to make lots of people who don't like me super angry.
I do wonder, sorry, go ahead.
No, no, I was just going to say, and I think something just very interesting about how people
engage with the platform.
is that I guess I'd say, like, we're sort of in the middle here.
I think most people use Facebook because they want to use Facebook and because they think
it's interesting.
We use social media because it's kind of relevant to our jobs.
Yes, we do compare, you know, like, look, no one ever tells me, like,
Jeff, you need to go get more clicks, right?
Like, but obviously getting, you know, getting attention is a good thing.
But then there are also, I think, entities that are just like purely dedicated
to getting attention on Facebook.
And there are definitely very effective ways.
And those entities are the ones that are going to be most gearing themselves
to trying to, one, understand how the system operates and what it rewards.
And two, trying to game the living hell out of it.
Because, I mean, everyone else is going to like,
Facebook makes an algorithm change.
Great.
They're going to keep on posting vacation photos and, like, interesting news links.
However, like, like basically crap Macedonian publishers,
they are going to work hard.
And so you expect them to be the ones that adapt much harder than everyone else.
And so there's something that's like really adversarial about the kind of Facebook attention garnering community.
Yeah, I spoke about this with Nehiel a couple weeks back.
We had an interesting discussion about this.
Brian, you were inside.
So, and you know, you have a little bit more freedom.
I think than Jeff does as a reporter to talk about this stuff.
And definitely an inside view.
So what's your perspective on how meaningful social interaction has gone wrong?
And is it that simple that, you know, angrier stuff will make Facebook more successful or is there something more nuance that I'm missing here?
Yeah, no, I'm going to have to go with what the documents found because it was the folks who are researching it and looking at stuff.
But, you know, look, like a lot of what we keep talking about here is the role of algorithms and the role of artificial intelligence in this stuff, right?
Because the new thing for mankind is the algorithmic determination.
of what you will see, right?
You hear Facebook talk about, hey, look, you're just joining the town square.
This is the public town square, right?
But it's really not that.
Like, it is a algorithmically personally created town square where you're lumped into a set
of content based on what the algorithms are optimizing for.
And I think it's very, very hard to design algorithms that you understand that they're
driving certain quantitative things versus the metrics, which Jeff was talking about earlier.
You know, like, optimistic builders are like the kind of people who work at Facebook.
And optimistic builders, you don't do your product pitch deck with a big section on what are they, what could go wrong or what are the ill consequences of these things.
Yeah.
Right. Like what's the worst? Like nobody imagined at Facebook, the Macedonian kind of stuff.
Right. Right. Yeah. Like reminds me of Amazon where like they write those six pages like saying what's going to happen with the product that they want to build once it goes out in the wild.
And, you know, they've told me that it's, oh, it's like writing science fiction.
except they always imagine the best case scenario, and there is no room for what could go wrong.
Yeah.
And those are the kinds of folks who have built some amazingly powerful products, right?
It's like, it's optimistic.
And I get it.
And I was in it.
And I was super optimistic.
But there's this need for the really critical worst case scenario of thinking, right?
And I think, I think, you know, in this case, the algorithms, you know, when you read it,
the algorithms rewarded that kind of angry content.
wasn't the design that wasn't like what the team was hoping for like hey let's just get a whole bunch of people pissed off and riled up and like boost engagement um but it you know it does lead to just way more focus on the worst case scenarios from the beginning and then when you do release it don't just see what the growth metrics look like yeah and really important from Jeff's comment earlier is don't leave it to some other team to clean up right
Yeah. That's a problem when you're saying, hey, look, we'll have an integrity team who deals with all the bad stuff and I don't even need to worry about it. I'll just build, right? It's that making it a part of everyone's job and a part of everyone's plans to build safely. That's really important. Yeah. It's a good moment to talk about Facebook culture for a minute. I mean, it's so interesting. This is so interesting to me because Facebook has a very strong feedback culture where you're encouraged to bring ideas and suggestions and tell people what's wrong to anything.
anybody across the organization, but it requires a willingness to listen.
And one of the things that I wonder if the message has gotten through,
and obviously the message didn't get through here, which is interesting.
But compensation, where, Brian, maybe you can enlighten us a little bit on this.
But from my understanding, people that build the products are compensated primarily based off
of growth.
And, you know, if that's the case, how does this ever change?
I mean, you're answering the question by asking it, right, which is compensation and goals needs to change to think about and reflect on deep understanding of what the product can do from a harm standpoint and then mitigating those harms, right?
So there's, yeah, I was talking with Brian Settler a month, two months ago now, and, you know, he kind of brought this great quote of, you know, when you invent the plane, you invent the plane crash. When you invent the car, you also invent the car crash.
Yeah.
But it's really being thoughtful on what the car crashes and plane crashes are here at the beginning.
And you can goal teams on that, right?
So it's more of a softer metric.
And, you know, people who pay out goals hate soft metrics.
But really, you can, like, you can, like, you do end up compensating people on how they are as a cross-functional partner, how they are as a peer, how they are as a mentor, those get paid out.
So, you know, rating people on their events.
ability to really think through the implications of their products is absolutely something
they could go on. And then making safety a part of every team's problem. So like if I was gold,
pretend I was building a product. And if I was gold on the integrity issues on my product as a
core part of my compensation and advancement, even if there was another team that was primarily
responsible for that, I would pay a lot more attention to it. I as a product group leader would be
having reviews on what's the latest on integrity in my platform. I as a leader in the partnerships
organization would be saying, hey, look, let's go through an analysis and let's go through
the latest understanding of integrity issues around my partners. But because that's not part of what
you're compensated on, it's somebody else's problem. It's an afterthought. Yeah. And engagement is
what keeps the company going. And it seems to me that it's much easier to take half measures
and then say we're always looking to do better than take aggressive measures like changing the way
that you compensate, even if it risks some of the growth in social media. So
competitive. I mean, Facebook, you look at it. It's got Snapchat, YouTube, and TikTok all,
you know, trying to take its users away and threatening successfully. So it's a tough problem.
And I do wonder if the company even believes that it can from a competitive standpoint make any
changes here. Well, you know, I think the interesting thing is, is that those other companies
you mentioned have been super quiet in this conversation. And there's a reason why. And to Facebook's
credit, right, to Facebook's credit, they have looked. Yeah.
they've actually put some effort in trying to look at some of these issues and trying to
understand these issues. My fear is that the other companies aren't looking and they're not
researching these things and none of them are transparent, right? So, you know, Facebook keeps talking
about how we're doing more than everybody else. And that is factually true. They're doing more
than everybody else. But it's still woefully inadequate. What also terrifies me, though, is what's
happening on TikTok. You talk about algorithms and virality and amplification. That is,
is the most amplified and viral app that you can find.
And nobody knows what's going on on that platform.
So, yes, these are Facebook issues.
I give Facebook credit for allowing their teams to continue to do some of this work.
I hope, hope, hope that one of the reactions isn't to clamp down on research
because that will be a reaction.
Somebody internally will say, we need a review board to approve all future research,
guarantee it.
And like that, that would be awful.
but we also need to remember that these other platforms aren't looking at it and they have to.
Yeah.
I would actually second that.
I think, look, like I do not have details here on what the equivalent of the cross-check
slash special exemption slash whitelisting slash shielding situation looks like elsewhere,
but it has been darkly intimated to me that Facebook is not the only company that has
different sets of enforcement rules depending on who you are. And yes, I mean, I think like the idea
that Facebook should needs to take more care when dealing with high profile accounts is of course
obvious, right? Like there's no, you know, it's not like the, the alternative to what Facebook had,
which was whitelisting, was like just basically taking down Donald Trump whenever anyone felt like
it. So there are other ways to do it. I think some of the questions are what level of reach
things that are previously flagged
shouldn't get while they're awaiting adjudication,
how long the review cues are,
and also sort of like
what level of difference in outcome
one expects from the overall systems, right?
Like, obviously, if you just eliminate stupid errors,
it's still a favor, but it's less of a favor
than if, let's say,
senior leadership weighs in asking for
a penalty to be waived on one of the accounts top influencers or one of the platforms top
influencers right so there's like there are there are like i think what i would say clearly obvious
better and worse ways to do this from a we can explain that to the public and no one's going to get
super angry point of view uh but i i you know i i truly do not know and but i said have heard some
dark mumblings regarding of what that looks like elsewhere in the industry yeah the ticot top
stuff in particular. We just had an episode about the risks of TikTok's rise. And when you start
to think about it, you do start to shiver and start rooting for Facebook because you don't
know what's going to happen with that. Do you guys have five minutes to talk a little bit about the
Instagram stuff? I know we're kind of out of time here. So we can end. I can do a couple more
minutes. You're going to be upsetting German public television. Okay. You know, I think we can make that
happen. Okay. All right. Sounds good. Sorry, Germany.
And we'll be back after this for five more minutes.
And we're back here for one final segment on the Big Technology podcast with Jeff Horowitz, Brian Boland.
We talked a little bit about in the second segment about when you design the plane, you think about the plane crash and the car.
You think about the car crash.
Say good segue because Instagram had Adam O'Sary's talked about how, you know, cars do a lot of good.
Instagram does a lot of good.
And you can't just throw, you know, throw the invention.
not because of the fact that, you know, I guess the intimation was it might kill people.
And then everybody went, you know, kind of wild dunking on the tweet, talking about how, you know, cars were regulated, you know, through the roof before, you know, the industry became safe for people.
And Adam in the podcast, he said this in the podcast with Peter Kafka and Recode Media.
And he said, you know, they welcome regulation, although certain kinds of regulations.
So what do you guys think is that a fair defense from Adam?
I mean, he also said, I'll just throw all of his points out there, he also said that, you know, we never really did research on, like, how fashion magazines were making girls feel back in the day.
And, you know, Instagram has supportive communities as well as, like, this comparison culture to it.
So I'm going to defer to what you think about the events.
Yeah.
I'm going to defer to Brian here because the automobile analogy is one that he actually raised with me well before the story started running when I went and sought him out after.
He first surfaced in a Kevin Ruse story.
So I will leave this to him because I have already cribbed an analogy from him once.
Okay, cool. Brian, you're up.
Yeah, no, look, and like I did find an ironic and knew that people would jump all over the regulatory side of it because the NHTSA didn't exist before there was public pressure in the 60s.
And in Ralph Nader's book, Unsafeited any Speed came out and people really understood the harms.
And they importantly understood that the auto manufacturers weren't going to invest in safety.
And they weren't taking the steps that were required.
And that's happened every time there's been a regulatory regime.
Like chemicals do good.
Like we use chemicals every day, but that was an industry that wasn't regulated.
We live places.
We have cities.
We have buildings.
We have building codes and regulation that governs those.
Those didn't exist.
But now they do because of safety issues.
So there's clearly a need for regulation here.
And, you know, the, the, the, the, the, the, the, the, the, the, the, the, the, the, the,
defensive analogies that make me cringe are those that, you know, we never studied magazines
for their body image issues because it dodges the question, right, and tries to get you to look
somewhere else. The question is, you have evidence that you have researched that there is
potential harm being done to significant, not an insignificant, a very significant portion of your
user base. And just wouldn't you want to know, like, if I
told you that the basement of your house, I was pretty sure had some toxic chemicals in it.
Like, would you let your kids still play in there and still give them ice cream in the basement?
Or would you be like, look, like, I just want to make sure that I have good understanding of
whether my basement has toxic chemicals that or not.
Like, hey, other houses have chemicals.
Right.
Yeah.
Like, hey, you know, like not all the old kids are made on somewhere else.
So that is like, to me, the comparisons to other industries and other things in time, if you had
done the homework, if you had really.
if I felt like you had done everything in your power to understand the impacts
and that you were working deeply with researchers at the academic sense
to understand the impacts, fine.
Like we can do that analogy.
But that analogy doesn't hold because on its own,
you have a flag and you should research to your deepest ability
and in partnership whether there are these ill impacts or not.
100%.
All right, Jeff, if you have one takeaway,
you want people to get from your reporting this week?
It is.
You know, I think I'm going to be some people who got me much smarter on the subjects that were in these stories.
Look, I started covering Facebook two and a half years ago, and I had not previously covered tech, and I didn't use social media very much.
So, like, basically, all glory goes to people who have bothered to walk me through this stuff.
I think that it just goes back to what you were saying at the beginning, which is just like,
is it all hopeless, you know, that definitely isn't the impression that folks have that I've
spoken to, right? That there are, you know, yes, there are always going to be tradeoffs,
but that perhaps the tradeoffs, if understood, and if others were at the table,
aside from just the company, the tradeoffs might be different and perhaps more acceptable
to a lot of people than they are. So just, you know, that's obviously.
getting past my capacity to speak or think or come up with ideas but you know it seems like
there's a growing body there's no one of the things that's most interesting there's a growing
body people who've been like talking about this stuff on Twitter who like candidly understand
it all the hell a lot better than I do and that's because they came straight out of the company
and so I think it's like really encouraging that folks like Brian are out there it's even more
encouraging that people who had like senior roles doing integrity stuff are like just popping up and
I'm hoping that like there may be a like kind of new class of Facebook trained people who are
I believe about as smart in this stuff as anybody in the world who are going to be kind of out
and about and perhaps maybe offering some ideas beyond just like well this seems kind of messed up
which is kind of where where we end up right and Brian is one of those people final
there's hope there's absolutely hope you know like the the platform has done some really great
things just let i think last year the platform helped register four million people in the u.s to
vote like that's a that's a big big deal right to be able to do these things you know i think
there's two things that i'd love to see one um i wish i you know i was i i pled my case internally
but i pled it quietly and i pled it to leaders at the company um i think it's time for people
internally to really speak up and rise up and use their force and their voice as employees to say
enough is enough. And retaining that exceptional talent is critical for Facebook. So people speaking up
and people, you know, really mandating these as priorities is important. And then it is time for
regulation. And it's time for the right regulation. And I think that's the thing that I'm most hopeful for
in the U.S. But note, a lot of the regulation you may get in the U.S. and in the U.K.
in the EU is going to focus on those countries.
It's these, you know, South America, African countries that...
I believe the phrase is rest of world, Ryan.
Let's just call them rest of world.
You know, countries, I do work in Kenya and Uganda and people rely on Facebook, right?
It's the thing they get everything from.
And so if we do regulation in the U.S., we do regulation in Europe, but we don't take care of,
of Kenya or Uganda, Ethiopia, then we're leading to undo suffering around the world.
And we can't let that happen.
But like there are paths to improve this stuff.
But it's time to take it seriously and do it.
Yes, definitely.
Facebook complies you have your marching orders.
All right, Jeff, if people want to read the series, I guess they just go to the homepage of the Wall Street Journal.
Actually, I think, look, I don't entirely understand this, but the paywall has been lifted for
people visiting through social media. If that doesn't work, try it in a different browser window.
If that doesn't work, I'm sorry. I don't understand why. If you're a Facebook employee,
of course, you can personally reach out to me and I will send you the documents personally as well
with a request for you to talk to me in the future. No, seriously, I'd actually would love to hear
from people who work there, even if they just want to tell me why they think I'm an idiot with
the understanding that that will be treated as off record. That's all fun too.
That's very generous. Very generous. And Brian's not very self-interested.
honestly. Well, yeah, it's good. Hey, look, if you're able to do more of this good work,
everybody's going to benefit from it. So we appreciate it. And Brian, if people want to reach out
to you or learn more about the Delta Fund, how could they do that? Brian, you're either muted or
you walked. Oh, sorry. I was, I muted myself. Man, I made it the whole time without doing that,
that very 2020, 2021 error. Yeah, so happy for people to reach out to me.
You know, I'm on Twitter, people can DM me there.
And then the delta fund.org is where people can learn about some of the philanthropic work
and economic investment work that we're doing, which is in a lot of places all around the world
and trying to help people improve equity.
Amazing.
Well, with apologies to German television, I want to thank both of you guys.
Thank you, Jeff and Brian.
What a great discussion at the last minute here on a Friday.
It's been a pleasure getting a chance to speak with you both.
And I just love this, you know, reporter and, you know,
a tech worker dynamic where we can sort of all,
tech executive dynamic, where we could all sort of talk about problems and solutions together.
It's what we try to do here on the big technology podcast.
Thanks so much for joining us.