Tomorrow - Episode 53: Blowing Up Facebook's Spot with Michael Nuñez
Episode Date: May 16, 2016Hey guys, just posting a quick update here to my profile. We recorded this incredible episode with Gizmodo's technology editor Michael Nuñez, who recently broke one of the year's biggest tech stories.... Namely, the fact that Facebook has been not-so-honest about how it surfaces news stories to its users. We discuss the current issues happening at Zuckerberg's house, the complicated relationship between social media and news, and why reporters are struggling to hold corporations accountable. Check it out! 1 Listen = 1 Smile, 1 Share = 1 New Friend. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
Hey, and welcome to tomorrow.
I'm your host, Joshua Topolsky.
Today on the podcast, we discuss Facebook, Facebook, and Facebook. I don't want to waste one minute, so let's get right into it.
My guest today is the technology editor at Gizmodo and has just this week or last week,
when you hear this, dropped a pretty huge scoop about Facebook.
I'm, of course, talking about Michael Nunez.
Michael, thank you for being here.
Thanks for having me.
So I do, you know, obviously, one of the reasons I want to talk to you is you just dropped this huge scoop.
What was it, on Monday, last Monday?
Because this will be airing on Monday.
So it would
have been the previous, right? I think that's right, yeah. So we dropped this story on Monday,
May 9th, revealing more details about how Facebook's trendy news module is operated.
And what we discovered was that a group of about 20 journalists are the ones that are choosing
what is able to show up in the feed and, more importantly, what's not great to just see somebody other than
like the traditional outlets doing like real, you know, the real legwork of journalism.
So I was excited.
I mean, this is like, it's always great to watch like how everybody reacts to it, you
know, how they scramble.
Yeah.
But just to back up.
So the story is essentially like, so Facebook has, I mean, I'm going to go really basic
for anybody who's listening. I doubt that anybody's listening who doesn't know this, but
I'm going to be stupidly basic just so everybody knows. So Facebook has this thing that they
introduced not that long ago. It's probably what, like maybe two years ago? Yeah. January, 2014.
Okay. And this, and this, this, and I'm going to actually look at it while I'm talking about,
cause I really want to like explain it if you don't know. And by the way, I think there's
probably a lot of people who look at Facebook and don't really think about it.
It's this trending bar.
It lives in the upper right-hand corner, the sort of mid-right side of Facebook.
And this trending bar is like one of the biggest drivers of traffic.
I mean, as a person who has obviously run several digital publications, this is like when you see the spike from that bar, it's like really intense, right?
So it really gets people clicking.
It's like a kind of persistent bar that shows trending news across a variety of topics.
And prior to this piece that you did, I mean, correct me if I'm wrong.
I mean, correct me if I'm wrong, but prior to the piece you did, the assumption and not the assumption, the actual word from Facebook about how that thing functioned was these
topics bubble up algorithmically.
Facebook has all of this data on what people are sharing and what people are talking about.
And so this stuff bubbles up algorithmically.
And then I don't know if they ever said now, then we have editors who kind of tweak the
headlines so they read better.
I know there was a thing a few months ago about how they were suppressing the word Twitter
in it, where they were saying social media instead of Twitter. So I don't know if they've ever been
explicit about the fact that they had physical editing being done to this. But the idea was that
this is a kind of like algorithmic view of what is happening in news that Facebook is delivering to
you. And yeah, that's exactly right.
So yeah, for the last two and a half years, let's say,
Facebook has maintained that some algorithm
is sorting through the news and determining
what is spiking in that moment.
So it's not the most popular topics on Facebook.
It's actually things that are showing a spike in engagement.
And they said that they had reviewers that were
writing these headlines and summaries, but ultimately had no impact on what people were
able to see in the trendy news module. Our reporting discovered that that description
was actually very misleading at best. So we found that a small group working in the New York office, a small group of recent
graduates, about 20 recent graduates from private East Coast schools, and many of whom are from
Ivy League schools, were actually the ones that were activating trends, which basically means that
they can show up in your trending news module in the top right corner of Facebook.
So when you say, hold on, I want to pause because I want to dive into that.
When you say activating trends, what you're talking about is, just so I understand this,
and everybody else does, it would algorithmically find things.
And then there is some system within Facebook where, hey, here's stuff that's bubbling up.
These are trends that are bubbling up.
And then an editor has to decide, essentially, what trends are going into the bar.
Is that right?
Yeah, that's right.
That's, to a certain extent, what they had been saying.
But what we determined was that this algorithm is not actually that sophisticated.
So although it does show things that are bubbling up on Facebook, it is also populated with
something called external topics.
And again, we can get into more detail about this in a second.
But external topics are
basically sourced from RSS feeds. So in several ways, Facebook was artificially manufacturing
trends. And for the last two years, they haven't been very upfront about how they do that. And
I'm happy to get into the details about exactly how that happens. But basically, these news
curators on the back end are looking at a list of things that are actually bubbling up,
that are organic trends,
and also things that are called external topics
that are being injected based on an RSS feed
that Facebook news curators set up
to basically offset the lack of hard news
that was being shown on facebook
so you know what what like through several discussions with um with former news curators
we determined that like most people on facebook aren't actually talking about hard news they're
typically talking about you know several news curators said beyonce is like constantly trending
and a lot of entertainers are like constantly entertaining
I'm sorry uh constantly trending right and it's not surprising exactly yeah it's not that surprising
when you consider you know what is in the trending news module on on a day-to-day basis and also just
like what you know about your friends and their response to lemonade and that sort of thing right
but um but you know so so several of them you know, there's often a lot of entertainment and sports. And, you know, some of the curators that were there in the early days were able to tell us that Facebook saw that as a problem. They wanted to give people the impression that, you know, that their users were talking about hard news on a regular basis. And so these news curators were instructed to basically to offset that type
of discussion that was happening organically. So they did that with external topics that were
sourced from RSS feeds. And they also did that with something called an injection tool,
which forced topics to trend that weren't natural. So basically, what we would see most of the time,
if they weren't touching it is, is bullshit. Is that what you're saying? I mean, we'd see like,
celebrity news and gossip and like the stuff that feels like probably, you know, I mean,
millions of web pages and millions of sites, but it's like, not really, it's not about the election.
It's not about policy. It's not about, you know, murders.
It's like very light by comparison. Yes, absolutely. And another reason for that,
and this is something that we weren't really able to get into in the article, but, you know,
after several conversations with these news curators, they all talked about an emphasis
on numbers. So their job is basically to go through this list on the back end, both of naturally trending topics and external topics which aren't naturally trending.
And they're summarizing those news events with a headline and with a three-sentence summary.
And these news curators described a really isolating experience.
They're forced to go through about 20 to 30 of these stories a day, and many of them were fired if they weren't able to get through
that many topics. And so in order to get through that many topics, they were activating news events
that were very one dimensional, such as Kim Kardashian posts a photo to Instagram. Okay.
That's really easy to summarize in, you know, less than a couple of minutes. Whereas, whereas something
like Ferguson protests, or, you know, Darren Wilson shooting, or the San Bernardino shooting,
you know, something that's very nuanced and right, and multi dimensional, probably unfolding in real
time. And it's in many cases, yes, exactly, it is like is much harder to get through. And so,
you know, in order to keep their job, they were often looking for sports events
and looking for news from celebrities
rather than dealing with,
there was no incentive to deal with
some of these more nuanced topics.
And so that was another way
in which the Trending News module
was sort of manipulated.
Right, and so this started with a report that you had,
which was essentially like, here's these people that here's who's actually selecting this stuff.
It's really not this algorithmic sort of agnostic, you know, or neutral position. It's actually
being chosen. And it's like a specific group of people. And they have actually pretty,
I mean, their jobs sound not dissimilar. I mean, if you've been in a modern newsroom,
which you obviously have, their jobs are not dissimilar from like a news editor in
a modern newsroom that's told to like, just get some stuff up and like find anything that's going
to stick, right? Like it's not uncommon. But what's interesting is like a normal news organization
has, you know, a few million visitors or whatever.
If it's a smaller few million, maybe tens of millions.
But Facebook is dealing with literally billions of people.
I mean, at least hundreds of millions that are seeing a lot of this trending stuff.
So it's a very different position that they're in.
Their position needs to be much more rigorous in terms of neutrality.
Because Facebook is not a news
source. They're not a news provider. They don't have an opinion. At least that's what they tell
us, right? They don't have an opinion on the stories. If you're, if you live in Kentucky and
all of your friends are in Kentucky and there happens to be more of a conservative thread
amongst you and your friends, presumably you would see more conservative news stories. You would see, I don't know why I
chose Kentucky, but let's just assume that it's a, it's a red state. I'm pretty sure it is a red
state. You know, like you're going to see more conservative news shared. You're going to see
more conservative news trending. So this, but the first part of your story is essentially like your
first story is, which was not this week, but last week, or rather not last week, but the week
preceding it, unless I'm mistaken. The third is when you published it.
Yeah.
So it was the previous week.
It was like, here's who's doing some of this selection, right?
The follow-up story is the one that seemed to have been, well, didn't seem to be, definitely
was the story that moved the rest of...
The first story was like, okay, yeah.
Oh, well, it's not that surprising.
You know, people cared, but they weren't really paying attention to the implications, I felt like. The second story that
you followed up with, which was this past Monday, was this story about how basically these, as you
said, like these young, sort of very green editors who had just graduated from largely East Coast,
largely Ivy League, or in many instances, Ivy League schools,
were not trending up, were not allowing stories that would have been more conservative or right wing leaning to trend up in this bar. Is that right? Because I just want to say, like,
in your first story, you actually have an image of, you have a screen grab of the trending bar.
And it's like, filled with conservative news. I mean, it is is like the top story is ted cruz then it's john
boehner the next story is hillary but it's about trump then john mccain then there's some there's
some other stuff megan kelly's in here trump is in here chris christie is in here obviously it
was like probably a a i would guess that's probably a day that's around one of the primaries but
it's not that they weren't letting any through.
That's correct, yeah.
So there's really a lot to unpack here.
First, I would like to say that this was iterative reporting
that started in early February.
Someone leaked us a document, an internal memo from Mark Zuckerberg
about him asking employees to stop scratching out Black Lives
Matter slogans on their famous signature wall because they were scratching out Black Lives
Matter and replacing it with All Lives Matter.
Right.
I remember that story.
It was a great story.
So Mark commented on that in an internal memo.
That was leaked to us unsolicited.
And when we published that, it emboldened several Facebook workers to come forward with
more information.
So eventually we were leaked another document that that showed a Facebook employee asking the question, quote,
what responsibility does Facebook have to prevent President Trump in 2017?
And it's important to note that the question doesn't ask, you know, what's my responsibility as a U.S. citizen to do this?
Because that would be very clear.
It would be go out and vote.
But this person is actually asking, you know, what responsibility does Facebook as a company have to prevent President Trump in 2017?
Right.
You know, we have no evidence of Mark Zuckerberg answering that question.
But that was another big story.
That was in, I believe, in March.
I believe in March. And so, so, you know, again, once we published that story,
more Facebook workers stepped forward with more revealing details and said, like, well, you know,
that's not the that's not the entire, you know, there's more, I guess, shady business happening internally. And let me tell you, so we were able to get in touch with those people. We reported that,
that there were, you know were a group of about 20
recent graduates controlling the Trendy News module. And in that story, I think what we were
trying to call into question is, or what we were trying to spotlight is basically that these 20
young recent graduates, these are not like editors, they're the ones that are choosing what's trending. That's a quote from several of the
curators. We choose what's trending, it was completely biased, etc.
So our follow-up story
came after that because one of the curators
stepped forward with internal documents that were able to show
about a six-month period in which right-wing or conservative-leaning news topics were regularly
kept off of the trendy news module. It's not to say that Facebook told these employees they
weren't able to activate these trends or they
weren't able to, you know, allow these things to trend. It was just that because there was a
selection bias in the editorial board that Facebook had created. And, you know, again,
an editorial board that they didn't want to own up to and had for a two year period avoided
talking about in any significant manner. You know, they, this person said that,
um, you know, there were, there were natural biases in, um, in a lot of these people, you know,
this person felt uncomfortable as a, um, conservative and, you know, basically kept a
running list of, um, of, of topics for about a six month period. So, um, so, um, so yeah, so, so of course,
you know, we did our best to corroborate that story with, um, other news curators. We were able
to, and, um, and from there, you know, we just went through the story with a fine tooth comb,
right? You know, our goal was to publish the truth and to basically
shed light on something that Facebook had been misleading the public about. And, you know,
just to be clear, like, I am not this, the story doesn't align with my, I have no political agenda,
right? Like, I am the son of an immigrant, I'm the son of an illegal immigrant. So like,
you know, there's no reason that I would ever want someone like Trump to become
president.
But if anything, if anything, right, that would you would lean in the other direction
here.
But like, I mean, obviously, this story, this is and this is what's so interesting.
It's easy to look at this and say, well, yeah, OK, fine.
I don't want Trump to be president.
This is good.
But the implication is much larger, right?
I mean, the implication here is really not,
it's not about like if they just,
you know what, they've been keeping Beyonce stories
out of the feed, right?
Okay, like even that has bad implications, right?
Even that suggests something.
Because the reality is like, look,
news organizations have certain responsibilities. And the reality is like look news organizations um have certain responsibilities
and the reality is like the reality about that is that they also have certain biases and it's
sometimes it's fairly obvious but those biases are and sometimes the authors or editors make it
clear what their biases are right the thing about again going back to this thing is that the facebook
problem is it's twofold right one is like increasingly we're getting our news on Facebook and,
and,
and,
and the,
the idea of Facebook is that it is a neutral,
like an observer.
Like it's,
it's just there.
They're not,
they're not controlling the flow of the news.
They're just there letting it out.
Right.
As a distributor of the news. And this is there letting it out, right? As a distributor of
the news. And this is suggest like they could do whatever the fuck they want, basically.
Like the reality is like you may have the greatest news organization in the world,
but if Facebook decides that they don't like you, then they just don't let your news get to the
people who read Facebook. And I think that's a really, that should be really scary, whether
you're at Breitbart or Mother Jones or whatever publication you work for.
If you're in the media and if you're a reader who cares about getting unbiased, uncolored
news, like news that is from the whole world of storytelling and news and information,
that's a pretty serious problem.
Yeah, absolutely.
The thing is, it's not wrong for Facebook to have an editorial board
and to choose the stories that they believe are most important to their users.
That's fine. They're a private company.
They're allowed to do that.
The problem is that for the last two years,
they've maintained that these
topics are algorithmically sorted. And, you know, what I think they're preying on by, you know,
by choosing that wording, I think they're preying on people's misunderstanding, like people's fear
of math, and basically their misunderstanding of what an algorithm is, you know, what, when we
started to pick this apart, and to dive a little bit deeper into it, what I discovered
was that this algorithm is not that sophisticated.
It is basically taking external topics from an RSS feed.
They're taking naturally trending topics from within Facebook.
They're kind of mashing them together and then putting that list in front of a recent
graduate who is underpaid and working as a contractor and doesn't receive
the same type of benefits that a Facebook employee does and is basically treated as a disposable
entity and having that person activate trends so that the public can see them or blacklist trends
so that the public can't see it. And especially in the early days of this Trendy News module,
can't see it. And especially in the early days of this trendy news module, at least as it was described to me by several former curators, there were, you know, it was described as, quote,
the Wild West. People said that there were very few, if any, rules in place. You know, I mean,
basically there were, as it was described to me by one curator, there were no rules in place,
and they were able to sort of just do whatever they wanted when the tool first launched.
Eventually, they created, at least according to this source, they eventually created those
guidelines.
So the guidelines that were eventually released by the Guardian were not always, um, not always available. And, um, I would also add that we
have evidence that, um, that that document has actually changed quite a bit, um, since it was
first created. So the version they have may not be the original document. The version I've seen
three different versions of that document. Right. Okay. So actually let's, this is good. Cause this
segues into the, the sort of the, the next next part of the story so i think we've like firmly established this this sort of this world
right you know um so following this i mean i only will get a little meta here but following this
there are there was a lot of of um a lot of other publications scrambled i want to actually focus
on the guardian for a second because of of i think of all and correct me if i'm wrong i probably
haven't seen every single story but the Guardian actually followed up with some real material, right? Like,
they actually went and did some investigation and found, like, they had part two of the story,
right? Or at this point, let's say part three of the story. Yeah, and it was a huge relief. You
know, I also should have said this way sooner, but, you know, I've been working on this story
with two other editors who are not really sharing
the spotlight with me, but have been working incredibly hard on this story. That would be
Katie Drummond, who's our editor-in-chief, and Alex Dickinson, our deputy editor. And those two
have worked with me for the past two months to sort through all of this information and to continue pulling the thread. And it was a huge burden to take on.
Once the story really picked up, it felt like it was the three of us against the world.
I mean, a lot of people were trying to poke holes in our story.
Facebook obviously tried to obfuscate the truth and muddle the truth a little bit by releasing, you know, press releases and
documents. And so it was nice when the Guardian finally released the findings from their
investigation. It was such a relief because, you know, it showed me that there was interest in
this and there were people that were willing to do their job as a journalist.
They were willing to investigate.
They were willing to spend a lot of money and man hours looking into this.
And so it was nice to know that we weren't in this fight alone and that there were people that were willing to challenge Facebook and to basically look into some of their claims.
Because we do – and I do want to talk about this particular point pretty deeply.
We are at this really important moment in media.
I mean, and I could talk for hours and we could go way down a rabbit hole on this,
but I want to like, so first I want to start with The Guardian and just say,
by the way, does The Guardian, I'm looking in here, I'm just looking at it right now,
do they credit you guys in this story? Because i don't see a gizmodo mention a way down they go way how many
paragraphs in is this i mean uh yeah it's probably lower but you know that i get it i get it but it's
lame you know like i mean i would i would have like to paragraph two is kind of the place where
you'd be like after reports from gizmodo, anyhow, whatever, I'm not going to, they've got their way of doing things.
And this is actually, this work is actually very good reporting on their part.
So everybody's doing their job, but, you know, it's nice to credit, give credit where it's due.
But here's the thing, by the way, it's still fucking insane that in 2016, we're still like talking about attribution and credit in this realm.
I mean, it is still like a case where the New York Times will write on a story. In fact, if I'm not mistaken,
they wrote on this story following your scoop
and did not credit you guys, right?
Yeah, I mean, yeah,
I think a lot of publications
have been pretty timid about
and very reserved in the way
that they've credited Gizmodo.
But-
Listen, once your organization has wronged Hulk Hogan,
you find that like a New York Times tends to back away, okay?
They don't want to be seen as in league with anybody who's fucked with Hulk Hogan.
So anyhow, but getting back to this.
So this thing I think is so critical to what is happening right now in the media.
Like Facebook has become – and I wrote – I'm not going to go back.
I wrote this thing about the media a couple weeks ago that got a very strong reaction, a very strong positive reaction. And one
of the things that I was arguing in the piece was that we always think that there's this new trick
to make our business bigger or better or fix the problems that we have. And one of those tricks
has been for a long time, one of those things, those tools has been social media and particularly
Facebook, right? It's an incredible driver of traffic. And there are many newsrooms now that rely on Facebook
almost completely for their audience, right? Like without Facebook, they would have no audience.
I mean, I'm thinking, I'm not going to name names, but I'm thinking of people right now
that if Facebook went away tomorrow in terms of delivery of traffic, their business would go away
pretty quickly. And by the way, like fine, you traffic, their business would go away pretty quickly.
And by the way, like fine.
That's like you make a choice.
There was a period where the same thing was true of Google where it's like if you didn't rank up really high in Google, in SEO – and by the way, that's still very important – that your business would have a hard time surviving.
That's still very true in a lot of different ways.
But the implication is like about the coverage of this.
So it's not just that Facebook might have preferences or they might have employee editors
that have preferences or whatever that is.
It's that to really cover Facebook, there's two pieces, right?
To cover Facebook and to say honestly and truthfully what's going on if there's a problem
becomes increasingly difficult if you've got like a variety of monetary deals with Facebook
that are
like making sure that you get your stuff on Facebook, right? You probably don't want to go
in and say, hey, we think Facebook is doing something that's wrong and is bad for our
industry and ultimately bad for us, right? You probably want to like play nice. But the flip
side of it is that, you know, we aren't, you know, it's like you're handing over so much, even if like you say
something, it's not like Facebook is going to hit you back, but you're handing over so
much of your business that, you know, you have to find a way to essentially work around
or work inside of these, you know, if it's a human based curation or if it's algorithmic.
And it's like, you're not really playing the game
of delivering what you need to do as a business.
You're playing a game of delivering
what Facebook needs you to do as a business.
And so it's an interesting point
because this really speaks to,
you see some of the coverage on the reaction to your story.
And I feel like a lot of people were kind of like,
oh, fuck you, it's not that important.
Or like, who cares what Facebook is doing?
Or like, yeah, obviously Facebook has curators and they don't show every story or whatever, as if like you can just brush this off. Like, were you surprised to see that reaction from
other people in the media? I was blown away. I mean, I won't name names, but I was blown away by
how willing some other journalists were able to blow this off, or, you know, the fact that some
of them pointed their magnifying glass in the direction of Gizmodo rather than the $350 billion
corporation that has a stranglehold on the media. You know, I think that those people are probably
uninformed about the fact that Facebook users are sharing less personal information than ever before.
People are no longer posting status updates as regularly as they used to.
They're uploading photos way less than they used to.
And so when you combine that with the fact that it is increasingly elbowing its way into
the digital news distribution industry, I think you can begin to understand the importance
of this issue, right? So, you know, Facebook isn't a startup anymore. digital news distribution industry, I think you can begin to understand the importance
of this issue, right?
So, you know, Facebook isn't a startup anymore.
It is a well-established company.
And now that its users are no longer, you know, this was the darling of the Web 2.0
movement.
It depended on people, you know, willingly and regularly uploading information onto the
site.
Now that people are doing that less,
Facebook is increasingly interested in finding ways to get content. The way that they've started to do that is through partnerships with the media. So, you know, at least part of their
strategy has been to rely on the media more for content. And certainly the introduction of this trendy news module was part of that
strategy. So for those that say that this is an unimportant issue, I mean, it is mind-blowing.
I think it's pathetic in a lot of ways.
I mean, I mean, what's really I mean, look, and I get the I get the the struggle traffic from Facebook at Gizmodo. You have to recognize it's an, it is actually an incredible fucking tool for, for news, for news people. Like it is, it is, it is, uh, uh,
like it's completely part of the ecosystem of news now. And there's no, there's no changing that in
the near future. And I think that that's not a, it's not a negative. I'm saying like,
it can be a place where, I mean, listen, you know, we started The Verge in 2011.
We were, we had some, you know, people who wrote, people knew who we were, but we didn't
have that big of a following.
And we started in November of 2011 with no audience, right?
I mean, we basically were starting from scratch.
And part of, you know, the way we build an audience was through social.
Twitter and Facebook were huge, huge levers that we could pull
to bring people into our world, which is really important. So you want to balance like this
amazing tool with the reality that like the tool has to be policed in some way that it has to be
fair, right? And if it isn't fair, then we have to reconsider our relationship with it in
the media. That's the way I look at it, is that if it isn't true that you can be judged on the
merit of your storytelling and the quality of your work, and that that can make an impact,
that it isn't about a Facebook editor deciding what goes or what doesn't like,
you know,
I think we need to reevaluate a little bit of our relationship with Facebook
and ask,
you know,
ask the questions like,
is this the best tool for us to use?
Right.
And I think that like the media should look hard at that relationship and
increasingly should look hard at that relationship.
You don't want to like,
you know,
it's like the printing press is owned by somebody else at this point.
Right.
Like maybe they change the language in your story.
Maybe they put a different headline on it, right?
That changes.
That's pretty significant shit.
Or maybe they just decide to leave your story off altogether.
That's pretty significant shit too.
Look at the trend right now.
Facebook trending is currently in the trending news module and Gizmodo is nowhere in sight.
Mark Zuckerberg is the first post that you see on that on that topic
and the second is tom stocky their vice president of search so i mean you know that in and of itself
is a form of suppression in my opinion i mean we're the fucking news organization that broke
the story to begin with excuse my language but like no no you can swear all you want give me a
break you know i just think that like if if Mark is as serious about having an open
conversation or direct conversation about this, as he suggests, then he would return my phone calls.
He would return my emails. He, you know, I mean, we, we went way out of our way to, um, to work
with Facebook to, um, to fact check this entire story. I mean, I sent them a list of questions,
um, about every single fact in this story. And that message was sent three days
in advance of the story actually being published. And then, you know, I made several phone calls
in order to make sure that I was doing my due diligence. After the story was published, again,
you know, I reached out to them and asked for comment. And it's really frustrating when I see
the head of the organization
saying that he wants to have a direct conversation
about this issue when, you know,
none of his employees are willing,
or he himself is unwilling to answer any of my questions
or to basically, you know, face the music
and speak to Gizmodo.
So let me talk, I want to talk about that a little bit.
I want to talk about some of their responses.
But just to be clear,
I just want to be really crystal clear here.
Yep.
The document that The Guardian has says pretty clearly that there is human intervention in trending topics, correct?
Every step of the way.
Yes, correct.
Okay.
Yes, correct.
Okay.
So Mark Zuckerberg's response, he says,
this week there was reports suggesting that Facebook contractors working on trending topics suppressed stories with conservative viewpoints.
We found no evidence that this report is true.
So what he's saying is we found no evidence that conservative stories
were repressed, but he's not, just to be clear,
not contradicting the report that you did and the report that the Guardian did, that there is some intervention, correct?
Yeah, that was my reading of it.
I mean, I think that his response was carefully worded and has sort of – I mean, really, they're tangled in their own responses.
When you look at the initial response, the statement that was given to TechCrunch and BuzzFeed and other news organizations at about 4 p.m. on Monday, May 9th, and you compare that to
the statement that was made by the vice president of Search, Tom Stocke, that was made, I believe,
at 1 a.m. on Tuesday morning.
Yeah.
Can I get into that for a second?
Because I'm looking at it right now.
Sure.
that for a second because i'm looking at it right now sure um he actually says directly um that they uh they he says we do not insert stories artificially into trending topics and do not
instruct our reviewers to do so now correct me if i'm wrong but these documents that guardian has
directly contradict that do they not i, I believe that they do.
I mean, it says here, I mean, in the Guardian story,
the guidelines show human intervention
and therefore editorial decisions
at almost every stage of Facebook's
trending news operation.
And then here's a couple of points.
A team of news editors working in shifts
around the clock was instructed
on how to inject stories
into the trending topics module,
how to blacklist topics for removal
for up to a day over reasons,
including, quote, doesn't represent a real world event left to the discretion of the editor.
So that right there, just that statement directly contradicts what Tom Stocke said.
That's correct.
Yes.
And so Zuckerberg really kind of danced around that in his response.
I mean, his response is, we found no evidence of it.
We're investigating, but we found no evidence that this report is true.
He doesn't say,
I mean,
he's basically addressing
the conservative versus,
he's saying like,
we're not tweaking this algorithm
or we're not tweaking our choice
to be more liberal
or less conservative.
That's what he's really addressing.
He's not addressing
that there's intervention
in the actual choosing of the topics.
Right, yeah, he's missing the point entirely.
Well, he's missing it on purpose, I think, because he's, I mean, and at the end of the
day, now he's saying he's going to talk to conservative leaders, right, directly.
Correct.
Is this, so the, I think.
But that also, I mean, it's, you know, we have to point out that that came after a GOP
Senate, you know, investigation was launched against Facebook.
So he's decided to meet with leaders of the GOP, and I don't know exactly how he phrased it,
but he's meeting with these politicians after an investigation was launched against Facebook by a GOP Senate
committee. Based on these reports? Correct. Yeah. I mean, just to say, just to be really,
really clear, I'm like, it's a real struggle to defend. In a way, it's like I kind of am like,
yeah, okay, great. I mean, because there's a lot of bad shit out there. There are a lot of lies.
There's a lot of misinformation. I mean, we're living in the age of Donald Trump. So what's coming from the GOP and the conservative viewpoint,
at least what is the popular, very loud things you're hearing, it's all shit. I mean, it's a
lot of shit, right? And a lot of bad shit. So on the one hand, I'm like, yeah, okay, please repress
that, get it out. But the reality is, it's just like, that is not, unfortunately for us, like if we think like adults and we think in a healthy way,
like that's really, really bad for human beings. And it's really, really bad for the industry of
news media. And so I think like, you know, it's easy to understand the justification
and even defend against it in some way. But it's really like, there's really no basis to say that this is something that we can allow.
And I think that the problem is when you've handed over so much power to one organization,
do they have an ethicist on staff?
Like this is my question.
Like does Facebook employ an ethicist?
Not that I know of.
And I've seen some media critics call for that type of job.
I think there's a real question to ask. It's like this thing, I mean, Facebook does this all the
time. They're like, we don't allow photos of breasts unless there's under these three
circumstances. If you look at the Instagram guidelines on nudity, they're completely insane.
They're weirdly puritanical. They're these strange American ideas of nudity. And I think the same is
true of Facebook. My wife, Laura, is a staff writer at The Cut at New York Magazine. She did
a story on Facebook. They took down this post of a woman who'd just given birth
that was in a private group, right?
And it's like, what's the criteria here?
You know, it's this strange attitude about the world
where it's like they're sort of policing
from a weird puritanical and very US-centric standpoint.
They don't really understand
all of the material they're dealing with.
I think this is a perfect example, and correct me if I'm wrong of like, this perfect combination of people
who don't really understand the material that they have to police, and a heavy handed style of
policing, right? Like, you got people who are pretty green to news, making decisions for literally
hundreds of millions of people about what news they see. Exactly right. Yeah.
And I think, yeah, you get to one of the most important parts of this story.
You know, another part that we weren't really able to include as much as I would have liked to in the story is that there are local news organizations that are basically being ignored.
So, you know, Facebook went ahead and released its list of RSS feeds and what is being populated in the external topics that are forced into the back end.
And in that list, there were very few, if any, local news stations.
And according to the curators that we spoke with, there were instances where things like, you know, a shootout in Chicago
would happen and WGN, which is, you know, for anyone that is from Chicago, is like a pretty
reputable and well known local news source. But in those instances, they would be forced to wait
until a larger national publication jumped on the story or that, you know, or in a lot of cases,
they weren't able
to allow that to trend because WGN wasn't part of this list of a thousand, um, a thousand,
you know, new sites that, that Facebook sees as legitimate. So, um, so, you know, in a lot of
cases, although they're pitching this tool as like a hyper localized news, um, module in many ways,
news module, in many ways, it's playing a role in the destruction of local news entities such as WGN or, you know, any local newspaper that, you know, you might be familiar with.
Right. So what happens next in your opinion here? Like, what is the natural progression
for this story? Well, I have a very firm idea of that.
So, I mean, you know, I think what makes me most excited
is that we have more shocking detail.
We have been working on this story for a two-month period.
And, you know, obviously we have to be careful about what we report.
And, you know, it takes a long time to get through these stories, right?
We're not going to put out something that's inaccurate in any way.
It's really important to us to discover the truth and to keep pulling the thread.
But as I said earlier, this started with a very simple story.
It was a leaked document from someone that was working at Facebook.
And each time more people have come forward with more detail and more information about what's
going on inside of Facebook. And you better believe that, you know, I, the reporting doesn't
stop now that, that I've published one story that has, you know, that has people talking. I mean,
there is more, this is a huge, huge, huge organization.
They have a lot of information.
They have just an inconceivable amount of influence
on people's everyday lives.
And so, of course, yeah, there's more going on internally.
Of course, I can't really talk about any of our follow-ups in any
specifics, but yeah, this is something that we're laser focused on and have been for the last two
months. And that's why when other news organizations, when our competitors have published
these spoon-fed comments from Facebook and just some of the really awful critiques of
the story, we've been able to brush that off because, you know, we have documentation of a
lot of interesting things happening at Facebook. And so our goal is to basically just continue to
report, continue to shed light on the types of decisions that are being made at
Facebook. And hopefully we can give people a better understanding of what algorithms are
and how humans play an important role in how those are created. And also just, you know, the
massive amount of information that Facebook has on people and how it has willingly, um, and,
and actually publicly, um, you know, run secret experiments on its users in the past. And, you
know, there's, there is so much to, um, to unpack here. And, you know, again, there's three of us,
it's me, it's, uh, my editor in chief, Katie Drummond, and my deputy editor Alex Dickinson,
who, you know, the three of us are working on this day and night.
And, I mean, in a lot of cases, it has, it has, it's, you know, it's been all-consuming.
I mean, it's really hard, this is really hard investigative work.
And, I mean, I can only say that, like, I just hope that people realize that I'm not going
to stop because, you know, one story gained some attention. There are definitely things that I'm
excited to continue reporting on. So you feel so just just so what you're saying is there's going
to be more. I mean, just to be clear, like you've got more on this. You know, I think that my editor in chief, Katie Drummond, would probably be pissed if I went off the record and said.
Let me just say I know Katie, OK?
And I think she would be totally fine.
Yeah.
No, I mean, I can't say that.
I'm not going to say that.
More.
I'm not going to say that.
But, you know, what I can tell you is that I'm going to continue reporting.
I mean, that's my job.
My job is to be a reporter.
And again, every time that we publish a story like this, more people step forward.
And really, that's about as much as I can say.
Let me ask you this question.
This is a little bit more about, I mean, look, Gawker has been extremely controversial.
You've had like, you know, Gawker's had a pretty crazy year or so, maybe a little bit more.
Like, how much do you feel like you've gotten pushed back because people are like, oh, well, it's, you know, itkers had a pretty crazy year or so um maybe a little bit more like how much do
you feel like you've gotten pushed back because people are like oh well it's you know it's gizmodo
it's gawker like you know we can't trust them because of x y or z like do you feel like in
both in our industry and outside of it like i'm curious to know how much you feel like
it's it's been harder for you to push this forward um um with that shit sort of in the atmosphere? You know, I think that, so like Katie, Alex,
and I have joined Gawker Media after all of that happened.
Yeah, on a full disclosure, I should say,
Katie has worked with me at both The Verge and Bloomberg,
and Alex Dickinson was actually one of our editors
at Bloomberg as well.
And there, and I just want to say to say, it was very clear to me.
And you came from Popsci.
You were at Popular Science.
And you left when?
Pretty recently, right?
Yeah, January 15th, I started at Gizmodo.
And I knew when Katie joined Gizmodo as editor-in-chief, and we had talked about it, I had a pretty good feeling that that this kind of reporting we were going to
start to see a lot more of it right because i think there's a period where gizmodo was a little
bit at sea in terms of direction but i knew like this is her this is katie's like this is right up
her alley you know like she loves deep serious hardcore real reporting yeah and so this is not
surprising just i'm just saying like this if you know any of these people,
and I know a couple of them and now another one, it's not unexpected, right? But like it is a
little bit of a whole new chapter for Gizmodo. Yeah, absolutely. That's how we view it. I mean,
I don't, you know, I stand behind my reporting 100%. I know that I have published the truth. I've published it
accurately. I've talked to all of the sources since I've published these stories, and all of
them have been happy. They think that they've been represented accurately. And also, you know,
some of them have been a little confused by Facebook's response to this. But yeah, I mean,
in terms of like the Hogan scandal
and people's perception of Gawker Media,
to be frank, none of that matters to Alex, me, or Katie.
I mean, we, I think we let our reporting speak for itself.
I think that we will continue to do these types of stories
because that's what's interesting to us.
And we think that that is important.
And that's the role that a technology publication
should play, I think.
You know, it's not so that we can, again, be spoon-fed press releases from Facebook
or Apple or any of these companies.
You know, we want to take a critical approach to anything, whether it's a gadget review
or whether it is an investigative report.
And, you know, we want to be critical.
We want to be accurate.
And I think that we've done that in this case, we will continue to do that
and so people, if someone wants to say
we can't believe this because of a Hulk Hogan
video, I mean that's
what I realized once this story was published was like
I can't read the story for everyone, I got really frustrated
because some people were just completely missing the point
in a lot of cases.
And some people that I had respect for and no longer do.
But what I realized was just like,
I can't read the story to this individual.
I can't read for them.
The onus is on them.
If they can't, if this is over their head, then unfortunately they're just either not a very good editor
or maybe their opinion isn't as important as I once thought it was.
But for us, there's more to unravel and we're going to be hard for you to answer, but I'm curious.
Do you feel like some journalists reacted the way they did because of their relationship with Facebook?
I mean, do you feel like that's an actual motivator?
I mean, some of these stories are definitely like, why sugarcoat this or why brush it off, right?
Because it's bad for everybody if it's true.
Yeah, I mean, I think there was a case where, you know, I hate to throw Recode under the
bus here, but, you know, I think that a 2015 story by Recode, you know, portrayed the trending
module as one that was governed entirely by an algorithm.
And, you know, I don't have the exact quote, but I think it made me a little bit more skeptical
of the stories that I read on that particular site.
I mean, just to,
and I'm actually looking at that Recode article right now.
They sort of are like,
maybe the Gizmodo story is bullshit.
Yeah, of course.
I mean, we caught them during their, the middle of their relaunch, right?
They relaunched this week, and one of their more prominent stories of last year was called into question.
So, I mean, I don't blame them.
And they have, like, several updates on the story now.
I'm just looking at it.
But they sort of don't even seem to address the Guardian story, which is definitely, I mean, say what you will, but the Guardian story definitely proves that there's human involvement in choosing of these stories.
Unless somebody wrote that manual for no reason and nobody ever used it.
But you've got now what seems like several outlets with physical evidence that this was the order of the
day so i feel like the update should be yeah it looks like this is true in this um this report
where facebook said hey it's all algorithmic was kind of bullshit i mean at one point at one point
we can assume it might have been right that that the that what was the trending the trending topics
was an algorithmic, purely algorithmic.
I mean, do we know, was there a point where it went from being totally algorithmic to having hands on it?
My understanding is that news curators have been involved every step of the way.
Can I just say something?
every step of the way.
Can I just say something?
I want to just deviate from this,
like sort of the overarching topic to say something about like,
this doesn't surprise me
that there are human beings curating it
because the reality of choosing news
and putting the right things
in front of human beings
is actually that like human beings
do it better than an algorithm.
Like in my experience,
like I've watched algorithms do it and I've watched algorithms do it, and I've watched
people do it. And yes, you want to use data. And yes, you want to, like, use algorithmic pieces.
I mean, as hard as it may be to admit, like, it's probably a better selection of news if
human beings are doing it. The problem for Facebook is that it's not a newspaper.
Well, and they build this as trending. It's in the name.
They call it trending and
publicly they've maintained that it is
algorithmically sorted.
What they have failed to disclose
is that, so it's basically
lying by omission,
the fact that the algorithm,
part of the equation is
pumping in stories from an RSS feed
that aren't naturally trending on
Facebook. Yeah, because if Facebook had just been like from the get go, they'd be like, well,
we use an algorithm, but we also have some human curators who help to choose stories from the
algorithm that are really best suited for the Facebook audience. Like this story would be less
shocking, right? Absolutely. Yeah, I think that's part of it. And also they should have disclosed
that part of their algorithm equation is pumping in stories
that aren't naturally trending on Facebook. So I mean, in that in that situation, we're talking
about like, there's a feed, right of news that news that these curators are seeing. And they're
just pulling stories out of that they may not even be getting shared on Facebook at that point.
Correct. Yeah, those are called external topics. And those are things that are not naturally
trending on Facebook, which is like what every newsroom does on the internet, like is like looking at a feed of stories and going like, hey, is there a story we should cover? Or is there something that like we want to direct people's attention to? I mean, that's a very common practice for a news organization.
Right. Which is why they hire journalists.
Right. Or even from their own pool of content, right? Like Gizmodo, there's an editor going like, all right, what's the top story? We got to put that up at the top of the page and whatever. You know, like that's a normal thing that happens. So it's interesting. I mean, do you think, how do you think Facebook responds to this? I mean, how do they fix this? I mean, is there a way to fix this?
It's not my job to fix Facebook. I think that the way that I sort of like to think about this is the question becomes, do you trust Facebook with your news? And also, can you trust Facebook now that it has been caught misleading the public. They've maintained, again, that this was algorithmically sorted, and we found that to be a little bit more nuanced than they've led us to believe.
I think this is an issue of trust. I don't know how they fix it, to be quite honest. It's really
not my job to look into that. Also, I'm just excited about, you know, working on some of the stories that,
that I'd like to tell. Right. Okay, which you can't, which you can't comment on at this point.
And I know, yeah, it may or may not be more Facebook stories, but we're not really,
we don't, we can't talk about it. Well, listen, Michael, this is really this, I thank you so much
for doing this is super interesting. Like, I mean, feel like um there's a lot and there's a lot to unpack here and we're just the tip of the iceberg um now what does does gizmodo have
a deal with facebook like you guys are doing some of the live video stuff right or gawker does
right yeah so we're pretty new to that um and the irony by the way the irony of this i'm not
i'm not casting you in any light i'm just just saying this is how incestuous the whole thing is.
You can write a really hard-hitting story about Facebook,
but the reality is that everybody is kind of tied up in Facebook in a lot of ways.
So what is the deal?
What is Gawker committed to?
To be honest, I don't know the details of the agreement.
Are there disclosures in your article?
Hold on.
There's a disclosure.
Here we go.
There it is.
Yeah, yeah.
Disclosure.
Facebook has launched a program that pays publishers, including the New York Times and
BuzzFeed, to produce videos for its Facebook Live tool.
Gawker Media, Gizmodo's parent company, recently joined that program.
So I don't even think we had started doing those videos at the time that this – I think
it was brand spanking new.
It was news to me when that disclosure was added. Is that deal still on, do you know, I think it was like brand spanking new. Like it was news to me when that disclosure was added.
Is that deal still on?
Do you know?
Or has it been?
It is for now.
I mean,
well,
that would be,
this is the fucking problem,
isn't it?
Like if Facebook were like,
oh,
hey,
you know what?
You know,
maybe not Gawker because we're not loving,
we're not loving everything they're doing right now,
but that's easily,
they could easily do that.
Right.
I mean,
they wouldn't do it because it would be,
it would be like the most amazing gift to give you guys in the world.
Right. They can't be, they can't be that foolish, but you know, this is like,
it's sticky. Right. And there are plenty of publication. I mean, like, look, when that,
when I actually wrote about this, um, when Facebook announced Facebook live,
which it's paying people to use, paying publishers to use, to create content that they weren't
creating. Right. It was like, they're not like, Oh, you know, to create content they weren't creating, right?
It was like, they're not like, oh, you know what? We should be doing more live videos. Like,
Facebook was like, we really want live video on our platform. We think that's going to be exciting.
Here's some money and go make some live video. And when that was announced, there were articles
from publications that were being paid that were like, man, Facebook's new live thing is so cool
and amazing. It's like, okay, is it? I mean, like, it's like okay is it i mean like it's like at what point
are we like you know we're in the room with the sales guys and the biz dev guys and the editors
and everybody's like yeah facebook live that's gonna be a thing we do and i think there is
for an audience you know this is getting i'm sorry i'm not gonna i'm gonna try not to ramble
on this but like this gets to the fucking heart of the Trump issue. Right.
Where it's like, no wonder people don't trust like the media.
Right.
We keep giving them reasons not to trust the media.
Like we keep giving them reasons to question where our alliances are.
Right.
And that creates this like perfect opportunity for somebody like Trump to come along who's a complete fucking liar.
I mean, like just a crazy, crazy, unabashed, like lying racist liar.
Yeah.
Racist nationalist fucking insane bullshit artist.
And like he comes along and he says whatever he wants and the media can't fact check him
because people are like, well, you know, come on the media.
You guys are all mobbed up.
Like we don't know what's going on with the media.
It's like, okay, so we've completely screwed ourselves into a corner.
And by the way, like, you know, trends real hard on, I mean, you know, we can say they're repressing conservative news, but I guarantee you, Trump has trended like a motherfucker.
Boy, I'm really swearing on this podcast.
Trump has trended like a motherfucker on Facebook over the last year, okay?
Trump has trended like a motherfucker on Facebook over the last year, okay?
Well, I will mention that I never saw our – we had that leaked document of a Facebook employee asking the question, quote, what responsibility does Facebook have to prevent President Trump in 2017?
That story did very well for us, but I never saw it trending on Facebook. So, I mean, I would add that disclosure as well.
So, I mean, I would add that disclosure as well.
You know, Trump definitely gets a lot of, you know, I think he has been trending on Facebook a lot, but not every story is all I wanted to say there.
Right.
I mean, listen, I mean, that've got you've got questions and it's hard to know where where that's where the where the where the human curation is going to come in and not come in.
But I do think it's I do think it is.
I do think this whole thing speaks to this sort of we're so at sea in our industry. And I mean, the media industry, we're so at sea on like who we are and what we are and who represents us and who doesn't
and where the loyalties are that we have given – why not?
Why wouldn't the audience question whether or not we know what we're doing or whether
or not we can be trusted?
The problem is it's given Donald Trump perfect ammunition.
If you don't like Trump but you also talk shit on the media, there's a real correlation
between those two things.
But by the way, the media sucks.
Like what many news organizations have done.
I mean, as an example, Cruz drops out of the race.
Like 10 minutes later, BuzzFeed tweets a gif of Cruz elbowing his wife in the face while
he's hugging her, which is, by the way, very funny and horrendous and cringe worthy to watch,
but it's like Cruz drops out of the Republican race and Buzzfeed tweets,
like a meme of him,
like hitting his wife in the face.
And it's like,
ugh,
Tuesdays or whatever.
And it's like,
yeah,
I mean,
it feels like,
it feels like this is a slightly more important story than making the
hilarious gif.
Right.
You know?
And it's like,
but that is what we've done.
Like,
we've just played into this.
Like news is entertainment. That news is entertainment and that everything's up for grabs and that, like, it's not really that important.
And who cares what the story is?
And, like, the who cares what the story is is exactly why Donald Trump is the Republican nominee.
Yeah, I think you're exactly.
Yeah.
And so this all is part of this all is part of, you know, and I of course have a grand unified theory about all this, but
like, it really is like, you know, we've given an audience that's intelligent and desirous of
knowledge and truly curious and, you know, able to be engaged. And we've given that audience
basically shit. We've been shoveling shit at them for years and years. And it's like,
why are we surprised when people don't trust us or people don't take this shit seriously? Why are we surprised when a guy like Trump rises to power? It's because
we've lost the most important thread where we can own this stuff and deliver it with some level of
seriousness and importance to an audience. And so the Facebook thing is perfectly in line with it.
That's not to say Facebook isn't an incredible opportunity for news organizations because it is and like in anybody who is it has a brain would would utilize
facebook in all the ways it can be utilized but there is a fine line and like we have to be really
careful about like how much we think of ourselves as like just entertainment um entertainment
providers and you know a partner to a platform that is not really built for news
and how much we think of ourselves as like journalists and newsmakers and like real
storytellers.
Yeah.
So anyhow.
No, no.
But that is one of the reasons why I absolutely love Gizmodo and love working with Katie and
Alex.
It's because I feel like we are one of the few newsrooms, at least in digital media or, you know, these
digital media brands that keeps its editorial independence.
So, you know, a lot of companies have begun to see their role in technology journalism
as one that is basically, you know, it's a symbiotic relationship, I think, for a lot
of these companies.
basically, you know, it's a symbiotic relationship, I think, for a lot of these companies.
You know, they want access to the Apple Watch, so they're going to buddy up with Apple as much as possible, or, you know, in this case, maybe it's Facebook. But really, there's a lack of
critical thought. And so at Gizmodo, you know, that was made, and so, you know, again, I used to
work at- You don't have, you're not going to buddy up to Apple. You don't have to worry about
buddying up to Apple, do you? Oh, no, no, not at Gizmodo.
I used to work for Popular Science.
The mission statement there was basically to get people interested in science and technology.
It was a very optimistic brand.
It's a magazine, so in a lot of ways, we had to work with companies to get gadgets into the front of the book and that sort of thing.
But when I came to Gizmodo, I noticed the stark difference in the approach.
It was like, no, you should be critical.
We don't have to cut deals with Apple so that we can get into WWDC
or that we don't need a 3,000-word essay on the Apple Watch
and a video to accompany it.
We can call it shit if it looks like shit.
Right.
And so, yeah.
I think increasingly – it's funny to have been through this era of – at Engadget, if you go back, we weren't invited to the Apple events.
the Apple events. And then there was a period where we were invited and, and, you know, everybody there was like, had no, like, you didn't, we came at like blogging. This was like really true gadget
blogging. You came at it from this, through this door that was like, nobody's going to give a shit.
So everything just has to be, you have to do everything on your own. Like you're going to go
buy the phone. Like I waited in line for the original iPhone and gadget as did seven other
people who worked there so that we could get one, you know, and like review it quickly or whatever we do.
I don't even remember what happened.
But like it's funny to have seen this this having not been in the world like you see magazines, particularly like these like technology publications and the relationships they had.
Like the relationships like Steve Jobs had with some of these journalists, you know, like Walt Mossberg, who I love.
It's a great example.
Like their relationship is really, it's very cozy.
I mean, they were friends, you know, and David Pogue and all these old school dudes.
And like we came at it from this totally different angle.
And it was kind of shocking to have those first experiences where, you know, Apple PR
calls you and they want you to talk about their angle.
I mean, I remember very specifically around the suicides at Foxconn.
I talked to some Apple PR people and we had long conversations.
And they weren't like, this is the story we want you to write.
But they were like, here's why we think this isn't a big deal. And it was a really like long lengthy sort of chat about like, this is really a, you know, you should think about it this way. And you could just feel like we want you to think about it this way. And we'd like it if you said it out loud. Right now, like, yeah, that's to greater and lesser degree, depending on the person you're working with. But it is it goes, it can go way, way fucking deeper than that. Like, you know, people take junkets, right? They go, they get, they get paid, literally,
they get on first class, they get a first class ticket and a four-star hotel and they go cover
the new Mercedes car. Like this happens all the time. We would routinely, car companies would
reach out and they'd be like, we want you to come to Dusseldorf to see our new X, but you,
we're not going to let you come unless you fly there. We didn't take junkets. We didn't take meals. We didn't take any of that
stuff. And so it's really interesting to see how cozy. And the Facebook coziness is a whole
different level because Facebook actually comes to you as an agnostic, neutral partner. They're
like, we're not trying to sell anything. We just want you to reach a big audience.
trying to sell anything. We just want you to reach a big audience. And, you know, it gets very blurry. It gets very gray. And I think like we're at a point now where we've got to remove the grayness
in every way possible, like for the sake of what we do as an industry, but for also the sake of
the audience and its intelligence and its understanding of what's going on in the world.
We really have to strive, like our industry has to strive to remove grayness. And I do think like this story speaks perfectly to that grayness and how subtly destructive it can be
to the things that we want to do as people in the news and what is important to the consumer
of that news. Yeah. I mean, I think the technology that we use on a day-to-day basis has matured in a lot
of ways, right?
It's been a long time since people were frantic over the iPhone, or even since just the iPhone
was launched, right?
It's been almost 10 years, right?
Right.
So I think-
Yeah, it'll be 10 years next year.
Yeah, which is crazy. It's really nuts, yeah. So I think... Yeah, it'll be 10 years next year. Yeah, which is like crazy.
It's really nuts, yeah.
So, you know, it's come a long way.
I think that the relationship has changed.
Before, this was seen as, you know, it was a gadget.
It was a toy.
It was something that...
It was like just an accessory to life.
It was like sort of...
I don't know.
It was fun.
I mean, to be totally honest, you know,
I read a lot of that stuff that you were writing
at Engadget at the time, and it was fun. I mean, to be totally honest, you know, I read a lot of that stuff that you were writing at Engadget at the time.
And it was fun to read about. I mean, I was much younger. I was excited about this stuff.
But, you know, a lot's happened since then. Edward Snowden has come forward, right?
And, you know, these Foxconn, you know, the working conditions of the people that are building these products has been revealed.
conditions of the people that are building these products has been revealed. And the people that are maintaining, keeping gore and violence and other, and bestiality
off of our Facebook feed.
We've learned more information about the fact that there are humans behind that.
And so there are so many, what I would call, I guess, milestone news events that have happened
since the launch of the iPhone and maybe even prior to that, since the, you know, people started using the Internet more on a day to day basis.
And so now I think, you know, the onus is on the technology journalists.
If you're going to call yourself a technology journalist, you can't just have an opinion about a gadget.
I mean, I just I'm so tired of people like like playing that that tune and basically saying like, oh, well, I want to talk about the graphic design of the new Instagram app logo.
It's like give me a fucking break.
I mean I think those stories are fun because I'm a design nerd.
But I think that the reality is like this is actually very much like –
That makes you a design – I mean it makes you a design critic, not a technology journalist.
Right. But I think and I think that that's what you're saying is so core to, you know, at least at the start of The Verge, when we talked about what it could be and what it would be. And, you
know, now it's like I was actually joking. Max Reed was on the podcast a couple of weeks ago.
And, you know, he's doing Select All at New York Magazine. It's like, oh, it's a sort of
intersection of culture and technology or whatever. And it's like, hey, that's what we used
to say at The Verge. And we used to get a lot of shit for it where we were trying to do these more ambitious
like more ambitious stories about what this really means it like and for and very quickly like to me
what was clear about the technology that everybody had in their pocket was we were going from
wow like this is such an amazing gadget to what does it mean that it does this or that like what
does it mean that now I can summon a car,
a black car wherever I am?
Like, and how does that change?
It's a shame that The Verge doesn't publish articles
like that anymore because, you know,
I think that the world needs a little bit more analysis
like that critical thought.
And I, you know, to be frank,
I just haven't seen a lot of that on The Verge.
Well, I can't, I mean, you know, it's a different,
it's a different beast.
I mean, The Verge is much larger than it used to be and i mean i'm not
going to comment on that just to say that but to say that like uh i'll say just generally in my
diet and this is a lot of the stuff that i've been working on for the last few months and you know
which i can hopefully talk about very soon but like i think there's a lack of that kind of critical
deep just across the industry not the verge specifically but a lack of like critical deep interesting curiosity inducing and like important
forms of storytelling and like i think there's a huge opportunity for that um and and i do think
like this is a great by the way gizmodo you can say like oh fuck gawker and this hulk hogan thing
and i don't like you guys and i didn't like this story or whatever. But the reality is when you've got a real story, when you've got a real scoop and when you're giving people something that is – they're not hearing anywhere else and that they need to know, it doesn't matter where it comes from.
I mean not that it doesn't matter, but it's like the story wins, right?
Yeah.
And this story very clearly won to me.
And that was one of the things I tweeted about the other day.
And this story very clearly won to me.
And that was one of the things I tweeted about the other day.
I was like, let's, you know, let's take a minute to just like recognize the importance of this.
That the fact that like, this is a great fucking story and it's real, it's really done well.
And it's really important that we all pay attention to it.
And so, you know, to me, that's like really encouraging because what it really says is
there's an audience that wants to know more.
They want to know better.
They want to know, they want to understand their world better.
And they want to know when shit is not the way it should be.
And so whether it's at Gizmodo or somewhere else,
there's going to always be an audience for that.
So that's encouraging to me.
Yeah, I was very encouraged by the end of the week
to see that some reporters at BuzzFeed
did a great, a particular
reporter did a good story about
how
the NDAs are what's preventing an open
conversation about this. Facebook has these
non-disclosure agreements that it asks
the curators to sign that have
no expiration date.
When Mark Zuckerberg comes out in front
of everyone and says, yeah, we want to have an open discussion,
it's like, well, dude, release the fucking NDAs from your own employees, from
your own workers. And then that would be very bad for them. There's any reason NDAs exist.
Yeah, exactly. Well, you know, and so but but like it was I was happy to see that was a story that
Gizmodo could have written. But we were so deep in this foxhole that like, you know, we weren't
able to shed light on that that so i was happy to
see that story told um and i'm sorry that i forgot the reporter's name but um you know that person
deserves credit well and also i think buzzfeed's i think buzzfeed's tech guys matt honan and john
pachowski and a couple of other people there have been actually doing some really good
um tech journalism and in fact i love like speaking to your thing about like gadget opinions
um they did a review of you know whatever i think it was either the watch or the new iPhone or something.
And it was like,
just like,
yeah,
okay.
It's a new iPhone.
Like,
here's what it does.
Like,
yeah,
you're going to go buy it.
Cause you like an iPhone.
It was like this very matter of fact,
like who gives a shit at this point,
you know,
the reality is it's like,
yeah,
who does give a shit about like the screen?
I mean,
it's gotta be really,
to me,
it's like,
we have,
we've been with these things for 10 years now.
We had this explosion.
I've talked about this a ton, I'm sure, on other podcasts.
But I see our whole world has changed because of these things.
And there was this, I used to talk about at The Verge, I used to say, technology is the lens in the mirror, right?
All of this technology is the lens in the mirror through which we see the world and we see ourselves.
And so forget about the technology for a second.
Yeah, there's going to be moments where you want to talk about VR and new cars and whatever that shit is. through which we see the world and we see ourselves. And like, so forget about the technology for a second.
Yeah, there's gonna be moments where you wanna talk about VR and new cars
and whatever that shit is.
And there's a reason to talk about it
and to get like granular.
But there's also, and I think an increasingly large reason
to talk about what we see through those lenses.
And so anyhow, so that's,
I think you and I are probably very aligned on that.
But it's like, it's an interesting moment because all – even Vanity Fair has to change the way it covers the news.
Like you can't just be Vanity – you have to talk about those lenses, right, and what you see through them.
And so it's interesting to see everybody adapting.
It's interesting to see how like different parts are going to move at different speeds. This Facebook piece of it, the speed of how it evolves is
probably going to change in the near future based on our relationship, based on the media's
relationship with an entity like that. So it'll be interesting to be both a part of that
and to be covering it, which is what you're doing.
I mean, to be frank, I am excited. This,
you know, I don't think that, again, like this was a great story. I'm happy that people
found it to be interesting. But, you know, there are things that I find to be more interesting
that I hope that I can go public with at some point soon. I'm sure you will. Anyhow, Michael,
this is great. Thank you so much for doing it. I actually thought we
were like ending and then we going down a whole other rabbit hole, which is great, like, which I
really enjoyed. You have to come back when you do your next scoop, you got to come back. And you
know, we should do is we should bring Katie who I love, and Alex who I love, and we should all get
in a room and just jam on this conversation. I'm sure it would be I'm sure it'd be a lot of fun.
Anyhow, thank you again. And and we'll talk to you soon. Cool. Thanks so much.
I'm sure it'd be a lot of fun.
Anyhow, thank you again.
And we'll talk to you soon.
Cool.
Thanks so much.
Well, that is our show for this week.
We'll be back next week, of course.
And as always, I wish you and your family the very best.
Though your family has just become a trending topic on Facebook.
And it wasn't curated. you