The Journal. - He Thought Instagram Was Safe. Then His Daughter Got an Account.
Episode Date: November 9, 2023Former Meta engineer Arturo Bejar thought he could help make Instagram safer after his daughter experienced harassment on the platform. But Bejar said that his concerns were not sufficiently addressed... by senior leadership at the company and that teens are still at risk for harassment and bullying on Meta's platforms. Further Listening: -The Facebook Files Further Reading: -His Job Was to Make Instagram Safe for Teens. His 14-Year-Old Showed Him What the App Was Really Like. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
In 2021, Arturo Behar decided to do something he never thought he'd do.
I decided to write the email that in my entire career as a security professional, I was taught not to write.
He sent an email to the top bosses at the company he was consulting for, putting
into writing his concerns.
The email is addressed to Mark Zuckerberg,
Sheryl Sandberg,
Chris Cox, and Adam Mosseri.
The company was
Meta Platforms, which owns Instagram
and Facebook.
Arturo is a computer scientist,
focusing on safety and security.
Actually, if I get the chance, I would really love to read out the email to you.
Sure, yeah, let's do that.
Dear Mark, I wanted to bring to your attention what I believe is a critical gap in how we as a company approach harm.
In his email, Arturo wrote that Meta wasn't doing enough to keep teens safe from harassment,
and that the company wasn't fully aware of how bad the problem was.
Today, we don't know what percentage of content people experience
as misinformation, harassment, or racism.
We have done great work driving down prevalence,
and there will always be more to do.
But what if policy-based solutions only cover a single digit
of what is harming people?
The letter ended by saying,
I am appealing to you because I believe that working this way will require a culture shift.
I know that everyone in M-Team deeply cares about the people we serve
and the communities we're trying to nurture,
and I believe that this work will be in the service of that.
How did you feel after you pressed send on that email?
I hoped, but I was afraid.
And I wrote this email in such a way that they knew, right?
They knew how bad it is.
But I was afraid that they wouldn't do anything.
Today, in an exclusive interview,
Arturo talks about his attempts to protect teenagers on Instagram
and Facebook, and how he felt his concerns were pushed aside.
Welcome to The Journal, our show about money, business, and power. I'm Ryan Knudsen. It's
Thursday, November 9th.
Coming up on the show, the Facebook engineer who tried and failed to make Instagram safer for teens. This message comes from TD. Getting mortgage advice should be fast and easy. Because when you want to buy a home, who has time to wait?
TD Mortgage Direct makes it quick and easy to get the answers you need.
Just answer a few questions and get personalized advice from a TD Mortgage Specialist.
And you can get up to $4,100 with a new TD Mortgage.
Offer ends October 31, 2024.
Conditions apply.
Visit td.com slash tdmortgagedirect.
Arturo Bejar has been interested in computers
since he was just a kid growing up in Mexico.
As a teenager, he worked at IBM,
and he even got to meet Steve Wozniak, the co-founder of Apple. Wozniak later paid for Arturo to study abroad in London when
he was 19. He gave me the opportunity to go study abroad, which I couldn't have done when I was
growing up in Mexico with the family, and my circumstance wouldn't have made that possible.
When I asked him how I could pay him back for his generous support,
he said, well, just do good.
And that really has informed all of the work that I try to do in technology ever since.
Arturo worked in information security at Yahoo for more than a decade.
And in 2009, he became one of Facebook's first big external hires. His job was to focus
on safety and security, often the safety and security of kids and teens. When Arturo joined
Facebook, he was optimistic about the impact the platform would have on kids, including his own.
You have one daughter? How many kids do you have? I have a daughter and a son. Uh-huh.
And when they were born,
what did you think technology would mean for them in their lives?
It was a wonderful time.
I came to the Valley in the 90s,
and my kids were born around 2000 and change.
And I think at the time,
there were all of these wonderful possibilities
about what the internet would do for them, right?
The ability to have information
at the tip of their fingers
instead of needing to find it in a library
or different ways.
And I was also,
ever since I started working in technology,
really fascinated about the possibilities
about what you could do when you use technology to connect people to each other.
And so I was very excited for them to have technology in their lives.
One of the projects Arturo worked on was a way to help teens deal with bullying.
When Arturo first joined Facebook in 2009, the company was still in the early stages
of trying to figure out what should and shouldn't be allowed on the platform. The company didn't
allow bullying, but it wasn't easy to figure out what bullying actually was. So Arturo helped build
a tool that allowed users to report posts that they considered bullying. Users could flag a post and then explain to Facebook why they thought it violated the
rules.
But when the company did that, they realized that a lot of the posts people considered
to be harmful weren't very obvious.
And we found that only half of them would look to you and me as something that you would
act on.
I'm going to hit you after school.
I'm going to fight after school.
And then the other half were like, man, you just wore a really great sweater today.
Or man, wearing contacts, not glasses.
And those were sort of just like insults that could only be understood in the context of what had happened at school or something like that?
That's exactly it.
It could only be understood in the context of what had happened at school or something like that?
That's exactly it.
It's somebody who's been bullied or harassed or teased at school,
and then they get home, and then they see that somebody's posted for everybody to see the thing that they've been teased about.
Right. Love that sweater, bro, or something like that,
that actually is a big insult to them.
Exactly.
But an outsider, a review an automated, you know,
a reviewer or even an artificial intelligence tool
might look at that and say,
oh, that's a very lovely compliment.
Exactly, exactly.
So that tool, would you say that it worked?
Yeah, because it worked from the perspective of,
like, I believe that the responsibility
that social media companies have towards teens is to give them tools to navigate these issues and, where possible, communicate to the people that are creating the content about the impact that they had on others. What it did well 10 years ago, right, was it allowed a teen to name what was going on for them.
And they found it helpful.
So at the end of the tool, we would ask them, does this help you with what was going on?
And we got good results around that.
And that's how you should measure those tools.
It's like, did they help you at the time as you were dealing with the thing?
In 2015, Arturo left Facebook.
He was going through a divorce and wanted to spend more time with his kids.
So when you left Facebook in 2015, how were you feeling about how the platform was moderating bullying
and the kind of environment that it was creating for teens and young people?
I was feeling good about it because I felt that what had been built up to that point was a good start for the work that was needed.
And so while there were still many things to do, I knew that if somebody had a bad experience,
that there was something that they could turn to that would be able to help them deal with the bad experience that they were having.
And so I felt that I would be happy for my kids to have those tools once they were of age to be using these products, which they weren't at the time that I left Facebook.
Arturo left Facebook feeling good, but that feeling didn't last long.
It all started to change after his daughter got an Instagram account.
Well, there were aspects of that that are absolutely wonderful.
There are aspects of that that are absolutely harrowing,
and it really didn't need to be that way.
That's after the break.
Your teen requested a ride, but this time, not from you.
It's through their Uber Teen account.
It's an Uber account that allows your teen to request a ride under your supervision
with live trip tracking and highly rated drivers.
Add your teen to your Uber account today.
Summer is like a cocktail.
It has to be mixed just right.
Start with a handful of great friends.
Now, add your favorite music.
And then, finally, add Bacardi Rum.
Shake it together.
And there you have it.
The perfect summer mix.
Bacardi.
Do what moves you.
Live passionately.
Drink responsibly.
Copyright 2024.
Bacardi.
It's trade dress and the bat device are trademarks of Bacardi and Company Limited.
Rum 40% alcohol by volume.
Arturo's daughter got an Instagram account when she was 14.
And one of the things she liked to post about was her new hobby, working on vintage cars.
It was something she and her dad would do together.
One of their big projects was restoring an old Porsche.
Her account got pretty popular, and she developed a following.
But then came the harassment.
We're at home, and she came to me crying.
And I was like, what's going on?
And she goes, like, Dad, I'm like, what?
I shared this post of the cars that I'm working on,
and somebody told me to get back to the kitchen.
And what can I do?
And I'm sitting there, and I'm like, I feel powerless, right?
And I'm like, I feel powerless, right?
Because get back to the kitchen is not something that you would write a policy about that you would remove.
But how do you sit there and look at your daughter who's crying and upset and tell her that there's nothing she can do?
And so she said, can you help me, Dad? Can you help me deal with these things? And there was nothing I could do. And so she said, can you help me, dad? Can you help me deal with these things?
And there was nothing I could do.
But it was more than just that. She received tons of misogynistic comments and sexual advances in her direct messages. People talked about her breasts and sent her penis pictures.
She would report the comments to Instagram,
but oftentimes the company would reply by saying it didn't explicitly violate the platform's rules.
And while she could block users that sent her inappropriate DMs,
those users didn't seem to suffer any consequences.
How did that make you feel to see that this was happening to your daughter?
Well, I felt that if there was one person that could help her with this, it should be me.
Because after I left
and there were all of these things that had happened,
I spent many years defending Mark and Cheryl and the company
under the thesis that every time
that I ran across a significant issue
or they asked me to look into something,
I would go up to their desks and say, oh, we just found that there's this thing and it's affecting
this many people. And they would always tell me, okay, do whatever you need to do. Tell me how I
can support you. And so in my first stint, I had a very good experience. And so when I saw these
things happening, there was a part of me that deeply felt like they don't know.
I'm sure they don't know how bad it is.
They don't know that teens are experiencing this.
So I think what I need to do is I need to go back in.
So Arturo called up his old colleagues and asked about his concerns.
The company responded by offering him a job.
After four years away in 2019,
he came back to Meta as an independent contractor
and worked on what was known as the well-being team.
But when he returned, he was surprised by what he found.
Most of the work he'd done during his first stint at Facebook had been scrapped.
It became clear that all of the lessons that we had learned building the flows for teams in 2013, the things that we've talked about, were all gone.
There was no memory of it. There was no record.
People didn't know that this had been done.
So initially, they brought me back in to share that experience again with the team that was working on these things.
Facebook disputed Arturo's characterization of what happened to his work.
The company said it made changes to try and educate users.
Another thing that surprised Arturo was that it looked to him
like the company wasn't really focused on the problems teen girls were facing.
It realized that the important question was,
like, why isn't there, like, a goal or a set of people dedicated around the issue of like teenagers who
received unwanted sexual advances over direct messaging everything that's important to the
company has a number a goal a team assigned to it and there wasn't a goal based on on teen's
experience and not only that but i remember as I was trying to figure out
why are they not working on these things?
Why are they not working on the kind of harassment and misogyny
that I was seeing my daughter experiencing?
During the four years Arturo was gone,
the company shifted its focus to more of an automated review system.
It started relying more on artificial intelligence
to identify posts that violated the rules
rather than having humans review posts that users had flagged.
And according to Meta, this system was extremely effective.
For instance, the company says that if an average user looks at 10,000 posts,
only eight of them would contain content like bullying or harassment.
The company calls this metric prevalence. Arturo thought this number wasn't accurately capturing
the real experiences that teens were having. So he and his team created a survey. The questionnaire
was called BEEF, short for Bad Emotional Experience Feedback. It was a recurring survey of issues that 238,000 users had experienced over the prior seven days.
And the survey delivered very different results.
By Arturo's measurements, teens were saying they witnessed bullying
100 times more often than meta-statistics would suggest.
According to the survey, in the past seven days,
would suggest. According to the survey, in the past seven days, 26% of users under the age of 16 recalled having a bad experience due to witnessing hostility against someone based on their race,
religion, or identity. More than 20% felt worse about themselves after viewing others' posts,
and 13% had experienced unwanted sexual advances.
In other words, even though Meta was saying that only a small number of posts actually violated the company's rules,
users were telling Meta that they were experiencing it much more often.
And is the reason that prevalence may not be picking up
these other experiences of bullying,
is that because of what you were sort of talking about earlier,
how someone might comment nice sweater bro on a post which a user may perceive as bullying but
the actual sort of system which is counting prevalence that wouldn't count as a rules
violation and it has to be a rules violation in order to be counted in in the prevalence metric? That's correct. That sounds like a really big gap in
what Facebook is able to see about the experiences that its users are having. Correct. And in the
case of an unwanted sexual advance over DMs, that doesn't show up in prevalence anywhere. There's no
metric to track that. And not only that, but I mean, as I'm going through this, there's not a button that
anybody can use that my daughter or her friends or anybody else can use to say,
this is an unwanted sexual advance over DMs.
But why is the block button not sufficient there?
Because there are so many reasons why people block each other, right?
So if you can block people when they're being annoying,
and then you unblock them when they stop being annoying.
Another thing you could do is you could take the block button,
and if somebody blocks somebody, make the next step be like,
why did you block that person?
And then you separate out, well, because it was a sexual advance,
or because they were harassing me or just being annoying.
The thing is, is in order to make these things better,
the systems need to have the information
that allow them to identify the people
who are initiating these unwanted sexual advances.
So if there was a button that a teen said,
oh, that's an unwanted message, why?
Because it's gross, right?
Or some language that works for them.
Then that will allow you to identify
the people who are initiating these messages.
And then maybe after, like, I don't know,
three messages where that feedback is given,
you tap them on the shoulder and you go like,
hey, Pat, I just, you should know
that this is not an environment in which,
like, you kind of send those unwanted advances.
It's not the place for that.
And the thing about building that is if that person continues the behavior,
then you can escalate and feature block,
and you know that somebody actually might be a predator.
There's a lot more information that comes from somebody continuing that behavior
after you give them the feedback.
And if you just have block, you don't get any of that and the system doesn't get better.
Why do you think it's Facebook or Instagram's responsibility to do that, to tap the user on
the shoulder and say, this is unwelcome? I think some people might say that that
is sort of overly paternalistic for a social network to sort of set those kinds of guidelines and
boundaries. And that's just something that people ought to work out or the perpetrators' parents or
whomever should be the one to intervene and not Facebook or Instagram. So you can look around and
see people that you find attractive and then try to send them a message. I think with that comes a responsibility to give those people tools to be safe.
In a statement, Meta spokesman Andy Stone said
the company's been working on tools like these
both before and after Arturo left.
For instance, he said the company warns users
before posting comments that the platform's automated systems
would likely ban.
And when someone wants to send a direct message
to a prominent content creator,
they're given a reminder to be kind.
At the time, Arturo took his concerns
to the top brass at Meta,
including Chris Cox, the chief product officer.
I asked him,
do you know what percentage of 13 to 15-year-olds
said that they were the target of bullying
in the last seven days?
And he knew the number off the top of his head,
the approximation of the number.
And I asked him a couple more, and then he knew the numbers off the top of his head.
And I remember being, like, so taken aback by that
because, like, if you know these numbers,
you'd think that you'd be like that would be like the
thing you get up and go to work and deal with every day because you have a responsibility I mean
you've won you've got most of the population on the active population on the internet on your
products including like hundreds of millions of teenagers you have a responsibility to provide
them with a safe environment and so we talked with him about the numbers,
and I said some of the things that they could do.
And he was like, oh, those are interesting.
Thank you for bringing this to my attention.
I will talk to the person that's now in charge of integrity,
and thank you for bringing this to my attention.
How did you feel after leaving that meeting?
I was heartbroken.
I mean, at the same time as this was happening,
Francis Haugen, a Facebook whistleblower, came out.
All of his documentation came out.
And I saw all of these things being talked about,
about, like, teenagers and body image issues on Instagram or the product.
And I saw that the company was minimizing their own research
while I'm sitting in the team and we're talking about these issues.
The company said Cox did not brush off Arturo's concerns.
It also said that Frances Haugen lacked expertise on teen mental health issues
and that the research she cited didn't prove the company's products were a net negative for users.
After his meeting with Cox, Arturo shared his findings with the rest of Meta's higher-ups
in that email we talked about at the beginning of this episode.
And how did Mark Zuckerberg, Sheryl Sandberg, and Adam Aseri respond to your email?
So Sheryl wrote back a very kind email saying,
you know, I've been told to get back to the kitchen,
and I know how awful that feels.
I really feel for your daughter experiencing these things.
Thank you for writing such a thoughtful note.
It really shows that you care about the company.
But no meeting, no follow-up, no action.
Did you get a meeting or have any other follow-up with any other executives?
Yeah. So a few weeks later, around two or three weeks later, I got a meeting with Adam Mosseri,
who is the head of Instagram. And Adam understood the issues. He thought these were good ideas, thanked me for them. I believe he said they would work on them.
But Wall Street Journal reporting shows that two years later, the problems Arturo identified
remain unresolved. Meta disputed that the company had rejected Arturo and his colleagues' ideas.
In a statement, Meta spokesman Andy Stone said, quote,
It's absurd to suggest that we only
started user perception surveys in 2019, or that there's some sort of conflict between that work
and prevalence metrics. He added that the company found value in each approach and that it continues
to work on the issues. Arturo hasn't stopped sounding the alarm. Earlier this week, he testified before Congress.
Thank you for the opportunity
to appear before you
and for your interest
in addressing
one of the most urgent threats
to our children
today, to American children
and children everywhere.
Arturo testified before Congress
earlier this week.
He said META could do more to make kids safer on its platforms.
Congressional leaders expressed outrage at its findings.
And what you have brought to this committee today is something that every parent in
America needs to hear. The numbers are really stunning.
If you could go back in time to when your daughter first asked you if she could get an Instagram account,
do you think that you would respond differently now,
given everything that you've been through?
Yeah, I wish it wasn't part of her life.
I can see the good and the bad,
but I think that the bad is just awful.
So I wish she wasn't on Instagram.
I wish that she was working on cars
and had a good, safe community
that she could be a part of,
where she could connect with other people
about what she was doing.
And I wish she wasn't learning
what she's needing to learn
from being with these tools the way they are today.
Well, Arturo, thank you so much for your time.
I really appreciate this conversation.
Thank you.
That's all for today, Thursday, November 9th.
Special thanks to Jeff Horwitz
for his reporting in this episode.
The Journal is a co-production of Spotify
and The Wall Street Journal.
The show is made by Annie Baxter,
Kylan Burtz,
Catherine Brewer,
Maria Byrne,
Victoria Dominguez,
Pia Gadkari,
Rachel Humphries,
Matt Kwong,
Kate Leinbaugh, Jessica Mendoza, Annie Minoff, Laura Morris, Enrique Perez de la Rosa, Our engineers are
Our theme music is by So Wiley. Additional music this week from Thank you. Wolf. We've got a new episode coming tomorrow in our series, The Trial of Crypto's Golden Boy.
Thanks for listening.
See you tomorrow.