Today, Explained - Zuck your feelings

Episode Date: January 16, 2025

Meta is going MAGA. New York magazine’s John Herrman explains Mark Zuckerberg’s makeover. Writer Ben Wofford introduces Meta’s policy puppet master. This episode was produced by Amanda Lewellyn ...with help from Travis Larchuck, edited by Amina Al-Sadi, fact-checked by Laura Bullard, engineered by Andrea Kristinsdottir and Rob Byers, and hosted by Sean Rameswaram. Transcript at vox.com/today-explained-podcast Support Today, Explained by becoming a Vox Member today: http://www.vox.com/members Mark Zuckerberg, CEO of Meta Platforms Inc. during an event. Photo by David Paul Morris/Bloomberg via Getty Images. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Mark Zuckerberg is in his cool era. He's letting his hair grow out, he's wearing black t-shirts with a gold chain, he covered Get Low with T-Pain. That's him singing. Mark Zuckerberg is also in his MAGA era. He's throwing a party at Trump's inauguration next week. He went on Joe Rogan to say companies need more masculine energy. He's ending Meta's DEI initiatives. He's taking tampons out of the men's bathrooms at his offices.
Starting point is 00:00:40 He's getting rid of the non-binary and transgender themes on Meta's messenger app. But perhaps most important of all, he's changing Meta's content moderation and fact-checking policies. We are going to poke around the new Zuck Your Feelings metaverse on Today Explained. This week on The Gray Area, how are digital devices changing us? We've become more machine-like and I think the exhibit A for that is how young people, for example, talk about their sex lives in machine-like terms, performative terms, in ways that actually have shaped their understanding of what an intimate sexual relationship even should be, what it should look like, what it should feel like.
Starting point is 00:01:26 Listen to The Gray Area with me, Sean Elling. New episodes every Monday, available everywhere. Hey, it's Andy Roddick, and I'm not just a former tennis player. I am a tennis fan, a tennis nerd. I just can't stop watching it. I can't stop analyzing it. I can't stop analyzing it. I can't stop talking about it to anyone that will listen,
Starting point is 00:01:47 which is why I started my podcast, Serve with Andy Roddick, now a part of the Vox Media Podcast Network. On the show, we talk about everything from new up and coming players to the champions dominating the narrative to whatever's on my mind. This January is the Australian Open
Starting point is 00:02:02 and you know I've got some thoughts. So tune in for our Australian Open coverage, fine served wherever you get your podcast or on our YouTube channel. Content moderation and fact checking on Facebook and Instagram is kind of like oxygen. You can't see it, but it's out there and it's essential to your user experience. It's getting rid of all the illegal material, the hateful material, and the spam. John Herman has been writing about the changes Metta's making to content moderation for New York Magazine. And if you're like, content moderation is boring, a reminder that without it we have
Starting point is 00:02:48 seen real world political violence. Exactly. And the fact checking piece was intended to sort of close a little bit of a loophole that existed with news content where if you know false or inflammatory stories about say an ethnic minority in a country going through political strife were going viral again and again and again they could feed into real political violence and and have. While the persecution of the Muslim minority continued for years the picture changed drastically once Facebook entered the fray in 2012.
Starting point is 00:03:25 Anti-Muslim and anti-Rohingya memes and propaganda have spread through Facebook, eroding support for the Rohingya's plight. And you know, in 2016 there was a lot of domestic pressure on Facebook to address similar issues. During the last three months of the presidential campaign, fake or false news headlines actually generated more engagement on Facebook than true ones. People actually believe a conspiracy theory that Hillary Clinton and her former campaign manager John Podesta ran a child sex ring at a pizzeria in DC. This is a lie.
Starting point is 00:04:02 To borrow Facebook's language, it was creating a less authentic environment, which is an incredible euphemism for a place that was just full of garbage. So for a while, the critics of Meta and Facebook and Facebook and Meta were sort of aligned that is no longer true. That is very pointedly not true. Now Zuckerberg is killing this program. Zuckerberg posted a video explaining his reasoning. What did you make of the video?
Starting point is 00:04:34 He looked so good with his hair and his t-shirt. Hey, everyone. I want to talk about something important today because it's time to get back to our roots around free exposure. You know, you see this video, and truly, if you haven't been watching this closely, it is crazy. It's like, okay, Mark Zuckerberg, you know, hoodie guy, plain shirt guy, Caesar haircut
Starting point is 00:04:53 guy. He's got curly hair. He's got a gold chain. He's big now. He's got a little bit of a tan. Like, okay, he's something's going on here. But a lot has happened over the last several years. There's been widespread. And you know, it's funny, but it's also a signal.
Starting point is 00:05:07 It's sort of slightly right-wing coded. It's more of an obvious performance of masculinity, circa 2025. Before these platform changes, I think there was a tendency to treat this as just like a personal rebrand, maybe like an early midlife crisis type thing. But now in hindsight, you know,
Starting point is 00:05:28 we can sort of understand this as perhaps part of a, you know, more personal and authentic political transformation, or at least a sense of personal freedom, catharsis of, you know, getting ready to sort of tell everyone to just deal with it because we're going to do what we want now. So we're going to get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms.
Starting point is 00:05:56 And then he starts using terminology that again is slightly right wing coded. He's sort of complaining about the quote unquote legacy media. After Trump first got elected in 2016, the legacy media wrote nonstop about how misinformation was a threat to democracy. He's talking about how he was being sort of pushed around and bullied by the Biden administration. And that's why it's been so difficult over the past four years when even the U.S. government has pushed for censorship.
Starting point is 00:06:23 In this announcement and then elsewhere on posts on threads and on the Joe Rogan experience, talking about how, you know, maybe if people are going to leave over these changes, they're just virtue signaling. Society has become very like, I don't know, I don't even know the right word for it, but it's like, kind of like neutered or like emasculated. And so, you know, probably the most striking thing about this video is how, on one hand, it's really familiar. This is Mark Zuckerberg after an election sort of laying things out and saying, you know, we are listening, we are working on this, we're trying to fix things.
Starting point is 00:07:04 His audience is just different now. It's a different group of people. It's not a critical press or potential regulators that he thought were important in 2016. He is now sort of, you know, looking in the imminent future and saying, all right, like, how can we work with you? I'm looking forward to this next chapter. Stay good out there and more to come soon.
Starting point is 00:07:27 And tell us exactly what the new policy is. Is it just we're going to let you guys hash it out in the comments? Kind of. So the two lanes for this are, one is that the fact checking program is being discontinued. This got sort of like top billing from Zuckerberg, but the bigger changes are to Facebook's basic and much broader moderation systems. So there are a few new carve outs. You are allowed to use more dehumanizing speech about transgender people, immigrants, you
Starting point is 00:08:05 are allowed to more broadly use harsher language in your interactions on the platform. We do allow content arguing for gender-based limitations of military, law enforcement, and teaching jobs. We also allow the same content based on sexual orientation. Mark Zuckerberg says that they will be rolling out a community notes style program. If you've been on X through Elon Musk's sort of takeover and remaking of the site, you'll know that they have a system that sort of allows users to weigh in on posts and say, you know, this is true, this is misrepresented, this is not true, the posts then carry this tag, things like that. It's an interesting and frankly kind of useful feature
Starting point is 00:08:47 on X, but it is not nearly up to the task of broad platform moderation. It tends to be slow, and I think the circumstances on Facebook, for example, are much less conducive to a good community notes program. We'll see what they build, but it is, I think, a partial replacement at best for the fact-checking program that existed before, which was already not doing a whole lot.
Starting point is 00:09:19 You brought up the transition Twitter made when Elon took over. My experience as a user of that regrettable platform is my feed started getting more confusing, quite frankly. There was more spam coming in to my DMs. There were more verified users who were just, you know, random people who wanted to amplify their voices. It got harder to tell misinformation or even disinformation from reliable information. I started seeing porn in my feed more often. Like the whole thing just got messier. Is that what people should expect from their experience on Facebook or
Starting point is 00:10:09 Instagram right now? In some ways, I think yes. And what's funny is I've had the same or similar feelings about the transformation of X. One thing that kept coming to mind is that this, you know, platform, which was certainly always flawed and full of all kinds of stuff that you didn't necessarily want to see or whatever, you know, it was always a complicated product, it felt kind of familiar.
Starting point is 00:10:37 It kind of felt like pre-2016 Facebook, where you're just scrolling around, things are kind of out of order Like literally not chronological. You don't know where things are coming from why you're seeing them. It's just sort of like an unstable But in some ways very engaging environment. So in rolling those back there's there's a return Potentially to this version of Facebook that the company left behind nearly ten years ago And yeah, the most useful current comparison is certainly X, which in some ways is probably doing very well in the eyes of its owner,
Starting point is 00:11:15 but is used by far fewer people, is now a sort of fairly hostile political environment for a lot of its previous users. It is far less useful in, for example, a disaster, like the fires in LA County or recent hurricanes. It is just full of untrustworthy information from untrustworthy people who are often there with malign ends to misinform, to make money, to spam.
Starting point is 00:11:46 It's a different kind of place. Euphemistically, it's rougher on the edges, it's rowdier. Functionally, it just doesn't work as well. And if you take seriously Elon Musk's commitment to free speech, if you take seriously Mark Zuckerberg's sudden commitment to free expression, maybe you can conceptualize this as just a trade-off. But the reality of these platforms
Starting point is 00:12:10 is much more complicated than that. They are not built with enabling free speech in mind. These are commercial advertising and subscription platforms with tons of restrictions on what you can do and what you can say, and that fundamental fact hasn't changed. The sort of flavor of censorship is what's changing. ["Sensorship is What's Changing"]
Starting point is 00:12:35 John Herman is a tech columnist and intelligenser from New York Magazine. You can read and subscribe at nymag.com. There's one guy over at Meta who's in charge of getting the flavor of censorship just right. And his name's not Mark, it's Joel. We need to talk about Joel next on Today Explained comes from Noom. Many a weight loss plan takes a one size fits all approach without taking into account your
Starting point is 00:13:20 individual needs. Things like dietary restrictions, medical issues, any number of factors might influence the best way for you to lose weight. Noom says they do things differently. According to Noom, their approach is personalized around your psychology and biology to meet you where you are without restricting what you eat. They've even published more than 30 peer-reviewed scientific articles describing their methods and effectiveness. Our colleague, Phoebe Rios, here at Vox, got to try out Noom and let us know how it went.
Starting point is 00:13:50 I feel like the plan Noom created was catered to my individual needs. It was very thorough. I felt like the questions they asked, I hadn't even asked myself, like what time I get out of bed in the morning, and if I eat with my phone in my hand. It was very helpful and very, very educating of how I spend my day.
Starting point is 00:14:11 You can stay focused on what's important to you with Noom's psychology and biology based approach. You can sign up for your trial today at Noom.com. Support for the show today comes from Vanta. Trust isn't just earned, Vanta says. It's demanded. Have you demanded someone's trust lately? Whether you're a startup founder navigating your first audit or a seasoned security professional scaling your governance, risk, and compliance program, proving your commitment to security is critical and complex.
Starting point is 00:14:48 And that's where Vanta comes in. You know the deal. Vanta says they can help businesses establish trust by automating compliance needs across 35 frameworks like SOC 2 and ISO 27001. They say they can also centralize security workflows, complete questionnaires up to five times faster, and proactively manage vendor risk.
Starting point is 00:15:13 You can join over 9,000 global companies like Atlassian, like Cora, and Factory who use Vanta to manage risk and prove security in real time for a limited time. Our audience can get $1,000 off vanta at vanta.com slash explained. That's v-a-n-t-a.com slash explained for $1,000 off. Support for the show day comes from Indeed.
Starting point is 00:15:41 It says that we're halfway through January, and that means it's way too late for anyone to be telling you Happy New Year. Rude. I like to say Happy New Year into May. But you know what else is too late? Hiring the right person for that open position from 2024. Luckily, there's Indeed.
Starting point is 00:16:01 Indeed, you can stop struggling to get your job posting. Indeed says their sponsored jobs help you stand out and hire fast. With sponsored jobs, your post jumps to the top of the page, which can help reach the people you want faster. There's no need to wait any longer. You can speed up your hiring right now with Indeed. And listeners of this show will get a $75 sponsored job credit
Starting point is 00:16:29 to get your jobs more visibility at indeed.com slash today explained. You can go to indeed.com slash today explained right now and support our show by saying you heard about Indeed on this show. Indeed.com slash today explained. Terms and conditions apply. Hiring?
Starting point is 00:16:46 Indeed is all you need. You're listening to Today Explained. Sean Rommes from here with Ben Wofford, who considers himself a Kaplanologist, which is to say he's written a lot about a guy named Joel Kaplan for places like is to say he's written a lot about a guy named Joel Kaplan for places like Wired and Business Insider. Honey Fareed, who's a professor at UC Berkeley calls Joel Kaplan the most influential person
Starting point is 00:17:14 at Facebook that most people haven't heard of. There's no question that the things that happened at Metta are coming from Mark, but there's also no question that there has been a change over the last- So Kaplan for the last 15 years or so has had this extremely important role at Facebook. And formally, his role has been to forecast and manage policy risk. Functionally, his role in the last 10 years has grown to be as sprawling basically as Facebook's reach itself. And it involves overseeing a prolific lobby in Washington DC, which is managing relations with the federal government and state capitals.
Starting point is 00:17:53 And he leads Kaplan a team of about a thousand policy staff worldwide in Facebook, shaping and massaging and sometimes thwarting the international laws and regulatory bodies and policies that graze any part of Facebook's enormous business. But it's this third role that has made Kaplan so controversial, and that is helping design and arbitrate Facebook's policies on political speech, which have changed so much and so dramatically over the last 10 years. Ben says Joel Kaplan is a Forrest Gump type figure. He went to Harvard, he was a good progressive
Starting point is 00:18:34 college student, but then the Gulf War starts and he finds himself feeling more conservative. He graduates, enlists, goes to law school and comes out a proper Republican. Clerks for Antonin Scalia at the Supreme Court, becomes best buds with Brett Kavanaugh. And then he joins up with George W. Bush, serves all eight years in the Bush administration, and then he gets out and he's like, what's next? And that's just when his old pal from Harvard, Sheryl Sandberg, calls him up and offers him
Starting point is 00:19:03 a job. Kaplan's role for the first three years, he's one of a number of elder statesman types surrounding a younger Zuckerberg who is increasingly realized that the reach of his company is going to be entangled in policy matters in Washington. Senator we run ads. I see. It's during this period, you call it, sort of from 2011 to 2016, that Kaplan, if not a mentor, is sort of described by colleagues
Starting point is 00:19:36 as sort of an older brother figure to a younger Zuckerberg. He's accompanying Zuckerberg to tech summits in the Obama Oval Office. My name is Barack Obama and I'm the guy who got Mark to wear a jacket and tie. By the time Kaplan comes out of those eight years in the Bush White House, he's got a reputation as a real bipartisan impresario. So Kaplan is a certain breed of Bush conservative that is open-handed and warm and interested in bipartisan compromise and it's part of why he's so prolific and such a valuable asset
Starting point is 00:20:14 to any lobbying operation or company, but especially to Facebook. There are lots of these moments where Facebook is growing, it stumbles on some kind of tripwire of conservative politics it didn't know was there. And the company sort of frantically looks around and says, who do we have who's like a singular Republican operative who can help us with this problem? And over and over and over again, the answer is just Joel Kaplan, Joel Kaplan, Joel Kaplan. But the real hinge moment comes in 2016. I mean, this is the first real crisis that Kaplan solves. And it's sort of a foreshadow of events. But it's a famous episode in May 2016, still known inside Facebook as sort of the Gizmodo
Starting point is 00:20:59 affair. Gizmodo publishes an article alleging that Facebook's trending topics widget is biased against conservative media publishers. The CPAC conference, for instance, you know, as that was going on, that was not allowed to trend in Facebook's trending news feed. And conservatives are outraged. Forget leaning in, does Facebook lean left? Republican Senator John Thune.
Starting point is 00:21:23 In comes Joel Kaplan for the rescue. Kaplan calls an old friend who's working on the Trump campaign and he designs this summit at Menlo Park where he's going to bring in these conservative media heavyweights, you know, more than a dozen of these big-name guests. They include Tucker Carlson and Glenn Beck and Dana Perino. And they get sort of this VIP treatment. They include Tucker Carlson and Glenn Beck and Dana Perino. And they get sort of this VIP treatment.
Starting point is 00:21:47 Zuckerberg gives a seminar where he explains to them the problem, what they're doing, how they're going to solve this and sort of finesse and massage and charms them. And Kaplan, of course, is preparing the summit, briefing Zuckerberg, walking him through the talking points. And it works. When the conservatives kind of come back from the summit, the consensus is that Kaplan sort of put out this four alarm fire. The Trending Topics widget controversy showed three things.
Starting point is 00:22:19 One that there were these political landmines that Zuckerberg and Facebook might not realize exist. Two, that Kaplan was the person that could navigate Zuckerberg and the company around them, and three, just as often as not, those types of landmines were about content and speech and the speech product. And so if you thought of this as a unified problem, right, you would want one person to be in charge of a unified solution. And that point person more or less becomes Kaplan.
Starting point is 00:22:47 How unusual is it for a tech company to have a, you know, individual go from essentially top lobbyists to, you know, top policy advisor, top policy programmer for the platform? Smart people and scholars who think about the architecture of the internet and social media really encourage people to step back and look at Facebook and think about how unusual it is and how not obvious or self-explanatory it is that the person who would be in charge of your political lobbying and policy operation is also largely in charge of crafting and designing the policies around content and speech. I think the one inside story that really summarizes Kaplan's role and influence happens in 2017,
Starting point is 00:23:39 and that's with a really radical proposal called Common Ground. So after 2016, there's this shock about the election and how ugly it was. And Common Ground has these big ambitious goals, all about reducing polarization with a cocktail of what they call, quote, aggressive interventions. They're going to downrank ugly incivility and optimize for quote good conversations and upregulate that kind of discussion. And it's all about the algorithm. So the new algorithm was going to recommend users join more politically diverse groups, for example.
Starting point is 00:24:17 It was going to reduce the viral reach of hyper active hyper partisan users. And the common ground team is really juiced. They're excited. They've hung posters around the office in Menlo Park that have their motto on it and say things like, reduce polarization or reduce hate. And then Common Ground runs into Joel Kaplan. And Kaplan's policy team grills these programmers
Starting point is 00:24:43 and project managers with questions. Questions not just about how it's going to be perceived by users, but how the changes will be experienced and perceived by political stakeholders. And with Trump in office, Facebook is much more sensitive to how any changes, even neutral nonpartisan changes like common ground might be perceived by politicians or media persona who have a big megaphone and can generate a political crisis and headache for Facebook. So in the end a few of the tweaks of common ground got through but in the end almost all of common ground
Starting point is 00:25:19 was scrapped and put on the shelf and never saw the light of day. Okay Ben you've helped us get to know this shadowy figure at Facebook at Metta and put on the shelf and never saw the light of day. Okay, Ben, you've helped us get to know this shadowy figure at Facebook, at Metta. He's been lurking around our government and our platforms for decades. But what does all of this mean for the next four years of Metta, Mark and Donald? So to me, Kaplan's professional life and his corporate values at Facebook suggest to me that there's almost no limit to the necessities and prerogatives of survival that Kaplan can't find a way to accommodate. I guess a different way of putting this would be,
Starting point is 00:26:12 Zuckerberg's donating a million dollars to the inaugural committee or going to Mar-a-Lago or bringing an MMA executive onto the corporate board. Those are really obvious, jarring ways that we can see Zuckerberg more than almost any other, not only tech company, but really any other major corporation in the United States. Facebook has managed to stand out in subjecting itself to the coming Trump wave. And Kaplan's appointment to lead global policy is to me actually the example of all of those things. Kaplan's singular achievement, I think of the last eight years, is finding a way to accommodate the brash ugliness of MAGA Washington and MAGA conservatism with the elite, burnish, and professionalized corporate values of Facebook and the corporate world.
Starting point is 00:27:13 The next four years of Trump is going to be what Kaplan does best, which is just an era of serious and profound accommodation of Facebook or by Facebook of Trump. You know, if you can think of all the unsavory ways that an empowered Trump might want to use Facebook for illegitimate ends, Kaplan is going to be the person in charge of figuring out a way to accommodate Trump and MAGA conservatism as far as it can go and pushing the breaking point further and further before it becomes untenable for Facebook. Ben Wofford, he writes for whomever he pleases. Most recently it was Business Insider. The piece was titled Magga's Man Inside Meta.
Starting point is 00:28:13 Businessinsider.com. Amanda Lou Ellen produces for today Explained, Amina Alsadi edits, Laura Bullard is our senior researcher, Andrea Christensdorcher and Rob Byers mix it up. Goodbye for now. I think I read it wrong. All right, hang on. We allow targeted cursing, defined as terms or phrases calling for engagement in sexual activity, or contact with genitalia, anus, feces or urine. We allow targeted cursing, defined as terms or phrases calling for engagement in sexual activity, or contact with genitalia, anus, feces, or urine, including but not limited
Starting point is 00:29:07 to, suck my dick, kiss my ass, eat shit. I think I got it. We allow targeted cursing, defined as terms or phrases calling for engagement in sexual activity. We allow targeted cursing, defined as terms or phrases calling for engagement in sexual activity or contact with genitalia.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.