Your Undivided Attention - Why the Meta Verdicts Are a Big Deal (And What It Was Like to Testify)

Episode Date: March 26, 2026

In two landmark cases, juries in California and New Mexico found Meta and Google liable for creating addictive, harmful products and failing to protect children from exploitation and abuse. These verd...icts signal that the era of tech impunity may finally be closing. State attorneys general are finding ways around the broad immunity of Section 230 — seeking not just fines, but changes to the design of these products. Our very own Aza Raskin testified at the New Mexico trial as a fact witness, drawing on his firsthand experience as the inventor of infinite scroll, one of the core mechanics of addictive design. In this episode, Tristan and Aza discuss what it was like to take the stand for tech justice, what the companies knew and when, and why the real significance of these cases lies not in the dollar amounts but in the injunctive relief still to come. In the 1990s, a series of landmark cases held Big Tobacco accountable for the harms of their toxic products. This could be that moment for social media. RECOMMENDED MEDIA Further reading on the New Mexico trial Further reading on the California trial Arturo Béjar’s “Broken Promises” Report   RECOMMENDED YUA EPISODES What if we had fixed social media? Jonathan Haidt On How to Solve the Teen Mental Health Crisis Social Media Victims Lawyer Up with Laura Marquez-Garrett Real Social Media Solutions, Now with Frances Haugen Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

Transcript
Discussion (0)
Starting point is 00:00:00 Hey everyone, this is Tristan Harris, and welcome to your undivided attention. Today, we're going to be talking about two verdicts that were handed down against big tech companies in two major lawsuits. One was in California, where META and Google were found to have been negligent and failed to warn users about the addicted design of their products. And that trial involved a young woman who alleged these products contributed to her deteriorating mental health and body dysmorphia. And in the other case, in New Mexico, META was found liable. for failing to protect children from exploitation and abuse on their apps. And actually, our very own Azar Raskin of this podcast testified at the trial.
Starting point is 00:00:42 Hey, Iza. Hey, Tristan. So we all know in the 1990s, there was a moment when there was a series of big cases holding the big tobacco companies accountable for the harms of their toxic products. And we really feel like this might be the big tobacco moment for social media and represents a real opportunity not just to hold these companies accountable with fines, but actual, you know, injunctive relief and design changes that would create a better future. So today I wanted to take a moment to talk about the New Mexico trial because you were involved in it
Starting point is 00:01:10 and because it involved a lot of real details about how the companies operated underneath the hood, how they thought about user safety, you know, beyond the verdict and the damages. So, Aza, you went to New Mexico. Tell us what happened here when Mr. Raskin went to Santa Fe. When Mr. Raskin went to Santa Fe, I had to buy a suit. But this case was brought by New Mexico Attorney General Raul Torres back in December 2023. And what's significant is that instead of trying to tackle this via Section 230, which if listeners
Starting point is 00:01:46 remember is the rule that says that platforms are not responsible for the content that users post, instead the Attorney General went after META for violating New Mexico's Unfair Trade Practices Act for failing to protect children on their apps, Facebook and Instagram. from abuse and exploitation. So essentially what the New Mexico Attorney General did was a kind of undercover operation where they created fake profiles of underage users and then saw what experience they had in the platform
Starting point is 00:02:15 and what they found was that these underage users were immediately flooded with really horrific, inappropriate exploitation and abuse stuff, sort of like sexual grooming, that kind of thing. So this is basically, this is the New Mexico AG. They create a fake profile. They say, I'm 12 years old, and they just have simulated.
Starting point is 00:02:32 that. And then immediately, it watches those accounts get inundated with all these messages from predators, basically. Yeah, predators being shown body dysmorphics of like thinspiration kinds of content. Essentially, the worst kinds of stuff, even if the kids are under age, Facebook just shows it to them. And what's important to hear is that this shows that Facebook knew what they were doing and did it anyway. And they did it in search of engagement of user number. So this is willful. And so what was the verdict of this trial? Yeah.
Starting point is 00:03:08 So, you know, the jury actually didn't have to deliberate for very long. They deliberated for just two days, and they found meta maximally guilty. So, you know, this sort of shows you the limits of laws it stands, because while the jury find them the maximum amount they're allowed to find them, that only amounted to $375 million in civil damages. And that's just not a big deal to these big companies. And so what's much more interesting is that they are going for injunctive relief. And that means the court can now go back and tell META that they have to change their product in specific ways.
Starting point is 00:03:46 And they can force META to change their product in specific ways that might really hurt engagement. And so this can really matter. It's not just a cost of doing business. This can be something more existential. If you could explain for listeners, why is it that $375 million is the maximum amount that they can do? I'm not exactly an expert on this, but essentially for every person in the lawsuit, the maximum amount of damages that the jury can ask for is $5,000. And that's what they asked for, and that's what they got. Maybe it's important to back up and just notice that for listeners, $375 million is just a cost of doing business.
Starting point is 00:04:21 Like, if I'm meta, I just am knowingly printing money in the meantime for many, many, many years, billions of dollars, billions and billions and billions of dollars, knowing that a fine like this is coming. and when it comes in and it's only 375 million, I don't have to treat this as a fine. I can treat this as just a fee. And I think why the injunctive relief is so important is because if you actually have forced design changes, like, for example, maybe you can't auto-play videos anymore, or maybe youth accounts just simply can't receive messages
Starting point is 00:04:50 from people across the network. I don't know. There's a lot of changes that could be made here that would lead to a different outcome, and so that seems like the next most significant part of the trial. Yeah, that's right. And just to put $375 million, into perspective for what that means for a company like META,
Starting point is 00:05:04 which is to say not very much. Meta is offering new employees to their superintelligence lab, something like $300 million. So you can see this really doesn't matter in a dollar amount, and that's why the injunctive relief changing the product. One example, we've talked about on this podcast, Tristan, the idea of a latency sanction or latency tax, that we know that latency, or rather, how fast a page loads is directly correlated to retention and engagement.
Starting point is 00:05:37 So if courts decide, say that an appropriate remedy is adding a very small amount of time delay to page loads, we're talking like 100 milliseconds, 200 milliseconds, this is less than around human reaction time. It's really subperceptual, but it gives users a little bit of that feeling of sitting on an airplane with bad Wi-Fi. and you go to Twitter or Facebook or Instagram and it loads a little slowly you decide to do something else. It's that. It's just adding a little bit of friction
Starting point is 00:06:05 at the point of use, which drops the number of overall users by some really actually significant amount. This is Amazon finding for every 100 milliseconds their page load slower. They lose 1% of revenue. And so this gives court a fine-grained tool for saying, depending on how bad Facebook has acted,
Starting point is 00:06:25 they can dial up the frills. a little bit, just the amount of time that it takes for a page to load. And because Facebook's core incentive is number of users is engagement, this gives courts a tool to directly sort of like punch back at the business model engagement, like punch them where it hurts. And then as Facebook starts to do better as their numbers go up, as the harms go down, you can lower over time the amount of sort of friction that's being added. And this is really, really exciting.
Starting point is 00:06:55 This is not going through the legislative system, which is slow and probably is not going to give us anything. This is going through the court system, which can move quickly and ongoingly. And that's the opportunity that sits in front of it. That's why we think it can actually be the big tobacco moment and not just a little wrist slap fine. I just want to reinforce for listeners not to sort of to horn here, but all of this was predicted by the incentives that A's and I laid out in 2013. And I want you to really, really hear that, because if you know the incentives you can understand and predict, all of these behaviors, this totally avoidable societal catastrophe. And sadly, we had to wait until this lawsuit for some of those changes to happen.
Starting point is 00:07:33 And my deep hope is that lawsuits like this help further the kind of immune system of society and culture to get more ahead of AI rather than wait until the lawsuit happened a decade later. Let's just talk a little bit more about the trial. And one of the critical things about it is the discovery process. They looked at Meta's internal documents. So I remember actually going on the television show, CBS this morning. morning, which is like one of the biggest American television shows. And I talked about, they asked me a question about Facebook making this change to make communication between people end-to-end encrypted. And they said,
Starting point is 00:08:09 is this a good thing or a bad thing? And I said, well, one of the reasons that I think Facebook is doing this is because if they encrypt messages, then they don't actually know what's being sent between people and they're not liable if they can't look. And I suspected that a lot of that had to do with trying to avoid liability. We heard the Facebook CEO, Mark Zuckerberg, said that Facebook will become a privacy-focused social network. Does that make sense to you? How do you interpret that? Well, the issue is that they're trying to avoid liability. I'm sure there's many things that there are good reasons for doing what they're doing. But when they move all this, you know, the Russian hacking, the pedophilia stuff, all this stuff that's going on in these different groups, when suddenly they're inside of private groups, it's not their responsibility.
Starting point is 00:08:56 Wow, I never thought of it like that. So once it's encrypted, they don't have to be responsible for telling, you know, the FBI or whatever, we knew this was happening because they can't know. And then this actually did come out in some of the documents as part of this discovery process. Can you talk about this, either? Yeah, that's indeed exactly what Facebook was going for. In 2019, Meta's head of policy, Monica Breckert said, you know, when she was talking about it internally, that, quote, we are about to do a bad thing as a company, that this is so irresponsible.
Starting point is 00:09:26 and their head of global safety in an email had said that Facebook allows pedophiles to find each other in kids. So it's very clear that Facebook was making sort of a cynical decision to encrypt to avoid liability versus taking a moral stand to do something right. And they just sort of like encryption washed it. So we get a lot of this behavior of if we don't look, we can't see it. We can't be doing it. and even when we do look, we're not going to do anything. So a couple of the ones that really hit me was 2018, the VP of Integrity Guy Razen at that point sent an email internally that said,
Starting point is 00:10:04 we know of the scale of the problem that there are a whole bunch of direct messages being sent to our underage kids for grooming and solicitation, but we're not doing anything. Just a year later in 2019, another VP emailed Zuckerberg personally saying, I just need 24 additional staff members to study this kind of problematic. use and build like tools. And the answer came back, nope, we're not going to change anything. From the CFO, Susan Lee. From the CFO specifically, yeah. Reporting on behalf of Zuckerberg. And as I read in one of the documents, in a 2020 chat between meta employees, one meta employee asked the
Starting point is 00:10:41 other, quote, what specifically are we doing for child grooming? And the other meta employee responded, quote, somewhere between zero and negligible. Child safety is an explicit. it non-goal this half of the year. And then in 2021, our now friend Outuro Behar, who led product safety at Facebook, he sent an email to Zuckerberg directly saying, hey, the company was deeply undercounting unwanted sexual advances towards minors.
Starting point is 00:11:09 And Zuckerberg didn't even respond. And this all just shows. So you're like, okay, so Facebook new, Facebook new, and then in 2022, what did Facebook do? They completely slashed their integrity and responsibility team and eliminated 100 positions. So Facebook was taking the viewpoint of the more we know, the more reliable. Let's just get rid of the problem by getting rid of the people pointing at the problem.
Starting point is 00:11:33 So, Is it maybe just to take people inside the courtroom for a moment, like what was it actually like? There you are, you know, with the jury, with Metas Lawyers sitting across from you. I'm sure with their, you know, their eyebrows furrowed and angry at you. What was it like to be in that room? Yeah. Well, first to say, like a lot of going to court is hurry up and wait. like get down there, get to court, and then you're put, or I was put into a little side room with no windows and just flickering overhead fluorescent light.
Starting point is 00:12:03 And then you just have to wait until you're called up onto the stand. And actually one of Facebook's tactics was to drag out the cross-examination of the people in front of me. And what the lawyers from our side was said is that Facebook was intentionally trying to make it so that I couldn't testify by like running. down the clock and make it so that it just would have to come back day after day after day, which was interesting. I didn't know that that was a tactic, but it is. The other really interesting thing, like, so you get in there, you sit down in the stand
Starting point is 00:12:38 and you are talking to the jury, but I was called as technically what's known as a fact witness and not an expert witness. So I was testifying from my own experience about my invention of infinite scroll. And so that means any time that I would talk about everything that we know about Tristan, like the effects of incentives, the Facebook lawyer would say, like, objection, the lawyers would approach the bench, and they would turn on a white noise machine, which the jurors hated, so that we couldn't even hear what the... They turned on a white noise machine?
Starting point is 00:13:13 Yeah, they turn on a white noise machine so that the lawyers can talk with the judge about whatever it is they're talking about, and I can't hear. and the jury can't hear. And it's very annoying. And it would happen every, you know, like three to four minutes during my testifying. In fact, any time that I'd start to get on a roll, Meadow would call for this kind of thing and they'd go up and they'd try to break the flow. So there's just a lot of like tactics and counter tactics happening at the object level,
Starting point is 00:13:45 even before we get into the content. You know, just on one of the other interesting moments is it turns out that Facebook, has been tracking Center of Humane Technology for a very long time. And actually, yeah, what was discovered in Discovery is they had a 2018 funding deck of ours that like you and wrote and I wrote and Randy wrote. And Facebook tried really hard to keep that funding deck from getting admitted as evidence, but our side prevailed. And there's a great moment where they had me read out our original funding deck
Starting point is 00:14:21 where it could say things that I couldn't say on the stand about what the effects against teens were against democracy was, because I was just reading a document. And Facebook tried to use their white noise generator many times during that. But in the end, you could just see, as I described what Infant Scroll was,
Starting point is 00:14:41 that even I as the inventor who know exactly how it works and how it sort of removed stopping cues to get you to use the product more and I could really see that land in the jury. And you just see that in this case, the jury was very skeptical of Facebook. And Facebook sort of had to fall back in their cross-examination of me.
Starting point is 00:15:00 There's sort of a funny moment when their lawyer was trying to pin me down. They said, like, so you're the inventor of Infinite Scroll, right? I'm like, yes. And they're like, do you know that just a couple months before you invented it, somebody else had published a blog post about inventing Infinite Scroll? So you're not the real inventor of Infinite Scroll, right?
Starting point is 00:15:18 And I was like, oh, that's a good. Well, that's sort of great, actually. Like, I get to absolve a little bit of my guilt. But, and you can see it sort of deflating out of them as their line of attack didn't work. But that's the level that Facebook was resorting to because they didn't really have an argument. They don't have an argument. Yeah. And honestly, my feeling sitting up there, and it's intense being cross-examined like that.
Starting point is 00:15:38 You have to just, like, breathe and remember, like, why you're there. So, Aza, just curious, what was your personal reaction hearing this verdict? The feeling was twofold. One was a kind of relief and an excitement because, one, it's just so obvious. But to have New Mexico, and now the case in L.A., both just hand out the obvious of Facebook being guilty and having known that they're making addictive products, that's awesome. That's a moment to be celebrated. And of course, the other side of it is being like, well, if it just stays at the $375 billion fine, then all of this doesn't really matter that much. We have to get the next step,
Starting point is 00:16:27 the injunctive changes to product. That kind of brings us to the last question, which is where is going from here? What's the next step? Well, of course, META is going to appeal and the case in California ends well. So we're going to have to wait and see what happens there. But the real significant moment is what kind of injunctive reliefs, what kinds of of penalties the court gives to META. And that has the chance to be incredibly significant.
Starting point is 00:16:53 And it's very important that this gives precedent for Facebook being found accountable. It gives precedent for ways that the courts can route around Section 230. And it is incredibly important because for the first time, we might have product level changes that can actually affect engagement. And for those who are interested, we have covered all of this before for the last decade. If you go back to our early interviews with Francis Hogan, if you go back to our interview with Arturo Bihar, we've been talking about these issues for such a long time. What gives me hope is that this verdict is finally happening. This lawsuit is coming due, and real accountability is happening.
Starting point is 00:17:31 And we just hope that the next, you know, move the court case with these injunctive relief actually makes design changes. So the material reality that our kids are living in in their psychological environment is not dictated by this kind of cynical behavior. Yeah, the phrase that came up in court again and again and again was too little too late. Facebook could say, we're adding stopping cues. You're scrolling a lot and you'll say, like, take a break, trying to capture people in hot versus cold states. Again and again, expert after expert, get up and just be like, too little, too late. Too little, too late. And this lawsuit might end up being too little too late except for this injunctive relief,
Starting point is 00:18:11 which means that it could be like enough too late. Yeah, I hope that they include in that no auto playing videos across the board that would just make auto playing videos opt in. Suddenly, all the brain rot economy just goes down by at least 50% overnight if there's no auto playing videos across, not just one app, but all of them. That's exactly it. Yeah. It just takes a little bit of latency. Just like a little bit of friction, like it gives you your agency back. Well, I also just want to thank you for testifying.
Starting point is 00:18:41 Thank you for doing this. It's so important. I really hope that this moves forward and onward and upward to the next things. It's a true honor to be there. Your undivided attention is produced by the Center for Humane Technology. We're a nonprofit working to catalyze a humane future. Our senior producer is Julius Scott. Josh Lash is our researcher and producer.
Starting point is 00:19:06 And our executive producer is Sasha Fegan. Mixing on this episode by Jeff Sudakin, an original music by Ryan and Hayes Holiday. And a special thanks to the whole Center for Humane Technology team for making this show possible. You can find transcripts from our interviews and bonus content on our substack and much more at humanetech.com.
Starting point is 00:19:25 And if you liked this episode, we'd be truly grateful if you could rate us on Apple Podcasts or Spotify. It really does make a difference in helping others join this movement for a more humane future. And if you made it all the way here, let me give one more thank you to you
Starting point is 00:19:38 for giving us your undivided attention.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.