Offline with Jon Favreau - 224: The Big Tech Critic Trump Is Trying To Deport
Episode Date: February 28, 2026Imran Ahmed, CEO of the Center for Countering Digital Hate, joins Offline to talk about the horrifying trends his team has unearthed across social media platforms…and how it’s put him in the cros...shairs of the Trump Administration. To date, Imran has weathered multiple lawsuits, stood up to Elon Musk, and won. But now, the State Department is trying to get him deported back to the UK—just for publicizing how platforms are hotbeds of bigotry and self harm content. He and Jon talk about how Section 230 of the Communications Decency Act is a cancer on our democracy, why Tech Oligarchs view the rest of us as NPCs, and how the “things" Silicon Valley is moving fast and breaking are actually our own children.For a closed-captioned version of this episode, click here. For a transcript of this episode, please email transcripts@crooked.com and include the name of the podcast.
Transcript
Discussion (0)
Offline is brought to by Quince.
A well-built wardrobe is about pieces that work together and hold up over time.
That's what Quince does best.
Premium materials, thoughtful design, and everyday staples that feel easy to wear and easy to rely on, even as the weather shifts.
Quince has the everyday essentials I love with quality that lasts.
You've got organic cotton sweaters, polos for every occasion, lighter jackets that keep you warm in the changing season.
The list goes on.
Quince works directly with top factories and cuts out the middlemen, so you're not paying for brand markup, just quality clothing.
Everything's built to hold up to daily wear and still look good season after season.
Plus, they only partner with factories that meet rigorous standards for craftsmanship and ethical production.
Love quints.
Got some of those Mongolian cashmere sweaters over the winter.
But now it's getting warm, so probably could use some polos, maybe a lighter jacket here and there.
Love quince.
Going to go on the website soon.
Check out what they got for the spring.
Refresh your wardrobe with quince.
Go to quince.com slash offline for free shipping on your order in 365-day returns.
Now available in Canada, too.
That's Q-U-I-N-C-E.com slash offline.
Free shipping and 365-day returns.
Quince.com slash offline.
Well, that's it.
Late-night talk shows are dead.
Not.
The Adam Friedland Show is breathing new life into the talk show format
with weekly episodes featuring some of the biggest stars on planet Earth.
Utilizing his razor-sharp wit and non-traditional good looks,
Adam Friedland goes beyond the surface-level routine of the average interview program
and digs into the core.
of who people really are.
People like,
hmm, let me think,
ever heard of Zoron Mamdani?
How about Sarah Jessica Parker?
Alec Baldwin ring a bell?
Geez, Adam.
See, you didn't mention me,
John Favreau.
Were you on the show?
Also on the show.
Oh, you were on the show.
Me and clavicular.
Uh-huh.
Just on the same program.
Mm-hmm.
Anyway, new episodes every Tuesday,
subscribe to the Adam Friedland show
on YouTube or anywhere you get your podcasts.
And I do think about,
like Mark Zuckerberg
sitting in the back of a blacked-out limo, being driven, you know, around San Francisco,
driving past homeless encampments and just thinking, these are a subspecies to me.
Now he's in a bunker in a bunker. Exactly, in a nuclear bunker.
Or watching January the 6th happen and thinking, well, you know, oh gosh, I guess the plebs are
rebelling. Maybe I should dial it down a little bit. That's where the sort of the sinister and
sociopathic element of move fast and break things become so clear to me. These guys do not
care about us as human beings. They see us as NPCs in a game in which they are the lead character.
I'm John Favro, and you just heard from today's guest, Imran Ahmed. Imran runs the Center for
Countering Digital Hate, an organization that does lots of excellent research on social media
platforms and lots of work to hold tech companies accountable for the harm they're causing.
He's the kind of guest we've talked to on the show many times before. But what makes Imran unique
is that Donald Trump is trying to deport him.
Imran is a legal permanent resident of the United States.
He's married to an American citizen.
He's the father of an American child.
And the only reason the State Department announced in December
that he's no longer welcome in this country,
his country,
is because he published research about X,
the hellscape formerly known as Twitter,
that made Elon Musk angry.
That's it.
I talked to him about his fight to stay in America,
as well as the very important work he's doing at the Center for Countering Digital Hate.
We also talked about reforming Section 230,
the legal statute that protects social media companies from being held accountable
for the things posted on their platforms,
how big tech has destroyed our trust in democracy,
and where he actually sees potential to change Big Tech's influence on our politics.
It was a great conversation, and one that made me very thankful
that Imran is out in the world doing this work.
Here's Imran Ahmed.
Amaran, welcome to Offline.
Nice to be here, John.
So you run a tech accountability organization called the Center for Countering Digital Hate,
which has been doing fantastic work since 2019, I believe.
Before that, you worked as a strategist in British politics for the Labor Party.
But a lot of people heard about you for the first time a few months ago
when the Trump administration suddenly tried to deport you,
even though you are a legal permanent resident of the United States
who's married to an American citizen with an American child
and another on the way.
What happened?
So it's not really just a story about an attempt to deport me.
For the last six years, I've been running an organization that uses data science, communications, public advocacy, and lobbying to try and persuade the world that we have a serious problem with social media.
That is creating a range of harms.
We were one of the first organizations to sort of approach this, not from a tech perspective,
but from a, look, these are these systemic, wide harms and try to catalog impact on our kids,
impact on individuals, impact on communities, impact on our democracy, and then show them
in really vivid ways.
And we've been really successful in doing that.
I mean, some people will have heard of us through our work, like disinformation doesn't,
which show that 12 people produced 65% of the disinformation that was spreading during the pandemic.
Some people will have seen our study deadly by design, which showed that 2.6 minutes after opening an account as a 13-year-old girl on TikTok, our researchers were being served to self-harm content. After eight minutes eating disorder content, every 39 seconds on average in the first 30 minutes of opening an account. So, you know, those sorts of research studies get huge amounts of attention. And that pisses off some really powerful people. So Elon Musk sued us a few years ago, and we beat him. He's a lot. He's a lot.
He sued us for $10 million saying that we'd cost him $100 million.
We made all of his business relationships fall apart, which is something I'm very proud of.
Also something that, I mean, you can take some credit for, but he did a good job making those
business relationships fall apart himself.
It's so funny.
The way I described it was our job is to hold up a mirror to these platforms.
And, you know, like you and I, when we see yourselves in the mirror and we don't like what we
see, brush our teeth, you know, brush our hair, whatever it is.
you know, go and get some Botox
what he did was
sue the mirror and say, no, I don't
look like that, I'm much better looking.
And then, so we've had this sort of
you know, there's been a systematic
attempt opposition to CCDH
and our work from big tech.
But then what changed
on December the 23rd was
that a new escalation
in that. So we see this
is really about big tech and the big
money that they've been spending in Washington.
That image of them
lined up on the days behind the president after spending millions of dollars, hundreds of millions
of dollars on trying to influence politics. And I got a text message linking to a tweet. I clicked on it
and it said, we are banning five Europeans from entering the United States because of their work
on tech accountability. And only one of those five lived in the United States. And as you say,
I'm what's called a green card holder. I'm married to an American.
She's from Oklahoma, so she's, you know, American American, fast cars and guns.
That's kind of her two hobbies.
And my children are American.
So that has been, that was my Christmas.
You found out via tweet from the State Department?
It was a tweet.
You know, that's how things work, right?
What was their argument for why you and the four others were banned from the United States?
So, I mean, a junior political appointee in the State Department has been out talking about this quite extensively, and her rationale for it is you wouldn't allow someone to enter the country if they said that they were trying to hurt an American company.
And so why would we let someone stay here?
Now, let's leave aside that that's nonsense with that.
And this is the same political appointee who, when Grok went crazy over Christmas, was in Britain explaining that a little bit of.
discomfort for women in being forcibly nudified is the acceptable price of freedom.
Those are taken from her words on this show.
This is a State Department official?
It's a State Department official.
So, I mean, you know, you've got this extraordinary situation where they're confidently
asserting we're going to deport someone because we don't like their advocacy.
And I mean, what was lucky for me was that two weeks earlier, they'd already trailed this
because whoever's behind this is relatively undisciplined, they'd briefed Ziteo.
You know, Medi Hassan's online left-wing news site and said, we're going to expel this guy because he criticized X specifically.
And the five people that they targeted had all had entanglements with Mr. Musk.
And that's where we can see his fingerprints all over this.
But they said, we're going to deport him.
So I'd already assemble the team of lawyers.
So I have, you know, Roberta Kaplan who won U.S. versus Windsor, leading to the defense of, you know, gay marriage, thanks to Roberta Kaplan.
Chris Clark, who's a really, really well-known litigator in New York, the ACLU, so Anthony Romero
sort of brought his team on board, and Norma Eisen and Democracy Defenders.
Love Norm.
Yeah, so we have a great team together for this.
And so when it came in, we just kind of went, okay, let's fight.
What was going through your mind in the hours and days after you found out?
And what was your family's reaction?
You will know, like having worked, you know, both of us have worked in politics, and we both work for people in senior leadership positions, you in the ultimate leadership position.
And there's no drama, really, when crisis, and certainly not for me, you don't get to those positions by being someone who falls apart in the moment of crisis.
So I'm personally wildly compartmentalized. I'm somewhat dissociative, which means that I just, I was very calm. I went downstairs to my office.
started calling people. I made arrangements. I told my wife what was happening. I took her aside and said to her,
I think the only moment which I really felt, a deep sense of responsibility and fear was when I asked
her, if you want, if you think it's best for the family, my first job is to you. And so if you want me to
resigned today and beg for forgiveness from Elon Musk, I will.
And she said, I don't think so, but let me think about it.
And I went downstairs, carried on making calls, and then she came down at maybe an hour later.
I left a post-it note on my desk, which I kind of vaguely spied.
And then I finished a meeting with some lawyers, and I looked at it.
And it said, I love you.
Fuck these guys.
That's awesome.
So, you know, I married very, very well.
Yes, you did.
Yes, you did.
And that gave me my permission to do what I had to do,
which is to put together a team and to get working on it.
So that night I stayed at home.
The next day we filed in the Southern District of New York
for a restraining order, a temporary restraining order,
and for an injunction to stop the government from.
See, our fear was, John,
that their plan was to actually break down my door
to physically detain me, arrest me, detain me,
and transport me to Louisiana,
or another favorable jurisdiction where they've got good judges.
And they have in the past detained people for months.
Still are.
Away from their friends, their family, their support networks.
Legal residents, green card holders.
Exactly.
Citizens in some cases.
And law-abiding green card holders, physically detaining them for months on end
before they even bring a notice to appear before an immigration court.
So we knew that was a possibility.
And what was most important was always been important to me is that we must never let
the bullying, guess, and the way of doing the job. I am genuinely, you know, I'm extremely motivated.
Just a few weeks ago, I was on the hill lobbying on the sunseting section 230 bill with
Joseph Gordon Levitt and senators from both parties, but also parents who've lost their kids to social
media. And every time I meet them, a little chip comes out of my soul. And now I am
close to broken by how many times I have met people who have suffered immeasurably because of the
business decisions taken by greedy plutocrats who run social media companies and their
indifference, they're sociopathic indifference to the harm that they cause in their zeal to
move fast and break things. You know, the things they break are our kids, our democracy.
And so, you know, I'm very, very motivated. And we filed at 7 p.m. on the 24th
December at 1249 a.m. on Christmas Day, I'd forgotten to put my phone onto silent. I got a text
message from our lawyers saying, we got it. So a judge, an unbelievable kindness to spend Christmas
Eve staying up, working to protect the constitutional rights of someone he'd never met, would never
meet again, but was a law-abiding legal permanent resident of the United States. So I'm, you know,
people ask me like do you think like oh america is a terrible place look at what's happening to you and
i'm like no look at what's happened to me it's a great place i mean that is a um a kind thing to say
and an optimistic thing to say um and look i i'm biased because my father-in-law is a federal judge
um but i do think in in the last year like the the number of federal judges especially in like
the district level who have stood up and in some of their opinions, just stood up for the rule of law,
and not just in a quiet way, but in a very committed way and been very verbal about how they feel
about this. And even, like, people, you know, judges appointed by Trump, by Bush, by all the,
it has been one of the inspiring, few inspiring parts of the reaction of last year.
It's really interesting to me that, like, so, you know, people like Elon Musk and some of their
advocates, keep asking, like, what motivates this guy? Like,
who's behind him. They keep thinking there's a conspiracy theory. Like there's one that's
been flying around that I'm a secret intelligence service agent and they're my six agent. And yes,
I am quite good at wearing suits. But I'm also incredibly talkative and really indiscreet.
So like, I would make a terrible spy. I know and I know I would. But, you know, they keep wondering.
And I think it's because they don't understand the nobility and the immense pull of public service.
They don't.
Of doing the right thing, of serving your community and your country, putting yourself at the service of other people.
And so when judges do that, every single judge has signed up to be in public service.
It is the impunity and arrogance of social media giants, some of the biggest companies on this planet, some of the richest human beings to have ever existed.
And I think that that's infected the rest of our economy.
It's infected our politics.
It's infected our perceptions of what's normal and admirable in behaviour.
And it's so sad because actually the nobility is found in the magistrate,
working in a small community whose job it is to make sure that the right decisions are taken,
to protect that community, to protect the individual, to protect victims,
to balance these different rights.
That is the absolute core of what I think is good about, you know, democracy, America, people.
Yeah, yeah.
Offline is brought to you by Sundays.
We all love the idea of feeding our dogs real fresh food.
But the reality is that fresh dog food usually means taking up freezer space, time to thaw and prep,
then a lot of mess when you serve it.
Get the good without the hassle with Sundays.
Sundays was founded by a veterinarian and mom, Dr. Tori Waxman,
who got tired of seeing so-called premium dog food full of fillers and synthetics.
So she designed Sundays.
Air-dried, real food made in a human-grade kitchen using the same ingredients in care you'd
use to cook for yourself and your family.
Every bite of Sundays is clean and made from real.
real meat, fruits, and veggies with no kibble. That means no weird ingredients you can't pronounce
and no fillers. Compared to kibble or other brands out there, Sundays invests 50 times more
in its ingredients to ensure premium quality because your dog deserves food made with care,
not in the interest of cost cutting. And the best part, you just scoop and serve. No freezer,
no thong or prep, no mess, just nutrient-rich, clean food that fuels their happiest,
healthiest days. So you get more of them to share together. Leo, love Sundays. We've had the other
dog food where you have to, it's sort of messy and it.
You put it in the fridge and the freezer and you thaw it out and it's just a real pain in the ass.
And Sundays is not really easy to serve for your dog. They also love it. And it's healthy too.
So make the switch to Sundays. Go right now to SundaysforDogs.com slash offline 50 and get 50% off your first order or you can use code offline 50 at checkout.
That's 50% off of your first order at Sundaysforogs.com slash offline 50.
Sundays for Dogs.com slash offline 50 or use code offline 50 at checkout.
Offline is brought to you by Delete Me. Delete Me makes it easy, quick, and safe to remove your personal data online at a time when surveillance and data breaches are common enough to make everyone vulnerable.
Delete Me does all the hard work of wiping you and your family's personal information from data broker websites.
Delete me knows your privacy is worth protecting. Sign up and provide delete me with exactly what information you want deleted, and their experts take it from there.
Delete Me sends you regular personalized privacy or privacy.
showing what info they found, where they found it, and what they removed.
Delete Me isn't just a one-time service.
Delete Me is always working for you, constantly monitoring and removing the personal information
you don't want on the internet.
The New York Times Wirecutter has named Delete Me their top pick for data removal services.
Someone with an active online presence, privacy is really important to me because, I don't know,
you want all your shit on there.
A lot of your shit's going to be on there anyway, so you might as well try to delete as much as
possible.
Have you ever been a victim of identity theft, harassment, doxing?
If you haven't, you probably know.
someone who has, Delete Me can help.
Take control of your data and keep your private life private by signing up for DeleteMe.
Now at a special discount for our listeners, get 20% off your DeleteMe plan when you go to
join DeleteMe.com slash offline and use promo code offline at checkout.
The only way to get 20% off off off.
It's joinDleetMe.com slash offline and enter code offline at checkout.
That's JoindeletMe.com slash offline code offline.
So the restraining order expires in March?
No, it expires once the case is over.
What happens next?
So we're currently exchanging papers, you know, like they're filing, we're filing, and we've got a judge in New York who will hear the case, and we're pretty confident of prevailing.
There is a fundamental constitution. In fact, some of the most sacred, protected speech is advocacy, is public policy advocacy. So we're pretty confident that they will find that our First Amendment rights are being breached.
And then, you know, it's possible that they will still bring a case against me to deport me,
and then we'll have to fight that separately.
So this is just about preventing detention, you know, an attempt to put me into a fetid immigration detention center for months on end.
Then there is a separate process of protecting my ability to stay in the country.
And I love America.
I've been here for six years.
My wife's American.
My children are American.
My child is American.
And in a few days, my children are American.
American. And so I want to stay and I want to continue to contribute to society. I think I do so
in a really positive way. We employ dozens of people at CCDH. And I think that we're working on behalf of
the public. And I know that from the reaction that we get from the people that we seek to serve,
from parents, from vulnerable communities who are subject to hate speech on social media, from
people who are victims of scams. We do an enormous amount of work on how scams are being
accelerated on platforms. You know, you would have heard recently that 10.1% of META's revenues
according to their internal estimates are actually from scams and illegal content. So,
you know, that kind of stuff. So I want to stay and I'm going to fight to make sure that I can
and eventually I hope to become a citizen because I worked in politics. I think in these
sorts of ways. I come from a country with kings and queens. I've always really admired
a country that believes that we can run ourselves with checks and balances.
And I think, you know, we can come back to this, but I think that that's where things
have gone wrong.
Like we allowed checks and balances to wither.
Yes.
On social media.
And I think it's actually social media that has, and, you know, this is an abstruse sort of point
of law.
But I think Section 230 is actually the cancer has actually infected the rest of our society.
I think it's metastasized.
And the impunity that that creates, the sense of you don't have to be responsible for
your negative externalities of the.
harm that you cause. I think that's infecting all of society now because of the wealthiest among us
enjoy this special protection under the law, then everyone else is thinking, well, why don't I?
Why can't I behave like them? And I think that that is what's caused that. So I'm really looking
forward to working to reestablish those checks and balances in this country that I deeply admire
a basic philosophical, constitutional level. I do want to get to 230 and some of the work you've been
doing. Before that, though, like how did you get on Elon?
Musk's radar. Tell me about that lawsuit, how that all came about. So the structure of
CCDH is we have a research team. They are a team of data scientists, investigative researchers,
and then we have a comms team that goes and turns that research into stuff that we can
communicate to the public through the media, through directly, through, you know, we have
like celebrity influences that we work with. And then we have a public affairs division that
goes and educates Congress, governments around the world about what we're finding.
and what sort of solutions might help to avert those problems in the first instance.
So how we can reform the system, create transparency and checks and balances, accountability, etc.
And our research division is, it does a lot of long-form projects, these really big projects.
We've got one coming out next week, which is a study of chatbots,
and whether or not chatbots will help you plan a violent attack.
Spoiler alert.
Yeah, I was going to say.
It's not good news for you guys. It's not good news for society. But then about three years ago,
when Musk took over X, I told the team, look, let's have a look and see if these changes that he's made.
So he said, I'm going to make this a free speech zone. He also told advertisers, don't worry,
I'm not going to get rid of all the rules. He said, I don't want to turn Twitter into a hellscape,
a free-for-all where anyone can say anything horrible to anyone else, because that would actually,
he recognized that would impede the rights of, you know, if you're, for example, if you're Muslim
or an African-American and if Twitter was full of hate speech against Muslims, hate speech
against black people, you wouldn't feel comfortable going there, which is what we found,
actually, through the healthcape that he created. So we wanted to quantify, has there been an
increase in hate speech on the platform? So we actually, we took a tool called, I think it's
brand watch that we used. We put in a number of racialized hate speech.
terms, the N-word as an example. And then we said, has the prevalence of those, has the number of
times they're used on a daily basis increased or decreased since he took over the platform?
And we found that there was a quadrupling in use of the N-word on his platform.
Yeah, I think I remember this. This got a lot of publicity.
Yes, it was on the front page of the New York Times. So like, you know, this research study was
really, really well received. And it was just really stark because it showed this big
increase in anti-women speech, anti-LGBQ Plus speech, anti-African-American speech, anti-Jewish speech,
and hate terms and the use of those hate terms. And, you know, we're quite good at branding
these things. We called it the Musk Bump. It went on the front page of the New York Times,
and it led to him losing $100 million in advertising because advertisers were like,
well, Elon, you said two different stories. Now we know which one to believe.
So he got mad. He got real mad. And his trust and safety counsel resigned. So he
got real mad. And then he started tweeting about me. He said, you know, this guy's a rat. He's
evil. And I was like, all right. And he kept asking, who's funding him? Who's behind this guy?
And so I was in L.A. I was like, I thought, I think I'm just going to screenshot this. I'm just
going to tell people that it's you, the public, who fund us. And I got a few celebrity friends
to sort of amplify it. And it got a lot of views, raises us a lot of money. And the next day,
he called the, this is really funny.
I'm 47 years old.
He called the chair of my board, and he was like, I need to speak to you about Imran Ahmed.
The chair of my board is like, I'm not his dad.
I'm the head of governance for his 501-C3, for his charity.
Cancel culture? Yeah.
So he's like, I need to fix this.
Like, can you maybe restrain your child?
And he was like, no, talk to my, talk to him.
He's a grown-up man.
And so I was, I don't know, I was like, hey, Elon, why don't he just talk to me directly
instead of, like, calling the chair of my board.
and I was having a copy with a journalist and they amplified it and he saw it and the next day he then threatened to sue us.
He said, I'm going to sue you under the Lanham Act.
What is that?
The Lanham Act's Federal Trademark Act and he said, I think you're secretly funded by Meta.
Mark Zuckerberg's best pal, Mark Zuckerberg, absolutely despises me.
And so Alex Spiro sent this letter and that was the first time I'd ever had to call a litigation lawyer.
So I called Robbie Kaplan, who I'd never met before, and I said, I hear that you're quite good at suing billionaires. And she was like, yep. I'm like, would you like to help me defend myself against one? She went, which one? I went, the biggest one. She said, yeah. She's my spirit animal. She's basically sort of, you know, just absolutely brutal and up for any fight. And she wrote back to them and said, this is absolutely ludicrous. And we press release that and we sort of, we let the world know that the free.
speech absolutist, Elon Musk is trying to sue a small nonprofit for their speech.
And then he actually filed a different lawsuit against us, which was saying that we had
violated the terms of service of X by essentially using data we'd found there for research
purposes and that the terms of service say that you can't take small amounts of data for research
because that's scraping. Like, you know how AI platforms like download your entire
thing. He was basically accusing us of the same thing. He said it cost us millions of dollars
to deal with you downloading the small amount of data to find find out what had happened on my
platform. And I mean, that took us a year. I mean, it cost us over a million dollars to fight it.
And that's the thing with these lawsuits. They're designed to do that. They're designed to
cripple you. They're designed to terrify people. They're designed to make people not want to talk to
you because then people start thinking, well, maybe you've done something wrong. And the truth is that, you know,
this is maybe a little bit too much of how you make the pie.
But your funders, your staff, your partners, all think, well, crumbs,
I'm not sure if I should be visibly doing business with Imran,
because what if we get drawn into this?
Right.
Is it really worth the hassle and the money?
He's a world's richest man, John.
I mean, like, he can really screw you up.
But, I mean, it didn't work because we beat him in court.
We got an anti-SLap ruling in California where he sued us.
An anti-SLAPP ruling is?
So SLAP, it's this brilliant aspect of American law
that a lot of states have got anti-SLAPP statutes
where strategic litigation against public participation,
which is a lyrically named rule,
which says that when powerful people sue little people like me
to try and silence them,
there are actually rules in place so that the little person
can get their costs back
because that's the most crippling aspect of it.
Oh, that's good.
In the rest of the world, if I see you and I lost,
I have to pay your costs.
But you don't have what's called cost shifting in the US.
We love a frivolous lawsuit.
Encourages a little guy.
I mean, it's an interesting aspect of being in America.
Like, it's a, you know, when you move here,
because you've watched movies,
like I was like, it's basically going to be ET.
Like, I'm going to find a little alien in my neighborhood.
It's going to be so great.
We're going to be best friends.
And it's not. Like, it's a radically different culture. And one of them is, you know, how sort of much litigation is seen as a, as a normal tool used by powerful people to terrorize smaller people and nonprofits and others.
And so you won that lawsuit, but like, then you were on his radar. And I think I've heard you say that then, like, Republicans in Congress started sent subpoenas your way.
And so it wasn't like, so it wasn't just like a one and done with Elon and then the deportation.
attempt. You've now been dealing with this over several years. Well, curiously, a few days later,
Jim Jordan, who was chair of the House Judiciary Committee's weaponization of the Federal Government
Subcommittee, trying to stop the government being used for illegitimate means, decided to use
the government to demand all of our internal emails. Any email that we'd ever sent to a social
media company or the executive branch of the US government. He had a conspiracy theory that we were
secretly funded by the federal government or by social media companies. That doesn't make much sense.
We've heard that conspiracy theory before, though, a few weeks earlier. And so we were like,
okay, here you go. And we gave him everything. And there was nothing there. But every six months,
we'd get a new subpoena. Then he deposed me for several hours. And I had to sit opposite him and
explain painstakingly, I just want transparency and accountability. Is that really? And he was nodding.
I remember like, I used it like a focus group of like, can I get Republicans to agree with my policy
platform? So I was like, so this is our star framework. Safety is transparency, accountability and
responsibility, section 230 reform, transparency on, you know, algorithms, on content enforcement
decisions, on advertising. And he was nodding. And I was thinking, I think he might actually agree with me,
even though he's spent the last two years harassing me.
Of course. And then, you know, we've had other things happen as well.
Stephen Miller wrote to the Department of Justice demanding we be prosecuted for FARA violations,
the Foreign Agents Registration Act, saying that we were secretly part of the British government.
Again, like, you know, I'm sure he thinks I'm a secret intelligence service spy as well,
which I am not.
But that wasn't true, and, you know, the DOJ cleared us.
And then the FTC are currently investigating a bunch of nonprofits,
including CCDH for whether or not we are at the heart of a sort of a criminal antitrust.
So basically he thinks that we control the world's biggest advertisers and that we've been telling
them not to give X money.
And I'm like, I'm pretty sure Bob Iger doesn't know who I am.
And I'm pretty sure they made their decision because Elon's kind of a dick.
And because, you know, it's not great advertising to have, you know, Mickey Mouse next to a dude
doing a Nazi salute by the Brandenberg Gate.
Right.
It's not kind of, you know, it's not on brand.
I mean, Elon's argument that he has floated over the last several years that, like,
the law should be able to compel advertisers to spend their money on certain platforms.
Like, it's insane.
I mean, it's literally a First Amendment violation.
I know.
That's the freedom of association.
It's what, you know, it's what's so odd is that they're really upset about the First Amendment
existing.
They hate it.
They really hate that we can speak freely.
They really hate that advertisers can decide.
where to spend their money. They really hate that advertisers might listen to the market, their customers,
normal people who are horrified by the way that these social media platforms behave.
And that they could be criticized. And that they can be criticized. And that they can be criticized
in a way that persuades other people to take action with their wallets or with like what platforms
they use. That is the, that is the worst thing for them. So, you know, I do think about this thin skin
thing a lot. And, you know, I'm personally not a religious person. I'm more than families Muslim.
We're Pashtun Afghans from generations ago, but still like, you know, that's where my family
comes from. And I get all this racial abuse, this religious abuse. Like, he's a jihadist who's
come here to destroy our country. And I'm like, dude, like, I'm an atheist. And I'm super
British. Like, I'm wearing a suit in California. I'm clearly extremely British.
In fact, the only thing I did was not wear suspenders a shirt and tie today.
I was like, I'm going to be very, very Californian.
I will wear a T-shirt under my suit.
So I find it very odd.
But we've taken all of this battering.
We continue to just get on with our jobs.
We always say that when someone attacks us, just deal with the attack at a technical level, manage it.
But our job is to make sure that we are focused on achieving the change.
that we want. And so be focused, be driven. But these guys, they get distracted by every insult
and then go full nuclear. And it is this kind of thin-skinned, incredibly, painfully, childlike,
vulnerable, kind of pathetic from an adult behavior that you see from them where they can't
tolerate any criticism. And I will do this again in this recording, but like, I will go back to
230. I think that the sense that they got of legal impunity where they weren't able to be held
accountable under the law for their business decisions and for the harm that they cause,
they have translated into moral immunity too. And then they've said also, you can't criticize us.
And it is this sense of both entitlement, but also immunity from the laws, the mores, the norms that
govern normal human behavior. They are something beyond us. At the most fundamental.
level, it is anti-democratic. It is why they have found an ally in Donald Trump. It is what
Donald Trump is all about. It is the idea that, like, I have the power. I am rich. I have worked
really hard to, you know, they say that in their own mind, worked really hard to get where I am,
and I want to make the rules. And I get to decide. And you don't get to decide. And also, you know,
the whole democracy thing, everyone having a voice. It's very messy.
They don't run their companies that way, right?
So that's all very messy to them, and they are smarter, and they know what they're doing,
so they get to make the rules, and you have to shut up.
I don't mean to be combative here, but I think it's more than just one president.
I think that this is a problem with...
Oh, for sure.
And I think it affected the Obama presidency, too, that the cozy relationship between big money,
you know, big tech, including, you know, big tech, I think the first, the first,
time they had a cozy relationship with government was under the Obama administration.
For sure. We also didn't know the harms at that point. No, you didn't until the end. And then,
you know, and I remember the encounter between the president and Mark Zuckerberg,
where Zuckerberg walked away and he was like, you know, screw this guy like. But I think that
that cozy relationship, and this is this is the impact of a politics that to me, again,
like as a Brit, is really extraordinary of the sheer amount of money that flows through that.
that city that I live in Washington, D.C. is terrifying. It's terrible. It is corruption. There's no other word for it.
And it is fundamentally vitiating the clarity of the voice of the people being heard in Washington.
If money and the enormous sort of both magnetic power that it has, but also the megaphone that it has in Washington,
is drowning out the voices of normal people. I think about things like the AI preemption rules that were tried to put me put through.
All of our polling shows again and again and again that Republican voters are incredibly
concerned about AI.
Yeah.
And so I do think that it is about big tech, big money and the corrupting influence it's
having in Washington.
And I think it's something that we really, really need to fix.
And my thesis is, you know, someone that runs a non-profit is the best way that we can
fix that is actually there is one number that's more important than dollars in Washington,
and that's the number of votes.
And so what we have to do is make sure.
that we elevate this issue and a clear understanding of what causes it and how we could fix it
to as many people as possible. I need to make this electorally salient to counteract the corrupting
effect of money in Washington. Yeah. The corrupting effect of money is absolutely at the heart of this.
This episode is sponsored by BetterHelp between caring for others and managing daily life,
your own needs can easily be overlooked.
This International Women's Day, we invite you to prioritize yourself.
Therapy is more than a resource.
It's a space design for your growth and healing.
There are women everywhere.
There are women everywhere.
And maybe you're a woman or maybe you're not.
Either way, you still need therapy.
Here's the thing.
There are women that need therapy and there are women that need therapy because there are men
that need therapy, honestly.
A lot of the women need therapy because the men need therapy.
So everybody needs therapy when you think about it.
Everyone needs therapy and everyone knows a woman.
That's right.
That's right. Close your eyes.
Close your eyes. Can you picture a woman?
Better Helps quality therapists work according to a strict code of conduct and are fully licensed in the U.S.
BetterHelp does the initial matching work for you so you can focus on your therapy goals.
A short questionnaire helps identify your needs and preferences, and their 12-plus years of experience.
An industry-leading match fulfillment rate means they typically get it right the first time.
If you aren't happy with your match, switch to a different therapist at any time from their tailored wrecks.
With over 30,000 therapists, BetterHelp is the world's largest online.
therapy platform, having served over 6 million people globally. And it works with an average rating
of 4.9 out of 5 for a live session based on over 1.7 million client reviews. Your emotional
well-being matters. Find support and feel lighter in therapy. Sign up and get 10% off at
BetterHelp.com slash offline. That's better. H-E-L-P.com slash offline.
Even if you took the money out of it, it is the personalities of some of these people who have the
money and who have this power that they want to make the rules. You know, and look, and I completely
I agree with you. It's our system too, I think. Because when you're in government and when you're in the U.S. government, like, there is this, you want to get stuff done. You want to get stuff done that you promised that you would get done for people. And you try to do it and you get blocked at every turn and you get blocked by Congress and you get blocked by the lawyers. You get blocked by this. And you're like, boy, it would be easy if there weren't all these checks and balances and rules, wouldn't it be, right? And like, look what they're doing in China. They're building railroads all over the place. They don't have to work.
worry about, right? So, like, there is this push and pull with our system here. And I think, like,
I watched Obama be frustrated by that, you know, and, like, didn't get to do all the things he wanted
to do. But then you have to balance that frustration with the sort of slow, frustrating nature of
democracy, but it's important to have checks and balances and give people a voice. And I think that
the reason that we have not just Trump here, but authoritarian's rising all.
around the world is they and their allies in big tech and big business are like, we're going to
take advantage of people's frustration with the slow, frustrating nature of democracy to swoop in
and say, we get to make all the rules now. Yeah, I mean, talking to the character of these people,
when I started CCDH, I looked for advice from everywhere, and I found this guy called Lord David Young.
Now, he'd been a cabinet secretary under Margaret Thatcher. In fact, he was the guy who brought privatization
to the UK. So, like, ideologically, he was very, very on the other side to where I'd been
on the Labour Party as a special advisor, but he was an incredibly kind man, loved talking to people
who disagree with him, as do I. And so he and I would talk for hours and hours. And he was
very generous with his time to because he was in his 80s by then. And he was a very successful
businessman in his own right. And I remember asking him once, David, explain to me what you
think the differences between someone who was incredibly successful in business like you and someone
like Mark Zuckerberg. And he said, you know, it was so characteristically honest of him. He said,
the truth is, I did well in business because I wanted to go down to the country club. And I wanted
people to look at me and my wife and say, there goes David. He's a fine chap. He's doing incredibly
well. His wife must be so proud. And for the wives to be kind of like, oh, she's doing great. She's
with David. He's doing incredibly well. And he said, it was about my standing in community. And that's why, as well,
I felt that I had to behave in a certain way.
I had to think about people because otherwise I would damage that brand equity that I had.
He said, but the problem is where these guys are, they're members of a country club with a membership of one.
They don't have any connection to the lives of us.
100%.
And I do think about like Mark Zuckerberg sitting in the back of a blacked out limo being driven, you know, around San Francisco,
driving past homeless encampments and just thinking, these are a subspecies to me.
Now he's in a bunker in a bunker.
Exactly, in a nuclear bunker.
or watching January the 6th happen and thinking, well, you know, oh gosh, I guess the plebs are rebelling.
Maybe I should dial it down a little bit.
And, you know, this complete emotional and spiritual disconnect from humanity, from the rest of us.
Which their platforms contribute to.
Exactly.
And I think that's where the sort of the sinister and sociopathic element of move fast and break things become so clear to me.
These guys do not care about us as human beings.
they see us as NPCs in a game in which they are the lead character.
You know, and I think it's the character of the people.
But then you're right, and democracy is incredibly messy.
One of my great frustrations is hearing people go to someone like Dubai and tell me,
oh, if only we had like, Shaq al-Zeyad, we could get things done.
We could build buildings and everything else.
And I'm like, I thought you guys didn't like kings.
I think you didn't like absolute rulers.
And actually, you know.
Talk to the people who were building those buildings.
They tend to look like me as well, so, you know, from the subcontinent.
But it's messy.
And, you know, you said like the messy, you know, the way that democracy works is glorious, though.
I wrote my dissertation on Jimmy Carter at Cambridge on his energy policies and how we basically
failed to get them through.
When he was writing them, he predicted things like wars in Iraq if we didn't start to build
energy independence, if we didn't start to build alternatives to fossil fuels.
But he couldn't get it through Congress because he just didn't want to play the
politics. And I do think that politics is messy. I think sometimes good policies don't get
implemented because of the messy, partisan silly nature of how we've constructed the checks and balances
in our democracy. But that's the game. Yeah. You know, that's where it takes talent. That's where
great presidents are made and poor presidents are revealed to us. Yes, I agree. Well, let's talk about
your work and how you see potential for change in all of this. Like, I'm curious how your approach
to digital hate and disinformation has sort of changed over the years.
I've been thinking back recently on the period of time when the strategy was centered around
content, pressureing the platforms to moderate content, pressuring governments to do
something about hate, speech, and disinformation.
I now feel like that's, I don't want to say lost cause, but harder from both a practical
standpoint.
It's very difficult to achieve and to moderate everything.
And because fundamentally judging whether a specific piece of content is acceptable,
when it's contested speech is always going to be inherently subjective.
But also, you know, I think, and we're seeing this now,
you can talk about the recent lawsuit against Meta and Google
that Zuckerberg had to testify on.
We're seeing so much of dishonest and hateful content
because of the way these platforms and algorithms are designed.
And so maybe the design itself, the algorithmic amplification,
should be the focus of the strategy.
And it seems like it's moved that way in this space,
but you know more about this than me.
So what do you think?
You know, I want to be, as honest as I can be about this, this is an emerging area of policy.
Our understanding of the harms has really been emerging over the last 10 years.
That's the age of like real understanding that these negative externalities are down to social media,
design choices, the way that these platforms operate.
And our understanding is slowly grown over time.
Initially, it was a sense that, look, there are bad people on social media.
So it was a bad actor analysis.
Increasingly, it's about bad platforms.
And the way that they are fundamentally constructed.
there's been a growing understanding thanks to whistleblowers and other things that there are actual
active decisions taken at a platform level where they know about harm and they choose not to do
anything about it.
So we've evolved as well as our understandings evolved as we've done more and more researchers.
We've been able to more clearly articulate and evidence that specific aspects of platform design,
that they are designed to create harm, that it's not abuse of the platforms, it's use of the
platforms as they were intended, that the platforms.
in one respect, we've become more and more aware of just how malignant the decisions being taken are.
I think that where we've moved to, and CCDH has been in this place for quite some time now,
is that we develop what we call our staff framework. It's very, very simple. It says that we can have
better platforms by having transparency. And, you know, CCDH works with neurologists, with social
psychologists, who work on inoculation theory, who work on a wide array. And we've tried to sort of
build a minimum viable framework for how we can reapply democratic values to something that has now
been taken out of democratic control. And also looking at what's worked in the past in America,
where we have a long history of industries that have caused real harm to people,
but we've eventually managed to renegotiate the relationship that we have with them,
through, and in accordance with the Constitution, is quite extraordinary limits on what's possible.
So it wouldn't be possible to, for example, have a regulator as easily in the US as you would in Europe.
But then Europe regulates America litigate.
So if you want to have a system in which litigation could start to produce the framework of what's tolerable,
what doesn't create an unnecessary risk and cost for social media platforms,
so to change the cost calculus, the risk calculus, transparency, first of all, so we know what's happening.
And that has this dual effect of helping people to understand how we're being manipulated.
I think that as it becomes clearer how we're being manipulated, people will resist that.
They get some inoculation, some pathway inoculation.
You know, what are the means by which we're being lied to?
What things are platforms deliberately amplifying to distort the way that we see the world?
And that helps us to sort of adjust our understanding for that distortion.
So there's transparency, but that also allows for us to understand how systemic harm is being created.
And then when there is knowing indifference to that harm to allow platforms to be held liable
under the normal negligence laws that everyone else is, the normal product design laws that
everyone else is.
So when someone is harming other people at scale, and I think in particular of things like
scams, eating disorder and self-harm content, I think of when they may lead to violence.
So terroristic content that, you know, there's a number of lawsuits which have been dismissed
because the current laws in the US do not allow us to hold social media companies accountable.
This is a sort of, you know, a well-worn cliche, but there is more regulation of a deli
that serves you sandwiches than there is of platforms that our kids spend 4.7 hours a day on an
average and fundamentally reshape their frontal cortex in a moment of extraordinary neuroplasticity
in their transition to adulthood. So, you know, which is bananas to me.
But the way that we've done that, and we've taken inspiration from people like Nader and others who've built the consumer rights revolution, is expose the harm and make it justiciable. So make it something that you can take someone to court for. And by creating transparency, accountability, and accountability is in part through the courts, but also through educating lawmakers so they can ask intelligent questions. In the UK, it would be a regulator that would ask tough questions in America. It's mainly select committees. It's mainly the common.
committees, the judiciary justice committees. So those giving them the knowledge and equipment that
they need to ask good questions and then allowing them to be held accountable through the courts.
And that would mean reforming, not repealing, Section 230 of the Communications Decency at 1996,
which is a 30-year-old piece of legislation. It basically says platforms are not liable for any of the
user-generated content that is on the platform. It says that they are, it says,
essentially posits that they are completely neutral spaces and therefore are not in any sense of publisher
or distributor, which is bonkers, because we know how algorithms work.
Right. Just the existence of the algorithms disproves that.
Yeah. So, I mean, right now there's this big debate happening in Congress over,
should it be design features that make someone liable? Should it be algorithmic recommendations?
So pushing or promoting scams or eating the sort of content or self-harm content.
There's a debate of a publisher versus distributor, which is a very technical debate, but still a very interesting one from a legal perspective.
So there is actually an active debate happening right now on whether or not 2.30 is fit for purpose.
And there's a bill, a bipartisan bill that was introduced in the Senate.
There was one introduced in the Senate in the House in the last Congress by members, Pallone and McMorris Rogers.
So the former and then the current at the time chairs of the Energy and Commerce Committee in the House.
And now there's a bipartisan bill introduced by Senators Durbin and Graham with support from Senators Hawley and others.
And we've been meeting with them over the past few months and listening to what they have to say.
But they are introducing a bipartisan bill that would sunset Section 230.
So essentially say you've got two years.
And then there's special protection under laws over.
And in that time, you have to renegotiate a new framework.
We want you to have a liability shield.
but it can't be absolute anymore.
And I think that would make a lot of sense.
What sort of reforms to the liability shield would you favor?
I think the algorithmic recommendation is one where everyone can align.
I think knowing indifference is a fantastic test.
So under negligence law, you know, normally it's a reasonable actor test.
Like would a reasonable person have been able to foresee the harm?
And if so, you become liable for it.
That on platforms that are this complex may be too low,
a threshold. So knowing indifference is when you know that you're creating systemic harm, when
human beings are being hurt by it, and you fail to take action or exacerbate it, and then
you become liable at that point. We do focus groups and polling. I was in this focus group in,
I know that you're a fan of focus groups too, and you've sat behind those one-way screens so many
times as of I, but like, we're in Denver, Colorado. It was white male libertarian dads, and they
were being explained section 230 and they were just explosively angry about it.
Wow.
We asked them to vote at the end, would you get rid of 230 or keep it?
And every single one is get rid of it.
And I'm like, I'm texting the moderation.
I'm like, ask them why, ask them why?
And one guy says, accountability.
And I'm like, just snap that.
Just like, put that in an ad for me.
And the second guy says, well, wait a minute.
So I can sue you.
I can sue him if they cause me harm.
But I can't sue a social media company if they make my kid kill themselves.
that's un-American. I was like, put that in the ad as well.
That's just incredible. But that's how normal people see it. They're like, wait a minute,
why does Mark Zuckerberg have special protection? What makes him different to the rest of us?
And why does he need special protection? And the answer is always, well, we have to help this young
industry to grow. Come on. It's been 30 years. They've grown.
Well, and it's like, you hear them say, well, I mean, the big tech companies can afford all the
lawyers, but if it's a small platform or small companies, it's going to be hard.
for them to, you know, deal with lawsuits they may get or moderate all their content.
You know, the one you always hear is like, what if, you know, you're going to start holding Reddit
liable for every single comment in the Reddit thread? And, you know, if someone has a Yelp review
about a restaurant that's, the restaurant sues for defamation, you're going to, they're going to
be able to sue Yelp for someone. But, like, it sounds like if you're reforming 230 and you're basing
it's sort of on design and algorithm, like, you don't have to worry about that kind of stuff.
You don't. And also, like, you have to recognize that there is an established corpus of law
and judgments on when you are negligent. And so they keep saying, well, you'd be able to see for
anything. Yes, of course, I've discovered you can see for anything. But, like, it doesn't mean
you're going to win. And so they're like, there'll be a flood of lawsuits. I'm like, well,
there's a flood of lawsuits right now. But they're losing because of 230. And they're like,
they would stop free speech. I'm like, it wouldn't get rid of the first.
First Amendment. Right. Right? The First Amendment still determines what's lawful speech or not. So
actually, hate speech in America is lawful. Right. Unfortunately. Your figure. So the First Amendment
still protects that. You wouldn't be able to bring an unconstitutional lawsuit and win because
it's unconstitutional. Right. Yeah, like if you, yeah, you can't stop people from filing suits that
are frivolous and get thrown out because they'll get thrown out. I mean, like, it is your God-given
American right under the Constitution to be an asshole. I'm like, thank God. To say hateful things, to say awful
That's your right.
To be, you know, Mr. Meanypants, as I speak.
But you can't systematically go up to 13-year-old girls and tell them you're fat and disgusting
and you should go on a 300-calorie-a-day diet and do that again and again until they starve themselves to death.
And know it, make the decision to just fill their feed with that.
On my board is a guy called Ian Russell, whose daughter Molly took her own life when she was 14.
Ian's British. And in Britain, the coroner's court decided to open an investigation with the role that platforms have played. It was the first time this had ever happened. So they subpoenaed from meta and Pinterest what Instagram and Pinterest had been feeding her. The conclusion of the investigators was that they had fed her so much self-harm content that it would have been rational for her to conclude that it is normal to hurt yourself outside if you hurt inside.
and that if you really hurt inside, you kill yourself.
And Ian is this wonderfully dignified and kind human being.
He sits on my board, as I say, but there's a documentary about Molly coming out shortly,
Molly versus the machines.
He told me the story and he was telling me about how when the paramedics arrived,
they had to physically pry his arms off her because he wouldn't put her down
so they could declare her dead.
And I think to myself that when a platform has done that, maybe you should be able to hold them accountable.
And it is fundamentally corrupting to our society, to our soul as a nation, when we allow anyone to get away with that.
And I think that at the core, I think that 30 years of that abstruse legal protection has become a culture of arrogance and indifference, a sociopathic indifference to the harm that these platforms are causing.
And I think that it is infecting our politics, both through money, but also I think through spiritual, you know, osmosis.
Yeah.
You know, the whole problem with a system that's based on checks and balances that stop any one locus of power becoming too powerful is that when you take one of those loci of power power, one of those places of power away from that system of checks and balances, it will become unbelievably powerful and tyrannical in its power very quickly.
And that's precisely what we've done.
and it is madness and it needs to be reversed.
I agree. Amaran, thank you so much for joining us, more importantly, for the work you're doing.
This country needs you. I really hope you stay.
Thank you. So is my wife, I think.
I feel confident that you'll win the case because it's so crazy.
But beyond that, I'm really glad that you and the Center for Countering Digital
hater out there doing this work because it's really important. Thank you, John. Thanks.
One quick note, check out the brand new episode of Pod Save America Only Friends, exclusive to
Friends of the Pod. If you haven't listened yet, you're missing out. This new episode is
helmed by Lovett and Tommy. In this episode, they discuss who they would prefer to run against
in a 2028 election. Mark or Rubio or J.D. Vance. Come on, guys. They also answered questions
from subscribers like, which MAGA influencer do they dislike watching the most
for podcast research. That's fun. So subscribe to Friends of the Pod to get access to Pod Save
America only friends, which is our new extra episode of Pod Save America. There's also a lot more
subscriber-only content, new shows, shows that you love like Polarcoaster with Dan Pfeiffer.
We have a growing list of substack newsletter like Pod Save America Only tabs. And you get ad-free
episodes of all your favorite crooked pods. And you get to know that you are supporting a
pro-democracy independent media company, which is great.
So if you're listening, hit pause and subscribe to Friends of the Pod at cricket.com
slash friends.
As always, if you have comments, questions, or guest ideas, email us at offline at Cricket.com.
And if you're as opinionated as we are, please rate and review the show on your
favorite podcast platform.
For ad-free episodes of offline and Podsave America, exclusive content and more,
go to cricket.com slash friends to subscribe on Supercast, Substack, YouTube, or Apple Podcasts.
If you like watching your podcast, subscribe to the Offline with John Favreau YouTube channel.
Don't forget to follow Cricket Media on Instagram, TikTok, and the other ones for original content, community events, and more.
Offline is a Cricket Media production.
It's written and hosted by me, John Favro.
It's produced by Emma Ilich Frank.
Austin Fisher is our senior producer.
Adrian Hill is our head of news and politics.
Jerich Centeno is our sound editor and engineer.
Audio support from Kyle Segglin.
Jordan Katz and Kenny Siegel take care of our music.
Thanks to Dilan Villanueva and our digital team who film and share our episodes as videos every week.
Our production staff is proudly unionized with the Writers Guild of America East.
