On with Kara Swisher - Imran Ahmed Researches Online Hate. Trump Wants to Deport Him
Episode Date: January 12, 2026As the founder and CEO of the Center for Countering Digital Hate, Imran Ahmed specializes in researching how hate and disinformation spread online. His work has made him a target of Elon Musk and the ...Trump administration — he’s one of five European tech regulators and researchers the White House wants to bar from the U.S. over claims of “foreign censorship.” Ahmed, who’s a British national based in the U.S., has sued to block his removal, and he’s not backing down from a fight with the administration over his goal to hold social media and AI companies accountable. Kara and Imran talk about the work his organization does to combat the spread of hate speech; why he thinks the Trump administration is targeting him at the behest of Elon Musk; and the stakes of his case when it comes to protections around free speech and immigration. They also talk about why so many tech CEOs are threatened by efforts to reign in the spread of disinformation on their platforms. Questions? Comments? Email us at on@voxmedia.com or find us on YouTube, Instagram, TikTok, Threads, and Bluesky @onwithkaraswisher. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
Hi, everyone, from New York Magazine and the Vox Media Podcast Network, this is on with Kara Swisher,
and I'm Kara Swisher. Last year, we talked a lot about the ways the tech industry cozied up to the Trump
administration and the ways that the Trump administrations returned the favor by trying to defang
any attempt to regulate them, even by foreign governments. The latest example of this backscratching
came right before Christmas. The Trump administration imposed travel bans on four European
tech researchers and a former regulator. It accused them of trying to sense.
or American viewpoints online, calling them, quote, agents of the global censorship industrial
complex. What incredible nonsense. Imran Ahmed is one of the people targeted by the White House.
He's not a politician or a regulator. He's a British citizen who runs an organization called
the Center for Countering Digital Hate. It researches the ways hate speech and disinformation
spread on social media and AI platforms and advocates for policy changes to combat it.
Of the five, Imran is the only one who lives in the United States, meaning that he could be deported.
A federal judge has blocked his attention for now.
I think the administration is going after him at the behest of Elon Musk, and of course, I think this is censorship, which is the very thing they decry,
being a hypocrite is not new to the Trump administration, but this creates one of the most egregious, one of the most egregious examples of it,
because these people have talked about being against censorship when in fact that's what they're calling for.
All right, let's get into my conversation with Imran.
Our expert question comes from Nicole Wong.
She served as Deputy U.S. chief technology officer during the Obama administration and was a top lawyer for Google and Twitter and someone I have huge respect for who has been thoughtful about these very difficult issues of policing online speech and what to do about it.
So stick around.
Every year, hundreds of thousands of people from all over the world flock to Las Vegas for the Consumer Electronics Show.
And they spend a week.
trying to sell each other on the weirdest gadgets you've ever seen in your entire life.
This week on The Vergecast, we're talking all about everything happening at CES,
from the TVs to the AI gadgets to the humanoid robots that everybody is hoping
might someday do your laundry and wash your dishes.
All that and much more on the Vergecast wherever you get podcasts.
Imran, thanks for coming on-on.
Oh, thank you. It's good to be here.
Yeah, we have not met, although we work in similar areas.
or I've been talking about these topics that you work in for a long time,
but you jumped into the news, and so let's get started there.
It's been a crazy few weeks.
Let me just get people up to speed.
On December 23rd, two days for Christmas,
the State Department said it was barring you and for other Europeans
who do similar work in the U.S., but you're the only one who lives here.
Walk us through that day, and how did you find out about the government's decision?
What went through your head and what was the first thing you did?
So, yeah, I mean, as you say, on the 23rd of December,
I got a text message from someone saying that,
the State Department had issued a press release saying it was going to deport or ban five people
from the United States. And this had kind of been trailed for a couple of weeks beforehand.
There'd been a few articles in odd outlets like the British Telegraph with a State Department
source and Zateo, which is a very left-wing online site, saying that they were going to take
revenge on some people, including me, for the European Commission's decision to fine X,
and also to stop people who are targeting X-Corp in particular.
And, I mean, at the time, I just thought, well, that's got to be nonsense
because everything that you've just cited is First Amendment protected advocacy activity,
and the State Department can't take action for that.
But then on the 23rd, they announced it,
and this sort of minor political appointee in the State Department
specifically named me and said, I'm one of the five,
and the only one of the five who's here in America.
I'm married to an American citizen.
I'm, you know, my daughter is American.
And, yeah, the first thing that comes into your head is why.
I think there's been a lot of people who've said that it's because of X or Y or Z,
but to me it was really obvious.
This is about Elon and the wider problem of big tech, big money,
and the incandescent anger of thin-skinned plutocrats like Musk and Zuckerberg
that we've been effective at holding them accountable.
They're angry that we've helped to pass legislation, that we're listened to by their advertisers, that we've driven action that impacts their bottom line.
So let's get back to that day.
You heard about it through a press release.
Yeah.
The State Department didn't call you or anything else.
Well, I saw this tweet from, or I was sent, I don't, I'm not on X.
I was sent a tweet from a lady, an undersecretary in this in the State Department who said, said they were targeting me.
And because they'd already trail this in the news, we'd already organized ourselves, is the truth, Cara.
We, you know, I'd spoken to Robbie Kaplan, who's my long-time attorney, great lawyer.
And to Chris Clark, who's a phenomenal litigator and has represented, he's actually represented Elon before.
And he'd agreed to come on board if the government, you know, we just thought, when we read the articles, we thought this is never going to happen.
Anthony Romero from the ACLU called me and said,
look, I've seen this. What do you need?
And a guy called Norm Eason, who runs something called Democracy Defenders called.
And so Norm called as well and said, like, what do you need?
So you had a sense of something that was coming,
although you found out about it on X, essentially.
Yeah.
Talk about the personal stakes for you here.
You mentioned your wife and child are American.
Have you thought about what you and your family will do
if you're forcibly removed from the United States?
In one respect, the stakes are...
relatively low for me because they're talking about deporting me to London. You know, I get to live in Marlabone again and, you know, go to my favorite restaurants from a place I've spent 20 years living in. But the real stakes are that, well, first of all, I mean, it's a complete disruption of our life. I'd moved to America six years ago on an 01 visa, so an extraordinary ability visa given by the Trump administration for the work I was doing with the Trump administration.
on digital anti-Semitism, ironically.
And I met my wife here.
I love this country.
I've been making my life here.
You know, we bought a house.
We've had our kid.
And we want to have more kids.
And this is the place that I saw my future in.
So it's shattering, obviously, all of those.
But the other thing is, it's, there's an enormous sense of this is not right.
And given what I do, it is unsurprising that my first reaction when someone does something,
which I believe to be fundamentally immoral and wrong, that my first reaction was a fight.
Right, do it. So just for people who cleared, the Secretary of State, Marco Rubio,
put out a statement announcing the action, he accused you and others of trying to quote,
coerce American platforms to censored, demonetize, and suppress American viewpoints.
Yeah.
I want you to explain these claims and also why they're false.
But I want you to explain it from their point of view and then explain your point of view.
Well, I mean, I can explain it both at a fundamental philosophical.
The truth is, Kara, and you and I have been circulating in circles where these debates have been happening for a long time, the censorship industrial complex.
I call it the aggrievement industrial complex for the most part, but go ahead.
Fundamentally, private individuals can't censor anyone.
Private individuals hold opinions.
They may think that you're a scumbag, but that's not censoring you.
That's a First Amendment protected opinion that they have on you.
Private publishing platforms can't censor anyone.
They have editorial lines and standards,
and algorithms may make the decision,
but it is an act of mathematical editing of timelines for billions of people
that is also their First Amendment rights.
So whenever someone sued a platform, for example,
saying, I need to be let back on this platform,
they got rid of me that breaches my First Amendment rights.
the judges have had to remind them, no, darling, actually it's the platform that has a First Amendment right to decide not to have you on there.
Advertisers can't censor people. They can reward content with money. So you produce great content and advertisers come to you and you earn money off the back of that.
Forcing an advertiser to reward content they don't want to reward is also contrary to the First Amendment and the right of free association.
Only governments can censor. In the United States, right.
Right, by using the threat of overwhelming force.
And that is to say, if you say something I don't like, I will punish you using the monopoly of violence that I hold.
And so that's the fundamental truth of how censorship works.
And accusing a non-profit or even a platform of censorship is frankly, legally meaningless.
Even if you were saying things that were false, correct?
I mean, you know, one of the things the First Amendment fundamentally protects is not just your right to be false, but you're right to lie.
You have the right actually under, you know, apart from in certain circumstances,
fraudulent activity, et cetera, et cetera, et cetera. You have the right to lie to people. And of course,
what the First Amendment doesn't say is, you can't be ostracized for it. You can't be punished
for it socially, economically, in other ways. People might decide they don't want to do business
with a liar. You have the right to hold an opinion about someone. So I think the debate on
censorship's become incredibly diffuse and dumb, because really what it's been used for is some
very thin-skinned people to say, please, please, please don't say mean things about me.
And you will know this with people like Musk as well,
is that the reason why, you know,
I am always bemused by his claims of censorship.
The truth is that he's kind of scared.
He's a scared little boy that people might hold opinions about him,
that he is an asshole.
And, you know, and as a result, he bought the playground
and still can't make people love him.
And he then manipulated the rules of the playground
to benefit him.
And even then he couldn't get people to love him.
He's done everything he could and yet can't.
And what CCDH does is essentially hold up a mirror to bad behavior.
All right, let me get in that.
I'm going to play devil's abe because I've listened to them too for a long time, all right?
This is what their principal argument is.
A lot of people believe that tech companies censored conservatives
and their views at the urging of the government and the organizations like yours.
For example, in March 2021, as the COVID vaccines were starting to roll out, your center for countering digital hate put out a report called a disinformation does in about 12 leading online anti-vaxxers. A list included Robert F. Kennedy Jr., and your organization called for all of them to be deplatformed. Every person named your report was removed and heavily suppressed by more than one major platform of the time. Talk about what this set up. Explain why that is or is not a problem. Because you did have effect with these platforms before, right?
Now, of course, they could give a fuck.
You know, I said at CCDH in September 2019,
and initially it was primarily about looking at anti-Semitism
and other forms of hate,
the kind of hate that was taking lives in Britain.
And it's only six months later the pandemic started in March 2020.
And I had to make a decision as to what we did with our tiny team of people.
And we said, well, actually, we can probably do most good
by studying the spread of disinformation lies about vaccines,
and about COVID, because COVID was, it was clear by then going to take more lives than any
hate actor on the planet in the following couple of years. Now, at that time, in March 2020,
President Trump was the president. One of the things that was really interesting is that when we
were talking to the Trump administration at the time, they were very worried about anti-vax rhetoric
coming from Biden and Harris. And here's the truth. With some reason, like they're actually kind of
right to be pissed with Biden and Harris for saying things like,
Harris was asked if she would take the vaccine. She said, I think that's going to be an issue
for all of us. I will say I would not trust Donald Trump, which is an incredibly irresponsible
thing to say. And I don't care that she's a Democrat. Like, it's an incredibly irresponsible.
And, you know, Joe Biden has said things like, you know, who's going to take this shot?
Are you going to be the first one to sign up once they now say it's okay? And Trump was pretty
pissed about that. Correctly so, I think. In March 21, we put out this report, the disinformation
doesn't, which was an analysis of, of the hundreds of thousands of bits of disinformation
circulating on social media platforms where they were linking to outside sites, we found that
12 people were producing 65% of the disinformation being linked to on social media platforms.
But how do you respond to critics who see that report and the result and say that's
harmful censorship at your organization's urging? Why isn't it censorship? I think platforms have
the right to enforce their rules. I mean, like, we all sign up to the rules of our platforms when we
join them, right? We all agree to her. If you have a rule, that's your responsibility to abide by the
rules, but you have a corollary right to expect others to abide by it too, and for there to be someone
to enforce those rules. And again, the evolution of CCDH is, it does include some of us kind of
realizing, oh gosh, just putting moral pressure on these companies doesn't work. At the time, we still thought
by arguing the point, by showing the evidence that platforms would change their behavior.
Which Maria Ressa tried to do in the Philippines and others have tried to do.
Yeah, lots of people. And, you know, Maria and I are friends. And, you know, we've talked about this.
I work with people like Ian Russell, whose daughter Molly took her own life because of content produced,
which was sort of flooded her timelines on Instagram. And, you know, her father tried to put pressure on these platforms,
but you know, and I know that they don't give a shit.
Right. So why are they claiming harmful censorship? From your point of view, what is their strongest argument?
I think it's politically expedient for them to sort of to now say that, oh gosh, that was terribly harmful.
And at the time, they were telling the Biden administration, look, what a great job we're doing on cleaning up disinformation.
I think right now we're in a really odd moment in America where what has been bipartisan consensus on, and I still think there is bypassing consensus.
If you look at polling, that vaccines are one of the safest, most effective, most consequential inventions of the past 200 years in medicine.
is breaking down and has become infected, in part because of the way that social media algorithms work.
And you and I probably agree on this emphatically that they forced us to the fringes of argument on every issue.
And so we have this perception that there is a significant wing of the population who believe that vaccines are the work of Satan.
Right. I think Derrim is that they weren't allowed to say that, that they weren't allowed to say they were Satan.
and they should have been allowed on these platforms,
as you were saying, because they can lie.
I had a similar argument with Mark Zuckerer about anti,
if you remember, that 2018 interview I did,
where he got into a big trouble where he said,
Holocaust deniers don't mean to lie.
And I said, oh, they do.
Just to be specific, we were very careful not to talk about things like lab leak.
We were very careful not to talk about whether or not they're satanic.
We actually looked specifically at,
is this information likely to harm someone's life or put them at risk of death?
We'll be back in a minute.
Support for this show comes from Quo.
If you're a business owner and someone can't get through to you,
it's more than just annoying. It's bad for business.
Quo, spelled QU-O, is a smarter way to run your business communications.
With Quo, you and your team can stay on top of every customer conversation
so you can reply faster and never miss an opportunity to connect with your customers.
Quo is designed to work for you wherever you are.
You can use it right from an app on your phone or computer and you can add new numbers or teammates in minutes.
Sync your CRM and use seamless routing and call flows as your business grows.
And Po isn't just a phone system, it's a smart system.
Their AI agent automatically logs calls, generate summaries, and highlights next steps, so nothing gets lost.
It can even qualify leads or respond after hours, ensuring your business stays responsive even when you're offline.
You can make this the year where no operational.
and no customer slips away. Try Quo for free plus get 20% off your first six months when you go to
Quo.com slash Kara. That's QUO.com slash Kara. Quo. No miss calls, no miss customers.
Support for the show comes from Framer. You need a website, then you need a website builder. In other words,
you need Framer. Framer is an enterprise grade, no code website builder used by teams at
companies like Perplexity and Miro to move faster.
With real-time collaboration, a robust CMS with everything you need for great SEO and
advanced analytics and include integrated A-B testing, your designers and marketers are empowered to
build and maximize your dot-com from day one.
Changes to your Framer site go live to the web in seconds with one-click without help from
engineering.
So whether you want to launch a new site, test a few landing pages, or migrate your full.com,
Framer has programs for startups, scale-ups, and live.
large enterprises to make going from idea to live site as easy and fast as possible.
Learn how you can get more out of your dot com from a Framer specialist and get started building
for free today at Framer.com slash Kara for 30% off a Framer pro annual plan.
That's framer.com.com slash Kara for 30% off. Framer.com slash Kara rules and restrictions apply.
Let's get back to the government's case against you because the State Department
actually, as you came, as you noted, a few weeks after the European Union,
Fidex, $140 million for violating the Digital Services Act,
the European Union has been much more strict on tech companies than U.S. hasn't been
restricted. I don't know what the opposite of strict is, but that's whatever that is,
is what the U.S. government has done, both the Democrats and the Republicans.
It was the first penalty imposed under the law, which sets rules for how platforms operate.
The Trump administration hasn't explicitly linked their action against you this case,
though they did cite your organization's support for the law, which you can do.
The actual fine by the European Commission was because Etz breached its rules on data access on,
and I think that that's the fundamental most important thing.
We need transparency of these platforms.
We actually have a right as a public to know how we're being misled,
how they are distorting the information that we receive.
Like what's in the food?
Exactly.
It's like a label, right, because we're not eating nutritious food anymore.
We're eating ultra-processed food.
Slop.
Yeah, right.
And I think just a few days ago,
the Secretary for Health and Human Services said,
we want to reduce the amount of ultra-processed food.
I agree.
I also agree we should try to reduce the amount of ultra-processed information we're receiving.
That's a very good point.
Now, X broke those rules,
and they were fined $140 million, $120 million.
And apparently, that censorship,
breaching rules and transparency,
that is now censorship, which is odd.
Well, everything's censored to Dilan Mosque.
Yeah.
He owns X, as you said.
He celebrated the news of the visa sanction.
and two of you, as you noted, had a pass.
In 2003, Musk sued your organization of a report
documenting a spike in hate speech on X after he took over.
A federal judge dismissed it.
Do you see this, he's a very litigious person, he'll do it,
and even if he doesn't have a good case,
but he'll continue to do it and harass people.
Do you see this as the federal government retaliating against you
on behalf of Musk, and why now, rather than when he was sort of at his strength,
although he's back, I guess?
I mean, why now it's been going on for years is the truth caro. So the story of his lawsuit was extraordinary. It started off with him calling me a rat and calling my organization evil online. We did the study that was on the New York Times that showed that when he took over, there was a tripling in the global usage of the N-word on his platform. There was a massive spike in anti-women, anti-LGBQ plus, anti-Hispanic, anti-Semitic hate on his platform. And that led to him losing a lot of advertisers. His trust and safety counsel resigned because of the content of our research.
And he sued us. And initially he just was like, you guys are rats, you guys are evil.
You're to blame for my fall off in business, not himself. Yeah.
So, I mean, and he kept asking like, who's funding you, who's behind you? Because he's a conspiracy
theorist. And so he's asking who's behind you. And I took a picture of his screenshot of his tweet.
And I said, I'll tell you who funds us, the public donate here. And I asked a few friends to retweet me
and, you know, a few celebrities retweeted. And it got a lot of views. We actually brought in a lot of money
off the back of it. And that really annoyed him. And so the next day, he called up the chair of my board
and said, I want to speak to you about Imran and the chair of my board said, nope, I'm not his dad.
You speak to him. And then the next day, we got a letter from Alex Spiro, his lawyer, saying,
we want to...
Quinn Emanuel, just for people who don't know. Go ahead.
Yeah, we want to sue. We're going to sue you under the Lanham Act. So he said, you are secret.
funded by Mark Zuckerberg and by Google and by foreign governments, and therefore we're going to sue you.
And we thought, well, we're not funded by any government or any tech company. So that's nonsense.
And that's the first time I met Robbie Kaplan. And I called her up and said, I hear that you're very good at suing billionaires.
And she said, I am. And I said, I've got another one for you. And she said, I'll do it.
Yeah. So that was the peak over this. Why now, though? What was? So he lost the case.
So he lost the case, but as soon as he sued us, we immediately got subpoenas from the House weaponization of the Federal Government Subcommittee, so Jim Jordan's Committee. And he asked specifically. As Night Follows Day. As Night Follows Day, he asked us under, of course, they have the subpoena power. He asked us for all correspondence and contracts between us and any social media company and any branch of government. Which is a lot. And we wrote back going, here's all of it. So we did. We gave him every single bit.
Right, but it's expensive for a small organization like you do this kind of stuff.
It costs hundreds of thousands of dollars.
In total, millions of dollars, this whole thing.
But of course, that evidence would have been very pertinent to Musk's initial claims
and belief that we're secretly funded by Mark Zuckerberg.
So, but there was nothing there, so he didn't really go any further with it.
Then we were targeted by Stephen Miller, who said that we're in breach of the Foreign Agent's
registration act through his America-first legal nonprofit.
and referred us to DOJ, which is not true as well.
We're not in breach of the Foreign Agents Registration Act.
Then we had the FTC start investigating us for being part of a criminal,
this is in the last year, being part of a criminal enterprise in which we sit.
Essentially, we control Disney is the argument for the FTC.
Yeah, that's well known.
And then we had the State Department take this action.
So you say, like, this is an odd time for it to happen.
The truth is it's never stopped happening for three years now.
They kept trying.
They were probably in a room going, now we'll do this.
Like, I can see that.
And it's death by a million cuts, right?
If you are an organisation with revenues of a few million dollars a year,
I have 36 staff, data scientists, people who are studying things, advocates, communicators.
I mean, all of this means I can't hire people.
We can't do things.
And so we've had to endure a lot.
The truth is, though, that you will know that when you get attention,
a lot of people hear our message and go, actually, I quite like the cut of that kid's jib.
And so when Elon Suda's, our revenues were about one and a half million.
globally. Last year, they're about 7 million globally. And that's all because that's all the Elon
effect. It's the Musk bump. Thank you, Elon. But why now did they do this? Because they got a lot of
shitty stuff on their plate, right? Like, was there a resurgence of his power? He didn't do it in the
middle of Doge, right? He didn't, like, when he had more power than ever. I mean, I'm not as
speculators. I don't know what his current relationship is with anyone in Washington. I do know that they are
escalating and escalating and escalating because, of course, they're feeling the cage coming,
around them, Kara, like whether it is the UK passing the Online Safety Act, the European
Union, the Digital Services Act, Canada, Mark Carney came to power saying he was going to pass
an online harms act.
Australia.
Australia, but also domestically, Florida's passed bans.
You know, Texas has got stuff in the mix.
California.
You've got AI legislation all over the country.
Because actually, there is a fever pitch.
And change organizations like mine, organizations trying to change a system, especially one that's
relatively new.
this is decades-long work.
Nader, it wasn't immediate.
It takes years to build up awareness.
Mothers against drunk driving, etc.
And cigarette. Exactly.
So on the AI front, right now,
speaking of the cresting of the situation,
I have been focused a lot on kids in chatbots, for example,
like since 2023, and I kept saying this is going to be a problem.
One of the big stories right now is how people have been using
the chatbot GROC from Musk's AI company, XAI,
to generate sexual images of,
women and minors must said anyone making illegal content will suffer the same consequences if they
upload illegal content. But as it's recording, X does not seem to have done anything with these
posts. There's now one of the woman who was killed in Minnesota, in a bikini, apparently.
We've seen governments pass laws to criminalize deep fakes in non-consensual images like this.
In the U.S. we have the Take It Down Act, right? This is, and which was a very Republican-backed,
What is the effect of what's happening right now? Because these CSAM images are disturbing. And most companies react. But so far, Musk has not removed them as people are following it. And neither of the platforms that are in the center of the app universe, Apple or Google, have moved in any way. And though Musk is at the center of it, they're certainly complicit in that.
I just think it shows how weak we've been at passing legislation to do common sense things like, um,
you know, not allowing AI platforms to produce CSAM,
you would have thought that would be the first thing that we'd think we want to fix,
we want to make sure it doesn't happen.
And of course, somehow we've got to the situation where Elon was at one point
boasting about the fact that his platform will put a bikini in anyone.
And our argument's always been that if you have ungoverned spaces,
so, I mean, in my past, I worked on foreign policy for the United Kingdom.
So, you know, we know about ungoverned spaces, like spaces that are ungoverned,
when I worked on countering terrorism and Islamic State, for example,
face like Syria, Somalia, they become breeding grounds for al-Qaeda,
for al-Shabaab for ISIS.
X became the breeding ground for anti-Semitism when must took over,
and now Grok has become the breeding ground for pedophiles,
because essentially it's an ungoverned space where you can do whatever you want.
And without rules, bad shit happens.
So the question is, who's going to put the rules there?
Do you see this being the crest of it?
because people have certainly, the Europeans have certainly reacted, the U.S. less quickly.
The Europeans have kind of reacted. The British have moved slightly faster.
The European, look, I've got an office in Brussels. We have people there. I have our office in London.
We have people there who are talking to, you know, lawmakers and everyone else.
All right. The U.S. is glacial. This is turtle.
The problem in Brussels is no one's back until next week because they're Europeans and so therefore
they have their six-week Christmas vacation. And the Brits have said something, but they have no powers.
So actually the Prime Minister came out of the United Kingdom and said,
it's unacceptable, this cannot happen.
And you're like, so what are you going to do about it, bro?
What are you going to do about it?
And there are very few powers available.
What is going to happen here?
Do you think this is one of these moments, along with chatbots and everything else?
Yeah.
Sexualized children.
Because we do have this act.
Well, the Take It Down Act has, the FTC has the power to fine them $52,000 per instance.
And this is Melania's acts, and President Trump signed it into law.
It's a good act.
You know, it's weak, but it's the first substantive piece of liability reform since Foster Sester.
And that was the first piece since Section 230 years ago in 1996.
So, and I think that it may push some people to call for action in, you may see some tightly written bills,
which deal specifically with this problem.
But we have a more general problem as well.
Like we just did a research report, fake friend, where we did in two modes.
First of all, users simulating being a 13-year-old with mental health problems, so suicidal ideation, eating disorder and drinking drugs problems.
And we wanted to see how quickly could we get it to tell us how to safely cut ourselves.
Chat GPT40, two minutes.
How quickly could we get it to list what drugs we could take at home to kill ourselves, 40 minutes?
How quickly could we get it to write a suicide note for us, 65 minutes?
And then we bombarded the back end with prompts asking, try to work out quantifying how many times in the probabilistic model.
So chat GPT doesn't always give the same answer.
So how many times does it give a bad answer?
Over half the time it was giving a dangerous answer.
A few weeks later, they launched ChatGPT5.
And Sam Altman came out specifically and said, because Adam Raines' case came out two weeks after our fake friends report.
Right. This is a chat.
I interviewed their parents. I interviewed their parents. Yeah, and I meet too many parents who've lost their kids. I'm sure, both of us are parents. It never leaves your soul. It never leaves your soul. And it's it chips away and it's chipped away at mine a lot over the last seven years.
I had one tech person saying, when are you going to stop interviewing these parents? I said, when you stop? Stop killing them. Stop killing their fucking kids. That's what I said. That's what I said.
And so. I didn't kill them. I'm like, ah, that's a. That's a. That's.
a debatable situation. You created, you handed them the gun, is my feeling. I was like,
you handed the gun. You handed the gun, and you said, and often you said, please shoot.
Yeah. What will it take to enforce it? What do you imagine? You know, Sam Altman actually got it
right a couple of years ago when he was still pretending to care about safety. And he went to Congress
and he said, we need, I remember him saying, he said, we need accountability and we need, you know,
legal responsibility. And that's literally the star framework that we'd written five years earlier.
And I was going...
And character I just settled with Google.
Right.
What do you make of that?
That was Megan Garcia's son.
Yeah, I mean...
They all claim that they want to fix this,
but the truth is that they aren't actually doing it.
And now we know that in the last year,
they have spent a ton of money,
a ton of money on lobbying against the precise laws
that they once claim to want.
Again, I don't know why this has become political.
having chatbots that don't persuade your kids to kill themselves or generate CSAM should not be a partisan issue.
And somehow in America, we've managed to make it partisan.
Well, look who is standing next to Trump in the inauguration.
I'll take that.
I always show that picture whenever that.
If you want to ask the question of why I'm currently being targeted by the State Department,
why they want to force me to move and take my American family with me, leave the country,
or to fight it, and I'm stuck in this country now.
I cannot leave this country because I wouldn't be allowed to be readmitted.
So if my parents get ill and die, I can't go to their funerals in the UK.
I can come back to the point.
I feel I have to fight this.
But the reason why they did that is that one image,
that image of them standing there at the inauguration
because they are desperate to protect their franchises
from the inevitable and they're growing,
the crescendo of clamor, begging for change.
Let me just finish up with your case.
The federal judge blocked your removal from the U.S. for now.
The Supreme Court, though, is an largely deferential to Trump with issues of immigration,
although this is also a free speech case.
It's a First Amendment case.
That's right.
But I'm saying they all make it an immigration case.
How confident are you of where this goes?
And what are the broader stakes then?
What message will send if the government is able to revoke your green card over your work?
I think this is the core of where we are right now.
I have faith in the courts.
I've actually, as I said before, I've gone up against Elon and a court found that he was trying to, he said, look at this censor, blah, blah, blah, I'm going to sue him for $10 million. And the court said, no, no, you're trying to censor him. You're trying to use lawfare to silence a non-profit. And they gave us a slap ruling, a strategic litigation against public participation ruling. We got costs awarded, dismissed completely. So I have faith in the courts. But in this case, they're going to position it as we have the right to throw out.
whoever I want. And I'm going to say, well, no, you don't. You don't have the right to throw someone
out. We have the right to throw out troubles and immigrants. That's basically. I'm a legal permanent
resident. I abide by the law. I pay my taxes. I create wealth here as well through the work that
we do. And so what I actually want is for my, this is about what they, what they're pissed off
about is my speech. And that's why I don't want to fight this. I don't want to have a fight with the
government. I have to, though, because governments acting this way are one thing. And governments do act
this way. Governments are often quite naughty.
Sure do. But if we accept or we lose the will to fight, that's when all is lost. It's not
when governments behave badly. It's when we just don't care anymore or we lose the will to fight.
So that's why these cases matter, because if we win, it doesn't matter if you're Democrat or
Republican, all Americans will have strongest free speech protections from big companies who seek
to silence them or the government seeking to punish them. This isn't specifically related to
tech platforms, but it kind of is because it moves to it.
But what we're seeing happen in Minnesota right now after the ICE officer shot and killed,
and I would call it murder, given the videos.
Even when we have videos, it shows what happened.
And most media companies are now, this is, she was veering away from him.
But the Trump administration is actively lying from the get-go about it,
even though video shows what's happening.
This is a situation where the government itself is lying in their ability to have free speech.
And then it's jumping online with all these, of course, once again, let's take apart the videotape all along.
So talk about what it says about information ecosystems right now, that they're not even doing the conspiracy theory thing.
They've jumped right to, we're going to lie until people are too confused in some fashion.
And then it gets iterated around the internet ecosystem or the digital ecosystem.
And I think it's one of the things that makes high-profile law enforcement cases so difficult.
I mean, I've worked with people in law enforcement.
I've sat on the Commission for Countering Extremism.
I've worked with, you know, I've talked to and understood the perspectives of people from FBI and DHS and other agencies in the United States.
We've never worked with them, but we understand where they come from.
One of the things that makes it really difficult is that instead of the investigative process,
which is slow, meticulous, careful, that they're trying to understand exactly what happened and where liability may lie and criminal liability may lie.
We instead get a screaming match on social media.
And I look, I think in a moment like what happened in Minnesota, my heart absolutely, of course, goes out to the family of a young woman, again, as a young dad, I'm not a young dad, I'm in my 40s, as a dad, like, it upsets me enormously to see a six-year-old who won't have their mom come home. That's the worst thing that can happen. But, you know, you've got both sides screaming based on a single video on social media about how once
as a criminal, abolished this, abolish that, blah, blah, and actually what we need is,
in a moment when we need sober politicians, our information ecosystem is designed in a way
that actually advantages the dumbest politicians possible, the liars, the historiotic overreactors,
the ones who spin and mislead as much as possible. And I think that's one of the problems
that social, it's one of the broader problems of social media, is that it's encouraged
democratic and political discourse that's increasingly toxic.
And then it destroys the values that underpin democracy.
And that's my real fear, Cara, is that we have a window to try and get to some sort of
renegotiation with tech about the way that their engagement-based algorithms work,
or the values that underpin democracy, and the hard-earned trust that we built in our society
collapses. And my family's from Afghanistan. When shit goes bad?
It happens fast.
Right.
In the 70s, we had women in miniskirts in Kabul,
as my grandfather keeps reminding me because he likes miniskirts.
And we had women in government, and it happens real fast.
And so those tipping points of the breakdown in trust,
the breakdown in truth.
And look at Afghanistan now.
We'll be back in a minute.
Support for On with Kara Swisher comes from Roons.
If you're looking for a health goal that you can actually stick to,
you might want to check out Grooons.
Grooons is a simple daily habit that deliver real benefits with minimal effort.
Their convenient, comprehensive formula packed into a snack pack of gummies a day.
This isn't a multivitamin, a greens gummy or a prebiotic.
It's all of those things and then some at a fraction of the price.
And bonus, it tastes great.
Groon's ingredients are backed by over 35,000 research publications.
While generic multivitamins contain only seven to nine vitamins,
Grunes have more than 20 vitamins and minerals and 60 ingredients which include nutrient
dense and whole foods. That includes six grams of prebiotic fiber, which is three times the
amount of dietary fiber compared to the leading greens powders and more than two cups of broccoli.
It's a daily snack pack because you can't fit the amount of nutrients Grunes does into just one
gummy. Plus, that makes it a fun treat to look forward to every day. Kick off the new year
right and save up to 52% off with the code Kara at Grooens.
That's code Kara, K-A-R-R-U-N-S dot CO.
So every episode we get a question from an outside expert.
Here's yours.
Hi, everyone.
My name is Nicole Wong.
As Karen is, the topic of online speech means a lot to me.
So I appreciate your work and being included in this conversation.
Over the last 30 years, I've worked with a lot of tech companies and a lot of governments
about how to effectively manage online speech platforms.
it's always been difficult like playing five-dimensional chess there has always been harmful malicious speech
but the difficulty is in the speech that falls in that category that's offensive to some but meaningful to
others that gray zone and for that category of speech someone always has to be the decider so i have a
multi-part question for you first in your ideal regulatory framework who makes that difficult
decision about what contested speech is permissible and what's not, and how do we assess success
or failure? And second, what is the role of the government in enforcing such speech? And does it
change when we're dealing with weak democracies or authoritarian governments who don't like what
critics have to say? Thanks, and happy New Year. So that's Nicole Wong who worked for Google.
I've known for years, one of the smartest thinkers on this, back when they were smart at
tech companies where they had decent people working on these very difficult and thorny questions.
She also worked at Twitter. So let's start with the first one. Who gets to decide what contested
speech is okay and what side? And how do you gauge success? So I think the First Amendment decides
what speech is. And if you remove Section 230, if you could sue a platform, and if you were trying
to see them for protected speech, the courts would say, sorry, that's protected speech. You cannot
see them for it. They've won every time in that. They've won.
because of 230. And I think that what we have to do is start looking at where that speech may
actually become negligence or knowing indifference and where they might be held liable.
Which might be an AI, by the way, which isn't clearly protected by Section 230, but go ahead.
So platforms have always said, like, we're being defended by 230 and the First Amendment. Take
230 away because 230 is like, it's just a blanket ban on being held liable. The problem with not
being held liable is, as you will have witnessed yourself, it starts to become a shibboleth.
it starts to become an idea that they should never be held liable, neither legally nor morally.
That's correct.
It creates a culture of indifference to the harm that they cause.
Very well said.
It's what's driven the sociopathic indifference and greed that these platforms have towards the harm that they cause.
And now, when I come along and say, well, that's mad.
They go censorship.
Which is nonsense.
I mean, utter nonsense.
Right.
So, I mean, that's how I'd answer that question is, let the courts decide what is, do not change the speech laws. That would be madness. Go to court. Go to court. Let the courts decide. That's what I said. The second part, what's the role of government enforcing speech. What does that change when you're dealing with a Donald Trump, where it's whatever he likes, like he's been saying this week is whatever his moral compasses is what the rest of us should do, which means that's a party.
The government doesn't have a role in enforcing speech. It has a role in protecting speech. And that's what makes what they're doing to me so outrageous. The government's role is actually to protect the rights. To protect you. You and I are both in one respect minorities, right? I mean, I'm, I mean, no one that's never listening to me ever knows that I'm brown, but I am actually brown. And, you know, you're gay and there are, there is, there are, that these voices were suppressed. Our voices.
were suppressed. Our ability to, my ability to be anywhere near the public sphere and your ability to be
open about who you love and the type of family that you want to have was suppressed. We've benefited
from the government actually protecting our right to speech. That's where the government's role,
but what the government shouldn't do is give special protections to one kind of company because
they're their friends. That's wrong. That's not the way that governments should work. That's, you know,
that's communism. That's basically oligarchs who are friends of Putin being given special protections.
So your organization published multiple shows the harms of various tech platforms. Your research
showed how, as you noted, chat TP2 produced instructions on self-harm and disordered eating within minutes,
how TikTok bombards users, young users with similar content and how X and meta enable the spread of
anti-Semitism. This is a, I've had round after round with Zuckerberg about this and one,
one of the more famous encounters we've had, the last one we've had actually.
What are you hoping these platforms will do with that information?
Because one of the things I was trying to show in that famous interview I did was he is fundamentally unable to make these decisions and shouldn't, right?
So what should happen here with this information?
Because that's who gets to decide, is Zuckerberg?
I mean, one of the most interesting things Zuckerberg was he put out a chart a few years ago,
which he was trying to explain why they have rules on their platform and why they enforce their rules.
Which they don't.
And on the x-axis, so like, you know, that it had how violative the content is,
so how close it is to violating their rules.
And on the y-axis, it had engagement levels.
And what it showed is that engagement is really low, really low, really low,
until you get the point of being violative and then it just shoots up.
And it keeps going up.
You know, I always say enragement equals engagement, but go ahead.
Right.
And we know this from study after study after study that shows that emotionally effective content
actually gets the most engagement.
And so essentially what it means is hate gets more engagement,
often people being pissed off at it,
and actually tolerance gets less engagement.
And if the fundamental way that your platform works
is that it doesn't reward how smart you are
or how hard you've thought about it,
but just raw engagement numbers,
he said we have to have rules
and we need to enforce those rules
because then once you go past a certain point of being violative,
it drops to zero.
What are the incentives, though, to reorder?
because one of the things Nicole and I, which I always appreciated back then, was she sort of said,
Google is fast, relevant, and gets you the answer you want. It wasn't viral and incented for engagement.
And she said, when that changed, you couldn't reorder it back to just being useful. You know,
speed and virality and engagement really fucked everything up. This was 15 years ago, she talked about this.
How do you reorder the way we, because we seem to like it, right?
Of course, humans seem to like this.
So how does it get reorder?
Is there a better way to regulate the platforms?
It isn't funnily about deciding which speech is good on a case-by-case basis
because these companies understandably don't want to be the speech police,
and they're not good at it.
So what could happen here?
So at a fundamental level, and this is one of the things that I really believe in as an organization,
we were designed to look at the spread of the stuff that hurts people in the real world.
We were never a tech organization.
We're always a consumer rights organization
that was looking at tech platforms.
I'm a great believer in the serenity prayer.
Like, you know,
grant me the serenity to accept the things I cannot change,
the courage has changed the things I can,
the wisdom to know the difference.
I don't do search platforms.
We look at social media and AI platforms specifically.
You know, Corey Doctor and others
have written about in certification
and how essentially once Google won the franchise for search,
they then filled you up with ads.
We do look at YouTube and YouTube has got a problem.
We just did a study called anorexia algorithm
looking at how YouTube was recommending anorexia content to children.
And YouTube have promised again and again and again they won't do it.
YouTube have got this.
And you will know this, that because they had the image of being really cool,
they're very good at managing their image,
that they got away with it.
And people kind of figure that YouTube probably wouldn't be as bad.
But YouTube is a systemically mendacious.
when it comes to covering up the way that their algorithms work in practice
and their inability to deal with the flood of dangerous content on there.
I had a long argument with Susan about this
because I was like whether you mean it or not, it's happening.
Like my son was looking at Ben Shapiro and suddenly got to anti-Semitism.
I'm like, well, that's a leap.
Like, this guy is the opposite.
Why is he down in anti-Semitic content?
That's a publishing decision by the platform.
When you make a recommendation to someone,
you are essentially making an end-eastern,
editorial decision. You are suggesting to them that that's something that you think they should
watch next. And that's what I find so pernicious about it. So what then, what we are in this
situation, how do we get out of? Because tech companies, as you know, have been long been opposed
to government regulation of programs. What's new is the Trump administration is using its power on
their behalf. And I would argue that Obama was a friend of also. And Biden wasn't. He got his
head handed to them by them. It's trying to suppress critics like you. It's weaponizing claims of
censorship to undermine foreign attempts to regulate content like the EU's Digital Services Act.
So what worries you most as the Trump administration goes after some of the few guardrails
that exist? And again, let me just note other countries, Australia passed a law to ban social
media for young people, and other countries are considering similar legislation.
I don't think that will happen here in the U.S., largely because of the, I think they'll
have teens suing the government. But go ahead.
A ton of Republican states are currently, and some have already passed bans on. So, yes,
we may not get one at a federal level, but actually we, we, I think you'd be surprised. I think
sentiment is moving so fast that legislation always lags sentiment and sentiment grows, you know, linear,
but legislation's binary. So like at some point it flips from zero to one, right? So like the
sentiment grows. And I think that you, we may see something and it will require talented advocates,
will require people who go out there and make the arguments who can actually overcome the millions
being spent by big tech. I think that's what, I think that's why, you know,
CCDH has been targeted, we're actually quite good at what we do. We understand how to create
the precise fact patterns that drive action. We know we're quite clever in picking areas, in making
these arguments in emotive and clever ways. And also speaking to legislators in a way that they
understand on a bipartisan basis, on the board of CCDH, we've got conservatives. You know, we've got
everyone from every side of the debate. So, but where is the meaning of because right now it feels like
a ridme of this bothersome priest kind of situation, right? But how can meaningful regulation of
social media also happen without buy-in from the US? When does that occur, this flip?
Well, this is what worries me about what's happening right now. You know, interestingly,
there's been no sanctions on Australia for having a ban on social media. What they are resisting most,
what both the, so Elon when he sued us, the actual tort in the lawsuit, was that you broke the
rules of our platform which ban people from taking data from the platform for research.
There is a through line through all of this. We're now being punished apparently because the European
Commission said you need to have transparency. CCDH is a research organization that uses research
to hold up a mirror to these platforms. Again and again and again, the thing they're most
scared of, Cara, is transparency. Yes, they are. And watchdogs who are able to get data and create
understanding for people. Well, you know, you're manipulating that data just so you know,
according to them. But that's right. Fine. Sue us for defamation. Yeah. They never have.
Not once. Not even Musk sued us for defamation. Because, you know, we are good at what we do.
And we know that we might get sued for defamation in the world's biggest companies.
Of course, we're triple checking everything. But look, speech itself, I've always had First
Amendment, very important that you keep it to where the law is already. And some kinds of speech are
dangerous and tortious, and you may end up being sued for them. And that creates a hard barrier
on that. But the other stuff, let's tell us how we're being manipulated and then see if advertisers
in the public still want to use your platform. Transparency is the key. If tomorrow there was
universal data access for all those guys at Stanford, you know, Stanford and Harvard and at CCDH
and everywhere else, so we could actually understand these platforms better, I think.
the game's over. Yes, agreed.
Because I think once we realize that they're not just putting red dye into our
information ecosystem, they are pumping uranium into it.
Let's just be able to sue them. Let's be able to sue the bastards. And if we lose, we lose.
If we win, we win. That's just the way America is.
Here's my thesis. If everyone in America knew about the existence of Section 230 tomorrow,
the day after Section 230 would no longer exist. Because overwhelmingly, when people realize
that special protections are given to one industry, they say, that's not fair.
And Americans believe in fair, it's why I love being in America.
I am competitive.
I like competing.
I like being surrounded by people who are hyper-competitive because it makes me better at what I do.
But there have to be rules.
There has to be fair competition.
And social media platforms being given a special dispensation.
Screw that.
That is fundamentally un-American.
Yeah.
But the internet without Section 230, it is a legal minefield.
to vet every single piece of content the user uploaded.
That's a real problem just removing it.
It's like removing your liver at this point.
So there's different ways that you could do this.
So I would argue that the status quo is so bad.
It's so bad and moving in such a bad direction
that actually radical action might be okay.
And fundamentally, why are they given any protections at all?
There are people like Marianne Frank and Daniel Citron,
brilliant legal theorists who've talked about,
But instead of saying that you're liable immediately, that actually it should be where there's
knowing indifference to harm. So when you're killing kids, you know about it and you just kind of go,
eh. And you know that those guys behave that way. Then I have been in those rooms.
Then the liability kicks in. So let's bring it back to you. You've made the last question,
you've made it clear you're not backing down from the fight with the administration or planning to
change your work. What's next for this case in your organization then? Talk about where it goes from here.
I mean, look, we're back in court in two months' time. The government didn't oppose extending the restraining order for a couple of months. And they gave us a fairly leisurely briefing schedule. So we're back in court in a couple of months. I look forward to being back in court. I have faith in the system. And in the meantime, you know, what they are trying to do is bleeders to death, what I'll be going out there and doing what most of my, I used to have a fun job. I used to actually do research and do all this stuff myself. Nowadays, it's mainly fun.
raising and begging people.
So I'm going to be going out there and raising money and also spreading the message.
Because as I said to you, if I can use this targeting to help people, more people to hear about
Section 230, that gets me closer today to the day when everyone in America knows that it exists.
And I think the day after everyone in America knows that it exists, it stops existing.
Well, we need to rename it, like K-pop Demon Hunters or something.
We've got to give it a new name.
So everyone just trips off.
because I'm fundamentally a dork and I'm not very good at...
I named my organization the Center for County of Digital Hate
because I was like, I can't be bothered explaining what it does,
so I'm just going to put it in the name.
Right, that's right. I'm just saying it's got...
You're absolutely right.
Anyway, so it goes to court and then you believe...
It could either be Taco, which is Trump always chickens out or they'll lose.
But it could bleed you.
I want to have this fight.
I think it's an important fight to have.
I think it is a fight about what the true censorship is.
And I think it's time that we puncture this nonsensical debate
that's been going on in America for the last five years
about whether or not criticizing Elon Musk
is de facto censorship, no Elon.
He said, this is great.
When this was all announced, he was like, this is great.
The great free speech absoluter said, this is great.
No, this is censorship.
When you're told that you're going to be split from your family
because of the things that you say and advocate,
that is censorship.
And we've got to all, it doesn't matter if you're a Republican or Democrat,
you can't let any president have that power.
I've got to fight this and I've got to win it.
All right. Thank you, Imran, for your time, and I appreciate it.
Thank you.
Today's show is produced by Christian Castro Roussel, Michelle Eloy, Megan Bernie, and
Caitlin Lynch. Nishat Kerwa is Vox Media's executive producer of podcasts.
Special thanks to Rosemary Ho and Bradley Sylvester. Our engineers are Fernando Aruta and Rick
Kwan, and our theme music is by Trachidemics. If you're already following this show,
you are fighting the good fight. If not, you're a scared little boy, and that only matters
if you are actually a little boy, which Elon Musk is not.
Go wherever you listen to podcasts, search for On with Carous Swisher and hit follow.
Thanks for listening to On with Carous Swisher from Podium Media in New York Magazine,
the Vox Media Podcast Network, and us.
We'll be back on Thursday with more.
