On with Kara Swisher - Sam Harris on Silicon Valley’s Slide into Techno-Authoritarianism
Episode Date: December 15, 2025One of 2025’s most memorable images was of Silicon Valley’s billionaire CEOs lined up in the front rows at President Donald Trump’s inauguration. It was visual proof of the tech industry’s emb...race of MAGA’s authoritarian-style of politics — one it has benefited from considerably over the last year. Author and neuroscientist Sam Harris has been using his podcast, “Making Sense,” to talk about the ways tech moguls are corroding our politics, and although he used to be close with some of them, he’s become a vocal critic of their support for Trump. Kara and Sam talk about why he thinks the left is to blame for the tech billionaires’ shift to the right, why all of us are bad at sorting through the glut of information we find online, and the potential risks that come with the Trump administration’s hands-off approach to A.I. They also talk about what possible tech regulation could look like, and whether everyday people stand a chance against tech oligarchs and their platforms. (Please note: This interview was recorded before President Trump signed an executive order to block states from passing A.I. regulation.) Questions? Comments? Email us at on@voxmedia.com or find us on YouTube, Instagram, TikTok, Threads, and Bluesky @onwithkaraswisher. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
In Elon's world, there is no need to explain this change of heart, right?
So he is now going back to a man who he told us all is, if not a pedophile himself,
culpable for the rape and molestation of underage girls, right?
And somehow that's fine now.
Just kidding, folks.
Hi, everyone, from New York Magazine and the Vox Media Podcast Network.
This is on with Kara Swisher, and I'm Kara Swisher.
As we look back on 2025, one of the year's most memorable images was of all the tech billionaires at President Donald Trump's inauguration.
They were there, Mark Zuckerberg, Jeff Bezos, Sundar Pichai, Elon Musk, Sam Altman, and Tim Cook,
mingling with the Trump family, his future cabinet members, and other stars of the Maggie universe.
It was visual proof of the political shift.
to the far right that has been taking place
at the highest levels of Silicon Valley
and in tech more broadly.
Over the last year, these same men
have benefited a lot from Trump's authoritarian-style politics
insofar that he's let them do pretty much
whatever they want, which is exactly how they like it.
But amid rising fears that we're now in an AI bubble,
there's a growing realization that those benefits
may come at a huge cost to Americans sooner than people thought.
I wanted to talk to Sam Harris about Silicon Valley's shift
towards authoritarian politics.
Sam is a neuroscientist and the host of the Making Sense podcast, and he's been talking a lot lately about the power these tech billionaires have and the ways they're hollowing out democracy.
I was really struck by how out there Sam was on this.
He was very close to most of the tech moguls.
We disagreed on a lot of things, including his thoughts on Islam.
And we've had some difficult conversations over the years.
But I was really struck by, I would say, courage on his behalf,
because he was saying things that no one else is besides myself and some other journalists.
And it came at a cost to him that was probably pretty high.
And he sort of is very articulately expressing what a lot of people in terms.
tech feel and are too cowardly to say, and he is not. So let's get into my conversation with Sam.
Our expert question today comes from Kim Scott. She's the author of the book Radical Kander
and a former executive at Google and Apple, so don't go anywhere.
Sam, thanks for coming on on.
Happy to be here. Great to see you, Kara.
Good to see you after a while.
I don't remember the last time we talked about as many years ago.
But things have changed, as they say.
And when it comes to Silicon Valley's shift towards authoritarian politics, the word we're hearing a lot these days is techno-fascism.
I want you to talk a little bit about your journey, because at the beginning of your, you were like the toast of these people.
Like, they loved talking to you and pointing you out to me and stuff like that.
And much like Yuval Harari, that has changed, right?
I think you've all wrote a pretty critical book of where tech was going, and you yourself have changed in the way you think about it.
Maybe that's not quite right.
But I'd love to know how you define the word techno-fascism and if you think it's the right word to describe this moment.
Yeah, so to start with the personal, I don't think I've changed.
My views are more or less exactly as they were, at least on the relevant topics that I think we're going to touch.
No, I just noticed a bunch of people change around me.
And several of these key people were friends.
So, you know, for me, I don't think this is so much the case for you of all.
But for me, it really is a story of finding friends with immense public platforms, you know, doing a lot of harm and find it impossible not to comment on that harm.
And really, that's just spelling the end of the friendship.
Right.
But I think there's a problem.
I mean, I occasionally use the word fascist myself, but I think there's a problem in your, you know,
using the term, one is just so triggering for anyone right of center who thinks that the
analogy to what was happening in Weimar, Germany is just false on many levels. And on certain
points, it is false. So fascism is just a historically contingent movement, which is, I mean,
I think it's much easier to say something like authoritarianism, you know, or tyranny or something
more generic. So I think these, I think a lot of these guys have, for whatever reason, unmasked
They're inner authoritarian, and they're not so fond of real democratic institutions at the moment,
and they're fond of oligarchy and kleptocracy and authoritarian puncture of most of what has kept our system as healthy as it's been for as long as we can remember.
And international alliances are unwinding as a result of this, et cetera.
You haven't changed.
That is absolutely true.
But one of the things, and I feel like I had a very similar purge, although I did see elements of it, right?
They were irritated by the press.
They often structure their companies in ways that they had full control, whether it's Mark Zuckerberg at Facebook, but they always wanted complete control of their businesses, which I suppose I didn't mind, right?
It made a lot of sense back then because a lot of people got run over by VCs or whatever, the stock market.
And so I understood why they were that way.
But what would be the word you is techno authoritarianism or is there a word?
Well, it sort of depends on which person and phenomenon we're talking about
because I think there are certain flavors of the unraveling here.
I mean, on one level, I mean, if you're going to talk about somebody like Elon,
so much of it is just trolling and a kind of solipsistic nihilism, right?
He's just out of touch ethically with the harm he's causing.
I think for me, that's the only way I can explain.
Not new, because he used to, remember, he used to sort of attack anyone who had any normal
criticism for Teslas, for example, when it was just a product.
But go ahead.
Yeah, I mean, so I missed that.
Like, I was, you know, I was really just his friend.
I really never saw him, you know, in his mode as being a boss.
So for me, it's, it's, in speaking about, you know, specific people where the relationships
are now no longer friendly, I don't know if it's a story.
of I didn't know who they were in the first place or, you know, they changed, right?
So, you know, either accounts for the data at the moment.
But just take, you know, Elon in his brief career as Doge master of the United States.
You know, the person I knew seemed to have been incapable of behaving the way I saw this person behave.
I mean, if you're the richest person on Earth tasked with making decisions that are suddenly going to harm some of the poorest people on Earth,
If you're cutting 20,000 workers in health care workers in Uganda overnight,
and you're miserating hundreds of thousands, if not millions of people
who are relying on medical aid for tuberculosis or malaria treatment or HIV,
and many of whom are kids under the age of five, right?
If you're doing all that work, then you have all these other jobs,
and you're the busiest man on earth, are you going to spend your days and nights
shit posting on X and celebrating your destruction,
of the greatest source of aid to the developing world, the USAID,
as feeding it into the woodchipper
and calling all the people whose lives and careers you're upending
over there as criminals.
I just say, I mean, he behaved like a total maniac.
And, you know, then discovers that, you know,
one of the engineers he's hired to do this owner's work
is a committed racist, and rather than just fire that person,
he does a poll on X to see whether he should fire him,
and then discovers that 80% of his fans actually,
have a soft spot for racism, and then jumps out in front of a crowd and does what looks like
a Nazi salute not once but twice. And then when there's the predictable controversy in association
with that, rather than apologize like a sane human being and prove that his heart is actually
in the right place, he makes Nazi jokes on X and just trolls everyone. And so there's this
spirit of a kind of adolescent amorality. I mean, just forget about immorality. I mean, I think some people
are genuinely sinister and happily create harms, but many people are just detached from the ethical
implications of what they do. And in a person like Elon, when what you do is literally affecting
the lives of countries at war and the lives of millions, I mean, this is a monstrous evil
for which he and his drifting friends are culpable. So, and everyone who is maintaining
cordial relationships with him and them in that orbit are,
in my view, participating in this evil, right?
I mean, there should have been, I mean, right of center,
it is such a fever swamp of conspiracism and delusion.
Elon's reputation is well intact,
but Bill Gates is thought of as a dangerous pedophile
who was probably putting chips in us during, you know,
within the COVID vaccines to somehow track us.
Meanwhile, you know, I mean, Gates, I don't know, Bill,
you know, he could be an odd person.
leave it that aside, there's no question he has held, you know, in his philanthropic work
through the Gates Foundation, you know, he and his former wife have saved more lives than anyone
in living memory. And it's the opposite of what Elon and his friends did in sub-Saharan
Africa under Doge. And yet, right of center, you know, up is down and down is up, ethically.
No one cares about those distinctions. Right, absolutely, because they're so removed from that.
But let's talk about how we got here.
Silicon Valley's always had a libertarian streak.
Or I call it libertarian light, but especially with founders and executives.
And the core is the idolatry of innovators, really,
and individual entrepreneurs who take risks and get rewarded.
Let's talk about that shift, because this new embrace of authoritarianism and right-wing politics,
is it so different from the past?
Or what are the roots of the new shift from your perspective?
Because you've watched this.
You've been closer to them than I have.
I've covered them.
I'm not their friends.
I wouldn't call none of them friends.
You've explained the shift towards authoritarianism
as a backlash of left-wing moral panic or wokeness,
and that's certainly present when they talk about it.
So draw a line for us as you see it.
Well, I do think the backlash against the far left is sincere.
I mean, because I feel it myself, right?
I mean, there's what happened on the left in Democratic politics
created a kind of super stimulus politically and ethically,
Whereas it just got so crazy or was perceived to be so crazy by everyone, I mean, even people, I consider myself left of center.
I don't even consider myself in the center, but on every other relevant issue.
But when you have someone like, you know, Kamala Harris running for president against the most odious opponent we've seen in living memory, I mean, somebody who has, you know, done his best to shatter our society with lies and tried to steal an election all the while claiming it was.
being stolen against him and engineered a, you know, something like an insurrection,
if not attempted coup.
The fact that we had a Democratic candidate, Kamala Harris, who couldn't disavow her
former support for taxpayer-funded gender reassignment surgeries for incarcerated illegal
immigrants at taxpayer expense, right?
I mean, that was so, that kind of thing is so crazy-making.
It's such an unnecessary own goal on the left.
I mean, of course, I think people with gender dysphoria should be, you know, have political equality, et cetera.
But, I mean, there was just something, the capture of the, of democratic politics by the far left activist class where everything was a series of blasphemy tests around identity politics, whether it was the trans issue or Black Lives Matter.
it was, I do view, so someone like Elon, I mean, when Elon said, you know, he's going to go to war against the woke mind virus and he just confessed that he was totally radicalized by the experience of, you know, in his mind losing a son to the, into a transgender cult, I think that is actually sincere. I don't think there's an ulterior motive behind that. I think that was. Perhaps not. Perhaps not. But, I mean, I think COVID had a bit to do with it. But, you know,
Trump couldn't admit, for example, he lost the election, you couldn't disavow his racist supporters.
But each of these tech leaders, I get that it's irritating, but how do these tech leaders go from feeling frustrated about the METU movement, cancel culture, the Gaza campus protests and the accusations of racial bias to supporting someone who tried to overturn a Democrat?
It seems a bit of a reaction.
Well, so what I can't explain is why someone in the center, as many of these people were, claimed to be.
couldn't keep both grotesque objects in view simultaneously on the far left and on the far right
and in Trumpistan and keep some sense of proportion, right?
I mean, so I feel everything that Elon presumably felt about, you know, the far left,
certainly virtually everything, and yet I always saw the danger of Trump and Trumpism
as being far more grave to our, you know, the functioning of our society.
Right. So I think you have to respond to both. But it is, I just, I did notice this pendulum swing that, you know, that everyone who was super sensitized to what was happening on the far left grew more and more acquiescent toward what seemed like the only viable corrective coming from Trump and Trumpism, right?
And it was, it also, it was, you know, talking about censorship and then they ban books. It gets the extra step, right?
talking about issues with trans people
or possibilities of having a disagreement
over sports, for example,
and then going to banning them completely
and removing rights.
Like, one does not equal the other in any way.
No, I mean, there's no making sense
of the failures of coherence
among these new, freshly minted fans
of Trump and Trump is on it.
So, I mean, if you pretended to care about
what was on Hunter Biden's laptop,
because of all the corruption it might have indicated coming from the Biden family,
how can you not care about what the Trump family is doing this very hour that outshines
the worst of what could have possibly happened under Biden's watch by a factor of a thousand,
right?
I mean, it's just the Trump family's adventures in cryptocurrency alone, you know, the meme coin.
I mean, there's absolutely nothing ever alleged against Biden that holds a candle to that.
And yet, you're not seeing the same people who were tearing their hair out over Hunter Biden's laptop
and its suppression by the mainstream media or discussion of its suppression by the mainstream media.
It's just these people are not coherent, right?
They're tribal.
I have an explanation.
Let's have a hypothetical.
Let's say the Biden administration and Democrats had champion all the same woke causes like racial equality
and more open immigration policy trans rights, but they cut taxes and took a hands-off approach to regulating tech and crypto.
an AI. Do you think we'd still see that ship to the right? Because, yes, there are true believers
the mega cause, no question. But it's shocking to see someone like Sam Altman or Jensen Wong or
Tim Cook embrace what it's based. And it seems like pure shareholder expediency to make. That's it.
Yeah. Well, once you draw the circle wide enough to include a visibly nervous Tim Cook
delivering some crappy solid gold Apple merch to the president, I think.
then you're just talking about political expediency and, you know, the thought that they have
some fiduciary responsibility running a public company to do whatever it takes to raise their share
price in the presence of a very vindictive, very kind of patronage pushing president, you know,
and that's, I think that's despicable. I mean, I think it's just an obvious failure of courage
on all of their parts.
I mean, if having, you know, many billions of dollars personally isn't sufficient to cause
you to grow a spine, I don't know, you know, I don't know what it would take.
So, you know, every billionaire who bent the need of this guy, I think, really has something
to apologize for in the end.
But it's, I think it's explained in those cases by not some newfound respect for Trump as
some kind of business genius, but just recognizing that, okay, the country's being run.
by a personality cult,
and at the center of that cult is a man
whose oxygen really is flattery, right?
And the only way to get anything done now
for at least three plus more years
is to pay obeisance to the guy
and make him feel like a genius in your presence
because you're falling in all over him.
Right. Now, there's also a spiritual component.
I mean, you've written about issues around religion
quite a bit to this embrace of authoritarian politics.
Il-Muss used to question the idea of God.
Now he claims he's a cultural Christian.
There's also Peter Thiel.
He gave a series of lectures where he warned
that critics of technology and AI
were ushering in an arrival of the Antichrist
and the destruction of the U.S.
What do you hear where the tech leads invoke religion
and spirituality, especially as it relates to tech like Teal does?
Now, Teal, he had religious feelings,
but rest of them, not so much.
Yeah, well, so everyone is sort of working out their existential crisis in various ways.
So, you know, a lot of these guys have gone to Burning Man for years and done psychedelics for years.
And, you know, the frame you put around all that conceptually is somewhat dependent on just, you know,
the books you've read and the conversations you've had or haven't had and the ideas you've found persuasive.
the place where it touches politics in a way that is at least intelligible to me and some of these concerns I share, as you know, is that I do think there is a zero-sum contest between open societies generally. I mean, even beyond the West, but I mean, to talk about, it's usually framed as Western culture, but it's beyond that. And Islamic fundamentalism, the world over, you know, so it's, I mean, we really do have attention there, which I've spent a lot of time, you know, whinging about, and,
I think the last time we had a podcast, I winched about that for the full hour.
Yeah.
But, and to some degree, you know, various characters in this space can probably share that same hobby horse.
They look at the fact that in the immediate aftermath of October 7th, you know, you can literally get a crowd of, you know, several hundred thousand people out onto the streets of London seemingly in support of Hamas or effectively in support of Hamas, or at least not disposed to view Hamas.
as a terrorist organization, et cetera.
And then somehow that, that moral confusion spreads to the finest campuses in our own country, right?
So we have Columbia, Harvard, et cetera.
And then you have Harvard, the presidents of three of those finest colleges brought before Congress,
and they can't seem to figure out how to say that, you know, support for genocide is against their terms of service.
And, yes, there's a lot of fine print on all of those moments.
And, you know, I'm happy to go down any of those rabbit holes if you want to.
But the general picture here is one of people left of center being ruled by identity politics and white guilt and a criticism of the West that runs so deep, sort of in the vein of Noam Chomsky, such that America's blamed for everything.
The West is blamed for everything.
It's all colonialism and racism and slavery and awfulness.
And we have nothing to be proud of.
And, you know, there's a lot that makes sense of that.
There's also many half-truths and lies, you know, weaving through all of that discourse.
And at the end of the day, it's just not a healthy way to run our politics.
Sure, but we talked about the excesses of the left, which were obvious, and that is certainly a ravenhold.
My point is to say tech regulation could usher in the antichrusting, the destruction of a country, that is as extreme as a guess.
On its face seems crazy to me.
But what I'm just trying to do is draw some point of contact between statements of that kind that seem extreme and cultic and weird and proto-fascist to some set of concerns that I consider rational, right?
So Peter, if I were sitting across, I don't know Peter very well, and I know him, but I think we would both agree that Western Europe has a problem of assimilation with its Muslim community that we're happy to not.
have here in America, and we don't want in America, right? So the figure, so in some sense,
we would both agree with the project of let's not let America resemble Belgium with respect to
this variable too much. Sure, but how does it get to anyone who wants track regulation is the
Antichrist? That's a step way down. I don't even understand it. I just, I think part of me
thinks it's cynical. Well, well, I think, I think so, I mean, if you imagine jumping from
from the critique of DEI, right?
So what DEI was, in their view,
is just a, the nullification of anything like meritocracy.
We're not going to care about getting the best man or woman for the job.
We're just going to care about checking quotas
because we're so guilty about, you know, our race has passed.
And so that from a kind of their pure entrepreneurial frame looks like
just a race toward mediocrity and failure.
And they don't, you know, so they don't want to be forced to do that.
Yes, I get it.
I get it.
It's the extreme nature.
And also, by the way, it's a miratocracy, not America.
Often is friends of friends and friends of Peter.
And now it's friends of Trump.
Oh, yeah.
Because they've gotten, you know, triggered by an obnoxious college student who's kind of allowed to do that on some level and when they're younger.
But, you know, and it's sort of a bad faith definition of DEI because excesses are different than, wow, this seems funny that only white guys can win.
Like, hmm, this seems on you, unless they're better, right?
But what...
Well, I mean, what you have in the Trump administration is DEI for conspiracists and loons and
grifters and sycophants, right?
I mean, this is not...
If you care about meritocracy, then you should care that we have the least qualified, you
know, high-level state servants we've ever had.
I mean, it's just incredible.
I mean, there's a few exceptions there, but it's incredible to see who's making these
decisions.
We've got people who are used to doing condo deals.
trying to figure the fate of nations.
Right, right, exactly.
So that would be Mr. Whitcomb.
The venture capitalist Mark Andreessen did an interview with the New York Times back in January
where he explained his shift from Democratic voter to Trump supporter in 2024,
although he wasn't very Democrat.
I didn't even know his politics, I'll be honest with you.
And one thing that comes through in the interview is his core belief that tech companies
are run by good people doing good things that make the world a better place.
Now there's plenty of evidence, not that way.
So the spread of election disinformation, deep fakes, loss of privacy, lack of safety, increasing polarization,
loneliness epidemic.
I've been speaking in parents who alleged their children died from suicide by the encouragement of chatbots.
Very compelling and depressing situation.
So this shift to MAGA, does it have to do with pure ego?
I feel like I was a little too mean to Andreessen, and suddenly he's like, he's the good guy,
and he couldn't even imagine that some of these things are problematic.
It seems like their feelings are hurt a lot by the general public seeing them as bad guys,
and so they blame Democrats, who mostly were handing glove with them for a very long time, by the way.
Well, there is an asymmetry here, which I've noticed among people who are public figures to one or another degree,
who got crosswise with the left and got vilified by people whose politics, they really shared across the board.
I mean, people who support gay rights and, you know, gender equality and everything, just everything on the list.
legalization of drugs, et cetera.
But, I mean, what you found is just the kind of the narcissism of the small difference on
the left to a degree that no one had ever really experienced before.
And you would just be defenestrated the moment you got slightly out of step with the,
with, you know, whatever, you know, Shibboleth was then being uttered.
And then they got, you know, simultaneously love bombed by everyone right of center, you know.
So you have the, I mean, it's not just MAGA,
but it's just everyone right of century.
We'll accept your Pilgrim's Progress rightward without any judgment.
I mean, you're absolved of all your sins,
and no matter how much you criticized these guys in the past,
the moments you see the light about how awful wokeness is,
they embrace you with open arms.
And so to some degree, I feel like we've witnessed this very weird sociological experiment
where lots of people with a fairly prominent platforms
just got inducted into a kind of cult
because their families were treating them terribly.
And it just felt better, right of center,
complaining about the left and how crazy it was.
And when you think about that, it is.
A lot of it's hurt feelings.
It honestly feels like it in a lot of ways.
I'll make the argument that they may have gotten defense-traded,
but there was a net right there.
They didn't get that hurt.
They got to come back.
they're richer than ever. It's not, it's not, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, I'm, it's, I'm, the, I mean, it's, and in, it's, I mean, many of these guys were out of the closet hating Trump the first time around, and for, for, for all the plausible reason. I mean, even J.D. Vance said that Trump is either, uh, you know, you know, an asshole of the first
order, or America's Hitler, right? And yet, no one ever has to give a rational accounting
of how their views have changed. I mean, after January 6th, some of the guys on the All-In
podcast, I think it was Chamoth most vocally, you know, in the aftermath of January 6th,
I think he said that, you know, Trump should be in jail for the rest of his life, right?
It's like, so how these guys went from there to where they are now has never been explained,
And they feel no burden to explain it because they have cultivated audiences that simply don't care about these kinds of ethical incoherency.
Now we're doing this.
Yeah, it's really quite interesting.
We'll be back in a minute.
Support for On with Kara Swisher comes from Indeed.
Right now, there's a talented person out in the world who could help to help.
take your business to the next level.
But finding that person doesn't need to be a grind.
Just use Indeed sponsored jobs.
It boosts your job posting to reach quality candidates
so you can connect with the exact people you want faster.
And it makes a big difference.
According to Indeed data,
sponsored jobs posted directly on Indeed are 90% more likely to report a hire
than non-sponsored jobs because you reach a bigger pool of quality candidates.
Join the 1.6 million companies that sponsor their jobs with Indeed.
So you can spend more time.
interviewing candidates who check all your boxes. Less stress, less time, and more results with Indeed
sponsored jobs. And listeners to this show will get a $75 sponsored job credit to help get your job
the premium status it deserves at Indeed.com slash on. Go to indeed.com slash on right now and
support our show by saying you heard about Indeed on this podcast. Indeed.com slash on terms and
conditions apply. Hiring. Do it the right way.
with Indeed.
Support for this show comes from Quince.
Every now and then, you find a gift so good
you almost always want to keep it for yourself.
This holiday season, Quince has cold weather essentials
that might tempt you to do exactly that.
Quince is offering $50 Mongolian cashmere sweaters
that look and feel like designer pieces,
silk tops, and skirts for any season,
and must have down outerwear
built to take on the cold. Not to mention beautifully tailored Italian will coats that are
soft to the touch and crafted to last for years. As always, quince is able to offer quality
that rivals high-end brands without the high-end markup. I've gotten a lot of quince stuff
myself over the years, including a really comfortable down cape, which I look fantastic in,
obviously soft pants and a wide variety of things that are really warm that I can layer on,
vast things like that. I really enjoy the stuff I wear from Quince and I wear it every day and it's
really good because it's not very costly. You can find gifts so good you want to keep them with
quince. You can go to quince.com slash Kara for free shipping on your orders and 365 day returns,
now available in Canada to. That's QINC.com slash Kara to get free shipping and 365 day returns.
Quince.com slash Kara.
support for this show comes from delete me delete me makes it easy quick and safe to remove your personal data online at a time when surveillance and data breaches are common enough to make everyone vulnerable delete me does all the hard work of wiping you and your family's personal information from data broker websites how does it work you sign up and provide delete me with exactly what information you want deleted and their experts take it from there and it's not just a one-time thing delete me can constantly monitor remove personal information you don't want on the internet throughout the year
I've gotten to use Delete Me, and I find it really easy to use.
They have a simple dashboard that's easy to navigate and understand.
I'm a privacy nut, and I am always so surprised about how much of my information is out there.
Much of an inaccurate, which is also just as bad.
I get updates all the time, and I act on them, and I really like having the ability to use it.
I've used a lot of these services, and most of them I find confounding.
This is super easy and very simple to use, and that's what's important.
You can take control of your data and keep your private life private by signing up for Delete Me
now at a special discount for our listeners.
Get 20% off your DeleteMe plan
when you go to join DeleteMe.com slash Kara
and use the promo code Kara at checkout.
The only way to get 20% off
is to go to join deleteme.com slash Kara
and enter code Kara at checkout.
That's join deleteme.com slash Kara code Kara.
Let's move on to where the shift
towards authoritarianism in tech leaves everyday people.
In an interview with the Atlantic's David Frum, you said we are in a tower of babble moment.
That informational guardrails are gone and in their place is a kind of, quote, new religion of anti-establishment conspiracy theorizing.
Expand on that as a neuroscientist, why do we struggle to sort through the glut of information online and distinguish the good from the bad?
Well, I think what we're finding, I mean, we sort of always knew this, but in the presence of the Internet and social media, it's just become excruciatingly obvious, is that some percent,
of us really have a taste for conspiracy thinking.
And, you know, so it's not that we just believe one of them.
Not a new thing.
No, it's not new at all.
I mean, it's, you know, obviously it was with, you know, JFK and before.
But you find that it's just this kind of characterological trait that, you know, you love
one of these anti-establishment narratives and you seem to love all of them.
And someone like Joe Rogan is one of the poster boys for this style of thinking, right?
It's just like the contrarian take on some world event is endlessly interesting.
And it doesn't have to, and the search for anomalies in the standard interpretation is endlessly interesting.
And what you get there, I mean, certainly when it gets weaponized on social media, is you get just the utter derangement of our politics.
And it happens on both the left and the right, but it certainly happens more on the right.
And traditionally it's been more on the right.
So you get, you know, Candace Owens, who, you know, within minutes of Charlie Kirk's assassination was cooking up some alternate theory about what happened there.
And it has, you know, she has an audience of millions who want to hear about this, right?
And, you know, and so I don't know how, if you've ever followed how crazy she's gotten, but, I mean, it's, you know, she and Alex Jones are, you know, one up in each other into some sort of, you know, mystical state of,
confusion. So everyone is captured by the algorithm, and it's a very perverse and weird algorithm
as you go right of center. And it's one that's being consciously gamed by, you know, Russian bots
and troll farms. I mean, it's just like we've built this digital infrastructure and invited
our enemies to weaponize it against us, and we've monetized their efforts to drive us crazy
in some kind of sci-op. Tell me, you know, as I said, as a nurse says, what does that work?
The inability, does assinguish the good from the bad?
Is it the glut of information?
Is it how it's delivered?
What is different here?
Because there was always a JFK, alligators in the toilet, whatever causes cancer.
There's a range.
I mean, a lot of people don't fall into this bin, but many of us are very uncomfortable
without cognitive closure.
I mean, just admitting that we don't know why something happened.
Or we just have some probabilistic sense of, you know, I feel like maybe it's a 50% chance it was this or a 50% chance it was that and I'm okay with that. There's this desperation in many people to feel certainty on whatever point they're entertaining. And what conspiracy thinking offers you, I mean, it offers many things, but it offers this kind of race to cognitive closure, which is very entertaining. It's also very empowering.
for the disempowered.
I mean, if you're somebody who doesn't feel like you're at the center of the action
or your, you know, the world is the way you want it,
it's very tempting to become part of a conversation that puts you this, you know,
this person on the fringe who no one's listening to at the center of the truth, right?
You have the true epistemology, right?
And so it's a kind of, I mean, it's a kind of pornography of,
doubt, in my view, and a kind of religion that gets spun up around achieving cognitive closure
based on, you know, not especially rational thinking. Right. So one of the most striking
examples is that supports the idea that we're in a Tower of Bevel moment is the ascension
of Robert F. Kennedy Jr., a long-time vaccine skeptic. I just spent the day at University of
Pennsylvania, and they're all going to flee the United States, all these amazing scientists, you know,
in MRNA and other things, by the way. Twenty years ago, the U.S. was on the
of halting the spread of measles. Now, thanks to the spread of vaccine misinformation,
outbreaks are getting worse. There's a big one in the wake of Thanksgiving. Two children
died of measles this year. I think there was zero for many, many years. And when the Biden
administration pushed platforms to deal with COVID and elects and misinformation, they got accused
of censoring conservative voices. So what's the solution? Are we just going to continue to
go into this pornography of information and being misinformed?
Yeah, well, you know, barring some real consequence that gets everyone's attention, I think we are.
I mean, you know, some kind of economic or epidemiological or environmental catastrophe that is just unignorable.
I mean, when you look at COVID, COVID was almost the worst case scenario because it was bad enough so that our kind of our seemingly rational response to it at every stage along the way was seriously.
disrupting of society, but it wasn't so bad that you couldn't have a wide, you know,
diversity of opinion about what was rational to do. And you have a lot of people who could just
say, well, this is just a cold, and I'm not, I don't care about it. And that was, you know,
while that wasn't strictly rational, lots of people got away with living that way, and they
can now say, see, this was all much to do about nothing. And you had a lot of liberals
trying to, in their own authoritarian way,
trying to control my life,
and I'm going to have none of it.
And so I feel like we probably learned
many of the wrong lessons from COVID.
It wasn't killer enough.
It didn't kill us enough.
Yeah, and so, yeah, you could imagine,
and it also killed old people preferentially
and not kids preferentially.
And so, I mean, next time around,
we might be in the presence of a virus
that is killing kids, right?
And that will provoke a very different response, I think,
or one would imagine.
Right, but that's what measles are, by the way.
What would you say, it's just not widespread,
it's just a small group of evil,
but what about January 6th?
Why wasn't that enough?
It was one of the most witnessed live events,
live stream on TV.
It was appalling to watch.
It didn't work.
But you could, I mean, then you're Tucker Carlson,
and you can show the weird footage of cops just,
you know, obviously scared cops,
just letting in a mob because they knew they were not in a position
to resist the mob.
And so it looked like just kind of the friendly opening the door
to, you know, and some kind of collaboration with the crowd.
And so what was this a false flag operation?
And then you get this discourse of this counter-narrative, which is obviously crazy.
Right.
But again, what is so corrosive here is that now we have a culture, or at least parts of culture,
that doesn't care about hypocrisy and lies on any level, right?
So, like, so Tucker suffered no rebuke from his own audience.
when it was revealed that for years
he had been pretending to like Donald Trump
and yet behind closed doors he was texting
that he thought Trump was a demonic force
and he couldn't wait to be rid of him.
Yeah, same thing with Dan Bungino just recently.
I was paid to...
Right, so, I mean, I just, you know,
perhaps you feel the same way.
I cannot imagine having an audience
that would stay with me
should it be revealed that I was that duplicitous, right?
I mean, like, just perpetrating a fraud
on them for years, just pretending to think one thing
and thinking the other thing.
You don't have a mother who loves Fox News, then.
It works. It's cognitively it works.
Yeah. I mean, that's a culture that doesn't care
about the truth, even, you know,
and doesn't care about being lied to,
even by the people they are putting their trust in, you know.
In Trump, we have somebody who lies with a velocity
that we have never witnessed in public life,
and yet his fans, it's not like
his fans don't think he lies. They know he lies. They just simply don't care about that.
Right, right. And speaking of which, it's the reason dictators like Russia's Vladimir Putin love
the idea of politicizing truth. No one knows what to believe, so people often end up going along
with the narrative push by those in power. In this case, the power to influence opinion is
concentrated in the hand of a few billionaires outside the government in tech. In this case, people like
Musk, they're mainstreaming fringe ideas like the Great Replacement Theory and vaccine skepticism or
whatever. I just spoke to journalist Jacob
Silverman, who wrote the book about Silicon Valley's
radicalization called Gilded Rage.
He said, it's, get that,
get it. He said it's disturbing
that we're still not really having a conversation
with a lack of accountability that these
leaders, these tech leaders face for the behavior.
But that said, the polls
are showing the public, the wider public,
is deeply distrustful of these people,
both on the left and the right.
So is that conversation gaining
traction from your perspective? It
feels like it.
But I feel like, I mean, AI, if anything like the dreams of AI or realize, I mean, leaving aside that, you know, the existential concerns around, you know, the misalignment of AI with our well-being.
But just, just if AI succeeds and, you know, basically cancels the need for most human labor, we're going to have a reckoning with wealth inequality that we just simply can't avoid any longer, right?
I mean, so it's been coming for as long as we've been alive, really, but it's AI, I think, is going to force the issue in a way that will be undeniable.
Yes, which people feel in their bones, right?
You can feel, people feel it, even if it is overwrought.
And also, I mean, just if jobs, especially white collar, you know, higher cognition, higher status jobs evaporate preferentially in the face of this tech, I just think, yeah, there'll be, you know,
You know, the pitchforks are coming, whether they're carried by people who were, you know, used to be plumbers or people who used to be software developers.
I just think in success, we're going to have to figure out how to have a far more equitable society than we have.
Yeah, I've always told them that.
I'm like, you're just pushing off the day where you have to armor plate everything.
Yeah.
We'll be back in a minute.
If you're looking for the perfect holiday gift and you want to give something more thoughtful than another gadget or pair of socks, here's my suggestion, a subscription to New York Magazine.
I've been part of New York Magazine for a while now, and I can tell you it's some of the best journalism out there.
From A.I. in classrooms to the future of media, New York Magazine digs into the stories, ideas, and people's shaping culture today.
And right now, when you subscribe or give an annual subscription, you'll get a free New York Magazine Beanie.
New York Magazine is the gift that informs, entertains, and keeps on giving all year long.
Head to NYMag.com slash gift to get started.
Support for the show comes from New York Magazine's The Strategist.
The Strategist helps people who want to shop the Internet smartly.
Its editors are reporters, testers, and obsessives.
You can think of them as you're a shopaholic friends who carry equally about function, value, innovation, and good taste.
And their new feature, the GIF Scout, takes the best of their reporting and recommendations
and uses it to surface gifts for the most hard to shop for people on your list.
All you have to do is type in a description of that person,
like your parent who swears they don't want anything,
or your brother-in-law who's a tech junkie, or your niece with a sweet tooth.
And the GIF Scout was scanned through all of the products they've written about
and come up with some relevant suggestions.
The more specific you make your requests, the better.
Even down to the age range, every single product you'll see.
see is something they've written about, so you can be confident that your gift has the strategist's
still of approval. Visit the strategist.com slash gift scout to try it out yourself.
Some people would tell you the iPhone 4 was the best iPhone ever. But there's almost no denying
that it was the most dramatic iPhone ever. It was leaked in a bar in California before it ever came
out. And then after it came out, there was such a huge scandal that Steve Jobs himself had to hold an
event and address it. This week on version history, a new chat show about old technology, we're
talking about the whole history of the iPhone 4, what it meant, why it was so chaotic, and whether
it really is the best iPhone ever. All that on version history, wherever you get podcasts.
In every episode, we get a question from an outside expert. Here's yours. Hi, Sam. I'm Kim Scott. I'm the author of the book.
radical candor. From where I sit here in Silicon Valley, it seems like affluence and courage
have become negatively correlated here. The wealthy believe they have too much to lose if they speak
out. What is so alarming to me now isn't that an authoritarian president and a small cadre of
right-wing tech execs want to take over. What is truly surprising to me is that the same people who
once believed in the power of technology to strengthen our society and our democracy have
gone ominously quiet. Fear is a tyrant's best weapon. The people in Silicon Valley who have
made fortunes still have a platform and a voice. Now is the time to use them and to stand
together in solidarity. Our silence will not protect us. Why do you think more people are not
speaking out? And how should we be speaking out? That's a great question. Yeah. Well, so, I mean, we're watching
the, the ruination of so much that we care about, right? I mean, if you look at the stature of
our country on the world stage, right? And what our country represents to our, you know, allies and
former allies and to countries that aspire, can only aspire to be democracies, right? But we used to
stand for quite a lot in the, I mean, it's not that we, we don't have things to apologize for.
We have a checkered past. I mean, it's been 250 years. But on our best days, we really, you know, could
be credibly said to care about, you know, whether you, you know, killed or jailed your journalists
or, you know, rigged your elections or, or, um, imprisoned your political dissidents, right?
And now, as a country, we seem to only care about whether you give our president and his rapacious family a hotel deal in your capital city, right?
I mean, or whether you buy their, whether you enrich them directly by buying millions of dollars worth of, or billions of dollars worth of their cryptocurrency, right?
I mean, that's where it's an extortion racket that we're running on the world stage at the moment.
So why aren't people saying things? I mean, both of us are smaller than all those people from a financial.
point of view. And I have an independent company. You have your own independent. We've, you know,
we sort of carved out these podcasts little islands or whatever that seem immune at this point.
But why haven't others spoken up? What is your sense? They're terrified. They call me. I get a lot of
out of girl. Good job. Good for you. I was like, yeah, and you. You know, please step up.
Well, I mean, so, I mean, it's often a story of incentives. I mean, so I think there is something,
perverse around the incentives of running a public company and imagining that your fiduciary duty
to shareholders is the kind of a master value, right, that covers every other conceivable sin, right?
So you can literally, you know, shake hands with the next Hitler because that is what the market,
you know, your future in the market will require. I'm not saying Trump is Hitler, but, I mean,
you get my point. It's just a failure mode of capitalism that we have to correct for.
And every one of these guys who we know thinks Trump is an imposter, really.
I mean, even as a businessman, an imposter, right?
None of these guys think Trump was even a major real estate developer, you know, when he was a real estate developer.
You know, when he was a real-on is what I hear from a lot.
What a moron.
Yeah.
So they're pretending to lionize him.
And, again, if, I mean, some of these guys have $100 billion or more, right?
You know, or, you know, I mean, 10 billion should be enough to inoculate you against these fears of being personally ruined by pissing off the President of the United States in our country, right?
I mean, so this is what we needed.
We needed someone like Jeff Bezos or Mark Benny Off or, you know, any of these guys to say, listen, this is America.
I'm not bending the knee to this maniac.
I run a very important public company.
If he seeks to damage it, everyone.
Everyone in the press is going to notice that, and we're going to talk about it, and we're going to ruin the political party that put him in power, because this is not the kind of country we live in, right?
And where was that guy or gal who's running a public company, who's a nerd against personal harm because they, you know, this is still America and they're as wealthy as they could ever need to be?
I mean, it's crazy-making that we don't have dozens of examples of that.
I would agree.
So let's get some solutions to finish up.
Right now, some of the biggest proponents of debt regulation that are coming from within MAGA.
Interesting, Florida Governor Ron DeSantis, Unbale, the Citizens Bill of Rights for AI.
Steve Bannon has said AI is likely the most dangerous technology in the history of mankind.
Georgia Congresswoman Marjorie Taylor Green has also expressed fears about the expansion of AI with no guardrails.
I don't agree with them on most things.
they always end up showing their actual ingredients very quickly.
But is that a growing civil war within the populists and tech CEOs?
Do you see that playing out and you imagine a scenario where populace actually win and regulate the tech companies?
Yeah, well, I can imagine if it became politically dominant, then it would be dominant.
I mean, I don't know what the limit is, the limit of influence is from the oligarchs at this moment.
I mean, they don't get everything they want because in some sense they're, you know, they're not aligned.
I mean, like, you know, Jeff Bezos and Elon Musk can't get precisely what they want if they're competing in the race to space, right?
Someone is going to get Trump's favor or not, right?
So same thing with these media mergers, yeah.
They're not, yeah, exactly, the media merger with Netflix now.
So it's not, everyone's not aligned, but yeah, I just, I can imagine that it will take.
something like a minor catastrophe to get everyone's attention. You know, if we do something with
AI that is sufficiently destabilizing to our economy, right, you know, if someone unleashes something
that, you know, makes the NASDAQ go dark for a week, right, and no one can figure out,
you know, how much money they have. Yeah, right. I just think that's, something has to wake us up.
Yeah, I was just called a non-patriot by one of them, and I was like, oh, but you're selling chips to China?
Okay, got it.
Like, I'm not sure what to say.
The Trump administration is sticking by these tech oligarchs, with the help of AIs are, David Sacks.
I'm not a fan.
Trump has promised to sign an executive order soon to bar states from passing their own AI regulation, something the tech leads really want, but a lot of people really don't.
The do whatever you want to approach to the tech sector is what got us here in the first place, as you just noted.
So, on a point like that, I would just say,
that there is a version of that that makes sense to me.
I mean, I don't think we want 50 different sets of regulations
around this kind of technology.
I agree.
But what it really demands
is a sane set of regulations coming from the federal government.
Which hasn't ever happened, you know.
By the way, the tech leaders hated,
the Biden administration's pushed to regulate them
in recent called Biden's efforts regulate crypto-quoted terror campaign,
a little much, Mark.
But what would that entail?
Because I mean, I write the annual story,
once again, we don't have tech regulation of any sort, even the most minor.
The previous administration didn't get it right.
This administration is letting all hell break loose.
A federal system would obviously be best in AI, for example.
But it doesn't happen because of the power of the people holding the purse frames.
Well, it also doesn't happen because of the perceived race that is existential against China in particular.
Again, it might take some terrible outcome that doesn't kill us all to get us to collaborate with our enemies on this particular point and regulate in a way that's saying.
What would be that sound policy, would you say, for example, the first one?
Well, so, I mean, there are really two levels of the problem.
There's the malicious use of this technology by bad actors.
And perhaps, you know, the inadvertent bad outcomes of just the way people can use it.
But there's the deeper problem that many people worry about, which is just the existential risk posed by the so-called unaligned AI, right?
That we could build something that is super intelligent and becomes actually self-improving and that this technology can fundamentally get away from us, right?
And what's truly alarming is that all the people, at least in the States who are building this tech, I mean, people like Sam Altman, it's not like they think that those existential concerns are irrational.
No, in fact, when you ask them, they put the probability of, you know, existential collapse at something like 20%.
You know, they're not saying it's, it's, you know, one in a billion, so don't worry about it.
They're saying, no, no, this is a completely sane set of concerns.
I'm not quite sure how we're going to avoid it.
We hope as we walk these final yards into the end zone, we'll figure it out.
But none of them think that, I mean, they all acknowledge that they're playing a game of Russian roulette
and that there's at least one bullet in the cylinder.
And we're just, you know, we're pulling the trigger because we have to, right?
We economically have to.
We're in a race against China.
I called it the Gere Me argument with Mark Zuckerberg when I was interviewed him, and I was like, I don't like you either.
But does it require some kind of true populist revolt that's centered on class-based politics, which Scott talks about, even if it has an anti-capitalist undercurrent in many ways?
Although I would argue this Trump administration is socialist at this point, given all the things they're doing.
What's needed for American democracy to thrive, if not populism, how do you envision citizens resting power back from authoritarian's and tackle?
Well, I don't know. I haven't ruled out the possibility that cultural change could surprise us and be sweeping.
I mean, for instance, I think that as AI, you know, as the AI sloppification of everything continues,
we might get to a point where many people, maybe even most people, declare some kind of, you know, epistemological bankruptcy with respect to the Internet or with respect to social media.
It's like if you're looking at a video and your first thought is now always, is this even real?
Or this sucks.
Right.
And that just gets locked in, right?
It's just like, okay, this looks like Vladimir Putin declaring World War III, but who knows if it's even real, right?
Strangely, this could force a return, a new return to sort of traditional gatekeeping, right?
But we might wake up in a world where all of a sudden you don't believe the video.
of Putin unless you see it coming from the New York Times or Getty images or like some official
gatekeeper. And this could make social media far less captivating for people. I mean, one could
hope that would happen. Yeah, there was an idea to mark the real stuff, not demark, not mark the
fake stuff, which is kind of interesting to think about it. But yeah, but I mean, now there's kind of an
arms race between, you know, detecting fakes and producing fakes. And maybe there's an asymmetry there
where it's always going to be easier to produce them than to detect them, you know, I don't know.
But I just think, you know, to some degree, we all have to just kind of personally work this out with respect to social media.
As you know, I deleted my Twitter account three years ago.
I didn't delete it because I thought he'd do something shitty with my account.
I just left it there.
Yeah.
So, I mean, you know, it's always humbling to confess that that's like the greatest life hack I've found in the last decade or more.
I mean, it was just in terms of improving my life.
It's just absolutely enormous.
And I think, you know, it's possible that many more people will have a similar epiphany
because there is something toxic about our engagement with these tools.
I mean, leaving AI aside and its capture of our economy,
just having our lives fragmented in this way by this, you know, this digital bride, you know.
I mean, we're just, you know, so much of your sense of what the world,
is and what you are in the world has been uploaded into this space, which, I mean, honestly, I mean, again, there are exceptions here, but it is, it's easy to, you know, it's easy to say Twitter isn't real life if, if you're me, because, you know, I've taken steps to immunize myself against the, the pain of it all, but it really isn't real life if you're me. I mean, it's like it's so, it's so amazing how much it doesn't matter what happens on it.
on X, you know, even if it's Elon Musk attacking you by name and you're trending that day,
yeah, you know, it really doesn't matter. And so, you know, many people may begin to yearn for a life
in the real world, reading real books and having real conversations face to face. And that
might be a kind of spiritual awakening for our whole culture. I mean, we might use these tools
differently in the future. You do see younger people. Both my older sons are just on
Just don't. What? What are you talking about? You know, they don't so care. They do, they use YouTube to watch TV. That's how I would put it. I don't think they use it to be influenced. I think they're just like TV. And that's their version of television. But, and which I watched quite a bit. So this has been a relatively gloomy conversation, but really interesting. But you're, you say you're hopeful. We're in a quote, Emperor's New Closed situation where the truth is obvious, but just going unacknowledged by most people. So what needs to happen to start for the kids.
to point to the emperor and say,
ah, you're naked.
Yeah, well, the kid is certainly pointing.
I mean, he's always pointing,
but, I don't know, I mean,
maybe as simple as just it no longer being Trump
at the center of Trumpism, right?
Like, when you ask yourself,
can J.D. Vance or any other
plausible candidate, you know,
inherit the mantle of the cult?
And it certainly seems plausible
that the answer to that is no.
That's just for a variety of reasons, you know, the k-fabe of, you know, the professed wrestling, you know, vibe is not going to be sustainable.
Um, and, and that we, and we might, you know, suddenly be confronted by a, a whole culture, right of center that pretends never to have been as morally confused as they now are, right?
It's a lots of people who come shuffling back into normalcy, pretending that they never got caught up into the, in the denialism and the cultishness.
and the weirdness that is Trumpism.
That's totally possible.
They're now suddenly care about, you know,
alliances with our democratic partners in Europe,
and they're no longer fans of Putin,
and they don't remember that anyone ever was a fan of Putin
in the Republican Party.
I mean, it's just all that's possible, you know.
Yeah, I don't know what happened.
What did I drink last night?
No, no, I didn't, I wasn't at that conference.
Yeah, yeah, yeah.
And what are you going to say to that?
I mean, oddly enough, Rachel Maddo said the same thing.
We're going to have to let them, this is their last chance,
and they need to start to shift or, they have an opportunity.
We might need something like a Truth in Reconciliation Commissioned at some level.
I mean, you know, for someone, again, for someone like Elon who's acted like such a maniac,
I don't know how you get back into normalcy without some kind of,
Maya Culpa. I mean, it's like, what, what were you thinking during Doge?
Yeah, that's not happening.
When you were little, I mean, you know, and what were you thinking when you were bringing
back neo-Nazis and conspiracists with great fanfare onto X at a moment when we were watching
a global eruption of anti-Semitism, the likes of which we hadn't seen since the Holocaust,
and just, and meanwhile, you know, all in the name of free speech absolutism, but also then just kicking
off journalists who you didn't happen to like
and complying with the demands
of authoritarians the world over
suppressing the speech of political dissidents
in, you know, countries like Turkey,
right? I mean, just like, like
who the hell did you become
and how can you pretend to now be normal?
Look, that's, you're asking for
a Scrooge moment. He wakes up Christmas
morning and says, boy, buy me the turkey.
There'll have to be many Scrooge moments.
Right, but what would you
say if he called you and said, I am so
sorry, I can't believe I attacked you, the
And he does attack you. He texts me much less than you.
I wouldn't even know if he's been attacking me, honestly.
But I was the face of evil, I think.
Yeah, but are you pure evil?
The last thing he said about me on Twitter was that I am pure evil.
That's his favorite.
So I have that over you, I think.
I believe. I have art of evil. I don't know. I would have, whatever.
You have a heart filled with hate.
Hey, hard filled with hate. That's right. I'm sorry. That's correct.
What would you say? He called you and said, okay, I'm really sorry.
I'm going to do whatever it takes to have reconciliation and truth.
I mean, so that's an interesting question about essentially the physics of apology, right?
Like, what constitutes an acceptable apology?
And for me, it really, it requires that the person sketch some rational path from where they were to where they say they now are, right?
So it's like, how is it that you were?
Yeah, like, how is it that you were?
How is it that you think better of the things you did?
And how are you now the person who views those things the way I view them?
And so I'm willing to, I mean, apology and forgiveness.
I mean, this is some of the best, this is the best software we've got, right?
I mean, like the capacity to forgive someone, I mean, it's absolutely indispensable,
not just psychologically, but civilizationally.
And we celebrate, you know, people being, having changed.
changes of heart and being forgiven at the deepest level.
I mean, you know, like, but even just like, you know, a rapist or a murderer.
I mean, someone, literally, you can get someone coming out of prison who really did commit
the murders for which they were sent to prison, and now, you know, whether in a Christian context
or just a secular context, you know, we acknowledge that people can apologize and be different
people, but there has to be some intelligible path from there to hear.
And in the absence of that, it's just, it just seems like, you know, it just seems like a bad faith maneuver.
And so, you know, I have no idea how I would respond unless I was in the situation.
Yeah, I probably wouldn't let it back.
But it's not going to happen.
I mean, that's just.
I'm good for ban.
I know it's not going to happen.
I'm certainly not waiting for it.
Certain people banish it, I say.
Banish it without a knee.
Banished.
You are banished.
I don't think that's the worst thing society's ever done from certain people.
But we'll see.
Life is too short, Kara.
I understand that, but it's not that short.
Let me just say.
Anyway, I really appreciate it.
It's just long enough for it to matter.
Just long enough.
Just long enough.
Anyway, I really appreciated Sam.
What an interesting.
We can whinge about things we disagree with later, but this was really smart, and I appreciate your thoughts on it.
Always happy to do it.
Today's show was produced by Christian Castor-Wiself, Michelle Alloy, Megan Bernie, and Kaelin Lynch.
Nishat Kerwa is Vox Media's executive producer of podcasts.
Special thanks to Bradley Sylvester.
Our engineers are Fernando Aruta and Rick Kwan,
and our theme music is by Trackademics.
If you're already following this show,
you know the Emperor has no clothes
and yuck when I think about it.
If not, you're drowning in conspiracy porn.
Go wherever you listen to podcast,
search for On With Carousisher and hit follow.
Thanks for listening to On With Caroushisher
from Podium Media, New York Magazine,
the Vox Media Podcast Network, and us.
We'll be back on Thursday with more.
You know,
KERRY
