Offline with Jon Favreau - The U.S. v. Google, Elon’s Secrets Revealed, and Why Trolls Got Nastier
Episode Date: September 17, 2023Kaitlyn Tiffany, Atlantic reporter and author of Everything I Need I Get from You, joins Offline to break down internet trolls. She and Jon unpack who these people are, and examine why the online tren...d of celebrating the misfortunes of strangers – including their deaths – is still very much alive. They talk about how trolls from across the political spectrum see their victims not as nuanced individuals with feelings, but as representatives of an enemy ideology, and thus fair game for online bullying and evening doxing. Then, it’s time for a tech roundup with Max on Walter Isaacson’s new Elon Musk biography, Congress’s AI hearings, and why President Biden’s DOJ is suing the internet’s largest search engine. For a closed-captioned version of this episode, click here. For a transcript of this episode, please email transcripts@crooked.com and include the name of the podcast.
Transcript
Discussion (0)
These weren't like the, you know, the 2016 trolls that people picture where it's like a Pepe the Frog avatar who's just saying kind of nonsensical, incomprehensible statements just to disrupt conversation.
These were real people who kind of didn't care that anybody could see they were posting these things.
So when I reached out to them, honestly, a lot of them were sort of at the attitude of like, kind of not really even understanding what I was curious about.
Because they were like, yeah, that's what I posted.
That made sense to say, you know.
And some of them were willing to get into it a little bit more and kind of examine why they would find themselves being part of the pile on, you know, hashing out what caused a stranger's death.
And I think some of them were
able to be somewhat introspective about it. But on the whole, the reaction was, I don't understand
what you even find interesting about this. This was like a very normal thing to say.
I'm Jon Favreau. Welcome to Offline.
Hey, everyone. My guest this week is The Atlantic's Caitlin Tiffany.
If you spent a lot of time online during the pandemic, you may have heard of the Herman Cain Award,
named after the late presidential candidate who died of COVID after attending a Trump rally.
If you're not familiar, the Herman Cain Award is a Facebook page and Twitter trend where people basically mock anti-vaxxers who've been hospitalized or killed by COVID.
Lovely stuff. And even though the Herman Cain Awards are now mostly behind us,
the online trend of celebrating the misfortunes of strangers, including their deaths, is still
very much alive. Which is what Caitlin wrote about a few weeks ago in an Atlantic piece titled
How Telling People to Die Became Normal. Caitlin decided to investigate what motivated some posters to tell people
they don't like to die. And she actually got internet trolls on the phone to ask them why
they thought it was okay to mock or threaten people who've lost loved ones just because
they disagree with their politics. So we talked about her piece, what she learned talking to trolls,
and what she thinks about the most extreme form of trolling,
doxing,
and why the definition of these terms has evolved over time.
As always, if you have comments, questions, or episode ideas,
please email us at offline at crooked.com
and stick around after my interview.
Max and I are back in studio to talk about
Walter Isaacson's new Elon Musk biography, Congress's AI summit, and why President Biden's DOJ is suing the Internet's largest search engine.
Here's Caitlin Tiffany.
Caitlin Tiffany, welcome to Offline.
Hi, thanks for having me.
We've been looking for an excuse to have you on for a while now because you write about so much of what we discuss on this show.
And I thought your recent piece in The Atlantic about trolling was fantastic and also a great way into the topic.
For those who haven't read it, it's called How Telling People to Die Became Normal. And you talk to two kinds of social media trolls, trolls who have mocked
the deaths of the unvaccinated and anti-vaccine activists who tell people their loved one died
because they got the shot. What made you want to focus on these two groups of people?
Yeah. Well, I guess since the beginning of the pandemic, it's been an interest of mine to see how seemingly normal people have reacted to the politicization of COVID and of the vaccines. And we saw a lot of this at the beginning during the sort of not exactly lockdowns,
but closures of businesses and schools.
There were Facebook groups that were warring with each other constantly and using this really jarring language to talk about other people.
So you maybe saw like COVIDiots.
I did, unfortunately.
Yeah.
Or there was a Facebook group in North Carolina that I was following where they referred to, you know, people who were doubting the facts of the signs of rodents, which is obviously quite an insulting thing to say about someone, but also just sort of, you know, dehumanizing and bizarre.
And I think what was interesting to me about it was that people who were having these conversations, like they didn't think of it that way. They thought of it as kind of either letting off steam
or as a kind of logical expression of this extreme frustration they were feeling that seemingly so
many people in the country just couldn't get on the same page as them about this very real and
very dangerous thing that was happening. So that was a sort of a spirit that
I saw on both, I don't want to say both sides, because it's not really a debate, obviously, but
on both sides of the COVID discourse, there was this rage that led to almost cynicism and detachment and this ability to kind of imagine everybody on the other
side as like a meme or a joke or just another example of this fad or you know trend that you
despise and and to kind of engage with them that way rather than as a person.
And like with trolling, it's actually kind of complicated because it's kind of hard to tell,
right, in some of these conversations whether people are trolling or whether they're totally
sincere and just like really detached from reality. So that's obviously an interesting space for me to live as a reporter.
Yeah, I mean, you use the word dehumanization,
which is, I think, an apt description of what happens here
because the examples you use, right,
there's all these intolerable debates online,
especially on Twitter about COVID
and have been since the beginning of the pandemic
but um getting to the point where someone dies and then you celebrate that death or mock it or
say use it as a reason to say i told you so when that person leaves behind other people who they
love very much family members is just like it really seems
to me like it's getting to the extremes and it's something that i can't really understand
and it's not like it's it's not just like people do that then they defend like there were whole
debates about whether it was the right idea to have you talk about the um the herman cain award
uh page on facebook right there was a Herman Cain award because Herman Cain Herman
Cain of course didn't believe in vaccination and then died of uh COVID and it was you know any
anyone else who died of uh who died without the vaccine was like you know targets for people
on this page and you and then you know you write about people's loved ones having to see that and
then on the other side I mean a colleague of yours that the atlantic lost his son and people said that they he lost his son
because he somehow vaccinated him right because these were the anti-vaccine crazies and what that
did to him was just beyond awful it was just hard to even read that piece so you track these people
down and you found their information and you called them up.
Were most people willing to talk to you?
More people than I thought would be.
I think that, you know, this question gets at something that's important to understand about the dynamics of online.
I guess trolling is kind of a blanket term.
Like we can get into maybe some of the semantics of what trolling actually is.
But like a lot of these people were posting under their real full names, you know, on Twitter and especially on Facebook.
And their names were easily connected to like their businesses, their email addresses, their phone numbers. These weren't like the 2016 trolls that
people picture where it's like a Pepe the Frog avatar who's just saying kind of nonsensical,
incomprehensible statements just to disrupt conversation. These were real people who
kind of didn't care that anybody could see they were posting these things. So
when I reached out to them, honestly, a lot of them were sort of at the attitude of like,
kind of not really even understanding what I was curious about because they were like, yeah,
that's what I posted. That made sense to say, you know? And some of them were willing to get into it a little bit more and kind of examine why they would be, why they would find themselves being part of the pile on, you know, hashing out what caused a stranger's death.
And I think some of them were able to be somewhat introspective about it.
But on the whole, it was really more, really more the reaction was, i don't understand what you even find interesting
about this this was like a very normal thing to say wow did any of them like in the conversations
you had through the course of the conversation express any kind of regret or remorse for their
comments or was everyone just like no that's totally normal this is my these are my political
beliefs those people are wrong and that's that yeah i spoke is my, these are my political beliefs. Those people are wrong. And that's that.
Yeah.
I spoke to one woman who, you know, she was older.
And I think maybe she didn't seem like super well-versed
in kind of how social media platforms work.
So she had seen the gentleman who wrote the story
for The Atlantic about his son dying.
She had seen his tweet on her feed because, you know, had been retweeted by someone.
And she kind of instinctively replied to it saying, like, the powers that be that are pushing
the vaccine should be held accountable or something. So she thought of herself as, like,
not participating in the pile on because she wasn't blaming the father.
She was blaming these larger power structures.
And that was interesting to me because, you know, if as a reporter looking at the tweets,
I wasn't differentiating her response from the other ones.
I was thinking this is part of the pile on.
And I'm sure, you know, being on the receiving end of that would also, you would, I mean, you're not thinking of that as like a comment that's coming in to defend you.
It looks like another one of the crazy comments.
So I think there are, there is space for miscommunication in this too.
Like she, that woman was very remorseful in saying, you know, this is horrible.
It's awful to say something like these people are saying to a grieving parent. But she was definitely a rare case, I would say.
Even some of the more thoughtful people I spoke to were kind of able to justify things in a
roundabout way. Like I would talk about in the piece speaking to a woman
who had reposted this like collage of the father's tweets you know talking about getting his kid the
vaccine and giving the kid candy because he did such a good job sitting for the vaccine
she reposted like all of those tweets on top of a photo that appeared in his obituary.
And she was able to, you know, really split hairs and be like,
well, I didn't say he killed his son.
I just presented a chronology.
I just like offered the information to people to draw their own conclusions. And maybe you think that's inappropriate, but I think it's fair game, basically.
So, yeah.
It does seem like a lot of the, what a lot of us would think as normal human reactions to someone else's suffering and death are subsumed by the larger political context here.
Yeah.
And some of the polarization.
And I know you looked at some of the research into Schadenfreude for this piece.
What did you learn about the desire to watch bad things happen to people that you dislike?
Yeah, I didn't end up talking about this in the piece because it was getting a little lengthy.
But there was some really interesting research done last year about, actually specifically about, you know, contemporary U.S. politics and people's desires to say, you know, if somebody who denies climate change lives in Florida and suffers the effects of climate change, like they have that coming to them. And even in the context of COVID, there were questions asked of more Republican
respondents who would say, if a Democrat suffers because of their business closure
during the pandemic, they had that coming to them. That's what they asked for.
And they did even ask, if know, if somebody denies COVID,
and then they become ill with it, do they deserve that? And the answer, especially on the left,
was overwhelmingly yes, even though the survey didn't specifically ask about dying. I think it's kind of implied in the question. So that wasn't totally surprising based on what I'd seen
online, but it was interesting to see it reflected so starkly. And then with the research that I mentioned in the
piece, that was more about online behavior specifically. And a researcher at BYU, Pamela
Brubaker, had looked at what motivates trolls on Reddit. And like I said, I think the dynamic is pretty different between somebody
who's trolling just to get a reaction and somebody who's sincere in their beliefs but just delusional.
But I still thought her paper was relevant because she was talking about this kind of
selfishness that you have to have in order to interject yourself into these conversations in a way that completely reduces
someone to like whatever trope you you can fit them into and robs them of their humanity um it's
just prioritizing this like almost desperate need to be right and to and to disrupt the other side's conversation over like any
consideration of like what collateral damage that might have. And they found like that motivation,
that kind of like narcissism or like need to interject was a really powerful motivator for
some people. Does the research say anything about whether that's how our brains are wired, whether
social media and being on the internet too much has made that worse, whether that's just a product
of increased political polarization over the last couple decades? Like what's the core of why so
many people want to do this? Yeah, I think schadenfreude is like an in-group out-group
phenomenon just historically that would bear out in like decades and decades of social science research so to me personally i think that it's
more compellingly a story of of political polarization in this in the u.s than it is
specifically a social media story i think social media is just a really useful tool for people who
are already susceptible to having those kinds of attitudes. It just gives
you a venue to express them. You know, I personally, I mean, I've heard plenty of people
in casual conversation in my day-to-day life, like make jokes about climate change deniers
in Florida and how they deserve to, you know, sink into the sea, which they don't mean probably, but, but like it's, it's, it's something that you hear a lot.
It's just that social media allows you to see it more. And I, I think, I think it looks a lot more
stark and jarring maybe when you come across comments like that on the internet that,
you know, aren't contextualized by tone or by knowing that the person doesn't really mean it. And it's just like, oh, my God, what a horrible thing to say, you know?
Yeah, I do. I do wonder, because, you know, as you pointed out, sort of the usual argument is,
you know, anonymity on social media makes it easier to do this kind of stuff. These people,
of course, all posted under their real names. But I've often wondered whether the way we use social media, you don't see the person,
you're not, you don't necessarily look at their face, you don't see their emotions as you're
lobbing these insults to them. And it's harder for me to believe that if someone was, say,
in the hospital for something else, and they saw a family crying over a loved one who just died
because they weren't vaccinated, that they would go up and say, oh, your loved one had that coming.
You know, and so I do wonder if just like the way that these platforms are constructed,
even if it's not the anonymity, makes it a little bit easier to do this from your keyboard,
from your phone, from your own house than it is to like do it in person. Yeah. I mean, I think distance is definitely a factor. I don't know if it's totally unique
to social media platforms. I think that would have been true probably, I don't know, in the
age of just email or forums or whatever. But I do think social media plays a role in the sense that
we touch on this a little bit in the piece that these platforms are spaces where people are really, you know, primed and incentivized to
present this pretty easily digestible image of themselves. So if you went onto my social media
profiles, you would pretty quickly get a sense of like, you know, where I live, my socioeconomic
status, my political leanings, my cultural interests, my background. Those things, those
signifiers are really salient on social media and they do fall into tropes pretty, pretty simply,
right? Like I mentioned in the piece that like, you know, people on Twitter who have like a Ukraine
flag emoji in their username, like that has become really strongly associated with, you know,
liberals and Democrats. That's like, it's a tell. Or someone who talks about like freedom in their
Twitter bio. I think anyone who's spent enough time online sees that and they draw the opposite assumption. They say this is
like a Republican presentation choice. So in that sense, like I think social media,
first of all, helps people make these really snap decisions about, okay, what kind of person
am I dealing with here? Who are they? And then it also provides ammunition evidence, like with the example of
the father who wrote the essay for The Atlantic whose son died. Tragically, I think part of the
reason that story spread so well in anti-vaccine factions was because he had provided all these
personal details that they were able to then weaponize. Like he had posted about taking his kids to get vaccinated. He had posted about, you know, buying them candy as a
reward. And it was easy for them to twist that into like, you know, you bribed your kids and
now one of them has died. So I do think social media plays a major role in that sense. I think like it was just important for me in this story not to remove like the aspect of human agency because I think sometimes sometimes coverage of social media platforms and the ills of them I think gives a little bit too much power to the platforms and the algorithms and kind of excuses people for
doing things that they ultimately did choose to do you know yeah no it's a it's it's much more
complicated and layered i think than a lot of the debate it's a little bit of a chicken or egg thing
too because i sometimes wonder like is it just the most politically opinionated people who end up going to social media to have these fights and people who just like to get in fights?
Or if you're on social media for other reasons, is it just easier for you to get sucked into these political fights because you are on social media?
And I do think it's a little bit of both.
Yeah, I would agree with that. I mean,
I think in my early 20s, when I was first on Twitter, I would get in fights because it was
kind of hard to resist. You know, someone says something you think is really dumb or wrong,
you want to reply and make them feel stupid. And you do kind of have to sometimes learn the hard
way, you know, put your hand on the stove and realize,
oh, like this doesn't feel good when I do this.
This is a waste of my time
and it's making me like a worse person
and you have to definitely like choose
to step away from that
because I do think it's a compelling,
I get why people have the impulse and it's right there and it's so easy
to engage. But yeah, which you wrote about last year,
partly because, as you pointed out, the definition of the word has changed to mean basically
whatever people want it to mean. Can you talk a little bit about the origins of the term
and then how its meaning has evolved over time? Yeah, I think this is relevant to the trolling
conversation too. Both of these words are, you know, words that had pretty specific meanings
when they were coined, you know, decades ago in the golden age of forum culture. So
trolling originally would have meant, you know,
deliberately posting something that you didn't even necessarily think was true. You just wanted
to get a rise out of people and disrupt conversation. So that's why I think it's a
little bit of an awkward fit in the story we were just talking about, because obviously some of
those people really did believe what they were saying. With doxing, it was originally meant to mean that you were taking someone's personal private information out of, like, you were in a space where the norm was one of anonymity.
And then to, like, intimidate someone or to, like, mess up their life in some way, you would release their personal information.
And that could mean
just their name, but I think it's more understood to mean like other information as well, like a
home address or something that could put them at risk if they were, you know, being threatened in
some way. And that was the term, that was the way the term was popularly understood in, you know, just before the 2016 election when Gamergate was a huge news story and people were beingAT team there or, you know, just deliver empty
pizza boxes in like a threatening way. Like I know where you live kind of way. So I think for most
news consumers, that would be what you would think of when you said doxing. But there's been this
cultural shift in the last few years, I think largely because of crypto culture, which is a culture where there is this norm of
anonymity where you can have a profile picture that's just like a cartoon, like an NFT, and go
by a nickname and never reveal your real name. So some of the powerful figures in that world
refer to doxing as just like, you've told someone my name, you've showed someone
a photo of my face, really just like any personal details whatsoever. And so they're simultaneously
very serious about it, like very rigid about what constitutes doxing. It's the smallest little thing.
They're also kind of not serious about it, I think, because to say that revealing
someone's name or who is in charge of a billion dollar NFT brand is endangering their safety is
obviously false. And they will be kind of winky about it, like, oh, LOL just got doxxed or
whatever, which is obviously not something you would say if a SWAT team came to your house.
Yeah, I guess the challenge is to sort of separate out the real harms and dangers that can come from certain forms of doxxing that we're still dealing with today.
People are still getting doxxed.
So I guess the real debate is like at a time when so much of our information is already on the Internet, who deserves anonymity?
How much and how does that even work?
Yeah.
Like, what do you think about that yeah I
guess with anything I think it's it's very context specific um and if you want to use like the
broadest possible definition of doxing it would I think it would also include just taking some
information or images from one location and moving them to another where the
audience is going to be more hostile. So I think that's something that the super popular
like libs of TikTok account was really known for doing, like taking videos of, you know,
people performing drag or just like queer people talking about their lives
to what they would have thought of as a receptive audience and then reposting it for this like very
aggressive right-wing audience that, you know, hates these people and doesn't think they should
be full citizens of the country. And you could call that doxing in a sort of spiritual sense because it is causing this like dangerous exposure. ways in which even if your personal information is already out there, technically, most people's
addresses are findable now unless you pay to have them removed from databases. So even if your
personal information is out there, it's just like the act of moving it somewhere else is what can
be overtly threatening. So I think it's good for people to be aware of that. I think it also allows
people to use the term in bad faith sometimes. Like when the Washington Post revealed the name
of the woman behind lives of TikTok, she claimed to be doxxed, which maybe is literally true or
felt true to her. But I think you could, I would make the argument that her name was in the public
interest and newsworthy because she had a huge platform, which she was using to terrorize people.
But yeah, I'm kind of, I think I'm kind of meandering away from your, your question a
little bit, but I, you know, it always comes down for me to like, this isn't a term that's like means one thing and should only be used one way
it's very case by case to me and has a lot to do with intent and with like with consequences and
motivation as well like you know there's there's simply a different motivation between
a police officer tweeting a picture of a black lives matters protesters full driver's license versus like a buzzfeed reporter saying hey i looked at
a public business record and i found out who the guy behind board apes is um like those are just
obviously two different things well i was gonna say like once upon a time long long ago uh we were
all doxxed by something called the yellow pages. Yeah, yeah.
No, totally.
They had our names and phone numbers in there.
But I do think there's so much of a focus now on sort of the meaning of the term and claiming that you were doxxed and using it sometimes in bad faith ways versus sort of the actions that some people take when they have other people's personal information, which seems like it should be the focus, right? Because you could have someone's personal information,
reach out to them and have a civil conversation
or be annoying and then they ignore you.
Or you could use it to harass them,
to show up at their house, to cause them harm, right?
Like it's so definitely,
I think that it seems like the debate
needs to be focused more on the actions
that come from having people's personal information
and them not being anonymous than whether or not something counts as doxing right yeah and i
think it should be galvanizing too for people who maybe haven't previously been like super
interested in conversations about online privacy because they think like well i don't use the
internet to do anything bad so i I'm not worried about that.
But like, as we've seen, as we were just talking about with the trolling story, like your pieces
of your personal life are scattered all over the internet. And all it takes is like somebody to be
motivated to use them in a way that's harmful to you. And that's not fair. Like I had somebody who
was, you know, sending my work email address, like 100
emails a night. So the Atlantic was like, we should get your address removed from the internet.
You have to pay money to do that. And you have to renew that subscription every year. Like I think
it's useful to just have people thinking more often of like, how would I react if my personal
information was used against me? Like,
what is out there that I've left up for people to find? And also like, in what ways are, you know,
the government currently failing to protect my like privacy and rights as a citizen?
It's an important topic, even if you're not saying, you know, running around yelling about
being doxxed. Yeah. A couple final questions for you,
just as someone who covers internet culture
and social media so much.
Do you think we're close to a tipping point
where social media has become so awful
that it's getting less popular and less important?
Or is that just wishful thinking on my part?
Yeah.
I mean, I think we're probably in um an awkward phase in between
you know the heyday of Facebook and Twitter which was very I don't know 2010s uh social media and
and now there's a lot of power and attention consolidating in TikTok or even Instagram. I think it is the platforms that my entire journalism career have been based on are definitely getting worse and probably going to like erode to the point of uselessness at some point. But I don't think we'll ever be done with social media. Do you do you have any advice for
how to have a relatively pleasant online experience on these platforms that's not
ruined by online trolling and all this other awful shit we talked about? Yeah, I mean, who knows?
They change Twitter every single day. And this feature might not be available tomorrow. But
I always tell everyone my number one
advice for using Twitter is to mute all of your notifications. There's no reason to have Twitter
notifications. You can toggle it so that you still get them from people you follow. That way you still
see when your friend or your boss likes your stuff, but you don't have to see all the crazy people in your
mentions who are accusing you of being a child trafficker whatever it may be that day that that
honestly was the single best step that i took is to only view uh mentions and replies and everything
else from people that i follow that follow me that was like yeah it truly changed my life i'm not even
kidding because when you start when i started when i was like, that was huge. It truly changed my life. I'm not even kidding.
Because when you start,
when I started,
when I was like 22 in journalism,
I wanted to see
what everyone was saying
about every single thing
I wrote.
Yeah,
I was there.
Yeah,
then when I turned that off,
I was like,
why did I ever care?
Didn't need it,
don't miss it.
Yeah.
Life is better without it.
Caitlin, Tiffany,
thank you so much
for joining Offline.
Really appreciate the conversation. Yeah, you so much for joining Offline. Really appreciate the conversation.
Yeah, thanks so much
for having me.
All right, we're back.
Hey, Max.
I'm sorry.
I was thinking about
the Roman Empire.
I'm sure you are.
Just classic Max. I'm sorry. I was thinking about the Roman Empire. I'm sure you are. Just classic Max.
Just going through favorite empires, favorite.
Are you a Republican era guy or more of an Imperial era guy?
Are you a kingdom guy?
Again, I just know what the Wikipedia page told me when I looked at it.
It's a great resource.
For the first time.
I love browsing Wikipedia.
After seeing this trend.
As an internet skeptic, I will always stand up for Wikipedia. All right, a lot of news to cover
today. The Biden versus Google case kicked off this week, which one former DOJ antitrust lawyer
called the most significant U.S. monopoly case in a generation. Government says Google controls
about 90% of the search engine market, not because they have the best search engine, but because they
have the most money and they use it to make deals with companies like Apple to be the default search engine on
their products. Google says not so fast. The default setting can easily be changed. And the
reason they're dominant is because their search engine is so awesome. What do you think, Max?
Whose argument do you find most persuasive here? So I think it's important to
keep in mind here that what is at issue is not just does Google have a monopoly, but are they
using their market power to exploit that position in a way that's anti-competitive? And I think it's
easy to look at this and to be like, look, I love Google search. Google search is great. I don't
want to use AltaVista, whatever. Like, why should I be so upset about this? But there is a school of thought that is really prevalent in the Biden administration,
but that goes back like a century that says that monopolies are really, really bad. Because when a
company has a monopoly, even if that company like Google does offer a good service, and it seems
like I'm not obviously being ripped off by having like Google preloaded on my phone or whatever,
that that company will necessarily exploit that position to extract more and more resources from consumers and to deliver a worse and worse product because they have no incentive to
deliver a better product and because they can use that power to basically just like get more
out of consumers. And like the big example that people cite in Google is that like, okay, they don't charge for the service,
but they can squash competitors.
So like, sure, there's not a better alternative
to Google now, but maybe there would have been
if Google wasn't using its market position.
Maybe they're extracting more of your personal data
and use it against you
because there's no place else that you can go.
And I think it's also important to keep in mind
that like the Biden administration,
the people at like the top of DOJ antitrust division, see this as part of a larger effort to break the monopolies of big tech that actually goes back to like before even Biden came
into office that they explicitly talk about as not just like, oh, it's so you'll have a better
search engine on your phone because they see these big monopolies as a threat to democracy itself.
Well, one specific example that I saw reading this was, so there's a search engine called
DuckDuckGo, which I only knew about because I've seen Tommy use it.
And I was like, what is he doing?
Why isn't he using Google?
But for those who are focused on OPSEC, it's like a privacy-focused search engine.
So it doesn't vacuum up all your data.
And the vice president of public affairs there says it takes 15 steps to choose DuckDuckGo as the default option on a smartphone running Google's Android operating system.
And so you can see why.
Say you don't want your data collected.
I wouldn't have even heard of DuckDuckGo. And so you can see why, like, say you don't want your data collected and you want to use it.
I wouldn't have even heard of DuckDuckGo.
So not only is it hard to get it, you know, so you can see why the lack of competition could actually ultimately hurt consumers. And people have been saying for a few years, Google search seems like it's getting worse and worse.
It's more dominated by ads.
It's harder to get the actual information you want.
And that's classic monopolistic behavior of you're no longer engineered toward delivering a better product.
You're engineered towards extracting more value from the consumers that you have held hostage, basically.
And Google has known this guidance to employees about certain terms
and words not to use, because there was this presumption that antitrust investigators were
going to come through and not to say things like dominance, not to talk about market position,
never to externally use internal Google data on how much of the search market they controlled,
because they were just like, the antitrust people are inevitably coming for us because the document doesn't say we have monopoly,
but it kind of implicitly concedes that. And again, I think it's important to think of this
as not just a like, is Google search too dominant, which it is, but also think of it as a first step
towards this larger project of dismantling the big tech monopolies, including, you know, Facebook.
So those are potentially some of the longer term implications. In the short term on this trial,
for this trial, DOJ has been sort of quiet about possible remedies that they would ask for if they get a favorable decision here. Though it sounds like people think that breaking up Google is
unlikely to happen from this one case, though that is a
possible remedy. For when the European Union had a problem with this, Google basically came up with
a choice screen where you can pick your search engine at the beginning and then they hope that
people select Google, but you at least have the choice. And that was to appease the Europeans.
So you could see something like that. Yeah. It's a hard problem to solve because it's hard when your product is free to come up with
a remedy that will introduce more choice in the marketplace. And like the case that people cite
as a precedent a lot to this is the big Microsoft antitrust case in the 90s. But that was a case
where it was easy for DOJ to recommend breaking up the company.
That made sense because what Microsoft was doing
was it was using its dominance of the operating system market with Windows
to say all of our Windows machines have to be preloaded
with the software applications that we also make.
So it was easy for, and a judge did ultimately rule in this,
but then it got thrown out for weird reasons,
to say that we have to break up the operating system and the software division into different companies. I don't know what solution they're
going to have through this. I'm sure they have something in mind though. And ironically,
that Microsoft case is what allowed Google to take off and become a dominant search engine.
It's funny, that's actually part of the standard lore in Silicon Valley, which I think goes to the deep fear of antitrust cases,
that even if you're just investigated for a few years, you have to be so cautious about what you
do when you're under investigation that it can allow a competitor to come in and eat your lunch.
There's a really, really deep fear of antitrust investigations in the tech industry. And at the
same time, there's this school of thought that says that monopolies are good.
That's been really dominant
in Silicon Valley for a long time.
Peter Thiel wrote a whole book about it,
Zero to One.
Well, I saw in one of these pieces
that Google lawyer was arguing
that one reason they should be allowed
to remain as they are
is because of AI,
because of artificial intelligence,
and because Google is arguing
that their size and scale
allows them to
do this research into ai that other companies can't and i was like wait that's an argument for
do i want one company having all the power to do research into ai because they're telling us that
they are you know are benevolent uh especially because, again, like the incentives that you have are so different when you're a monopoly.
If you're Google and you're saying, what can I do with AI and you're a monopoly, you're not saying, what are the AI tools that are going to attract new consumers that people are really going to like?
You're going to say, what are ways I can use AI to just drill even more money out of the people who have to use my search engine because they don't have any other choice.
So we have to flip those incentives.
I think that's really important.
Speaking of AI, flip phone user Chuck Schumer organized a...
Is he a flip phone guy? Really?
Wow.
Famously.
Chuck Schumer taking the offline challenge.
Good for him.
Yeah, I don't think he ever...
I don't think he was ever online.
He organized a three-hour meeting of the minds in Washington this week, attended by Elon Musk, Mark Zuckerberg, Bill Gates, the CEOs of Google, OpenAI, Microsoft, various Democratic and Republican politicians.
All around AI, most of the executives agreed on the need for regulating AI, with Elon warning the group of civilizational risks.
But according to the New York Times,
there was some disagreement among the titans.
Here's a quote.
Mr. Zuckerberg highlighted open source research
and development of AI,
which means that the source code
of the underlying AI systems are available to the public.
And he said, quote,
open source democratizes access to these tools
and that helps level the playing field
and foster innovation for people and businesses. apparently bill gates and others raise concerns that open
source ai could lead to security risks no shit i just want to say that like zuckerberg's quote
reminded me of his argument for facebook in the first place like all this is about
is connecting the world and the more you connect people all around the world, it democratizes communication.
And everyone has a voice and everything's going to be wonderful.
And it's the same shit.
I could be proven wrong.
I don't know enough about this.
But it does seem to me that giving everyone in the world the tools to do whatever they want with AI was not going to lead to a good place. It really underscores for me that if I wanted to know the best way to regulate a new emerging
technology, the people who I would consult on it are not the heads of the major companies
dominating the tech sector. It's like it's a real contrast that as the executive branch of
our government is questioning and seeking to reduce the power of these companies,
the legislative branch is consulting those same companies for how they should like guide the tech industry going forward.
It's a real like the frog holding hearings, asking the scorpion to testify on the best way to cross the river.
I guess there were some other experts there, too, that were not affiliated with the company.
I wouldn't want to be too, but like. And they they're closed door hearings so we don't know exactly what was said
yeah i also think there's a little and this is probably left over from the obama era there's
still a little bit of like oh these guys are so smart and they can yeah the smartest people in
the room and there's definitely more skepticism now and i'm sure schumer and a lot of the other
republican and democratic politicians they're skeptical of them.
But still, it's still sort of a little bit of a hangover from, you know, Silicon Valley knows all.
I want to believe that Congress has learned its lesson because they did finally come around on the social media companies, but it kind of feels like deja vu where it took them so long to even understand
how the social media platforms work. I'm sure you remember those infamous 2018 hearings with
Mark Zuckerberg where Orrin Hatch is asking, how do you make your money? Why do I have so many
chocolate ads on my feed? And at that point, social media had been like a national emergency for two or three
years, and they still didn't know how it worked. And a lot of people have come around since then.
And by 2020, you did see a lot of people in Congress who had gotten very smart on it,
and their staff had finally gotten smart on it. But it does feel like we haven't fully learned
our lesson about how you have to be ahead of the curve on the tech companies instead of just asking them, well, how does your technology work?
And, you know, on that note about timing, like I really worry that a lot of this regulatory talk about artificial intelligence is it's coming a bit late already, even though it's just been a huge topic that's sort of broken out of the tech world and into the political world in maybe the last
year or two. But we have an election coming up. And I think that there are a lot of bad actors,
many of them with authoritarian bents in the world, who would like to see Donald Trump reelected.
And, you know, I don't want to overstate the effects of propaganda because low tech propaganda works pretty well on people as well.
But you have a lot of AI generated propaganda from a lot of bad actors involved in an election in 2024, whether domestic or foreign, and doesn't seem like it's a recipe for anything good. And the thing that does make me a
little bit sympathetic to the difficulty of regulating it is that technology moves so fast.
And it's hard for even the people who, I think it's really important to understand about this
generation of AI, is that not even the people who design it and build it are fully aware or able to fully understand what it's
capable of because so much of it is self-guided that doesn't mean it's skynet and it's going to
like break free and take over but it does mean that just like what it's able to do is you just
like run it and you kind of test it and see like what can it do what can it not do and so it's like
if even the people who are making it aren't are like learning about it after the fact regulators
are there's going to be no way for them
to keep that out of the curve.
Yeah, it feels like with every technological development,
there are unintended and unforeseen consequences.
And it seems like for AI, that is going to be on steroids.
Right, yeah.
Which is, and look, I think the White House
has been on top of this.
I think they know.
I mean, when dan interviewed white
ass chief of staff uh jeff zients on pod save america a couple months ago he was like oh this
is one of the top three issues the president's concerned on which sort of surprised me yeah wow
yeah surprising yeah and so they're gonna come up with some regulations i mean even i think like as
a you know i don't know if it's a band-Aid solution or not, but for the 2024 election, even requiring watermarks on AI-generated content.
I don't know.
That seems like it could help a little bit.
Yeah.
I mean, there's been a lot of technologies that we have developed in our country's history where because that technology is considered to be dangerous, the government regulators are involved in monitoring it.
And you just like you have to have just like a grown up in the room just kind of keeping track of where it's going and what you're working on.
I mean, that's been the history with weapons development in the country.
And that doesn't mean that you're stopping progress.
It doesn't mean the government is controlling it or spying on it.
But I think we do have to.
There's, I think, broad acknowledgement that we don't fully understand how powerful this could be. And
as skeptical as I am of the government's ability to get ahead of this, I do think just close
involvement with these companies is going to be helpful. So more meetings with Elon and Zuck,
who I guess were quite frosty to each other, is what I heard. I was waiting for a cage match to break out.
Yeah, and they're both still willing, I guess.
But we'll see.
Yeah, I don't buy it.
Speaking of Elon,
we got to talk about Walter Isaacson's new biography
about the billionaire.
Lots of excerpts already making news.
I have to say,
I thought the story about his radicalization
was quite interesting and insightful it seems like
well it seems like two big things happened one he was mad that he had to shut his tesla factory down
because of the pandemic uh wanted people to keep working uh didn't think the pandemic was a big
deal so he's a real like you know freedom, Fauci stealing my freedom kind of guy.
And it seems like he's even angrier that his eldest child not only transitioned, but became a leftist.
And he blames what he calls the progressive woke indoctrination at the L.A. private school right here in L.A. Crossroads for the fact that she no longer wants to spend any time with him.
And here's a quote he gave to Walter Isaacson.
Unless the woke mind virus,
which is fundamentally anti-science,
anti-merit, and anti-human in general,
is stopped,
civilization will never become multi-planetary.
I just loved where that sentence ended.
I know. It's definitely,
it takes a little Elon Musk twist at the end.
Oh no,
we're not going to become multi-planetary
because of the woke mind virus.
It's funny.
He's in so many ways,
he just sounds like you're kind of like
standard 50 something Fox News dad,
but with just with like,
there's a little sci-fi guy,
a little like flavor note at the end.
Well, this is,
I mean,
I want to hear what you think is interesting about
this book but it just for me that sort of explains it all and it's it's you know people
i think liberals sometimes tend to think that people on the far right uh like if you're
radicalized if you're a fox news watching person like you got to be you gotta be dumb you know
and there's been a little like oh elon's not really a genius elon's dumb kind of thing i think elon is clearly a very smart
guy about a lot of topics he thinks he's smarter on a lot more topics than he is but like he's
building cars building rockets some kind of smart guy sure and yet he clearly fell down
the rabbit hole of internet radicalization and some some of it was personal. Some of it was
tendencies that he clearly already had there in his life, right? And tendencies already had,
but it's fairly clear what happened here is that this guy was brain poisoned by the internet.
Yeah. I think you're right that the kind of big question this book is trying to answer
is like, what happened to Elon Musk? And it's like the project of this book is trying to answer is like what happened to elon musk and it's
like the project of this book is to find a like grand unified theory like what's the rosebud sled
of elon musk and i think that he has assembled some like meaningful stories from a long elon
musk life that like helped to chart his progress I think he overstates how much there's like the aha,
like radicalization.
This is where he turned moments.
Which you do to tell a story.
It's true.
And it's,
it's over exaggerated.
Yeah.
And I think that like,
look,
when you are a full-time book writer,
it's like you and I know a lot of full-time book writers.
Like you have to churn out a lot of books to make ends meet.
Even if you're a big writer,
like Walter Isaacson, like I think this book was kind of a paycheck job. Like, I'm sorry,
like no shade to my guy, but like there are passion projects and there's one where you like
churn it out off of 10 interviews and it's like hope to make a lot of money. I think that's what
this was. And the like case for what happened to Elon, he was radicalized by these two events,
I think is way overstated. I think that, you know,
you see a lot of his tendencies from much earlier that in his life,
like in the PayPal era, you see a lot of this.
And I think in a lot of ways,
he is like kind of just your standard Obama Trump voter.
Like, I think that that really tracks like the timeline.
I think it tracks the things that he talks about,
that he cares about the fact that like Walter Isaacson makes a big deal about,
like he used to be an Obama donor and now like look at how fucking crazy he is. And I think that he just like I think that there are probably particulars from his life that track along and like parallel the general Obama Trump voter journey, which is this like small but really well documented phenomenon and i think it's just basically like the same thing
that we know happened generally with obama trump voters which it was racial resentment and like a
backlash to social change that felt like it was going faster moving at a pace that he didn't like
and wasn't comfortable with and now he's just like taken this hard right turn as so so many people in
not just the united states but a lot of Western countries have.
And I think that for him and for a lot of the Silicon Valley online used to be center,
center right, has since gone even further reactionary. There's like an added level to it because I think a lot of Obama Trump voters voters are just by demographics non-college educated
sure right like right midwestern middle of the country that's that's sort of the stereotype
right but then you've got like the elons the all-in pod folks like all of these
the blake masters and the on the extreme end of this right peter teal and for them you know having gone to a number of these fundraisers with obama
way back in the day you just like you can't it's physically impossible to not roll your eyes at
these fundraisers when the tech people talked and for all their brilliance and rather like we're
you know we believe in social progress and they're culturally liberal and blah, blah, blah. There's just this libertarian streak that is just so they view the government and they view politics as dirty and stupid and corrupt.
And they are the geniuses that can fix everything. So when they have to face any kind of rules, regulations, any kind of, you know, entreaty to think about others and the populace and stuff like that, like, and to come together as a community, they resist.
And I think that was clearly gone with around the pandemic.
Right. And then I think some of the more cultural social stuff is in line with what you're saying, which is, you know, he had a, you know, he had a kid who transitioned and suddenly he thinks that the reason she doesn't want to turn to the right is both very distinct society and like now society is changing very roman
empire yeah right you're always thinking about the roman empire john um and like you know at some
point in the like 2010s a lot of non-college white men started to learn like oh my god i have to make
space in society for like i don't get to be the top of the social hierarchy. Not everything is going to be
for me. I have to make room for people of color, LGBT people, people who are different from me.
And I hate that. And I hate that change. And I want to reclaim my place at the top of the hierarchy,
much like tech guys came up, especially tech guys of the Gen X Elon Musk generation came up in the
90s thinking that like, we are the heroes of the universe.
We're the masters.
We're the smartest people in the world because they were writing off of all this free VC money.
And then it started to crumble in the 2010s because there was a big backlash to Silicon Valley after Trump was elected.
There was like interest rates are going up.
Regulations are coming down.
And they're saying, well, people are turning against us. They're no longer saying we're the smartest
people in the history of the world. So that must mean there's something wrong with the world and
we have to do this big backlash to reclaim our place. And there's this like mission creep where
it's like, I'm smart about this. So therefore I am smart about everything, right? Which is the whole,
the whole Twitter story with Elon, right? Like I could build rockets and cars, so I must be able to run a social network.
How hard can people be?
Right, right, right.
Well, dude.
And this is that, like, every, like, again, like, something that I think, like, kind of gets under my skin about the Walter Isaacson component is, like, he's trying to trace an arc, and I get that.
But, like, I really think this is who he has always been.
Also in terms of
like fucking up his businesses like you read his stories about getting pushed out of paypal
and it's because he tried to rename the company x for no good reason and they were like what the
hell are you doing and tried to branch it off and all these businesses that it shouldn't have been
in which is like the same thing that he's doing now i think he has in his personal life and his personality, he's always been a puckish asshole.
Sure.
Yeah.
I think that is for sure.
And I'm saying that not like.
As a puckish asshole yourself.
As a puckish asshole.
Right.
No.
Knowing people who know him, knowing people who've dealt with him from way back when,
like this is all, it all fits.
And then I think, but I think there is a story about political radicalization that actually
now his political beliefs are more in line with his personality.
Yeah, that's true.
That's sort of how I see it.
That's true.
Yeah, right.
I guess it's always the question with political radicalization is how much is it driven by particular events in some way?
And this is true of anyone.
This is true of the randos who join QAnon, particular events in their life, particular shocks, and how much of it is because of broader societal trends that they're just a part of.
And usually it's a mix of both. Anything else from the biography that you're interested in?
So this episode about Elon Musk shutting down or refusing to open up access to this Starlink
remote internet service to the Ukrainian military when it was trying to launch this naval drone attack on Russian forces in the occupied Ukrainian port of Zavastopol, I think is pretty interesting.
And it's like, it's not a story that was first reported in the Walter Isaacson book, but we got some new details on it.
Now, some of those details turned out to be wrong.
Whoops.
Big whoops. But I think this is like a thing,
a real thing to keep an eye on is like Elon Musk's relationship to the Russian government. I'm not
trying to like push a big conspiracy theory. I don't think he's in Vladimir Putin's pocket, but
like he has implied to a lot of people that he talks on the phone to Vladimir Putin. He has said
himself that he did this to block a Ukrainian attack on Russian forces because
of a conversation he had with the Russian ambassador to the United States, in which the
Russian ambassador- Who's always on the level.
That's right. Right. Who turned out to have lied to Elon Musk about, he said that if Ukraine
launches an attack on Crimea, where this port is, it will lead to nuclear war. That was never
credible. Everybody who knows anything about this war knows that that was not true. But he believed that this was true. And so he is like still talking very
proudly about like, yes, we stopped the Ukrainians from launching this attack because we have the
Starlink service that they use by like shutting or refusing to turn off the service for them.
So all these drones like floated up onto the beach and like people in the US.s government are worried about this yeah you don't have to believe a conspiracy that like
elon musk is a russian asset who wants putin to win the war sure but you but what's obvious is
that he's a guy who's had his brain poisoned by the internet who's been radicalized by a lot of
the right-wing bullshit that you see on the internet. Which is very pro-Russian.
And therefore, his view of the war is in line with what a lot of the folks on the right think.
Yeah.
And so, but the only difference is
he's in control of this incredibly important
communication system and billions of dollars
and has this relationship with the U.S. government.
The U.S. government has to depend on him for shit.
I mean, it's...
Yeah, this worries me a lot as bad
as the Twitter stuff is the fact that he is an integral node in not just American, but international
like military defense infrastructure is worrying because he's made clear with this episode that he
feels that it is not just okay, but actually really good for him to intervene and to have
like a foreign policy of
his own and to say like, I don't think this attack is a good idea, so I'm not going to let it happen.
And it's why, you know, the free speech champion that Elon Musk is just sort of,
that goes away when it comes to China, because he's got business interests,
but particularly around Tesla and China. And there is a part in the book where,
I guess, Isaacson reports on a conversation that Elon had with Barry Weiss,
journalist Barry Weiss, about, he said, oh, yeah, China, we are going to have to be careful what is
said on Twitter about China because of my business interests. Now, you're a little more skeptical of that. So I hate to say it, but I don't think that what we learned about this particular China
conversation is that damning or that bad. So what he is reported to have said is like,
we have to be careful about what we say about China because we have business interests there,
which is, I don't like that that's true, but like every company has business in China
and they will all say the exact same thing.
That's not like a good thing,
but it doesn't mean that he's doing like pro-China censorship.
It doesn't mean that he's like steering policy towards China.
I don't think we have any evidence yet
that he is like manipulating Twitter
to appease Chinese authorities.
And I think we would see that pretty quickly
because everything he does there is so fucking clumsy. It's very obvious. I'm going to turn my entire account into an anti-China
propaganda screed and see what happens. Do you think I'll get shadow banned?
I think give it a shot. I think it'll be interesting to see what your engagement numbers
do. I think that you should become a hardcore Taiwanese nationalist and be just like,
Chiang Kai-shek is your bio photo now. You're all about the Taiwanese National Party. You're reclaiming all of greater China for Taiwan. I think that
would be a fun bit for you. I guess I should really get TikTok off my phone.
All right. That's all the time we have for today. I think we solved that problem.
We'll see you next Sunday. Bye, everyone. Cheers. Andrew Chadwick is our sound editor. Kyle Seglin, Charlotte Landis, and Vassilis Fotopoulos sound engineered the show.
Jordan Katz and Kenny Siegel take care of our music.
Thanks to Michael Martinez, Ari Schwartz, Amelia Montooth, and Sandy Gerard for production support.
And to our digital team, Elijah Cohn and Rachel Gajewski, who film and share our episodes as videos every week. If you're looking for more cricket content, check out the latest episode of our YouTube original series, Liberal Tears.
Host Tommy Vitor and political commentator Brian Tyler Cohen team up to rank the most what the fuck moments from political press conferences.
Look, if you want to watch Tommy Vitor get hazed by Brian Tyler Cohen, this is the show for you.
He's had to eat hot chips.
He got tramp stamps of Mike Pence.
The whole thing.
Search Liberal Tears on YouTube and subscribe to make sure you don't miss future episode drops.
Also, it's been one month since Crooked Media Reads published our first book, Mobility, by Lydia Kiesling.
And we got some updates.
Not only did Mobility make national bestseller lists,
it has also received amazing reviews,
such as this one from the LA Times,
which called it an emotionally
and geopolitically savvy coming-of-age story.
Don't let another month pass you by.
Grab your copy of Mobility wherever books are sold.