Hard Fork - The Interview: How Wikipedia Is Responding to the Culture Wars
Episode Date: November 25, 2025Last month our colleague Lulu Garcia-Navarro had a conversation with Wikipedia’s co-founder Jimmy Wales about the challenges the site is facing — including by right-wing influencers who claim it i...s biased and by A.I. chatbots that compete with its content.We found the conversation interesting, and think you might too. So to tide you over until our special holiday episode on Friday, we’re bringing you that conversation from the New York Times podcast “The Interview.” Guests: Jimmy Wales, co-founder of Wikipedia and author of “The Seven Rules of Trust: A Blueprint for Building Things That Last” Additional Reading:The Culture Wars Came for Wikipedia. Jimmy Wales Is Staying the Course.Elon Musk Challenges Wikipedia With His Own A.I. EncyclopediaElon Musk Groks Wikipedia We want to hear from you. Email us at hardfork@nytimes.com. Find “Hard Fork” on YouTube and TikTok. Subscribe today at nytimes.com/podcasts or on Apple Podcasts and Spotify. You can also subscribe via your favorite podcast app here https://www.nytimes.com/activate-access/audio?source=podcatcher. For more podcasts and narrated articles, download The New York Times app at nytimes.com/app.
Transcript
Discussion (0)
Well, Casey, it's Thanksgiving week.
Happy Thanksgiving week to you and your family, Kevin.
It's also a Tuesday.
It is a Tuesday, but we had a couple things that we wanted to tell you about.
The first of them is that we want to bring you a recent episode of the interview that we really thought you might like.
Yes. Also, we wanted to take a week off and go have Thanksgiving dinner with our families.
Oh, yeah. You're not going to catch us in the studio this time.
So last month, our colleague Lulu Garcia Navarro interviewed Jimmy Wales, the co-founder of Wikipedia.
they had a really interesting conversation about all of the culture wars surrounding Wikipedia
right now, how it operates, the political bias accusations that it has faced, the challenges
they're now facing from AI and from Grogapedia, Elon Musk's version of Wikipedia.
And we should note that this interview happened just before Grogapedia was released to the public,
but we thought this was a conversation well worth hearing, and we wanted to bring it to you
on this Thanksgiving week.
Absolutely.
And the second thing we wanted to tell you is that, believe it or not, Kevin, it has been an entire year since we released our episode on the 100 most iconic technologies of all time.
And in the wake of that episode, listeners said, we would like to hear about more technological icons.
And so we are proud to announce today that this Friday, in this very feed, we will be releasing the long-awaited follow-up to that list, the 50 most iconic technologies of 2025.
Yes.
And if you're wondering, why only 50 this year?
Casey, why are only 50 this year?
It's because there were not that many iconic technologies this year.
And we have a very high bar for what counts as an icon.
And, you know, Kevin, just to give people a little teaser of what they can expect,
we have a clip from this episode that's coming up on Friday.
Rare Earth Metals!
So that is coming up on Friday.
If you have a long drive somewhere to see your family on Black Friday
because you're going to the mall.
If you're camped out in line,
outside of a Walmart waiting for the Labuboos to open up.
This is a great thing you can listen to.
And by the way, did Labibu's make the list?
There's only one way to find out Hard Fork this Friday.
And we hope you have a great holiday.
We're thankful for you.
Very thankful.
From the New York Times, this is the interview. I'm Lulu Garcia Navarro.
As one of the most popular websites in the world, Wikipedia helps define
our common understanding of just about everything.
But recently, the site has gone from public utility
to a favorite target of Elon Musk,
congressional Republicans, and MAGA influencers,
who all claim that Wikipedia is biased.
In many ways, those debates over Wikipedia
are a microcosm of bigger discussions we're having right now
about consensus, civil disagreement,
shared reality, truth, facts,
all those little easy topics.
A bit of history. Wikipedia was founded back in the Paleolithic era of the Internet in 2001 by Larry Sanger and Jimmy Wales.
It was always operated as a non-profit and it employs a decentralized system of editing by volunteers, most of whom do so anonymously.
There are rules over how people should engage on the site, cordially, and how changes are made, transparently.
And it's led to a culture of civil disagreement that has made Wikipedia what some have called the Lerner.
last best place on the internet.
Now, with that culture under threat, Jimmy Wales has written a book called The Seven Rules
of Trust, trying to take the lessons of Wikipedia's success and apply them to our increasingly
partisan trust-depleted world.
And I have to say, I did come in skeptical of his prescriptions, but I left hoping he's right.
Here's my conversation with Wikipedia co-founder Jimmy Wales.
I wanted to talk to you because I think this is a very tenuous moment for trust, and your new book is all about that.
In it, you sort of lay out what you call the seven rules of trust based on your work at Wikipedia.
And we'll talk about all those, as well as some of the threats and challenges to Wikipedia.
But big picture, how would you describe our current trust deficit?
it. I think I draw a distinction between what's going on maybe with politics and journalism,
the culture wars and all of that, and day-to-day life. Because I think in day-to-day life,
people still do trust each other. People generally think most people are basically nice. And we're
all human beings bumping along on the planet, trying to do our best. And obviously, there are
definitely people who aren't trustworthy.
But the crisis we see in politics, trust in politicians, trust in journalism, trust in business, that is coming from other places and is something that we can fix.
One of the reasons why you can be an authority on this is because you created something that scores very high on trust.
You have built something that people sort of want to engage with.
Yeah. I mean, I do think Wikipedia isn't as good as I want it to be. And so I think that's part of why people do have a certain amount of trust for us because we try to be really transparent. You know, you see the notice at the top of the page sometimes that says the neutrality of this page has been disputed or the following section doesn't cite any sources. People like that. Not many places these days will tell you, hey, we're not so sure here.
It shows that the public does have a real desire for unbiased, sort of neutral information.
They want to trust.
They want the sources.
They want you to prove what you're saying and so forth.
How does Wikipedia define a fact?
Basically, we're very old-fashioned about this sort of thing.
What we look for is good quality sources.
So we like peer-reviewed scientific research, for example.
example, as opposed to populist tabloid reports. You know, we look for quality magazines,
newspapers, et cetera, so we don't typically treat a random tweet as a fact. And so we're pretty
boring on that regard. Yeah, it's sort of like the publication that you cite gets cited by other
reputable sources, that it issues corrections when it gets things wrong. It's all the old-fashioned
sort of good stuff. And I think it's important to say when we look at different sources,
they will often come to things from a different perspective or a different political point
of view. That doesn't diminish the quality of the source. So, for example, I live here in London,
in the UK. We have the Telegraph, which is a generally right-leaning but quality newspaper.
We have the Guardian, generally left-leaning but-quality newspaper. Hopefully the facts,
as you read the articles and you glean through it, the fact should be.
reliable and solid. But you have to be very careful as an editor to tease out, okay, what are the
facts that are agreed upon here? And what are the things that are opinions on those facts?
And that's, you know, that's an editorial job. It's never perfect and it's never easy.
And Wikipedia also is famously open source. It's decentralized and essentially it's run by
thousands of volunteer editors. You don't run Wikipedia, we should say.
It runs me.
How do those editors fix disputes when they don't agree on what facts to be included or on how something is written?
How do you negotiate those differences?
Well, in the best cases, what happens and what should happen always is take a controversial issue like abortion.
Obviously, if you think about a kind and thoughtful Catholic priest and a kind and thoughtful Planned Parenthood activist, they're never going to agree about abortion.
but probably they can come together, because I said they're kind and thoughtful, and say,
okay, but we can report on the dispute. So rather than trying to say abortion is a sin or abortion
is a human right, you could say, Catholic Church position is this, and the critics have responded
thusly. You'll start to hear a little of the Wikipedia style, because I believe that that's
what a reader really wants. They don't want to come and get one side of the story. They want to come and
say, okay, wait, hold on. I actually want to understand what people are arguing about. I want to
understand both sides. What are the best arguments here? Yeah, and basically every page has what's
called a talk tab where you can see the history of the discussions and the disputes, which relates
to another principle of the site, which is transparency. You can look at everything and see who
did what and what their reasoning was. Yeah, exactly. So, you know, oftentimes if you see something
repeating, you think, huh, okay, well, why does it say that? Often, you'll be able to go
on the talk page and read sort of what the debate was and how it was. And you can weigh in there
and you can join in and say, oh, actually, I still think you've got it wrong. Here's some more
sources. Here's some more information. Maybe propose a compromise, that sort of thing. And in my
experience, it turns out that a lot of pretty ideological people on either side are actually
more comfortable doing that because they feel confident in their beliefs. I think it's the people
who, and you'll find lots of them on Twitter, for example, they're not that confident in their
own values and their own belief system. And they feel fear or panic or anger if someone's
disagreeing with them rather than saying, huh, okay, look, that's different from what I think.
Let me explain my position, which is where you're more intellectually grounded person will
come from. What you're saying is supported actually by a study about Wikipedia that came out
in the Science Journal Nature in 2019,
it's called the wisdom of polarized crowds.
Perhaps counterintuitively,
it says that politically contentious Wikipedia pages
end up being of higher quality,
meaning that they're more evidence-based,
they have more consensus around them.
But I do want to ask about the times
when consensus building isn't necessarily easy
as it relates to specific topics on Wikipedia.
Some pages, they have actually restricted editing privileges.
So the Arab-Israeli conflict,
climate change abortion unsurprising topics there. Why are those restricted and why doesn't the
wisdom of polarized crowds work for those subjects? Well, typically the subjects that are restricted
are we try to keep that as short as we can. The most common type of case is if something's really
big in the news or if some big online influencer says, ah, Wikipedia's wrong, go and do something
about it. And, you know, we get a rush of people who don't understand our culture, don't
understand the rules, and they're just vandalizing or they're just sort of being rude and
so on and so forth. And we just, as a calming down, just like, okay, hold on, just slow down.
We're going to protect the page. And then there are pages where, you know, the most common type
of protection, we call semi-protection, which just means you have to have had an account for,
I forget the exact numbers, it's something four days and you have made 10 edits without getting banned.
Now, typically, and this is what's surprising to a lot of people about Wikipedia, like 99% of the pages, maybe more than 99%, you can edit without even logging in, and it goes live instantly.
That's like mind-boggling, but it kind of points to the fact that most people are basically nice.
Most people are trustworthy. People don't just come by and vandalize Wikipedia, and often if they do, it's because they're just experimenting or they didn't believe that would, they're like, oh my God, it actually went live.
I didn't know I was going to do that.
It's like, yeah, please don't do that again.
This brings me to some of the challenges.
Wikipedia, while it has created this very trustworthy system,
it is under attack from a lot of different places.
And one of Wikipedia's sort of superpowers can also be seen as a vulnerability, right?
The fact that it is created by human editors.
And human editors can be threatened, even though they're supposed to be anonymous.
You've had editors, docs.
pressured by governments to doctor information.
Some have had to flee their home countries.
I'm thinking of what's happened in Russia, in India,
where those governments have really taken aim at Wikipedia.
Would you say this is an expanding problem?
Yeah, I would.
I think that we are seeing all around the world
a rise of authoritarian impulses
towards censorship, towards controlling information.
And very often these come, you know, as a wolf and sheep's clothing because it's all about protecting the children or whatever it might be that, you know, you move forward in these kind of control ways.
But at the same time, you know, the Wikipedians are very resilient and they're very brave.
And one of the things that we believe is that in many, many cases, what's happened is a real lack of understanding by politicians.
and leaders of how Wikipedia works.
A lot of people really have a very odd assumption
that it's somehow controlled by the Wikimedia Foundation,
which is the charity that I set up
that owns and operates the website.
Therefore, they think it's possible to pressure us
in ways that it's actually not possible to pressure us.
The community has real intellectual independence.
But yeah, I do worry about it.
I mean, it's always something that weighs very heavily on us
is volunteers who are in dangerous circumstances
and how do they remain safe is, like, critically important.
I want to bring up something that just happened here in the U.S.
In August, James Comer and Nancy Mace
to Republican representatives from the House Oversight Committee
wrote a letter to Wikimedia,
requesting records, communication, analysis on specific editors
and also any reviews on bias regarding the state of Israel,
in particular.
The reason, and I'm going to quote here, is because they are investigating the efforts of foreign operations and individuals at academic institutions subsidized by U.S. taxpayer dollars to influence U.S. public opinion.
So can you tell me your reaction to that query?
Yeah.
I mean, you know, we've given a response to the parts of that that were reasonable.
I mean, what we feel like is there's a deep misunderstanding or lack of understanding about how Wikipedia works.
You know, ultimately, the idea that something being biased is a proper and fit subject for a congressional investigation is frankly absurd.
And so, you know, in terms of asking questions about cloak and dagger, whatever, we're not going to have anything useful to tell them.
I'm like, I know the Wikipedians, they're like a bunch of nice geeks.
Yeah, I mean, the Heritage Foundation here in the United States, which was the architect of Project 2025, have said that they want to docks your editors.
I mean, how do you protect people from that?
I mean, it's embarrassing for the Heritage Foundation.
I remember when they were intellectually respectable, and that's a shame that if that's what they think is the right way forward, they're just badly mistaken.
But it does seem that there is this movement on.
the right to target Wikipedia over these types of concerns. And I'm wondering why you think
that's happening. I mean, it's hard to say. There's a lot of different motivations, a lot of
different people. Some of it would be, you know, genuine concern if they see that maybe Wikipedia's
biased or, you know, I have seen, for example, Elon Musk has said, Wikipedia is biased because they have
this really strong rules about only citing mainstream media and the mainstream media is
biased. Okay. I mean, that's an interesting question, interesting criticism. Certainly, I think,
worthy of some reflection by everyone, the media, and so on and so forth. But it's hardly,
you know, it's hardly news to anybody and not actually that interesting. Then other people,
you know, in various places around the world, not speaking just of the U.S., but, you know,
facts are threatening. And if you and your policies are at odds with the facts, then you may find
it very uncomfortable for people to simply explain the facts. And I don't know, that's always
going to be a difficulty. But we're not about to say, gee, you know, maybe science isn't valid
after all. Maybe the COVID vaccine killed half the population. No, it didn't. Like, that's crazy
and we're not going to print that. And so they're going to have to get over it. I want to talk
about a recent example of a controversy surrounding Wikipedia, and that's the assassination
of Charlie Kirk.
You know, Senator Mike Lee called Wikipedia Wicked because of the way it had described Kirk
on its page as a far-right conspiracy theorists, among other complaints that they had about
the page.
And I went to look at the time that we're speaking, and that description is now gone from
Wikipedia.
Those on the left would say that that description was accurate.
those on the right would say that that description was biased all along. How do you see that,
that tension? Well, I mean, I think the correct answer is you have to address all of that.
You have to say, look, this was a person. I think the least controversial thing you could say about
Charlie Kirk is that he was controversial. I don't think anybody would dispute that. And to say,
like, okay, this was a figure who was a great hero to many people and treated.
as a demon by others. He had these views, many of which are out of step with, say, mainstream
scientific thinking, many of which are very much in step with religious thinking and so on
and so forth. And those are the kinds of things that if we do our job well, which I think we have
in this case, we're going to describe all of that. Like maybe you don't know anything about
Charlie Clark. You just heard, oh my God, this man was assassinated. Who was it? What's this all
about? Well, you should come and learn all that. You should learn, like, who his supporters were
and why they supported him and what are the arguments he put forward and what are the things
he said that upset people that's that's just part of learning what the world is about so those
words that were there far right and and conspiracy theorists those were in your view the wrong words
and that the critics of wikipedia had a point it well i don't it depends on what they
depends on the specific criticism so if the criticism is this word appeared on this page for
minutes, I'm like, you know what, that's, you got to understand how Wikipedia works. It's a process.
It's a discourse. It's a dialogue. But to the extent that he was called a conspiracy theorist by
prominent people, that's part of his history. That's part of what's there. And Wikipedia
shouldn't necessarily call him that. But we should definitely document all of that.
You mentioned Elon Musk, who's come after Wikipedia. He calls it Wocopedia. He's now trying to start
his own version of Wikipedia called Grocoppedia, and he says it's going to strip out ideological
bias. I wonder what you think attacks like his do for people's trust in your platform writ large.
Because as we've seen in the journalism space, if enough outside actors are telling people not to
trust something, they won't. Well, you know, it's very hard to say. I mean, I think for many people
their level of trust in Elon Musk is extremely low because he says wild things all the time.
So to that extent, you know, when he attacks us, people donate more money.
So, you know, that's not my favorite way of raising money.
But the truth is, a lot of people are responding very negatively to that behavior.
One of the things I do say in the book, and I've said to Elon Musk, is that type of attack is it's counterproducing.
productive, even if you agree with Elon Musk, because to the extent that he has convinced people
falsely that Wikipedia has been taken over by woke activists, then two things happen.
You're kind and thoughtful conservatives who we very much welcome and we want more people
who are thoughtful and intellectual and maybe disagree about various aspects of the spirit of
our times. Come and join us and let's make Wikipedia better. But if those people think, oh no,
it's just going to be a bunch of crazy woke activists, they're going to go away.
And then on the other side, the crazy woke activists are going to be like, great, I found my home.
I don't have to worry about whatever.
I can come and write rants against the things I hate in the world.
We don't really want them either.
You said you talked to Elon Musk about this.
When did you talk to him about it and what was that conversation like?
I mean, we've had various conversations over the years.
You know, he texts me sometimes.
I text him sometimes.
he's much more respectful and quiet in private.
But that you would expect, it's got a big public persona.
When was the last time you had that exchange?
That's a good question.
I don't know.
I think the morning after the last election, he texted me that morning.
I congratulated.
Obviously, the debate that happened more recently was because of
the hand gesture that he made that was interpreted in different ways,
and he was upset in the way that it had been characterized on Wikipedia.
Yeah, I heard from him after that.
I mean, in that case, I pushed back because I went to check, like, oh, what does Wikipedia say?
And it was very matter of fact.
It said he made this gesture, it got a lot of news coverage, many interpreted it as this,
and he denied that it was a Nazi salute.
That's the whole story.
It's part of history.
I don't see how you could be upset about it being presented in that way.
If Wikipedia said, you know, Elon Musk is a Nazi, that would be really, really wrong.
But to say, look, he did this gesture and it created a lot of attention and some people said it looked like a Nazi.
Yeah, that's great.
That's what Wikipedia is.
That's what it should do.
Do you think Elon Musk is acting in good faith?
You're saying that in private he's nice and cordial, but his public persona is very,
is very different.
You know, I think it's a fool's errand to try and figure out what's going on in Elon Musk's mind.
So I'm not going to try.
I don't mean to press you on this.
I'm just trying to refer to something that you said, which is people human to human are nice, right, that we're good, that we should assume good faith.
And so you're saying that Elon one-on-one is lovely.
But he is attacking your institution and potentially draining support for Wikipedia.
Well, I mean, I don't think he has the power he thinks he has
or that a lot of people think he has to damage Wikipedia.
I mean, we'll be here in 100 years and he won't.
So I think as long as we stay Wikipedia, people will still love us.
They will say, you know what, Wikipedia is great.
And all the noise in the world and all these people ranting, that's not really the real thing.
The real thing is genuine human knowledge, genuine discourse, genuinely grappling with the difficult issues of our day, that's actually super valuable.
So there's a lot of noise in the world.
I hope Elon will take another look and change his mind.
That would be great.
But I would say that of anybody.
And, you know, in the meantime, I don't think we need to obsess over it or worry that much about it.
you know, we don't depend on him for funding. And yeah, there we are. I hear you saying that part of
your strategy here is just to stay the course, do what Wikipedia does. Are there changes that
you do think Wikipedia needs to make to stay accurate and relevant? Well, I think what we have to
focus on when we think about the long run, we also have to keep up with technology. You know,
we've got this rise of the large language models, which are an amazing but deeply flawed
technology. And so the way I think about that is to say, okay, look, I know for a fact,
like, no AI today is competent to write a Wikipedia entry. It can do a passable job on a very
big, famous topic, but anything slightly obscure, and the hallucination problem is disastrous.
But at the same time, I'm very interested in how can we,
use this technology to support our community. One idea is, you know, take a short entry and feed in
the sources. Maybe it's only got five sources and it's short and just ask the AI, is there anything
in the entry that's not supported by the sources or is there anything in the sources that could be
in Wikipedia but isn't? And give me a couple suggestions if you can find anything. As I've played
with that, it's pretty okay. It needs work and it's not perfect. But if we react with just like,
oh my God, we hate AI, then we'll miss the opportunity to do that. And if we go crazy, like,
oh, we love AI and we start using it for everything, well, we're going to lose trust because we're
going to include a lot of AI hallucinated errors and so on. I mean, that's interesting because
Wikimedia writes this yearly global trends report on what might impact Wikipedia's work. And for
2025, it wrote, quote, we are seeing that low-quality AI content is being turned out not just to spread
false information, but as a get-rich-quick scheme, and it is overwhelming the internet,
high-quality information that is reliably human-produced has become a dwindling and precious
commodity, end quote.
I read that crawlers from large language models have basically crashed your servers
because they use so much of Wikipedia's content.
And it did make me wonder, will people be using these large language models to answer their
questions and not going to the source, which is you? Well, you know, this has been a question
since they began. We haven't seen any real evidence of that. And I use AI personally quite a lot.
And I use it in a different way, though, different use cases. And how? Say, I like to cook. It's my
hobby. I fancy myself as being quite a good cook. And I will often ask Chachapit for a recipe.
I also ask it for links to websites with recipes.
It sometimes makes them up, so that's a bit hilarious.
And I also suggest be careful using Chagabit for cooking, unless you actually already know how to cook, because when it's wrong, it's really wrong.
But Wikipedia would be useless for that.
Wikipedia doesn't have recipes.
It's a completely different realm of knowledge than encyclopedic knowledge.
So, yeah, I'm not that worried about it.
I do worry about, you know, in this time when journalism has been under incredible financial pressure,
and there's a new competitor for journalism, which is sort of low-quality, churned-out content,
produced for search engine optimization to compete with real human-written content,
to the extent that that further undermines the ability for the business model of, particularly,
local journalism is something that I'm very worried about, then that's a big problem. And it's not
directly about Wikipedia, but it is about, you know, it's very cheap to generate very plausible
text that, yeah, hmm, that that doesn't seem good to me.
It definitely doesn't seem good to me either as a journalist. I just recall that there was this
hope that as the internet got flooded with garbage, and this is even before AI, there's
this was just, you know, kind of troll farms and clickbait,
that it would benefit trustworthy sources of information.
And instead we've seen that the opposite has happened.
Wikipedia, news organizations, academic institutions.
They're all struggling with the same thing.
Why do you think that they are struggling in an era
where they should be flourishing if what you say is true
that people ultimately do want to trust the information
that they're getting?
Well, I mean, I think that a big piece of it is that the news media has not done a very good job
of sticking to the facts and avoiding bias.
I think a lot of news media, not all of it, but a lot of news media has become more partisan
and there are reasons for it, and it's short-termism.
And you'll even see there's some stuff in the book about this.
some arguments by some people in journalism that objectivity is, we should give up on it and
be partisan and so on and so forth. I think that's a huge mistake. And I think there's lots of
evidence for that. You know, Wikipedia is incredibly popular and that's one of the things people
say about Wikipedia that they really value and they're really disappointed if they feel like
we're biased and so on and so forth. So, I mean, I gave the example earlier because I live in the
UK, I read these papers. But, you know, if you look at the telegraph and you look at the Guardian
on an issue related to climate change,
I can already tell you before we start reading
which attitude is going to come from which paper.
Neither of them is doing a very good job of saying,
actually it's not our job to be on one side of that issue or the other.
It's our job to describe the facts
and to understand the other side and so on and so forth.
Returning to that value system is hugely important
because otherwise, how are you going to get the trust of the public?
I guess I'm surprised at you saying this because Wikipedia has been faced with similar attacks on its own credibility.
And you say that you are neutral and credible and that the system that you employ is fair.
And yet there are people who completely dispute that.
And so I think what the response to what is a very common,
broadside against journalists and journalism in this era that they have taken aside, those of us on
the inside would say it is part of a larger project of discrediting facts. And we've seen those
attacks on Wikipedia, we've seen them on academic institutions, and we've seen them on the
media. They are all part of the same thing. So I'd love you to tease out why it's unfair when it
happens to Wikipedia, but it's fair when it happens to journalistic institutions. Well, it's either
fair or unfair for both, depending on what the actual situation is. So, you know, there have been
cases in the history of Wikipedia when somebody said to me, wow, Wikipedia is really biased on this
topic. And I say the response should be, wow, let's check it out. Let's see. Let's try and do better.
If we find that we have been biased in some area, then we need to do a better job.
And I think that for many outlets, that isn't happening.
Instead, what's happening is pandering to an existing audience.
And I understand why.
The reason why that's happened, it has to do with the business model, has to do with
getting clicks online and so on and so forth.
It has to do with the fact that without sufficient financial resources, you have to kind of
scrap for every penny you can get in the short run, rather than saying, no, we're going to take
the long view, even though a few people may cancel. And, you know, I'm encouraging us all to say,
you know what, let's double down on that. Let's really, really take very, very seriously the need
to be trustworthy. After the break, Jimmy and I speak again about how he thinks part of Wikipedia's
success is the fact that profit isn't even on the table.
The most successful tweet I ever had, I think it was a New York Post journalist, tweeted to Elon, you should just buy Wikipedia when he was complaining something about Wikipedia.
And I just wrote, Not for Sale.
Hi.
Hello.
So I was saying, I was saying,
thinking about our first conversation. And I was thinking about the moment that Wikipedia was
created in, a time before social media, before sort of the dramatic polarization that we've
seen, before the political weaponization of the internet that we've seen. I'm still, after talking
to you, sort of not sure that the lessons of how Wikipedia was created apply to today.
And so I wanted to ask you, do you think Wikipedia could be created now and exist in the same way that it does?
Yeah, I do.
Yeah, I think it could.
And I actually think that the lessons are pretty timeless.
At the same time, yeah, it's absolutely valid to acknowledge the Internet is different now.
And there's new problems, new problems that come from social media and all the rest and the, you know, the, you know, the,
aggressively politicized culture wars that we live in, that is different. But I don't think
that's a permanent change to humanity. I think we're just going through a really crazy era.
Here we are. Why do you think the Internet didn't go the way of Wikipedia? You know,
collegial, working for the greater good, fun, nerdy, all the words that you use to describe that
moment of creation? Well, I, you know, the thing is, I'm old enough that I sort of grew up on
the internet in the age of Usenat, which was this massive message board, kind of like Reddit today,
except for not controlled by anyone, because it was by design, distributed and uncontrollable,
unmoderatable for the most part. And it was notoriously toxic. There was some skepticism then,
And that was when, you know, at first was recognized, I suppose first recognized, that anonymity can be problematic, that people, you know, behind an alias, behind their keyboards, no accountability, can be just really bad and really vicious.
And, you know, that's when we first started seeing spam.
I remember some of the early spam and everyone was like, oh, my God, what's this?
Spam, you know, it's terrible.
So I think some of these things are just human issues.
But now they've, you know, that's to a larger degree than then.
We live online.
It's in our pocket all the time.
It's in our pocket all the time, yeah.
So obviously the impact is much more.
Hmm. I mean, I think I was thinking about Wikipedia in particular and maybe why it went a different way in that you chose at a certain point to make it a not-for-profit.
it. You chose not to sort of capitalize on the success of Wikipedia. And it made me wonder about, you know, OpenAI started as an open source for the greater good project, kind of like Wikipedia. And they've now shifted into being a multi-billion dollar business. I'd love to know your thoughts on that shift for Open AI. But more broadly, do you think that the money part of it also changed the equation?
Yeah. I mean, I do think it made a difference.
in lots of ways. And I'm not against for-profit. You know, I'm not, you know, there's nothing,
nothing wrong with for-profit companies. But even as a nonprofit, you do have to have a business
model, so to speak. You've got to figure out how you're going to pay the bills. And for Wikipedia,
that's not too bad. The truth is, we don't require billions and billions and billions of dollars
in order to operate Wikipedia. It's, you know, we need servers and database and we need to
support the community and all these kinds of things.
I would say in terms of the development of Wikipedia and how we're so community first and community
driven, you wouldn't really necessarily have that if the board were made up largely of investors
who were worried about the profitability and things like that. Also, I think it's important
today for our intellectual independence. We're under attack in various ways, as we've talked about.
And, you know, what's interesting is, you know, one of the things that isn't going to happen, actually, the most successful tweet I ever had, I think it was a New York Post journalist tweeted to Elon, you should just buy Wikipedia when he was complaining something about Wikipedia. And I just wrote, not for sale. That was very popular. But it isn't for sale. And, you know, I just thought, you know what, you know, I would like to imagine myself as a person who would say to Elon, no thank you for a 30 billion.
million dollar offer if I owned the whole thing. But would I, you know, actually 30 billion, you know,
30 million? Yeah, I'm not interested. 30 billion. You know, and so that's not going to happen because
we're a charity and I don't get paid and the board doesn't get paid and all of that. And I do think
that's important for that independence that we're not, we don't think in those terms. We're not
even interested in that. Since we last spoke, the co-founder of Wikipedia, Larry Sanger,
has given an interview to Tucker Carlson that's getting a lot of attention here.
in the United States on the right.
And he has had a lot to say about Wikipedia
and not a lot of its good.
In the past, he's called it
one of the most effective organs
of establishment propaganda in history.
And we should say
that he believes Wikipedia has a liberal bias.
And in this interview
and on his ex-feed,
he's advocating for what he's calling
reforms to the site,
which include reveal who Wikipedia's leaders
are and abolish source blacklists.
And I just wonder what you make of it.
Yeah, I haven't watched it.
I can't bear Tucker Carlson, so I'm going to have to just suck it up and watch, I suppose.
So I'm not, I can't speak to the specifics in that sense.
But, you know, the idea that everything is an equally valid source and that it's somehow wrong that Wikipedia tries to prioritize the mainstream media and quality newspapers, magazines, and make judgments about that.
is not something I can in any way apologize for.
But, you know, there's no question.
Like, one of my sort of fundamental beliefs is that Wikipedia should always stand ready
to accept criticism and change.
And so to the extent that a criticism says,
oh, Wikipedia is biased in a certain way and that these are the flaws in the system,
well, we should take that seriously.
We should say, okay, like, is there a way?
to improve Wikipedia is our mix of editors right? At the same time, I also think, you know what,
we're going to be here in 100 years and we're designing everything for the long haul. And
the only way I think we can last that long is not by pandering to this sort of raging mob of the
moment, but by maintaining our values, maintaining our trustworthiness, being serious about trying to make
things better if we've got legitimate criticism.
And so, you know, other than the fact that, okay, we're just going to do our thing and we're
going to do it as well as we can, I don't know what else we can do.
I think you and Larry did build something beautiful that has endured.
I do wonder if it's going to be part of our future, because I feel some despair about where
we're all headed and some fear. And I guess you'll just say that I have to trust that it's all
going to end up okay. But I do worry that it might not. Yeah. I mean, there's so much right now
to worry about. And, you know, I can't dismiss all that. I can try and cheer you up a little
bit. But, you know, we just saw Donald Trump talking about.
the enemy within and suggesting the military should be in cities doing what shooting people he
doesn't i it's unbelievable on the other hand i sort of think he's just this blustering and being
donald trump and all that but you have to worry i didn't cheer me up i got to tell you right there
that that's fair enough fair enough i got to tell you like as a pep talk pretty low pretty low
I think we're going to be all right, but it's, yeah, it's a rough time.
Okay.
What was the last page you read on Wikipedia and what were you trying to find out?
Oh, that's a good question.
Can I take a second to look?
Sure.
Show full history.
Search Wikipedia.
Now, I'm going to skip over list of Wikipedians by number of edits.
That's just me doing work.
I'm going to look.
Oh, I know.
This is fun.
Admiral Sir Hugo Pearson, who died in 1912, used to own my house in the countryside.
And I found this out, and there's a picture of him, which I found on eBay and ordered.
And I was trying to remember something.
There's nothing about my house in the article because he was there and then he moved away.
But I love it.
And I'm thinking of making a replacing the AI voice assistant.
I use Alexa, you know, people use OK, Google or whatever.
I want to make my own and I want to have it be a character
and the character will be the ghost of Hugo Pearson.
Anyway, that's what I was researching.
I'm not sure I'll ever get around to it.
I'm like really busy promoting my book and things like that.
But when I get spare time, I dream of sort of being a geek
and I'm going to go home in February maybe
and just work home on playing in my house.
You are a geek in the best possible way.
Thank you so much for your time.
really appreciate it. Oh, thank you. Yeah, it's been great.
That's Jimmy Wales. His new book, The Seven Rules of Trust, a blueprint for building things that last, comes out,
October 28th. To watch this interview and many others, you can subscribe to our YouTube channel
at YouTube.com slash at Symbol the interview podcast. This conversation was produced by
Wyatt Orm. It was edited by Annabelle Bacon, mixing by Afim Shapiro, original music by Dan Powell,
Rowan Nemistow, and Marian Lazzano. Photography by Philip Montgomery. Our senior booker is
Priya Matthew, and Seth Kelly is our senior producer. Our executive producer is Alison Benedict.
Video of this interview was produced by Paola Newdorf,
cinematography by Zebediah Smith and Daniel Bateman.
Audio by Sonia Herrero.
It was edited by Amy Marino.
Brooke Minters is the executive producer of podcast video.
Special thanks to Molly White, Rory Walsh, Renan Borelli,
Jeffrey Miranda, Nick Pittman, Maddie Masiello,
Jake Silverstein, Paula Schumann, and Sam Dolnik.
I'm Lulu Garcia-Navarro, and this is the interview from the New York Times.
Thank you.
