It Could Happen Here - Wikipedia and Climate Misinformation
Episode Date: December 22, 2021We talk with Alex Stinson @sadads, Wikimedia Foundation senior programme strategist, about how Wikipedia works and what we can do to increase climate awareness online.WikiProject Climate Change- https...://en.wikipedia.org/wiki/Wikipedia:WikiProject_Climate_change Learn more about your ad-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
You should probably keep your lights on for Nocturnal Tales from the Shadowbride.
Join me, Danny Trejo, and step into the flames of fright.
An anthology podcast of modern-day horror stories inspired by the most terrifying legends and lore of Latin America.
Listen to Nocturnal on the iHeartRadio app, Apple Podcasts, or wherever
you get your podcasts.
Welcome everybody to It Could Happen Here, a podcast about, I don't know, how things
are kind of crumbling and how we can maybe put stuff back together.
And today I am excited to talk with a senior, let's see, what is the actual term?
Senior programs strategist.
A senior programs strategist at Wikimedia, Alex Stinson.
Hello, greetings.
Hi, it's so good to be here.
I'm very excited about our talk today because, I mean, this should surprise nobody that I used to be a Wikipedia editor back in the day. Not shocking at all, if you know me. But yeah,
we're going to be talking about what kind of Wikipedia just itself, and then also climate misinformation and disinformation and how we can maybe
create a better understanding of climate change and its effects across kind of
the world and how digital information works.
Those are all kinds of topics we talk about often enough,
but never within the actual context of like Wikipedia as an entity.
So I guess let's, let's just start there with Wikipedia.
And for those who don't, maybe people use the website,
but they're not quite sure what it is,
how do you actually describe what Wikipedia is?
Because it is like an interesting kind of amorphous entity.
It's so many things.
I think most people are used to thinking about Wikipedia
as like the fact checking device.
Like I have a bar argument with my friends and I pull out my phone and people throws this website
at me, right? It's a lot of things. It's 300 language Wikipedias. Actually, it's not just one.
Each of these communities has its own editorial community. Last I checked, it's like 60 million articles
across the languages. It's really, it's a lot of different content. And a topic can be on each of
those Wikipedias, right? And this is important as we start talking about disinformation is like
each Wikipedia, because it's edited by people in that language and it's written by that language community.
Each article is different and has different perspectives.
280,000 volunteers editing every month.
So this is a lot of people, right?
But the bulk of that's happening on English Wikipedia and some of the larger languages that are spoken across multiple cultural contexts.
And then there's also a lot of other content sitting behind Wikipedia.
So there's a media repository, and there's a called Wikimedia Commons,
and there's a database called Wikidata,
which kind of powers those little knowledge graphs on the right side of Google
and a whole bunch of other parts of the internet.
Wikidata shows up in Amazon, Alexa, and all kinds of other places, right?
And so we're not just like one website.
It's many websites, lots of knowledge, lots of platforms, lots of context.
And we'll come back to that a bit more as we talk.
Yeah, one really interesting part of it is like i don't know my my
personal kind of social leanings i generally kind of like things that are more decentralized
in general um other other hosts on the podcast are generally kind of on like the progressive
left libertarian spectrum um and one thing i i do really appreciate about Wikipedia is
it's more like, it's not,
I don't think it's like open source,
but it,
the way it has decentralized
editing and all that kind of stuff
is just a really interesting model of
like, what if a lot more stuff worked this way?
And I'm not sure
how much of a decentralization focus
is there consciously at people at the foundation and people who try to like actually like run it
behind the scenes and stuff yeah so wikipedia grows out of the like open source movement and
kind of early days of the internet right this idea that like knowledge wants to be free technology
wants to be free software wants to be free um let's let's use the legal infrastructure to like create freedom
right uh in that sense and then there's also the free as in like anyone can edit and then the free
to like do whatever you want out there in the world um there there's uh people are like free
as in beer and free as in speech right uh and those things are those things are also there's
they're always intention tension, uh, and
they're kind of working.
And as you can imagine, especially when you get outside of kind of multicultural internet
spaces, like English Wikipedia, um, it can get challenging.
Like if you're in Croatia and everyone is speaking Croatian, there's a very small bubble
in which to create that Wikipedia.
Right.
Um, and so it's interesting in that sense.
in which to create that Wikipedia, right?
And so it's interesting in that sense.
I think there's also another part of Wikipedia that a lot of people don't see,
which is the movement behind it.
So there's the editorial communities,
people show up and make edits.
But because there's this ideology
that you're talking about,
this like decentralized,
like we need to share our knowledge
or culture or language on the internet.
There's also a whole social movement sitting
behind the scenes uh and there there's a podcast recently dot com the wikipedia story that kind of
captured that the essence of of that um and it's it's a lot of people like myself so i started
editing in high school yeah yeah me too yeah yeah those like, oh, I know how to click the
edit button and I figure out how to use the internet and that kind of thing. But there's
a lot of people that like the intuitiveness of clicking an edit button on a piece of open source
software to create content is just not, it's not clear. Right. And so you have to organize and
invite people in. And so we have a whole movement that does that too there's about 140 150 organizations around the world that we uh organize events work
with uh libraries and museums and educational institutions and so there's always this um
kind of interesting dynamic where our values which is is this open software platform stuff, is also lived in our practice and our outreach,
like creating change through society by sharing knowledge and education.
And so I think, yeah, it's an interesting dynamic.
Yeah, I think that does create a really oftentimes beautiful reflection.
It can have some dark sides every once in a while,
but it is really nice to have the ideology
driving it, being reflected in
the actions of operating it and spreading it
and that kind of thing.
This is something we briefly
touched on already, but I'd
like to move on to
how
climate change and broader
social issues are
covered on Wikipedia. Because you already
mentioned, because there's not a Wikipedia, there's many based on different languages and places,
it feels like to me, whenever social issues kind of get covered on Wikipedia,
it's going to be in some part like a local reflection of whatever is in that area.
If there's a white liberal writing articles in New York, it's gonna be different than someone, you know, halfway across the world, writing them in, you know, a much smaller country, let's say, like Belarus, who's under like, what I would call a dictatorship.
So that's going to change kind of the nature of what people are making because of that kind of divide. So how, how does that kind of crop up?
And is there any like solutions to that?
Or because,
because of the,
because of the decentralized thing,
it's like,
how much can we like impose?
Like,
like I'm,
I'm,
I'm not in Belarus.
How much can I impose what I want their Wikipedia to look like?
Yeah.
There's kind of two or three dynamics you're,
you're touching on here. The first is because there's kind of two or three dynamics you're you're touching on uh here the first is
uh because there's kind of an intention bias like something comes up in the news and our
wikipedia community like people are within minutes of breaking news stories are usually like editing
the page yeah working to improve it right um so if things show up in the, you know, European American press, it's very likely, especially something like English Wikipedia will pick up on it and immediately cover it.
And because there are multiple perspectives in those press, usually kind of the ideological kind of multi-sided nice, like works itself out because there's a lot of eyes and a lot of people who know how to edit there yeah right yeah um on in a kind of cultural linguistic geographic context
where there's like one set of stories and there's not a lot of diversity um uh this this happens and
and i'm going to refer to croatian wikian Wikipedia because we actually had an external researcher look at Croatian Wikipedia because part of it has been caught by by folks with kind of very ideological leanings in a way that's excluding others.
And this is not good. Right. It creates a very one sided information environment and it really reflects kind of the news dynamic going on there so when like breaking news happens or when
a topic like a social issue or not necessarily climate change is not a social issue right this
is a global like yeah life threatening issue um when when something becomes politicized it's very
easy for especially in smaller language wikipedias, for few people to kind of swing the whole
perspective on that.
So, yeah, there's this breaking news issue.
And this is where our kind of organized communities are really important.
So the example I want to point out of this working well is in medicine.
So our medical community during the Ebola outbreaks a few years back in West Africa were able to organize both on English and in languages that were accessible for local communities, high quality coverage of the medical content because it's like has impact on people's lives. they they recruited translators they thought about like what's a simple way to communicate the story
um in that context and like what do the the workers the or the advocates or whoever on the
ground who's working with that crisis what knowledge do they need right um and you see
like other open technology movements do stuff like this like humanitarian open street map
has a similar kind
of way of organizing they're like hey there's a crisis happening um let's pull people together
from different parts of the world who have the right knowledge or skills and like address the
knowledge gap um so so you can solve it it's just it's complicated um and you know we've been trying
to address as a movement what we call the gender gap.
So there's both less women editors, less women's content on many of the wikis.
And like it's taken years and it's very hard to organize.
And even when there is investment in it, it's challenging to to make substantial progress because there might be contextual issues around it too.
And so you can't just drop in on a Central Asian language
with a Western perspective
and expect to change the culture of the wiki overnight.
You have to engage with it consistently
and be persistent and work on it over and over and over again.
We are going to take a short break to hear a message from our lovely,
lovely advertisers, unless it's ExxonMobil again, but we will be back shortly.
Welcome. I'm Danny Thrill. Won't you join me at the fire and dare enter?
Nocturnum, Tales from the Shadows, presented by iHeart and Sonora.
An anthology of modern day horror stories inspired by the legends of Latin America.
From ghastly encounters with shapeshifters
to bone-chilling brushes with supernatural creatures.
I know you.
Take a trip and experience the horrors that have haunted Latin America since the beginning of time.
Listen to Nocturnal Tales
from the Shadows
as part of my Cultura podcast
network, available
on the iHeartRadio app,
Apple Podcasts,
or wherever you get your podcasts.
Okay, and we're back.
One thing that we cover decently part of my job and and and robert evans's job is disinformation and misinformation and how this type of stuff spreads online um
particularly you usually kind of linked to like political extremism or conspiracy theories or, you know, in, in that general kind of bubble.
And so what,
what type of kind of climate misinformation has really been festering on
various, you know, Wikipedias across the world, really?
Cause like we, we, we just be talking about like these topics and how,
and how, and like why it happens,
like what are the main types of misinformation or disinformation that is much more prevalent?
Yeah, so the first is just kind of neglect
of content that's happening
across the various things related to climate.
But we've identified on English Wikipedia
over 3,700 articles
that are directly related to climate change. We don't have a very
big editorial community in English on that topic that's like fluent in the science and fluent in
the other stuff. And then you go out to the other languages and like some of the languages have like
3,000 of them. Some of them have like 200, right? And so there is both, and some of that content
was like translated several years ago, right? Or five or 10 years ago. And like the climate
rhetoric has really changed. It's changed a lot. And like numbers and statistics, all that stuff
gets updated every year. And it's, yeah, that is, there's a lot to keep up with. And like reading the IPCC report
or looking at any of the consensus science,
there's like a lot of change
that you have to be influent in like science communication.
You have to understand like where to look for the information.
And it's interesting.
My partner is a Spanish language speaker
and she was in a kind of workshop for journalists in Argentina
for climate communication. And the workshop was like, oh, you should cite The Guardian, right?
So even as to kind of understand this climate stuff. So a lot of these local language contexts,
there aren't even good sources. And the sources they do have are often citing like the dominant
narrative that's going on and like the anglophone news cycle right because there's not a lot of
climate communication going on and so there's just a lot of complexity involved in updating that much
content all the time um and so the bulk of the stuff that kind of creeps in is like this neglect, right?
It's like some dominant idea in the narrative just hasn't been updated.
And like, we need someone to update it.
And that's like an organizing problem, right?
That's like, we need more people who are science literate, who speak the local language,
who understand how to edit Wikipedia.
And that's trainable.
Like we can do that.
Yeah.
The reason that matters, the neglect
matters is it stops people from making decisions about climate change because they don't have like
an accurate sense of what we need to do. Right. Which is cut the fossil fuels, increase resilience,
do adaptation, like actual political change. Yeah. Right. And so that's just, it's a problem.
The other stuff's a bit more,
it's a little bit more complicated.
One of the things that happens is,
as you know,
there's quite a manipulation of narrative
that has happened around climate change.
There's this really great podcast by
Amy Westervelt about how the fossil fuel industry got its message into schools in the last 30 years
in the US. And that narrative is just so prevalent. And so one of the things about Wikipedia is that
we try to do a balance of positions. If there are reputable sources
kind of describing or analyzing a topic, and this is back to your polarization question too,
if there are reputable sources describing a topic, we try to give them equal weight
and balance across the article. The problem with climate is that some of the narratives that look like
reputable sources are just pumped out of fossil fuel industry funded think tanks, right? And these
things are not truthful narratives, right? And so the BBC ran an article two weeks ago on kind of climate denial in some of these smaller languages,
smaller language Wikipedias. And what they found was a lot of these narratives being given
equal weight with the climate science. And I took a look, our community after that BBC article came
out, started looking across all the language wikipedia articles about just the main climate change page and they found 31 wikipedia's that had some of that like equal weight of bad climate
science interesting yeah um and you know the bbc article only found like five or ten right we found
another a lot a lot more yeah yeah yeah and. And so it's, it's like a,
it's a really like these narratives just seep in and, you know, again, I'm going to go back to the
Croatian example. Like if your media environment has been locked down by a certain political
rhetoric to those narratives might have traveled from like the Anglosphere into these other spaces
and then gotten stuck, right? And it's just like keeps getting recycled. And so that causes delay.
And I was listening to your podcast recently about soft climate denial. Like this is what's
happening in other language environments, right? Is people
are rehearsing this misinformation. It seems like a valid position because it's been rehearsed so
many times by, by folks. Some people who are championing that position are like doing so
unknowingly. And in the process, we're kind of disconnecting entirely
from the source of the information.
And that is just, it's really bad.
One interesting thing that I thought of
when you were bringing up sourcing,
how sourcing itself can be an issue.
In the States, there's kind of a joke that
when people use just Wikipedia asipedia as like as a source
to be like they they just they just link the article and but like that is the default for
so many people when they begin a begin a research project is like okay what's this what's that what
is what does wikipedia have on it what's the sources wikipedia uses um and kind of branch
off from there it's a very common thing so So I'm not sure how different internet culture will be different in other countries,
but if they
don't have
the base sourcing necessary
to create a
decent homepage article,
then just sourcing from Wikipedia in the first
place becomes so much harder.
Because you were saying
just use the Guardian
is one of the things.
That's not horrible advice, but if it's only just from one thing,
then that's going to change the entire nature of coverage and information on specific topics.
I've had that just be a really interesting thing that I never thought of before,
is how different countries' Wikipedias or language wikipedia's will have will have like different sources so then
getting information from from the page is just going to be so different and like yeah like the
whole like the whole like uh tiered of sourcing is just completely changed yeah and and i think
like you know in medicine most medical practitioners expect most of the medical
literature to be in a handful of
languages like English and Chinese and that kind of stuff, right? And like part of your professional
work and part of like saving people's lives is being able to use those sources. And so if a
medical Wikipedia article has a translation from like an English article into another language,
and you're distributing that to medical practitioners
and they find the citation and it's in English and they can go follow the source.
Like that's not such a big deal.
But in a topic like climate, where the vast majority of the people that have to make decisions
on this information do not have access to other languages, maybe their access to English
is through like machine translation.
Yeah.
Google or something like that.
Like having,
not having sources in your local language or just having the sources that
were translated from an English Wikipedia article,
which happens a lot on these smaller language Wikipedias is kind of like
not helpful for climate decision making.
And this is where it's, and it's easy, for example,
in a lot of these like Eastern European languages
or Central Asian languages
for like a politically spun news site opinion
about something to kind of creep in
at the same level of of kind of uh validity as as another
as scientific research as the the you know consensus understanding of the climate crisis
so how how might i know we talked about like like uh trainings for like journalists and people to
start editing wikipedias in their language but like how how do
we kind of improve climate communication overall with open access to information and you know
creating more linguistic um diversity and stuff yeah well i i think there's like a couple
opportunities um in this and then i there's some other misinformation i also want to talk about too. But I think the sourcing one is a particularly challenging one.
We need more basic science-based climate communication and more languages.
And I'm not saying just the more languages like the big UN languages
or the ones that are kind of colonial across
cultural languages like Spanish or French or Arabic or, you know, all these languages that
have been used across cultures. We also need it in local languages and we need it to be evidence
based and we need it to be audience based. Right. So if someone is like searching online in Swahili about how like drought is happening in Kenya, right, or Tanzania or or, you know, they're suddenly flooding or like I need to deal with X, Y and Z adaptation to the climate crisis, which is, by the way, what all of the global south is doing right now.
Right. Like the global south is having to adapt to this crisis that polluting
countries have made.
Yeah. And we're not actually giving them the resources to the,
to the problem that we've caused.
Yeah. Well, it's not even like giving the research.
We're not even like the people who are like, we want to adapt our society.
We're not resourcing the folks on the ground who have the agency who
have the understanding who know how to do the research in the context who know how to do the
communication in the context right we're not even like bolstering their their request for help right
like the the the the u.n climate conference kind of failed on this adaptation funding right yeah
and uh this is you know this is where like a platform like wikipedia and like kind of failed on this adaptation funding right yeah and uh this is you know this
is where like a platform like wikipedia and like kind of approaching this from a knowledge activist
perspective where you're like there are people who need this knowledge to address like understand
what's happening around them so they can make decisions that doesn't like you know yeah we need this kinds of information we
need open source knowledge not just wikipedia but one of the platforms um and and you know the you
all do open source investigation and you're used to like open source software communities and i
listen a couple of your podcasts and you're kind of constantly speaking back to those open communities that come out of like Anglophone software spaces.
Yeah.
And like we need to acknowledge that like we figured out how to do open knowledge, but we haven't given all those tools.
We haven't transferred the knowledge on how to do it.
We haven't adapted those tools to other parts of the world and other
languages. And so just like starting to look for these other communities, asking for the people,
like who's ready to organize, like giving them money to go do it, right? These things are like
really practical. And I think we're not not we're not often not listening or we're
not looking for that solution and reminder like most of the people having to adapt um are in the
global south and speak other languages like we need to be there in that language if we want the
the climate crisis to like resolve itself uh without you know destroying people's lives yeah absolutely um
yeah that's that's the thing we we try to bring up is that the people that's going to be
initially worse affected are the people who are already kind of not in the greatest situation in
the first place that's like how how like how like the areas that are gonna that are gonna experience the
most flooding the most extreme weather events all this kind of stuff it's it's not it's not
starting with something like new york city it's starting with areas that are already uh dealing
with a lot of like local issues and now this is just something else on top and yeah fixing all of
that is uh i mean fixing all of it's impossible we can only take like small adaptive steps to like
mitigate some of the worst effects and yeah i mean that's that's stuff that comes up a bunch
but um you mentioned you wanted to at least briefly mention um some other forms of disinformation
yeah so we've also witnessed a couple times times where something will hit like breaking news or become a political position in a context.
And then like we will see bad actors show up on Wikipedia and try to manipulate it.
I have two examples of this. The first is about a year ago, we found a group of accounts editing about some of the inter-Amazonian highways
that the Bolsonaro presidency is building through the Amazon, where they were trying
to remove the environmental and indigenous people's impact assessments from the Wikipedia
articles.
uh, uh, impact assessments from the Wikipedia articles. Uh, and so like basic human rights stuff, basic, you know, healthy environment things that the government is like expected
to follow through on. We're being like manipulated out of the, uh, the articles, uh, for a more like pro-economic growth narrative um and so you know it's
we can't like the the shift towards this like very extreme right like economic growth only
version of reality yeah um does play out on the wiki now we were lucky that this was fairly trans
like fairly easy to see once we found it but we had
to coordinate across um uh english spanish and portuguese to like address the problem uh so so
we need like multilingual communities who are kind of coordinating and talking to each other
uh to address that um the other thing we've seen is like uh so did you I don't know how well you follow the climate movement, but did you see when Disha Ravi got arrested in India by chance?
I don't think so.
So she's a youth climate activist that was part of Fridays for Future India, which is like a group, kind of sister group of the group that formed in europe around greta tunberg right um and
uh she uh um her gmail account got attached to a google doc uh uh just a scene active on a google
doc that was about uh sharing social media about the ind, the farmers protests in India, which have been
like a real political sticking point issue. And I had written, so I'm both a volunteer and a
professional who organizes the community. And in my volunteer time, I'd written the biography of
Nisha Ravi, like months before the Indian government kind of identified her with this social media toolkit. And when she got
arrested for something that's like just basic social organizing tactic with social media,
the kind of Hindu nationalist social media environment, like zoomed in on her Wikipedia
article and on all these other social media
presences she had. And they tried to silence it, be like, okay,
we need to delete this article. And fortunately,
like a group of us were watching the page and we caught it and we're able to
stop that. But there's kind of the, the,
the kind of flash mob situation that happens a lot now and social media
where it's like, Oh, this thing has been polarized.
Now we need to go attack it.
And so you can imagine like English Wikipedia has a healthy immune system
for this kind of stuff. It like sees it.
It has enough people that it can do that. Yeah.
Yeah.
But you can imagine on a smaller wiki that the narrative
could shift and stay permanently shifted quite quickly. Yeah. If that happened. And so that's
another concern, right? So there's like the subtle, like a few accounts just like quietly
removing things and then like the active political kind of intervention that happens.
In terms of like disinformation, do you see the Wikipedia as being kind of intervention that happens in terms of like disinformation do you see the
wikipedia as being kind of susceptible to like intentional disinformation campaigns of people
slowly kind of editing the ideology of of articles to to push kind of some agenda whether that be
like individually and like in like you know more of like a crowd operation um or even like run by
like people with political power um like
do you how much of a risk do you see that with this like an open source idea is that's of like
intentional slow dissemination of disinformation on like important articles and stuff well so i i
think i might reframe your question a little bit. Like all open source kind of knowledge spaces are susceptible to that.
Right.
The question is to like,
what degree and how harmful is it going to be?
Right.
Yeah.
Like,
is it,
is it like very open to this and will it cause a lot of problems?
The bigger language Wikipedias have healthy immune systems.
We have a combination of kind of bots
that are like AI generated
that flag bag edits.
And then we have a lot of community
patrolling happening.
And even in some of the smaller communities
that have like medium-sized editor communities
like Swedish Wikipedia,
it doesn't take a lot
for that local language community
to patrol the pages
and be like, oh, okay, these changes are kind of weird. I can roll it back. This doesn't seem like
it fits our culture of Wikipedia. The problem is when a language Wikipedia has very few editors and they're not active all the time. And so this is where we need kind of more
eyes on the content, right? Because it's very easy for like a really small language community
to kind of have a little bit of content, but never see it maintained. And this is where the like,
where our communities are forming around these languages, like a lot of the West African languages, for example, that our communities are kind of organizing in.
And we invest in those communities existing and figuring out the governance and training people how to edit and getting access to the kind of technical skills to do this.
And we have kind of systems that we're hoping over the
next few years, invest in that resilience, right? Like building a code of conduct, making it easier
for communities to see this kind of stuff. But it is 300 languages, right? Yeah. And it is a
volunteer built system. And you do need a healthy editorial community in order to keep
a wiki from like drifting too much yeah um so a good example of this and get a reference croatian
because it's the one we've done research on yeah like it was possible for a few people to push
people who are more in consensus with the global position on various topics
out of the wiki um and and that's just like we we have to find a balance between like local language
uh and this is my personal opinion right we need to find a balance between kind of local language sovereignty on this stuff and also not like radicalizing a topical environment.
And we see this particularly on impactful topics, right?
Like ones that directly affect like politics or in the case of climate crisis, like people's livelihoods and ability to function in society, right?
And we just like, we need to be cautious about that but but you know wikipedia is a common resource uh and i think this is really important
like the the way wikipedia works is you know the wikimedia foundation provides the servers we fund
our communities we support them we help them work through governance issues, but like the, we need editorial communities to
maintain it. That's what those 280,000 people are doing as volunteers is they're building a
editorial practice that makes the content work. And, and we, we need that. And so we need, you
know, like-minded communities, like the people for your podcast who are like, oh,
we need the internet to be reliable and have accurate information on it to
show up. Cause if we don't do that, it's, it's really like,
it's the common resource.
Welcome. I'm Danny Threl.
Won't you join me at the fire and dare enter
Nocturnum, Tales from the Shadows, presented by iHeart and Sonora.
An anthology of modern day horror stories inspired by the legends of Latin America.
From ghastly encounters with shapeshifters
to bone-chilling brushes with supernatural creatures.
I know you.
Take a trip and experience the horrors
that have haunted Latin America since the beginning of time.
Listen to Nocturnal Tales from the Shadows
as part of my Cultura podcast network,
available on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.
We have a decent international listening race as well.
And I'm thinking, like, would you, like, recommend people, you know, in different countries,
or even people inside kind of, like, you know, the States, America, Canada, the UK,
who are, like, multilingual, would you at least encourage them to browse other language Wikipedias and maybe start making edits when they see this type of misinformation
popping up?
Yeah.
So two kind of perspectives on this.
One, look for a local organized community.
So we have what's called Wikimedia affiliates.
These are 130, 150 organizations around the world.
They regularly run events, especially now that we're leaving COVID, increasingly more in-person
events. They train folks, like look for them in your context. And if you need help finding,
you know, find me on Twitter and I can connect you with those communities. And the other part
is small edits. So I think a lot of people look
at Wikipedia and they think about like a traditional publishing platform, right? Like,
oh, you know, I have to write the whole article. Yeah, I have to be a master. And the secret sauce
to all of this is like most people start with one citation, one comma, one typo fix. And they do
a handful of those a month.
And then they keep coming back.
And as you do those small edits,
you start reading the content more carefully
and fixing the things you can fix.
And so I recommend going in to like add one citation.
Like if you go and add one citation today,
that like makes life better
or you fix the communication on a sentence.
The other part of it is, you know, I said, there's these organized groups for the climate in particular.
I run this campaign called Wiki for Human Rights, which is focused on a theme that we kind of identified with the UN human rights on the right to a healthy environment,
which is this new human right that has been acknowledged by the Human Rights on the right to a healthy environment, which is this new human right that
has been acknowledged by the Human Rights Council. And we're organizing kind of writing contests and
edit-a-thons and kind of trainings for communities to go and look for the human dimension of the
climate crisis. So I think when we think about climate communication, a lot of people are like
science, right? They're like, oh, this is, you know, about how weather systems work and how the atmosphere forms and that kind of stuff. And the
content that's more impactful is this like human inflected stuff. Like how does the climate crisis
infect you as an individual and agriculture and the cities you live in, in the clothing you buy,
in the manufactured goods, in the mine around the corner that's producing water pollution
that's going to harm your children for the next 30 years, right? And that is the kind of stuff
that we're encouraging communities to pay attention to is, it is more the like justice and human rights oriented perspective,
uh,
on these topics.
And your cat is very cute.
Yeah.
Every once in a while,
they,
they love to love to take the camera.
Um,
and so,
uh,
yeah.
So,
so if you follow me on Twitter,
I will,
I can hook you up with that campaign as well.
Yeah.
Um,
yeah. Where, where, where,, I can hook you up with that campaign as well. Yeah. Yeah.
Where can people find you online and to learn more information about, you know, the various
kind of topics we've discussed today?
So search, if you're interested in climate change stuff on Wikipedia, English Wikipedia
has a wonderful wiki project climate change that has a little tab. So if you search WikiProject climate change on Google and you find there's a tab at the
top that says get started with easy edits and that kind of can get you oriented to like
where can you affect English Wikipedia on this?
And, you know, once you find a gap on English, it's easy to find it on other languages.
For kind of learning about Wiki for Human Rights, you can
search for that and or follow me on Twitter, S-A-D-A-D-S, SADADS, on Twitter. We also have a
group called Wikimedians for Sustainable Development, who's kind of communicating on
Twitter, which is the group that's really focused on sustainability topics more generally. And, you know, the other way to look is find something you've been reading about,
about the climate crisis or sustainability issues in the news, look it up on Wikipedia,
see if it's missing. If it's not, click the edit button, add a sentence, right?
it button, add a sentence, right? A good example of this, I learned about a park in the center of Nairobi that's being protested by environmental activists because some of the big trees were
being cut down, Uhuru Park, right? This came by on my Twitter handle, like I'm not connected to this
at the moment, right? But because I had news sources,
I had three or four news sources. I could say really simply in 2001, the park came under
scrutiny for a renovation that included removing old trees. That's a climate action. Yeah. Right.
And I think, you know, I am constantly overwhelmed by the climate crisis.
As is a lot of people. Yeah.
And just being able to tell that little story, like, hey, the decisions people are making are not productive here.
Just gathering that story is important.
And what's important is Wikipedia plays institutional memory on this, right know a lot of a lot of activist work is very temporal it's very like
in that moment right um and if it doesn't get documented on wikipedia the local news sources
are going to get lost in the wind of time yeah totally right um and and so i think you know to
do your little activist motion like a sentence describing what happened in a moment where resistance was happening is like a huge step forward.
Right. Because it connects the environmental crisis, climate crisis, human rights issues to like daily lives.
Like people look up this park probably on Google because they want to go there. Right. Or they read about it because people are like, when was it created?
What was that protest that happened there the other day?
And if the source isn't there, then it doesn't really exist in their minds.
Yeah, it doesn't exist in their minds. And I think that's like one of the big issues with climate crisis and, you know, amplified even worse in other languages.
the big issues with climate crisis and you know amplified even worse in other languages right is that people aren't making that connection they aren't seeing it around them and they're not you
know kind of connecting action to how we address it that uh that is a really good that's a really
good point and yeah i mean i will encourage encourage everybody to start making small edits. That's what I did for a long time before I moved into open source journalism and reporting. It's a great way to get started. And it's a great way to start disseminating small bits of information, because the only thing that we can really do as people is small steps.
is small steps. We can have like an adaptive goal in mind, but you need to take small steps to get there. And that is a really great way to start influencing the way people think about climate
and our situation. Yeah. And I think too, you know, your podcast kind of appeals to folks who
are interested in like finding the truth and reality. Right. And that's, that's like that, that investigation is what a Wikipedia article is.
It is like one, 10, a hundred editors out there in the world trying to go like, what
the heck is this topic about?
Right.
How do I compile my notes in a way that helps other people?
And I think in the face of the climate crisis, Dr.
Ayanna Johnson says, like, find the thing you're good at, find the thing you're passionate about,
and find the thing that makes you feel good and is rewarding, and find the thing that actually,
like, helps affect the climate crisis, right? And a small edit on Wikipedia meets your kind of knowledge needs. It's very satisfying
because people will read it.
And it is
incremental change in the right direction, right?
People will make decisions on it.
Yeah.
I mean, I guess
I think that probably closes
this up today unless you have anything else to
add. I guess one
more plug for your twitter so
we can uh get get more eyeballs on you um and the work that you're doing yeah um so at s-a-d-a-d-s
uh it's my long-term handle on the internet um and you you can find me all over the place uh
and i tweet about wikipedia and the climate crisis well and uh we'll
we'll link the wikipedia uh wiki project climate change page in the description for people to find
uh thank you so much for taking time to uh talk to us all about these topics um i'm really really
great uh really grateful to have this type of uh knowledge knowledge readily accessible to more people. Also, you know, in the spirit of Wikipedia.
Thank you.
Thank you so much.
You can follow us by subscribing to the feed
and on Twitter and Instagram at HappenHerePod and CoolZoneMedia.
See you on the other side, everybody.
It Could Happen Here is a production of Cool Zone Media.
For more podcasts from Cool Zone Media, visit our website, coolzonemedia.com,
or check us out on the iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts.
You can find sources for It Could Happen Here updated monthly at coolzonemedia.com.
Thanks for listening.
You should probably keep your lights on for Nocturnal Tales from the Shadow Brass. Thanks for listening. terrifying legends and lore of Latin America. Listen to Nocturnal on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.
