Fresh Air - Best Of: 'Origin' Dir. Ava DuVernay / How Algorithms 'Flatten' Culture
Episode Date: January 20, 2024Ava DuVernay's new film Origin explores a new way to consider the historical subjugation of Black people in America: as the adverse result of a caste system. The film is inspired by Isabel Wilkerson's... book Caste: The Origins of Our Discontents. In the movie, Wilkerson embarks on a journey to learn about caste, traveling to Germany and India to get to the root of the Black experience in America.Also, we'll talk about how algorithms flatten culture with journalist Kyle Chayka. He says algorithms affect every aspect of our lives — from what we watch on Netflix, what songs are at the top of the charts, to what our local coffee shop looks like. His book is Filterworld.Learn more about sponsor message choices: podcastchoices.com/adchoicesNPR Privacy Policy
Transcript
Discussion (0)
From WHYY in Philadelphia, I'm Tanya Moseley with Fresh Air Weekend.
Today, director Ava DuVernay, her new film Origin, explores a new way to consider the historical subjugation of Black people in America as the adverse result of a caste system.
The film is inspired by Isabel Wilkerson's book Caste, The Origins of Our Discontents. In the movie, Wilkerson embarks on a journey to learn
about caste, traveling to Germany and India to get to the root of the Black experience in America.
Also, we'll talk about how algorithms flatten culture with journalist Kyle Chayka. He says
algorithms affect every aspect of our lives, from what we choose to watch on Netflix, to the songs
that are at the top of the charts, to what our local coffee shop looks like. I could reliably land in any city in the world
and easily find my way to a coffee shop with cappuccinos with nice latte art and
minimalist reclaimed wood furniture. That's coming up on Fresh Air Weekend.
This is Fresh Air Weekend. I'm Tanya Mosley. When my guest Ava DuVernay first read
Isabel Wilkerson's book, Cast, The Origins of Our Discontents, she was so stunned she re-read it two
more times. The best-selling book draws a line between India's caste system, the hierarchies of
Nazi Germany, and the historic subjugation of Black people in the United States.
The book is academic in nature, 496 pages filled with facts and historical notes.
People told Duvernay, an acclaimed filmmaker, that it was too complex of a story to adapt into a film.
But she did it anyway, writing and directing Origin.
In the film, which is opening in theaters this week, DuVernay makes Wilkerson, played by
Anjanue Ellis-Taylor, the center of her own story as she explores how understanding the caste system
can deepen our understanding of what Black people experience in America. In this scene from the
movie I'm about to play, Blair Underwood plays a persistent editor who asks Ellis-Taylor to write
about the recent death of Trayvon Martin,
a tragedy that is impacting the nation at the time. The editor had recently given her the 911
calls of the shooting and now asks her if she's listened to them yet.
Listen.
Yeah. Yeah. It's a lot.
Yeah?
It's a lot. There's a lot there.
But longer form stuff, questions that I don't have the answer to.
So ask them in a piece.
I don't write questions. I write answers.
Questions like what?
Like, why does a Latino man deputize himself to stalk a black boy to protect an all-white community?
What is that?
The racist bias I want you to explore. Excavate for the readers.
We call everything racism. What does it even mean anymore? It's the default.
When did that happen?
Brett, where are you going, man?
So wait, so you're saying that he isn't a racist?
No, I'm not saying that he's not a racist.
I'm questioning why is everything racist?
That was a scene from the new movie Origin,
directed by today's guest, Ava DuVernay.
The question, what does racism even mean,
sets Wilkerson on a path of global investigation and discovery.
Ava DuVernay is an Academy Award nominee
and winner of several awards, including
an Emmy, BAFTA, Sundance, and Peabody Award. Her feature film directorial work includes
the historical drama Selma about the life of Martin Luther King Jr., the criminal justice
documentary 13th, and Disney's A Wrinkle in Time. She also directed the Emmy-nominated Netflix drama series When They See Us, based on the 1989 Central Park jogger case.
Her 2016 documentary 13th explores the prison industrial complex and won a Peabody and was nominated for Best Documentary Feature at the Oscars.
Ava DuVernay, welcome back to Fresh Air.
I'm happy to be here with you. Thank you for having me.
Yeah, I'm happy to have you.
Okay, so I want to put myself in the place where you were when you read Cast three times.
So were you trying to understand how to adapt it for film or were you trying to wrap your head around this idea of Cast?
I think I was just trying to survive the pandemic. I think of, it came out, the book, about two months after the
murder of George Floyd in the midst of a pandemic where I had recently lost someone I loved. And I
was in my house. I wasn't working. I was not filming. I wasn't doing the things that I usually do. My company was shuttered, as was every other place.
And the book had sat on my nightstand for a while.
And, you know, one of those pandemic days, you pick it up and you start to explore.
And it drew me in, but I didn't really understand it.
It's a dense book.
And when I finally got through it, I wasn't satisfied with my retention of the theory, my real
integration of it into my understanding. I felt that I had a surface understanding. And so I read
it again, just to satisfy myself, because it was being talked about a lot in social circles,
you know, online and amongst friends. And so when I finally read it a couple
more times, I started to feel a story emerge. And the story really centered on the journey to tell
the story. And it became a film about a woman in pursuit of an idea. What was it about dramatizing
it and film to approximate truth versus a documentary. I think I've heard you say that
people sometimes, some people need dramatization to hold on to the humanity of a person or a
storyline. Maybe news doesn't do it or maybe journalism or maybe even documentary doesn't
always do it. Well, I mean, that's just me being a little biased because I'm in love with film.
You know what I mean? I'm in love with film. And so for me, that's just me being a little biased because I'm in love with film.
You know what I mean?
I'm in love with film.
And so for me, that's the kind of top format.
You know, that is the art form that gets inside my bloodstream and really, really helps me orient myself to the world and organize my thoughts is cinema.
And so when I read something or I hear a story or there's something that I want to say, I go to that method, that way, that sharing.
It's the image.
Wilkerson found the word racism insufficient to capture what is like this rigid social hierarchy of the Jim Crow South in particular.
And she found how Nazis were influenced and inspired by American racism. Had that been an idea that you had sat with or you knew before you read the book?
No, did not know it.
So I'm an African-American studies major, English major, UCLA.
That's crazy.
Read quite a bit.
Right.
Had not come across that bit of information that Nazis had been influenced by the blueprint of American
South segregation policies, that actually they had sent scholars and people to study it,
to bring it back. So when I read it in her book, it was fascinating to me, but I had to go
look at that stuff myself and read it myself. It's not widely known.
And so there's certainly scholarship out there other than Isabel Wilkerson's that shares that information, but none that I'd ever heard of.
And so when I'm sitting there and I'm reading the actual notes, the actual transcriptions, the actual letters, it's astounding.
It's very matter-of-fact. And in some spaces, the Germans are shocked and surprised and appalled by some of the things that were done in America and said, that's taking it a little too far.
See, now that's crazy. Let's do it this way.
Yeah, let's do it this way instead.
I don't know if we can get away with that here, but we can do this, this, and this.
It's really shocking.
But certainly that's a part of the book.
And this is what I basically
did is all of the parts in the book where my jaw dropped. I put that in the movie. Yeah.
I think the challenge that many Americans have in particular about this notion of caste as it
relates to Black Americans, unlike the Dalits in India, is that with Dalits, they can never surpass their lot in life.
But Black Americans, some would argue, you and I sitting here right now having a talk shows that
we can actually move past that. The other side of it is that we might be the exception
and not the rule. Well, it's challenging for me, but having read the book many times,
studied the book, made the film about the book, my understanding of it is this. While you and I
may be sitting here and we might be successful in our careers, what it has taken for us to be
in these spaces is a different trajectory than what our white male counterparts have gone through to be in their spaces.
In addition to that, outside of this space, when we're walking down the street,
when we're in the department store, when we're in various spaces where our scholarship or careers or intellect is unknown
and we are seen only by our outward facing traits, it doesn't matter.
And we are not on the same footing.
And that's the way the society functions.
And so that's part of what her book, I believe, asked me as a reader to think about is to really drill down into it and not allow ideas
about it to kind of sit inside of sound bites and easy questions. But this is really insidious stuff
that affects us all. And it's an invitation to address it, explore it, think about it.
Side note, you know, I think this is the first time, this is the first film that I've ever witnessed a protagonist as a black woman intellectual.
I know.
It's one of the few.
There's a film by an incredible filmmaker who's no longer with us named Kathleen Collins.
She was a filmmaker who came to her height in the 70s, I think maybe early 80s.
She did a film called Losing Ground,
and it's about a woman academic. This is a film that sadly very few people have seen,
but they exist. But I mean, when I can count them on less than one hand, we're talking
about a real subgenre of films that we see in the Hollywood industrial complex, right?
Man thinking, taking on the big subject,
tackling an intellectual concept,
traveling the world to figure it out.
No one believes him, but he knows.
He is an intellectual warrior.
We can name 10 of those.
There are a couple of big ones this year even
that follow that trajectory.
But put a woman in that place and tell me how many you think of where the main action is a woman thinking, grappling with big ideas.
That is what it's about.
And now add a black woman to that.
The list gets—
Smaller and smaller.
Sadly, much smaller, yeah.
Our guest today is award-winning filmmaker Ava DuVernay.
We're talking about her new film, Origin. We'll be back after a break.
I'm Tanya Mosley, and this is Fresh Air Weekend.
Let's get back to my interview with Ava DuVernay
about her new film, Origin, based on the best-selling book cast,
The Origins of Our Discontents by Pulitzer Prize
winning author Isabel Wilkerson. DuVernay has written and directed several films and documentaries
including Middle of Nowhere, I Will Follow, Selma, Disney's A Wrinkle in Time, and the Netflix drama
limited series When They See Us. Her new film is Origin. So this movie starts with an opening scene of a boy depicted as Trayvon Martin.
Trayvon was shot and killed by George Zimmerman while walking home from a convenience store in Florida in 2012.
How did you make the decision to open the movie that way? Well, Isabel Wilkerson told me that the verdict around Trayvon Martin's murder was
the impetus for her to start thinking about some of these ideas in a concrete way.
I remember when she was sharing that with me, I thought, oh, wow, could it open on that? Could
the spark that sparked her spark the film? And really trying to stay close to and honor her process, her life,
her genius.
You know, I wanted to start where she started.
Also, what Trayvon represents in the greater story, because what Isabel Wilkerson, the
character in the movie, represents, what she's saying is, this is another black boy who was
killed, and what does it mean? Like, we're just going to tack the word racism. And the movie represents what she's saying is this is another black boy who was killed.
And what does it mean?
Like we're just going to tack the word racism to describe what has happened.
I need to figure out what this means at the root cause of it.
Yeah.
Right. experiences that we apply labels on that are outdated, that are not robust enough to hold
their meaning and their import. And I think that's one of the reasons why I was attracted to
the book. There are spaces in the book that I don't, parts in the book that I don't necessarily
agree with, but I love thinking about it. You know what I mean? All kinds of things. Like, you know, it took me a really long time to wrap my mind around the idea that there's something
underneath racism that's called caste. Or, you know, I know a lot of people kind of grapple with
the book because they think that the premise is that's not race, it's caste. But that's not what she's saying in my interpretation. She says, that's not race only.
It is also caste.
And unless you can dig down and understand the multiple levels of what this experience is made up of,
you can't solve for it.
Caste is underneath all of the isms, all of the ways in which we disregard one another,
we organize ourselves, the hierarchies in our societies and in our cultures, it's underneath it.
So it doesn't mean racism doesn't exist.
It means the foundation, the root, the origin underneath is the very simple premise.
Someone has to be better than someone else.
Now let's organize why.
Pick a reason.
He's taller.
She's white. This person's a man. This person lives and comes from this part of the world. This person has all their body parts work this way. Whatever it is, someone is better than another and we organize ourselves as a society in terms of power and justice and fairness around that random set of traits.
Were there parts of your everyday life that you looked at differently
after you understood this concept?
In the day-to-day hierarchies, you know, that are just part of our everyday lives.
I did. I did.
I think I would often, just in my own life, think, you know, that was racist, right?
What has happened or what was said or what.
And to be able to dig a little further, it helped me make it less personal because race is really personal to me.
You know, I mean, I've grown up feeling like, you know, I walk through the world
with a lens.
My primary lens is race.
And I would say my primary lens
is race even more than gender.
You know, certainly
when I think about caste with that,
it animates my thoughts
about the way that I am moving through the world, being treated,
being regarded in a different way, because beyond just seeing me as black, there's something else
at work there in regards to where blackness lies in its value system, in its feeling of safety, and it's a feeling of worthiness in any particular situation that connects me to someone else who may be dealt with as lesser than or dangerous.
I'm doing air quotes through all of this, right?
Beyond my skin color, right?
That there are commonalities, that there are connections, that it helped me feel more in solidarity.
You know, it helped me feel more connected to other people, to other plights, to other, you know, other folks that are outside of what is considered normal and valid and worthy.
And in doing that, it expanded my worldview.
But I still struggle with, you know, an event happens and I call what I think it is.
And the awareness of Cass invites me to dig a little deeper.
I read that you maybe changed the hierarchies within your work, like on set.
Was that true? Like you thought
about this after understanding it? Well, I tried to apply it. I tried to just apply it and really
look in my own space. So it's easy for me to say, oh, as a black woman, I'm lower on the hierarchy
of America. And that's just what it is. But I'm not a quote unquote lower caste in my company.
I said, you know, how can I balance this out a little bit?
Yes, I'm the head of the company. Yes, I run it, and I run my sets in a circular leadership fashion
where it's not so much of a pyramid but more of, you know, me in the middle of a circle of people
equally around me who have a voice. That's the way that I run things. But within that,
I can say that everyone is in a circle, but if certain voices are
prioritized or if certain people are made to feel more comfortable to even speak, what are the
things that I can do to loosen that up and promote it? You can have everyone around a table in a
circle, right? But am I doing the extra thing that makes that person who otherwise would be uncomfortable to speak?
I might have to do a little extra for them to tell them you are here at this table.
Speak when you want to speak. Right.
And so with that idea, my colleagues and I started to look at the ways in which we organize ourselves.
And the biggest example and the most one of the most beautiful is with my cinematographer, Matt Lloyd.
We started to talk about how could we deal with cast on our sets.
And he said, you know, when I look at a film set and a crew, there's a hierarchy embedded in the very names in which we call each other by our titles, by our position titles.
And we have ACAM and we have a BCAM.
We have, you know, basically junior people and they're all called these things.
So as they come to the table, they're already defined and they're already told at that circular table.
Who's important.
Who's important.
And so we tried to break those down and he did an incredible job in his department of renaming everything.
There was no first camera and second camera.
There was an east camera and a west camera.
And there were lots of little ways that we just tried to address and play with and push against this idea of cast, simply the idea of how do we organize ourselves.
Love is such a strong through line.
I'm glad that you point that out.
In this film.
Oh, yeah.
Like, I caught it because I watched it twice.
And, of course, it was there and obvious the first time.
But the second time, I was able to see that through line of love, not to spoil the movie.
But Isabel Wilkerson's husband dies early on, but his presence is so there throughout the entire film.
And what he represents is a breaking of cast. They're a
biracial couple. And he makes that decision to be with her. And she makes the decision to be with
him. And we see that throughout the movie. That's one real simplistic way of breaking cast.
Yeah. In Isabel Wilkerson's book, she speaks in a lot of detail about the idea of endogamy, that idea of endogamy, that idea of
who we are allowed to love. And that's a big part of her thesis, her premise, as she's
sharing her ideas about caste with us and teaching us about it, the idea of endogamy being just a key pillar, you know, the foundational element
of organizing folks in a certain way is regulating who can love who and who can live life together.
And once you start to regulate that, everything else starts to kind of follow that downward
slide once you say, you two have to be together and you two cannot be.
And that's a big part of the modern conversation about caste.
Ava DuVernay, thank you so much for this conversation and this film.
I love sitting with you. Thank you for having me.
Ava DuVernay wrote and directed the new film Origin. It's playing in theaters now.
Depending on what corners of social media you're on,
chances are good that you've heard this earworm of a song by the group Ocean Alley.
The song is called Confidence, and the Australian indie band released it five years ago.
But thanks to going viral, it's having a moment right now.
Or maybe you got caught up in the popularity of Running Up That Hill by Kate Bush back in 2020. deal with God, and I get him to swap places. Be running up that road, be running up that hill,
be running up that building. Running Up That Hill topped the charts and was streamed over a billion
times on Spotify in 2020, even though it first came out in 1985. The algorithm helped get it
there after the Netflix series Stranger Things featured it in its
fourth season. Although this sounds cool, right, that music can be discovered this way, but have
you ever thought about how something like this happens? Writer Kyle Chayka has been thinking
about this for several years. In his new book, Filter World, How Algorithms Flatten Culture,
he explores how we are fed algorithmic recommendations
that dictate the music we like, how we interpret the news, what movies we consume, even what foods
we eat, clothes we wear, the language we use, and the places we go. And Chayka argues that all of
this machine-guided curation has made us docile consumers and flattened our likes and tastes.
Kyle Chayka is a staff writer for The New Yorker, covering technology and culture on the internet.
His work has also appeared in The New Republic, The New York Times Magazine, and Harper's, among other publications.
Chayka's first nonfiction book, The Longing for Less, A History of Minimalism, was published in 2020.
Kyle Chayka, welcome to Fresh Air.
Thanks so much for having me here.
This is a conversation I've wanted to have for the longest time, so I'm really excited that you're here. So almost about a decade ago, I guess, we could basically go on Facebook
or Instagram or Twitter and scroll through the posts of everyone we followed, almost chronologically,
especially in those early days. Now, most of what we engage in, as you write in this book,
is content flowing through the algorithm, optimized for engagement, and pretty much
devoid of the human touch. What changed about eight or nine years ago. I guess that was around 2015, 2016.
Yeah. In the earlier era of social media, most of the feeds that we were interacting with were linear. So that just meant they were chronological. They ordered all the posts that you saw from most
recent to oldest. And that was just how everything was filtered. You could see it on Facebook or
Instagram or whatever. And over the past decade, most of those feeds have switched to being more algorithmic or more driven by algorithmic recommendations.
So these are equations that measure what you're doing, surveil the data of all the users on these platforms, and then try to predict what each person is most likely to engage with. So rather than having this neat ordered feed,
you have this feed that's constantly trying to guess what you're going to click on,
what you're going to read, what you're going to watch or listen to.
And it feels like a kind of intrusive mind reading sometimes.
I could see how all of this can make us passive consumers,
but what do you mean when you say the algorithms are flattening culture? I think algorithmic
recommendations are kind of influencing us in two different directions. For us consumers,
they are making us more passive just by like feeding us so much stuff, by constantly recommending
things that we are unlikely to click away from, that we're going to tolerate, not find too surprising or challenging.
And then I think those algorithmic feeds are also pressuring the creators of culture, like
visual artists or musicians or writers or designers, to kind of shape their work in ways
that fits with how these feeds work and fits with how the algorithmic recommendations promote content.
Yeah, that's why I thought that bringing up music is a really good way, a good example of how Filter
World can feel like it's both expanding and contracting culture. Because, you know, I never
would have learned about a group like Ocean Alley otherwise, but there are these other elements that
you're talking about,
about then tailoring the work based on the algorithm and trying to go viral.
Yeah, yeah. I mean, because we consumers like really consume so much culture through these feeds,
in order to reach audiences, creators also have to work through these feeds. Like a musician has to work through Spotify or TikTok and kind of
mold their work in a way that fits with TikTok. So that might mean like a really catchy hook that
occurs right at the beginning of a song or packing every sound possible into the like
10 seconds that you have for a viral TikTok sound. One other thing I was thinking about is what I also see, though,
is that the digital space has lessened the potency and power of gatekeepers.
So we're no longer relying on a handful of media that dictate what is good art,
what is good fashion and culture and music.
Couldn't it be argued that algorithms in the digital space more broadly
have opened up the
world, though, in ways that we've never had access to before? I think they really have. Like, there's
this huge power of the internet to let anyone publish the art that they make or the songs that
they write. And I think that's really powerful and unique. Like in the ecosystem, the cultural ecosystem that we had before,
there were these gatekeepers like magazine editors or record executives
or even radio station DJs who you did have to work through
to get your art heard or seen or bought.
And so these were human beings who had their own biases and preferences and social
networks. And they tended to block people who didn't fit with their own vision. And now in the
algorithmic era, let's say, rather than seeking to please those human gatekeepers or figure out
their tastes, the metric is just how much engagement you can get on these digital
platforms. So the measure of your success is how many likes did you get? How many saves did you get
on TikTok or bookmarks? How many streams did you get on Spotify? So I think there are advantages
and disadvantages to both of these kinds of regimes. Like on the internet, anyone can put out their work and anyone
can get heard. But that means to succeed, you also have to placate or adapt to these algorithmic
ecosystems that I think don't always let the most interesting work get heard or seen.
I was especially fascinated by your chapter on personal taste and the ways that think we all have taste.
We all have things we like and don't like,
and we all think about what we like and don't like,
and that's what our taste is.
I think what we like is also what we identify with,
and it's how we connect with other people
and how we build communities around culture.
So I think taste is really important,
and it's something that algorithmic feeds and these big digital platforms builds communities around culture. So I think taste is really important,
and it's something that algorithmic feeds and these big digital platforms kind of allow us to ignore
or encourage us to ignore
just so they can keep us listening and clicking and watching.
We're listening to my conversation with journalist Kyle Chayka.
He's a staff writer for The New Yorker
and has written a new book called Filter World,
How Algorithms Flattened Culture, which explores the impact of algorithmic technology on culture.
We'll continue our conversation after a short break.
I'm Tanya Mosley, and this is Fresh Air Weekend.
Today I'm talking to Kyle Chayka.
He's a staff writer for The New Yorker, covering technology and culture on the internet.
We're talking about his new book, Filter World, How Algorithms Flatten Culture, which explores the impact of algorithmic technology on culture.
Well, as part of your exploration of taste, you wanted to see if a digital space could actually identify your taste. So in 2017, Amazon created something called the Amazon Echo Look, which tried to
approximate taste by making fashion decisions for the user. And you tried it out. How did it go?
Well, this was a little camera that stood on your shelf, and it could take these full body
selfies for you that would show you your full outfit. And you could have the
app, the Echo Look app, send out the images, kind of algorithmically analyze them with some human
help as well. And the machine would tell you how stylish you were being or not. Like it would
purport to give you an analysis of how good your outfit was. And I found that it didn't really work
for me. I mean, this really, this pushed on me, I think popped collars. It was a big fan of,
which I think were last fashionable when I was in middle school. Um, it really didn't like my
choice of, of monochrome outfits, like a, an all gray outfit, which, you know, maybe that's true. Maybe
that's not cool. But it's part of my personal choice, my style. To me, the biggest problem
with the Echo look was that it just kind of gave you this grade of your outfit. Like it told you,
oh, this is 75% stylish, but it couldn't really tell you why why or it didn't give you the logic behind its
analysis it just kind of like told you whether you were going in the right direction or the wrong
direction and that's just so antithetical to what we think of as personal style or even what we want
to communicate via fashion like how is this algorithm to know what you are trying to communicate with your clothes that day or how you're trying to feel out in the world?
So I found it kind of useless as a style analysis and also just almost actively misleading or distorting the purpose of fashion, which is actually to communicate something about yourself, not to conform to some data-driven standard.
And that was in 2017.
I mean, several years later, now the big conversation is around generative AI
and its ability to predict what we like, to offer more specificity.
How does that play into this conversation?
Yeah, I feel like AI is like the looming question for all of this,
all of this technology. My feeling is that algorithmic feeds and recommendations have kind
of guided us into conforming to each other and kind of having this homogenization of culture
where we all accept the average of what everyone's doing.
We all kind of fit into these preset molds. And now AI is kind of promising to just spit out that average immediately. It'll digest all of the data in the world. It'll take in every song,
every image, every photograph, and produce whatever you command it to. But that output will just be a complete banal
average of what already exists. That almost signals to me like a death of art or a death
of innovation. Okay, I want to talk about some of the other platforms where we're guided by
the algorithm. In the case of streaming services,
Netflix pioneered the filtering of culture
through recommendation engines.
What does the Netflix algorithm factor?
It factors a lot of different things,
including what movies or shows you've already watched,
what other users are watching and clicking on,
and also just what Netflix chooses to prioritize in a given moment. So the Netflix homepage now
is really driven by algorithmic recommendations. It's always personalized to try to present you
things that you are likely to watch. And that's always measuring the kinds of genres that you're watching
or the actors you like or, you know, other favorites that you've shown to the system.
The problem, as you write in the book and one scholar wrote, is that it's taking away the
process of cultural meaning through decision making. We make meaning through making our own decisions about what we want
to see and what we like. Yeah, I think so. I mean, the act of choosing a piece of culture to consume
is a really powerful one. Like it is an internal decision that means we're giving our attention to
a specific thing, means we're interested in a specific category of culture.
And I think it can be really empowering. But I think in the context of a Netflix homepage,
it can also be completely manipulative. On the Netflix homepage, there's this problem called
corrupt personalization, which is the appearance of personalization without the reality of it.
And that happens with Netflix because Netflix is always changing the thumbnails of the shows
and movies that you are watching in order to make them seem more appealing to you.
So if you're someone-
Oh, give me an example. Yeah.
Yeah. And then one like test version watched everything at random times. But what this academic found was that Netflix would change the thumbnails of the shows to conform to that category that the user watched, even would take a romantic comedy and put like the one
sports scene as the thumbnail to kind of encourage you to watch it. Or, you know, in a thriller,
maybe if you're a romantic comedy watcher, it would take the one frame where like two characters
are going to kiss or something in order to make it look like this is the kind of culture you want to consume, even though it's actually not. So it's the algorithm in that way is kind of manipulative
and using your tastes against you. You know, I'm just wondering about something, and you as someone
who follows art and culture, this is what you do for a living is write about it. If everything is
recommended for us or tailored to the kinds of movies we like
or the news that I like to consume or the music I like to listen to, how do I really know what's
happening culturally in the world? So how do I know what's happening around me to understand
if my tastes and sensibilities are running in parallel or up against what's happening?
I think that's really hard to do right now.
These digital platforms and feeds,
they kind of promise a great communal experience,
like we're connecting with all the other TikTok users
or all the other Instagram users.
But I think they're actually kind of atomizing our experiences
because we can never tell what other people are seeing
in their own feeds. We
don't have a sense of how many other people are fans of the same thing that we're fans of, or even
if they're seeing the same piece of culture that we're seeing or experiencing an album or a TV
show in the same way. So I think there's this lack of connection, like, as you're saying, this sense that we're alone in our consumption habits and we can't come together over art in the same way, which I think is kind of deadening the experience of art and making it harder to have that kind of collective enthusiasm for specific things. On the other hand, I'm someone, for instance, who I'm a plant lover. I'm a plant mom.
I'm obsessed with plant life. And so through the algorithm, it feeds me lots of content around
caring for plants and facts about plants. And so there is also another community, though,
I'm tapping into through that. I think algorithms are essentially an ambivalent force. Like I think
you can use them to great effect. You can use them to find the people or pieces of culture that you
like. But I think when we rely on them too much, that's when it becomes so overwhelming and
flattening of our experiences. So in the plant department, I think it's been really cool to see communities
develop around these different trends like plants. But then you kind of see the same plants being
popular everywhere you go. It's so true. Unavoidable fiddly fig or, you know, apotheos plant.
And I think, I don't know, it's hard to sustain both that community building and a sense of diversity.
And like a sense that everyone can pursue their own paths within it.
It's like within these feeds, I feel like there's always one correct way to approach a thing or one correct mode of consumption.
And so in plants, that might be,
oh, I have to go get the fiddle leaf fig. Or, you know, in films, I have to go see the Barbie movie
or something like that. You write about this in the book about how the flattening of culture has
impacted, quote unquote, the real world, right? Every coffee shop has a fiddle leaf plant. And so like you give the example of the hipster coffee shop. You notice something when you were traveling around the world about how writer, I would always find a coffee shop to work in,
in whatever city I landed in. So whether that was Tokyo or Los Angeles or Copenhagen or Berlin,
I would kind of land, go to my Airbnb, open Yelp or Google Maps and search hipster coffee shop
and see where the nearest place was that I could go. And it struck me that
all of these places looked increasingly similar to each other. So I could reliably land in any
city in the world, open one of these apps and easily find my way to a coffee shop with a
fiddly fig and cappuccinos with nice latte art and white subway tiles on the walls and minimalist reclaimed wood
furniture and it just struck me as so strange because no one was forcing these cafes to
design themselves this way there was no like corporate parent like a starbucks mandating a
particular style instead it was just that all of these independent cafe owners around the world
had kind of gravitated toward
the same style and the same set of symbols like the fiddly fig. So Kyle, you're recommending
an algorithmic cleanse, which you actually did for a few months. How did you do that? And how did it
go? Yes, I mean, I think regulation can help these situations, but in the end,
users are kind of responsible for themselves for the time being. And one thing you can do is just
opt out of these systems entirely. Like you can log off Instagram, you can log off Twitter,
you can not go on TikTok, even though it might feel impossible. And that's what I wanted to do, I think, in September of 2022.
I just felt like totally overwhelmed.
I was nearing the end of writing this book.
I needed to just cut myself off completely from the influence of algorithmic feeds.
And I just decided one Friday that I was going to completely log off all of these things,
reroute my consumption habits away from digital platforms,
and almost figure out a different way of existing in the world
than what I had been used to the past decade or more.
That had to be hard because this is what you do.
You cover the internet.
Oh, yeah.
It was very difficult.
I had to really push myself to do it.
There were weeks and weeks where I said,
okay, I'm definitely going to do it this week
I'll do it next week
I'll do it the following week
so to cut myself off
I mean it felt like breaking an addiction
and when I did cut myself off
and suddenly not have all of that
algorithmically recommended stimulus
I did feel my brain kind of like gasping for
oxygen and grasping for more stimulus that I was missing. How did you fill it?
With some difficulty. My first, the first attempt I made was actually putting these fidget apps on
my phone. Like I found that my thumb was twitchy and like uncomfortable because I wasn't scrolling
through things on my phone. Like that familiar scrolling motion where you flip your thumb upward,
that was totally missing. And so what I did was download these apps where you can like
flip a light switch to turn a light on or like sweep a digital floor or spin a dial,
like accomplish these totally meaningless tasks just in order to like
soothe your brain with these motions. And it worked for a little while. It was a good interim solution.
What are you afraid of with the flattening of culture? Like what is the future that you see
that is really concerning when we think about all of this?
Because this sounds great for us as an audience and for those who will read your book, but for the vast number of those who are online, they are passively consuming.
I mean, I think passive consumption certainly has its role.
We are not always actively consuming culture and thinking deeply about the genius of a painting or a symphony or something. Like, it's not something we can do all the time. But I think what I worry about is just the, I suppose, the passivity of consumption that we've been pushed into, the ways that we're encouraged not to think about the culture we're consuming, to not go
deeper and not follow our own inclinations. And I worry that that passivity, along with the ways
that algorithmic feeds pressure creators to conform to, it kind of leads to this vision of
all culture that's like the generic coffee shop. It's like, it looks good. It might be nice. You might be
comfortable in it. But ultimately, there's like nothing deeply interesting about it. It doesn't
lead anywhere. It's just this kind of like, ever perfecting, perfect boredom of a place,
like a perfectly ambient culture of everything. And I suppose that when I really think about it
is the kind of horror at the end of all this,
at least for me,
is that we'll never have anything but that.
We'll never have the Fellini film
that's so challenging you think about it
for the rest of your life
or see the painting that's so like strange
and discomforting that it really sticks with you.
Like I don't wanna leave those masterpieces of art behind just because they don't immediately
engage people. You know what I'm scared about is the younger generations who know nothing else.
I, you know, I was not born into the era of algorithmic feeds, like the internet that I grew up on as a early teenager was more about self directed creation of stuff and kind of
exploring. But now I feel like so much of internet experience is driven through an algorithmic feed,
or you're forced to fit into a mold, like the ideal of an Instagram influencer or a creator on TikTok, that younger people
maybe don't have the freedom to just kind of figure out what they're into without being
pushed in one direction or another.
So what are you hopeful about in regards to this topic?
We've seen that there have been hearings around regulation, but none of them have really
pushed us far enough where we're going to see it in the way that we see the changes in the UK.
What are some things that bring you hope?
I think what makes me the most hopeful is that people are starting to get bored of this whole situation. Like, we as users of the internet have spent a solid decade or more, you know, experiencing these things, existing with algorithmic feeds, and it feels increasingly decrepit and kind of bad.
Like, we're realizing that these feeds aren't serving us as well as they could, and who they really benefit are the tech companies themselves. So I think as users start to realize that they're not getting
the best experience, I think people will start to seek out other modes of consumption and just
build better ecosystems for themselves. Kyle Chayka, thank you so much for this
conversation and your book. It was a great discussion. Thank you.
Kyle Chayka is a staff writer for The New
Yorker covering technology and culture on the internet. His new book is Filter World,
How Algorithms Flattened Culture. Fresh Air Weekend is produced by Teresa Madden.
Fresh Air's executive producer is Danny Miller. Our technical director and engineer is Audrey
Bentham.
For Terry Gross, I'm Tanya Mosley.