Fresh Air - How Algorithms 'Flatten' Culture

Episode Date: January 17, 2024

Filterworld author Kyle Chayka examines the algorithms that dictate what we watch, read and listen to. He argues that machine-guided curation makes us docile consumers. Also, Maureen Corrigan reviews ...You Only Call When You're in Trouble, a new novel from Stephen McCauley.Learn more about sponsor message choices: podcastchoices.com/adchoicesNPR Privacy Policy

Transcript
Discussion (0)
Starting point is 00:00:00 This is Fresh Air. I'm Tanya Mosley. Depending on what corners of social media you're on, chances are good you've heard this earworm of a song by the group Ocean Alley. It's all about confidence, baby. She was a confident lady. The song is called Confidence, and the Australian indie band released it five years ago. But thanks to going viral, it's having a moment right now. But whether it's having a moment on your feed, well, that's all up to the algorithm.
Starting point is 00:00:35 Writer Kyle Chayka has been thinking about this for several years. In his new book, Filter World, How Algorithms Flattened Culture, he writes about how we are fed algorithmic recommendations that dictate what music we like, how we interpret the news, what movies we consume, even what foods we eat, clothes we wear, the language we use, and the places we go. And Chayka argues that all of this machine-guided curation has made us docile consumers and flattened our likes and tastes. Kyle Chayka is a staff writer for The New Yorker, covering technology and culture on the internet.
Starting point is 00:01:11 His work has also appeared in The New Republic, The New York Times Magazine, and Harper's, among other publications. Chayka's first nonfiction book, The Longing for Less, A History of Minimalism, was published in 2020. Kyle Chayka, welcome to Fresh Air. Thanks so much for having me here. This is a conversation I've wanted to have for the longest time, so I'm really excited that you're here. So almost about a decade ago, I guess, we could basically go on Facebook or Instagram or Twitter and scroll through the posts of everyone we followed,
Starting point is 00:01:45 almost chronologically, especially in those early days. Now, most of what we engage in, as you write in this book, is content flowing through the algorithm, optimized for engagement, and pretty much devoid of the human touch. What changed about eight or nine years ago? I guess that was around 2015, 2016. Yeah, in the earlier era of social media, most of the feeds that we were interacting with were linear. So that just meant they were chronological. They ordered all the posts that you saw from most recent to oldest. And that was just how everything was filtered. You could see it on Facebook or Instagram or whatever. And over the past decade, most of those feeds have switched to being more algorithmic or more driven by algorithmic recommendations.
Starting point is 00:02:33 So these are equations that measure what you're doing, surveil the data of all the users on these platforms, and then try to predict what each person is most likely to engage with. So rather than having this neat ordered feed, you have this feed that's constantly trying to guess what you're going to click on, what you're going to read, what you're going to watch or listen to. And it feels like a kind of intrusive mind reading sometimes. I could see how all of this can make us passive consumers, but what do you mean when you say the algorithms are flattening culture?
Starting point is 00:03:06 I think algorithmic recommendations are kind of influencing us in two different directions. For us consumers, they are making us more passive just by like feeding us so much stuff, by constantly recommending things that we are unlikely to click away from, that we're going to tolerate, not find too surprising or challenging. And then I think those algorithmic feeds are also pressuring the creators of culture, like visual artists or musicians or writers or designers, to kind of shape their work in ways that fits with how these feeds work and fits with how the algorithmic recommendations promote content. Yeah, that's why I thought that bringing up music is a really good way, a good example of how Filterworld can feel like it's both expanding and contracting culture.
Starting point is 00:03:58 Because, you know, I never would have learned about a group like Ocean Alley otherwise, but there are these other elements that you're talking about, about then tailoring the work based on the algorithm and trying to go viral. Yeah, yeah. I mean, because we consumers like really consume so much culture through these feeds, in order to reach audiences, creators also have to work through these feeds. Like a musician has to work through Spotify or TikTok and kind of mold their work in a way that fits with TikTok. So that might mean like a really catchy hook that occurs right at the beginning of a song or packing every sound possible into the like 10 seconds that you have for a viral TikTok sound. One other thing I was thinking about is what I also see, though, is that the digital space has lessened the potency and power of gatekeepers. So we're no longer relying on a handful of media that dictate what is good art, what is good fashion
Starting point is 00:05:01 and culture and music. Couldn't it be argued that algorithms in the digital space more broadly have opened up the world, though, in ways that we've never had access to before? I think they really have. Like, there's this huge power of the internet to let anyone publish the art that they make or the songs that they write. And I think that's really powerful and unique. Like in the ecosystem,
Starting point is 00:05:27 the cultural ecosystem that we had before, there were these gatekeepers like magazine editors or record executives or even radio station DJs who you did have to work through to get your art heard or seen or bought. And so these were human beings who had their own biases and preferences and social networks. And they tended to block people who didn't fit with their own vision. And now in the algorithmic era, let's say, rather than seeking to please those human gatekeepers or figure out their tastes. The metric is just how much engagement you can get on these digital platforms. So the measure of your success is how many likes did you get? How many saves did you get on TikTok or bookmarks? How many streams did
Starting point is 00:06:17 you get on Spotify? So I think there are advantages and disadvantages to both of these kinds of regimes. Like on the Internet, anyone can put out their work and anyone can get heard. But that means to succeed, you also have to placate or adapt to these algorithmic ecosystems that I think don't always let the most interesting work get heard or seen. I was especially fascinated by your chapter on personal taste and the ways that algorithms have disrupted our taste. You explored this by first asking the question, what is taste? It is a very human thing. Yeah, I think, I mean, taste gets a bad rap sometimes as something that can be pretentious or elitist. But I think we all have taste. We all have things we like and don't like.
Starting point is 00:07:10 And we all think about what we like and don't like. And that's what our taste is. I think what we like is also what we identify with. And it's how we connect with other people and how we build communities around culture. So I think taste is really important. And it's something that algorithmic feeds and these big digital platforms kind of allow us to ignore or encourage us to ignore just so they can keep us listening and clicking and watching. Well, as part of your exploration of taste, you wanted to see if a digital space could actually identify your taste.
Starting point is 00:07:46 So in 2017, Amazon created something called the Amazon Echo Look, which tried to approximate taste by making fashion decisions for the user. And you tried full outfit. And you could have the app, the Echo Look app, send out the images, kind of algorithmically analyze them with some human help as well. And the machine would tell you how stylish you were being or not. Like it would purport to give you were being or not. Like it would purport to give you an analysis of how good your outfit was. And I found that it didn't really work for me. I mean, this really, this pushed on me, I think popped collars. It was a big fan of, which I think were less fashionable when I was in middle school. It really didn't like my choice of
Starting point is 00:08:46 monochrome outfits, like an all gray outfit, which, you know, maybe that's true. Maybe that's not cool, but it's part of my personal choice, my style. To me, the biggest problem with the Echo look was that it just kind of gave you this grade of your outfit. Like it told you, oh, this is 75% stylish, but it couldn't really tell you why, or it didn't give you the logic behind its analysis. It just kind of like told you whether you were going in the right direction or the wrong direction. And that's just so antithetical to what we think of as personal style or even what we want to communicate via fashion. Like, how is this algorithm to know what you are trying to communicate with your clothes that day or how you're trying to feel out in the world? So just, I found it kind of useless as a style
Starting point is 00:09:38 analysis and also just almost actively misleading or distorting the purpose of fashion, which is actually to communicate something about yourself, not to conform to some data-driven standard. And that was in 2017. I mean, several years later, now the big conversation is around generative AI and its ability to predict what we like, to offer more specificity. How does that play into this conversation? Yeah, I feel like AI is like the looming question for all of this technology. My feeling is that algorithmic feeds and recommendations have kind of guided us into conforming to each other and kind of having this homogenization of culture where we all accept the average of what everyone's doing we all kind of fit into these preset molds and now ai is kind of promising to just spit out that average immediately like to it'll digest all of the data in the world it'll take take in every song, every image, every photograph and produce whatever you command it to. But that output will just be a complete banal average of what already exists. That almost signals to me like a death of art or a death of innovation. Okay, I want to talk about some of the other platforms
Starting point is 00:11:05 where we're guided by the algorithm. In the case of streaming services, Netflix pioneered the filtering of culture through recommendation engines. What does the Netflix algorithm factor? It factors a lot of different things, including what movies or shows you've already watched, what other users are watching and clicking on, and also just what Netflix chooses to prioritize in a given moment. driven by algorithmic recommendations. It's always personalized to try to present you things that you are likely to watch. And that's always measuring the kinds of genres that you're watching or the
Starting point is 00:11:53 actors you like or, you know, other favorites that you've shown to the system. The problem is, as you write in the book, and one scholar wrote, is that it's taking away the process of cultural meaning through decision-making. We make meaning through making our own decisions about what we want to see and what we like. Yeah, I think so. I mean, the act of choosing a piece of culture to consume is a really powerful one. It is an internal decision that means we're giving our attention to a specific thing, means we're interested in a specific category of culture,
Starting point is 00:12:31 and I think it can be really empowering. But I think in the context of a Netflix homepage, it can also be completely manipulative. On the Netflix homepage, there's this problem called corrupt personalization, which is the appearance of personalization without the reality of it. And that happens with Netflix because Netflix is always changing the thumbnails of the shows and movies that you are watching in order to make them seem more appealing to you.
Starting point is 00:13:07 Oh, give me an example. Yeah. Yeah. An academic did a long-term study of this by creating a bunch of new accounts and then kind of giving them their own personalities. Like one character, let's say, only watched romantic comedies. One character only watched sports documentaries one only watched thrillers and then one like test version watched everything at random times uh but what this academic found was that netflix would change the thumbnails of the shows to conform to that category that the user watched even if the show was not of that category like it would say you're the sports viewer netflix would take a romantic comedy and put like the one sports scene as the thumbnail to kind of encourage you to watch it uh or you know in a thriller maybe if you're a romantic comedy watcher
Starting point is 00:14:02 it would take the one frame where like two characters are going to kiss or something in order to make it look like this is the kind of culture you want to consume, even though it's actually not. So it's the algorithm in that way is kind of manipulative and using your tastes against you. You know, I'm just wondering about something. And you as someone who follows art and culture, this is what you do for a living is right about it. If everything is recommended for us or tailored to the kinds of movies we like or the news that I like to consume or the music I like to listen to, how do I really know what's happening culturally in the world? So how do I know what's happening around me to understand if my tastes and sensibilities are running in parallel or up against what's happening? I think that's really hard to do right now. Like these digital platforms and feeds, they kind of promise a great communal experience. Like we're connecting with all the other TikTok users or all the other Instagram users. But I think they're actually kind of atomizing our experiences because we can never tell what other people are seeing in their own feeds. We don't have a sense of how many other people are fans of the same thing that
Starting point is 00:15:17 we're fans of, or even if they're seeing the same piece of culture that we're seeing or experiencing an album or a TV show in the same way. So I think there's this lack of connection, like, as you're saying, this sense that we're alone in our consumption habits, and we can't come together over art in the same way, which I think is kind of deadening the experience of art and making it harder to have that kind of collective enthusiasm for specific things. On the other hand, I'm someone, for instance, who I'm a plant lover. I'm a plant mom. I'm obsessed with plant life. And so through the algorithm, it feeds me lots of content around caring for plants and facts about plants. And so there is also another community, though, I'm tapping into through that. Yeah, I think there's always, I think algorithms are essentially
Starting point is 00:16:13 an ambivalent force. Like, I think you can use them to great effect. You can use them to find the people or pieces of culture that you like. But I think when we rely on them too much, that's when it becomes so overwhelming and flattening of our experiences. So in the plant department, I think it's been really cool to see communities develop around these different trends like plants. But then you kind of see the same plants
Starting point is 00:16:40 being popular everywhere you go. It's so true. Like the unavoidable fiddly fig or you know apotheos plant and i think i don't know it's it's hard to sustain both that community building and a sense of diversity and like a sense that everyone can pursue their own paths within it. It's like, within these feeds, I feel like there's always one correct way to approach a thing or one correct mode of consumption. And so in plants, that might be, oh, I have to go get the fiddly fig. Or, you know, in films, I have to go see the Barbie movie or something like that.
Starting point is 00:17:22 I mean, you write about this in the book about how the flattening of culture has impacted, quote unquote, the real world, right? Every coffee shop has a fiddle leaf plant. And so like, you give the example of the hipster coffee shop. You noticed something
Starting point is 00:17:38 when you were traveling around the world about how they're influenced by the algorithm. Yes, I think this was in the mid 2010s or so when I was traveling around as a freelance magazine writer, I would always find a coffee shop to work in, in whatever city I landed in. So whether that was Tokyo or Los Angeles or Copenhagen or Berlin, I would kind of land, go to my Airbnb, open Yelp or Google Maps and search hipster coffee shop and see where the nearest place was that I could go.
Starting point is 00:18:15 And it struck me that all of these places looked increasingly similar to each other. So I could reliably land in any city in the world, open one of these apps and easily find my way to a coffee shop with a fiddly fig and cappuccinos with nice latte art and white subway tiles on the walls and minimalist reclaimed wood furniture. And it just struck me as so strange because no one was forcing these cafes to design themselves this way. There was no like corporate parent, like a Starbucks mandating a particular style. Instead, it was just that all of these independent cafe owners around the world had kind of gravitated toward the same style and the same set of symbols like the fiddly fig. Our guest today is journalist Kyle Chayka. He's a staff writer for The New Yorker and has written a new book called Filter World, How Algorithms Flattened Culture, which explores the impact of algorithmic technology on the ways we live.
Starting point is 00:19:13 We'll continue our conversation after a short break. I'm Tanya Mosley, and this is Fresh Air. Hi, it's Terry Gross here with a promo for a special conversation I had with my co-host, Tanya Mosley, only available for our Fresh Air Plus supporters. When I'm going through a really hard time, I sometimes just think about that, like all the people who I've met through interviews who've come out the other end intact. Terry, I can only imagine the lessons you've learned over time. I mean, it's more than a self-help book, because just in the year that I've been doing this show, I learn so much with every single interview I do.
Starting point is 00:19:52 Tanya and I select our favorite interviews of 2023 and talk to each other about talking in a new bonus episode only available on Fresh Air Plus. Find out more and join for yourself at plus.npr.org. Today, I'm talking to Kyle Chayka. He's a staff writer for The New Yorker covering technology and culture on the internet. His work has also appeared in The New Republic, The New York Times Magazine, and Harper's, among other publications. Chayka's first nonfiction book, The Longing for Less, published in 2020, is a history of minimalism. We're talking about his new book, Filter World, How Algorithms Flatten Culture, which explores the impact of algorithmic technology on the ways we live.
Starting point is 00:20:37 Meta, the parent company for Facebook, announced this month that it will begin removing some sensitive and age-inappropriate content from teenagers' feeds, even if the content was posted by people they follow. Influencers have been very vocal over the years about how the algorithm is against them, working against them in many instances, how some have had great success and then all of a sudden their likes have gone down, their views have gone down. It does seem like a vicious cycle in the way that you're talking about in the quote-unquote real world with retail owners who say they're constantly chasing something. How and why is this happening where people are feeling like they may have a career actually as an influencer online and then all of a sudden they've lost it it's so capricious in a way like i i mean i've felt it myself as a journalist or someone who's
Starting point is 00:21:32 been on twitter for a decade or so like sometimes you feel like you're in tune with the algorithm you know exactly what to do and and how to tune your content so that it gets popular and gets a lot of likes and then at other, it's just not hitting. Like the feed is suddenly prioritizing other things or other formats. So I think there's a seduction of algorithmic platforms where they can really deliver your content to a huge audience and suddenly get you a big fan base, like as a TikTok creator or a proto-influencer.
Starting point is 00:22:04 And then all of a sudden, just as you're gaining that audience and starting to maybe become addicted to the attention, then the algorithm kind of goes away. Your solution to the equation stops working quite as well as you thought it was. And that's, I mean, tastes fluctuate and maybe your content can stay the same and people just are not as interested in it anymore. But also the algorithm of the feed changes and the forms of content that digital platforms emphasize change over time. So you kind of do have to play this constant catch up game of figuring out what the feed likes this day or this week and what it might prioritize in the future.
Starting point is 00:22:48 Well, you write quite a bit about your own experience, how early in your career you learned to judge your success on the internet almost entirely in terms of numbers. So many of us do this. How many thumbs up on Facebook or how many likes on Twitter? How does this impact how you see yourself as a writer, how you measure what is good? Oh, man. I mean, it's tough to separate your success from attention online, I think. I mean, this is just a real fact of existing in the past decade or more. So much of how we consume things is routed through these feeds that if your work is doing badly in the feeds, then you kind of feel bad about it.
Starting point is 00:23:28 Like, as a journalist, particularly in the early 2010s, I would feel bad that my article, my brand new article, didn't get as many likes as the last one. Or my pithy observation on Twitter didn't have enough response, and that would make me feel embarrassed or that I had done the wrong thing. So it kind of puts you in this trap of self-consciousness, I think, where it's like you're always being graded on everything you put out into public. And in a way, you are, because I'm thinking about how hard it is for maybe filmmakers and writers and visual artists, creators of all kind to find footing in filter world. Book publishers, for instance, want to know how many platforms you're on, how many followers you have as part of your proposal for a book deal. How much of that assessment actually changes the kinds of books and movies and music that we have access to? Oh, I think it really does change it. I mean, every publisher will ask a new author,
Starting point is 00:24:32 what is your platform? Like, how big of a platform do you have? Which is almost a euphemism for how many followers do you have online, whether that's Twitter or Instagram or an email newsletter. They want to know that you already have an audience going into this process, that you have a built-in fan base for what you're doing. And culture doesn't always work that way. I don't think every idea should have to be so iterative that you need fans already for something to succeed, that you have to kind of engage audiences at every point in the process of something to succeed, that you have to kind of engage audiences at every point in the process of something to have it be successful. So for a musician, you know, maybe you'll get a big record deal only if you go viral on TikTok. Or if you have a hit YouTube
Starting point is 00:25:18 series, maybe you'll get more gigs as an actor. There's this kind of gatekeeping effect here too, I think, where in order to get more success on algorithmic platforms, you have to start with seeding some kind of success on there already. Have television shows or movies used algorithms to help them green light or shape content? I think so. I mean, I think you can see how tv shows and movies have adapted to algorithmic feeds by the kind of like one-liner gif ready scenes that you see in so many tv shows and movies now you can kind of see how a moment in a film is made to be shared on twitter or how a certain reaction in a reality TV show, for example, is made to become a meme. And I think a lot of production choices have been influenced by that need for your piece of
Starting point is 00:26:14 content to drive more pieces of content. How would you say journalism has been impacted and shaped by these algorithmic forces? Well, algorithmic feeds, I think, took on the responsibility that a lot of news publications once had. So say in decades past, we would see the news stories that we consumed on a daily basis from the New York Times front page on the print paper, or then the New York Times homepage on the internet. Now, instead of the publication choosing which stories are most important, which things you should see right away, the Twitter or X algorithmic feed is sorting out what kinds of stories you're consuming and what narratives are being built up. Or, you know, TikTok, we now have kind of TikTok talking heads and explainers, rather than news anchors on cable TV. So the kind of responsibility for choosing what's important,
Starting point is 00:27:17 I think, has been ported over to algorithmic recommendations rather than human editors or producers. This feels like we're now venturing into what is dangerous territory because we've been talking for many years about misinformation. You write about how algorithms can speed up ideological radicalization because it feeds users more extreme content in a single category. How would regulation, federal regulation of these tech companies, impact maybe the power that the algorithm has? I mean, I think part of the problem with FilterWorld is that it feels inescapable. Like we've been talking about so many negative qualities of this environment, I think there are ways out of it and ways that we can like break down the grip of filter world. And that is a lot of that is through
Starting point is 00:28:12 regulation. I mean, social media is a very unregulated space right now, especially compared to traditional media. And so I think monopolization is one thing that we can be very wary of. So I think if Meta, Facebook's parent company, was forced to spin off some of its properties like Instagram or WhatsApp, and those properties were made to compete against each other, then maybe users would have more agency and more choices for what they're consuming. There are also real consequences that go beyond what we focused our conversation on. You tell the story of a 14-year-old from London named Molly Russell who died by suicide. An audit of her online consumption found that she had been fed content on suicide. Can you explain how something like that happens? When Molly Russell was using Twitter and using Pinterest and Instagram to consume content that was about depression and about self-harm. But then it wasn't stopping there.
Starting point is 00:29:21 The platforms were then recommending her more and more of that content. So at one point, Pinterest actually sent her an email of suggestions of self-harm images to add to a collection that she had made. And I think, I mean, it's just so blatant why that's dangerous and bad. It's an algorithmic acceleration of content, but the content is harmful, and I don't think we want it to be pushed at people. What could tech companies do to filter or slow down or block access to harmful material like this? In the case of Molly Russell, I think there are ways that could pretty easily have prevented it. I mean, better moderation would make platforms more wary of promoting such negative content like about depression or other things like anorexia or mental health problems. And there's another strategy that just excludes certain subjects
Starting point is 00:30:21 and content matter from algorithmic promotion. So if a tech company's filter is detected that something was about self-harm or depression, then maybe that content would not be algorithmically promoted, and that would kind of slow down the acceleration that might have caused problems for Molly Russell. We need to take a short break here, but before we do, I just want to say that if you are in a state of despair and are having thoughts of self-harm, you can get help by calling or texting the Suicide and Crisis Lifeline at 988 at any time. The number again to text or call is 988. If you're just joining us, my guest is journalist Kyle Chayka. He's a staff writer for The New Yorker and has written a new book called Filter World, How Algorithms Flattened Culture,? You're recommending an algorithmic cleanse, which you actually did for a few months. How did you do that? And how did it go?
Starting point is 00:31:33 Yes. I mean, I think regulation can help these situations, but in the end, users are kind of responsible for themselves for the time being. And one thing you can do is just opt out of these systems entirely. Like you can, you can log off Instagram, you can log off Twitter, you can not go on TikTok, even though it might feel impossible. And that's what I wanted to do. I think in September of 2022, I just felt like totally overwhelmed. I was nearing the end of writing this book, I needed to just cut myself off completelyaring the end of writing this book. I needed to just cut myself off completely from the influence of algorithmic feeds. Not coincidentally,
Starting point is 00:32:11 this was around the time that Elon Musk was acquiring Twitter. And that was kind of damaging my experience with one of my favorite platforms. So I was feeling some dread anyway. And I just decided one Friday that I was going to completely log off all of these things, reroute my consumption habits away from digital platforms and almost figure out a different way of existing in the world than what I had been used to the past decade or more. That had to be hard because this is what you do. You cover the internet. Oh, yeah. It was very difficult. I had to really push myself to do it. There were weeks and weeks where I said, okay, I'm definitely going to do it this week. I'll do it next week. I'll do it the following week. It was compulsive in a way because I had spent years and years, you know, every day being on Twitter, looking at Instagram a dozen times a day, being on TikTok, especially during quarantine.
Starting point is 00:33:10 So to cut myself off, I mean, it felt like breaking an addiction. And when I did cut myself off and suddenly not have all of that algorithmically recommended stimulus, I did feel my brain kind of like gasping for oxygen and grasping for more stimulus that I was missing. How did you fill it? With some difficulty. My first, the first attempt I made was actually putting these fidget apps on my phone. Like I found that my thumb was twitchy and like uncomfortable because I wasn't scrolling through things on my phone like that familiar scrolling motion where you flip your thumb upward that was totally missing and so what I did was download these apps where you can like flip a light switch to turn a light on or like sweep a digital floor or spin a dial like accomplish these totally
Starting point is 00:34:04 meaningless tasks just in order to like soothe your brain with these motions. And it worked for a little while. It was a good interim solution. What are you afraid of with the flattening of culture? Like what is the future that you see that is really concerning when we think about all of this? Because this sounds great for us as an audience and for those who will read your book, but for the vast number of those who are online, they are passively consuming. I mean, I think passive consumption certainly has its role. Like we are not always actively consuming culture and like thinking deeply about the genius of a painting or a symphony or something.
Starting point is 00:34:48 Like, it's not something we can do all the time. But I think what I worry about is this, just the, I suppose, the passivity of consumption that we've been pushed into. The ways that we're encouraged not to think about the culture we're consuming, to not go deeper and not follow our own inclinations. And I worry that that passivity, along with the ways that algorithmic feeds pressure creators to conform to, it kind of leads to this vision of all culture that's like the generic coffee shop. It's like, it looks good. It might be nice. You might be comfortable in it. But ultimately, there's like nothing deeply interesting about it. It doesn't lead anywhere. It's just this kind of like, ever perfecting, perfect boredom
Starting point is 00:35:39 of a place, like a perfectly ambient culture of everything. And I suppose that when I really think about it is the kind of horror at the end of all this, at least for me, is that we'll never have anything but that. We'll never have the Fellini film that's so challenging you think about it for the rest of your life, or see the painting that's so like strange and discomforting that it really sticks with you. Like I don't want to leave those masterpieces of art behind just because they don't immediately engage people. You know what I'm scared about is the younger generations who know nothing else. Mm-hmm. I, you know, I was not born into the era of algorithmic feeds, like the internet of an Instagram influencer or a creator on TikTok, that younger people maybe don't have the freedom to just kind of figure out what they're into without being pushed in one direction or another.
Starting point is 00:36:56 So what are you hopeful about in regards to this topic? We've seen that there have been hearings around regulation, but none of them have really pushed us far enough where we're going to see it in the way that we see the changes in the UK. What are some things that bring you hope? I think what makes me the most hopeful is that people are starting to get bored of this whole situation. Like, we as users of the internet have spent a solid decade or more, you know, experiencing these things, existing within algorithmic feeds,
Starting point is 00:37:32 and it feels increasingly decrepit and kind of bad. Like, we're realizing that these feeds aren't serving us as well as they could, and who they really benefit are the tech companies themselves. So I think as users start to realize that they're not getting the best experience, I think people will start to seek out other modes of consumption and just build better ecosystems for themselves.
Starting point is 00:37:59 Kyle Chayka, thank you so much for this conversation in your book. It was a great discussion. Thank you. Kyle Chayka is a staff writer for The New Yorker covering technology and culture on the internet. His new book is Filter World, How Algorithms Flatten Culture. Coming up, book critic Maureen Corrigan reviews Stephen Macaulay's new novel, You Only Call When You're in Trouble. This is Fresh Air. Our book critic Maureen Corrigan is a longtime admirer of Stephen Macaulay's comic novels. She says his latest one, You Only Call When You're in Trouble, couldn't have been published at a better time of year. Here's her review.
Starting point is 00:38:39 Champagne bubbles pop and vanish, but a good comic novel is a steady mood lifter, especially during these flat post-holiday weeks of the new year. Enter Stephen Macaulay, whose novels have been brightening spirits since 1987. That's when his debut novel, The Object of My Affection, was published and later made into a movie starring Jennifer Aniston. Macaulay's new novel is entitled You Only Call When You're in Trouble, and like its seven predecessors, it offers readers not only the expansive gift of laughter, but also a more expansive image of what family can be. The trio at the center of this story consists of a flighty single mother named Dorothy, who's now in her 60s, her daughter Cecily,
Starting point is 00:39:34 a 30-something college professor who's currently under investigation for alleged misconduct with a female student, and Tom, Dorothy's brother and Cecily's uncle, as well as her de facto father. An architect in his 60s, Tom has just been double-dumped. A client has canceled a lucrative building project, and Tom's longtime boyfriend has moved out. Though he fiercely loves his sister and niece, Tom is also realizing how very weary he is of being their emotional and financial pillar. He's always put himself second to their needs. On the very first page of this highly designed story, we learn that Dorothy, ever the cash-strapped free spirit, is embarking on a risky new business venture, a massive retreat center in artsy Woodstock, New York. Convinced that time is running out to make her mark, Dorothy has partnered up with the quintessential bully of a self-help guru,
Starting point is 00:40:47 a woman named Fiona Snow, whose book, The Nature of Success in Successful Natures, was a flickering bestseller back when Oprah still had her talk show. Dorothy summons Tom and Cecily to the gala opening of the center, where she also plans to finally disclose the long-hidden identity of Cecily's father. As in a Shakespearean comedy, mayhem and a flurry of unmaskings ensue, during which characters' true natures are exposed. As a comic writer and novelist of manners, Macaulay has not only been likened to the Bard, but to Jane Austen, Edith Wharton, and to contemporaries like Tom Parada and Maria Semple. Let's throw in Oscar Wilde as well, because there's an economy to Macaulay's style that's reminiscent of Wilde's quick wit. For example, late in the novel, Tom finds himself applying for jobs in other architectural firms.
Starting point is 00:41:54 We're told that, so far, the response was what he'd expected, polite rejection with promises to keep him in mind. Tom sensed embarrassment in the people he'd spoke with. Gray hair and CVs make for an inherently embarrassing combination, like condoms and senior discounts. Even in Macaulay's earliest novels, however, his characters and their predicaments were never simply setups for clever one-liners. There's always been a psychological acuity to his work, and here a deepened sense of looming mortality. Cecily, who rightly disapproves of her mother's partnership with Fiona Snow, sadly reflects that Dorothy had had her issues with drugs, men, and money over
Starting point is 00:42:48 the years. They had, in many ways, defined her life. Now it looked as if she was pulling into the end station, a shallow form of self-help, the politically acceptable cousin of religion. Altruism that didn't involve real sacrifice, self-indulgence made to seem a moral imperative. In its own sparkling way, you only call when you're in trouble is concerned with the question of endings, of what we leave behind, whether it be our work, our worst mistakes, our most loving if flawed relationships. Personally, I didn't want to leave Macaulay's voice and sensibility behind when I finished this novel. So I did something I can't remember ever doing before. I searched online for an earlier novel of his called The Easy Way Out that I'd never read, bought it, and dove in. In January, any safe measures to keep one's mood and bubbles aloft
Starting point is 00:43:56 are justified. Maureen Corrigan is a professor of literature at Georgetown University. She reviewed Stephen Macaulay's new novel, You Only Call When You're in Trouble. Tomorrow on Fresh Air, Washington Post reporter Peter Jamison will tell us why homeschooling is America's fastest-growing form of education. While homeschoolers are an increasingly diverse group with a variety of motivations, some advocates say poor regulation of homeschooling may be a detriment to kids academically and leave them at greater risk of abuse. I hope you can join us. To keep up with what's on the show and get highlights of our interviews, follow us on Instagram at NPR Fresh Air. Fresh Air's executive producer is Dani Miller. Thank you. Anne-Marie Baldonado, Thea Chaloner, Seth Kelly, and Susan Nyakundi.
Starting point is 00:45:07 Our digital media producer is Molly C.V. Nesper. Rita Shorrock directs the show. For Terry Gross, I'm Tanya Mosley.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.