Big Technology Podcast - Would a World Without Algorithms Be Better? — With Kyle Chayka
Episode Date: January 17, 2024Kyle Chayka is a New Yorker staff writer and author of FILTERWORLD: How Algorithms Flattened Culture. Chayka joins Big Technology Podcast to talk about his book — which debuts this week — and dig ...into whether algorithms really do make our culture flatter. In this spirited conversation, we interrogate the role of algorithms in society, ask how they reflect in the real world, and ponder how products would be redesigned to take back control. Stay tuned for the second half, where we discuss Chayka's story on people falling in love with AI bots. --- Enjoying Big Technology Podcast? Please rate us five stars ⭐⭐⭐⭐⭐ in your podcast app of choice. For weekly updates on the show, sign up for the pod newsletter on LinkedIn: https://www.linkedin.com/newsletters/6901970121829801984/ Questions? Feedback? Write to: bigtechnologypodcast@gmail.com
Transcript
Discussion (0)
Let's talk about how algorithms are flattening culture and whether there's a better way to solve the problem with a New Yorker writer whose new book on the subject comes out this week.
All that and more coming up right after this.
LinkedIn Presents
Welcome to Big Technology Podcast, a show for cool-headed, nuanced conversation of the tech world and beyond.
This week, we're joined by Kyle Cheka.
He is a staff writer at the New Yorker covers.
technology and culture on the internet.
He's also the author of a new book called Filterworld,
How Algorithms Flattened Culture, Coming Out This Week.
Kyle, welcome to the show.
Thanks for having me.
Thanks for being here.
I've been a reader of your work for, I can't even remember.
I mean, it must be more than a decade at this point.
I've always found you to be an extremely astute observer of what's going on online.
And more and more about how that reflects offline.
So we're going to talk about all of that this week.
And why don't we begin with, I mean, you're writing a story or a book about algorithms and how they filter our world.
I mean, obviously at this point, you're kind of throwing your hands up and being like, we don't like what the algorithms are doing.
What's your criticism of algorithms?
Well, I think, I mean, to begin with, I do think algorithms have been necessary for the internet as it exists at this point.
Like, since the advent of social media, social networks have gotten bigger and bigger.
We produce more and more content.
Like, users generate content at an extremely fast pace, extremely high volume.
So you need some way of sorting it.
You need a mechanism to figure out what you might actually want to see and sort that out from what you just don't care about or what you could easily miss.
But I think at this point, we're facing this inflection point.
in social media and on the internet in general where it's harder to like be outside of the
algorithmic feed than it is to be within it. So these these feeds and recommendations have just
taken over so much of our experience online that I think that's kind of choking new experiences
and making it harder to like find new or original or creative stuff. Okay. So let's just get into it
because I have some counterpoints I want to bring up to you.
First of all, like, it's definitely true that we're moving more towards algorithm and less
toward human signal, right?
We're going from a place where, like, we are not in the follow model anymore, as much as we
are the algorithm recommending.
And this is specifically for content, right?
So it's less about who you designed to follow on Facebook or Twitter and more about what
TikTok shows you.
But that being said, these algorithms do, like, they are designed to find new stuff that you
might be interested in. The TikTok algorithm, for instance, looks for things that, you know,
of course, that you have watched and you're more likely to watch and stay with. But also,
every now and again, it will feed in a video of some interest that it's not sure whether or like
you will want and see how you respond. And if you do like it, then it will send more. And that's
how you end up in different rabbit holes. So there's an argument to be made here that algorithms
actually do expose you to lots of new stuff you might not have otherwise. I'm sure you thought
about this when you wrote the book? What's your what's your take on that? I think they
do highlight new stuff otherwise they wouldn't work at all right like you don't just want
the same thing over and over and over again. There's a variable in there whether it's Spotify or
X or TikTok that's maybe marked surprise or a difference. I'm sure that it's actually how it's
labeled surprise. Yes, let's twist the surprise knob a little more and that would be like an
experience of randomness, right? The TikTok for you feed doesn't know if you're going to like
this new video of a dog doing something funny, but you probably will. So it feeds that to you,
sees if you engage, then delivers more if you do engage with it, delivers less if you don't.
So I think it's nice as like a discovery engine in some ways, but I think the overall effect,
I mean, on one level, I think there's not enough surprise. Like the surprise knob is not turned
up enough, and we have no way of turning it up ourselves. I can't tell TikTok to give me
weirder stuff directly. I can't shake Instagram's idea that I only want to see cheap European
property. And so I'm kind of stuck in these, like, reflections of my own desires that I actually
am bored of. That's your Instagram feed? Cheap European property. Yeah, like every Italian stone
Village.
That's the view into my subconscious.
So I suppose, like, algorithms absolutely deliver us new things.
I think they desensitize us from, like, following those leads once we do find something
new because we just expect that there's another new thing around the corner, another new
thing.
And I wish there was a way to talk back a bit more.
But we do talk back, though, in many ways.
Like, if we hover on a video on TikTok, we're sending that signal.
And in fact, like you...
But that's such a passive signal.
I would disagree.
I think it could be active, right?
Like, I think the experience of using a TikTok right now is like, you almost are aware
of how the algorithm is interpreting.
So, like, if you don't want to be recommended stuff as opposed to, first of all, I think
you can say, show me less.
But also, if you want to see more, you know at this point to hover and maybe I'm giving
it too much credit, but you know to hover on that video and so you'll get more of it.
But again, maybe I'm for exaggerating what's actually happening.
The extremely online power users are more aware of how the algorithm works
and how to, like, tune the algorithmic feed to what they want,
so to talk back in that way of sending signals.
Yeah.
But I worry that kind of the mainstream user or the person who is not so aware of
the influence of an algorithmic recommendation just kind of like stays within that
that feed that's just still recommending kind of,
More content that may be engaging, but it's not ultimately leading you to more interesting
places. So are you basically saying that the algorithms are not working the way that they are designed
to? Because let's just talk about it, right? Like, it shows you that surprise to see if you're
interested. Like, for instance, if I ended up with a TikTok video of property in Italy, that would
be a surprise for me. But if I like sent signal, sent TikTok the signals, which I don't even have to
think about, right? If I watch that video through, then it would send me more. So it sort of
automatically follows those leads for you. Yeah, it's the sense of like surveillance. Like you are
communicating through your actions what you're interested in or not. I mean, if the algorithms
work the right way, yeah, for sure. Yeah. And like, I mean, I use TikTok a lot. I feel like I have
that sense of how it reads your subconscious almost and directs you down these various rabbit
holes. But I think, I mean, that model of surveillance and the passivity of those signals,
I don't know. I don't think that ultimately is the only way we need to interact with culture.
Like, I think. I would like be, I would like there to be something more beyond the passive
signals to have people seek things out more, to have people even be able to follow artists
more closely. I mean, I think another consequence of these feeds is that you might watch a musician's
TikTok video for a while because you're interested in it and you might get more, but you kind of
expected to come back to you without you having to try at all. So it builds these kind of passive
consumption habits. And that's like, I think it's kind of unhealthy for us, even though it's
entertaining and hypnotizing. But it's really unhealthy for the artists who are not making as much money,
who are not getting as much attention as they might if they had a more direct audience that
wasn't so mediated. That is that is absolutely true. So we do now end up instead of following
a single artist that we love, we can follow a genre like actually yesterday in my surprise box
on Instagram I got this guy like, you know, banking these pipes and making beautiful music
of it. And I actually had a moment where I was like, should I follow this person? And then I was like,
no, but I think that, you know, Instagram probably knows I played it like 60 times because
I'm like writing and I have this like, bing, bing, bing, you know, just like music going on in the
background.
I'm like, they're going to come bring it back.
But I don't have any incentive to follow the artist.
And that's true that we now end up in a much more, you know, amorphous area of interest as
opposed to much more intentional.
And that does change things.
There's like an ambient consumption of stuff that you're interested in rather than like
super engaged.
Like in Spotify, particularly, I think they talk a lot about lean in versus lean out.
consumption, like lean-in consumption being when you're truly paying attention, when you're actually
like listening to the song that's in your ears and thinking about it. And that's leaning in,
being attentive. And then there's lean-out consumption, which is just letting stuff play,
experiencing it ambiently, putting it in the background essentially. And I worry that all of
these algorithmic feeds kind of lead to more lean-out consumption than they do lean-in consumption.
So you have this pipe banging guy, and you're like, oh, cool, I like a little bit of this.
I like to hear it.
So if I listen to it, it's going to come back to me, you know, like a piece of driftwood on the tide or something.
I'm not going to take that next step to follow the guy.
You're not buying his album of pipe banging.
I did share him on my WhatsApp channel.
His name is Jake Chapman.
His handle is Chappie Mike Chappie Milkshake.
So if anybody's interested in what I'm here, it's actually terrific music.
So maybe I will follow after this.
But you're right, we're not in the moment where we are following this stuff intentionally.
And that leads to like a sort of a dissociation between our interests and the stuff that we follow.
But, okay, I think that this is kind of clear, like how this conversation was going to go.
But for the sake of argument, I'll stand on the side of the algorithms and we'll let you, you know, stand on the side of algorithms are maybe not all they're cracked up to be.
From that vantage point, I would say that maybe they really do work in pointing us to the stuff that they've basically, what they do, and I think you've mentioned this, is they kind of strip out like a lot of the middle class of content creation, and they point us to a much smaller amount of stuff that is really well liked.
And the argument that I would make on behalf of the argument, on behalf of the algorithms, is that,
a lot of that middle class was actually really dull and uninteresting.
And I'm not talking about professional content creators, but I'm just saying, like, let's take the Facebook newsfeed, right?
So the Facebook news feed, most of the stuff posted within it is pretty boring and not engaging.
And what it does is it takes all these posts from these thousands of different inputs and shows you, you know, maybe here are the, these are the three or four, like really interesting things that you will engage with.
You will learn from.
You will find interesting.
Now, of course, there's been problems with that because it's pointed you.
are the things that will most outrage you and most make you want to, you know, overthrow government or
whatever. But they, but, but, but, but, but when it works the way it's intended, it actually
sifts through like so much of the boring, dull stuff and actually gets you to interesting
things. And as a consumer, that's maybe something to celebrate. For sure. I mean, in the
Facebook case, Facebook made this decision to collect all kinds of content into its feed, right?
It's not just friends and family.
It's articles, it's videos, it's random meme group page content.
So they piled everything into one platform.
And because you have all that different kinds of content, because you have such a high
volume of it, suddenly need to sort it in a much more aggressive way.
So in that context, the algorithmic feed really works because it's giving you the stuff
that other people engaged with, that other people commented on.
But I think it's also possible that maybe we just didn't need that fire hose of everything it wants to begin with.
Like, I think you really need that filtering and that sorting when you have an undifferentiated mass of all kinds of stuff.
And you need to, like, pick through it because it's not all very high quality.
It's a lot of mediocrity.
If you rewind to like a bygone era of, say, Facebook in 2011 or Instagram in 2013,
or something, you didn't need so much sorting because it really was just the people who you
were in direct contact with. You were following a much smaller number of people. The feeds felt
quieter in a way. So I think the algorithmic, the rise of algorithmic recommendations has
followed this expansion of digital platforms into all types of content. And I kind of have a desire
to go back to that earlier stage where maybe we didn't need so much sorting because things
were more built for a specific purpose and stuck to that purpose rather than like every platform
doing everything at once. Yeah, this was a debate that I had with Francis Hogan, the whistleblower
at Facebook who like didn't really like the interpretation of one of the papers that I had
I guess found in her document dump and didn't like my interpretation, which was that if you
took the Facebook did this study and found that if you took the newsfeed algorithm out and
just did reverse chronological stuff, you had much more spam and content that people didn't like
and it's just a disaster and therefore we need the algorithm. And what she was like, and I think
this is like a pretty good point in retrospect, which is that if you take the algorithm out,
you're not just left with the experience of the platform absent the algorithm. You're left with
the platform that needs design changes to make it so that you could use it and enjoy it without
this recommendation algorithm. And what my argument did was leave out the fact that like Facebook
would then make changes to its platform, potentially good changes,
that leave it less reliant on the algorithm
and more encouraging maybe, like you said, smaller communities
as opposed to this big fire hose.
Yeah, I mean, I think the idea that every platform should do,
everything has led us in a not great direction as far as user experience.
And I think, I mean,
if you think about some non-alorithmic spaces online right now,
like a Discord server, or maybe a linear Reddit forum, just like the most recent posts.
Those work because they're not huge, because they're more intimate, because they're based on a
specific context. And that's not how Facebook or Instagram or X or TikTok work at this point.
So I do think you kind of have to rethink the entire mode of how we consume things online in order
to build things that are not so algorithmic
and encourage more intentional consumption.
And that sort of goes to the second main thing
in your book that I wanted to press you on,
which was that like,
if you don't have the algorithms, what do you have?
So you can potentially redesign, you know, certain products
to end up with smaller communities
and smaller experiences and potentially that solves
some of the problem of filtering the signal from the noise.
But you're still going to have this,
big society with all this stuff out there. And so, like, what do you do with it? Like, who then
determines, you know, who, what, what you're, what might, what you might find interesting.
Now, of course, part of that is going to be you. You'll have the agency. But of course, like,
in the vacuum, we're going to have tastemakers. And I kind of hate tastemakers. I find them to, first
of all, there's, there's, like, tastemakers are so elite, um, and often, like, follow elite tastes. But the, the,
the real thing, and, you know, maybe this is just me, because I have about as much basic
taste as possible. So I'm kind of a sucker for filter world, both the book, I guess, and
the algorithms. But what, what tastemakers often do is tell us, like, how things don't measure up
to the bar, how things are bad, and why that music is bad, that art is bad. And that movie
that got mass appeal is bad. And, like, I think they just sometimes try a little bit too hard
to disparage things as opposed to appreciate the good for when it's good for so many people.
What do you think?
I mean, the link between taste and elitism, I find so annoying, like, problematic.
I think it's like, I don't know when to trace it back to, like, was it the 80s, was it the 90s?
But there's this idea that when you have taste, it is pretentious or to say that you have taste is
pretentious and is elitist. I think to me, this is just talking about taste philosophically,
I suppose. Everyone has taste in stuff. You have things you like. Other people have things
that they like lesser more than you do. And it's not a problem to think about that. It's not
like inherently problematic to be like, I really enjoy this thing, whether it's a sports team
or noise music or like, you know, minimalist classical composition from 1973.
Like, everyone can like all of those things.
So I suppose rather than encouraging the return of elitist gatekeepers and tastemakers,
I would rather encourage, like, everyone to become their own tastemaker.
I think in a way I see that happening on Butterbox,
the film review social network, where it kind of encourages all sorts of people, everyone on the
site, to post their own little reviews, to think about what they like about a film, to rate it.
I suppose that's the kind of tastemaking that I want to see more of. And that's also an example
of a non-alorithmic platform that's still fun and cool. But is it chicken and egg a little bit?
Because then, like, how do you find the stuff that you're interested in?
No, it's very true.
But how did people find it before the internet?
Like, they read magazines, they read newspapers, they talked to their friends about what records they bought in the record shop, which was probably helped along by a human being.
I think the human being as a cultivator of culture and as like an accelerator of that connection of one recommendation passing to another person is good.
Like, I'm a big fan of recommendations.
Yeah, word of mouth.
Like, you should tell your friends what you like and why you like it.
And that's not an inherently elitist act.
Yeah, totally.
No, I think that you're right.
More of the grassroots tastemaking is like a great way to sort of replace some of those
or hedge against some of the elitist tastemaking.
Like, when I travel and we're going to get into travel in the second half.
But like, I'll, of course, like, read what's on TripAdvisor and stuff.
stuff like that and Google Maps, but my favorite way to find out. And I don't think this makes
me special. Again, going back to the basic tastes, but my favorite way to find out what's
interesting around the places just to go and ask the people that live there and are familiar
with the places themselves, because they'll know better than the internet, you know, 10 times
out of 10. Right. I mean, everyone, like, you have that friend who goes to Mexico City fairly often
and has, like, the Google Doc of their own personal recommendations of where to go.
And that's an example of human taste making.
And that doesn't feel pretentious or browbeating or something.
It's just like, here's what this one person liked in this place.
And maybe if you follow some of these suggestions, you'll find some cool stuff.
You'll find more stuff.
You'll figure out your own opinion of these things.
And I think, I mean, you see different modes of this happening online now, too.
I mean, substack and email newsletters are a great bottom-up grassroots kind of way of spreading
ideas and talking about stuff.
And I mean, at least so far, email is relatively unsorted.
Like, I still get a lot of, like, a fairly good linear feed of the stuff I want to see
in my Gmail inbox.
Yeah.
Although one thing that you wrote about really resonated with me when it comes to sending
email, you write about this thing called, maybe we can go deeper into it called
algorithmic anxiety where like anytime maybe you try to communicate with someone or put a piece
of content out, is it putting a piece of content out? You sort of feel some anxiety about how the
algorithm is interpreting it and whether you did everything the algorithm asked for. And then I was just
like, well, you know what? Like, I'm not subject to the algorithms because I do an email newsletter.
And then I'm like, no, actually I have the same exact thing. Have I been filtered into spam and
my updates versus the primary inbox? I do have algorithmic anxiety. So can you talk a little bit
about what that is as a thing?
Yeah, it was this really interesting term
that was only coined in 2018, surprisingly,
by this academic Shagoon Javer.
And he was kind of embedded at Airbnb as a sociologist,
and he took on this project of interviewing Airbnb hosts
about how they felt about Airbnb search engine,
the rankings of their listings.
And what he found was that the hosts basically had no idea
how the platform was working. They had all of these kind of folktale methods of gaming the search engine
and then trying to provoke the algorithm into paying attention to them. Like they would change their
calendar a bunch of times. They would log on to the website more times a day. Like they felt like
these little tricks were giving them an edge when in reality it probably was not those variables.
So I think the origin of that algorithmic anxiety is like the mismatch between how the algorithmic feed or recommendations affects us and how little we understand or can control them.
Like in the context of a newsletter, you are wondering where your email is getting sorted into and if you're actually reaching your readers.
I mean, when I put out a tweet or a X post or whatever we're calling them,
I worry, like I, I think through, okay, how can I get the algorithmic feeds attention?
What can I do to sustain it?
Am I doing the right thing?
And that's like a source of anxiety, in part because I measure the success of the tweet based on likes, which is my own problem.
I think everybody on Twitter does that.
It's the scorekeeping.
Yeah.
Exactly.
Okay.
So let me ask you this before we go to break.
when people write a book like this, they have a goal in mind sometimes.
Like you want to write a book and you want it to do something.
I mean, obviously you want it to sell.
But beyond that, like maybe it could have an impact.
So is your intention here, is it to get the builders of tech to like remove algorithms,
to get the users of tech, to think about their relationship with algorithms in a different way?
Like, what are you trying to do here?
I was really trying to provoke this conversation.
Okay.
here we are, which is like, I think a lot of users are not aware of just how much
algorithmic recommendations shape what they experience online. And we all underestimate how
much they influence all of the culture that's around us. So I wanted to kind of shake,
like grab you by the shoulders and shake you and be like, look at how these are influencing
everything at once. And I wanted to, like, not to be so hopeless as all that and like offer some
optimism, both in the form of like showing how regulation is happening in the EU that is
mandating algorithmic transparency and other things and how new platforms are working online and
there are like digital alternatives to these consumption habits that we've developed.
What would you say to builders of products that, I mean, we have a lot of builders listening
here. They build, they use algorithms. If they hit the breaks or
I think, I mean, for me, as a user of this technology myself, like I've experienced all of
these feeds. I've been as online as anyone over the past decade. I think we've reached a tipping
point. People are tired of just how algorithmic these feeds have become, the kind of proportion of
recommendations to things that you've chosen to follow. And I think users are third.
thirsty for some sense of human-to-human connection that has become or gone missing in some of
these larger platforms. And I mean, that used to happen on Facebook and Instagram. It can happen
again. On TikTok, there are kind of dregs of human interaction when you like a comment or something.
But ultimately, I think I want to see more apps and more experiences that are about cultivating
one person, communicating with one person, use their stuff.
ability to follow a single voice in a sustainable way. I'm just like a, I think I want tech
builders to have more of a sense of culture that's not just about popularity. That's not just about
how many faves or how much engagement something gets. Like I think obscurity is an important value
just as much as popularity is. Wow. Yeah. I guess like the question is how would you code that into an
algorithm? That is a challenge. That is a challenge. Yes, it is.
But maybe the way, maybe there's two, I mean, two sides of the same coin.
Yeah, maybe you do turn up the surprise knob or the other thing that you could do is potentially
allow for that more human input for, you know, broader signal.
Like maybe, I guess the follow signal has gone away for a reason, right?
It just wasn't strong enough.
But maybe finding ways to have people tune their experience, their experiences could be good.
All right, we're here with Kyle Chaka.
He's the author of Filterworld, how algorithms flattened culture.
I've gotten a chance to read it, enjoy it.
In the second half, we're going to talk a little bit, so we've talked a little bit about online.
In the second half, we're going to talk a little bit about how algorithms shape offline experiences.
So plenty more coming up right after this.
Hey, everyone.
Let me tell you about The Hustle Daily Show, a podcast filled with business, tech news,
and original stories to keep you in the loop on what's trending.
More than 2 million professionals read The Hustle's daily email for its irreverent and informative takes on business and tech news.
Now, they have a daily podcast called The Hustle Daily Show,
where their team of writers break down the biggest business headlines in 15 minutes or less
and explain why you should care about them.
So, search for The Hustled Daily Show and your favorite podcast app, like the one you're using right now.
And we're back here on Big Technology Podcast with Kyle Chaka.
He's the author of Filterworld, How Algorithms Flattened Culture.
He also writes for The New Yorker, writes about tech for the New Yorker, covers AI, a lot of social media.
really good stuff. Okay, so another thing that I read in the book that I found to be pretty
interesting is that algorithms are shaping the physical world, right? That I don't fully know
exactly how it works, but maybe things that do really well on Google Maps, like have a, you know,
similar type of feel and vibe to them, and therefore they beget other physical places that
look exactly the same. Can you talk a little bit more about that? That's really interesting to me.
For sure. I mean, I think we all feel how algorithmic recommendations.
shape the physical world. One example in the book is that Google Maps, for example, when
passing by my wife's hometown of Westport, Connecticut, if the highway is too busy leaving
New York, it will, Google Maps will route all of the traffic through downtown Westport, which is a
tiny town. It cannot handle many cars. There are many traffic lights, but it completely warps the
quality of this downtown from being something that's for locals like relatively slow to this
massive line of cars that's just circumventing I-95 or whatever. So that's an example of
a recommendation literally reshaping how people move through the world. I think in I write a lot
in filter world about coffee shops and restaurants and how they have to kind of mold themselves
to digital platforms in order to succeed.
And I mean, you see that, you see symbols of that as in the case of restaurants that name
themselves Thai food near me.
Yeah, I love that.
Thai food near me does not make sense without the internet.
Like, that as a name is illogical unless you consider that when people search for Thai food
near me on Google Maps, this restaurant will likely come up because they've
kind of game the system. So I find that same effect in a way with restaurants and cafes,
adopting certain design symbols, making sure they have really optimized internet presences,
worrying about their Google Maps and Yelp ratings. I mean, lately, I've been traveling a bunch
in the past month. And when I use Google Maps, it's really very blatant which restaurants
have either paid for promotion or just really do well on Google.
Google Maps and they show up as these massive stars in the midst of neighborhoods or cities.
And my attention is drawn to them inevitably.
Like it very much Google Maps makes it easier for me to go to those places versus other places
that might not be so online.
So I think I just want to call attention to that fact that even our choice of a restaurant
is guided by how something has adapted to the internet.
Yeah.
Yeah. And look, I think that, first of all, the Thai food near me thing is just kind of hilarious. Maybe it doesn't really change. I mean, maybe you end up at a Thai food place because you search for it. But you also talk a little bit about how coffee shops have all started to look the same. And they're like, let me see if I can capture it. It was like kind of minimalist. They got the subway tiles on the white subway tiles on the wall. Everybody kind of knows exactly. Like the clear glass case with some pasties in it.
Succulents and ceramic vases, nice little chunky mugs, latte art.
I don't doubt that those things, like, there's a certain taste that people will like
and they will go to these things and they'll rate them well.
But I also sort of feel like if you had a kick-ass coffee shop that was different,
it would still break out and rate well.
So can you talk a little bit about, like, you know,
are we just destined because of the algorithms to have all these different,
you know, all these similar looking coffee shops, like, why won't difference also rate well
on Google?
I think it's kind of going with the flow versus charting your own path in a way.
Like, there's a very...
It's too risky to open something that's different or...
Well, I think so if you are an entrepreneur who is opening a coffee shop somewhere,
like I reported this out for the book.
I interviewed like a dozen coffee shop founders all over the world and asked,
them, why they designed their coffee shop in the way that they did. And they all talked the most
about Instagram, needing to cultivate the attention of the Instagram demographic, the tourists who
is planning their trip based on what they saw online. And they also talked about the kind of network
effect of being able to connect with so many other coffee shop entrepreneurs and baristas so that
the barissa in Beijing could see what the barista and Brooklyn was doing all the time.
So I think there's almost this peer pressure, both from the consumers and from the shop owners
themselves, to all kind of move toward the same point. And if you want to attract that
Instagram audience, which includes probably people like you and me, you want to look like Instagram.
You want to follow the aesthetic that's already popular on Instagram. And I feel very guilty of
this like I'm part of the system because I do you know when I'm in a new city I look on Google
maps I look through the coffee shop listings and I usually go to one that has all these symbols
because I kind of know that it will be something that I like I think it just creates this kind
of self-reinforcing cycle same here and like even though you like go to these independent
coffee shops like what you really end up in is I don't guess I might get in trouble for saying this
but what it is is it's like Starbucks but it's just a different form of Starbucks which is
that it's uniform, but just shaped differently.
It's decentralized Starbucks.
Exactly.
That's what we've invented here is like a completely homogenized,
flattened experience that you can get anywhere,
but it's not dictated by some corporate monoliths.
But it's interesting also because it does provide some comfort being able to be in a
familiar place in an unfamiliar place.
Like there's this, so there's this German rapper, his name's Aligato.
He's a big, he's very, very popular there.
I've been listening to some of his music.
one of his songs is about going to Starbucks in Phuket and like how it's like, you know,
it's so exotic, but you feel so at home. Sorry?
Yeah, yeah, how that's a good thing.
Actually, I think just kind of making fun of the whole thing.
Like, look at me, I'm in Phuket, Thailand, but I'm in Starbucks.
But there is, I don't know, it is interesting.
It's, there's a comfort in it.
Yeah, and I think, I mean, in the book, I'm kind of observing this as a phenomenon.
I think it's a kind of like pressure or forced.
It's acting on a lot of things we experience now.
I think in the context of a cafe or a restaurant even,
this can be like a very nice experience to find something that you know is going to be good.
Like I can reasonably be assured that if there's subway tiles on the coffee shop walls,
then I'll get a good cappuccino.
And that's usually true.
Yeah, they're going to have cold brew.
Yeah, exactly.
But what I want to argue is that that,
that kind of familiarity and comfort might be what we want from a cafe, but I don't think
it should be what we want from music or art or design that's supposed to, like, move you
and make you become a better person. Like, I don't think these coffee shops are making me a more
moral or purposeful or, like, enlightened person. They're just giving me a good space in which
to basically, like, numb myself. Yeah, it totally dulls the experience, right? And my wife and I
were just talking about this, about how travel, like, you know, a bunch of years ago probably was not, was, was way more exciting because you had no idea what you were walking into. And you just sort of stumbled into amazing or heard word of mouth, you know, ended up at these great sites and wonders. And now it's like, okay, you go to Google Babs, you know exactly how to do it. You know, exactly what it looks like, you know, exactly the angle to take the Instagram photo. You're grimacing here. And I think it just seems like this is resonating with.
It hurts in a way, like that idea that everything is totally pre-visualized and predicted,
and it's a kind of like easy, frictionless path.
And I feel like, I mean, travel used to be more challenging in many ways.
It was harder to plan things out.
You didn't have Google translate.
You didn't have the camera that could translate signs for you.
But in a way, I think that challenge made things more satisfying or at least gave a sense of purpose.
Yeah, and some of your best stories about how you were so totally screwed on the side of a road,
not having any idea where you were and actually having to say hello to somebody.
It pushes you into a scenario you wouldn't actually be in.
Whereas if you're just like landing in a place and calling an Uber and like going to your
your minimalist hotel and then going to your minimalist cafe, like that's actually not
an experience of a place or maybe an experience at all.
So totally true.
I mean, like, as I said, I've been traveling a fair amount recently, and the, even the kind of Apple pay or contactless pay that you can do with your phone has erased a bunch of experiences, like having to go to a ticket machine to buy a subway ticket in Amsterdam or in Leon where I am right now.
Like, you can just carry your phone, swipe everything, and walk right in. And that's super easy and cool, and it really helps.
I almost missed the trouble of, like, figuring out how to get a ticket or get a new card for a different city.
That is one step too far from me, Kyle.
I'm not going to blaspheme Apple Pay on this podcast for sure.
But I hear your point.
Okay, so we're in this era right now, you know, generative AI right now, it's like, it's almost like algorithms are going to start mediating the other algorithms that we're using and it'll just be algorithms all the way down.
So obviously, like, as you wrote, this whole era of LLMs and generative AI was unfolding,
how do you think that changes or accentuate some of these issues that you're talking about?
I mean, my theory for a little while has been that algorithmic feeds,
I mean, algorithmic feeds are a form of AI.
There's machine learning there.
There's like adaptability there.
The algorithmic feeds have encouraged people to behave in kind of generic ways and culture to fit in specific molds.
like the generic Spotify song or generic TikTok choreography.
And I think generative AI now has this ability to like spit out the generic culture instantly based on a user's prompts.
Like rather than just guiding us toward a kind of generic culture, it's literally creating the generic culture from the get-go.
And you kind of see that in the cliche mid-journey aesthetic of the kind of like,
average style of all of these AI images that kind of look like sci-fi Lord of the Rings or
whatever. Like there's a certain vision that just looks kind of good and looks kind of like AI and
smooth and like reasonably coherent without being completely coherent. So I feel like as users
start adopting these new tools, not just like consume culture, but to create it,
we'll be in the zone of like even more generic stuff at a higher and higher volume.
I'm with you. I mean, I do think that generative AI is starting to homogenize creativity in many
ways. And I guess creativity has also been homogenized in a big way, which is a whole different.
I feel like we talk a lot about creativity right now. And it's like we have such a desire for
creativity. Creativity is like such a good thing. No one says that creativity is bad. And
yet we don't we want creativity to be easy in a way like we want creativity to mean describing a thing
to an AI model and then pushing the button like as if that's fundamentally creative and I don't
think it necessarily is yeah I mean it it is homogenizing but the other side of this is that
it does allow for creative people to become even more creative by experimenting with so many
different potential ways as opposed to just one and sticking with it so I guess it's
yet to be seen. I guess like one thing, one other thing I wanted to talk to you about before we
leave. I'm not 100% sure if it's related, but I think it might be, but you did write about it
and I feel like it's fair game, is that you did recently do this long piece about how people
are starting to like really fall in love and build companionship with AI models. And I'm just
kind of curious, like, if you could share a little bit about that and if that sort of, is that
the type of person that we fall in love with also kind of sort of be flattened.
by AI or might it be AI, you know, completely.
That doing that story on chatbots made me believe in AI more than anything else that I've
experienced.
Like more than mid-journey, more than chat GPT turning, you know, refrigerator instructions
into Shakespeare sonnets.
Like the chatbots truly feel like something new and different, I think, because people really
relate to them as sentient creatures. And they, they relate to them as personalities, as characters
with a depth and history and interactivity that I don't think much technology has at this point.
Like, we don't think that our iPhones have personalities, right? It's just, it's one device for
everyone. And it might have like quirks or flaws, but you don't say, oh, my iPhone is annoyed
at me or, oh, my iPhone said something really funny.
Whereas these chatbot users really developed these deep relationships and senses of coherence and kind of three-dimensionality with this technological products.
And I found that really fascinating.
I think in part it works that way because people interact with them for so long.
The chatbots do build up memories and kinds of shared frames of reference.
And it becomes this human interaction in a way, or it, or a job.
draws more human interaction out of the human in the equation.
It feels inspiring, I suppose, in a way that few technological products have to me.
So there's hope.
Yes.
I mean, we just saw that new little red cube AI personal device.
Oh, the rabbit.
Yeah, Ronji and I just talked about on the Friday show.
He's trying to pre-order it already.
To me, what I want is like the Tamagotchi version of that.
that, you know, the AI Tomogachi with its own personality. Maybe, okay, here's maybe what it is.
The depth of learning and training and the kind of experience and memories of a specific
AI character is specific. Like, your chatbot is not like other people's chatbots.
Right. And that's a not homogenous thing. That can maybe cultivate more specificity and more
unique experience. Well, just wait till we all end up in Vision Pro. And we're
starting to not only have these as Tomogacchis, but talking to them.
And, you know, those are all going to be algorithmically generated as well, but maybe not
sorting algorithms.
But if you think you're living in a filter world now, just wait until you put those
things on and then you're really going to be there.
Right.
I think this book was really about the past decade.
It was like about the 2010s and what happened to us during the 2010s.
And now we're just on the cusp of like some other thing.
And it'll take a decade to figure out what that's about to.
The book is Phil's World, How Algorithms Flattened Culture by Kyle Chica, who's here with us today.
You can find it this week in any bookstore you'd like, Amazon or any of the local Indies.
I have a feeling of which one Kyle's going to want you to visit, maybe one that doesn't have an algorithm in the front door.
Call your bookstore. Call up your local bookstore and request it.
Kyle, thanks so much for joining.
Yeah, this is really fun. Thank you.
Great stuff.
Thanks, everybody for listening.
Thank you, Nate Gwattany for handling the audio, LinkedIn,
for having me as part of your podcast network.
And all of you, the listeners,
we will be back on Friday to break down the week's news.
Until next time, we will see you then on Big Technology Podcast.
