On with Kara Swisher - TikTok Is Changing How We Talk & How We Vote

Episode Date: July 21, 2025

Social media algorithms are leading to the creation of new words, new accents, and even new identities. And while using the apps may seem like a fun, trivial way to waste time, they’re actually havi...ng a profound impact on how we communicate — and on our our democracy. To find out more, Kara talks to Adam Aleksic, a 24-year-old Harvard-educated linguist and social media influencer, and the author of Algospeak: How Social Media Is Transforming the Future of Language. They discuss the way new words, communities, and identities develop on social media apps; the financial motives and incentive structures underlying the algorithms; the mechanisms through which they shape user behavior; and how they ends up impacting our culture and politics.  Questions? Comments? Email us at on@voxmedia.com or find us on YouTube, Instagram, TikTok, and Bluesky @onwithkaraswisher. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 All your base are belong to us. Greatest one. Yeah, that's a 90s kind of reference. That's Kara Swisher's time period when she first came on. Usually I come on here and I make people feel old. You're making me feel really like a child. Hi everyone from New York Magazine and the Vox Media Podcast Network. This is On with Kara Swisher and I'm Kara Swisher. Today I'm talking to Adam Aleksic, a 24-year-old Harvard educated linguist who is the author
Starting point is 00:00:36 of AlgoSpeak, How Social Media is Transforming the Future of Language. And he's also a social media influencer himself. Adam says the social media algorithms are leading to the creation of more new words than ever before, and recognizing the incentives behind the algorithms can help us understand how our language, culture, and even our politics are being influenced by opaque systems owned by a tiny number of billionaires or beholden to the Chinese Communist Party. I'm excited to talk to him because I've been following how language changed on the internet since before he was born.
Starting point is 00:01:10 Let's be clear. And there's all kinds of words that do come out and people use and they come and they go. It started with memes. All kinds of things happen in the early internet. You started to see this in people talking short language, meme-ified language, essentially. And it also creates new words and new ways people talk to each other.
Starting point is 00:01:27 The meme, I think, is one of the more important social media cues of the era. And if you know them correctly, you're on the in. If you don't, you are cringe, often like myself, as my kids say. Our expert question today comes from Brooke Hamerling, the founder of The New New Thing, and a writer and podcast host at pop culture mondays.com
Starting point is 00:01:47 She's a good friend of mine, but I gotta say she knows more about internet memes than anyone else. I know so stick around Support for this show comes from service now who are enabling people to do more fulfilling work, the work they actually want to do. You know what people don't want to do? Boring, busy work. But now with AI agents built into the ServiceNow platform, you can automate millions of repetitive tasks in every corner of your business, IT, HR, customer service, and more. And that means your people can focus on the work that they want to do. That's putting AI agents to work for people.
Starting point is 00:02:27 It's your turn. Get started at servicenow.com slash AI dash agents. Blockchain is reshaping every aspect of society, starting with finance. It's happening across industries, across sectors, and across the world. And it's happening with Ripple. With more than a decade of blockchain experience, over 60 licenses, and strong institutional trust, Ripple provides financial institutions with blockchain and crypto-powered solutions
Starting point is 00:03:00 across payment and digital custody applications. This means secure 24-7 transactions, moving value across the world faster. Find out more at ripple.com. Let's dive right into the book. How did the word unalive develop, and what does it say about the way our language is changing? Right, so my book starts with that example, unalive develop and what does this say about the way our language is changing? Right, so my book starts with that example, unalive. There's kids in middle schools talking about Hamlet contemplating unaliving himself in their essays and having classroom discussions
Starting point is 00:03:34 on the unaliving that happens to Dr. Jekyll and Mr. Hyde. It's the synonym for killer commits suicide and people are using it first on TikTok because you can't say kill. It's not necessarily censored, but it's suppressed. Your videos will get pushed to fewer people. And so creators try to circumvent that with coming up with, quote-unquote, out-go-speak language meant to circumvent kind of online censorship. Circumvent. So you're circumventing whatever the content moderation system, if there is one at all.
Starting point is 00:04:02 Right. But it's also taken on a new life offline as a euphemism. And it's sort of this perfect example, I think, of this new algorithmic infrastructure of the internet bleeding into our everyday lives. But I think this is the new kind of defining feature of mass communication online right now and how we relate to each other online. And that's affecting actual language. That is fascinating. So using a watermelon emoji as a way to signal support for Palestine is another example.
Starting point is 00:04:28 Talk about the many layers of communication that happen when someone uses a word or emoji to show support for a cause or social media. Talk about that a little bit. I'm so glad you brought up the watermelon. It's a fascinating example. We have a few layers of kind of performance happening. You're first performing for the person There's like a literal meaning and that the path that's the watermelon stands in for Palestine
Starting point is 00:04:50 So it has a literal meaning. It has a performative meaning the performative meaning is I'm in your shared group We understand the social context of what this means. Mm-hmm. You're also performing for the algorithm I call that algorithmic performativity You're performing in a way that the algorithm will allow this video to be pushed further. And again, the Palestinian flag is not actually censored. Sometimes there's just an imagination of what the algorithm does and people over-crack. But it probably does like suppress some posts relating to the conflict. They have community guidelines around, let's not exaggerate, conflicts and stuff like that. So they probably do suppress videos with a Palestinian flag. And so creators turn to the watermelon emoji as a way of getting their videos seen by more people. And so you're performing for that. But at the same time, there's a meta-linguistic indicator, like a fourth level. You're also signaling to your
Starting point is 00:05:32 audience, hey, by the way, we're being watched by this platform. And I'm using this watermelon emoji not just to mean the literal thing, not just for the algorithm, but also to tell you we are in a surveillance state. And we know this. So we're going to be tricky, although it's in plain sight in a lot of ways, right? I mean, I assume the companies know what they're doing there, too, and allow it to happen. Obviously. And the companies have caught on to unalive, too.
Starting point is 00:05:56 If you search up the word unalive right now on TikTok, it'll redirect you to a page for seeking mental health guidance. And some creators started putting an at sign instead of the letter A or an exclamation point instead of the letter I, and that's to circumvent the second layer. And the example I use in the book is we're playing linguistic whack-a-mole. The algorithm keeps coming down, the hammer comes down, and then new mole pops up. And in linguistics, we call that a productive force, something that produces more language. So, some listeners, they might think this is interesting, but why should I care about
Starting point is 00:06:24 teen slang or emojis used by social media activists? What do you say to them? What do you, why does studying how algorithms shape language evolution tell us about power and communication and the society writ large? Yeah, I think we have to start with this stuff we're talking about, the quintessential examples of algo-speak language meant to circumvent the algorithm. That's the tip of the iceberg. These are the known knowns, the examples we can point to. We can say, oh, this is just a clear, obvious example of algorithms rerouting our speech. I think there's also less clear examples,
Starting point is 00:06:55 maybe the known unknowns. And we're not even getting into, I mean, we can't possibly know what the unknown unknowns are. But the more I began to look into this, this was the example that first drew me as a creator, as a linguist, because I can't turn off linguist brain, I can't turn off content creator brain. I can't turn off content creator brain, I'm constantly looking at my own language and thinking, wow, what am I saying here?
Starting point is 00:07:09 Why am I saying it? So I was drawn to that kind of algorithm speak. And the more I looked into it, the more I think algorithms are shaping every aspect of online communication right now, at least on these platforms, which is the predominant way that we're communicating. So you have the influencer accent. These are sort of inflections and ways of stressing words that are meant to grab your attention better. Because the underlying logic of these platforms is let's
Starting point is 00:07:28 get people's attention for as long as possible so we can commodify it, so we can sell more ads and sell their data. That means that language is really, really revolving around what grabs people's attention. This has been true in the past. You've always needed to grab people's attention and communicate, but I think it's compounded on the algorithm. I think it's amplified. And this is a pattern you see time and time again. It's a normal human process, but it's exaggerated by the algorithm. So you have attention-grabbing mechanisms. Humans also naturally create in-groups and out-groups. That's an innate tendency that we do. But
Starting point is 00:07:55 algorithms have a tendency to separate us into filter bubbles. And so they emergently compound this human tendency with their behaviors. And then we have this completely new thing. And these algorithmically created echo chambers are now incubators for new language change. You have communities forming online that have a shared need to invent new vocabulary. One example I use in the book is like a K-pop community, which maybe didn't have a way to coalesce before when they were geographically disparate. Now they have a fandom and it starts on Tumblr and Twitter and moves to TikTok. And now they have all this like in-group vocabulary, which is like completely unique to them. But
Starting point is 00:08:28 sometimes the words escape to the mainstream, like the word delulu, which started as a way to describe a fan's delusion toward their idol. So you're also an influencer. Also, it's sort of a new word for a job, at least. You have 1.5 million followers on Instagram, 750,000 on TikTok and over 600,000 on YouTube. Last year you went viral for the term boomer ellipsis. Explain that and why it worked and the strategy for going viral there. The boomer ellipsis identified the kind of dot dot dot that boomers do in their text messages, you know, sort of an unfamiliarity with the conventions of internet speak. That's, you know, that's more of the early internet era kind of explaining through that
Starting point is 00:09:06 lens that Gretchen McCulloch set up. So I was explaining this phenomenon that I don't think anybody else had talked about before, or there were some big studies on this. I particularly coined the phrase boomer ellipsis as an example of trend bait. This is something that I talk about in the book that influencers try to identify what will the next trend be. They go out of their way to coin new phrases, coin new words, because it's some kind of sociologically compelling thing to us that, oh, there's this phrase, I don't know what this means. And now I want to be in on the group that knows what this phrase means.
Starting point is 00:09:36 So it satisfies that in-group curiosity. At the same time, it speaks to our fascination with intergenerational differences. I also dedicate a chapter to that right now, for example, the Gen Z stare is trending on TikTok. Before that, we had the Gen Z finger heart, the Gen Z shake, we had millennial paws. We have all these sort of generational terms and also generations are completely made up. There's no such thing. There's a lot of academics who are really frustrated. Pew Research Center is scaling back on what they're calling generations right now.
Starting point is 00:10:02 Sure, we have like familial generations, but the social idea of a generation is newly constructed since like World War I. Absolutely. It was often around music. Also, it's sort of Western bias too. Anyway, there's a lot of like reasons generations are made up. I feel as an older Gen Z person, I feel much closer to a young millennial than a young Gen Z person.
Starting point is 00:10:21 But you know, we still get lumped into these broad labels that may be constrained or we now we start trying to identify with these labels. Maybe now that I'm Gen Z, I want to use the Gen Z finger heart. I want to use like, I relate more to the Gen Z stare or something. So when I coined that phrase, boomer ellipses, I did that knowingly playing into intergenerational tribalism. Which still which still persists today. It really is kind of you're right. You're absolutely right. It hasn't been used in it. And I hadn't thought about it on a global basis. Of course, nobody in Syria my age is the same as me,
Starting point is 00:10:50 for example, or have similar things. So tell me, what do some new words develop within our social media subclusters, but others don't? Like, it's really interesting what happens. And of course people, speaking of a phrase, they try to make fetch happen and they don't from a famous movie of another era.
Starting point is 00:11:06 It's still a great expression. Is there something about them? And how does it, what determines whether it jumps from social media usage, which can be here and gone, to the mainstream use in conversation? Right. You can't force language change. That's why the woke academics couldn't make Latinx happen or all the other kind of like ivory tower
Starting point is 00:11:25 academic intellectual. Let me just tell you liberals didn't like that either. But go ahead. And that's because it felt forced. It felt like the word fetch fetch worked for a while there. Adam fetch worked for a while there. So you know language will follow the conduits of what is seen as cool or funny. That's always what it's been.
Starting point is 00:11:40 And that's what it is now. And there's some groups that have more social prestige. So they're seen as cooler. There's some groups that are good that have more social prestige, so they're seen as cooler. There's some groups that are good at coming up with memes, so they're seen as funnier. There's another factor of which groups are actually coming up with new language, and there are some groups that are simply producing more, and the more you'd produce, the more chances it has to go viral. So a lot of internet memes come from 4chan. 4chan has this need to demonstrate a shared performativity in this slang, because there's anonymous user accounts.
Starting point is 00:12:05 So to show that you're not a quote unquote normie, you have to play by their slang. And then they come up with new words as it's part of their culture. There's a lot with both platform design and user culture that kind of works circularly to either create new words or not. Like why is so much of our slang, Gen Z slang,
Starting point is 00:12:23 that's also made up, but why is so much Gen Z slang coming from African American English? It comes from the ballroom scene in the 1980s, which is this very culturally rich space that was trying to come up with new language to differentiate themselves from the straight white norms of the English language. They had a shared need to invent slang. And that's when slang gets invented, when there is a shared need, when these communities are created, when they feel a desire to come up with new words.
Starting point is 00:12:45 And some are better than others, right? Some, like as you were saying, if they're culturally cool or funny, then we capitalize on it broadly. And how do meme formats fit into this engagement rubric? Because they are. A lot of the early internet ones were really cloudy memes, but they worked really well. And some were phrases that were attached to memes because they always had a picture with them in some way.
Starting point is 00:13:04 Phrasal templates are very important, meme templates. There's carriers for ideas, like make X, Y again, if we're already dabbling in politics here. That's a phrasal template. My LinkedIn bio says making linguistics cool again. Earlier today, I was talking to someone and they said like, make teaching engaging again. And we say that without thinking that that happened
Starting point is 00:13:26 because Trump like popularized that phrase. It was spread as a viral meme and now it's just this, it's a carrier for other, because it's so easily re-adaptable, remixed with other new ideas. So it keeps taking on new lives every time it gets reused. Keep calm and carry on. Right, right.
Starting point is 00:13:42 So these grammatical skeletons, X is the new Y, like like you know, that kind of stuff, they underlie our language. They've always been like that, right? But internet memes make it easier to point out and see these happen. But basic memes have pictures or videos with them or something that you could put up a picture and people know, like the crying person, Britney Spears, the guy crying. Does that fit in here in this engagement rubric? Like a picture? Like, when you put up a picture, people do that to respond right away. You know, the classic like the distracted boyfriend meme, you could overlay different images onto him. I did a video about this recently. Every single sub meme of that died out faster than the distracted boyfriend as a whole. The distracted boyfriend kept surviving
Starting point is 00:14:23 year after year, even though individual means just come and go. Because here's the thing, memes are fads. They have lifespans. They have every single word also is a meme. It has a lifespan. Some memes have shorter lifespans. I think, you know, the word Yeet or the words on fleek had a shorter lifespan. Tell the people what it is. Sorry to do this to you, but... Oh, yeah. Yeet is an interjection popularized by Vine for when you throw something on fleek just meant looks good or cool. Also those are both Vine phrases and I use Vine as an analogy. It's sort of similar, it's the first time we have like sort of video based, it's not
Starting point is 00:14:56 personalized recommendation algorithms like we see today. So we have actually, the current algorithms allow you to incubate a lot more different things at once because it's not like everybody's getting shown the same feed like they were on Vine. But the point is these video platforms are very good at spreading memes. What I was going with that is that some memes have shorter lifespans, some memes have longer lifespans. So the words like selfie and cancel for like cancel someone online. Those were popularized around the same time as Yeet and Onfleek
Starting point is 00:15:21 and yet they still stick around. And another thing we hear is like how much do we perceive this as being a meme? Because the more we perceive it as sticking out, when your grandma starts using, you know, yeet, it's no longer cool. But selfie also, it fits a lexical gap. Like if there's a need for that word in our language. And so we adopted for that reason as well. I remember my kids saying yeet now that you said it. I was like, oh, still around in the sense of like, we use it as a callback to that era of time. And I think that's where skivvy is going to go. Right. I think skivvy might be on its way out. That's just a nonsense interjection.
Starting point is 00:15:51 It doesn't mean I do. You could say what the skivvy that should stay. I personally am a huge fan of people asking for my favorite brain rot word. I'll say skivvy. Why do some memes or words stick around and others die? So the questions of what makes a word stick, it's really multifaceted. Re-adaptability is a big one. If the meme can be used in a lot of different contexts, if it's easily applied to new situations,
Starting point is 00:16:13 that gives it an easy chance to jump from one use to another and keep surviving. If the meme fits a cultural need, if like I said, a lexical gap, it's easy to survive. And let me go back one step and redefine meme, because you said it's like pictures and videos, I said a lexical gap it's easy to survive and let me go back one step and redefine mean because You said it's like pictures and videos. I said it's a word. It's not very well defined at all It was first brought up the modern concept in the 1976 Richard Dawkins book the selfish gene where he describes it as a self Replicating unit of culture and a lot of his ideas are you know, it's sketched out an interesting concept
Starting point is 00:16:42 I don't think most people really agree with his sort of evolutionary scheme of how words and ideas spread. But there's definitely something to a unit of culture that people adapt across moments in time and how these stick around and they are constantly remixed as well. I think that's another thing. Mm hmm. Not unlike clothing and things like that, which is interesting to think about. Absolutely.
Starting point is 00:17:05 We'll be back in a minute. Support for On with Gera Swisher comes from Quince. Summer can be a great time to update your wardrobe. With Quince, you can find warm weather outfits that are timeless, feel luxurious, look elevated, all with a quality that goes beyond what you'd expect at the price point. Right now, Quince is offering 100% European linen tops starting at $30, washable silk dresses and skirts,
Starting point is 00:17:36 soft cotton sweaters, versatile warm weather pieces that you'll want to pick up season after season. The best part, everything with Quince is half the cost of similar brands. By working directly with top artisans and cutting out the middlemen, Quince says they can give you luxury without the markup. I've tried a lot of Quince stuff myself and I actually love it. I'm wearing their sports bra right now. I love it. I'm also wearing their pants, which are very loose and light and cool, given it's been so hot on the East Coast these past
Starting point is 00:18:02 few weeks. The material is really great. They feel really great. Plus, Quince says they only work factories that use safe, ethical and responsible manufacturing practices. You can give your summer closet an upgrade with Quince. Go to quince.com slash Kara for free shipping on your order and 365 day returns. That's q u i n c e dot com slash Kara to get free shipping and 365 day returns. Quince.com slash Kara. Hey, this is Peter Kafka, the host of channels, a show about media and tech and what happens
Starting point is 00:18:37 when they collide. And this may be hard to remember, but not very long ago, magazines were a really big deal. And the most important magazines were owned by Conde Nast, the glitzy publishing empire that's the focus of a new book by New York Times reporter Michael Grinbaum. The way Conde Nast elevated its editors, the way they paid for their mortgages
Starting point is 00:18:57 so they could live in beautiful homes, there was a logic to it, which was that Conde Nast itself became seen as this kind of enchanted land. You can hear the rest of our chat on channels, wherever you listen to your favorite media podcast. Blockchain is reshaping every aspect of society, starting with finance.
Starting point is 00:19:21 It's happening across industries, across sectors, and across the world. And it's happening with Ripple. With more than a decade of blockchain experience, over 60 licenses, and strong institutional trust, Ripple provides financial institutions with blockchain and crypto-powered solutions across payment and digital custody applications. This means secure 24-7 transactions, moving value across the world faster. Find out more at Ripple.com. So, let's talk about the business of language. After all, the social media algorithms are engineered to create engagement or addiction, however you want to look at it, in order to
Starting point is 00:20:01 sell advertising. It's a very clear way this works. So we'll start with our expert question. Every episode we get an expert descendants question for our guests. So let's hear yours. My name is Brooke Hammerling. I'm a communications advisor and a writer, podcaster of a weekly newsletter and podcasts called Pop Culture Mondays. And it is to help the olds understand what the youngs are talking about what's breaking and unraveling and the social worlds like TikTok. My big question for Adam is the ownership of a word. We live in a world now where people seem to take ownership of things that they have in their minds coined.
Starting point is 00:20:39 A great example is Pat Riley was able to trademark the term three Pete. What is the value of a word and can somebody actually own it? So for example, the terms that are really popular today like Riz, cap, no cap, delulu. There is an actual woman who made the term delulu really famous on TikTok. Does she have a chance of trademarking that and making sure that that is her word that she owns? Okay. I'm really glad that question was asked. I actually talk about Delulu in my book. I talk
Starting point is 00:21:14 about word ownership. So let's go back to the word on fleek. That was coined by Kayla Newman, a user who went by Peaches Monroe, and she was in a car and she called her eyebrows on fleek. The video went super viral. It got used by a lot of news outlets. Ariana Grande and Nicki Minaj used the word. Nicki Minaj got into a fight with another rapper over that rapper made t-shirts with the phrase pretty on fleek and Nicki Minaj claimed that was hers and she should get royalties. And it was so out of her hands. Two years later, Kayla Newman trademarks the word fleek. And by then it was dead. The meme was gone. So unfortunately, intellectual property laws can't catch up. And also, there's a difference
Starting point is 00:21:54 between a copyright trademark, like, you need to have like a business associated with it, you need to have like proof. So intellectual property wise, it's very, very difficult to own a word, right? So I think since the Enfleek era, there's been more of an attitude of let's give credit to creators. So Jules Lebron, this creator who came up with the word demure, mindful, cutesy, she had this viral moment last summer. She was able to fund her transition. She was able to afford a lot of new stuff because people gave her more recognition. I think that's a, it was people are still figuring out the norms of the internet back in the day.
Starting point is 00:22:30 But I think there's more of a cultural attitude now toward giving credit. So you can't actually own the word. It's out of your hands. It's out of your hands. Is there any word you think that could be owned? Someone really did. In the, in the trademark sense, if it's like a business, but if it's like a viral internet sensation, you can't get, it's just out of your hands.
Starting point is 00:22:48 There's more in place now for creators to capitalize on that because there's been growth in the creator economy. There's like lawyers who immediately reached out to help Jules LeBron with the word demure. So like there's more of a movement toward that. That being said, you can't ever own a word. And words will change, too. There's a sense of, like, group ownership as well over some words.
Starting point is 00:23:10 So a lot of the ballroom words, the slay, serve, queen, these are just, you know, slang words that middle schoolers now are using, but they came from the ballroom scene in the 1980s, this queer, black, Latino space. And many people feel like the middle schoolers shouldn't be using those words. But I'm sorry, it's out of their hands. Like at this point, you can't stop the words from changing. So you write that social media algorithms use language to create new identities for users, which could then be commoditized by the platforms. I'll pull up a quote from the book and have you read it. Language plays a circular role in identity formation. If you choose to use a certain word, you are accepting that you belong to the group
Starting point is 00:23:45 using that word. In the social media era, the algorithm will recognize that, push you deeper into that group, and give you more access to more niche language. So why do social media platforms have the financial incentive in creating new identities? Yeah, that's a really good question. So they've run on natural human behaviors.
Starting point is 00:24:01 We naturally want to pay attention to things, and they naturally create the incentive structures for influencers to try to grab your attention because they reward retention rate how long viewers watch the video. So attention is just an example there of what they're rewarding and I think our language is evolving around what gets attention. Humans also have a natural tendency to want to belong to groups. That's why people on the early internet sought out other people with similar interests. That's right, very first. And algorithms really play into that
Starting point is 00:24:28 by making you feel like you're part of a group. They'll push you further into the K-pop community and you'll start using words like delulu and they have so many like Korean loanwords in that community. I'm not in this filter bubble at all. We're all separated into different consumptive kind of niches.
Starting point is 00:24:43 What's really interesting here is that these are now demographics. In the past, a demographic statistic could be something like race, age, gender. Now it's whether you're Pastel Goth or whether you're a Cottagecore, whether you're a K-pop fan. These are all now labels that are used to represent you. Because what the algorithm is doing is they build a very nuanced picture of who you are. This is never actually who you are. It's a shadow representation using limited information, but it's fairly good.
Starting point is 00:25:05 And they'll take all this information, which videos you've liked, how your thumb rests on your screen, what wifi network you're connected to, what other phones are connected on that wifi network, all the usual stuff about cross app tracking and demographic information that they can. They turn that into a numerical representation
Starting point is 00:25:20 and embedding of who you are as a person. Each video as it's uploaded undergoes a computer vision algorithm, a natural language processing algorithm. That gets turned into a numerical representation of what the video's about. And then these numerical representations get paired with each other.
Starting point is 00:25:32 And that's how they know to send certain videos to certain users. These algorithms are predictive. They try to guess which videos are going to get the most attention by users. Because again, the whole logic is attention. How do we get people's attention so we can commodify it, so we can sell more of your data?
Starting point is 00:25:48 So because in-group behavior is something that is good for getting attention. You want to feel like you're part of a group, and the algorithm is set up to reward that behavior. It creates communities. It creates micro communities. Right, or it creates someone who goes across communities, right, for example.
Starting point is 00:26:02 Because when you're saying there's demographics, you know, sex, age, et cetera, there's also demographics of people who like to watch hardware hacks like myself, you know what I mean, or something like that. And because everyone's individually moving, they can then group them together in some way. Yeah, it's predictive because you're still in this cluster
Starting point is 00:26:19 of people who like hardware hacks, and it knows from other users' similar behavior that you might like this video if it contains the numerical representation of a hardware hack. But where we get into identity formation is very interesting because now that the algorithm incentivizes these labels to be coined and the same way I coined the phrase boomer ellipses, creators go out of their way to either coin or popularize phrases as trend bait because they want to tap into this perceived algorithmic space. They want to find ways to communicate to that imaginary representation
Starting point is 00:26:49 of what a cluster of people is. And so they find words like pastel goth, cottagecore. There is a latent kind of desire for that word. The word is popularized by creators. Algorithm pushes it further, it becomes more of a thing. Now as a pastel goth. In the past, let's say in the 80s, 70s, if you were a goth, that was counterculture, right?
Starting point is 00:27:06 It was a broad label. You could be a lot of different things if you were a goth. Now you have to be a cyber goth or a trad goth or a retro goth or a pastel goth. And now that you're in this smaller, perhaps category, circularly forming your identity around this, because every time you get a pastel goth video, you're like, oh, the algorithm really knows me, forgetting that the algorithm gave you that identity. Now you circularly identify with a smaller category of what you can be, which potentially limits your true self-expression because if you were just a broad goth, that contains
Starting point is 00:27:32 way of a wider semantic range. A myriad of goth, multitude of goths. Yeah, no, it's interesting. All groups do this. I went to a neo-Nazi rally in Germany. I was covering once. And the amount of different Nazis was really interesting and I was fascinated by,
Starting point is 00:27:49 they had all had different costumes, but I was thinking, oh, they're not, they were together so they got to interact with each other. But now on TikTok, of course, you can sell, they can sell you pastel goth clothes, by the way, so that they do see an opportunity. It's one click away on the TikTok shop, conveniently. Sure, which is the point of kids.
Starting point is 00:28:06 Kids, in case you're interested, they wanna sell you shit. It always works for the house, just remember that. Algorithms haven't just led to the creation of new words and identities, it led to new accents. You described three of them in your book. You mentioned them earlier. The entertainment influencer accent, lifestyle influencer accent,
Starting point is 00:28:21 and educational influencer accent. So explain the characteristics of these accents, how they developed, and say a sentence or two in each one, so we can hear the differences. Let's start with the entertainment influencer. If you want to add one in, please do. Right. Influencers always communicate for their perceived audience, for the algorithm as well. Again, there's a few layers of performativity happening here. And identify like different
Starting point is 00:28:43 types of influencer accents based on this audience they're accommodating for. The entertainment influencer accent is sort of downstream of Mr. Beast, and there's this term beastification that's been going around, and there's a lot of influencers who are trying to mimic this, but it's basically just making every word really pop.
Starting point is 00:28:56 I just bought this private island, I'm giving away a million dollars! But if you look at any real interview of Mr. Beast talk, he doesn't talk like that, right? And it's very intentional, it's very deliberate. Last year, an employee of MrBeast leaked a 36-page onboarding memo elaborating his exact strategies for going viral. He's extremely deliberate with it.
Starting point is 00:29:14 He talks about retention every single page of that memo. He's very methodical with it. He's extremely analytical. He's good at gaming the algorithm. He didn't get there by coincidence. So MrBeast figured out this accent that really, really works. I'm talking to a different audience. I'm not talking to brain-rotted 14-, really works. I'm talking to a different audience.
Starting point is 00:29:25 I'm not talking to brain-rotted 14-year-olds. I'm talking to somewhat brain-rotted nerdy people. You know, you got to be a little bit brain-rotted. And I'm using this not sort of in a joking way, you know, but I will talk really quickly. I'll stress more words to grab your attention because that's what works for my audience. It does. And you see a lot of influencers kind of also talk like that. Also you got to keep in mind that successful strategies self-replicate, sometimes people just start speaking
Starting point is 00:29:46 a certain way because they assume that's the correct way to speak online. There's also a huge survivorship bias in what gets shown on your For You page. The videos that end up on your For You page are ones that are predisposed to go more viral. Now we get back into like, what makes something culturally click.
Starting point is 00:30:00 So I want lifestyle influencer accent. I know this one, but go ahead. Hey guys, welcome to this podcast. We're talking about accents. You'll notice the rising tones that kind of like keeps the viewer hooked because it sounds like something's always coming next. It feels dead air.
Starting point is 00:30:14 They elongate their vowels. It feels dead air. Dead air is really bad, especially when they're working on an extemporaneous capacity. They need to fill that dead air. And the sort of lifestyle influencer accent has evolved kind of out of all these evolved kind of out of all these,
Starting point is 00:30:25 sort of out of previous accents. Maybe not entertainment, but like my accent is based on early founders like the Green Brothers and Vsauce and stuff like that. I don't think I was consciously imitating them, but I sort of, I started out speaking slowly, and there's a subconscious cue taking as well. And I was interviewing a lot of creators about
Starting point is 00:30:42 how'd you end up with this accent? Some say, you know, I did this consciously looking at retention. Other people said, I did this subconsciously just looking at what other people did. And a lot of us just, there's some level of taking our cues from other people. There's some level of maybe we just get
Starting point is 00:30:52 behaviorally conditioned by the algorithm as well. So there's a few layers of that. Do a sports one or is there one that's for sports? Is there one for? What I like to compare this to is really not that different from the broadcast voice. This just in. We've always been talking like this because you're accommodating for a certain audience for a certain medium. I'm a strong believer that the medium is the message. Each new medium
Starting point is 00:31:13 and I think algorithms are that new medium will affect how we communicate online. Yeah. Social media algorithms don't just affect new words and how accents. They also shape which ideas get attention because in order to go viral, videos generally need to create the most extreme and reductive version of a concept in the most confident and emotionally engaging way possible. Talk about the downstream effects, because they're always confident. The expression I always use is frequently wrong but never in doubt. A lot of the maha ones are like that, like drive me, I'm like wrong, wrong, wrong, wrong. But they sound, I almost believe them even though I know better. So talk a little bit about, and
Starting point is 00:31:50 it's very dangerous in those cases, many of the times, you know, please chug, you know, apple cider vinegar, please don't. Right. That kind of thing. So talk about the downstream effects on our culture. So, right, algorithms reward extreme behavior. And also the chug vinegar thing, for example, that will generate a lot of comments from people saying, hey, don't chug vinegar. At the same time, those comments are engagement.
Starting point is 00:32:11 Engagement pushes videos further in the algorithm. There's comments of, like, things that drive confusion, things that are on the boundary of irony and authenticity. We tend to see those ideas spread really easily because they get that extra boost of engagement, which is, like like really paradoxical. Rage bait, unfortunately, incredibly good for grabbing people's attention.
Starting point is 00:32:29 Something that infuriates you, you keep hate watching out of spite, or you comment out of anger, and that pushes this stuff further. So unfortunately, when we say that platforms are rewarding things that grab your attention, that's not necessarily things you want to see. The videos you want to see and the videos you actually get have a disconnect and people
Starting point is 00:32:48 constantly feel this and there's like a lot of studies on how people try to go out of their way, spend so much time training their algorithm to show them videos that they'd rather see. Because unfortunately, your base instinctual reaction to a video is not what your like higher order self is actually wanting to. No, right, right. Ladies and gentlemen, Donald Trump. I always said he was the greatest internet troll in history.
Starting point is 00:33:08 I, I have a lot of thoughts on how Donald Trump is uniquely suited to the algorithmic medium. If we're talking medium is the message. Well, please go ahead. Tell me. Well, you know, it's historically been thought that different mediums affect candidates' electability. Like, it's believable that Kennedy outperformed Nixon in the 1960 election simply because Kennedy was more photogenic on TV and this was when TV was introduced and Nixon was more of a radio candidate or whatever. It seems probably true that more attractive candidates have a better chance of getting elected because of television.
Starting point is 00:33:37 In the same way, more memeable candidates probably have a greater chance of getting elected in the algorithmic era. The fact that Donald Trump's phrases make X, Y again, this has been the worst X in the history of Y perhaps ever. The fact that these phrases can be so easily remixed and so easily adaptable, he talks differently in a way that is algorithmically compelling because it's one, it's extreme, two, it's like readaptable, all of these memetic qualities that make something stick. Perhaps normalize his ideas, perhaps like cause him to dominate. In the same way he dominates the news cycle fairly well, he also dominates the internet cycle through his memeable character. And I think that if it wasn't for the algorithms, I don't think like he maybe would have been
Starting point is 00:34:17 reelected. Yeah, I would agree. One of the things that I said this on a show the other day, I go, he's so good at it and he's so appealing and this and that. I got so much pushback. I'm like, it's factual. Like whether you like it or not, I go, he's so good at it, and he's so appealing, and this and that. I got so much pushback. I'm like, it's factual. Whether you like it or not, I didn't say I liked him. I said, he's excellent at it,
Starting point is 00:34:31 and people don't wanna give him that credit. Intentionally or not, there could still be a survivorship bias, but I think he probably knows what he's doing. No, I think he does it naturally. I think some people are intuitively good. I think Kennedy was intuitively good at TV. I think Roosevelt was intuitively good at radio. You could go back. Hitler was excellent
Starting point is 00:34:47 at radio, by the way, FYI, and speaking in groups. And Trump is very good in public settings with rallies, and he's very good in this. And I'm not giving him credit. I'm just saying he – I did a column once where I compared him and AOC, and I said they both have the same qualities. And I was comparing their qualities and people lost their minds. I'm like, but just look at it. You need to be this to be in politics going forward. There's a bi-modal representation
Starting point is 00:35:13 of what political views we're getting right now, which I find highly concerning, right? AOC and Marjorie Taylor Greene have more extreme beliefs. So they're more likely to get pushed by the algorithm. The congressman I grew up with in Albany's 20th congressional district, Paul Tonko, he's boring. No, but like if he says anything, it's just the mainline democratic kind of idea. He's not interesting. His ideas are never going to go viral because they're not algorithmically catchy.
Starting point is 00:35:35 We'll be back in a minute. On August 1st, may I speak freely? I prefer English. The Naked Gun is the most fun you can have in theaters. Yeah, let's go. Without getting arrested. Is he serious? Is he serious? No.
Starting point is 00:35:58 The Naked Gun, only in theaters August 1st. No frills, delivers. Get groceries delivered to your door from No Frills with PC Express. Shop online and get $15 in PC Optimum Points on your first five orders. Shop now at NoFriels.ca. Hey, we know you probably hit play to escape your business banking, not think about it. But what if we told you there was a way to skip over the pressures of banking? By matching with a TD Small Business Account Manager,
Starting point is 00:36:27 you can get the proactive business banking advice and support your business needs. Ready to press play? Get up to $2,700 when you open select small business banking products. Yep, that's $2,700 to turn up your business. Visit td.com slash small business match to learn more. Conditions apply.
Starting point is 00:36:46 So one of the things that you write quote, we've been conditioned to consume information only if it's somehow funny or relatable. Talk about Zoran Mandami, incredibly relatable, incredibly funny, incredibly, I would say substantive. Also, he manages to really get some very substantive messages out there. Talk about him in particular and how it's changed
Starting point is 00:37:06 the idea of being funny, relatable, or just you break through and then you compare it to what Cuomo put out the other day and I just wanted, I was like, no stop. I almost, I don't even know his number, I know his brother, I was like, please get him to stop. What does he even do to, yeah. Stop, like stop making videos at least at the very, you can run all you want, but videos are not your forte.
Starting point is 00:37:26 As someone who lives in New York right now, you can really feel the actual energy. So there's a few things going on here. One, there's that meme as carrier idea that these videos are carriers for his actual message. And he was actually very good at being intentional with both playing into this format of the algorithm and at the same time holding something within it, this actual message about affordable housing. He was very consistent with that. He was very good with that. The other thing is authenticity.
Starting point is 00:37:49 And this is just a classic buzzword in the creator economy that people want authenticity. Mondani seems like a real guy. Every time like some new clip surfaces of him, it's just like, this guy's like down to earth relatable. Like he seems like a good dude. And I think we're so tired of, we all know that the politicians are performing
Starting point is 00:38:08 for algorithms and how do you give off that vibe of authenticity? I don't know, that's sort of like also maybe a mimetic thing just like what naturally feels right. Or they're authentic in a way that's unattractive, like Joni Ernst and the dead thing with the thing. I was like, oh my God. I'm certainly not attracted to Joni Ernst.
Starting point is 00:38:26 Well, I know, but she actually stressed it to me. I was like, okay, you really are a terrible person. You managed to authentically get through that you're a jackass. It was really interesting how she did that. Mondani is a particularly gifted communicator, absolutely, in the online space. And you watch his, they're not tricks, I don't want to call them tricks, because Trump has them too, in the way he uses all caps
Starting point is 00:38:51 versus Mondani who is, does, I think his smile is part of it. Yeah. Even the visual kind of language, like walking out of the beach with a suit on, like all that, like, he knows that that will generate comments probably, or at least his media team knows, I think he probably knows, he seems very savvy. He knows that that will generate comments probably or at least his media team knows I think he probably knows he seems very savvy
Starting point is 00:39:06 He knows that people are now gonna be commenting about the suit dripping with water and that's gonna push it further in the algorithm Right. So a lot of Americans see progressives as language police in part because the right has done a good job framing content Moderation is censorship. I this is I think this is much more complex as I'm sure you do The voters seem to have punished them for them. As Brock Collier reported in New York Magazine, former Bernie voter told them that Trump inauguration party that he, quote, wanted the freedom to say faggot and retarded. I'm sorry to say those, but I just did. Why do taboo words hold so much appeal to some people and how have social media platforms
Starting point is 00:39:40 reacted especially with Donald Trump and the cultural change he's brought with him. Right, well, I want to start with breaking down progressive versus woke. So Mumdani's the new kind of progressive, maybe that the left should be modeling off of, where doesn't feel like he's policing you, right? He's just, he's raiding out good energy. And I think the traditional woke ivory tower academia
Starting point is 00:40:04 radiates out bad energy. They're saying, if woke ivory tower academia radiates out bad energy. They're saying, if you're not with us, you're racist. And that feels really bad to hear. And so you start thinking, I don't like these guys. They're making me feel bad about myself. Versus if you just send out positive vibes. So it's sending out negative vibes or sending out positive vibes for the left.
Starting point is 00:40:19 And that's how I think the left should be communicating more by sending out positive vibes. The right, funnily enough, like you think, oh wow, they're intolerant or something, their language is more inclusive. They are more willing to use all kinds of language. They're not telling you, oh, you can or cannot say this. The left will tell you that.
Starting point is 00:40:36 The left will be exclusive with language. And it's that attitude of inclusivity that allows them to Trojan horse through ideas, use those memes as carriers. That said, you then get permission to just be an asshole. I had someone say a similar thing to me, like now we can say these words. Now we get to say this, this, and this,
Starting point is 00:40:54 and I said, you know what, you're right, but you're still an asshole. And then he like shut up, and I thought, well fine, go ahead and say them, but it means I think you're an asshole. So here we are. As I noted, I do think President Trump is a genius at creating or co-opting language that not only builds identity, it's a recruitment tool, it communicates an entire worldview, and he uses terms like globalist, deep state, swamp, fake news, liberal elites, America
Starting point is 00:41:19 first. Sounds like SEO language, honestly. Especially fake news and deep state. I think you really push those around, but he also absorbs other subcultures, crypto bros, incels, QAnon followers. And there's a whole like dog whistling kind of like babahoya. Why haven't Democrats been able to do the same successfully? And if you were advising them, what would you tell them to do differently?
Starting point is 00:41:38 I know they have groups together to discuss how to talk to young men, which makes me cringe, if I want to use a term, it's cringe. What would you tell them to do differently? Start thinking about how to Trojan horse through ideas. There's always going to be a group that doesn't want to hear your ideas. You need to push it through by packaging it inside a carrier meme that is more funny or more compelling. Look at the Manosphere.
Starting point is 00:42:01 Look at their language like Sigma, which is now sort of a viral phrase middle schoolers will use it. Explain what it is. Yeah, Sigma refers to like, it's complicated in the Manosphere. It refers to an idealized man outside the socio-sexual hierarchy of alphas and betas, but practically it can be like a synonym for like dominant man. And that's how it's spread. It's spread through Sigma Wolf memes, which were just funny. It spread through carrier sentences like what the sigma, which just sounds absurd in a way that middle schoolers are ready to adopt.
Starting point is 00:42:30 Not just middle schoolers, but it sort of became a brainwrap word. And they packaged these words through funny concepts. But with it, they carry their hierarchy of looking at the world, their lens of perceiving all dynamics between people as power structures. And that's, I think some degree of it is lost when you package it, of course. Like, I don't think middle schoolers are really thinking about incels when they say what the
Starting point is 00:42:55 sigma, I think they're just using that word to relate with each other. But a lot of people find incels really repulsive and wouldn't want to use their language. So how did their language hit the mainstream? They're extremely good at weaponizing memes. Some of it was taken inadvertently. Some people use that language to make fun of incels and there is that boundary between irony and authenticity that generates more comments that I was talking about.
Starting point is 00:43:15 But in many cases, there's 4chan trolls and radical incels intentionally repackaging their ideas as means to spread them further. And a lot of meme templates, there's this gigachad or crying Wojak memes that kind of push their ideology. The Chad stride versus virgin walk, you don't have to really know what that is, but it's like a sort of a categorization of people that's pushed and with it, their way of thinking is pushed, but people see it as a funny meme, a way of labeling the world, but now you're
Starting point is 00:43:47 also... Right. And so they don't realize they may suddenly be recycling Nazi ideas, right, for example. And so my advice for progressives is, well, one, don't do exclusive feeling stuff, package things through good vibes. Maybe listen to some more Stevie Wonder. I've been listening to a lot of Stevie Wonder recently, and I'm going somewhere with this. This man, you listen to his album Songs and the Key of Life. It's amazing. He spreads such joy in that song, not only in his words,
Starting point is 00:44:16 but in the melodies. He talks about things that are very important to him. He talks about hardship and pain and loss and kind of poverty and discrimination, but he always turns into this funky, upbeat 70s groove and it's just catchy, it's vibey, you want to dance to it, and then maybe you connect with the song more and you actually take heed of his message that he's trying to spread out in this. But the songs always feel positive, they always feel like anybody could dance to this song. I don't care if you're Republican, I don't care if you're a Democrat, let's vibe to some Stevie Wonder. So going back to the previous question, now how social media platforms reacted to the election of Donald Trump and cultural change he's brought with him, which is astonishing
Starting point is 00:44:51 given how old he is, have they changed how they weight their algorithms? Yes, this is very important and such a wonderful question that you asked. Immediately in the wake of the 2025 election, I mean, we saw that picture of all the CEOs lined up behind Trump and we saw that meta loosen their content guardrails. They now allow for a lot more AI content. They now allow for a lot more like, uh, they took away all their kind of woke stuff. So there's some really racist AI slop on Instagram right now. I did a piece recently about there's a reel with 30 million views about a swarm of shirtless black men running towards a KFC and eating
Starting point is 00:45:25 fried chicken. And the underlying audio was the N word repeatedly. That's not something you would have seen under Joe Biden. Right? And so the election of Donald Trump, these the platforms are, you know, they're going to mold themselves to the political regime as well. I'm sure if there was a huge left wing backlash, they'd go back to doing the DEI stuff or whatever. Yes, that's what I used to joke. If Kamala had won Mark Zucker, we'd be using the doing the DEI stuff or whatever. Yes, I used to joke if Kamala had won, Mark Zucker would be using the terms they them, but go ahead. Literally, I think that's it.
Starting point is 00:45:49 But the platforms reflect the current political situation. They're at the end of the day, they're just trying to make money. And always. Those racist AI slop videos do make money for them. In fact, they make more money. The more they push the AI slop, the less money they have to give out to real creators. And I interviewed some of those racist AI creators and and I tried to be impartial and ask them, hey, why are you making these videos?
Starting point is 00:46:08 What's the underlying motivation? And you know what all of them said? They said, I'm doing it for views. I'm doing it for likes. I'm doing it for followers. They didn't say I'm racist. And they are on some level, but I don't think they're doing this out of genuine malice.
Starting point is 00:46:20 They're doing this because, and a lot of these are AI hustle people as well. They're trying to get people to do their AI stuff. There's a banality of evil here, banality of the algorithm, that you create an institution and people will fit themselves into that institution looking for reward, looking for just complacently perpetuating the bullshit that these platforms set up to be there. And there are changes in waiting, definitely, Like we see that. Why did Grok go racist on Twitter? And there are some, we don't actually know what happens with algorithms.
Starting point is 00:46:51 I do want to caveat that even engineers don't know they call it a black box because once you program it has so many parameters, you don't know what's happening. You put an input spits out an output. You're like, I don't know how that got there. You have a general idea though. They clearly tweak something because Elon Musk asked them to tweak something and then Grok goes racist, goes like pro South African farmer or whatever. Hmm, I wonder who's like that. They were training on his stuff. That's what I heard.
Starting point is 00:47:16 That's what they were doing. Okay, so you have something going on with the inputs, you have something going on with the reinforcement learning, the training data, and then you have this crazy output. Well, let me just say, this is why I have dubbed Mark Zuckerberg the most dangerous person in the world, because he doesn't care. Any way he can make money, he does so. He's here to increase net worth. Exactly.
Starting point is 00:47:33 So we know people are changing the way they write to avoid the perception that their work is actually created by AI. At the same time, a lot of people are going to use AI to do their writing for them, including the influencers. Generative video will become increasingly good, and it is becoming. A lot of people are going to use AI to do their writing for them, including the influencers. Generative video will become increasingly good, and it is becoming.
Starting point is 00:47:47 All the memes about horses jumping off of diving boards or zebras, fantastic. I've seen those, yeah. And there was one the other day that was so good. I was like, this is fantastic. There's full text, there's five fingers. Like, we're in the era where we can't really tell what's real, what's not real. You know, what was interesting, when they were bad, someone was like, oh, see, they're bad.
Starting point is 00:48:04 I'm like, they're not going to stay bad. They're like, just go look at the early internet. So I mean, if you look at the early internet, you wouldn't recognize it was so bad. And now it's not, you know. So when we get to a place where AI-generated influencers become popular, and if yes, what does that do to our language? So okay, we know that they want this. We know that Meta is experimenting with AI-generated comments, AI-gener generated profiles. They're actively incentivizing people to create AI generated accounts. I've
Starting point is 00:48:28 come across a lot of AI generated accounts on Instagram that at first glance really do look like real people. I think there will always be a need on the internet for just raw humanity. And there's this aesthetic called internet ugly that anytime you try to impose a world of smooth gradients, we're going to come up with something messy. Look at the Italian Brain Rot memes earlier this year, which was a series of AI-generated animals with absurd kind of torsos. And I think that was a serious cultural reaction against AI. It was like we had this new, like, smooth kind of software, AI's finally getting good,
Starting point is 00:49:01 let's make the most ridiculous thing possible. And that was an absurd human reaction. We're going to continue doing kind of ugly, gritty, authentic feeling things with the internet because that's what we crave, real humanity. AI representations of reality will always be necessarily a flatter version of reality. It's a map. The map can never be the territory. They'll never be fully caught up to the way we use language.
Starting point is 00:49:21 We use slang, particularly. Ask Chad GPT to talk to you in slang. It's it's gonna sound clunky. It doesn't sound real because they don't actually understand pragmatics. They don't understand how language is used in context. So I strongly believe that humans will always find a way to creatively be one step ahead of AI. And yes, yes, AI is here. There's a real reality that we won't like know what exactly is certain. There's sort of this that we won't know what exactly is certain. There's sort of this epistemic kind of confusion going on.
Starting point is 00:49:49 At the same time, I feel optimistic for, and there might be like, another thing is we might see a cultural backlash against algorithms. We're definitely seeing a cultural backlash against AI, but there will be more people seeking out these decentralized platforms, Blue Sky, Substack, individual groups, Discord. I think the algorithms will remain the underlying infrastructure of mass communication on the internet. And it's very important we talk about this. And I think this book will continue to be relevant
Starting point is 00:50:11 because it's that medium is going to continue affecting us. The bulk of it. It's a bulk of it. Right, right. But individually, I think we're going to be seeking out more of that anti stuff. And the stuff that does go like popular on the internet will always have a messy humanity underlying it.
Starting point is 00:50:23 Mostly, we'll see. I think they'll get very good at it. Well, even with the AI generated stuff that feels like, you know, that is AI generated, it's still prompted by somebody who like knows how to tap into the zeitgeist. And there's absolutely. Maybe there's still something human about that. So when we're talking about this, though, the people in charge, TikTok is owned by ByteDance, which is ultimately answers to the Chinese Communist Party. Instagram is owned by Meta, which is controlled completely by Mark Zuckerberg.
Starting point is 00:50:48 YouTube is owned by Alphabet, which is still controlled by Larry Page and Sergey Brin. And X, which is much smaller and is still influential is owned by Elon Musk. Does this mean that this small group holds some of the greatest influence over how over English language evolves? Language and culture, the Al-Ghokharsy, I guess. All these platforms are going to continue monetizing our attention, and they found the most addictive way to do that. Short form, vertical video run through personalized recommendations.
Starting point is 00:51:17 That will continue, like even as, however many people try to go offline, like that's always, like, until they find a more addictive medium, that's going to continue dominating our culture. And yeah, they kind of buy these baked in platform incentives that in the sort of banality of the algorithm sense, creators will replicate. I unfortunately do think that we will continue seeing language evolve under their kind of auspices. However, I end the book on a positive note.
Starting point is 00:51:42 I do think language is a reflection of how humans relate to each other. And we will continue being human and continue using language in a way that makes us human. And we might spend time less on these platforms, but at the end of the day, we are human and that's still, they can't take that away from us. I'm going to push back on you because you won't even know, you don't even know they're doing it. That's the thing. That's where we'll get to.
Starting point is 00:52:05 I think we need radical awareness of what they're doing. Yeah, that's kind of like one reason I really care about this book because it sort of exposes this stuff and I'm hoping I'm working more more stuff with this with media theory going down the McLuhan route and stuff. But we need to be very, very aware of what's happening and then we can make our own choices in the same way you compare this to like cigarettes or like people just didn't know or like sugar or whatever like there's been a lot of times throughout history where there's this like really addictive product and people just weren't aware of how bad it was and once we become more aware we can make our own decisions.
Starting point is 00:52:36 Except in terms of sugar we've never been fatter, we've never been on more unhealthy we've as at least the United States so we know but we but we don't care. And that's the problem. So you say social media is neither good nor bad. It's messy. I have a different opinion. Well, I think it's a tool. I think it is. Yes, yes, the thing itself, yes.
Starting point is 00:52:52 So it can be tough to, you wrote this, it can be tough to tell who wins and who loses in the algorithmic era of the language change. And that's fair, but it's a bit of a cop out. I want to push you who wins and who loses when algorithms designed to increase engagement and addiction have an outsized influence on the words we use and the way we talk to each other?
Starting point is 00:53:12 I think we can make a very easy argument that it's a cruder culture because of this. I want to say that culture and language are similar but different. Language is a proxy for culture here, and I explore a lot how culture bleeds in and how language influences culture. I don't think there's anything ever bad with language itself. If I'm just talking from a linguistic perspective, not a cultural theory perspective, with language, no such thing as like brain rot, for example. No word neurologically is worse for your brain than any other word.
Starting point is 00:53:39 At the end of the day, language is a way that humans have to identify what's happening in the world and talk to other humans about it. So the language is fine. Culturally, I do agree we have a lot of problems we got to sort through. So who wins and who loses? Well, I think humans do sometimes win when we have memes and this sort of stuff I was talking about with Italian brain. I think there is like a positive way that we reclaim our own agency. and when we do move to other platforms And I think we should be mixing our media as much as possible
Starting point is 00:54:10 I don't think algorithms are completely bad because they elevate some voices that haven't had voices before I think we should be mixing our media totally enjoyable Yeah, I sometimes I'm like I love some of this and I realized some of it is really dangerous So you talked about Stevie Wonder's songs from the Key of Life, which makes me love you now. That's my era, that's my era. But which words are on their way out and which are super popular?
Starting point is 00:54:34 Skivity's got another year left, that's my call. It's gonna die out in the way Yeet died out. Which words are growing in popularity? We have words that are like more under the radar that are gonna stick around in radar that are going to stick around in the same way selfies stuck around, right? Like in the same way cancer stuck around. So like low-key, for example, side-eye.
Starting point is 00:54:50 I talk about these in my book as examples of words that don't stick out as like quote-unquote brain rot, but have recently been popularized by algorithms and are maybe going to remain in place in our language. And I don't think those words are bad at all. And in fact, I don't think... No. Can Riz go? Can Riz please go? Riz may have more of a chance to survive from skivity, honestly.
Starting point is 00:55:09 But yeah, I don't know. I love the word skivity. Oh, God. What is wrong with people? Riz, stop saying it. You're individually defining culture from a subjective perspective right now. I am, and that is correct. And hence, that's why I'm so famous.
Starting point is 00:55:20 Adam, this is a fascinating book. And you're really incredibly erudite and smart guy in thinking these things. And it's great that people are looking at this and I really appreciate it. Well, I really appreciate talking to you. Thank you for pushing back a little bit. On with Kara Swisher is produced by Christian Castor-Russell, Kateri Yocum, Megan Burde, Allison Rogers, and Kailin Lynch.
Starting point is 00:55:44 Nishant Kurwa is Vox Media's executive producer of podcasts. Special thanks to Kate Peterson. Our engineers are Rick Kwan and Fernando Arruda. And our theme music is by Trackademics. If you're already following the show, all your bases belong to us. If not, you're unalive. Go wherever you listen to podcasts,
Starting point is 00:56:04 search for On with Kara Swisher and hit follow. And don't forget to follow us on Instagram, TikTok and YouTube at On with Kara Swisher. Thanks for listening to On with Kara Swisher from New York Magazine, the Vox Media Podcast Network and us. We'll be back on Thursday with more.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.