CoRecursive: Coding Stories - From Hacker News to TikTok - How Algorithms Learned to Hook Us

Episode Date: March 2, 2026

Corey told me about his AI cat reel problem. He found these AI-genearted cat videos hilarious. Who makes these? He kept sending them to his wife. Then he tried to stop watching and he couldn't. So I w...ent down the rabbit hole of how social media algorithms actually work. It starts simple. Upvote, downvote, sort by time. But by 2017 Facebook has a metric that quietly reshapes what two billion people see. Then a leaked playbook lands, and a CEO takes the stand in Los Angeles. Today is an investigation into what happens when the algorithm knows you better than you know yourself. Episode Page Support The Show Subscribe To The Podcast Join The Newsletter    

Transcript
Discussion (0)
Starting point is 00:00:00 Hi, this is co-recursive. I'm Adam Gordon-Belt. And today I have Liam with me here. Hello, Liam. Good to be here. So, Liam, I got this message from Corey, a developer I used to work with, and he ran to this weird problem on Instagram, and he wanted me to look into it. We're cat people, so finding, you know, funny cats on Instagram reels feels like a supernatural thing. But once, was it Sora? When that showed up, there were so many. There were so many. any AI-generated cat videos. Cat videos were already cracked on, you know, these social media algorithms. And my feed just turned into nothing but AI slop cats. So the thing that happens, Liam, he thought they were hilarious. He started sending them to his wife. And then eventually his wife told them to stop, right? But he couldn't stop them because now the algorithm I'd learned he liked AI cats. Yeah, it knows. Now he can't get away from them. That's the problem.
Starting point is 00:00:59 He just not watch them or stop going on that social media platform. What's he trying to solve here? Yeah, I ask them that myself. It does affect me sometimes in daily life where it's time to put the kids to bed and you can't do much when you're putting your kids to bed. You've got to keep them on track, but you can't also like do it for them. So you're just kind of standing there in this awkward, you know, traffic control situation. And sometimes that goes for like an hour and you've been scrolling the whole time. And you've spent an hour just on, you know, five second videos watching who knows how many of them.
Starting point is 00:01:35 And you come out of that and you don't feel great. So he's a developer and he can't beat the algorithm. And if he can't beat the algorithm, then who's going to be able to beat the algorithm? And what are you trying to solve here? What are you trying to figure out for him? Yeah, so that's a good question. We both have a developer background. And I was kind of interested just in how these social media algorithms work.
Starting point is 00:02:07 So this was a great prompting. But I thought this would be very quick, and it was anything but. So I started Googling around, right? Like, how does the Instagram algorithm work? But that's not really the type of answer I was looking for. That's like influencer tips on how to make your various post spread. So I had to go further back. Before any of these platforms existed, you mostly chose yourself what you saw online.
Starting point is 00:02:37 There was blogs you subscribe to. You bookmarked things. But then came slash dot. and then dig, and then sites like Reddit and Hacker News changed the model, from editors deciding what mattered to crowds voting. People post links and you get up votes or downboats, and popular stuff rises to the top. And the algorithms behind this were remarkably simple.
Starting point is 00:02:59 The original Hacker News ranking algorithm, it's public, it's written in ARC, Lisp, dialect. And honestly, the best way to understand it and all the similar algorithms is Flappy Birds. What do you mean by flappy birds? Do you remember flappy birds that game? Do you remember it at all? I think I only sort of saw it on the internet.
Starting point is 00:03:19 I don't think I really got into it. So Flappy Birds, you're a bird in like a Mario-type world that's a side-scroller. And all you can do is press the space bar. And when you press the space bar, the bird goes up and then it slowly starts to fall. And so there's various obstacles. And you're just pressing up to like give it enough up to get through the various things as the landscape scrolls through. So it became this sensation briefly, like wordle.
Starting point is 00:03:46 But it's very simple. You just have one thing that goes up, and then the gravity slowly pulls it down. And so this is exactly how all of these early ranking sites worked. Hackernews, in fact, called there's gravity. So you go to a website, and there's a bunch of links listed, and when you hit upvote, that's like pressing the space bar on that item. So like the ranking of the first page,
Starting point is 00:04:10 like a whole bunch of parallel flappy birds. And all the people on the internet are like pressing the space bar, raising them up or down. And the gravity is pulling them down over time, but the space bar of people is pushing them up. The way that these algorithms can be tuned, you have like a time decay. So basically, you can imagine each story gets heavier over time. So when an article first appears, it only takes a couple upvotes to push it to the top. But the longer it appears, the heavier and larger that bird gets. and the more people it takes to raise it up.
Starting point is 00:04:42 It's the simplest algorithm you could describe. Like, it's kind of neat to see, but really it's just simulating gravity. And the way it does that with that decay over time, they can tune that in terms of how often the stories turn over, right? How often do you have new things? How often do things stay so that people can comment and react? So that's it.
Starting point is 00:05:02 That's like the simplest version of what eventually becomes Instagram and TikTok and etc. All right, so every post is like a bird on a flappy bird. It gets boosted, goes up over time, and then eventually gravity makes it go down. Yeah, but like the crazy thing to me, Liam, is like I spend a lot of time on Hacker News, sometimes on Reddit. There's a while I was really into custom keyboards, like, for my computer. And I spent a lot of time onto like the mechanical keyboard subreddit. Whatever the size of the community, there was always interesting things to find. But it's so small, right?
Starting point is 00:05:38 and so simple. There's no machine learning, there's no personalization, no tracking. It's just votes and time. It has come up with in 2005. It's nothing fancy, but it creates this unified front page.
Starting point is 00:05:51 It creates this experience where all the people in a certain group, you know, they have their own front page of things that are interesting. It gives the community something common to talk about by bringing everybody together
Starting point is 00:06:04 and voting on something. So it is a little bit like a shared home page which is different to a typical feed where it's entirely bespoke and personalized to you. Yeah, this is like the early versions, right? And yeah, it has that advantage where on Reddit, if you're into woodworking, the Reddit woodworking surfaces interesting things, right?
Starting point is 00:06:23 If you're into politics, you get the most talked about debates. And what it's doing is it's searching through the interests of the people paying attention. If you're voting, you can think that this simple algorithm puts forth a bunch of ideas about interesting woodworking ideas, and then you pick on one, and then that's probably interesting to somebody else in the community. So it's kind of searching through everybody's interest
Starting point is 00:06:45 to find the things that are most interesting in common with all the people who are there. But here's where it gets interesting, right? On Reddit, there's another button, and it's called a Sort by Controversial. So instead of showing the things that get upvoted the most, Sorby Controversial shows the things that get the most upvoted and downvoted. The interesting thing about that is a lot of times the conversations around something that's controversial are more in depth, right?
Starting point is 00:07:13 There's more to talk about. If there's a controversial woodworking technique and some people don't even want it on there, it's likely to be interesting to talk about. Although people may get in flame worse and it may cause aggression. And so that sent me down another rabbit hole. Part of this, Liam, is just I started looking into this and I ended up going in a lot of different directions. But there's this short story by Scott Alexander, and it's called Sort by Controversial. And in it, a startup of some sort wants to find things that do well on Reddit. And so they train like a machine learning model on this controversial posts.
Starting point is 00:07:49 Right? They get this sort by controversial information. They try to build something that will recreate it. And so what they end up doing is building a machine for generating very divisive statements. A statement that, you know, if you show it to two people, they may have very different take on it. This sort by controversial is actually just finding things that divide us, that cause us to battle it out. And the crazy thing, the reason I think this is interesting is, you know, you don't need a fancy model or a bad actor, like some crazy machine learning to get this result to stir
Starting point is 00:08:24 things up. Because it's still just this simple gravity, but you're just measuring instead of what's people like the most, like what causes them to interest? interact the most. All you're doing is measuring that. It's not complicated, crazy AI math, but you can find these stories. You can find these things that just cause such visceral reactions that people can't help but interact with it. That's just how we're wired. Like certain things, if you care about it a lot, you can't not respond. Go, I can't go to bed. Someone's wrong on the internet. You're like, I got to tell people that that's incorrect because everybody has an opinion. And this can happen to any community, right?
Starting point is 00:09:03 Whether it's on Reddit or Facebook or YouTube, it can happen anywhere, where there's these divisive comments that are made with the intention of splitting the readers. It's not even an intention, right? It's just, it's almost just innate to who we are, that there will be points of discussion that are very divisive and they grab people's attention. Luckily though, Reddit and Hacker News and Dig in these early sites, like they weren't made to surface these. They did get surfaced and on Reddit you could go and see, oh, what's the things that's dividing the woodworking community? But they didn't build the system to stir up this type of drama.
Starting point is 00:09:49 Yeah, so how does this explain Corey's cats? Yeah, exactly, right? It doesn't, right? Like, not yet. but I had to start somewhere to understand how these social algorithms work. Reddit explains how communities find what matters to them. You know, people are upvoting, stories slowly fall off the bottom, new ones pop up. But Corey is on his own, right?
Starting point is 00:10:14 There's no group. He's just scrolling lost in his feed. So I had to keep digging, right? And next up was Facebook, and that's where things start to get a little unsettling. I was up late and I was reading. I'm up on my phone in bed and I think like oh this is what Corey's talking about I'm stuck
Starting point is 00:10:32 but before I get into what went wrong with Facebook I want to talk about just how Facebook worked in the early days so you go into Facebook you would post a photo and other people would see like oh there's an update from Liam he posted a new photo this is the news feed and it was an innovation like everywhere has this now but this was a big improvement at the time
Starting point is 00:10:56 because it hit on something deep. We're social animals, right? And we're wired to keep tabs on people around us. It's kind of like the office lunchroom or cafeteria. You know, you're not there for the coffee or for lunch. You're there to hear what let down in that meeting you missed. Somebody maybe has a comment about somebody else under the breath. That whole thing.
Starting point is 00:11:18 The news feed automated that. It's like, here's all the people you're following and here's the things that they're saying. So the news feed is just gossip. That's it. People always talk about gossip, like it's bad. But gossip we like because it's something important. It's like it's social standing. It's important for us to know where we stand in the tribe.
Starting point is 00:11:38 And we will want things because we see other people want them. We compare ourselves constantly to other people in our group. If your neighbor posts that they just did a new kitchen renovation, suddenly your kitchen feels inadequate. Like nothing has changed in your world. It's just your comparison group has changed. There's actually research around this. It's not celebrities who get under your skin.
Starting point is 00:12:03 If Kim Kardashian has an amazing and crazy vacation, like you can roll your eyes at that. It probably doesn't make you as jealous as if your college roommate that you didn't think was that smart, if they get a promotion, the social standing of those that you perceive as close to you is like just something in us that we deeply care about. What I'm curious about is what's the algorithm actually doing here?
Starting point is 00:12:28 Mm-hmm. The algorithm was called EdgeRank and is super simple. It just looked at how close someone was to you, right? Like, is it your friend or is it your friend's friend? How often do you interact with them? Looked at what type of post it was. You know, if you posted a picture, that is more important than just a text update. And how recent it is.
Starting point is 00:12:50 So it's kind of like just old-school Twitter, right? It is a list of the recent things from the people that you have friended. And it can rank them a little bit. Like if you comment on somebody's posts a lot, it's more likely to show bears. So I'm sure that's hard to do at Facebook scale. But again, it's so simple. The thing that made it so powerful isn't this great insight that the algorithm has. It's actually, it's in us.
Starting point is 00:13:17 Reddit was finding consensus in a group. What does everyone think is interesting and woodworking? Facebook taps into something deeper. It's just our social standing. Envy, jealousy, gossip. The code isn't complex. What's interesting about it is that your friends from high school are on there. The thing that makes it powerful isn't an algorithm, actually.
Starting point is 00:13:36 It is human nature. But how does this relate to Corey? Because he's not jealous of a cat. I guess what happened is I wanted to understand what these algorithms do. And honestly, I spent a lot of time on this. And this was one of my early, light bulbs. Like yes, it doesn't totally connect to Corey's thing, but it connects to this idea that, oh, like, Facebook newsfeed wasn't this great all-seeing eye that understood Adam. It's
Starting point is 00:14:04 just working because I care about these people that are my friends. And I spent some time on this, but I couldn't figure how it exactly connected to Corey's thing, right? He's not in a community. He's not envious. He's not even angry or jealous. He's watching what sounds very wholesome to me, just like cats doing silly things. So I almost called them back to say that I didn't have a good answer. But then I found these Facebook data leaks. They're called the Hagen documents. So what are these leaks?
Starting point is 00:14:35 Yeah, so this data scientist, Francis Hogan, walked out of Facebook with tens of thousands of files, and you can find a lot of these files online. They shared them with journalists. They became part of the public record. And they told this whole crazy story that around 2017, Facebook broke its algorithm.
Starting point is 00:14:53 So this kind of does relate to what happened to Corey. I don't know why this topic is so interesting to me, Liam, but I've gone pretty deep on it. It's 2017, Facebook's engagement numbers are dropping. And it's a big deal because there's this feedback loop of Facebook. You post a photo, you know, of your birthday party, and then your friend gets a notification because they're in it, and they log in and they leave a comment, and then you like that, right?
Starting point is 00:15:17 and then they post something and they tag you and you go in and it's this circle, right? It's like the people in your community interacting with each other. Activity causes more activity. Comments feel good, likes feel good. Everybody keeps contributing in the cycle repeats. But then something happened. The loop was slowing down. People were getting less out of it so it seemed.
Starting point is 00:15:40 And so they were like, we got to find a fix to address this, what they perceived as like a real threat to the heart of their business. They rolled out this new metric. They called it meaningful social interactions, or MSI. And instead of, you know, measuring how long people stayed on the site, they wanted to boost engagement. They wanted to boost comments and shares and reactions. And the thinking was simple, right? If people are talking to each other, people are interacting, then the platform is working as intended. All right. So to clarify the fix was optimizing for these meaningful social interactions, which was comments, shares, and reactions. Exactly.
Starting point is 00:16:20 I mean, it sounds reasonable. Yeah, right? Like, I mean, we work at tech companies. It sounds like a perfect strategy. You have all these metrics. You're like, oh, this one's falling. Let's put together something that optimizes for raising this up because this is the most important thing.
Starting point is 00:16:34 They didn't have a way to tell what a reaction was. They see the numbers moving, but they don't see the post behind them. They're watching these aggregate numbers and things are looking better. All right. So they're watching the forest not the trees.
Starting point is 00:16:47 Yeah, exactly. From a distance, everything looked great. Engagement was up. People were commenting and the metric was doing his job. But if you zoomed in post by post, you saw something else. The post getting the most reactions were the ones making people angry. You don't rush to comment on your friend's vacation photo. You comment when you can't believe what someone just said or what they just shared.
Starting point is 00:17:06 That's when you feel the need to jump in. You set a metric and you optimize it and the number goes up, but there's side effects. And you can't see the side effects on the dashboard. But they built this system, Liam, where you would get rewarded for posting something that would anger people. You would get more distribution on Facebook, more likes, more comments, whatever, for saying something that was divisive. That was how it ended up working. Shit. So what they've actually done is they've actually built sought by controversial here.
Starting point is 00:17:36 Yeah. Yeah, looking, it's wild, right? Like, they didn't read, obviously, the Scott Alexander story. By optimizing for engagement, they had built this system. that just finds the most divisive and angry things. That finds across 2 billion people who use Facebook, they took it from a woodworking Reddit where people might fight about glue
Starting point is 00:17:57 or whether or not you can use it to a worldwide thing where the messages that get the most distribution on Facebook are the ones that anger people that divide. They had this small group in Facebook that was supposed to be stopping misinformation from spreading. And after this fix rolled out, they sounded the alarm, right?
Starting point is 00:18:17 That this is just making people fight. Like, we built a system that makes people fight. And they brought it to the newsfeed people, and the newsfeed people said, like, yeah, this is risky. But, you know, like, we were worried. Our site was spinning down. Less people were using Facebook. And we found this thing to flip, and all of a sudden, everybody's on Facebook.
Starting point is 00:18:34 The newsfeed people said to the integrity people, like, hey, we'll look into it. We got the metrics in the right direction. There must be a way to kind of tune this where it doesn't cause quite so much anger. And so they actually did hire some people. They got a guy from Netflix who worked on Netflix recommendation system. They asked him, like, hey, you guys recommend movies to each other. Like, what can we do here? We need to keep the growth up, but we don't want to cause the world to split into wars or whatever, right?
Starting point is 00:19:02 And he spent a lot of time on it. And he said, it's actually tricky because you've actually found the best growth metric. Like, this is really working. I don't know how we turn it down without hurting our numbers. Facebook was kind of stuck. What was happening, Liam, is the turn towards engagement brought growth back up. It brought growth to exactly where they needed, maybe beyond. But it was causing all these problems and they could see it.
Starting point is 00:19:29 Like even all over the world, some really horrible things happening that some would say was related to this change. Like war is breaking out. But turning it down would turn the growth down. And turning the growth down would hurt their revenue, hurt their revenue, hurt their stock price hurt the stock options that everybody pays there it was really hard for them to address this so the fixed existed but it just cost too much for everyone in the company to go and implement it yeah and like in fairness to them i get it right like companies are often predicated on growth there's other social networks they want people to use the platform they're not forcing people to
Starting point is 00:20:15 respond in angry ways, but yet they sort of are. Yeah, I mean, this has been going on for years, like newspapers, publishing front page articles about a war or about a murder or something like that, and this whole thing of, if it bleeds, it leads. So you're right. Like, you could say, hey, they're just replicating in their distribution, how editors choose stories for newspapers, like the most salacious thing goes first, but there's editors and the things generally are true.
Starting point is 00:20:48 This isn't true of Facebook, right? There's no editors. There's no limits. There's instant feedback. There's angry people talking to each other. So honestly, this is super interesting. Like, I spent a lot of time reading into all this because it's such like a Sophie's choice for a company to decide what to do here.
Starting point is 00:21:08 And if Corey's question, like, I kind of wish that it was like, hey, why is my dad always fighting with other people on Facebook or Twitter or even in YouTube comments because I think that now, like I actually have a really strong understanding to answer that question. It's all about metric optimization and how engagement causes divisiveness. But actually that wasn't Corey's question. I mean, Instagram is part of meta, part of Facebook now. But Reels is not a copy of the news feed, but doesn't sort your friends posts by who's got the most angry reactions, right? It's more like lining up an endless stream of videos from strangers, really, and trying to find the ones you can't stop watching.
Starting point is 00:21:49 It's kind of a whole different thing. Yeah, so we're still not at Corey's answer yet. I know. It's actually frustrating for me. Every time I found an answer, like these leaks I thought would explain it. And they did explain why Facebook got toxic, but they didn't explain why Corey spends all day watching cats. That's when I landed on YouTube.
Starting point is 00:22:09 Actually, because YouTube cracked a different code, and it is video-based. It doesn't care about reactions or anger or even a sense of community. They're just trying to find things to keep you watching. Early on, YouTube cared about clicks, right? You'd upload a video onto YouTube, and YouTube counted how many people clicked. And that was already, like, way more data than Facebook had. Because people click on things way more than they comment. So it's just like, hey, show the videos that people click on the most.
Starting point is 00:22:39 But that led to a problem. It just led to clickbait in junk, where the video is. didn't actually deliver. So YouTube had to solve this. Instead of just counting clicks, they started caring about how long you would actually stick around. And so they built their whole system. Instead of this engagement system of Facebook, they built it around. How do we keep you on the site? How long are you watching this video? That way we can move past flashy thumbnails and we can focus more on. If people are on our site, they must like it. So they pulled it off by taking an idea from Amazon and others, which is collaborative filtering. You go on Amazon,
Starting point is 00:23:18 you're like people who bought this product also bought this one. YouTube was able to really unlock this algorithm and say, like, if you watch this video and then you might like this other video, that somebody else watch both of them together. And then every second you watch a video is a signal. Every time you skip a video, that's a signal that it's not a fit. And it's not just about you, Liam, right? It's about the people who watch the type of stuff that you do. So when you stop watching a video at the 32nd mark, YouTube doesn't just learn that you didn't like it, right? It learns that people with viewing habits like you didn't like it. And so I wanted to see how YouTube actually does this under the hood, and they actually publish a paper on it. The paper is by
Starting point is 00:23:59 Covington's Adams and Sargin. And it's about how they trained YouTube to find what videos to show you. and you might think this is really complex and it's got a lot of crazy math in there and it does because YouTube is massive and there's so many videos and so many people visiting but the core of it is actually pretty clear here's what it comes down to right
Starting point is 00:24:19 they take the last 50 videos that you watched for more than a few minutes and then they use like a vector space so they come up with a representation of those 50 videos and they combine them all together and they put it in some sort of space
Starting point is 00:24:35 Like you can imagine a graph and they're like, Liam is the last 50 videos he watched and that makes them here on that space. All they're doing for everybody is finding that spot that they're at and then they find other people near that spot and they're like, well, what did they like? And they show them the next one. That's the recommendation. So they're building a model of you specifically based on your behavior on their platform. But all of these videos on YouTube back in the day, we're all long videos, right?
Starting point is 00:25:04 I'm curious, what I'm hearing from you regarding Corey is that is watching five second clips, not long-form videos. Yeah. And so is everybody. Like, so many people are addicted to these short videos. What makes short videos different? And that question kind of changed how I saw all this. I used to think of, oh, there's this algorithm.
Starting point is 00:25:26 It's this one trick. You know, whether it's Facebook or, you know, even the simple one on Reddit or whatever is on YouTube. but it's a little different than that. Every platform has built their own little algorithm, and they're each kind of optimizing for a specific thing. Reddit is great at finding this consensus for a small community. Facebook is good, you know, originally for stirring up envy, and then they found a way to, oh, stir up rage and divisiveness.
Starting point is 00:25:56 And YouTube, right, they found this momentum loop, like, oh, let's keep you watching. Right? One video after another, let's focus on the time. Let's learn and predict. But five second videos, it's a whole new game. That's TikTok. They're the people who innovated here. And once I looked into what TikTok does,
Starting point is 00:26:16 things started to really click. So this is the algorithm that we're talking about with Corey, right? Yeah, man, I'm getting there. There is like an arc here, I promise. But yes, after reading the YouTube paper, I wanted to know how TikTok pulls this off. The reason that TikTok's important here, is like TikTok is the pioneer of this addictive short form video.
Starting point is 00:26:38 But TikTok, they keep a lot of things under wrap. But then in 2021, an internal bite dance, their parent company, document was leaked to the New York Times, came straight from their engineering team. And TikTok actually confirmed there was a real document. But it was kind of a cheat sheet to explain to non-technical staff how everything worked. It broke down the algorithm, explained what it did. So YouTube looks at your...
Starting point is 00:27:04 last 50 videos and makes a model of you so that when you go to the YouTube homepage it will recommend things and then you'll click on them to watch them and it does this kind of offline. You can imagine nightly it's updating its model of Liam based on what you've seen. But watching 50 videos that takes a lot of time. If your interest change from watching YouTube, there's just a lot to watch. But with short videos the pace is totally different, right? the time that you could watch a 10-minute YouTube video, you might watch or just flip through and skip like 30 or 40 TikToks. And every swipe that you do, every like loop is a signal.
Starting point is 00:27:44 If you watch a video twice, that's like a really strong signal. If you swipe past in half a second, that's a really strong signal. If you pause mid-video or go back a little bit, you're interested, right? You tap the creator's profile. You're interested. Very quickly, you're generating all these signals that they learn about you. And like YouTube, right, TikTok has to narrow down all the videos on the platform to pick the ones just for you. But once they do that, the system runs the numbers on the fly, not waiting till, you know, the night to build a model of you. They are figuring out constantly, how likely are you to watch the whole thing? How likely are you to just abandon TikTok at this point? They weigh each of those. And the strongest signals, unlike Facebook and the interaction
Starting point is 00:28:31 system. The strongest signals are nothing to do with like or comments. It's like YouTube. It's just watch time and rewatch time. Did you watch the whole thing? Did you watch the thing after it? Did you watch twice? So they're reading your unconscious behavior and then they're adapting the videos I show you on the fly. It's actually quite a technical feat where YouTube did this huge thing at night to determine what's best. TikTok is doing it on the fly. As you're going through there, they're learning about you. They're constantly updating that. Their first breakthrough was signal density. So the That's what they call just the fact. They're getting like 40 data points for you in 10 minutes,
Starting point is 00:29:06 where YouTube just gets one, that you watch something for 10 minutes, right? The second breakthrough is that training that I mentioned, right? YouTube does this batch at night, but TikTok ditch that approach. They published paper about their system monolith, and it's a streaming pipeline. Every time you're interacting,
Starting point is 00:29:23 they're real-time updating that model of you of what you might want to watch next. Every swipe or pause goes into a Kafka queue, And then they have this flink job that grabs those events and it adds context. You know, how you watch this one video can affect another, like two later. It's learning very quickly. And then what they tried to do was figure out like, oh, what period of the information you watch should I learn over? Should I look at what Liam has watched in the past year, in the past week, etc.?
Starting point is 00:29:53 And actually, do you want to guess what they chose? What do they choose? 30 minutes. So the most important information about Liam is in the last 30 minutes. It's very keyed in to what you're currently doing. The reason they can do this 30 minutes is because they have way more data per minute. The model is learning all the time. So in the last 30 minutes, they may actually have a lot of information.
Starting point is 00:30:15 But then there's the real engineering hurdle that using the last 30 minutes of your time on TikTok means they need to be able to constantly update this. Right. So they came up with a complicated system to do that. it's really a great feat of engineering. Basically, what all this means is they have a lot more data per video and a lot more video per period of time.
Starting point is 00:30:38 I've got this image in my head of a massive burger and chips meal versus a meal at a sushi train. And the restaurant that serves the massive burger and chips, maybe the chef is out back, maybe they're even watching you to see, whether you like it. And they're looking to learn how you behave or how you think about the massive burger and chips that you've eaten. But they were able to sit through the entire meal.
Starting point is 00:31:09 And maybe at the end of that meal, maybe you hated it and you only ate half of it. They were learning once from one big meal. Whereas a sushi train, someone sat down and there's small bite-sized meals going around and around and around. And the sushi chef might be there watching every single bite that you take and they can see that you really like the chicken terriarchy sushi maybe you don't get any of the raw fish so all of a sudden they start adapting their sushi train and they take off the raw fish sushi and start piling on more of the other things yeah exactly the wall street journal uh did this test to see how fast ticot could figure out your interests so they built these bots basically and they gave them each a personality that they would only like
Starting point is 00:31:57 certain videos. They had this one account called Kentucky 96. It was set up to be a 24-year-old interested in sadness. In just three minutes, 15 videos in, it watched just a sad clip, and it watched it several times, right? And that's all it took. By the time that 30 minute was in place, Kentucky 96 had seen 224 videos, and 93 of those were about people who were depressed or thinking of suicide or self-harm. What happened is every time Kentucky 96 lingered on something about sad, the next one was likely to be even more sad.
Starting point is 00:32:34 And the algorithm doesn't know what it's doing, right? It's still very simple. Yeah, so this algorithm can send someone from sort of the mainstream into fringe in 36 minutes from a relatively okay state with a couple of depressive thoughts, and then all of a sudden, they're entirely depressed in 36 minutes only.
Starting point is 00:32:58 That's all it takes. Yeah, and this Wall Street Journal test, they did, it tested a lot of different user profiles. One's interested in politics, and they quickly got pulled towards, you know, Q&N and conspiracies. I don't think there was a cat owner one, maybe they should have.
Starting point is 00:33:12 But the pattern was always the same. As long as you interact with something, the algorithm can find more adjacent things that'll grab you and pull you deeper. A system like YouTube takes weeks to learn your habits, and so it sees a kind of a larger range of what you're into. TikTok locks into your narrow in the interest moment and keeps feeding it. So the rabbit hole isn't really intentional.
Starting point is 00:33:32 It's what happened when a system is learning your preferences faster than you realize it. And this connects back to a problem that Corey is actually really worried about. And it's kind of a bigger one than AI cats. It's about how people and especially young people are able to cope with the world we've built. Basically, he's worried about his son. He's old enough to where he sits down at the computer, he cracks open Roblox, and he has it fill the right three quarters, the right four fifths of the screen, and then he puts a browser in the other 20% and scrolls YouTube shorts while he's playing Roblox.
Starting point is 00:34:15 And that's not behavior that I want. And I'm worried that I'm raising kids who don't understand how to actually. focus on something for longer than four or five seconds because their attention is just hopping everywhere. So I think that's the real problem, right? He's actually worried, I mean, for himself, but for his kid. Is it a technical problem here? Or is it a bad habit that is enabled by tech?
Starting point is 00:34:45 Yeah. That's the hard question. So I will answer that because I think that's important. And I called Corey, I walked him through the whole story, right? There's like Reddit and Facebook and YouTube and TikTok and how each found its way to like get more people interested in it. Is there ever a point where this becomes unwillingly like self-reinforcing? Oh, you know, we showed him a bunch of these. And so we're just going to keep flooding his algorithm.
Starting point is 00:35:14 But then because we're flooding his algorithm, it's just reinforcing the signal that we've already decided. Did I just de-d-d-d-s my algorithm to the point that it just thinks this is mostly what I want to watch? Yeah, I mean, possibly. Yeah, so meta has, what is this called? It says, we want to make sure everyone on Instagram, especially teens, has positive and age-appropriate experiences. We've built-in protections, but we also want to give you a way to shape your own experience. That's why we're attesting the ability for everyone on Instagram to reset the recommendations. and just a few taps you can clear
Starting point is 00:35:49 you recommend the content across Explorer, Reels, and Feeds. So that's your key. It's just under content preferences, reset suggested content. Oh, really? I was going to ask, is there, do you have any tips to dig myself out of the hole I've created for myself here? But the interesting thing is,
Starting point is 00:36:06 this part you probably won't like as much, right? Because like the thing that makes TikTok and reels and whatever work is not necessarily some magic in the algorithm. It's like your interest. Social media is like cheesecake, right? Like we like things that are fatty tasting. We like things that are sweet because that's just our evolutionary background, right? We wanted fat and sugar.
Starting point is 00:36:29 And cheesecake is just like a purified form of that. So like the thing about cheesecake isn't that it's magically addictive because of the ingredients. It's addictive because it's things that we like and enjoy. So, I mean, I think that you like these cats. I guess. Yeah, I love I love annoying my wife by sending tons.
Starting point is 00:36:52 Yeah, right? Maybe it's just the absurdity of sharing it with her, but like if the novelty of them wears off and the Rio's algorithm is working properly, I mean, it should be able to start showing you other stuff. Although, as you said, maybe it's over indexed on this. Thanks for giving me the benefit of the doubt there. Yeah, I don't know, but I don't know if this is helpful.
Starting point is 00:37:13 The thing I found is that actually learning about the algorithms doesn't necessarily make them not work. Now that I know how they work, it's not going to make me less likely to scroll on my phone. Yeah, I'm hoping I can scroll less, but we'll see. I feel like much like cheesecake, the key is just to not have it in the house or on the kitchen counter. That's all I got. I didn't solve your problem necessarily. You did. I'm going to go give this reset preferences a shot.
Starting point is 00:37:42 We'll see if I can get back to normal. I feel like cheesecake now. But I mean, so the answer here is that we just don't have social media in the house. Well, the point isn't that you should never eat cheesecake, right? Cheesecake is delicious. Everybody has their weak spots. The craving's always going to be there for us as humans. But the concentration of the thing that we're craving for, that's what matters.
Starting point is 00:38:09 Yeah. Like the craving is us. Like our desires and the things that we want in the world, whether food or weird videos, it's part of who we are. The algorithms didn't build it. They just got very good at finding it and concentrating it. Yeah.
Starting point is 00:38:25 So how about his kid? Yeah. I mean, I think in a way that is both simpler and more profound. The one thing that I found that was disturbing, I guess, was the, because the TikTok thing is so small, it can exasperate mental health things. Like if you're feeling sad, it will learn, oh, like you're interacting with sad videos.
Starting point is 00:38:49 I mean, that's concerning for teams and children, I think, more so than anything else. Yeah, I didn't even think about that. That's tough, are they doing anything guardrail-wise around that? Or is it just the algorithm's going to algorithm? I think it's, algorithm's going to algorithm.
Starting point is 00:39:06 Like, they don't know what a cat video is, right? The algorithm's not that smart. It's just pattern matching. And if it's pattern matching and showing you things that you think are funny to share with your wife, that's cool. But if it's like a depressed teenage girl and they find that she likes depressed things and just kept feeding her depressing stuff, like, yeah, man, you go to a bad place. But I just think like people shouldn't panic about kids' use of social media as much as just try to have constraints, right? Yeah, That's kind of where we've fallen between my older daughter and my son. We just opt for, you know, reasonable monitoring without getting too much into their business.
Starting point is 00:39:46 But enough that we're aware, you know, like I, I have a link to every short my son's watched. And have I watched all of them? No, but I go randomly pick some. And it's like, okay, it's 12 year old boy stuff. Like, it's what I would have expected. We're good. Yeah, so that's the show, or at least that's where I thought it would end. keep the cheesecake out of the house.
Starting point is 00:40:15 Simple enough. But while I was putting this episode together, Mark Zuckerberg took the stand in Los Angeles. The first time he's ever testified before a jury because a young woman said Instagram was designed to hook her and she was suing. And there are 1,700 similar cases behind her. And those same Hagen documents that I spent weeks reading through,
Starting point is 00:40:34 they're now evident in this trial. Met his own researchers found, as was in the documents, that turning off middle of the night notifications helped kids sleep better, but it hurt growth. So they didn't do it. And it's easy to hear things like that, Mark Zuckerberg, keeping kids up at night, and think that like, hey, there's your villain. That's the answer here.
Starting point is 00:40:58 But I don't think it's the right lesson, right? We spent this whole episode talking about how every platform is kind of tapping into something innate in us. That cheesecake is always going to get made because we want it, right? That's why maybe regulation matters. You know, we do it for alcohol, we do it for cigarettes. You know, we don't regulate cheesecake, but maybe we should. So until next time, thank you, Liam.
Starting point is 00:41:21 Thank you, Corey. Thank you, everybody. And thank you so much for listening.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.