Tech Won't Save Us - How Brainrot AI is Upending the Internet w/ Jason Koebler

Episode Date: May 1, 2025

Paris Marx is joined by Jason Koebler to discuss the economy behind AI slop generation, how people are building businesses on AI-generated images, and the wider consequences of their proliferation on ...social media. Jason Koebler is a cofounder of 404 Media and cohost of the 404 Media Podcast.Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Support the show on Patreon.The podcast is made in partnership with The Nation. Production is by Eric Wickham.Also mentioned in this episode:Jason wrote about how Brainrot AI is monetized on Instagram and the wider effects of AI slop on how we perceive reality.He also wrote about how whether the tech industry’s bet on Donald Trump is working out with Emanuel Maiberg.Support the show

Transcript
Discussion (0)
Starting point is 00:00:00 What I think is happening is the people creating this type of content are creating outrage bait or engagement bait that is more designed to get people to signal to the algorithm that this is something that should be boosted. And it's getting a lot of views based on that. Hello and welcome to Tech Won't Save Us, made in partnership with The Nation magazine. I'm your host, Paris Marks, and first of all, a big thank you to everyone who supported the show during our five-year membership drive. It's still hard to believe that the show is now five years old, but I think it's heading into this year in a great place. And part of that is thanks to those of you who are supporting the work that goes into making it every single week. We set two goals last month to get 100 new supporters to make a new series on defense tech and the relationship between Silicon Valley and the Pentagon. We hit that goal, and I will start working on that soon
Starting point is 00:01:05 with the goal of getting it out into the world in the fall so you can all hear it then, and hopefully we can have a broader discussion about this important topic at that moment. Our second goal, even higher goal, more ambitious, was to make a zine. We did not hit that goal, but who knows? We'll see if maybe that project will come back in another form. But for now, I'll be focused on that series in the fall. And on top of that, I wanted to give you an early indication of something else, because I'm also working on another special series that I think will be rather important in telling a story that, you know, maybe people have a certain idea of, but probably don't know the specifics of, and that is of the
Starting point is 00:01:45 privatization of the internet. One thing people might not realize is that this year is the 30th anniversary of, you know, what we often consider to be the privatization of the internet. And it happened right around this time, if you consider the decommissioning of the public backbone of the internet in the United States to be, you know, kind of the key moment that defines that event. I was hoping to have that series ready for around now, but things have been a little busy, so I've had to delay it a little bit. So stay tuned for more information on that. But I just wanted to give you a heads up that it is coming, and hopefully it will help you to see that really key event that has shaped so much of what we experience today online in a new light. So stay tuned for
Starting point is 00:02:25 that. But this week, there is a great conversation that I think you're really going to enjoy. My guest is Jason Kebler. Jason is the co-founder of 404 Media and a co-host of the 404 Media podcast, where him and his fellow journalists at 404 go through some of the biggest stories that they have reported through that week to give you greater context into their thinking behind it and what has been going on in the tech industry over the past week. Jason was on the show last year to discuss the increasing proliferation of AI-generated images and content spreading across social media platforms and what he was observing with that, how common it had become, how people were reacting to it. But he has continued to pay
Starting point is 00:03:05 attention to what is going on in that space, what is happening on social media. And he is seeing a much more negative turn on the impact that all of that AI generated content is having. So I wanted to have him back on the show so we could discuss that. We dig into this concept of brain rot AI and how all of this AI slop is not just degrading these platforms, but further degrading the information environment that we have been talking about with social media for so long, right? Where it becomes so easy to spread conspiracy theories, false information, and other things like that through these platforms. And it becomes harder and harder for people to know if what they are encountering is fake or
Starting point is 00:03:46 real or who is behind it or what the motivation is for spreading this particular idea. And so I think this is a really important conversation. And we also end that off with, you know, a discussion of what we're seeing with the tech companies more broadly in this moment, particularly with regard to their relationship with Donald Trump, based on a story that Jason and his colleague Emmanuel Mayberg wrote for 404 Media. Obviously, we saw so many of these tech billionaires get behind Trump and his campaign, hoping to, you know, get benefits out of that. But I think the question is beginning to emerge as to whether that is really paying off for them. So I wanted to touch on that with Jason as well. Needless to say, this is a great conversation. I always enjoy talking to Jason and hopefully you always enjoy
Starting point is 00:04:29 listening to us chat. If you do enjoy the show, make sure to leave a five-star review on your podcast platform of choice. You can share the show on social media or with any friends or colleagues who you think would learn from it. And if you do want to support the work that goes into making the show every single week, you know, if you forgot to do so during April and you had meant to, that's okay. You can join supporters like Jordan from Portland, Oregon, Lucas in Vigo, Spain, Chad from Los Angeles, Jessica in Austin, Kiara from Hoboken, Gary from Durham, North Carolina, Vito from Berlin, Seb from Biel, Switzerland, Ethan in Boston, Louis from Scotland, Jaco from Amsterdam, and Barbara from Germany by going to patreon.com slash tech won't save us where you can become a supporter as well. Thanks
Starting point is 00:05:11 so much and enjoy this week's conversation. Jason, welcome back to tech won't save us. Hey, I love the show. Glad to be back. Love everything that you're doing over at 404. So mutual respect there. Sometime last year, you came on the show because you were looking into like the AI images that were increasingly proliferating, I think was on Facebook in particular at that time. And then there were things to be worried about and to be concerned about with what was happening here with the weird kind of stuff that was circulating there. But we're now kind of about a year on from what you were looking at then. And you have been
Starting point is 00:05:44 continuing to follow this space on different social media platforms, you know, as these models have continued to evolve and the tools and what the people who are making this stuff know how to do with them, basically. I guess I would start by asking what kind of AI slop are you seeing on these networks today? And how have you seen that shift over the past year? Yeah. So I think last time we talked, we were probably talking about Shrimp Jesus, which was images of Jesus with shrimp for arms. And then there were also... Bring him back. Yeah. Yeah. Bring him back. Just missed that guy. There were also a lot of images of
Starting point is 00:06:18 stereotypically poor African children who were like building sculptures out of bottles. Like that was a really popular thing that was happening. A lot of like wood carving art and sand sculptures and things like that. And those were going really viral on Facebook. Some like Gaza stuff as well, I believe, and Palestinians and stuff. So that was kind of the beginning of what I would call like a shift to more topical AI slop and AI spam. So there was the all eyes on Rafa Instagram, I guess you'd call it a meme, but it was basically an AI generate image that went extremely viral on Instagram. And that sort of begat a lot of AI generated images that were related to things
Starting point is 00:06:58 that were happening in the news. And what I've seen is that a lot of the spam is more closely tied to world events because that's what usually does well on social media. Like anytime there's a disaster, there's a lot of AI generated content around it. So Hurricane Helene, there was tons of sad children in canoes like with their house destroyed near them. I mean, very horrible stuff. And then like the LA wildfires, there was a lot of AI generated content there. What has changed a lot is the sophistication of the AI, we're seeing a lot more videos at this point, like it was almost all still images for a while. And that's because the AI video generators have gotten a lot better. And people have also gotten a lot better at circumventing the content moderation. A lot of these Chinese image generators don't have any restrictions in terms of generating celebrities. So I'm seeing a lot of like LeBron James, Steph Curry, Ronaldo, Leo Messi, AI slop. And then I'm also seeing a lot of really grotesque
Starting point is 00:08:01 stuff by that. I mean, I'm looking at one of my articles. It's a manatee human hybrid AI slop video. There is a guy made of pizza. There's a man with his head in the toilet, like a human toilet hybrid. There's a lot of like body horror stuff more generally. And there's a lot of like creepy pasta type stuff like jump scare type AI. And I'm sure we'll talk about it, but that's become like really popular on all platforms. Yeah, it's almost like you see this like melding, you'll often see like one thing and it will meld into something else. And it feels like it'll just get like weirder and weirder as the video keeps going on. When people see this stuff, as you've
Starting point is 00:08:40 been following this, you've been looking at the comments on these types of images. Are they believing this is something real? Do they know it's immediately AI? Like, what do people make of seeing these things? I mean, I don't think people think that they're real anymore, at least the most popular types of AI slop that I've seen. So there's really two categories, and I probably talked about this last time, but there's like the realistic AI generated content that is designed to fool people. I've
Starting point is 00:09:05 been seeing a lot of what I'd call Elon Musk inspiration porn, where it's like Elon Musk, and then there'll be a spaceship behind him. And there'll be a caption that says like Elon Musk created this flying saucer. And then a lot of the comments on that will be wow, this guy is like, so innovative, stuff like that. There was also a lot of viral stuff on Facebook with Elon Musk making like tiny houses to solve the housing crisis, which is nothing he's ever shown any sort of interest in, to my knowledge, and it's definitely not real. It wouldn't surprise me if he'd promised it at some point and not delivered anything on it, though. I mean, that's entirely possible. And then a lot of the natural disaster stuff, people
Starting point is 00:09:41 do think that that is real, like based on the comments, based on how it's spreading and being shared. And that's really, really concerning. But then with the really grotesque stuff, I don't think that anyone thinks that it's real based on the comments. A lot of the comments are mad about the content saying like, Oh, this is disgusting, or I don't want to see this or this is so fucked up, like a lot of that sort of thing. But I think what is happening here is this is content that's created for the purposes of gaming the Instagram algorithm or the YouTube algorithm or the Facebook algorithm. And what I mean by that is if you stay on the video for any length of time, that's a signal to the algorithm that you are interested in that sort of content. Like it's really hard to tell what's going on in the videos because the people or the
Starting point is 00:10:30 images, whatever is happening on screen is morphing and changing very rapidly. Like usually there's kind of crazy music. Usually it's something where you stop and say, oh my God, like what is this? And that split second where you're trying to like figure out what's going on might be a signal to the algorithm that you want to see more of it. A lot of the comments were saying this is disgusting. That's engagement. And that's a signal to the algorithm that you want to see more of it. A lot of these videos also have tons and tons of shares like Instagram. You can see shares where people are copy pasting the link and sending in a group text or sending in a message or posting it somewhere.
Starting point is 00:11:08 And I would imagine that a lot of these are saying like, oh, my God, did you see this like really weird video? That's a really strong signal, actually, to the algorithm that it should show you more of this. And so what I think is happening is the people creating this type of content are creating outrage bait or engagement bait that is more designed to get people to signal to the algorithm that this is something that should be boosted. And it's getting a lot of views based on that. When you see those, like all of those shares on a platform like Instagram or Facebook, obviously some of that
Starting point is 00:11:41 is going to be real people. Do you have any idea of how much of it is just like bots and kind of like fake engagement or is that kind of hard to tell it's pretty hard to tell but I did an article about a year ago where I there's this thing called the dead internet theory that I'm sure you've talked about on the show before I think we might have we may have talked about it yeah yeah where it's like oh everyone on the internet is a bot but I think what's happening is it's like a mix of there's definitely sophisticated spam farms that do things like buy likes and buy shares and use bots to give any piece of content an initial boost in the algorithm. And so I think it's a mix of a lot of like bots doing that initial boost that initial seeding of any piece of content and then human beings doing the
Starting point is 00:12:24 rest of it. And it's really hard to tell because when you're talking of any piece of content, and then human beings doing the rest of it. And it's really hard to tell because when you're talking about a piece of content that has 1.2 million likes, there's no way for a human being to scroll through that and look at like which accounts liked it and determine sort of what's going on. Totally. And then like you were saying, you know, you get these metrics that based on the way the platform is designed and based on what say Facebook or Instagram are looking to get out of these platforms, engagement, people looking at ads, you know, stuff like that. Then seeing people are watching it for a while,
Starting point is 00:12:55 even if they're watching it just because like it's moving really fast and it's kind of freaking them out and they're like, what the hell is this? Or they're sharing it to say, look at this weird thing that came up on my feed. Like, why is this coming up? That still counts to a Facebook as like, look, people are engaging with this content. It doesn't mean it's good, but it works for their goals either way. Yeah, I mean, to my knowledge, there's no like sentiment analysis happening in the comments. I've like never heard of Facebook, Meta, YouTube trying to do any sort of sentiment analysis. I know that YouTube has like a thumbs down button for videos, but the extent to which that matters in the algorithm, like I have no idea. And that's one of the problems. These algorithms are black boxes, which your listeners know at this point. But something that the people creating this stuff seem to understand is that shares, comments are really important for the algorithm. A really important point. And so in what you've been writing recently, you called this a form of brain rot AI. Do you want to talk specifically about what you mean by that concept?
Starting point is 00:13:57 I mean, brain rot, I guess, is like a TikTok term. I don't know exactly where it initially came from, but it's sort of the like skibbity toilet, Rizzler, like Gen Alpha, really esoteric memes. I don't blame anyone if they've never heard of any of that stuff. There's weird stuff going on online in like the young people circles. Yeah, I mean, it's it's just like it's a it's a new type of meme where it's like this is really quite low brow toilet humor very often. It's a lot of like body horror, bodily fluids, goo dripping from the walls, things like that's sort of the aesthetic of it, I guess. And this is not a term that I invented by any stretch. Like people have been using this term for a few years now. The reason I called it brain rot
Starting point is 00:14:45 AI is because that's what the people making it call it. I've been very, very interested in the economics of creating AI spam, like how people get paid, how much they make, why they make it, what their strategies for making it are, where it comes from, that sort of thing. Basically, the way that it works is that there are people who do a lot of spamming. And then there are people who figure out how to create the strategies for making this type of content. And the people who make the strategies, the prompts, the tools, that sort of thing, they make more money by selling classes about how to do it than they do actually spamming the platforms. That brings me back to like the days of early self-publishing and all that kind of stuff where like you have people who are like, I don't know, writing their own books or even like,
Starting point is 00:15:33 you know, making YouTube videos, like early kind of creator economy and stuff like that. And then you had these people who maybe weren't even that successful at doing that at all, but like made a ton of money selling courses on like explaining how it was done. And so many people like wanted to get in on it that they would spend so much money like buying these different courses to try to learn what was happening. Obviously, you don't have everybody making their own brain rot AI, but it feels like it's a growing industry. I'm not calling these a pyramid scheme, but like it reminds me of a pyramid scheme of mid-level marketing, because basically the strategy that a lot of people have at this point is to try to grow a big AI generated Instagram account and then to use that big AI generated Instagram account to sell access to
Starting point is 00:16:17 your courses about how to make an AI generated Instagram account. Would this be like the new version of like the meme Instagram account or something then? Yeah, no, exactly. So there's this website called WAP, like W-H-O-P dot com. It's basically a website that lets you sell ebooks. I mean, we're talking about sort of like early internet stuff here where it's like, oh, please buy my ebook, exactly like you were saying. But usually the way it works is you'll buy access to an ebook and series of like videos about how to make this AI. And with that, you'll get access to a private discord channel where people share strategies. And so I've now bought like five of these courses over the course of the last year. They're usually like 20 bucks a month, actually, but get everything and then cancel. So in this case, like I bought this course, I got into this discord, and then there's an entire guide to making brain rot AI. What brain rot AI, at least in this community is, is videos of beloved childhood characters. Think like SpongeBob, Dora the Explorer, Peppa Pig, which is for like very young children, Mickey Mouse, et cetera.
Starting point is 00:17:27 And they basically have a cut and paste template for what you need to write to get an AI image generator to create this. And so there's like, I've seen hundreds of videos like this by this point, but basically imagine like 10,000 SpongeBobs all storming like a McDonald's and like attacking the people in the McDonald's. And there's just like blood everywhere. And then they start eating Big Macs and, you know, in a really gross fashion and all the burger like flies everywhere and stuff like that.
Starting point is 00:17:58 And the people making this call it brain rot AI. It's like a whole strategy, just like take beloved childhood character, do something really gross with them, use my prompts to do it and then spam it to Instagram. I'm actually really surprised that like Viacom or Disney hasn't come after these AI image generators at this point. And the way that they're doing it, not to give everyone a bunch of instructions, but they're like using chat GPT to generate the prompts. So like you're putting a prompt into chat GPT to output another prompt that you're going to use for this other kind of generator or whatever. They're doing exactly that. They're like chat GPT, write me an AI prompt for an image generator that will give me 100 Spongebob storming McDonald's. So then chat GPT them that. They copy paste it to an image generator
Starting point is 00:18:45 to get an image of SpongeBob and McDonald's. Then they take that and they put the image into a video generator to animate the image. And then they'll do some video editing on the back end and publish it. Is there reason to use ChatGPT for that? Because if you just went to one of these video generators and said, make me a video with all these SpongeBob's, like it wouldn't do that because of copyright and IP things? Or what's the reason to use ChatGPT over just putting a prompt into the video generator? The reason is because they're not that good at writing. Okay.
Starting point is 00:19:19 The human being spammers are not that good at writing prompts that will give them good outputs. And so basically, like each AI tool is good at a specific thing. So they have found that ChatGPT is good at generating prompts for other image generators, but that ChatGPT itself won't generate an image of SpongeBob or of LeBron James, but it will generate a prompt to make an image of SpongeBob on another image generator. And then that image generator is actually not that good at making video. And the video generator is not that good at making images. And so they're like taking each tool, finding what it's good at, and then using that. And this might sound like a lot of work, but the way that they do it at this point, it's like a two minute long process. It like there's various steps, but each step is like 30 seconds long. And so the effect
Starting point is 00:20:10 is they're able to make a lot of these videos very quickly. And I'm sure like actually figuring that out took a bit of work. Right. But then once you have found like the process that seems to work, at least for now, then, as you say, you can really just start like really churning this stuff out. Right. And then I guess you don't even just need one kind of Instagram or TikTok account or for now, then as you say, you can really just start like really churning this stuff out, right? And then I guess you don't even just need one kind of Instagram or TikTok account or Facebook page that is kind of publishing these things. You can run a whole network of them pretty quickly without a lot of real work or effort. Is that the goal? Yeah. So a lot of people have 5, 10, 15 different accounts. And usually each of them will have a different niche is what they call them. So there'll be like a SpongeBob brain rot Instagram account, then there will be a basketball one like
Starting point is 00:20:51 Steph Curry and LeBron James are really popular right now because they found that these image and video generators are very good at making LeBron James and Steph Curry. I'm sure that will change at some point, you know, it will be a different celebrity or something else. So there's that. And then like, as you said, the person who figured that out is the one who made the ebook, and he's making $20 a month from each of these people in the discord in any given discord that I found, there's between like 100 and 1000 people all sort of making this and a lot of those people might only be making a few dollars based on sort of like the engagement goals that Instagram or TikTok or YouTube has. So the way that they're doing it is they're spamming these social media platforms, hoping to get invited to these bonus programs where Instagram will pay you a tiny fraction
Starting point is 00:21:40 of the ad revenue that Instagram makes if you are a consistent enough poster and you're able to go viral often enough. And so they're basically making like business accounts on each of these platforms. And on YouTube, it's like they're collecting a fraction of the ad revenue, the same as any other YouTuber. And that's sort of like where the economy comes into play. And so then the people who would be making a lot of money at this, are they actually making money from actually spreading this stuff on the networks? Or are they making money off of selling the courses? Or can both of those be lucrative endeavors? I get the sense that the people who are selling the courses are probably making more money.
Starting point is 00:22:19 But all of these communities, they like to brag when they have success. And so I saw a video of like LeBron James, Steph Curry and P Diddy in jail. I mean, it's pretty graphic video. I guess you can maybe imagine what it was considering it was like P Diddy in jail. And that had been seen like 20 million times on Instagram or something. And the person who made that took a screenshot of their business panel, because Instagram will show you how much money you make from any given video. And I think they made something like $100 from that. And so we're not talking like huge, huge, huge amounts of money. Like that was a very viral post and they made about $100. But if you have 15
Starting point is 00:22:56 accounts and you're doing that somewhat regularly, I've seen people post screenshots of how much money they're making and they're making $10,000 a month. How long that lasts for? I don't know. But I've talked to people who make this their entire job. It's like with any of these things, right? The algorithm can shift and all of a sudden what you were doing before doesn't work very well and you need to find the new thing, right? You know, we saw that with people publishing books on Kindle, direct publishing or whatever it was called, same as people publishing YouTube videos. So you're talking about these people making this kind of stuff. You know, some of them will be
Starting point is 00:23:29 making up to $10,000 a month, some of them far less. These tools, are they free? Because, you know, obviously we hear about the cost of the computation behind things like image generators and video generators. Do they have to subscribe to like certain tools to get access to this? Is it pretty cheap to do? What do you see on that side of it? That's actually a really interesting question, because initially what I saw people doing a year ago when it was just images was they were using free trials for things like Bing image generator, which uses OpenAI's image generation tools.
Starting point is 00:24:01 They would just like use it until they hit a wall and then they would make a new account and do that over and over again. That's one way to drive up the user numbers. It really is. Yeah. But the way that a lot of the best video tools work is that they use far more compute and therefore the free trials are either very limited or don't exist at all. And so this is one of the first times I've seen where people actually are subscribing and paying money for these tools. And so usually from what I've seen, it actually are subscribing and paying money for these tools. And so usually from what I've seen, it's like you can pay 20 or $30 a month and get something like unlimited video generation. But there's a few people who are like really power users and saying get the premium plan so that I can generate just like tons and tons of this stuff. So I would imagine just based on my read of the community, a lot of the people who are
Starting point is 00:24:45 doing this are pretty young or they are in developing countries. Like a lot of them say, I'm in India, I'm in the Philippines, I'm in Vietnam. And so this is sometimes not a small expense for these people. And so there is a lot of risk for the people doing it if they don't know what they're doing. I think like any other gold rush, so many people trying to make money in crypto, for example, will lose it all because they don't know what they're doing or because they get the rug pulled out from under them, of course. I guess then it also makes it feel kind of more professional, right? Or like if I am putting in this money, I need to really commit to like
Starting point is 00:25:16 trying to make this work to get something out of it. I'm actually pursuing something real. I'm not just like seeing some other people on a Discord server trying this out. And so I want to see if I can get somewhere with it too. Like there needs to be, it feels like a certain degree of commitment there. Yeah, that's absolutely right. The other thing though, is that there's a new video generator or new model like almost every other week. And so they're constantly looking for new tools and they definitely are biasing towards free and cheap tools. Recently, a lot of the biggest advances in this space have come from Chinese companies. And often when these things are first released, there is a grace period where it's like, okay, we're going to release it for free to get a bunch of users and then we'll
Starting point is 00:25:57 add our paywall later. So I have seen a lot of people trying to find free tools or migrating to tools that have just launched that might still be VC backed that are able to run like a huge loss just to try to lock in users. One of the really fascinating things about that, and you were kind of touching on this before, right, is the degree to which this is not even really trying to target people, right? It's not like what is this specific person or group of people going to be interested in, but kind of like how do we ride the algorithms? How do we try to figure out how the algorithm works to get these types of things to circulate,
Starting point is 00:26:29 to be seen, to be engaged with, to learn the kind of affordances of the platform and to try to take advantage of that for their own success, right? That's the biggest thing is that I've now been following this world for, I don't know, like 18 months or so. And the strategies shift constantly
Starting point is 00:26:44 because the algorithm is changing constantly because what happens to work changes constantly. Like I don't really see the traditional shrimp Jesus stuff anymore. Like that stuff is very quaint by this point. And that was, I mean, mind blowing for me personally, when I first saw it a year ago, we've moved so far since then. Only the people around in the early days of generative AI remember this. Yeah, the strategies are constantly shifting, basically. Obviously, we have seen these efforts by people to take advantage of platforms in the past, and that has often involved user-generated content. That's what so many of these platforms are actually built on, whether it's our posts, but beyond that, videos and the other things that people create in order to develop an audience, in order to get traction on these platforms. Some people are
Starting point is 00:27:29 building businesses on the platform. Some people are really trying to take off. And of course, we've recently had more of a pivot back to video, if you want to put it that way, where it feels like YouTube is changing. There's this effort to make Instagram reels and TikToks. Social media feels more video heavy than it has in a long time. But as you're going into these groups, and as you're seeing the kind of stuff that they are generating, the traction that that AI generated stuff is receiving, and how quick it is to put this together, what are the implications of that for the social media platforms that we use? But also like, I don't know, this whole notion of like a creator economy or whatever
Starting point is 00:28:05 you want to call that? That's a great question. And that's something that I've tried to think through myself and grapple with a little bit because initially when I first started writing about this stuff, I thought surely the social media platforms don't want all of this AI spam on their platform. Then Facebook wouldn't talk to me about it. I sort of expected in the course of reporting this stuff for Facebook to say, we don't want this on our platform, we're going to delete it, or we're going to minimize it in some way. And then I started listening to Mark Zuckerberg's earnings reports for meta. And they talk about how engagement is up. They talk about how time on site is up. They talk about how they're going to be releasing and already have released their own AI tools, their own image generators, their own AI generated profiles.
Starting point is 00:28:51 And they also talk about a future where their platform is made up of a mix of real content from people's friends and family and AI generated content, whether that comes from users who are making things or whether it comes from meta's own AI bots, more or less. And so now, what I believe the strategy to be is one, I don't know if the social media companies care whether the content is AI generated or whether it is made by human beings. I suspect they don't really care all that much as long as people remain on the website. But I also know that when we talk about the business models of these companies, they're focused on delivering the most specific types of ads to people, learning as much about you and your interests as is possible. And something that
Starting point is 00:29:42 generative AI allows for is hyper, hyper specific content. And so I used to think of social media as sort of like this infinite space where just there, you can never finish Instagram, you can never finish TikTok, you can scroll forever. But let's say my dog is behind me, he's a rat terrier. I like looking at videos of rat terriers. And if TikTok or Instagram were to learn that I really like rat terriers, who's to say that they're not going to just generate infinite videos of rat terriers doing cute things. And you can get super specific about someone's interests and then try to keep them on the site because you can essentially generate a never ending fountain of content
Starting point is 00:30:22 and hyper specific ads that go along with it. They're allowing advertisers to make really, really specific ads at this point using generative AI as well. It's like almost every social media platform has created generative AI ad tools where instead of making, let's say, five different types of ads and then going in the back end and trying to say like, OK, whichever one is performing the best, that's the one that I'm going to put money behind. You can now use generative AI on Facebook to have thousands of different versions of an ad. And those thousand ads will be delivered to a thousand different types of people, according to all of the demographic and interest information that Facebook has about you.
Starting point is 00:31:04 I'm not surprised that they're so craven. just to say like, whatever it is that people are looking at, as long as we're getting more ad revenue out of it, then it's fine, basically, especially in this moment when, you know, they're pulling back on moderation and like, you know, attempts to make their platforms like, I don't know, better for people or like, you know, not have all this kind of disgusting or extremist or hate filled stuff on there. They're like, you know, any of this is fine now, as long as line go up. And as long as we don't make Trump and the conservatives like mad at us, basically, because Zuckerberg's basically one of them anyway. Now, you know, that's like the advertising piece. But if we think about
Starting point is 00:31:38 all of these people who, you know, increasingly like make a living on these platforms or, you know, even just kind of engage with them and make videos to like, I don't know, impress their friends or try to see if they can get views just for fun or whatever. What does it mean for that whole ecosystem of things if actually sitting in front of your camera to make a video takes time and takes effort? And on the other hand, you can just go into these generative AI tools and basically churn out hundreds or thousands of things in the same time it would take to make a single video. It means that you are not just competing with the billions of people on these platforms who are creating things. You're also competing with generative AI. And generative AI is, I called it a brute force attack on the
Starting point is 00:32:21 algorithms because when I write an article, I might spend a few hours, a few weeks, a month doing it depending on how complicated the article is. And then I'll post it on social media and I'll hope that it goes viral or I'll just hope that people see it. You know, I'm playing the algorithmic lottery every time I post anything on the internet. Sometimes it works and sometimes it doesn't. If something works really well, I might consider like, oh, that type of article did really well. Let me try to write more about this topic and then I'll go work on it again. And it will take me another few days or another few weeks to do another type of article that's similar to that. And then I'll do it again. When I write an article and publish it
Starting point is 00:33:01 and no one reads it or performs really poorly, that means I spent a lot of time doing something that no one is going to see that no one's going to care about. That doesn't mean it wasn't important, but it means it didn't work on these platforms. But if I'm using generative AI, I can have like a really quick feedback loop where I'm making thousands of articles or thousands of images or thousands of vertical videos, seeing which ones work. And then I can iterate on that very, very quickly. This type of content performs really well in the algorithm because there are thousands of videos that no one saw, but that didn't take any effort to make.
Starting point is 00:33:38 And so it's not really that big of a deal that they failed, that they sort of like fell by the wayside. And so I think it contributes to a world where everything is sort of converging, where everything starts being really samey for a while, because people are able to find out what works in the algorithm, then everyone in this world starts making very similar stuff, and then it changes, and then they sort of all rush to the other part. And for individual like human creators, it means that your stuff might not get seen and you're competing with other content at a scale like you've never competed with before.
Starting point is 00:34:11 It feels like it like continues this perverse incentive that we often see on social media or that like the internet creates to a certain degree, right? Where we talked about in the past, like making television shows or movies or even like journalism, right? And traditionally kind of the expense that went into that, you know, you had these big institutions that were doing it. And then you kind of move more and more as, you know, these platforms proliferate. And as the tools get more accessible, you have bloggers, which eventually become more like independent journalists. And obviously, you have the YouTube creators, which you're sure some of them do like ascend into, you know, the Hollywood machine, but a lot of them are kind of churning away, kind
Starting point is 00:34:47 of doing their own thing. And obviously there are a lot of positive things that have come out of that, but there have also been the concerns about what it means for entertainment, what it means for journalism and continual funding of those sorts of areas that we think are socially valuable. And it feels like now looking at what generative AI does to that is it allows just those incentives or those pressures to be put on steroids. And you wonder like what the ultimate outcome ends up being and what gets destroyed
Starting point is 00:35:18 along the way. Yeah, I mean, this is definitely an evolution of an incentive structure that we've seen for a long time. You can even go back to things like aggregation, which is where, you know, like a blog will take someone else's reporting, quote it heavily, usually link back to it. But maybe they'll put on like a sexier headline. And so you have the journalists who did a really difficult investigation. And then you have like the Daily Mail that rewrote it in five seconds. That version goes viral. And then the original journalists may not get credit, may not get the clicks, may not get the financial success that comes with breaking big news. And that means that it becomes harder and harder to do that type of work because investigations take a long time and they're hard to do.
Starting point is 00:36:00 This is that on steroids, sort of, as you said, I think that the flip side of that is as more and more things become AI generated on social media, the are in good shape to weather this sort of storm because you like I have my favorite YouTubers that I'll go actively seek out. I have my favorite news publications that I'll go actively seek out. I follow those publications and those people because I like the people behind it. And I know that they're putting in a lot of effort to make something that's very different in this world of sameness. But I think that's going to get a lot harder to do. It'll be a lot of effort to make something that's very different in this world of sameness. But I think that's going to get a lot harder to do. It'll be a lot harder to build an audience from scratch for people who are just coming up now. And so I worry about that a lot. It makes you wonder if it like further cements or exacerbates this notion of like the superstar effect, right? Where there's obviously all this talk about how there's so much opportunity online, you know, anyone can be a blogger or a YouTuber or whatever, and, you know, potentially make it be your living.
Starting point is 00:37:10 But actually, what we see is, you know, there's often a small number of people who consume, have most of the success and take most of the income from that. And there's a much smaller number of people who can actually kind of make the living and whatnot. So yeah, I think that'll be interesting to see how it shakes out. But I wonder, picking up on those particular topics, as all of this stuff further disseminates then, you know, as you've been watching this and thinking about the implications of it, what does this mean for the information ecosystem that we already see struggling, you know, facing challenges at the moment, not to mention the broader ideas about culture and what the future of that is? How does all this kind of easily generated content that is now proliferating in so many of the spaces
Starting point is 00:37:50 where we engage and communicate, how does that shift things in those realms? I started really worrying about this during Hurricane Helene, when there were a lot of AI generated images of disaster that were going really viral. And people were saying, hey, these are AI generated, these are fake. And you had like the Republican National Committee, the chair of that saying, I don't care, because it feels real to me. And I think that there is a lot of that now, where you see something from the LA wildfires, that's AI generated, and you say, well, that's not real. But then you have a bunch of people saying, well, like something similar that probably happened. I think that we are already so siloed that I really do worry about sort of the base understanding of reality that's out there. But
Starting point is 00:38:35 at the same time, I also think our information ecosystem has been really, really broken for a long time that I don't know how much of a difference it actually is going to make. I think that social media, like broadly defined, has been a pretty bad place to get news for a long time. And this certainly makes it worse, in my opinion. You can only break it so much, you know, like I don't know how much more broken it can possibly get. And so I think that it's like where I look for hope is I think that people are starting to realize that social media is really, really, really broken for news. And so there certainly are many people who just log on to TikTok or log on to Instagram or Facebook and say, oh, I'm going to get my news here. But I think a lot more people are starting to specifically seek out information from people and publications that they
Starting point is 00:39:26 trust. And like our publication, for example, is getting a lot of traction in group texts where we publish a story and then someone shares it in their group chat with their friends. And then that person might come and subscribe to us. And the reason that I know that that's happening is because dozens of people have emailed me and said, I had never heard of 404 Media, but then my friend sent it to me in a group text and I group texts that I'm in from people who I know are like serious news consumers who are paying attention to what's going on, who have different levels of expertise about different types of topics. And then they'll say, I just saw this, this is really good, you should read it, or this is really important, or this is really scary, you should check it out. So I think that things like newsletters, group texts, RSS feeds, like some old technologies, like human
Starting point is 00:40:26 to human recommendation systems, hopefully will become a bit more powerful as algorithmic recommendation systems break down. Yeah, I certainly hope so too. And I feel like that fits into this conversation that I feel like has been growing for the past couple of years about this kind of shift away from these broader platforms into these smaller groups that are also not so public and where people can just kind of exchange these things. Obviously, sometimes very good things. Other times, it's, you know, kind of the extremists using that, the kind of Trump supporters and Bolsonaro supporters and all those types of people. But that's it. You know, nothing can be wholly always good and wonderful. Just picking up on what you were saying there. One of the things that I remember from our conversation last year was I believe that
Starting point is 00:41:08 you said that people were seeing these AI generated images on places like Facebook and kind of feeling a bit nervous that they might like fall for AI and people might realize that like you thought this AI image was real, you know, you're stupid, whatever. Do you still think that that is there? Or do you feel like that has shifted? You know, thinking about what you were just saying that time where people are like, oh, well, this natural disaster image might not be real, but something like that might happen anyway. I guess what do you see in that space? Like, are people still nervous to share and fall for AI stuff? Or has it become increasingly normalized?
Starting point is 00:41:45 I think people still worry about it. And I do think that being able to decipher what is AI and what is not is getting harder when the person creating the AI or, you know, posting the AI is trying to trick people like these tools are getting a lot better. You used to be able to just look at people's hands and see like, oh, they have six fingers or they have two fingers and be able to tell that it was AI generated. That is not the case anymore, really. Like sometimes it is, but these tools are getting a lot more realistic if your goal is to create really realistic AI that fools people. I still have people in my life who seem very worried about being tricked by AI. And so they do have their guard up all the time saying
Starting point is 00:42:25 like, Oh, I don't want to share something that's fake. I think it's still a problem. But then you also have a lot of people who say, I don't really care whether this is real or fake, as long as it reinforces my worldview. I mean, they're not saying that out loud necessarily. But they'll say like, this is my feeling like this picture captures the emotion or the vibe that I'm trying to portray. I don't care whether it's real or not. And I see a lot of that in these guides about how to make AI. A lot of them used to say don't label it as AI because you want to try to trick people. And now they basically say it doesn't matter either way because it might go viral either way. It doesn't super matter. I feel like it's not as much as before, but I feel like I definitely still see people who I feel like should know better than to do it. Just sharing
Starting point is 00:43:09 around AI generated images to like make a point, even if I think they have like, you know, generally good and agreeable politics. For me, it's like, why are you engaging in this? Why are you doing this? I guess it can be easy or what have you. You know, you know, the visual can help to make a point and it's a lot easier to do it through generative AI than other means, I guess. It is interesting, because there was a while where like a lot of people on substack, other places to would decide to use generative AI to illustrate their journalism or to illustrate their blogs. And for a while, it's like, Oh, cool, we don't have money for a Getty Images, like a stock image library. I can't afford to create bespoke art for each individual article that I'm writing. And people were doing that
Starting point is 00:43:50 for a while. And then there was a big backlash to generative AI. And people stopped doing that. Some people still do. Yeah. But it's interesting that the sort of aesthetic of generative AI became a shorthand for like, this is not trustworthy, or this is lazy. At least that's what I perceive. And a lot of people have like a real knee jerk reaction to not engaging with anything that is generative AI. And I understand a lot of that has to do with the fact that it's trained on copyrighted content, and the fact that it's bad for the environment, and all of that sort of thing. Like I fully understand that. But I do wonder sort of like how long that will last because my personal feeling is like, it is tricky, but I do not think that we are going
Starting point is 00:44:33 to be able to like wish this technology away at this point. Like I've seen too much to now say, like, this is not going to be a thing. I think it's going to remain in existence for some purposes and people are going to use it. And it's going to become and already has become this big like culture war. Like, do you use it or do you not? And if you do use it, what do you use it for? Like, where is the line? Like you say, especially when you have people who are finding ways to make money off of it, right? And to influence discourses and all these other goals that they might have. I think that this is going to continue to be a really important thing to watch. And I'll be looking forward to what you continue to find as you follow it. And I'm sure you'll be on the show in the
Starting point is 00:45:12 future. And we'll talk about that again. But before we close off this episode, I wanted to ask you about another article that you wrote with your colleague, Emanuel Mayberg, who I think is the only 404, you know, kind of at least main staff member that hasn't been on the show. And I really need to, or, you know, co-founder or whatever. I really need to change that sometime soon. But it was a really interesting article for me because it really aligned with some things I have been thinking lately. And essentially, you know, this article was arguing that big tech went for Trump and what they got doesn't seem to have really been what they were hoping to get. So do you want to kind of expand on that a little bit and talk a bit about what you were thinking as you were writing that and what Trump has turned out to be versus what some of these tech companies and CEOs that were
Starting point is 00:45:53 backing them seem to hope he would be? So this article, like the idea for it came from Emmanuel. He did a lot of the heavy lifting and then I added some points to it. So he thinks a lot in terms of acceleration, deceleration. And I thought it was a very smart angle that he took. But basically, there was this idea during the Biden administration that Biden and Kamala Harris and Lena Kahn and the Democrats more broadly were stifling innovation with regulation, like Lena Kahn was investigating a lot of big tech companies, there was a bunch of lawsuits, you know, some of them continuing on to right now, there was this idea that China was going to pull ahead, especially in AI, because Washington broadly defined was putting up all
Starting point is 00:46:37 of these artificial regulations that were preventing them from going like ham on this technology, like basically like deploying it to its full effect and building this utopian future of, you know, AI generated luxury for mostly for them, but for everyone. And then the other thing that was happening was there are a lot of Silicon Valley startups that have not IPO'd yet that are waiting to IPO. And so like the number of IPOs over the last few years has
Starting point is 00:47:05 gone way down. There's been only a couple major like tech company IPOs and a lot of them have kind of flopped. And so there was this idea that they would elect Trump. Trump would get rid of a bunch of regulations, cut taxes, let big tech do whatever they want. And then the stock market would go up. People would IPO. We would beat China on AI, and we would like the tech oligarchy would fully take place. And what has happened instead is Trump has done the tariffs of uncertainty that has made it impossible for any tech company to IPO at this point because the market's been falling and it's been really volatile. And the worst thing that you could possibly do as a tech company is IPO and have it flopped immediately. And so there were all these companies that were kind of like edging in IPO and now they're still kind of stuck. Beyond that, the tariffs have added such uncertainty that
Starting point is 00:48:06 it's almost worse than any sort of regulation that Biden or Kamala Harris could have put on big tech because the tariff situation is so complicated and so ever changing that these tech companies can't make big investments until they have a sense of what the rules are going to be. And Trump has now like sort of exempted some big tech products from the from the tariffs, although who knows how long that lasts for who knows what's going to be covered, so on and so forth. But it's still not the switch to like, you know, so basically, it's like, you know, the brakes are on like, instead of like, going to the moon, we're just kind of sitting here waiting to figure out what's going to happen with the tariffs. And Silicon Valley is very upset about that.
Starting point is 00:48:49 It was really interesting to see them get so behind Trump. And then as this kind of trade war continued to accelerate in recent weeks, seeing Bill Ackman and Elon Musk and some of these other folks start to speak out publicly about some of their issues with the trade element in particular because of how it hits Silicon Valley. But I think that point about the IPOs is so important, right? Because this is how so many of those investors make their money. It's not often so much from the profits of the actual companies, but from having their early investment turn into something bigger when the company IPOs. And so if you're not having the IPOs, the kind of cycle and the model that the industry is built on is not working properly. And if the
Starting point is 00:49:28 market is going down, then obviously that can't happen, right? It was really telling. I didn't even realize this. Emmanuel is the one who realized this. But Marc Andreessen was tweeting quite a lot about Trump and boosting Trump. And he hadn't tweeted in like a week after the tariffs were initially announced. And I don't know if like a week after the tariffs were initially announced. And I don't know if he's ended up saying something about the tariffs, but he was completely silent on them. And at least publicly, of course, surely he was trying to do stuff behind the scenes. But I think that's very interesting because these people were like really, really constantly talking about MAGA and Trump is, Trump is gonna take us to the
Starting point is 00:50:05 moon, etc. And then the tariffs happen, and they all kind of shut up for a while. It took Musk like a few days to say anything about the tariffs. And so I would imagine that they were trying to put pressure on Trump behind the scenes. And then a lot of them did end up saying like, we don't think the tariffs are good for business, more or less. But I think a lot of them were kind of scared of upsetting Trump by coming out publicly against them from the get go. Yeah, you know, you saw Tim Cook kind of working in the background to try to get his exemptions. But even beyond that, you see the potential restrictions on important inputs for the chip production supply chain, if that's going to be moved back to the United States. And of
Starting point is 00:50:43 course, the general attack on the Chips Act itself, which, you know, was the Biden era law to provide a bunch of subsidies, basically, to try to bring chip manufacturing back to the United States in a much more concerted way. But then on top of that, too, you know, we still have these aspects of the Biden agenda, as you were saying, that are moving forward, the antitrust cases, of course, you know, as we talk, I believe it was only a couple days ago, we got the ruling that Google has been found to be a monopoly for the second time. Of course, Mark Zuckerberg is on trial in the case against Meta. But it does kind of feel like, yeah, these things are moving forward. But like, what does the Trump administration actually want out of them? Is it like more competitive markets and stuff as you know,
Starting point is 00:51:20 we would hear from Lena Kahn and the Biden administration? Or is it to use whatever power that is gained out of these cases just to try to force the tech industry to more align with them? It's going to be really interesting, I think, to see how this relationship between the tech industry and Trump, after the tech industry went so hard to align with him, continue to develop, because it doesn't seem that Trump is just going to do everything that they want, right? Yeah, I mean, they still definitely seem to be backing him. But I'm very curious how long that lasts, especially if we do start to see increased unemployment, increased prices, things like that, because these companies really rely
Starting point is 00:51:56 on having a robust economy to finance what they're doing. And then the other thing is, if we start seeing other countries retaliate against American companies, all of these companies are extremely reliant on America being dominant globally and being able to sell their products internationally. But more importantly, for like these social media companies, making sure their websites aren't blocked, things like that. And I can't tell if the temperature has gone down slightly on the tariffs or if just the immigration and ICE situation is so dire at the moment that they've taken a backseat for right now. But it's going to be very interesting to see sort of like how this all shakes out. I completely agree, right? Obviously, we're all watching it
Starting point is 00:52:35 very closely, but it's not going to become any less consequential in the weeks and months to come. Jason, it's always great to speak with you. I always learn so much from reading the work that you and your colleagues at 404 are doing. Thanks so much for taking the time to come on the show again. I really appreciate it. Thanks so much for having me. It's always really fun. Jason Kebler is a co-founder of 404 Media and co-host of the 404 Media podcast. Tech Won't Save Us is made in partnership with The Nation magazine and is hosted by me, Paris Marks. Production is by Eric Wickham. Tech Won't Save Us relies on the support of listeners like you to keep providing critical perspectives on the tech industry. You can join hundreds of other supporters by going to patreon.com
Starting point is 00:53:12 slash tech won't save us and making a pledge of your own. Thanks for listening and make sure to come back next week. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.