No Priors: Artificial Intelligence | Technology | Startups - The Shifting Value of Content in the AI Age with Cloudflare CEO Matthew Prince

Episode Date: August 7, 2025

Cloudflare has spent nearly fifteen years making the Internet faster, more reliable, and more secure. So now that AI systems are changing the way we interact with the Internet, Cloudflare wants to hel...p level the playing field for content creators. Sarah Guo and Elad Gil sit down with Matthew Prince, co-founder and CEO of Cloudflare to discuss the evolution of the internet from search to AI, including Cloudflare’s role in facilitating that shift. Matthew talks about how AI assistants are changing the shape of the Internet, the problems Google created by making traffic the arbiter of content value, and how he sees Cloudflare’s part in facilitating the new content marketplace for the mutual benefit of creators and AI companies. Plus, a look towards how agentic infrastructure may unfold in the near future. Sign up for new podcasts every week. Email feedback to show@no-priors.com Follow us on Twitter: @NoPriorsPod | @Saranormous | @EladGil | @eastdakota | @Cloudflare Chapters: 00:00 – Matthew Prince Introduction 00:37 – Cloudflare’s Role in Securing the Internet 02:08 – The Road to Cloudflare’s Dominance 03:20 – The Internet’s Shift from Search to AI 06:34 – Role of Agents and Content on the New Web 09:44 – Reshaping the Content Market Online 13:05 – De-emphasizing Traffic as a Proxy for Value 18:04 – Will We Run Out of Quality Human-Generated Content? 20:01 – Scaling the Value of Content in the AI Age 22:32 – Cloudflare’s Approach to Inference 24:55 – How Cloudflare Responds to Market Demand 26:04 – Open vs. Closed Models 27:21 – Path to the New Marketplace for Content 30:58 – Advice for Content Creators 32:47 – Exploring the Timeline for Running Models Locally 40:07 – The Future of Agentic Infrastructure 44:52 – Conclusion

Transcript
Discussion (0)
Starting point is 00:00:00 Matthew, thanks so much for being here. Thanks for having me. So I want to get right into the juicy topics, but make sure our listeners understand, like, the scale and, like, her role of Cloudflare first. So correct me, if I'm wrong in any of this, $66 billion market cap company today, about $1.8 billion in trailing revenue, and then, like, the biggest CDN by far with a bunch of different. products now in security in particular? What else should our audience understand about the role cloud for a place? Not to nitpick on one thing, but we've never really thought of ourselves as a CDN. We start out very much as a security company. The whole thesis was, could you put a firewall in the cloud? We saw that servers were going to the cloud. We saw that software
Starting point is 00:00:49 was going to cloud. It seemed inevitable to us that the networking equipment would go to the cloud. And the big objection that everyone had was you were going to slow things down. And so we worked very hard to figure out how could we not slow anything down. And the goal was just to get back to parity. It turned out we were a little too good at our jobs and everything got a lot faster. And so, yes, we've ended up competing in the CDN space. But really what Cloudflare is, is what the network should have been, what the Internet should have been. Had we known in the 60s, 70s, how important it was going to be. So how can we make it faster, more reliable, more secure, more efficient, more private? And that fundamentally is what we're working on every day at Cloudflare.
Starting point is 00:01:27 How long has been since you guys started the company? We launched in September of 2010, so we'll be coming up on our 15th year in September of this year. And I don't think there's a way to ask this question without like somewhat trivializing the journey, but like how did you become so dominant? I don't know. I mean, I think we just focused on how did we do the right thing for our customers? How do we solve the problems that we're there? And, you know, at some level the story of Cloudflare is that, that we have been customer zero along the entire journey. So every thing that started from, could we take a firewall, put it in the cloud, how would we get the data to populate that firewall?
Starting point is 00:02:09 We had to have a free service. Once we had a free service, all of a sudden we had to be able to figure out how to scale enormously across millions of customers in an efficient way. That meant that you had a whole bunch of weird stuff that was using us. We got attacked by every which direction. We had to build a public policy team in order to deal with those issues. We had to build our own security. We like someone almost hacked or stole our domain at some point as a way to have hacking into us.
Starting point is 00:02:36 So the next thing you know, we built our own registrar. And so to some extent, Cloudflare has been about, you know, start with a relatively simple idea, make it as broadly available as possible and then solve all the problems that become sort of inherent once you've done that. Now you're in the position that I believe you've described as like the Internet's traffic cop. I think a lot of people feel with the sucking sound of attention toward AI assistance that the shape of the Internet is changing. What is your, before we go to like point of view on what to do, what is your prediction for what's happening? So no matter what, the dominant kind of value creation model of the last 30 years of the web has been search. search drove everything and drove all of what you did online entire industries grew up around that and the real three ways that you could drive value on the web in the past where you either sold a thing whether that was a subscription or a product or something else you sold ads against some content or and i didn't say the business model but i said the value creation model because the third part is really important which is a lot of people just created content for the ego of knowing
Starting point is 00:03:50 that other people were doing. The old adage is two reasons why people create content to get richer to get famous. A lot of people are just doing it to get famous. And that's a lot of what drove. I mean, that's what drives Wikipedia. That's what drives a lot of content creation that is on the web.
Starting point is 00:04:06 I think that the web is shifting now to a new interface. It's shifting away from search and it's shifting to AI. And we can see that through the trends in terms of Google usage. We can see that in terms of like our own usage, where more and more people are turning to these AI systems where they used to turn to Google. Even Google itself is sort of morphing into an AI company kind of in their interface before our eyes. And as we do that, the natural thing that's going to happen is we're going to consume derivatives rather than consuming the original content itself. A study that just came out from Pew Research that says that if Google puts an AI overview on the top of search,
Starting point is 00:04:49 it is much less likely that people click on the links. And that seems sort of like a, duh, but the data that we have also substantiates that and shows that compared with 10 years ago, it's become 10 times harder for the same piece of content to get a click from Google than it was before. And that's because the answer box, that's because of AI reviews, it's because of the search interfaces has gone there. And that's the good news for content creators. In the case of someone like OpenAI, it's 750 times harder than it was with the Google of old.
Starting point is 00:05:27 With the case of Anthropic, it's 30,000 times harder than the content of old. And so what I worry about is that if the value creation model of the web has been all about how do I get traffic, the new interface of the web isn't going to send you traffic. And if that's the case, if content creators can't get value from selling a thing or a subscription, selling ads, or just the ego that they get knowing someone is reading their stuff, I worry that people aren't going to create content. And that's going to really not only starve the web, but it's actually going to starve even the AI companies that are using that content as effectively the fuel for their engines. How do you think that evolves? Because if you look forward, the other thing that people are talking about a lot right now is agents and the fact that you're not only getting information through an AI, it'll actually go into actions on your behalf. So the time you actually spend on the web is going to go down effectively, or at least you're going to be dealing with one interface, which is this agent that goes off and does thing in the background for you. So do you think that ultimately the AI companies will start paying for content?
Starting point is 00:06:25 Do you think there will be other ways to monetize it? Do you think it's a completely different model starts to emerge in terms of how the web works? There are going to be different solutions for different pieces of the equation. At some level, agent kind of commerce is going to be part of the easiest of these to solve, where there are going to be certain companies that's, say, listen, we'd love your agent to come and, you know, buy a widget from us. There can be others that right now are aggregators of information that the agents can actually disintermediate or disaggregate that content. And they'll be actually quite threatened by that.
Starting point is 00:06:58 But ultimately, I think commerce, agents and AI is probably ultimately pretty good. So separate commerce from content and content is what you're worried about. Content is a different piece where content, the problem right now is the default assumption has been that you get content for free. And it's actually interesting. A lot of the content creators are looking to the law as the solution to this. And generally, I'm a recovering law professor. So, you know, pardon me for going down this weird tangent.
Starting point is 00:07:25 But it's actually, I think, really interesting, which is in copyright law, the more that you are a derivative as opposed to a direct copy, the actually the safer you are, the more likely you are to fall under fair use. And we've actually seen a number of court cases. Two that happened here in California that happened within a week of each other, one of which basically said AI uses of content is fair use. The other one which said it wasn't, there's going to be a whole bunch of things around that. But probably the more sensible one is the more that you're creating derivative content, the less likely it is to be a copyright violation. But kind of opposite of that, the more that you're creating derivative content, the more likely it is that someone isn't going
Starting point is 00:08:11 to go back to that original source. And so I actually worry that a lot of the content creators are focused very much on what the law says today and on copyright law, which may not come out in their favor because it is actually protecting those derivative uses. I think what we have to figure out is probably a different business model where content creators get compensated. And I think the good news is, as you talk to the big AI companies, and 80% of the major AI companies are Cloudflare customers.
Starting point is 00:08:38 So we have good relationship with them. We talk to them about it all the time. What they all say is you're absolutely right. We should be paying for content. The devil is in the details, though, because what they all are desperately worried about is how do you make sure that it's a level playing field? They all believe their technology is the best. They all believe that on a level playing field they're going to win.
Starting point is 00:08:56 But they're really worried, well, if Google still gets content for free, but we have to pay for it, that doesn't seem fair. Or if I'm paying for it, but somebody else gets it for free, that doesn't seem fair. So what we've been really working on is how can we create that really level playing field? And we think if that's the case that AI companies will actually be quite willing to pay for that content. What are the approaches you've been taking to try and level the playing field here? So in order to have an economy, you have to have a market. In order to have a market, you have to have scarcity. Like no markets exist without some level of scarcity. And the problem right now with content is that content, there is no scarcity. They're giving it away for free. And so we spent the last year working not across, only across, class, there's existing customers, but then going across the entire public. publisher ecosystem, writ large, not just print publishers, but video, audio, music,
Starting point is 00:09:41 you know, film across the entire spectrum, and saying, you know, we think that there's a problem with AI, that it's starting to actually take value and not give you anything back. And across the board for every publisher from the Associated Press to Zip Davis and everything in between, we've seen just incredible resonance with that message where they're all saying, you're absolutely right. Our business is getting astronomically harder over just the last six months. and we're seeing less and less of our existing business model working. So we need to do something about that.
Starting point is 00:10:11 And so what we did on July 1st was we announced what we called the content independence day, where you could actually have independence from these AI companies. And we, for free, across all of our customers, where they paid us or they didn't pay us, we started blocking by default any training that was being done by any AI companies that was there. And it was really important that we focused on that because that meant that we could treat Google the same as everyone else.
Starting point is 00:10:34 Now what we're doing is we're working with the ITF and other standards organizations to say let's define how you have to announce what your crawler is doing as it behaves online. And we're really encouraged by the early work that's there. As that happens, we think we'll be able to set in place really fine-grained permissions for content creators or anyone else to say, you know, humans can get my content for free, but robots have to pay for it. and then figure that out. That first step of creating scarcity is what you have to do in order to figure out what the market is. And then after that,
Starting point is 00:11:09 I think figuring out the market, that's going to be what takes some time and I think we're still experimenting with different. That's super interesting, because if you look in the sort of search precedent, we had a robots. Dot text file,
Starting point is 00:11:19 and that's where you kind of specify whether a search engine could come and crawl the content. And it sounds like you're really extending some of those concepts on through to the AI layer. Yeah, that's right. And I think, you know,
Starting point is 00:11:29 Robots. that text was a relatively simplistic and blunt tool where today, you know, it basically says you can either allow something or disallow something. And you can basically do it either, you know, you can do it on a directory on your site or or to the entire site. But there's not that sort of fine-grained control. And so we think robots of text is sort of like the street signs that are on the road, a lot of people don't necessarily follow the speed limit, though. And we actually see plenty of examples.
Starting point is 00:12:03 In fact, some really prominent companies that do some very, very, very shady things where they basically say, absolutely, we follow the rules of the road. But when push comes to sub, if it turns out they're blocked, then all of a sudden they're doing a bunch of things that look not dissimilar to what we see
Starting point is 00:12:17 Russian hackers or Iranian hackers do in order to try and get around those blocks. At Clubflow, we're really good at identifying that and stopping it. And we're also really good at embarrassing those companies that do that. So, you know, watch our blog, and I have a feeling that some prominent AI companies that are misbehaving are called out pretty soon. What do you think that, you know, should the idea of a marketplace for, like, contributing
Starting point is 00:12:41 to training work out, what do you think that does to the landscape of, like, the types of content companies that win or lose? Like, I can't imagine it's going to look like it does today because there's some notion of, like, incrementality. I mean, this is going to take us down a little bit of a tangent. But I think a lot of the things that are wrong with the world today are ultimately Google's fault. They're not the worst actor. I'm glad we started with we're all friends here, and then it's all Google's fault. Yeah, it's all Google's fault. I think Google has been a net force for good in the world.
Starting point is 00:13:12 I think that they actually believe in ecosystems. I think they're trying to do the right thing. But they taught everyone to worship, if they're content creators, sort of a deity which is traffic. And that was the proxy for value. It was how do you generate the most traffic? And that led to Facebook as the next iteration. It led to TikTok, led to folks like the Huffington Post, which would literally write a piece of content, and then A, B, test headlines trying to figure out which one generated
Starting point is 00:13:40 the largest cortisol response to get the most clicks. Or if you guys remember, demand media. Demand media. I mean, BuzzFeed. I mean, there's a whole bunch of folks that we're just trying to figure out how do we actually stimulate rage and get people stirred up so they'll click on the thing so that I can either sell them a subscription or sell ads against a piece of content. And again, I think that that led to a lot of me-toism.
Starting point is 00:14:03 that led to a lot of people writing the same story with sort of a slightly different bend. I don't think it led to a lot of us actually figuring out how to advance human knowledge. And so what I think is interesting is if you think about the AI companies on mass, they're a relatively good approximation for the sum of human knowledge. Not perfect, but probably the best we've ever had, right, where they come together. And the reality is that they are in aggregate. they're like a giant block of Swiss cheese where yeah there's a whole bunch of cheese there
Starting point is 00:14:38 but there are holes in the cheese as well and they're very algorithms as they come across a piece of content they prune off that content which is already kind of part of the meaty part of the cheese whereas the parts that are in holes are actually super valuable to them and so I actually think
Starting point is 00:14:59 that if we could create a market where you're rewarding content creators not for who stimulates the most cortisol, but who fills in the holes in the cheese. And you actually pay people for that. That is a better outcome. And that's actually advancing human knowledge. And that's really amazing if we can do that.
Starting point is 00:15:18 Isn't that kind of arguably companies like Mercor or surge or scale, as they do data labeling and they hire human experts to basically fill out content areas for AI companies? So do you basically view this as like a distributed model of that or sort of a web-based? I've spent the last year talking to, a lot of people. One of the more interesting conversations that I had was with Daniel. I flipped to Stockholm and saw Daniel. And I think there's really nobody who has compensated content
Starting point is 00:15:45 creators at scale like Daniel has. And it's amazing. The day before iTunes launched, the music industry was about an eight to nine billion dollar industry. Spotify on its own today pays out over $10 billion a year to the music industry. And so done right, these can be very much pie expanding. You know, there's plenty of cheese to go around if we do this, if we do this correctly. And the thing, I remember he was telling me a story, which I thought was sort of in the same vein, which was Spotify actually looks at queries that people have run that they don't have good answers for, where there's, you know, somebody searches for, I don't know, I want a disco song about like how fun it is to dance with your dog, right? I don't know. And if somebody searches for that and it doesn't get it, they actually publish that list
Starting point is 00:16:36 back to content creators and musicians. And there are several musicians that are making literally tens of millions of dollars a year just writing songs for what people care about listening to that are unmet demand. And so I think that it's not exactly the same as data labeling. I think it's actually saying, like for the first time in human history, we can actually very accurately identify where there are holes as the very nature of the pruning. algorithms that these LLM models are trained on, and that we could then, if we sort of basically resurface that and say, hey, this is that we don't have enough articles about, you know, the
Starting point is 00:17:12 wing-toed ferret that people will actually go out and do that. And that if we can then compensate people on that, that actually is much better than yet another article about what's happening in Washington, D.C., yet another article about, you know, how much San Francisco is on decline or on the rise. I mean, again, that that's not actually adding to human knowledge. That's just, you know, rage bait effectively. Yeah. How do you think that plays out as, so if you look at some of these labeling companies that also then hire experts in to sort of provide some of the at least expert content that you mentioned, if you look at some of the models like Med Palm 2 from Google,
Starting point is 00:17:46 which is a couple years old now, it outperformed human physicians, the average human physician in terms of output. So if you rated its output against people, at what point do you think we've run out of good content from people? In other words, there is some limit. I don't think that's true. I mean, I do think that there. There will be some, like, there's always going to be people running new experiments and new tests and finding new things and new discoveries. And yeah, maybe we can imagine some distant future where it's all robots that are doing this in the labs. But that's a long way it's off. And so in the meantime, you know, I think we can do that.
Starting point is 00:18:21 My black mirror kind of version of the future, though, is actually one where we're not going to get rid of journalists. We're not going to get rid of scientists. We're not going to get rid of researchers. you're going to still need that work. What I worry about is if we don't figure out how to compensate broadly content creators who are independent, that we actually go back to almost a time in the Medici's, where the web had historically been this incredible sort of distributor of value of creation and knowledge creation.
Starting point is 00:18:50 You could imagine a world in which all of a sudden you have five big AI companies. You have the conservative one and you have the liberal one, you have the European one, and the Chinese one. And they all actually hire and run their own team of journalists, researchers, academics, the experts that fill in the cheese, the source of holes in their cheese. And again, that's not too hard to imagine that in some not so distant future that becomes a thing. What I hope is that we figure out a way to compensate independent content creators and share that knowledge across all of them as opposed to creating these silos of knowledge behind each you know, variation of an LLM.
Starting point is 00:19:31 Do you feel like the large labs agree with you on how much can be paid out to creators to fill those holes? Because, you know, you look at the scale of ad revenue, you know, I mean, even ignoring things like commerce and whatever from the open web. But, you know, in aggregate, like, what's been paid to labeling companies, like $10 billion less? We're, like, really far off if people are starting with a very large free, free today base. Well, again, I'm not sure labeling companies is the right, is labeling companies the right model, or is it, you know, GPU spend, or is it, is it employee spend?
Starting point is 00:20:09 You know, I actually think, you know, first of all, the amount that's paid to labeling companies will go up. The amount that's paid to employees and then GPUs is continuing to go up. And so the question is, how much value is content actually giving you? And the answer is, you know, somewhere between zero and 100 percent, right? And is it more or less than another unit of GPU time? I mean, there's a market that can figure that out. From that first principles view, I say it. Yeah.
Starting point is 00:20:39 And so you've got, so there is some value, which is there. I think the mistake that a lot of content creators did was they actually did deals that don't scale as the business models of the AI companies scale with them. So if you do a deal that's like $20 million and you get all my content, I mean, that's an incredibly naive deal, right? It might seem like a great deal to the content providers day one, but it's exactly the opposite of what you want to do. What you really want to do is say, okay, you know, if you imagine that there were a way to for all of the content that is available, you say here's how much that is creating value, that's going to be some percentage of whatever the subscription fee is for your AI model or if you're an ad supported AI in the future, it's going to be some. some percentage of that. And then as the AI companies grow, then, which will inherently then mean the ad revenue shrinks, that you share in that upside as your downside gets, gets diluted. And that, I think that that, you know, there's still going to be advertising out there. There's still
Starting point is 00:21:41 going to be subscriptions. There's still going to be tent pull content that people just have to consume even if they're AI's. But what you also want to do is allow that content to get into the AI systems, and the content creators should get compensated for that. And again, a market will determine, if there's scarcity, a market's going to determine how valuable that actually is. One of the things that Cloudfair is known for, to your point, is really speeding up web pages in the internet. And as we shift from serving pages to sort of models being run, you're kind of shifting from a world of like caching and serving pages to like inference. How do you think about that in the context of Cloudfare or some of the directions that you all are going?
Starting point is 00:22:20 Well, I mean, I think we leaned in heavily. I mean, nobody remembers this. But back in 2020, we partnered with this, you know, graphics chip company in order to put GPUs at the edge of our network in order to allow people to do inference. And by the way, it was crickets. Like, you launched this product, no one responded. It wasn't a single, like, sales inquiry to use it. And so we apologized to the partner who happened to be Nvidia. And it kind of went on our way.
Starting point is 00:22:49 Now, four years later, the market was ready for it. We basically just took out the same press release initiated again, and then, you know, it's taken off like gangbusters. I think that we've leaned in heavily to, you know, we believe that a lot of inference is going to happen on your end device, but there will always be some model, which is too big or too resource intensive. And in that case, the next best place to run it is going to be on, at the inside the network at the edge.
Starting point is 00:23:13 And that's what we're delivering. More importantly, I think that if you look at whether or, you look at, whether or it's MCP or whatever the next protocol that connects agents to, you know, services and allows these things to connect, inherently, because of how much of the internet we sit in front of, they have to pass through us. And so we're investing heavily behind those protocols, making sure that they have all of the security, the underlying rails and payments infrastructure and everything else that you need. And my hunch is that what we solve in the content space And the rails that we create for the payments there very naturally then become one of the models
Starting point is 00:23:52 to do sort of agent to agent over MCP or whatever the final protocol becomes payment infrastructure to be able to handle that as well. So Cloudflare fundamentally is a network. And I remember when cryptocurrency and blockchains and everything, we're getting big people are like, aren't you worried about this? And I'm like, they still need a network. As AI gets big, but still needs a network. And so I think we sit in the center of this.
Starting point is 00:24:19 And as you especially have more agent-to-agent communication, I think the network actually becomes more and more important. Is there a bet you're making on like what changes in terms of models or, you know, compound systems that like drive more model traffic to the architecture you described where it's in network versus, you know, in large data center today? I wish I could say we were that strategic. I mean, I think we go to wherever the market demands that we go. Including build a neocloud.
Starting point is 00:24:50 I don't even know what a neocloud is, but sure. So I think we're fundamentally always just trying to say, how do we respond to whatever either our own team needs as customer zero or what our customers need? And the fact that 80% of the AI companies are using us, they are constantly pushing us to can you do this, can you do that? And I think our team has been uniquely good at being able to execute and innovate and stay at least up with whatever the trends are. And that's, again, I think I'm proud of the fact that we have ended up in a lot of these conversations and that so much of the Internet does flow through us that one way or another, I think that we end up being in the center of a lot of these transactions. Does that imply any particular belief around like open or closed models as people continue to develop capabilities? You know, we have closed models that run on us. We have a lot more open models that run on us.
Starting point is 00:25:47 I tend to, we have historically been a company that believes very much in open source. And most of the things that we build internally, as long as we can, we try to open source all of that technology. And so I tend to be in the pro open models. You know, we work very closely with the meta team in Lama. and everything that they're doing. But again, I think there's going to be different flavors of this. And I, you know, again, we're happy to have customers in either end of that spectrum. I'm a little bit, I'm actually quite skeptical of the, if we allow open source models,
Starting point is 00:26:26 the world is going to end arguments. That seems histrionic to me. There are things we should worry about. Like it is, you know, I think some of the, you know, synthetic pathogens and other things that can be created. But it seems to me like the place to regulate that and control that. is in the machine that can actually print the pathogens, not in the AI model that can come up with what it is. That seems like a pretty flimsy argument for why we shouldn't have open source.
Starting point is 00:26:51 What needs to happen for your view, sorry, I'm still going back to like shape of the web, what needs to happen for your view of like a marketplace for content to emerge, right? Like what are the next signs that this is actually like happening? I think the very tactical next step is we've got to get Google to not be a special snowflake. Because Google has had such a dominant position in search, they almost believe that it is their right to have access to content without having to pay for it. And so the conversation that we have with them is we get it.
Starting point is 00:27:29 The deal that you made with content creators in the past was they give you their content and you send them a certain amount of traffic. You over time have taken just as much of their content, but you've sent them one-tenth of the traffic that they have. And if we just plot those trends out going forward, it's going to become, you know, a smaller and smaller player. At some point, the content creators will say, we're just going to block Google, right? Now, that seems on, that was unthinkable 10 years ago.
Starting point is 00:27:57 It seems radical six months ago. It is what people are talking about today. Why Google is so important is when you talk to all of the other AI companies, Google is the one company that they're the most afraid of. And that the reason that they're most afraid of them is because they think that they have privileged access to content in a way that is much more difficult for them to do. And so what I think we have to be able to do, first of all, is to say to Google, listen, you can still do search indexing. But if your bot is taking content and then transforming it in some way, making into the answer box, making into AI overviews, turning it into Gemini, that's a different action. That's a different deal.
Starting point is 00:28:35 And you have to be in the same bucket as everyone else. And Google is going to resist that. Now, I think the good news is they really do believe in the ecosystem. I think they are trying to do the right thing that's out there. And, you know, maybe not this is the good news for Google, but it's the good news, I think, for the web, which is that they have a ton of both regulatory and legal and legislative pressure, which is coming down on them. So one way or another, I think we will flatten that out. Once that happens, I think that's when we can actually start to say, we're going to shut off access to content unless you pay for it.
Starting point is 00:29:08 In the beginning, most of the deals that are done, the actual money being changed, will be between large content producers and large AI companies. That's happening right now where Condi Nast or dot dash Meredith or the New York Times or Reddit is doing a direct deal with a large model company. That'll happen a bunch. Where I think we can play a role is when you have either a large content provider trying to make a deal with all of the AI startups that are out there, which they really do want to do. want to do it in a way that scales, but they can't do one-off deals in those cases. Or you have the long tail of content with all of the different AI companies. In both of those cases, I think Cloudflare can play a role in helping set what are sort of basic rates that are there. And how that model looks, you know, I'm not sure. It might be that we negotiate basically on behalf of a number
Starting point is 00:29:59 of the content providers with all of the different AI companies, basically a pool of capital, much like how Spotify does and then distribute that out. It might be micropay. Every time you access a piece of content, it might be that training is actually a different payment rate than search. That's, I think, something that we'll have to figure out. But step one is we've got to get Google to play by the same rules that every startup, every new, every, you know, other company is playing by. And the minute we do that, I think the rest of the marketplace will actually happen a lot faster than you think. For any content company or individual providers, you know, that used to be a big part of the web, that cannot predict today. Like, there's no business model for them today to make money off of content going into these
Starting point is 00:30:42 AI experiences. Yep. And they can't, it's not easy to predict what is incremental to models. What advice would you have for them? So I think the first thing is you've got to, you've got to get back to controlling your content. So you have to create scarcity from the beginning. So how do you make sure that you're not just giving your content away for free?
Starting point is 00:31:02 And again, we've made that easy. There are other companies that are working to try and make that easy as well. And so one way or another creates scarcity and then start to have conversations. You can see which AI companies are the most likely to deal with it. So just today there was news that Google is starting a pilot project to start to pay news providers, something they swore they would never do. But again, I think that they can see, because they do believe in the ecosystem, they can see that this has to happen if the incentives for creating content go away, if you can no longer
Starting point is 00:31:33 sell something, if you can no longer sell ads against something, if you can no longer even get the ego hit because if people aren't going to the original source, you don't even know if you write some incredibly influential piece that ends up in millions of AI responses, you don't actually ever even know that happened. You're yelling into the void. We've got to figure something out around that piece. So I think the first step for content creators is recognize that you have that the business model of the web is changing. Second, recognize there is something you can do about it, right? You can actually create this scarcity. And then third, actually participate, start to go out and say, based on the data, hey, you keep trying to
Starting point is 00:32:11 crawl my stuff. Like, let's figure out a way that we can have some fair exchange of value for that. One other thing that you mentioned sort of a little bit as a side note when we were talking earlier was around how you felt that a lot of the models would actually be running on device and running locally. And then obviously there'd be things on the edge or in the cloud that would be the bigger models perhaps doing more complex tasks. When do you think that'll happen?
Starting point is 00:32:34 Do you think that's based on when the devices advanced in certain ways? model size, is it something else? Well, I think a lot of it's happening today. You know, on your phone, there's a lot that your phone is doing locally without it having to go out. And there are certain places, certain applications where it has to be local. If you have a driverless car and there's a red ball bouncing through a yard with a little, you know, girl running after it, whether to hit the brakes or not can't be dependent
Starting point is 00:33:00 on network conditions, right? So that has to run locally. I think that the big place where, you know, where I think there's going to be exciting innovation that it does not feel like it will be today is really in just how do you take, especially on the inference side, making it significantly more power efficient. That ends up being the biggest limiting factor. Apple has shown that it's possible and that you can actually have relatively power efficient GPUs and TPUs that are out there. When we talk with the folks at Nvidia, it feels a little bit like talking to Intel back, you know, in our case in 2010 or Apple's case in 2005, where they were like, you're doing it wrong if you care about power efficiency. I remember sitting in Intel's research lab outside of Portland in 2011, and we were a tiny
Starting point is 00:33:50 little startup, but we were doing interesting, innovative things, and so we were using their chips, and so they brought me in, and I was like, the only thing we care about is cores per watt. And we just need as many cores per watt as you can possibly deliver. And they just kept saying you're doing it wrong. You should be water cooling your systems. And we kept trying to explain. Like we don't have that luxury. We have to go into what are oftentimes the oldest, most legacy data centers in the world where there is a relatively limited power envelope. And we've got to fit within that. I think the same thing is going to happen in the AI space. And I'm very hopeful. Invidia has been a terrific partner to us. But it's been sometimes frustrating
Starting point is 00:34:30 to see how, you know, more, more GPU capacity comes along with, you know, having to stand up your own mothballed nuclear power facility. Like, that can't be the solution. And there's no physics reason why it needs to be. And so I think that we're actively looking around the ecosystem, trying to figure out who can deliver the most, you know, tensor units per watt or whatever the sort of GPU equivalent is. And that, I think, is going to be the big unlock that allows you to have more running on your device, whether that's your phone or your driverless car, or frankly, at the edge of the network. Because, again, we also have to live within a power envelope, which is not the same as if we were, you know, standing up a, you know, 100 megawatt data center.
Starting point is 00:35:20 Yeah, I was kind of thinking of it, I guess, from two or three perspectives. I mean, to your point, there's the actual chips. And in the context of mobile, obviously, there was ARM and then Qualcomm as other approaches to basically get to some of the things that you're mentioning for devices. separate from that, there's actual model size and inference time and a few other things that are kind of overlapping but different. And so to some extent, it's how large of a model that's how perform it can you actually load on a device and when does that happen and how well can it run? And so I was just a little bit curious how you thought about all those different pieces because there's also the model component or side of that that seems to matter quite a bit. And it's all coming. I mean, it's all inevitable.
Starting point is 00:35:52 I'm just sort of wondering about time frames. I mean, I think that we are still, most of the AI companies are still relatively inefficient in terms of their utilization from everything that we can see. And so when we, you know, it is in our interest based on just how our business model works and everything else to make inference, to make anything we do as efficient as possible. Because we only charge customers based on the actual work that we do. We're not, we're different, the hyperscalers. The hyperscalers, you go out and you rent a GPU, they don't care if you use it or you don't use it.
Starting point is 00:36:29 There's no incentive actually for them to make. In fact, if tomorrow someone announced that they had made in France 100 times more efficient, that's great news for us. It's terrible news for the hypers because our business models are very different. I think that the great lesson of deep seek, I just wish that it had been a group of students out of Hungary that had designed it, not out of China. Because there were some really significant, innovative steps that they took to make essentially training and inference significantly more efficient. We all got distracted in the U.S. by the fact that it was China and, you know, was it real or not real or anything else.
Starting point is 00:37:06 It was real. There was really great science that was done there to be more efficient. I think we've just barely scratched the surface on what we can do around model compression, what we can do around really just, you know, much more efficient pruning. There's a ton in these models that are branches of the tree that just literally there's the probabilities of going down it are so incredibly low that you can prune those branches off fairly efficiently and still get incredible performance. So I think we have an enormous amount of the kind of hardcore computer science, which is different than kind of the hardcore AI science to do to just say, how can we now take these things and make them massively more efficient? And my hunch is that, you know, it is not particularly long before you're running something, which is the equivalent of kind of the current generation of chat GPT on your iPhone or at your Android device.
Starting point is 00:38:00 Does that advancement happen at the model providers in open source, at an infrastructure provider? I mean, it's hard to predict where it is. I mean, I know that that's like we're not investing in how we build a frontier model. that's not that's not our job we are investing on how we take any models that that we're running and and run them significantly more efficiently so that's that's how we think about it i think i think the question is who's going to be the vmware of AI right who's going to create kind of that ability to just like because we haven't even done that basic work like today for the most part if you want to spin up a GPU, you're taking an entire VM, there's not even a container because you've
Starting point is 00:38:50 got to get that low level or you're taking an entire machine that is running. And that's extremely expensive. And in most cases, with most of the hyperscalers, you actually have to get anything close to attractive pricing. You have to commit to a year of that. And now it's up to you as a customer to figure out all that efficiency. Someone will come along and figure out a way to say, here's how we can slice these things up, here's how we can make them more efficient. And we're just going to speed run essentially the CPU efficiency gains, including a lot of the security things, like the specter and other attacks, the speculative attacks that you had in CPUs will come to GPUs. All that stuff is coming. We're going to speed run the last 30 years
Starting point is 00:39:31 of CPU efficiency gains in the next five to 10 in GPUs. I think fundamentally there's a lot really interesting next-gen models around physics and materials and a few things that may actually be interesting. Obviously, a few really odd future-looking things on the infrastructure side I think will be important. You've probably been following a lot of the sort of agentic-related infrastructure that actually is necessary for multi-step agents. So I think there'll be a few big companies there. And then there's all the vertical app things. We will probably compete in that agentic infrastructure space. It will probably not be one provider. It will be something where you're going to have to have a whole bunch that actually work together in some way.
Starting point is 00:40:09 And so figuring out the standards behind that, I think, is going to be important. And it's, you know, I think that whether it's MCP or, you know, Google did their own flavor of it, which was sort of just like it felt very Microsoft-y, kind of embrace and extend. But all of that space is going to be. Sort of like temporal and graph, all those things are kind of early indicators of like new agentic infrastructure that's kind of. Yeah, that's right. Yeah. I think there are like an entire, very large domains where a lot of the architecture should probably apply.
Starting point is 00:40:40 And like the data collection, like efficient data collection is the question because we're like we're not going to get robots from Common Crawl. But we are like probably going to get them. We just have to figure out how to pay for the data. And so I think figuring out if there are interesting models to get to generalization in these other domains is like something I'm looking at. Yeah. The other thing that's going to be really interesting is I was actually kind of pretty much a, a skeptic around like the blockchain cryptocurrency. It may be that this shift is the thing because now we're looking at this and we're thinking, okay, okay, let's say we got to a place where it was
Starting point is 00:41:15 actually micropayments for every page view. I mean, we do something 15 trillion requests every day. It's an obvious use case. But how you then scale these things to be able to work that way. I mean, you can't do that with Bitcoin, right? And you can't even do that with Salon or other things. I think it's going to be interesting how all of these things that sort of, that have developed over the last 10 years, how they kind of come back together in interesting ways to invent whatever that next future is going to look like. Yeah, super interesting. Yeah, I think a lot of people also talk about agentic permissioning and identity. That's right. Yeah. How do you actually embed identity on the blockchain? Well, and also, I mean, the whole
Starting point is 00:42:00 question of identity is going to be really, really interesting because there are times where I might want you, the human, to be there and be okay with that. There might be times where I'm willing to be, you are the agent connected with a human to be there. And then there might be times I'm willing to let some sort of agent that is self-directed be there. And setting up kind of the differences in those permissions is going to be really interesting. Browsers, another place. I mean, everybody is building a browser right now. And that's an interesting question for us. Like how much do we lean into supporting that versus how much do we say, okay, that's actually just a way of leaking data that's back out. And so I think getting to some sort of way of saying, here's who I am
Starting point is 00:42:45 as an agent, a bot, a browser. Here is what I have agreed I will do with whatever it is that I'm taking from you and and you know cryptographically signed like I have I agree to these things. I think that that's it's it'll probably be the same fundamental infrastructure that regulates how bots access the web that ends up regulating how browsers access the web and and a browser that takes data and then immediately feeds it back to an LLM might have more restricted access than one that doesn't and that's going to be an interesting you know market. interesting. That's super interesting. Yeah. And I guess to the identity side, ZK is a very natural way to actually do a lot of really interesting. Prove you have the right to a credential, but not show
Starting point is 00:43:35 the credential to your other things that are complicated. And so the blockchain's perfect for that. It feels like all the building blocks for this had been coming for quite some time. And it wasn't until fairly recently where it felt like, oh, that starts to be the shape of how these blocks come together. But it feels like both from kind of the need of the change of the business model and through the fact that the technologies have mature to the point that they're starting to be able to handle these volumes and have the broad adoption to actually take off. Again, all the work of the last 10 years that, you know, if you'd asked me six months ago, how is this all going to come together? In the last few months, it's felt like, oh, now you can start to see
Starting point is 00:44:19 the shape of what this future might look like. I think that's what we have time for. Thanks, Matthew. Thank you so much for joining us. Thanks for having me. Find us on Twitter at No Pryors Pod. Subscribe to our YouTube channel if you want to see our faces. Follow the show on Apple Podcasts, Spotify, or wherever you listen.
Starting point is 00:44:37 That way you get a new episode every week. And sign up for emails or find transcripts for every episode at no-dashpriars.com.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.