a16z Podcast - Balaji and Taylor Lorenz on AI and Media

Episode Date: May 1, 2026

Theo Jaffee speaks with Balaji Srinivasan and Taylor Lorenz about how AI is reshaping media, trust, and online communication. Building on prior public disagreements between the two, the conversation r...evisits core tensions around media, technology, and power in a rapidly changing information environment. They discuss the breakdown of traditional information systems, the rise of AI-generated content, and why new models for verifying identity and truth may be necessary. The conversation lays out competing visions for the future of media, from decentralized “webs of trust” and cryptographic verification to the role of journalism, privacy, and public accountability.   Resources: Follow Balaji on X: https://x.com/balajis Follow Taylor Lorenz on X: https://x.com/TaylorLorenz Follow Theo Jaffee on X: https://x.com/theojaffee Stay Updated:Find a16z on YouTube: YouTubeFind a16z on XFind a16z on LinkedInListen to the a16z Show on SpotifyListen to the a16z Show on Apple PodcastsFollow our host: https://twitter.com/eriktorenberg Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

Transcript
Discussion (0)
Starting point is 00:00:00 I think the media guys think the tech guys started it, and the tech guys think the media guys started it. I think the media guys started by economically disrupting them. I think this is why we're seeing such a resurgence in live streaming and interest in these sort of like communal experiences because live is something that is so hard to face, it is such a like a human thing. We actually need to have decentralized cryptographic truth
Starting point is 00:00:20 that's not behind a paywall that anybody can verify, no matter how poor they are, no matter what, I think just like you should not be subject non-consensually to government surveillance, you shouldn't be subject non-consensually to corporate surveillance. Okay, but what about an independent media reporter? Is that okay?
Starting point is 00:00:38 What happens when anyone or anything can generate information at scale? AI is making it easier than ever to create content, but much harder to verify it. As agents generate text, images, and even identities, the systems we've relied on for trust, from media institutions to social networks,
Starting point is 00:00:56 start to break down. In response, new ideas are emerging, cryptographic verification, decentralized identity, and new forms of social coordination that aim to prove what's real rather than simply asserted. But these shifts also raise deeper questions
Starting point is 00:01:12 about privacy, accountability, and the role of journalism in an AI-driven world. To understand and debate what comes next, Theo Jaffe speaks with Bologi Srinivasin and Taylor Lorenz. And so I think as much as I like AI, within the digital tribe, it accelerates coding,
Starting point is 00:01:32 it's great for search, all this kind of stuff. Between digital tribes, it's often bad because it's just, you know, AI agents spamming 50 different people with a resume or a sales email
Starting point is 00:01:44 or something like that and just breaks the commons and so we're going to need to have, I think, a whole new generation of human-only social networks. I don't know how verifiable that would be. Like, you can assume
Starting point is 00:01:55 some kind of biometric method of proving that you are a human, and so you have your account that says, I'm Theo, I'm a human. But then on my account, I can just post stuff that I generated with chat GPT, maybe with some savvy prompts
Starting point is 00:02:07 to get around Pangram. But you're saying, yeah, yeah, well, here's Web 3 of Trust, right? So you would have the way, so there's a whole, just cold cat and mouse here, but just to give you a sense. Web of Trust is A asserts a B is trustworthy, who assert C is trustworthy,
Starting point is 00:02:23 who asserts D is trustworthy, and then the trust drops off, right? You trust your friend, and maybe you trust your friend's friend, but probably not your friend's friends, friends, friends, friends, friend. And so there's a way of modeling that mathematically. And you can have not just one proof point, not just A trust B, but also, you know, X trust B and Y trust B and Z trust B. And they trust them for a bunch of reasons, all of which is expressed in the metadata on the edges and nodes. And then you can do a calculation
Starting point is 00:02:50 and inference of what is a probability that somebody continues to be human. And then you also have not just automatic pangram style reporting, but you have manual flagging of something. It's, there's a certain, there's a set of signals people can look at. And I think if you establish the culture as being human only, and you also take away some of the payoff for pacing just reams of AI text,
Starting point is 00:03:17 I think it's possible to do. It's a little bit like Snapchat where, you know, Snapchat is disappearing messages. And yes, of course, in theory, you can take a photograph of the thing. But in practice, it did deter people from doing it. So you can set the culture in such a way that I think you can deter it. I think instead of, like, I agree with all of that.
Starting point is 00:03:34 I think that people will probably just gravitate towards formats that we're, I think this is why we're seeing such a resurgence in live streaming and interest in these sort of like communal experiences because like live is something that is so hard to it is such a like a human thing. And so I think people are just gravitating towards those formats that prior, that sort of prioritize these like human. communal experiences. So that's why we're doing network school for the human part of things and having people meet up in person. Is that the thing in Singapore or something? Yes. Yeah. If you're in Singapore,
Starting point is 00:04:06 let me know. I saw a TikTok from one of those guys that was in it. Yeah. So, yeah, I think that kind of thing, in person communities is going to be important because, you know, for example, you know, focusing is best done in my view with pen and paper, pencil and paper, offline. And if you, you, like to be extremely offline focused with your own thoughts, now you can actually do something. And then you come back and you're extremely online later. But without that offline kind of thing, you don't actually have a sense of your own head.
Starting point is 00:04:40 You know, you don't have your sense of your own direction. And so I do want to have much more of that. And I think also what's happening is we have the digital divide, but in reverse, where in the 90s people were scared that only the wealthy would have digital everything. But actually, digital everything is super cheap. Like, you basically have the same phone as Larry Page, the same Wikipedia as Sergey Brin or whatever, right?
Starting point is 00:05:02 Same Grogapedia, I should say. Grockapia is far superior to Wikipedia, by the way. And so you have the same digital experience. That's being hyper deflated. And now it's physical that's a premium product. And so the digital divide is kind of in reverse. So I agree with you, Taylor. I think you're going to see much more in the way of in person and that kind of stuff.
Starting point is 00:05:20 Taylor, I am interested in your take on Wikipedia versus Grocoppedia, because one of my favorite use cases for Grogapedia is to look at articles for people who are like totally unfairly maligned on Wikipedia. Got it, yeah. I mean, I like, I do get annoyed at Wikipedia because there's a lot of misinformation about me on Wikipedia. Yeah, let's pull it up.
Starting point is 00:05:43 So I'm sympathetic. Can you get this stuff on stream? Yeah, let's look at my Grogoppedia. I actually haven't looked at it before. It's quite long. Is this true? Look at how long this thing is. Jesus Christ.
Starting point is 00:05:53 Who has time? to read all this. I think I, you know, like I said, Wikipedia is not perfect. There's misinformation on, on, you know, Wikipedia. Overwhelmingly, though, what I really like about Wikipedia and I think is good and noble is the effort, is a decentralized, community-driven effort to get information right and compile information. And I think it's great to have competitors to that. So I'm not against the concept of like any competitor to Wikipedia. I think it's a little silly when it's like framed as like anti-woke or whatever. I think Yon was trying to say it was like when it initially can't think Wikipedia is like, well, it's more than that.
Starting point is 00:06:33 So, yes, people have said, oh, you know, Wikipedia is Wikipedia's Wikipedia. But there's actually a critique from the left as well, which is people have said that. But there's a critique from the left, which is it's also Whitapedia and Westapedia. Yeah, it's not, it has a lot of flaws. And to be clear, nothing against white people or Western people and so on. But there's a critique from the right and a critique from the left that says that the kind of people who edit Wikipedia are simply not representative of the world as a whole, right? Partly because lots of people who got online in the 2010s from India, from Asia, Africa, Latin America, they may be Anglophones. They may speak about our language, but they didn't build the political capital at the beginning to be part of Wikipedia, ARBCOM and all this kind of stuff, right?
Starting point is 00:07:19 And so that got all ossified in a certain way. And even if you look at like the perennial sources list or something like that, it's extremely anglophone, western, and so on, its orientation. And it locks out new media sources, international media sources, and so and so forth, at a structural level. And so that's why I think, you know, that's something where the economy is shifted to Asia. It's the economy shifting out of, really out of America in many ways, out of the West. And so there's an inclusion argument as well to have voices outside. And I think it is definitely time for truly open source competitors. Another aspect is, you know, Wikipedia actually licenses a lot of its content to AI, to AI companies, but it doesn't pay its creators.
Starting point is 00:08:06 Well, the contributor, but anybody can be a contributor to Wikipedia, right? In theory, but in practice, you have to have like an editor history and all this kind of stuff. It's like all animals are equal, but some are more equal than others as a Wikipedia. editor. And, you know, the thing about that is, look, it exists, and I think it did do a lot of good in the world in the early 2000s. And in fact, at that time, the insistence on only using mainstream media sources or whatever, probably at the time, I think it was less of an issue. But now today you have a situation where there's like something's being posted on the internet from a primary source like Jeff Bezos tweet something, but it has to get recycled.
Starting point is 00:08:43 Right, which I think is that they need to fix that. Yes. The primary source thing is ridiculous. And I actually had that when I was trying to correct misinformation on my own Wikipedia at one point. And I had to get like a person to write an article, you know, mentioning whatever. I can't even remember what I was trying to scratch. Jimmy Wells, if you're hearing this. I'm a huge fan of Jimmy Wales.
Starting point is 00:09:03 But yeah. He should fix this. He should allow primary sources. Absolutely allow primary sources, right? Yeah. And especially if it's like the person's own statements or like if Elon or Trump or somebody, you know, whatever. as long as their authenticated account, it should just be directly attributable to them.
Starting point is 00:09:19 It shouldn't have to go through a paywalled link or something like that. Absolutely. Yeah. I'm curious, Bologi, like, you know, just like in light of sort of like your criticism of the mainstream media, I feel like you, where did this come from? Like where, because I feel like you have this like animosity towards, and I understand I'm sympathetic, you know, from a lot of,
Starting point is 00:09:39 like I'm a big critic of the mainstream media myself, but I'm just kind of curious like where, what's the origin story of all that? Oh, great question. Well, so, I mean, look, I was actually an academic for many years. I was the most offline math person you could possibly meet, right? I gave my first public talk at like age 33, right? So most of my life, I was basically just essentially completely private person and just doing math and DNA sequencing and stuff like that. And I didn't really care about the media or anything. anything like that? I mean, whatever. It just, it was like, I don't know, as much as people care about, I don't know, architecture or something like that.
Starting point is 00:10:25 It was just like an interest area, right? So, but in the 2010s, I did, you know, like I had gotten into tech and we had gotten into tech. And we noticed that, especially post, exactly post-2013, after the second Obama term, non-the-first, you know, David Rosadoes actually published something showing that a bunch of words related to, for example, like white privilege and phrases like this all went vertical in the New York Times exactly in 2013.
Starting point is 00:10:54 And it wasn't like 2012. It was 2013 is after the beginning of the second Obama term. And then people in tech, if they pointed out that there was some issue with like, you know, drug addiction in San Francisco, they would just get hammered online and they'd get articles written on them and they get canceled and so forth. And many people lost their companies. They lost their funds. you know, Traus Clanick famously got attacked. Thomas, Preston Warner lost GitHub.
Starting point is 00:11:23 You know, like many, many heads of funds were canceled during this period. And they were just mystified as to what's going on because they're just trying to build rocket ships or electric cars or even just some file storage product or something like that. And suddenly they're like Public Enemy Number One, which is this bizarre kind of thing. It was very disorienting for many people. And eventually they put it together.
Starting point is 00:11:46 realized, oh, some of our colleagues on the social media and search side have completely disrupted the news business model. And Google and Facebook went vertical and took all to ad revenue. And like, you know, for example, Rolex is running an ad on NYT.com and, you know, Facebook.com. And so it's like they're like a direct commercial competitor. And it's also Craigslist going after classified as it wasn't just ads as a whole thing. So you realize, oh, okay, well, I guess we're like Coke and they're like Pepsi. And with Coke, listen to, all the things Pepsi was saying about, you know, how bad it was. And, you know, well, Pepsi has a blog and it's bigger than our blog, so we need to set up our own blogs, right?
Starting point is 00:12:24 And ultimately, by the way, at a structural level, right, there's a deep similarity between tech and media because we're both involved in the collection, presentation, and dissemination of information, right? We collect it, right? We figure out how to format it, aesthetically, you know, copywise, fonts, whatever. And then we disseminate it. We post it out there, right? And that's because, actually, if you look, many people in tech, like I was an academic, Paul Graham was an academic, Peter Thiel was Stanford Law, he was a jurist, Larry Page was an academic, Sergey Brin was an academic, and so and so forth. You know, Mike Moritz was a journalist. If you look, many of us were basically, for lack of a term, central left academics, you know, journalists, in Peter's case, a lawyer, and so on and so forth.
Starting point is 00:13:11 And we're very, we're actually of that world, right? So it's a little bit like, you know how Yale split off from Harvard or like America split off from Britain? We actually come from the same route. That's actually the interesting part of it, right? So I've always believed in democracy, equality of opportunity, media, academia, research, pure science, open source. I actually do unironically believe in those things. I felt that many of those words were getting corrupted in 2010s. Go ahead.
Starting point is 00:13:39 Go. Do you unironically believe in democracy? It's okay if you don't. Well, was you asking me or Taylor? Go ahead. You. Yeah, well, but here's why. Yeah, I'll say why, because the market for votes is like the market.
Starting point is 00:13:57 Like, you know, capitalism is all about consent. You know, I consented to this transaction. I consented, you know, that I'm buying this and they're selling that. And the same way, you want a government that you've consented to. And if there's insufficient consent for a government, there's insufficient legitimacy for that government, then it's just using pure force to try to enforce. And so I think, like, you know, one way of putting it is, you know, Marx was very insightful about all the ways that markets could fail and all the idiosyncrasies and so on. But then often that just means, okay, you have to figure out a clever way of getting into this market. Yes, Microsoft has such a hammerlock on it, but Google figured out a way, right?
Starting point is 00:14:40 And there's people on the right who are very insightful about all the ways that elections will fail and, you know, oh, it's all just ringers and, you know, people brought in. But, okay, fine, nevertheless, still figure out a way to win, right? Figure out a way to build legitimacy, to build political support because you need that as well, right? And in fact, actually, I think a lot of people in tech have over-rotated, I mean, competency is important, but it's a necessary and not sufficient. you also need legitimacy. In a sense, you need to accumulate votes, just like you need to accumulate dollars. You need both because, and those votes,
Starting point is 00:15:13 it's not just accumulating votes. You need to have, let's call it, limited government. You need to have some constraints of what an individual can do. You need to have rules that people can trust because otherwise, if you just have a pure individual, what do you have? You have a very, very unpredictable environment
Starting point is 00:15:28 that's also actually bad for business. So there's a center-right argument for the center-left, which is that you want to actually have political ability, you want to have the consent of the governed. Now, the exact mechanism of that, whether it's, you know, I do think that gerrymandered elections and so on in America in both blue states and red states have suppressed democratic forces in the same way that you can rig markets and so and so forth. And so to restore that kind of democracy, I think we're going to have, you know,
Starting point is 00:15:57 many both blue and red states in America have become de facto one-party states where elections are held, but the party always wins, just like China. Remember how I said the blues and reds have actually copied China? So you have this emerging thing where many states are becoming one-party states. So how do you have consent of the government? Well, the answer is they still have physical exit. They have digital exit. And so then you have a thousand communities around the world that you can apply to and choose from, just like you'd apply to college at age 18. You apply to your country at age 18. And that's how we restore a democratic choice where you have consented to the government that governs you because you have a choice,
Starting point is 00:16:33 a practical choice of a thousand of them. And so that's why I think we need to take some of these words and concepts that actually really had a lot of positivity to them and figure out how to rethink them in an internet-based age, but keeping the substance and spirit of it. Let me pause there, more I can say.
Starting point is 00:16:50 I think that you have a lot of interesting concepts there. I mean, yeah, I certainly, I do believe in democracy. I don't know. Yeah. I mean, I think that a lot of that stuff is interesting to think about theoretically, like the systems that you're describing. I, you know, I'm not as like sort of up with all the network state stuff,
Starting point is 00:17:12 but I think I'd kind of want to like look into it more and like understand it a little bit more. I guess like I asked you the question about the media just to sort of like go back to previously what we're saying. Because like I do you feel like just that not that it isn't interesting to hear your sort of thoughts. I am glad to hear that you're perfect. democracy. I kind of clocked it where Theo asked you because I feel like you're known as somebody that has like a different version, like sort of has put forth this different version of government.
Starting point is 00:17:37 But it's interesting to sort of hear it articulate it out. Actually, I have a whole talk on techno democracy. And actually you can see it. We're going to release an open source implementation. Okay. And the idea is like consider for example how politicians violate their campaign promises, right? How would you actually solve that? Like for example, Trump promised not to go to war in Iran and he did, right? How would you, the issue is a politician when they're elected, there's no power above them. That's being the issue where, you know, it's not been binding.
Starting point is 00:18:07 So the social smart contract, the idea is basically that, and this is a whole other talk, and you can go and look at that or what have you. But we've put capitalism on chain with cryptocurrency. Actually, much politics actually exists online, but the only thing that doesn't exist online is a binding vote. So you can have a mutually binding vote where anybody can vote for anybody
Starting point is 00:18:29 and they're effectively signing a social smart contract. And now you've expanded the electorate. It's basically the opposite of gerrymandering. When anybody can vote for anybody, then accumulating a bunch of people voting for you is a signal of, okay, well, these thousand people chose that person as a political leader. And moreover, that political leader
Starting point is 00:18:48 is themselves bound by a smart contract where there's limits on what they can do. So we restore rule of law with the rule of code. Anyway, I've got a whole thing on this, I'm going to put this out there. But I think, I guess like I, yeah, I'm not up with all the sort of the theories around politics, although it's interesting to kind of hear your thoughts. And I would be interested to sort of hear you talk to more people in D.C.
Starting point is 00:19:11 And sort of I'd love to hear a debate between like you and somebody in D.C. about that that maybe like has a different view on government. But I will say one thing, which is D.C. is just like economically in decline. I know. I have problems with D.C. socks. And so I just was there last week. trust me, we probably agree on a lot of issues with that. I guess, like, one thing that I'm curious, like, still kind of, like, it seems like you have this, like, animosity, or not animosity, but, like, resentment. Like, their feel, and it's not just you.
Starting point is 00:19:39 Like, I mean, it seems like a lot of people in tech do feel this kind of, like, resentment towards especially legacy media, and particularly the New York Times. Like, Bology, you know me from working at the New York Times. That's when you were, like, very mad at me. Even though I was very unfairly maligned, please go back and read my clubhouse article and see how I actually use the word, unfettered. You will see that it is not how it was described. We did actually talk about this when both of you came on separately. Bologi, you mentioned unfettered
Starting point is 00:20:04 conversation. I don't want to litigate the past, but let's say. There's a lot of, just indulge me. A lot of stuff that was thrown in both directions. Let's put it like that. Listen, 100% and listen, I love a Twitter fight. I'm not trying to, like, I'm not trying to like crucify anyone for fighting on Twitter. I do that all day.
Starting point is 00:20:21 But I do think that like there is this, like, correct me if I'm wrong, but there is this like, vibe among a lot of people in tech where I do feel like they feel like almost like hurt or upset at like legacy media. And I'm kind of curious like what that origin story is. Maybe it was the tech flash. Maybe it was like whatever. Some people got canceled in the 2010s, but it feels, you know, deep. Well, so I mean, the thing is I think it's take, it's different today. And the reason it's different today is basically tech and media were on the same side. I mean like, you know, Steve Jobs, like remember Occupy Wall Street, like everybody, he was he was covered, like his passing was something where even Occupy Wall Street was lighting up their iPhones for him and so and so forth. Tech was considered to be a wing of the Democrat Party.
Starting point is 00:21:08 And, you know, Obama's first election, tech helped, you know, elect Obama and all the kind of stuff. You remember all that, right? So there was, that was something where I think, in retrospect, I think the, you know, media saw, wait a second, these tech guys are going to. going from gadget manufacturers to suddenly becoming wealthier than legacy, you know, the legacy establishment. Also, another aspect, there's like at least three big differences between, let's say, New York media and Silicon Valley. And now Silicon Valley is decentralized of internet. But the three are, well, first obviously is, you know, paper versus digital, like the medium. But the second is actually, obviously, geographic the 3,000 miles away. But the third
Starting point is 00:21:47 is perhaps the least obvious, which is demographic, right? Silicon Valley has lots of Asians, lots of Indians, lots of people from overseas. Whereas the New York Times and U.S. media is based in Brooklyn, and they're mostly folks who have generations in the country. And so that nationalist versus internationalist aspect is also a very important question as to who has the right to speak. And implicitly, it's like Sellsboro is a right to speak. Newhouse has the right to speak, right?
Starting point is 00:22:16 They have the right to speak. And you, a recent immigrant, don't, right? Okay, wait. Can we zoom in on that a little bit? Because it seems like very different from how like the New York Times might see themselves, right? Of course. They don't really see themselves. That's right. They don't see themselves that way.
Starting point is 00:22:31 I know. I know. They don't see themselves that way. But you have to look at what they do rather than what they say, right? Selkber himself has this giant mansion, but he attacks you for being wealthy, right? Solesbury himself has dual class stock, but they have articles. Okay. But that's like the bot like he, like let me tell you, no one at the New York Times, like the actual journalist, right?
Starting point is 00:22:50 writing these articles that everybody's seeking issue with. Like, they're not, they don't, I understand that like, whatever, there's that famous, like, Chomsky interview, I think, where he's interviewing a BBC reporter and he's like the only reason you're in. You wouldn't be in the position you're in if you didn't believe the thing you did. Yes, I'm very sympathetic to that. But at the same time, at what point do we let it go? Because it has a long time.
Starting point is 00:23:14 I think at this point, at this point, like, yes, I just like, it's like, it's like, it's Like, it's, the content creator industry is ascendant. It's only going to become more digital. The Times is a great business is going to continue on. But like, look at where legacy media is. You're kicking a dead rabbit on the ground. You know what I mean? Like, it's so.
Starting point is 00:23:34 So, actually, so let me say something on that. So first is you just asked me how the history happened because I wasn't going to, you know, so I just want to give you the history since you asked. No, I know. I just like, I do have to go on. But where's the go forward? Because I just still, it just, I just think that there's like, like, and also like maybe let's engage with actual journalists, too.
Starting point is 00:23:50 Like instead of like Salzberger, sure, the people, again, there's tons of idiot people that run these corporations. But the actual journalists, like, I don't think have necessarily like not all, do you know what I mean necessarily? Some of them, yes, they're chosen to be in that position for a reason. But a lot of them are just doing their job and they're doing a good job and they're trying to engage in good faith. And I think that there's a lot of animosity kind of coming. So, yeah. Well, so okay. The TLDR, just to answer your question, then let's talk about it.
Starting point is 00:24:15 The TLDR is, I think the media guys think the tech guy started. tech guys think the media guys started it. I think the media guys think the tech guys started by economically disrupting them. I think the tech guys started, I think the media guys started by socially attacking them in the 2010s. But NetNet, where did it land up? Basically, tech realized it needs to build its own media. Hence, monitoring the situation, a new live street, right? Hence, hence X, hence, you know, all-in podcast, hence pirate wires. Hence, you know, like a bunch of shows, Gary Tan's thing and so and so forth. We just have to build our own thing. You don't want to say TBPN.
Starting point is 00:24:51 TBPN, that's right. Exactly, you know, exactly, right? And, you know, it's basically like, you know, in a video game, it's like, get good. And, you know, on their side, even though I disagree with a lot of what they've done, New York Times did actually hire a bunch of programmers and built an app. And so they, some media was like, you know what, rather than yelling at these tech guys, we have to build our own technology, technology had to build its own media. All right, what's a go forward?
Starting point is 00:25:11 And in my view. In my view. There's a technology company now because, like, most of what they do, like most of the revenue comes from the games. Yeah, that's right. So now I think what happened is they actually did well economically and they gained money in, but they lost power because they lost a bunch of folks who, you know, allocate capital and a bunch of folks who built things and so and so,
Starting point is 00:25:33 but they gained a bunch of, I don't know, political partisans, fine, whatever. Point is that I actually don't think it's like totally over the cause of one thing, which is one very important thing. we actually need to have decentralized cryptographic truth that's not behind a paywall that anybody can verify no matter how poor they are, no matter what, you know, like phone or whatever they have, they can verify to anywhere,
Starting point is 00:25:58 and they're not just taking some corporation or government's assertion for it. Now, we actually do have that with, you know, there's actually a book called The Truth Machine, actually by former WHA reporters Vigna and Casey, but they left the WHA, at least Casey did, which says, I mean, that's what Bitcoin is. What Bitcoin is is decentralized cryptographic truth.
Starting point is 00:26:19 Anybody anywhere, whether they're Japanese or Chinese, Democrat or Republican, they could be European or Latin American, whatever. They agree on the state of the Bitcoin blockchain, like who has what amount of money up to trillions of dollars, and that's been true for 10 plus years. And the same algorithms that get us to consensus on financial facts, you know, built this trillion dollar economy, we can use that with some generalization to get to consensus on other kinds of facts, other social facts. That's a big theme for me for the next several years, is establishing consensus, provable truth. It should be free. It should be open source. It should be globally verifiable.
Starting point is 00:26:55 It should not be paywalled. And it's something where I think that's actually achievable. To give you a very concrete example of what I'm talking about. So, you know, I've got a talk on the ledger of record, but now it's actually feasible. If you go, for example, to, you know, Claude or Groch, Chat, TBT, would have and you ask it to describe the FTX hack that happened in late 2022 and give links. It will give some links, for example, to either scan, which show exactly how much, what amount was hacked.
Starting point is 00:27:25 So you can click that, and you can see this happened at that time. And you can check the timestamp. You can check the references. Another example of this was, you know, when there's a thing about the Brazilian fires, you know, a few years ago, and there was a fake photo that actually Emmanuel Macron tweeted out, and the Atlantic actually ran an article on this basically saying they're going to invade Brazil. It's a good justification to invade Brazil because of these fires. It turned out that that was a fake photo, or rather a photo,
Starting point is 00:27:48 which a timestamp showed that there was the guy who'd taken it had died years ago, so it couldn't be a contemporary photo, and you're going to have this whole thing on this photo. The point being, that's a concrete example. There's other examples, like a Chinese case where a timestamp showed that something was patented before, or a patent was filed after a claim would be made publicly, or Elon was able to disprove a story by showing the instrumental record
Starting point is 00:28:10 the timestamps of his car versus John Broder's article like in 2013 or something like that. The point being that this instrumental record, these timestamps are a history that's like the history of the machine. It's all the timestamps that's the raw data. And now you can actually use that to build a story on top of it
Starting point is 00:28:26 which cites the raw data. It cites the raw facts, sites of primary sources. And I think that's where we want to go over the next five years. And the reason I don't think actually to battle, you know the Kobe thing like job done? Job's not done, right?
Starting point is 00:28:40 Job's not done. Because legacy, media has a little bit of a comeback. Why? Because of AI, people are going to legacy media URLs because there's some small degree of cryptographic verification. You see the H-TPS lock symbol. You know, it's coming at least from that author. You know that, okay, well, they may have fake the video, but you feel you know where their biases are. And so we have to respond to that, not simply by going direct, but by proving correct. So that's a major theme for the next five years, 10 years. So that's why I don't think job, job's not done. And to be clear, that's not an oppositional thing,
Starting point is 00:29:12 that's something that you'd want. Even people at legacy media companies would want that because they also want to verify stories to be correct. And sometimes those facts will be in favor of some group or against some group, but let the truth, you know, come out and let the heavens fall. Let me pause there. Yeah, I mean, I think those are all, like this is a bit,
Starting point is 00:29:30 like you have a lot of theories and I think it'll be interesting to see kind of like the way that information ecosystems evolve and how that works out. I'm all for the sort of decentralization of information and people learning. And I think it's exciting to, kind of see the way that people are even using AI to gather their own information,
Starting point is 00:29:47 do their own reporting, do their own kind of investigations and stuff. I just think, yeah, I just think that also, like, most reporters are trying to kind of communicate. Like, most people, they just kind of want a story, or they just want to want the gist of something. Like, they're just trying to, like, learn about something maybe during the day. And I do think that, like, reporters do a good job of packaging information and delivering it to consumers in an easily digestible, largely accurate way. Of course, there's always people that are bad actors. Of course, there's always, you know, sort of wrong information.
Starting point is 00:30:16 But, yeah, but it is interesting to think about how it evolved. I don't know if you saw that group that was accused of setting up this, like, fake AI-generated news website. And then they had agents, actually. It was like an agent that was emailing people for comment. It was leading the future pack, right? Yeah. So although they say that it was a vendor, they say that it was not them. But it was just interesting to see, I don't know,
Starting point is 00:30:43 what interested me is that this agent had gone and emailed all these people for comment and people were saying like, oh, I got an email from that agent or whatever. And I was like, that's interesting, like, to think about it. I wonder if we could get agents to FOIA document requests, you know, email sometimes you have to reach out to comment for, you know, to like 100 people if you're sourcing stuff or whatever. You know, so I just think that it's, yeah, it'll be interesting to kind of see how it all evolves. Yeah, you know, one thing also, I should say is I always try. try to figure out a win-win relationship with some industry that's being disrupted. I actually do have sympathy, you know, because in the 2000s, it was a pretty good gig to be,
Starting point is 00:31:21 you know, a time writer or something like that. You would have an expense account. You'd write six articles a year. You'd fly around the world. It's pretty chill before the financial crisis. And I get why a lot of people in Brooklyn, for example, have like a sense of loss and so on around that. Maybe that sounds funny. No, I think you're totally right.
Starting point is 00:31:41 I think people in L.A. also have a loss. I mean, I live in L.A., but, like, people, I mean, I have friends that are screenwriters, right? They miss the golden age of screenwriting. That's never good at us. So, you know, yeah, I'm sure it would have been amazing to be a Vanity Fair writer in 1995, but it doesn't exist anymore, and it is what it is, you know?
Starting point is 00:32:00 That's right. You can be a live streamer. Good. Now we can all be live streamers. Well, I'm kind of serious about that. Like, I think that sort of content creation has surpassed and supplanted that. And you can have a pretty great lifestyle
Starting point is 00:32:14 as an online content creator just because the institutions don't make it so easy to get that anymore. I just think, yeah. You know, one thing you mentioned, I have millions and I should do something. So one thing I actually do want to do is fund, and I have been doing this quietly actually for a while,
Starting point is 00:32:32 but I'm going to do more of it more publicly, is funding media, academia, democracy, equality, that kind of stuff. Where, you know, like, basically with crypto in particular, I can put up a prize or a task, and anybody from anywhere can do that, and we can pay them pseudonymously or publicly or whatever. And we can have, you know, like, it'll be different than, let's say, the blockbuster picture of the, you know,
Starting point is 00:33:03 20th century, just like that was different from Broadway and plays. But in some ways it has more upside because in somebody who's like a, you know, now you can be the full stack yourself, right? So taking Hollywood, you can be the screenwriter, you can be the director, you can cast the actors, you can translate into 50 different languages. You know, the trueuteur can actually now do it without requiring a deal with Paramount or something like that. So I think there's a role, basically I'm sympathetic to especially you know like people who've lost jobs and so on but I think just like substack kind of stepped in and it was able to build a new model for writers it doesn't support it doesn't support reporting is the problem right is that those that economy
Starting point is 00:33:46 that this this content creator economy doesn't support like in-depth investigative reporting and I think that's what we're losing out. So on that topic right I will say something which is um So, you know, there's a saying which is he dockses, she leaks, but the New York Times investigates, right? And basically the issue is that for somebody who's not elected to just go and spy on somebody else and dig through their garbage and sort of and so forth, like there was a journalist. We want that. Who went to like Benioff's mansion and was literally digging through his garbage in Hawaii, you know, or something like that, right? Are you talking about Pu-Wing, the most iconic, the most iconic journalist?
Starting point is 00:34:27 I don't know, was it Allison or Benioff? I don't remember, but... I think it was Benioff. Okay. The point is that basically, if that was not a quote journalist, that would be seen as stalking, harassment, you know, cyberstalking. Like, it'd be weird, right? For person A to develop, to go and go through a person.
Starting point is 00:34:46 But that's very silly. You could... But the context of that behavior matters, right? The context of calling someone up and asking their associates questions would be weird. but in the context of a job checking job references, it's normal, right? Like, the context matters. So the thing is, if, like, if I just see that person is... It's not consent if you call up and say, should I hire this?
Starting point is 00:35:07 I'm like in fact I'm looking into hiring this person. But think about it, if that's somebody who is just, in my view, Salzberger's employee, Bezos's employee, whose ultimate bottom line is a number of page views that they're making for this inherited media corporation. Well, that's not. They don't care about page views anymore. They care about subscriptions.
Starting point is 00:35:24 They haven't cared about page views long down. Okay, fine. Okay, fine. Subscriptions, conversions. The point being, like, ultimately, it's not like they're some elected official, right? It's not like I consented, you know, look, if the FBI... Hold on a second, but you don't want, but we... Okay, so there's value, right? In, I talk to a lot of investors or people, you know, people in finance, right?
Starting point is 00:35:43 There's value in the journal, the reporting that someone like Pui Wing was doing for the Wall Street Journal or the New York Times, right? Like, they are uncovering things about companies that is valuable to people, to the public, to either public, knowledge or investors or, you know, shareholders, et cetera. So I think there's value in uncovering a lot of this information. Well, but think about what you're saying. There's value to News Corp. Right? But there's also public, there's also value to the public. Ah, that's where I think we may disagree because I would say News Corp is a media corporation that makes money from attacking other corporations, getting their private information, reselling it for money, but then they pay all their own stuff. News Corp does not want, newscrop doesn't want reporting on news. They're a business.
Starting point is 00:36:25 reporting on News Corp. But wait, sorry, but Bologi, you just said you want information to be totally free. So you certainly do recognize that there is value to uncovering non-public information. You think that, wait, sorry, hold on. I just want to understand. Like, so do you think that when you say consent, you mean that you think that information, if somebody hasn't consented to share information, it shouldn't be public? I think just like you should not be subject non-consensually to government surveillance,
Starting point is 00:36:51 you shouldn't be subject non-consensually to corporate surveillance. Well, but okay, but we're not talking about, so, but what about an independent media reporter? Is that okay? Well, what, what if it's like a stalker who's going to subsec and wants to sell subscription to their subsection? Okay, so that's what about yours? Right. So, okay, so anybody doing journalism, what if it, what is it a nonprofit newsroom? Well, actually, I want to.
Starting point is 00:37:13 Let's take the word of journalism for a second, okay? Okay. So if, if, for example, take, and just hear me out for a second. So, okay, I'm very confused by this. in China is doing like some robotics thing, right? Okay. Zuck would never say they're not doing technology. Technology is assumed to be a human universal, right?
Starting point is 00:37:31 But for many years, and maybe still today, if you ask somebody who works in, you know, like a legacy media, and you said, is Glenn Greenwald doing journalism? Is Tucker Carlson doing journalism? Is CGTN or like a Chinese media? Are they doing journalism? Yeah, yeah, yeah. They'd say no, they'd say they're not. That's not real journalism.
Starting point is 00:37:55 Even if they may have much longer. And everybody has different opinions on what's quote-unquote real journalism and a lot of it's driven by ideology. But you can understand that there's value in uncovering non-public information, right? There's value to the public in uncovering non-public information, right? The public, well, so two things. I disagree with that because the public is... But didn't you say you want this like open decentralized information ecosystem?
Starting point is 00:38:18 Do you think things like Polly Market are bad also? It's said. It's said. Yeah, yeah. So let me say something. The public, what is the public? The public is 8 billion people, right? That's like everybody.
Starting point is 00:38:27 Okay. So the person who's at the company that's being attacked by News Corp, like the News Corp employee is making money at the expense of the tech company that they're spying on. And that doesn't necessarily benefit 8 billion people. It might entertain them. The News Corp employee would be extremely perturbed if someone was spying on them. They'd be, they'd lose their minds. They'd get super mad about that, right?
Starting point is 00:38:47 So nobody. No, but people report on me all that. I'm a bubble. figure. People report on me all the time. There's a difference between your public statements and then going and surveilling you, spying and you start, like, no one, like, you're not concerned. But if it's relevant to the public, right? Like, people aren't just randomly like spying on someone for no reason. They're trying to get information that is inherently valuable to the public. Okay, wait, when you say, what do you mean by
Starting point is 00:39:07 the public? Do you mean the Kazakistanis? Do you mean the entire world? No, like actually, you know, a portion of the public, I guess. I guess a portion of the public. Yes, a portion of the public. That's right. So it is those people who are paying News Corp. I don't think it's always the people that ultimately are paying. Why? Because it's paywalled. It's paywalled content, right? Some of it's paywalled.
Starting point is 00:39:27 So the only people who are seeing, hear me out for a second, right? News Corp goes and steals or surveils, you know, some company. What about a nonprofit newsroom that is not paywalled? Like something like public. Okay. But at least you're with me so far. Let's go to the nonprofit in a second. But it's not the public.
Starting point is 00:39:41 It's a news corp subscriber who can actually see the article that NewsCorp, you know, got stole information or they surveilled somebody. to publish it. So Murdoch makes money. You know, the Wall Street Journal reporter makes, you know, page views or it has subscriptions, whatever. And the subscriber to News Corp who is paying Murdoch, yes, yeah, maybe they get some entertainment value out of it or whatever.
Starting point is 00:40:06 They get actual. Sometimes they'll get actual business. Sometimes they'll get, like, I mean, especially if you're a Wall Street Journal on New York Times subscriber, you're often subscribing because you want, you want information, you want information that is going to help you run your business. Sometimes. Sometimes. But here's the thing.
Starting point is 00:40:20 A lot of that stuff is actually very... Entertainment. Sure. A lot of this stuff is very... A lot of this stuff is very negative. A lot of it's actually very negative sum. For example, like, look, a true public good is like a bridge, a piece of infrastructure and so and so forth. Like, electrification.
Starting point is 00:40:34 Yeah. Like, I do believe in the public good. And what if that bridge was... And what if the guy who made the bridge secretly paid someone off to, you know, not file any of the safety reports? And it's actually a huge threat. Let me give an example. Somebody wants to investigate whether the bridge is a threat because actually, once the car drives over at somebody you could die. Of course, but let me give you a different example. So, for, you know,
Starting point is 00:40:54 when Obama was running against, like in his first Senate election, there was a candidate in his wife and I think Jerry Ryan and Jack Ryan. And there was like some, like sealed divorce papers or something like that, they had both between themselves. That was just their own, you know, they had agreed that that was just sealed or whatever. And then some outlet went and got that and printed it. Why? Because they wanted to elect Obama. I mean, that's a real, that answer. And it harmed that couple and that family. Yeah, I don't, I'm not going to, I don't know the context of that, but I'm not going to argue that people haven't crossed the line or published the teal thing, obviously. I agree with that. Well, Taylor, what would you say the line is? Sorry, what?
Starting point is 00:41:36 What would you say the line is where it completely, investigative reporting becomes like surveillance? Oh, well, it's all survey. If you guys are worried about surveillance, like, we have a bit, Like, I am very against mass surveillance. So, like, trust me, you have a lot of sympathies. And I'm actually very... Where it becomes stalking, let's say. Well, right, but I don't think it's stalking for somebody to be reporting critically on a company. And I think sometimes I...
Starting point is 00:42:00 But I agree with you that... Just let me finish. I hear you. I understand. Trust me. You are heard... Like, I'm sympathetic. I have had people...
Starting point is 00:42:07 I have right-wing media reporters calling up my family members, harassing my family members, showing up places. I'm so sympathetic to this argument. I think there should be a very high bar. I famously wrote about libs of TikTok. I've talked extensively about the like many, many bars that that met, primarily that she was directly shaping laws in Florida. And I think that we have a right to know if an anonymous Twitter account is directly informing a legislation that is going to be used to regulate us.
Starting point is 00:42:35 We should know who's informing that because for all we know it could be somebody abroad. It could be somebody, you know, we should know kind of who's by that. So I'm not saying that like, like, I do believe in the value of. uncovering non-public information. That said, I know there's bad actors out there. I think there's bad actors out there in the content creator ecosystem as well. I don't think it's just a corporate media problem. I think that there's a lot of kind of crowdsourced investigations.
Starting point is 00:42:59 I write about this a lot that are happening now on YouTube and Instagram and TikTok that go far more than you're talking about. Journalists, I think, at least have to abide by some sort of ethical standards. Bologi, I don't know when the last time you've been on some of these TikTok T accounts, but they don't have those boundaries, right? Yeah. So here's the thing. I think the line, I'm glad to hear you say, and by the way, you know, it's too bad, you know, about your family and so and so forth.
Starting point is 00:43:24 I support a privacy. And in general, if you haven't made something public, then, okay, like there's a democratic process. There's rule of law. There's the concept of search warrants. But do you think it's valuable to know, for instance, that Open AI hasn't hit, like, revenue targets or is struggling to reach a billion users? Um, I mean, like, so, okay, let me, what I think is valuable is privacy as a bedrock kind of thing. So let me explain what I mean. But don't you think it's valuable to know that?
Starting point is 00:43:57 Don't you think it's valuable for, like, you don't think that that information has value? It certainly does. The reason I'm very skeptical about that kind of thing is I have seen so many incorrect reports of that kind of thing. Like, for example, go back and look in the early, let's give you a concrete example. Google's toughest search is for a business model. I think it said in 2003, NYT article. And it basically said, oh, in 2002, 2003, it said Google isn't making money.
Starting point is 00:44:23 And that wasn't information. It was actually just wrong. Totally. There's wrong information that comes out. I'm not arguing that. But if the information is correct, would you say that that's valuable? Not necessarily because it's like, look, I mean,
Starting point is 00:44:35 it's like taking, okay, here, if you, if you had a surveillance photo. We have to have a debate, 2.0. Yeah. Let me out, Taylor. if you had a surveillance photo of somebody and they were, I don't know, they were in their towel or something like that
Starting point is 00:44:47 and you published it and lots of people clicked on it. Yeah, it's a real photo. It's true. But there should be, but there would have to be news value to, it would have to be relevant for a reason. Who's judging that? Yeah, and I think it's totally good to question
Starting point is 00:45:02 who's making these judgments, who decides what's new, what's valuable, who decides like what the news value of certain information is. I am all for the time. And that's why it goes back to the publishers, right? That's,
Starting point is 00:45:11 and so essentially, look, My view is, with the rise of the internet, there's going to be a lot more people from outside the U.S., outside the west, outside, basically to both the right and the left of the current ecosystem who are going to have a voice and are not simply going to be downstream and accept that Salzberger and Newhouse and Murdoch and these kind of guys are the ones who ultimately judge what is newsworthy. I don't think that's acceptable. They're not elected. They can't be fired. Sure. Totally.
Starting point is 00:45:38 There's no accountability. I totally agree with you on that. but I also think that we should strive to, like, there is value in non-public information. And I think that we see this coming out more and more with citizen journalism, right? Which I think you support. I do support citizen journalism, but I want to be very clear about what I support when I say that. I don't mean surveillance. I don't mean stalking.
Starting point is 00:45:58 I mean, like, hey, here is a, like, here's a pothole on the street that hasn't been repaired. I'm going to write it up. And what if it's corporate wrongdoing? And what if it's corporate wrongdoing? So if it's corporate wrongdoing, well, there's a, like, there's a police report that you file, right? Like, what if it's not necessarily like a police report, but what if it's like, I mean, this is what I'm talking about of like, I do think that there are people, for instance, I write a
Starting point is 00:46:19 lot of stories where like people within a company or within a program, they want their story out. They know that there is huge consequences for them actually speaking publicly, going out and trying to do that. There would be retaliation. So they go to a journalist, right? And they say, hey, can you help me get this information out? And I do think that there's, especially with the government, too, by the government.
Starting point is 00:46:36 So I generally think that that is, uh, not really pure-hearted. I think often people will do that because they can't get their way internally at the company. Yeah, everybody has motivation. Everybody has motivation. Yeah. That's right. And also often, let's say you've got 500 people at a company and one of them goes in defects and give some non-private information or non-public information. The other 499, their equity value goes down. They're basically like traders that betrayed the tribe. Not the equity value. I know. I just think, but sometimes that information is valuable. I hear you. I think there's a lot more nuance to this than. then we can probably agree.
Starting point is 00:47:12 Well, okay, I'll put it like this. I, we may not agree on this, but at least you want to hear my, you know, maybe I can, you know, my point of view is just that. A, privacy is very important. B, we should consent to having things put in public. And C, there is actually a process for non-consensual, you know, privacy, but that's a search warrant, right?
Starting point is 00:47:34 Like a privacy invasion, whatever you want to call it. Like, there is a mechanism where if somebody's trying to keep something private, but it actually should be more public. It's called a search warrant. There's a problem cause. I don't think we should have a search. Especially with the government biology, at least we can agree.
Starting point is 00:47:46 We should criticize that like I guess I have a different view. I'm and I'm all for privacy. I'm all for privacy. But when I see what these lawmakers are doing, for instance, don't you think it's valuable to know like who's behind groups like the digital childhood alliance that's advocating for all these identity verification laws
Starting point is 00:48:03 across the United States? These groups going around, we're child safety. We're just a bunch of grassroots moms. No, you're not. You're funded by evil, you know, groups or you're funded by meta. You're funded by, I think there's value in knowing that. I think that.
Starting point is 00:48:14 This is public record, right? No, it's not. It's not. The way that that actually manifests is, it's just some like titillation, revelation, get people mad, get a lot of views. No, it can directly affect policies. It can directly affect, it can directly affect whether people listen to those groups and whether they find those groups credible.
Starting point is 00:48:33 I know that somebody's just a front group. That means policy is just set by leak after leak after leak after leak, ideally not, but certainly I think there is value to these leaks and I think there's value to some of this information. And I do think that we should strive to uncover valuable information. Yeah, and I guess my alternative view is. And it's not what like polymarket is and all this, isn't that sort of also the goal of all of that stuff is to sort of uncover information collectively? I, you know, I actually, I actually was an angel investor in polymarket, but I do, I did it because I actually don't really believe in prediction markets. I believe in verification markets.
Starting point is 00:49:07 that is to say, like, to actually resolve whether a bet happened, you actually have to have the historical record of what happened. And, you know, for example, the kind of things I'm interested in prediction markets would be like an internal prediction market within a company for when is this feature going to ship or, you know, is this bug going to be fixed and so and so forth? And there's things like that that you can do, which are, I think, have positive applications. But we got to have a 2.0 because I do have to bounce. But I, we have to have a 2B.
Starting point is 00:49:37 I shouldn't agree with the disclosure of non-public information for prediction markets or anything else like that. Even if somebody, but actually that's a good example. That might be some private individual who discloses non-public information in order to make money. I think it's almost exactly the same as a journalist who does it in order to get subscribers. The journalist is not profiting directly in the same way, but I hear you. I do. We might disagree on this, but I appreciate your commitment to privacy, and I appreciate that you are against these age verification, identity verification.
Starting point is 00:50:07 laws and I think we can at least agree on that, but I do have to go up so sorry. I'm so late on my story. This is great. This has been great. Yeah, thank you both so much for coming on the show. It was heartwarming, really, to see how much you agree on some pretty fundamental things. I wonder how many more segments we can do like this. Well, this might be a one-off, but this is fun. All right. I will see you later, see you. Okay. Bye, guys. Bye. Thank you both. Okay. All right, bye-bye. Thanks for listening to this episode of the 8. A16Z podcast. If you like this episode, be sure to like, comment,
Starting point is 00:50:43 subscribe, leave us a rating or review, and share it with your friends and family. For more episodes, go to YouTube, Apple Podcasts, and Spotify. Follow us on X at A16Z and subscribe to our substack at A16Z.com. Thanks again for listening, and I'll see you in the next episode. This information is for educational purposes only and is not a recommendation to buy, hold, or sell any investment or financial product. This podcast has been produced by a third party and may include pay promotional advertisements,
Starting point is 00:51:14 other company references, and individuals unaffiliated with A16Z. Such advertisements, companies, and individuals are not endorsed by AH Capital Management LLC, A16Z, or any of its affiliates. Information is from sources deemed reliable on the date of publication, but A16Z does not guarantee its accuracy.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.