No Priors: Artificial Intelligence | Technology | Startups - How AI is opening up new markets and impacting the startup status quo with Sarah Guo and Elad Gil

Episode Date: July 18, 2024

This week on No Priors, we have a host-only episode. Sarah and Elad catch up to discuss how tech history may be repeating itself. Much like in the early days of the internet, every company is clamorin...g to incorporate AI into their products or operations while some legacy players are skeptical that investment in AI will pay off. They also get into new opportunities and capabilities that AI is opening up, whether or not incubators are actually effective, and what companies are poised to stand the test of time in the changing tech landscape. Sign up for new podcasts every week. Email feedback to show@no-priors.com Follow us on Twitter: @NoPriorsPod | @Saranormous | @EladGil Show Notes:  (0:00) Introduction (0:16) Old school operators AI misunderstandings (5:10) Tech history is repeating itself with slow AI adoption (6:09) New AI Markets (8:48) AI-backed buyouts (13:03) AI incubation (17:18) Exciting incubating applications (18:26) AI and the public markets (22:20) Staffing AI companies  (25:14) Competition and shrinking head count

Transcript
Discussion (0)
Starting point is 00:00:00 Okay, hi, listeners. Today you just have me and a lot shooting the, what's the appropriate term here? Shooting the breeze. Shooting the breeze. Yeah. But I want to start this shooting the breeze session by talking about this Goldman Sachs report that everyone's reading, which essentially says, I'm just going to get on my soapbox for a second here, that the title is something like calling the top. on AI. And so for obvious reasons, I don't like it, but I do think it's worth decomposing for a second. I do encourage everybody to go skim this thing. So there's a bunch of interviews in it. And two of the core ones are from this guy, Darren S. Moglu and Jim Covello. They're respectively like MIT professor and the GS head of global equity research. And Darren is arguing essentially that AI is going to impact less than 5% of all tasks. And the. like trillion dollars of CAP-X that people are spending on training models is a waste because
Starting point is 00:01:07 AI will be unable to solve the complex problems. It's not built to do that. And Jim argues, you know, he argues that in contrast with the Internet where you are disrupting something expensive from the beginning even early on versus having a very expensive solution that then becomes democratized, you know, AI is very expensive from the very beginning. And then the other argument he makes is that any efficiency gains from AI will be competed away anyway. And so, like, you know, none of the companies are going to gain from this. And so if we just, like, talk about Darren first, Darren's arguing about something he doesn't understand. Like, his claim is, you know, how do we know scale works?
Starting point is 00:01:51 More data won't make customer support reps better. I think, like, that's just a fundamental, like, misunderstanding of the technology. and also objectively of what has happened over the last several years of scale and data improving capability and quality of model outputs. I think a lot of these folks, too, by the way, are just kind of stuck in the old AI world. Like, I haven't read the report,
Starting point is 00:02:13 so I'm not talking specifically about these authors. But a lot of people are treating this like old school ML, and they don't seem to realize that there's been sort of a breakthrough in terms of these transformer-based models or other architectures that effectively are both highly, scale dependent, but also provide different types of functionality and features than, you know, you're sitting there and you're munching some data and effectively doing fancy regressions in some sense. So I think that's the other issue here in terms of a lot of what I hear.
Starting point is 00:02:41 This happens a lot in healthcare. In healthcare, they always talk about how data is the new oil and you're like, data is not the new oil. You know, sometimes data is useful and while-labeled data can be extremely useful, but, you know, a lot of it is also about the model and the application and everything else. And so I think there's just this broader misconception in terms of how this stuff works and what it means and all the rest of it. Yeah, I agree with that. I think this is actually a case of like this time it's different and also people lacking, you know, even the market state of what's happening on the ground. Like there are absolutely things that are cost effective to do today. That's why you get actually a series of companies that are democratizing
Starting point is 00:03:18 capabilities and really more on the prosumer side, but, you know, beginning to see things even in, for example, healthcare, traditionally really slow industry where you go, 0 to 5 or 10 million of run rate in your first year. I think, like, as you said, one sort of problem with this framework of thinking is, you know, you assume AI is like what it has been in the traditional ML world. The other is this assumption that the tech won't get much better fast and it won't get cheaper fast. And I think the willingness to predict 10 years into the future of insignificant improvement is ludicrous when literally all of the people working on this tech, are unwilling to, like, you and I are probably unwilling to predict two years into the
Starting point is 00:03:59 future, much less 10 years. Well, I mean, the thing I'd predict two years into the future is that there's going to be even more broadspread applications of it. So I think it's almost the opposite thesis, which is this is early days. And if you look at enterprise adoption of AI, most large enterprises are very early on. They think of it as three things. What sort of vendors can I buy AI-related tools from? What are my internal tools and how do I adapt them to AI?
Starting point is 00:04:22 And then third is, how do I think about it from the perspective of external customer-centric products? Almost everything has happened to now, up until now, has been like bender buys, Decagon, Hervey, et cetera. And then the product side is very early. And it's really the AI-centric for consumer companies, chat GPT, perplexity, et cetera, that are, that they've provided functionality at scale so far. And so the big wave of enterprise adoption hasn't even happened yet, right? It's very early days. so all the impact is in the future. Yeah, I think the predictions I'm willing to make with strong confidence is models will get better across a bunch of domains.
Starting point is 00:04:59 And, like, we're very early in exploitation. And as you said, like, there's a sequence of enterprise adoption where we, like, haven't really gotten there yet because the planning cycle takes so long and, like, you know, all these natural frictions. But I do think one of the, like, flawed assumptions in Jim Convello's picture of the future is that the way you apply these advantages is standard across companies. That's like saying, oh, everyone will use the Internet, and therefore there's no economic gains to be had by companies when the Internet happens. But, like, you clearly get Amazon, you get borders, and you want to be Amazon, or you want to be Klarna or, you know, whoever is actually changing their cost structure dramatically here. Yeah. By the way, this is not a new story during the Internet era. There are people who said that the Internet was meaningless, and the CEO of IBM in the day had this, quote, the Internet companies were fireflies before the storm. And this storm was companies like IBM, adopting Internet and et cetera, et cetera.
Starting point is 00:05:55 And then, of course, now the trillion-dollar market cap companies are Amazon and Google and the like. So this is an old story. It's old wine and new bottles. Time to break the bottle. What else is like inspiring you or bothering you from a market's perspective? You know, I think the really interesting thing about AI is that it's opening up markets in unexpected ways. And I think there's three sort of drivers of that. One is we have new capabilities.
Starting point is 00:06:20 And in particular, we have new capabilities across multiple fields. You know, biology and robotics and obviously language and image gen and video and things like that with a lot of emphasis right now on language and image gen. So these things are very generalizable and you train one model to do lots of stuff versus doing dispoke mini models, which was sort of more of the norm before. Second, you can access it through an API anywhere in the world, which means you don't have to build all the MLOPs, you don't have to do your own model training. like you can use it out of the box with a simple API call,
Starting point is 00:06:48 so suddenly anybody can use it. And third, a lot of companies and organizations have a mandate to do AI. And that means the markets are suddenly open in ways that they haven't been in two decades when the last time that happened was with the Internet, where they said, oh, we have to do something on the Internet. We have to be like an e-company, I think so people said.
Starting point is 00:07:08 You know, those three things are driving unexpected behavior in terms of how you can address markets. And what I and, you know, the small team that works of me have been looking at is market by market, say you look at services and AI transforming services, and I think I talked about this before where, you know, software spend in the U.S. is something like half trillion dollars a year. There's probably $5 trillion in headcount costs for services industries that could be transformed by AI, you know, things like legal or accounting or, you know, you name it, sales, et cetera. And as we've gone through market by market, we've basically been looking at, you know, are there companies that exist? If a company doesn't exist, you'd incubate something, or other situations where you actually want to do a buyout that's AI-driven because it's better to have somebody take over the asset or the company and then dramatically change the cost structure or the leverage per employee using AI.
Starting point is 00:08:05 And so I think for the first time, two things really make sense. One is incubation, which usually is a terrible idea, you know. But there's a few things that, you know, I and my team have started actively working on. And the second is AI-driven buyouts. And I've now backed one or two of those with the idea that you can suddenly do things with AI that you can before and you can kind of change the game in a market. And so, and that's beyond, you know, obviously all the really exciting stuff happening in terms of model architectures and infrastructure and all the rest. But I just think from a market's perspective, which is sort of our topic today, there's really odd things you can suddenly do. And part of it is just the buying behavior is shifted because everybody has an AI mandate.
Starting point is 00:08:40 and they're willing to consider products that they wouldn't be able to consider without AI. So it's a very exciting time from that perspective. Yeah, I want to come back to incubations in a second, having done a few of these. With bio, just to articulate, like, what I think is the premise here, because we've also looked at these
Starting point is 00:08:58 is you are short-cutting the change management process for industries that can be, like, dramatically automated with these new capabilities. by just controlling them, right? You're actually short-changing two things. You're short-changing adoption of technology, which in some industries may just not happen very fast. And in some cases, there's an incentive not to adopt it.
Starting point is 00:09:23 You're short-changing change management to your point. But third, you're actually taking over an asset and you're able to entirely rework the way that a subset of their organization functions relative to AI. And Florida kind of did that to itself with this customer support team where they reduced TED count there by 700 people. people by effectively using open AI and some custom workflows to do better customer support.
Starting point is 00:09:46 And suddenly it was 24-7. It was, I think, almost 20 languages. It was higher MPS than the customer support reps. It was faster response time. It was lower repeat incidents. And there's lots and lots of industries where a lot of the cost is associated or leverage on time. In other words, you could keep the same employee base. You could just make them 10 times more effective in terms of the set of customers I can serve.
Starting point is 00:10:09 where a lot of the work is basically what people call email jobs, right? You're copying and pasting data from one spreadsheet to another. You're responding to emails. You're in a CRM, whatever it is. And some aspects of that are now suddenly accessible or automatable via this new type of GenAI. When you – and just articulate, like, the case against it, too. Like, I think it is, you know, managing services companies, identifying assets and the operational operational intensity of, you know, doing acquisitions as well is a different skill set than most, let's say, software engineering heavy startup teams are. And so I do think that is the question. Like, you know, can you get the right people to run these things? Yeah, you kind of need both because the traditional playbook on what used to be called like tech enabled buyouts, which largely didn't work, is you'd have a PE person come in. They'd buy a bunch of companies. They'd roll them up. And they'd put this really thin veneer of software on top.
Starting point is 00:11:09 of it so that they could claim as a technology company, and effectively what you're doing is you're arbitraging a tech valuation in order to buy other sort of EBITDA rich assets. You're looking for cash flow, right? But you're buying it for cheap at a normal private equity multiple, and then you're raising money on the back of that revenue or cash flow using a tech multiple, which often is, you know, a few X higher. And so effectively, a lot of those early things were arbitrage. And the early investors and the founders of those things tended to do well.
Starting point is 00:11:39 And the late investors and late employees tend to do poorly because as the thing increasingly became recognized as a private equity play versus a software company or a technology company, the market cap and multiple would come down and normalize, right? It would lose value the later stage of GA because the investors became more savvy about what was really happening. That's different here where the leverage on the technology is dramatically higher than just adding a software veneer. But, you know, one of the things I backed is actually driven by a software founder. who's hiring in a P.E. team versus the other way around. And obviously there's very good P.E. people who are hiring and, you know, or working or co-funding with ML people or AI people or software teams. But you need to make sure that you have the right mix and that you actually know what you're really doing. Because otherwise you're just doing
Starting point is 00:12:26 a roll-up, but you're calling it something else. Which, again, is fine, right? Yeah, but you just need to have some roll-up. Like, you need financial engineering DNA and investor DNA as well. I think one of the things that makes this, like, idea resonate with me is I have some friends running, let's say, the current generation of, you know these people too, of like sort of tech eating services companies in traditionally very fragmented industries. And like in a small room, they would say, man, like Sarah, a lot, the automation works and the distribution is the problem going on acquiring the customer relationships. Yeah. And so you're buying the distribution. Yeah, exactly. You're just buying the whole thing. Okay. Let's talk about incubation.
Starting point is 00:13:05 So my, you know, thinking on this historically has been like, okay, when does this actually make sense at all? It's when, like, you really, like, have visibility into an opportunity set or a customer set that others do not. And the, like, the expertise set required to make a company work, like the DNA is unlikely to get it to get together organically. And so, like, that's the alpha. Like, how do you see it? I think incubation is usually. are a terrible idea. And most firms who are groups that incubate companies tend not to work very well. And there's kind of examples of that, right? There's a handful of firms that have actually done very
Starting point is 00:13:45 good incubations. And often, but not always, they're kind of vertically specialized. And so they really understand the dynamics of DTC, or they really understand the dynamics of healthcare, or they really understand the dynamics of X thing. And they have proprietary access to customers or product ideas that they know will quickly take off and resonate, or they have a captive customer base they can sell to where they know how to roll together assets, right? And so I think the way that incubation has tended to work historically is if you have some form of deep expertise in relationships that you can leverage and most people just didn't have that. They would just go and try and start a company. And most, you know, the reason to some extent the startup
Starting point is 00:14:23 ecosystem works is you have 5,000 different founders simultaneously doing a parallel exploration of multiple markets and multiple technologies until a small subset of them actually hit something that works, right? So there's probably like a handful, you know, three to ten companies a year that actually matter out of the thousands of founders, right? And so, you know, a lot of these approaches traditionally
Starting point is 00:14:48 have not worked very well in those kind of examples. You know, Snowflake was famously an incubation by Sutter Hill Ventures, et cetera, but, you know, most of the time these things haven't done great. And there's other positive examples, right? Right now, because of AI, the odd thing that's happening is that certain subsets of the market are incredibly crowded. You know, you have a dozen companies all doing the exact same thing, and then you have these
Starting point is 00:15:11 areas where it's really obvious to build stuff, and they're just wide open and nobody's doing anything. And then relatedly, there's a lot of very large enterprise interest in building things or adopting AI technology. And so there's all sorts of forms of incubation that I think are possible now. And so there's a handful of things that I'm working on right now from an incubation perspective, which, again, normally I wouldn't do very much. The last incubation I did was about a year ago where I worked with Anker Goyle who had started a company that Figma acquired.
Starting point is 00:15:39 I worked with them on a company called Brain Trust, which is sort of Eval and Prompt Playground and a few other products all kind of rolled together. And that was just driven by the fact that, you know, he and I had been sort of riffing on like, what does that be enterprise need to use in order to adopt AI? But, you know, it's rare for me to do those sorts of
Starting point is 00:15:59 of things, but right now there's just a lot to do because there's so many just clear market opportunities or customers to work with or, you know, so there's that customer pull or market pull that normally doesn't exist. I think it's a very exciting time. One of the things that I think makes this make sense right now is the set of people who understand the technology are not commonly the set of people who understand the domain, right? This is why you have this like mismatch that you describe of like some really obvious open markets. And so I think if you can get like great engineers and technologies who understand what's going on in AI and the capabilities that are possible to actually apply them to where the enterprise problems are, that's exciting. And,
Starting point is 00:16:41 you know, the first incubation I ever worked on was a company called The Wake, where it was actually, you know, applying last generation machine learning techniques to network data for security use cases. But I think be it that or other companies, it tended to be like, hey, a technologist from a different domain is looking for the use cases, that best match, which can be very dangerous, but if you know what it is, then it becomes a unique match. And so I, you know, I think that is exciting. What do you pay attention to just in terms of like new applications you think should exist, stuff you want to incubate, whatever it is?
Starting point is 00:17:22 Yeah, there's a ton, you know, and if people are interested in working with me on something, I think they should obviously feel free to reach out through the network kind of thing. But, you know, there's one thing that I've been looking at very seriously on the healthcare side. One thing I'm working on that is sort of incubating against these large enterprise assets that I mentioned. And that's very exciting because then you have instinct customers. And, you know, we're, you know, in the process of pulling together a founding team for that. If anybody, again, wants to ping me or apply for it. And then there's one or two areas that I'd really like to work.
Starting point is 00:17:56 but we just don't have bandwidth. You know, one is in the services world, and one is more like a new type of model or, you know, a large-scale model for a specific application area. Are you working on anything in the area? Yeah, well, we just did a healthcare thing. So I'm very excited about it after, you know, many, many months of casting about. So I guess we talked about the really early stuff, you know, which is incubation and all that kind of stuff. The flip side of it is public markets, right?
Starting point is 00:18:25 like and one question I've been increasingly getting is like what do you what do you buy in public markets given AI or would you change how you think about portfolio construction given AI or how do you think about you know existing companies and that they're you know the risk of AI to those companies and things like that so do you have any sort of thoughts or picks in terms of things that you think are exciting that are you know much later companies relative to AI or how to think about that I mean I'm I obviously focus on the early stage but I'd be curious. for your point of view on this. I do think there's a like if I could short something. I think that there is a mid to late stage private company, small public company that does not have the speed and institutional and founder leadership will to go change the business when they see the writing on the wall. And I think that that's not certainly not every company. I mean, We've talked to a number of, like, amazing founders who have attacked the opportunity really aggressively because they see the way to grow revenue or make the business as higher quality. But I think that isn't going to be most companies. That's sort of the mid to late stage private set.
Starting point is 00:19:38 So I think that is, that's something I would really think about in terms of portfolio rationalization. What about you? What in the public markets are you paying attention to? You know, I think of it as less just public market specific, but more like what are things that, you know, are durable in the coming world. And there's almost two versions of this question. It's like, what do you want your kids to be able to own someday or whatever, if you can afford it?
Starting point is 00:20:03 And then there's what portfolio you would construct today in tech. And, you know, the crazy thing is if you go back 10 years ago, there was kind of this era where people were talking about fang, right? Those were the four companies you were supposed to just buy and hold. And that was like Facebook, Alphabet. Netflix and Amazon, right? That kind of morphed now, and Vidia is part of it. And they, you know, they keep kind of changing the rubric.
Starting point is 00:20:31 But to some extent, you know, one question is, what is that next set of companies that have multi-hundred billion or trillion-dollar potential, right? And so there's some basket of like, whatever you want to call it, new thing or new, yeah, I care what they call it, the Great Eight or the Something Seven, or I can remember what is called in a lot. Yeah, it's a magnificent seven, right? The Magnificent Seven, right? So what are the, what is the next wave of that?
Starting point is 00:20:56 And maybe that's SpaceX and Stripe. And, you know, you can kind of think through, what do you think of these companies that may just keep compounding for the next 10 years? So that's one segment. Second segment is potentially somebody's pre-existing big tech companies that we'll just keep going. Like, I wouldn't be surprised, for example, if Apple ended up with like another iPhone replacement cycle just for AI. Oh, we have a new chip that runs these, you know, they mentioned. that they're going to be working with smaller models that run these smaller models locally.
Starting point is 00:21:25 You need to upgrade your phone, et cetera. So at least for some period of time, Apple may have a bump from that. So there may be some big tech. Third is what I call like AI durable companies. Like what are the companies where AI just doesn't matter for, you know, is it railroads? Is it certain SaaS companies, right, where it just doesn't matter. You add AI, who cares? That actually shows durability.
Starting point is 00:21:51 Like if AI doesn't improve the product that much, it's not really a competitive factor. So you can imagine a basket of stuff there. And then obviously there's like, what's the new AI index besides Nvidia? Invidia's a core piece of just a way to index the market, but there may be other companies. Maybe that's scale, maybe it's something else, right?
Starting point is 00:22:11 But I think I'd kind of think of it across those four segments. You know, big tech, subset, what's the new magnificent seven, what's durable in the face of AI, and then what's your AI index? You know, as we build out this whole AI wave, how do you think about staffing AI companies and post-AI teams and what does all that mean? A while back, you said that, like,
Starting point is 00:22:35 you think the founders right now, both, you know, existing and new, they've been like high-quality, motivated, ambitious in a way you haven't seen any in a while, just really, really good. I think one specific trait that keeps, or specific belief that keeps happening is like, especially second-time founders, but both types, like, I've seen founders be really ambitious to have, like, a great pure metric of revenue per employee, right?
Starting point is 00:23:06 They're ambitious to have really small teams. And I think that's interesting, like, people talk about, like, the one-person billion dollar company or whatever. But I do think this is kind of the purest version of, like, actually doing startups, right? because you never quite had enough money to begin with. So you're always trying to solve the problem with the people in the building. And so, you know, solving it with fewer people in the building and more technical capability feels like just an extension. Yeah, it feels like we're a couple years away from that, though, right?
Starting point is 00:23:36 Like there are some things that we can become dramatically more cost efficient on, but, you know, at least the existing coding copilot's and other things give you some efficiency, but not, you know, 5X. and maybe as some of these, you know, AI coding agents really start to work at scale, that's when you have this shift in terms of cost structure. Obviously, there's cost structure shifts that we talked about in terms of buyouts
Starting point is 00:24:01 and customer support or BPA or other things. But, you know, I think there's a, one of the questions in my mind is like, when do we feel this really hit? And what is the GPT generation equivalent? at which it happens for enough of the economy or enough of a tech company, right? Like what portions of a tech company for a startup can be automated today? Because if you think about it, many startups are six to eight engineers and a designer or product engineer and then, you know, a founder or two and then sometimes like an early business person, right?
Starting point is 00:24:39 So there just isn't a lot to get rid of in some sense for very early companies. I don't think it's at that stage. And I would agree. Like the models in this generation, they're just not smart, coherent enough when they're still fighting this uphill battle to get them to do tasks that are valuable. And like, I don't think you're going to eliminate headcount in the first 10 people. You're like, you know, doing what we're doing, like looking for the pockets of workflow or services, whatever else, where there's that minimum viable quality already today. But when I ask, like, I'd be here as what you think. When I ask founders what functions they think shrink head count first and like how soon was your question?
Starting point is 00:25:15 I do ask it all the time and I kind of get like, like, we're two or three years from it. And then I, for like actual headcount impact first, going back to an example, you brought up, I get SDRs and I get support. You know, there is real volume there as you get to hundreds of people. I think the other observation on just like what's happening with teams is the cadence of competition and change in the environment are like what matters in the environment. for a startup. It is, you know, like here, what your supply chain is doing, like what capabilities are out there, what your competitors are doing and how quickly your customers adopt, right? Maybe I'm missing something, but like those are kind of the external things. And, you know, customers will be customers unless you control them and buy them out. But the pace of change of the
Starting point is 00:26:03 underlying technology has accelerated such that, you know, you have this like increasingly aggressive race. And so it seems quite important to be velocity oriented. And maybe this just axiomatic in startups. But I do think that there is this tradeoff where the dynamics have changed. And the tradeoff on one side is you still need to hire, experience leaders to, for example, help build certain go-to-market functions well and effectively. And on the other hand, like hiring these leaders tends to, you know, they tend to come from more experience and they have a natural pace that is often quite different from like the initial early stage startup team, but especially for the prosumer companies that, you know, we've been both involved
Starting point is 00:26:54 in, having that go-to-market leadership is not quite as in the driver's seat, quite as early, but you do need to existentially maintain this product velocity. And so on margin, I'd say, you know, for these companies that have this pull from the market in AI, you can choose that velocity instead of bending to experienced leaders or at least be more selective about waiting for and finding and hunting the experienced leaders that like match the match or enhancing natural velocity of the company and or can step up to it. I think it goes back to like big companies suck and you can kind of like punt and avoid big companyness for longer.
Starting point is 00:27:40 And I think that's one of like the core dreams structurally of like, what does AI mean for teams and companies? I think another thing that has been happening structurally from how companies are oriented or what the teams look like is that there are a rare set of companies today that are not initially AI companies where the products, the whole company are really in manifestations of the founder's product taste. these really design-oriented companies like Ivan Zau and Notion, Slack and Stewart Butterfields. But in AI companies today, models are such a big piece of the product experience and that the, thus the like model design. And I do mean that in the aesthetic sense of the term is quite important and often comes from the founder. And I think a bunch of examples of that could be mid-jurney. Hey, Jen, ideogram, Suno, UDio, Pica.
Starting point is 00:28:40 Like, these products, especially in the creative field, they really reflect, like, the type of output that the founders want, the type of data they're choosing. And I think some of the companies end up being more founder-driven in terms of product and model sense than maybe traditional software company. Yeah, make sense. Cool. I think we covered a lot of really great stuff on the market side.
Starting point is 00:29:06 So thanks everybody for listening to the day. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.