a16z Podcast - 2023 Big Ideas in Technology (Part 1)

Episode Date: January 24, 2023

At the end of 2022, our team at a16z asked dozens of partners across the firm to spotlight one big idea that startups in their fields could tackle in 2023.Emerging from this exercise came 40+ builder-...worthy pursuits for the year, ranging from entertainment franchise games to precision delivery of medicine to small modular reactors, and of course loads of AI applications.In our 2-part series, we’ll be covering 12 of these big ideas with the partners that shared them.Here in part 1, we’ll cover Consumer, Games, and Enterprise, with a little Fintech sprinkled in. Listen in as we chat with Connie Chan, Anne Lee Skates, Jack Soslow, Doug McCracken, Sarah Wang, and Sumeet Singh.And look out for part 2 dropping soon, covering Fintech, American Dynamism, and Bio & Health!For the full list of 40+ ideas, check out the full article: https://a16z.com/2022/12/15/big-ideas-in-tech-2023/Topics Covered:(1:30) Breakthroughs in Buying (Finally!) - Connie Chan(16:40) Unlocking the “Third Place” - Anne Lee Skates(30:14) Games as a Neverending Turing Test - Jack Soslow(46:09) The Metaverse Goes Fashion Forward - Doug McCracken(59:24) Generative AI Advances Beyond “Text to Image” to Complex Workflows - Sarah Wang(1:12:23) Embracing Large Language Models and Maintaining Trust - Sumeet SinghStay Updated: Find us on Twitter: https://twitter.com/a16zFind us on LinkedIn: https://www.linkedin.com/company/a16zSubscribe on your favorite podcast app: https://a16z.simplecast.com/Follow our host: https://twitter.com/stephsmithioPlease note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. For more details please see a16z.com/disclosures.

Transcript
Discussion (0)
Starting point is 00:00:00 At the end of 2022, our team at ACTV has dozens of partners across the firm to spotlight one big idea that startups in their fields could tackle in 2023. Emerging from this exercise came literally 40-plus builder-worthy pursuits for the year. And that ranged from entertainment franchise games to precision delivery of medicines, small modular reactors, and of course, loads of AI applications. And in this two-part series, we'll be covering 12 of those big ideas. In today's part one, we'll cover ideas spanning consumer, AI, games, and enterprise, featuring the voices and big ideas of Connie Chan, Anne Lee Skates, Jack Soslow, Doug McCracken, Sarah Wang, and Samit Singh. In these conversations, you'll hear several important themes, including how physical transfers to digital, what the next wave of AI might look like, and specifically what role fault tolerance might play in this equation, what we can learn about social commerce from the East, and what role the growing number of bots in our lives might play. All right, let's get to it. As a reminder, the content here is for informational purposes only.
Starting point is 00:01:03 Should not be taken as legal business tax or investment advice or be used to evaluate any investment or security and is not directed at any investors or potential investors in any A16Z fund. For more details, please see A16c.com slash disclosures. Let's start with Connie. The big idea is around social commerce. So social commerce, discovery commerce, video commerce, these trends are inevitable. In 2023, will be the year it all becomes obvious.
Starting point is 00:01:43 Social platforms will become a natural place for product discovery, and platforms will seek to ease the friction between inspiration to purchase intent, and finally, the completed purchase transaction. Whether it's short form, long form, or live stream, video is going to be a fantastic, way to sell things and teach consumers about a product's value proposition. This will birth an entirely new ecosystem where anyone can become a seller, and new companies can help everyday creators curate and sell products and services. All right, Connie, so I like your prediction, but I think some people could argue that some
Starting point is 00:02:16 of these aspects of social, video, and discovery commerce that you're describing are already in certain applications. I'm just going to use Instagram as an example here. So what gaps do you see in terms of how social implement? discovery or changes its involvement with e-commerce and how perhaps does this may be compared to other parts of the world that have a different state? So I think you're right in the sense that we already all shop from Instagram. Most of the time we get lots of product inspiration from the ads. But whether or not we actually complete the transaction right there and then
Starting point is 00:02:48 that's still a TBD. And I think it's because a lot of the platforms today, the Western platforms still have way too much friction between inspiration to discovering more, learning about the product, to finally doing the actual checkout, the actual payment completion. Today, you might find inspiration on Instagram. You might find a video of an influencer talking about a product. But you might not click and complete the purchase right there. And then you might instead open a rouser and go to Sephora.com and go search for that item or go to Nordstrom.com and go search for that outfit.
Starting point is 00:03:21 The transaction right now is still kind of being broken up. And that's because there's a lot of friction and the actual implementation of it's not great. But I think the other thing about video that's still very underdeveloped is the idea of something like live video. And live video is a fantastic way to sell things. And it is something that I'd say we haven't really explored out in the West yet outside of these other platforms like whatnot that does live shopping for collectibles and other categories. And the reason why live is so powerful is a bunch of things. One, there's community. The people who are all buying at the same time are talking to each other.
Starting point is 00:03:53 comments in the chat. They're talking to the seller. You can ask the seller a question directly. Say you are buying a used luxury purse. You can say, hey, turn it inside out. Can you flip it over? I see a little stain. Can you put it up closer to the screen? I want to look at it more closely. So basically as a consumer, you get more confidence to buy. And shopping is all about giving consumers confidence that this is the thing that's going to change their life and make their lives better, right? So live video just gives you more ways to answer questions and get that information to the consumer. But also, live video allows you to do these kind of limited time discounts or these window duration promos in the same sense that you have Black Friday sales. Why not have a sale for a few hours where the price is just different and then drive people to do volume purchasing, to do group purchasing.
Starting point is 00:04:44 There's just a lot more stuff that gets unlocked when it's live. So whether it's, again, short video, pre-recorded video, long video, I still think there's more to explore in the same sense that you and I, when we grew up, we had these, like, as seen on TV, infomercials. And they worked. They worked for so many of us. And for a lot of it, it was because it had entertainment value. Right now, the video that we see on Instagram today, most of them are still ads, or maybe an influencer kind of campaign sharing a product with you, sharing their endorsement, but they're not entertainment yet. And there's still much ways to go. before video shopping in the West looks like entertainment. Yeah, so I think you're painting a pretty compelling picture, but I still am trying to understand why we haven't seen some of these buying behaviors. I mean, to your point, the infomercial has existed in our Western culture for a long time,
Starting point is 00:05:33 and you said it worked. And so what is stopping us from having these platforms or having the habits, really, of a consumer to go seek out that kind of shopping experience? Does it just really not exist yet? Has the right company not sprung up? Or why do you think we maybe see some of these behaviors in other parts of the world, but not really in the West? We see them more exemplified in a place like China, where live shopping is very big. I think it's because the platforms implemented it, quite frankly, years ago.
Starting point is 00:06:04 And there's also a huge ecosystem built out where you have people who are just curating products, handling logistics, customer service, and so forth. But there is no reason that that user behavior, that customer behavior, is only going to going to work in Asia. In fact, in the U.S. with whatnot, you already see lots of people loving the community of buyers and sellers and finding things that they're buying that they didn't think to go buy. There's already an element of discovery that happens with life. But you start that question asking about infomercials. And let me ask you, when you saw those infomercials as a kid, it wasn't just that the person was demonstrating the product. It was because if you buy it
Starting point is 00:06:44 now in that like 10 minute window, you had all these freebies. Or if you bought it right then in that 10 minute window, it was discounted. But once that infomercial ended, it might not be that price. That's why infomercials work. And that kind of time element is very possible with something like live video, but not what we're seeing today with kind of these pre-recorded Instagram videos with celebrities because quite frankly the platforms haven't built out the tools to allow them to do something like that yet. I love that you brought that up because I think through digital scarcity and how in an infomercial you're like if I don't buy this now, I'm not going to have access to this discount. I won't know where to get it either. Yeah. Exactly. And I feel like
Starting point is 00:07:30 there's almost like the opposite dynamic that has become a habit at least for me on Instagram using that example again where I will literally see an ad for something. And if I don't click it right then and there. I'm like, I know I'm going to be retargeted. I know I'm going to get this ad again. And so I don't feel this urgency, this scarcity. But another aspect of my shopping experience from when I was younger that I don't see in the digital experience is I used to think shopping was fun. It was a social activity. I'd go with my friends. We'd talk about, you know, what we wanted to get. I'd look at what they're buying. And I think what you're pointing at is also some of these other habits in other parts of the world in the East. It's a social experience. It's
Starting point is 00:08:10 entertainment to actually shop, which I don't think really exists in at least my current experience. For years now, in China, when you were shopping, you could put four or five dresses in a shopping cart, send that link to several friends, and they could comment on every single one, and then you could see under every dress the comments from each of your friends. And that feature's been there for years, right? So the idea of unlocking social commerce or product discovery excites me a lot. Like when I walk into Target or Costco, I might have like one thing. thing I need to get. And I always end up with a lot more stuff in my shopping cart. Totally inevitable,
Starting point is 00:08:46 right? I've carted from Costco less than $200, even when I went over one thing. But does that happen when you shop on Amazon? It doesn't. It doesn't. Because the West shopping is still search based. And we are not focused on pushing kind of this discovery-based full 100% recommendation engine shopping experience to the end user. I say Shian is doing it. Temu, the new player in e-commerce, pushed by Ping Duodua, they're pushing that too. But we have still yet to see more commerce companies
Starting point is 00:09:21 take on this product discovery focus as opposed to search and filter. And let me just help that person navigate all the skews I have. The reality is consumers, when you went shopping with your friends, you might have picked out coats or dresses that you wouldn't have gone to search for if you were just searching by, you know, neck type or sleeve length or bestseller, right? It's not going to surface the same things to you as
Starting point is 00:09:46 if you were just walking in a store. That's a great analogy. Something else I want to ask you about is I've heard you talk about how social apps in particular, which you could say are becoming commerce focused apps if they aren't already. A lot of them have kind of devolved into copying one another. Maybe the most prominent example is when Instagram copied Snap Stories. We also saw a version of this last year when Clubhouse was copied by just about every social app out there in terms of having live audio. And so why does it at least seem like social platforms today are just copying the same features from one another and not testing some of these more creative features, especially when we've seen some of these creative features work in other parts of the world?
Starting point is 00:10:31 Yeah. I think there's both logic to it, if you think about it, but also opportunity, right, to change the status quo. The logical part is that, hey, this is already a proven idea. I already have distribution on my app. Why don't I just copy this feature? It's going to work on my app too. And that's actually a very, I call it super app, the phrase super app, almost a super app mentality, which is that I have distribution. My users will want the same thing. I should stick that same feature or that same. product in my app too and it's going to work. So there's logic for why they're doing it. But to your point, there is missed opportunity because there are all these other business models that they're not exploring yet. It's almost like they have to wait until some startup gets traction with it or some other company gets traction with it before they try and implement it. And if they don't do it correctly, it doesn't always work. But some of these Eastern to Western translations they take time. This has been part of my investment thesis for the last 10 years. I take ideas that work in Asia and other parts of the world, try and figure out what can be copied in the U.S.
Starting point is 00:11:36 and then I try and teach it to as many people as possible so we can learn from companies outside of just the U.S. My learning is that there's usually a time lag of at least a year and generally three to five years before some of those concepts that are already mainstream in Asia become very popular here. And is that just a reality that there's just friction, there's inertia in the way we do things, which sometimes is different to other parts of the world, Or is there some sort of misnomer? There's some sort of misunderstanding where people think, oh, there's like a different culture and therefore it won't apply here.
Starting point is 00:12:10 I'm just trying to understand why it takes so long for us to absorb these learnings. It requires a very open mind because you have to go at it or think about product with the assumption that, you know, just like Ratatouille, anyone can be a cook. Like anyone can come up with a great product idea. And if you come up with that mentality, you become much. more open-minded and you actually want to see as many examples of success or failure as possible. And I think another big reason, though, why some of this stuff takes a while to implement is twofold. One, it requires huge organizational changes. This is really a top-down
Starting point is 00:12:46 decision. It's very difficult to change this type of mentality bottoms up. And the reality is, if you are changing your business model of, say, like an Instagram newsfeed and you start pushing all these live shoppable videos, that's taking up space that otherwise would have gone to an ad. that generates money, right? So your P&L takes a near-term hit. You are using a pixel space that could have gone to something else. And so this is a very high-level decision or high-level framework that needs to really come from the top and have really strong buy-in from the top because there are near-term impacts when you try and make these big shifts, right? The second reason I'd say is as much as we use our phones and are addicted to our phones in
Starting point is 00:13:25 the U.S., I think we are still mobile but also PC-oriented. We are not mobile only. Maybe we're mobile first, maybe younger generations, but we're not mobile only, right? Whereas I'd say in China and a lot of developing countries, they are mobile only. There are people who literally do not have a computer and everything is done on their phone.
Starting point is 00:13:47 And therefore, they're experimenting much, much faster on mobile product experiences and mobile business models. And in the U.S., because a lot of platforms try and be backwards compatible, they have to make sure that whatever they have on their app it's also supported on the computer, it limits their creativity or it limits the rate at which they can innovate. I'll give you an example to kind of solidify, like, how mobile only China is versus the U.S. In the U.S., if I'm buying a plane ticket to another country, to another state, whatever, I might research that plane ticket on my phone, but I might still buy it on my computer.
Starting point is 00:14:24 And that's the data that actually comes back from the travel companies, that those transactions, the high-ticket transactions are still happening on your computer. And some might even phone in, right? Yes. Some people might even phone in to have that assurance, right? And whereas something like that, it would all be on the phone. 100% of that kind of transaction would all happen on the phone in these developing countries. And so this kind of disconnect between are you mobile first?
Starting point is 00:14:52 Are you mobile only is another big reason why the U.S. moves slower. I like that frame. and I saw the same kind of leapfrogging from living in Indonesia for a while where truly, as you're saying, it's not mobile first, it's mobile only. I like that frame a lot. And something that you articulated in your idea for the year is that these trends are, in your own words, inevitable. So what do you see specifically changing in 2023 this upcoming year as we look to some of these perhaps long-term shifts? But what do you see happening very soon? I see platforms taking, I don't know if more risk is the right word. But platforms are taking more action. They are going to experiment more,
Starting point is 00:15:34 specifically with things related to commerce or things related to creators. There's just going to be more experimentation happening on that front. And because I'm so bullish on the efficacy of it, because I know I have like an abroller and oxyclean and like all this random stuff that I bought from infomercials, they just work. And I think it's going to work on the phone too. There's no reason why I wouldn't. And because platforms are waking up to this. And also brands are waking up to this. I think that combination is the important catalyst. Now that we're still missing ingredients, we still need products to sell. We still need creators who are really good at selling. Now, the creators who are big on Instagram might not be the best sellers. Just go back
Starting point is 00:16:17 to your infomercial days. Like, there are some people really good at selling. They're not necessarily going to win Academy Awards because they're not necessarily great actresses, but they're really good at selling. And in the same sense, I also think even if we see an increase in commerce for video, you might also see a new category of creators of merch. Next up, we have Anne. Hi, I'm Anne Lee Skades, and I'm a consumer partner at A16Z. My big idea for 2023 is unlocking the third space. Ray Oldenberg coined the term third space, places that host regular, voluntary, informal, and happily anticipated gatherings outside of home or work. These used to be, in real life, community gathering spots, such as bars, coffee shops,
Starting point is 00:17:06 churches, gyms, and clubs. But remote and hybrid work, as well as this generation's natively digital habits, have given way to an online-first era. So many interactions and first encounters occur online, whether it's over Zoom or meeting a like-minded friend with similar interests over Twitter or TikTok, or in a Discord group or a game? What are the next generation of tools and platforms that are built to serve consumers, community builders,
Starting point is 00:17:33 and creators in the post-COVID hybrid world? And how can new technology such as generative AI create user experiences that facilitate deeper discussions and relationship building between real humans and not thoughts? All right, Anne, I'm so excited to talk about this topic because we've been seeing this theme across several of our big ideas, this transition from physical to digital. And before we get into that, I just want to hear from you. I mean, we heard Ray Oldenberg's definition,
Starting point is 00:17:59 but how would you define this idea of a third space? I think the third spaces is physical locations that are not your home and not your work, where you can explore different but important parts of your identity. To me, there are two dimensions. One is obviously the actual physical location, but the other is a person's mental and psychological state. Third spaces are places that enable freedom and safety outside of our workplaces that are homes. I think in terms of the physical location dimension, there are places you congregate socially and you're able to explore different parts of your identity really and safely.
Starting point is 00:18:37 Psychologically, to me, these are spaces where you don't have a defined role of responsibility. For instance, at home, you might be a parent, right, or you might be a breadwinner or a head of household or both. But at work, you have a specific job duty and associate responsibility. neither of these places offer true freedom. Colleges to me are really good earlier life examples of good third spaces. So there's lots of communities and spaces where you're free to explore and free to interact with whoever you want. For adults, this could be religion and churches. It could be games and gaming bars, cafes, arcades. For fitness enthusiasts, this could be
Starting point is 00:19:19 gyms and for art lovers, you know, this could be a museum or a particular museum or even space within a museum. But there's been definitely a dwindling of these types of public spaces. And especially with COVID, actually, there's an interesting stat from the Atlantic that in 2019, two-thirds of Americans said they had a favorite local place. They went to regularly. But that two-thirds has actually dropped to just a little more than half. I like the frame of having a place in society or feeling like you can almost shape or mold how you show up because as you said there's kind of like predefined notions in the way that you show up to work or at home but these third spaces are really important you mentioned college like that was such a formative time of so many people's lives because they could kind of reshape themselves and say hey I actually want to show up differently here but we're also starting to see in addition to the dwindling of these physical spaces maybe some of these digital spaces arise which is really interesting and something that I talk about a lot on the podcast is how we tend to copy and paste what exists in the physical world straight to the digital. So you gave examples of like
Starting point is 00:20:24 bars or churches. During the pandemic, we saw the equivalent of those, again, kind of being copied and paste it, like the same routines, the same ways of interacting just, you know, on a web page or in a Zoom application. But I think when things get really interesting, it's when we start to rethink these spaces with the tools that we have online. So have you seen any examples of where people have kind of, as people say, from first principles thought about how do I use this to actually facilitate some of these human connections that we previously formed in physical third spaces. Yeah. Before I go into an example, I think there's two particularly interesting things here about what digital natives unlocks for these third spaces. So the first is you can connect to more
Starting point is 00:21:10 specialized people and have a much wider selection of people because literally now online, you have a global pool to choose from. That's like everybody, billions and billions of people in the world. So, like, let's say you're an anime fan. Instead of connecting with 800K people in San Francisco because you live in San Francisco, who may or may not share your interests in anime, you can now actually have a selection set of 100 million people globally who are true fans of anime. And, you know, more concretely, like taking whatnot, which is one of our portfolio companies, that's a live shopping platform focused on collectibles and communities as an example. If you think about whatnot, it might either look like just a marketplace or it might just look like a digital games toy collectible store, but it's actually not a copy and pace of a physical game store.
Starting point is 00:22:01 So they've got people who are hosts and guests and they're literally coming from all over the world from lots of different countries. There's gamification built into the product via just a live auction mechanism. So it's kind of like you're in this real-time auction with all these people. And there are digital goods and services that are transacted live, and you can jump from one string to another in real time and actually have tens of thousands of choices of different rooms or even different streams, different events that you want to be a part of that you can literally immediately go to without physically changing your relocation. This relates to my second thought, which is in a physical event, your experience is really limited by presence. you can physically only be present in one place at one time. Your physical belongings and items can also only be in one place at one time. But in a digital world, this concept of presence is completely different.
Starting point is 00:22:56 You can actually reimagine what this means. You can be in a digital environment but also know what's happening or be somewhat immersed in a completely different environment without having to physically go there. And you can immediately change the experience of presence by changing. the environment in a digital world. You can, you know, change the space, change the mood, change the furniture, change the background, what you're wearing, anything, and everything. Games in here, it's probably the most advanced and digitally native. And I think in general, it's always a good place to look for the future of consumer technology. Dungeons and Dragons is
Starting point is 00:23:30 a good example. So it's an original OG-R-PG game. You know, there was sort of like a physical tabletop version. You had to physically gather around with other players and have enough people to fill in various roles, but you're limited by the specific people you pick, their real life imagination, and actually having these people available. But today, in RPGs, you're able to explore completely different worlds, play with AI or humans, but these humans can be from all over the world, and you can immerse in any visual and audio environment of your choosing. That's literally not possible when you're playing in real life. It's exciting. I think about chess is something I played when I was younger, and I only had access to the chess coaches in my immediate area
Starting point is 00:24:15 or the chess clubs in my immediate area and the people within those clubs and the skill sets that they were at. And so, yes, it does completely change who you have access to. But I also like this idea of different modalities of fun or engagement that you mentioned. And something that comes to mind there is I've been talking a lot to my friends recently about how all we ever do is go and eat or drink. And that's what you do. You just meet up with friends and talk. But I remember when I was younger, you know, college comes to mind. We would go play tennis. We would go join a dance class. And you're going to more fun activities. Right. Reble comes to mind here where digitally we've turned coding into a game. You also mentioned, you know, the gaming space overall. Fortnite is something
Starting point is 00:24:56 that we see many kids and older folks playing together. And so I think that's encouraging to me. I think a lot of people look to gaming in a negative way. But to me, we're seeing, again, these new modalities of how people are engaging with one another instead of just talking to one another. And so are there any other examples that come to mind there in terms of like how the internet or how the digital tools we have really change our ability to engage and connect? Yeah. There's quite a view that come to mind. And I definitely think gaming is the best example, but actually for the purpose of those questions, since we're looking at all sorts of activities, I'm hopeful that there will be a lot more digital activities that you can do with
Starting point is 00:25:37 your friends outside of gaming. I think if you look at new relationships, you can build through tech, whatnot, TikTok, Reddit, Discord. These are all really great at facilitating interest-based and sometimes even stranger-first relationships and actually deepening those and giving people a place to outlet and nerd out about very specific niche hobbies and niche interests. For existing relationships, I think that there can be different forms of gaming ranging from really casual types of games like raising a virtual pet all the way through to more serious games like Fortnite. And I also think there's new forms of media creation, Be Real, TikTok, being the obvious most recent example. One interesting one is Lenza AI, which is, you know, the sort of GPT,
Starting point is 00:26:26 generative AI created visual of yourself based on a few photos of a person. They actually had this creative thing where you can now make Gen. AI art for your friends and family and even for your pets. So not just for yourself. And it's an interesting new way to use technology to engage with your friends. Yeah, I love how, as you're speaking about, people are creating together. I can't think of other examples like this in the physical world other than going to maybe a random pottery class once a year with friends. Like, I think it's really exciting that you actually you see friends and family from far away connecting over the creation of these digital products. I'm curious, though, I could see some pushback from people saying, you know, this is great.
Starting point is 00:27:09 Like you can foster your online pet, but we have to exist in the physical world. And of course, there's merit to that. We eat in the physical world. We sleep in the physical world. And connection in that world is important, too. But I'm curious to know if you've seen any trends in terms of some of these habits that are inevitably being formed by the time, the sheer time we're spending online, if those habits or those new modalities that we've talked about, are they reshaping any ways of interacting in the physical world? Have you seen any of the transfer in reverse? Because so far we've
Starting point is 00:27:39 talked about how physical transfers to digital. Yeah. And I also think it's interesting to see just the interplay. What interesting example, I think from a technological lens is VR. It is obviously by nature a completely immersive experience. And so for instance, we have this education portfolio company called Prisms VR that's teaching kids math in a spatial world. And what we see is actually physically in classrooms, kids are using these headsets and learning math and VR. The more advanced kids are actually more eager to help out kids who are struggling with a certain concept after being in this new novel digital environment, even though they were solving problems alone in the digital environments. So I find that really fascinating. I would
Starting point is 00:28:28 say another under-export area here is thanks to Zoom COVID and remote work, we're now really comfortable interacting with people online, whether it's strangers or colleagues or even, you know, your loved ones. But I think there's an opportunity to digital products and features that prompt or create in-person interactions. So Pokemon GoToMe was last such mass market product. And now we're seeing some innovations around the creator economy of fandoms and creators and celebrities gathering online and translating those online interactions to the physical space. Imagine, like, you showed up for a physical concert and you're prompted to meet other fans
Starting point is 00:29:06 that you've already interacted with digitally before who maybe share the same favorite songs or share the same favorite experience with a celebrity or collect the same digital good. I've been thinking a lot about how we have, in many cases, so much data from the digital world, but then we don't, as you're saying, translate that back to when they're physical gatherings.
Starting point is 00:29:25 Like I see so many podcasts or communities thrive in the digital world and then they organize these in-person meetups and they're just like, hey, everyone, stand together, like figure it out, meet people. And they have so much information about the types of topics people like talking about even potentially demographic data where they can connect people who have the same problems or who are at the same phase of life. I think you can facilitate such richer discussions and richer interactions once you use all those pieces of digital data that you're talking about. Exactly. I love this idea of blending the two because there's positives to both the physical and digital. And again, I think when we go physical to digital, we often just copy and paste in strange ways. And then when we go the reverse way, we almost like forget all of the benefit that we got from the digital world that could be implemented in the physical. Next step, we have Jack. Hi, I'm Jack Soslow. I'm a partner on the games team focused on ARVR, games infrastructure, and AI.
Starting point is 00:30:21 And this is my big idea. Games as a never-ending train test. It's not hard for a computer to deceive a human. From Eliza to chat GBT, computers have successfully masqueraded their biology. This phenomena also occurs in games to the mass prevalence of bots. Bots have historically been scripted procedures, but are increasingly becoming true neural network-based AIs. The bots are all around you, you just don't realize. Look to the rumored concurrency bots in words with friends,
Starting point is 00:30:53 the close-win bots and call-of-duty mobiles onboarding, the prevalence of cheater's bots and chess or the never-ending war of bots and platforms in games like Rintag. The next generation of these bots will take human-like to a whole new level. Startups like in-world AI, Comve AI, or charisma.a.i are making in-game agents that understand game state and have conversations, emotions, real objectives. Imagine walking through the wilderness not knowing whether your clansmage is a bot, building a town in a township tale with strangers without knowing the humanity of the local farmer,
Starting point is 00:31:27 playing a game of diplomacy, but not knowing if Turkey is an AI gutting for European domination. In the year ahead, you may not know who's who anymore, and you won't mind. Games are good alone, but better together, or so you think. As you've said, it's not that hard for a computer to trick a human. It turns out the Turing test was probably too low of the bar, and we're seeing this play out everywhere. I think maybe a recent example of this is chat GPT, which is throwing out a lot of correct answers, but also a lot of incorrect answers,
Starting point is 00:31:56 and humans are just kind of taking it at face value. So maybe we don't even need human-like bots to trick humans. It seems like less evolved bots are already doing that. So what do you think about that? Have we already kind of crossed that chasm? Oh, absolutely. And I think the question here is, at what point do you cross the chasm in whatever domain to deceive humans?
Starting point is 00:32:19 So, for example, you used to read the words, don't believe everything you read on the internet. Now I think this applies to AI. Don't believe everything you read from AI. The Turing test, as you say, is a flawed metric. And the reason for this is in part because it is too low of a bar. It is a binary bar. So it's either intelligent or not intelligent. And it tests for deception, not real understanding.
Starting point is 00:32:42 What these AIs do is they try to understand the underlying rules that generate the data set that they're trained on. and then generate an approximation of what they believe is the next token. But it doesn't mean that it's right. In the case of ChatGBT, BT, it communicates with a lot of confidence. A great example of this might be, say, asking ChatGBT to multiply 153 times 257. It might get an answer that's close to correct, and it communicates it like it is correct, but it's not correct. And this is a deterministic domain where you know what the right answer is.
Starting point is 00:33:14 there are a lot of domains where there isn't actually a right answer. The right answer is much more amorphous, say, like, you ask, what should China's AI strategy be? This is a very complex question, and it is dictated by your bias underlying what data you've seen in the past, what you believe about China at large, and then also what you believe the importance of AI should be at large, too. This is a complicated question with a lot of different answers. Open AI and chat GPT might have one answer and then say maybe a Chinese AI, might have another. Because these things are information sources, how much can you really trust them? In the case of Google, you ascribe some trust to the outputs, but you don't really know
Starting point is 00:33:56 the underlying model that contributed to generating the sources that you see. You might trust your mother, but what went into her reasoning process? You probably don't trust the onion, but I'm sure some people's grandmothers do. The question of should you trust chat GPT, the answer is probably for most things, but it's not always that clear. We don't know exactly what went into their training data, which is the underlying rule set that they're generating their model upon. And then additionally, there's a second layer of filtering. With these two layers of filtering, it's hard to understand exactly what it is
Starting point is 00:34:30 or what isn't going on in the model. But for the most part, when people have been testing the outputs, they've been fairly trustworthy. So in my opinion, my faith is I trust the AIs for the most part and then we'll validate their results on Google or other pieces of information. And what makes me positive and optimistic that trusting AIs is going to work out well is because there's been a large transition more towards open source and openness. So you understand the data that's going into the models.
Starting point is 00:34:59 Then you also understand the weights and biases. So you know exactly what you're getting. And then additionally, these models get better, better over time. So where 153 times 257 may have been incorrect in the past, it might be right in the future with GPT4. And then because it's been trained on the entirety of the internet, it actually does understand certain things much better than a human would. So I do trust it as an information source for the most part.
Starting point is 00:35:26 I like that you mentioned trust is really a sliding scale, and that's not just with bots, but that's with humans as well. And maybe one differentiating aspect of bots, or in the case of chat GPT as an example, is that when there are falsehoods, they can be corrected at scale. To use your example, if an answer is incorrect, and you can retrain the AI to no longer do that, it will no longer do that, not just in the single instance, but globally. So that's interesting.
Starting point is 00:35:50 But I think another interesting aspect of your big idea for the year is that we are really experiencing this exponential age as it relates to software, and that includes bots, includes AI. And by nature of that exponential technology, I think we can expect to see a lot more bots. And these bots becoming a lot more convincing, which is exciting, maybe. scary. There's all sorts of implications of this where we're not just talking math, true or false. It's like, do I know if this person's real or is it really their video, right? Is it a deep fake? And so I think there's going to be all sorts of implications, but focusing it back on games specifically, how do you think this is going to impact the gaming world? Is this going to make
Starting point is 00:36:31 certain games not even worth playing because you're just really encountering a bunch of bots instead of humans? A game is a structured form of play, and the key word there is structure. There is structure. are rules involved in the game. When I go and play a game, I enter, say, like, call of duty under the auspices that everybody is going to be a human and they're playing by the rules. When there's a bot involved, say someone has AIMBot, then I feel like I've been treated unjustly, and then the game becomes less strong. The question then becomes, should you design rules for bots? If you're going through the runescape, never-ending war of banning bots, then the bots come up with a different strategy to get around the ban, and then you have to ban those
Starting point is 00:37:07 and new strategy, ban those. It's like, is that worth the effort for just maybe designing the rules for bots in the first place? Maybe you lower the resource constraints or make it particularly horrors of bots to configure in this world or make it so that the humans have to interact with another human in order to get the root or to smith whatever sword that you want. These are the types of questions that you have to ask. So if it's a structured form of play, perhaps the structure should be built around bots
Starting point is 00:37:34 in the first place if you can't really defend against them. that additionally, there are sometimes in these games, you're interacting with thoughts and you just don't know it, especially when they're non-complex games. Maybe a good analogy to think about this is, imagine a bot versus a human who played tic-tac-toe. Tic-tac-toe is a solid game. It would be pretty much impossible to distinguish between the two.
Starting point is 00:37:56 Now think about chess. Now think about chat bots. There's a sliding scale to the complexity of tasks and what you can distinguish to be a bod and not. In an exponential age, we're looking at, say, chatbots that passed the turning test. But what does this mean for SkyRum? Say, the local guard, they might eventually get to the point where they are purely human-like. Maybe the types of interactions that you have with them aren't scripted chatbot lines, but really the types of interactions
Starting point is 00:38:21 that you might have with somebody that you build a relationship with. Perhaps the first time you went to a town, the guard is pretty guarded. You know, they give you a little sass and they ask for money. But after you beat the werewolf that has been haranguing the town for years and you put back in their cave, the guard treats you with the love and respect that you deserve, and you go to the local bar and people celebrate you. These are the types of things that over time will get more and more believable worlds with more and more believable entities. The reason why games like Minecraft have added pets or villagers is because these
Starting point is 00:38:51 single-player games like Survival Minecraft is a pretty lonely experience. You want to have real companionship, or at least the feeling of pseudo-companionship throughout your journey. So if you have really believable villagers in Minecraft, that would be a nice ad. Or say, for example, as Navi in Link or like or a creative time. There's literally someone who is with you, the entirety of the game, who kind of tutorialize the experience and gives you the sense of companionship throughout. What happens when that's an AI that can understand all the different mistakes that you
Starting point is 00:39:21 might be making ahead of time or is actually building a relationship with you that's personalized for your experience? There are a million different ways where this can kind of bubble up. The ones that I'm most excited about are with single-player games and then also with multiplayer games where the multiplayer games become personalized where you actually don't play with other humans anymore. You're playing with perfectly catered AIs for the experience that you want.
Starting point is 00:39:45 A good example of this is Call of Duty Mobile where when you onboard, you're playing against really bad thoughts and then they give you an initial win that helps you learn the actual skills like going to being good at this game. And then slowly but surely you introduce more and more humans into the equation to level up the skill cap. This is the type of experience that could be done at scale with games like Fortnite, say, for example,
Starting point is 00:40:07 groups of different AIs that give you a close shootout. These types of close wins are really good factors for predicting the likelihood that you come back in the future. And there's this chart that game makers try to bring you on, which is this low state, which is over time, the dopamine spikes hit, and then they go down, and then they go up a little bit more, and then they go down a little bit more,
Starting point is 00:40:27 and then so on, so forth, all the way through. and they try to cater the game experience for this exact feeling. But in the case of multiplayer games, you can't actually control what other people do. But if you had AI bots that were believable, you could. So those are the kinds of things that I'm looking forward to. Yeah, I'm hearing a couple important things there. One is it does seem unrealistic to try to void games of all bots.
Starting point is 00:40:50 It doesn't seem like that's possible. And so, again, recurating the way that we think about how games are designed is important. And the second part is how do we make. make these bots additive instead of reductive, right? There are the kinds of bots where actually you're describing they can make the game experience better. And then we all know of bots where it just completely destroys the game. It makes it seem like it's not even worth playing because I can never achieve the skill level
Starting point is 00:41:13 of a particular bot as an example. The important distinction there is a bot is something that is made by the game creator and then a hack is made by a game player to bypass the rules. A bot is supposed to improve the game and then a hack is detracting from the game. from everyone else. At the same time, we can't prevent the hats, most likely, either, right? That's going to have to be part of game design as well. Yes. Yes. Well, I mean, to that point, my limited game experience is in the realm of chess, and it reminds me of some of the stuff we've been seeing in chess recently, where we've actually known that bots are better than humans at chess
Starting point is 00:41:48 for decades now since 1997, the famous Deep Blue versus Kasparov game. But for decades since then, people have still willingly and enjoyingly played chess because there was still a sentiment of I can improve a chess relative to other humans. I'm still not facing bots in this sense. More recently, there's obviously been some controversy because there is now this gray area of can we control bots kind of coming into the human realm of chess. But even then, that's kind of skyrocketed chess into the zeitgeist and it's become more interesting or important than ever.
Starting point is 00:42:21 And then you almost see this new spinoff. It's actually existed for a while. but it seemed to gain relevance, which is the Fisher random chess, which basically has the same fundamentals as chess, but the pieces are swapped around. And so it's almost like humans are trying to extend the game so that's still relevant so that they can still kind of test their humanness, their human skill, while avoiding the influence of bots. So I'm interested to see how that might play out in other types of games. As you said, designing games so that they are not so much bot-proof because the bots will play a role,
Starting point is 00:42:55 but that they are still interesting and exciting with the influence of bots. 100%. And what I really appreciate about games like Fisherrandom chess is that it's about understanding chess versus memorizing, which a computer can do really well, and was essentially what Deep Blue was doing at a greater extent. There's just a really glorified surf function.
Starting point is 00:43:14 I'm excited for other games that kind of consider bots as first-class citizens within them, and then additionally, new game mode. that are built specifically for humans, where instead of saying, hey, no box, you can play on bots on these servers, but also these servers and so for humans. And we'll let Darwin isn't decide what is played more often. That's a good way to put it. We'll let the consumers decide. Let's close us out by talking about 2023 in particular, because we're going to see games and the influence of bots change over the coming years, the coming decades. That's going to be really hard to predict. But what specifically in 2023 are you excited about in this space?
Starting point is 00:43:50 I'm excited for foundation models to come out in more domains. So we saw image pretty much be solved last year and then text continually get better and better. This year I'm excited about 3D foundation models, audio foundation models, video foundation models, those sorts of things. And then additionally, I'm excited about how they all come together under one roof. So how do the text models inform the 3D models inform the audio models? In the case of games, every game asset is not really just one 3D asset. It is an asset with sound attached to it, with code and scripts. And each of these informs the other aspects.
Starting point is 00:44:26 Say, for example, changing an old wooden door. It might have a rickety, slow-moving script as well as a sound attached to it. But if I were to change the door to be a metal door, how does that actually change the script? And then also the sound. And then if I were to change the code, how does that actually change the 3D asset and so on and so forth? So all of these things are connected. And right now they've been primarily point solutions. And then additionally, there's this kind of AI unity that is born that really helps creators and game creators kind of have shocked away a lot of the things that are mostly wrote or at least have been solved by experts in their respective fields.
Starting point is 00:45:00 Yeah, I think you co-wrote an article with James Quartzman from our team. And I thought the way you put it was really, really strong in the sense that if we look at how easy it is to publish online for so many creators to tweet or write up with blog post, that has happened in the last couple of things. decades. And if we look back to the 1990s or earlier 2000s, that was not the case. It was not that easy to publish online. And I'm excited for kind of the same renaissance within games. It's kind of crazy actually to imagine that anyone like myself who has very little gaming experience could actually spin up a game potentially in the coming years. I'm sure you've had games in your head that you've wanted to create. But you could it because it's too hard. In the same way that maybe like you wanted to make an image in the past is on or me. And it wasn't really possible
Starting point is 00:45:46 until you saw Mid-Journey or Dolly. This type of capability is right around the corner, and it's a bet on that derivative. AI is moving so quickly that you have to kind of ride the wave and see where it's going to go. So excited about all of this. I think games is coming for a new renaissance. There's going to be tons and tons of games that are made,
Starting point is 00:46:03 but the real skill is in understanding what a good game looks like. Next up, we have Doug. Hey, I'm Doug McCracken from the games team. Here's my big idea for the year. The metaverse goes fashion forward. gamers know that character skins and games like League of Legends and Fortnite are an important form of self-expression as they become part of the player's identity. That's why character skins are big business despite having no gameplay benefits. Digital natives, Gen Z, and Gen Alpha demand that brands enable self-expression into Metaverse.
Starting point is 00:46:35 Of that group, two and five already believed that self-expression be a fashion is more important in the digital world than the physical. And three and four say they will spend money on digital fashion. Brands that lead in, like Gucci, will be rewarded by the hearts and wallets, both physical and digital of consumers. And as physical brands go digital, more digital brands will go physical, creating even stronger competition and broader adoption. Brands that don't go all in will be left behind. Consumers will demand interoperability across experiences in the metaverse. So over time, brands will favor platforms that enable them to wear their Nike shoes in different games and virtual worlds. Gen Z and Gen Alpha move seamlessly between the physical and digital worlds.
Starting point is 00:47:19 The fashion brands that embrace this will win. All right, Doug, so I feel like maybe as someone who's not a super active gamer, Fortnite comes to mind as a great example of where game players are spending money on digital assets that, as you said, don't offer specific gameplay benefits. can you give some other examples for people who may not be familiar of this same dynamic? So first of all, for games, 90% of Gen Z play games, right? And Gen Alpha, it's even higher. I think it's 94% for Gen Alpha. So the idea, the definition of games is very broad at this point. And games are becoming increasingly social. I'm not sure if you've read the book, Status and Culture,
Starting point is 00:47:58 but it's a really interesting look into what drives identity and culture. And really a lot of it is about status. And so, you know, games that have a strong sense of identity tend to be social. Multiplayer games in particular have a very strong sense of identity. So Fortnite has really builds an amazing game that essentially built for partnerships. Like what other game can have DC superheroes, Ariana Grande, and Rick and Morty all in the same world, right? So they've just done an incredible job. But another example is League of Legends. The skins really started as kind of fully imagined alternate versions of the characters in alternate universes, but they've also done partnerships.
Starting point is 00:48:36 They partnered with Louis Vuitton, for example, for their world finals when it was in Paris, and they launched a set of skins. There are other examples as well. So, Minecraft, GTL-9, both have really strong skin ecosystems, especially with the UGC component. So people that are players actually creating skins for those game experiences themselves. And then two other examples are Roblox and Animal Crossing that book is more on virtual items.
Starting point is 00:49:02 I love that you went through so many examples, but I do feel like we have a limited number of examples just by the nature of this being relatively nascent. Based on those examples, are there any trends, any playbooks that you've seen work particularly well? Is it just about partnering with these large brands like Louis Vuitton or artists like Ariana Grande? Is it about introducing some level of scarcity?
Starting point is 00:49:23 Is it the UGC component? Have you seen any kind of indicators of what people have been starting to figure out in the space as it comes to what really works here? Yeah, I mean, stay more focused on fashion for the moment. I think we're in this experimental phase where, you know, the brands seem to be thinking about their partnerships with the digital worlds, more as marketing. And over time, I think that will build into a bigger business for them.
Starting point is 00:49:50 Back on that book, status and culture, one of the other hypotheses is that, you know, high status individuals, and I'll apply this to brands too, tend to deviate from the norm. So it's interesting to see the luxury brand seem to have embraced this trend first with the virtual world and the games and really digging into these partnerships. I'll give you an example. So Kim Kardashian were a Valenciaga dress at New York Fashion Week. And then Fortnite players noticed that there was actually a skin that looked very similar to that dress that was locked. They couldn't see the pole skin, but it started conversation.
Starting point is 00:50:24 And then, of course, Fortnite comes out and reveals that they've done a partnership with Valenciaaga and created skins in the game. related to this New York Fashion Week event. So I think that's just an interesting way that the brands have been really building awareness with the player bases of these games and excitement with those player bases. And that's one example, but creating scarcity is another example, right? And I think over time, as we look at Web 3 and other types of technologies that enable ownership and resale, that actually amplifies this idea of scarcity and the value of scarcity. Another example is feature differentiation.
Starting point is 00:50:59 So the digital world enables a lot more feature differentiation than the physical world. I'll give an example of League of Legends. There are different tiers of skins, and this is kind of OG for all the League of Legends players out there, but DJ Sona was the third ultimate skin, which is the highest tier, to be released. And, you know, they charged, I think it was roughly $40, $50 at the time for that skin. And it was a complete rework of the character. She had three different versions of herself. She was a DJ.
Starting point is 00:51:27 So there was music that was custom, and you could share that music with your team, and then all of the effects were redone. There are many different opportunities to really try new things and add really cool benefits into the experience with fashion or with skins or cosmetics. And so I think there's a lot of experimentation happening. And overall, it's really about we've proven the business kind of in the core games like League of Legends and Fortnite. And what's going to happen for the brands? How are they going to participate in that business over the long term? That remains to be seen. I love that you mentioned that because I think it's very natural for us as humans to just kind of replicate what we have in the physical world, in the digital world.
Starting point is 00:52:04 So to give an example that's parallel to your Kim Kardashian example, people have been doing this all the time when they see a celebrity wear a particular outfit. They'll go and figure out what designer created that outfit. But then they'll also find a bunch of different brands who have created similar clothing. And they'll say, hey, if you want to be like Megan Markle, you know, wear these jeans or wear this dress. And so it looks like we're, again, kind of copying and pasting some of the behaviors from the physical world into digital. But as you said, there's this opportunity to rethink it. It's like what does fashion look like natively within games? As you're saying, like there's all types of signaling from music to different tiering of specific assets that people are selling.
Starting point is 00:52:45 It just reminds me of snow crash from 30 years ago where when people went to the metaverse in that book, they could show up as a better version of themselves, as in like what they physically look like in real life, or they could show up as like a duck or something that doesn't even exist in the quote unquote real world. So I think that's really exciting. But I want to ask you about something that I've been noticing, which is that we spend so much time in this digital world, many of us. I can speak for myself. I'd probably spend more time realistically in the online world than the offline world. But my share of spend, you could call it, does not represent that. people are still much more likely today to buy a suit in the physical world than buy maybe some sort of equivalent in the digital world. Are you seeing any indicators, any data, any trends that maybe show that this might be changing? Yeah. So it's interesting. The global apparel and footwear business, I've seen stats that say it's a roughly two trillion dollar business, right? And so that's massive. But I think it's hard to compare revenue. The cost structure is completely different. So I don't think digital will fully disrupt physical. We still need to wear
Starting point is 00:53:54 outfits and dress for a daily life. But I do see that for maybe a specific example is like Fortnite earned over $2 billion for multiple years, right? More than half of their revenue comes from cosmetics. And I also think the UGC component for digital is very different because if you're able to immediately download something that a creator created, maybe for free, maybe for a small amount of money, having brands compete in that environment. And so the pricing structure may be different. And I do see that the features that we were talking about earlier
Starting point is 00:54:27 could play a big role in that and improving the value in the digital world. I would have loved when I was a kid to have those Nike shoes that Marty McFly put on and Back to the Future Tomb, and they're self-lacing. That's a basic feature in the Metaverse, right? What other cool features can we add?
Starting point is 00:54:45 But maybe just one last point is that with so much content, people will likely change their outfits more, depending on their experiences. And so the consumption of different types of content could be much, much broader than in the physical world. I like that you also mentioned the pricing structure because it reminds me of apps in the early days. For whatever reason, people felt like it was too much to ask for people to spend 99 cents on an app, even though often it provided much more value than maybe it's parallel in the physical world. And so it will be interesting to see all of these experiments come to be and see where the pricing really ends up because this is, as we've
Starting point is 00:55:24 been talking about, a nascent space. And perhaps it is also a generational thing, whereas more digitally native people start interacting with these games, maybe they'll have less friction as people who maybe are older who question this idea of spending on this digital asset within a game without the game benefits. Something I want to get to that you spoke about in your big idea is the need for interoperability. So you said in your big idea that consumers will demand the ability to interoperate their assets between platforms. But I want to push you on this because we don't see that many examples of this today. Most games do kind of operate in their silos. And I want to ask you, is this due to maybe the technology we've had
Starting point is 00:56:06 up until now? Or is it due to consumers truly just enjoying the quote unquote best games? And by nature of that these are the games where the creators of these games have decided to allocate engineering resources to just their games and not necessarily interoperating with other games. Do you have any thoughts on that and how we might progress there? I do. I have two made points here. I think that the first is really related to there are standards and technical hurdles for implementing interoperability, of course. And then certain games have created their own boats. And a lot of the most popular games now have existed for 10 plus years. And they continue to update and innovate. But what I'm interested in is what are the next generation of companies doing? And there's
Starting point is 00:56:50 a company, for example, Ready Player B, who is a cross-game avatar company for the Metaverse. And they have 6,000 developers that have signed up to leverage their avatar technology. And so what happens when one of those developers becomes the next generation Roblox or League of Legends? So I think that it's the builders of the future who are leveraging this tech, it actually solved an engineering problem for these startups because they don't have to develop their own avatar tech, right? So it actually helps them solve that and then adds interoperability. And so I think there is tech that's being developed to solve this problem. And there is adoption happening pretty rapidly. And I think the second point really is on the consumer side. If you think about back to the Gen Z, Gen Alpha
Starting point is 00:57:38 demographics, kids are growing up today playing games like Roblox. where there's an expectation that you can customize your avatar and you can go into different experiences with that avatar. You come out of the experience, you customize it again, and you go in a different experience. And so I think that that is really creating this environment where kids are going to expect that with the next generation of games of virtual worlds as well.
Starting point is 00:58:01 So we'll see how this all plays out. And I don't think it's black or white, like there's an interoperable future or there's not an interoperable future. I think both can coexist, but I think interoperability will grow over time. Yeah, and we see parallels of that on the web. It's not binary where everything is interoperable or not,
Starting point is 00:58:19 but there are layers of interoperability that do exist. So excited to see how that progresses. But a lot of the trends we've spoken to so far are maybe macro trends or trends that at least will play out over the next several years, if not decades. What specifically are you excited to see in this space in 2023? With the user-generated content environment, I'm very excited to see more digital focused or digital only creators and businesses forming around digital fashion. We have examples like Artifacts, which was sold to Nike.
Starting point is 00:58:51 There's a company called DressX, you know, that there are designers that are creating amazing content. I want to see this grow even more. I'm also interested in how digital content and trends will translate more and more back into the physical world as well. I think that that's going to continue to happen. And like I mentioned, kids move seamlessly between worlds. They're both the real world to them, right? And so how will these influence each other? I think it's going to be really interesting to track that.
Starting point is 00:59:17 And then finally, just continued growth of avatar interoperability of cross platforms. Next up, we have Sarah. Hi, everyone. I'm Sarah Wang, and I'm a general partner on the A16Z Growth Fund focused on enterprise software. And this is my big idea for 2023. Generative AI advances beyond text to image to complex work. flows. We're starting to see the next momentous platform shift in technology. AI is eating software. In 2022, a lot of this took the form of AI generated images of dogs flying in outer space or
Starting point is 00:59:50 AI avatars reflecting the best version of our selfies. But in 2023 and beyond, the enterprise productivity implications will really start to shine. AI will drive 10 to 100 times performance improvements showing companies that there is a new way to work, advancing from text to image to more complex workflows, such as text to SQL queries, or eventually text to Excel modeling and more. We're tremendously excited for the next generation of AI-need of infrastructure and application companies to emerge, along with the enduring existing companies that incorporate AI in their products. As AI becomes increasingly democratized and underlying models potentially commoditized, applications will need to differentiate on the basis of winning mission-critical workloads,
Starting point is 01:00:33 as they did in the last great platform shift and moving from on-prem to cloud. so glad your prediction touched on this idea of practical applications of some of the generative AI tools that we've seen arise in the last year. I feel like 2020 was so exciting. We got to create avatar versions of ourselves, write up these like half big screenplays. It was really fun, but I'm excited to see the next step there in terms of how this is applied, maybe on a more fundamental level in applications that we use every day. And I think one of the reasons that we saw some of these more creative spheres take on AI first is because there's not really a correct answer, right? If you get an avatar, sure, I want it to look like me,
Starting point is 01:01:12 but there's no perfect avatar that I'm looking for. In fact, I love the fact that I don't know what I'm going to get back. But in some other cases, there is perhaps a more correct solution to a problem, and there's a little less what you might call fault tolerance in the system. And so as we think about maybe workflow applications or enterprise applications, are there any that come to mind with similar potential maybe a little more tolerance in terms of getting wrong answers or they're not being maybe a specifically correct solution. Yeah, first of all, I love how you put that fault tolerant. You know, I think that's notably different from other areas where the bar is incredibly high. Autonomous vehicles would be a good example of that. The stakes are, you know, the highest,
Starting point is 01:01:55 right? People die if the product isn't right. And as a result, the process of getting there may take a bit longer. On the other hand, I believe that potential workflow applications of AI are enormous and incredibly near term. And a lot of that is due to this, again, more tolerance for fault. And I think a lot of these workflows are actually oriented around AI assist. So in other words, where AI is making the human more productive, but humans still stay in the loop. And as a result, they can be deployed immediately. I think coding has been pretty salient recently. I've even heard developers say that they can't work without it. That's not to say that developers are using AI and saying, hey, you know, create this app soup to nuts, right? That may be in the future and the very
Starting point is 01:02:40 near future, but today what developers are doing are using it to code complete, right? And figuring out how they can make their process, especially the repetitive parts of their job, just way more efficient and faster. And we always talk about finding new solutions that are 10x plus better. And that's arrived in that arena. There's so many use cases we could touch on, but ones that we've seen lately that have been mind-blowingly good. Video editing with companies like Runway really come to mind. This has the potential race-replace software that's been around for 30 years. Pitch editing and pitch creation. The list really goes on, but I think the key theme here, to your point on, immediate value, really centers around how AI is assisting the human and making them more productive and
Starting point is 01:03:25 efficient. And then we'll slowly get to the idea that, hey, we can actually make a perfect pitch deck that reflects exactly what we want. We could code it out that is exactly what we want and we can edit a video soup to nuts without humans in the loop. But until then, I think the intermediate step has already been pretty groundbreaking. Yeah, I also like the idea of an AI pushing us to be more creative. So one concept that Mark spoke about in an interview we did with VJ recently is Mad Men was a show that was known for always having these really, really creative plots. And one of the reasons behind that is they would sit in a room and they would talk about what are all the different plot lines that people are going to expect. And then let's avoid all of those.
Starting point is 01:04:06 Let's make sure we don't give people what they think is naturally coming next. You could do that same thing with AI. You could ask the AI to come up with 10 different plot lines and then say the same thing. And so it's again, it's pushing you to think more creatively. But another interesting part of AI that I think is maybe underrated is its ability to transition from not just text to text, but we've talked about text to image. We talked about text to code. And I think something you talked about in your big idea was us getting to those more complex models like text to SQL. As we do that, how do you see maybe the foundational layers changing, if at all? Because right now we are relying on a few models like GPT3, which influence co-pilot or some
Starting point is 01:04:50 of these other layers. Do you see that changing where we then have many, many more fine-tuned models to the use cases that people need? Or do you see it all kind of layering back to the few foundational models that we have today like GPT? Yeah, this is a really fascinating question and one that we've been discussing internally a lot, you know, especially as to your point, a lot of the focus more recently is on these horizontal large language models. And the sheer capabilities of these models has really been astounding. But to your other point on fine-tuning, or the process of using additional data to further train a pre-trained language model has a really big impact on how accurate and useful these models are, especially if you think about B2B use cases where a lot of
Starting point is 01:05:35 the truly useful data is actually proprietary enterprise data and not available widely on the internet to train on. So if you put this together, what does it actually look like in the future? Honestly, the real answer is likely that it's use case dependent. I think one of the exciting aspects of the LLMs is that they've really democratized access to large and expensive models. Yeah, their API, you don't need tens or even hundreds of millions of dollars to deploy them in your product. And if companies are then further able to fine-tune these models to their particular industry,
Starting point is 01:06:06 there's an argument that specific use cases don't need expensive models, and we're seeing some companies start to make the decision to own the model, which affords them. more control, but also maybe more cost effective in the long run. And so it's still an open question in my mind. And again, it will be likely use case dependent. Given the capital demands of developing large language models, there is a likelihood that points to a potential, you know, more oligopolistic structure, the way that you have with the big clouds. And not everyone will go down the route of attempting to build a model of that size. But the real question is, which use case? needs what and how much will it cost? And how do you trade that off between control,
Starting point is 01:06:50 having in-house talent, and not relying on an independent vendor versus just being able to move really fast? In addition to that, I also think about competition, right? If we think about GBT3 as an example, there have been several products built off of that. Some that come to mind, Jasper, Lex, these writing tools that, again, are all built on the same foundation. and the question has come up repeatedly from my experience of how are they going to differentiate? And of course, there's differentiation through distribution, through product, through U.S., but do you have any thoughts there on if there really are only a few models that so many products are built off of, how do you really differentiate in that environment?
Starting point is 01:07:31 You know, it goes back to, I touched on this a little bit in the big idea, but it really goes back to owning more of the workflows. And, you know, if we work in the idea of fine-tuning your model, right, how do you collect the data that enables you to do so and makes your product better? I think it goes back to product improvement cycles, if you will. This isn't a perfect parallel, but if you look at the last great platform shift, which is the shift to cloud, and who the big application winners from that shift were, you can look at a Salesforce or a workday, a service now, and the competitors that they faced,
Starting point is 01:08:03 right, and all of their direct competitors, not sort of last. Gen had also moved to cloud, right? And so the benefit of this platform shift that they were touting was not at all unique to them, but came down the ability for them to build out workflows for their end customer that were useful and sticky and has really enabled them to dominate their continued mind share in front of customers in the decade plus since then. And so I think in this next gen of applications built on Open AI and Open AI's competitors, the way to really differentiate yourself is to further build and develop workflows that, you know, are not just sort of a one-click button of add AI to XYZ, which, you know, is really neat over time. It has that novelty wow
Starting point is 01:08:47 factor, but it's frankly not sticky and not something that captures the attention and differentiation in front of your customers long term, but to really say, hey, we're actually going to develop the entire video editing workflow or whatever workflow you've decided to tackle. And to that point, Steph, I actually think there's an interesting opportunity for companies that already exist and own these workflows to incorporate AI into their current products. A lot of our own portfolio companies are doing this. And in fact, I would say this is so important to venture that if a company doesn't have an AI strategy, it should really be thinking about one yesterday. Because the fact that owning the workflow and maybe the data associated with it, the user is already living your product
Starting point is 01:09:30 all day long to further enhance that with AI features just makes a very powerful loop that I think contributes to the point that you mentioned on, hey, how do you really differentiate if you're built on the same infrastructure? Absolutely. I've been thinking about that a lot. This idea where, again, 2022, I feel like was maybe this sounds too negative, but a little bit of a gimmicky period as it related to AI where it's like, look what we can do and it's in your face. And when a company is using it, they need everyone to know. But upon reflection, there have been so many companies that have incorporated AI or machine learning into their products for years. And it hasn't been front and center. And one company that comes to mind there is the script, which we use
Starting point is 01:10:07 for editing. And their overdub feature is using machine learning to train on your voice. And that has existed for quite some time. It's called overdub. It's not like AI assisted voice learning system, right? And so it is going to be interesting to see how companies differentiate in the technology they have, but also maybe the marketing, the branding, the positioning of it, whether they choose to put it out in front or whether it's just, as you said, like existing companies almost like improving, solidifying their existing features. But to that end, it feels almost like every white-collar professional is going to be reshaped in some way by AI to differing degrees. But I'm curious to hear from you in 2023 specifically,
Starting point is 01:10:52 because that's what we're talking about with these big ideas. Are there any kind of short-term, near-term companies, products, industries that you think opportunity lies the most? Yeah, I think there's a lot of tough or attention on some of the things that we mentioned, right? Copy editing, coding, a lot of the consumer use cases. So maybe I'll throw out one that a little bit less than news, but that I'm really excited for, and I think it will be a 2023 change. And that's just the potential for AI and the data workflow. So I'm no data scientist, but as an Excel junkie, you know, I know there's a lot of repetitive workflow involved in getting to the insight or the answer that AI can help you wait through. And from the demos that we've
Starting point is 01:11:33 seen that have, frankly, I think the 10x potential that you see in a lot of these other arenas can be applied to data workflows. And that's incredibly exciting because data-driven decision-making has become the norm. It's the backbone of our data infrastructure investing thesis. But for AI to tackle that as an arena next, I think is incredibly exciting in a very, very large end market. And again, you know, I think a lot of the companies that own the workflows already will come out with some of these breakthroughs and AI features. They really do change the use case from novelty to something that's actually useful that will save you hours of your day that will take you to an outcome that is greater than you would have gotten to had you not use the AI. And I think that's what I'm
Starting point is 01:12:16 most excited for. Next up, we have to meet. Hey everyone, this is Samit Singh. I'm on the A16Z FinTech team, and this is my big idea for 2020. In 2023, FinTech companies will need to strike a delicate balance between pushing the envelope by building with new technology rails, such as large language models or LLMs, but also maintaining customer trust. While potential use cases within FinTech are still emerging, LLMs like GPD3 and the upcoming GPT4 may help businesses train datasets much more quickly and cheaply.
Starting point is 01:12:48 In addition, they may finally be able to automate data-heavy and manual tasks, such as insurance claims processing or loan origination that have really only been semi-automated in the past. But while LLMs can address some low-hanging fruit, more complex use cases will require reserves of user trust. When dealing with, say, fully automated investment decisions or automatic financial reporting for businesses with complex money flows, companies will need to balance these new services and experiences with potential skepticism from consumers. So something I want to call out here is it seems like robustness to error really
Starting point is 01:13:23 matters here. So to give an example, AI today being applied to potential drug candidates sounds like maybe a better option than actually prescribing a drug to a patient definitively. And I say for now, because this may change, but what applications stand out to you again today as near-term opportunities that maybe have some of this fault tolerance? I think what we're going to see is the initial tools that have this sort of level of fault tolerance being around customer engagement tools, for example, in fintech, where both banks and fintechs use these sort of models to basically automate what might be a CX agent in the past, but today you see a sort of virtual assistance when you log into your bank. And I think there's a level of fault tolerance in there
Starting point is 01:14:08 in that, you know, while it's obviously not great, if the customer doesn't get what they want right away, but there still is going to be CX agents in the back, right? I don't think these people are going away anytime soon. And so, you know, they might be able to be plugged in should there be an issue versus something that was just more sort of higher value, right, like a automatic underwriting or automated accounting where things need to be 100% correct. So for automated accounting, for example, if things are incorrect, there could be huge potential implications like IRS funds. I like the example of customer service because something I wanted to ask you about is the transparency that you think is required here or how founders should maybe be thinking about that.
Starting point is 01:14:47 And just to give a couple examples here, there are several applications today that use machine learning within their feature set. And some of them choose to be very, very transparent about it. And some of them don't. An example that I can think of is Descript has had this awesome feature within their application for years, I think. And they call it overdub. And they're basically using machine learning to train your voice so that you can replicate it. as you need. That's obviously using machine learning, but they're not branding it as such. But I could also see how different circumstances would be the opposite, where you'd want to vouch and say, hey, look, look at this awesome new technology that we're using. And maybe a silly example is
Starting point is 01:15:25 just when you use chat GPT, you know I'm using an LLM. You know that front and center. And so how do you think about that? How transparent should founders be when they're thinking about implementing this technology? I think it's a bit of pontification at this point because it's so early, But I think in some cases, there will be interfaces for consumers to, for example, do their taxes via chat TPT like interface or for an earnings call to be automated where an equity research analyst can basically use a chat GPT like interface to, you know, search based on specifically tuned enterprise data for that company. But I think on sort of like more sort of consumer fintech products or other products that we haven't even thought of, these might just be sort of back-end products or back-end platforms where the consumer is never really. seeing the interface, if you will, they're just beneficiaries of that technology should. Right. And if they're beneficiaries,
Starting point is 01:16:16 I'm also curious to know if it's in the back end and maybe they're aware, maybe they're not, do you think that there will be a requirement, for example, to disclose that something is being run through an LLM or how do you see regulation maybe coming into play in this arena? Yeah, no, I think it's a bit too early to tell today, but I'm sure there will be cases where,
Starting point is 01:16:38 you know, let's say in a world where loan decisions, or things like tenant screening are being fully automated by LLMs and say there are consumer issues with that or it turns out that the models that we're being trained upon were using sort of discriminatory data sets. There could be huge implications from folks like the CFPB and other regulators who crack down on these types of companies. But again, I think it's just something that founders are going to have to keep in mind given no one is doing that just yet. Right. So it's kind of warning, if you are going to use these things, to be aware of the potential regulatory implications.
Starting point is 01:17:13 Yeah, I think it's going to be a fascinating thing to watch because maybe less so in fintech, but in certain areas, knowing that something is being run through a computer instead of a human just changes your relationship with it. I think I saw a tweet recently where someone was using LLMs for therapy. And it kind of, as you'd imagine, got a bunch of blowback because just the sentiment of a human not being there in the process, people were not a fan of, even though in that example, I think actually humans were part of that process. But what else do you think founders should be thinking about here? We've kind of touched on regulation. We've touched on the transparency that might be required.
Starting point is 01:17:48 Is there anything else that you think founders should be really paying attention to as they're thinking about integrating this technology? Yeah, for sure. I think on the product side, I think these product leaps will likely come from tuning existing large models with four specific sort of commercial use cases with specific data. However, I think the one thing that founders can't forget is that no matter the sort of platform shift, whether it's been internet, cloud, mobile, distribution is always key. And so even with this new sort of product leap where maybe, you know, you have consumers flocking to the product because you are advertising, say, hey, this automated tax product is
Starting point is 01:18:24 powered by LLMs, and you're gaining a bunch of customers through that, in the event that this sort of products or technology becomes table stakes, well, then we're sort of back at square one, which is distribution is still key. We often say that it's almost more important than products, but hopefully that's not the case given sort of a lot of the last generation of financial services, Pintech products that were more financial engineering than software engineering. And so it'll be interesting to see if these new product leaps can account for distribution modes that folks have had in the past. Yeah. So what I'm hearing is one, it'll be important for people to train their own data sets to a degree or kind of fine-tune
Starting point is 01:19:00 these models because if they don't, they're using essentially the same models that everyone else has. But then even if that's a case, as things advance, there will be kind of this baseline layer that most applications are using. And so from there, as you're saying, distribution really matters, how can you reach the customers that matter the most? Right. And that's a bit of pattern matching from previous sort of technology shifts, right? And so, you know, there's a chance this time is different. However, people more wise than me have always said it's not it's never different just you know just maybe just a different flavor but yeah I mean we always think it's different but no I think it's something that I've been thinking about a lot
Starting point is 01:19:34 how do you build a moat especially if you're using some of the baseline LLMs like if everyone is using GPD 4 when it comes out then it's all being built on the same foundation so how do you stand out how do you build a real moat around that it'll be interesting to see you mentioned insurance claims processing and loan origination as two potential avenues of this being applied Are there other developments that you see specifically coming in 2023 or the near term future? I think in the near term, it probably even won't be low in origination, for example. I think where we'll see the low hanging fruit is likely around kind of back office automation of work flows, if you will. And so, you know, we've been very excited on the Pintech team around vertical SaaS businesses that help, for example, SMBs across industries, manage their back office finances.
Starting point is 01:20:21 and right now this software or the software products that we've seen is really augmenting people and augmenting for the fact that there are basically less back office people to do this work and so software can help make each employee even more powerful. I think what we might start seeing is this ability to fully replace that person and the reality is in a lot of cases you're not even replacing anyone. It's just these people have left the workforce, this sort of like typical back office employee at an SMB. And so, you know, perhaps we'll see models being used by companies of all sorts to automate workflows fully. And I think something that we've, you know, said in the past on our team is
Starting point is 01:21:01 it's better to automate 100% of 5% rather than trying to automate a little bit of the 100% of a workflow. And I think the big question is now like, okay, we may have the potential to automate everything. And what are the implications of that? I think that's going to be really exciting to see. Yeah, I think that'll be really interesting to watch. I mean, I think the canonical example is the ATM and how that replaced one aspect of what bank tellers did. In the arc of history, it seems like people have always managed to reapply their skills to a new, exciting, perhaps more interesting, more fulfilling area. So yeah, that'll be fun to watch. Right. That's the hope. All right, that's all for part one, but look out for part two of our
Starting point is 01:21:39 big ideas for 2023, where we cover FinTech, American Dynamism, and Bion Health. And if you're looking for the full list of 40 plus ideas, you can head over to A16C.com. and you'll find it on the home page. See you soon. Thanks for listening to the A16Z podcast. If you like this episode, don't forget to subscribe, leave a review, or tell a friend.
Starting point is 01:22:02 We also recently launched on YouTube at YouTube.com slash A16Z underscore video, where you'll find exclusive video content. We'll see you next time. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.