No Priors: Artificial Intelligence | Technology | Startups - Introducing 4D Creation Open Beta: NPCs, 4D Worlds, and the Future of Gaming with Roblox CEO Dave Baszucki

Episode Date: February 5, 2026

From “virtual doppelgängers” to “real-time dreaming,” online gaming platform Roblox is using AI technology to build the “Holodeck” envisioned in science fiction decades ago. Sarah Guo and... Elad Gil sit down with Roblox CEO Dave Baszucki at Roblox headquarters to explore the intersection of AI, physics simulation, and the future of human connection. Dave discusses the evolution of the 4D creation tool in Roblox, a high-fidelity simulation that enables thousands of people to interact in real-time with photo-realistic graphics and acoustic physics. Dave reveals how Roblox is leveraging 13 billion hours of monthly user data to train native AI models that go beyond simple LLMs, enabling NPCs that can navigate and play games with human-like intuition. He also talks about how immersive communication will change video conferencing, how Roblox searches for unlikely talent outside of traditional elite universities, and how he balances rapid weekly iterations with keeping a “long view” on Roblox’s vision.  Sign up for new podcasts every week. Email feedback to show@no-priors.com Follow us on Twitter: @NoPriorsPod | @Saranormous | @EladGil | @DavidBaszucki | @Roblox  Chapters: 00:00 – Cold Open 00:36 – Dave Baszucki Introduction 01:16 – Realizing Robolox’s 20-Year Vision 05:29 – Using 4D Immersive Simulations in Virtual Interactions 08:22 – Physics Engine vs. Photorealism  11:50 – Storing Roblox History as Vector Data 14:00 – Training NPCs - Moving Beyond LLMs 18:05 – The Future of the Game Designer 19:54 – Video Latent World Models 23:53 – Social Simulation - AI Companions and Virtual Relationships 27:26 – Why Asset Costs Haven’t Changed the Gaming Industry 29:52 – AI Coding in Roblox Studio 31:36 – The Roblox Creator Economy 33:57 – Long-Term Conviction vs. Weekly Iteration 37:50 – Dave’s Hiring Philosophy for Roblox 43:44 – Conclusion

Transcript
Discussion (0)
Starting point is 00:00:00 What proportion of you think your children will have at least one serious relationship with an AI throughout their lifetime? I think zero right now. I don't think we're close to crossing the human AI barrier, but it is worth thinking. Things are changing every day. We have to incorporate and do that. And then there's also some weird universal truths of things that stick around for 40 years and kind of have to blend them together. We feel if we build Roblox right, it might be the kind of thing that's kicking around in four years. Hi listeners, welcome back to No Pryors. Today, Alad and I are here with Dave Bazuki, the founder and CEO of Roblox, the 3D immersive world where more than 150 million users come every day to hang out. We talk with Dave about the future of AI in gaming, their investments in AI, how MPCs are
Starting point is 00:00:56 going to change and really transform game. Roblox's studio, one of the largest coding platforms out there, and their 20-year mission. to build the Hologdeck. Welcome, Dave. Dave, thanks so much for doing this with us. It is so great to be here, and thank you for doing this in Roblox headquarters. Yeah, exciting to be here. Yeah, great to have you guys. Thanks. So I think we wanted to start with a big picture question.
Starting point is 00:01:18 I think you all have been really visionary in terms of where all this is heading, and AI is obviously having a big wave and effect on gaming and what's going to happen in the future there, and that's across things like world models, how do you think about NPCs and their revolution, how you think about assets within games, how you think about world creation. So we just love to hear your views of like big picture 10 years from now. Where are we going? Like what's coming? Yeah, so two step on this one. I would say first, I know you're both dabble in investing a little. So there is a Roblox business plan PowerPoint deck from almost 20 years ago that we pull out
Starting point is 00:01:54 sometimes and we're amazed at the fidelity of it. It pretends a new category. One could call it the category of human co-experience. It's got the sizes of a bunch of companies from 20 years ago. So you can see social networking companies. You can see YouTube video. You can see toys and you can see all of that. And what it imagined is really a new category.
Starting point is 00:02:17 Some have called it the Metaverse. Some have called it the Hologdeck, which is really the ultimate high fidelity simulation where people can come together and do stuff. It's like a ready player one in the movie. Exactly. And so we do have that business plan slide. And what we had always imagined is if this high fidelity space was backed by physics simulation and reality simulation, you'd be able to do stuff.
Starting point is 00:02:45 You'd be able to build a car, put wheels on it, drive it around. You'd be able to go to a birthday party and blow out the candles. You'd be able to chop trees down and make a house of it. So that was literally the genesis of Roblox. And here we are 20 years later, there's so much more to do. But one vision of AI is how do you superpower the creation of that holodeck? And how do you get it photorealistic as soon as possible? How do you get 10,000 people in it instead of 100?
Starting point is 00:03:18 How do you do acoustic simulation? That sounds realistic with 10,000 people. So in that sense, AI is super interesting about just trying to get a vision that's been around for 20 years. and has arguably been in sci-fi, once again, Snow Crash Holiday, Forever. There is another interesting vision, and I think it's useful comparing these product things. There's much more, what's the future of just a single person by themselves dreaming what they want to dream? And we sometimes call that real-time dreaming category. It's a category that we can see a bit of trappings of with short-form video.
Starting point is 00:03:59 There's a little bit of that in short form video. You know, someone's 2 a.m. in their bed doom scrolling. It's reacting to dwell time. It's reacting to what you favorite. And you're kind of getting a little bit of a pre-built dream. That I think we sometimes imagine could go all the way to the famous Tom Cruise movie, Vanilla Sky, where he literally real-time dreamed for several years in an imaginary universe where everyone around him was primarily an NPC.
Starting point is 00:04:29 and he didn't even know he was doing it. So that's more world-building as content. That is exactly right. And so if you put those at two extremes, one is communication platform, high fidelity with people you know, the other is real-time dreaming where everyone's an NPC. Everything in between is possible. And I think we see everything in between ultimately as being possible. and we may see weird product categories we never imagine.
Starting point is 00:05:02 I think people are probably thinking, does the text prompt turn into a voice prompt, and does it automatically turn into a video prompt if there's like, I want to learn French, video, how you're in a French cafe, start talking to an NPC. So I think there's a wide range of products that start to get into this space we're interested in, which is kind of that 3D reality space.
Starting point is 00:05:29 Do you think that seems useful exist for both consumer and business applications? Because people also talk about this in the context of, instead of doing a Zoom, why don't you go into a shared space with someone and meet with that? I think, you know, what degree of your life becomes virtual with that way? Yeah, we have seen, I think on the communication side, people just want more and more fidelity. We can go back to paintings on the wall of the cave. People want more fidelity.
Starting point is 00:05:52 We have smoke signals. We have the mail system. We have the Pony Express. We have the telegraph system. We have the phone system. We have arguably the COVID-initiated video. We also did have the COVID-initiated 3D experience on Roblox, actually. And we found side-by-side video.
Starting point is 00:06:12 A lot of young people use Roblox as an early test case of how to co-experience. I think as a technology for multiplayer 4D simulation gets better and more. photorealistic, it's almost going to be like video is the downsampling. And it's like if we were having this 4D immersive communication, we'd say go to legacy analog mode and make it look like Zoom, basically. And we could always do that. But we'd be able to do things we could never do on Zoom. You know, you'd say, hey, let's pop up and go walk around my office. And I want to show you something, come and follow me kind of thing. So I do think the 4D sims, as it gets photorealistic will be a super set of video.
Starting point is 00:06:59 The other thing for business is there's a few things at 40, and I use 40 because it's not just 3D shape, but it's function in a way. It will provide that video does not provide is acoustical things. And there's some real tricky physics problems when you're having a company meeting of 1,000 people. If we try to do it on Zoom, we have like a bunch of squares,
Starting point is 00:07:22 who's live, who's not live, how do we mix the sound, which is very tricky. If we do it in a three dimensional simulation, I can hear everyone with attenuation based on how far they are. And it can actually be a much more natural experience. Like I walk closer to you and I hear you. And that may actually be a better human interface paradigm. There's one final technical thing that's going to be very, very difficult is when people try to sing happy birthday to you together. And because no matter whether you're on Zoom or whether you're in a 3D simulation, you're hearing them like 30 milliseconds later. So what I think we're going to find is people are actually allowing each one of us to sing happy birthday with a time forward extrapolation of each other and then remixing it. So I do,
Starting point is 00:08:16 for the business side, I do think ultimately we're going to see some interesting conferencing solutions. Roblox has this amazing wild story with physics simulation or physics engine as part of the birth
Starting point is 00:08:30 of Roblox. But then today, or for the last decade, I think of this as an immersive world where people have their first coding experiences. They build amazing games. They interact with each other. I don't think of that world
Starting point is 00:08:43 as being super focused on realism. That's right. So is that something that's changing for you guys? Is it enabled now? How should we think about that? The way I would think about it is we're pushing really hard on the physics simulation capability. And at the same time, we have this incredible free market that we can't control on content generation. I would say we just started talking about full acoustic simulation.
Starting point is 00:09:09 I just tweeted about it. So actually, you know, approximate attenuation and all of this. this kind of things and echoes, that's going to start happening. On the physics simulator, it's behind the scenes gotten better and better and better. It'll keep getting better. But what we have found, I would say, is the diversity of developers are beyond our expectation. And I would say some of the things that have gone viral, dressed to impress, for example, grow a garden, are less physical. But they have leveraged cloud capabilities that we've built side by side, you know,
Starting point is 00:09:46 grow a garden was essentially able to build an experience where things keep growing, even when you're not there, by leveraging a lot of the cloud persistence we provided. Dressed to impress was able to do some very interesting things with how they run contests and layer clothing on. So I think we're generally, you know, the specification for our product could be, we just want to simulate the real world with 10,000 people. And the more we go in that direction, the more will deceive creators using it in different ways. How much attention have you been paying to world models and what's been happening in the AI world
Starting point is 00:10:23 in terms of how people are shifting, how they do physics simulation, right? I think we're really watching it. I think that the biggest thing we have focused on, and I think we'll continue to focus on, is how do we get that 10,000 player multiplayer type experience? And how do we fall back to 100 or 10 or 1? we feel it's super robust to build a communication platform. And that's part of our goal of getting to 10% of gaming. The technology for this, I think we're going to start unraveling it over the next five to 10 years,
Starting point is 00:10:59 is what's the most efficient way to synchronize the state of 10,000 people? It's a deep infrastructure problem versus necessarily a physics simulation. How do we synchronize the state? and the memory over the last five hours of 10,000 people. Where have they been? Where have they been in 3D space? What have they been doing? The world model thing is very exciting
Starting point is 00:11:24 because they're storing memory, literally video latency. You know, a minute of frames. It's showing super early promise. I think the technical revolution is going to be, what's the ultimate synchronized state for 10,000 people? Is it in video latency? Is it in much more native 3D format? is it a hybrid new discovery of some 3D video latent space to be determined?
Starting point is 00:11:49 I will share that the format we're using, part of the vision for us, is to ultimately store the history of everything on Roblox. And to store it, not raster like video, but store it vector, which means the ability to play back anything that's ever happened on Roblox. That's 13 billion hours a month. And so the ability... What do you think that does for you, the history of your world? So there's a sci-fi book by Arthur C. Clark called The Light of Other Days,
Starting point is 00:12:23 which talks about the fantasy of what would happen if we had infinite playback in our world? It's like a real social problem. Because if everything we've ever done can be played back, actually in that sci-fi book, it's not just everything could be played back, but you could look at any playback. And so everything is completely transparent. There's no private conversations. It actually delves into that.
Starting point is 00:12:49 I think for us, we would then use it very thoughtfully and privacy compliant and judiciously. But for very simple things, you can imagine, oh, there was a safety incident. Let's go back and put five cameras and listen to the audio and see what happened and make a good judgment call. That's kind of interesting.
Starting point is 00:13:11 There's a really special thing that maybe I did with someone in my family that's really special. Given I have access to my history, go play that back. And possibly because it's 3D vector data, not raster data, reshoot it from it. Make a cool video of the best moment. And I think where it ultimately ends up for us is there's a huge push now for people to find kind of hybrid video. ASDW human interaction data. People are starting to train on video with keyboard stuff. And like, where do I find 200 million hours of this data?
Starting point is 00:13:52 The data we have, which is 13 billion hours a month, can be reproduced from any camera angle and can interact with the 3D space. So it's very powerful data. It leads to a really interesting computer science program idea around how to create great NPCs that are more than just an LLL. Okay, tell us more about that. Actually, that sounds really interesting. Yeah, go check out on XRX feed.
Starting point is 00:14:16 I think the last day or two, we released a bunch of native video demos and maybe we can even play them back here. But what we are starting to train is NPCs based on a combination of all of the data we have in a privacy-compliant way that are starting to navigate and starting to play games.
Starting point is 00:14:36 The challenge, I think, would be challenge level one, everyone has access to an NPC that can get pretty good at playing any Roblox game. So cool. Challenge level two is by watching, if I so opt in in a privacy compliant way, my behavior is the way I act, literally my gestures, the way I look at things, the way I talk. Would we allow you to have your own virtual doppelganger if you wanted? That's fascinating. And then number three would be is if I have a good virtual.
Starting point is 00:15:11 doppelganger, just as agentic is all the buzz in other areas, is there a simple user interface for agentic virtual doppelgangers? And that starts to get interesting because one could imagine sending their virtual self out to go do something. It could be, you know, my son or my daughter wants to play with me, having to do some work. Could my virtual doppelganger fit in for 15 minutes? Sure, That's great. That could be really interesting. And there's a lot of other things, just like we're going to see agentic space in the 2D and the workspace
Starting point is 00:15:49 and the productivity area. I think it's going to be interesting to imagine. So you're basically using these 18 million hours a month to train... 13 billion. 13 billion. Yeah. To train a native NPC model. That will be a Roblox privacy compliant model,
Starting point is 00:16:03 never share the data. And then over time, no ship date, make that customizable. as your own virtual doppelganger, or for creators with the addition of a prompt, you know, upload Benjamin Franklin's history, or it's already embedded in an LLM, and to say, hey, I want Ben Franklin. Yeah. And I want a happy Ben Franklin.
Starting point is 00:16:27 Are there experiences that you imagine creators making with these NPCs that are really different from the games of them to me? I think there's a lot of standard game NPCs. that I think have done such a good job. I wouldn't claim when these types of NPCs would be good enough. But like Grand Theft Auto, right? They've been working on this forever. That is a very rich world with a lot of very high performance,
Starting point is 00:16:54 you know, NPC behavior. Imagining someday many people can create them. I think to make that happen, these platforms have to be very cloud-connected platforms rather than running local to spin up inference on all of these things. But you could imagine that. You could imagine a very young creator saying, I'd like to build an American history thing
Starting point is 00:17:23 where you go and talk to Benjamin Franklin. So I think we would want to enable both of those. It's really interesting because basically the transition, it was going to be a poor analog. But there's a transition that happened in self-driving. We went from mapping out heuristics and sub-behaviors in sort of fine detail to capture edge cases, to just doing end-to-end models for deep learning. And it sounds like you're basically considering doing the same thing for NPCs.
Starting point is 00:17:47 I think that's the exciting computer science capability is rather than all of these wonderful techniques people have used, you know, decision trees and this and this, getting to a purely more general modeling solution would be really fascinating. Yeah, super interesting. What do you think the role of like a row of? like a Roblox game designer or any game designer is five years out from now. You think it's different? I think I'm optimistic that typically industrial revolution or whatever revolution, I think we've chatted about stuff like this before, there's a thought that there won't be any
Starting point is 00:18:27 jobs after this. I think I'm a little bit more optimistic that a new class of things will show up. And I think there's a huge opportunity for more creators. I think, you know, I'm not to the point where I believe, you know, what we call AI slop today is going to take over. Like I think the human touch has got a long runway on it. So you can imagine more diversity or quality of experiences supplemented by AI. But I do think the role will be more leveraged. and the quality coming out of five people will be astounding,
Starting point is 00:19:09 you know, but we'll come to expect that quality. So quality will go up. I think iteration on testing will go up. I think where we see platforms like Roblox going is just as people are starting to spin up agents to develop code. I think people, hopefully on the Roblox platform against the Roblox cloud, you'll be able to go away for 24 hours and your agents are going to be like tweaking and testing,
Starting point is 00:19:40 spinning up a Roblox experience, sending 20 NPCs into them on various client emulators. There's an MPC on a phone. There's an MPC here, tuning the game. So hopefully you'll feel more power in what you're building. I'm also curious. There is a set of research around world models and I'm sure you've seen that is just trying to directly
Starting point is 00:20:01 generate game experiences with like there's no physics engine underneath. Like you don't control the mechanisms. It's just video. What's your view on that? I think it's like a super exciting, interesting thing to think about. I think like I said before, I think one thing is going to be the higher the fidelity is memory stored in video latency or does there need to be another research breakthrough where memory is ultimately stored in some new latent space that's 3D or for dimension. But the fidelity is amazing. I think that's really beautiful. And I think that the way we would see the ultimate architectures of these platforms is probably for a while, not an all in one thing. We would probably see some hyper efficient, thousand person synchronization state
Starting point is 00:20:51 engine. We might also see some of the photorealism not being done server side, but being done in intermediate spaces or client side as well. And we may see multi-step pipelines where there's the raw framework. There's 3D up sampling. Then there's local 2D up sampling. We're actually going, oh my gosh, like people are putting together a bunch of pieces, you know, dedicated NPC capability, dedicated synchronization capability, 3D upsampling, 2D upsampling, and potentially world model stuff.
Starting point is 00:21:25 You know, world model, you know, you can imagine that. that being used, even in its early state today, rather than maybe a gameplay state, where rather than making a video, you talk about the video while you walk around. And you kind of say, put me in a Western world. I want to go left. I want to go right. Add a few horses. And that video actually goes to a multiplayer experience. One could imagine that. I think what a lot of other people are wondering is the short form video, does someone hit it just like they did with short form video into a highly retentive, like, local dreaming product that can just react to what you're doing.
Starting point is 00:22:05 I think it's still early for both of those, but it's a big possibility. I would say that we are doubled down on a hybrid multi-AI tech stack that supports multiplayer natively. Yeah. I think one of the ideas that's, like, internationally very interesting has been the idea that you can generate, like, serial microdramas as a form that is. is not yet that popular in the West, but seems like a good format for the technology.
Starting point is 00:22:32 Microdramas, I've watched a few. I mean, it's really interesting. You know, it's fun to see, like, some entrepreneurs tried to do that. I won't mention who, like five or ten years ago and then they naturally emerged kind of thing. It's interesting to think if those microdramas are easier to produce.
Starting point is 00:22:49 They don't need live actors. But, you know, what is the plot line that really resonates? You know, we'll see with that. And you can mass test that at scale if you're using AI, right? You can actually have, you know, a very large number of iterations. This is kind of one of the arguments people make around the Fermi paradox of like, why haven't we seen alien life? And the idea is it's just, there's alien life.
Starting point is 00:23:09 They're just stuck in these virtual worlds now where, you know, it's funner than the real world. Like, what's the famous quote? Where is everyone? Yeah, exactly. So one option is there in some virtual landscape that's more compelling than the real world. And so therefore, there's no reason to travel the universe. Or it can be, you know, We could really start conjecturing what's the future evolution.
Starting point is 00:23:31 Is it human or machine life and all of that? Yeah. And once it gets to maybe human machine life, what happens? Yeah, that's a... Yeah. Interesting. Yeah. That's the Vanellist guy.
Starting point is 00:23:43 Yeah. Exactly. Or the optimistic thing is more the holodeck thing. Like everyone's hanging out with other people rather than a simulation of themselves. Yeah. Are there experiences that you imagine, from a like a communication or a social perspective that you think are going to be just different or new or that you're going to have real insights into from these really great MPCs, right?
Starting point is 00:24:08 Because I'm sure you've seen what has happened with the rise of AI companions and how engaging they are given they're just text into your faces. There's some fun things that, like there's a black mirror dating episode. I don't know if you've seen it where they have a lot of virtual doppelganger power. And obviously, I'm not, Roblox is not doing dating. We're just talking about this right now, you know, blah, blah, blah, 18 plus ID verified, all of that. So let's not go crazy.
Starting point is 00:24:36 But let's think about a future in the Black Mirror episode where you literally can spawn the life of a thousand virtual doppelgangers. That thousand virtual doppelgangers, you're on your new dating app in three minutes. You know, someone else's virtual doppelgangers live a thousand virtual doppelgangers live a thousand virtual lives with your virtual doppelganger? And then they say, what's the success rate? And when the like the thousand virtual doppelganger say, 98% of the time we had a good life, then you say, oh, we should go on a date or something.
Starting point is 00:25:09 So there's, I think. Social simulation. There's a lot of things we, I think, are going to be really hard to project. I think one of the things we would focus on is the more we can just build raw, high-performance infrastructure, have AI as a service, have all of these things capable in a cloud, have the same thing run on a phone as a computer, have it auto-translate. Hopefully, we would discover some of these things. Yeah, I'm really optimistic about that, given the engagement that people see with these companions as just a text interface with like the beginning of memory,
Starting point is 00:25:45 right? Versus if your Roblox, you have embodiment, you have, as you said, cloud persistence. And I think it's just going to make the characters much more interesting very quickly. I'm thinking through how we would embody memory for NPCs, which is more than just a prompt. One could imagine any NPC on Roblox because we have recording of the whole thing, they actually have access to everything they've ever said or done in a very lean format. So they can go back through that, retrain on that. So that hopefully would be a native part of the platform. But I do know, yeah, there's a lot of interesting engagement with these types of companions
Starting point is 00:26:24 and imagining fully 3D versions will be like pretty fascinating. Yeah, friend of mine basically asks at dinner parties, he runs a well-known AI company and it's sort of controversial question for the group as often, what proportion of you think your children will have at least one serious relationship with an AI, like throughout their lifetime? I think zero right now. I do. I don't think we're close to crossing the,
Starting point is 00:26:49 human AI barrier. But it is worth thinking... He meant it even in the context of like, even if it's superficial and there isn't true intelligence behind it, just given the behavior with text, once you start having video, virtual worlds. I can imagine a real-time coach therapist, like in your earbud, just hanging out all the time.
Starting point is 00:27:07 Yeah. So I think for that kind of less evasive or over-your-shoulder coaching, I think that seems like... And that's already a major use case, it seems, for some of the... Like JATGPT and other platforms, where mental health advice or coaching or other things like that are actually one of the major use cases. Yeah, yeah. I think that will be arguably very valuable.
Starting point is 00:27:27 Can I ask an industry perspective question just given you've worked in gaming for a very long time now? Why do you think there's a lot of discussion about how assets of different types are cheaper to create with generative tools? I don't think that's really changed the dynamics of the gaming industry yet. I don't know if you would agree with that claim or explain it. One could say, no matter how cheap it is to create assets, the expectation of quality from consumers goes up at exactly the same velocity. Okay. And so, you know, what was not possible a while ago, the expectation of consumers will go up. I think there's a general notion that the technology for the gaming industry has to move to cloud vertically integrated.
Starting point is 00:28:18 And asset management can't just ship on a DVD or be a giant download. I think we're really seeing we've launched dynamic LOD for textures, meshes, is on the way. So you can imagine every single asset in the game, sound, 3D object, image can exist in various formats. Traditional format is it's like a texture or a mesh or an audio file. Future format could be it's an AI prompt and it's generated procedurally on demand. So there's these two things. They're both very AI enabled. One can then imagine that you have to be cloud connected because for all of these assets,
Starting point is 00:29:04 you know, a full range of LOD, just like with video streaming, upsampling. On low-end devices, you need low LOD, high-end devices, you need 4K LOD. and ultimately on-demand AI generation, that then gets into, once again, one of the demos is probably on 3D upsampling. The notion that if someone makes a very primitive experience on Roblox, they would augment it with a prompt, and they'd just say,
Starting point is 00:29:33 I know it's kind of primitive, but make it kind of look medieval and more realistic, and have all of those assets auto-upsample to in 3D in the class, for free on demand. That kind of just in time thing is some of the stuff we're thinking about how AI can accelerate if you have a cloud platform. Rovlox is also from the studio perspective,
Starting point is 00:29:55 one of the biggest development environments in the world. That's right. What are you seeing from the perspective of AI coding there? Yeah, so this is great. So once again, industry standard ways of using a different model with Studio. So if you're in cloud code, you can control Studio. is super exciting. And so a lot of our users have started gluing these things together with the
Starting point is 00:30:20 hope ultimately, you know, they can build stuff from there. Roblox Studio itself is running a native assistant as well and a native code engine. And I think the pattern for that is Roblox Studio will more and more for code follow many of those industry best practices, but in parallel needs to have kind of an environmental generation kind of component that's very unique to that. And I think the future of environmental generation is, once again, any prompt you can imagine, text, image, video, possibly walking around in a world model to define a world, iterate on that, iterate on that to a primitive 3D skeleton, do I like it, and then iterating on that to a fully functional game.
Starting point is 00:31:08 So as that, I think that's, our vision is to run those side by side. So first time studio user gets it out of the box. At the same time, we like plug into all of your AI workflow. I think the final thing is because it's so cloud connected, that thing we talked about earlier, spinning up jobs, spinning up agents, running test plans, kind of keep grinding away, trying something different, is like you have all of that cloud capability. Just had a curiosity.
Starting point is 00:31:38 the cycle of development for a successful Roblox game? What we have seen is over the years bigger teams so that the top Roblox creators now are well into the tens and tens and 30s plus million a year. It's really serious stuff. We have seen in the Roblox creator community, I think, a healthy sign that creator number 1,000 and what is their average yearly revenue. It's going up faster than creator number one. So that's a bit of a healthy long tail. And I would say all the way out to creator 1000, there's such a huge community of people making a living. What I think our creators have found is it's gotten both more healthy but more competitive. And I think we've seen the pattern now
Starting point is 00:32:30 of much more live ops in that because this is cloud, you can update your experience just like good websites every day or every week. And that is just kind of a constant practice. So I say update frequency much more rapid. Which is great for users because they get updated content at a faster rate. That's right. Users love updates. They love new things.
Starting point is 00:32:54 It keeps games fresh. So I think the other thing we have seen, on the discovery side, we've tried to and more and more taken the approach we want discovery to be completely transparent. These are all the things that are going into the discovery algorithm. We check it. There's a benefit to that transparency and discovery and that it actually keeps a lot of pressure on us. You know, oh my gosh, someone's going to game discovery. Well, then we better make it really good, right?
Starting point is 00:33:29 And transparent. I actually think transparent discovery and recommendations is an interesting trend for the industry worth talking about. So with that, we have seen also the kind of the content distribution get richer and a little bit less spiky. There's now like the top 20 experiences on Roblox are all really good and all kind of vying for the number one slot, which maybe wasn't so true three or four years ago. Zooming all the way out for a second, I think it was true five or ten years ago when we met. It was true when I was talking to one of the members of your leadership team today, they will say you're
Starting point is 00:34:08 a extremely high conviction, very long-term oriented leader, right? So the 20-year plan for Roblox and we're going to do things the way we have a vision toward. You're obviously also all in on AI, right? Very excited about the technologies and investing in them. How do you square these things where you have a very long-term view when so much is happening in the environment all the time? I think we always want to pair, take the long view, which is one of our four big values, with get stuff done. And so it's almost like in those old gardener charts, you know, take the long view, not take the long view. Yeah. Get stuff done,
Starting point is 00:34:45 not take the stuff. Like the magic quadrant is take the long view, get stuff done. We typically mean inside the company rapid iteration. And so if someone has a view of some, exciting new product and we're going to ship it in six months, that can be a very scary thing unless it can be broken down. We're going to ship this thing every week and iterate towards this, but we have a good target of where we're going. And so I think inside the company, we, I'd say our AI team, our facial age estimation team, our safety team. These are literally things working on a weekly basis, which allows a fairly fast reaction time. AI, I think, is arguably as far as speed right now, just every day. But you mean, you make it sound pretty simple in that,
Starting point is 00:35:39 like all the things that you're trying to get done on a week or a six-month time scale, they're all aligned toward a vision that isn't really effective. If you have your six to 12-month vision, hopefully that's moving not too fast. But at the same time, the week's, weekly iterations are moving very quickly in that direction. And you could say the Roblox vision is ultimately to build the holodeck. And it's 10,000 people, it's multiplayer, real-time modification of the environment. It's NPCs, photorealistic. That's a pretty stable spec for 20 years.
Starting point is 00:36:17 And so then... I actually feel like that's the biggest takeaway for me here. I spent a lot of time with software CEOs, right? And I think there's a lot of actual concern about, like, is the vision we had for the company in terms of its durability or its value still as valuable as we thought five years ago? And for Roblox, perhaps because you were, you know, pointing at something so far away to begin with, it's like pretty clear, you know. I think we have crisped that up over the years. I would say two years ago, we were a little less crisp. We would say, we're going for a billion DAUs.
Starting point is 00:36:52 That's a good spec with this spec I just mentioned. It's a little harder to operationalize around product management teams and that. So we took a stepping stone and said we want to get to 10% of global gaming content on the way to that. It's actually a remarkably good step. It's about 300 million DAUs. It's about 20 billion. It's about 3x where we are. It can be forensically torn apart.
Starting point is 00:37:19 We can see what part is the USA market. We actually have a higher target for U.S. than 10%. And so that has been very clarifying. I think for a software CEO, always knowing both 3x and 10x is very helpful. Because if you don't know 10x, then it's hard to sleep at night. If you don't know 3X, it's hard to plan forensically how we're going to operationalize that, how quickly we're going to get there. One of the thing that I think that Robox has done uniquely well is how you've thought about hiring
Starting point is 00:37:55 and how you've rethought certain aspects of that. And I think that's really led to people who have more bias to action who are moving quickly, who are iterating rapidly. Can you talk a little about how you've rethought some aspects of that if you're willing to share that as part of it? Yeah. I mean, one is we've been very public that we really want values-aligned people that are good problem-solvers that are very creative. And it's hard to do signal to noise
Starting point is 00:38:24 and that kind of a hiring thing. So we actually bought a company about five years ago that was doing their best scientifically to build 3D assessment tooling for some big companies to really find people who are problem solvers and creative.
Starting point is 00:38:41 That's part of what I was referring to. I was fascinated at that acquisition. So we bought Embelis. We got a great team of scientists people that had been involved with the SAT test and all of that. And we've tuned that to our new college grad intern program so far. We've essentially, I believe, in a very fair meritocracy kind of way, taking the whatever it is, 10, 50, 60,000 people run them all through what we've,
Starting point is 00:39:11 I think, shown to be somewhat of a very fair, non-weighted by any other social factor assessment. and the people that come out of there, we have an incredible confidence that they're problem solvers. They do a bunch of other things really well. So we kind of have this building wave inside the company of three years ago, two years ago, one year ago of these 100, 200, 300, 400 or 400 people that hopefully we're like mentoring some of the best future leaders in the industry. Yeah, it's super interesting because I have a number of friends who always talk about how you discover untapped sources of talent or how do you find the best. people in the world and are there ways to test for that or sort of screen for that?
Starting point is 00:39:51 We've done a lot of testing on this fairness testing, looking backwards on it. I'm very confident in where it's going to go for us. Super interesting. Yeah, it's like a very systems oriented approach. The one thing we have found, which is a little edgy, might be the correlation between traditional universities and our own testing. And we hire... Sorry, what's edgy about that?
Starting point is 00:40:18 Well, I think people typically think, like, I went to Stanford. That's where all the top talent is we have found that not necessarily to be true. And we have found community college, you know, the small Midwestern engineering school. Like, because we're assessing our own way, we kind of, we basically ignore the signal of where you went to university. I've heard that anecdotally from other CEOs now, particularly for certain cohorts like the cohort that came during COVID for a lot of the top universities is perceived as like a weaker cohort. And I don't know the reason or the interpretation behind it.
Starting point is 00:40:52 But it's interesting to see this shift in terms of how people even think about where are some of the best talents. So it's interesting that you're kind of seeing that. So we get to be to have self-determination. We get to say colleges, you do what you want. All of that. We're just going to do it in a fair way that we think is our way. And it's worked out.
Starting point is 00:41:10 Can you give just, because it's so interesting, like any intuition for, some way you test for problem solving capability? I think we make public some of the tests that we use. And the cool thing is they're all built on Roblox. They're all tested by lots of people. And they're arguably really interesting three-dimensional problems like programming a factory or like you're using some new geometric language to program a robot. You know, it's almost like Turtle graphics. And you can play with that and see how people are. We're going to try to get jobs. Yeah. I would be, I'm actually might be a little afraid to take the test myself, given how good some of our candidates are. You know, as a long-term predictor of the future,
Starting point is 00:42:03 is there something happening in AI that you believe or hold with conviction like five years out? Do you think not everybody believes yet? That I believe that not. I think typically it's the other way. More people believe more stuff than I believe. You think of yourself as a skeptic? Well, I think of myself as... He's building the holodeck, but he's a skeptic. I think I'm more like the power of time and the power of compounding rather than like an amazing invention and just rapid iteration.
Starting point is 00:42:32 One thing I constantly am amazed by is either the graphical user interface or Microsoft Excel. I think I was playing with Microsoft Excel in like the, 80s and here we are 40 years later. It's not that different. Yeah. So it's kind of like this interesting complement in AI world. Things are changing every day. We have to incorporate and do that. And then there's also some weird universal truths of things that stick around for 40 years and you kind of have to blend them together. So I think that's how we, I think the good news is we feel if we build Roblox right, it might be the kind of thing that's kicking around in four years. Awesome. Thanks so much, Dave. Thank you for joining us. Okay. Cool. That was really good.
Starting point is 00:43:23 That was really good. Yeah. Find us on Twitter at No Pryor's Pod. Subscribe to our YouTube channel if you want to see our faces. Follow the show on Apple Podcasts, Spotify, or wherever you listen. That way you get a new episode every week. And sign up for emails or find transcripts for every episode at no dash priors.com. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.