No Priors: Artificial Intelligence | Technology | Startups - Rivian’s Roadmap to AI Architecture and Autonomy with Founder and CEO RJ Scaringe

Episode Date: February 12, 2026

Autonomous vehicle technology has moved past human-coded rules and into an era of neural networks and custom computer chips. And to solve the most difficult driving scenarios, electric vehicle company... Rivian abandoned its original technology platform to build a vertically integrated data stack. Sarah Guo sits down with Rivian Founder and CEO RJ Scaringe to explore the seismic shift in the automotive industry toward AI-driven, software-defined vehicles . RJ discusses the move away from function or domain-based architecture for vehicle electronic systems to software-defined architecture, which allows for dynamic, monthly updates to features in Rivian’s vehicles. RJ also talks about the upcoming launch of Rivian’s R2 model, which aims to be a distinct, affordable, mass-market alternative to the Tesla Model Y. Plus, RJ shares his vision for a future where vehicles don’t just drive us, but inspire personal freedom and exploration. Sign up for new podcasts every week. Email feedback to show@no-priors.com Follow us on Twitter: @NoPriorsPod | @Saranormous | @EladGil | @RJScaringe | @Rivian  Chapters: 00:00 – Cold Open00:35 – RJ Scaringe Introduction0:58 – Rivian’s Autonomy Evolution05:19 – Why Rivian’s Tech is Vertically Integrated10:06 – Levels of Autonomous Driving Technologies14:00 – Importance of a Software-Defined Architecture19:28 – Differentiating Autonomous Vehicle Models23:20 – R2: The First Mass Market Autonomous Vehicle25:02 – Do Americans Want EVs?29:05 – How Our Relationship to Vehicles is Evolving30:45 – Conclusion

Transcript
Discussion (0)
Starting point is 00:00:00 By 2030, it'll be inconceivable to buy a car and not expect it to drive itself. Every single one of our cars, we want to have the ability for it to operate at very high levels of autonomy. Radars are extremely cheap. LIDARs are very cheap. But the really expensive part of the system is actually the onboard inference. An order and magnitude more expensive than any of the perception stack. My view is EV adoption in the United States is a reflection of the lack of choice. As consumers, we need lots of choices. We need to have variety.
Starting point is 00:00:25 We self-identify with the thing we drive. The world doesn't need another model why. needs another choice. Hi, listeners. Welcome back to No Pryors. Today, I'm here with RJ Scringe, the founder and CEO of Ruvian. We're here to talk about their autonomy strategy, proprietary chips, their coming R2 model, whether Americans want EVs and what our relationship to cars is going to be in the age
Starting point is 00:00:54 of AI. Let's get into it. RJ, thanks so much for doing this. Thank you for having me. So Rivian's already an incredibly cool company. How did you decide was going to become an autonomy company when that is. happen. I mean, from the beginning, we thought of it as a transportation and mobility company. And in fact, even before Rivian became Rivian, when I was thinking about what's the first
Starting point is 00:01:12 products, it was unclear what kind of car would be, but even if it was a car, but it was always clearly wanted to be at the front edge of helping to redefine what does it mean to have access to personal transportation. And so autonomy has always been part of the strategy, but it's not fully coming to life with the technology that we're building. And you think about the function of Rivian, there's transportation, there's almost So the experience, like when, how long did you guys start investing in the autonomy strategy here? Yes, we launched our one in very end of 2021. And we used what I'll broadly characterize like a one dot O approach to autonomy.
Starting point is 00:01:50 So we had a perception platform. We used a third party of front pacing camera that was essentially a third party solution that been plugged into an overall framework that we built, but it was all rules based. So the camera is fed a rules based planner. The planner would then make a bunch of decisions. around the feeds from the perception. And it was, you know, the moment we launched, we knew it was the wrong approach,
Starting point is 00:02:10 but it was the thing we'd started working on well before the launch. And so at the end of 2021, beginning of 2022, we made the decision to completely reset the platform. Was that hard as a decision? No, because it was so clear. We made that, you know, when you're building something like this,
Starting point is 00:02:26 you recognize you're going to spend many, many billions of dollars creating it. So we knew this, like at the core of transportation, is driving. At the core of that is a shift to having the vehicle we kept on driving itself. And so we made the decision to redo it, like clean sheet, no legacy of what we had built in the Gen 1. And that first launched from a hardware point of view in the middle of 2024. So I was with our Gen 2 vehicles. You know, not a single line of shared code, not a single piece of common hardware on the perception or on the compute side. And then we had to build
Starting point is 00:03:02 like the actual data flywheel. So we had to grow the car park to build enough of the data to then start to train the model. And what we showed in our autonomy day late last year, late in 2025, was the beginnings of a series of really like super exciting steps of how this is going to grow and expand. I say this whole time, I think of not just for Rivian, but I'd say for the auto industry in general, the last three years compared to the next three years are going to look very different. So the rate of progress that we saw in autonomy between, let's say, 2020 and 2025 or 2021 and 2025. And what we're going to see between today and let's say 2029, 2030 are there are completely
Starting point is 00:03:40 different slopes. And that really comes back to, you know, entirely new architectures and not being used to develop self-driving actually truly AI architectures where it's before. These were not AI architectures in the, in the true sense. They were, they were using machine vision, but really rules-based environments that we defined as humans, you know, we codified them, which is very different. than Apple today. You might actually have perfect timing here in that I got to be part of investing in sort of the first wave of independent autonomy bets that were working with the OEMs at my last
Starting point is 00:04:11 investing firm. But this is, let's say, eight, 10 years ago. And as you mentioned, there's several architectural revolutions since then. And so for companies to make that shift from, you know, we're going to have these separate perception and planning systems to more end-to-end neural networks. I asked because I felt it was actually quite a hard decision for people of choosing their partners and internally from a technical perspective. Well, I think it, I mean, you can see it. So if you go back to the very beginning of the idea of self-driving, a lot of effort, a lot of spend happened for companies to build these rules-based environments and to build
Starting point is 00:04:50 these more classic systems. And when transformer-based encoding came along, just a couple of years ago, And it shifted very rapidly to, it was clear that the future state was going to be neural net base. It was hard because if you're a company that's built all these systems, it's like, do I keep investing what I had? What do I do with all this work that was built before? And the reality is, a lot of it is, the vast majority of it's going to be pure throwaway because it wasn't like a gradual shift. It was a complete rethink of how things are architected. How did you decide that this was going to be an in-house effort versus a partner?
Starting point is 00:05:26 our effort. That given most people who made cars, said, we're going to go partner or buy something here. I guess the emotional slash philosophical is on things that are really important, we've taken the approach of vertically and engineering them. So electronics, our software, all the high voltage systems in the vehicle, so things like motors, inverters, all the power electronics. These are all things we develop and build and house. And in a few cases, you know, we had to start with something that was either off the shelf or partially off the shelf, but today, all that's completely in-house. And in the case of self-driving,
Starting point is 00:05:58 we knew that long-term it needed to be something that was developed internally. We started, as I said, with a mobilized-centric solution, which a lot of folks did, particularly in that 2015 to 2021 timeframe. But when you really look at what's necessary to be successful in a neural net-based approach, there's a core set of ingredients
Starting point is 00:06:19 that very few people have, and I think we uniquely have them. So first and foremost, you need to have complete control of a perception platform, You have all the, everything that the system is capable of observing, whether that's cameras, radars or LIDARs or some combination of all three, you need to control that, meaning there's no intermediary company that's like processing some of the information. And so that's powerful because you can then feed raw signals into your system.
Starting point is 00:06:45 The system needs to be capable of triggering unique or interesting or noteworthy events that you can then use to train that triggered, you know, those triggered moments need to then be. captured, saved on the vehicle, and then when the time arises where you have Wi-Fi, ideally, send it up. And the reason I say Wi-Fi, these are, this is a lot of data. So you could, of course, do it over LTE, but it's expensive. As you have to have a really robust data architecture in the vehicle, then you need to be able to send it off, off-board, and use that with a lot of training, so a lot of GPUs to train a model. Companies that are either developing independent solutions that are not a car company, they typically don't have access to the type of mileage that we do.
Starting point is 00:07:25 So the huge amount of data that our vehicles generated. If you're developing this from a sensor set point of view, you typically don't have the vehicle architecture and the vehicle car park. So we just came to the view that we have all these ingredients to do it really well. It's like not an optional thing. It's the companies that do this well will exist. The companies that don't do this well, like I feel really strongly. They will not exist.
Starting point is 00:07:49 They will shrink to nothing. The lastentatically approach zero. You think it can only be delivered and really a vertical. vertically integrated. I think there's more than one less than five companies outside of China that have the necessary ingredients to do this. The capital, the GPUs, the car park with enough vehicles generating enough data. I say more than one less than five. And the control of that whole training loop you're just doing.
Starting point is 00:08:14 It's probably like more than one less than three, maybe four. Like there's a very small number of companies that can do this. I think the unique spot we are in time right now is the one. If I asked explicitly then, it's you? It's Tesla. It's Waymo. Is that the three? I would include all three of those, yeah. And there's maybe one or two others in the mix. But I think the challenge is you have to look at not just the moment in time for performance where we are today.
Starting point is 00:08:37 Do you have the ingredients to continue making progress at a very high rate over the next four or five years? And so a lot of the solutions that are more 1.0 based and are sort of stuck in that framework, I think have a truly a 0% chance of progressing to be competitive with an neural net-based approach. And the neural-not-based approach does take a lot of times. You have to build a ton of inference. You have to either buy it or build it. A lot of inference we decided to build it. So we built an in-house ship to do this.
Starting point is 00:09:09 You need to have a car park this large. You just mean enough onboard compute to actually run the models in the car. Yeah. Yeah. In the vehicle. And so you could buy that. Of course, the video makes those. But you need to be able to do that.
Starting point is 00:09:21 at scale and have it in every car. And so we took the decision to make our chip in-house. Is that more a capability decision or a cost decision? It's a cost. We want to have it on everything. So every single one of our cars, we want to have the ability for it to operate a very high-levels of autonomy. And so we design and spec and build the cameras.
Starting point is 00:09:42 Radars are extremely cheap. Lidars are now very, very cheap. But the really expensive part of the system is actually the onboard inference. And so that's like an order a magnet. and more expensive than any of the perception stack. I think people focus on the perception because it's the things we can visualize. Right. But the brain is actually the most expensive part.
Starting point is 00:10:00 And so we brought that in-house as a way to remove cost from the system so that we can easily deploy this on every car. You are taking like a sort of step-by-step approach to levels of autonomy. Yeah. And Rivient, how do you think about how quickly you approach like level four or, you know, the safety case around each of these things, how fast your team goes? was against this. Yeah. I mean, this is, even this question is unique. It's just a few years ago, 20, 2019, 2021, even.
Starting point is 00:10:29 There was, like, very, like, very clearly delineated ways to approach autonomy. There was a level two approach, which was camera heavy, maybe, with a few radars. And then there was a level four approach, which was, of course, had cameras, but had a lot of lighters. It was sort of inconceivable to think of the level two system becoming a level four. And similarly, the level four. force system was way overbuilt to even like conceivably think about putting that on every consumer vehicle.
Starting point is 00:10:56 Well, you didn't want the big word light arm. Yeah. You didn't want all these parts. Yeah. Tens of thousands of dollars of perception. So what's happened is those two worlds just, I think, have just started to very clearly merge where the delineation between a level two, a level three, and a level four in terms of perception and in terms of compute has started to fade.
Starting point is 00:11:16 And it's now essentially just remove, like how capable the system is. at addressing all these corner cases. And, you know, this is what's hard for a consumer to recognize. If you're driving a level two system or a level three system or a level four system, for 99.999, like, they're identical. Right. The difference is like the fifth or sixth or seventh nine on that is these like extreme corner cases. And so I think it's actually led to a lot of confusion where you'll be in a level two system,
Starting point is 00:11:51 be like the car could drive itself and you're like, yes, it can under most of the roads, millions of months, except these very unique corner cases. And so to your point on safety cases, the question then becomes is like how confident are we in the system capability in covering these really obscure, unlikely rare events, which of course, if they're not covered well, it can lead to really, you know, terrible outcome, you know, the vehicle and bad collision. And so that's where the neural net-based approach has just changed things a lot. So the, the, the, the, buildings are so much stronger and the ability now I think for us to deploy on a lot more vehicles have a car park that's very large. So we went from, you know, a few years ago, state of the art
Starting point is 00:12:34 was you'd have a test development fleet of maybe, maybe a few hundred vehicles, maybe, maybe like high hundreds of vehicles to now like thousands and thousands of every single car on the road is part of your data fleet that's identifying these unique core cases and then running them against them to test. And now, of course, we're simulating those unique cases and we can do a lot there. So just the whole nature, it's changed so dramatically that, I mean, I think by 2030, it'll be inconceivable to buy a car and not expect it to drive itself. You know, maybe this sooner, maybe like we hope it's sooner. We're like we're targeting a little sooner than that, but certainly in like a very, very near future, like that will become a must have in a car.
Starting point is 00:13:14 Sort of like it's hard to imagine buying a car today without airbags or buying a car today without air conditioning. These things at a moment of time were optional. I think in not too much time, a couple of years, it'll be hard to concede buying a car that can't drop you at the airport or pick up your kids from school. I would argue that right now, most of the biggest car makers do not have the ingredients that you described to make this a reality. So do you think that that's going to play out in the market where, like, autonomy will be so important as a driving feature, core feature of the car, that there's just going to be big market share shift to those who can figure it out. I know you're biased here, but that was the assumption.
Starting point is 00:13:56 No, no, I think it's a hard question answer. So I think it's, I always characterize like this. I think it's inconceivable for a car company to continue to operate at scale, like mass market. I think very niche, enthusiast realms, sure, but like at scale, without, a software-defined architecture, which is even before you get to autonomy, just like, can you do OTAs? Do you have control of a- Sorry, can you define software-defined architecture? Yeah, that's like, before we get to autonomy, it's like, these are like basics.
Starting point is 00:14:26 So the way car- Like core thesis of Rovier. Yeah, yeah. So the way car electronic systems have been designed and built and have evolved, with the exception of Tesla and Rivian, every car on the road has what is called a domain-based architecture. So you could also call a function-based architecture. So all the functions across the vehicle, let's say chassis control or door system control or A-FAC, your air conditioning system, all have little computers associated with them. Right. What we call ECUs, electronic control units.
Starting point is 00:14:56 And in a modern car, you might have 100 to 150 of these. And each of these run their own little island of software. And that little island of software is written by a supplier, more likely a supplier to the supplier. So you go to a tier one and they hire a tier two who writes the code base to run your HV. This is why it's impossible to debug like a software system. It's also why it's really hard to do an update. So imagine you have 100 different islands of software written by 100 different teams that all have to coordinate. And so if you want a feature, you know, something that manifests as a feature often involves combining functions from different domains.
Starting point is 00:15:31 So a simple one to visualize is when you walk up to your car to get into it, you want it to automatically unlock. You want the HVAC to go to your preset. You want your seats to adjust. You want it to make an audible noise in the outside. You want the lights to do something. You probably want the audio system to do something. Those are all different little ECUs in a traditional car. And the coordination costs in it is really high.
Starting point is 00:15:50 It's very unlikely that a car company will make a change to that sequence because it involves coordinating amongst maybe 10 different players. In contrast, on a approach where you build a zonal architecture, where you have a very small number of computers, ideally one, two, maybe three, depending on the size of the car, that are running one operating system that control everything. It's very easy. That sequence, you could make updates to, you know, in a matter of minutes, maybe an hour,
Starting point is 00:16:18 you can change the whole sequence of what happens. You walk up to the car, issuing over-the-year update, and it's very straightforward. How often does Revean update? We do about one a month, and it's typically, you know, we add a couple of new features. We had refinements to existing features. We're listening to, like, what customers are seeing and asking for. But, you know, every month the car gets, like, notably better. And it's created this really amazing dynamic where customers are excited for the update.
Starting point is 00:16:45 Like, when's the next OTA going to drop? The irony of all this is these domain-based architectures goes back to like, how do we arrive at this? It actually goes back to fuel injection systems. So up until early 1960s, like every car in the road was completely analog. So there's no computers at all in the car. It's 100% analog. And the first computers were there to drive the fuel injection systems. And car companies said this isn't a core competency.
Starting point is 00:17:10 let's push that little computer to run the fuel injection system to a supplier, and the supplier will make that. And this is where you saw things like the Bosch fuel injection systems. Never planned. It's sort of like a field of weeds. Then over the next 60, 70 years, everything that became computer controlled to any degree suddenly starting to have a little ECU, a little computer associated with it. And it just grew into this absolute disastrous mess that is, you know, today the network
Starting point is 00:17:40 architecture that's in truly every car on the road with the exception of two companies. That what I just described is what underpins. We did a large software licensing deal, a $5.8 billion deal with Volkswagen Group, is the second largest car company in the world to essentially leverage our network architecture and ECU topology for all their various brands. And so it's an interesting final point there on your first question, which is what happens to market share. So I think it's inconceivable that car, if to be, to be a business,
Starting point is 00:18:10 be at scale that you don't have a software-defined architecture that allows your features to become better and better, and particularly thinking about how AI starts to integrate into the features. That's number one. Secondly, it's inconceivable to think about a car company existing at scale without the vehicles having very high levels of autonomy. And so car companies have a choice on both of those. They can either accept that they're going to shrink. That's choice one.
Starting point is 00:18:34 Choice two is go build with themselves, which is really hard because they don't typically have these skill sets. they're not software electronics companies in terms of like their organizational DNA or they can find a third party to source it from. And in both cases, there's not great third parties to go to. And in the case of autonomy,
Starting point is 00:18:52 most of the third parties that did emerge over the last 10 to 15 years tend to be very much like classic rules-based, like 80 or autonomous vehicle 1.0 solutions. And those work pretty well for the business construct of selling like a sensor and a function. But that structure is really flawed when you want to have a large data flywheel and it's constantly learning and evolving and you're issuing updates constantly.
Starting point is 00:19:19 It's just, it's really hard to imagine that with an arm's length transaction. And so I think the vertically integrated stacks are going to naturally have some big advantages. So this might be an irrelevant question, but I'm curious. Do you think that the autonomy, like the models at maybe the three, maybe the one, maybe the five companies that come up with this, develop are fundamentally different over time. Because it's been a lot of time in the AI ecosystem and the, let's say, the language-oriented foundation models, like, feel like they're converging at this moment in time. I look at a rivian and I'm like, I don't know, people adventure in that thing.
Starting point is 00:19:56 Do you actually want it to do different things, have different styles or capabilities? Or is it really just, like, as much autonomy as possible safety case? Well, first, this is a great question. I want my car to drive for me also. Like in the LLM world, a lot of it has converged because the training data sets nearly the same. Yeah. So we're taking the breadth of knowledge is contained on the internet, and we're training models off of that. In the case of driving a vehicle, there is no internet of driving data.
Starting point is 00:20:25 And so you need both a robust sensor set to be able to capture the data and you need a car park that has enough vehicles in it. And so, of course, Tesla has the largest car park of vehicles by far. Our approach to this is we have a higher level of capability on our perception stacks. We have better cameras. We have radar. And of course, with R2, we'll have a LIDAR as well. A huge part of that strategy is not only those copper corner cases better, so the cameras have incredible low light and bright light performance.
Starting point is 00:20:55 So the dynamic range of the cameras is stronger. We have more cameras, a lot more megapixels. We have radar, which is great for object detection. and the LIDAR, which is, it's a very powerful tool for training the models. And so imagine 800 feet in front of us, there's a little spec into a camera. It's hard to figure out what that is. And historically what we would do to train that is she would have a LIDAR sitting on the vehicle, like a ground truth fleet to help train your cameras.
Starting point is 00:21:23 Putting that on every single one of our cars turns our entire fleet into this amazing training platform, this data acquisition machine. That was a core part of how we thought about our strategy is we're going to go, you know, not as heavy as, let's say, a Waymo on perception, but heavier than, let's say, Tesla to build a really robust data platform on a vehicle by vehicle basis and then with a car park that's going to grow significantly with the expansion with R2. Yeah, so I think first and foremost is there is no common Internet data. So the data sets that we're going to be picking up, though, are going to be very similar. But you have to go acquire. But there's still different decisions about what data you care about acquiring. Well, I think this is what to like, how does a car feel?
Starting point is 00:22:07 Ultimately, it needs to be safe. And the differences in the way it drives or fuels are going to be more about like what's the UI, the user interface of it. You know, like even we just updated some of our features. We have three settings for how the vehicle drives. Mild, medium, and spicy. Spice is the highest one. Yeah.
Starting point is 00:22:24 And so it's just like a little bit more aggressive. Over time, and we've spent time thinking about this, I think this will start to become part of key decisions. How does the vehicle behave? And there's work we're doing to think about how the vehicle can behave in a way that against a set of heuristics drives like you. So the overall model is trained on how to perform in a safe way, but it actually learns some of your driving preferences and creates a model around you. Of course, in a world where you never drive the car because it's always driving for you.
Starting point is 00:22:57 There's a way for you to set. I'd like it to aggressively change lanes. I'd like it to reside in the right-hand lane. Like those kinds of decisions. And those are less around the tech more on what's the product or the UI, if you're like. Right. The ability to collect those preferences. Yeah, it's preference-based.
Starting point is 00:23:14 And I think we will see that. And that'll be a decision like a Tesla makes that may be different than how a review makes it. It's hard to say today. Can we talk about what the R2 means for like the company and some of the key design decisions here? I was just talking to Jonathan, one of your lead designers, about the constraints and, you know, aiming for more mass market and more volume here. I mean, yeah, you said it. It's a flagship product. So it's average selling prices around $90,000.
Starting point is 00:23:44 It's the best selling. The R1S is the best selling premium electric SUV in the country. So that's electric SUV is over $70,000. And we're the best selling premium S should be electric or not electric in the state of California. So it sells really well. It outsells everything in his class, like a model, Tesla Model X, it also sells like two to one. But because of the price, it's just limiting in terms of how much wine we can achieve
Starting point is 00:24:07 with that platform. And so our two is our first truly mass market product with pricing that's, as we've said, going to start at 45 and allows people that are in that, you know, the average price of new car in their States is $50,000 in that like 45,000 to $55,000 price range, I think to have a really great choice. And to date, there haven't been a lot of great choices there. You know, there's, I'd say there's like sort of singular set of great choices with a model, Model 3, Model Y. And of course, that's shown through extreme market share capture, 50% roughly market share goes up or down. But around that, call it half the EV market is Model 3, Model
Starting point is 00:24:49 Y. So there's just such an untapped opportunity to pull customers out of ICE vehicles out of internal combustion vehicles with a choice that's, you know, has characteristics that are different and unique relative to a Tesla. These are like two substance terms to be rapid fire questions, but they're important for me to ask you. Do Americans want EVs? Like, why haven't they adopted them faster? Yeah, I think to the last question, I think causality is always a hard thing to, you know, really understand. But let's zoom out here. The overall adoption in the United States of EV is around 8%. The vast majority of vehicle buyers are buying vehicles that are under $70,000 with the average sale price of about 50.
Starting point is 00:25:29 And so if you look at the number of vehicle choices you have at a price point that's under $70,000, depending on the year, this, of course, changes year to year. There's well in excess of 300 different vehicle model line choices, putting aside trims and performance packages, but just in terms of like overall vehicle types. And she can buy hatchbacks, minivans, SUVs, you know, two-seaters, convertibles. I mean, there's a whole array of different things you can buy. And in the EV space, I think, and this is, I think there's more than one, less than three great choices. And I'd say Tesla with the Model 3 Model Y is absolutely one of those.
Starting point is 00:26:10 But there's so few choices that if you are looking for a form factor that's not a Tesla. So you think it's just missing product set that people are going to want. Yeah. An extreme lack of choice is how I put it. Like a shocking lack of. choice. And this is what gets into interesting, like corporate psychology. But because of the success of the Model Y in particular, the EB choices that do exist that are outside of Tesla are often very similar to a Model Y. Sure. So if you were to draw like an outline, if you looked at
Starting point is 00:26:42 the side view profile of a lot of its alternatives, and draw a profile and then put it next to model Y. It's almost identical. There's a design sketch over here of basically the Model Y and all its competitors that are copycats. It's like if you want a model Y, it's like if you want a Model Y versus getting... You want something different. Yes, you have all these companies who are trying to create their own version of Model Y. And it's like it's unfortunate because they didn't say, well, what can we do that's unique and different? And so for us, we think the model Y is a great car. I've owned one. Many folks in our team have owned one, but the world doesn't need another Model Y. The world needs
Starting point is 00:27:14 another choice. And so I think this is a reframing of just how we look at transportation is it's such a big space. It's such an air. of personal expression that we need as consumers, we need lots of choices. We need to have variety. We self-identify with the thing we drive. I just haven't had it. So I think my view is the EV adoption in the United States is a reflection of the lack of choice. There's one set of really great choices with Model 3. I think there needs to be many more. And so even looking at our partnership with both trying group, a big motivator for that, which ties to our mission, was can we take our technology platform and allow that to be expressed through a variety of really
Starting point is 00:27:58 interesting and very storied brands in different form factors, different price points, of course, different segments. And I think the more choices we have, the more it's going to lead to broader based adoption of electric vehicles, which creates, I think, a very positive level of momentum around the space. It's worth noting on that point. When we look at how we develop a car, like take R2, we don't think of it as this is someone who's going to buy an EV, let's make it good. We think of it as let's make the best possible vehicle.
Starting point is 00:28:33 You know, we can imagine. So incredible performance and, you know, great range, great dynamics, tons of storage. And the person buying it will be drawn into electrification because the car is just the best choice they have. And we took that same view with R1. And on R1, the vast majority of our customers, our first time ever owning an EV is ARIVN, which is really good. If all we were doing is moving customers between one or two brands, it wouldn't be accomplished. We have to create new EV customers with products that are so compelling that it just draws people in.
Starting point is 00:29:06 So that leads into my very last question here. I grew up thinking like a car is a huge part of my identity. Love cards, drew them. still think they're pretty cool. And, you know, as they become more like utilitarian services with the rise of robo-taxies as a concept of like, you know, serving some of the function, what your car did before, how do you think our relationship with cars changes or vehicles over time? I do think we're going to see a shift.
Starting point is 00:29:35 It's interesting, like philosophical, philosophical question, why are cars such a part of our society and why do we have this affinity for them in a way that we don't have, that feeling for other things in our life that are really important. Like I don't look at my refrigerator and think I really love that in the same way that I do with a car. And I think part of it is a car enables this personal freedom. It allows you to explore. It's something that you're not only ride in, but it becomes part of an expression
Starting point is 00:30:05 itself. And I think that's probably going to continue to some degree, but it is going to evolve. And the way we look at it with our products and even how we've laid it. out and contemplated the purpose of the brand, we really look at it through the lens of the vehicles and the products we make need to both enable people to go do the kinds of things, you know, that they would hope to have memories of years to come. So we often say the kinds of things you'd want to take photographs of. But more than just enabling it, which is a functional requirement, like can it drive there, you know, can it fit the stuff, your pets, your gear, your friends,
Starting point is 00:30:40 your, all of your stuff. More than just enabling it, can it, and inspire it. And so can the brand and the way we present what we're building and the way we make design decisions inspire you to go do the things you want to remember for years to come? And so there's little like design decisions we take that link to that. So a flashlight in the door is an invitation to explore. It's an invitation to go look at things the night. Or the tree house. Yeah. Exactly. So there's all these little decisions you made throughout the whole car from these there that are just designed to like engage that element of inspiring people to go like imagine that life they want to have.
Starting point is 00:31:21 Awesome. Thank you so much, RJ. Congrats on the R2 and on the autonomy program. Thank you. Find us on Twitter at No Pryor's Pod. Subscribe to our YouTube channel. If you want to see our faces, follow the show on Apple Podcasts, Spotify or wherever you listen.
Starting point is 00:31:37 That way you get a new episode every week. And sign up for emails or find transcripts. for every episode at no dash priors.com.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.