In The Arena by TechArena - MIPS Balances Performance with Flexibility in a Data-Driven World

Episode Date: October 14, 2024

MIPS CTO Durgesh Srivastava shares insights on AI, data centers, automotive edge, and how MIPS is leveraging RISC-V to drive efficient, flexible computing solutions....

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to the Tech Arena, featuring authentic discussions between tech's leading innovators and our host, Alison Klein. Now, let's step into the arena. Welcome in the arena. My name is Alison Klein. And today I'm really excited because we have Durgesh Srivastava, CTO of MIPS with us. Welcome back to the program, Durgesh. Thank you so much, Alison. Really glad to be here. Durgesh, this is your second time on the Tech Arena podcast, but the first time in this exciting new role. So tell me about MIPS and what you're driving at the company. Thank you, Alison, again. Yeah, at MIPS,
Starting point is 00:00:50 we are an accelerated computing company. I understand the workloads have changed and it's significantly changing. What we do is we try to do the balance between the fixed function and programmable, because fixed function runs fast, but it's hard to program, and then fully programmable can do anything, but it's not that efficient in power and performance. Our solution is flexible, and it can handle diverse workloads, starting from data center, data-play processing. In the end, our focus remains the data movement
Starting point is 00:01:21 and data processing, and we are handling that, and that's my reason for being here, because I was always focused on data plane processing, data center scale-out and scale-up problems, and in automotive, which is becoming a data center of the wheels. So I joined MIPS late part of the March of this year, and it has been very exciting as we are getting unprecedented attention and advancements in these markets. We are also working on various innovations in data center and automotive, as I
Starting point is 00:01:52 mentioned, and given just the technology, the way it is, is creating immense opportunities for innovation and growth. And we are part of it. We have also created a very talented team of starting from CPU architects, performance team, tools team, and then followed by some of the systems architect. They are part of the CTO office. We have a very group of good engineers working and trying to solve these problems that are growing day by day. It's interesting. When you showed up on my radar as being a CTO of MIPS, it took me back because MIPS has been a silicon presence in the industry for decades. But I think that some in the audience might want to understand a little bit more on MIPS.
Starting point is 00:02:35 What sets MIPS apart and where do we find your processors today? You're actually right. At MIPS, we love to call us as a restartup. It has a long history, and we are restarting with the recent launch under the new management. And we just went back to our roots, which is the data processing and data movement. And we call it MIPSiness. And it's a funny word, but it makes perfect sense when I explain to you is that, yes, we do well on the scalar processing, but we optimized our architecture and microarchitecture for anywhere where there is a data plane requirements. integration because it's much easier. And those kinds of things are helping us significantly, a lot of efficiency like performance per watt, performance per dollar, those kinds of things. So we will continue in that area and I'll keep you updated as we go further
Starting point is 00:03:35 in our journey and which we believe that we are doing innovation to our best. Now enter AI. We've talked before about how AI is creating demands on silicon at a pace that's like nothing we've ever seen before. What challenge does this represent to the industry to deliver what is needed from the edge to the cloud and in all of those environments that you described earlier? You are absolutely right. And it's a very good point. AI is driving unprecedented demands for processing power, efficiency, and specialized capabilities. I'm sure you have seen the research work in thermal related to cooling has gone up from liquid cooling, and there's a lot of demand for even immersion cooling now.
Starting point is 00:04:19 So industry is changing. And compute, which we provide as computer architects keeps growing year over year but because the motion as you can see is slowing down whereas the requirements is going 100x and the compute we are providing is 2x. So our approach at MIPS is focus on performance per watt instead of absolute performance that's. And the second is also look at how we can do the programming model for any of the software developers easier. What it means is we could have accelerators, but if we do using the existing architecture for inline programming for tightly coupled and loosely coupled accelerators, that's the one
Starting point is 00:05:02 which we use. And we find that the performance efficiency has gone up significantly. So just to summarize, what we are doing is we take the existing architecture, we look at various usages, use cases, and try to optimize for goals rather than providing a general purpose solution. And that is paying very well that we see our performance efficiency has gone up significantly. We are incorporating AI like in data paint processing. We are also working on some of the accelerators for AI, especially inference, which our customers are asking. So we have a mixture of solutions and then we can definitely walk you through some of the solutions and pipeline.
Starting point is 00:05:50 Now, when you think about the continuum for data center to edge, where do you see along that continuum the opportunity for MIPS to play a role? That's a good point. So a couple of things. First, we have a very strong presence in automotive. We do see automotive as an edge just because of latency and the compute requirements. Most of the compute happens in the car. As industry, we have this term automotive is data center on the wheel. So we definitely have good and very strong solutions.
Starting point is 00:06:18 And on the data center, we are looking at both. We are looking at the data center, which is the scale-out problem where the compute requirement is so much that you have to have multiple racks and even multiple data centers working together. And then leading into the edge. So we are going from data center into the edge. And primarily edge, what we are doing is doing the inference level for the data center AI inference, which again, we make it easier, more focused based on the workloads
Starting point is 00:06:48 rather than a general purpose solution. So we are going from data center to the edge as well, which include the data plate processing and some of the custom instructions which we are adding. Now, you made a very interesting decision to align with RISC-V. What does this mean for the architecture moving ahead? That's a very interesting question, and thank you for asking that.
Starting point is 00:07:12 So MIPS and RISC-V from the instruction set and architecture perspective are very similar. Given the RISC-V ecosystem is growing, so we make a strategic decision to move the ISA to RISC-V. So we are fully RISC-V compatible, and we run all the compliance tests and provide all the reserves as needed by our customers. But what we did was we bring the best of the two worst. So MIPS historically has been doing very well in the data processing like routers, switches, anywhere the data requires to be moved or processed.
Starting point is 00:07:48 So we kept the microarchitecture and we kept building on it. And then ISA, we transitioned to RISC-V. Transition was not that hard just because they're saying that they are closely coupled. So that helps us bring the MIPSiness into the RISC-V world with our microarchitecture, which continues from the historical past as well as going forward with the RISC-V. So we are helping to grow the RISC-V ecosystem. We are taking advantage of RISC-V STEM instructions, again, to do the best performance and performance efficiency. So all those things which are working well for us,
Starting point is 00:08:25 for the industry, so we see it as a win. Now, there's been a lot of talk about RISC-V in the industry. It's hard to assess its advancement in the market overall. I know that it has made deeper inroads in some segments of the market than others. How do you see this today, and where do you think the market is going? That's a good question.
Starting point is 00:08:45 And you're right. It is hard to assess as you can see the news all over. So what we see is RISC-V is experiencing rapid adoption across a wide range of applications, from IoT device to data centers, primarily because it's very versatile and it has the growing appeal, like I mentioned as a customization which is extremely important and useful. So what we see is that the adoption in the microcontroller sensors IoT space is very high and that is the starting point and all the adoption for the higher segments like in the compute data center in the starting point and all the adoption for the higher segments, like in the compute data center, in the client segments and so on and so forth, we are seeing slowly it is going up.
Starting point is 00:09:31 You will start seeing more and more adoption in the wearable segment, and then it will move into the handheld. And then you'll see in parallel, you're seeing a lot of solutions related to AI. So most of the leading AI inference solutions, which you see in the market, are based on this FILE processing. Overall, things are growing. China has a strong momentum. U.S. has a strong momentum. Of course, North America has a strong momentum. So you'll see a lot of solutions which are coming up and things are growing very fast and very rapidly. Now, what can we expect as we look forward? You know, we're entering Q4 of 2024, heading very quickly into 2025.
Starting point is 00:10:19 What does the next runway look like for MIPS in terms of engaging the market? Yeah, thank you for asking that. As settling back on whatever we have talked, we have some interesting announcements coming very soon. Within a month, you will hear the product. So we have a product which is already launched or will be launched soon. And then we have a few other products in Python. So you'll keep hearing a lot of announcements from MIPS on the processors, which are RISC-V
Starting point is 00:10:43 based processors. You will hear things related to the DPU space, which is SmartNIC. And you will also hear some of the things in near future on some of the scale of what we are trying to do for Data Center. Primarily to connect a lot of these AI ML engines, which require a lot of memory. But they require connection each other so that they can share the memory and do the large-language model processing. Here are some of those things we are doing. We will also talk about the announcements
Starting point is 00:11:14 and the partnerships in both the data center and automotive space. Also, we are in the process of publishing some blog on hardware software core design, which is looking very well on the performance efficiency. So there are a lot of things, exciting things coming. And we will be continuing to work and starting this year. And for the next year, you'll see a lot of announcements at various places.
Starting point is 00:11:37 I'm sure that we piqued folks' interest about MIPS. And I'm sure a lot of people in the audience are thinking, oh my gosh, they're entering a new chapter of evolution. I need to find out more. So where can folks find out more about the solutions that you've talked about and engage with your team? That's a good point. And thanks again. Yes, we have our MIPS.com website, which definitely we add all the information, all the announcements. We are also very active on LinkedIn and Twitter. So we encourage people to join, follow MIPS for sure. They can connect with me also on the LinkedIn.
Starting point is 00:12:16 And we are very actively participating in data center and edge-related conferences. So we were present in hardships and we were at the hardware summit. We are also going to be in the RISPY summit, OCP with various presentations and talking things related to data plane and in-memory compute and all those things. So all in all, there are a lot of conferences we are present.
Starting point is 00:12:40 We have our website and all the social media, like LinkedIn, Twitter, we are available. So please feel free to follow us, ask questions, collaborate with us. We'll be really happy to work with anyone in the industry. Durgash, thanks so much for the time. I always learn stuff when I talk to you, and this interview was no different. Thank you.
Starting point is 00:13:01 Thank you, Alison. Thanks again for the time. Thanks for joining The Tech Arena. Subscribe and engage at our website, thetecharena.net. All content is copyright by The Tech Arena.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.