In The Arena by TechArena - Clearing AI’s Costly Bottlenecks with Cornelis Networks
Episode Date: October 8, 2025CEO Lisa Spelman explains how tackling hidden inefficiencies in AI infrastructure can drive enterprise adoption, boost performance, and spark a new wave of innovation....
Transcript
Discussion (0)
Welcome to Tech Arena, featuring authentic discussions between tech's leading innovators and our host, Allison Klein.
Now, let's step into the arena.
Welcome in the arena. My name is Allison Klein. I am so excited for today's interview. We have Lisa Spellman, CEO of Cornelis Networks, back in the studio with us. Welcome, Lisa.
Hi, Allison. It's great to be here. So you just celebrated your one year anniversary at
Cornelis and have been in the CEOC and won incredible accolades for the company in terms of
the rising star in the semiconductor space. Tell me a little bit about Cornelis and what the
company has accomplished over the last year. Thank you. I'd love to. Yeah, it's so funny because
a year can feel like it passes in the blink of an eye. And also it feels like we've been doing this,
and accomplished five or 10 years of work within one year.
So, Cordellus Networks is a company that is laser-focused, driven, committed, maniacal
about solving this efficiency problem that is plaguing AI and HPC mega system through the network.
And so we design silicon, we build systems, we do amazing software, and we deliver that
end-to-end back-end network, so the supernet crewed and the switch that has unique and differentiated
features than our customers need in order to improve their GPU utilization, improve their
compute utilization, and fundamentally get more out of the hardware they're buying, under the
compute power they're investing so heavily in, how to get more out of the power budget that
they have in their rack and at the wall. And so really, we're just trying to help them
get more. Now, we're heading into AI InfraSummit next week, where you are going to have a keynote
about AI infrastructure, and I want to get into that with you. But before we do, how do you see
AI advancements shaping up in the next two years? And what do enterprises need to do now
to prepare for that broad integration that everyone is expecting into workloads?
Yeah, it's interesting. There was a study that came out recently that said that
some insanely high amount. Maybe it was over 90% of enterprise.
effort and projects are struggling to achieve ROI. And I think we are on the cusp of that
being a complete flip, truly a 180. So admittedly, there definitely is work to prepare to get
ready to ensure that your organization is set to benefit from advancement in AI. But the time is
now. And I think that enterprises are really sorting themselves into kind of two separate
bucket, those that are either building themselves as AI Native or committed to making that
pivot to AI Native. And then there's some that are still fighting it a little bit. And I think
that those that are down the path of embracing that AI Native mindset are going to be
lines that have the opportunity to unlock more and just solve bigger problems more quickly
with more engaged and excited employees and customers. There is real work to do there, though.
And AI is such an interesting technical discussion. Obviously, I spend all of my day focus on how
packets move around the network. So we call it the guts, like really getting into the guts and the
system to make sure that there's no bottlenecks, no congestion, latency is minimized to the
lowest extent, all of those fundamentals. But enterprise AI is as much an infrastructure as it is a
cultural endeavor. And so I encourage anyone that's building an AI based company or transforming a
company to pay as much attention to culture as they pay to the technical aspects. And I'm living and
breathing, doing the same at Cornell. We are growing ourselves as an AI-Native company. We're
incredibly heavy users. We have a lot of areas where we've used it to accelerate our progress,
but spending a lot of time in making sure that our team members, our people are prepared for
this journey and feel really comfortable and engaged in the progress we're making.
Now, your heritage in the industry gives you a really good, unique insight into how these
macro transitions occur and we passed through that chasm of disillusionment into broad adoption.
I think that we both saw the same thing with computing in the early days, folks saying maybe not
the value that was promised and now we understand that it's really transformed every aspect of
business. Why are you confident that this will follow the same suit? And what are you seeing
from those AI native and customers that gives you hope?
I just think there's so many similarities in every wave of a transformation like this.
And the cloud is a great example of this.
And yes, I've had the opportunity to fit deeply in the compute aspect of the overall system infrastructure
and now in the network space.
And obviously, those two are so interrelated and have so much reliance on each other.
But when we faced cloud, as you mentioned, and saw this wave of transformation coming,
there was, again, a group of early adopters that committed to cloud-first-based,
not just infrastructure, Allison, but business model.
And that was the rise of not just the infrastructure and the service DSP, but the service
and the service.
So whether that was SaaS with a B-to-B software type of offering or whether that was
the ways in which so many consumer experiences have been completely transformed in ways
that didn't even seem possible.
And funny enough, I was actually in an IT department running infrastructure and operations
and engineering during the time and the beginning of the cloud.
And there was so much talk inside of that large IT organization around whether or not this
would become real and what would be the killer.
app that would drive you to invest in this space.
And what we ended up doing was we said, there's no singular killer app.
We're just pulling together all of our use cases that could benefit from the improved
economics of a climate-based infrastructure.
And we're putting them together as one, I guess, mega use case in order to drive the investment.
Then as soon as you build it, more will come.
And this AI thing, I think, just shares a lot of the.
similarity. So obviously a lot of companies have started down the foray in the space of software
and their software coding. It helps tremendously to have very experienced software developers that can
guide and shape, especially early versions of AI. And that's improved tremendously. So now you can
vibe code pretty much anything you want. But when you think of really truly enterprise class,
So you still want that experience there to help drive it and put it forward.
Once you're building businesses off of software that's enabling AI,
there's no reason that you don't start to extend that and see potential in other areas.
It's actually impossible to become a user of AI in the space and not have your mind opened up to what other
opportunities exist there for your business.
And again, when I look inside Cornelis, it's not.
just about us solving this efficiency challenge for the industry and for our customers.
It's about how we are a better, more customer-oriented company ourselves, how we put out
products faster, how we ensure that we are delivering the most important software to our
customers as quickly as possible. I'll give a small example. We recently launched a generation
of our product. It's called the CN5,000. It's the 400 gig end-to-end.
network. And in bringing it to market, we have done work to meet all of our customers' OS support
obligations. So we have a standard set of operating systems that we support. But it's not uncommon
for customers to come back and say, hey, for this particular deal, I need REL 9.6, or I need 9.4,
whatever it is. I think back to prior generations. That would be an entire process. And there would be
months in between the requests coming in, our analysis of whether we can do it, getting a team
together, making sure it can all happen. Now with, again, some of our AI infrastructure and the
tools that we built and created, we're able to assess the deltas between OSs in an incredibly
quick and efficient manner. And we're able to target any of our upgrade work super quickly.
So it's like you have example after example of the ways you can increase the pace of your
entire business by the fact that you're utilizing tools and people together.
I think it's incredibly powerful.
And it's a really small example, but we're replicating that across pretty much everything
that we do, whether it's marketing, whether it's design, whether it's in the software
domain validation, huge area for us.
Now, I want to go deeper into that with you.
You know, we're seeing petty capital commitments in this space.
From the hyperscalers and neocloud, nation states getting involved, but also into the broad
proliferation and enterprise, from your position, and I know that you're deep into not just
how the network advances, but also how it interplays with compute and storage and everything
that comes into an AI system, how mature is AI infrastructure really? And where are the biggest
gaps that need to be overcome today? So when I step back and look at the whole thing, I think we're
in the phase that I would call brute force, but it works. And we got to move to elegance. And to put
a few extra words around that, when I say like brute force, but it works, it's what are we doing
right now. We are throwing more compute at the problem. We are putting more scale around the problem.
We are putting more concrete, more power, all these things around the problem and saying we just have
to brute force through these models. And the hyperscalers are obviously like really driving and
setting a lot of the pace there and pushing forward with significant massive investments.
But there are challenges there.
network is one. It's an important one. Actually, there's data that suggests that GPUs are
spending anywhere between 15 and 30 percent of their time in non-math mode. So 15 to 30 percent of
time is being spent purely on communications. So that means they're not doing math for you.
They're not improving your time to solution. They're just waiting for data. There's other bottlenecks.
It's not just the network storage will continue to have advancements to be made in that space as well.
So there's other system contributors.
The network alone is a pretty big one.
And that's actually something I will talk about a bit at the AI Infra conference.
So when we can make improvements on those types of metrics in the system, we can do what I said, moving towards elegance.
By the way, huge believer in the hardware, software dynamic as well.
So, yes, I'm on the infrastructure side.
I'm speaking in the infrastructure track.
Like, I get that.
But when you start to see the work that's being done on the model side,
and you see the way that the large model providers are shaping and advancing their generations,
you do see they're moving towards more, again, thoughtful deployment of tokens is the best way I can think to describe it,
in that they're feeding tokens off of different models at different phases based on the types
of questions or prompts that they're getting. And they're definitely trying to move towards
greater efficiency while still improving quality of response. I think it's a great time for
enterprises because they don't have to invest in that level of model development. Like a lot of
it's been done on their behalf. And now there's enough capability where you can pull in tools to
your environment, train it or fine-tune it essentially off of all of your data and just get really,
really custom and bespoke AI interfaces that speak only to what your employees need to keep your
customers happier. So I think the enterprise AI is on the brink of a phenomenal takeoff that it's
them exciting to be a part of. No, you've been at the company for a year, and I'm almost terrified
to ask this question, given the pace that things are changing in the industry. But if you look ahead
three years from now, if you succeeded with the team beyond your collective wildest expectations,
how will the AI landscape look different because Cornelis Networks was driving innovation at the heart
of it and what change will you have driven that wouldn't have happened otherwise? I fundamentally
believe that we will play an important and big role in delivering a more economical and
sustainable AI infrastructure. So the genesis of everything that we've built into our architecture
and everything that we're building is around performance first. This is a high performance product.
that is set to challenge old paradigms for how you deliver data throughout the network.
And so very excited about that.
But it's not just performance for performance sake.
It delivers an improved utilization and efficiency of the entire system.
So it's not like you're just delivering improved network micro benchmarks where you say,
yeah, my network's a bit faster.
and then the whole system still needs the same power.
It still needs the same concrete put around it.
It still needs the same massive amounts of compute.
It's network innovation that allows you to do more with that same investment of your
power, your compute, the rest of the infrastructure, your software.
So I believe that as a company, we are on a path to deliver that more economical, more
sustainable AI infrastructure to the world. And that is not a cloud only statement. I have so many
reasons that I believe that the cloud providers will continue to drive that frontier edge
models and capabilities, but that the economics, the privacy, the security, and the use cases
will drive infrastructure for enterprise AI into an on-prem world.
There might be some of the COLO aspects to that as well.
But the nature of the constant use, the massive amounts of data being generated,
are going to make that a more economical and for enterprises,
I think, faster-moving type of world and environment.
So when I look out at the market,
and we've done a bunch of TAM analysis and looking at how
this shakes out. I do see that the hypers, they have 40 to 50% of that AI market, but I see that
the enterprises and some of those next wave neo clouds and the sovereign clouds that you mentioned
represent the other 50, 60% of the market as well. I can't wait to see how it shapes. And I've been doing
a lot of interviews of practitioners leading up to this conference. It's fascinating to just see the
progress that they've made in the last year. And what you're talking about in terms of on-prem
implementations, I think we're going to hear a lot at the conference. And the next question
that I have for you is your talk. Obviously, you've given us a couple of little peaks during
this conversation, but can you just give us some insight into what you're going to be talking about
and why this is such an important topic for you right now? So at Cornellus, we've built and
proven technology that rivals and outperforms
and trench competitors. We've shown customers and partners that our
approach isn't just different. It's better. And I want to expose
that to more people. So I'm looking forward to, you know,
getting in front of them. And it goes back to not just showing off
of what we've built, but more importantly than that,
giving enterprises and the cloud providers that are there in
attendance, a chance to see how we can help solve their problems of AI efficiency.
So right now, infrastructure is holding back the next great discovery. It's holding back the
next human achievement. It's holding back what the next business evolution is for many of the
folks that are going to be in attendance at the event. And we want to give them a path to
unlocking that. And that's what I'm going to be talking about.
So final question for Aalisa, where can folks find you at the AI Infra Conference and where can they find out more about Cornelis networks and the solutions we talked about today?
Yes. So for those of you that are attending the event, very much look forward to seeing you all there. We have a booth out on the show floor and then I, like I said, I'm giving a talk on Tuesday afternoon. So around three o'clock. So please come by and see that in the infrastructure track. And then secondly, for those of you that can't make it, we'll be continuing to show.
share content online. You can go to
Kronelisnetworks.com, of course.
Follow us on LinkedIn.
Check out our YouTube channel if that's
not where you found us first. And we
look forward to staying connected with you.
Lisa, always a pleasure.
Thank you.
Thanks for joining Tech Arena.
Subscribe and engage at our website
Techorina.a.
All content is copyright by Techorina.
Thank you.
Thank you.
