In The Arena by TechArena - Dell on Storage Innovation in the AI-Driven Enterprise
Episode Date: September 30, 2025Dell and Solidigm leaders explore how modern storage—flash, SSDs, and flexible architectures—enables AI, accelerates performance, and helps enterprises manage data across edge to cloud....
Transcript
Discussion (0)
Welcome to Tech Arena, featuring authentic discussions between tech's leading innovators and our host, Alison Klein.
Now, let's step into the arena.
Welcome to Tech Arena Data Insights. I'm Allison Klein, and today is another episode with our great friends from Solidine.
Welcome back to the program, Scott, as well as Dell. We've got Seth Ali here,
with us from Dell.
I'm just going to do quick introductions
and then have you introduce yourself.
Seff is a senior product marketing manager at Dell
and Scott is a leadership marketing director at Solidime.
Welcome to both of you.
Seth, why don't you go ahead and start introducing yourself
and what your role scope is at Dell?
Sure, thank you, Alison.
What I do on a daily basis at Dell
is identify data challenges that businesses face
and work with them to develop an efficient
and effective data strategy that delivers tangible business outcomes.
But if you want a simpler way to put that, it would be, I could think of myself as a conduit for businesses trying to bridge the gap between IT innovation and data outcomes at the end.
And Scott, I know that you've been on the program before, but why don't you go ahead and reintroduce yourself to the audience?
Yeah, so I'm Scott Shadley.
I've been in the semiconductor space for going on almost 30 years now.
Built these things from scratch and have been enjoying the ride of doing the education and implementation of SSDs and storage products throughout the entire.
entire ecosystem from when it was a cash only to now the flash only type of architecture and
things like that. So very similar to Seth, also spending a lot of time with customers making
sure that we understand needs and product fits. Now, the topic for today is around enterprise
storage requirements and how they're changing. You two are so well positioned to give us
some great education on this topic. I'm just going to start with a question to both of you.
how have you seen these requirements within enterprise evolve in the past few years,
especially when we look at the context of AI and data-driven workloads?
Yeah, so from a solidine point of view or even just from personal perspective point of view,
seeing how enterprise is evolving to this more of I need my data today and I need to store a lot of my data today.
AI is the latest and greatest of the buzzwords or the technology ecosystem changes that have driven this continuous
growth. But at the end of the day, we're seeing that enterprises are gathering more data in more
unique places, and you need to be able to execute information gathering and results from that as
quickly as possible. And that's one of the places where we're seeing a lot of the emphasis on
the transition to a lot more flash-based architectures, SSD-based architectures. And partnering with
companies like Dell and in the ecosystem as we talk to our end customers is very important because
we're one cog in a bigger machine. And we need to make sure that all those
cogs align very well as we move these technologies forward.
So that's definitely one perspective.
And storage, obviously, at the heart of it, is data storage.
But from an AI point of view, I believe that AI workloads demand, one, sustained throughput,
it demand, secondly, low latency, three, massive scale.
And as you all know, AI is only as good as the data that you feed it.
And that means that you need the high speed in uninterrupted data access that you get from your
storage layer.
So to that point, I think storage isn't just about reliability and cost per terabyte and flash SSD anymore.
It's about data activation.
It's about feeding your GPUs efficiently to maximize utilization and effectively stream huge data sets in parallel to enable that real-time analysis across environments.
Yes, you need storage.
You need to make sure that you've gathered data to feed AI, but you need performance.
You need to feed it at scale.
You need that flexibility.
You need efficiency.
And of course, arguably, most importantly, you need your storage to be.
secure and now more so than ever before.
So if I can remember a number of years ago where storage was the afterthought of the data
center in terms of what was seen as strategic, with AI, it's become the center of the story.
And what we have is some enterprises not necessarily having the right infrastructure in place
to take advantage of these opportunities.
What do you think are the biggest challenges that enterprise that you talk to face when
trying to modernize their storage infrastructure.
When it comes to modernizing a storage strategy,
I think there's two aspects we can look at it.
The first one, from purely an AI perspective,
to your point, underutilized GPUs
is a huge and major challenge that enterprises face,
especially in this, as you said, AI context.
So you make sure your storage architecture
can meet the demands of high-speed data access,
as I said before.
The other aspect, on the other side of it,
is data fragmentation.
That's another key challenge that we hear customers have.
the data is siloed across different environments, be it edge, core, or cloud.
It can be anywhere and everywhere.
Another on that point is complexity, the sheer operational complexity that comes with
these massive amounts of data and these rigid architectures that lead to vendor lock-ins
and other operations silos, that's a major concern that we hear our customers have.
Now, I know that Delve has established itself as a trusted advisor with a lot of enterprises.
how do you see your position in leaving it through this evolution with your customers
and what differentiates in your mind Dell's approach to innovation in the storage space?
So I think over here I want to plug in the Dell AI data platform.
That's where the Dell AITP comes in.
It's Dell's approach for an open, flexible, and secure infrastructure that supports the entire data lifecycle.
Because we recognize that it's not only about storing data anymore, as I said before.
It's about managing data, and most importantly, it's about using data in your AI workloads.
So the daily data platform is built on the principles of place, process, and protect,
where our scale-out NAS platform, being the world's most flexible, efficient, and secure
scale-out storage is at the core of that data foundation.
So when I look at in terms of innovations and how we stand out and how we differentiate in the market,
power scale's performance and scale is completely unmatched.
We're continuously innovating in this space, and in just a year or so,
delivered about 220% faster data ingestion and 99% faster data retrieval compared to previous
generations. And in today's age of data, metadata is hugely important. We recently also
introduced a new software feature within our software suite called metadata IQ, which essentially
what it does is accelerate search and querying by organizing a data and tagging unstructured data
for better discoverability, and that eventually does go into AI workloads. Those are two huge
facets that we've been continuously innovating on, making our data more discoverable and making
sure that we're matching the pace of the demand of performance that storage infrastructure requires
at this point. And I think lastly, I want to touch one more point where we stand out and perhaps
this is most important. I think what Dell does is it takes a validated ecosystem approach.
It integrates with reference designs, AI frameworks, and numerous ISVs partners, like
invidia, for example, to help customers operationalize AI faster with less risk and less
complexity. And that end-to-end approach is, I think, what sets Dell apart.
Yeah, Scott, as Seth is talking about this opportunity, underpinning that is somewhat of a golden
age for solid-state drives coming into fruition. How are you seeing customer requirements
shift, particularly when it comes to high-capacity flash and AI-driven workloads?
Yeah, it's an interesting thing because it's a combination of both the AI-driven workloads, the location of the data.
And one of the biggest things that we're seeing from customers is this concept of time-to-first data, even.
So it's not even so much how much you can store and how much you can manipulate, but how quickly can you get that first access.
So we start to see latency playing a large play in a lot of this work.
When you talk about these AI workloads and where data resides in, to Seth's point, edge core cloud,
the overall infrastructure has a key aspect and input to it.
So designing the storage infrastructure behind it
is important to make sure you're addressing some of those other bottlenecks
so that the GPU is left fully saturated
and you're not creating any kind of stranded data islands or data locations.
So being able to talk to customers and understand that I need a big amount of data
in this location, but I also need to access it from that location
drives this need to truly understand a larger scope of than just what that one product is.
And that's why these conversations with customers never revolve around just what SSD do you want and how many can I ship you?
But it's really about what are you doing with your data?
How are you managing your data?
And the AI workloads, we're starting to see the focus now on the token economy and how tokens work.
And the unique thing about tokens in the AI economy is they're not one size fits all.
It's like a bank.
You go in and you can get a quarter, a nickel, a dime, a dollar bill or whatever.
All those tokens are different sizes and shapes.
and they need some kind of locality about them
that makes them that first time to date a very real.
Now, Seth, I know that we at Tech Arena have written a ton
about the disruptive force of AI,
and Scott has done a really nice job of talking about
some of the core capabilities that they're seeing
from a media perspective.
How do you see your customers thinking about performance,
latency, and capacity, and storage,
and how does that differ within the AI era?
your perspective.
I'm going to piggyback off of what Scott said.
So to his point, it's not just about story.
It's about the entire AI stack, and that includes networking.
So when it comes to GPU saturation, as he said,
so that storage becomes even more important.
That's why storage has taken up even more of a mindshare,
because you can't afford for your GPUs to be idle.
So I think efficient feeding of these data-hungry compute servers
is what makes storage even more important.
And in today's age, where performance is obviously integral,
model training, for instance, needs sustained throughput.
If your storage can't keep up, as I said, you're wasting expensive GPU cycle.
You're basing those tokens then.
And when it comes to inferencing and retrieval augmented generation, latency is critical in the
AI pipeline and directly impacts decision-making and business outcomes, right?
And of course, when it comes to storage, you have to admit the fact that capacity, of course,
matters too.
And because these workloads scale fast and at an exponential rate, your storage has to keep
up with it, it has to scale alongside the data that's being created at these workloads.
These are all the things that organizations now need to be mindful of in this new age,
and that's where performance latency and capacity take the forefront.
Now, Scott, for the last couple of years, we've been talking a lot about large language model
training, but as enterprise start proliferating these in applications, we're seeing inference on
the rise. How do you see storage supporting inferencing and other real-time AI?
use cases?
We've spent a significant amount of time understanding the different aspects of the AI flow,
and you've got your ingest, your training, then we get into the big guy, the inference.
And one thing that's unique about that, and as Seth mentioned, the idea of the RAG architectures,
they starve if you don't feed them large amounts of data, because the idea is I've got a
whole bunch of information that's sitting somewhere that I need to go back and reference
in order to inference on it and being able to transition that data from being in an
archive type location to more real and available footprint is valuable.
And that's where something like a large capacity drive and a nice network box like
power scale box type architecture.
And then you still have that closer data, which you have to be able to manage,
which is the real time regeneration and efforts that are in that more direct
attached location next to the GPU.
So being able to understand the math and the economics behind what needs to be sitting
right next to the GPU and what needs to be sitting out a little further where you can
do some of that longer time pull in for your inference and workloads is something that we're
starting to see people optimize and spend a lot of effort and focus on. Which way do I manage
the data set so that I'm pulling the large brick in without impacting the real-time work of
the data that I'm on today? Now, Seth, I know that we've been talking a lot about Flash,
and some people would say that it's essential for modern workloads, and I'm sure Scott would
like that to be true for all workloads. But where does Dell see Flash excelling? And
And where do you think HDDs still play a role?
So I won't deny that Flash, as you said, Flash is essential for these high-performance workloads.
And at B8 AI training, rendering analytics, really anything that's latest-sensitive needs high-performance Flash SSDs.
And as I said, you can't afford the data to be a bottleneck.
You can't afford your GPS to be idle.
And that's where Flash will remain king.
But to your point, that being said, ASDDs do still deliver great economics at scale.
And they have a role to play, especially for cold and warm data sets.
The way that I see it is that not all data is equal.
So not all data needs to be treated equally.
No one size fits all works for anyone customer.
So the power skills customers can mix Flash and HDD in the same cluster.
So Flash is used where it matters most and HD where it makes the most sense to balance performance and cost,
because that's a big consideration for a lot of our businesses as well.
And being able to do that within the same unified namespace,
to fit your data needs,
you fit your workload needs,
is the name of the game, I feel.
Now, as workloads evolve,
we're also seeing the diversity
of where data is being generated, expanding.
And if you look at this vast landscape
of cloud, edge, and everywhere in between,
how is Dell adapting your storage portfolios
to support this decentralized model for customers?
So simply board,
power scale is the world's most flexible solution.
And it delivers complete consistent user experience
and seamless mobility no matter where,
be it, edge cloud anywhere in between, right?
So we have solutions like PowerScale for Microsoft Azure
and Power Scale for AWS,
in which we ensure business can extend their
on-premises experiences into the cloud
completely and effortlessly.
Plus, besides those two features,
we also have a new Smart Sync functionality
that we recently announced.
And what this does is not only supports
data mobility across locations,
be it EdgeCode Cloud,
but it also supports data mobility
across data types. Again, it's enabled seamless movement from, let's say, any kind of data
to, let's say, a lower cost S3 compatible storage, whether it be our partners with Saabee or
whether it be our other products within our integrated storage portfolio with optic scale
with native S3 compatibility. So I think what sets Dell apart within this data fragmented ecosystem
or age that we're living in is our ability to unify data management across different
environments, ensuring that these businesses can access, can move and can protect that data
seamlessly wherever it may live.
Now, Scott, Seth just described a really awesome distributed environment
in a platform that's going into all of that.
How is Solidime approaching, optimizing for various environments within your portfolio?
Yeah, it's an interesting thing because when we first brought Flashspace storage to market,
it was all about being really fast.
And I had to be the fastest thing, and this one drive is going to speed up an entire infrastructure.
As we've seen that become now to Seth's point an essential part,
of the platforms, it's not so much about being just the fastest. It's not about being just
tailored for that speed aspect of it, because there's a network involved in a lot of this,
especially when you're going from edge to core to cloud. So being able to articulate and build
a product portfolio that supports the right bandwidth, the right density, and the right
time to data that customers are looking for is very important. And so always looking at the tip
of the sphere is fun and, you know, able to say I have the first this, the fastest that.
But at the end of the day, the customer and what they really want is not always the fastest.
It's the most efficient by way of power, cost, performance, latency that satisfies the need
they're looking for.
And so Solidime has been doing a lot of work.
We spent a lot of time investigating the entire AI stack, the entire data ecosystem from Edge
with partners there where we're talking about, like I mentioned previously on one of these,
the hedgehogs, all the way to the cloud where we're.
partnering up with those guys and delivering solutions. And now thinking outside that box,
even, and what can the SSD-type architectures actually provide customers that help with both
the cost performance and the layer capability of creating that next level between cold and warm,
whether you want to call it something else, or just fitting it in the stack that it solves
the customer problem less than just being the next big thing or big news.
Now, I are old enough to remember when the first SSDs hit the market, and the capacities that are coming out on Flash at this point are crazy compared to where we started.
Seth, when you look at these high-capacity drives, what is the opportunity to balance performance and cost efficiency for customers?
That's a huge opportunity.
These high-capacity flash drives that are coming out, they're a game changer right now.
I mean, in the last year alone, we've gone from 30 TB SSDs to 60 and now to 122 TB SSD.
And when I think about it, I remember when I first saw Solidimes 122 TB drive in the palm of my hand, a Deltaic word this year.
And I was completely mind-blown at the amount of data that can fit into this small little thing, right?
The rack space density, that's the opportunity.
The density is key over here.
The benefit that gives enterprise customers is nothing short of amazing.
These high-capacity flash drives effectively help customers reduce power.
They help reduce cooling costs.
They help maximize floor space.
That may be limitations for certain entities,
while still delivering the throughput required,
the flexibility and efficiency that AI workloads
and other high-performance workloads require.
To your point, the rate of change is unbelievable.
Going from smaller SSDs to now such a higher-capacity SSDs is huge.
I don't think it'll be long before we see 245 TBs out in the market and so forth.
the data explosion is not going to stop, and that's where the opportunity is.
AI needs data, and it, again, essentially creates data as well in turn.
So managing this data growth while keeping within the data center costs and limitations
that enterprises have, that's where the opportunity for a higher capacity flash, I feel, is.
Now, Scott, I think that one of the things that comes to mind here is density,
and I know that Solidime has said a lot about media density and drive density.
Can you talk a little bit about that in terms of the importance of density and performance in your product roadmap and how that aligns with Dell's product portfolio?
Yeah, absolutely. To Steph's point, these 122 terabytes today going to 245 here in the near future, when you have that larger density per device and you're putting multiple devices in a system, it doesn't have to be at the fastest possible rate you can imagine because you're just not going to be able to ingest it into the local data.
data set or even over the network that's attached to it.
So optimization of the infrastructure, both inside the drive and around the drive,
like with the Dell Power Scale systems, is important to make sure that you're giving
the customer that performance, a TCO benefit that they're looking for.
And to your point, when we were selling a 73-gigabyte drive, one of the first
enterprise drives ever put out, at $30,000, which they're not even more close to that
anymore, right?
I used to sell those guys to EMC before Del Bottom.
But now with this 122 at a cost effective price, and we're seeing at the edge, the infrastructures are smaller, but they have more data than you do in some other places.
So being able to put that kind of a capacity with the, to Seth's point, the performance, the capacity, the security that's associated to that technology and that product in that footprint and then scale it all the way across the ecosystems that you get that unified look is very valuable and important.
a lot of the focus for us.
So if you need that really fast drive,
I'm still going to sell you one.
I've got those that are going to be at a lower capacity
because you're not using them for density storage
or using them for a cache or some kind of fast layer.
And when you really do need bulk storage,
but easily accessible timed data at bulk storage,
you're going to go for these 60, 122 and soon 200 plus terabyte drives.
Now this conversation has really been about a fan kids walk through storage innovation,
but I want to ground us on the fact that what this,
technology is doing is really helping customers unlock better business outcomes.
Seth, you get your customers really well.
Can you give us a few examples of instances where you've made an impact for your customers?
Sure, you're right.
It's absolutely about business outcomes at the end of the day.
So I think one story I love telling all the time is with KM, Kennedy Miller-Mitchell.
That's the studio behind the recent Mad Max movies.
If you've seen them, what Poppskill did was enable them to pre-visualize entire scenes even before filming.
Now, I imagine being able to give directors a cut of the movie even before filming even starts.
And what this gives directors is the ability to make creative decisions to realize their full potential or their vision for the movie.
And if they needed to incorporate any changes that they felt necessarily in real time, it's completely transformed the production process for them.
And it saved them countless hours and resources.
Mad Max is a story that I love telling for sure.
And another example that I can give you would be Subaru.
They turned to power skill in the wake of this exponential data growth that they were.
experiencing. And what we did was we had them consolidate and scale their unstructured data,
allowing them to manage over 1,000 times more files than before and streamline operations in the
same namespace, as I said, within the power scale storage layer. The outcome for that is that
besides the scale and consolidation of their data was that they were able to accelerate the development
of their latest AI initiative and improve the accuracy of their driver-assisted technology.
Again, the better the data, the better the AI output, the more data they were able to manage and
consolidate, the better the outcome of their diverse system technology. And in both these use
cases, I feel what power scales delivered is the performance, which we talked about already,
the scalability that we talked about, simplicity, which is also really important, that both
of these enterprises needed to help thrive in today's data-intensive environment.
That's awesome. Thank you so much for sharing those examples. I know that you've got more on your
website, and that's where I'm going next. First, I want to say thank you to both of you for
participating today, it's been a fantastic journey, but I know that I have more questions and
I'm sure that our audience does too. So where can our listeners go for more information and to
connect with you and your teams? Scott, why don't you go first? Yeah, no problem. So for Solidime,
it's Solidime.com forward slash AI. We've got a great landing page, gives you lots of insights on where
we see the world of the AI ecosystem and the impacts and value that storage play there. For me personally,
it's SM Shadley on the socials or Scott.shadley at solidime.com.
If you'd like to reach out to me directly, I'd be happy to have those following
conversations. To Allison's point, there's lots of questions.
And fortunately, we've done a lot of research and have a lot of good answers for you.
So hopefully I will hear from you all soon.
So for PowerScale, I'd say, obviously the website, as you said, is a good place to start off.
That'll be Dell.com slash PowerScale.
And our team of professional services is always ready to help businesses get up and going.
For me personally, my address would be S-A-I-F-S-F.
dot al-y ali at dell.com. As Scott mentioned, happy to have any conversations 101 if anybody needs
and we can get the start at that. Awesome. Thank you so much to both of you for being here today.
It was such a pleasure. Thank you for having me. Appreciate it.
Thanks for joining Tech Arena. Subscribe and engage at our website, Techorina.com. All content is
copyright by Tech Arena.
Thank you.
Thank you.