In The Arena by TechArena - Immersion, Not Hype: Midas on Liquid Cooling at OCP
Episode Date: January 10, 2026Midas Immersion Cooling CEO Scott Sickmiller joins a Data Insights episode at OCP 2025 to demystify single-phase immersion, natural vs. forced convection, and what it takes to do liquid cooling at AI ...scale.
Transcript
Discussion (0)
Welcome to Tech Arena, featuring authentic discussions between tech's leading innovators and our host, Alison Klein.
Now, let's step into the arena.
Welcome in the arena. My name is Allison Klein, and we are coming to you from the OCP Conference in San Jose, California.
This is another Data Insights episode, which means Jenny Serowski is with me. How are you doing, Janice?
I'm doing well, Allison.
Day two of recording. How are you doing? How's the energy level?
Day two, energy's great.
I've just wrapped up lunch, so I'm super pumped.
So let's talk about what topic we're talking about in this episode and who you've brought with you.
We have a cool topic.
I have to say that because we're going to talk about all things, liquid cooling.
Today we have with us Midas.
And sitting next to my left here is the CEO of Midas.
Wow, you really brought the heavy hitter, didn't you?
Yes, we did.
Welcome to the program, Scott.
Thank you, Janice.
Happy to be here.
And to keep the puns going, it is pretty cool to be up here talking with you.
Awesome. Thank you. So why don't we just start with an introduction of Midas and how the company fits within the data center arena.
Yeah, Midas provides emerging cooling solutions. How we got to provide that solution was actually being a user of the technology.
2011, we are a small data center in Austin, Texas. And during the growth of our data center, we quickly became the go-tee company to handle the hard-to-cool IT.
that progressed to the need of a different cooling solution than our traditional air cooling,
and we moved into immersion.
Wow.
So 2011 to 2012, quite a few iterations of the immersion solution, which ultimately resulted
in our own built solution, which we patented, consumed, and utilized until 2016,
when we strategically elected leave the data center space and just become IT infrastructure
provider.
And the rest, they say, is history of 4,000 tanks.
but it was a good start.
Now, immersion cooling is a big topic here as well as just liquid in general.
But a lot of plants don't really understand immersion.
Can you explain a little bit more in detail around what is single phase immersion cooling?
Absolutely.
And how does it maybe even differ from, say, second phase?
So at the core, immersion is easy.
The first thing that we do is we get the advantage of a liquid in the heat dissipation versus air.
1,200 times better is just a good easy number to run with.
So by moving from air cooling, which is just allowing a fan and surface area to cool the IT
and then putting into liquid, we automatically jump to that 1,200 times better.
So immersion is easy.
Doing immersion well gets a little more difficult.
We'll talk about that a little bit in the future.
But single phase immersion, there's two versions, natural convection,
where we just let the heat difference between a cold plate or heat exchange or
think from an ice cube, and the IT creates natural convection.
And then there is force convection, which is where most of the industry is at today.
We're actually using a pump to move the fluid by faster.
So that is the single phase immersion.
The other immersion out there is two-phase where the chemistry itself, the die-elector,
change of state.
It boils at a low temperature.
It provides the cool.
Now, you've deployed thousands of these emerging tanks with your XCI systems.
Tell us about what it was like as you matured the technology.
What did you learn or along the way?
So a very early learned was computational fluid dynamics is our friend.
I'll say CFD for now on.
But like air, how the fluid moves around heat generating device is key.
So you need to have a uniform flow.
You need to be able to get to the heat generating devices.
Need to engage it with the dielectric and move it away constantly to keep it cool.
And unfortunately, form factor always changes, whether it's 19.
servers, A6 shoeboxes, 21-inch servers, 21-inch OCP servers,
power supplies at one spot, GPUs at the other.
So it's ensuring that the technology handshakes well
with the form factor.
That's been the most challenging.
On the bright side, at the end of the day, it's only physics.
So the physics can support the workload.
We just have to fit the form factor into the physics box.
So, Scott, one of your key differentiators
is actually reuse of the energy.
Can you tell us a little bit more about how this just works
and how the work you're doing with your systems
are more sustainable than other types?
Yeah, thermal recover heat reuse is near and dear to my heart.
And there's not a better technology for that than immersion.
So at the end of the day, when we turn energy into compute,
we generate heat.
It's just a natural function of that activity.
In an air-cooled environment, we have to try
to move that heat with air, very poor thermal conductor, and it moves and dissipates very quickly.
In an immersion environment, we grab all that heat in a fluid, as mentioned, it already
is 1,200 times better at grabbing it, but then another benefit is that folds onto it longer.
So we're able to transfer that heat efficiently and then ultimately move it through the system
and handshake with other thermal needing elements.
I just left the meeting here at OCP on a German district.
heating facility that a large data center is getting next year. And that's a prime use case.
In Europe especially, district heating is how a lot of homes are heated. So they automatically have a
boiler and move hot water around. If we're able to bring them water at 50 degrees C, basically for free,
it helps all of the energy consumption farther down the line. We've already paid for the energy
once when we've created the IT, the bites of the cubits. So at that point,
Why not use it again?
And that's where federal recovery is really useful.
Now, we're at OCP Summit.
Folks are talking about a lot of direct-to-chip or coal-plate technology cooling,
but compute densities are getting bigger, tighter all the time.
What would you say is the tipping point where folks start looking at immersion,
and how do you see this trend evolving over the next few years?
So it's between 100 kilowatts per rack.
We use a tank, but that was really familiar with the rack.
100 kilowatts per tank on a ROI and total cost of ownership model above 40 kilowatts is very, very strong.
But for the necessity of immersion cooling to cool EIT, once we reach that 100 kilowatt, the advantages are extreme.
With the rapid advancement of GPUs today, it will not take long from 100 to 200 to 300 to 300 kilometers.
While the physics state the DLC can still keep up, cooling will chill.
ship the GPUs, the peripheral of the server is going to create a great deal more heat,
and that still has to be resolved with airfoid.
So the power use efficiencies, the use of the cracks, the use of the air conditioning and the
data set will elevate.
And at the end of the day, whether it was a thermal recovery or the density of the IT itself,
as we all are hearing a lot more today, the true constraint is the power.
It hurt.
And we're going to have to get smarter and how we consume that power to keep up with
demand. So being at this conference, T-minus two years ago, we'll see a lot of immersion cooling
on the floor. And we think automatically science projects, like who's actually using this? But fast
forward today, it seems so much more realistic for folks to have these systems. But tell us a little
bit more about what the prerequisites are to really deploy some of these things, and is it really
turnkey ready? We'll start with some of the prerequisites. The prerequisites, the largest period that we've
faced for 15 years now is I'm not going to put water on my data center floor.
Data center operator of 2015 was adamant about that.
Then we started seeing the deployments of Vodor G8 Exchangers.
Now we're seeing a rapid deployment of direct liquidity chip.
While competing technologies, thank you for bringing water to the data center floor,
because then that challenge has been resolved.
Today we are deploying many systems that are in DLC,
direct liquid chip holes and tying into that existing water loop.
Sure.
So many of the data centers, especially ones that are focusing on machine learning and AI,
they are building water loops in the facility.
So that prerequisite is done.
Then we need to start looking at the IT.
At the end of the day, it's just quite a bit different.
The main differences, we don't need fans.
So we need to have the fans removed and the bios acknowledge there are no fans,
thermal paste and fluids don't get along, so we replace that with another thermal
median between the chip and the heat sink. Beyond that, we are able to dunk. The advantage of getting,
especially the fans off the board, we automatically cut down power usage. If we use a use case of a
one kilowatt server, 150 to 200 watts of that rated power usage is on the fans. So we are able to
complete the same amount of compute with that one kilowatt server for 800 watts. So that's just a
quick example of some of the benefits. And now, as we start moving towards the future,
we have companies like Solidine that are actively embracing immersion, which is just outstanding.
So as the world starts seeing where the direction is going and comes to support, these peripheral
challenges, now luckily will be a threat note in the history books. I love it. There's one thing
I will say on that. We'll be at Supercompute in about four weeks. And we're going to show off one of these
systems. It'll be your 12 view, correct? Correct. So at Solidine booth, a supercomputer,
we're going to have a 12 view that is actively running with some of their solid state drives.
Might as well have a 25 view that has some Solidine technology at it and some GPUs.
We'll also be showing off some of our maintenance suite, which is not really exciting.
But at the same time, during these two years of evolution to Janice's admission,
we've gone from what is that? And I can get electrocuted to how do,
I operate my data center. And part of that operation of an immersion data center is providing the
tools. We're no longer rolling it out from a vertical rack we're lifting it from a horizontal
rack. So we've developed a server lifting mechanism that's completely dripless, allows it to have
server service done on that site as well as some other maintenance tools. So all of that will be
available at supercomputer coming up. Any St. Louis quick. I guess I'm going to know where I'm going
to go first in St. Louis. That's really cool. Scott, you have given us a very
compelling reason to believe in immersion and the broad proliferation of it.
But one thing that I haven't heard from you yet is Y-Mydus.
A rising tide lifts all boats and the other folks in the industry really stepped up and are
doing a great job.
But when you look at Y-Midus, I come back to being six years of utilizing the technology.
There's a lot of elements in the technology that can become more user-friendly when you have
to use it yourself.
And that's the Midas system.
I recently deployed a system at a university in New K,
yeah, CEO with Piperich,
but literally had that system up and going in 40 minutes.
That's an advantage of a Midas.
We had to maintain ourselves, so we built it that way.
Midas, we have a truly concurrent maintainability and fault-tolerant system.
Everybody knows when I say cooling distribution year,
so a CDU, redundant Midas thing comes with two.
A year ago today, I had hired my global cell.
manager John Griffith did not bring him to OCP, put him on a training video that we built in-house.
And with an hour of education, and this was being behind the camera, he was able to change out
hot swap as CBO in seven minutes.
Wow.
None of the other players in the arena can get that done. So with the Midas system in short,
fault tower, two and n plus one, and we never have to work on the immersion system in the
white space. Four quick connects, one,
cat 5, one power connection, we're able to hot swap, get the seating you that has a fault out
to the gray space, and we're back up to 100% redundancy in seven nights.
Fantastic.
That's really the difference.
Yeah, that's fantastic.
Thank you so much for being on the show today, Scott.
One final question for you.
I'm sure folks who are listening online want to talk to you further.
Where can they go find out more information about this cool technology and engage your kid?
So, www.
W. Midasimersion.com is the website.
Easy to get connected through there.
Obviously, can look myself or John Griffith up on LinkedIn,
massively active on LinkedIn.
The phone number and contact information is there.
Just jump on board.
That's all we can say.
I think my key takeaway is ready to dunk from this.
That was the key message, and I love that.
Thank you so much to both of you.
And Janice, this wraps another episode of Data Insights.
Thanks so much for being here.
Thank you, Allison.
Thank you, Allison.
It's great to be here.
Thanks for joining Tech Arena.
Subscribe and engage at our website, Techorina.
All content is copyright by TechRena.
