Not Your Father’s Data Center - Evolving Data Center Cooling for AI
Episode Date: February 25, 2025Today we welcome Dave Rubcich, Global Vice President of Key Accounts at Vertiv to discuss the innovative partnership in cooling between Vertiv and Compass Datacenters©. Raymond and Dave expl...ore the evolution of data center cooling technologies from the 1980s to the present, into the transition from traditional air cooling to modern liquid cooling methods while addressing the increasing thermal demands of AI-powered data centers. They highlight the innovative Cool Face Flex solution—formerly referred to as the DH400—a hybrid unit offering unmatched flexibility in accommodating both air and liquid cooling without altering data center designs.Throughout the episode, insights into data center energy efficiency, the technological journey from Liebert to Vertiv, and the significance of adaptable cooling solutions are offered, providing listeners with a comprehensive understanding of current data center cooling trends.Timestamped Overview00:00 Intro03:38 Pivot from Engineering to Sales06:54 Liebert's Early Talent Development Program11:06 Tech Advances in Room Temperature Efficiency14:42 Optimizing HVAC Efficiency and Control17:19 Emerson's Network Power Divestment23:28 Evolution of Cooling in Computing26:54 Pumped Refrigerant Breakthrough29:33 Outdoor DSE Pump Solution31:01 Flexible Chilled Water Unit Development
Transcript
Discussion (0)
Nobody today can tell you definitively,
well, 20% of my load's gonna be air-cooled,
80% is gonna be water-cooled.
And on day one, it may be something,
it might be 90% air, day four, it might be 50-50.
The reality is nobody knows.
Nobody knows.
As we started talking more and more about the idea
of the Coolface Flex, of a hybrid unit
that can operate in either mode of operation,
everybody realized the true value of having this solution
and the benefit of the flexibility.
Well, welcome again to another edition
of Not Your Father's Data Center.
I am Compass's Chief Customer Officer,
Raymond Hawkins, and your host.
We are always excited to be joined by folks
from the data center business and help us understand
a little bit more about what is driving
the digital infrastructure in our world today.
Today, our friends from Vertiv have loaned us
the global vice president of key accounts,
multi-tenant data centers, long time veteran at adverting, Dave Rubich out of Ohio.
Aren't you in Westerville?
Did I got that right, Dave?
Yes, I am.
Thank you for having me on today, Raymond.
Well, awesome for you coming and joining us.
Westerville, just north of Columbus.
If I know my Ohio geography right,
tell us a little bit about you in the early days.
Where's home?
Is Ohio home?
Where did you grow up?
I doubt you went to college and said all I ever want to do is cool data center
So tell us about your hopes and dreams and how you got into this business sure
So born and raised in Ohio actually grew up in Eastern, Ohio down in the Ohio Valley near Wheeling, West, Virginia
Virginia. End up going to college in Cincinnati, so I'm a proud Bearcat.
All right.
Grew up as a grew up as a Buckeye fan, went to college in
Cincinnati. So my allegiance has totally changed.
I was just gonna say that that's gonna be a little bit of a split
there. A lifetime Buckeye fan and going to the Bear
Cats.
It is, and it's a little tough living in Columbus.
Oh, yeah, that's right.
Even, yeah.
You're flying the Cincy flag there, but you were in the heart of Buckeye country.
Absolutely.
Yeah, absolutely.
Down there in southeastern Ohio, sorry for those of folks that aren't as fascinated by
Ohio as Dave and I might be, Ohio University, is that down in the southeast?
That is southern Ohio, yeah, down in Athens.
Athens, yeah.
Solich, boy, the guy from Nebraska went there and did good things.
And the Bengals quarterback, Joe Burrow, he-
Yeah, yeah, yeah.
Well, he's from the area.
Of course, he went to Ohio State and then transferred to LSU, but he grew up in Athens. Just shows what a great job Ohio State does preparing quarterbacks for success.
Absolutely. Then he went to LSU. So I'm an SEC guy. I'm a diehard Auburn guy. I'm actually recording
today from the Compass office annex in Auburn, Alabama. And I will say having been at the Auburn LSU game in 2019, that Joe
Burrow performance, not only in that game, but that season might be the
best college quarterback season.
I personally witnessed there may be betters, but man, he was unbelievably
accurate, unbelievably resilient, unbelievably tough, just an incredible
gamer that's as impressed with a college quarterback
I've ever been right there in your backyard.
Yeah, he's really good.
And he's proven that in the NFL.
OK, so lifetime in Ohio, you went to the Bearcats.
What just you didn't study data center cooling in Cincinnati.
What did you study?
No, electrical engineering.
Didn't really, you know, honestly, when I went to college,
I didn't know what I wanted to do.
But I've always been good in the math and sciences and there was a lot of demand for
engineering. So I knew I could get a job.
But as I, as I went through my college career, did some co-oping, the idea of being,
you know, glued to a desk every day, didn't quite appeal to me.
And I had some friends started talking about technical sales. And I like the idea of going out and seeing
customers and seeing new applications being in factories
different things. And I thought, well, let me give that a try. If
I can't sell anything, I can always default go back and be a
you know, be a quote unquote engineer. Yeah. And almost 40
years later, I'm still trying to figure out if I can be a sales
guy.
Yeah, I hear you. You and me both.
I stumbled out of college into the technology business in the late 80s.
And I joke with my kids all the time.
They're like, you know, Dad, we've always had computers.
I'm like, no, baby, we haven't always had computers.
We used to have posters in the walls in the offices.
Some day there'll be a computer on every desk. Yeah.
So by early computer days, I mean, we've got offices and they literally there wasn't a computer on every desk. So by early computer days, I mean, we'd go to offices
and they literally, there wasn't a computer in the building. Not one.
When I started, our admin had the lone computer for all of our quotes and everything. We'd
fill out forms, give to that person and they would generate everything.
Dave, did you ever get a green bar report?
No, I'm not sure I know what a green bar report is.
So the big the big computer paper, the reels on the dot matrix and one line was green and
one line was white, green, white, green, white, green, white.
Yeah.
So we call those green bar reports.
Sorry.
Yeah.
I say that my kids are like, what are you talking about?
I'm like, yeah, we program computers on cards.
Yeah.
They're like, what?
Punch cards.
Yeah.
For trend cards.
Yeah. Yeah. And you run your cards through and you see if your program
worked. You'd slide them in this wooden slot and the guy'd hand them back to you and tell you if
your program worked and give you your Green bar report. Those were the days. Yeah. We would grab
other guys' cards and put them out of order. If you change them out of order, you mess them up.
That's right. That's actually right. They had to be in order. That's exactly right. Boy, man. So the computer business, you and I have both been lucky to stumble
into that business in the 80s. And what a blessing that has been. For me, I left selling
technology about 11 years ago now and got the data center business. So tell me, did
you leave electrical engineering and go out to
be a sales guy and get right into it? I mean, was it cooling? What were you all selling?
When I was coming out of college, I was interviewing with numerous companies. Liebert at the time
was one of the companies I interviewed with, and I had several job offers. Liebert was
the one that was really the technical sales job versus engineering jobs.
And interestingly enough, I won't make this a long story, but my dad was an HVAC mechanic,
and he was familiar with Liebert Equipment.
And he talked highly of the company, knew more about it than I did at that time.
He said, this is a really good company.
You should really look into it more.
But long story short, I end up hiring in with the company
out of college.
In those days, when Liebert was founded in 1965,
it was all computer room cooling.
And in the late 70s, early 80s, Liebert
was expanding into power conditioning and uninterrupted
power supply systems.
And so at that time, because we were growing dramatically, and we primarily go to market through manufacturers reps, Liebert decided that they would recruit and train engineering
students coming out of college. And after they trained them through a four to five month program,
they would make graduates of the program available to the reps to hire.
And that's how we partly took on the ownership of building out the Salesforce and training them to sell power equipment.
So I came into the company through that program.
I was fortunate that after I came out of the program, I was placed in our Columbus office, which was a factory-owned office versus an independent manufacturer's rep.
That kept me within the company, so to speak.
It started my career where I've had numerous career advancements,
but always in some type of a sales or sales management role.
Well, I'm not trying to date you,
so you can punt on this if you want,
but I'm going to guess this is the late 80s, when you're coming'm not trying to date you so you can punt on this if you want. But I'm gonna guess this is the
late 80s. When you're coming out and going to work for
leaving 1986 86. Okay. And so they started out cool and data
centers. I can't tell you, you know how many computer rooms and
I teach people, you know, we call them DP rooms, the data
processing room. Yeah. And there'd be a big lever with a
black lever with a black handle on it that you would turn.
Things were huge, right?
Size of six refrigerators,
and it would sit in the corner and make god-awful noise,
and we'd cool the rooms to 60 degrees.
I mean, just sound like you had a hurricane
sitting over there.
But we made them just crazy cold.
And I'd love it if you would,
because this is something I think,
I mean, you
were cooling data centers, you were cooling data processing rooms, computer rooms, whatever they
would be called, you were cooling those things in the 80s. We used to think you had to keep the room
freezing. Will you take the folks that listen to us, I mean, with your experience and your exposure,
how have we changed the idea of heat rejection in the data center? What were we
doing then versus how we do it now? And will you talk us just a little bit
through that journey? Sure. And I think part of the change there is the
technology itself on the data center side, right? So you mentioned the Green
Line reports. Back in those days, we didn't call it a data center, we called it a
computer room. And we were keeping everything,
generally the normal conditions were 72 degrees
and 50% committed, give or take, right?
But it felt like sometimes it would be 60 degrees in there.
Well, if you walked by the Lebert,
it was colder than that.
Yeah, absolutely.
Because you were getting hit with all the air, yeah.
And I can remember being in many, many data centers
where a lot of the operators worked
in the data center.
And you'd have ladies that would be wearing sweaters and they would change the set points
to try to keep themselves warm.
Just like today, right?
We just let them go in there and change the set points.
Yeah.
But you think about, like I mentioned, humidity control.
So back when we had all those dot matrix printers and everything,
if if the humidity was too low, you'd get static.
If the humidity was too high, the paper would jam.
So we really had to keep pretty tight tolerances on humidity.
And then you had spinning tape drives and, you know, a lot of other
rotary type devices in the data center.
Much more analog style in that direction
Yeah, yeah analog equipment and then back, you know
I started back in the mainframe days like the the 38 ones the 30 89's were provided water cooling
Which it's interesting how now the industry is coming back around
Which I'm sure we'll probably hit on that
Yeah, yeah, yeah, yeah. Yeah. Which I'm sure we'll probably hit on that talk about lipid cooling. Yeah, yeah. We'll spend a minute on that.
Yeah.
But you're right.
I mean, you think about where we're at today where we're trying to be more green and sustainable,
and now we're running much higher temperatures and much higher dollar T's.
So we're trying to drive supplier temperatures as high as we can to make the equipment more energy efficient.
So, Dave, you alluded to the fact that the technology in the room has changed, right?
We're not doing dot matrix printers or we're not hanging tapes anymore. So,
thankfully, those days are behind us. But isn't part of why we're running the rooms at warmer
temperatures not just that it eats up less energy and it's more efficient, better for the environment,
costs less money, but aren't we comfortable
that the mean time to failure wasn't materially changed?
The difference between running a room at 68 degrees
and 76 degrees, it didn't have a material impact
on mean time to failure, and we as an industry,
over years, got comfortable that we didn't have to be down that low.
That's a dumb guy's version of it, is that right?
You are 100% correct, yes.
And we follow a lot of the industry recommendations and standards from Ashray and the IT industry,
Nebs and communications, and a lot of those standards have changed over the years.
And they're a lot more tolerant now of the operating windows that we can operate at wider,
you know, wider ranges of temperature and humidity, understanding that it's not going to have a negative
impact on the reliability or failure rate of the equipment.
Yeah. So I have a degree in basket weaving from an SEC school.
You have a degree in electrical engineering from an SEC school. You have a degree in electrical engineering
from a distinguished school like Cincinnati.
You brought up Delta T,
so we're gonna let you take a crack at explaining Delta T
because we have had this conversation
on the podcast a couple of times.
I explain to people it's where you put two tailbacks
on either side of the quarterback,
and then that way they can fake how he hands it off
and you don't know which way it's going. And that has not resonated with most of the quarterback. And then that way they can fake how he hands it off. You don't know which way it's going.
And that has not resonated with most of my audience.
So I'm gonna let you handle Delta T
and I'll just stick with the wing team.
Okay, so I'll describe.
I love your analogy,
but maybe I should run a single wing offense instead.
Yes, yeah, probably better.
Yeah.
So I guess the way I would think about it,
he's the way I can describe Delta T.
If you're, in terms of cooling, you've got the supply air,
what we're pumping out into the room,
which is the temperature we wanna maintain the space at.
And as that air is passing through all of the IT load,
it's absorbing that heat
and returning it back to be conditioned.
Just as an example,
if I'm delivering 75 degree
air into the space and I'm returning 95 degree air, the delta T is 20 degrees. It's the difference
between the return air temperature and spline temperature. I'm guessing, and again I will
demonstrate that I went to an SEC school and studied basket weaving, you can manage that
spread on either end, I'm guessing. You could push the cooling temperature lower.
You could push more air in, I'm guessing, increase the CFM, and that would help your
intake air.
There's multiple dials you can play with there, right?
You're absolutely correct.
And unfortunately, one of the things that we find today is many data centers will be
designed at a certain delta T. It might be 20 degrees,
it might be 22 degrees.
And the higher the delta T and the higher we drive the return air temperature, the more
efficient the data center is.
So if I'm comfortable with a 30, and I'm going to say this in non-electrical engineering
terms, if I'm comfortable with a 30 degree delta T, I'm allowing my air I'm pushing into the room to absorb 30 degrees
of heat before it exits the room.
That's a lot.
If I squeeze that down to 15 degrees, I'm saying I don't want that air to absorb near
as much heat before it gets back in the cycle.
And so that's a harder workload, if there's a better way to say that.
Yeah.
When you think about the rooms that are designed and then how they're actually operated, if you're not at full load and
if you're over provisioning the air, meaning I'm delivering more air than I
need, then that air is going to return back to the unit instead of at 20 degrees,
15 degrees maybe.
So you're, you're providing more energy in moving air, you know, with fans than
is really needed.
So with controls today, we want to separate
the control of the fans that deliver just the right amount of air so that we're getting that
Delta T and we want to separately control the capacity of the unit so we're not over cooling
space. We want to maintain a constant supply air temperature, regulate the fan speed to maintain
a constant Delta T. And we can dial in those fans pretty meticulously today, can't we?
Yeah, because they're variable speed fans.
Yeah, right, right, right.
A lot different than the Lieberts I walked by in the 80s versus what we are doing today.
Walk us through, how did it go from Liebert to Vertib?
What happened?
Who got bought?
Who got merged?
What changed? Do you mind us walking through the history of that? Because I know there are some people in our
business that are as old as you and I that would love to hear the connection because I don't think
I could describe it. Yeah, I'd be happy to walk you through that. So Liebert was founded by a
gentleman by the name of Ralph Liebert in 1965. Just again, I'll kind of make the story quick,
but Ralph was a mechanical contractor.
He was very innovative.
He was involved in helping the patent, the freeze drying process, very wise HVAC mentality.
And back in the 60s, if you recall, was when the first mainframe computers were coming
to market.
And he was getting involved in these applications where he was doing build up systems to do temperature and humidity control. And so he got the idea of building a packaged unit that could
do both the temperature and humidity in a single unit. He took that idea and basically started
Liebert Corporation. So the Liebert name came from the founder of the company and that was in 1965. Then I joined the
company in 1986 and about six months after I joined the company is when it
was announced that we were going to be acquired by Emerson Electric and we
operated under Emerson for roughly 30 years and Emerson had five major
platforms at the time of the very conglomerate of different businesses.
But one of their platforms, they called it Network Power.
There was a lot of synergy between those companies.
So ASCO, which made transfer switches, which is a big part of the data center industry.
You put that on your generator to switch power.
ASCO and Liebert were the two big building blocks of the network power platform.
And they bought a bunch of other bold on companies like DC power companies, transient
voltage surge suppression, racks, just other companies like that, that were all
synergistic.
So we had a whole platform that really served the data center and the
telecommunications market.
really serve the data center and the telecommunications market. So I said we operated under Emerson for about 30 years.
In 2016, Emerson decided to divest of network power.
I think part of their thinking at that time was, you know, Emerson is very good at being
a best cost producer and, you know, they make things like compressors and garbage disposals and
different things like that.
So they're very good at making a few variations of something and making many, many, many units
over time.
Like think about different models of garbage or disposals and sinkerator, things like that.
Liebert didn't fit that business so well, because every job can be a snowflake.
There's variance.
And also the pace of change coming on
about the time they sold us, where
we need to turn over our product platform, our portfolio,
in three years, not 20 years.
So I think maybe other reasons that led to them also.
But Emerson decided to divest of the whole
platform.
And so, we were bought by Platinum Equity, a private equity firm, and that's when we
changed our name to Veriv.
What year was that, Dave?
That was 2016.
Okay, okay.
So, not quite 10 years ago.
Yeah, and we went public then, I think it was either 2019, 2020, right in that timeframe
as well when we went public.
Okay.
That's a super helpful understanding of the journey.
Am I losing my mind or did the Liebert line operate with that name inside of Emerson?
It did.
Okay.
Okay.
I thought so.
Because I was like, I know I've seen Liebert gear long past 1988.
So it was still the product line was just an Emerson company,
but the product line was still Liebert. Correct. And we still have different products that
we'll call it a Liebert something or other. But when you look at the product, it's going
to have the Vertil badge on it. But it was just in that thermal management,
I think is what you called it, division of Emerson. Yeah, that's correct. Yeah. Got it.
Okay, cool. That's helpful because I did
not know when people would say to me, Vertiv, I'm like, hey, these guys are great. And they're like,
oh yeah, that came from Liebert. I'm like, wait a minute, did what? I did not know the history there.
Okay, so you took the Vertiv name in 16, went public in 19. So you have been in the Columbus
office for four decades despite the changes around you.
So you are institution, I've got to think,
inside Vertis slash Lebert Land.
As I say it now, I'm in the 1% club.
So one of the amazing things about this company
is the number of people that have worked their lives here.
It's not uncommon to have people retiring at Lebert that
have been here for 45, close
to 50 years.
It truly is a family-
That's pretty awesome.
Yeah, it is.
And it's rare to see, but-
Was Mr. Liebert a Columbus guy?
You said he was a corporate office.
Was he from Ohio?
Okay.
I didn't know that either.
Okay.
I think he might've originally been from Cincinnati, but he moved up to Columbus and he had a, he had another business called capital refrigeration before Leaver.
Over the years, I always felt like a rookie.
Even I'd been here 25 or 30 years, you know, I look at the guys above me that have been here 40 plus.
Yeah.
Yeah.
But, but now that I've been here 38, you know, I look around and see who who's been here as long as I have or even longer. I'm in that,
I'll say the 1% club now. Those young bucks that are having their 25th anniversary.
Yeah. You're like, welcome newcomer. Well, I actually have a guy that reports to me that just
turned, had his 25 year anniversary last week. And I was kind of joking too about that.
Yeah. I remember when I had my 25th anniversary back in the 90s. It was great.
Yeah.
Pretty awesome. That's good stuff. All right. Well, so appreciate hearing your Ohio history.
Appreciate it. I had no idea. 38 years. That's pretty awesome. Thank you for telling us the
Lieber to Verti journey and the Emerson way station in the middle. That's pretty awesome. Thank you for telling us the Lieber to Verti journey
and the Emerson way station in the middle. That's pretty cool. Let's talk a little bit
about a couple more subjects. You mentioned, and I have a lot of fun with this with the
younger people on the team. They go, hey, you know, we're going to do liquid cooling
because you know, AI is eats up so much power. And I'm like, hey guys, we have done liquid
cooling from the beginning. We were liquid cooling computers before anybody did air.
I said, you know, and I loosely explained it to him.
I said, hey, we got away from liquid cooling because we didn't need that much thermal
capacity.
One, computers and water don't like each other.
Two, and three, it's expensive.
And so we found a more efficient,
both from an economically efficient methodology,
less risky methodology, and one that you didn't need
the same thermodynamic properties of water
to do it with air.
I said, we started with water, guys.
That's how the world started cooling computers.
We moved away from it.
It's just now that we are producing so much heat
that we can't reject
that heat effectively with just air anymore.
And so all things old become new again.
So you alluded to water.
Talk to folks that listen to us, talk them a little bit through when you started, we
were doing some liquid cooling, still was in the marketplace, and then it went away,
and then we did air, and now it's back.
So talk us through that journey.
Yeah. So as I said, when I started, it was the mainframe days. And for those who are listening
to remember the mainframes, it was primarily a water-cooled unit that the IBMs of the world,
or the MDALs, or the NECs, they provided that unit that sat next to their mainframe device.
And we would provide a chiller that was in the room.
It looked just like the Liebert crack units.
And it was sized to support that mainframe.
And we'd provide the chilled water that ran out into their
CDU, where much like today with the CDUs that we provide,
it's basically a heat exchanger where we're isolating the
water that, and I think they use dielectric water that went in internal to their mainframe.
And then, we started in probably what,
maybe the early 90s,
getting into more distributed computing.
And I specifically remember around the turn of the century,
right around 2000 is when the introduction
of the OneU pizza box servers came into the market.
Right.
And everybody was talking about 20 KW Iraq, right?
I got these one you pizza box servers.
Let's go stack them in there.
Right.
I can put 42 of them in there.
That's right.
Exactly.
Right.
They're like, how are we going to cool this?
And we're a 42 of them and you can slide a sheet of paper between each one.
It'll be fine.
Correct.
We actually, at that time, right around,
I can't remember exactly, it was around 2000, 2001,
we bought a small company in Mountain View, California
called Coology.
And Coology was developing cold plate technology
and liquid cooling.
And so we developed a
product that we called Liebert XD for extreme density. 20 kilowatts a rack.
20 kilowatts a rack. And we were pumping a refrigerant around and we
had different types of modules, ceiling mount, in the row and sitting on top of the rack to cool these call it 20, 25 kW, 30 kW
racks. It was very efficient. We ended up probably selling more of those systems due to the efficiency
than because of the need for the high heat load for two reasons. One, even though everybody said,
oh, we're, you know, racks are going to go to 20 KW per rack.
What do you think the average is today in the data center? The average density in a rack today?
It's still not 20. I bet it's 14.
I would say it's maybe around eight.
So we never saw the actual densities get to where the market said it was going.
And then we also-
Can I stick a pin in that for a second?
Absolutely. I think that's true today too. It is.
People are telling us we're going to get, I'm going to have 300 kilowatts of rack.
Okay. We'll come back to that when we talk about what's happening now.
Yeah. Okay. I hear you. Yeah. Yeah. Yeah. Let's come back. We'll talk about that. But the other
thing that, in addition to the fact that the loads weren't there, we also as an industry figured out
we can cool 20kW or 25kW of rack with air.
In fact, we're doing it probably much higher today, maybe 30, 40, 50kW.
I was just saying, it's fairly commonplace if you see a guy tells you he needs you to
cool 25 kilowatts of rack, none of us break out in a sweat about that.
Right.
But back in 2000, nobody thought that was possible, right? So everybody was kind
of pushing the panic button.
We also thought all the computers were going to shut down at midnight too. So that's exactly
right. That's another podcast. We'll get to having both having, but I'm certain we were
both sitting in data centers on New Year's Eve of 1999.
Well, I wasn't. I don't know what your social life was, but I was.
I was in a data center.
Yeah, Y2K was going to be Armageddon for all of us.
Yeah, well, that's another podcast. We'll get back to Coology. Sorry.
But the interesting thing about that acquisition of Coology, we literally were 20 years before
our time, but that acquisition allowed us to develop, because we were doing
pumped refrigerant, if we turn ahead to what Compass is deploying today, it's also a pumped
refrigerant technology. It's a little bit different, but it's because of what we learned to do with that
acquisition of Coology that later on helped us to develop our, we call it the Liebert DSE, but it's a pumped refrigerant economizer.
And so while we were, like I said, 20 years ahead of
where the market is now going with liquid cooling,
that acquisition really led us to being able to innovate and do some other things with pumped refrigerant.
I gotcha.
Interesting stuff. All right, so that leads us a little bit into,
you said let's talk about what's going on today.
Yeah.
We're all things, you know, are new again,
or there's nothing new under the sun.
I'm not sure which is a better way to say it,
but we're back to thinking about,
hey, I've got to get Liquid Cool to the data center.
Yeah.
Enter in the Compass and Vertiv partnership and
us going, okay my customers are saying, here I'm not sure how much
cooling I'm gonna need but I know I'm gonna need it so help me plan for that.
And that leads us into our conversation and partnership with you guys so
will you talk us through a little bit of what you saw there and how I'm not
giving anybody a hard time
marketing wise. I'm not crazy about the name Cool Face Flex because it's hard for me to
say. Let's talk about how that product came about.
Sure. Love to talk about that. And I also have a hard time with that. That name just
doesn't roll off.
It doesn't roll off the tongue. Yeah, exactly.
Cool Face Flex. So if I call it a DH400, please excuse me, because it's Coolface Flex.
It will mean something to me if you say DH400. I'll know what you're talking about. But
that to me is so much easier, but we digress.
So for the audience, let me just back up for a minute before we get into the Coolface Flex.
When we started our relationship with Compass, I guess I'm going to say it's six, seven,
eight years ago. Compass at that time was deploying rooftop air units in some of your data center designs, and then you
pivoted to a more efficient technology, which was a heat wheel.
That's right. The Kyoto units. That's right.
Yep. Yep. The Kyoto heat wheel. And Compass was having some operational challenges with
the Kyoto heat wheel Wheel as were some other
compadres. Yeah so we had this puncture-frigerant technology that we
called the DSE. We had that in different form factors in an indoor split
unit and as we were having companies come to us trying to figure out
what solutions we could offer to help them with their heat wheels or replacing
the heat wheels.
We were looking at maybe doing evaporative cooling and different things.
And we realized we've got the technology with the DSC pump refrigerant.
We just need to make it in an outdoor package solution.
So we designed it. We intentionally designed it so it would be an almost a drop-in replacement for the Kyoto
heat wheel.
Same footprint, et cetera.
So again, for your listeners, we've been deploying that technology, Compass has been deploying
it for the last six, seven years from Verta.
I was going to say seven years now.
Yeah.
Yeah.
And so it's an air-cooled system, but with the pumped refrigerant economizer,
when it's wintertime and cooler temperatures,
it's not using mechanical cooling,
it's just pumping a refrigerant.
So very efficient.
So now to answer your question about
how did the Coolface Flex become into play,
back last spring, your team approached us about,
as you're thinking about going into, you know, how do you
convert your data centers to adapt and support a liquid-cooled design? There's a, also, I think
it's more of a perception versus reality, but there's a wide perception in the industry that
if you're going to do liquid cooling, you just have to go chill water. Right. And I think the
initial request that Compass asked of
Vertev was, hey, we don't want to have to change our building
design. So could you make a chilled water unit that
basically fits in the same footprint or format of the
building where we deploy the DP-400? And so that kind of
started this conversation
and started the wheel spinning.
In that conversation, the idea came up about,
well, wait a minute, why don't we take the DP-400
and add the components to it to make it a chiller
so that it can either support an air-cooled mode
or make it support a liquid load.
As we started talking through the idea,
your team loved the idea of having that flexibility.
One of the things that is really unique about it
and what we're all seeing in the industry
is this way to adapt to AI,
nobody today can tell you definitively,
well, 20% of my load's gonna be air-cooled,
80% is gonna be air-cooled, 80% is going
to be water-cooled. And on day one, it may be something, it might be 90% air, 10% liquid.
Day four, it might be 50-50. Day 20, it might be 10-90.
The reality is nobody knows.
Nobody knows. And so, as we started talking more and more about the idea of the Coolface Flex,
of a hybrid unit can operate in either mode of operation, everybody realized the true value
of having this solution and the benefit of the flexibility. And so we literally took that idea
back in, like I said, March, April timeframe. timeframe and we're now your team was
down in Monterey Mexico at our manufacturing plant just a few weeks ago
witness testing the first unit coming off the production line so we're
starting to ship that product so from idea to final product was maybe about a
nine month productions you know we're a cycle, which without our partnership with Compass,
we wouldn't have been able to do.
It's a really cool, innovative, collaborative,
listen to the marketplace, pay attention solution, right?
Somebody didn't sit in the lab somewhere and say,
I think I'd like to do this.
Both of us listened to what was happening
in the marketplace, listened to our customers and said, hey, what do you think about this? I think it was a beautiful
combination of market intelligence and collaboration to where, to your point about the flexibility,
I mean, I am out in front of customers routinely telling the story of, hey, I know you want
liquid coaling and I know you can't tell me how much. So don't worry about it. I'm going to do this. I'm going to sell you DP-400. And
then somebody in marketing elbows me and goes, no, you're going to sell them a cool face
flex. That's what I meant. It is the answer. So yeah, it's what I meant. Cool face flex.
Yeah, I 100% agree with you. It was true collaboration of us listening to what compasses needs were and verve coming
up with a concept and then us co-developing it together.
And, you know, we put a UPS in it to make sure that we can back up the pumps for that
liquid cooling load.
I mean, it's got a lot of innovation into it.
And again, probably the best thing, you didn't have to change any of your building design.
Well, and not only did you not have to,
it's interesting, you mentioned the UPS.
It is a well thought out solution.
Because when I sit down in the room with our engineers
and my customers' engineers,
they start rattling off questions
and the team has anticipated so much
about what was gonna be asked.
Can I switch?
How long does it switch?
How am I gonna keep the pumps up when I switch over if I'm on liquid? So you're the point the UPS, there's lots of industry knowledge built into that unit. It is very, very well thought out and innovative solution.
Yeah, well, man, Dave, this has been awesome. I really, really have enjoyed getting to hear your personal story, getting to hear the Liebert to Vertis story. I think it would be remiss for two guys who both went to school in Ohio to not end the podcast
without a Woody Hayes story. I mean, so you got to have at least one. I mean, you've been in Ohio
your whole life. You got to have one good Woody Hayes story. The one that I can think of, I mean,
and it's a well-known one, but it's probably always my favorite Woody quote, was
they were beating the hell out of Michigan, I think.
He went for two.
And they asked him at the end of the game why he went for two, and he said, because
they wouldn't allow me to go for three.
And I always thought that summed up Woody, you know?
Yeah, yeah, that's a good one.
So I know I mentioned this one when we chatted beforehand, but I'm going to tell you, so
Woody Hayes, I was in Ohio as at my father's station at Wright-Palastin Air Force Base.
So I was in the Dayton Fairborn area.
And this is in my early days.
So I would have been middle school through high school.
I ended up moving before my senior year, but middle school through high school.
So I'd have been, I don't know whatever that is, 11 through 15 or
something like that.
And we would go to the state fair.
And I want to say the state fair was in Columbus every year.
I'm not sure that's right, but I think that's right.
And my father had bought me a t-shirt.
So this would have been in the, as much as I hate to admit it in the seventies.
And it said Woody Hayes, 278 wins, 42 losses, seven ties, and one knockout.
One knockout.
I wore that to the state fair.
I thought it was cute and funny and I followed football, but you wouldn't be amazed by how
many grown-ups stopped me and asked to take a picture with me in that shirt at the state
fair. Oh, was that right? Yeah, that was, so I just stood out as, you know,
these memories as a kid at Anchor.
I remembered what a big hit that t-shirt was at State Fair.
One knockout, but that.
And I still remember the play.
Oh my gosh.
Watching that Clemson bowl game, yeah.
Amen, yeah, so for those of you who listen to our podcast
who aren't crazy sports guys,
YouTube, Woody Hayes punches a player
and you'll know what Dave and I
are talking about. It is worth the watch. We'll just say, it's a nice way to say it,
is that Woody was intense. He was intense. Yeah, he was the best way to...
Competitor. A competitive guy. Well, Dave, thank you so much for the partnership that
you have. Thank you for such a creative and smart solution. Super
impressive and love our friends at Verdi.
Thank you for hanging out with me a little bit and talking with the folks who want to
listen to learn a little bit about the data center.
This has been great.
I appreciate you.
I enjoyed talking to you as well.
So thank you so much.
Awesome stuff.
Thank you.