Not Your Father’s Data Center - Keep Cool with Data Center Heat Recovery Strategies

Episode Date: March 24, 2020

Pedro Matser started KyotoCooling some 15 years ago when he and his colleagues were asked by a data and telecommunications center in the Netherlands to find a more efficient process. “We sa...t down with a group of people to come up with an energy-efficient solution,” Matser said on this episode of Not Your Father’s Data Center. He and his colleagues ran through their options, including traditional heat recovery, which is a popular strategy in Europe. Traditional heat recovery saves heat in winter by recovering and storing heat. “When I looked at these techniques, I found you could use these techniques for a data center,” he said. “You don’t want to bring the air from the data center outside and exchange it for fresh air.” Instead, two loops are created, one outside-air loop and one inside loop to transfer free cooling. “We found the results stunning – in [The Netherlands]," Matser said, "we could save 90% of the energy required to cool the data center.” In this episode, Matser and Jamie Nickerson, head of electrical and mechanical engineering at HED, joined host Raymond Hawkins to talk about the Kyoto Wheel by Kyoto Cooling. Nickerson explained how the Kyoto wheel works. “When you think about a traditional office building, most often, there is a direct air-side economizer to save resources when the outside has specific cooler conditions than inside,” Nickerson said. As an example, he noted that, when you place hot soup in the refrigerator, not only is the environment making the soup cooler, but the soup can make the air around it warmer. “When you have a data center, you have a lot of equipment generating a lot of heat,” Nickerson said. “We push cooler air into the space, absorbing the heat, then the air stream needs to reject the heat to continue the cycle.”

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to Not Your Father's Data Center podcast, brought to you by Compass Data Centers. We build for what's next. Now here's your host, Raymond Hawkins. Welcome again, everybody, to Not Your Father's Data Center. We are going to talk today with Pedro Matzer and Jamie Nickerson. Before they join us, just want to remind you what we're trying to do here at Compass Data Centers and with our podcast is just an opportunity. If we went to dinner, we'd sit down and we'd talk about politics and we'd talk about religion and we'd probably talk sports. And eventually we'd end
Starting point is 00:00:40 up talking about work. And we just hope that you get to join us at dinner and sit down and listen to the work conversation. The things that all of us get to do as a way to enjoy our work interaction, enjoy the people we work with and provide for our families. And we're just grateful to have you listen in with us and engage in the conversation. We are joined by one of the founders of Kyoto and the creator of the Kyoto wheel, Pedro Matzer from the Netherlands. Pedro, how are you today, sir? I'm fine, thank you. Glad to have you. And we also have with us today Jamie Nickerson from HED,
Starting point is 00:01:15 running their MEP engineering side of the business. Jamie, how are you today? Doing great. Happy to be here. All right, guys, thank you both for joining us as we talk about some things and we'll try to explain to our audience today a few of the concepts behind cooling, where water fits in the cooling part of a data center, how the Kyoto wheel works,
Starting point is 00:01:39 just some of the things that I think that smart people like the both of you understand inherently and me and folks that might listen to us might need a little more explanation so if you guys don't mind we'll just jump right in and Pedro we'll start with you if you don't mind giving us a little bit of a history your history and the history of Kyoto and then we'll transition into talking a little bit about the wheel itself yes Yes, thanks. It started about 15 years ago now when we were asked by the Dutch telecom provider and a data center provider in the Netherlands to find a system for them that would be a little bit more efficient than the one they currently had.
Starting point is 00:02:24 They were at that point running at a POE of 2.3. would be a little bit more efficient than the one they currently had. They were at that point running at a PUE of 2.3. And we sat down with a group of people and tried to come up with an energy efficient solution. And by going to all the different possibilities, one the the things we looked at was the traditional heat recovery something that's very popular in europe where normally you save heat energy in winter time by recovering the heat from your exhaust air. And when I looked at those different techniques, I found out that you could use those for a data center where you don't want to bring the air from
Starting point is 00:03:18 the data center outside and exchange it for 100% for full fresh air, but by creating two loops, basically, an outside air loop and an inside loop, and using those heat transfer means for basically transferring free cooling. And the results we had when we did our first calculation were really, we found them really stunning. In Holland, we could save 90% of the energy required to cool the data center. And basically, we went down in those first designs from a PoE of 2.3 to a PoE of 1.3.
Starting point is 00:04:02 And well, that's what people call significant, I believe. That was quite a jump. Yeah, yeah. And, I mean, the first time I looked at the results, I doubted myself and thought I made a mistake in the calculation or something like that. But over and over, those numbers are are running at 90 to 95% free cooling and just requiring mechanical cooling 5% of the year in the northern part of Europe.
Starting point is 00:04:39 And well, that's quite a change to the traditional free cooling systems that were operating at that time, which normally were with dry coolers. And those could only operate during freezing conditions outside. And typically, you would get to 25% of free cooling. So that was quite a jump. So, Pedro, that's a little bit of where the request, Telecom asking you to help with their 2.3. Was that common back in the early 2000s that that would be a PUE of a data center? Yes, I mean the efficiency of the data centers was in general quite low.
Starting point is 00:05:21 Typically you would see, and that was the case both in Europe and in the US, I know from studies on this, that there was 2.5 times more cooling capacity installed in the data center than that was actually used. And all those fans were running, using lots of power and being very ineffective. And that also had to do with the lack of physical separation from the traditional data centers. There was a lot of bypass and the delta T's that were achieved
Starting point is 00:06:03 were quite low. And that's also something we improved, and that way created the maximum efficient data center, basically. Gotcha. So if you don't mind, Pedro, as we go through this, we're going to use terms that, for me as a sales guy, I don't always firmly understand. And if you don't mind, I'll stop in the middle and just ask both of your opinions or
Starting point is 00:06:28 both of your explanations so Jamie Pedro mentioned Delta T do you mind in as simple of terms as smart engineers can do you mind explaining Delta T to folks who may not fully understand what we're talking about when we take Delta T sure and you know with the different systems you're using, sometimes you're talking about delta Ts and different scenarios, you know, with across the rack or at the unit, you know, outside. But what we're talking about here primarily is the temperature rise across the IT equipment. So if you're delivering air at 70 degrees and you're returning it at 90 degrees, you're looking at a 20 degree delta T. And, you know, that has a linear relationship with the amount of air that you need to move.
Starting point is 00:07:11 The higher that delta T is, then the lower the amount of airflow is to, you know, absorb the same amount of heat from the IT equipment. Okay, so if I'm thinking about this as a non-engineer sales guy, if the air getting sucked in the front of my equipment is 70 degrees and it blows out the back of the equipment at 90 degrees, if I want to change that delta T, you are able to do that with a smaller amount of airflow. When you lower that delta T. Okay, increase the delta T, meaning if I want the front end to be 70 and I'm okay for the back end to be 95. So if I have a 25 degree, so an increase in delta T means I'm allowing more heat to come out and not get rejected, for lack of a better term, right? Exactly, yeah. So the temperature rise is maybe a better way to say that, from the air entering the equipment to the air leaving the equipment.
Starting point is 00:08:16 So if I say this as best I can try to understand as a sales guy, is the larger my delta T is, meaning the less hard my heat rejection equipment needs to work. Is that the right way to say it? Is a simple way to think about it? It depends on the system. One of the big things is that you are able to drop your fan airflow down, which then drops the fan speed and can save a significant amount of energy. So CFM, I'm reducing the cubic feet per minute of airflow. Is that what you're telling me?
Starting point is 00:08:48 Gotcha, gotcha. So if I'm on the other side of the curve, so let's say that my delta T is too big and I want to make it go down, is there, again, a sales guy words, I'm sure I'm not using engineering terms, can I make the air colder or make more air flow? Can I do either one? It really depends on the system. Yeah, there are, you know, the temperature of the air does come into play. When you look at purely the amount of airflow needed, that really is a temperature difference or delta T driven
Starting point is 00:09:19 parameter. But the supply air and the return air have much more of an impact on the efficiency of the mechanical cooling system and or the amount of hours that you're able to economize. The higher you let those temperatures go, the more you can economize with outside air because there are more times of the year that the outside air conditions are cooler than the operating conditions in the data hall. So I think that's back to Pedro's comment about being in northern Europe. He's got lots of time where he can use outside air, and so I'm getting very efficient. So I think I got a simple understanding of the more I can use cold air that I didn't have to make cold, the more efficient my cooling equipment can run.
Starting point is 00:10:10 But I'm just trying to get my arms around the notion of delta T. If I'm trying to move cold air over the top of my IT equipment so that I keep it from breaking, I can either push more air over the equipment or I can get colder air. And however I'm doing that, whether I'm using outside air or whatever, those are my two options, yes? Yes. Okay. All right. I don't know if that helped anybody. It helped me. It made me understand better.
Starting point is 00:10:35 All right. Well, Pedro, do you mind going back to the beginning? You walked us through how the Dutch telecom asked you, hey, we've got issues with the efficiency of our cooling. You took them from 2.3 to 1.3. Can you give us a little bit of insight into the early days of Kyoto and how the wheel came about a little bit? We'd love to hear how that started. Yeah, when we came up with the idea using heat recovery equipment, which basically the wheel in its origin is. I looked at all the different means of see what would give the best results,
Starting point is 00:11:26 what is most scalable and also what is most controllable. And the wheel turned out to be our favorite. It requires, especially for the larger volumes and capacities, it requires very little space and since you can control the speed of the wheel and you can control the air volume you to use the wheel as our solution for for the data center equipment we we suggested that to to our Dutch customer Dutch KPN and and they said well build this one and let's see how it works. We did their first data center and immediately it showed the result that we
Starting point is 00:12:34 calculated for so we we did three more data centers for them and then Holland is only a small country so we run out of data centers at that time in Holland. And we were, as the guys who invented it, we had a sort of farewell dinner to the project. And we were sitting together and had the idea. Well, basically, it would be a pity not to do anything with this idea. I mean, if it works so great in Holland, air is the same all over the world.
Starting point is 00:13:10 So let's try to get out there and see if we can sell this idea to more people. So that's when we reached out, found somebody who was really interested in Canada, Chris Fulton. And he helped us starting to get feet on the ground in Canada and in the US. And the first data center we had a PO for was the state of Montana data center for the government. Pedro, what year was that?
Starting point is 00:13:49 That's in 2008, late 2008, just after we started the company. And then we had right after that Rogers in Toronto, a data center for them. And that's when things really get started. And from there, it slowly spread all over the world. We founded a company. The Kyoto Cooling name was something that wasn't there when we did those first projects for the Dutch KPN. But we came up with the name, founded the company, and specialized on cooling data centers.
Starting point is 00:14:36 And that's about 300 megawatts ago, I think. 300 megawatts. All right. Any history behind the name Kyoto, what it means or how you came up with Kyoto? Yeah, the Kyoto name was really based on the Kyoto Treaty in the late 80s, early 90s, the first environmental treaty that was done in Kyoto and the treaty that got out of there to reduce energy consumption, CO2 production was called the Kyoto Treaty. I think at that time it was a much more important piece of guideline for European governments than it was in some other places. And people associated Kyoto and the treaty with efficiency and with energy reduction and CO2 reduction. So that's why we came up with the name.
Starting point is 00:15:54 The difficult thing is that the Japanese think it's a Japanese firm. So you have to explain that it's not the case, but that's how it came. Very good. I appreciate that history. So, Pedro, as you've described it, you're, of course, there in Holland. You talked about moving the product and the company's sales into Canada and then the first deal happening in Montana. All northern climates is based on the concept of what the Kyoto Wheel does. Certainly, the outside air helps with that efficiency.
Starting point is 00:16:27 Is that where Kyoto started because of the strength? And I know you have installs all over the world now, but is that why you started in the north? Because it was the easiest fit for the solution at the beginning? Yeah, yeah. I mean, obviously the projects in Holland were started there because we were living there and that's where the customer was. Chris Fulton, being a Canadian, started out in his own region.
Starting point is 00:16:55 So that's the reason those first projects were also in the northern part of the US and in Canada. The nice thing about the other cooling is that once I started to do calculations across the world, you see the efficiency is getting less when the climate is getting hot but you still have enormous savings due to the fact that people associate the certain places like for instance we we did some stuff in saudi arabia and everybody understands saudi arabia is a very hot climate, a lot of sunshine and warm temperatures, but that's during the day. So in your mind, it's a region where you say free cooling isn't possible. If you do the real calculation, then you find out that there's more than 50% of free cooling capacity still inside the Kyoto wheel in Saudi Arabia. And that's also how we did our projects in South Africa, in Australia, in South America, where we achieve a little bit less than what you typically see in the Northern hemisphere, but still enormous savings compared to the traditional equipment.
Starting point is 00:18:31 Jamie, as we talk more and learn more about what the Kyoto wheel does and how it works, can you take a few minutes and walk me through, when I think of this air, using air instead of compressors, I think of directly blowing cold air into a space, and this I think you told me is the right way to think about this, is this is an indirect heat exchange. Can you talk with me about what you mean by that and what's going on and why it's better? Sure. So when you think about a traditional, say, office building,
Starting point is 00:19:04 which is probably what most people are familiar with from kind of an HVAC system, and it has a very basic compressor cooling system. or times of the day that the outside air conditions are cooler or have less enthalpy, which is the actual dry bulb temperature and humidity conditions, than the inside conditions. So during those times, you will open a damper and use outside fresh air directly into the space to cool it and to turn off the compressors. Typically, as long as there's not a fire or some other condition outside that would cause that to be a concern, that is very acceptable and an easy way to save energy in this kind of environment. However, in more critical applications, when you get into really anything that's mission critical, whether it's chip fabrication or healthcare or data centers where air quality is of a little bit of a higher concern, that becomes a challenge, both from particulate in the air, whether it is dust or smoke or pollen, or even the fact that
Starting point is 00:20:18 the humidity and temperature varies more quickly outside. And so even temperature control becomes a little bit more of a challenge where your systems are trying to keep up. The indirect methods of economization provide a little bit of a buffer for that. They provide a separation between the quality of the outside air and the inside of the data center. And they also let you handle the fluctuations in temperature and humidity much better. Pedro, I'm going to come back to you. As I look at, and again in a non-engineer sales guy view of the world, I go look at the Kyoto wheels that we install in our customers' data centers and I will say the wheel looks to me like a radiator, just a simple sales guy view of
Starting point is 00:21:01 the world. And when I think of a radiator, I think of air or fluid passing through the radiator and changing temperatures just in a simple understanding. Is that right? And if it is, great. If not, tell me how I'm wrong. And if you don't mind taking me through a couple minutes of how that heat exchange actually happens. Yeah, it basically is a big round radiator. And typically in a radiator, you normally have inside the radiator you have the hot water or cold water wheel that transfers the heat from one side of the data center heats up the the aluminum honeycomb
Starting point is 00:22:09 while passing through it and the wheel slowly rotates from the inside air so from the data center air to the outside air section where cold outside air flows to the radiator and takes away the heat that was brought there by the service, cools down the L-minimum honeycomb while it rotates back into the airstream of the data center air, at that point heating up from the warm air returning from the data center, and by that cooling down the air temperature. And that's a continuous process that we'll slowly moving. And it's when I talk to engineers, they correct me and they go, no, Raymond, we're doing heat rejection. And you used that term when you were describing the wheel. So Jamie, can you tell me the right way to think about that? I think of
Starting point is 00:23:34 the notion of let's just make it colder. And I know I'm not saying it right. So what is heat rejection? Why do we say it that way? And what's the right way to think about it? Sure. I think a good way to think about it, when I go back to probably when I was in high school and going through physics, and the first time I really thought about heat transfer type stuff, is when you put a hot item in the refrigerator, it's almost better to think about not the refrigerator making that item colder, but that item, that soup or whatever it is, is actually making the air around it in the refrigerator warmer, and then the compressor is working to cool that down, right? So that soup in that case is projecting its heat to the cooler, lower potential space. So when you
Starting point is 00:24:18 think about a data center like this, you have your IT equipment in the space that needs to be cooled, and it is generating a lot of heat. We the space that needs to be cooled, and it is generating a lot of heat. We are pushing cooler air into the space that is getting sucked in through the cabinets. Sometimes it's pressure-driven. A lot of times they have their own fans in there. They're pulling that air across. It is absorbing the heat first from that equipment, and then now that that airstream has absorbed the heat that same airstream needs to reject that heat to be able to continue that cycle and to absorb again so you know it's um it's a process of absorption and rejection so once that air or water whatever your
Starting point is 00:24:56 medium is has been heated up it is it is sent back to the equipment that you're using to do the cooling and you're using some method to basically extract the heat or energy from that medium and reject it to the outside air. Okay, so when I hear the word heat rejection, we're talking about pushing that hot medium out of the space. That's what we're talking about when we say heat rejection. And then, and I never, they never let me in physics class, so I'm going to say it more simply. So if I've got a hot processor, that processor is cooling down on its own, right? The heat is coming off of that processor. And what we're doing is we're moving that hot air.
Starting point is 00:25:38 So if I had 70-degree air next to a processor that's running at 120, as that processor throws off its 120 degree air it's heating up my 70 degree air. Correct. That's what's happening. Yeah it's rejecting its own heat. It's rejecting its own heat. It's making my air hotter and now I've got to get my air out of the room to reject it out of the room. Exactly and that's when I went to the refrigerator thing where it's really that hot item that's making the surrounding air hotter and that other system is just working to recool its own air back down. Right. And in a closed loop in a refrigerator, it's just forcing to make
Starting point is 00:26:12 all of its air cooler. In a data center, I'm pushing the air out of the room. That's the term. I'm rejecting the heat. I got you. All right. Because I hear the word heat rejection and I don't quite get my arms around what we're saying there. That helps a bunch. This might be better as a vlog, you know, in the future so we can draw pictures of wheels. But Pedro, I'm sorry, I'm going to come back to you on the honeycomb and on the wheel. So first question, were you so smart that the first time you did this, you go, okay, guys, what we need is a big slow moving wheel in a honeycomb made out of aluminum? Or was there lots of trial and error to get to that? No, basically after doing the calculation, I mean, I worked for the last 30 years using wheels
Starting point is 00:26:55 for HVAC applications here in Europe. And so I was very familiar with the wheel and what you could achieve with it. Just never imagined using it for a data center in a recirculation loop. That was basically the only new thought about it. we had a lot of try and error situations by trying to make it more efficient and what would be the best way to the best speed to drive it the the the the the best layout for sizing the wheel size compared to the capacity and the airflow you want to get out of it there was there was quite a bit of trial and error in it the system all the systems we built worked basically without any problems but through the years you learn how to get the maximum efficiency out of your investment
Starting point is 00:28:06 and how to scale things in the best way and also how to control things in the best way. I mean, certainly in the early years, the controls were quite basic, and that's something we really developed into a state-of-the-art way to make sure you have the highest possible availability also from your controls and from the whole Kyoto community. So if I understand you right, Pedro, you were using the wheel in other applications, and what was unique about Kyoto was to use it in the data center space. Is that the right understanding of how you're describing the genesis of this solution? Yeah, yeah. Typically, the HVAC applications bring in outside air, bring that into the building
Starting point is 00:29:10 and do heat recovery with the wheel. And what we did is the outside air goes through the wheel and goes outside again, and the data center air goes to the wheel and goes outside again, and the data center air goes through the wheel and goes into the data center again. So in the large picture, there is two separate airstreams that don't mix. And I think, Jamie, this is what you were talking about, about direct air and air-to-air. So the air in the data center, which we care about particulates and humidity, stays inside that, for lack of a better term, a closed loop. And the outside air is doing its job, but it's not getting blown into the data center. Is that right?
Starting point is 00:29:59 That is correct. Okay. I want to just make sure I understand that right. So next question, and I'm going to go down a little bit of a different path because when we think about Kyoto and we think about cooling, you know, one of the things I wanted to talk about because at Compass we're big fans of Kyoto and not having compressors and also not having water. As I hear you describe it, Pedro, in the early days, this air-to-air heat exchange, you know, named, I did not know, named after the Kyoto Accord, I think was the name of that treaty. So very much environmentally aware from the very beginning, green. And when I think about, when we talk with customers about why we think this is the best answer, we talk about the reliability of the system because of the lack of
Starting point is 00:30:42 complexity. It doesn't have 7,000 moving parts. It doesn't have things that compressors and other things that can break. We like it from that regard, but we also love the sustainability part. So I'd love to switch the conversation a little bit, guys, and talk about the fact that we don't use water, why we don't use water, some of the problems that get introduced with systems that have water, and just the challenges of cooling, whether it's evaporative cooling or chilled water loops. So just some of the ways we've cooled data centers before on the water side. Jamie, if you'd take a few minutes and give us what are some of the water solutions
Starting point is 00:31:14 and why we think this air-to-air heat exchange might be superior. Sure. And I think first I'll kind of talk about at a high level some of the considerations when you do look at using water. You have availability, you have reliability, and then I think you also talked about sustainability. So they're all interrelated, but they are a little bit different in the way that you think about them. Obviously, water is not a widely available or cheap resource in all parts of the world. And that is just a reality and also the reality that it is becoming harder to come by or rely on, which kind of brings us to the reliability aspect that with most of these data centers, the primary reliability aspect really goes back to the electrical power. It's your utility power, your batteries, your UPS, you have your generators. And as long as you're providing power to all the mechanical cooling equipment and all the electrical equipment, all of that is staying online and operating even through failure scenarios, loss of utility power. When you bring water in, now you have another reliability study or consideration
Starting point is 00:32:29 because that is an entirely separate utility that is not really interlocked with the electrical system. So now you need to look at are there any times of the year that that local jurisdiction is going to cut the water off? What are the water quality issues? Has this area had or is expected to have any incoming drought scenarios, which is also becoming a lot more common, even in areas that haven't typically had that concern? And then from a sustainability standpoint, that really goes back to water usage, even if it's available. Are there project concerns or local concerns or just, you know, company goals in terms of reducing water consumption? I think if you look at like a cooling tower scenario, which is one that uses about the most water for any of the
Starting point is 00:33:19 different systems out there, that's like typically 7 million gallons of water per year per megawatt and so when you start looking at these campuses in a hyperscale level of you know two three hundred megawatts that is a lot of water so I think that that that is where you know that the conversations become critical in our industry for water usage and then in terms of how we use it, there's really a lot of different systems. And it starts the most water, like I said, it's kind of going up to the evaporative cooling tower that is used to cool a water cooled chilled water system that's then used to provide chilled water to the crawl units as opposed to using something like the Kyoto wheel or compressors in that in that sense and
Starting point is 00:34:08 That's the most water there are Scenarios in between where we use, you know direct evaporative cooling where we're using outside air and we're misting essentially You know water under that for adiabatic cooling. We also have another method where people use various indirect evaporative cooling methods, again, where there's air quality concerns, and that kind of separates that. But in all cases, you're using water. And even if you're reducing it by 80% by going with a direct evaporative method as opposed to a cooling tower, that's still 80% of 7 million gallons per year per megawatt. So it's still a very large number.
Starting point is 00:34:48 So I want to try to restate it just to make sure that I understand it. So the first point I think I heard you made is we've got reliability, we've got sustainability, and the reliability and availability issue, we already recognize that from a power perspective, right? We've got generators, and we've got dual dual power feeds and we've got substations. We all get that part in the data center business. But by adding water to your data center, it's a whole separate subsystem that we've got to depend on the utility to provide us from an availability perspective. That's what I think was the first point, right? Absolutely. And so, yeah, you're looking at the, like I said,
Starting point is 00:35:24 the reliability of that water source. You're looking at the, like I said, the reliability of that water source. You're looking at the water quality. What additional water quality measures are you going to have to have chemical additives or different things to that? And then you also have to look at potential for water storage. If water is utilized in your primary cooling and heat rejection for the site, then that realistically cannot go down or you have lost your my reliabilities you're yeah you're one of you're not one of your one of your nines right you've lost that or maybe a couple at that point so the um the reality is at that point you're needing to store water you're needing to treat that water you're having to pay to to um to have
Starting point is 00:36:01 a tanker and space for that to store it and, and figuring out how long that's needed for. Usually that's a minimum of 12 hours, 24 hours, whatever that's decided for uptime or whatever your current agreement is with a particular client or tenant. So that becomes a cost and space issue. Yeah, land utilization, storing water, it's not small and easy to store. Just like I have belly tanks that make my generators run, and people want to know how much fuel can you put in there
Starting point is 00:36:29 and how long is that thing going to run, I got the same consideration, but now I have it with this heavy element of water of where am I going to put it. It's going to take lots of space. It's going to be super heavy. It's going to eat up a lot of the land that I use, as well as now just availability of water. Can I get the kind of water I need? I'm in an area where I got water. So from an availability, sustainability standpoint, that makes a lot of sense. I appreciate that.
Starting point is 00:36:56 I'm going to ask for one more clarifying point. So the first one was, it's a whole nother layer that challenges my availability and my reliability. The second point I think, Jimmy, is it just uses tons of water at the high end. I think you said 7,000? About 7 million per one megawatt. It really depends on the system. 7 million gallons. Oh, my goodness. And then evaporative cooling, just want to make sure I've got my arms around that.
Starting point is 00:37:18 I've got hot air coming in in a simple concept. One side of my airflow, I missed water over the top of it. That hot air causes the water to evaporate. The exchange of evaporation drops the temperature of my air and cools my air. Is that what I'm doing? Correct, but you have to have the availability for it to absorb water. So again, that limits you in certain locations because, you know, in many locations, the times that it's hot, it's also humid. And so the wet bulb depression you get, the availability you have to absorb that water is not always there. Certainly in certain extreme cases, you can look at it. But then when you look at it on a bin level around the whole annualized hours, that
Starting point is 00:38:06 becomes a challenge. And your peak efficiency can look very high, but there are a lot of times of the year that you're not able to get a lot. So for those of you who aren't listening in Texas, Houston would not be a great place to run evaporative cooling, right? It is not traditionally a great place for that. And Arizona could be great if I think about it. Is that a fair example? That is correct. But the interesting thing that goes along with that, a lot of places that evaporative coolers, or you may hear them in the kind of the Home Depot term, the swap cooler, right? Those kind of the same thermodynamic principles. but the places that those work the best, the water availability or quality tends to be the worst because it is naturally dry in those locations.
Starting point is 00:38:52 Right. Yeah. So ironically, where it would work the greatest, I'm probably in a desert and I have a hard time getting water there anyway. Right. It's scarce or expensive. I understand. I got it. Pedro, so tell me if this is a fair way to characterize it. As you came up with the idea of using the wheel in the data center and even with the name choosing Kyoto, is it fair for me to say that you were green before green was cool? Yeah. And I mean, when we started, we wanted to be more efficient and being for the environment that's costing us money. And here we came up with an idea that saves us money and we're green at the same time. So that really was the driver for this.
Starting point is 00:39:59 A nice win-win. I often find that if you do the right thing for the right reasons, it usually ends up being the right thing for the right reasons on more than one level, which love that that worked out with Kyoto. Love that that was the motivation behind even the name and that it's helped lead our industry in a direction where it needed to go. Because I think at times, guys, our industry gets a little bit of a knock for being big users of electricity and also a big knock for being big users of water. And I think that, you know, I personally think if you look at the measure of amount of compute we're delivering versus the amount of energy we're burning, we're actually dropping that number. And our industry is headed in the right direction. Yes.
Starting point is 00:40:38 And the other thing I think is that the world looks at our industry and says, oh, those bad data center people. And they don't connect the dots that everything you do on your phone is happening in our data centers. On the phone that's running in our data centers. They're making that comment on the article on their phone. There's some electricity in our business. Well, guys, I'm super grateful that both of you joined us. Pedro, the fact that you joined us all the way from the Netherlands is awesome. At the end of each one of our podcasts, we've got a bit of a tradition started
Starting point is 00:41:06 where we give away fabulous prizes to our guests. And this, here we are on the 12th of March. What we've decided is that both of you, you can attend any Major League Baseball spring training game you want in the next two weeks or any NCAA tournament basketball game in the next two weeks you want, all expense paid by Compass data centers. So, yeah, we're happy to host you guys at either one of those. I appreciate that. Very generous.
Starting point is 00:41:34 And I apologize, Pedro, if you don't get the joke that all of those events have been canceled here in the U.S. due to the virus. As they are in Europe, and I'm not allowed to enter the U.S. anymore, so I would have a hard time collecting this advice. Thank you very much. Thank you so much for joining us. Even though
Starting point is 00:41:55 you can't join us for spring training baseball, we'll all go home and watch Netflix and hopefully survive this. Thank you guys for joining us, and look forward to seeing you out there in the marketplace. Thank you guys. Thank you. My pleasure.
Starting point is 00:42:06 Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.