Liquid cooling is a good idea for data centres. We already knew it was environmentally better, and thermodynamically better – but it seems it’s also got some other benefits that aren’t quite so obvious.
Cooling a server with liquid is better than using air, because liquids are more dense, and blot up more heat. You have to blow air across the hot electronics to remove enough heat – like running a fan and an electric heater in the same room at the same time.
Liquids can flow gently with a minimal amount of pumping. They also extract the heat in a better, more concentrated form, so it can actually be used elsewhere to warm offices up or heat washing water.
One of the objections to liquid cooling has been that it necessitates more engineering: an extra circulation system in your racks, and containment to keep it in contact with the electronics. It also means you can’t buy bog-standard blades.
But there are ways round this, and when I met one of the leaders, Iceotope, in London recently, I got some good answers. I also heard of the benefits to trade off against this.
Firstly, the liquid circulation system can be made pretty easy to connect with bayonet fixings and so forth. I saw an Iceotope blade unplugged and removed, and it looked like only a minor overhead compared with a regular blade.
Secondly, while it is true that blades for a system like this will be a proprietary design, they are simply casings for standard Intel motherboards, so replacement parts and upgrades are easy enough. The liquid circulates in a closed loop outside the sealed unit which contains special non-conductive coolant. Drain that out, and you can replace parts all right.
But here’s the extra benefit. Sure, if you are running this system, you have to add a separate system of tubes to your rack, but you get rid of some – probably far more niggling – engineering, You no longer have to worry about airflow.
As things stand, the backs of your servers need constant airflow, and all the cables you have are potentially blocking that. So you need to arrange them neatly and set up blocking plates and curtains to keep the air moving where you want it.
Switch to liquid cooling and now you can run as many wires as you like, and you don’t have to seal up the air gaps. You can build your racks at a higher density.
You also get a quiet rack, and one which can be run in a normal office air-conditioned environment.
eBay’s data centre expert Dean Nelson described it nicely, when he told me why liquid cooling is inevitable, and why it will unleash more of the potential of processors. Right now, you have to rein a processor in so it doesn’t overheat, and the amount of heat you can extract is limited by the blade, the box and the rack that it is in.
“These layers have held back the optimisation of our infrastructure,” Nelson told me. “Liquid cooling will remove the locks on chips, and take away constraints.
Liquid cooling may take a while to emerge properly, but it looks more and more inevitable as data centre people search for higher densities and greater efficiencies.
A version of this article appeared in Green Data Center News
Do you know about renewable energy? Try our quiz!
Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector
Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…
Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…
Judge Kaplan praises former FTX CTO Gary Wang for his co-operation against Sam Bankman-Fried during…
Explore the future of work with the Silicon In Focus Podcast. Discover how AI is…
Executive hits out at the DoJ's “staggering proposal” to force Google to sell off its…