HP Discover: HP Launches Apollo Water-Cooled HPC Servers
HP uses warm water to deliver efficient HPC systems
HP has launched an energy efficient supercomputer that uses warm water for cooling, and has an enviable efficiency rating, along with a lower-cost, modular air-cooled alternative.
With the Apollo 8000, users of high-performance computing (HPC) can save up to $1 million in energy costs over five years, as it needs less cooling, and produces usable heat for warming office space. The HP6000, featuring a dense chassis-based architecture and advanced power management, also got a space-age launch at HP Discover in Las Vegas.
Reach for the moon?
The Apollo 8000 was developed for the US National Renewable Energy Laboratory (NREL) and has been turned into a product which HP claims is the first to be 100 percent cooled by water.
For primary cooling to the electronics, the system uses standard “heat pipes” – metal tubes containing alcohol under pressure. These devices, sometimes used inside laptops, carry away heat very quickly by phase change – evaporating in the hot section and condensing in the cold end.
The heat pipes give up their heat to a “heat bus” (a metal bar) along the side of the modules. This transfers heat to a main water cooling circuit in the rack.
This “dry disconnect” idea removes the danger of water touching the electronics, and causing short circuits, said Antonio Neri, vice president of servers and networking at HP. It also allows the chassis to be dismantled easily. Modules can be slid out because the heat transfer goes across between two metal plates.
Other vendors have used warm water to cool supercomputers, including IBM, which used warm water cooling for the Swiss Federal Institute of Technology in Zurich. IBM then developed this as a product, the IBM System x iDataPlex Direct Water Cooled dx360 M4 server, which has been installed as Germany’s SuperMUC compuater (UPDATED: many thanks to IBM’s Chris Sciacca for reminding us of this, in comments below).
“This system has a PUE of 1.06 [ie only 6 percent extra power is required for cooling,beyond that used by the computer kit],” said Bobi Garrett from NREL. “The system captures 90 percent of the waste heat – and we use it for office heating.”
The machine is helping NREL get new insights into areas such as biofuels, through heavy-duty simulation and calculation: “We are more and more dependent on HPC.”
The office heating is not an incidental. The heat tubes take heat from the hardware at about 60C, and give it up to the water. HP guaranteed a constant supply of water at 40C so NREL could avoid buying other heating equipment.
Bizarrely, this means that in the unlikely event that the Apollo system is ever idle, NREL would continue running it, for the heat it provides.
The Apollo 8000’s little brother, the 6000, is intended for enterprise use. While missing the water-cooling pipes, it packs 160 servers into a rack and relies on standardised HP kit such as advanced power management and pooled power, so it can uses modular chassis.
That sounds impressive compared with the 8000 system’s 144 servers per rack, but the 6000 holds single socket servers, while the 8000 has two-socket servers, so it has roughly double the density.
Updated 10 June to clarify the cooling system details.
What do you know about Hewlett-Packard? Find out with our quiz!