Silly Season: Electricity From Waste Server Heat?
It’s not perpetual motion, but the idea of generating electricity from server heat is a very wasteful way to save energy, says Peter Judge
The slow, hot days in the middle of August are traditionally the time for stories about perpetual motion machines and impossible patents.
This year, it’s a bit different. Perhaps the silly season is less silly because there’s less of that August heat-haze (at least in London).
So this year’s patent story turns out to be a real and important one, as Oracle chases Google for money over Java. And this year, instead of a perpetual motion machine, we’ve got a lesser thermodynamic error … electric power from waste heat.
Let’s be clear about this. Making electric power from heat is not impossible. In many cases, it’s also not foolish. But using thermoelectrics to get power back from waste heat, which is generated as a byproduct from an electrical system, is pretty well guaranteed to be a wasted effort.
Power from waste heat
So when Applied Methodologies pops up, promising to turn waste data centre heat into electricity which can be used to offset the cost of running the data centre, or to power some of the data centre equipment, they are talking about something that can be done. But the question is, is it worth doing it?
Thermoelectric generation is already in use. Researchers have proposed it as a way of harvesting energy from the body, and it was proposed in the form of phone-charging “power wellies” as Orange’s annual spoof green press release for the festival season.
In the car industry, “BMW has found that reusing the otherwise wasted exhaust heat to power a thermoelectric generator could reduce fuel consumption by as much as 5 percent,” according to Applied Methodologies’ Jeffrey Sicuranza, in a white paper on the site.
The company has indeed produced a prototype which produces a 4V supply from the waste heat of a 1U server (seen here powering fairy lights). Applied Methologies suggests this could be fed into a power management system, and used for tasks such as charging UPS batteries. eWEEK ran a story based on Applied Methodologies’ press announcement in 2007.
Since then, little seems to have happened, beyond a few demonstrations, and a YouTube video channel which seems to have prompted a story at Data Centre Knowledge, where writer Rich Miller asks “What if your web server could generate its own power?”
“We believe that the time has come for the IT industry to recognize, utilize and integrate Thermoelectrics to capitalize on the increase[d] demand of computing resources and waste heat generated for power generation in most of its devices,” says the white paper. “Today’s TEGs [thermoelectric generators] are efficient enough to produce usable energy when scaled and integrated properly.”
But what are we talking about here? The waste heat is ultimately generated electrically, and has to be removed to keep the computers at their operating temperature. More energy has to be put in, to remove that heat – in old fashioned data centres, it sometimes takes as many Watts of power to remove the heat as it does to run the servers in the first place.
There is a temperature difference between the server, and the coolant – air or (as in the IBM supercomputer at Zurich), water. Data centre designers have to remove the heat as effectively and simply as possible. It is possible to use some of that heat; for instance, the Zurich computer heats homes, as does Telehouse West in London.
But there’s a difference between using the heat to warm something else up, and converting heat energy into electricity. The latter is a fundamentally inefficient process. It also has the effect of slowing the rate at which heat is removed from the hot side of the thermoelectric generator – and thereby raising the temperature of the server.
A wasteful way to save
“If you put a heat engine between the heatsink and the CPU, you will generate energy but increase the change in temperature (aka delta-t),” explains Peter Hopton, chief executive of the UK’s green computer maker, VeryPC. “The increased delta-t means that either the heatsink must be much much bigger to compensate (read diminishing returns as it gets bigger) or the air flowing past it must be cooler.”
In other words, if you make electricity from your waste heat, you have to put more energy into your coolers to keep the same cooling effect. “And you’re not going to get all that energy back,” says Hopton, “or you could have perpetual motion.”
Alternatively, you could decide to live with a hotter server. In which case you could save more electricity and create the same effect by turning your cooling system down.
Applied Methodologies provides a few figures. It gets a 4V supply from each server it equips with a TEG, but the current is very low (its demonstrations use LEDs and PC fans). It’s clearly far less than a Watt, and Hopton has a good rule of thumb: 1W costs about £1 a year – if you reckon a year is about 10,000 hours and 1kWh (a unit) costs about ten pence.
The electricity you get back from the system is like the small change you save from your pockets in a penny jar. It’s never going to amount to enough to pay for your food or bills, unless you apply the very wasteful approach of filling the jar with artificial purchases.
“It’s like paying your bills from a penny jar, when you go out and buy Mars bars intentionally to get change to put into the jar,” says Hopton.