Is Coal Still King Of Data Centre Power?
Do we need coal ti power the Internet? Peter Judge thinks that is old-fashioned thinking
How much power does the cloud use? There is a new estimate in circulation – and it’s higher than previous ones – that IT uses ten percent of the world’s electricity.
Green IT people seize on evidence that tech is energy-hungry, but this report came from a source they might not be so keen on. It’s called “The Cloud Begins With Coal”, and the study emerged from the Digital Power Group, sponsored by the American Mining Association, and the American Association for Clean Coal Electricity.
But don’t let that put you off. It’s a very interesting read.
Energy-guzzling Internet
In fact, it follows from a prescient paper, issued back in 1999 from the same author – Digital Power Group’s Mark Mills. That paper was called “The Internet Begins With Coal”.
What follows is not a substitute for a critical reading of the latest paper.
The report lines up some facts – including the startling one that the cloud now carries as much data hourly as the Internet carried per year, back in 2000.
And this unbelievable growth rate is not something we can expect to slow down. Every aspect of IT systems – screens, storage, networks and processing – is become rapidly cheaper, and demand is growing.
That growth is spectacular because of an important factor. There are plenty of things which have become cheaper – lighting in particular is falling in price, and food and transport have sometimes dropped in cost.
But digital is different. As the report says, “people can only use so many lumens per square foot, eat so much or spend so much time in a car”, so expansion is limited.
IT systems are interconnected, and there is literally no limit to how much you can use. A mouse click might once have shifted some bits on a local machine, and displayed something on a monochrome screen. Now it moves data across continents and switches on streams of information in the form of HD video and sound.
Turning on a light doesn’t require dozens of other lights to turn on. Using a smartphone does demand processing elsewhere. But IT can be endlessly interconnected.
Like other recent authors, Mills discusses end-to-end power use, including the networks as well as the end user devices and the data centres which feed them. The energy used in this whole system is larger than some previous estimates, and the report addresses some other areas where power has been – it says – underestimated, such as TVs and set-top boxes.
The report is marred by some aspects however – for instance, its claim that an iPhone uses more power than a fridge is based on a scenario where users watch hours of video on their iPhones, over energy-guzzling cellular networks.
Fossil fuel dinosaurs?
And the report’s big conclusion had me scratching my head. Coal is necessary, says Mills, because it is reliable and can respond to increases and decreases in demand. “Conventional power plants operate 24-7 with 80 to 90 percent reliability,” he says, contrasting this with the variable nature of renewable energy. “In most of the world, the power networks are anchored by power plants fueled by coal, uranium or natural gas.”
That’s true enough. However, in some places, renewables are doing just fine. Germany now makes a quarter of its electricity from solar. Spain is in a similar position.
And one of Mills’ main proof points for the necessity of coal-fired power is somewhat self-defeating. He quotes a Greenpeace paper which ranks data centre owners by how much they rely on coal.
Apple is top of that list. But the list dates from 2012. Since then, Apple has proudly shifted all its data centres to renewable energy. It seems strange that Mills missed that point.
Do you know Apple better than other people? Try our quiz!