With London’s annual Green IT conference starting this week, we discussed the main issues with analyst Andy Lawrence, research director for eco-efficient IT at analyst firm The 451 Group.
Last week, we found that Green IT is still viable in the recession, even though compliance regulations have no teeth, and the emphasis has shifted towards reducing costs.
This time, we look in more depth at how those costs can be reduced, by cutting electricity usage in IT equipment. In particular, how much energy can you save in your data centre…. and is the data centre the first place to look for those savings?
And if the IT department really can deliver all this within its own domain, should it start taking control of power use in the rest of the company?
How much energy can we save?
“It is quite clear that most companies are finding it very easy to save energy,” said Lawrence. “They find they are very inefficient, and can save a lot of energy by using new technologies.” The US government has proposed users should aim for 40 to 50 percent savings, based on same workload, and trials by Accenture have backed this up, said Lawrence.
“You should get 30 to 50 percent savings if you employ a combination of a few leading practices – Virtualisation, consolidation, using energy efficient servers, and free air cooling that uses outside air to cool the servers. “This last is important: data centres with high efficiency figures are all using free air cooling,” said Lawrence.”Google is building a data centre with no chillers at all.”
But should we be sceptical? If companies can save money, energy, and space in their data centres, won’t they spend that on boosting their processing power to gain competitive advantage? Intel may sell processors that can do nine times the work of their older system – but if they clearly don’t plan to reduce their shipments by eight-ninths, do they?
“All the quoted savings are relative to workload,” agreed Lawrence. ” There are very few cases where overall energy has been reduced and stayed down. People cut – then grow back with new apps. They will in almost all cases increase the workload to take up that energy – but at least they aren’t building a new data centre.”
Which uses most energy – servers or desktops?
But does this focus on data centres obscure a greater saving which could be got in desktop systems? To start with, which uses most energy? And which is easiest to reduce? “Which uses the most energy? It’s debatable, and you hear different figures from different people,” says Lawrence. “Google clearly uses more energy in its data centres, but a company with large call centres, or an educational establishment, might use more on the desktop.”
Page: 1 2
Suspended prison sentence for Craig Wright for “flagrant breach” of court order, after his false…
Cash-strapped south American country agrees to sell or discontinue its national Bitcoin wallet after signing…
Google's change will allow advertisers to track customers' digital “fingerprints”, but UK data protection watchdog…
Welcome to Silicon In Focus Podcast: Tech in 2025! Join Steven Webb, UK Chief Technology…
European Commission publishes preliminary instructions to Apple on how to open up iOS to rivals,…
San Francisco jury finds Nima Momeni guilty of second-degree murder of Cash App founder Bob…