You have to hand it to Schneider Electric’s marketing department. The power supply company has started an initiative to gather information about data centre energy use. Now that doesn’t sound all that exciting – so they’ve called it the Data Centre Genome Project.
That’s a much more interesting way to put it. According to Soeren Brogaard Jensen, Schneider’s VP of data centre software, the company wants the data centre community to “donate the DNA of their data centres” to enable everyone to operate more efficiently.
The reality is a bit more prosaic, but potentially useful. Jensen reckons that data centres worldwide are wasting $10 billion in energy costs because of unnecessary activity.
When building a facility, people want to know the maximum power their technology uses, detailed for them on the nameplate of the equipment they are installing. But most IT vendors include a huge safety margin, so that power limit is never reached in practice.
That leads to dodgy decisions on how much power is really needed. In turn, that leads to wrong assumptions in the data centre community and potential losses because of basing actions on false figures. That’s the theory anyway.
Schneider wants to base its own database on real operational data as well as users’ measured results. Showing the idea off to TechWeekEurope at the Data Centre Dynamics event in London, Jensen illustrated how good monitoring with a DCIM (data centre infrastructure management) system can do useful things such as exposing when a rack is full of servers which are not actually doing anything.
This is more than a pitch for Schneider’s DCIM software though. Jensen assures me that anyone can put information into the project, and use what is there.
But there is more that can be done with data centre information. Some data centres are gathering so much they cannot understand what they have, and there are bigger problems than the waste of over-specification.
Elsewhere at the DCD event, IT management firm Romonet showed a system which takes the data that is gathered by things like DCIM, and makes sense of it.
For instance, even though a data centre had 26 points of monitoring on each of its 17 CRAC cooling units, it was unable to spot that they were operating inefficiently because they had been wrongly set up, said Liam Newcombe of Romonet.
Spotting that required a system that was based around an actual model of the data centre and its surroundings, including the local climate – something Romonet says its software does.
Data centres are increasingly moving to contracts where different providers are paid according to their contribution to the overall efficiency. So if the PUE is too high, the data centre owner needs to be able to prove to the CRAC supplier that they were at fault.
Some organisations take the approach that more data is better – flooding the data centre with energy and temperature meters. But both Schneider and Romonet believe usable information is more important than just data.
“You don’t need flood metering,” says Newcombe. “You need intelligence.”
Are you a Green IT guru? Try our quiz!
OpenAI reportedly begins early talks with California attorney general over complex transition from nonprofit to…
European Commission says it will review Apple's iPad compliance with DMA rules as it seeks…
James Dyson delivers most high-profile criticism so far of Labour's first Budget that raises £40bn…
Nvidia, Meta bring cases before US Supreme Court this month seeking tighter limits on investors'…
Nvidia to replace Intel this week on Dow Jones Industrial Average after years of turmoil…
Joby Aviation and Toyota Motor complete demonstration flight in Shizuoka as companies prepare to bring…