Physical Infrastructure Still A Power Play In Virtualised Worlds

While virtualisation and cloud are seen as forging ahead and innovation is coming out all over the market, physical data centre design is perceived as lagging behind. However, this does not reflect the effort and investment required for a successful data centre development project.

Physical data centre infrastructure must deliver the required level of computing power, while keeping power consumption and cooling demands as low as possible. It’s a balancing act, but it still needs to be reliable – having a data centre that only meets requirements at specific peaks is not a successful design.

A measure of improvement

The Power Usage Effectiveness (PUE) ratio, developed by the Green Grid consortium, is one of the prime ways that data centre managers can judge the success of their own installations.

While PUE offers a measure of reliability and efficiency, it is not intended to say your data centre is bad and that other data centres are good, which appears to be how it is perceived in the market. Instead, this should be treated as a measure of progression to efficient use of resources.

To start with, virtualisation makes workloads more mobile within the data centre. Whereas previously a data centre would cater for mission-critical workloads on large physical servers with dedicated additional power and cooling resources, these days virtual machines can be moving all around the data centre independently.

From a design standpoint, this makes traditional approaches less efficient. The concept of the data centre ‘hot spot’ that you could correct by adding additional cooling does not exist in the virtual world. These ‘Hot spots’ become mobile depending on load and which resource the virtual machine is moved to.

Instead, there are a couple of routes that can be taken to keep the power and cooling efficiencies in place, while ensuring that the benefits of virtualisation are delivered. The first is the move to hot aisle / cold aisle throughout the data centre. This involves capturing airflow of a particular kind – either hot or cold – and keeping this to a particular location so that any cooling applied provides as much benefit as possible.

The second approach considers containment providing passive thermal management. This involves the design of the racks, cabling and venting in which the servers, switches and storage are hosted. This may seem like a small detail, but overlooking the physical design stage here can add massively to the ongoing overhead for the data centre.

Passive thermal management designs air flow so that heat is taken away from the devices more efficiently. It uses the physics of the hot air itself and channels it away from the IT assets faster, rather than pumping more cold air at the devices themselves, which requires additional power.

Up and down: planning for the future

Employing this approach to energy efficient deployments, including passive thermal management techniques, has seen average results of 15 percent reduction in power consumption and a 38 percent reduction in cooling costs. Over the life of a data centre, this can be a massive saving, based on getting the basics right.

Using this understanding of physical infrastructure and passive air cooling is essential in the building of new data centres, alongside an awareness of the demands that virtualisation and cloud place on the data centre. There is a difference in the demands that these two approaches place on the data centre.

While it is much easier to deploy new servers as virtual machines, virtualisation deployments tend to be more static in the number of VMs that are in place. The biggest consideration around implementations is thus the mobility of virtual machines.

Similarly, areas of heat generation may move around the data centre, as bigger virtual machines are moved to provide them with the right amount of resources. On the physical IT side, this means designing the data centre in a uniform way with power, cooling and heat management thought about across the whole design.

Page: 1 2

adminuk

View Comments

Recent Posts

Spotify, Paramount Sign Up To Use Google Cloud ARM Chips

Google Cloud signs up Spotify, Paramount Global as early customers of its first ARM-based cloud…

7 hours ago

Meta Warns Of Accelerating AI Infrastructure Costs

Facebook parent Meta warns of 'significant acceleration' in expenditures on AI infrastructure as revenue, profits…

7 hours ago

AI Helps Boost Microsoft Cloud Revenues By 33 Percent

Microsoft says Azure cloud revenues up 33 percent for September quarter as capital expenditures surge…

8 hours ago

Trump Media Briefly Worth More Than X

Truth Social parent company Trump Media sees shares rally and then sink as stock price…

8 hours ago

Reddit Shares Surge On First-Ever Profit

Social media service Reddit shows first-ever profit in nearly 20-year history as AI translation aids…

9 hours ago

Russia Carrying Out Targeted Attacks In UK, Microsoft Warns

Russian-backed hacking group impersonating Microsoft, AWS in 'highly targeted' social engineering attacks with UK in…

9 hours ago