Dell is counting the development of private cloud computing systems within corporate and academic data centres to help drive sales of latest line of servers and software.
On 24 March Dell introduce enhanced servers, software and services designed to help organisations build their own cloud computing systems. The centre piece of this package is new Dell PowerEdge C-series servers designed to support high performance computing with efficient energy consumption and affordable total cost of ownership.
Many corporations have experience working with public cloud computing customer relations management applications and some have worked with cloud computing services such as Amazon EC2 or Microsoft new Azure service, both of which are Dell partner services, noted Forrest Norrod, Dell vice president and general manager of server platforms.
But Dell also believes that IT managers are ready to start building private cloud systems inside their data centres, he said. “I think it is starting now. There was a lot of tire kicking last year,” as IT managers considered the feasibility of building private cloud systems as opposed to using existing public cloud services, Norrod said.
A year ago they were more likely to find that they could provision a public cloud application with the necessary storage capacity on Amazon EC2 or Azure for a lot less than they could do it inside their own data centers, he said.
But that conclusion was not without “a lot of angst and introspection,” Norrod said, because of a lack of service level agreements and nagging concerns about security and data compliance issues. Those issues have led to “an extreme interest” in deploying private cloud systems by exploring “how do I get the economies of the public cloud inside my firewall, “ Norrod said.
Private cloud systems a number of benefits for enterprises including reducing the cost of computing resources and storage that would enable the deployment of applications that might be economically unfeasible through the companies established IT infrastructures. Dell officials also say it will enable enterprises to transfer very old but mission critical legacy applications that can run in virtualized environments on a private cloud platform.
Furthermore cloud systems have the potential to bring greater agility to an enterprise by providing the computing power to let rapidly respond to business opportunities or a sudden increase in customer demand, Norrod said. “An increase in agility means an increased experimentation,” with ways to solve business problems or discover new business opportunities, Norrod said.
For example, Praveen Asthana, Dell’s vice president, networking and strategy for enterprise storage cited the experience that he and many other of his fellow Texans have whenever a hailstorm hits somewhere on the plains of the southwest. Hailstones the size of golf balls and larger smash car windows, ding paint jobs and damage building roofs.
When this happens there is always a spike in insurance claims that bring visits from claims adjusters bearing handheld computers of various kinds. Asthana has noticed that these devices are great from collecting claim information and even taking photos. But he said that his claims adjuster expressed frustration when trying to transmit and store the information on the insurance company’s servers mainly because many adjusters around the state are filing claims at the same time.
A modernized data center with a private cloud system, would be able to handle the increased traffic reducing user frustration, increasing claims adjuster productivity and improving customer satisfaction, Asthana said.
Dell officials also cited their role in helping the University of Colorado at Boulder build a new high performance computing center based on Dell PowerEdge C6100 four-node cloud and cluster servers equipped with Intel Westmere processors.
The university wanted to provide new HPC capacity while consolidating some of the highly distributed computer resources on the Boulder campus said, Henry M. Tufo III, an associate professor in the Department of Computer Science who led the design and deployment effort for the new data centre.
Tufo said that a major goal of the project was to build a data center that would be highly power and cost efficient while avoiding the need for a highly expensive and time consuming construction project. “Another major issue was minimizing the impact on campus and on its infrastructure,” he said. That is why the university opted to work with Dell to design a modular data center that could be deployed in a matter of months in an existing campus facility. The design and deployment of this new HPC data center spanned about six months, Tufo said.
The new facility provides the capacity for data-processing intensive applications that require high performance such as weather and climate modeling, gene sequencing or seismic data analysis.
It’s not the intent of the project to consolidate all of HPC capacity on campus into one location since there are many university departments that want to maintain their own computing resources, Tufo said. But it is likely that the university will be able to consolidate more of its computing resources into the new data centre over the next year or two, Tufo said
Landmark ruling finds NSO Group liable on hacking charges in US federal court, after Pegasus…
Microsoft reportedly adding internal and third-party AI models to enterprise 365 Copilot offering as it…
Albania to ban access to TikTok for one year after schoolboy stabbed to death, as…
Shipments of foldable smartphones show dramatic slowdown in world's biggest smartphone market amidst broader growth…
Google proposes modest remedies to restore search competition, while decrying government overreach and planning appeal
Sega 'evaluating' starting its own game subscription service, as on-demand business model makes headway in…