High Performance Computing is entering the mainstream, according to Intel, and the UK is at the forefront of that development, leading the world in both bioscience and weather forecasting.
At an event in London, Intel highlighted how HPC is now used in everything from measuring blood flow in the human body and mapping the evolution of the ocean in light of climate change, to carrying out data analysis in the financial sector.
One of the top 500 supercomputers in the world – an SGI Altix ICE 8200 EX, known as cx2 – is situated at Imperial College London. A single rack can be powered by up to 512 Intel Xeon processor cores and deliver 6 teraflops of performance. Students are able to enhance their experiments through simulation, creating models of situations that are either impossible in real life or too expensive to carry out.
Simon Burbridge, head of information and communication technologies at the university, explained that HPC is a “fundamental part” of the research that goes on there, and gives students the flexibility to think outside the box.
Cancer Research is another organisation to have made great strides as a result of its access to HPC facilities. The charity’s SGI Altix UV supercomputer is based on Intel Xeon dual quad-core processors and has 512 cores. This is similar to supercomputer selected by Professor Stephen Hawking and his team at the UK Computational Cosmology Consortium in Cambridge, to support their research into the origins of space.
“Cancer genomics has become a collaboration really between the clinical groups in the cancer research labs and the computational groups who are turning the data into something we can understand,” said James Hadfield, head of genomics at Imperial.
“From the outset we’ve had to invest in our infrastructure year on year, and deal with the constraints of getting all the processing and data management done within a fixed environment,” he said.
Some have reportedly questioned the need for Cancer Research to own a supercomputer, citing the added expense and suggesting that data analysis could be pushed out to the cloud. However, MacCallum claims that this is not yet a viable option, as the the long-term cost of storing such large amounts of data in the cloud would be astronomical.
The news follows the launch last week of “the world’s most powerful supercomputer” – the Tianhe-1A – at SC10 in New Orleans. Located at the National Supercomputing Center in Tianjin, China, Tianhe-1A uses 7,168 Nvidia Tesla M2050 GPUs and 14,336 CPUs, and has reportedly demonstrated performance of 2.57 petaflops.
“Thanks to the use of GPUs in a heterogeneous computing environment, Tianhe-1A consumes only 4.04 megawatts, making it three times more power-efficient; the difference in power consumption is enough to provide electricity to over 5,000 homes for a year,” the company said in a statement.
Intel also used the event to show off its Knights Ferry software development platform and demonstrate its Knights Corner microprocessor, which is made on Intel’s 22nm manufacturing process and uses Moore’s Law to scale to more than 50 Intel processing cores on a single chip.
Suspended prison sentence for Craig Wright for “flagrant breach” of court order, after his false…
Cash-strapped south American country agrees to sell or discontinue its national Bitcoin wallet after signing…
Google's change will allow advertisers to track customers' digital “fingerprints”, but UK data protection watchdog…
Welcome to Silicon In Focus Podcast: Tech in 2025! Join Steven Webb, UK Chief Technology…
European Commission publishes preliminary instructions to Apple on how to open up iOS to rivals,…
San Francisco jury finds Nima Momeni guilty of second-degree murder of Cash App founder Bob…