AWS re:Invent: Amazon Nvidia Expand Collaboration

At its AWS re:Invent conference in Las Vegas, Amazon Web Services (AWS) has made a series of announcements alongside its existing partner, Nvidia.

Amazon Web Services and Nvidia announced “an expansion of their strategic collaboration to deliver the most advanced infrastructure, software, and services to power customers’ generative artificial intelligence (AI) innovations.”

Nvidia of course is benefiting from the demand and uptake of AI. Earlier this week Nvidia informed customers in China it was delaying the launch of a new artificial intelligence chip, in light of toughened US export restrictions.

Processed with VSCOcam with g3 preset

AWS re:Invent

AWS and Nvidia said the latest developments are an expansion of their long-standing relationship. These new developments include:

  • AWS will be the first cloud provider to bring Nvidia GH200 Grace Hopper Superchips with new multi-node NVLink technology to the cloud. The platform will be available on Amazon Elastic Compute Cloud (Amazon EC2) instances.
  • Nvidia and AWS will collaborate to host Nvidia DGX Cloud, which is Nvidia’s AI-training-as-a-service, on AWS. DGX Cloud on AWS will accelerate training of generative AI and large language models that can reach beyond 1 trillion parameters.
  • Nvidia and AWS are collaborating on Project Ceiba to design the world’s fastest GPU-powered AI supercomputer – an at-scale system with GH200 NVL32 and Amazon EFA interconnect, hosted by AWS for Nvidia’s own research and development team. This first-of-its-kind supercomputer – featuring 16,384 NVIDIA GH200 Superchips and capable of processing 65 exaflops of AI – will be used by Nvidia to propel its next wave of generative AI innovation.
  • AWS will introduce three additional Amazon EC2 instances: P5e instances, powered by Nvidia H200 Tensor Core GPUs, for large-scale and cutting-edge generative AI and HPC workloads; and G6 and G6e instances, powered by Nvidia L4 GPUs and Nvidia L40S GPUs, respectively, for a wide set of applications such as AI fine tuning, inference, graphics, and video workloads.

Long partnership

“AWS and Nvidia have collaborated for more than 13 years, beginning with the world’s first GPU cloud instance,” said Adam Selipsky, CEO at AWS. “Today, we offer the widest range of Nvidia GPU solutions for workloads including graphics, gaming, high performance computing, machine learning, and now, generative AI.”

“We continue to innovate with Nvidia to make AWS the best place to run GPUs, combining next-gen Nvidia Grace Hopper Superchips with AWS’s EFA powerful networking, EC2 UltraClusters’ hyper-scale clustering, and Nitro’s advanced virtualisation capabilities,” said Selipsky.

“Generative AI is transforming cloud workloads and putting accelerated computing at the foundation of diverse content generation,” added Jensen Huang, founder and CEO of Nvidia.

Nvidia chief executive Jensen Huang. Image credit: Nvidia

“Driven by a common mission to deliver cost-effective, state-of-the-art generative AI to every customer, Nvidia and AWS are collaborating across the entire computing stack, spanning AI infrastructure, acceleration libraries, foundation models, to generative AI services.”

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

Tech Minister Admits UK Social Media Ban For Under-16s “On The Table”

Following Australia? Technology secretary Peter Kyle says possible ban on social media for under-16s in…

20 hours ago

Northvolt Appoints Restructuring Expert For Main Battery Plant

Restructuring expert appointed to oversea Northvolt's main facility in northern Sweden, amid financial worries

22 hours ago

CMA Halts Google Anthropic Investigation

British competition watchdog decides Alphabet's partnership with AI startup Anthropic does not qualify for investigation

23 hours ago