Spotify, Paramount Sign Up To Use Google Cloud ARM Chips

Google Cloud said Spotify and Paramount Global are amongst the first major customers using a new data centre processing chip it designed using ARM technology.

The company announced the Axion chip in April of this year, its first ARM-based central processing unit, alongside ARM-based Tensor Processing Units (TPUs) that it has been using since 2015.

Axion competes with Intel and AMD CPUs, but uses far less power – a key concern as companies build out data centres for electricity-hungry AI applications.

Amazon and Microsoft have also developed ARM-based CPUs that they offer to developers via their cloud infrastructure, while start-up Ampere Computing is selling such chips to Oracle’s cloud unit.

Image credit: Arm

Power efficiency

Google said its chip is about 60 percent more efficient than conventional counterparts.

The power saved can be used to drive resource-intensive tasks such as AI computing, the company said.

Before offering Axion to outside customers, Google used it internally to power applications such as YouTube ads, the Google Earth Engine and other services.

Google, Microsoft and Amazon have all reached deals aiming to use nuclear power to secure energy for their data centre needs.

When it introduced Axion, Google also announced a new generation of its TPU chips, which it uses to drive AI tasks.

Intel and AMD have both introduced AI-focused accelerators that aim to compete with those from market leader Nvidia.

AI chips

Nvidia itself announced the next-generation “Blackwell” GPU platform earlier this year.

While ARM-based chips are currently installed alongside Nvidia chips, SoftBank, the majority owner of ARM, has plans for the company’s technology to drive AI chips that compete directly with those from Nvidia, the Financial Times reported this week.

SoftBank head Masayoshi Son wants to put ARM technology at the centre of a new network of data centres purpose-built to train and run AI systems, the report said.

ARM chief executive Rene Haas has not commented on the plans, but told a Bloomberg event last week: “All those AI workloads are going to run on Arm somewhere, somehow. That’s the reason we spend a lot of time talking to SoftBank about the future.”

Matthew Broersma

Matt Broersma is a long standing tech freelance, who has worked for Ziff-Davis, ZDnet and other leading publications

Recent Posts

US Finalises Billions In Awards To Samsung, Texas Instruments

US finalises $4.7bn award to Samsung Electronics, $1.6bn to Texas Instruments to boost domestic chip…

7 hours ago

OpenAI Starts Testing New ‘Reasoning’ AI Model

OpenAI begins safety testing of new model o3 that uses 'reasoning' process to ensure reliability…

8 hours ago

US ‘Adding Sophgo’ To Blacklist Over Link To Huawei AI Chip

US Commerce Department reportedly adding China's Sophgo to trade blacklist after TSMC-manufactured part found in…

8 hours ago

Amazon Workers Go On Strike Across US

Amazon staff in seven cities across US go on strike after company fails to negotiate,…

9 hours ago

Senators Ask Biden To Extend TikTok Ban Deadline

Two US senators ask president Joe Biden to delay TikTok ban by 90 days after…

9 hours ago

Journalism Group Calls On Apple To Remove AI Feature

Reporters Without Borders calls on Apple to remove AI notification summaries feature after it generates…

10 hours ago