OpenAI, Broadcom In Talks Over Development Of AI Chip – Report

OpenAI logo displayed on a smartphone. Artificial intelligence, AI, ChatGPT

Rebelling against Nividia? OpenAI is again reportedly exploring the possibility of developing its own AI processor

OpenAI is reportedly seeking to develop an AI chip, as the artificial intelligence pioneer seeks to provide the capacity required to run increasingly powerful AI models, and lessen its dependence on Nvidia.

The Financial Times reported that San Francisco-based OpenAI has been in talks with semiconductor designers including Broadcom about developing a new chip.

This is not the first time it has been reported that OpenAI has been exploring chip possibilities. CEO Sam Altman has previously publicly complained about the scarcity of GPUs (graphics processing units) – a market dominated by Nvidia, which controls more than 80 percent of the global market for the chips best suited to run AI applications.

Image credit: Nvidia
Nvidia’s PCIe A100 GPU. Image credit: Nvidia

AI chips

It should be remembered that in 2018, Altman had personally invested in an AI chip startup called Rain Neuromorphics, and in 2019, OpenAI signed a letter of intent to spend $51 million on Rain’s chips.

Then in October 2023 it was reported that OpenAI was exploring making its own artificial intelligence chips and had gone as far as evaluating a potential acquisition target.

Also in late 2023, multiple reports stated that Sam Altman had been seeking to raise billions of dollars from some of the world’s biggest investors for an AI chip venture.

Altman was also reportedly been looking to raise funds for an AI-focused hardware device he has been developing with former Apple designer Jony Ive.

In January 2024 it was reported that Sam Altman had been courting investors in the Middle East and elsewhere to set up a network of fabrication plants for advanced AI chips.

Sam Altman. Image credit: OpenAI
OpenAI chief executive Sam Altman. Image credit: OpenAI

The talks with Abu Dhabi-based AI investment group G42 and others were motivated by Altman’s belief that it is necessary to act now to ensure a sufficient supply of AI processing chips through the end of the decade.

Then in February it was reported that OpenAI head has been pursuing investors including the United Arab Emirates (UAE) government for a project possibly requiring as much as $7 trillion.

This project reportedly required such a huge investment in order to reshape the global semiconductor industry, to overcome current limitations with AI chips.

Broadcom talks

Now the Financial Times has reported that OpenAI has been holding talks with chip designers including Broadcom.

“The limiting factor of AI is capacity: chip capacity, energy capacity, compute capacity. [OpenAI] is not just going to sit back and let others build [that] when they are on the front line,” the FT quoted one person with knowledge of OpenAI’s plans.

The talks between OpenAI and Broadcom, which were first reported by The Information, have centred on the role Broadcom could play in developing a new chip for OpenAI.

The talks were at an early stage and OpenAI had “engaged across the industry”, according to a person with knowledge of the discussions.

“OpenAI is having ongoing conversations with industry and government stakeholders about increasing access to the infrastructure needed to ensure AI’s benefits are widely accessible,” OpenAI is quoted as saying in a statement. “This includes working in partnership with the premier chip designers, fabricators and the brick and mortar developers of data centres.”

Broadcom did not respond to a request for comment.

Costly business

Running complex AI platforms is a very expensive business, and Nvidia is the main supplier in this space with its H100 and A100 chips, which serve as a generalised, all-purpose AI processor for many customers.

And these AI chips are not cheap.

While Nvidia does not disclose H100 prices, each chip can reportedly sell from $16,000 to $100,000 depending on the volume purchased and other factors.

It has been previously reported that Meta Platforms has plans to bring its total AI chip stock up to 350,000 H100s chips in 2024, demonstrating the hefty financial investment required to compete in this sector, and the value of organisation supplying these semiconductors.