OpenAI Mulls Development Of Own AI Chip – Report
Developer of ChatGPT, OpenAI is said to be exploring making its own AI chips and is eyeing a potential acquisition target
ChatGPT owner and developer OpenAI continues to explore opportunities to expand beyond its artificial intelligence (AI) chatbot expertise.
Reuters reported, citing people familiar with OpenAI’s plans, that the San Francisco-based firm is exploring making its own artificial intelligence chips and has gone as far as evaluating a potential acquisition target – the identity of which has not been revealed.
The firm has already begun expanding outside its home country, when in July OpenAI confirmed that its first international office would be located in London, where it will focus on “research and engineering.”
AI chips
According to the Reuters report, OpenAI has not yet decided to move ahead with its AI chip development plans.
However, since at least last year it reportedly discussed various options to solve the shortage of expensive AI chips that OpenAI relies on, people familiar with the matter told Reuters.
These options have included building its own AI chip, working more closely with other chipmakers including Nvidia, and also diversifying its suppliers beyond Nvidia.
OpenAI declined to comment on the report.
The report that OpenAI is mulling an AI chip move does make some sense, after CEO Sam Altman made the acquisition of more AI chips a top priority for the company.
Indeed, Altman has publicly complained about the scarcity of GPUs (graphics processing units) – a market dominated by Nvidia, which controls more than 80 percent of the global market for the chips best suited to run AI applications.
Reuters reported that the effort to get more AI chips is tied to two major concerns Altman has identified: a shortage of the advanced processors that power OpenAI’s software and the extremely high costs associated with running the hardware necessary to power its efforts and products.
Since 2020, OpenAI has developed its generative artificial intelligence technologies on a massive supercomputer constructed by Microsoft, one of its largest backers, that uses 10,000 of Nvidia’s graphics processing units (GPUs).
Microsoft it should be remembered had provided OpenAI with a $1 billion investment in 2019 and a further $10 billion investment in early 2023.
And running ChatGPT is said to be very expensive for OpenAI. Reuters cited an analysis from Bernstein analyst Stacy Rasgon, which states each query costs roughly 4 cents.
If ChatGPT queries grow to a tenth the scale of Google search, it would require roughly $48.1 billion worth of GPUs initially and about $16 billion worth of chips a year to keep operational.
Other firms
There are a number of well known tech giants that have invested significant amounts of money over the years to develop their own AI silicon.
OpenAI’s main backer for example, Microsoft, is developing a custom AI chip that OpenAI is reportedly testing.
Alphabet’s Google meanwhile is known to have developed its own artificial intelligence chip, as well as a chip for quantum computing.
Facebook owner Meta Platforms is developing its own AI silicon, and indeed its own AI supercomputer – the AI Research SuperCluster (RSC) supercomputer.
Amazon Web Services has already designed a second generation data centre processor that is said to be more powerful, with at least a 20 percent performance increase over the first generation chip.
Amazon was helped in this regard after it purchased Israel-based chip manufacturer Annapurna Labs for $350m to $370m in 2015.