AI and graphics chip giant Nvidia is reportedly seeking to building bespoke semiconductors and chips for certain customers.
Reuters, citing nine sources familiar with the plans, reported that Nvidia is building a new business unit focused on designing bespoke chips for cloud computing firms and others, including advanced artificial intelligence processors.
Nvidia of course is already the dominant designer and supplier of AI chips, and it aims to capture a portion of an exploding market for custom AI chips and to protect itself from the growing number of companies interested in finding alternatives to its products.
Worldwide demand for AI infrastructure including Nvidia GPUs more than tripled Nvidia’s share price in 2023, and has led a number of countries to prioritise the technology as critical national infrastructure.
Last October the United States introduced new export restrictions aimed at preventing China from obtaining high-end AI processors and the manufacturing technology from developing such chips itself.
Then in December US commerce secretary Gina Raimondo in a speech criticised Nvidia for making GPUs that could be sold to China in compliance with US restrictions but were capable of performing AI training tasks.
Last month it emerged that Nvidia was planning to begin mass production of its latest AI chip aimed at the China market in the second quarter, but was already facing reduced demand from the country’s biggest cloud providers over regulatory uncertainties and the product’s reduced performance.
But a week later it was reported that high-end Nvidia AI chips have been purchased by Chinese military organisations, state-run AI research institutes and universities over the past year in spite of US export controls.
Into this comes the report that the Santa Clara, California-based company, which currently controls about 80 percent of the market for high-end AI chips, is building a bespoke chip division.
Nvidia’s customers include the likes of ChatGPT creator OpenAI, Microsoft, Alphabet and Meta Platforms – all of who have reportedly raced to snap up the dwindling supply of Nvidia chips to compete in the rapidly emerging generative AI sector.
Nvidia’s H100 and A100 chips serve as a generalised, all-purpose AI processor for many of those major customers, Reuters reported.
Nvidia does not disclose H100 prices, which are higher than for the prior-generation A100, but each chip can sell from $16,000 to $100,000 depending on the volume purchased and other factors.
Meta has reportedly said it plans to bring its total stock to 350,000 H100s this year.
But tech companies have started to develop their own internal chips for specific needs. Doing so helps reduce energy consumption, and potentially can shrink the cost and time to design.
Nvidia is now attempting to play a role in helping these companies develop custom AI chips that have flowed to rival firms such as Broadcom and Marvell Technology, the sources who declined to be identified because they were not authorised to speak publicly, told Reuters.
According to Reuters, Nvidia officials have met with representatives from Amazon, Meta, Microsoft, Google and OpenAI to discuss making custom chips for them.
Beyond data centre chips, the company has pursued telecom, automotive and video game customers.
Nvidia also plans to target the automotive and video game markets, according to sources and public social media postings, Reuters reported.
Troubled battery maker Northvolt reportedly considers Chapter 11 bankruptcy protection in the United States as…
Microsoft's cloud business practices are reportedly facing a potential anti-competitive investigation by the FTC
Ilya Lichtenstein sentenced to five years in prison for hacking into a virtual currency exchange…
Target for Elon Musk's lawsuit, hate speech watchdog CCDH, announces its decision to quit X…
Antitrust penalty. European Commission fines Meta a hefty €798m ($843m) for tying Facebook Marketplace to…
Elon Musk continues to provoke the ire of various leaders around the world with his…