AI and graphics chip giant Nvidia is reportedly seeking to building bespoke semiconductors and chips for certain customers.
Reuters, citing nine sources familiar with the plans, reported that Nvidia is building a new business unit focused on designing bespoke chips for cloud computing firms and others, including advanced artificial intelligence processors.
Nvidia of course is already the dominant designer and supplier of AI chips, and it aims to capture a portion of an exploding market for custom AI chips and to protect itself from the growing number of companies interested in finding alternatives to its products.
Worldwide demand for AI infrastructure including Nvidia GPUs more than tripled Nvidia’s share price in 2023, and has led a number of countries to prioritise the technology as critical national infrastructure.
Last October the United States introduced new export restrictions aimed at preventing China from obtaining high-end AI processors and the manufacturing technology from developing such chips itself.
Then in December US commerce secretary Gina Raimondo in a speech criticised Nvidia for making GPUs that could be sold to China in compliance with US restrictions but were capable of performing AI training tasks.
Last month it emerged that Nvidia was planning to begin mass production of its latest AI chip aimed at the China market in the second quarter, but was already facing reduced demand from the country’s biggest cloud providers over regulatory uncertainties and the product’s reduced performance.
But a week later it was reported that high-end Nvidia AI chips have been purchased by Chinese military organisations, state-run AI research institutes and universities over the past year in spite of US export controls.
Into this comes the report that the Santa Clara, California-based company, which currently controls about 80 percent of the market for high-end AI chips, is building a bespoke chip division.
Nvidia’s customers include the likes of ChatGPT creator OpenAI, Microsoft, Alphabet and Meta Platforms – all of who have reportedly raced to snap up the dwindling supply of Nvidia chips to compete in the rapidly emerging generative AI sector.
Nvidia’s H100 and A100 chips serve as a generalised, all-purpose AI processor for many of those major customers, Reuters reported.
Nvidia does not disclose H100 prices, which are higher than for the prior-generation A100, but each chip can sell from $16,000 to $100,000 depending on the volume purchased and other factors.
Meta has reportedly said it plans to bring its total stock to 350,000 H100s this year.
But tech companies have started to develop their own internal chips for specific needs. Doing so helps reduce energy consumption, and potentially can shrink the cost and time to design.
Nvidia is now attempting to play a role in helping these companies develop custom AI chips that have flowed to rival firms such as Broadcom and Marvell Technology, the sources who declined to be identified because they were not authorised to speak publicly, told Reuters.
According to Reuters, Nvidia officials have met with representatives from Amazon, Meta, Microsoft, Google and OpenAI to discuss making custom chips for them.
Beyond data centre chips, the company has pursued telecom, automotive and video game customers.
Nvidia also plans to target the automotive and video game markets, according to sources and public social media postings, Reuters reported.
Welcome to Silicon UK: AI for Your Business Podcast. Today, we explore how AI can…
Japanese tech investment firm SoftBank promises to invest $100bn during Trump's second term to create…
Synopsys to work with start-up SiMa.ai on joint offering to help accelerate development of AI…
Start-up Basis raises $34m in Series A funding round for AI-powered accountancy agent to make…
Data analytics and AI start-up Databricks completes huge $10bn round from major venture capitalists as…
Congo files legal complaints against Apple in France, Belgium alleging company 'complicit' in laundering conflict…