Two giants at the head of the pack racing towards artificial intelligence dominance are Google and Facebook. Google, with its voice recognition and search optimisation may just have the edge, but Facebook is close behind with tools that can recognise faces in photos and the development of its ‘M’ virtual assistant.
But both companies have taken a strategic turn recently, open sourcing their AI and Machine Learning efforts in order to crowdsource expertise and accelerate research.
It was November when Google open sourced its TensorFlow artificial intelligence engine, and it is today that Facebook open sources its own AI hardware design.
“At Facebook, we’ve made great progress with off-the-shelf infrastructure components and design thus far. We’ve developed software that can read stories, answer questions about scenes, play games, and even learn unspecified tasks through observing some examples,” said Facebook.
“But we realised that truly tackling these problems at scale required us to design our own systems. Today, we’re unveiling our next-generation GPU-based systems for training neural networks, which we’ve code-named Big Sur.”
Facebook said that as part of its ongoing commitment to open source, it’s contributing the innovation, which boosts AI power in GPU hardware, to the Open Compute Project so that “others can benefit”. GPU, or Graphics Processing Units, power the deep learning capabilities of AI and the more there are the more powerful AI can be. Facebook said that it’s built Big Sur to incorporate eight high-performance GPUs of up to 300 Watts each.
Facebook is the first company to adopt NVIDIA’s Tesla M40 GPU accelerators, introduced last month, to train deep neural networks. They play a key role in Big Sur.
“Deep learning has started a new era in computing,” said Ian Buck, vice president of accelerated computing at NVIDIA. “Enabled by big data and powerful GPUs, deep learning algorithms can solve problems never possible before. Huge industries from web services and retail to healthcare and cars will be revolutionised. We are thrilled that NVIDIA GPUs have been adopted as the engine of deep learning. Our goal is to provide researchers and companies with the most productive platform to advance this exciting work.”
Big Sur, whilst being bigger and faster than Facebook’s previous platform, also boasts higher versatility and efficiency. The new servers are optimised for thermal and power efficiency, meaning that they can be operated in regular air-cooled data centres.
“We plan to open-source Big Sur and have the design materials submitted to the Open Compute Project (OCP). Facebook has a culture of support for open source software and hardware,” said the social network. “We’re very excited to now add hardware designed for AI research and production to our list of contributions to the community.”
Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector
Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…
Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…
Judge Kaplan praises former FTX CTO Gary Wang for his co-operation against Sam Bankman-Fried during…
Explore the future of work with the Silicon In Focus Podcast. Discover how AI is…
Executive hits out at the DoJ's “staggering proposal” to force Google to sell off its…