Like Google, Facebook Is Open Sourcing Its AI Hardware
Facebook AI Research (FAIR) introduces ‘Big Sur’ artificial intelligence hardware, open sources it to the Open Compute Project
Two giants at the head of the pack racing towards artificial intelligence dominance are Google and Facebook. Google, with its voice recognition and search optimisation may just have the edge, but Facebook is close behind with tools that can recognise faces in photos and the development of its ‘M’ virtual assistant.
But both companies have taken a strategic turn recently, open sourcing their AI and Machine Learning efforts in order to crowdsource expertise and accelerate research.
It was November when Google open sourced its TensorFlow artificial intelligence engine, and it is today that Facebook open sources its own AI hardware design.
“At Facebook, we’ve made great progress with off-the-shelf infrastructure components and design thus far. We’ve developed software that can read stories, answer questions about scenes, play games, and even learn unspecified tasks through observing some examples,” said Facebook.
Scale
“But we realised that truly tackling these problems at scale required us to design our own systems. Today, we’re unveiling our next-generation GPU-based systems for training neural networks, which we’ve code-named Big Sur.”
Facebook said that as part of its ongoing commitment to open source, it’s contributing the innovation, which boosts AI power in GPU hardware, to the Open Compute Project so that “others can benefit”. GPU, or Graphics Processing Units, power the deep learning capabilities of AI and the more there are the more powerful AI can be. Facebook said that it’s built Big Sur to incorporate eight high-performance GPUs of up to 300 Watts each.
Big Sur is effectively an engine behind artificial intelligence and machine learning computation. It comes in an open rack-compatible form, designed specifically for AI computing at large scale. It also uses NVIDIA’s Tesla Accelerate Computing Platform, which means, according to Facebook, it can “train twice as fast and explore networks twice as large”.
Facebook is the first company to adopt NVIDIA’s Tesla M40 GPU accelerators, introduced last month, to train deep neural networks. They play a key role in Big Sur.
“Deep learning has started a new era in computing,” said Ian Buck, vice president of accelerated computing at NVIDIA. “Enabled by big data and powerful GPUs, deep learning algorithms can solve problems never possible before. Huge industries from web services and retail to healthcare and cars will be revolutionised. We are thrilled that NVIDIA GPUs have been adopted as the engine of deep learning. Our goal is to provide researchers and companies with the most productive platform to advance this exciting work.”
Big Sur, whilst being bigger and faster than Facebook’s previous platform, also boasts higher versatility and efficiency. The new servers are optimised for thermal and power efficiency, meaning that they can be operated in regular air-cooled data centres.
“We plan to open-source Big Sur and have the design materials submitted to the Open Compute Project (OCP). Facebook has a culture of support for open source software and hardware,” said the social network. “We’re very excited to now add hardware designed for AI research and production to our list of contributions to the community.”