Intel’s Movidius Neural Compute Stick Aims To Usher In Plug-and-Play AI

Intel has revealed the Movidius Neural Compute Stick, a USB-based kit designed to help ease the development and deployment of artificial intelligence (AI) systems for developers and researchers.

In a claimed world-first, Intel has combined a deep learning inference kit, the means by which AI essentially put into action some of what it has learnt after initial training, and a self contained AI accelerator to provide dedicated deep learning neural network processing capabilities to a whole host of devices with a USB connection.

Deep Learning on the edge

The idea behind the Movidius Neural Compute Stick is to bring deep learning neural network capabilities, the technique given to training machine learning algorithms by replicating how a human brain discreet and processes data,  to the edge of networks, rather then relying on connections back to central or cloud based systems.

In effect, the Movidius Neural Compute Stick would run AI algorithms trained on a central system with massive data sets and use it to interpret real-world data on the edge of networks or out in a field of operations or research.

This would bypass the need to constantly send data back and forth between a device on the edge of a network and a central system, thereby cutting out latency and allowing an AI system to make faster decision and take actions on them sooner.

“The Myriad 2 VPU housed inside the Movidius Neural Compute Stick provides powerful, yet efficient performance – more than 100 gigaflops of performance within a 1W power envelope – to run real-time deep neural networks (DNN) directly from the device,” explained Remi El-Ouazzane, vice president and general manager of Movidius, an Intel company.

“This enables a wide range of AI applications to be deployed offline,” he added, noting the benefit of having the ability to run systems on the edge independent to a central server or cloud.

While the Movidius Neural Compute Stick will still need to take AI algorithms from a central server or cloud, its compact for and universal connection means AI-based smart software could be integrated into all manner of devices from heavy machinery to household white goods, thereby paving the way for greater degrees of smart automation.

Given Intel is struggling a little to retain its dominant position in the processor market and has just shut down its wearables division, it is no surprise the chipmaker is focusing more on AI orientated technology.

Quiz: What do you know about Intel?

Roland Moore-Colyer

As News Editor of Silicon UK, Roland keeps a keen eye on the daily tech news coverage for the site, while also focusing on stories around cyber security, public sector IT, innovation, AI, and gadgets.

Recent Posts

X’s Community Notes Fails To Stem US Election Misinformation – Report

Hate speech non-profit that defeated Elon Musk's lawsuit, warns X's Community Notes is failing to…

1 day ago

Google Fined More Than World’s GDP By Russia

Good luck. Russia demands Google pay a fine worth more than the world's total GDP,…

1 day ago

Spotify, Paramount Sign Up To Use Google Cloud ARM Chips

Google Cloud signs up Spotify, Paramount Global as early customers of its first ARM-based cloud…

2 days ago

Meta Warns Of Accelerating AI Infrastructure Costs

Facebook parent Meta warns of 'significant acceleration' in expenditures on AI infrastructure as revenue, profits…

2 days ago

AI Helps Boost Microsoft Cloud Revenues By 33 Percent

Microsoft says Azure cloud revenues up 33 percent for September quarter as capital expenditures surge…

2 days ago