Google Is Bringing Local Machine Learning To Android Smartphones

Google has released research demonstrating how artificial neural networks used to train artificial intelligence (AI) and machine learning systems can be run on a device rather than relying on a central machine or powerful cloud-hosted servers.

Dubbed Federated Learning, the new technique is being used to improve how the Gboard virtual keyboard on Android phones are updated, yet keeps users’ data private at the same time.

“When Gboard shows a suggested query, your phone locally stores information about the current context and whether you clicked the suggestion. Federated Learning processes that history on-device to suggest improvements to the next iteration of Gboard’s query suggestion model,” explained Brendan McMahan and Daniel Ramage, research scientists at Google.

Mobile machine learning

Federated Learning works by taking central system trained machine learning models and running them locally on an Android smartphone, tapping into the power of modern mobile processors, and running those models on data stored on the device rather than pushing information to and fro between a cloud system.

The algorithms then learn from the data on the device and changes accordingly. These changes are then summarised as a focussed update and sent back to the cloud-based system, which ingests these changes and those from smartphones, averages the combined changes and subsequently evolves the overall shared machine learning model.

In effect, Federated Learning is a distributed and aggregated approach to training machine learning models and running artificial neural networks, which at the same time protecting the privacy and data of individual Android smartphone users.

“Federated Learning allows for smarter models, lower latency, and less power consumption, all while ensuring privacy. And this approach has another immediate benefit: in addition to providing an update to the shared model, the improved model on your phone can also be used immediately, powering experiences personalised by the way you use your phone,” the researchers said.

While traditional machine learning systems work on large amounts of data, evolving the model after cycles of major data crunching, the Federated Learning technique works on providing higher quality updates rather than rely on sheer volumes of data throughput.

This has the benefits of reducing the amount of communication between a device and a central system; some 10 to 100 times less according to Google’s research, when compared to a more typical machine learning system using an optimisation algorithm like Stochastic Gradient Descent.

So while traditional machine learning systems have more data bandwidth and lower latency, Federated Learning bypasses the problems of higher latency and low throughput by cutting out the need for large numbers of iterative changes to the shared model through providing high quality updates in smaller numbers.

Tapping into TensorFlow

To carry out machine learning on Android smartphones, Google made has made use of a miniaturised version of its TensorFlow machine learning software library combined with careful scheduling to ensure training if a local model only happened when the smartphone is idle, plugged in and connected to a free Wi-Fi connection.

This approach ensures that the machine learning process does not disrupt a user’s daily smartphone activities.

At the same time, updates based on a user’s data but not containing that information, are encrypted and security sent to a centralised server, which bypasses any issues with storing user data in the cloud. Cryptographic techniques also ensure that the coordinating server only decrypts the average updates of hundreds or thousands of updates rather than inspecting individual updates, thereby further protecting people’s privacy.

Google has ambitions to further the use of Federated Learning, alongside pushing its other machine learning research to tackle the training of systems that already have data stored in the cloud and thereby do not benefit from Federated Learning.

“Beyond Gboard query suggestions, for example, we hope to improve the language models that power your keyboard based on what you actually type on your phone (which can have a style all its own) and photo rankings based on what kinds of photos people look at, share, or delete,” the researchers said.

Google is not only working on the software side of machine learning and AI development, but also has its fingers in the hardware pie having developed a chip that it claims can beat CPUs and GPUs in terms of powering AIs.

Quiz: Put your knowledge of artificial intelligence (AI) to the test.

Roland Moore-Colyer

As News Editor of Silicon UK, Roland keeps a keen eye on the daily tech news coverage for the site, while also focusing on stories around cyber security, public sector IT, innovation, AI, and gadgets.

Recent Posts

UK’s CMA Readies Cloud Sector “Behavioural” Remedies – Report

Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector

47 mins ago

Former Policy Boss At X Nick Pickles, Joins Sam Altman Venture

Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…

3 hours ago

Bitcoin Rises Above $96,000 Amid Trump Optimism

Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…

5 hours ago

FTX Co-Founder Gary Wang Spared Prison

Judge Kaplan praises former FTX CTO Gary Wang for his co-operation against Sam Bankman-Fried during…

5 hours ago