This article is transferred fromOn-Device AI — What I know so far"

The original text is in English, and the full text is machine-translated, which does not affect the overall understanding and does not care about the details of the translation.

We are in the dawn of accelerated development of artificial intelligence applications.The processing tasks in the AI ​​algorithms in these applications are performed on cloud-based infrastructure or devices.In addition to cloud-based methods, today's on-device methods are becoming more and more popular due to increased privacy, low latency, enhanced reliability, and many other advantages.

By 2022, 80% of smartphones shipped will have on-device AI capabilities, up from 2017% in 10-GartNer

What is on-device inference?

The inference on the device is to make the pre-decoding process I use a well-trained model, which will run functions on the device.Compared with the cloud-based paradigm, on-device reasoning is becoming more and more popular due to its lower latency and higher privacy.However, due to lack of computing power and energy consumption, it may be more difficult to perform such computationally intensive tasks on small devices.

What is on-device training?

The answer is in the question! 😂 Train the model on the device.Again, just like inference on a device, the main challenge of using on-device training is the computing power limitation and energy consumption of such devices.However, training on the device has more advantages.Your model can learn from user data.Since the model runs on the device, the user's behavior can be easily obtained.In addition, it can be personalized for the user.Since the training is carried out on the device, there is no need to upload the data to the cloud.Therefore, it guarantees the privacy of the data.In addition, there is no need to host a server to train the model, so you can also save your money.

In this article, I pointed out some useful and popular resources that I found on the Internet for newbies related to AI on devices.I will keep this article updated.If you know of missing information, please leave a comment so I can update this list. 😁

Mobile app for training/inference on mobile devices

The following are some popular mobile applications/functions for on-device inference and training on your mobile device.

01) "Hey Siri" function-you canIn this articleFind out how "Hey Siri" works.
02) The "Now Playing" function on the pixel phone can identify the music being played-you canHereFind research papers related to the research
03) Face ID technology on iPhone-to understand how it works, please refer toText
04) The Photos application on the Apple device processes images and recognizes the behavior of faces and locations on the device—reference


All product names, logos and brands are the property of their respective owners

Tensorflow Lite— Tensorflow Lite is an open source deep learning framework that supports inference on devices. TensorFlow Lite currently does not support on-device training.To use Tensorflow Lite on an application, you must first use the TensorFlow Lite Converter to convert the Tensorflow model into a compressed flat buffer.It will create a .tflite file.After that, you can load it into a mobile or embedded device and use the TensorFlow Lite interpreter to run the model on the device.
PyTorch Mobile—Currently, PyTorch Mobile is in beta.Unlike Tensorflow Lite, it does not need to convert existing machine learning models into an intermediate file format.Googl eML Kit-ML Kit is a mobile SDK, currently in Beta version.
You can use thislinkJoin ML Kit’s Early Access program.Apple's Core ML — Core MLA cool feature is that you can use the Core ML converter to convert models from other machine learning libraries such as TensorFlow and PyTorch to Core ML.Microsoft's Embedded Learning Library (ELL) — EEL is mainly used for resource-constrained platforms and small single-board computers, such as Raspberry Pi.HuaweiML Suite-The ML Suite provides on-device API and cloud API.Samsung Neural SDK -The API provided by Samsung Neural SDK allows developers to easily deploy pre-trained or custom neural networks on the device.It is designed to run only on Samsung devices.

A processor that supports AI on the device

Samsung Exynos
Qualcomm Snapdragon

So far, what I have discussed in this article is a short introduction to equipment inference and training, as well as some popular examples.After I learn something new, I will be here again. 🤗