Google debuts TensorFlow Lite to enable machine learning on mobile devices
Google LLC is launching a lightweight version of its open-source TensorFlow machine learning library for mobile platforms. Announced at Google’s I/O developer conference in May, TensorFlow Lite is now available for both Android and iOS developers in preview.
TensorFlow is an open-source software library that was released in 2015 by Google to make it easier for developers to design, build and train deep learning models. It functions by sorting through layers of data as part of the learning process. TensorFlow can be thought of as a kind of artificial brain through which complex data structures or “tensors” flow. Google says this process is a central aspect of deep learning that can be used to enhance many technology products.
In a blog post today, Google’s TensorFlow team said this new Lite version can be seen as an evolution of its TensorFlow Mobile application programming interface. It’s now the company’s recommended solution for deploying machine learning models on mobile and embedded devices.
TensorFlow Lite is a “crucial step toward enabling hardware-accelerated neural network processing across Android’s diverse silicon ecosystem,” said Android Engineering Vice President Dave Burke.
Because it’s still under development, TensorFlow Lite only has access to a limited number of machine learning models for now, including MobileNet and Inception v3 for object identification with computer vision, plus Smart Reply for natural language processing that provides one-touch replies to incoming chat messages. It’s also possible for developer to deploy custom models trained with their own datasets. The company said that more models and features will be added in future according to users’ needs.
Here’s a quick look at TensorFlow Lite’s architecture:
The Internet giant added that TensorFlow Lite was rebuilt from scratch to make it as lightweight as possible, enabling inference of on-device machine models with a small binary code base. It’s also been “fast optimized” for mobile devices with improved model loading times and support for hardware acceleration. Lastly, TensorFlow Lite also takes advantage of “purpose-built custom hardware to process ML workloads more efficiently,” Google said.
The developer’s preview of TensorFlow Lite can be downloaded from GitHub now.
Images: Google
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU