Image Classification on Android with TensorFlow Lite and CameraX

Leverage the GPU delegate for machine learning on the edge

Published in 6 min read Nov 18, 2019

TensorFlow Lite is the lightweight version of TensorFlow Mobile. It’s here to unleash the machine learning power on your smartphones while ensuring that the model binary size isn’t too big and there’s low latency. Additionally, it also supports hardware acceleration using the Neural Networks API and is destined to run 4X faster with GPU support.

CameraX is the latest Camera API released with the Jetpack Support library. It’s here to make developing with the camera much easier, and with Google’s automated lab testing, it strives to make things consistent across Android devices, of which there are many. CameraX represents a huge improvement from the Camera 2 API in terms of ease of use and simplicity.

The goal of this article is to merge the camera and ML worlds by processing CameraX frames for image classification using a TensorFlow Lite model. We’ll be building an Android application using Kotlin that leverages the power of GPUs of your smartphones.

CameraX: A Brief Overview

CameraX is lifecycle aware. So it removes the need for handling the states in the onResume and onPause methods.

The API is use case-based. The three main use cases that are currently supported are:

Additionally, CameraX provides Extensions to easily access features such as HDR, Portrait, and Night Mode on supported devices.

Tensor Flow Lite Converter

The TensorFlow Lite converter takes a TensorFlow model and generates a TensorFlow Lite FlatBuffer file. The .tflite model then can be deployed on mobile or embedded devices to run locally using the Tensor Flow interpreter.

The following code snippet depicts one such way of converting a Keras model to a mobile compatible .tflite file:

from tensorflow import liteconverter = lite.TFLiteConverter.from_keras_model_file( 'model.h5')tfmodel = converter.convert()open ("model.tflite" , "wb") .write(tfmodel)

In the following sections, we’ll be demonstrating a hands-on implementation of CameraX with a MobileNet TensorFlow Lite model using Kotlin. You can create your own custom trained models or choose among the hosted, pre-trained ones.

Implementation

Under the Hood

The flow is really simple. We pass the bitmap images from the Analyze use case in CameraX to the TensorFlow interpreter that runs inference on the image using the MobileNet model and the label classes. Here’s an illustration of how CameraX and TensorFlow Lite interact with one another.

Setup

Launch a new Android Studio Kotlin project and add the following dependencies in your app’s build.gradle file.

//CameraX
implementation 'androidx.camera:camera-core:1.0.0-alpha02'
implementation 'androidx.camera:camera-camera2:1.0.0-alpha02'
// Task API
implementation "com.google.android.gms:play-services-tasks:17.0.0"

implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly'
implementation 'org.tensorflow:tensorflow-lite-gpu:0.0.0-nightly'

The nightly TensorFlow Lite build provides experimental support for GPUs. The Google Play Services Task API is used for handling asynchronous method calls.

Next, add the MVP files, the labels, and the .tflite model file under your assets directory. Also, you need to ensure that the model isn’t compressed by setting the following aaptOptions in the build.gradle file:

androidaaptOptions noCompress "tflite" 
noCompress "lite"
>

Add the necessary permissions for the camera in your AndroidManifest.xml file:

Now that the setup is complete, it’s time to establish the layout!

Layout

The layout is defined inside the activity_main.xml file, and it consists of a TextureView for displaying the Camera Preview and a TextView that shows the predicted output from your image classification model.

Conclusion

So that sums up this article. We used TensorFlow Lite and CameraX to build an image classification Android application using MobileNet while leveraging the GPU delegate—and we got a pretty accurate result pretty quickly. Moving on from here, you can try building your own custom TFLite Models and see how they fare with CameraX. CameraX is still in alpha stages, but there’s already a lot you can do with it.

The full source code of this guide is available here.

That’s it for this one. I hope you enjoyed reading.