TensorFlow is a multipurpose machine learning framework. TensorFlow can be used anywhere from training huge models across clusters in the cloud, to running models locally on an embedded system like your phone.
This codelab uses TensorFlow Lite to run an image recognition model on an Android device.
What you'll learn
A simple camera app that runs a TensorFlow image recognition program to identify flowers.
License: Free to use
This codelab will be using Colaboratory and Android Studio.
Open the Colab which shows how to train a classifier with Keras to recognize flowers using transfer learning, convert the classifier to TFLite and download the converted classifier to be used in the mobile app.
The following command will clone the Git repository containing the files required for this codelab:
git clone https://github.com/tensorflow/examples.git
Next, go to the directory you just cloned the repository. This is where you will be working on for the rest of this codelab:
cd examples
If you don't have it installed already, go install AndroidStudio 3.0+.
Open a project with Android Studio by taking the following steps:
examples/lite/codelabs/flower_classification/start
from your working directory. Copy the TensorFlow Lite model model.tflite and label.txt that you trained earlier to assets folder at lite/codelabs/flower_classification/start/app/src/main/assets/
.
dependencies {
// TODO: Add TFLite dependencies
}
implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly'
implementation 'org.tensorflow:tensorflow-lite-gpu:0.0.0-nightly'
implementation 'org.tensorflow:tensorflow-lite-support:0.0.0-nightly'
android {
...
// TODO: Add an option to avoid compressing TF Lite model file
...
}
aaptOptions {
noCompress "tflite"
}
public class ClassifierFloatMobileNet extends Classifier {
...
// TODO: Specify model.tflite as the model file and labels.txt as the label file
...
}
@Override
protected String getModelPath() {
return "model.tflite";
}
@Override
protected String getLabelPath() {
return "labels.txt";
}
public abstract class Classifier {
...
// TODO: Declare a TFLite interpreter
...
}
protected Interpreter tflite;
protected Classifier(Activity activity, Device device, int numThreads) throws IOException {
...
// TODO: Create a TFLite interpreter instance
...
}
tflite = new Interpreter(tfliteModel, tfliteOptions);
public void close() {
...
// TODO: Close the interpreter
...
}
tflite.close();
tflite = null;
Our TensorFlow Lite interpreter is set up, so let's write code to recognize some flowers in the input image. Instead of writing many lines of code to handle images using ByteBuffers, TensorFlow Lite provides a convenient TensorFlow Lite Support Library to simplify image pre-processing. It also helps you process the output of TensorFlow Lite models, and make the TensorFlow Lite interpreter easier to use. At a high level, this is what we want to do:
private TensorImage loadImage(final Bitmap bitmap, int sensorOrientation) {
...
// TODO: Define an ImageProcessor from TFLite Support Library to do preprocessing
...
}
ImageProcessor imageProcessor =
new ImageProcessor.Builder()
.add(new ResizeWithCropOrPadOp(cropSize, cropSize))
.add(new ResizeOp(imageSizeX, imageSizeY, ResizeMethod.NEAREST_NEIGHBOR))
.add(new Rot90Op(numRoration))
.add(getPreprocessNormalizeOp())
.build();
return imageProcessor.process(inputImageBuffer);
public List<Recognition> recognizeImage(final Bitmap bitmap, int sensorOrientation) {
...
// TODO: Run TFLite inference
...
}
tflite.run(inputImageBuffer.getBuffer(), outputProbabilityBuffer.getBuffer().rewind());
public List<Recognition> recognizeImage(final Bitmap bitmap, int sensorOrientation) {
...
// TODO: Use TensorLabel from TFLite Support Library to associate the probabilities with category labels
...
}
Map<String, Float> labeledProbability =
new TensorLabel(labels, probabilityProcessor.process(outputProbabilityBuffer))
.getMapWithFloatValue();
The app can run on either a real Android device or in the Android Studio Emulator.
You can't load the app from Android Studio onto your phone unless you activate "developer mode" and "USB Debugging". This is a one time setup process.
Follow these instructions.
Android studio makes setting up an emulator easy. Since this app uses the camera, you may want to setup the emulator's camera to use your computer's camera instead of the default test pattern.
To do this you need to create a new device in the "Android Virtual Device(AVD) Manager".
From the main ADVM page select "Create Virtual Device":
Then on the "Verify Configuration" page, the last page of the virtual device setup, select "Show Advanced Settings":
With the advanced settings shown, you can set both camera sources to use the host computer's webcam:
Before making any changes to the app let's run the version that ships with the repository.
Run a Gradle sync, , and then hit play, , in Android Studio to start the build and install process.
Next you will need to select your device from this popup:
Now allow the Tensorflow Demo to access your camera and files:
Now that the app is installed, click the app icon, , to launch it. It should look something like this when you point your camera to a sunflower image
License: Free to use
You can hold the power and volume-down buttons together to take a screenshot.
Now try a web search for flowers, point the camera at the computer screen, and see if those pictures are correctly classified.
Or have a friend take a picture of you and find out what kind of TensorFlower you are !
TensorFlow Lite supports several hardware accelerators to speed up inference on your mobile device. GPU is one of the accelerators that TensorFlow Lite can leverage through a delegate mechanism and it is fairly easy to use.
examples/lite/codelabs/flower_classification/android/app/src/main/res/values/strings.xml
<string-array name="tfe_ic_devices" translatable="false">
<item>CPU</item>
<!-- TODO: Add GPU -->
</string-array>
<item>GPU</item>
public abstract class Classifier {
...
/** Optional GPU delegate for acceleration. */
// TODO: Declare a GPU delegate
...
}
private GpuDelegate gpuDelegate = null;
protected Classifier(Activity activity, Device device, int numThreads) throws IOException {
...
switch (device) {
case GPU:
// TODO: Create a GPU delegate instance and add it to the interpreter options
...
}
}
gpuDelegate = new GpuDelegate();
tfliteOptions.addDelegate(gpuDelegate);
public void close() {
...
// TODO: Close the GPU delegate
...
}
if (gpuDelegate != null) {
gpuDelegate.close();
gpuDelegate = null;
}
That's it. In Android Studio click Run () to start the build and install process as before.
Now in the UI if you swipe up the bottom sheet and choose GPU instead of CPU, you should see a much faster inference speed.
Here are some links for more information: