maobon / Object-Detector-TFLite

Android object detector with TensorFlow Lite. Sample for univercity paper.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

TensorFlow Lite Object Detection Android Demo

Overview

This is a camera app that continuously detects the objects (bounding boxes and classes) in the frames seen by your device's back camera, using a quantized MobileNet SSD model trained on the COCO dataset. These instructions walk you through building and running the demo on an Android device.

The model files are downloaded via Gradle scripts when you build and run. You don't need to do any steps to download TFLite models into the project explicitly.

Application can run either on device or emulator.

Build the demo using Android Studio

Prerequisites

  • If you don't have already, install Android Studio, following the instructions on the website.

  • You need an Android device and Android development environment with minimum API 21.

  • Android Studio 3.2 or later.

Building

  • Open Android Studio, and from the Welcome screen, select Open an existing Android Studio project.

  • From the Open File or Project window that appears, navigate to and select the tensorflow-lite/examples/object_detection/android directory from wherever you cloned the TensorFlow Lite sample GitHub repo. Click OK.

  • If it asks you to do a Gradle Sync, click OK.

  • You may also need to install various platforms and tools, if you get errors like "Failed to find target with hash string 'android-21'" and similar. Click the Run button (the green arrow) or select Run > Run 'android' from the top menu. You may need to rebuild the project using Build > Rebuild Project.

  • If it asks you to use Instant Run, click Proceed Without Instant Run.

  • Also, you need to have an Android device plugged in with developer options enabled at this point. See here for more details on setting up developer devices.

Switch between inference solutions (Task library vs TFLite Interpreter)

This object detection Android reference app demonstrates two implementation solutions:

(1) lib_task_api that leverages the out-of-box API from the TensorFlow Lite Task Library;

(2) lib_interpreter that creates the custom inference pipleline using the TensorFlow Lite Interpreter Java API.

The build.gradle inside app folder shows how to change flavorDimensions "tfliteInference" to switch between the two solutions.

Inside Android Studio, you can change the build variant to whichever one you want to build and run—just go to Build > Select Build Variant and select one from the drop-down menu. See configure product flavors in Android Studio for more details.

For gradle CLI, running ./gradlew build can create APKs for both solutions under app/build/outputs/apk.

Note: If you simply want the out-of-box API to run the app, we recommend lib_task_api for inference. If you want to customize your own models and control the detail of inputs and outputs, it might be easier to adapt your model inputs and outputs by using lib_interpreter.

Model used

Downloading, extraction and placing it in assets folder has been managed automatically by download.gradle.

If you explicitly want to download the model, you can download from here. Extract the zip to get the .tflite and label file.

Custom model used

This example shows you how to perform TensorFlow Lite object detection using a custom model. * Clone the TensorFlow models GitHub repository to your computer. git clone https://github.com/tensorflow/models/ * Build and install this repository. cd models/research python3 setup.py build && python3 setup.py install * Download the MobileNet SSD trained on Open Images v4 here. Extract the pretrained TensorFlow model files. * Go to models/research directory and execute this code to get the frozen TensorFlow Lite graph. python3 object_detection/export_tflite_ssd_graph.py \ --pipeline_config_path object_detection/samples/configs/ssd_mobilenet_v2_oid_v4.config \ --trained_checkpoint_prefix <directory with ssd_mobilenet_v2_oid_v4_2018_12_12>/model.ckpt \ --output_directory exported_model * Convert the frozen graph to the TFLite model. tflite_convert \ --input_shape=1,300,300,3 \ --input_arrays=normalized_input_image_tensor \ --output_arrays=TFLite_Detection_PostProcess,TFLite_Detection_PostProcess:1,TFLite_Detection_PostProcess:2,TFLite_Detection_PostProcess:3 \ --allow_custom_ops \ --graph_def_file=exported_model/tflite_graph.pb \ --output_file=<directory with the TensorFlow examples repository>/lite/examples/object_detection/android/app/src/main/assets/detect.tflite input_shape=1,300,300,3 because the pretrained model works only with that input shape.

allow_custom_ops is necessary to allow TFLite_Detection_PostProcess operation.

input_arrays and output_arrays can be drawn from the visualized graph of the example detection model. bazel run //tensorflow/lite/tools:visualize \ "<directory with the TensorFlow examples repository>/lite/examples/object_detection/android/app/src/main/assets/detect.tflite" \ detect.html

  • Get labelmap.txt from the second column of class-descriptions-boxable.
  • In DetectorActivity.java set TF_OD_API_IS_QUANTIZED to false.

Additional Note

Please do not delete the assets folder content. If you explicitly deleted the files, then please choose Build->Rebuild from menu to re-download the deleted model files into assets folder.

About

Android object detector with TensorFlow Lite. Sample for univercity paper.


Languages

Language:Java 100.0%