googlecodelabs / tensorflow-for-poets-2

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Failed Building customized app 'E/AndroidRuntime: FATAL EXCEPTION: CameraBackground'

mmortazavi opened this issue · comments

It is really frustrating to see that the official documentation is not working. I followed the step by step to deploy a TensorFlow Lite model created using AutoML Vision Edge. As usual it works for the provided graph.lite and labels.txt files. However it fails when a custom object detector model exported from AutoML Vision Edge is used. Necessary recommended changes were made. Here is the metadata about the model that was exported from AutoML:

{
    "inferenceType": "QUANTIZED_UINT8", 
    "inputShape": [
        1, 
        512, 
        512, 
        3
    ], 
    "inputTensor": "normalized_input_image_tensor", 
    "maxDetections": 20, 
    "outputTensorRepresentation": [
        "bounding_boxes", 
        "class_labels", 
        "class_confidences", 
        "num_of_boxes"
    ], 
    "outputTensors": [
        "TFLite_Detection_PostProcess", 
        "TFLite_Detection_PostProcess:1", 
        "TFLite_Detection_PostProcess:2", 
        "TFLite_Detection_PostProcess:3"
    ]
}

Quantization is there, so the change from INT to Float should be carried out as app uses a float model and I have done those changes as instructed. The gradle were synced and app seems to be built as well. But it crashes momentarily with the following error:

11/29 12:46:12: Launching 'app' on samsung SM-A530F.
$ adb shell am start -n "android.example.com.tflitecamerademo/com.example.android.tflitecamerademo.CameraActivity" -a android.intent.action.MAIN -c android.intent.category.LAUNCHER
Waiting for process to come online...
Connected to process 22758 on device 'samsung-sm_a530f-5200837ad427566d'.
Capturing and displaying logcat messages from application. This behavior can be disabled in the "Logcat output" section of the "Debugger" settings page.
E/Zygote: isWhitelistProcess - Process is Whitelisted
E/Zygote: accessInfo : 1
I/flitecameradem: Late-enabling -Xcheck:jni
I/flitecameradem: report jit thread pid = 22772
I/flitecameradem: The ClassLoaderContext is a special shared library.
I/DecorView: createDecorCaptionView >> DecorView@7843c96[], isFloating: false, isApplication: true, hasWindowDecorCaption: false, hasWindowControllerCallback: true
I/tflite: Initialized TensorFlow Lite runtime.
D/TfLiteCameraDemo: Created a Tensorflow Lite Image Classifier.
D/OpenGLRenderer: Skia GL Pipeline
D/EmergencyMode: [EmergencyManager] android createPackageContext successful
D/InputTransport: Input channel constructed: fd=65
D/ViewRootImpl@73a7d7a[CameraActivity]: setView = DecorView@7843c96[CameraActivity] TM=true MM=false
D/ViewRootImpl@73a7d7a[CameraActivity]: Relayout returned: old=[0,0][1080,2220] new=[0,0][1080,2220] result=0x7 surface={true 516628328448} changed=true
I/ConfigStore: android::hardware::configstore::V1_0::ISurfaceFlingerConfigs::hasWideColorDisplay retrieved: 0
    android::hardware::configstore::V1_0::ISurfaceFlingerConfigs::hasHDRDisplay retrieved: 0
I/OpenGLRenderer: Initialized EGL, version 1.4
D/OpenGLRenderer: Swap behavior 2
D/mali_winsys: EGLint new_window_surface(egl_winsys_display *, void *, EGLSurface, EGLConfig, egl_winsys_surface **, EGLBoolean) returns 0x3000
D/OpenGLRenderer: eglCreateWindowSurface = 0x7841875700, 0x784972b010
D/ViewRootImpl@73a7d7a[CameraActivity]: MSG_WINDOW_FOCUS_CHANGED 1 1
D/InputMethodManager: prepareNavigationBarInfo() DecorView@7843c96[CameraActivity]
D/InputMethodManager: getNavigationBarColor() -855310
D/InputMethodManager: prepareNavigationBarInfo() DecorView@7843c96[CameraActivity]
    getNavigationBarColor() -855310
V/InputMethodManager: Starting input: tba=android.example.com.tflitecamerademo ic=null mNaviBarColor -855310 mIsGetNaviBarColorSuccess true , NavVisible : true , NavTrans : false
D/InputMethodManager: startInputInner - Id : 0
I/InputMethodManager: startInputInner - mService.startInputOrWindowGainedFocus
D/InputTransport: Input channel constructed: fd=74
I/CameraManagerGlobal: Connecting to camera service
D/VendorTagDescriptor: addVendorDescriptor: vendor tag id 3854507339 added
D/vndksupport: Loading /vendor/lib64/hw/android.hardware.graphics.mapper@2.0-impl.so from current namespace instead of sphal namespace.
D/InputMethodManager: prepareNavigationBarInfo() DecorView@7843c96[CameraActivity]
    getNavigationBarColor() -855310
V/InputMethodManager: Starting input: tba=android.example.com.tflitecamerademo ic=null mNaviBarColor -855310 mIsGetNaviBarColorSuccess true , NavVisible : true , NavTrans : false
D/InputMethodManager: startInputInner - Id : 0
D/ViewRootImpl@73a7d7a[CameraActivity]: MSG_RESIZED: frame=[0,0][1080,2220] ci=[0,0][0,144] vi=[0,0][0,144] or=1
E/AndroidRuntime: FATAL EXCEPTION: CameraBackground
    Process: android.example.com.tflitecamerademo, PID: 22758
    java.nio.BufferOverflowException
        at java.nio.Buffer.nextPutIndex(Buffer.java:542)
        at java.nio.DirectByteBuffer.putFloat(DirectByteBuffer.java:802)
        at com.example.android.tflitecamerademo.ImageClassifier.convertBitmapToByteBuffer(ImageClassifier.java:196)
        at com.example.android.tflitecamerademo.ImageClassifier.classifyFrame(ImageClassifier.java:114)
        at com.example.android.tflitecamerademo.Camera2BasicFragment.classifyFrame(Camera2BasicFragment.java:663)
        at com.example.android.tflitecamerademo.Camera2BasicFragment.access$900(Camera2BasicFragment.java:69)
        at com.example.android.tflitecamerademo.Camera2BasicFragment$5.run(Camera2BasicFragment.java:558)
        at android.os.Handler.handleCallback(Handler.java:873)
        at android.os.Handler.dispatchMessage(Handler.java:99)
        at android.os.Looper.loop(Looper.java:214)
        at android.os.HandlerThread.run(HandlerThread.java:65)
W/System: A resource failed to call close. 
I/Process: Sending signal. PID: 22758 SIG: 9
Process 22758 terminated.

I am not an Android developer, so I am having hard time what causes the crash. I think though that the input image by the camera is not the right input for the model.

Hi! It's a bit late, but I have just faced the same problem trying to replicate the instruction.
I am not a Java programmer, but I have found that the size of the array allocated for imgData is reduced, but in convertBitmapToByteBuffer() it is still packing floats into the array.

I have changed the function as shown below and it works now. I am not yet sure whether just replacing the putFloat to put (the latter puts bytes) and just trimming floats to bytes in its argument is correct, but solves the main problem.

private void convertBitmapToByteBuffer(Bitmap bitmap) {
    if (imgData == null) {
      return;
    }
    imgData.rewind();
    bitmap.getPixels(intValues, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(), bitmap.getHeight());
    // Convert the image to floating point.
    int pixel = 0;
    long startTime = SystemClock.uptimeMillis();
    for (int i = 0; i < DIM_IMG_SIZE_X; ++i) {
      for (int j = 0; j < DIM_IMG_SIZE_Y; ++j) {
        final int val = intValues[pixel++];
        imgData.put((byte)((val >> 16) & 0xFF));
        imgData.put((byte)((val >> 8) & 0xFF));
        imgData.put((byte)((val) & 0xFF));
      }
    }
    long endTime = SystemClock.uptimeMillis();
    Log.d(TAG, "Timecost to put values into ByteBuffer: " + Long.toString(endTime - startTime));
  }

I have a feeling there is a way of just copying the buffers (or even reusing the Bitmap data array) without looping through x and y.