tensorflow / flutter-tflite

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Feature-Request] Incorporation of `tflite_flutter_helper` for Image Classification Tasks

saurabhkumar8112 opened this issue · comments

As many of you might already be familiar, tflite_flutter_helper is a popular helper library, specifically for Image processing while dealing with tflite. This was earlier developer by tensorflow team as well

As of now it has been discontinued, but it has good helper functions for tensor operations, which are quite efficient.

For example, using experiments I found that using tflite_helper_liibrary to run mobilenet inference on Android GPU Delegate, its 8X faster compared to simply using Image and then converting it into ListMatrix(as suggested in the mobilenet example).

If that's the case its important to use tflite_flutter_helper with tflite_flutter. It would be nice this package can either be incorporated or taken up by Tensorflow team as well(given that they have bandwidth). Would be nice to have

How to verify the inference speed

  1. Use a XNNPackDelegate for android.
  2. Run inference using TensorImage Wrapper class.
  3. Do step 2 but with vanilla Image inference(as given in mobilenet examples)

You will see that doing inference in step 2 is around 8X faster than step 3. I did this evaluation in Pixel 3.
PS: This doesn't include any pre-processing time benchmarks, just inferencing.

So it actually wasn't written by the TFLite team - it was written by a Summer of Code participant a few years ago :) Unfortunately bandwidth is very limited (no one other than myself from the TFLite team is actually working on this repo, so it's very community driven and just maintained/reviewed). Rather than the helper library though, the Flutter team is actively working on the MediaPipe plugin https://github.com/google/flutter-mediapipe/pulls which should address the speed/ease issues when it's ready.

@saurabhkumar8112 Could you put an example where you use that library?

The original library for me doesn't work with some phone which are supported by tensorflow and I think it come from the image processing part (doesn't transform correctly some format or else) as it is detedcting stuff with a very bad confidence...

Oh wow, Thank you for your issue :P
Make me look into the convert image of the Live Object detection and see that the problem was coming from there. (I change to another way of converting and now it works on my Oppo Reno 7Z).

@JobiJoba were you able to resolve the issue or still want help?

Hey @PaulTR thanks for replying. I look forward to MediaPipe, meanwhile, I am working on updating tflite_flutter_helper for our Image classification case. If it works fine, I will try making a PR, is that okay?

kinda solve but it's slow :) but at least I detect something :)

yes @JobiJoba object detection is likely to be slow. Make sure you're using the required GPU delegate

It seems unfortunately :(

static Future<Interpreter> _loadModel() async {
    final interpreterOptions = InterpreterOptions();

    // Use XNNPACK Delegate
    if (Platform.isAndroid) {
      interpreterOptions.addDelegate(XNNPackDelegate());
    }

    return Interpreter.fromAsset(
      _modelPath,
      options: interpreterOptions..threads = 4,
    );
  }

I try to make the "example" project of live object detection working and see what kind of FPS I can achieve to then continue on something more complex.

On Android, only XNNPackDelegate is working ; GPUDelegate ; GpuDelegateV2 is not working.

Did I do something wrong or they are suppose to work?

EDIT: My bad GpuDelegateV2 works but it's the same perf as XNNPack.

Try reducing/increasing the number of threads. Depending upon device, that might increase performance. There's always a threadoff between spawing threads, managing them and the extra processing power they can provide.

Good advice, unfortunately in my case 4 thread seems to give the best result in XNNPackDelegate. (I suppose it's not important with GpuDelegateV2).

The best FPS I can get with that setup here is 7 fps.

I know we can do better because this app : https://play.google.com/store/apps/details?id=com.ultralytics.ultralytics_app

can do 25 fps on my phone.

Obviously I don't know how, why , how (again) they do :P but it's possible! (of course they use another model but it I understand correcly it's heavier than ssd mobilenet)

depends on the model as well. SSD produces 8k bounding boxes, and Retinanet produces ~100k

So it actually wasn't written by the TFLite team - it was written by a Summer of Code participant a few years ago :)

You did advertise the Flutter package availability with bells and whistles - remember the following post? It's from August 2023, not so long ago:

https://blog.tensorflow.org/2023/08/the-tensorflow-lite-plugin-for-flutter-officially-available.html

Please understand that, when developers see such a blog post, they understand that Google is taking some form of ownership of TFLite for Flutter, through this package. Which isn't true, to say the least.

Unfortunately bandwidth is very limited (no one other than myself from the TFLite team is actually working on this repo, so it's very community driven and just maintained/reviewed). Rather than the helper library though, the Flutter team is actively working on the MediaPipe plugin https://github.com/google/flutter-mediapipe/pulls which should address the speed/ease issues when it's ready.

Based on your various answers in this repository, I understand that developers are strongly encouraged to give up high hopes on tflite_flutter and keep an eye on MediaPipe.

Remember, people are working on these things because they think they're valuable and are trying to make things easier for others. Please try and keep feedback constructive.

Please understand that, when developers see such a blog post, they understand that Google is taking some form of ownership of TFLite for Flutter, through this package. Which isn't true, to say the least.

The post says it was migrated to a place where it can be maintained (which it is, there are actively PRs being merged and responses). Before this the TFLite plugin hadn't been updated in three years.

Based on your various answers in this repository, I understand that developers are strongly encouraged to give up high hopes on tflite_flutter and keep an eye on MediaPipe.

The plugins are for fairly different things. TensorFlow Lite, on all platforms, is very much for custom down-in-the-weeds solutions, and other tools have been made for common tasks like image classification. We're moving in the same direction for Flutter, so right now it's just an in-between time now that I've drummed up people to work on the MediaPipe plugin compared to having nothing at all.