cpetig / tflite_micro_compiler

generate tflite micro code which bypasses the interpreter (directly calls into kernels)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[IMPROVEMENT]

OttoES opened this issue · comments

What should be improved?
Instructions to how to use the code in (initial README.md)

Describe the solution you'd like
The current instructions for to use this respiratory are not very clear. (I am an experienced C programmer but might lack some basic understanding of a complex build system like this, so please be patient with me)
Pulling the repository is fine. The next part basically instruct you to cd to build and run cmake -DGET_TF_SRC=ON ..
This is fine but please add a warning that this can take very long...
Some variations are discussed after that which I did not execute?
The next step to actually compile a model is where I think things are unclear.

The instructions says: cd ../tensorflow
No such directory is present after the cmake run above. After some snooping I found the folder in tflite_micro_compiler/cmake/_deps/tf-src/
This an awkward place for the folder and I can only assume that is because that is the default path of the scrip somewhere?

The next step also fails: ./compiler hello_world.tflite hello_compiled.cpp hello_
I assume that this implies you are still in the tensor-flow folder. Unfortunately there is no executable called ./compiler generated due to the failure of the previous step? (I also suggest to change the name of the executable compiler because the model compiler can be confused with the c compilers)

Next it would be nice to know how to C-compile the examples: like which files are compiled, is there a make file or a file list somewhere or a directory with all the files? (A folder of files would be very nice if you want to copy and integrate this into a different project)
Last but not least, if you have your own tflite model, how do you build and compile your own model?

I am also happy to test your suggestions and even update the document. (but will need just a bit of direction as requested above please)

Having said all that, thanks for the effort, this looks like an awesome idea.

Thank you for the interest in our project and your feedback.

The best reference to get it running would be the CI, because it is continuously tested: https://github.com/cpetig/tflite_micro_compiler/blob/r2.3/.github/workflows/c-cpp.yml

Yes, you figured out correctly that with GET_TF_SRC, the TF directory will be hidden in some internal folder. However, as far as I remember you do not need to find it in the CMake deployment, because the "make hello_world_bin" is not required there. I agree that the README is confusing here. It is meant to show a hierarchy where you chose to either use "CMake" or "Make" to build the project. And when using CMake, there are some options whether you want to use an existing copy of TF, or automatically download a specific version.

If using CMake, an important step is not mentioned in the README. You actually have to build the project with cmake --build . in the cmake build directory.

In both cases, you should get a "compiler" ELF file that sits either in the CMake build dir (build/ if using instructions) or in the main directory (for Make). I agree that it could be more accurately be named something like tflm_compiler.

Let us know if anything else is unclear. I would be helpful if you can propose these changes, thank you very much for the offer.

Thanks a lot for the replay, cleared up a lot. I will try it over the weekend and let you know.

Hi, I tried to follow the CI yml file, on ubuntu and windows machine
The "cmake --build ." fails
I tried using specific tensorflow version 2.3.0 , but still not success.
Any help will be appreciated.

image

Looks like a compiler portability Issue. This offending code is machine-generated (google's flatbuffer bindings) in tflite itself generator so it is not readily manually fixed. I recall we hit similar Issues integrating tflmc in an internal AI/ML toolchain project when embedded systems colleagues attempted to add support for the IAR compiler.

The forthcoming Infineon MTB-ML 2.0 Toolchain may include local patches for this. Feeding these back as a PR / Fix for tflite upstream is under consideration but googles contribution process is arduous and they're unsurprising chary accepting stuff they can't test/support in their CI/CD.

Hello, I am trying to execute the command cmake --build . but it throws the following error. Here is a snapshot of the terminal.

image

What is the probable cause for this issue?

Look like you are using an old c++ standard for compiling the tflm lib.

please try to add TARGET_COMPILE_FEATURES(${LIB_NAME} PUBLIC cxx_std_14) here:

Okay, I added that line. It fixed some errors but these are present.

image

It seems like CMake pulls the latest version of flatbuffers which might be incompatible with the schema generated for TFLM. You you try to add this code instead (replacing the original statement of course):

FETCHCONTENT_DECLARE(
        flatbuffers
        GIT_REPOSITORY https://github.com/google/flatbuffers.git
        GIT_PROGRESS FALSE
        GIT_TAG v1.12.1
        QUIET
        )

I did at least work with TF2.4