ARM-software / ML-examples

Arm Machine Learning tutorials and examples

Home Page:https://developer.arm.com/technologies/machine-learning-on-arm

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Retraining of the KWS model

mesut92 opened this issue · comments

Hi
I am trying to create a new KWS model. I trained a model on this notebook.
https://github.com/tensorflow/tflite-micro/blob/main/tensorflow/lite/micro/examples/micro_speech/train/train_micro_speech_model.ipynb

I moved weights to nn_model variable in kws_micronet_m.tflite.cpp. And I changed Labels.cpp .
I am using Keil Studio for model deployment. I can use pretrained model.
It did not work. I deployed to MCU, but it did not open KWS. How can i train with new keywords?

MCU model: F46

See here for training code for a compatible model.

It is possible that the micro speech model needs different input features to what is expected of the existing kws_micronet_m model (and the ds_cnn model linked above). So if the input sizes do not match then there will be issues running the application.

I used this scripts to train model (medium version). It did not work. I am gonna check about feature size.

Hi Richard
I trained ds_cnn model with this conf (dct_coefficient_count=10);
python train.py --model_architecture ds_cnn --model_size_info 5 172 10 4 2 1 172 3 3 2 2 172 3 3 1 1 172 3 3 1 1 172 3 3 1 1 --dct_coefficient_count 10 --window_size_ms 40 --window_stride_ms 20 --learning_rate 0.0005,0.0001,0.00002 --how_many_training_steps 10000,10000,10000 --summaries_dir work/DS_CNN/DS_CNN_M/retrain_logs --train_dir work/DS_CNN/DS_CNN_M/training

And i moved to weights to this file;

ARM-software/ML-examples/tree/main/cmsis-pack-examples/kws/src/kws_micronet_m.tflite.cpp

But it did not work? Is not it possible to share how you produce pretrained model? I can use pretrained model but others does not work?

When you say it doesn't work is it the result of the model when used in the application are not what you expect? If that is the case then it might be caused by the labels vector here

The pretrained micronet model label order is different to that from the training scripts. Try changing your labels in the application to this order and see if that helps?

Nope. I trained ds_cnn in ML-ZOO with same label. same output size. I generated .tflite file. And I converted it with this;
https://github.com/thodoxuan99/KWS_MCU/blob/main/kws_cortex_m/tflite_to_tflu.py
I moved weight to this place;
ARM-software/ML-examples/tree/main/cmsis-pack-examples/kws/src/kws_micronet_m.tflite.cpp

It did not generate anything. Empty screen. No signal has shown. It did not gave error during build project. I did not look the label order, because it did not produce output.

Would you be able to share the tflite file, and I can try replicating the issue?

Okay, I converted your model with

xxd -i ds_cnn_quantized.tflite > model_data.cc and copied the contents of the array into ARM-software/ML-examples/tree/main/cmsis-pack-examples/kws/src/kws_micronet_m.tflite.cpp, overwriting the existing model data that is there.

It builds but I get the following output when running on Keil Studio Cloud.

INFO - Added support to op resolver
INFO - Creating allocator using tensor arena at 0x31000000
INFO - Allocating tensors
ERROR - tensor allocation failed!
ERROR - Failed to initialise model

Running it again locally I believe this is caused by the Softmax operator in your model that isn't present in the pretrained MicroNet one. TFLite micro needs to know what operators are present in the model for it to work otherwise it will throw an error.

I manually enlisted this operator by editing the local cmsis pack at ~/.cache/arm/packs/ARM/ml-embedded-eval-kit-uc-api/22.8.0-Beta/source/application/api/use_case/kws/src/MicroNetKwsModel.cc and also editing these 2 lines in the main (as these numbers don't align with your model input shape). I also had to make edits to ~/.cache/arm/packs/ARM/ml-embedded-eval-kit-uc-api/22.8.0-Beta/source/application/api/use_case/kws/src/KwsProcessing.cc to change the useSoftmax parameter within the DoPostProcess() function to false.

Inference now runs albeit with the wrong result, which may just be the result of the new model.

Ideally we should have some way for a user to manually enlist new operators via the API, incase they have changed the model like you have done. We have a task to do this but I am not sure of when this will be completed.

You can make local edits to the cmsis-packs yourself as well to get things working but this is probably not a sustainable solution, instead I think you have 2 ways to go forward:

  1. You can adjust the retrained model so it doesn't have softmax at the end when you produce your tflite file, this way you don't need to edit the cmsis packs.
  2. Switch to using the ML-embedded-evaluation-kit, which is what this cmsis-pack example are based off of (and where the cmsis packs come from). Swapping to use this will allow you to more easily modify the KWS use case and change models etc. or even generate new cmsis-packs if you wish.