dnth / yolov5-deepsparse-blogpost

By the end of this post, you will learn how to: Train a SOTA YOLOv5 model on your own data. Sparsify the model using SparseML quantization aware training, sparse transfer learning, and one-shot quantization. Export the sparsified model and run it using the DeepSparse engine at insane speeds. P/S: The end result - YOLOv5 on CPU at 180+ FPS using on

Home Page:https://dicksonneoh.com/portfolio/supercharging_yolov5_180_fps_cpu/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

export.py error

dani3l125 opened this issue · comments

Hello,
Thank you for the useful repository!
I am trying to export a model quantized and pretrained by deepsparse into onnx and getting an error.
The model: yolov5n6

The command: python yolov5-train/export.py --weights yolov5n_sparse.pt --include onnx tflite --imgsz 480 640 --simplify --dynamic

The error:
Traceback (most recent call last): File "yolov5-train/export.py", line 715, in <module> main(opt) File "yolov5-train/export.py", line 704, in main run(**vars(opt)) File "/home/daniel/miniconda3/envs/torch2onnx/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 28, in decorate_context return func(*args, **kwargs) File "yolov5-train/export.py", line 593, in run model, extras = load_checkpoint(type_='ensemble', weights=weights, device=device) # load FP32 model File "yolov5-train/export.py", line 529, in load_checkpoint state_dict = load_state_dict(model, state_dict, run_mode=not ensemble_type, exclude_anchors=exclude_anchors) File "yolov5-train/export.py", line 553, in load_state_dict model.load_state_dict(state_dict, strict=not run_mode) # load File "/home/daniel/miniconda3/envs/torch2onnx/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1406, in load_state_dict raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format( RuntimeError: Error(s) in loading state_dict for Model: Missing key(s) in state_dict: "model.0.conv.weight", "model.0.bn.weight", "model.0.bn.bias", "model.0.bn.running_mean", "model.0.bn.running_var", "model.1.conv.weight", "model.1.bn.weight", "model.1.bn.bias", "model.1.bn.running_mean", "model.1.bn.running_var", "model.2.cv1.conv.weight", ...

Any support would be much appreciated (:

What about the older version on yolov5n ? Can you export those?

The original yolov5n.pt and yolov5n6.pt the exports are successful.
The sparsified version of the older version - yolov5n - returns similar error.

It seems like the export file expects only a dictionary that matches the original model's layers and is not compatible with the quantized layers.