sophgo / tpu-mlir

Machine learning compiler based on MLIR for Sophgo TPU.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

mlir转f16 报错

SweatLin opened this issue · comments

Traceback (most recent call last):
File "/workspace/tpu-mlir/python/tools/model_deploy.py", line 329, in
tool.lowering()
File "/workspace/tpu-mlir/python/tools/model_deploy.py", line 122, in lowering
mlir_lowering(self.mlir_file, self.tpu_mlir, self.quantize, self.chip, self.cali_table,
File "/workspace/tpu-mlir/python/utils/mlir_shell.py", line 103, in mlir_lowering
_os_system(cmd)
File "/workspace/tpu-mlir/python/utils/mlir_shell.py", line 50, in _os_system
raise RuntimeError("[!Error]: {}".format(cmd_str))
RuntimeError: [!Error]: tpuc-opt yolov5s_231201.mlir --processor-assign="chip=bm1684" --processor-top-optimize --convert-top-to-tpu="mode=F16 asymmetric=False doWinograd=False ignore_f16_overflow=False q_group_size=0" --canonicalize --weight-fold -o yolov5s_bm1684_f16_tpu.mlir

我也遇到了同样的错误,无法解决

我也遇到了同样的错误,无法解决
可以转F32的 1684x才支持F16

我也遇到了同样的错误,无法解决
可以转F32的 1684x才支持F16

raise RuntimeError("[!Error]: {}".format(cmd_str))

RuntimeError: [!Error]: tpuc-opt mobile_sam_bm1684x_f16_final.mlir --codegen="model_file=mobile_sam_fp16.bmodel embed_debug_info=false model_version=latest" -o /dev/null
这是我遇到的错误,转成fp32也是同样的错误,bm1684x两个都报错了,不知道是tpu-mlir的问题还是onnx的问题

我也遇到了同样的错误,无法解决
可以转F32的 1684x才支持F16

raise RuntimeError("[!Error]: {}".format(cmd_str))

RuntimeError: [!Error]: tpuc-opt mobile_sam_bm1684x_f16_final.mlir --codegen="model_file=mobile_sam_fp16.bmodel embed_debug_info=false model_version=latest" -o /dev/null 这是我遇到的错误,转成fp32也是同样的错误,bm1684x两个都报错了,不知道是tpu-mlir的问题还是onnx的问题

方便的话,可以模型给我,我试试