ppogg / ncnn-yolov4-int8

NCNN+Int8+YOLOv4 quantitative modeling and real-time inference

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ncnn-yolov4-int8

NCNN+Int8+YOLOv4 quantitative modeling and real-time inference

Int8 quantization and inference for yolov4-tiny using ncnn is performed as follows.

  • inference
Equipment Computing backend System Framework input_size Run time
Intel Core i5-4210 window10(x64) ncnn@fp16 320 36ms
Intel Core i5-4210 window10(x64) ncnn@int8 320 57ms
Raspberrypi 3B 4xCortex-A53 Linux(arm64) ncnn@fp16 320 313ms
Raspberrypi 3B 4xCortex-A53 Linux(arm64) ncnn@int8 320 217ms

中文教程:https://zhuanlan.zhihu.com/p/368653551

中文教程:https://zhuanlan.zhihu.com/p/372278785

  • Note

Please watch out the following code changes

About

NCNN+Int8+YOLOv4 quantitative modeling and real-time inference


Languages

Language:C++ 100.0%