znsoftm / YOLO-Nano

A new version YOLO-Nano

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

YOLO-Nano

A new version YOLO-Nano inspired by NanoDet.

In this project, you can enjoy:

  • a different version of YOLO-Nano

Network

This is a a different of YOLO-Nano built by PyTorch:

  • Backbone: ShuffleNet-v2
  • Neck: a very lightweight FPN+PAN

Train

  • Batchsize: 32
  • Base lr: 1e-3
  • Max epoch: 120
  • LRstep: 60, 90
  • optimizer: SGD

The overview of my YOLO-Nano Image

Experiment

Environment:

  • Python3.6, opencv-python, PyTorch1.1.0, CUDA10.0,cudnn7.5
  • For training: Intel i9-9940k, RTX-2080ti

VOC:

YOLO-Nano-1.0x:

size mAP
VOC07 test 320 65.0
VOC07 test 416 69.1
VOC07 test 608 70.8

COCO:

size AP AP50 AP75 AP_S AP_M AP_L
COCO eval 320 17.2 33.1 16.2 2.6 16.0 31.7
COCO eval 416 19.6 36.9 18.6 4.6 19.1 33.3
COCO eval 608 20.6 38.6 19.5 7.0 22.5 30.7

YOLO-Nano-0.5x:

hold on ...

Visualization

On COCO-val

The overview of my YOLO-Nano Image Image Image Image Image Image Image Image Image Image Image Image Image Image

Installation

  • Pytorch-gpu 1.1.0/1.2.0/1.3.0
  • Tensorboard 1.14.
  • opencv-python, python3.6/3.7

Dataset

VOC Dataset

I copy the download files from the following excellent project: https://github.com/amdegroot/ssd.pytorch

I have uploaded the VOC2007 and VOC2012 to BaiDuYunDisk, so for researchers in China, you can download them from BaiDuYunDisk:

Link:https://pan.baidu.com/s/1tYPGCYGyC0wjpC97H-zzMQ

Password:4la9

You will get a VOCdevkit.zip, then what you need to do is just to unzip it and put it into data/. After that, the whole path to VOC dataset is data/VOCdevkit/VOC2007 and data/VOCdevkit/VOC2012.

Download VOC2007 trainval & test

# specify a directory for dataset to be downloaded into, else default is ~/data/
sh data/scripts/VOC2007.sh # <directory>

Download VOC2012 trainval

# specify a directory for dataset to be downloaded into, else default is ~/data/
sh data/scripts/VOC2012.sh # <directory>

MSCOCO Dataset

I copy the download files from the following excellent project: https://github.com/DeNA/PyTorch_YOLOv3

Download MSCOCO 2017 dataset

Just run sh data/scripts/COCO2017.sh. You will get COCO train2017, val2017, test2017.

Train

VOC

python train.py -d voc --cuda -v [select a model] -ms

You can run python train.py -h to check all optional argument.

COCO

python train.py -d coco --cuda -v [select a model] -ms

Test

VOC

python test.py -d voc --cuda -v [select a model] --trained_model [ Please input the path to model dir. ]

COCO

python test.py -d coco-val --cuda -v [select a model] --trained_model [ Please input the path to model dir. ]

Evaluation

VOC

python eval.py -d voc --cuda -v [select a model] --train_model [ Please input the path to model dir. ]

COCO

To run on COCO_val:

python eval.py -d coco-val --cuda -v [select a model] --train_model [ Please input the path to model dir. ]

To run on COCO_test-dev(You must be sure that you have downloaded test2017):

python eval.py -d coco-test --cuda -v [select a model] --train_model [ Please input the path to model dir. ]

You will get a .json file which can be evaluated on COCO test server.

About

A new version YOLO-Nano


Languages

Language:Python 98.4%Language:Shell 1.6%