bareblackfoot / repulsion_loss

Tensorflow implementation of Repulsion Loss.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Repulsion Loss: Detecting Pedestrians in a Crowd

A Tensorflow implementation of Repulsion Loss by Nuri Kim. This repository is based on the Faster R-CNN implementation available here.

Performance

Trained with VOC0712 trainval set and tested on VOC 2007 test set. As stated in the paper, ResNet 101 is used as a backbone network. The crowd sets consist of images containing at least one object having overlap with other object in the same category over the certain threshold.

Method mAP mAP on Crowd (>0.0) mAP on Crowd (>0.1) mAP on Crowd (>0.2) mAP on Crowd (>0.3) mAP on Crowd (>0.4)
Faster R-CNN 79.7% 72.8% 68.9% 66.2% 63.9% 62.0%
Faster R-CNN + RepGT 79.9% 72.8% 72.8% 71.4% 67.4% 61.6%

Prerequisites

  • A basic Tensorflow installation. I used tensorflow 1.7.
  • Python packages you might not have: cython, opencv-python, easydict (similar to py-faster-rcnn).

Installation

  1. Clone the repository
git clone https://github.com/bareblackfoot/repulsion_loss.git
  1. Build the Cython modules
make clean
make
cd ..
  1. Install the Python COCO API. The code requires the API to access COCO dataset.
cd data
git clone https://github.com/pdollar/coco.git
cd coco/PythonAPI
make
cd ../../..

Setup data

Please follow the instructions of py-faster-rcnn here to setup VOC and COCO datasets (Part of COCO is done). The steps involve downloading data and optionally creating soft links in the data folder. Since faster RCNN does not rely on pre-computed proposals, it is safe to ignore the steps that setup proposals.

Test with pre-trained models

  1. Download pre-trained model
  • Google drive here.
  1. Create a folder and a soft link to use the pre-trained model
NET=res101
TRAIN_IMDB=voc_2007_trainval+voc_2012_trainval
mkdir -p output/${NET}/${TRAIN_IMDB}
cd output/${NET}/${TRAIN_IMDB}
ln -s ../../../data/voc_2007_trainval+voc_2012_trainval ./default
cd ../../..
  1. Test with pre-trained Resnet101 models
GPU_ID=0
./experiments/scripts/test_repulsionloss.sh $GPU_ID pascal_voc_0712 res101

Train your own model

  1. Download pre-trained models and weights. The current code support VGG16 and Resnet V1 models. Pre-trained models are provided by slim, you can get the pre-trained models here and set them in the data/imagenet_weights folder. For example for VGG16 model, you can set up like:

    mkdir -p data/imagenet_weights
    cd data/imagenet_weights
    wget -v http://download.tensorflow.org/models/vgg_16_2016_08_28.tar.gz
    tar -xzvf vgg_16_2016_08_28.tar.gz
    mv vgg_16.ckpt vgg16.ckpt
    cd ../..

    For Resnet101, you can set up like:

    mkdir -p data/imagenet_weights
    cd data/imagenet_weights
    wget -v http://download.tensorflow.org/models/resnet_v1_101_2016_08_28.tar.gz
    tar -xzvf resnet_v1_101_2016_08_28.tar.gz
    mv resnet_v1_101.ckpt res101.ckpt
    cd ../..
  2. Train (and test, evaluation)

./experiments/scripts/train_repulsionloss.sh [GPU_ID] [DATASET] [NET]
# GPU_ID is the GPU you want to test on
# NET in {vgg16, res50, res101, res152} is the network arch to use
# DATASET {pascal_voc, pascal_voc_0712, coco} is defined in train_repulsionloss.sh
# Examples:
./experiments/scripts/train_repulsionloss.sh 0 pascal_voc vgg16
./experiments/scripts/train_repulsionloss.sh 1 coco res101
  1. Visualization with Tensorboard
tensorboard --logdir=tensorboard/vgg16/voc_2007_trainval/ --port=7001 &
tensorboard --logdir=tensorboard/vgg16/coco_2014_train+coco_2014_valminusminival/ --port=7002 &
  1. Test and evaluate
./experiments/scripts/test_repulsionloss.sh [GPU_ID] [DATASET] [NET]
# GPU_ID is the GPU you want to test on
# NET in {vgg16, res50, res101, res152} is the network arch to use
# DATASET {pascal_voc, pascal_voc_0712, coco} is defined in test_repulsionloss.sh
# Examples:
./experiments/scripts/test_repulsionloss.sh 0 pascal_voc vgg16
./experiments/scripts/test_repulsionloss.sh 1 coco res101
  1. You can use tools/reval.sh for re-evaluation

By default, trained networks are saved under:

output/[NET]/[DATASET]/default/

Test outputs are saved under:

output/[NET]/[DATASET]/default/[SNAPSHOT]/

Tensorboard information for train and validation is saved under:

tensorboard/[NET]/[DATASET]/default/
tensorboard/[NET]/[DATASET]/default_val/

About

Tensorflow implementation of Repulsion Loss.

License:MIT License


Languages

Language:Python 93.8%Language:Shell 3.1%Language:Cuda 2.2%Language:MATLAB 0.8%Language:C++ 0.1%Language:Makefile 0.0%