sunzhuojun / RAPiD

RAPiD: Rotation-Aware People Detection in Overhead Fisheye Images (CVPR 2020 Workshops)

Home Page:http://vip.bu.edu/rapid/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

RAPiD

This repository is the official PyTorch implementation of the following paper. Our code can reproduce the training and testing results reported in the paper.

RAPiD: Rotation-Aware People Detection in Overhead Fisheye Images
[arXiv paper] [Project page]

Installation

Requirements: The code should be able to work as long as you have the following packages:

An exmpale of Installation with Linux, CUDA10.1, and Conda:

conda create --name RAPiD_env python=3.7
conda activate RAPiD_env

conda install pytorch torchvision cudatoolkit=10.1 -c pytorch
conda install -c conda-forge pycocotools
conda install tqdm opencv

# cd the_folder_to_install
git clone https://github.com/duanzhiihao/RAPiD.git

A minimum guide for testing on a single image

  1. Clone the repository
  2. Download the pre-trained network weights and place it under the RAPiD/weights folder.
  3. Directly run python example.py. Alternatively, demo.ipynb gives an example using jupyter notebook.

Evaluation on CEPDOF

TBD

Training on COCO

TBD

Fine-tuning on fisheye image datasets

TBD

TODO

  • Update README

Citation

RAPiD source code is available for non-commercial use. If you find our code and dataset useful or publish any work reporting results using this source code, please consider citing our paper

Z. Duan, M.O. Tezcan, H. Nakamura, P. Ishwar and J. Konrad, 
“RAPiD: Rotation-Aware People Detection in Overhead Fisheye Images”, 
in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 
Omnidirectional Computer Vision in Research and Industry (OmniCV) Workshop, June 2020.

About

RAPiD: Rotation-Aware People Detection in Overhead Fisheye Images (CVPR 2020 Workshops)

http://vip.bu.edu/rapid/


Languages

Language:Jupyter Notebook 89.1%Language:Python 10.9%