Johnqczhang's repositories
densepose_installation
How to install DensePose with PyTorch (including caffe2) from source code or binaries via conda
awesome-self-supervised-learning
A curated list of awesome self-supervised methods
AdelaiDet
AdelaiDet is an open source toolbox for multiple instance-level detection and recognition tasks.
ByteTrack
ByteTrack: Multi-Object Tracking by Associating Every Detection Box
CenterNet
Object detection, 3D detection, and pose estimation using center point detection:
mmtracking
OpenMMLab Video Perception Toolbox. It supports Single Object Tracking (SOT), Multiple Object Tracking (MOT), Video Object Detection (VID) with a unified framework.
CenterNet-better
An easy to understand and better performance version of CenterNet
cocoapi
COCO API - Dataset @ http://cocodataset.org/
DCNv2
Deformable Convolutional Networks v2 with Pytorch
deep-high-resolution-net.pytorch
The project is an official implementation of our CVPR2019 paper "Deep High-Resolution Representation Learning for Human Pose Estimation"
detectron2
Detectron2 is FAIR's next-generation platform for object detection, segmentation and other visual recognition tasks.
faster_rcnn
Faster R-CNN
HeadHunter
Code for the head detector (HeadHunter) proposed in our CVPR 2021 paper Tracking Pedestrian Heads in Dense Crowd.
maskrcnn-benchmark
Fast, modular reference implementation of Instance Segmentation and Object Detection algorithms in PyTorch.
mxnet
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more
py-faster-rcnn
Faster R-CNN (Python implementation) -- see https://github.com/ShaoqingRen/faster_rcnn for the official MATLAB version
resume
My resume, generated with moderncv
SparseR-CNN
End-to-End Object Detection with Learnable Proposal
stanford_dl_ex
Programming exercises for the Stanford Unsupervised Feature Learning and Deep Learning Tutorial
TrackEval
HOTA (and other) evaluation metrics for Multi-Object Tracking (MOT).