DavidHSTai's repositories
leetcode-4
LeetCode Solutions: A Record of My Problem Solving Journey.( leetcode题解,记录自己的leetcode解题之路。)
ACM
自己两年来做题总结的模板
admm-pruning
Prune DNN using Alternating Direction Method of Multipliers (ADMM)
Attention-Augmented-Conv2d
Implementing Attention Augmented Convolutional Networks using Pytorch
Augmentor
Image augmentation library in Python for machine learning.
Auto-PyTorch
Automatic architecture search and hyperparameter optimization for PyTorch
DeepLearning-500-questions
深度学习500问,以问答形式对常用的概率知识、线性代数、机器学习、深度学习、计算机视觉等热点问题进行阐述,以帮助自己及有需要的读者。 全书分为18个章节,50余万字。由于水平有限,书中不妥之处恳请广大读者批评指正。 未完待续............ 如有意合作,联系scutjy2015@163.com 版权所有,违权必究 Tan 2018.06
eng-practices
Google's Engineering Practices documentation
KL-Loss
Bounding Box Regression with Uncertainty for Accurate Object Detection (CVPR'19)
lihang-code
《统计学习方法》的代码实现
mmdetection
Open MMLab Detection Toolbox and Benchmark
nn-compression
A Pytorch implementation of Neural Network Compression (pruning, deep compression, channel pruning)
PolarMask
Code for 'PolarMask: Single Shot Instance Segmentation with Polar Representation'
Python
All Algorithms implemented in Python
PytorchInsight
a pytorch lib with state-of-the-art architectures, pretrained models and real-time updated results
rotated_maskrcnn
Rotated Mask R-CNN: From Bounding Boxes to Rotated Bounding Boxes
scalabel
Quantify computer vision performance in human terms
sgan
Code for "Social GAN: Socially Acceptable Trajectories with Generative Adversarial Networks", Gupta et al, CVPR 2018
social-lstm
Social LSTM implementation in PyTorch
SparseConvNet
Submanifold sparse convolutional networks
spconv
Spatial Sparse Convolution in PyTorch
Statistical-Learning-Method_Code
手写实现李航《统计学习方法》书中全部算法
Stronger-One-stage-detector-with-much-Tricks
Stronger SSD & YOLO v3
TorchSeg
Fast, modular reference implementation and easy training of Semantic Segmentation algorithms in PyTorch.
UPSNet
UPSNet: A Unified Panoptic Segmentation Network
utils.pytorch
Utilities for Pytorch
vim-plugin
The Kite plugin for Vim.