Yury Nahshan's repositories
mamba
Mamba SSM architecture
mobilenetv2.pytorch
72.8% MobileNetV2 1.0 model on ImageNet and a spectrum of pre-trained MobileNetV2 models
pytorch-qcr
Pytorch library for quantization, calibration and inference of the models
torch-graph
Execution graph library for pytorch
pytorch-ssd
MobileNetV1, MobileNetV2, VGG based SSD/SSD-lite implementation in Pytorch 1.0 / Pytorch 0.4. Out-of-box support for retraining on Open Images dataset. ONNX and Caffe2 support. Experiment Ideas like CoordConv.
training
Reference implementations of training benchmarks
cnn-quantization
Quantization of Convolutional Neural networks.
distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://nervanasystems.github.io/distiller
kaggle_planet_competition
Planet Understanding the Amazon from Space competition keras implementation