Dave Turner's starred repositories
django-cacheops
A slick ORM cache with automatic granular event-driven invalidation.
django-cachalot
No effort, no worry, maximum performance.
graphql-markdown
Flexible GraphQL Documentation Generator (Markdown)
carbon-lang
Carbon Language's main repository: documents, design, implementation, and related tools. (NOTE: Carbon Language is experimental; see README)
AI_for_Scheduling
This is the code for "AI for Scheduling" by Siraj Raval on Youtube
From-0-to-Research-Scientist-resources-guide
Detailed and tailored guide for undergraduate students or anybody want to dig deep into the field of AI with solid foundation.
Awesome-pytorch-list
A comprehensive list of pytorch related content on github,such as different models,implementations,helper libraries,tutorials etc.
awesome-nlp
:book: A curated list of resources dedicated to Natural Language Processing (NLP)
NYU-DLSP20
NYU Deep Learning Spring 2020
Early-Bird-Tickets
[ICLR 2020] Drawing Early-Bird Tickets: Toward More Efficient Training of Deep Networks
torchprofile
A general and accurate MACs / FLOPs profiler for PyTorch models
neural-network-pruning-and-sparsification
TensorFlow implementation of weight and unit pruning and sparsification
A-_Guide_-to_Data_Sciecne_from_mathematics
It is a blueprint to data science from the mathematics to algorithms. It is not completed.
pytorch-OpCounter
Count the MACs / FLOPs of your PyTorch model.
Efficient-Computing
Efficient computing methods developed by Huawei Noah's Ark Lab
knowledge-distillation-papers
knowledge distillation papers
ghostnet.pytorch
[CVPR2020] GhostNet: More Features from Cheap Operations
awesome-AutoML-and-Lightweight-Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
knowledge-distillation-pytorch
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility