Jeff Rasley's repositories
Megatron-LM
Ongoing research training transformer language models at scale, including: BERT & GPT-2
se3-transformer-public
code for the SE3 Transformers paper: https://arxiv.org/abs/2006.10503
accelerate
🚀 A simple way to train and use PyTorch models with multi-GPU, TPU, mixed-precision
ColossalAI
Making large AI models cheaper, faster and more accessible
DeepLearningExamples
Deep Learning Examples
DeepSpeedExamples
Example models using DeepSpeed
fairseq
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
ghpages-test
Playing around with GitHub pages.
InnerEye-DeepLearning
Medical Imaging Deep Learning library to train and deploy models on Azure Machine Learning
onnxruntime
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
pytorch-cifar-models
3.41% and 17.11% error on CIFAR-10 and CIFAR-100
QANet-Pytorch
QANet based Pytorch for SQuAD 1.0
tensorflow
An Open Source Machine Learning Framework for Everyone
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
triton
Development repository for the Triton language and compiler
website
Source for https://fullstackdeeplearning.com