Stas Bekman's repositories
ipyexperiments
jupyter/ipython experiment containers for GPU and general RAM re-use
transformers
š¤Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
accelerate
A simple way to train and use NLP models with multi-GPU, TPU, mixed-precision
DeepSpeedExamples
Example models using DeepSpeed
gpt-neo-fine-tuning-example
Fine-Tune EleutherAI GPT-Neo And GPT-J-6B To Generate Netflix Movie Descriptions Using Hugginface And DeepSpeed
machine-learning-note
my machine learning note
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
Megatron-LM
Ongoing research training transformer models at scale
nvidia-system-monitor-qt
Task Manager for Linux for Nvidia graphics cards
pytest-monitor
Pytest plugin for analyzing resource usage during test sessions
pytorch-tvmisc
Totally Versatile Miscellanea for Pytorch
pytorch_memlab
Profiling and inspecting memory in pytorch
tokenizers
š„Fast State-of-the-Art Tokenizers optimized for Research and Production