Logan Adams's repositories
apex
A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch
Language:PythonBSD-3-Clause000
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Language:PythonApache-2.0000
deepspeed-feedstock
A conda-smithy repository for deepspeed.
Language:ShellBSD-3-Clause000
DeepSpeed-MII
MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.
Language:PythonApache-2.0000
DeepSpeedExamples
Example models using DeepSpeed
Language:PythonApache-2.0000
Language:Python000
fish-shell
The user-friendly command line shell.
Language:RustNOASSERTION000
000
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Language:PythonApache-2.0000