Hao Zhang's repositories
ray-scalable-ml-design
Some microbenchmarks and design docs before commencement
Megatron-LM
Ongoing research training transformer language models at scale, including: BERT & GPT-2
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Language:PythonMIT000
ludwig
Ludwig is a toolbox that allows to train and evaluate deep learning models without the need to write code.
Language:PythonApache-2.0000
metaseq
Repo for external large-scale work
Language:PythonMIT000
models
Models and examples built with TensorFlow
Language:PythonApache-2.0000
RFC
Community Documents
000
tensorflow
Computation using data flow graphs for scalable machine learning