🚀 A simple way to train and use PyTorch models with multi-GPU, TPU, mixed-precision
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
🤗 Fast, efficient, open-access datasets and evaluation metrics for Natural Language Processing and more in PyTorch, TensorFlow, NumPy and Pandas
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
Flax is a neural network ecosystem for JAX that is designed for flexibility.
All the open source things related to the Hugging Face Hub.
Ongoing research training transformer language models at scale, including: BERT & GPT-2
Model parallel transformers in JAX and Haiku
Optax is a gradient processing and optimization library for JAX.
A fast and lightweight python-based CTC beam search decoder for speech recognition.
Pre-training BART in Flax on The Pile dataset
Pure Rust multimedia format demuxing, tag reading, and audio decoding library