Hugging Face's repositories
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
tokenizers
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
text-generation-inference
Large Language Model Text Generation Inference
accelerate
🚀 A simple way to train and use PyTorch models with multi-GPU, TPU, mixed-precision
alignment-handbook
Robust recipes to align language models with human and AI preferences
distil-whisper
Distilled variant of Whisper for speech recognition. 6x faster, 50% smaller, within 1% word error rate.
autotrain-advanced
🤗 AutoTrain Advanced
huggingface_hub
The official Python client for the Huggingface Hub.
huggingface.js
Utilities to use the Hugging Face Hub API
datasets-server
Lightweight web API for visualizing and exploring all types of datasets - computer vision, speech, text, and tabular - stored on the Hugging Face Hub
swift-transformers
Swift Package to implement a transformers-like API in Swift
optimum-intel
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
optimum-benchmark
A unified multi-backend utility for benchmarking Transformers and Diffusers with full support of Optimum's hardware optimizations & quantization schemes.
optimum-neuron
Easy, fast and very cheap training and inference on AWS Trainium and Inferentia chips.
optimum-habana
Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
optimum-amd
AMD related optimizations for transformer models