M Saiful Bari's starred repositories
accelerate
🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support
PromptPapers
Must-read papers on prompt-based tuning for pre-trained language models.
promptsource
Toolkit for creating, sharing and using natural language prompts.
cpu_features
A cross platform C99 library to get cpu features at runtime.
TransCoder
Public release of the TransCoder research project https://arxiv.org/pdf/2006.03511.pdf
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
NL-Augmenter
NL-Augmenter 🦎 → 🐍 A Collaborative Repository of Natural Language Transformations
ACL-anthology-corpus
This repository provides details and links to the ACL anthology corpus/collection including .bib, .pdf and grobid extractions of the pdfs
deep-learning-models
Natural language processing & computer vision models optimized for AWS
LaplacianShot
Laplacian Regularized Few Shot Learning
multilingual-modeling
BLOOM+1: Adapting BLOOM model to support a new unseen language
ml_nlp_paper_data
Dataset of ML and NLP papers
carbon-footprint
A repository for `codecarbon` logs.
transformers
🤗 Transformers: State-of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX.
multilingual-t0
Multilingual extension of T0
eval_t0_deepspeed
Evaluate T0 with DeepSpeed