Hailey Schoelkopf's repositories
triton-index
See https://github.com/cuda-mode/triton-index/ instead!
adapter-transformers
Huggingface Transformers + Adapters = ❤️
amazon-sagemaker-examples
Example 📓 Jupyter notebooks that demonstrate how to build, train, and deploy machine learning models using 🧠 Amazon SageMaker.
lm-evaluation-harness
A framework for few-shot evaluation of autoregressive language models.
bigcode-evaluation-harness
A framework for the evaluation of autoregressive code generation language models.
FLAN
Provides a minimal implementation to extract FLAN datasets for further processing
gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
gpt-fast
Simple and efficient pytorch-native transformer text generation in <1000 LOC of python.
haileyschoelkopf.github.io
Personal Blog. Under construction for now.
human-eval-infilling
Code for the paper "Efficient Training of Language Models to Fill in the Middle"
Megatron-DeepSpeed-mT0
Ongoing research training transformer language models at scale, including: BERT & GPT-2
minimal-mistakes
Personal Website for interim use. Under Construction
old-haileyschoelkopf.github.io
Blog + personal site. Under construction
promptsource
Toolkit for creating, sharing and using natural language prompts.
reinforce-lms
Project for NERH RLHF hackathon
roots-search-tool
Scripts supporting the development and serving the Roots Search Tool - https://hf.co/spaces/bigscience-data/roots-search
text-to-text-transfer-transformer
Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"
to-read-list
Collecting links to NLP / CV papers so that I don't forget to read them. As of 6/7/22
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
wip-new-website
New Personal Site / Blog. Under Construction.