Niklas Muennighoff's repositories
promptsource
Toolkit for creating, sharing and using natural language prompts.
matrixshapes
Language modelling task to infer shapes of matrices - One of the most difficult tasks for models like GPT-3, GPT-J
alpaca_eval
An automatic evaluator for instruction-following language models. Human-validated, high-quality, cheap, and fast.
bigcode-evaluation-harness
A framework for the evaluation of autoregressive code generation language models.
FlagEmbedding
Open-source Embeddings
gritlm
Generative Representational Instruction Tuning
licensed-pile
Repo to hold code and track issues for the collection of permissively licensed data
lm-evaluation-harness
A framework for few-shot evaluation of autoregressive language models.
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
open_lm
A repository for research on medium sized language models.
OpenDevin
🐚 OpenDevin: Code Less, Make More
prompt_semantics
This repository accompanies our paper “Do Prompt-Based Models Really Understand the Meaning of Their Prompts?”
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
udacity-dl
Udacity Deep-Learning Nanodegree 2020