Charles Foster's repositories
simple-diffusion-model
Pedagogical codebase for a simplified score-based generative model design, with training loop
simple-parallel-transformer
As it says on the tin, this repo has a simple implementation of a transformer model, with some borrowed efficiency improvements. The purpose is mainly pedagogical.
humongous-rs
A Rust pipeline for extracting HUMONGOUS, a dataset of web-based text extracted from Common Crawl and ready for multilingual language modeling.
simple-vq-vae
Pedagogical codebase for a simplified VQ-VAE based on attention and linear interpolation
uspto_patent_data_parser
A python tool for reading, parsing and finding patent using the United States Patent and Trademark (USPTO) Bulk Data Storage System.
all-normalization-transformer
A simple Transformer where the softmax has been replaced with normalization
awesome-NeRF
A curated list of awesome neural radiance fields papers
DynamicGrids.jl
A framework for gridded simulations in Julia
GPT-Neo-visual-grounding
Visually ground GPT-Neo 1.3b and 2.7b
new-website
New website for EleutherAI based on Hugo static site generator
pile-explorer
For exploring the data and documenting its limitations
pile_united_nations
A script for collecting the United Nations Digital Library dataset in a language modelling friendly format.
self-attention-experiments-vision
A project about replicating, evaluating and scaling up self-attention based models in vision.
tolman-eichenbaum-formers
Looking to build out some code inspired by https://openreview.net/forum?id=B8DVo9B1YE0
transformer-memorization
Experiments quantifying the memorization properties of transformers
vector-quantize-pytorch
Vector Quantization, in Pytorch
x-transformers
A simple but complete full-attention transformer with a set of promising experimental features from various papers