Gordon Erlebacher's repositories
annotated_encoder_decoder
The Annotated Encoder Decoder with Attention
CAP5771-datasets
Reduced datasets for testing
G6_hover_MWE
A minimal working and self-contained example of edge hover with Antv/G6
gordon_assignment_1
Gordon's assignment 1 for ISC-5771, testing with Gradescope
gordon_assignment_1a
My assignment 1
GPT4ALL-Python-API
Simple API for using the Python binding of gpt4all, utilizing the default models of the application. It can be used with the OpenAPI library.
Julia_benchmarks
Collection of benchmarks for discussion in Julia Discourse
jupyterlab-vim
:neckbeard: Vim notebook cell bindings for JupyterLab
karparthy_llm.c
LLM training in simple, raw C/CUDA
Latent_class_Analysis
Exploration of inference using processes.
llamafile
Distribute and run LLMs with a single file.
llm_multiagent_debate
Code for Improving Factuality and Reasoning in Language Models through Multiagent Debate
LocalAI
:robot: The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs ggml, gguf, GPTQ, onnx, TF compatible models: llama, llama2, rwkv, whisper, vicuna, koala, cerebras, falcon, dolly, starcoder, and many others
minbpe_forked
Minimal, clean code for the Byte Pair Encoding (BPE) algorithm commonly used in LLM tokenization.
mlx-examples
Examples in the MLX framework
nanoGPT_GE
The simplest, fastest repository for training/finetuning medium-sized GPTs. GE - experiment with low rank matrices
s4_pytorch
Structured state space sequence models
sai_DataMining_Programming_Assignment_1
Programming Homework 1, Data Mining, Spring 2024
torch-neural-ssm
Neural State-Space Models and Latent Dynamics Functions in PyTorch for High-Dimensional Forecasting
toy-models-of-superposition
Notebooks accompanying Anthropic's "Toy Models of Superposition" paper
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
universal_differential_equations
Repository for the Universal Differential Equations for Scientific Machine Learning paper, describing a computational basis for high performance SciML