evelynmitchell's repositories
AQLM
Official Pytorch repository for Extreme Compression of Large Language Models via Additive Quantization https://arxiv.org/pdf/2401.06118.pdf
BiLLM
BiLLM: Pushing the Limit of Post-Training Quantization for LLMs
BlackMamba
Code repository for Black Mamba
Bonjourr
Minimalist & lightweight startpage inspired by iOS
CAME
The official implementation of "CAME: Confidence-guided Adaptive Memory Optimization"
dolma
Data and tools for generating and inspecting OLMo pre-training data.
heaptrack
A heap memory profiler for Linux
iTransformer
Unofficial implementation of iTransformer - SOTA Time Series Forecasting using Attention networks, out of Tsinghua / Ant group
LeanQC
Lean Algorithmic Trading Engine by QuantConnect (Python, C#)
MAGDi
The code implementation of MAGDi: Structured Distillation of Multi-Agent Interaction Graphs Improves Reasoning in Smaller Language Models. Paper: https://arxiv.org/abs/2402.01620
marvin
✨ Build AI interfaces that spark joy
neural_networks_solomonoff_induction
Learning Universal Predictors - Google Deepmind 2401.14953
OpenMoE
A family of open-sourced Mixture-of-Experts (MoE) Large Language Models
pge-outages
Tracking PG&E power outages
profila
A profiler for Numba
PufferLib
Simplifying reinforcement learning for complex game environments
Python-Type-Challenges
Master Python typing (type hints) with interactive online exercises!
pytorch
Tensors and Dynamic neural networks in Python with strong GPU acceleration
self-discover
Implementation of Google's SELF-DISCOVER
teldrive
Telegram Drive Storage
torchfix
TorchFix - a linter for PyTorch-using code with autofix support
Trailblazer
TrailBlazer: Trajectory Control for Diffusion-Based Video Generation
unitxt
🦄 Unitxt: a python library for getting data fired up and set for training and evaluation
vaex
Out-of-Core hybrid Apache Arrow/NumPy DataFrame for Python, ML, visualization and exploration of big tabular data at a billion rows per second 🚀
xmc.dspy
In-Context Learning for eXtreme Multi-Label Classification (XMC) using only a handful of examples.