Korbinian Pöppel's starred repositories
vision-lstm
xLSTM as Generic Vision Backbone
flash-linear-attention
Efficient implementations of state-of-the-art linear attention models in Pytorch and Triton
symbolicai
Compositional Differentiable Programming Library
unlimiformer
Public repo for the NeurIPS 2023 paper "Unlimiformer: Long-Range Transformers with Unlimited Length Input"
alpaca-lora
Instruct-tune LLaMA on consumer hardware
RedPajama-Data
The RedPajama-Data repository contains code for preparing large datasets for training large language models.
all-contributors
✨ Recognize all contributors, not just the ones who push code ✨
RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
OpenBugger
Code to create bugged python scripts for OpenAssistant Training, maintained by https://twitter.com/Cyndesama
Open-Assistant
OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
klimaschutztag-rohrbach-2023
Homepage für den Klimaschutztag in Rohrbach a.d. Ilm am 26.3.2023
dvc-objects
dvc objects - contains filesystem and object-db level abstractions to use in dvc and dvc-data
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
pytorch-lightning
Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.
django-tailwind
Django + Tailwind CSS = 💚