Marko Vukovic's repositories
replicant_os
Open source components and issue tracker for the Replicant Network
axolotl
Go ahead and axolotl questions
bayesian_machine_learning
Notebooks about Bayesian methods for machine learning
dev-rel
All of Aztec's workshops, resources, tutorials, ideas, and useful tools
DoRA
[ICML2024] Official PyTorch implementation of DoRA: Weight-Decomposed Low-Rank Adaptation
dspy
DSPy: The framework for programming—not prompting—foundation models
elixir
Elixir is a dynamic, functional language designed for building scalable and maintainable applications
equinox
Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
instructor
structured outputs for llms
jax
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
llama-recipes
Scripts for fine-tuning Meta Llama3 with composable FSDP & PEFT methods to cover single/multi-node GPUs. Supports default & custom datasets for applications such as summarization and Q&A. Supporting a number of candid inference solutions such as HF TGI, VLLM for local or cloud deployment. Demo apps to showcase Meta Llama3 for WhatsApp & Messenger.
llama3
The official Meta Llama 3 GitHub site
maxtext
A simple, performant and scalable Jax LLM!
Megatron-LM
Ongoing research training transformer models at scale
nano_gpt
The simplest, fastest repository for training/finetuning medium-sized GPTs.
nerves
Craft and deploy bulletproof embedded software in Elixir
otp
Erlang/OTP
paxml
Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimentation and parallelization, and has demonstrated industry leading model flop utilization rates.
peft
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
phoenix
Peace of mind from prototype to production
pytorch
Tensors and Dynamic neural networks in Python with strong GPU acceleration
rust-magic-patterns
Magical Rust patterns laid out and simplified
RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
site
My personal website.
sqlite-vec
Work-in-progress vector search SQLite extension that runs anywhere.
tinygrad
You like pytorch? You like micrograd? You love tinygrad! ❤️
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
trl
Train transformer language models with reinforcement learning.