Piyushi Manupriya's starred repositories
speculative-decoding
Explorations into some recent techniques surrounding speculative decoding
unscented-autoencoder
Accompanying code for the ICML'23 paper "Unscented Autoencoder", authored by Faris Janjos, Lars Rosenbaum, Maxim Dolgov, and J. Marius Zoellner.
gpu-optimization-workshop
Slides, notes, and materials for the workshop
the-book-of-secret-knowledge
A collection of inspiring lists, manuals, cheatsheets, blogs, hacks, one-liners, cli/web tools and more.
LLMs-from-scratch
Implementing a ChatGPT-like LLM in PyTorch from scratch, step by step
diffprog.github.io
Website for the book "The Elements of Differentiable Programming".
sampling_diffusion
Improved sampling via learned diffusions (ICLR2024) and an optimal control perspective on diffusion-based generative modeling (SBM@NeurIPS2022)
LLM-Uncertainty-Bench
Benchmarking LLMs via Uncertainty Quantification
Diffusion-Models-Papers-Survey-Taxonomy
Diffusion model papers, survey, and taxonomy
llm-course
Course to get into Large Language Models (LLMs) with roadmaps and Colab notebooks.
mixture-of-experts
Training two separate expert neural networks and one gater that can switch the expert networks.
soft-mixture-of-experts
PyTorch implementation of Soft MoE by Google Brain in "From Sparse to Soft Mixtures of Experts" (https://arxiv.org/pdf/2308.00951.pdf)
soft-moe-pytorch
Implementation of Soft MoE, proposed by Brain's Vision team, in Pytorch