Kashif Rasul's repositories
pytorch-transformer-ts
Repository of Transformer based PyTorch Time Series Models
iTransformer
Official implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting".
accelerate
🚀 A simple way to train and use PyTorch models with multi-GPU, TPU, mixed-precision
alignment-handbook
Robust recipes for to align language models with human and AI preferences
chronos-forecasting
Chronos: Pretrained (Language) Models for Probabilistic Time Series Forecasting
HoMM
High order Moment Models
hopfield-layers
Hopfield Networks is All You Need
k-diffusion
Karras et al. (2022) diffusion models for PyTorch
lit-gpt
Hackable implementation of state-of-the-art open-source LLMs based on nanoGPT. Supports flash attention, 4-bit and 8-bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
mlx-examples
Examples in the MLX framework
pytorch-lightning
Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.
RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
vector-quantize-pytorch
Vector Quantization, in Pytorch
Wuerstchen
Official implementation of Würstchen: Efficient Pretraining of Text-to-Image Models