Kashif Rasul (kashif)

kashif

Geek Repo

Location:Berlin, Germany

Twitter:@krasul

Github PK Tool:Github PK Tool

Kashif Rasul's repositories

pytorch-transformer-ts

Repository of Transformer based PyTorch Time Series Models

Language:Jupyter NotebookLicense:MITStargazers:263Issues:15Issues:17
Language:PythonLicense:Apache-2.0Stargazers:14Issues:1Issues:0

vq-tr

VQ-TR repository

Language:Jupyter NotebookLicense:MITStargazers:6Issues:5Issues:1

gluon-ts

GluonTS - Probabilistic Time Series Modeling in Python

Language:PythonLicense:Apache-2.0Stargazers:5Issues:3Issues:0
Language:Jupyter NotebookLicense:MITStargazers:4Issues:5Issues:1

iTransformer

Official implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting".

Language:PythonLicense:MITStargazers:3Issues:1Issues:0

trl

Train transformer language models with reinforcement learning.

Language:PythonLicense:Apache-2.0Stargazers:3Issues:1Issues:0
License:Apache-2.0Stargazers:2Issues:0Issues:0

accelerate

🚀 A simple way to train and use PyTorch models with multi-GPU, TPU, mixed-precision

Language:PythonLicense:Apache-2.0Stargazers:1Issues:1Issues:0

alignment-handbook

Robust recipes for to align language models with human and AI preferences

Language:PythonLicense:Apache-2.0Stargazers:1Issues:1Issues:0

blog

Public repo for HF blog posts

Language:Jupyter NotebookStargazers:1Issues:1Issues:0

datasets

🤗 The largest hub of ready-to-use datasets for ML models with fast, easy-to-use and efficient data manipulation tools

Language:PythonLicense:Apache-2.0Stargazers:1Issues:1Issues:0
Language:Jupyter NotebookLicense:MITStargazers:1Issues:1Issues:0

vector-quantize-pytorch

Vector Quantization, in Pytorch

Language:PythonLicense:MITStargazers:1Issues:1Issues:0

chronos-forecasting

Chronos: Pretrained (Language) Models for Probabilistic Time Series Forecasting

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0
Language:PythonLicense:Apache-2.0Stargazers:0Issues:2Issues:0

evaluate

A library for easily evaluating machine learning models and datasets.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

gangsrc

GANG Software suite

Language:CLicense:GPL-2.0Stargazers:0Issues:2Issues:0

HoMM

High order Moment Models

Language:PythonStargazers:0Issues:1Issues:0

hopfield-layers

Hopfield Networks is All You Need

Language:PythonLicense:NOASSERTIONStargazers:0Issues:1Issues:0

lit-gpt

Hackable implementation of state-of-the-art open-source LLMs based on nanoGPT. Supports flash attention, 4-bit and 8-bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0
License:Apache-2.0Stargazers:0Issues:0Issues:0

mlx-examples

Examples in the MLX framework

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

notebooks

Notebooks using the Hugging Face libraries 🤗

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:0Issues:1Issues:0

pytorch-lightning

Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

RWKV-LM

RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0
Language:PythonLicense:MITStargazers:0Issues:1Issues:0

transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

Wuerstchen

Official implementation of Würstchen: Efficient Pretraining of Text-to-Image Models

Language:Jupyter NotebookLicense:MITStargazers:0Issues:1Issues:0

xformers

Hackable and optimized Transformers building blocks, supporting a composable construction.

Language:PythonLicense:NOASSERTIONStargazers:0Issues:1Issues:0