Ther's repositories
Awesome-Transformer-Accleration
Paper list for accleration of transformers
awesome-green-group-repeating
绿群的兄弟们个个都是人才,说话又好听
bitsandbytes
8-bit CUDA functions for PyTorch
ceres-solver
A large scale non-linear optimization library
ChatGLM-6B
ChatGLM-6B:开源双语对话语言模型
chisel-template
Template for chisel projects.
cutlass
CUDA Templates for Linear Algebra Subroutines
diffusers
🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch
gem5-stable
Lab Platform of Modern Computer Architecture
I-BERT
[ICML'21 Oral] I-BERT: Integer-only BERT Quantization
LongLoRA
Code and documents of LongLoRA and LongAlpaca (ICLR 2024 Oral)
LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
peft
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
PowerInfer
High-speed Large Language Model Serving on PCs with Consumer-grade GPUs
qlora
QLoRA: Efficient Finetuning of Quantized LLMs
SPT
[ICCV 2023 oral] This is the official repository for our paper: ''Sensitivity-Aware Visual Parameter-Efficient Fine-Tuning''.
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
triton
Development repository for the Triton language and compiler
VINS-Mono
A Robust and Versatile Monocular Visual-Inertial State Estimator
vit-finetune
Fine-tuning Vision Transformers on various classification datasets