wenhuach21's repositories
gptq
Code for the ICLR 2023 paper "GPTQ: Accurate Post-training Quantization of Generative Pretrained Transformers".
Language:PythonApache-2.0000
haystack
:mag: Haystack is an open source NLP framework that leverages pre-trained Transformer models. It enables developers to quickly implement production-ready semantic search, question answering, summarization and document ranking for a wide range of NLP applications.
Language:PythonApache-2.0000
LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
Language:PythonMIT000
stable-diffusion
A latent text-to-image diffusion model
Language:Jupyter NotebookNOASSERTION000
stanford_alpaca
Code and documentation to train Stanford's Alpaca models, and generate the data.
Language:PythonApache-2.0000
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Language:PythonApache-2.0000