Xiaoye Qu's repositories
ATSEN
AAAI 2023
awesome-Large-MultiModal-Hallucination
List of papers on Hallucination in LMM
Awesome-Transformer-Attention
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
CALM-pytorch
Implementation of CALM from the paper "LLM Augmented LLMs: Expanding Capabilities through Composition", out of Google Deepmind
ChatTTS
ChatTTS is a generative speech model for daily dialogue.
FuseLLM
ICLR'2024: Knowledge Fusion of Large Language Models
Generalizable-Mixture-of-Experts
GMoE could be the next backbone model for many kinds of generalization task.
gpt_academic
为ChatGPT/GLM提供实用化交互界面,特别优化论文阅读/润色/写作体验,模块化设计,支持自定义快捷按钮&函数插件,支持Python和C++等项目剖析&自译解功能,PDF/LaTex论文翻译&总结功能,支持并行问询多种LLM模型,支持chatglm2等本地模型。兼容文心一言, moss, llama2, rwkv, claude2, 通义千问, 书生, 讯飞星火等。
GPTQ-for-LLaMa
4 bits quantization of LLMs using GPTQ
langchain-ChatGLM
langchain-ChatGLM, local knowledge based ChatGLM with langchain | 基于本地知识的 ChatGLM 问答
llama-moe
⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training
LLM-Shearing
Preprint: Sheared LLaMA: Accelerating Language Model Pre-training via Structured Pruning
Megatron-DeepSpeed-BLOOM
Ongoing research training transformer language models at scale, including: BERT & GPT-2
Megatron-LM
Ongoing research training transformer models at scale
mergekit
Tools for merging pretrained large language models.
MOSS
An open-source tool-augmented conversational language model from Fudan University
Open-Llama
The complete training code of the open-source high-performance Llama model, including the full process from pre-training to RLHF.
PaLM-rlhf-pytorch
Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM
PromptPapers
Must-read papers on prompt-based tuning for pre-trained language models.
smallcap
SmallCap: Lightweight Image Captioning Prompted with Retrieval Augmentation
st-moe-pytorch
Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch
transformers_tasks
⭐️ NLP Algorithms with transformers lib. Supporting Text-Classification, Text-Generation, Information-Extraction, Text-Matching, RLHF etc.
Xwin-LM
Xwin-LM: Powerful, Stable, and Reproducible LLM Alignment