Yiqing Huang's starred repositories
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
awesome-chatgpt-prompts-zh
ChatGPT 中文调教指南。各种场景使用指南。学习怎么让它听你的话。
ChatGLM-6B
ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
ColossalAI
Making large AI models cheaper, faster and more accessible
Langchain-Chatchat
Langchain-Chatchat(原Langchain-ChatGLM, Qwen 与 Llama 等)基于 Langchain 与 ChatGLM 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and Llama) RAG and Agent app with langchain
nlp_chinese_corpus
大规模中文自然语言处理语料 Large Scale Chinese Corpus for NLP
llm-numbers
Numbers every LLM developer should know
Fengshenbang-LM
Fengshenbang-LM(封神榜大模型)是IDEA研究院认知计算与自然语言研究中心主导的大模型开源体系,成为中文AIGC和认知智能的基础设施。
Chinese-LangChain
中文langchain项目|小必应,Q.Talk,强聊,QiangTalk
transformers_tasks
⭐️ NLP Algorithms with transformers lib. Supporting Text-Classification, Text-Generation, Information-Extraction, Text-Matching, RLHF, SFT etc.
P-tuning-v2
An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks
Chain-of-ThoughtsPapers
A trend starts from "Chain of Thought Prompting Elicits Reasoning in Large Language Models".
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
GPT2-NewsTitle
Chinese NewsTitle Generation Project by GPT2.带有超级详细注释的中文GPT2新闻标题生成项目。
bigscience
Central place for the engineering/scaling WG: documentation, SLURM scripts and logs, compute environment and data.
LLM-Tuning
Tuning LLMs with no tears💦; Sample Design Engineering (SDE) for more efficient downstream-tuning.