Lin Yupian's starred repositories
ChatGLM-6B
ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
ColossalAI
Making large AI models cheaper, faster and more accessible
stanford_alpaca
Code and documentation to train Stanford's Alpaca models, and generate the data.
gpt4-pdf-chatbot-langchain
GPT4 & LangChain Chatbot for large PDF docs
sentencepiece
Unsupervised text tokenizer for Neural Network-based text generation.
ChatGLM-Tuning
基于ChatGLM-6B + LoRA的Fintune方案
finetune-transformer-lm
Code and model for the paper "Improving Language Understanding by Generative Pre-Training"
Chain-of-ThoughtsPapers
A trend starts from "Chain of Thought Prompting Elicits Reasoning in Large Language Models".
natural-instructions
Expanding natural instructions
Event-Extraction
近年来事件抽取方法总结,包括中文事件抽取、开放域事件抽取、事件数据生成、跨语言事件抽取、小样本事件抽取、零样本事件抽取等类型,DMCNN、FramNet、DLRNN、DBRNN、GCN、DAG-GRU、JMEE、PLMEE等方法
prompt-tuning
Original Implementation of Prompt Tuning from Lester, et al, 2021
Styleformer
A Neural Language Style Transfer framework to transfer natural language text smoothly between fine-grained language styles like formal/casual, active/passive, and many more. Created by Prithiviraj Damodaran. Open to pull requests and other forms of collaboration.
ACL2022_KnowledgeNLP_Tutorial
Materials for ACL-2022 tutorial: Knowledge-Augmented Methods for Natural Language Processing
Prompt-Tuning
Implementation of "The Power of Scale for Parameter-Efficient Prompt Tuning"
data2text-seq-plan-py
Code for TACL 2022 paper on Data-to-text Generation with Variational Sequential Planning
ACL2022_tutorial_multilingual_dialogue
Materials for "Natural Language Processing for Multilingual Task-Oriented Dialogue" Tutorial at ACL 2022
A-Survey-on-Neural-Data-to-Text-Generation
A Survey on Neural Data-to-Text Generation