Lin Yupian's repositories
Proposal_System
this is a proposal system.
A-Survey-on-Neural-Data-to-Text-Generation
A Survey on Neural Data-to-Text Generation
autoprompt
AutoPrompt: Automatic Prompt Construction for Masked Language Models.
BELLE
BELLE: Bloom-Enhanced Large Language model Engine(开源中文对话大模型-70亿参数)
bert4keras
keras implement of transformers for humans
CLGE
Chinese Language Generation Evaluation 中文生成任务基准测评
ControlPrefixes
Repository accompanying Imperial MSc Computing (Machine Learning & A.I) Thesis
CoT-Planner
CoT-Planner: Chain-of-Thoughts as the Content Planner for Few-shot Table-to-Text Generation Reduces the Hallucinations from LLMs
d2l-zh
《动手学深度学习》:面向中文读者、能运行、可讨论。中英文版被55个国家的300所大学用于教学。
data2text-bioleaflets
Biomedical Data-to-Text Generation via Fine-Tuning Transformers
data2text-seq-plan-py
Code for TACL 2022 paper on Data-to-text Generation with Variational Sequential Planning
DualEnc
Codebase for DualEnc (ACL-20)
Flat-Lattice-Transformer
code for ACL 2020 paper: FLAT: Chinese NER Using Flat-Lattice Transformer
handson-ml2
A series of Jupyter notebooks that walk you through the fundamentals of Machine Learning and Deep Learning in Python using Scikit-Learn, Keras and TensorFlow 2.
incubator-echarts
A powerful, interactive charting and visualization library for browser
ir_datasets
Provides a common interface to many IR ranking datasets.
MedLLMsPracticalGuide
A curated list of practical guide resources of Medical LLMs (Medical LLMs Tree, Tables, and Papers)
plms-graph2text
Investigating Pretrained Language Models for Graph-to-Text Generation
PrefixTuning
Prefix-Tuning: Optimizing Continuous Prompts for Generation
Python
All Algorithms implemented in Python
relogic
Code for Findings of ACL 2021 paper: Logic-Consistency Text Generation from Semantic Parses
stanford_alpaca
Code and documentation to train Stanford's Alpaca models, and generate the data.
t5_in_bert4keras
整理一下在keras中使用T5模型的要点
text-to-text-transfer-transformer
Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"
transformers
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.