THUDM's repositories
Multilingual-GLM
The multilingual variant of GLM, a general language model trained with autoregressive blank infilling objective
FasterTransformer
Transformer related optimization, including BERT, GPT
GRAND-plus
Code and dataset for paper "GRAND+: Scalable Graph Random Neural Networks"
GLM-iprompt
Apply Iprompt on GLM with innovative new methods. Currently support Chinese QA, English QA and Chinese poem generation.
Efficient-Head-Finetuning
Source code for EMNLP2022 long paper: Parameter-Efficient Tuning Makes a Good Classification Head