There are 1 repository under roberta-wwm topic.
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)