sylershao's starred repositories
SUPERVISED-CONTRASTIVE-LEARNING-FOR-PRE-TRAINED-LANGUAGE-MODEL-FINE-TUNING
in this project, I've implemented the Facebook paper about fine tuning RoBERTa with contrastive loss.
Chinese-BERT-wwm
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
BERT-BiLSTM-CRF-NER
Tensorflow solution of NER task Using BiLSTM-CRF model with Google BERT Fine-tuning And private Server services
CopyTranslator
Foreign language reading and translation assistant based on copy and translate.