wharstr9027's repositories
bert
TensorFlow code and pre-trained models for BERT
Apache-2.0000
NLP-progress
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
MIT000
xlnet
XLNet: Generalized Autoregressive Pretraining for Language Understanding
Apache-2.0000
TextClassifier_Transformer
个人基于谷歌开源的BERT编写的文本分类器(基于微调方式),可自由加载NLP领域知名的预训练语言模型BERT、Bert-wwm、Roberta、ALBert以及ERNIE1.0
000
Multi_Label_Classifier_finetune
微调预训练语言模型,解决多标签分类任务(可加载BERT、Roberta、Bert-wwm以及albert等知名开源tf格式的模型)
000
Focal-Loss-implement-on-Tensorflow
The implementation of focal loss proposed on "Focal Loss for Dense Object Detection" by KM He and support for multi-label dataset.
Apache-2.0000
elasticsearch-definitive-guide
欢迎加QQ群:109764489,贡献力量!
Language:HTMLNOASSERTION000
cws_evaluation
Java开源项目cws_evaluation:中文分词器分词效果评估对比
Language:LexApache-2.0000
doc.elasticsearch.cn
elasticsearch 文档中文版
Language:HTML000