yanzhitech / bert

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

bert

Using BERT in Chinese NER TASK.

The BERT is decribed in "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding". https://arxiv.org/pdf/1810.04805.pdf

About

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding


Languages

Language:Python 50.2%Language:HTML 49.8%