ohho-zb's repositories
Chinese-BERT-wwm
Pre-Training with Whole Word Masking for Chinese BERT
Apache-2.0000
deep_learning-1
Basic Deep Learning
Language:PythonMIT000
Entity-Relation-As-Multi-Turn-QA
Code for ACL 2019 : Entity-Relation Extraction as Multi-Turn Question Answering
Language:PythonApache-2.0000
fanqiang
翻墙-科学上网
Language:JavaScript000
github
test
000
NeuRec
Next RecSys Library
Language:Python000
pytorch-pretrained-BERT
📖The Big-&-Extending-Repository-of-Transformers: Pretrained PyTorch models for Google's BERT, OpenAI GPT & GPT-2, Google/CMU Transformer-XL.
Language:PythonApache-2.0000
xlnet
XLNet: Generalized Autoregressive Pretraining for Language Understanding
Language:PythonApache-2.0000