数据娃掘's repositories
bi-attention-flow
Bidirectional Attention Flow
DistilBERT-SQuAD
Distillated BERT Question Answering model using SQuAD
ChatBotCourse
自己动手做聊天机器人教程
transformers
A collection of resources to study Transformers in depth.
AB_distillation
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
ai-bestpapers
:trophy: AI Best Paper Awards
AwesomeMRC
This repo is our research summary and playground for MRC. More features are coming.
BERT-for-RRC-ABSA
code for our NAACL 2019 paper: "BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis"
Electra_with_tensorflow
This is an implementation of electra according to the paper {ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators}
LSTM_Stock
Learing the process of LSTM, and use keras achieve stock prediction using LSTM。LSTM步骤解释,然后使用keras实现用LSTM预测股票走势
nlp_chinese_corpus
大规模中文自然语言处理语料 Large Scale Chinese Corpus for NLP
Shift-AI-models-to-real-world-products
Share some useful guides and references about how to shift AI models to real world products or projects.
siamese-BERT-fake-news-detection-LIAR
Triple Branch BERT Siamese Network for fake news classification on LIAR-PLUS dataset in PyTorch
simple-Linear-Regression
Simple Linear implementation with python
tflite-android-transformers
DistilBERT / GPT-2 for on-device inference thanks to TensorFlow Lite with Android demo apps
transformers-1
🤗 Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch.
universal-resume
Minimal and formal résumé (CV) website template for print, mobile, and desktop. https://bit.ly/2kEzgt8
UnsupervisedQA
Unsupervised Question answering via Cloze Translation
wiki_zh_word2vec
利用Python构建Wiki中文语料词向量模型试验