arixlin / Text-Classification

Implementation of papers for text classification task on DBpedia

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Text-Classification

Implement some state-of-the-art text classification models with TensorFlow.

Attention is All Your Need

Paper: Attention Is All You Need

See all_attention.py

Use self-attention where Query = Key = Value = sentence after word embedding

Multihead Attention module is implemented by Kyubyong

IndRNN for Text Classification

Paper: Independently Recurrent Neural Network (IndRNN): Building A Longer and Deeper RNN

IndRNNCell is implemented by batzener

Attention-Based Bidirection LSTM for Text Classification

Paper: Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification

See attn_bi_lstm.py

Hierarchical Attention Networks for Text Classification

Paper: Hierarchical Attention Networks for Document Classification

See attn_lstm_hierarchical.py

Attention module is implemented by ilivans/tf-rnn-attention .

Adversarial Training Methods For Supervised Text Classification

Paper: Adversarial Training Methods For Semi-Supervised Text Classification

See: adversrial_abblstm.py

Dataset

You can load the data with

dbpedia = tf.contrib.learn.datasets.load_dataset('dbpedia', test_with_fake_data=FLAGS.test_with_fake_data)

Performance

Model Test Accuracy Notes
Attention-based Bi-LSTM 98.23 %
HAN 89.15% 1080Ti 10 epochs 12 min
Adversarial Attention-based Bi-LSTM 98.5% AWS p2 2 hours
IndRNN 98.39% 1080Ti 10 epochs 10 min
Attention is All Your Need 97.81 % 1080Ti 15 epochs 8 min

TO DO

  • Code refactoring

About

Implementation of papers for text classification task on DBpedia

License:Apache License 2.0


Languages

Language:Python 100.0%