prakruti-joshi / Gated-Attention-Network

PyTorch implementation of the Gated Attention Network for text classification. Comparison of the model with BiLSTM and soft attention models using IMDb and TREC dataset.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Gated Attention Network

Implementation of the paper : Not all attention is needed - Gated Attention Network for Sequence Data (GA-Net)

The GAN model is implemented in Python using PyTorch framework. The GAN model is compared to attention and LSTM models for text classification problem on TREC and IMDb datasets.

Flow Diagram for the network:

There are two networks in the model:

  1. Backbone Network
  2. Auxiliary Network

Comparison with soft attention network:

Soft Attention gives some attention (low or high) to all the input tokens whereas gated attention network chooses the most important tokens to attend.

Gate Probability and gated attention:

Visualization of probability for gate to be open for input token and the actual gated attention weight.

About

PyTorch implementation of the Gated Attention Network for text classification. Comparison of the model with BiLSTM and soft attention models using IMDb and TREC dataset.


Languages

Language:Jupyter Notebook 94.8%Language:Python 5.2%