xssChauhan / Hierarchical-Attention-Network

PyTorch Implementation of Hierarchical Attention Network

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Hierarchical-Attention-Network

PyTorch implementation of Hierarchical Attention Network.

Modules

WordGRU: models.WordGRU

A GRU layer that takes in batches of words and encodes them

WordAttn: models.WordAttn

Attention Layer that takes in encoded words and generates Sentence Vectors

SentenceGRU: models.SentenceGRU

GRU layer that accepts documents as sentences and encodes the sentences

SentenceAttn: models.SentenceAttn

Attention Layer that takes in encoded sentences and generate document vectors

HAN: models.HAN

Monolithic class that takes in documents, and trains the network

Download and Prepare Data

Download the data from https://www.kaggle.com/utathya/imdb-review-dataset

Prepare the data for network using: python imdb_data.py imdb_master.csv

Training the network

python train.py

About

PyTorch Implementation of Hierarchical Attention Network


Languages

Language:Jupyter Notebook 63.8%Language:Python 36.2%