wangwang110 / birnn-language-model-tf

Tensorflow implementation of Bi-directional RNN Langauge Model

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Bi-directional RNN Language Model in TensorFlow

Tensorflow implementation of Bi-directional RNN Langauge Model refer to paper [Contextual Bidirectional Long Short-Term Memory Recurrent Neural Network Language Models: A Generative Approach to Sentiment Analysis].

Requirements

  • Python 3
  • TensorFlow

Usage

Penn Tree Bank (PTB) dataset is used for training and test. ptb_data is copied from data/ directory of the PTB dataset from Tomas Mikolov's webpage.

Train

$ python train.py

Hyperparameters

$ python train.py -h
usage: train.py [-h] [--model MODEL] [--embedding_size EMBEDDING_SIZE]
                [--num_layers NUM_LAYERS] [--num_hidden NUM_HIDDEN]
                [--keep_prob KEEP_PROB] [--learning_rate LEARNING_RATE]
                [--batch_size BATCH_SIZE] [--num_epochs NUM_EPOCHS]

optional arguments:
  -h, --help            show this help message and exit
  --model MODEL         rnn | birnn
  --embedding_size EMBEDDING_SIZE
                        embedding size.
  --num_layers NUM_LAYERS
                        RNN network depth.
  --num_hidden NUM_HIDDEN
                        RNN network size.
  --keep_prob KEEP_PROB
                        dropout keep prob.
  --learning_rate LEARNING_RATE
                        learning rate.
  --batch_size BATCH_SIZE
                        batch size.
  --num_epochs NUM_EPOCHS
                        number of epochs.

Experimental Results

  • Orange Line: LSTM language model
  • Blue Line: Bi-directional LSTM language model

Training Loss

Loss for Test Data

References

About

Tensorflow implementation of Bi-directional RNN Langauge Model


Languages

Language:Python 100.0%