yangnedu362 / GORU-tensorflow

Gated Orthogonal Recurrent Unit implementation in tensorflow

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

GORU-tensorflow

Gated Orthogonal Recurrent Unit

If you find this work useful, please cite arXiv:1706.02761.

Installation

requires TensorFlow 1.2.0

Usage

To use GORU in your model, simply copy GORU.py.

Then you can use GORU in the same way you use built-in LSTM:

from GORU import GORUCell
cell = GORUCell(n_hidden, fft=True)

Args:

  • n_hidden: Integer.
  • capacity: Optional. Integer. Only works for tunable style.
  • fft: Optional. Bool. If True, GORU is set to FFT style. Default is True.

Example tasks for GORU

Three tasks for RNN in the paper are shown here. Use -h for more information

Copying Memory Task

python copying_task.py --model GORU

Denoise Task

python denoise_task.py --model GORU

Parenthesis Task

python paren_task.py --model GORU

About

Gated Orthogonal Recurrent Unit implementation in tensorflow

License:Other


Languages

Language:Python 100.0%