Vasishta / Linear-Attention-Recurrent-Neural-Network

A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN.

Home Page:http://www.neuraxio.com/en/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Vasishta/Linear-Attention-Recurrent-Neural-Network Issues

No issues in this repository yet.