CyberZHG / keras-self-attention

Attention mechanism for processing sequential data that considers the context for each timestamp.

Home Page:https://pypi.org/project/keras-self-attention/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

can this be used in a seq 2 seq task?

cristianmtr opened this issue · comments

Can this be used in a seq 2 seq task? With an encoder LSTM and a decoder LSTM? The examples don't seem to cover this

If you are using TensorFlow you can use AttentionWrapper instead of this library, it is built-in functionality and supports Bahdanau and Luong attention.
https://www.tensorflow.org/api_docs/python/tf/contrib/seq2seq/AttentionWrapper

commented

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.