CyberZHG / keras-self-attention

Attention mechanism for processing sequential data that considers the context for each timestamp.

Home Page:https://pypi.org/project/keras-self-attention/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Self-attention before BiLSTM

katekats opened this issue · comments

Hi,

Is it possible to put the self-attention layer from the library after the input vector (word embeddings) and before the BiLSTM layer? How can the equations of the self-attention layer be re-written?