CyberZHG / keras-self-attention

Attention mechanism for processing sequential data that considers the context for each timestamp.

Home Page:https://pypi.org/project/keras-self-attention/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Which paper does Local Attention refer to?

Fatigerrr opened this issue · comments

Hello,I just want to know Which paper does Local Attention refer to,thank you

Please share me the paper describing seq self attention