CyberZHG / keras-self-attention

Attention mechanism for processing sequential data that considers the context for each timestamp.

Home Page:https://pypi.org/project/keras-self-attention/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

what's the difference between SeqWeightedAttention and SeqSelfAttention?

qianwang102 opened this issue · comments

SeqWeightedAttention is another implementation of self attention or SeqWeightedAttention is not an implementation of self attention?

commented

Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward?

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.