CyberZHG / keras-self-attention

Attention mechanism for processing sequential data that considers the context for each timestamp.

Home Page:https://pypi.org/project/keras-self-attention/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Error (with multiplication?)

GadL opened this issue · comments

commented

I keep getting value errors when working with your attention mechamism like such:

ValueError: Dimensions must be equal, but are 128 and 32 for 'Attention/MatMul' (op: 'MatMul') with input shapes: [?,128], [32,32].

commented

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.