CyberZHG / keras-self-attention

Attention mechanism for processing sequential data that considers the context for each timestamp.

Home Page:https://pypi.org/project/keras-self-attention/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ValueError: Shapes (None, 3) and (None, 50, 3) are incompatible

Keramatfar opened this issue · comments

Hi,
I am getting this error in the last layer of my simple LSTM network when i add self-attention according to your examples.

commented

Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward?

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

commented

Hi there, I resolved this issue by adding a lambda layer to reduce the dimension:
Lambda(lambda x: x[:, -1, :])(attn_layer)