CyberZHG / keras-self-attention

Attention mechanism for processing sequential data that considers the context for each timestamp.

Home Page:https://pypi.org/project/keras-self-attention/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Scaled Dot Product attention error

AliOsm opened this issue · comments

When applying scaled dot product attention is gives the following error:

TypeError: Tensor objects are only iterable when eager execution is enabled. To iterate over this tensor use tf.map_fn.

Any idea?

A list of three tensors is needed as input. I've made an update and single input is supported now.

It is works now, but unfortunately, either ScaledDotProduct, SeqSelfAttention or SeqWeightedAttention did not make any improvement in text classification or sequence labeling tasks, any idea?

commented

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.