CyberZHG / keras-self-attention

Attention mechanism for processing sequential data that considers the context for each timestamp.

Home Page:https://pypi.org/project/keras-self-attention/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

__init__() missing 3 required positional arguments: 'node_def', 'op', and 'message'

dingtine opened this issue · comments

commented

when i used the SeqSelfAttention function, the code return this error: init() missing 3 required positional arguments: 'node_def', 'op', and 'message', how to fix this ?

commented

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.