CyberZHG / keras-self-attention

Attention mechanism for processing sequential data that considers the context for each timestamp.

Home Page:https://pypi.org/project/keras-self-attention/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Issue with tensorflow-gpu

kerighan opened this issue · comments

The layer works fine on CPU, but I have this error on tensorflow GPU:

Blas xGEMV launch failed : a.shape=[1,2000000,4], b.shape=[1,4,1], m=2000000, n=1, k=4

Hey! I am having the same issue, tried on windows, linux and google colab but none worked without forcing tensorflow to cpu, which is really slow. How did you solve it?