tsterbak / keras_attention

Contains an implementation of the attention mechanism and a keras text classifier wrapper.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Error when drawing attention map

xiaotongshi opened this issue · comments

in sparse_categorical_crossentropy
logits = tf.reshape(output, [-1, int(output_shape[-1])])
TypeError: int returned non-int (type NoneType)

if I print the output of attention layer:
[<tf.Tensor 'attention_weighted_average_2/Tanh_1:0' shape=(?, 256) dtype=float32>, <tf.Tensor 'attention_weighted_average_2/truediv:0' shape=(?, ?) dtype=float32>]
The second output is not in shape (None, Input_length)