CyberZHG / keras-self-attention

Attention mechanism for processing sequential data that considers the context for each timestamp.

Home Page:https://pypi.org/project/keras-self-attention/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Compatibility with Tensorflow 2.0

mohamedScikitLearn opened this issue · comments

I'm trying to create a model using keras-self-attentionon Google colab, and since the default Tensorflow version is 2.0 now, this error prompt :

model = models.Sequential()
model.add( Embedding(max_features, 32))
model.add(Bidirectional( LSTM(32, return_sequences=True)))
# adding an attention layer
model.add(SeqWeightedAttention())
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
/usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py in _get_default_graph()
     65     try:
---> 66         return tf.get_default_graph()
     67     except AttributeError:

AttributeError: module 'tensorflow' has no attribute 'get_default_graph'

During handling of the above exception, another exception occurred:

RuntimeError                              Traceback (most recent call last)
4 frames
<ipython-input-7-9c4e625938a2> in <module>()
      3 model.add(Bidirectional( LSTM(32, return_sequences=True)))
      4 # adding an attention layer
----> 5 model.add(SeqWeightedAttention())

/usr/local/lib/python3.6/dist-packages/keras_self_attention/seq_weighted_attention.py in __init__(self, use_bias, return_attention, **kwargs)
     10 
     11     def __init__(self, use_bias=True, return_attention=False, **kwargs):
---> 12         super(SeqWeightedAttention, self).__init__(**kwargs)
     13         self.supports_masking = True
     14         self.use_bias = use_bias

/usr/local/lib/python3.6/dist-packages/keras/engine/base_layer.py in __init__(self, **kwargs)
    130         if not name:
    131             prefix = self.__class__.__name__
--> 132             name = _to_snake_case(prefix) + '_' + str(K.get_uid(prefix))
    133         self.name = name
    134 

/usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py in get_uid(prefix)
     84     """
     85     global _GRAPH_UID_DICTS
---> 86     graph = _get_default_graph()
     87     if graph not in _GRAPH_UID_DICTS:
     88         _GRAPH_UID_DICTS[graph] = defaultdict(int)

/usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py in _get_default_graph()
     67     except AttributeError:
     68         raise RuntimeError(
---> 69             'It looks like you are trying to use '
     70             'a version of multi-backend Keras that '
     71             'does not support TensorFlow 2.0. We recommend '

**RuntimeError: It looks like you are trying to use a version of multi-backend Keras that does not support TensorFlow 2.0. We recommend using `tf.keras`, or alternatively, downgrading to TensorFlow 1.14.**
commented

Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward?

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

commented

Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward?

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

It is not available in Tensorflow 2.2.

@grkhr @Juhong-Namgung

The codes were always tested with the latest version of TensorFlow:
https://travis-ci.org/github/CyberZHG/keras-self-attention/jobs/707084673#L2342

What were your error log?

I found that is my fault.
I used Attention layer without tensorflow.keras.layers.Layer().
Thank you very much @CyberZHG.

@Juhong-Namgung can you show your code example here ? ( how did you add the attention layer on your TF2 model )

@CyberZHG
I tried this and this error appeared:

from tensorflow.keras.layers import *
model_q2 = tf.keras.Sequential()
model_q2.add(Embedding(input_dim = 100000,
                       output_dim = 200,
                       trainable=True,
                      input_length = 200))
model_q2.add(LSTM(128, return_sequences = True))
model_q2.add(SeqSelfAttention(attention_activation='sigmoid'))
model_q2.add(Dense(60, activation = 'tanh'))
model_q2.add(Dense(2, activation = 'sigmoid'))

**ERROR **

TypeError: The added layer must be an instance of class Layer. Found: <keras_self_attention.seq_self_attention.SeqSelfAttention object at 0x7f7590ecfd30>

@mohamedScikitLearn

One way to solve this is by upgrading Keras to at least 2.4, the other is by adding the environment variable before importing this package:

import os
os.environ['TF_KERAS'] = '1'
from keras_self_attention import SeqSelfAttention

@mohamedScikitLearn
My working code is as follows:

from tensorflow.keras.layers import *
import os
os.environ['TF_KERAS'] = '1'

model = tf.keras.Sequential()
    model.add(Embedding(input_dim=max_vocab_len, output_dim=emb_dim, input_length=max_len, embeddings_regularizer=W_reg))
    model.add(Dropout(0.2))
    model.add(LSTM(units=128, return_sequences=True))
    model.add(Dropout(0.5))
    model.add(Layer(SeqSelfAttention(attention_activation='relu')))
    model.add(Flatten())
    model.add(Dense(9472, activation='relu'))
    model.add(Dropout(0.5))
    model.add(Dense(21, activation='softmax'))

I think you should add tensorflow.keras.layers.Layer()