CyberZHG / keras-self-attention

Attention mechanism for processing sequential data that considers the context for each timestamp.

Home Page:https://pypi.org/project/keras-self-attention/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

AttributeError: module 'tensorflow' has no attribute 'get_default_graph' while using 'SeqSelfAttention'

octolis opened this issue · comments

Hey CyberZHG,
thank you for your cool packages. I've used keras-self-attention, and firstly it worked okay but the other day 'AttributeError: module 'tensorflow' has no attribute 'get_default_graph'' started to appear every time I try to use SeqSelfAttention. Without using your code, the error disappears.
I couldn't figure out what was the problem. I tried to upgrade/downgrade and reinstall tf and Keras (following the posts from StackOverFlow) but it didn't help.
So maybe you can explain to me what's wrong? The problem seems to be somehow connected with keras-self-attention. I'm new to neural networks and programming in general, so I hope if this question is stupid, you'll be patient to answer in detail (because several days of googling did not help). Thank you in advance!
Here is my code:

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout, Flatten, Activation
from tensorflow.keras.layers import LSTM
from tensorflow.keras.layers import GRU
from keras_self_attention import SeqSelfAttention

max_features = 4 #number of words in the dictionary
num_classes = 2
model = Sequential()
model.add(GRU(128, input_shape=(70, max_features), return_sequences=True, activation='tanh'))
model.add(SeqSelfAttention(attention_activation='sigmoid')) 
model.add(Flatten())
model.add(Dense(num_classes, activation='sigmoid'))

model.compile(loss='binary_crossentropy',
              optimizer='rmsprop',
              metrics=['accuracy'])
model.summary()

Here is my error:

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-15-1807f7e55fc9> in <module>
     11 model.add(GRU(128, input_shape=(70, max_features), return_sequences=True, activation='tanh'))
     12 # model.add(LSTM(128, input_shape=(70, max_features), return_sequences=True)) #return_sequences: output for att.layer
---> 13 model.add(SeqSelfAttention(attention_activation='sigmoid'))
     14 # model.add(Dropout(0.5))
     15 model.add(Flatten())

~/anaconda3/lib/python3.7/site-packages/keras_self_attention/seq_self_attention.py in __init__(self, units, attention_width, attention_type, return_attention, history_only, kernel_initializer, bias_initializer, kernel_regularizer, bias_regularizer, kernel_constraint, bias_constraint, use_additive_bias, use_attention_bias, attention_activation, attention_regularizer_weight, **kwargs)
     47         :param kwargs: Parameters for parent class.
     48         """
---> 49         super(SeqSelfAttention, self).__init__(**kwargs)
     50         self.supports_masking = True
     51         self.units = units

~/anaconda3/lib/python3.7/site-packages/keras/engine/base_layer.py in __init__(self, **kwargs)
    130         if not name:
    131             prefix = self.__class__.__name__
--> 132             name = _to_snake_case(prefix) + '_' + str(K.get_uid(prefix))
    133         self.name = name
    134 

~/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py in get_uid(prefix)
     72     """
     73     global _GRAPH_UID_DICTS
---> 74     graph = tf.get_default_graph()
     75     if graph not in _GRAPH_UID_DICTS:
     76         _GRAPH_UID_DICTS[graph] = defaultdict(int)

AttributeError: module 'tensorflow' has no attribute 'get_default_graph'
commented

Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward?

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

@octolis @CyberZHG I got the exact same error.

some more details

RuntimeError: It looks like you are trying to use a version of multi-backend Keras that does not support TensorFlow 2.0. We recommend using `tf.keras`, or alternatively, downgrading to TensorFlow 1.14.

In other words, the keras-self-attention is written for tf v1.

@ulf1 thank you!!!