fchollet / deep-learning-with-python-notebooks

Jupyter notebooks for the code samples of the book "Deep Learning with Python"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Chapter 12 Text generation how do i save and load models?

srivassid opened this issue · comments

I am triying to save the model used in chapter 12, text generation but cannot do so. How would i be able to save it?

And load it back?

Thanks

You can save a keras model after training by calling the save() function:
model.save('model_name.h5')

Alternatively you could just save the models weights:
model.save_weights('model_weights_name.h5')

The above would save the model to your current directory.

To load the model:
load_model('model_name.h5')
To load the weights:
load_weights('model_weights_name.h5')

There are of course other ways to save your model. For instance if you are using a callback during training such as a model checkpoint.

I hope this is helpful.

i tried that, i was able to save the model but while loading the model i got the following error. I think the PositionalEmbeddings have not been taken into account.

i got this error

ValueError: Unknown layer: 'PositionalEmbedding'. Please ensure you are using a keras.utils.custom_object_scope and that this object is included in the scope. See https://www.tensorflow.org/guide/keras/save_and_serialize#registering_the_custom_object for details.

If you are using a custom layer then you need to specify this when calling load_model():

model = load_model('model_name.h5', custom_objects={'PositionalEmbedding': PositionalEmbedding})
or

with custom_object_scope({'PositionalEmbedding': PositionalEmbedding}):
    model = load_model('model_name.h5')

This is assuming that you have defined a custom layer in your model.

Loaded the model.

Now i get this error

Call arguments received by layer 'string_lookup' (type StringLookup):
  • inputs=<tf.RaggedTensor [[b'this', b'movie']]>

Hello..

I have a same issue.

Chapter 11. Transformer Translatation from Eng to Spanish have a same issue.

`
class PositionalEmbedding(keras.layers.Layer):
...
}

class TransformerEncoder(layers.Layer):
...
}

class TransformerDecoder(layers.Layer):
...
}

...

transformer_model.save('model/eng_spa_transformer')

new_model = keras.models.load_model('model/eng_spa_transformer', custom_objects={'TransformerDecoder': TransformerDecoder,
'TransformerEncoder':TransformerEncoder,
'PositionalEmbedding':PositionalEmbedding})
`

Load new_model is ok but...
predicts = new_model.predict([eng_seq_pad, spa_seq_pad])

new_model can't predict well..

The problem is that eng_spa_transformer/assets directory is empy...

image

image

Any help?