Hvass-Labs / TensorFlow-Tutorials

TensorFlow Tutorials with YouTube Videos

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Convert Fine Tuned Pre-Trained Keras Model To TF Estimator And Use On AWS Sagemaker

agyemanha opened this issue · comments

Thank you so much Hvass-Labs for your contributions. I am your biggest fan, and your work means a lot to me.
Please can you make a tutorial on fine tuning a pre-trained Karas model, convert to TF Estimater, then use on AWS Sagemaker.
I believe this suggestion will be of great help to the community.

I have researched and attempted on fine-tuning a pre-trained Keras model on AWS Sagemaker. However I am stucked on using the custome estimator (which has frozen weights) on AWS Sagemaker.

My Goal: Instead of reinventing the wheel, I intended to:

  1. Fine tune a pre-trained Keras model like VGG16 by,
  2. Freezing base layers which has already learned general features.
  3. Adding/customizing last layers specific to the intended problem to solve. The customized last layers will be retrained on TFRecord data.
  4. Compile the model with the necessary optimizer, loss, and metrics.
  5. Convert the custom model into TF Estimator
    o Reasons:
    o Estimator base models can be run on local host or distributed multi-server environment
    o Can run on CPUs, GPUs, or TPUs
    o Estimator models can be shared with other developers
    o It can be used on AWS Sagemaker (here is where I am stucked)

From the code below, I created my custom TF Estimator by using Keras pre-trained model VGG16. I froze up to the 4th layer, and customized the last layers to suite my intended problem. I compiled with necessary optimizer, loss and metrics then finally converted to TF estimator.

#Imported Libraries

from tensorflow.python.keras.applications.vgg16 import VGG16
from tensorflow.python.keras import models
from tensorflow.python.keras import layers
from tensorflow.python.keras import optimizers
from tensorflow.python.keras.preprocessing import image
import numpy as np
import tensorflow as tf
from tensorflow.python import keras
import numpy as np
import sys
from PIL import Image
import os
import shutil

tf.version

///////////////////////////////////////////////////////////////////

Loading the VGG model

img_size = (150,150,3)
conv_base = VGG16(weights='imagenet', include_top=False, input_shape=image_size)

Freezing the layers except the last 4 layers

for layer in conv_base.layers[:-4]:
layer.trainable = False

Checking the trainable status of the individual layers

for layer in conv_base.layers:
print(layer, layer.trainable)

Creating the model.

model = models.Sequential()

Adding the conv_base base model

model.add(conv_base)

Adding the custom layers.

model.add(layers.Flatten())
model.add(layers.Dense(1024, activation=’relu’))
model.add(layers.Dropout(0.5))
model.add(layers.Dense(5, activation=’softmax’))

Show summary of my new model, check the trainable parameters.

model.summary()

#Compile with the necessary optimizer, loss, and metrics you'd like to train with.
model.compile(Adam(lr.0001),loss='categorical_crossentropy',metrics=['accuracy'])

Converting this custom model to TF Estimator for use in AWS Sagemaker

Convert the custom model to TF Estimator (est_model) and save to directory called model_dir.

model_dir = os.path.join(os.getcwd(), "models//catvsdog1").replace("//", "\")
os.makedirs(model_dir, exist_ok=True)
print("model_dir: ",model_dir)
est_model = tf.keras.estimator.model_to_estimator(keras_model=model,
model_dir=model_dir)`

Your assisttant will be hugely appreciated.

Thanks for the compliment, I'm glad you like my work!

Unfortunately I can't help you with this. The best thing I can suggest is that you ask on StackOverflow. And you should use the Markdown feature for writing code in your post so it shows correctly.