Option to include or exclude top layer
ashavish opened this issue · comments
Would be good to have a "nice to have feature" of Include_top = True / False similar to inception v3 implementation in Keras.
Hi, I would also love this feature. For example, in InceptionV3 I do:
InceptionV3_notop = InceptionV3(include_top=False, weights='imagenet',
input_tensor=None, input_shape=(299, 299, 3))
output = InceptionV3_notop.get_layer(index = -1).output # Shape: (8, 8, 2048)
output = AveragePooling2D((8, 8), strides=(8, 8), name='avg_pool')(output)
output = Flatten(name='flatten')(output)
output = Dense(8, activation='softmax', name='predictions')(output)
InceptionV3_model = Model(InceptionV3_notop.input, output)
As I would like to use InceptionV4 to predict 8 classes using new data.
Is there anyway around it at the moment? I'm new to all of this.
Can it be achieved simply by doing:
InceptionV4 = InceptionV4.create_model(num_classes=8, weights='inception-v4_weights_tf_dim_ordering_tf_kernels.h5')
I'll be adding this sometime tomorrow, however, there is a fairly straightforward way to do it:
- Simply change the number of features in the final dense layer in the model definition (in your case to 8) and give the layer a name (to keep the weights from trying to be set for that layer)
So, in summary if you can't wait for a day or so just change the last layer in the model to the following and you should be good to go (change the name to whatever you want):
predictions = Dense(output_dim=num_classes, activation='softmax', name="newDense")(net)
Then, when you load the model simply specify your new number of classes.
# Create model and load pre-trained weights
model = inception_v4.create_model(num_classes=8, weights='imagenet')
Should be good to go! Looks correct to me. Again, I'll be adding the include_top functionality tomorrow.
Cheers @kentsommer! I will probably pull your update sometime tomorrow then.
Sorry about the delay @GlastonburyC, this change will be included in the Keras 2.0 release.
Closing this out, please track release in #6