Saved weights dimension mismatch
fparkins opened this issue · comments
ValueError: Layer #1 (named "conv2d_1"), weight <tf.Variable 'conv2d_1/kernel:0' shape=(3, 3, 32, 32) dtype=float32, numpy=
array[...] has shape (3, 3, 32, 32), but the saved weight has shape (32, 3, 3, 3)
ValueError: Layer #1 (named "conv2d_1"), weight <tf.Variable 'conv2d_1/kernel:0' shape=(3, 3, 32, 32) dtype=float32, numpy=
array[...] has shape (3, 3, 32, 32), but the saved weight has shape (32, 3, 3, 3)
@GeorgianaLoba solution for @Arminkhayati partially works. The 2.0 version works for the full model but If you need the headless model this doesn't work (404 error). The 2.1 version has dimension mismatch between weights and model for both include_top = True and include_top = False
inception_v4.py lines 39 and 40
WEIGHTS_PATH = 'https://github.com/kentsommer/keras-inceptionV4/releases/download/2.0/inception-v4_weights_tf_dim_ordering_tf_kernels.h5'
WEIGHTS_PATH_NO_TOP = 'https://github.com/kentsommer/keras-inceptionV4/releases/download/2.0/inception-v4_weights_tf_dim_ordering_tf_kernels_notop.h5'
Closing this out after a long hiatus from Github. There have been a lot of updates across many of the utilized frameworks since this was published. Unfortunately I did and do not have the time to update it which is most likely why these issues are cropping up.
I will happily merge in any pull requests if folks have a desire to keep this working. Apologies for such a late response.