Why do you change d.trainable twice after d.compile during train?
haozhi1817 opened this issue · comments
It seems you can only change one model's trainable before model.compile, if you compile one model, your change of trainable will not work. So, why do you change model.trainable twice during train :
x= np.concatenate((image_batch, generated_images))
y = [1] * BATCH_SIZE + [0] * BATCH_SIZE
d_loss = d.train_on_batch(X, y)
print("batch %d d_loss : %f" % (index, d_loss))
noise = np.random.uniform(-1, 1, (BATCH_SIZE, 100))
d.trainable = False #'here'
g_loss = d_on_g.train_on_batch(noise, [1] * BATCH_SIZE)
d.trainable = True #'here'
print("batch %d g_loss : %f" % (index, g_loss))
I found the same problem, there're some unnecessary line to change trainable,
and it seems that you should recompile once you set trainable as True or False to freeze layers.
example:
https://keras.io/applications/