Using a different checkpoint
aditya1709 opened this issue · comments
aditya1709 commented
I have trained the model for a while and want to start training from that checkpoint again. I tried making LOAD_PRETRAINED_MODEL = False, and did save.restore(sess, appropriate checkpoint directory) in train.py. It starts from a high loss (not where the checkpoint was saved).
Is the only way to do this, converting the ckpt file to a .pkl file and make LOAD_PRETRAINED_MODEL = True ?
aditya1709 commented
It was a super silly mistake. You just have to make sure you don't initialise the variables after restoring the checkpoint. Also make sure you understand what 'ckpt.model_checkpoint_path' returns.