torch / nngraph

Graph Computation for nn

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Copying weights from one model to another

arthitag opened this issue · comments

Hi,

I have a trained nngraph model(model0 - quite elaborate structure and without annotations/names for the layers).
I created another very similar nngraph model from file(model1 - with a few extra nn.Identity() and nn.CaddTable() layers).
I want to copy the weights of model0 into model1. What is the best way to do this?

I tried wt0,gp0 = model0:parameters() and copying them sequentially using :copy() into model1. This way the weights do get copied but how can I ensure they follow the correct ordering?

Thank you for your replies.

Did you find a solution?

yes, you can get a list of all parameters from nodes using First_model.forwardnodes.
[ node.data.module.weight, node.data.module.bias]
This will give you weights and biases. If you have added extra layers in your second model, you can annotate them and skip initializing those layers with weights of first model. The rest can get initialized in proper order.
Also, if you have batchnorm layers in your model, you need to get running_mean and running_var parameters for those layers besides weights and biases . [node.data.module.running_mean, node.data.module.running_var ]
check this out in that case:
torch/nn#1256