torch / nngraph

Graph Computation for nn

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Any neat way to debug?

blackyang opened this issue · comments

Is there any neat way to debug nngraph? Thanks!

Currently I write a Debug layer which is almost the same as Identity layer except that I can print whatever I want inside updateOutput() and updateGradInput()

seems that there is a similar implementation in dpnn, the PrintSize layer:)

You can also add more outputs. Like:

input = nn.Identity()()
L1 = nn.Tanh()(nn.Linear(10, 20)(input))
L2 = nn.Tanh()(nn.Linear(30, 60)(nn.JoinTable(1)({input, L1})))
L3 = nn.Tanh()(nn.Linear(80, 160)(nn.JoinTable(1)({L1, L2})))

g = nn.gModule({input}, {L1,L2,L3})

@JoostvDoorn Thanks! But in that way we need to provide some zeroTensors as gradOutput when during backprop

FYI, pytorch introduces similar functionalities: register_forward_hook and register_backward_hook