Be ware of torch.FloatTensor(1). Should use torch.zeros(1)
huangzhii opened this issue · comments
Zhi Huang commented
Still for the orthogonal regularization piece of code
orth_loss = Variable(torch.FloatTensor(1), requires_grad=True)
Use torch.FloatTensor
is dangerous as per soumith (pytorch/tutorials#41).
Basically, torch.FloatTensor
will create a Tensor with uninitialized memory instead of zero. The memory can contain any garbage, as it is uninitialized. This is intentional. If you want zeroed Tensors, you can use torch.zeros(1)
Kevin Zakka commented
I've finally updated the regularizations to use tensors rather than Variable
and fixed this as well. Thanks @huangzhii !