kevinzakka / pytorch-goodies

PyTorch Boilerplate For Research

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Be ware of torch.FloatTensor(1). Should use torch.zeros(1)

huangzhii opened this issue · comments

Still for the orthogonal regularization piece of code
orth_loss = Variable(torch.FloatTensor(1), requires_grad=True)
Use torch.FloatTensor is dangerous as per soumith (pytorch/tutorials#41).

Basically, torch.FloatTensor will create a Tensor with uninitialized memory instead of zero. The memory can contain any garbage, as it is uninitialized. This is intentional. If you want zeroed Tensors, you can use torch.zeros(1)

I've finally updated the regularizations to use tensors rather than Variable and fixed this as well. Thanks @huangzhii !