The second-order analytical gradients are not all zero as described in the article.
ckrsls opened this issue · comments
ckrsls commented
I printed the value of the value of the second derivative, and found that the second derivative is not zero. I understand that theoretically the second derivative of trilinear interpolation should be 0, but why are the results of the code implementation inconsistent?
gradient = torch.autograd.grad(sdf.sum(), x, create_graph=True)[0]
hessian = torch.autograd.grad(gradient.sum(), x, create_graph=True)[0]