NVlabs / neuralangelo

Official implementation of "Neuralangelo: High-Fidelity Neural Surface Reconstruction" (CVPR 2023)

Home Page:https://research.nvidia.com/labs/dir/neuralangelo/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

The second-order analytical gradients are not all zero as described in the article.

ckrsls opened this issue · comments

I printed the value of the value of the second derivative, and found that the second derivative is not zero. I understand that theoretically the second derivative of trilinear interpolation should be 0, but why are the results of the code implementation inconsistent?
gradient = torch.autograd.grad(sdf.sum(), x, create_graph=True)[0]
hessian = torch.autograd.grad(gradient.sum(), x, create_graph=True)[0]