luost26 / score-denoise

:snowflake: Score-Based Point Cloud Denoising (ICCV 2021)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Supervised loss function

ddsediri opened this issue · comments

Hi @luost26,

Thank you for sharing your implementation! I just have a question about the supervised (and self-supervised) loss function. In https://github.com/luost26/score-denoise/blob/main/models/denoise.py line 77, what is the purpose of self.dsm_sigma? I was not able to find this in the paper.

Furthermore, in equation 3 of the main paper, you take the expectation with respect to the distribution N(x_i). But in the code this is a straightforward average so is this a uniform distribution?
score-based-denoising-question

Thank you!
D.

Hi,

You might consider the purpose of self.dsm_sigma as scaling the loss function to improve training.

Please refer to Eq.41 and Eq.42 in http://personal.psu.edu/drh20/genetics/lectures/11.pdf
It explains why we can take a straightforward average.

Thanks!

Hey @luost26,

Ah ok, fair enough. I wasn't exactly sure why the parameter was used but it makes more sense as a means to scale/standardize the loss.

For the second point, great, you use the sample mean as an estimator for the expected value.

Thank you for clearing my doubts and cheers again for the implementation!
D.

The dsm_sigma is chosen according to the standard deviation of the noise during training.

In the implementation, the std of noise added to the point cloud ranges from 0.01 to 0.03. Therefore, dsm_sigma is set to 0.01 such that the loss and the gradient are not too small.

Thank you for the further clarification on how the value was chosen. That's very helpful!