aRI0U / RandLA-Net-pytorch

PyTorch implementation of RandLA-Net

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

why needs torch.distributions.utils.probs_to_logits?

tangyipeng100 opened this issue · comments

I have noticed that there comes torch.distributions.utils.probs_to_logits after forward propagation:

train.py 115~117:

scores = model(points)#[b, 40960, 6]->[b, 14, 40960]

logp = torch.distributions.utils.probs_to_logits(scores, is_binary=False)

why needs torch.distributions.utils.probs_to_logits? Can I remove it and use scores directly to calculate the loss?