Question on Soft argmax implementations.
widiba03304 opened this issue · comments
AFAIK, soft argmax is an expectation of heatmap, which should be calculated without control flows like if
or for
.
However, the code contains some control flows for tensorflow and pytorch impl.
Line 220 in 8b2b116
metrabs/metrabs_pytorch/ptu.py
Line 58 in 8b2b116
Can you check if my understanding is correct?
- Let's assume an input with a shape [B, D, J, H, W].
- The softmax to the activation reduces the axes [D, H, W], which results in the sum of [D, H, W] is 1.
- However, these implementations multiply indices (linspace) for [D, H], [D, W], and [H, W], not [D, H, W].
Why is the softmax performed for [D, H, W] if the indices for [D, H], [D, W], [H, W] should be multiplied by it?
Thanks in advance.