Why different Softmax?
LukeAI opened this issue · comments
thanks for this demo!
I was trying to understand the implementation and I see that there is a modified softmax function
The original head uses the standard softmax implementation from pytorch:
https://github.com/Turoad/CLRNet/blob/7269e9d1c1c650343b6c7febb8e764be538b1aed/clrnet/models/heads/clr_head.py#L450
but I see that in your implementation you do something a bit different - you first subtract the max from each element in x. Why is that?
CLRNet-onnxruntime-and-tensorrt-demo/demo_trt.py
Lines 153 to 156 in 07a47e6