Radius computation when normalizing ray directions
MrMois opened this issue · comments
Hello, I got a question regarding the ball radius.
In camera.py the ray directions are normalized, hence t values (and distances) are in euclidean space. But the radii are computed on the image plane w.r.t. the unnormalized distances. Shouldn't the radii also be scaled by 1 / ||directions|| before the directions are normalized so you get the radii on the unit sphere?
Greets!
I think that is done in compute_ball_radii inside trimipRF.py. The radii are scaled depending on the angle to the principal camera ray.
Hello, I got a question regarding the ball radius.
In camera.py the ray directions are normalized, hence t values (and distances) are in euclidean space. But the radii are computed on the image plane w.r.t. the unnormalized distances. Shouldn't the radii also be scaled by 1 / ||directions|| before the directions are normalized so you get the radii on the unit sphere?
Greets!
I have the same question , i ’m looking forward to relative reply!
The radii are comupted on a virtual imaging plane with a depth value (coordinate along z-axis) as 1 w.r.t. the camera frame. No matter whether rays are normalized or not, this unit depth value has defined coordinates of these pixels.