ClementPinard / SfmLearner-Pytorch

Pytorch version of SfmLearner from Tinghui Zhou et al.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Hello, how to visualize the true value of the depth map.

yangbinchao opened this issue · comments

I want to compare my prediction performance by visualizing the true value of the picture(depth). How to display the ground_truth of the depth map?

Hi,

you can easily log the ground truth with the tensor2array function.
Here, in a very similar project, I made a little function that logs best and worst results that you can work on : https://github.com/ClementPinard/unsupervised-depthnet/blob/master/test_depth.py#L50

Note that the depth here is only a bunch of lidar rays, so not continuous.

Hi,

you can easily log the ground truth with the tensor2array function.
Here, in a very similar project, I made a little function that logs best and worst results that you can work on : https://github.com/ClementPinard/unsupervised-depthnet/blob/master/test_depth.py#L50

Note that the depth here is only a bunch of lidar rays, so not continuous.

Thank you very much. That solved my problem

Hi,

you can easily log the ground truth with the tensor2array function.
Here, in a very similar project, I made a little function that logs best and worst results that you can work on : https://github.com/ClementPinard/unsupervised-depthnet/blob/master/test_depth.py#L50

Note that the depth here is only a bunch of lidar rays, so not continuous.

Thank you for your reply, how can this kind of GT be generated ?

图片1

commented

Hi,
you can easily log the ground truth with the tensor2array function.
Here, in a very similar project, I made a little function that logs best and worst results that you can work on : https://github.com/ClementPinard/unsupervised-depthnet/blob/master/test_depth.py#L50
Note that the depth here is only a bunch of lidar rays, so not continuous.

Thank you for your reply, how can this kind of GT be generated ?

图片1

hi,have you achieve this funciton?

Hi, doing chores to clean up mi issue list, the visualization above can be obtained through scipy ND interpolator. See https://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.LinearNDInterpolator.html#scipy.interpolate.LinearNDInterpolator

So you just need to use this function to get the 3d points from lidar, and https://github.com/ClementPinard/SfmLearner-Pytorch/blob/master/data/kitti_raw_loader.py#L165 then use scipy's function to construct the interpolated depth map.

Note that it won't be accurate so it's only for vizualisation purpose.