ros-industrial / yak

A library for integrating depth images into Truncated Signed Distance Fields.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Scaling Factor Between Depth Image and Real Distance

kokokoharu opened this issue · comments

Hi,

Sorry I'm new to this and it might be a trivial question... I wonder if Yak expects the scaling factor between depth image pixel value and real depth (seems not)? I'm unable to think of how it fuses the TSDF volume, which is in world units, using depth images without knowing the conversion factor since every point on that ray has been projected to the same pixel Is there anything I've missed?

Thanks

The fusion algorithm does make some assumptions about how distances are represented in depth images depending on the data type of the pixel value. For many RGB-D cameras the depth at each pixel is represented as a 16-bit unsigned integer in units of millimeters, so if you provide uint16 images it assumes that the distances are in millimeters. Some other types of sensors like industrial structured-light scanners represent depth as 32-bit floating point numbers with units of meters, since it's possible to represent almost-arbitrarily-small measurements, so if you provide float32 images it'll assume units of meters. We can get away with this because there are conventions defining how depth measurements should be represented in sensor data: for example, ROS REP-103 says that ROS systems should represent distances in unites of meters.

For an example of a place where this can cause problems, check out issue #24 in yak_ros.

@schornakj Thank you very much!