NVIDIA-AI-IOT / CUDA-PointPillars

A project demonstrating how to use CUDA-PointPillars to deal with cloud points data from lidar.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Changing the grid size resulted in incorrect inference results

Forgeaheadye opened this issue · comments

Once I change the ratio of point cloud to pillar size, the inference results are completely inaccurate.

const int grid_x_size = (max_x_range - min_x_range) / pillar_x_size;
const int grid_y_size = (max_y_range - min_y_range) / pillar_y_size;
const int grid_z_size = (max_z_range - min_z_range) / pillar_z_size;

If my own grid size does not match the grid size provided by the CUDA pointpillar author, the reasoning is incorrect.
This issue has caused me to be unable to freely set the point cloud range and pillar size.

const float min_x_range = 0.0;
const float max_x_range = 69.12;
const float min_y_range = -39.68;
const float min_y_range =39.68;
const float min_z_range = -3.0;
const float max_z_range = 1.0;

const float pillar_x_size = 0.16;
const float pillar_y_size = 0.16
const float pillar_z_size = 4.0

This is the point cloud range and pillar size provided by the author.

Thank you very much for your answer

I have the same problem. Have you finally resolved it? If it is resolved, could you please let me know where the problem is?