NVIDIA-AI-IOT / CUDA-PointPillars

A project demonstrating how to use CUDA-PointPillars to deal with cloud points data from lidar.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Different Results when I do inference on onnx file using TRT engine vs inference using pth file

Allamrahul opened this issue · comments

I am able to train and validate using a custom dataset (bunch of npy files and corresponding annotations). I am using pointpillars model. The results look accurate on the point cloud using the demp.py file (visualization using pth)

For my eval example 000000.npy,
I get the following results using demo.py:
pred_dicts[0]['pred_scores'] tensor([0.8475, 0.4954, 0.4725, 0.4603], device='cuda:0')
pred_dicts[0]['pred_boxes'] tensor([[ 9.6669, 1.1732, 2.1426, 0.2856, 0.5018, 3.1349, 6.2874],
[ 9.8581, -10.6740, 2.0632, 0.4447, 0.4504, 2.5857, 6.2749],
[ 24.9824, -10.4977, 3.1227, 0.2673, 0.4696, 3.1857, 6.2983],
[ 24.8274, 1.3483, 2.7095, 0.2326, 0.4953, 3.1487, 6.3119]],
device='cuda:0')

However, when I do inference using the generated onnx file using the TRT engine, I get the following:
50.8117 -6.56379 1.9568 0.256044 0.501621 2.79036 6.28261 0 0.860124
48.4368 -14.8604 2.06768 0.442814 0.450666 2.58379 6.27543 0 0.499151
42.9857 -14.6852 3.1227 0.267265 0.469636 3.18565 6.29831 0 0.472531
45.4026 -6.38253 2.70948 0.232621 0.495276 3.14874 6.31186 0 0.460299

After I analyzed the numbers, I see the confidence numbers are almost the same. Even the box dimensions are same across both. Even the z dimensions match. However, I see a huge disparity in x and y coordinates. Can someone please help me out?

I have attached another comparison as well:
CompResPP

Hey, can someone please help me out with this? I am using npy files for TFRT inference instead of bin files. Not sure if that is causing any issue? When I compared the .npy data being loaded in python vs data loaded in the main.cpp, I see 32 additional bytes when the main.cpp file loads the same npy file. I accounted for this but that did not fix my issue, unfortunately!

@byte-deve , I would really appreciate your input here.