lordzth666 / WACV23_PIDS-Joint-Point-Interaction-Dimension-Search-for-3D-Point-Cloud

Open source code for Paper: "PIDS: Joint Point Interaction-Dimension Search for 3D Point Cloud".

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Point Cloud Size Mismatch

ldkong1205 opened this issue · comments

Hi @lordzth666, thanks for open-sourcing your work and providing the checkpoints!

I am testing PIDS on a cross-sensor dataset, which has fewer points than SemanticKITTI. I directly used the checkpoints trained on SemanticKITTI during the evaluation, and the following error appeared:

Traceback (most recent call last):
  File "/PIDS/val_models.py", line 281, in <module>
    tester.slam_segmentation_test(net, test_loader, config, model_root=model_root)
  File "/PIDS/pids_core/utils/tester.py", line 548, in slam_segmentation_test
    for i, batch in enumerate(test_loader):
  File "/anaconda3/envs/torch1101/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 530, in __next__
    data = self._next_data()
  File "/anaconda3/envs/torch1101/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 1192, in _next_data
    return self._process_data(data)
  File "/anaconda3/envs/torch1101/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 1238, in _process_data
    data.reraise()
  File "/anaconda3/envs/torch1101/lib/python3.9/site-packages/torch/_utils.py", line 434, in reraise
    raise exception
IndexError: Caught IndexError in DataLoader worker process 2.
Original Traceback (most recent call last):
  File "/anaconda3/envs/torch1101/lib/python3.9/site-packages/torch/utils/data/_utils/worker.py", line 287, in _worker_loop
    data = fetcher.fetch(index)
  File "/anaconda3/envs/torch1101/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 49, in fetch
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/anaconda3/envs/torch1101/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 49, in <listcomp>
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/PIDS/pids_core/datasets/SemanticKitti.py", line 230, in __getitem__
    ind = int(self.epoch_inds[self.epoch_i])
IndexError: index 2200 is out of bounds for dimension 0 with size 2200

Could you have a look at this problem? Thanks!

Hi, @ldkong1205 ,

I think the issue caused by using a different dataset is "max_in_points" and "max_val_points" constants (See https://github.com/lordzth666/WACV23_PIDS-Joint-Point-Interaction-Dimension-Search-for-3D-Point-Cloud/blob/master/cfgs/semantic_kitti.py#L59 as an example.) on the new dataset. As the new dataset contains fewer points, the expected numbers will be smaller, and cause the checkpoints to fail due to the "out-of-bounds" Index issue.

I would advise try the following approach to address:
(1) run the training code of the architecture on the new dataset (recommended). The training code will determine ""max_val_in_points" and "max_in_points" automatically, so you can proceed with the correct numbers.
(2) (Not advised) You can plug in the "max_val_points" and "max_in_points" in (1) directly in the config file, and rerun the validation. It's not safer to see the power of the checkpoints, yet it may help on hacking a new dataset.

Thanks for your reply and suggestion!