Pointcept / Pointcept

Pointcept: a codebase for point cloud perception research. Latest works: PTv3 (CVPR'24 Oral), PPT (CVPR'24), OA-CNNs (CVPR'24), MSC (CVPR'23)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Understanding the point_collate_fn.

BBangHoYoo opened this issue · comments

Hi,

I am currently in the process of training using custom data, and I have encountered some confusion regarding batch processing. For instance, when the batch size is set to 6, sometimes the batch composition appears in the form of 0, 0, ..., 0, 1, 1, ..., 1, 2, 2, ..., 2, ([204800, 204800, 204800] like this... original size is 102400) suggesting some kind of batch merging. I am unsure why this merging occurs. During debugging, it seems that something related to this is handled in point_collate_fn, but I do not fully understand the necessity of this operation. Could you please clarify why this is required? Thank you for your assistance.

def point_collate_fn(batch, mix_prob=0):
assert isinstance(
batch[0], Mapping
) # currently, only support input_dict, rather than input_list
batch = collate_fn(batch)
if "offset" in batch.keys():
# Mix3d (https://arxiv.org/pdf/2110.02210.pdf)
if random.random() < mix_prob:
batch["offset"] = torch.cat(
[batch["offset"][1:-1:2], batch["offset"][-1].unsqueeze(0)], dim=0
)
return batch