HuguesTHOMAS / KPConv

Kernel Point Convolutions

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Bug on Scannet.py

ZhengdiYu opened this issue · comments

Hey,

You seem to use a wrong aggregation file.

KPConv/datasets/Scannet.py

Lines 206 to 233 in 07861d8

vertex_data, faces = read_ply(join(path, scene, scene + '_vh_clean_2.ply'), triangular_mesh=True)
vertices = np.vstack((vertex_data['x'], vertex_data['y'], vertex_data['z'])).T
vertices_colors = np.vstack((vertex_data['red'], vertex_data['green'], vertex_data['blue'])).T
vertices_labels = np.zeros(vertices.shape[0], dtype=np.int32)
if new_path == self.train_path:
# Load alignment matrix to realign points
align_mat = None
with open(join(path, scene, scene + '.txt'), 'r') as txtfile:
lines = txtfile.readlines()
for line in lines:
line = line.split()
if line[0] == 'axisAlignment':
align_mat = np.array([float(x) for x in line[2:]]).reshape([4, 4]).astype(np.float32)
R = align_mat[:3, :3]
T = align_mat[:3, 3]
vertices = vertices.dot(R.T) + T
# Get objects segmentations
with open(join(path, scene, scene + '_vh_clean_2.0.010000.segs.json'), 'r') as f:
segmentations = json.load(f)
segIndices = np.array(segmentations['segIndices'])
# Get objects classes
with open(join(path, scene, scene + '_vh_clean.aggregation.json'), 'r') as f:
aggregation = json.load(f)

However, in the ScanNet repo https://github.com/ScanNet/ScanNet, it says:
|-- _vh_clean_2.ply
Cleaned and decimated mesh for semantic annotations
|-- .aggregation.json, _vh_clean.aggregation.json
Aggregated instance-level semantic annotations on lo-res, hi-res meshes, respectively
|-- _vh_clean_2.0.010000.segs.json, _vh_clean.segs.json
Over-segmentation of lo-res, hi-res meshes, respectively (referenced by aggregated semantic annotations)

Please have a check!

Best

You seem to be right but I have not used the Scannet dataset for a very long time now. I will make a correction and if there is any other problems, don't hesitate to post another issue.

Thanks for spotting this.

Best,
Hugues

You seem to be right but I have not used the Scannet dataset for a very long time now. I will make a correction and if there is any other problems, don't hesitate to post another issue.

Thanks for spotting this.

Best,
Hugues

Thank you for your quick reply! By the way, Could I ask about reason about creating a rasterize_mesh (finer point clouds) with very number of points for ScanNet training ? Why don't we just use the raw point cloud for training?

Best,
Zhengdi