loicland / superpoint_graph

Large-scale Point Cloud Semantic Segmentation with Superpoint Graphs

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

how to calculate mIoU and OA in Semantic3d?

LeanderWayne opened this issue · comments

Hi, i follow the semantic3d.md instructions and when i run learning/main.py with --resume argument, the stdout shows that the avgIoU:nan, Test mAcc: 0.0

i wonder if you can help me.

Hi,
that is because the ground truth for the test sets is withheld. Either use the train/val split, or submit your .labels files to semantic3d.net.

@loicland thank u for replying. But how do I use 'train/val split'?
the model.pth.tar still placed in the directory 'results/sema3d/trainval_best'?

@loicland Hi, I viewed semantic3d.net and I found the submit need label files but I run:
"python partition/visualize.py --dataset sema3d --ROOT_PATH /home/leon/SEMA3D --res_file 'results/sema3d/trainval_best/predictions_testred' --file_path 'test_full/stgallencathedral_station6' --output_type ifprs"
only got some .ply files. then I tried to run this code with --output_type g, but i got error:
_Traceback (most recent call last):
File "partition/visualize.py", line 94, in
prediction2ply(ply_file + "GT.ply", xyz, labels, n_labels, args.dataset)
File "/home/leon/superpoint_graph/partition/provider.py", line 59, in prediction2ply
if len(prediction.shape) > 1 and prediction.shape[1] > 1:
AttributeError: 'list' object has no attribute 'shape'

You are trying to writ a .ply file of the ground truth on the test dataset, which is impossible. As per the readme, the .labels size are obtained with:

python partition/write_Semantic3d.py --SEMA3D_PATH $SEMA3D_DIR --odir "results/sema3d/trainval_best" --db_test_name testred

Hi, I have used write_Semantic3d.py to make labels file, however, I found all labels file are missing one label for a point. And the result I got from semantic3d.net is surprisingly low( IoU less than 10%). I know that voxel_width may be a factor, but cant be this low?

Due to my memory limit and gpu limit, I have to use voxel_width 0.2~0.3 when I run partition.py. The prediction result .ply file looks fine, at least the buildings and road is correctly classified, I don't know why every class's IoU is less than 10%. @loicland

I remember a bug I thought I fixed in which all the labels were shifted by (0 instead of 1, 1 instead of 2). Could you try to check if label_up is within 0-7 or 1-8 in write_Semantic3d.py? If the former,

np.savetxt(label_file, labels_ups+1, delimiter=' ', fmt='%d')   # X is an array

should fix it.

well I simply checked the labels file, the value is within 0-7.

Oops. If it work (it should) let me know, and I'll fix it on the repo.

I modified the code like you said. the values is within 1-8 now, but there is still one point missing. for example the marketsquarefeldkirch4-reduced.labels needs 10538633 lines, but the labels file i got has 10538632 values.

I actually find that in the read_semantic3d_format() function in provider.py, you use
pd.read_csv(data_file, delimiter=' ',chunksize=ver_batch)
I add the parameter header=None to avoid the first line becoming header. however it makes no difference to the labels file finally i got.

Try to apply your change to the header to interpolate_labels_batch line 624 of provider, in aprticular line 643 and 649. Alternatively comment the panda code and uncomemnt the genfromtxt one, this is the one I used at the time of my submission (but it is very slow).

Let me know if it fixes things.

Hi, it took me some time to resubmit the results. after the modification, the results seems fine now.
in interpolate_labels_batch in provider.py,

if(ver_batch>0):
vertices = pd.read_csv(data_file, sep=' ', nrows=ver_batch, header=None if i_rows==None else i_rows-1).values

in the else part, i just use header=None

these are my changes. Thanks for your help.

hi, i have the similar problem as you, the result is avgIoU:nan and Test oAcc: 0.0 after running the main.py . Follow your discussion with author, i have run the write_semantic.py and then run the main.py, but the problem is still existing, i don’t know if I run the files in the wrong order . could you please tell me how to solve the problem? i will appreciate it.