chenzhaiyu / points2poly

Reconstructing compact building models from point clouds using deep implicit fields [ISPRS 2022]

Home Page:https://github.com/chenzhaiyu/points2poly

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

GPU and dataset

Hxinyue opened this issue · comments

Thank you for the great work!
What is the minimum performance of the graphics card used in your experiment? And how to handle my own building point cloud data set for training?

Hi @Hxinyue, I can't indicate a minimal GPU requirement - we were using RTX 2080 Ti but you should be able to train your model given limited resources with reduced batch size. For training with your own data please see here.

When I tried to run 'python reconstruct.py dataset_name='helsinki_mini' model_name='helsinki_fullview', there was a picture error message.
1677488413(1)

I tried 'pip install sage', but the sage package installed seemed to be empty. Is this installed separately in Linux system or another solution?

Thank you for your answer!
Can you provide complete synthetic dataset and real-world point cloud datasets for training and evaluation?

https://github.com/chenzhaiyu/points2poly#reconstruction-from-custom-point-clouds
Is "1. ply" in the folder "00_base_pc" a point or a mesh? There is a point in my own ply file, and then rebuild it according to the prompt. There is no error, and I get a "eval" folder, but there is no "reconstructed" folder

Thank you for your answer! Can you provide complete synthetic dataset and real-world point cloud datasets for training and evaluation

We're currently preparing city-wide synthetic data, so stay tuned! The real-world point clouds are courtesy of the first author of https://www.mdpi.com/2072-4292/13/6/1107; please get in touch directly for the data.

https://github.com/chenzhaiyu/points2poly#reconstruction-from-custom-point-clouds Is "1. ply" in the folder "00_base_pc" a point or a mesh? There is a point in my own ply file, and then rebuild it according to the prompt. There is no error, and I get a "eval" folder, but there is no "reconstructed" folder

Data in 00_base_pc should be point clouds, as indicated by the folder name. Likely you're running into the same issue #7.

Excuse me, how can I retrain on my dataset? Thank you for your reply

Have you prepared your training data?

Prepare meshes and place them under datasets/{dataset_name} that mimic the structure of the provided data. Refer to this instruction for creating training data through BlenSor simulation.

Then you can follow the README of Points2Surf for training the SDF.

Prepare the grid data and put it in the corresponding folder, but there is an error when running the reconstruction
1678284518(1)

I don't know whether it is related to my own data set. My building point cloud has many windows; There is another question: Is my "00_base_pc" points (below), or do I need to use Poisson reconstruction to convert it to meshes and put it in this folder?
points cloud

@Hxinyue Hello, could you please share your dataset related to buildings? Thank you very much.

@Hxinyue Hello, could you please share your dataset related to buildings? Thank you very much.

Sure. What's your email number?

One would be to try to reduce the minimum number of support points. You can also play around with the other parameters while visualizing the corresponding planar segments to make sure it’s okay before being ported to the reconstruction.

Thank you very much for your reply! I'll try again.

same error, I have trained point2surf model in my data, and do different parameters in primitives extraction, and set append_bottom true, reduce coefficient. but still got no unreachable cells. aborting. Can you suggest any way to the right reconstruction?

same error, I have trained point2surf model in my data, and do different parameters in primitives extraction, and set append_bottom true, reduce coefficient. but still got no unreachable cells. aborting. Can you suggest any way to the right reconstruction?

Hi, thanks for providing the info. Several factors could be the reason:

  • Is it the case that your point clouds are heavily incomplete? In this case the points2surf occupancy indicator may not be an optimal one;
  • Did the points2surf training converge? This can be relevant to the previous factor;
  • Have you visualized (at least some of) the primitives to make sure they are ok? For example, if a principal plane is missed out, the reconstruction might fail.

the output scene_dense.ply from openmvs after DensifyPointCloud,it can be extracted primitives
image
total 1104503 vertex, after primitives extraction
image
and the output scene_dense_mesh_refine.ply from openmvs after RefineMesh is a mesh which cannot be extracted primitives
image
my intention is compare a pointclound reconstruction by your work with some plane prior info to the openmvs reconstruction. Is the pointclound accord with standard?and I will check the points2surf training and visualization

Hi @Gotta-C, I guess the point cloud per se is okayish as input in terms of the extracted planes. How about the other data in your dataset? It is still possible that the missing points would hinder the points2surf prediction because points2surf to some extent relies on local patches (though it worked on buildings without bottoms).

Hi @Gotta-C, I guess the point cloud per se is okayish as input in terms of the extracted planes. How about the other data in your dataset? It is still possible that the missing points would hinder the points2surf prediction because points2surf to some extent relies on local patches (though it worked on buildings without bottoms).

thanks for the reply,I have tried different scenes,hall,small room,and gets the same problem. Compared to your dataset, the *.vg seems less than 40000 vertex, 15 planes and the points seems very nice. Is it possible that the scene or point clond too large caused?

That is possible when you use the pre-trained model, but as I understood you trained your own model on your own dataset. Again you may check the points2surf training to see if it did converge. Maybe as a sanity check also try to overfit to one example, i.e., training on the same one and then reconstructing that one.

@Hxinyue Hello, how did you do the modeling of point clouds of buildings without bottoms? Can you share your well-trained model? I can't find the corresponding training data when training my own model :( . Thank you very much.

I'm sorry, I also encountered this problem. Complete the reconstruction in the point cloud data of buildings without bottoms. If I make progress, I would be happy to share with you.

The Helsinki dataset updated with f194e87.