BlueBrain / Ultraliser

Reconstruction of watertight meshes, annotated volumes and center line skeletons of neuroscience spatial structures from non-watertight inputs, segmented masks, skeletons of NGV morphologies and volumes.

Home Page:https://portal.bluebrain.epfl.ch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Implement edge collapse-based mesh simplification.

marwan-abdellah opened this issue · comments

To reduce the mesh size while preserving the topology based on a cost function, we should implement the edge-collape algorithm in Ultraliser.

An existing implementation is found here https://github.com/andandandand/progressive-mesh-reduction-with-edge-collapse.

@NadirRoGue I have checked the implementation. It is working very well and preserves the structure better than the current decimation algorithm. The main issue I can clearly see now is the number of iterations. In certain cases, when this number is set for example to 1000, it adds more edges in the resulting mesh. So I still do not understand what is exactly meant by this parameter.

As you can see in the image below, the size of the input mesh is 1.8MB, but the one decimated with 1000 iterations is 2.3 Mb and that with 5000 iterations is 2.0 MB. Therefore I still do not understand how the new edges are created if the algorithm is supposed to collapse vertices (and consequently edges) in every iteration.

deceimation

P.S. Let's have a look when you are back to the office. It is not urgent.

For the iterations parameters, it might be a not so clear name. The parameter is the maximum number of edges that will be removed (it might happen that the algorithm cannot collapse further before reaching the given quantity).

As for why they are bigger in size, I will check the code to see if Im saving the data that should not be there (Indeed, on every iteration, it will remove 2 triangles, 1 edge and 1 vertex)