QiujieDong / Laplacian2Mesh

Laplacian2Mesh: Laplacian-Based Mesh Understanding

Home Page:https://qiujiedong.github.io/publications/Laplacian2Mesh/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Laplacian2Mesh: Laplacian-Based Mesh Understanding

This repository is the official PyTorch implementation of our paper, Laplacian2Mesh: Laplacian-Based Mesh Understanding.

News

  • 🔥 This paper was accepted by IEEE TVCG
  • ⭐ Gave a talk at CVM2023 on Laplacian2Mesh.

Requirements

  • python 3.7
  • CUDA 11.3
  • Pytorch 1.10.0

To install other python requirements:

pip install -r requirements.txt

Installation

clone this repo:

git clone https://github.com/QiujieDong/Laplacian2Mesh.git
cd Laplacian2Mesh

Fetch Data

This repo provides training scripts for classification and segmentation on the following datasets:

  • SHREC-11
  • manifold40
  • humanbody
  • coseg_aliens
  • coseg_chairs
  • coseg_vases

To download the preprocessed data, run

sh ./scripts/<DATASET_NAME>/get_data.sh

The coseg_aliens, coseg_chairs, and coseg_vases are downloaded via the script of coseg_aliens. This repo uses the original Manifold40 dataset without re-meshing via the Loop Subdivision.

Preprocessing

To get the input features by preprocessing:

sh ./scripts/<DATASET_NAME>/prepaer_data.sh

The operation of preprocessing is one-time.

Training

To train the model on the provided dataset(s) in this paper, run this command:

sh ./scripts/<DATASET_NAME>/train.sh

The training process is time-consuming, you can refer to DiffusionNet to optimize the code to speed up the training.

Evaluation

To evaluate the model on a dataset, run:

sh ./scripts/<DATASET_NAME>/test.sh

Visualize

After testing the segmentation network, there will be colored shapes in the visualization_result directory.

Cite

If you find our work useful for your research, please consider citing the following papers :)

@article{dong2023laplacian2mesh,
  title={Laplacian2mesh: Laplacian-based mesh understanding},
  author={Dong, Qiujie and Wang, Zixiong and Li, Manyi and Gao, Junjie and Chen, Shuangmin and Shu, Zhenyu and Xin, Shiqing and Tu, Changhe and Wang, Wenping},
  journal={IEEE Transactions on Visualization and Computer Graphics},
  year={2023},
  publisher={IEEE}
}

Acknowledgments

Our code is inspired by MeshCNN and SubdivNet.

About

Laplacian2Mesh: Laplacian-Based Mesh Understanding

https://qiujiedong.github.io/publications/Laplacian2Mesh/

License:MIT License


Languages

Language:Python 94.8%Language:Shell 5.2%