ggsonic / NeuMesh

Code for "MeuMesh: Learning Disentangled Neural Mesh-based Implicit Field for Geometry and Texture Editing", ECCV 2022 Oral

Home Page:https://zju3dv.github.io/neumesh/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

NeuMesh: Learning Disentangled Neural Mesh-based Implicit Field for Geometry and Texture Editing

NeuMesh: Learning Disentangled Neural Mesh-based Implicit Field for Geometry and Texture Editing

[Bangbang Yang, Chong Bao]Co-Authors, Junyi Zeng, Hujun Bao, Yinda Zhang, Zhaopeng Cui, Guofeng Zhang.

ECCV 2022 Oral

⚠️ Note: This is only a preview version of the code. Full code (with training scripts) will be released soon.

Installation

We have tested the code on Python 3.8.0 and PyTorch 1.8.1, while a newer version of pytorch should also work. The steps of installation are as follows:

  • create virtual environmental: conda env create --file environment.yml
  • install pytorch 1.8.1: pip install torch==1.8.1+cu111 torchvision==0.9.1+cu111 -f https://download.pytorch.org/whl/torch_stable.html
  • install open3d development version: pip install [open3d development package url]
  • install FRNN, a fixed radius nearest neighbors search implemented on CUDA.

Data

We use DTU data of NeuS version. Please specify the data_dir in the configs/.yaml before evaluation.

Evaluation

Here we provide a pre-trained model of DTU scan 63.

Novel view synthesis

You can evaluate images with provided pre-traied models.

python -m render --config configs/neumesh_dtu_scan63.yaml   --load_pt ./checkpoints/dtu_scan63/latest.pt --camera_path spiral --num_views 90 --background 1 --dataset_split entire --test_frame 24 --spiral_rad 1.2

Texutre Swapping

You can perform texture swapping with provided configs to swap the texture of red and gold apples on dtu_scan 63.

python -m render_texture_swapping --config configs/texture_swapping_dtu_scan63.json --camera_path spiral --rayschunk 1024 --downscale 4 --num_views 90 --edit_method code --dataset_split entire --outdirectory texture_swapping --test_frame 24 --spiral_rad 1.2

Citing

@inproceedings{neumesh,
    title={NeuMesh: Learning Disentangled Neural Mesh-based Implicit Field for Geometry and Texture Editing},
    author={{Chong Bao and Bangbang Yang} and Zeng Junyi and Bao Hujun and Zhang Yinda and Cui Zhaopeng and Zhang Guofeng},
    booktitle={European Conference on Computer Vision (ECCV)},
    year={2022}
}

Note: joint first-authorship is not really supported in BibTex; you may need to modify the above if not using CVPR's format. For the SIGGRAPH (or ACM) format you can try the following:

@inproceedings{neumesh,
    title={NeuMesh: Learning Disentangled Neural Mesh-based Implicit Field for Geometry and Texture Editing},
    author={{Bao and Yang} and Zeng Junyi and Bao Hujun and Zhang Yinda and Cui Zhaopeng and Zhang Guofeng},
    booktitle={European Conference on Computer Vision (ECCV)},
    year={2022}
}

Acknowledgement

In this project we use parts of the implementations of the following works:

We thank the respective authors for open sourcing their methods.

About

Code for "MeuMesh: Learning Disentangled Neural Mesh-based Implicit Field for Geometry and Texture Editing", ECCV 2022 Oral

https://zju3dv.github.io/neumesh/

License:MIT License


Languages

Language:Python 100.0%