aipixel / GPS-Gaussian

[CVPR 2024 Highlight] The official repo for “GPS-Gaussian: Generalizable Pixel-wise 3D Gaussian Splatting for Real-time Human Novel View Synthesis”

Home Page:https://shunyuanzheng.github.io/GPS-Gaussian

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Render a ply mesh with texture using your render code

jinnan-chen opened this issue · comments

Hi, thanks for your great work!
I want to render a mesh.ply with texture in ply file, how to modify the render_data.py to add mode to the scene? Thanks!

Hi, thanks for your interest.

I think GPS-Gaussian is not suitable for joint rendering with the scene since the unprojected Gaussian points formulate merely a partial representation of humans. However, you can try to save the Gaussian points as in 3DGS and viewed in SIBER. Note that, scale needs a log function and opacity requires an inverse sigmoid function as here.

Hi, thanks for replying!
In fact, my question is not about GPS-Gaussian. It is about the data preparation to render multi-view images from original mesh.

Sorry for misunderstanding the question.

You can load the .ply file with trimesh and export the .obj file with a texture map. Then you can render them as the human scans in render_data.py. If you want to add both scans of human and scene, modify these lines as

    if len(renderer.scene.models) >= 2:
        renderer.modify_model(0, obj, texture)
        renderer.modify_model(1, scene_obj, scene_texture)
    else:
        renderer.add_model(obj, texture)
        renderer.add_model(scene_obj, scene_texture)

Thanks, but how to load the .ply file with trimesh and export the .obj file with a texture map using scripts for a lot of mesh.ply files?

If the material is included in .ply file, just load and save it with Trimesh. However, if it is not included in .ply file, I think you need some other tools to process the data. You can use MeshLab to check whether the material is included.
image
image

However, for 2K2K dataset, meshes cannot be processed in this way. But it seems these meshes are rendered the tested in the paper. I wonder how are they rendered using provided code?

Sorry, I have not actually used 2K2K data for training, so I did not notice it was not feasible to render them with the provided code. I think you can use THuman2.1 data as an alternative since it has been extended from 500 to 2500 models.

It seems not released yet.