predmach / DeepHandMesh

Official PyTorch implementation of "DeepHandMesh: A Weakly-Supervised Deep Encoder-Decoder Framework for High-Fidelity Hand Mesh Modeling," ECCV 2020

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

DeepHandMesh: A Weakly-Supervised Deep Encoder-Decoder Framework for High-Fidelity Hand Mesh Modeling

Introduction

This repo is official PyTorch implementation of DeepHandMesh: A Weakly-Supervised Deep Encoder-Decoder Framework for High-Fidelity Hand Mesh Modeling (ECCV 2020. Oral.).

Demo

  • Download pre-trained DeepHandMesh from here and place it at demo folder, where the filename is snapshot_${EPOCH}.pth.tar.
  • Download hand model from here and place it at data folder.
  • Set hand joint Euler angles at here.
  • Run python demo.py --gpu 0 --test_epoch ${EPOCH}.

DeepHandMesh dataset (RGB images are not prepared yet)

  • For the DeepHandMesh dataset download and instructions, go to [HOMEPAGE].
  • Belows are instructions for DeepHandMesh for the weakly-supervised high-fidelity 3D hand mesh modeling.

Directory

Root

The ${ROOT} is described as below.

${ROOT}
|-- data
|-- common
|-- main
|-- output
|-- demo
  • data contains data loading codes and soft links to images and annotations directories.
  • common contains kernel codes.
  • main contains high-level codes for training or testing the network.
  • output contains log, trained models, visualized outputs, and test result.
  • demo contains demo codes.

Data

You need to follow directory structure of the data as below.

${ROOT}
|-- data
|   |-- hand_model
  • Download datasets and hand model from [HOMEPAGE].

Output

You need to follow the directory structure of the output folder as below.

${ROOT}
|-- output
|   |-- log
|   |-- model_dump
|   |-- result
|   |-- vis
  • log folder contains training log file.
  • model_dump folder contains saved checkpoints for each epoch.
  • result folder contains final estimation files generated in the testing stage.
  • vis folder contains visualized results.

Running DeepHandMesh

Prerequisites

  • For the training, install neural renderer from here.
  • After the install, uncomment line 12 of main/model.py (from nets.DiffableRenderer.DiffableRenderer import RenderLayer) and line 40 of main/model.py (self.renderer = RenderLayer()).
  • If you want only testing, you do not have to install it.

Start

  • In the main/config.py, you can change settings of the model

Train

In the main folder, run

python train.py --gpu 0-3

to train the network on the GPU 0,1,2,3. --gpu 0,1,2,3 can be used instead of --gpu 0-3. You can use --continue to resume the training.

Test

Place trained model at the output/model_dump/.

In the main folder, run

python test.py --gpu 0-3 --test_epoch 4

to test the network on the GPU 0,1,2,3 with snapshot_4.pth.tar. --gpu 0,1,2,3 can be used instead of --gpu 0-3.

Results

Here I report results of DeepHandMesh and pre-trained DeepHandMesh.

Pre-trained DeepHandMesh

Effect of Identity- and Pose-Dependent Correctives

Comparison with MANO

Reference

@InProceedings{Moon_2020_ECCV_DeepHandMesh,  
author = {Moon, Gyeongsik and Shiratori, Takaaki and Lee, Kyoung Mu},  
title = {DeepHandMesh: A Weakly-supervised Deep Encoder-Decoder Framework for High-fidelity Hand Mesh Modeling},  
booktitle = {European Conference on Computer Vision (ECCV)},  
year = {2020}  
}  

License

DeepHandMesh is CC-BY-NC 4.0 licensed, as found in the LICENSE file.

About

Official PyTorch implementation of "DeepHandMesh: A Weakly-Supervised Deep Encoder-Decoder Framework for High-Fidelity Hand Mesh Modeling," ECCV 2020

License:Other


Languages

Language:Python 100.0%