wangxiao5791509 / Point-Transformers

Point Transformers

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Pytorch Implementation of Various Point Transformers

Recently, various methods applied transformers to point clouds: PCT: Point Cloud Transformer (Meng-Hao Guo et al.), Point Transformer (Nico Engel et al.), Point Transformer (Hengshuang Zhao et al.). This repo is a pytorch implementation for these methods and aims to compare them under a fair setting. Currently, all three methods are implemented, while tuning their hyperparameters.

Classification

Environment

conda create -n pointtransformer python=3.7 

conda activate pointtransformer

pip install omegaconf tqdm -i https://pypi.tuna.tsinghua.edu.cn/simple 

pip install hydra-core --upgrade --pre

conda install pytorch=1.2.0 torchvision cudatoolkit=10.0 -c pytorch 

Data Preparation

Download alignment ModelNet here and save in modelnet40_normal_resampled.

Run

Change which method to use in config/config.yaml and run

python train.py

Results

Using Adam with learning rate decay 0.3 for every 50 epochs, train for 200 epochs; data augmentation follows this repo. For Hengshuang and Nico, initial LR is 1e-3 (maybe fine-tuned later); for Menghao, initial LR is 1e-4, as suggested by the author. ModelNet40 classification results (instance average) are listed below:

Model Accuracy
Hengshuang 89.6
Menghao 92.6
Nico 85.5

Miscellaneous

Some code and training settings are borrowed from https://github.com/yanx27/Pointnet_Pointnet2_pytorch. Code for PCT: Point Cloud Transformer (Meng-Hao Guo et al.) is adapted from the author's Jittor implementation https://github.com/MenghaoGuo/PCT.

About

Point Transformers


Languages

Language:Python 100.0%