aRI0U / RandLA-Net-pytorch

PyTorch implementation of RandLA-Net

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

About the parameter of batch size

shnhrtkyk opened this issue · comments

commented

First of all, thanks for the implementation.
I found one point of concern.
There are two definitions of the batch size, an option for train.py and a config file (from utils.tools import Config as cfg) in data.py.

Hi, thanks for your interest!

Yes the fact that there are command-line arguments and config file is not very clean... Basically our code is supposed to work with command-line arguments but we also copy-pasted some portions of the original implementation of RandLA-Net, in particular their Config file. We progressively removed the parameters of this config file (therefore half of the file is under comments) but did not take the time to remove it properly (shame on us). This explains why there are redundant parameters and probably unused parameters in the config file.

I would say that the best practice would be to use command-line arguments instead of the ones in the config file when possible (and thus keep args.batch_size and not cfg.batch_size), but feel free to do as you want.

I just modified the code so that the batch size is unambiguously determined by the command line arguments.

commented

Thanks‼️