vloncar / mpi_opt

Hyperparameter optimization with MPI

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

mpi_opt

Bayesian optimization on supercomputer with MPI

alpha instructions

Install the software together with mpi_learn

git clone git@github.com:thongonary/mpi_opt.git
git clone git@github.com:svalleco/mpi_learn.git
cd mpi_opt
ln -s ../mpi_learn/mpi_learn

To run the mnist example (you need to get the mnist data file first using get_mnist in mpi_learn) with 4 blocks of 5 processes for 10 epoches, and 10 bayesian optimization cycle : (1 opt master + 4 block x (1 master + 4 workers)) = 21 processes

mpirun -tag-output -n 21  python3 hyperparameter_search_option3.py --block-size 5 --example mnist --epochs 10 --num-iterations 10

To run with 5-fold cross validation : (1 opt master + (5 fold x (4 block x (1 master + 4 workers))) = 101

mpirun -tag-output -n 101 python3 hyperparameter_search_option3.py --block-size 5 --example mnist --epochs 10 --num-iterations 10 --n-fold 5

To run with a genetic algorithm instead of Bayesian optimization, add

mpirun -tag-output -n 101 python3 hyperparameter_search_option3.py --block-size 5 --example mnist --epochs 10 --num-iterations 10 --n-fold 5 --hyper-opt genetic --ga-population 20

Note that compared to Bayesian optimization, the runtime increases by a factor of population size. However, similar results are generally achieved for the same iterations*population size compared to Bayesian optimization.

About

Hyperparameter optimization with MPI


Languages

Language:Python 94.8%Language:Dockerfile 5.0%Language:Shell 0.2%