Easy distributed hyperopt
The code in allows you to run distributed hyperopt with only a shared folder: all trials will be stored and shared between nodes using that folder.
No head node, no distributed frameworks. Just a shared folder.
To use,
- Create an empty folder to store all the trial scores. The folder should be accessible from all your nodes.
- Fill in example.py with the foldername, your desired optimization function and hyperparameters to optimize over.
- Then, run example.py on all your nodes.
- Kill the script when you want to stop.
- Set the foldername in print_best_model.py and then run it.