brodderickrodriguez / hypertune

Hyperparameter tuning using Particle Swarm Optimization and parallel computation which outperforms current approaches. V0.1 Beta

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

hypertune

A package to tune ML hyperparameters efficiently using Particle Swarm Optimization.

Please see ./examples for examples on how to use this package with your existing implementation.

Documentation about this repository can be found here.

Requires:

  • Python>=3 (built using v3.7.4)
  • numpy (built using v1.17.3)

Installation:

  • from PyPI via PIP:
    • TBD
  • from source via PIP:
    • pip install git+https://github.com/brodderickrodriguez/hypertune.git

Acknowledgements:

  • Travis E, Oliphant. A guide to NumPy, USA: Trelgol Publishing, (2006).

Contributors:

  • Brodderick Rodriguez (web)

About

Hyperparameter tuning using Particle Swarm Optimization and parallel computation which outperforms current approaches. V0.1 Beta

License:Apache License 2.0


Languages

Language:Python 100.0%