balavenkatesh3322 / hyperparameter_tuning

A collection of Hyper parameter tuning library.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Maintenance GitHub GitHub GitHub

Hyper-parameter Tuning library

CV logo

What is Hyper-parameter Tuning?

Parameters which define the model architecture are referred to as hyperparameters and thus this process of searching for the ideal model architecture is referred to as hyperparameter tuning.

For instance, How many trees should I include in my random forest? How many neurons should I have in my neural network layer? How many layers should I have in my neural network?

Library Name Description Framework
Keras Tuner A hyperparameter tuner for Keras, specifically for tf.keras with TensorFlow 2.0. Keras
talos Talos radically changes the ordinary Keras workflow by fully automating hyperparameter tuning and model evaluation. Keras
hyperas A very simple convenience wrapper around hyperopt for fast prototyping with keras models. Hyperas lets you use the power of hyperopt without having to learn the syntax of it. Keras
Library Name Description Framework
Auto-PyTorch This a very early pre-alpha version of our upcoming Auto-PyTorch. So far, Auto-PyTorch supports featurized data (classification, regression) and image data (classification). PyTorch
hypersearch une the hyperparameters of your PyTorch models with HyperSearch. PyTorch
botorch BoTorch is a library for Bayesian Optimization built on PyTorch. PyTorch
Library Name Description Framework
tpot TPOT stands for Tree-based Pipeline Optimization Tool. Consider TPOT your Data Science Assistant. TPOT is a Python Automated Machine Learning tool that optimizes machine learning pipelines using genetic programming. General
nni NNI (Neural Network Intelligence) is a lightweight but powerful toolkit to help users automate Feature Engineering, Neural Architecture Search, Hyperparameter Tuning and Model Compression. General
xcessiv Xcessiv holds your hand through all the implementation details of creating and optimizing stacked ensembles so you're free to fully define only the things you care about. General
ray Ray is a fast and simple framework for building and running distributed applications. General
tune-sklearn Tune-sklearn is a package that integrates Ray Tune's hyperparameter tuning and scikit-learn's models, allowing users to optimize hyerparameter searching for sklearn using Tune's schedulers. General
optuna Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning. General
Hyperopt Hyperopt is a Python library for serial and parallel optimization over awkward search spaces, which may include real-valued, discrete, and conditional dimensions. General
scikit-optimize Scikit-Optimize, or skopt, is a simple and efficient library to minimize (very) expensive and noisy black-box functions. General
Ax Ax is an accessible, general-purpose platform for understanding, managing, deploying, and automating adaptive experiments. General
Spearmint Spearmint is a software package to perform Bayesian optimization. General
hyperparameter_hunter Automatically save and learn from Experiment results, leading to long-term, persistent optimization that remembers all your tests. General
sherpa A Python Hyperparameter Optimization Library General
auptimizer Auptimizer is an optimization tool for Machine Learning (ML) that automates many of the tedious parts of the model building process. General
advisor Advisor is the hyper parameters tuning system for black box optimization. General
test-tube Test tube is a python library to track and parallelize hyperparameter search for Deep Learning and ML experiments. General
Determined Determined helps deep learning teams train models more quickly, easily share GPU resources, and effectively collaborate. General

Contributions

Your contributions are always welcome!! Please have a look at contributing.md

License

MIT License