There are 2 repositories under blackbox-optimization topic.
Python-based research interface for blackbox and hyperparameter optimization, based on the internal Google Vizier Service.
Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces.
Experimental Global Optimization Algorithm
Towards Generalized and Efficient Blackbox Optimization System/Package (KDD 2021 & JMLR 2024)
Parallel Hyperparameter Tuning in Python
PRIMA is a package for solving general nonlinear optimization problems without using derivatives. It provides the reference implementation for Powell's derivative-free optimization methods, i.e., COBYLA, UOBYQA, NEWUOA, BOBYQA, and LINCOA. PRIMA means Reference Implementation for Powell's methods with Modernization and Amelioration, P for Powell.
Elo ratings for global black box derivative-free optimizers
PyXAB - A Python Library for X-Armed Bandit and Online Blackbox Optimization Algorithms
Distributed Asynchronous Hyperparameter Optimization better than HyperOpt. 比HyperOpt更强的分布式异步超参优化库。
Distribution transparent Machine Learning experiments on Apache Spark
Python module for CEC 2017 single objective optimization test function suite.
Generalized and Efficient Blackbox Optimization System.
Heuristic Optimization for Python
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Tuning the Parameters of Heuristic Optimizers (Meta-Optimization / Hyper-Parameter Optimization)
[ICLR'24] "DeepZero: Scaling up Zeroth-Order Optimization for Deep Model Training" by Aochuan Chen*, Yimeng Zhang*, Jinghan Jia, James Diffenderfer, Jiancheng Liu, Konstantinos Parasyris, Yihua Zhang, Zheng Zhang, Bhavya Kailkhura, Sijia Liu
Rust implementation of the Simple(x) Global Optimization algorithm
A julia implementation of the CMA Evolution Strategy for derivative-free optimization of potentially non-linear, non-convex or noisy functions over continuous domains.
Fast Bayesian optimization, quadrature, inference over arbitrary domain with GPU parallel acceleration
Blackbox optimization algorithms with a common interface, along with useful helpers like parallel optimization loops, analysis and visualization scripts.
Robustify Black-Box Models (ICLR'22 - Spotlight)
📰 Must-Read Papers on Offline Model-Based Optimization 🔥
A library for the hyperparameter optimization of deep neural networks
Nonlinear programming application examples solved with Artelys Knitro
Black-box optimizer submitted to BBO challenge at NeurIPS 2020
🎯📈 Sequantial and model-based optimization
A simple black-box optimization framework to train your pytorch models for optimizing non-differentiable objectives
SPGD: Search Party Gradient Descent algorithm, a Simple Gradient-Based Parallel Algorithm for Bound-Constrained Optimization. Link: https://www.mdpi.com/2227-7390/10/5/800