huawei-noah / HEBO

Bayesian optimisation & Reinforcement Learning library developped by Huawei Noah's Ark Lab

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Support for conditional/hierarchical design spaces

bbudescu opened this issue · comments

Can one specify a conditional/hierarchical search space to HEBO? I.e., something similar to SMAC3?

E.g., only sample the number of convoutional filters in the second layer of a neural net only if we decide (by another parameter )to have a network with at least two layers.

I'm thinking that this can be somewhat circumvented by returning a high cost value for infeasible combinations, but I imagine this might be a suboptimal approach as it might affect optimization performance since, e.g., some optima might lie on the edge of feasible regions, and the cost estimator, if it has some sort of smoothness prior (which they usually do), has a chance of assigning unfaithfully high cost values near the infeasible configurations - at least initially (i.e., the a priori probability of a kink in the error surface is generally lower).

Some optimizers are able to address this by training a different model - a feasability predictor (which has the advantage of being able to work with unknown feasibility constraints).

So how should one deal with this in HEBO?

Has anyone come up with any solution for hierarchical searches using HEBO? Would hate to switch to different optimizer because of not being able to do something similar to how SMAC3 performs hierarchical searches.