BADS
#
Note
The BADS
class implements the Bayesian Adaptive Direct Search (BADS) algorithm.
BADS attempts to solve an unbounded, bounded or nonlinearly constrained optimization (minimization) problem, and is compatible with both noiseless and noisy target functions.
To perform the optimization, first initialize a BADS
object and then call bads.optimize()
on the instance.
See below for more details on the BADS
class methods and interface. The primary entry-points for users are the BADS
class, which initializes the algorithm, and the OptimizeResult class, which represents the returned optimization solution. The Basic options may also be useful.
- class pybads.bads.BADS(fun: callable, x0: ndarray = None, lower_bounds: ndarray = None, upper_bounds: ndarray = None, plausible_lower_bounds: ndarray = None, plausible_upper_bounds: ndarray = None, non_box_cons: callable = None, gamma_uncertain_interval=None, options: dict = None)[source]#
BADS Constrained optimization using Bayesian Adaptive Direct Search.
- BADS attempts to solve problems of the form:
\(\mathtt{argmin}_x f(x)\) subject to: lower_bounds \(<= x <=\) upper_bounds, and optionally \(C(x) <= 0\)
Initialize a
PyBADS
object to set up the optimization problem, then runoptimize()
. See the examples for more details under the examples directory.- Parameters:
- funcallable
A given target
fun
.fun
accepts inputx
and returns a scalar function value of the target evaluated atx
and the noise if provided. In case the target functionfun
requires additional data/parameters, they can be handled using an anonymous function. For example:fun_for_pybads = lambda x: fun(x, data, extra_params)
, wherefun
is the function to optimize, anddata
andextra_params
are given in the outer scope.- x0np.ndarray, optional
Starting point for the optimization. If not specified or
None
, the starting pointx0
is uniformly randomly drawn inside the plausible box betweenplausible_lower_bounds
andplausible_upper_bounds
(see below).- lower_bounds, upper_boundsnp.ndarray, optional
lower_bounds
(lb
) andupper_bounds
(ub
) define a set of strict lower and upper bounds for the coordinate vector,x
, so that the unknown function has support onlb
<=x
<=ub
. If scalars, the bound is replicated in each dimension. UseNone
forlb
andub
if no bounds exist. Setlb[i] = -inf
andub [i] = inf
if the i-th coordinate is unbounded (while other coordinates may be bounded). Note that iflb
andub
contain unbounded variables, the respective values ofplb
andpub
need to be specified (see below). By defaultNone
.- plausible_lower_bounds, plausible_upper_boundsnp.ndarray, optional
Specifies a set of
plausible_lower_bounds
(plb
) andplausible_upper_bounds
(pub
) such thatlb
<=plb
<pub
<=ub
. Bothplb
andpub
need to be finite.plb
andpub
represent a plausible range, which should denote a region where the global minimum is expected to be found. As a rule of thumb, setplausible_lower_bounds
andplausible_upper_bounds
such that there is > 90% probability that the minimum is found within the box (where in doubt, just setplb = lb
andpub = ub
).- non_box_cons: callable, optional
A given non-box constraints function that specifies constraint violations, e.g :
lambda x: np.sum(x.^2,1)>1
- optionsdict, optional
Additional options can be passed as a dict. Please refer to the BADS options page for the default options. If no options are passed, the default options are used. To run BADS on a noisy (stochastic) objective function, set
options['uncertainty_handling']
=True
. You can help BADS by providing an estimate of the noise.options['noise_size'] = sigma
provides a global estimate of the SD of the noise in your problem in a good region of the parameter space. (If not specified, defaultsigma = 1.0
). Alternatively, you can specify the target noise at each location withoptions['specify_target_noise']
=True
. In this case,fun
is expected to return two values, the estimate of the target atx
and an estimate of the SD of the noise atx
(see the examples). Ifoptions['uncertainty_handling']
is not specified, BADS will determine at runtime if the objective function is noisy. To obtain reproducible results of the optimization, setoptions['random_seed']
to a fixed integer value.
- Raises:
- ValueError
When neither
x0
or (plausible_lower_bounds
andplausible_upper_bounds
) are specified.- ValueError
When various checks for the bounds (
lower_bounds
,upper_bounds
,plausible_lower_bounds
,plausible_upper_bounds
) of BADS fail.
References
[1]Singh, S. G. & Acerbi, L. (2024). “PyBADS: Fast and robust black-box optimization in Python”. Journal of Open Source Software, 9(94), 5694, https://doi.org/10.21105/joss.05694.
[2]Acerbi, L. & Ma, W. J. (2017). “Practical Bayesian Optimization for Model Fitting with Bayesian Adaptive Direct Search”. In Advances in Neural Information Processing Systems 30, pages 1834-1844. (arXiv preprint: https://arxiv.org/abs/1705.04405).
Examples
For BADS usage examples, please look up the Jupyter notebook tutorials in the PyBADS documentation: https://acerbilab.github.io/pybads/examples.html
- optimize()[source]#
Run the optimization on an initialized
PyBADS
object.BADS starts at X0 and finds a local minimum X of the target function ‘fun’.
A history of the optimization problem can be found at the
self.iteration_history
variable of thePyBADS
object.- Returns:
- optimize_result: OptimizeResult
Dictionary containing the result of the optimization. See the documentation of the
OptimizeResult
class for more details. For example, retrieve the final solution with the following attributes:optimize_result.x
optimize_result.fval