PyBADS Example 5: Extended usage

PyBADS Example 5: Extended usage#

In this example, we will show PyBADS at work on a target with multiple local minima (also referred to as “multimodal” in statistics), and showcase some additional useful features of the package.

This notebook is Part 5 of a series of notebooks in which we present various example usages for BADS with the PyBADS package. The code used in this example is available as a script here.

import numpy as np
from pybads import BADS

1. Problem setup#

In this example, we are going to optimize the six-hump camelback function, which has six local minima, two of which are global minima.

Note that, in most realistic scenarios, you would not know whether your problem has only a single local minimum (which is also the global minimum). In practice, many optimization problems exhibit multiple local minima and you should assume so, in the absence of additional knowledge.

def camelback6(x):
    """Six-hump camelback function."""
    x_2d = np.atleast_2d(x)
    x1 = x_2d[:,0]
    x2 = x_2d[:,1]
    f = (4 - 2.1*(x1*x1) + (x1*x1*x1*x1)/3.0)*(x1*x1) + x1*x2 + (-4 + 4*(x2*x2))*(x2*x2)
    return f

lower_bounds = np.array([-3, -2])
upper_bounds = np.array([3, 2])
plausible_lower_bounds = np.array([-2.9, -1.9])
plausible_upper_bounds = np.array([2.9, 1.9])

options = {
    "display" : 'off',             # We switch off the printing
    "uncertainty_handling": False, # Good to specify that this is a deterministic function
}

2. Run the optimization#

PyBADS is not a global optimization algorithm in that there is no guarantee that a single run would return the global optimum (in practice, this is true of all algorithms, under a finite budget of evaluations). The gold rule of optimization, regardless of optimization algorithm, is to always rerun the optimization multiple times from different starting points (a multi-start strategy), to explore the landscape of the target and gain some confidence about the results.

Below, we rerun PyBADS num_opts times from different starting points and store the results of each run, which we will examine later. A few observations:

  • Each optimization uses a different BADS object (the general rule is: one BADS instance per optimization).

  • We specified x0 = None. This choice will randomly draw a starting point x0 uniformly inside the provided plausible box, delimited by plausible_lower_bounds and plausible_upper_bounds.

  • For each run, we set a different, predetermined random seed via options['random_seed'], which can be helpful for reproducibility of the results.

  • We switched off PyBADS default printing.

num_opts = 10
optimize_results = []
x_vec = np.zeros((num_opts,lower_bounds.shape[0]))
fval_vec = np.zeros(num_opts)

for opt_count in range(num_opts):
    print('Running optimization ' + str(opt_count) + '...')
    options['random_seed'] = opt_count
    bads = BADS(
        camelback6, None, lower_bounds, upper_bounds, plausible_lower_bounds, plausible_upper_bounds, options=options
    )
    optimize_results.append(bads.optimize())
    x_vec[opt_count] = optimize_results[opt_count].x
    fval_vec[opt_count] = optimize_results[opt_count].fval
Running optimization 0...
Running optimization 1...
Running optimization 2...
Running optimization 3...
Running optimization 4...
Running optimization 5...
Running optimization 6...
Running optimization 7...
Running optimization 8...
Running optimization 9...

3. Results and conclusions#

First, we inspect the results. In this example, the target function (six-hump camelback function) has two equally-good solutions: $\( x^\star = \left\{ (0.0898, -0.7126), (-0.0898, 0.7126) \right\}, \qquad f(x^\star) = -1.0316 \)$ which should be represented in the set of results.

Importantly, we should find below that (almost) all solutions are very close in function value, suggesting that we found the minimizers of the target.

print('Found solutions:')
print(x_vec)

print('Function values at solutions:')
print(fval_vec)
Found solutions:
[[ 0.08939411 -0.712172  ]
 [-0.09008138  0.71347182]
 [ 0.09082897 -0.71192109]
 [-0.09414637  0.71077771]
 [-0.09105368  0.71271563]
 [ 0.08997784 -0.71272106]
 [-0.08990161  0.71246268]
 [ 0.09031594 -0.71231801]
 [ 0.0895162  -0.71246247]
 [-0.09016521  0.71251721]]
Function values at solutions:
[-1.03162597 -1.03162297 -1.03161951 -1.03151937 -1.03162277 -1.03162836
 -1.03162812 -1.03162648 -1.03162779 -1.03162784]

We now take the best result of the optimization:

idx_best = np.argmin(fval_vec)
result_best = optimize_results[idx_best]

x_min = result_best['x']
fval = result_best['fval']

print(f"BADS minimum at x_min = {x_min.flatten()}")
print(f"Function value at minimum fval = {fval}")
BADS minimum at x_min = [ 0.08997784 -0.71272106]
Function value at minimum fval = -1.0316283561123913

The best result indeed matches \(f^\star = -1.0316\).

The OptimizeResult object returned by PyBADS contains further information about the run:

result_best
{'fun': <function __main__.camelback6(x)>,
 'non_box_cons': None,
 'target_type': 'deterministic',
 'problem_type': 'bound constraints',
 'iterations': 8,
 'func_count': 77,
 'mesh_size': 0.0009765625,
 'overhead': 1307.5565116015111,
 'algorithm': 'Bayesian adaptive direct search',
 'yval_vec': None,
 'ysd_vec': None,
 'x0': array([[-1.61243961,  1.40878276]]),
 'x': array([ 0.08997784, -0.71272106]),
 'fval': -1.0316283561123913,
 'fsd': 0,
 'total_time': 1.3023214350209962,
 'random_seed': 5,
 'version': '0.8.3.dev21+g2aaa7af',
 'success': True,
 'message': "Optimization terminated: change in the function value less than options['tol_fun']."}

In particular:

  • total_time is the total runtime (in seconds), including both the time spent evaluating the target and the algorithmic cost of BADS.

  • overhead represents the fractional overhead of BADS compared to the time spent evaluating the target. In this example, overhead is astronomical because the target function we are using is analytical and extremely fast, which is not what BADS is designed for. In a realistic scenario, the objective function will be moderately costly (e.g., more than 0.1 s per function evaluation), and the fractional overhead should be less than 1.

  • random_seed is the random seed used for this run (None if not specified).

  • version is the version of PyBADS used (e.g., 0.8.1); developer versions have an additional dev suffix with more versioning information.