L_BFGS_B_Optimization¶
Purpose¶
The purpose of the driver is to identify a parameter vector that minimizes the value of an objective function . The search domain is bounded by box constraints for and may be subject to several constraints such that only if (see jcmwave_optimizer_create_study()).
The driver uses the L-BFGS-B algorithm to perform a gradient-based minimization. We recommend to use the driver if an exact convergence towards a local or global minimum is required. If no derivative information are available, a convergence can be better performed with the derivative-free downhill-simplex minimization.
The implementation of the driver is based on the open source implementation of scipy (see https://docs.scipy.org/doc/scipy/reference/optimize.minimize-lbfgsb.html). It is extended to support constraints and a parallel optimization by starting several independent minimizers at different positions.
Usage Example¶
addpath(fullfile(getenv('JCMROOT'), 'ThirdPartySupport', 'Matlab'));
client = jcmwave_optimizer_client();
% Definition of the search domain
domain = {...
struct('name','x1', 'type','continuous', 'domain', [-1.5,1.5]),...
struct('name','x2', 'type','continuous', 'domain', [-1.5,1.5]),...
struct('name','radius', 'type','fixed', 'domain', 2)...
};
% Definition of a constraint on the search domain
constraints = [...
struct('name', 'circle', 'constraint','sqrt(x1^2 + x2^2) - radius')...
];
% Creation of the study object with study_id 'example'
study = client.create_study('domain',domain, 'constraints',constraints, ...
'driver','L_BFGS_B_Optimization',...
'name','L_BFGS_B_Optimization example', ...
'study_id','L_BFGS_B_Optimization_example');
% Definition of a simple analytic objective function.
% Typically, the objective value is derived from a FEM simulation
% using jcmwave.solve(...)
function observation = objective(sample)
pause(2.0); % makes objective expensive
observation = study.new_observation();
x1 = sample.x1;
x2 = sample.x2;
observation.add(10*2 + (x1.^2-10*cos(2*pi*x1)) + (x2.^2-10*cos(2*pi*x2)));
%derivative w.r.t. x1
observation.add(2*x1 + 20*pi*sin(2*pi*x1), 'x1');
%derivative w.r.t. x2
observation.add(2*x2 + 20*pi*sin(2*pi*x2), 'x2');
end
% Set study parameters
study.set_parameters('max_iter',25, 'num_initial',3,...
'jac',true, 'initial_samples', [[0.5,0.5];[-0.5,-0.5]]);
% Run the minimization
while(not(study.is_done))
sug = study.get_suggestion();
obs = objective(sug.sample);
study.add_observation(obs, sug.id);
end
info = study.info();
fprintf('\nMinimum %0.3e found at (x1=%0.3e, x2=%0.3e)',...
info.min_objective, info.min_params.x1, info.min_params.x2)
Parameters¶
The following parameters can be set by calling, e.g.
study.set_parameters('example_parameter1',[1,2,3], 'example_parameter2',true);
max_iter (int): | Maximum number of evaluations of the objective function (default: inf) |
---|
max_time (int): | Maximum run time in seconds (default: inf) |
---|
num_parallel (int): | |
---|---|
Number of parallel observations of the objective function (default: 1) |
eps (float): | Stopping criterium. Minimum distance in the parameter space to the currently known minimum (default: 0.0) |
---|
min_val (float): | |
---|---|
Stopping criterium. Minimum value of the objective function (default: -inf) |
num_initial (int): | |
---|---|
Number of independent initial optimizers (default: 1) |
max_num_minimizers (int): | |
---|---|
If a minimizer has converged, it is restarted at another position. If max_num_minimizers threads have converged, the optimization is stopped (default: inf) |
sobol_sequence (bool): | |
---|---|
If true, all initial samples are taken from a Sobol sequence. This typically improves the coverage of the parameter space. (default: True) |
jac (bool): | If true, the gradient is used for optimization (default: False) |
---|
step_size (float): | |
---|---|
Step size used for numerical approximation of the gradient (default: 1e-06) |
f_tol (float): | The iteration stops when . (default: 2.2e-09) |
---|