Evolutionary & Swarm Optimization

Evolutionary & Swarmintermediate~8 min

Use derivative-free search (fminsearch) to optimise noisy or non-smooth objective functions.

Step 1 — Define a multi-variable objective

Anonymous functions capture objective functions cleanly. The extended Rosenbrock is a classic benchmark: the global minimum is at (1, 1) with f = 0.

f = @(x) (1 - x(1))^2 + 100*(x(2) - x(1)^2)^2;
disp(f([0, 0]));
disp(f([1, 1]))
▶ Run in SimLab

Expected output: f([0,0]) = 1, f([1,1]) = 0

Step 2 — Find the minimum with fminsearch

fminsearch implements the Nelder–Mead simplex method — a derivative-free strategy that mirrors the population-based exploration of evolutionary algorithms.

f = @(x) (1 - x(1))^2 + 100*(x(2) - x(1)^2)^2;
[x_opt, fval] = fminsearch(f, [0; 0]);
printf('Optimum: x1=%.4f, x2=%.4f, fval=%.6f\n', x_opt(1), x_opt(2), fval)
▶ Run in SimLab

Expected output: Optimum near x1=1.0000, x2=1.0000, fval≈0

Step 3 — Visualise the search landscape

Plot the objective surface to see why this function is hard: the narrow curved valley misleads gradient-based methods.

x1 = linspace(-1.5, 2, 60);
x2 = linspace(-1, 3, 60);
[X1, X2] = meshgrid(x1, x2);
Z = (1 - X1).^2 + 100*(X2 - X1.^2).^2;
contourf(X1, X2, log(1 + Z), 20);
colorbar; xlabel('x1'); ylabel('x2');
title('Rosenbrock Landscape (log scale)')
▶ Run in SimLab

Expected output: Contour plot with a narrow banana-shaped valley leading to (1,1)

Related Tutorials

Try SimLab — Free MATLAB® Alternative

466 functions. Runs in your browser. No install.

Open SimLab

Stay Updated

Get notified about new simulations and tools. We send 1-2 emails per month.