From 694ac426d4245719eabdef0b48e325c553e036b2 Mon Sep 17 00:00:00 2001 From: Francis Gagnon Date: Mon, 8 Sep 2025 11:08:21 -0400 Subject: [PATCH] Update the multistart optimization tutorial in `ensemble.md` --- docs/src/tutorials/ensemble.md | 8 +++++++- 1 file changed, 7 insertions(+), 1 deletion(-) diff --git a/docs/src/tutorials/ensemble.md b/docs/src/tutorials/ensemble.md index 5c926a653..42b4215c9 100644 --- a/docs/src/tutorials/ensemble.md +++ b/docs/src/tutorials/ensemble.md @@ -5,6 +5,8 @@ of optimization, this is useful for performing multistart optimization. This can be useful for complex, low dimensional problems. We demonstrate this, again, on the rosenbrock function. +We first execute a single local optimization with `OptimizationOptimJL.BFGS` and `maxiters=5`: + ```@example ensemble using Optimization, OptimizationOptimJL, Random @@ -18,10 +20,14 @@ prob = OptimizationProblem(optf, x0, [1.0, 100.0]) @time sol1 = Optimization.solve(prob, OptimizationOptimJL.BFGS(), maxiters = 5) @show sol1.objective +``` +This results is compared to a multistart approach with 4 random initial points: + +```@example ensemble x0s = [x0, x0 .+ rand(2), x0 .+ rand(2), x0 .+ rand(2)] function prob_func(prob, i, repeat) - remake(prob, u0 = x0s[1]) + remake(prob, u0 = x0s[i]) end ensembleprob = Optimization.EnsembleProblem(prob; prob_func)