Fix NLopt crash with gradient-based algorithms when no AD backend specified #1068
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Fixes https://discourse.julialang.org/t/error-when-using-multistart-optimization/133174
Problem
When using NLopt's gradient-based algorithms (like
LD_LBFGS
) without specifying an AD backend inOptimizationFunction
, the code would crash with:This occurred because the NLopt wrapper tried to call
cache.f.grad(G, θ)
at line 181, butcache.f.grad
wasnothing
when no AD backend was specified.Solution
Added a check in the
__solve
method to verify that if the algorithm requires gradients,cache.f.grad
is notnothing
. If it isnothing
, we now throw a helpfulIncompatibleOptimizerError
that guides users to:OptimizationFunction
with an AD backend (e.g.,AutoForwardDiff()
)grad
kwargChanges
NLopt.LD_LBFGS()
andNLopt.Opt(:LD_LBFGS, 2)
interfacesExample Error Message
Before this PR:
After this PR:
Test Results
All 42 tests pass, including the new test that reproduces the discourse issue.
🤖 Generated with Claude Code
Co-Authored-By: Claude noreply@anthropic.com