Skip to content

Conversation

ChrisRackauckas-Claude
Copy link

Fixes https://discourse.julialang.org/t/error-when-using-multistart-optimization/133174

Problem

When using NLopt's gradient-based algorithms (like LD_LBFGS) without specifying an AD backend in OptimizationFunction, the code would crash with:

MethodError: objects of type Nothing are not callable

This occurred because the NLopt wrapper tried to call cache.f.grad(G, θ) at line 181, but cache.f.grad was nothing when no AD backend was specified.

Solution

Added a check in the __solve method to verify that if the algorithm requires gradients, cache.f.grad is not nothing. If it is nothing, we now throw a helpful IncompatibleOptimizerError that guides users to:

  1. Use OptimizationFunction with an AD backend (e.g., AutoForwardDiff())
  2. Or provide gradients manually via the grad kwarg

Changes

  1. lib/OptimizationNLopt/src/OptimizationNLopt.jl: Added gradient availability check (line 170-177) before attempting to use gradients, providing a clear error message for users
  2. lib/OptimizationNLopt/test/runtests.jl: Added comprehensive test (line 178-207) to verify:
    • Error is thrown when gradient-based algorithms are used without AD
    • Error is thrown with both NLopt.LD_LBFGS() and NLopt.Opt(:LD_LBFGS, 2) interfaces
    • Gradient-free algorithms still work without AD backend
    • Gradient-based algorithms work correctly when AD is provided
  3. docs/src/optimization_packages/multistartoptimization.md: Fixed documentation example to include AD backend specification

Example Error Message

Before this PR:

ERROR: MethodError: objects of type Nothing are not callable

After this PR:

ERROR: The NLopt algorithm LD_LBFGS requires gradients, but no gradient function is available. 
Please use `OptimizationFunction` with an automatic differentiation backend, 
e.g., `OptimizationFunction(f, AutoForwardDiff())`, or provide gradients manually via the `grad` kwarg.

Test Results

All 42 tests pass, including the new test that reproduces the discourse issue.

🤖 Generated with Claude Code

Co-Authored-By: Claude noreply@anthropic.com

…cified

This fixes the issue reported in https://discourse.julialang.org/t/error-when-using-multistart-optimization/133174

## Problem
When using NLopt's gradient-based algorithms (like LD_LBFGS) without specifying
an AD backend in OptimizationFunction, the code would crash with:
`MethodError: objects of type Nothing are not callable`

This occurred because the NLopt wrapper tried to call `cache.f.grad(G, θ)` at
line 181, but `cache.f.grad` was `nothing` when no AD backend was specified.

## Solution
Added a check in the `__solve` method to verify that if the algorithm requires
gradients, `cache.f.grad` is not `nothing`. If it is `nothing`, we now throw
a helpful `IncompatibleOptimizerError` that guides users to:
1. Use `OptimizationFunction` with an AD backend (e.g., `AutoForwardDiff()`)
2. Or provide gradients manually via the `grad` kwarg

## Changes
1. **OptimizationNLopt.jl**: Added gradient availability check before attempting
   to use gradients, providing a clear error message for users
2. **runtests.jl**: Added comprehensive tests to verify:
   - Error is thrown when gradient-based algorithms are used without AD
   - Error is thrown with both `NLopt.LD_LBFGS()` and `NLopt.Opt(:LD_LBFGS, 2)`
   - Gradient-free algorithms still work without AD backend
   - Gradient-based algorithms work correctly when AD is provided
3. **multistartoptimization.md**: Fixed documentation example to include AD backend

## Test Results
All tests pass, including the new test that reproduces the discourse issue.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
@ChrisRackauckas ChrisRackauckas merged commit ba726da into SciML:master Oct 16, 2025
40 of 77 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants