Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Success when none of convergence measures is satisfied #806

Closed
adamglos92 opened this issue Apr 17, 2020 · 12 comments
Closed

Success when none of convergence measures is satisfied #806

adamglos92 opened this issue Apr 17, 2020 · 12 comments

Comments

@adamglos92
Copy link

Hi,
I optimizing my function, and I've got the following output

 * Status: success

 * Candidate solution
    Minimizer: [4.09e-01, 5.23e+00, 6.28e+00,  ...]
    Minimum:   8.416426e-01

 * Found with
    Algorithm:     Fminbox with L-BFGS
    Initial Point: [4.54e-01, 6.00e+00, 5.54e+00,  ...]

 * Convergence measures
    |x - x'|               = 0.00e+00 ≰ -1.0e+00
    |x - x'|/|x'|          = 0.00e+00 ≰ -1.0e+00
    |f(x) - f(x')|         = 0.00e+00 ≰ -1.0e+00
    |f(x) - f(x')|/|f(x')| = 0.00e+00 ≰ -1.0e+00
    |g(x)|                 = 1.53e-03 ≰ 1.0e-06

 * Work counters
    Seconds run:   21  (vs limit Inf)
    Iterations:    6
    f(x) calls:    11450
    ∇f(x) calls:   11450

Why is it a successful run, while none of the convergence measures is satisfied? I used following options

optimizer = Fminbox(LBFGS(alphaguess=InitialStatic(alpha=.001)))
opt = Optim.Options(g_tol=1e-6, x_abstol=-1., x_reltol=-1., f_abstol=-1., f_reltol=-1.)
sol = optimize(fg!, lower_bounds, upper_bounds, init_times, optimizer, opt)

Unfortunately, my function is too long to write it in here

@antoine-levitt
Copy link
Contributor

Sometimes allow_f_increases is the problem, try setting that to true?

@adamglos92
Copy link
Author

It seems it works! is it somewhere in the docs? And does the "bad case" has any meaning?

@antoine-levitt
Copy link
Contributor

I'm not sure... @pkofod?

@antoine-levitt
Copy link
Contributor

And we should really make that the default...

@pkofod
Copy link
Member

pkofod commented Apr 30, 2020

It seems it works! is it somewhere in the docs? And does the "bad case" has any meaning?

It is probably mentioned under options, but I'm not sure it's mentioned specifically anywhere. It should definitely not write success in that case, so if you can provide the original problem that would be awesome. Aslo what version of Optim were you on? There was a bug that if the line search failed (no decrease ould be found) it would print success. You might not have had that bug fix when doing this.

And we should really make that the default...

We should discuss this elsewhere. I'm not sure it's also too wise. I did change it so that bfgs is only updated if d(y, s) is positive, but we don't want to update BFGS at least, because an increasing function will definitely not fulfill the wolfe conditions.

@antoine-levitt
Copy link
Contributor

An increasing objective function is not the fault of the user, it's a defect in the algorithms, so we definitely should not stop. Maybe print a warning? With LBFGS I never had any problems with allow_f_increases=true, and definitely had issues with the default.

@adamglos92
Copy link
Author

I am pretty sure I was using the current version of Optim. I will try to create a simplified MWE, however it may take some time.

@adamglos92
Copy link
Author

Hi, here is the MWE

using Optim 
using LineSearches: InitialStatic
using Random
## preparation

function obj_fun!(F, G, x, weights::Vector{T}, alpha::Vector{T}) where T<:Real
    @assert length(x) == length(weights) == length(alpha)

    if G != nothing
        copyto!(G, -weights .* sin.(x .* alpha) )
    end
    if F != nothing
        return sum(weights .* cos.(x .* alpha))
    end
    nothing
end

Random.seed!(1)
n = 20
w = rand(n)
alpha = collect(0.:(n-1))

fg! = Optim.only_fg!((F, G, x) -> obj_fun!(F, G, x, w, alpha))

lower_bounds = fill(0., n)
upper_bounds = fill(Float64(pi), n)
init_times = rand(n) .* upper_bounds

##

optimizer = Fminbox(LBFGS(alphaguess=InitialStatic(alpha=.001)))
opt = Optim.Options(g_tol=1e-6, x_abstol=-1., x_reltol=-1., f_abstol=-1., f_reltol=-1.)
sol = optimize(fg!, lower_bounds, upper_bounds, init_times, optimizer, opt)

@pkofod
Copy link
Member

pkofod commented Apr 30, 2020

An increasing objective function is not the fault of the user, it's a defect in the algorithms, so we definitely should not stop. Maybe print a warning? With LBFGS I never had any problems with allow_f_increases=true, and definitely had issues with the default.

I'm not sure why we should only stop if it's the fault of the user? Sometimes numerical algorithms fail. But we can try it 🤷‍♂️ some sort of benchmarking would be nice though.

Hi, here is the MWE

Thanks, so an outer iteration "fails" to decrease the objective here. That might be ok.

@antoine-levitt
Copy link
Contributor

I'm not sure why we should only stop if it's the fault of the user? Sometimes numerical algorithms fail

Not smooth optimization algorithms. There's really no excuse to fail if eg fixed-step gradient descent works. Of course that's tricky to get right, but that's the aim.

@pkofod
Copy link
Member

pkofod commented May 4, 2020

Not smooth optimization algorithms. There's really no excuse to fail if eg fixed-step gradient descent works. Of course that's tricky to get right, but that's the aim.

Yeah I get what you're saying, and I agree.

@pkofod
Copy link
Member

pkofod commented Sep 5, 2020

I switched to allowing increases.

@pkofod pkofod closed this as completed Sep 5, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants