Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

patch 1.7.4 -> 1.7.5 breaks Accelerated/MomentumGradientDescent() #1038

Closed
NilsNiggemann opened this issue Apr 27, 2023 · 3 comments · Fixed by #1043
Closed

patch 1.7.4 -> 1.7.5 breaks Accelerated/MomentumGradientDescent() #1038

NilsNiggemann opened this issue Apr 27, 2023 · 3 comments · Fixed by #1043

Comments

@NilsNiggemann
Copy link

NilsNiggemann commented Apr 27, 2023

MWE: This works on 1.7.4 but not on 1.7.5.
The same error also occurs for MomentumGradientDescent(), but not the other solvers

julia> g(x) = -exp(-(x-pi)^2)
julia> using Optim

julia> optimize(x->g(x[1]),[0.],method = AcceleratedGradientDescent())
ERROR: BoundsError: attempt to access Bool at index [2]
Stacktrace:
 [1] getindex(x::Bool, i::Int64)
   @ Base ./number.jl:98
 [2] add_default_opts!(opts::Dict{Symbol, Any}, method::AcceleratedGradientDescent{LineSearches.InitialPrevious{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}})
   @ Optim ~/.julia/packages/Optim/29per/src/multivariate/optimize/interface.jl:24
 [3] optimize(f::Function, initial_x::Vector{Float64}; inplace::Bool, autodiff::Symbol, kwargs::Base.Pairs{Symbol, AcceleratedGradientDescent{LineSearches.InitialPrevious{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}, Tuple{Symbol}, NamedTuple{(:method,), Tuple{AcceleratedGradientDescent{LineSearches.InitialPrevious{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}}}}})
   @ Optim ~/.julia/packages/Optim/29per/src/multivariate/optimize/interface.jl:88
 [4] top-level scope
   @ REPL[6]:1

Edit: Checked on Julia 1.8.5 and 1.9.0-rc1

@pkofod
Copy link
Member

pkofod commented Jun 12, 2023


julia> optimize(x->g(x[1]),[0.], AcceleratedGradientDescent())
 * Status: success

 * Candidate solution
    Final objective value:     -1.000000e+00

 * Found with
    Algorithm:     Accelerated Gradient Descent

 * Convergence measures
    |x - x'|               = 1.13e-07 ≰ 0.0e+00
    |x - x'|/|x'|          = 3.59e-08 ≰ 0.0e+00
    |f(x) - f(x')|         = 1.28e-14 ≰ 0.0e+00
    |f(x) - f(x')|/|f(x')| = 1.28e-14 ≰ 0.0e+00
    |g(x)|                 = 0.00e+00 ≤ 1.0e-08

 * Work counters
    Seconds run:   0  (vs limit Inf)
    Iterations:    5
    f(x) calls:    36
    ∇f(x) calls:   36

strange, it happens when you use the method keyword only..

@pkofod
Copy link
Member

pkofod commented Jun 12, 2023

of course. I get it. It's because the function that's at fault is only used in the kwargs version.

@pkofod
Copy link
Member

pkofod commented Jun 12, 2023

I'll post a patch release

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants