-
-
Notifications
You must be signed in to change notification settings - Fork 151
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove deprecations for breaking release #715
Conversation
fd9cda7
to
3d72670
Compare
@Abhishek-1Bhatt can you take this one from here? All that needs to be done is that the tests need to be updated in the same way as the tutorials are being updated (a lot of the tests and examples should be the same) |
Sure 👍 |
Issues related to |
1bc7702
to
30e09f3
Compare
src/hnn.jl
Outdated
end | ||
return new{typeof(model), typeof(re), typeof(p)}(model, re, p) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is missing Lux compatibility dispatches.
@@ -11,19 +11,16 @@ derivatives of the loss backwards in time. | |||
|
|||
```julia | |||
NeuralODE(model,tspan,alg=nothing,args...;kwargs...) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Need to mention Lux
Did a lot of issue cleanup. Most were solved already. |
Still doesn't run full CI. Also there are some conflicts with master, I can't see what the conflicts are because of access issues. Besides maybe we need to need to merge master into this branch, it says 10 commits behind. |
Co-authored-by: Abhishek Bhatt <46929125+Abhishek-1Bhatt@users.noreply.github.com>
Co-authored-by: Abhishek Bhatt <46929125+Abhishek-1Bhatt@users.noreply.github.com>
This line errors the Newton NeuralODE Test DiffEqFlux.jl/test/newton_neural_ode.jl Line 51 in c48a0e1
Is this a compatibility issue with Lux? ERROR: MethodError: no method matching initial_state(::Optim.KrylovTrustRegion{Float64}, ::Optim.Options{Float64, OptimizationOptimJL.var"#_cb#11"{var"#5#6", Optim.KrylovTrustRegion{Float64}, Base.Iterators.Cycle{Tuple{Optimization.NullData}}}}, ::NLSolversBase.TwiceDifferentiableHV{Float32, ComponentArrays.ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = ViewAxis(1:10, Axis(weight = ViewAxis(1:5, ShapedAxis((5, 1), NamedTuple())), bias = ViewAxis(6:10, ShapedAxis((5, 1), NamedTuple())))), layer_2 = ViewAxis(11:16, Axis(weight = ViewAxis(1:5, ShapedAxis((1, 5), NamedTuple())), bias = ViewAxis(6:6, ShapedAxis((1, 1), NamedTuple())))))}}}, ComponentArrays.ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = ViewAxis(1:10, Axis(weight = ViewAxis(1:5, ShapedAxis((5, 1), NamedTuple())), bias = ViewAxis(6:10, ShapedAxis((5, 1), NamedTuple())))),
layer_2 = ViewAxis(11:16, Axis(weight = ViewAxis(1:5, ShapedAxis((1, 5), NamedTuple())), bias = ViewAxis(6:6, ShapedAxis((1, 1), NamedTuple())))))}}}, ComponentArrays.ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = ViewAxis(1:10, Axis(weight = ViewAxis(1:5, ShapedAxis((5, 1), NamedTuple())), bias = ViewAxis(6:10, ShapedAxis((5, 1), NamedTuple())))), layer_2 = ViewAxis(11:16, Axis(weight = ViewAxis(1:5, ShapedAxis((1, 5), NamedTuple())), bias = ViewAxis(6:6, ShapedAxis((1, 1), NamedTuple())))))}}}}, ::ComponentArrays.ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = ViewAxis(1:10, Axis(weight = ViewAxis(1:5, ShapedAxis((5, 1), NamedTuple())), bias = ViewAxis(6:10, ShapedAxis((5, 1), NamedTuple())))), layer_2 = ViewAxis(11:16, Axis(weight = ViewAxis(1:5, ShapedAxis((1, 5), NamedTuple())), bias = ViewAxis(6:6, ShapedAxis((1, 1), NamedTuple())))))}}})
Closest candidates are:
initial_state(::AcceleratedGradientDescent, ::Any, ::Any, ::AbstractArray{T}) where T at C:\Users\user\.julia\packages\Optim\6Lpjy\src\multivariate\solvers\first_order\accelerated_gradient_descent.jl:35
initial_state(::Optim.KrylovTrustRegion, ::Any, ::Any, ::Array{T}) where T at C:\Users\user\.julia\packages\Optim\6Lpjy\src\multivariate\solvers\second_order\krylov_trust_region.jl:39
initial_state(::SimulatedAnnealing, ::Any, ::Any, ::AbstractArray{T}) where T at C:\Users\user\.julia\packages\Optim\6Lpjy\src\multivariate\solvers\zeroth_order\simulated_annealing.jl:62 |
See if you can isolate it to |
What would be the best way to allow |
Just use Flux.jl for that for now. The real answer is an upstream fix to Optim.jl but that shouldn't block this. |
Just a heads up here. I will be deprecating quite a few functionalities (mostly these were undocumented but they ended up in user code) in v0.4.8 for removal in v0.5 (See https://github.com/avik-pal/Lux.jl/blob/ap/tests/src/deprecated.jl). Might be worthwhile avoiding using these (most notably |
Continued in #794 |
Fixes #707