-
-
Notifications
You must be signed in to change notification settings - Fork 161
Closed
SciML/SciMLSensitivity.jl
#408Description
Training via DiffEqFlux fails when save_idxs keyword is used within the loss function. Differentiation seems to be the problem. Evaluation of the loss function works fine.
MWE is taken from the docs.
Discussion on julia discourse is here.
using DifferentialEquations, Flux, Optim, DiffEqFlux
function lotka_volterra!(du, u, p, t)
x, y = u
α, β, δ, γ = p
du[1] = dx = α*x - β*x*y
du[2] = dy = -δ*y + γ*x*y
end
# Initial condition
u0 = [1.0, 1.0]
# Simulation interval and intermediary points
tspan = (0.0, 10.0)
tsteps = 0.0:0.1:10.0
# LV equation parameter. p = [α, β, δ, γ]
p = [1.5, 1.0, 3.0, 1.0]
# Setup the ODE problem, then solve
prob = ODEProblem(lotka_volterra!, u0, tspan, p)
function loss(p)
sol = solve(prob, Tsit5(), p=p, save_idxs=[2], saveat = tsteps)
loss = sum(abs2, sol.-1)
return loss, sol
end
result_ode = DiffEqFlux.sciml_train(loss, p, ADAM(0.1), maxiters = 100)Metadata
Metadata
Assignees
Labels
No labels