Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Link to DifferentialEquations.jl? #12

Open
ChrisRackauckas opened this issue Nov 5, 2020 · 12 comments
Open

Link to DifferentialEquations.jl? #12

ChrisRackauckas opened this issue Nov 5, 2020 · 12 comments

Comments

@ChrisRackauckas
Copy link

Cool project! I think it would be cool if simulation-based methods could be supported here by hooking into DifferentialEquations.jl and its DiffEqFlux fast adjoint schemes (https://diffeqflux.sciml.ai/dev/). Usually this ends up being a bit faster than discrete-then-optimize techniques (when utilizing automatic differentiation mixed adjoints), so it would be a good technique to mix in.

@sshin23
Copy link
Collaborator

sshin23 commented Nov 6, 2020

Thanks for the great suggestion @ChrisRackauckas

It would be nice to have an option to formulate the problem as a simulation-based problem using adjoint evaluation capability. And I think DifferentialEquations.jl would be the most convenient package for doing this 🙂 We can't promise anything at this point, but this seems like a great long-term plan.

Do you have an example code of hooking up DifferentialEquations.jl with NLP solvers via JuMP?

@ChrisRackauckas
Copy link
Author

https://diffeq.sciml.ai/stable/analysis/parameter_estimation/#Using-JuMP-with-DiffEqParamEstim is an example, but that will use forward-mode AD so it won't scale to huge problems as well. Mixing it with https://diffeq.sciml.ai/stable/analysis/sensitivity/#solve-Differentiation-Examples for the gradient definition is what you'd want.

@paulflang
Copy link
Owner

Thanks a lot for your interest, @ChrisRackauckas. I think your suggestion on Twitter to use a two step approach could make SBML2Julia much more powerful.
The current Julia code output of SBML2Julia looks like the tests/fixtures/jl_code_gold.jl file. Do you think you would be able to contribute fixtures tests/fixtures/mtk_code_gold.jl (or tests/fixtures/catalyst_code_gold.jl), tests/fixtures/JuMP_code_gold.jl, tests/fixtures/optim_code_gold.jl, etc. which solve the optimization problem specified in tests/fixtures/jl_code_gold.jl to the feature_mtk branch? You will probably see from the code structure that I know very little Julia, so please feel free to make improvements. I could then take it from there and modify the Python part to auto generate MTK, JuMP, optim etc. models.

@paulflang
Copy link
Owner

I am just trying to optimize a simple ODE system specified in MTK with JuMP:

using ModelingToolkit
using OrdinaryDiffEq, DiffEqParamEstim, JuMP, NLopt

t_max = 10
dt = 0.1

t_out = 0:dt:t_max
tspan = (0,t_max)

@ModelingToolkit.parameters t, k1, k2
@ModelingToolkit.variables A(t) B(t)
@ModelingToolkit.derivatives D'~t

parameters = [k1 => 0.8, k2 => 0.6]
initial_conditions = [A => 1,
                      B => 0]
eqs = [D(A) ~ -k1*A + k2*B,
       D(B) ~ k1*A - k2*B]
sys = ODESystem(eqs)

model_ode(p_) = OrdinaryDiffEq.ODEProblem(sys,initial_conditions,tspan,p_,jac=true)
solve_model(mp_) = OrdinaryDiffEq.solve(model_ode(mp_), Tsit5(),saveat=dt)
mock_data = Array(solve_model([0.8, 0.6]))
loss_objective(mp_, dat) = build_loss_objective(model_ode(mp_), Tsit5(), L2Loss(t_out,dat))

juobj(args...) = loss_objective(args, mock_data)(args)
jumodel = Model()
JuMP.register(jumodel, :juobj, 2, juobj, autodiff=true)
@JuMP.variable(jumodel, k1, start=1)
@JuMP.variable(jumodel, k2, start=1)
@JuMP.NLobjective(jumodel, Min, juobj(k1, k2))
set_optimizer(jumodel, NLopt.Optimizer)
set_optimizer_attribute(jumodel, "algorithm", :LD_MMA)
JuMP.optimize!(jumodel)

Unfortunately, I get the following error:
ERROR: MethodError: no method matching lower_mapnames(::Tuple{Float64,Float64})
I am using Julia v1.5.3 and just added the used packages yesterday. Do you have ideas how to resolve this?

@ChrisRackauckas
Copy link
Author

JuMP turns parameters into tuples, so you probably don't want to use JuMP if the number of parameters gets large. That said, here's a working version of that code:

using ModelingToolkit
using OrdinaryDiffEq, DiffEqParamEstim, JuMP, NLopt

t_max = 10.0
dt = 0.1

t_out = 0:dt:t_max
tspan = (0,t_max)

@ModelingToolkit.parameters t, k1, k2
@ModelingToolkit.variables A(t) B(t)
@ModelingToolkit.derivatives D'~t

parameters = [k1 => 0.8, k2 => 0.6]
initial_conditions = [A => 1,
                      B => 0]
eqs = [D(A) ~ -k1*A + k2*B,
       D(B) ~ k1*A - k2*B]
sys = ODESystem(eqs)

model_ode(p_) = OrdinaryDiffEq.ODEProblem(sys,initial_conditions,tspan,p_,jac=true)
solve_model(mp_) = OrdinaryDiffEq.solve(model_ode(mp_), Tsit5(),saveat=dt)
mock_data = Array(solve_model([0.8, 0.6]))
loss_objective(mp_, dat) = build_loss_objective(model_ode(collect(mp_)), Tsit5(), L2Loss(t_out,dat))

juobj(args...) = loss_objective(args, mock_data)(args)
jumodel = Model()
JuMP.register(jumodel, :juobj, 2, juobj, autodiff=true)
@JuMP.variable(jumodel, k1, start=1)
@JuMP.variable(jumodel, k2, start=1)
@JuMP.NLobjective(jumodel, Min, juobj(k1, k2))
set_optimizer(jumodel, NLopt.Optimizer)
set_optimizer_attribute(jumodel, "algorithm", :LD_MMA)
JuMP.optimize!(jumodel)

The other change was to make time floating point valued, since integer-valued time usually isn't a great idea. Hopefully that helps!

@paulflang
Copy link
Owner

Thanks! That helped a lot.

@paulflang
Copy link
Owner

I cannot find a way to get multiple experimental condition to work. What I tried for instance is:

mock_data_c1 = Array(solve_model([0.8, 0.6]))
mock_data_c2 = Array(solve_model([0.8, 0.5]))
loss_objective(mp_, dat) = build_loss_objective(model_ode(collect(mp_)), Tsit5(), L2Loss(t_out,dat))

juobj(args1, args2) = loss_objective(args1, mock_data1)(args1) + loss_objective(args2, mock_data2)(args2)

jumodel = Model()
JuMP.register(jumodel, :juobj, 2, juobj, autodiff=true)
@JuMP.variable(jumodel, k1_1, start=1)
@JuMP.variable(jumodel, k2_1, start=1)
@JuMP.variable(jumodel, k1_2, start=1)
@JuMP.variable(jumodel, k2_2, start=1)
@JuMP.NLobjective(jumodel, Min, juobj((k1_1, k2_1), (k1_2, k2_2)))

But I get the following error:

ERROR: Unexpected object (k1, k2) (of type Tuple{VariableRef,VariableRef} in nonlinear expression.
Stacktrace:
 [1] error(::String) at ./error.jl:33
 [2] _parse_NL_expr_runtime(::Model, ::Tuple{VariableRef,VariableRef}, ::Array{JuMP._Derivatives.NodeData,1}, ::Int64, ::Array{Float64,1}) at /home/paul/.julia/packages/JuMP/qhoVb/src/parse_nlp.jl:223
 [3] top-level scope at /home/paul/.julia/packages/JuMP/qhoVb/src/parse_nlp.jl:115
 [4] top-level scope at /home/paul/.julia/packages/JuMP/qhoVb/src/macros.jl:1368
 [5] top-level scope at REPL[23]:1

If anyone knows how to solve this, please let me know.

@ChrisRackauckas
Copy link
Author

I would ask on #jump-dev as it's more of a JuMP question.

@paulflang
Copy link
Owner

paulflang commented Jan 14, 2021

Oh, I see. Thanks. I actually wanted to write an objective function that includes multiple experimental conditions and could be fed to any optimizer, not only JuMP. Is there really no way this can be done using build_loss_objective?

@ChrisRackauckas
Copy link
Author

build_loss_objective isn't the issue here (and you could / probably should use DiffEqFlux.jl's more expressive form anyways). The issue is that is an ill-formed JuMP expression, so you couldn't write that with one experimental condition. As to why it says it's ill-formed, that's a JuMP question and you might want to ask a JuMP dev.

@paulflang
Copy link
Owner

Little update from my side. I tried to do the first step of the 2-step approach and wrote SbmlInterface.jl, which converts SBML models to MTK and simulates them. Thanks a lot for your help on slack, @ChrisRackauckas . Next plan is to interface with optimizers.

@ChrisRackauckas
Copy link
Author

That is fantastic!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants