-
Notifications
You must be signed in to change notification settings - Fork 230
Description
I am trying to get Turing to work for bayesian parameter optimisation in combination with DiffEq. Unfortunately, I cannot really get it to mesh properly. In this case, I can go through the full process, however, the outputs do not look like expected (i.e. the posterior distributions are very far from the true values).
I am not super familiar with what is going on (and got little experience with Bayesian stuff), so I might make a stupid mistake. Either way, some hint on whether seeing something like this is as bad as I presume would be useful!
I start by delcaring the model and generating some synthetic data where I know the parameters (two for the models and one for the noise distribution).
# Fetch packages
using Distributions
using Turing
using OrdinaryDiffEq
using StatsPlots
using Random
Random.seed!(14);
# Define SIR model.
function sir(du, u, p, t)
S, I, R = u
γ, ν = p
infection = γ * S * I
recovery = ν * I
du[1] = -infection
du[2] = infection - recovery
du[3] = recovery
return nothing
end
# Create ODEProblem for true model parameters.
u0_known = [999, 1, 0]
tend_known = 100.0
γ_true = 0.0003; ν_true = 0.1; σI_true = 0.2;
p_true = [γ_true, ν_true]
oprob_true = ODEProblem(sir, u0_known, tend_known, p_true)
# Generate synthetic data (and plot it).
sol_true = solve(oprob_true)
measured_t = 5:5:100
measured_I = [rand(Normal(I, σI_true*I)) for I in sol_true(measured_t, idxs = 2).u]
plot(sol_true; label = "Vals (true)")
plot!(measured_t, measured_I; label = "Vals (measured)", seriestype = :scatter)Next I I create and run a Turing model.
# Creates Turing model.
@model function fit_sir(data, prob)
# Sets prior distributions.
γ ~ LogUniform(0.00001, 0.001)
ν ~ LogUniform(0.01, 0.9)
σI ~ LogUniform(0.1, 1)
# Simulate the model.
prob = remake(prob; p = [γ, ν])
predicted = solve(prob; saveat = measured_t, verbose = false, maxiters = 10000)
# If simulation was unsuccessful, the likelihood is -Inf.
if !SciMLBase.successful_retcode(predicted)
Turing.@addlogprob! -Inf
return nothing
end
# Computes the likelihood of the observations.
for i in eachindex(predicted)
I = max(predicted[i][2], 0.0)
data[i] ~ Normal(I, σI *I)
end
return nothing
end
# Fit model to data.
model = fit_sir(sol_true[2,:], oprob_true)
chain = sample(model, NUTS(), MCMCSerial(), 100000, 3; progress = false)Finally, I plot the posterior distributions, including the true values:
# Plots the result of the fits (doesn't look correct).
plt1 = begin
plt = plot(plot(chain)[1,2])
plot!([γ_true, γ_true], collect(ylims(plt)); color = :black, lw = 5, la = 0.5)
end
plt2 = begin
plt = plot(plot(chain)[2,2])
plot!([ν_true, ν_true], collect(ylims(plt)); color = :black, lw = 5, la = 0.5)
end
plt3 = begin
plt = plot(plot(chain)[3,2])
plot!([σI_true, σI_true], collect(ylims(plt)); color = :black, lw = 5, la = 0.5)
end
plot(plt1, plt2, plt3; layout = (1,3), size = (1200, 400), title = ["γ" "ν" "σI"], yguide = "")I also tried seeding the sample with initial values corresponding to the correct ones, but this did not help
model = fit_sir(sol_true[2,:], oprob_true)
initial_params = fill([0.0005, 0.05, 0.35], 3)
chain = sample(model, NUTS(), MCMCSerial(), 100000, 3; progress = false, initial_params)
plot(chain)

