Skip to content

Commit

Permalink
Fix package extensions; Clean up tests; Remove use of threadid; Move …
Browse files Browse the repository at this point in the history
…figures into docs folder; Add a better, shorter, PDE example; remove GeneralLazyBufferCache (#87)
  • Loading branch information
DanielVandH committed Jun 15, 2023
1 parent 285a34f commit 919661e
Show file tree
Hide file tree
Showing 69 changed files with 4,126 additions and 4,733 deletions.
5 changes: 0 additions & 5 deletions .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,11 +21,6 @@ jobs:
matrix:
version:
- '1'
- '1.6'
- '1.7'
- '1.8'
- '1.9'
- 'nightly'
os:
- ubuntu-latest
arch:
Expand Down
6 changes: 4 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,8 @@ DelaunayTriangulation = "927a84f5-c5f4-47a5-9785-b46e178433df"
Makie = "ee78f7c6-11fb-53f2-987a-cfe4a2b5a57a"

[extensions]
ProfileLikelihoodMakieExt = "Makie"
ProfileLikelihoodDelaunayTriangulationExt = "DelaunayTriangulation"
ProfileLikelihoodMakieExt = "Makie"

[compat]
ChunkSplitters = "1.0"
Expand All @@ -42,7 +42,9 @@ StatsFuns = "1.1, 1.3"
julia = "1"

[extras]
DelaunayTriangulation = "927a84f5-c5f4-47a5-9785-b46e178433df"
Makie = "ee78f7c6-11fb-53f2-987a-cfe4a2b5a57a"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"

[targets]
test = ["Test"]
test = ["Test", "DelaunayTriangulation", "Makie"]
6 changes: 4 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,10 @@


[![DOI](https://zenodo.org/badge/508701126.svg)](https://zenodo.org/badge/latestdoi/508701126)
[![](https://img.shields.io/badge/docs-dev-blue.svg)](https://DanielVandH.github.io/ProfileLikelihood.jl/dev)
[![](https://img.shields.io/badge/docs-stable-blue.svg)](https://DanielVandH.github.io/ProfileLikelihood.jl/stable)
[![Dev](https://img.shields.io/badge/docs-dev-blue.svg)](https://DanielVandH.github.io/ProfileLikelihood.jl/dev)
[![Stable](https://img.shields.io/badge/docs-stable-blue.svg)](https://DanielVandH.github.io/ProfileLikelihood.jl/stable)
[![Build Status](https://github.com/DanielVandH/ProfileLikelihood.jl/actions/workflows/CI.yml/badge.svg?branch=main)](https://github.com/DanielVandH/ProfileLikelihood.jl/actions/workflows/CI.yml?query=branch%3Amain)
[![Coverage](https://codecov.io/gh/DanielVandH/ProfileLikelihood.jl/branch/main/graph/badge.svg)](https://codecov.io/gh/DanielVandH/ProfileLikelihood.jl)

This package defines the routines required for computing maximum likelihood estimates and profile likelihoods. The optimisation routines are built around the [Optimization.jl](https://github.com/SciML/Optimization.jl) interface, allowing us to e.g. easily switch between algorithms, between finite differences and automatic differentiation, and it allows for constraints to be defined with ease. Below we list the definitions we are using for likelihoods and profile likelihoods. This code works for univariate and bivariate profiles.

Expand Down
4 changes: 2 additions & 2 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,8 @@ makedocs(
"Example I: Multiple linear regression" => "regression.md"
"Example II: Logistic ordinary differential equation" => "logistic.md"
"Example III: Linear exponential ODE and grid searching" => "exponential.md"
"Example IV: Diffusion equation on a square plate" => "heat.md"
"Example V: Lotka-Volterra ODE, GeneralLazyBufferCache, and computing bivarate profile likelihoods" => "lotka.md"
"Example IV: Lotka-Volterra ODE and computing bivarate profile likelihoods" => "lotka.md"
"Example V: Fisher-Stefan PDE" => "stefan.md"
"Mathematical and Implementation Details" => "math.md"
])

Expand Down
123 changes: 23 additions & 100 deletions docs/src/exponential.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ using LatinHypercubeSampling
using OptimizationOptimJL
using OptimizationNLopt
using Test
using StableRNGs
```

## Setting up the problem
Expand All @@ -24,7 +25,7 @@ Let us start by defining the data and the likelihood problem:

```julia
## Step 1: Generate some data for the problem and define the likelihood
Random.seed!(2992999)
rng = StableRNG(2992999)
λ = -0.5
y₀ = 15.0
σ = 0.5
Expand All @@ -33,16 +34,14 @@ n = 450
Δt = T / n
t = [j * Δt for j in 0:n]
y = y₀ * exp.(λ * t)
yᵒ = y .+ [0.0, rand(Normal(0, σ), n)...]
yᵒ = y .+ [0.0, rand(rng, Normal(0, σ), n)...]
@inline function ode_fnc(u, p, t)
local λ
λ = p
du = λ * u
return du
end
using LoopVectorization, MuladdMacro
@inline function _loglik_fnc::AbstractVector{T}, data, integrator) where {T}
local yᵒ, n, λ, σ, u0
function _loglik_fnc::AbstractVector{T}, data, integrator) where {T}
yᵒ, n = data
λ, σ, u0 = θ
integrator.p = λ
Expand Down Expand Up @@ -90,7 +89,7 @@ We can now use this grid to evaluate the likelihood function at each point, and
```julia
gs = grid_search(prob, regular_grid)
LikelihoodSolution. retcode: Success
Maximum likelihood: -547.9579886200935
Maximum likelihood: -548.3068396174556
Maximum likelihood estimates: 3-element Vector{Float64}
λ: -0.612244897959183
σ: 0.816327448979592
Expand All @@ -103,7 +102,7 @@ You could also use an irregular grid, defining some grid as a matrix where each
using LatinHypercubeSampling
d = 3
gens = 1000
plan, _ = LHCoptim(500, d, gens)
plan, _ = LHCoptim(500, d, gens; rng)
new_lb = [-2.0, 0.05, 10.0]
new_ub = [2.0, 0.2, 20.0]
bnds = [(new_lb[i], new_ub[i]) for i in 1:d]
Expand All @@ -112,20 +111,21 @@ irregular_grid = IrregularGrid(lb, ub, parameter_vals)
gs_ir, loglik_vals_ir = grid_search(prob, irregular_grid; save_vals=Val(true), parallel = Val(true))
```
```julia
julia> gs_ir
LikelihoodSolution. retcode: Success
Maximum likelihood: -1729.7407123603484
Maximum likelihood: -2611.078183576969
Maximum likelihood estimates: 3-element Vector{Float64}
λ: -0.5090180360721444
σ: 0.19368737474949904
y₀: 15.791583166332664
λ: -0.5170340681362726
σ: 0.18256513026052107
y₀: 14.348697394789578
```
```julia
max_lik, max_idx = findmax(loglik_vals_ir)
@test max_lik == PL.get_maximum(gs_ir)
@test parameter_vals[:, max_idx] PL.get_mle(gs_ir)
```

(If you just want to try many points for starting your optimiser, see the optimiser in MultistartOptimization.jl.)
(If you just want to try many points for starting your optimiser, see e.g. the optimiser in MultistartOptimization.jl.)

## Parameter estimation

Expand All @@ -143,10 +143,10 @@ prof = profile(prob, sol; alg=NLopt.LN_NELDERMEAD, parallel = true)
```
```julia
ProfileLikelihoodSolution. MLE retcode: Success
Confidence intervals:
95.0% CI for λ: (-0.51091362373969, -0.49491369219060505)
95.0% CI for σ: (0.49607205632240814, 0.5652591835193789)
95.0% CI for y₀: (14.98587355568687, 15.305179849533756)
Confidence intervals:
95.0% CI for λ: (-0.5092192953535792, -0.49323747169071175)
95.0% CI for σ: (0.4925813447124647, 0.5612815283609663)
95.0% CI for y₀: (14.856528827532468, 15.173375766524025)
```
```julia
@test λ get_confidence_intervals(prof, )
Expand All @@ -162,90 +162,13 @@ Finally, we can visualise the profiles:
fig = plot_profiles(prof; nrow=1, ncol=3,
latex_names=[L"\lambda", L"\sigma", L"y_0"],
true_vals=[λ, σ, y₀],
fig_kwargs=(fontsize=30, resolution=(2109.644f0, 444.242f0)),
fig_kwargs=(fontsize=41,),
axis_kwargs=(width=600, height=300))
resize_to_layout!(fig)
```

![Linear exponential profiles](https://github.com/DanielVandH/ProfileLikelihood.jl/blob/main/test/figures/linear_exponential_example.png?raw=true)

## Just the code

Here is all the code used for obtaining the results in this example, should you want a version that you can directly copy and paste.

```julia
## Step 1: Generate some data for the problem and define the likelihood
using OrdinaryDiffEq, Random, Distributions, LoopVectorization, MuladdMacro
Random.seed!(2992999)
λ = -0.5
y₀ = 15.0
σ = 0.5
T = 5.0
n = 450
Δt = T / n
t = [j * Δt for j in 0:n]
y = y₀ * exp.(λ * t)
yᵒ = y .+ [0.0, rand(Normal(0, σ), n)...]
@inline function ode_fnc(u, p, t)
local λ
λ = p
du = λ * u
return du
end
@inline function _loglik_fnc::AbstractVector{T}, data, integrator) where {T}
local yᵒ, n, λ, σ, u0
yᵒ, n = data
λ, σ, u0 = θ
integrator.p = λ
## Now solve the problem
reinit!(integrator, u0)
solve!(integrator)
if !SciMLBase.successful_retcode(integrator.sol)
return typemin(T)
end
= -0.5(n + 1) * log(2π * σ^2)
s = zero(T)
@turbo @muladd for i in eachindex(yᵒ, integrator.sol.u)
s = s + (yᵒ[i] - integrator.sol.u[i]) * (yᵒ[i] - integrator.sol.u[i])
end
=- 0.5s / σ^2
end

## Step 2: Define the problem
using Optimization
θ₀ = [-1.0, 0.5, 19.73] # will be replaced anyway
lb = [-10.0, 1e-6, 0.5]
ub = [10.0, 10.0, 25.0]
syms = [, , :y₀]
prob = LikelihoodProblem(
_loglik_fnc, θ₀, ode_fnc, y₀, (0.0, T);
syms=syms,
data=(yᵒ, n),
ode_parameters=1.0, # temp value for λ
ode_kwargs=(verbose=false, saveat=t),
f_kwargs=(adtype=Optimization.AutoFiniteDiff(),),
prob_kwargs=(lb=lb, ub=ub),
ode_alg=Tsit5()
)

## Step 3: Grid search
regular_grid = RegularGrid(lb, ub, 50) # resolution can also be given as a vector for each parameter
gs = grid_search(prob, regular_grid)

## Step 4: Compute the MLE, starting at the grid search solution
using OptimizationOptimJL
prob = ProfileLikelihood.update_initial_estimate(prob, gs)
sol = mle(prob, Optim.LBFGS())

## Step 5: Profile
using OptimizationNLopt
prof = profile(prob, sol; alg=NLopt.LN_NELDERMEAD, parallel=true)


## Step 6: Visualise
using CairoMakie, LaTeXStrings
fig = plot_profiles(prof; nrow=1, ncol=3,
latex_names=[L"\lambda", L"\sigma", L"y_0"],
true_vals=[λ, σ, y₀],
fig_kwargs=(fontsize=30, resolution=(2109.644f0, 444.242f0)),
axis_kwargs=(width=600, height=300))
```
```@raw html
<figure>
<img src='../figures/linear_exponential_example.png', alt'Linear exponential profiles'><br>
</figure>
```
Binary file added docs/src/figures/linear_exponential_example.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/src/figures/logistic_example.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/src/figures/logistic_example_prediction.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/src/figures/lokta_example_profiles.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/src/figures/noisy_pde_data.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/src/figures/regression_profiles.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,4 @@ From Wilk's theorem, we know that $2\hat{\ell}\_p(\boldsymbol\theta \mid \boldsy

We compute the profile log-likelihood in this package by starting at the MLE, and stepping left/right until we reach a given threshold. The code is iterative to not waste time in so much of the parameter space. In the bivariate case, we start at the MLE and expand outwards in layers. This implementation is described in the documentation.

More detail about the methods we use in this package is given in the sections in the sidebar, with extra detail in the tests.
More detail about the methods we use in this package is given in the sections in the sidebar, with extra detail in the tests. All of the examples in the sidebar use a Gaussian likelihood, but of course the tools here work for any likelihood problem (e.g. some good examples that might be good to show are the Poisson problems [here](https://www.slac.stanford.edu/econf/C030908/papers/THAT001.pdf)).
9 changes: 3 additions & 6 deletions docs/src/interface.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,6 @@ LikelihoodProblem(loglik::Function, θ₀,

Importantly, `loglik` in this case is now a function of the form `ℓ(θ, p, integrator)`, where `integrator` is the same integrator as in the integrator interface from DifferentialEquations.jl; see the documentation at DifferentialEquations.jl for more detail on using the integrator. Furthermore, `ode_function` is the function for the ODE, `u₀` its initial condition, and `tspan` its time span. Additionally, the parameters for the `ode_function` (e.g. the `p` in `ode_function(du, u, p, t)` or `ode_function(u, p, t)`) can be passed using the keyword argument `ode_parameters`. The algorithm used to solve the differential equation is passed with `ode_alg`, and lastly any additional keyword arguments for solving the problem are to be passed through `ode_kwargs`.

There is also a method that makes it easier to use automatic differentiation when considering ODE problems by using a `GeneralLazyBufferCache` from PreallocationTools.jl - see Example V.

The full docstrings for the methods available are given in the sidebar.

## LikelihoodSolution: Obtaining an MLE
Expand Down Expand Up @@ -103,11 +101,10 @@ The full docstring for `get_prediction_intervals` is given in the sidebar.
We provide a function `plot_profiles` that can be useful for plotting profile likelihoods. It requires that you have done

```julia
using CairoMakie
using LaTeXString
using CairoMakie # (or any other Makie backend)
```

(else the function does not exist, thanks to Requires.jl). A full description of this function is given in the corresponding docstring in the sidebar.
(else the function does not exist, thanks to Requires.jl and package extensions from Julia v1.9). A full description of this function is given in the corresponding docstring in the sidebar.

## GridSearch

Expand All @@ -123,4 +120,4 @@ grid_search(prob::LikelihoodProblem, grid::AbstractGrid; save_vals=Val(false), p

Here, `grid` could be either a `RegularGrid` or an `IrregularGrid`. You can set `save_vals=Val(true)` if you want an array with all the likelihood function values, `Val(false)` otherwise. To enable multithreading, allowing for the evaluation of the function across different points via multiple threads, set `parallel=Val(true)`, otherwise leave it as `Val(false)`. The result of this grid search, if `save_vals=Val(true)`, will be `(sol, f_vals)`, where `sol` is a likelihood solution giving the parameters that gave to the highest likelihood, and `f_res` is the array of likelihoods at the corresponding parameters. If `save_vals=Val(false)`, only `sol` is returned.

More example is given in the examples, and complete docstrings are provided in the sidebar.
More detail is given in the examples, and complete docstrings are provided in the sidebar.
Loading

0 comments on commit 919661e

Please sign in to comment.