Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compiler 3.0 #965

Merged
merged 67 commits into from
Dec 9, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
67 commits
Select commit Hold shift + click to select a range
7d5451c
VarName refactor
mohamed82008 Nov 17, 2019
e4365ef
minor cleanup
mohamed82008 Nov 17, 2019
f848eed
new compiler proof of concept
mohamed82008 Nov 18, 2019
f596e29
update model macro docstring
mohamed82008 Nov 18, 2019
e9d2a56
remove compiler2.jl and make it compiler.jl
mohamed82008 Nov 18, 2019
f46d530
replace @vi() with the varinfo input to the model
mohamed82008 Nov 18, 2019
f162905
user-input random variable names
mohamed82008 Nov 19, 2019
0b73b74
minor cleanup
mohamed82008 Nov 19, 2019
a73ed67
introduce the concept of a context
mohamed82008 Nov 19, 2019
c1b6803
minor cleanup
mohamed82008 Nov 19, 2019
59b8e9a
BatchContext to scale the log likelihood
mohamed82008 Nov 19, 2019
db597cd
minor cleanup
mohamed82008 Nov 19, 2019
9e7ad76
skipping logpdf when in LikelihoodContext
mohamed82008 Nov 19, 2019
6594ea4
partial missing data bug fix
mohamed82008 Nov 19, 2019
8a00ed3
fix autodiff for partial missing data
mohamed82008 Nov 19, 2019
84583b3
fix Kai's and Tor's comments
mohamed82008 Nov 20, 2019
48158ab
NoDist of a NamedDist is a NamedDist
mohamed82008 Nov 20, 2019
277e11c
rename assume_or_observe to tilde & add dot_tilde
mohamed82008 Nov 20, 2019
645dce2
inference tilde cleanup
mohamed82008 Nov 20, 2019
d8a1101
remove vector _tilde
mohamed82008 Nov 20, 2019
fd25d2b
some more cleanup
mohamed82008 Nov 20, 2019
2d8b860
remove repeated comment
mohamed82008 Nov 20, 2019
daff9c6
add FillArrays to deps
mohamed82008 Nov 20, 2019
873ecf1
change vec ~ tests to .~
mohamed82008 Nov 20, 2019
6e99ab0
fix compiler tests
mohamed82008 Nov 20, 2019
488b991
add logpdf macro for use in the model macro
mohamed82008 Nov 20, 2019
8cb9240
fix bugs
mohamed82008 Nov 21, 2019
4b8b50c
avoid evaluating the LHS or RHS twice
mohamed82008 Nov 21, 2019
4fb20c5
compiler and varinfo docs
mohamed82008 Nov 21, 2019
f82398c
remove spaces in varname indexing
mohamed82008 Nov 21, 2019
5e21a13
type stability fix
mohamed82008 Nov 22, 2019
6238adf
minor fix and docs update
mohamed82008 Nov 22, 2019
40c1e8d
remove unused arg in dot_tilde observe method
mohamed82008 Nov 22, 2019
6e8958e
shorten the docstring of the `@model` macro
mohamed82008 Nov 22, 2019
9a5572b
fix tests
mohamed82008 Nov 22, 2019
3ce0956
try dropping support for .~ on Julia 1.0 only
mohamed82008 Nov 22, 2019
e19d414
fix #760
mohamed82008 Nov 22, 2019
d57c594
use @. in compiler docs
mohamed82008 Nov 22, 2019
eadcef0
support @. and conditionally support .~ when valid
mohamed82008 Nov 22, 2019
0909b3b
minor cleanup
mohamed82008 Nov 22, 2019
09f6b1f
make ambiguity error say @. or .~
mohamed82008 Nov 22, 2019
ef2a05a
test @. by default and conditionally test .~
mohamed82008 Nov 22, 2019
7135b8f
support broadcasting ~ with mismatched array sizes
mohamed82008 Nov 23, 2019
f1f87b1
add .~ test for mismatched array sizes
mohamed82008 Nov 23, 2019
78338e1
test for throwing when input is missing
mohamed82008 Nov 23, 2019
0ed0d8b
add more tests
mohamed82008 Nov 23, 2019
13a83a9
Merge branch 'master' into mt/compiler3.0
mohamed82008 Nov 27, 2019
1342e04
fix merge
mohamed82008 Nov 27, 2019
40332e7
workaround string(:) == "Colon" in Julia 1.2
mohamed82008 Nov 27, 2019
ea5a5cc
fix the colon thing for real
mohamed82008 Nov 27, 2019
fc5f558
minor test fix
mohamed82008 Nov 27, 2019
6f5206f
add `@sampler()` to access the sampler in model
mohamed82008 Nov 27, 2019
f6721fd
increase sample size and lower atol in mh test
mohamed82008 Nov 28, 2019
22dab76
remove FillArrays dep and reorganize a bit
mohamed82008 Dec 2, 2019
ad72d1e
Merge branch 'master' into mt/compiler3.0
cpfiffer Dec 2, 2019
1f0daca
Remove docstring spacing.
cpfiffer Dec 2, 2019
2212df1
Interface -> AbstractMCMC in dynamichmc
mohamed82008 Dec 2, 2019
b1984cf
BatchContext -> MiniBatchContext & ctx docstrings
mohamed82008 Dec 2, 2019
f7ed50b
Core.tilde vs Inference.tilde in compiler docs
mohamed82008 Dec 2, 2019
61c7bb5
ismissing to === missing
mohamed82008 Dec 7, 2019
8829f52
style issues and cleanup
mohamed82008 Dec 7, 2019
313f5ff
is_number_or_array_type -> isa FloatOrArrayType
mohamed82008 Dec 7, 2019
a9d03b0
get_matching_type changes
mohamed82008 Dec 8, 2019
0086db8
Merge branch 'master' into mt/compiler3.0
mohamed82008 Dec 8, 2019
ce92abb
remove SpecialFunctions compat block
mohamed82008 Dec 8, 2019
490ecb3
update the model internals docs
mohamed82008 Dec 8, 2019
fe46fc3
update the guide.md file
mohamed82008 Dec 8, 2019
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
146 changes: 119 additions & 27 deletions docs/src/using-turing/advanced.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ using Turing
m ~ Normal(0, sqrt(s))

# Observe each value of x.
[x ~ Normal(m, sqrt(s))]
@. x ~ Normal(m, sqrt(s))
end

sample(gdemo([1.5, 2.0]), HMC(0.1, 5), 1000)
Expand All @@ -99,58 +99,50 @@ using Turing
data = (x = [1.5, 2.0],)

# Create the model function.
mf(vi, sampler, model) = begin
mf(vi, sampler, ctx, model) = begin
# Set the accumulated logp to zero.
vi.logp = 0

# If x is provided, use the provided values.
# Otherwise, treat x as an empty vector with
# two entries.
if isdefined(model.data, :x)
x = model.data.x
else # x is a parameter
x = model.defaults.x
end
x = model.args.x

# Assume s has an InverseGamma distribution.
s, lp = Turing.assume(sampler,
InverseGamma(2, 3),
Turing.VarName([:c_s, :s], ""), vi)
Turing.@varname(s), vi)

# Add the lp to the accumulated logp.
vi.logp += lp

# Assume m has a Normal distribution.
m, lp = Turing.assume(sampler,
Normal(0, sqrt(s)),
Turing.VarName([:c_m, :m], ""), vi)
m, lp = Turing.Inference.tilde(
ctx,
sampler,
Normal(0, sqrt(s)),
Turing.@varname(m),
vi,
)

# Add the lp to the accumulated logp.
vi.logp += lp

# Observe each value of x[i], according to a
# Normal distribution.
for i = 1:length(x)
vi.logp += Turing.observe(sampler,
Normal(m, sqrt(s)),
x[i], vi)
end
vi.logp += Turing.Inference.dot_tilde(
ctx,
sampler,
Normal(m, sqrt(s)),
x,
vi,
)
end

# Define the default value for x when missing
defaults = (x = Vector{Real}(undef, 2),)

# Instantiate a Model object.
model = Turing.Model{Tuple{:s, :m}, Tuple{:x}}(mf, data, defaults)
model = Turing.Model(mf, data)

# Sample the model.
chain = sample(model, HMC(0.1, 5), 1000)
```


Note that the `Turing.Model{Tuple{:s, :m}, Tuple{:x, :y}}` accepts two parameter tuples. The first set, `Tuple{:s, :m}`, represents parameter variables that will be generated by the model, while the second (`Tuple{:x}`) contains the variables to be observed.


## Task Copying


Expand Down Expand Up @@ -207,6 +199,106 @@ result = optimize(nlogp, lb, ub, sm_0, Fminbox())
```


## Maximum a Posteriori Estimation


Turing does not currently have built-in methods for calculating the [maximum a posteriori](https://en.wikipedia.org/wiki/Maximum_a_posteriori_estimation) (MAP) for a model. This is a goal for Turing's implementation (see [this issue](https://github.com/TuringLang/Turing.jl/issues/605)), but for the moment, we present here a method for estimating the MAP using [Optim.jl](https://github.com/JuliaNLSolvers/Optim.jl).


```julia
using Turing

# Define the simple gdemo model.
@model gdemo(x, y) = begin
s ~ InverseGamma(2, 3)
m ~ Normal(0, sqrt(s))
x ~ Normal(m, sqrt(s))
y ~ Normal(m, sqrt(s))
return s, m
end

function get_nlogp(model)
# Construct a trace struct
vi = Turing.VarInfo(model)

# Define a function to optimize.
function nlogp(sm)
spl = Turing.SampleFromPrior()
new_vi = Turing.VarInfo(vi, spl, sm)
model(new_vi, spl)
-new_vi.logp
end

return nlogp
end

# Define our data points.
x = 1.5
y = 2.0
model = gdemo(x, y)
nlogp = get_nlogp(model)

# Import Optim.jl.
using Optim

# Create a starting point, call the optimizer.
sm_0 = [1.0, 1.0]
lb = [0.0, -Inf]
ub = [Inf, Inf]
result = optimize(nlogp, lb, ub, sm_0, Fminbox())
```


## Maximum Likelihood Estimation


Much like the MAP example above, one can use Turing's syntax to define a model and ignore the prior distributions to perform [maximum likelihood estimation](https://en.wikipedia.org/wiki/Maximum_likelihood_estimation) (MLE).

```julia
using Turing

# Define the simple gdemo model.
@model gdemo(x, y) = begin
s ~ InverseGamma(2, 3)
m ~ Normal(0, sqrt(s))
x ~ Normal(m, sqrt(s))
y ~ Normal(m, sqrt(s))
return s, m
end

function get_nloglike(model)
# Construct a trace struct
vi = Turing.VarInfo(model)

# Define a function to optimize.
function nloglike(sm)
ctx = Turing.LikelihoodContext()
spl = Turing.SampleFromPrior()
new_vi = Turing.VarInfo(vi, spl, sm)
model(new_vi, spl, ctx)
-new_vi.logp
end

return nloglike
end

# Define our data points.
x = 1.5
y = 2.0
model = gdemo(x, y)
nlogp = get_nloglike(model)

# Import Optim.jl.
using Optim

# Create a starting point, call the optimizer.
sm_0 = [1.0, 1.0]
lb = [0.0, -Inf]
ub = [Inf, Inf]
result = optimize(nlogp, lb, ub, sm_0, Fminbox())
```


## Parallel Sampling


Expand Down
Loading