Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
57 changes: 57 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
name: CI
on:
pull_request:
push:
schedule:
- cron: '44 9 16 * *' # run the cron job one time per month
jobs:
test:
name: Julia ${{ matrix.version }} - ${{ matrix.os }} - ${{ matrix.arch }} - ${{ github.event_name }}
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
version:
- '1.5'
- '1.6'
- '1.7'
os:
- ubuntu-latest
arch:
- x64
steps:
- uses: actions/checkout@v2
- uses: julia-actions/setup-julia@v1
env:
PYTHON: ""
with:
version: ${{ matrix.version }}
arch: ${{ matrix.arch }}
- uses: actions/cache@v1
env:
cache-name: cache-artifacts
with:
path: ~/.julia/artifacts
key: ${{ runner.os }}-test-${{ env.cache-name }}-${{ hashFiles('**/Project.toml') }}
restore-keys: |
${{ runner.os }}-test-${{ env.cache-name }}-
${{ runner.os }}-test-
${{ runner.os }}-
- uses: julia-actions/julia-buildpkg@latest
- uses: julia-actions/julia-runtest@latest
docs:
name: Documentation
runs-on: ubuntu-latest
needs: test
steps:
- uses: actions/checkout@v2
- uses: julia-actions/setup-julia@latest
with:
version: '1.7'
- name: Install dependencies
run: julia --project=docs/ -e 'using Pkg; Pkg.develop(PackageSpec(path=pwd())); Pkg.instantiate()'
- name: Build and deploy
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # If authenticating with GitHub Actions token
DOCUMENTER_KEY: ${{ secrets.DOCUMENTER_KEY }} # If authenticating with SSH deploy key
run: julia --project=docs/ docs/make.jl
105 changes: 13 additions & 92 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,100 +1,21 @@
# GraphPPL

GraphPPL.jl is a probabilistic programming language focused on probabilistic graphical models.
| **Documentation** | **Build Status** |
|:-------------------------------------------------------------------------:|:--------------------------------:|
| [![][docs-stable-img]][docs-stable-url] [![][docs-dev-img]][docs-dev-url] | [![DOI][ci-img]][ci-url] |

# Inference Backend

GraphPPL.jl does not export any Bayesian inference backend. It provides a simple DSL parser and model generation helpers. To run inference on
generated models user needs to have a Bayesian inference backend with GraphPPL.jl support (e.g. [ReactiveMP.jl](https://github.com/biaslab/ReactiveMP.jl)).

# Examples

## Coin flip
[docs-dev-img]: https://img.shields.io/badge/docs-dev-blue.svg
[docs-dev-url]: https://biaslab.github.io/GraphPPL.jl/dev

```julia
@model function coin_model()
a = datavar(Float64)
b = datavar(Float64)
y = datavar(Float64)

θ ~ Beta(a, b)
y ~ Bernoulli(θ)

return y, a, b, θ
end
```
[docs-stable-img]: https://img.shields.io/badge/docs-stable-blue.svg
[docs-stable-url]: https://biaslab.github.io/GraphPPL.jl/stable

## State Space Model
[ci-img]: https://github.com/biaslab/GraphPPL.jl/actions/workflows/ci.yml/badge.svg?branch=master
[ci-url]: https://github.com/biaslab/GraphPPL.jl/actions

```julia
@model function ssm(n, θ, x0, Q::ConstVariable, P::ConstVariable)

x = randomvar(n)
y = datavar(Vector{Float64}, n)

x_prior ~ MvNormalMeanCovariance(mean(x0), cov(x0))

x_prev = x_prior

A = constvar([ cos(θ) -sin(θ); sin(θ) cos(θ) ])

for i in 1:n
x[i] ~ MvNormalMeanCovariance(A * x_prev, Q)
y[i] ~ MvNormalMeanCovariance(x[i], P)

x_prev = x[i]
end

return x, y
end
```
GraphPPL.jl is a probabilistic programming language focused on probabilistic graphical models. This repository is aimed for advanced users, please refer to the [ReactiveMP.jl](https://github.com/biaslab/ReactiveMP.jl) repository for more comprehensive and self-contained documentation and usages examples.

## Hidden Markov Model

```julia
@model [ default_factorisation = MeanField() ] function transition_model(n)

A ~ MatrixDirichlet(ones(3, 3))
B ~ MatrixDirichlet([ 10.0 1.0 1.0; 1.0 10.0 1.0; 1.0 1.0 10.0 ])

s_0 ~ Categorical(fill(1.0 / 3.0, 3))

s = randomvar(n)
x = datavar(Vector{Float64}, n)

s_prev = s_0

for t in 1:n
s[t] ~ Transition(s_prev, A) where { q = q(out, in)q(a) }
x[t] ~ Transition(s[t], B)
s_prev = s[t]
end

return s, x, A, B
end
```

## Gaussian Mixture Model
# Inference Backend

```julia
@model [ default_factorisation = MeanField() ] function gaussian_mixture_model(n)

s ~ Beta(1.0, 1.0)

m1 ~ NormalMeanVariance(-2.0, 1e3)
w1 ~ GammaShapeRate(0.01, 0.01)

m2 ~ NormalMeanVariance(2.0, 1e3)
w2 ~ GammaShapeRate(0.01, 0.01)

z = randomvar(n)
y = datavar(Float64, n)

for i in 1:n
z[i] ~ Bernoulli(s)
y[i] ~ NormalMixture(z[i], (m1, m2), (w1, w2))
end

return s, m1, w1, m2, w2, z, y
end
```
GraphPPL.jl does not export any Bayesian inference backend. It provides a simple DSL parser, model generation, constraints specification and meta specification helpers. To run inference on
generated models user needs to have a Bayesian inference backend with GraphPPL.jl support (e.g. [ReactiveMP.jl](https://github.com/biaslab/ReactiveMP.jl)).
3 changes: 3 additions & 0 deletions docs/Project.toml
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
[deps]
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
GraphPPL = "b3f8163a-e979-4e85-b43e-1f63d8c8b42c"

[compat]
Documenter = "0.27.7"
12 changes: 11 additions & 1 deletion docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,21 @@

Welcome to the documentation for GraphPPL.jl.

Useful links:

- [`ReactiveMP.jl` documentation](https://biaslab.github.io/ReactiveMP.jl/stable/)
- [User guide: Model specification](@ref user-guide-model-specification)
- [User guide: Constraints specification](@ref user-guide-constraints-specification)
- [User guide: Meta specification](@ref user-guide-meta-specification)



## Table of Contents

```@contents
Pages = [
"user-guide.md"
"user-guide.md",
"transformation-steps.md"
]
Depth = 2
```
Expand Down
105 changes: 83 additions & 22 deletions docs/src/transformation-steps.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Transformation steps
# Model specification transformation steps for the ReactiveMP.jl backend

## Step 1: Normalizarion of `~` operator node arguments

Expand All @@ -14,11 +14,7 @@ is translated to
lhs ~ Node(..., var"#anonymous" ~ f(...), ...)
```

The only one exception is reference expression of the form `x[f(i)]` which are left untouched.

This step is recursive from top to bottom.

This step forces model to create an anonymous node for any inner function call within `~` operator expression. In some cases backend can (and will) optimize this inner anonymous nodes into just function calls. E.g. following example won't create any additional nodes in the model
The only one exception is reference expression of the form `x[f(i)]` which are left untouched. This step forces model to create an anonymous node for any inner function call within `~` operator expression. In some cases ReactiveMP.jl backend can (and will) optimize this inner anonymous nodes into just function calls. E.g. following example won't create any additional nodes in the model

```julia
precision = 1.0
Expand All @@ -32,56 +28,121 @@ noise ~ NormalMeanVariance(noise_mean, 1.0 / precision) # Since 1.0 and precisio

Any expression of the form

```
datavar(args...)
```julia
y = datavar(args...) # empty options here
# or
y = datavar(args...) where { options... }
```

is translated to

```
datavar(var"#model", ensure_type(args[1]), args[2:end]...)
y = datavar(var"#model", options, :y, ensure_type(args[1]), args[2:end]...)
```

where `var"#model"` references to an anonymous model variable, `ensure_type` function ensures that the first argument is a valid type object, rest of the arguments are left untouched.
where `var"#model"` references to an hidden model variable, `ensure_type` function ensures that the first argument is a valid type object, rest of the arguments are left untouched.

This step is recursive from top to bottom.
The list of possible options:
- `subject`: specifies a subject that will be used to pass data variable related information, see more info in `Rocket.jl` documentation.
- `allow_missing`: boolea flag that controls is is possible to pass `missing` data or not

### `randomvar()` transformation

Any expression of the form

```
randomvar(args...)
```julia
x = randomvar(args...) # empty options here
# or
x = randomvar(args...) where { options... }
```

is translated to

```
randomvar(var"#model", args...)
x = randomvar(var"#model", options, :x, args...)
```

where `var"#model"` references to an anonymous model variable, arguments are left untouched.
where `var"#model"` references to an anonymous model variable, arguments are left untouched.

This step is recursive from top to bottom.
The list of possible options (see ReactiveMP.jl documentation for more info about these options):
- `pipeline`
- `prod_constraint`
- `prod_strategy`
- `marginal_form_constraint`
- `marginal_form_check_strategy`
- `messages_form_constraint`
- `messages_form_check_strategy`

### `constvar()` transformation

Any expression of the form

```
constvar(args...)
```julia
c = constvar(args...) # constvar's do not support any extra options flags
```

is translated to

```
constvar(var"#model", args...)
c = constvar(var"#model", :c, args...)
```

where `var"#model"` references to an anonymous model variable, arguments are left untouched.

This step is recursive from top to bottom.
## Step 3: Tilde pass

### 3.0 Node reference pass

All expression of the form

```julia
variable ~ Node(args...)
```

are translated to

```julia
node, variable ~ Node(args...)
```

### 3.1 Node options pass

All expressions of the form

```julia
node, variable ~ Node(args...) where { options... }
```

are translated to

```julia
node, variable ~ Node(args...; options...)
```

### 3.2 Functional relations pass

All expression of the form

```julia
node, variable ~ Node(args...; options...)
```

represent a valid functional dependency between `variable` and `args...`. There are 2 options for further modification of this expression:

1. If `variable` has been created before with the help of `datavar()` or `randomvar()` functions the previous expression is translated to:

```julia
node = make_node(var"#model", options, variable, args...)
```

2. If `variable` has not been created before the expression is translated to:

```julia
node = make_node(var"#model", options, AutoVar(:variable), args...)
```

that internally creates a new variable in the model.

### `~` operator transformation
## Step 4: Final pass

WIP
During the final pass `GraphPPL.jl` inject before any `return ...` call (and also at the very end) the `activate!` call to the `var#"model"`
Loading