Skip to content

Latest commit

 

History

History
47 lines (33 loc) · 1.47 KB

me-der.md

File metadata and controls

47 lines (33 loc) · 1.47 KB
layout mathjax author affiliation e_mail date title chapter section topic theorem sources proof_id shortcut username
proof
true
Joram Soch
BCCN Berlin
joram.soch@bccn-berlin.de
2022-10-20 03:11:00 -0700
Derivation of the model evidence
Model Selection
Bayesian model selection
Model evidence
Derivation
P367
me-der
JoramSoch

Theorem: Let $p(y \vert \theta,m)$ be a likelihood function of a generative model $m$ for making inferences on model parameters $\theta$ given measured data $y$. Moreover, let $p(\theta \vert m)$ be a prior distribution on model parameters $\theta$ in the parameter space $\Theta$. Then, the model evidence (ME) can be expressed in terms of likelihood and prior as

$$ \label{eq:ME-marg} \mathrm{ME}(m) = \int_{\Theta} p(y|\theta,m) , p(\theta|m) , \mathrm{d}\theta ; . $$

Proof: This a consequence of the law of marginal probability for continuous variables

$$ \label{eq:prob-marg} p(y|m) = \int_{\Theta} p(y,\theta|m) , \mathrm{d}\theta $$

and the law of conditional probability according to which

$$ \label{eq:prob-cond} p(y,\theta|m) = p(y|\theta,m) , p(\theta|m) ; . $$

Plugging \eqref{eq:prob-cond} into \eqref{eq:prob-marg}, we obtain:

$$ \label{eq:ME-marg-qed} \mathrm{ME}(m) = p(y|m) = \int_{\Theta} p(y|\theta,m) , p(\theta|m) , \mathrm{d}\theta ; . $$