Skip to content

Commit

Permalink
reorganize inference docs api
Browse files Browse the repository at this point in the history
  • Loading branch information
dustinvtran committed Dec 16, 2016
1 parent 121f108 commit 04a313c
Show file tree
Hide file tree
Showing 5 changed files with 62 additions and 107 deletions.
9 changes: 0 additions & 9 deletions docs/autogen.py
Expand Up @@ -64,7 +64,6 @@
'inference-compositionality.tex',
'inference-data-subsampling.tex',
'inference-development.tex',
'inference-api.tex',
],
},
{
Expand Down Expand Up @@ -99,14 +98,6 @@
],
'child_pages': [],
},
{
'page': 'inference-api.tex',
'title': 'API',
'parent_pages': [
'inference.tex'
],
'child_pages': [],
},
{
'page': 'criticism.tex',
'title': 'Criticism',
Expand Down
2 changes: 2 additions & 0 deletions docs/tex/api/criticism.tex
Expand Up @@ -35,6 +35,8 @@ \subsubsection{Criticism}
% Developing new criticism techniques is easy. They can be derived from
% the current techniques or built as a standalone function.

\begin{center}\rule{3in}{0.4pt}\end{center}

{{sphinx

.. automodule:: edward.criticisms
Expand Down
57 changes: 0 additions & 57 deletions docs/tex/api/inference-api.tex

This file was deleted.

39 changes: 39 additions & 0 deletions docs/tex/api/inference-classes.tex
Expand Up @@ -103,4 +103,43 @@ \subsubsection{Exact Inference}
classical Gibbs and mean-field updates \citep{bishop2006pattern} without
tedious algebraic manipulation.

\begin{center}\rule{3in}{0.4pt}\end{center}

{{sphinx

.. autoclass:: edward.inferences.KLqp
:members:

.. autoclass:: edward.inferences.ReparameterizationKLqp

.. autoclass:: edward.inferences.ReparameterizationKLKLqp

.. autoclass:: edward.inferences.ReparameterizationEntropyKLqp

.. autoclass:: edward.inferences.ScoreKLqp

.. autoclass:: edward.inferences.ScoreKLKLqp

.. autoclass:: edward.inferences.ScoreEntropyKLqp

.. autoclass:: edward.inferences.KLpq
:members:

.. autoclass:: edward.inferences.MAP
:members:

.. autoclass:: edward.inferences.Laplace
:members:

.. autoclass:: edward.inferences.MetropolisHastings
:members:

.. autoclass:: edward.inferences.HMC
:members:

.. autoclass:: edward.inferences.SGLD
:members:

}}

\subsubsection{References}\label{references}
62 changes: 21 additions & 41 deletions docs/tex/api/inference-development.tex
Expand Up @@ -8,62 +8,42 @@ \subsubsection{Developing Inference Algorithms}
methods. This enables fast experimentation on top of existing
algorithms, whether it be developing new black box algorithms or
new model-specific algorithms.
For examples of algorithms developed in Edward, see the inference
\href{/tutorials/}{tutorials}.

\includegraphics[width=700px]{/images/inference_structure.png}
{\small\textit{Dependency graph of inference methods.
Nodes are classes in Edward and arrows represent class inheritance.}}

There is a base class \texttt{Inference}, from which all inference
methods are derived from.

\begin{lstlisting}[language=Python]
class Inference(object):
"""Base class for Edward inference methods.
"""
def __init__(self, latent_vars=None, data=None, model_wrapper=None):
...
\end{lstlisting}

It takes as input the set of latent variables to infer and a dataset. Optionally, if the user uses an external language to specify the model, it takes as input a model wrapper \texttt{model_wrapper}.

Note that \texttt{Inference} says nothing about the class of models that an
algorithm must work with. One can build inference algorithms which are
tailored to a restricted class of models available in Edward (such as
differentiable models or conditionally conjugate models), or even
tailor it to a single model. The algorithm can raise an error if the
model is outside this class.
methods are derived from. Note that \texttt{Inference} says nothing
about the class of models that an algorithm must work with. One can
build inference algorithms which are tailored to a restricted class of
models available in Edward (such as differentiable models or
conditionally conjugate models), or even tailor it to a single model.
The algorithm can raise an error if the model is outside this class.

We organize inference under two paradigms:
\texttt{VariationalInference} and \texttt{MonteCarlo} (or more plainly,
optimization and sampling). These inherit from \texttt{Inference} and each
have their own default methods.

\begin{lstlisting}[language=Python]
class MonteCarlo(Inference):
"""Base class for Monte Carlo inference methods.
"""
def __init__(latent_vars, data=None, model_wrapper=None):
super(MonteCarlo, self).__init__(latent_vars, data, model_wrapper)
For example, developing a new variational inference algorithm is as simple as
inheriting from \texttt{VariationalInference} or one of its derived
classes. \texttt{VariationalInference} implements many default methods such
as \texttt{initialize()} with options for an optimizer.

...
\begin{center}\rule{3in}{0.4pt}\end{center}

{{sphinx

class VariationalInference(Inference):
"""Base class for variational inference methods.
"""
def __init__(self, latent_vars=None, data=None, model_wrapper=None):
super(VariationalInference, self).__init__(latent_vars, data, model_wrapper)
.. autoclass:: edward.inferences.Inference
:members:

...
\end{lstlisting}
.. autoclass:: edward.inferences.VariationalInference
:members:

For example, developing a new variational inference algorithm is as simple as
inheriting from \texttt{VariationalInference} or one of its derived
classes. \texttt{VariationalInference} implements many default methods such
as \texttt{initialize()} with options for an optimizer.
.. autoclass:: edward.inferences.MonteCarlo
:members:

For examples of inference algorithms developed in Edward, see the inference
\href{/tutorials/}{tutorials}. It can also be useful to look at
the
\href{https://github.com/blei-lab/edward/tree/master/edward/inferences}
{source code}.
}}

0 comments on commit 04a313c

Please sign in to comment.