You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Our algorithms perform inference on a Model class and requires only one thing from the class: the method log_prob(self, zs), which takes a n_minibatch x d matrix of latent variables z and outputs a vector [log p(x, z_{1,:}), ..., log p(x, z_{n_minibatch,:})]^T. (Any data x is fixed and stored inside the class.)
In cases where we use the reparameterization gradient, we also require the gradient of the log_prob() function with respect to z. This is done automatically if log_prob() is implemented in TensorFlow.
We want to extend this to a manual specification of the gradient. The motivation is twofold:
we want to be able to specify the model with a Stan program and use its log_prob() and grad_log_prob functions.
a user may not be familiar with TensorFlow and prefers to implement the model in vanilla numpy/scipy, in which they will hand-derive the gradient;
All of this should go behind the scenes: the algorithms will simply check if the grad_log_prob() method exists. If it exists, it uses the function. If it doesn't exist, then it tries to autodiff it using TensorFlow. This solves 1. To solve 2, we also require a wrapper than wraps a Stan program and data into this class with the two methods log_prob and grad_log_prob.
The text was updated successfully, but these errors were encountered:
dustinvtran
changed the title
Enable manual specification of model gradient for inferring Stan models
Enable manual specification of model gradient and wrapper for Stan models
Feb 21, 2016
dustinvtran
changed the title
Enable manual specification of model gradient and wrapper for Stan models
enable manual specification of model gradient and wrapper for Stan models
Feb 21, 2016
dustinvtran
changed the title
enable manual specification of model gradient and wrapper for Stan models
enable manual specification of model gradient and wrapper for Python, Stan, and PyMC3 models
Jul 10, 2016
dustinvtran
changed the title
enable manual specification of model gradient and wrapper for Python, Stan, and PyMC3 models
enable manual specification of model gradient for Python, Stan, and PyMC3 models
Jul 10, 2016
dustinvtran
changed the title
enable manual specification of model gradient for Python, Stan, and PyMC3 models
enable manual specification of model gradient and wrapper for Stan models
Jul 10, 2016
Our algorithms perform inference on a
Model
class and requires only one thing from the class: the methodlog_prob(self, zs)
, which takes an_minibatch x d
matrix of latent variablesz
and outputs a vector[log p(x, z_{1,:}), ..., log p(x, z_{n_minibatch,:})]^T
. (Any datax
is fixed and stored inside the class.)In cases where we use the reparameterization gradient, we also require the gradient of the
log_prob()
function with respect toz
. This is done automatically iflog_prob()
is implemented in TensorFlow.We want to extend this to a manual specification of the gradient. The motivation is twofold:
log_prob()
andgrad_log_prob
functions.All of this should go behind the scenes: the algorithms will simply check if the
grad_log_prob()
method exists. If it exists, it uses the function. If it doesn't exist, then it tries to autodiff it using TensorFlow. This solves 1. To solve 2, we also require a wrapper than wraps a Stan program and data into this class with the two methodslog_prob
andgrad_log_prob
.The text was updated successfully, but these errors were encountered: