New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: Jacobian calculation for gradients #36

Open
aterenin opened this Issue Jan 3, 2019 · 3 comments

Comments

Projects
None yet
2 participants
@aterenin
Copy link

aterenin commented Jan 3, 2019

I'd like to use TransformVariables as part of an HMC sampler, but my evaluating my target distribution's probability density involves solving a partial differential equation, which is done outside Julia and thus not compatible with AutoDiff. There are a special techniques for obtaining gradients for such models which involve solving an adjoint equations, but that's a digression.

For people with use cases like mine, it would be nice if there was a function which for a transformation y = t(x) transforms grad(log(f(x)) into grad(log(f(y)) with the appropriate Jacobian correction added to each individual coordinate - similar to transform_logdensity, but for gradients of log-densities.

@tpapp

This comment has been minimized.

Copy link
Owner

tpapp commented Jan 3, 2019

I am not entirely sure what you are asking for here. You have a transformation, you can also calculate its log Jacobian determinant, and you would like to use it for Bayesian inference?

with the appropriate Jacobian correction added to each individual coordinate

Sorry, I am not getting this --- the Jacobian is for the whole transformation, not by coordinate.

Perhaps a mock example would help.

@aterenin

This comment has been minimized.

Copy link

aterenin commented Jan 3, 2019

Sure.

Suppose we have two random variables, X and Y with X ~ Beta(2,2) and Y = X/2. Suppose that we do not know the density of Y.

Let f_X(.) and f_Y(.) be their densities. We are interested in evaluating d/dy* ln f_Y(y) for a giveny, so we transform x = 2y and obtain f_Y(y) = f_X(x / 2) |J(x)| where J is the Jacobian of the transformation, here just equal to the constant 2. The logarithm is just ln f_Y(y) = ln f_X(x / 2) + ln(2) which is what transform_logdensity gives you.

I am interested in d/dy ln f_Y(y) in x coordinates. In other words, my target distribution is constrained, so I transform it to be unconstrained. transform_logdensity lets me evaluate its density, accounting for the transformation. How do I evaluate the gradient of its log density, also accounting for the transformation?

Right now, the examples in DynamicHMC do this by applying automatic differentiation to ln f_Y(y), but in my use case this isn't possible, hence the desire to obtain it directly.

Does this example help?

@tpapp

This comment has been minimized.

Copy link
Owner

tpapp commented Jan 4, 2019

Yes, this is very helpful, and reminds me of something I encountered myself when I was coding a log density and its gradient without AD.

I will think about it an get back to you in this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment