Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can I add custom gradients? #77

Open
kai-qu opened this issue May 2, 2019 · 11 comments
Open

How can I add custom gradients? #77

kai-qu opened this issue May 2, 2019 · 11 comments

Comments

@kai-qu
Copy link

kai-qu commented May 2, 2019

If I want to give a custom or known gradient for a function, how can I do that in this library? (I don't want to autodifferentiate through this function.) I am using the grad function.

If the library doesn't provide this feature, is there some way I can easily implement this functionality myself, perhaps by changing the definitions of leaf nodes or by editing the dual numbers that presumably carry the numerical gradients?

Here's a concrete example of what I mean:

Say I have some function I want to take the gradient of, say f(x, y) = x^2 + 3 * g(x, y)^2. Then say that g(x, y) is a function whose definition is complicated and involves lots of Haskell code, but whose gradient I've already calculated analytically and is quite simple. Thus, when I take grad f and evaluate it at a point (x, y), I'd like to just plug in my custom gradient for g, instead of autodiffing through it: something like my_nice_grad_of_g (x, y).

I see other autodiff libraries do provide this feature, for example Stan and Tensorflow both allow users to define gradients of a function.

Thanks!

@cartazio
Copy link
Collaborator

cartazio commented May 2, 2019

#15 may help

the code in that version was

erf1 = lift1 erf $ \x -> (fromInteger1 2 /! sqrt1 pi1) *! exp1 (negate1 x *! x)

https://github.com/ekmett/ad/blob/master/src/Numeric/AD/Jacobian.hs
hass the class which is used to facilitate that

@cartazio
Copy link
Collaborator

cartazio commented May 2, 2019

point being i believe this answers your question, though please chime in if you want more examples or links / hit a new problem.

but closing for now :)

@cartazio cartazio closed this as completed May 2, 2019
@kai-qu
Copy link
Author

kai-qu commented May 3, 2019

Thanks! I have a nice minimal example working like this:

λ> import Numeric.AD
λ> import Numeric.AD.Jacobian
λ> import Numeric.AD.Internal.Forward
λ> (lift1 (\x -> x^2) (\x -> 2 * x)) (Forward 3 1)
Forward 9 6

However, the Jacobian module only seems to provide primitives for defining derivatives for scalar or two-argument functions. I'd like to define a custom gradient for a function f : R^n -> R, for example something like

liftN (\[x, y, z] -> x * y * z) (\[x, y, z] -> [y * z, x * z, x * y])

Any thoughts on how I could do this? It doesn't seem efficient to try and hack this functionality in terms of lift1.

@cartazio
Copy link
Collaborator

cartazio commented May 6, 2019 via email

@cartazio cartazio reopened this Jan 20, 2020
@cartazio
Copy link
Collaborator

@hypotext it is a gap, and its currently not quite possible to do gradients etc with "custom rules", which is a problem i think we'd both agree

@kai-qu
Copy link
Author

kai-qu commented Jan 20, 2020 via email

@cartazio
Copy link
Collaborator

ok, lets start with several sub questions

  1. what should/would this api look like?
  2. how should/would this interact with calculating higher derivatives etc?

what types would you expect/hope the interfaces to have?

@kai-qu
Copy link
Author

kai-qu commented Jan 21, 2020 via email

@cartazio
Copy link
Collaborator

ok, :)

@cartazio
Copy link
Collaborator

I'm honestly thinking about experimenting with writing a ghc core plugin to support better optimization for autodiff computations on ghc core level sometime this jan/feb

@dschrempf
Copy link

Hi! May I ask about the status of this issue? I am in a similar situation as the author. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants