Skip to content

Conversation

aseyboldt
Copy link
Member

We were missing an example for a custom theano op with a gradient. This adds a section to the advanced theano section in the docs with an example for an implicit function.
The implementation here isn't particularly fast, it could be improved by writing this in C, but I think that would be another section :-)

@springcoil
Copy link
Contributor

springcoil commented Sep 20, 2017 via email

thetamu = theta * mu
return [- g[0] * mu ** 2 / (1 + thetamu + tt.exp(-thetamu))]

If you value your sanity, always check that the gradient is ok::
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

;)

@twiecki
Copy link
Member

twiecki commented Sep 20, 2017

This is great!

@twiecki twiecki merged commit 2f2d961 into master Sep 20, 2017
@junpenglao junpenglao deleted the add-custom-op-doc branch September 21, 2017 05:00
ColCarroll pushed a commit that referenced this pull request Nov 9, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants