Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add abs to derivative rules #70

Closed
wants to merge 1 commit into from

Conversation

joehuchette
Copy link
Contributor

No description provided.

@mlubin
Copy link
Collaborator

mlubin commented Jul 11, 2015

The only potential issue is that this might be an unexpected and surprising symbolic differentiation rule to use. This definition is the right one for JuMP, but mathematically it's a bit iffy to return a value for the derivative of abs at zero. Wolfram alpha uses the rule x/abs(x) which properly isn't defined at zero.

@johnmyleswhite
Copy link
Collaborator

I'm not sure. There's clearly something to be said for this, but I wonder if this is the first time we're providing a subgradient rather than a gradient. Maybe we should split the differentiation rules into two sets and allow users to specify whether differentiate should error out in the presence of non-differentiable functions?

@joehuchette
Copy link
Contributor Author

@johnmyleswhite would the x/abs(x) option work with you? At least then it's undefined at the origin.

@mlubin
Copy link
Collaborator

mlubin commented Jul 11, 2015

That's not the right rule for JuMP though.
On Jul 11, 2015 6:31 AM, "Joey Huchette" notifications@github.com wrote:

@johnmyleswhite https://github.com/johnmyleswhite would the x/abs(x)
option work with you? At least then it's undefined at the origin.


Reply to this email directly or view it on GitHub
#70 (comment)
.

@johnmyleswhite
Copy link
Collaborator

I would think you'd either want:

  • The rule that JuMP wants.
  • An error to be raised during the symbolic differentiation process about non-differentiability.

@joehuchette
Copy link
Contributor Author

You guys can make the call on whether this appropriate or not here. One option would be to use the x/abs(x) rule here and then override it in ReverseDiffSparse here, which as best I can tell should be sufficient to get the right behavior for JuMP (maybe need overloads in DualNumbers as well?).

@mlubin
Copy link
Collaborator

mlubin commented Aug 17, 2015

This was implemented in ReverseDiffSparse as a special case: mlubin/ReverseDiffSparse.jl@2530b75
I'm in favor of not defining the derivative of abs in Calculus.

@joehuchette
Copy link
Contributor Author

Me too

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants