-
Notifications
You must be signed in to change notification settings - Fork 154
Open
Description
norm is not differentiable at 0, so at best you can return a subgradient. It appears that the subgradient is 1.0 at 0.0 (and -1.0 at -0.0).
julia> ForwardDiff.gradient(norm, [0.0, 0.0])
2-element Array{Float64,1}:
0.0
1.0
julia> ForwardDiff.gradient(norm, [0.0, -0.0])
2-element Array{Float64,1}:
-0.0
-1.0I'm wondering if it would be worth it to define Base.norm on ForwardDiff.Dual, and return a subgradient of 0.0 at both 0.0 and -0.0
Also perhaps I missed this, but I think it would be nice to mention somewhere that in generic auto-diffable code sqrt(sum(v.^2)) should be replaced with norm, since sqrt is singular at 0, and produces a NaN when composed with a function with 0 gradient (0*Inf = NaN).
Metadata
Metadata
Assignees
Labels
No labels