Skip to content
This repository has been archived by the owner on Oct 31, 2024. It is now read-only.

Use DifferentiationInterface #148

Merged
merged 4 commits into from
May 26, 2024
Merged

Use DifferentiationInterface #148

merged 4 commits into from
May 26, 2024

Conversation

avik-pal
Copy link
Member

@avik-pal avik-pal commented May 25, 2024

  1. Removes AD backend-specific code (except rrules)
  2. Merge Improve Code Standards #147 first
  3. SimpleHalley now supports in place problems
  4. More backends are now supported by default

TODOs

  • Update Halley tests to add the in place versions
  • Add Halley to 23 test problems

Copy link

codecov bot commented May 25, 2024

Codecov Report

Attention: Patch coverage is 95.04950% with 5 lines in your changes are missing coverage. Please review.

Project coverage is 90.34%. Comparing base (0d300d3) to head (29ae939).

Files Patch % Lines
src/ad.jl 88.46% 3 Missing ⚠️
src/nlsolve/dfsane.jl 83.33% 1 Missing ⚠️
src/utils.jl 98.11% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #148      +/-   ##
==========================================
- Coverage   90.60%   90.34%   -0.26%     
==========================================
  Files          23       21       -2     
  Lines        1299     1202      -97     
==========================================
- Hits         1177     1086      -91     
+ Misses        122      116       -6     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

src/ad.jl Outdated Show resolved Hide resolved
src/ad.jl Outdated Show resolved Hide resolved
@avik-pal avik-pal force-pushed the ap/di branch 7 times, most recently from f111bd3 to 5ef7667 Compare May 26, 2024 02:24
@avik-pal avik-pal merged commit 6b682af into main May 26, 2024
17 of 19 checks passed
@avik-pal avik-pal deleted the ap/di branch May 26, 2024 05:50
@gdalle
Copy link
Contributor

gdalle commented May 27, 2024

Any issues with DI integration @avik-pal ?

Copy link
Contributor

@gdalle gdalle left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So cool that you managed to merge this! Here are a few remarks on DI use, feel free to take them into account or not

mul!(reshape(du, 1, :), vec(DiffResults.value(res))',
DiffResults.jacobian(res), 2, false)
resid = __similar(du, length(sol.resid))
v, J = DI.value_and_jacobian(_f, resid, AutoForwardDiff(), u)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you may want to use preparation

2 .* vec(DiffResults.value(res))' * DiffResults.jacobian(res),
size(u))
_f = Base.Fix2(prob.f, p)
v, J = DI.value_and_jacobian(_f, AutoForwardDiff(), u)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

preparation?

__f = (du, u) -> f(du, u, p)
ForwardDiff.jacobian(__f, du, u)
__f = @closure (du, u) -> f(du, u, p)
return ForwardDiff.jacobian(__f, __similar(u), u)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not use DI?

return ForwardDiff.jacobian(__f, u)
end
u isa Number && return ForwardDiff.derivative(__f, u)
return ForwardDiff.jacobian(__f, u)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

DI?

value_and_jacobian(ad, f, y, x, p, cache; J = nothing)
function value_and_jacobian(
ad, prob::AbstractNonlinearProblem, f::F, y, x, cache; J = nothing) where {F}
x isa Number && return DI.value_and_derivative(f, ad, x, cache)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

was the cache obtained with DI.prepare_derivative(f, ad, x)?

Comment on lines +65 to +66
H = DI.second_derivative(f, ad, x)
v, J = DI.value_and_derivative(f, ad, x)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

preparation?

return fx, dfx, d2fx
df = @closure x -> begin
res = __similar(y, promote_type(eltype(y), eltype(x)))
return DI.jacobian(f, res, ad, x)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

preparation?

res = __similar(y, promote_type(eltype(y), eltype(x)))
return DI.jacobian(f, res, ad, x)
end
J, H = DI.value_and_jacobian(df, ad, x)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

preparation?

dfx = J_fn(x)
d2fx = ForwardDiff.jacobian(J_fn, x)
return fx, dfx, d2fx
df = @closure x -> begin
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can use DI with functions f!(y, x) without the closure

Comment on lines +220 to 232
@inline function __get_concrete_autodiff(prob, ad::AutoForwardDiff{nothing}; kwargs...)
return AutoForwardDiff(; chunksize = ForwardDiff.pickchunksize(length(prob.u0)), ad.tag)
end
@inline function __get_concrete_autodiff(
prob, ad::AutoPolyesterForwardDiff{nothing}; kwargs...)
return AutoPolyesterForwardDiff(;
chunksize = ForwardDiff.pickchunksize(length(prob.u0)), ad.tag)
end
@inline function __get_concrete_autodiff(prob, ::Nothing; kwargs...)
return ifelse(
ForwardDiff.can_dual(eltype(prob.u0)), AutoForwardDiff(), AutoFiniteDiff())
return ifelse(ForwardDiff.can_dual(eltype(prob.u0)),
AutoForwardDiff(; chunksize = ForwardDiff.pickchunksize(length(prob.u0))),
AutoFiniteDiff())
end
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this necessary with DI?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants