-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add SparseDiffTools jacobian #105
Conversation
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## main #105 +/- ##
==========================================
- Coverage 98.50% 98.44% -0.07%
==========================================
Files 8 8
Lines 938 964 +26
==========================================
+ Hits 924 949 +25
- Misses 14 15 +1
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report in Codecov by Sentry. |
@amontoison I am finally ready with this! I think it will not be hard to do the same for the Hessian, later. |
Awesome!!! Would you mind adding a couple of unit tests? They serve as documentation too. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tangi, why do you call this backend SparseForwardADJacobian
?
We can decide if we want to determine the Jacobian by rows or columns.
matrix_colors(A,alg; partition_by_rows)
Because it uses a forward-mode autodiff to compute the values. That's right, I can add these options in the backend constructor, thanks for the suggestion. |
Hey @amontoison ! I added the algorithm in kwargs and tests. However, no clue how to make the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Tangi!
I have a few comments but it looks good to me.
I will open a PR for sparse Hessian after this one.
(ADNLPModels.SparseForwardADJacobian, Dict(:alg => SparseDiffTools.GreedyD1Color())), | ||
(ADNLPModels.SparseForwardADJacobian, Dict(:alg => SparseDiffTools.AcyclicColoring())), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We also have other algorithms if we want:
GreedyD1Color
;BacktrackingColor
;ContractionColor
;GreedyStar1Color
;GreedyStar2Color
;AcyclicColoring
.
I think that we can easily use the color
function in the future that I interfaced last week in CUDA.jl.
I will open an issue in SparseDiffTools
when it will be merged.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not all of them work, though^^. We could add this when documenting this feature maybe !?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, let's do that!
Co-authored-by: Alexis Montoison <35051714+amontoison@users.noreply.github.com>
Thanks @amontoison for the improvements! |
#13