Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FINAL] Integrate batch Hessian in ExaPF #185

Merged
merged 5 commits into from
Jul 6, 2021
Merged

[FINAL] Integrate batch Hessian in ExaPF #185

merged 5 commits into from
Jul 6, 2021

Conversation

frapac
Copy link
Member

@frapac frapac commented Jul 2, 2021

Cleaned & rebased

* autodiff: add ConstantHessian structure
  Allow to avoid computing Hessian-vector products with AutoDiff if the Hessian is constant.
  (e.g. for voltage_magnitude_constraints)
* Remove hard-coded parts in computation of objective's gradient.
* Add a new function cost_production, with associated kernels.
* Specify ramping constraints with penalties in objective
* Compute gradient and Hessian of ProxAL's objective with AutoDiff.
* ReducedSpaceEvaluator now uses batch Hessian by default
* add support for batch Hessian
* add batch_jacobian function
* generate Hessian code with metaprogramming
* change signature of hessian_lagrangian_penalty_prod!
* Allow to deport the computation on the GPU, while using
  a CPU compatible solver.
* add CUSOLVERRF in the dependencies
* fix transpose in kernels
@frapac frapac requested a review from michel2323 July 2, 2021 21:12
@frapac frapac merged commit f136813 into develop Jul 6, 2021
@frapac frapac deleted the fp/review_final branch July 23, 2021 13:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant