Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dual eval #17

Merged
merged 7 commits into from
Feb 25, 2021
Merged

Dual eval #17

merged 7 commits into from
Feb 25, 2021

Conversation

VMBoehm
Copy link
Contributor

@VMBoehm VMBoehm commented Feb 17, 2021

I added a ChiSquareProblem_dual test which is called during the LBFGs tests. It passes (and I made sure it's being called). Should I look at anything else in the tests?

abopt/base.py Outdated
self.complete_g(state)

if self.problem.problem_dual_eval:
self.complete_y_g(state)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see what you are concerned about. accessing _member from outside of the class.

The branching can be done inside Problem.f_g():

if self._objective_gradient:
return ...
else:
return self.f(), self.g()

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done. The branching is now in Problem.fg(x). In Proposal I removed complete_y and complete_g and replaced it by complete_y_g, this required replacing complete_y in the linesearch by complete_y_g as well

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is that what you meant?

abopt/base.py Outdated
@@ -323,6 +358,11 @@ def g(self, x):
g = self._gradient(x)
return g

def f_g(self, x):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps just call it fg? f_g may imply $f_g$. No strong preference here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

def vjp(x, v): return v.dot(J) * phiprime(x)
def jvp(x, v): return J.dot(v * phiprime(x))

def objective(x):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shall we do a test for each of the following cases:

  1. only objective_gradient
  2. objective and objective_gradient
  3. objective and gradient

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will do, I have also added a statement that checks that objective and gradient are not None is objective_gradient is.

@rainwoodman rainwoodman merged commit 9e42d00 into master Feb 25, 2021
@rainwoodman
Copy link
Member

Thanks! I'll rename the tests (dual is less explicit than WithObjectGradient)

@rainwoodman rainwoodman deleted the dual_eval branch February 25, 2021 16:45
@rainwoodman rainwoodman mentioned this pull request Feb 25, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants