-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dual eval #17
Dual eval #17
Conversation
abopt/base.py
Outdated
self.complete_g(state) | ||
|
||
if self.problem.problem_dual_eval: | ||
self.complete_y_g(state) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see what you are concerned about. accessing _member from outside of the class.
The branching can be done inside Problem.f_g():
if self._objective_gradient:
return ...
else:
return self.f(), self.g()
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done. The branching is now in Problem.fg(x). In Proposal I removed complete_y and complete_g and replaced it by complete_y_g, this required replacing complete_y in the linesearch by complete_y_g as well
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is that what you meant?
abopt/base.py
Outdated
@@ -323,6 +358,11 @@ def g(self, x): | |||
g = self._gradient(x) | |||
return g | |||
|
|||
def f_g(self, x): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps just call it fg? f_g may imply $f_g$
. No strong preference here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
abopt/testing/__init__.py
Outdated
def vjp(x, v): return v.dot(J) * phiprime(x) | ||
def jvp(x, v): return J.dot(v * phiprime(x)) | ||
|
||
def objective(x): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shall we do a test for each of the following cases:
- only objective_gradient
- objective and objective_gradient
- objective and gradient
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will do, I have also added a statement that checks that objective and gradient are not None is objective_gradient is.
Thanks! I'll rename the tests (dual is less explicit than WithObjectGradient) |
I added a ChiSquareProblem_dual test which is called during the LBFGs tests. It passes (and I made sure it's being called). Should I look at anything else in the tests?