-
Notifications
You must be signed in to change notification settings - Fork 9
Linesearchers should check function value before asking for gradient #127
Comments
Backtracking returns only FuncEvaluation, but at present Bisection needs both at the same time. Splitting it will require some changes and perhaps a bit of thought. If the step satisfies Armijo, then it is easy, just return GradEvaluation. If not, then it is not so clear. If maxStep has not been found yet, then probably the right thing is to halve the step and try again with another FuncEvaluation. If the step has already been bounded, then ... ? Anyway, when this is done, I will be this close from suggesting to drop the ORability of evaluation operations. Just saying (although I am afraid it was me who came up with the idea). |
I tried to come up with something for Bisection that would actually save some work if a step does not satisfy the Armijo condition. But needing both values simultaneously is essential for the algorithm, so either we don't save work or it will be a different algorithm. Do you see another way, @btracey ? If we just want to issue FuncEvaluation and GradEvalaution separately, it can be done easily, though. |
That's exactly what I have in mind. Request function value. If it passes, request gradient. If not, update |
But update needs gradient too, so we don't save anything. |
Continuing the previous discussion from closed #137:
... One was to see if gradient computations can be saved if we know that the point does not satisfy even Armijo. It is obvious to me now that they cannot, the algorithm simply needs the derivative at the midpoint to decide which side to subdivide. We could modify the algorithm but then it would not be the same thing, it is not clear how to modify it and whether it is worth it. If we modified the bracketing phase not to use g, only f, then we could save them during that phase because the action does not need g, it is always: go forward. ... but Backtracking does not ask for the gradient and Bisection cannot do without it.
The very first check in Bisection is g < 0 so as it is now we always need the derivative. We can remove it, though, and make the bracketing phase derivative-free (slide forward until the midpoint has lower value than the two end points). Perhaps we should. Also, shouldn't we fail earlier than when b.step == step? Just asking. |
First, let's say the initial bracketing phase. There are four cases
Both cases 3 and 4 mean that a minimum has been bracketed, as the function value has increased. So there we do not need to check the gradient. When it is bracketed and we look at a new point, again, if the function value is higher we will always cut toward the best minimum. It is only when the function value is lower that the gradient value is needed. |
Yes. But currently the first test in Bisection during bracketing is
Oh, I have just realized what was not fitting in and was thus confusing me: the current code stores the derivative in b.minGrad and b.maxGrad so it makes it seem as if those derivates are needed. But in fact they are not! They are just stored and never used, so yes, we can save gradient evaluations in some cases. Uf. Do you want to put this together or should I take a stab at it? |
Fixed by #143 |
Many problems have a gradient that's expensive to compute. The line searchers we implement should check that the function value has decreased before asking for the gradient.
The text was updated successfully, but these errors were encountered: