Skip to content
This repository has been archived by the owner on Nov 23, 2018. It is now read-only.

Commit

Permalink
Remove extra check for conclusion of Bisection.
Browse files Browse the repository at this point in the history
The original code for Bisection included a secondary check to the strong Wolfe conditions to see if the optmiization was finished. The idea was to help mitigate floating point noise and allow for stronger convergence to the gradient. Unfortunately, all this does is add complexity. The parameters are ad hoc, and trade off floating point noise for actual function modulation. For more complicated functions (especially concurrent ones) the noise will be higher, while other functions may have modulations that are very small. It is impossible to design a tradeoff that is good for all functions. Instead, keep the code simple. This also fixes issues with Bisection and the outer OptLoc disagreeing on the optimum location
  • Loading branch information
btracey committed Jan 26, 2015
1 parent 0be7928 commit ef0cd18
Showing 1 changed file with 1 addition and 16 deletions.
17 changes: 1 addition & 16 deletions bisection.go
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,7 @@

package optimize

import (
"math"

"github.com/gonum/floats"
)
import "math"

// Bisection is a LinesearchMethod that uses a bisection to find a point that
// satisfies the strong Wolfe conditions with the given gradient constant and
Expand Down Expand Up @@ -60,18 +56,7 @@ func (b *Bisection) Init(initLoc LinesearchLocation, initStepSize float64, f *Fu
return FunctionAndGradientEval
}

const (
funcSmallEqual = 1e-14
gradSmallEqual = 1e-10
)

func (b *Bisection) Finished(l LinesearchLocation) bool {
if floats.EqualWithinRel(l.F, b.initF, funcSmallEqual) && math.Abs(l.Derivative) < gradSmallEqual && math.Abs(b.initGrad) < gradSmallEqual {
// The two numbers are so close that we should just check on the gradient
// TODO: Should iterate be updated? Maybe find a function where it needs it.
return math.Abs(l.Derivative) < b.GradConst*math.Abs(b.initGrad)
}

return StrongWolfeConditionsMet(l.F, l.Derivative, b.initF, b.initGrad, b.currStep, 0, b.GradConst)
}

Expand Down

0 comments on commit ef0cd18

Please sign in to comment.