Description
The current Newton's method implementation has no option to force Hessian to be positive definite, which is a common technique to improve convergence.
In the current implementation the algorithm will not converge to the global minimum if the following conditions are met:
- Hessian is indefinite
- Directional derivative is negative
- Initial guess is close to a local maximum and the desired global minimum (too vaguely described to be very useful condition but there are cases where the above two are true and the algorithm still converges so I thought I would mention this anyway)
This may not be a complete description of cases when forcing positive definiteness is helpful but I ran into such a case and can illustrate with the Six Camel Hump Function
Below is a contour plot of this function with areas that satisfy the first two requirements above highlighted in dark blue. I have picked some points out of those areas and confirmed that the NewtonMinimizer implementation does not converge when using those points as initial guesses.
It would be interesting to write a test that iterates through a grid in the domain above to see if those two conditions are necessary/sufficient for non-convergence but I don't currently have the time and it is outside the scope of this issue.
To restate, an option should be added to NewtonMinimzer that forces the Hessian to be Positive Definite.
I will shortly create a pull request with my proposed solution to this issue.
Regards,
Mikhail