Skip to content

Modify NewtonMinimizer so that it forces Hessian to be positive definite #1067

Open
@strMikhailPotapenko

Description

@strMikhailPotapenko

The current Newton's method implementation has no option to force Hessian to be positive definite, which is a common technique to improve convergence.

In the current implementation the algorithm will not converge to the global minimum if the following conditions are met:

  • Hessian is indefinite
  • Directional derivative is negative
  • Initial guess is close to a local maximum and the desired global minimum (too vaguely described to be very useful condition but there are cases where the above two are true and the algorithm still converges so I thought I would mention this anyway)

This may not be a complete description of cases when forcing positive definiteness is helpful but I ran into such a case and can illustrate with the Six Camel Hump Function

Below is a contour plot of this function with areas that satisfy the first two requirements above highlighted in dark blue. I have picked some points out of those areas and confirmed that the NewtonMinimizer implementation does not converge when using those points as initial guesses.

image

It would be interesting to write a test that iterates through a grid in the domain above to see if those two conditions are necessary/sufficient for non-convergence but I don't currently have the time and it is outside the scope of this issue.

To restate, an option should be added to NewtonMinimzer that forces the Hessian to be Positive Definite.

I will shortly create a pull request with my proposed solution to this issue.

Regards,
Mikhail

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions