Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[line-search] implement Armijo line search algorithm #925

Merged
merged 12 commits into from
Jun 9, 2020

Conversation

HenKlei
Copy link
Member

@HenKlei HenKlei commented May 18, 2020

When solving nonlinear problems for several different parameters, finding a proper value for the relax parameter of the Newton solver that works for every parameter is quite difficult. Thus, I implemented a (very) simple line search algorithm that, at least in my application, led to convergence of the Newton method for every parameter.

To check the original Armijo-Goldstein condition, one needs the gradient of the objective function. Thus, if no gradient is provided, a weaker version of this condition is checked; namely that the function value is decreasing, no matter to what extend.

To be able to compute the gradient of the error function in the Newton method, we switched from error_norm to error_product in the Newton. Thus, we can derive the gradient for any error_product. Certainly, this restricts us to the Hilbert space case; non-Hilbert norms are no longer supported.

Furthermore, we had to make some adjustments in the tests of the Newton method; the Jacobian of the MonomOperators was not implemented properly. We added a test case for the line search algorithm, in which the Newton without line search starts oscillating between two values.

@codecov
Copy link

codecov bot commented May 18, 2020

Codecov Report

Merging #925 into master will increase coverage by 0.16%.
The diff coverage is 95.45%.

Impacted Files Coverage Δ
src/pymor/algorithms/newton.py 90.90% <94.44%> (+4.37%) ⬆️
src/pymor/algorithms/line_search.py 96.15% <96.15%> (ø)
src/pymor/operators/interface.py 85.91% <0.00%> (+0.70%) ⬆️
src/pymor/operators/constructions.py 84.28% <0.00%> (+0.95%) ⬆️
src/pymor/vectorarrays/numpy.py 84.55% <0.00%> (+0.98%) ⬆️
src/pymor/tools/formatrepr.py 87.80% <0.00%> (+1.21%) ⬆️

src/pymor/algorithms/line_search.py Outdated Show resolved Hide resolved
src/pymor/algorithms/line_search.py Outdated Show resolved Hide resolved
src/pymor/algorithms/line_search.py Show resolved Hide resolved
src/pymor/algorithms/newton.py Outdated Show resolved Hide resolved
src/pymor/algorithms/newton.py Outdated Show resolved Hide resolved
src/pymor/algorithms/newton.py Outdated Show resolved Hide resolved
@TiKeil
Copy link

TiKeil commented May 19, 2020

Hi Hendrick,

this already looks very good. Could you add a reference for the Armijo-Goldstein line search?

Also, I would be very happy if you could change the names of the arguments in the armijo rule to a more general setting. Please have a look at the requested changes.

Tim

@HenKlei
Copy link
Member Author

HenKlei commented May 19, 2020

Hi Tim, thank you for your suggestions! I will have a look.

Copy link
Member

@sdrave sdrave left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @HenKlei, thanks a lot for this PR. This is looking very good already .. - You should add an entry to AUTHORS.md ..

src/pymor/algorithms/line_search.py Show resolved Hide resolved
src/pymor/algorithms/newton.py Outdated Show resolved Hide resolved
src/pymor/algorithms/newton.py Show resolved Hide resolved
src/pymor/algorithms/newton.py Outdated Show resolved Hide resolved
src/pymor/algorithms/newton.py Outdated Show resolved Hide resolved
Copy link
Member

@sdrave sdrave left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apart from minor issues, I think this can be merged soon. @HenKlei, you should still add an entry to AUTHORS.md ...

src/pymor/algorithms/line_search.py Outdated Show resolved Hide resolved
src/pymor/algorithms/line_search.py Outdated Show resolved Hide resolved
src/pymor/algorithms/line_search.py Outdated Show resolved Hide resolved
src/pymor/algorithms/line_search.py Outdated Show resolved Hide resolved
src/pymor/algorithms/line_search.py Outdated Show resolved Hide resolved
src/pymor/algorithms/newton.py Outdated Show resolved Hide resolved
src/pymor/algorithms/newton.py Outdated Show resolved Hide resolved
src/pymor/algorithms/newton.py Outdated Show resolved Hide resolved
src/pymor/algorithms/newton.py Outdated Show resolved Hide resolved
src/pymor/algorithms/newton.py Outdated Show resolved Hide resolved
@sdrave sdrave added this to the 2020.1 milestone May 26, 2020
Copy link
Member

@sdrave sdrave left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is all looking very nice now. Only one thing remains: there should be some form of test for the new code. There is already something very basic in pymortests.algorithms.stuff. Could you add something there?

@HenKlei
Copy link
Member Author

HenKlei commented May 27, 2020

From my point of view, there isn't anything missing and this is ready to get merged.

@TiKeil, @sdrave Any objections or further comments?

@HenKlei
Copy link
Member Author

HenKlei commented May 27, 2020

This is all looking very nice now. Only one thing remains: there should be some form of test for the new code. There is already something very basic in pymortests.algorithms.stuff. Could you add something there?

I have an idea for a test, I will work on that.

@HenKlei
Copy link
Member Author

HenKlei commented May 28, 2020

I added a small test in de2747b, but actually, I am a bit astonished about the behavior of the Newton algorithm in this case. Using 0.5 as initial value does not work without using the line search method. Certainly, as far as I can see, the algorithm should converge even when using 0.5 as initial guess. Maybe, there's something wrong with the monomials, or I didn't get the idea of the tests in stuff.py at all?

@sdrave
Copy link
Member

sdrave commented May 29, 2020

@HenKlei, turns out the implementation of MonomOperator.jacobian is wrong. The right implementation would be:

    def jacobian(self, U, mu=None):
        return NumpyMatrixOperator(self.derivative(U.to_numpy()).reshape((1,1)))

Can you fix it in this PR?

@HenKlei
Copy link
Member Author

HenKlei commented May 29, 2020

Alright, I'm going to fix this and have another look at the tests. Thanks for the clue.

@HenKlei
Copy link
Member Author

HenKlei commented May 29, 2020

@sdrave After applying the check you suggested, the test is not passing. This makes completely sense when looking at the problem: for instance the monom of degree 2, i.e. x^2, reaches the desired absolute tolerance of 1e-15 already for values like x=1e-8. Hence, comparing this value of x to 0 will lead to a failing test, although the correct root is approximated. I think, this test is maybe not the best for the Newton algorithm at all, at least in the way it is currently performed. Do you have any suggestions on how to proceed?

@sdrave
Copy link
Member

sdrave commented Jun 2, 2020

Hence, comparing this value of x to 0 will lead to a failing test, although the correct root is approximated. I think, this test is maybe not the best for the Newton algorithm at all, at least in the way it is currently performed.

Not at all ...

Do you have any suggestions on how to proceed?

Maybe the easiest way would be to check whether the solution actually has a residual smaller than atol?

@sdrave
Copy link
Member

sdrave commented Jun 3, 2020

@HenKlei, I see you have used float_cmp, which is also fine. Shall we merge this now?

@HenKlei
Copy link
Member Author

HenKlei commented Jun 4, 2020

Now, the test should work as expected.

@sdrave Maybe you could have another look. From my point of view, this can be merged.

@HenKlei
Copy link
Member Author

HenKlei commented Jun 4, 2020

The test_lincomb_op is broken due to the changes on the MonomOperator. I will have another look.

@HenKlei
Copy link
Member Author

HenKlei commented Jun 5, 2020

I had another look at test_lincomb_op in pymortests/operators.py. This is the first part of the method:

def test_lincomb_op():
    p1 = MonomOperator(1)
    p2 = MonomOperator(2)
    p12 = p1 + p2
    p0 = p1 - p1
    x = np.linspace(-1., 1., num=3)
    vx = p1.source.make_array((x[:, np.newaxis]))
    assert np.allclose(p0.apply(vx).to_numpy(), [0.])
    assert np.allclose(p12.apply(vx).to_numpy(), (x * x + x)[:, np.newaxis])
    assert np.allclose((p1 * 2.).apply(vx).to_numpy(), (x * 2.)[:, np.newaxis])
    assert almost_equal(p2.jacobian(vx).apply(vx), p1.apply(vx) * 2.).all()

To me, the last line does not really make sense. I think, p2.jacobian(vx).apply(vx) should be some kind of list of directional derivatives at the points given by vx in the directions given by vx. On the other hand, p1.apply(vx) * 2. should be a list of directional derivatives at the points given by vx in direction 1 in each component. I think p2.jacobian(vx).apply(vx) should be replaced by something like p2.jacobian(vx).apply(ones), where ones is a vector containing 1 in each entry.

Nevertheless, I did not get the test to work again so far. The dimensions of the vectors and operators do not seem to fit. @sdrave Are you sure that the reshape in return NumpyMatrixOperator(self.derivative(U.to_numpy()).reshape((1,1))) is correct, especially when the Jacobian should be used in test_lincomb_op in the way it is at the moment?

@sdrave
Copy link
Member

sdrave commented Jun 5, 2020

Nevertheless, I did not get the test to work again so far. The dimensions of the vectors and operators do not seem to fit. @sdrave Are you sure that the reshape in return NumpyMatrixOperator(self.derivative(U.to_numpy()).reshape((1,1))) is correct, especially when the Jacobian should be used in test_lincomb_op in the way it is at the moment?

jacobian should only get a VectorArray of length 1. You cannot get more than one Jacobian with a single call. So the test is wrong, and an assertion in MonomOperator.jacobian might be helpful as well.

@HenKlei
Copy link
Member Author

HenKlei commented Jun 5, 2020

jacobian should only get a VectorArray of length 1. You cannot get more than one Jacobian with a single call. So the test is wrong, and an assertion in MonomOperator.jacobian might be helpful as well.

Alright, then I should be able to fix that. Thank you!

@HenKlei
Copy link
Member Author

HenKlei commented Jun 6, 2020

@sdrave The tests pass. Just two small questions:
Should we rename stuff.py to newton.py? Do you have any improvements or simplifications for test_lincomb_op?

@sdrave
Copy link
Member

sdrave commented Jun 8, 2020

@HenKlei, thanks for all the additional work!

Should we rename stuff.py to newton.py?

Probably. Although I would do it in a separate PR.

Do you have any improvements or simplifications for test_lincomb_op?

I think this is not our nicest test, but for now I would leave it as is. @renefritze, is currently changing a lot in the test suite and I would wait a bit until things have settled down.

I would say, let's merge this PR.

@HenKlei
Copy link
Member Author

HenKlei commented Jun 9, 2020

Alright, sounds good!

@sdrave sdrave merged commit 9ad6fef into pymor:master Jun 9, 2020
@HenKlei HenKlei deleted the line-search branch August 31, 2020 09:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants