Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add more options to control NLP scaling #649

Merged
merged 4 commits into from
Aug 7, 2023
Merged

Conversation

nychiang
Copy link
Collaborator

@nychiang nychiang commented Aug 3, 2023

In this PR, the following user parameters are introduced:

scaling_max_obj_grad: If a positive value is given, the objective of user's NLP will be scaled so that the inf-norm of its gradient is equal to the given value. This value overwrites the value given by scaling_max_grad. Default value is 0

scaling_max_con_grad: If a positive value is given, the constraints of user's NLP will be scaled so that the inf-norm of its gradient is equal to the given value. This value overwrites the value given by scaling_max_grad. Default value is 0

scaling_min_grad: If a positive value is given, it is used as the lower bound for the scaling factors. This option has a priority, i.e., the final scaling factor computed must greater or equal to this value, even thought it may violate the values given in scaling_max_grad, scaling_max_obj_grad and scaling_max_con_grad. Default value is 1e-8

CLOSE #648

@nychiang nychiang requested a review from cnpetra August 3, 2023 22:49
@cnpetra
Copy link
Collaborator

cnpetra commented Aug 4, 2023

we should test this manually to ensure that it does not mess up the duals (and of course brings the derivatives norms at the target).

@nychiang
Copy link
Collaborator Author

nychiang commented Aug 4, 2023

I did some tests with NlpSparseEx1, where one can specify a scaling factor in the command line.
I played with the scaling factor and 3 options introduced in this PR. I can see that all the problem converges, despite different number of iterations. The worst case is that I got some warnings about big residual from the compressed linear system.

@nychiang
Copy link
Collaborator Author

nychiang commented Aug 4, 2023

@cnpetra new commit has updated the option descriptions, according to the .tex file

@cnpetra cnpetra merged commit 980844c into develop Aug 7, 2023
6 of 7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Introduce more user parameters to control NLP scaling
2 participants