Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add optional kwargs to optimize! for optimize_hook #1987

Merged
merged 4 commits into from Jun 22, 2019
Merged

Add optional kwargs to optimize! for optimize_hook #1987

merged 4 commits into from Jun 22, 2019

Conversation

gsoleilhac
Copy link
Contributor

A simple change to pass along kwargs to the optimize_hook

@codecov
Copy link

codecov bot commented Jun 17, 2019

Codecov Report

Merging #1987 into master will increase coverage by <.01%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #1987      +/-   ##
==========================================
+ Coverage   88.86%   88.86%   +<.01%     
==========================================
  Files          33       33              
  Lines        4282     4283       +1     
==========================================
+ Hits         3805     3806       +1     
  Misses        477      477
Impacted Files Coverage Δ
src/optimizer_interface.jl 77.5% <100%> (+0.57%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 36dc086...1a03049. Read the comment docs.

test/model.jl Outdated Show resolved Hide resolved
@gsoleilhac
Copy link
Contributor Author

Done, should the error be thrown sooner to avoid side-effects if optimizer_factory !== nothing or model.nlp_data !== nothing ?

Copy link
Member

@mlubin mlubin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should the error be thrown sooner to avoid side-effects if optimizer_factory !== nothing or model.nlp_data !== nothing ?

No, I don't think that's a big issue.


Optimize the model. If `optimizer_factory` is not `nothing`, it first sets the
optimizer to a new one created using the optimizer factory. The factory can be
created using the [`with_optimizer`](@ref) function. If `optimizer_factory` is
`nothing` and no optimizer was set to `model` before calling this function, a
[`NoOptimizer`](@ref) error is thrown.

Keyword arguments `kwargs` are passed to the `optimize_hook`. An error is
thrown if `optimize_hook` is `nothing` and keyword arguments are provided
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: Please end the sentence with a period.

@mlubin mlubin merged commit cf5657a into jump-dev:master Jun 22, 2019
dourouc05 pushed a commit to dourouc05/JuMP.jl that referenced this pull request Jul 18, 2019
* Add optional kwargs to optimize! for optimize_hook

* Add test for unexpected kwarg error

* Add documentation on kwargs to optimize!

* Formatting tweak
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants