-
Notifications
You must be signed in to change notification settings - Fork 562
Switch gdpopt preprocessing to use FBBT #2264
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…essing transformations afterwards
…pt-preprocessing-to-fbbt
Codecov Report
@@ Coverage Diff @@
## main #2264 +/- ##
==========================================
- Coverage 84.56% 84.56% -0.01%
==========================================
Files 607 607
Lines 76199 76207 +8
==========================================
+ Hits 64439 64443 +4
- Misses 11760 11764 +4
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
bernalde
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This sounds good and removes duplication. Are those transformations being used elsewhere in Pyomo? Does it make sense to keep them there? I'm also not entirely sure that all the transformations are being covered by FBBT, in particular 'contrib.propagate_zero_sum' was very particular to GDPOpt. I could not find the example where this transformation made a lot of sense (if I remember correctly it appeared when Qi was doing modular design) but it might make a big difference when solving the problems.
Have you noticed if there is a big performance difference in using this transformation instead? Is this transformation going to use Michael's newest version of FBBT in C++? I would suggest at least running a test in GDPLib instances to see that this still works, unfortunately, we do not have tests to see if this can break what was working before.
| xfrm('contrib.detect_fixed_vars').apply_to( | ||
| m, tolerance=config.variable_tolerance) | ||
| xfrm('contrib.propagate_zero_sum').apply_to(m) | ||
| # Last, check if any constraints are now trivial and deactivate them |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See main review's comment.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@bernalde FBBT does handle the propagate_zero_sum transformation:
In [1]: import pyomo.environ as pe
In [2]: m = pe.ConcreteModel()
In [3]: m.a = pe.Set(initialize=list(range(5)))
In [4]: m.x = pe.Var(m.a, bounds=(0, None))
In [5]: m.c1 = pe.Constraint(expr=sum(m.x.values()) == 0)
In [6]: from pyomo.contrib.fbbt.fbbt import fbbt
In [7]: fbbt(m)
Out[7]: <pyomo.common.collections.component_map.ComponentMap at 0x7fc4e10e40a0>
In [8]: m.x.pprint()
x : Size=5, Index=a
Key : Lower : Value : Upper : Fixed : Stale : Domain
0 : 0 : None : 0.0 : False : True : Reals
1 : 0 : None : 0.0 : False : True : Reals
2 : 0 : None : 0.0 : False : True : Reals
3 : 0 : None : 0.0 : False : True : Reals
4 : 0 : None : 0.0 : False : True : Reals
These variables will then get fixed in the detect_fixed_vars transformation. FBBT will work on more general constraints as well:
In [9]: m.x.setub(None)
In [10]: m.x.pprint()
x : Size=5, Index=a
Key : Lower : Value : Upper : Fixed : Stale : Domain
0 : 0 : None : None : False : True : Reals
1 : 0 : None : None : False : True : Reals
2 : 0 : None : None : False : True : Reals
3 : 0 : None : None : False : True : Reals
4 : 0 : None : None : False : True : Reals
In [11]: del m.c1
In [12]: m.c1 = pe.Constraint(expr=sum(i*m.x[i] for i in m.a) == 0)
In [13]: fbbt(m)
Out[13]: <pyomo.common.collections.component_map.ComponentMap at 0x7fc4e1cad730>
In [14]: m.x.pprint()
x : Size=5, Index=a
Key : Lower : Value : Upper : Fixed : Stale : Domain
0 : 0 : None : None : False : True : Reals
1 : 0 : None : 0.0 : False : True : Reals
2 : 0 : None : 0.0 : False : True : Reals
3 : 0 : None : 0.0 : False : True : Reals
4 : 0 : None : 0.0 : False : True : Reals
|
@emma58 Potential FBBT performance issues were raised by @bernalde. While one iteration of FBBT should not be problematic, more than a few can be, and I think the default value for |
| xfrm('contrib.deactivate_trivial_constraints').apply_to( | ||
| m, tolerance=config.constraint_tolerance) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If the deactivate_trivial_constraints transformation does what I think, then it should also be covered by FBBT by using the deactivate_satisfied_constraints option in the call to fbbt. It is worth double checking that these do the same thing though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, good point. They do indeed do the same thing. However, I guess you could get more trivially satisfied constraints after the detect_fixed_vars transformation, so maybe this call is still worth it?
pyomo/contrib/gdpopt/nlp_solve.py
Outdated
| fbbt(m, integer_tol=config.integer_tolerance, | ||
| feasibility_tol=config.constraint_tolerance) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FBBT may have some undesirable side effects. Tightened variable bounds can hurt NLP convergence in some cases. I recommend storing the variable bounds before calling fbbt and restoring the bounds at the end of preprocess_subproblem for any variables that do not get fixed. There should at least be an option to control that behavior, I think.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So restoring bounds in combination with setting deactivate_satisfied_constraints to True can have problems. For example:
m = ConcreteModel()
m.x = Var(bounds=(-2, 10))
m.c1 = Constraint(expr=m.x >= 8)
m.obj = Objective(expr=m.x)
fbbt(m, deactivate_satisfied_constraints=True)
# This becomes an issue when we restore bounds: we've actually removed m.c1 from the model, and are relying on just
# the variable bounds to enforce the constraint
print(m.c1.active) # False
So I think we either need to wait until after fbbt and use the (less smart) deactivate_trivial_constraints transformation or we can't restore the bounds to the originals.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch! I would be in favor of using the deactivate_trivial_constraints transformation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, sounds good! I think that makes sense.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
After thinking about this a bit more, the variable bounds should be restored right after fbbt rather than at the end of preprocess_subproblem. In general the variable bounds may impact later transformations.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, I think you're right that they should be restored before deactivate_trivial_constraints because we risk the same issue as above otherwise. It's not actually a problem now because deactivate_trivial_constraints isn't as smart as fbbt: It only checks individual constraints where the body is constant. So, with that design, if it's trivial with tight bounds it will be trivial with looser bounds. But I agree that in principle it should be called after restoring the bounds. However, the detect_fixed_vars transformation fixes variables whose bounds have closed, so it needs the tightened bounds. And I think remove_zero_terms is agnostic to the bounds. I'll switch to restoring them after detect_fixed_vars.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good.
| xfrm('contrib.propagate_fixed_vars').apply_to(m) | ||
| m, tolerance=config.variable_tolerance) | ||
| # Now, if something got fixed to 0, we might have 0*var terms to remove | ||
| xfrm('contrib.remove_zero_terms').apply_to(m) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not important for this PR, but I thought generate_standard_repn took care of removing zero terms.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It does. I think that this transformation is called because deactivate_trivial_constraints decides something might be trivial by checking if the polynomial degree of the constraint body is 0. And if you have
m.x = Var()
m.y = Var()
m.y.fix(0)
m.e = Expression(expr=m.x*m.y)
then m.e.polynomial_degree() is 1 apparently.
So I think that the long-term solution is a rewrite of deactivate_trivial_constraints (I'll submit an issue so I don't forget), but for now I guess this makes sense.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it!
…pt-preprocessing-to-fbbt
…eration limit for fbbt
…ctivating trivial constraints
| default=3, | ||
| description="Maximum number of feasibility-based bounds tightening " | ||
| "iterations to do during NLP subproblem preprocessing.", | ||
| domain=PositiveInt |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this be NonNegativeInt so a user can effectively disable preprocessing?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Never mind. There is another option for this.
Fixes # NA
Summary/Motivation:
The gdpopt solver previously called a series of transformations from
contrib.preprocessingbefore solving NLP subproblems. Since much of this was duplicated functionality with fbbt, this PR switches to using fbbt to do most of the preprocessing, leaving a couple steps afterwards to the preprocessing transformations.This relies on #2263 for the GLOA tests to pass (because the baron writer actually complains about active LogicalConstraints) .
Changes proposed in this PR:
Legal Acknowledgement
By contributing to this software project, I have read the contribution guide and agree to the following terms and conditions for my contribution: