Skip to content

Conversation

@emma58
Copy link
Contributor

@emma58 emma58 commented Jan 19, 2022

Fixes # NA

Summary/Motivation:

The gdpopt solver previously called a series of transformations from contrib.preprocessing before solving NLP subproblems. Since much of this was duplicated functionality with fbbt, this PR switches to using fbbt to do most of the preprocessing, leaving a couple steps afterwards to the preprocessing transformations.

This relies on #2263 for the GLOA tests to pass (because the baron writer actually complains about active LogicalConstraints) .

Changes proposed in this PR:

  • Rewrites NLP subproblem preprocessing function to first call fbbt and then further process the result to fix variables based on the bounds, remove 0 terms, and deactivate any trivial constraints created along the way.

Legal Acknowledgement

By contributing to this software project, I have read the contribution guide and agree to the following terms and conditions for my contribution:

  1. I agree my contributions are submitted under the BSD license.
  2. I represent I am authorized to make the contributions and grant the license. If my employer has rights to intellectual property that includes these contributions, I represent that I have received permission to make contributions and grant the required license on behalf of that employer.

@codecov
Copy link

codecov bot commented Jan 19, 2022

Codecov Report

Merging #2264 (98f4164) into main (d0e9a12) will decrease coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main    #2264      +/-   ##
==========================================
- Coverage   84.56%   84.56%   -0.01%     
==========================================
  Files         607      607              
  Lines       76199    76207       +8     
==========================================
+ Hits        64439    64443       +4     
- Misses      11760    11764       +4     
Flag Coverage Δ
linux 81.97% <100.00%> (-0.02%) ⬇️
osx 72.40% <100.00%> (-0.03%) ⬇️
other 81.93% <100.00%> (-0.02%) ⬇️
win 79.06% <100.00%> (-0.01%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
pyomo/contrib/gdpopt/config_options.py 100.00% <100.00%> (ø)
pyomo/contrib/gdpopt/nlp_solve.py 75.39% <100.00%> (+0.15%) ⬆️
...ntrib/preprocessing/plugins/zero_sum_propagator.py 95.83% <0.00%> (-4.17%) ⬇️
...mo/contrib/preprocessing/plugins/bounds_to_vars.py 79.41% <0.00%> (-2.95%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update d0e9a12...98f4164. Read the comment docs.

Copy link
Contributor

@bernalde bernalde left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This sounds good and removes duplication. Are those transformations being used elsewhere in Pyomo? Does it make sense to keep them there? I'm also not entirely sure that all the transformations are being covered by FBBT, in particular 'contrib.propagate_zero_sum' was very particular to GDPOpt. I could not find the example where this transformation made a lot of sense (if I remember correctly it appeared when Qi was doing modular design) but it might make a big difference when solving the problems.
Have you noticed if there is a big performance difference in using this transformation instead? Is this transformation going to use Michael's newest version of FBBT in C++? I would suggest at least running a test in GDPLib instances to see that this still works, unfortunately, we do not have tests to see if this can break what was working before.

xfrm('contrib.detect_fixed_vars').apply_to(
m, tolerance=config.variable_tolerance)
xfrm('contrib.propagate_zero_sum').apply_to(m)
# Last, check if any constraints are now trivial and deactivate them
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See main review's comment.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bernalde FBBT does handle the propagate_zero_sum transformation:

In [1]: import pyomo.environ as pe

In [2]: m = pe.ConcreteModel()

In [3]: m.a = pe.Set(initialize=list(range(5)))

In [4]: m.x = pe.Var(m.a, bounds=(0, None))

In [5]: m.c1 = pe.Constraint(expr=sum(m.x.values()) == 0)

In [6]: from pyomo.contrib.fbbt.fbbt import fbbt

In [7]: fbbt(m)
Out[7]: <pyomo.common.collections.component_map.ComponentMap at 0x7fc4e10e40a0>

In [8]: m.x.pprint()
x : Size=5, Index=a
    Key : Lower : Value : Upper : Fixed : Stale : Domain
      0 :     0 :  None :   0.0 : False :  True :  Reals
      1 :     0 :  None :   0.0 : False :  True :  Reals
      2 :     0 :  None :   0.0 : False :  True :  Reals
      3 :     0 :  None :   0.0 : False :  True :  Reals
      4 :     0 :  None :   0.0 : False :  True :  Reals

These variables will then get fixed in the detect_fixed_vars transformation. FBBT will work on more general constraints as well:

In [9]: m.x.setub(None)

In [10]: m.x.pprint()
x : Size=5, Index=a
    Key : Lower : Value : Upper : Fixed : Stale : Domain
      0 :     0 :  None :  None : False :  True :  Reals
      1 :     0 :  None :  None : False :  True :  Reals
      2 :     0 :  None :  None : False :  True :  Reals
      3 :     0 :  None :  None : False :  True :  Reals
      4 :     0 :  None :  None : False :  True :  Reals

In [11]: del m.c1

In [12]: m.c1 = pe.Constraint(expr=sum(i*m.x[i] for i in m.a) == 0)

In [13]: fbbt(m)
Out[13]: <pyomo.common.collections.component_map.ComponentMap at 0x7fc4e1cad730>

In [14]: m.x.pprint()
x : Size=5, Index=a
    Key : Lower : Value : Upper : Fixed : Stale : Domain
      0 :     0 :  None :  None : False :  True :  Reals
      1 :     0 :  None :   0.0 : False :  True :  Reals
      2 :     0 :  None :   0.0 : False :  True :  Reals
      3 :     0 :  None :   0.0 : False :  True :  Reals
      4 :     0 :  None :   0.0 : False :  True :  Reals

@michaelbynum
Copy link
Contributor

@emma58 Potential FBBT performance issues were raised by @bernalde. While one iteration of FBBT should not be problematic, more than a few can be, and I think the default value for max_iter is 10. For these purposes, I would recommend adding an option to control the maximum number of iterations and maybe decrease the default for GDPopt to 2 or 3.

Comment on lines 326 to 327
xfrm('contrib.deactivate_trivial_constraints').apply_to(
m, tolerance=config.constraint_tolerance)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the deactivate_trivial_constraints transformation does what I think, then it should also be covered by FBBT by using the deactivate_satisfied_constraints option in the call to fbbt. It is worth double checking that these do the same thing though.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, good point. They do indeed do the same thing. However, I guess you could get more trivially satisfied constraints after the detect_fixed_vars transformation, so maybe this call is still worth it?

Comment on lines 316 to 317
fbbt(m, integer_tol=config.integer_tolerance,
feasibility_tol=config.constraint_tolerance)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FBBT may have some undesirable side effects. Tightened variable bounds can hurt NLP convergence in some cases. I recommend storing the variable bounds before calling fbbt and restoring the bounds at the end of preprocess_subproblem for any variables that do not get fixed. There should at least be an option to control that behavior, I think.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So restoring bounds in combination with setting deactivate_satisfied_constraints to True can have problems. For example:

m = ConcreteModel()
m.x = Var(bounds=(-2, 10))
m.c1 = Constraint(expr=m.x >= 8)
m.obj = Objective(expr=m.x)

fbbt(m, deactivate_satisfied_constraints=True)

# This becomes an issue when we restore bounds: we've actually removed m.c1 from the model, and are relying on just 
# the variable bounds to enforce the constraint
print(m.c1.active) # False

So I think we either need to wait until after fbbt and use the (less smart) deactivate_trivial_constraints transformation or we can't restore the bounds to the originals.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch! I would be in favor of using the deactivate_trivial_constraints transformation.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, sounds good! I think that makes sense.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After thinking about this a bit more, the variable bounds should be restored right after fbbt rather than at the end of preprocess_subproblem. In general the variable bounds may impact later transformations.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, I think you're right that they should be restored before deactivate_trivial_constraints because we risk the same issue as above otherwise. It's not actually a problem now because deactivate_trivial_constraints isn't as smart as fbbt: It only checks individual constraints where the body is constant. So, with that design, if it's trivial with tight bounds it will be trivial with looser bounds. But I agree that in principle it should be called after restoring the bounds. However, the detect_fixed_vars transformation fixes variables whose bounds have closed, so it needs the tightened bounds. And I think remove_zero_terms is agnostic to the bounds. I'll switch to restoring them after detect_fixed_vars.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good.

xfrm('contrib.propagate_fixed_vars').apply_to(m)
m, tolerance=config.variable_tolerance)
# Now, if something got fixed to 0, we might have 0*var terms to remove
xfrm('contrib.remove_zero_terms').apply_to(m)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not important for this PR, but I thought generate_standard_repn took care of removing zero terms.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It does. I think that this transformation is called because deactivate_trivial_constraints decides something might be trivial by checking if the polynomial degree of the constraint body is 0. And if you have

m.x = Var()
m.y = Var()
m.y.fix(0)
m.e = Expression(expr=m.x*m.y)

then m.e.polynomial_degree() is 1 apparently.

So I think that the long-term solution is a rewrite of deactivate_trivial_constraints (I'll submit an issue so I don't forget), but for now I guess this makes sense.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it!

default=3,
description="Maximum number of feasibility-based bounds tightening "
"iterations to do during NLP subproblem preprocessing.",
domain=PositiveInt
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be NonNegativeInt so a user can effectively disable preprocessing?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Never mind. There is another option for this.

@michaelbynum michaelbynum merged commit 57143c0 into Pyomo:main Feb 1, 2022
@emma58 emma58 deleted the gdpopt-preprocessing-to-fbbt branch September 28, 2022 22:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants