Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

suboptimal solution returned by scipy.optimize.linprog #7044

Closed
mdhaber opened this issue Feb 14, 2017 · 2 comments
Closed

suboptimal solution returned by scipy.optimize.linprog #7044

mdhaber opened this issue Feb 14, 2017 · 2 comments
Labels
defect A clear bug or issue that prevents SciPy from being installed or used as expected scipy.optimize

Comments

@mdhaber
Copy link
Contributor

mdhaber commented Feb 14, 2017

I have been working on an interior point algorithm for scipy.optimize.linprog, and I happened to find that my code found a solution with a lower objective than that returned by the simplex algorithm. I have double-checked, and my code's solution satisfies the constraints. This suggests that the simplex algorithm is returning a suboptimal solution. Since the simplex algorithm is supposed to find a global optimum, there must be a bug in the implementation.

Here is the relevant code. First, solve the problem with scipy.optimize.linprog. Confirm that the objective function value reported by linprog is c.dot(x) and that the equality constraints and non-negativity constraints are satisfied.

d = json.loads('{"c": [0.5488135039273248, 0.7151893663724195, 0.6027633760716439, 0.5448831829968969, 0.4236547993389047, 0.6458941130666561, 0.4375872112626925, 0.8917730007820798, 0.9636627605010293, 0.3834415188257777, 0.7917250380826646, 0.5288949197529045, 0.5680445610939323, 0.925596638292661, 0.07103605819788694, 0.08712929970154071, 0.02021839744032572, 0.832619845547938, 0.7781567509498505, 0.8700121482468192, 0.978618342232764, 0.7991585642167236, 0.46147936225293185, 0.7805291762864555, 0.11827442586893322, 0.6399210213275238, 0.1433532874090464, 0.9446689170495839, 0.5218483217500717, 0.4146619399905236, 0.26455561210462697, 0.7742336894342167, 0.45615033221654855, 0.5684339488686485, 0.018789800436355142, 0.6176354970758771, 0.6120957227224214, 0.6169339968747569, 0.9437480785146242, 0.6818202991034834, 0.359507900573786, 0.43703195379934145, 0.6976311959272649, 0.06022547162926983, 0.6667667154456677, 0.6706378696181594, 0.2103825610738409, 0.1289262976548533, 0.31542835092418386, 0.3637107709426226, 0.5701967704178796, 0.43860151346232035, 0.9883738380592262, 0.10204481074802807, 0.2088767560948347, 0.16130951788499626, 0.6531083254653984, 0.2532916025397821, 0.4663107728563063, 0.24442559200160274, 0.15896958364551972, 0.11037514116430513, 0.6563295894652734, 0.1381829513486138, 0.1965823616800535, 0.3687251706609641, 0.8209932298479351, 0.09710127579306127, 0.8379449074988039, 0.09609840789396307, 0.9764594650133958, 0.4686512016477016, 0.9767610881903371, 0.604845519745046, 0.7392635793983017, 0.039187792254320675, 0.2828069625764096, 0.1201965612131689, 0.29614019752214493, 0.11872771895424405, 0.317983179393976], "A_eq": [[1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0], [1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0], [1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.0, 2.0, 2.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.0, 3.0, 3.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.0, 4.0, 4.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 5.0, 5.0, 5.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 6.0, 6.0, 6.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 7.0, 7.0, 7.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 8.0, 8.0, 8.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 9.0, 9.0, 9.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.0, 2.0, 2.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.0, 3.0, 3.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.0, 4.0, 4.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 5.0, 5.0, 5.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 6.0, 6.0, 6.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 7.0, 7.0, 7.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 8.0, 8.0, 8.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 9.0, 9.0, 9.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.0, 2.0, 2.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.0, 3.0, 3.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.0, 4.0, 4.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 5.0, 5.0, 5.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 6.0, 6.0, 6.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 7.0, 7.0, 7.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 8.0, 8.0, 8.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 9.0, 9.0, 9.0], [1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 2.0, 0.0, 0.0, 2.0, 0.0, 0.0, 2.0, 0.0, 0.0, 3.0, 0.0, 0.0, 3.0, 0.0, 0.0, 3.0, 0.0, 0.0, 4.0, 0.0, 0.0, 4.0, 0.0, 0.0, 4.0, 0.0, 0.0, 5.0, 0.0, 0.0, 5.0, 0.0, 0.0, 5.0, 0.0, 0.0, 6.0, 0.0, 0.0, 6.0, 0.0, 0.0, 6.0, 0.0, 0.0, 7.0, 0.0, 0.0, 7.0, 0.0, 0.0, 7.0, 0.0, 0.0, 8.0, 0.0, 0.0, 8.0, 0.0, 0.0, 8.0, 0.0, 0.0, 9.0, 0.0, 0.0, 9.0, 0.0, 0.0, 9.0, 0.0, 0.0], [0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 2.0, 0.0, 0.0, 2.0, 0.0, 0.0, 2.0, 0.0, 0.0, 3.0, 0.0, 0.0, 3.0, 0.0, 0.0, 3.0, 0.0, 0.0, 4.0, 0.0, 0.0, 4.0, 0.0, 0.0, 4.0, 0.0, 0.0, 5.0, 0.0, 0.0, 5.0, 0.0, 0.0, 5.0, 0.0, 0.0, 6.0, 0.0, 0.0, 6.0, 0.0, 0.0, 6.0, 0.0, 0.0, 7.0, 0.0, 0.0, 7.0, 0.0, 0.0, 7.0, 0.0, 0.0, 8.0, 0.0, 0.0, 8.0, 0.0, 0.0, 8.0, 0.0, 0.0, 9.0, 0.0, 0.0, 9.0, 0.0, 0.0, 9.0, 0.0], [0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 2.0, 0.0, 0.0, 2.0, 0.0, 0.0, 2.0, 0.0, 0.0, 3.0, 0.0, 0.0, 3.0, 0.0, 0.0, 3.0, 0.0, 0.0, 4.0, 0.0, 0.0, 4.0, 0.0, 0.0, 4.0, 0.0, 0.0, 5.0, 0.0, 0.0, 5.0, 0.0, 0.0, 5.0, 0.0, 0.0, 6.0, 0.0, 0.0, 6.0, 0.0, 0.0, 6.0, 0.0, 0.0, 7.0, 0.0, 0.0, 7.0, 0.0, 0.0, 7.0, 0.0, 0.0, 8.0, 0.0, 0.0, 8.0, 0.0, 0.0, 8.0, 0.0, 0.0, 9.0, 0.0, 0.0, 9.0, 0.0, 0.0, 9.0], [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 2.0, 0.0, 0.0, 0.0, 2.0, 0.0, 0.0, 0.0, 2.0, 3.0, 0.0, 0.0, 0.0, 3.0, 0.0, 0.0, 0.0, 3.0, 4.0, 0.0, 0.0, 0.0, 4.0, 0.0, 0.0, 0.0, 4.0, 5.0, 0.0, 0.0, 0.0, 5.0, 0.0, 0.0, 0.0, 5.0, 6.0, 0.0, 0.0, 0.0, 6.0, 0.0, 0.0, 0.0, 6.0, 7.0, 0.0, 0.0, 0.0, 7.0, 0.0, 0.0, 0.0, 7.0, 8.0, 0.0, 0.0, 0.0, 8.0, 0.0, 0.0, 0.0, 8.0, 9.0, 0.0, 0.0, 0.0, 9.0, 0.0, 0.0, 0.0, 9.0], [0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 2.0, 0.0, 2.0, 0.0, 2.0, 0.0, 0.0, 0.0, 0.0, 3.0, 0.0, 3.0, 0.0, 3.0, 0.0, 0.0, 0.0, 0.0, 4.0, 0.0, 4.0, 0.0, 4.0, 0.0, 0.0, 0.0, 0.0, 5.0, 0.0, 5.0, 0.0, 5.0, 0.0, 0.0, 0.0, 0.0, 6.0, 0.0, 6.0, 0.0, 6.0, 0.0, 0.0, 0.0, 0.0, 7.0, 0.0, 7.0, 0.0, 7.0, 0.0, 0.0, 0.0, 0.0, 8.0, 0.0, 8.0, 0.0, 8.0, 0.0, 0.0, 0.0, 0.0, 9.0, 0.0, 9.0, 0.0, 9.0, 0.0, 0.0]], "b_eq": [1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 15.0, 15.0, 15.0, 15.0, 15.0, 15.0, 15.0, 15.0]}')
A_eq = np.array(d["A_eq"])
b_eq = np.array(d["b_eq"])
c = np.array(d["c"])
sol = linprog(c, A_eq = A_eq, b_eq = b_eq)
print "Solution returned by linprog: ", sol.x
print "Optimal Objective, according to linprog: ", sol.fun
print "Optimal Objective is c.dot(x): ", np.allclose(c.dot(sol.x),sol.fun)
print "Constraints satisfied: ", np.allclose(A_eq.dot(sol.x), b_eq)
print "All non-negative: ", np.all(sol.x>=0)

This prints:

Solution returned by linprog:  [ 0.51712026  0.29028941  0.07162669  0.          0.12096363  0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.82386203  0.17613797  0.          0.          0.          0.          0.
  0.          0.47850304  0.          0.52149696  0.          0.          0.
  0.11499792  0.          0.44659316  0.          0.          0.
  0.43840892  0.          0.          0.          0.          0.          0.
  0.          0.          0.56159108  0.43840892  0.          0.
  0.91715025  0.          0.          0.          0.          0.
  0.08284975  0.48287974  0.06957349  0.          0.          0.          0.
  0.          0.          0.44754676  0.          0.52513917  0.01122306
  0.          0.16127271  0.          0.30236507  0.          0.          0.
  0.          0.          0.55340684  0.23926062  0.17613797  0.          0.
  0.03119457]
Optimal Objective, according to linprog:  2.25979372588
Optimal Objective is c.dot(x):  True
Constraints satisfied:  True
All non-negative:  True

So linprog is finding a feasible solution to the problem, and it is calculating the objective value for that solution correctly.

Then consider the alternate solution I found: confirm that it is feasible, and calculate the objective.

x = json.loads('{"x": [0.3331522727509847, 0.3067173628301121, 3.702522086402331e-10, 1.8091875158058772e-10, 0.3601303635118847, 8.484450085557408e-11, 1.7237957088133708e-10, 4.989385018290918e-11, 4.8729665124229865e-11, 0.16603295277017202, 1.4126059349605686e-10, 2.1887976906163802e-10, 1.0296773459973263e-10, 5.983336445187816e-11, 0.7300380223176367, 0.10392902371561037, 6.219463981524648e-10, 5.1693061341455744e-11, 9.658776511037417e-11, 9.735668142197327e-11, 6.305542666826776e-11, 6.549420576885266e-11, 2.05928715931002e-10, 5.868921825027981e-11, 0.5703422050369672, 5.7885975547811996e-11, 0.4296577943180349, 7.358224024388978e-11, 7.743564892401284e-10, 0.2623574136497042, 0.5779467671406471, 8.683202925213137e-11, 1.5592320940080054e-10, 1.0649002074562453e-10, 0.15969581790177767, 1.106871516331739e-10, 2.2344718075301884e-10, 2.504337325185133e-10, 8.067582085152109e-11, 1.0655018996599815e-10, 0.15969581782722844, 2.531395466365125e-10, 9.113978854103399e-11, 0.8403041810358536, 1.315316824280523e-10, 6.747401650025393e-11, 2.953548339158474e-10, 0.7376425851949395, 1.2093688679988512e-10, 1.1556729777743684e-10, 6.731196879086867e-11, 8.239116010567021e-11, 3.115265603789146e-11, 0.26235741402487167, 0.3067173622975413, 0.6932826352748306, 1.0142550676150176e-10, 3.315171242265853e-10, 1.2626602517885402e-10, 4.6265633761269013e-10, 1.1348917738091376e-09, 1.602656107311836e-10, 1.1060574636708291e-10, 0.19409741167724903, 2.768529384047373e-10, 2.5357809396758446e-10, 5.354947336919287e-11, 0.48017381790820896, 5.6308899544650045e-11, 0.3257287694909369, 3.366139597875523e-11, 2.4965433730521696e-10, 4.2961741843388147e-11, 5.944224356070173e-11, 6.748939571747158e-11, 0.4220532318974184, 1.5825057780147723e-10, 0.2699619765434897, 1.6919311912128475e-10, 1.0756294867647948e-10, 0.30798479095419196]}')['x']
print "Alternative solution: ", x
print "Objective of alternative solution : ", np.array(c).dot(x)
print "Constraints satisfied: ", np.allclose(A_eq.dot(x), b_eq)
print "All non-negative: ", np.all(x>=0)

This prints:

Alternative solution:  [0.3331522727509847, 0.3067173628301121, 3.702522086402331e-10, 1.8091875158058772e-10, 0.3601303635118847, 8.484450085557408e-11, 1.7237957088133708e-10, 4.989385018290918e-11, 4.8729665124229865e-11, 0.16603295277017202, 1.4126059349605686e-10, 2.1887976906163802e-10, 1.0296773459973263e-10, 5.983336445187816e-11, 0.7300380223176367, 0.10392902371561037, 6.219463981524648e-10, 5.1693061341455744e-11, 9.658776511037417e-11, 9.735668142197327e-11, 6.305542666826776e-11, 6.549420576885266e-11, 2.05928715931002e-10, 5.868921825027981e-11, 0.5703422050369672, 5.7885975547811996e-11, 0.4296577943180349, 7.358224024388978e-11, 7.743564892401284e-10, 0.2623574136497042, 0.5779467671406471, 8.683202925213137e-11, 1.5592320940080054e-10, 1.0649002074562453e-10, 0.15969581790177767, 1.106871516331739e-10, 2.2344718075301884e-10, 2.504337325185133e-10, 8.067582085152109e-11, 1.0655018996599815e-10, 0.15969581782722844, 2.531395466365125e-10, 9.113978854103399e-11, 0.8403041810358536, 1.315316824280523e-10, 6.747401650025393e-11, 2.953548339158474e-10, 0.7376425851949395, 1.2093688679988512e-10, 1.1556729777743684e-10, 6.731196879086867e-11, 8.239116010567021e-11, 3.115265603789146e-11, 0.26235741402487167, 0.3067173622975413, 0.6932826352748306, 1.0142550676150176e-10, 3.315171242265853e-10, 1.2626602517885402e-10, 4.6265633761269013e-10, 1.1348917738091376e-09, 1.602656107311836e-10, 1.1060574636708291e-10, 0.19409741167724903, 2.768529384047373e-10, 2.5357809396758446e-10, 5.354947336919287e-11, 0.48017381790820896, 5.6308899544650045e-11, 0.3257287694909369, 3.366139597875523e-11, 2.4965433730521696e-10, 4.2961741843388147e-11, 5.944224356070173e-11, 6.748939571747158e-11, 0.4220532318974184, 1.5825057780147723e-10, 0.2699619765434897, 1.6919311912128475e-10, 1.0756294867647948e-10, 0.30798479095419196]
Objective of alternative solution :  1.730550597
Constraints satisfied:  True
All non-negative:  True

So we have another feasible solution that yields an objective lower than that returned by scipy.optimize.linprog.

Note that another version of this problem (with a zero objective function and bounds), exposed another bug; scipy.optimize.linprog did not respect the constraints. See #6690.

@mdhaber
Copy link
Contributor Author

mdhaber commented Feb 14, 2017

It turns out that A_eq in the problem is rank-deficient. After finding and removing rows that are a linear combination of others, linprog's solution agrees with the other.

So linprog is producing a suboptimal solution when an equality constraint matrix is rank-deficient. Perhaps the algorithm does not need to be modified, but I think linprog should at least warn the user of rank-deficiency and/or mention in the documentation that the equality constraints need to be linearly independent.

@rgommers rgommers added defect A clear bug or issue that prevents SciPy from being installed or used as expected scipy.optimize labels Feb 14, 2017
mdhaber added a commit to mdhaber/scipy that referenced this issue Mar 3, 2017
The new file scipy/optimize/linprog_ip.py contains a new function,
linprog, intended to replace the one in scipy/optimize/_linprog.py.
The new linprog function works exactly the same way as the old one,
except that now the 'method' argument can be either "simplex" or
"interior-point".

scipy/optimize/_removeRedundancy.py contains a function
_removeRedundancy that is used in the presolve procedure of
the interior point method. It's in a separate file as
scipy/optimize/_linprog/_linprog_simplex would benefit
from calling it to eliminate redundancies in equality
constraints. This would help fix some of its issues, e.g. scipy#7044.

scipy/optimize/tests/test_linprog_ip contains the unit tests for
the new linprog function. It contains all the tests for the old
simplex method and several new tests, some of which would be relevant
to the old simplex method.

Please forgive the mess. I have no idea what I'm doing.
mdhaber added a commit to mdhaber/scipy that referenced this issue Jul 2, 2017
The new file scipy/optimize/linprog_ip.py contains a new function,
linprog, intended to replace the one in scipy/optimize/_linprog.py.
The new linprog function works exactly the same way as the old one,
except that now the 'method' argument can be either "simplex" or
"interior-point".

scipy/optimize/_removeRedundancy.py contains a function
_removeRedundancy that is used in the presolve procedure of
the interior point method. It's in a separate file as
scipy/optimize/_linprog/_linprog_simplex would benefit
from calling it to eliminate redundancies in equality
constraints. This would help fix some of its issues, e.g. scipy#7044.

scipy/optimize/tests/test_linprog_ip contains the unit tests for
the new linprog function. It contains all the tests for the old
simplex method and several new tests, some of which would be relevant
to the old simplex method.

Please forgive the mess. I have no idea what I'm doing.
mdhaber added a commit to mdhaber/scipy that referenced this issue Jul 3, 2017
The new file scipy/optimize/linprog_ip.py contains a new function,
linprog, intended to replace the one in scipy/optimize/_linprog.py.
The new linprog function works exactly the same way as the old one,
except that now the 'method' argument can be either "simplex" or
"interior-point".

scipy/optimize/_removeRedundancy.py contains a function
_removeRedundancy that is used in the presolve procedure of
the interior point method. It's in a separate file as
scipy/optimize/_linprog/_linprog_simplex would benefit
from calling it to eliminate redundancies in equality
constraints. This would help fix some of its issues, e.g. scipy#7044.

scipy/optimize/tests/test_linprog_ip contains the unit tests for
the new linprog function. It contains all the tests for the old
simplex method and several new tests, some of which would be relevant
to the old simplex method.

Please forgive the mess. I have no idea what I'm doing.
pv pushed a commit to pv/scipy-work that referenced this issue Jul 3, 2017
The new file scipy/optimize/linprog_ip.py contains a new function,
linprog, intended to replace the one in scipy/optimize/_linprog.py.
The new linprog function works exactly the same way as the old one,
except that now the 'method' argument can be either "simplex" or
"interior-point".

scipy/optimize/_removeRedundancy.py contains a function
_removeRedundancy that is used in the presolve procedure of
the interior point method. It's in a separate file as
scipy/optimize/_linprog/_linprog_simplex would benefit
from calling it to eliminate redundancies in equality
constraints. This would help fix some of its issues, e.g. scipy#7044.

scipy/optimize/tests/test_linprog_ip contains the unit tests for
the new linprog function. It contains all the tests for the old
simplex method and several new tests, some of which would be relevant
to the old simplex method.

Please forgive the mess. I have no idea what I'm doing.
pv pushed a commit to pv/scipy-work that referenced this issue Jul 30, 2017
The new file scipy/optimize/linprog_ip.py contains a new function,
linprog, intended to replace the one in scipy/optimize/_linprog.py.
The new linprog function works exactly the same way as the old one,
except that now the 'method' argument can be either "simplex" or
"interior-point".

scipy/optimize/_removeRedundancy.py contains a function
_removeRedundancy that is used in the presolve procedure of
the interior point method. It's in a separate file as
scipy/optimize/_linprog/_linprog_simplex would benefit
from calling it to eliminate redundancies in equality
constraints. This would help fix some of its issues, e.g. scipy#7044.

scipy/optimize/tests/test_linprog_ip contains the unit tests for
the new linprog function. It contains all the tests for the old
simplex method and several new tests, some of which would be relevant
to the old simplex method.

Please forgive the mess. I have no idea what I'm doing.
@mdhaber
Copy link
Contributor Author

mdhaber commented Jun 21, 2018

Closed via #8909

@mdhaber mdhaber closed this as completed Jun 21, 2018
mdhaber added a commit to mdhaber/scipy that referenced this issue Sep 14, 2018
Also fixed minor bug checking for positive u, which should enable
correct solution of scipy#5400 and scipy#7044
mdhaber added a commit to mdhaber/scipy that referenced this issue Nov 25, 2018
Also fixed minor bug checking for positive u, which should enable
correct solution of scipy#5400 and scipy#7044
mdhaber added a commit to mdhaber/scipy that referenced this issue Dec 21, 2018
Also fixed minor bug checking for positive u, which should enable
correct solution of scipy#5400 and scipy#7044
pvanmulbregt pushed a commit to pvanmulbregt/scipy that referenced this issue Jan 26, 2019
Also fixed minor bug checking for positive u, which should enable
correct solution of scipy#5400 and scipy#7044
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
defect A clear bug or issue that prevents SciPy from being installed or used as expected scipy.optimize
Projects
None yet
Development

No branches or pull requests

2 participants