Here is a small example [1]:
Min -x2 -x3 x0 + 0.33*x2 + 0.67*x3 = 0.5 x1 + 0.67*x2 + 0.33*x3 = 0.5 x0 + x1 + x2 + x3 = 1.0
We can encode this problem as follows:
from scipy.optimize import linprog a_eq = [[1.0, 0.0, 0.33, 0.67], [0.0, 1.0, 0.67, 0.33], [1, 1, 1, 1]] b_eq = [0.5, 0.5, 1.0] c = [0, 0, -1.0, -1.0] res1 = linprog(c=c, A_eq=a_eq, b_eq=b_eq, method='simplex') print(res1)
The simplex solver has troubles with this. It returns a sub-optimal solution with objective function value 0:
fun: -0.0 message: 'Optimization terminated successfully.' nit: 4 slack: array([], dtype=float64) status: 0 success: True x: array([ 0.5, 0.5, 0. , 0. ]) con: array([ -9.83213511e-13, -9.83102488e-13, -1.96620498e-12])
SciPy 1,0 has a new interior point solver [3] which seems a bit more robust:
res2 = linprog(c=c, A_eq=a_eq, b_eq=b_eq, method='interior-point') print(res2) fun: -1.000000000000884 message: 'Optimization terminated successfully.' nit: 4 slack: array([], dtype=float64) status: 0 success: True x: array([ 5.41163802e-13, 5.41163802e-13, 5.00000000e-01, 5.00000000e-01])
We see an optimal objective of -1. It also gives a warning:
D:\Anaconda3\lib\site-packages\scipy\optimize\_linprog_ip.py:623: OptimizeWarning: A_eq does not appear to be of full row rank. To improve performance, check the problem formulation for redundant equality constraints. warn(redundancy_warning, OptimizeWarning)
There is no good reason a Simplex solver should give incorrect results in case the A matrix is not full-rank. Usually solvers add a slack (often not physically) to each constraint, so A will never be rank deficient. The basis matrix can always be chosen such that it is both optimal and non-singular. Here is how an optimal basis (with objective -1) looks like for this problem provided by a different simplex solver:
LOWER LEVEL UPPER MARGINAL ---- EQU e1 0.5000 0.5000 0.5000 EPS 'Non-basic' ---- EQU e2 0.5000 0.5000 0.5000 . 'Basic' ---- EQU e3 1.0000 1.0000 1.0000 -1.0000 'Non-basic' LOWER LEVEL UPPER MARGINAL ---- VAR x0 . . +INF 1.0000 'Non-basic' ---- VAR x1 . . +INF 1.0000 'Non-basic' ---- VAR x2 . 0.5000 +INF . 'Basic' ---- VAR x3 . 0.5000 +INF . 'Basic'
We see equation 2 (or its slack) is made basic although the value of the slack is zero. This solution is actually both primal and dual degenerate. [4] A different solver gives:
LOWER LEVEL UPPER MARGINAL ---- EQU e1 0.5000 0.5000 0.5000 -1.0000 'Non-basic' ---- EQU e2 0.5000 0.5000 0.5000 -1.0000 'Non-basic' ---- EQU e3 1.0000 1.0000 1.0000 . 'Basic' LOWER LEVEL UPPER MARGINAL ---- VAR x0 . . +INF 1.0000 'Non-basic' ---- VAR x1 . . +INF 1.0000 'Non-basic' ---- VAR x2 . 0.5000 +INF . 'Basic' ---- VAR x3 . 0.5000 +INF . 'Basic'
Same primal solution, but different basis. No hint of dual degeneracy however. Bonus question: Which of these two simplex solutions is more correct?
Actually both are correct. From linear programming 101 we know \(y = c_B^T B^{-1}\) where \(y\) are the duals. We can calculate this quickly with some R. Suppose we have:
A ## x0 x1 x2 x3 s1 s2 s3 ## e1 1 0 0.3333333 0.6666667 1 0 0 ## e2 0 1 0.6666667 0.3333333 0 1 0 ## e3 1 1 1.0000000 1.0000000 0 0 1 c ## x0 x1 x2 x3 s1 s2 s3 ## 0 0 -1 -1 0 0 0
Then we can do:
bs ← c('x2','x3','s2') t(c[bs]) %*% solve(A[,bs]) ## e1 e2 e3 ## [1,] 0 0 -1 bs ← c('x2','x3','s3') t(c[bs]) %*% solve(A[,bs]) ## e1 e2 e3 ## [1,] -1 -1 0
There is even a third optimal basis. It has the following duals
bs ← c('x2','x3','s1') t(c[bs]) %*% solve(A[,bs]) ## e1 e2 e3 ## [1,] 0 0 -1
In a sense we have three optimal solutions and SciPy's simplex solver is not able to find a single one of them.
Conclusion
References
- Linear programming using scipy.optimize.linprog returns suboptimal solution, https://stackoverflow.com/questions/46702953/linear-programming-using-scipy-optimize-linprog-returns-suboptimal-solution
- linprog(method='simplex'), https://docs.scipy.org/doc/scipy/reference/optimize.linprog-simplex.html
- linprog(method='interior-point'), https://docs.scipy.org/doc/scipy/reference/optimize.linprog-interior-point.html
- What is this EPS in a GAMS solution?, http://yetanothermathprogrammingconsultant.blogspot.com/2017/09/what-is-this-eps-in-gams-solution.html
- Finding all optimal LP solutions, http://yetanothermathprogrammingconsultant.blogspot.com/2016/01/finding-all-optimal-lp-solutions.html
Wow. This is not great to read... Do you know what the status is nowadays? The post is ~1.5 years old by now.
ReplyDeleteThere may be just bugs, but also the underlying algorithm (full tableau Simplex method) is not very numerically stable. I doubt this will ever be a reliable solver for anything but the smallest problems.
Delete