Friday, March 4, 2016

Reformulate MINLP to MIP

In this post the question came up, what type of solver can handle this problem:

\[\begin{align} \max\> & (0.9 x_0 + 0.8 x_1 + 0.85 x_2) \cdot(0.95 x_3 + 0.8 x_4 + 0.7 x_5)\cdot (0.98 x_6 + 0.94 x_7) \\ & x_0 + x_1 + x_2 = 1\\ & x_3 + x_4 + x_5 = 1 \\ & x_6 + x_7 = 1\\ & 3 x_0 + x_1 + 2 x_2 + 3 x_3 + 2 x_4 + x_5 + 3 x_6 + 2 x_7 \le 6 \\ & x_i \in \{0,1\} \end{align}\]

This is a non-linear model with integer restrictions: the objective is nonlinear and the equations are all linear. The first inclination would be to use a MINLP solver. However, because all products involve only binary variables, this model is easily linearized. The idea is that \(y=x_1\cdot x_2\cdot x_3\) with \(x_i\in\{0,1\}\) is equivalent to a set of linear inequalities :

\[ \boxed{\begin{align}&y=x_1\cdot x_2\cdot x_3\\ &x_i\in\{0,1\} \end{align}}\] \[\>\>\Longleftrightarrow\>\>\] \[\boxed{\begin{align} &y\le x_1\\ &y\le x_2\\ &y\le x_3\\ &y\ge x_1+x_2+x_3-2 \\ &y\in\{0,1\},x_i\in\{0,1\} \end{align}}\]
This is so tight we can even relax the integer restriction on \(y\) to \(y \in [0,1]\).

This means we can reformulate the MINLP model to a linear MIP model. For the full model we have:

\[\begin{align} \max\>\>& 0.9 \cdot 0.95 \cdot (0.98 y_{036} + 0.94 y_{037}) +\\ & 0.9 \cdot 0.8 \cdot (0.98 y_{046} + 0.94 y_{047}) +\\ & 0.9 \cdot 0.7 \cdot (0.98 y_{056} + 0.94 y_{057}) +\\ & 0.8 \cdot 0.95 \cdot (0.98 y_{136} + 0.94 y_{137}) +\\ & 0.8 \cdot 0.8 \cdot (0.98 y_{146} + 0.94 y_{147}) +\\ & 0.8 \cdot 0.7 \cdot (0.98 y_{156} + 0.94 y_{157}) +\\ & 0.85 \cdot 0.95\cdot (0.98 y_{236} + 0.94 y_{237}) +\\ & 0.85 \cdot 0.8 \cdot (0.98 y_{246} + 0.94 y_{247}) +\\ & 0.85 \cdot 0.7 \cdot (0.98 y_{256} + 0.94 y_{257}) \\ \text{s.t.}\>\> & y_{036} \le x_0, \> y_{036} \le x_3, \> y_{036} \le x_6 \\ &y_{037} \le x_0, \> y_{037} \le x_3, \> y_{037} \le x_7 \\ &y_{046} \le x_0, \> y_{046} \le x_4, \> y_{046} \le x_6 \\ &y_{047} \le x_0, \> y_{047} \le x_4, \> y_{047} \le x_7 \\ &y_{056} \le x_0, \> y_{056} \le x_5, \> y_{056} \le x_6 \\ &y_{057} \le x_0, \> y_{057} \le x_5, \> y_{057} \le x_7 \\ &y_{136} \le x_1, \> y_{136} \le x_3, \> y_{136} \le x_6 \\ &y_{137} \le x_1, \> y_{137} \le x_3, \> y_{137} \le x_7 \\ &y_{146} \le x_1, \> y_{146} \le x_4, \> y_{146} \le x_6 \\ &y_{147} \le x_1, \> y_{147} \le x_4, \> y_{147} \le x_7 \\ &y_{156} \le x_1, \> y_{156} \le x_5, \> y_{156} \le x_6 \\ &y_{157} \le x_1, \> y_{157} \le x_5, \> y_{157} \le x_7 \\ &y_{236} \le x_2, \> y_{236} \le x_3, \> y_{236} \le x_6 \\ &y_{237} \le x_2, \> y_{237} \le x_3, \> y_{237} \le x_7 \\ &y_{246} \le x_2, \> y_{246} \le x_4, \> y_{246} \le x_6 \\ &y_{247} \le x_2, \> y_{247} \le x_4, \> y_{247} \le x_7 \\ &y_{256} \le x_2, \> y_{256} \le x_5, \> y_{256} \le x_6 \\ &y_{257} \le x_2, \> y_{257} \le x_5, \> y_{257} \le x_7 \\ & x_0 + x_1 + x_2 = 1\\ & x_3 + x_4 + x_5 = 1 \\ & x_6 + x_7 = 1\\ & 3 x_0 + x_1 + 2 x_2 + 3 x_3 + 2 x_4 + x_5 + 3 x_6 + 2 x_7 \le 6 \\ & x_i \in \{0,1\} \\ & y_{i,j,k} \in [0,1] \\ \end{align}\]

Notice that we dropped the \(\ge\) inequalities in the linearization. That is not by accident. We can leave them out as we are maximizing and pushing each \(y\) upwards.

Conclusion: MINLP models can save you a lot of typing.

Could a modeling system like GAMS help with this expansion? Well yes. We can let GAMS do the heavy lifting of the expansion as follows:

set i /i0*i7/;
alias
(i,j,k);
set cp(i,j,k) 'cross-products' /

  
(i0,i1,i2).(i3,i4,i5).(i6,i7)
/;

This would allow us to write:

objective.. z =e= sum(cp(i,j,k), c(i)*c(j)*c(k)*y(i,j,k));
linearize1(cp(i,j,k)).. y(i,j,k) =l= x(i);
linearize2(cp(i,j,k)).. y(i,j,k) =l= x(j);
linearize3(cp(i,j,k)).. y(i,j,k) =l= x(k);

No comments:

Post a Comment