## Saturday, May 9, 2009

### Modeling posts today

Hi,
For a project I'm working on, I need to solve the following problem. Suppose
we have a squared-grid (size=n) with n^2 cells with a 4-connexity
neighborhood. Each cell must contains exactly one building. We have several
building colors (blue, red and green) with an increasing amount of points.
The game rules are the following:
* No constraints on blue buildings.
* Red buildings must have at least one blue building in its neighborhood.
* Green buildings must have at least one red building and at least one blue
building in its neighborhood.
The goal is to maximize the number of points by having the biggest buildings
(green > red > blue). I have started to write the LP but I have some
difficulties to express the constraint on red buildings because it directly
depends on the values of the variables. So, I would like to transform the
following constraint

param n, integer, > 0, default 4;
var x{1..n, 1..n}, integer, >=0, <=2; /*blue=0, red=1 and green=2 */

s.t. r{i in 1..n, j in 1..n:x[i,j]=1}: sum{a in i-1..i+1, b in j-1..j+1:a>=1
and a<=n and b>=1 and b<=n and i!=a and b!=j and x[a,b]=0} x[i,j] >= 1;

into a valid one. I think there's a way to express it by using binary
variables but I don't see how. Does anybody can help me ? Thanks in advance

Indeed you can not use a variable to drive sets (basically the sets are handled by the modeling system and the variables by the solver, so there is a phase difference; another way to look at it is that the solver expects a system of purely linear inequalities). It is better to use a binary variable with an extra index for the color. That makes implementing the logical condition much easier. The model is simple once you decided on how the variables are organized. Here is the model:

set C; # colors
param n;
set I := 1..n;
param points{C};

set neighbor{i in I, j in I} := setof{i1 in max(i-1,1)..min(i+1,n),j1 in max(j-1,1)..min(j+1,n)}(i1,j1);

var x{I,I,C} binary;

maximize obj:
sum{i in I, j in I, c in C} points[c]*x[i,j,c];
OneColor{i in I, j in I}:
sum{c in C} x[i,j,c] = 1;

RedRequirement{i in I, j in I}:
sum{(i1,j1) in neighbor[i,j]} x[i1,j1,'blue'] >= x[i,j,'red'];
GreenRequirement1{i in I, j in I}:
sum{(i1,j1) in neighbor[i,j]} x[i1,j1,'red'] >= x[i,j,'green'];
GreenRequirement2{i in I, j in I}:
sum{(i1,j1) in neighbor[i,j]} x[i1,j1,'blue'] >= x[i,j,'green'];
solve;

for{i in I}
{
for{j in I}
{
printf if x[i,j,'red']>0.5 then 'R '
else if x[i,j,'green']>0.5 then 'G '
else if x[i,j,'blue']>0.5 then 'B ';
}
printf "\n";
}

data;
set C := red green blue;
param n := 5;
param points := green 10, red 5, blue 2;
end;

This gives:

Generating obj...
Generating OneColor...
Generating RedRequirement...
Generating GreenRequirement1...
Generating GreenRequirement2...
Model has been successfully generated
ipp_basic_tech:  1 row(s) and 0 column(s) removed
ipp_reduce_bnds: 1 pass(es) made, 0 bound(s) reduced
ipp_basic_tech:  0 row(s) and 0 column(s) removed
ipp_reduce_coef: 1 pass(es) made, 0 coefficient(s) reduced
glp_intopt: presolved MIP has 100 rows, 75 columns, 657 non-zeros
glp_intopt: 75 integer columns, all of which are binary
Scaling...
A: min|aij| = 1.000e+000  max|aij| = 1.000e+000  ratio = 1.000e+000
Problem data seem to be well scaled
Crashing...
Size of triangular part = 100
Solving LP relaxation...
0: obj =  2.500000000e+002  infeas = 5.000e+001 (0)
*    54: obj =  2.060000000e+002  infeas = 0.000e+000 (0)
*    78: obj =  2.110000000e+002  infeas = 2.029e-015 (0)
OPTIMAL SOLUTION FOUND
Integer optimization begins...
Gomory's cuts enabled
MIR cuts enabled
Cover cuts enabled
Clique cuts enabled
Creating the conflict graph...
The conflict graph has 2*75 vertices and 150 edges
+   352: >>>>>  1.930000000e+002 <=  2.070000000e+002   7.3% (13; 0)
+   909: >>>>>  1.980000000e+002 <=  2.060000000e+002   4.0% (30; 8)
+  6635: mip =  1.980000000e+002 <=     tree is empty   0.0% (0; 211)
INTEGER OPTIMAL SOLUTION FOUND
Time used:   1.0 secs
Memory used: 0.8 Mb (855246 bytes)
G G G G G
B R G B R
G G G G G
G B G G G
R G G R B
Model has been successfully processed

Someone else suggested to use:

# binaries indicating color;
var is_blue{i in N, j in N}, binary;
var is_green{i in N, j in N}, binary;
var is_red{i in N, j in N}, binary;

# color counts
var blues;
var greens;
var reds;

I don't think this results in a very clean model (invalid link:thread has been removed). If you have similar variables that behave in the same way, in general it is better to add an index position so the variables can be folded into one larger structure. We can even take this approach one step further: instead of three similar constraints we can use a single constraint block using an additional set describing the implications. Note that the suggested model here (invalid link:thread has been removed) gives a different solution than my model (even after changing the objective to make the points identical) because the border is handled differently from what the poster proposed.

Update: the set neighbor should be rewritten as:

set neighbor{i in I, j in I} := setof{i1 in max(i-1,1)..min(i+1,n),j1 in max(j-1,1)..min(j+1,n): i1!=i and j1!=j}(i1,j1);
display neighbor;

to reflect what the poster indicated (see the comments).

Hi Everyone,

I am looking for some help with the following problem:

Let's say I have a number of print jobs (tax statements, bills etc.)

Each print job has 1 basestock and up to 6 brochures.

e.g. A tax statement prints on blue basestock and has 2 brochures (My Tax and Tax Breaks for Pensioners)

The machines that the print jobs run on have 6 hoppers for brochures and 2 trays for basestock.

I can combine the print jobs into sets, thus minimising the set up time for the machine.

e.g. If I combine 3 jobs into 1 batch, then when I run this batch on the machine I only need to to one setup.

The problem seems to be, to find the minimum number of collections that the sets can be grouped into.

e.g. I have the following print jobs:
Job 1:
Basestock A, Brochures 1, 2, 3, 4

Job 2:
Basestock B, Brochures 2,3

Job 3:
Basestock A, Brochures 5,6,7

I can combine Jobs 2 and 3, becuase this gives me a batch with 2 basestocks and 6 brochures which is within the capabilities of the machine.

So I am left with two batches:
Batch 1: Job 1

Batch 2: Jobs 2 and 3

I am looking for an algorithm that will provide the smallest number of
collections

Any assistance would be much appreciated

I don’t think the description or my understanding of it is completely correct. E.g. I think job 2 + job 3 leads to a batch of 5 brochures instead of 6. If we combine jobs 1 and 2 we use brochures 2,3 in for both jobs. I assume this is ok, and that it actually saves a spot for brochures. The advice was given to use a MIP model (this is actually not a bad idea at all), but the suggested model is not that good:

Perhaps you can make a ILP model to solve it. Suppose you have n jobs,you can define binary variable x[i,j] (1 <= i <= n,1 <= j <= n)  to show if the job i and the job j are in the same batch, As jobs are given,you can get a parameter c[i,j] to show if the job i and the job j can be in the same batch (for your example, c[2,3]=1 ,c[1,2]=0).Then
you set up such constraints:

x[i,j]=1 (i!=j) means that job i and job j are in the same batch
x[i,j]=1 (i==j)means that job i are put in the batch solely

for any  i: sum( x[i,j])=1 1 <= j <= n
for any i,j :x[i,j]=x[j,i]
for any  i: if c[i,j]==0 then x[i,j]=0  1 <= j <= n

Last,you set up the object:

min = sum( x[i,j]) 1 <= i <= n, i <= j <= n

This does not look right. First you need probably something like a binary variable x[i,k] indicating whether job i goes into batch k.  Then the rest of the model should follow. Assuming I interpret the problem correctly, the model could look like

set BROCHURES;
set BASESTOCK;
set JOBS;
set BATCHES;

set Brochures{JOBS};
set BaseStock{JOBS};

param numHoppers;
param numTrays;

var x{JOBS,BATCHES} binary;
var batchUse{BATCHES} >= 0, <= 1;
var batchBrochure{BATCHES,BROCHURES} >= 0, <= 1;
var batchBaseStock{BATCHES,BASESTOCK} >= 0, <= 1;

minimize obj: sum{k in BATCHES} batchUse[k];
batchUsage{i in JOBS, k in BATCHES}: x[i,k] <= batchUse[k];
allJobs{i in JOBS}: sum{k in BATCHES} x[i,k] = 1;
calcBatchBrochures{k in BATCHES,b in BROCHURES,i in JOBS:b in Brochures[i]}:
batchBrochure[k,b] >= x[i,k];
brochureCapacity{k in BATCHES}: sum{b in BROCHURES} batchBrochure[k,b] <= numHoppers;
calcBatchBaseStock{k in BATCHES,b in BASESTOCK,i in JOBS:b in BaseStock[i]}:
batchBaseStock[k,b] >= x[i,k];
baseStockCapacity{k in BATCHES}: sum{b in BASESTOCK} batchBaseStock[k,b] <= numTrays;

solve;
display x;

data;
set BROCHURES := 1 2 3 4 5 6 7;
set BASESTOCK := A B;
set JOBS := job1 job2 job3;
set BATCHES := batch1 batch2 batch3;
set Brochures['job1'] := 1 2 3 4;
set Brochures['job2'] := 2 3;
set Brochures['job3'] := 5 6 7;
set BaseStock['job1'] := A;
set BaseStock['job2'] := B;
set BaseStock['job3'] := A;
param numHoppers := 6;
param numTrays := 2;
end;

I am a little bit careful here in exploiting that some variable are automatically integer when they become binding. So we only need to keep x as binary. These relaxed variables (batchUse, batchBrochure, batchBaseStock) can be considered as bounds, and they are free to move when not binding. In the case of batchUse the objective will drive the variable down to zero.  In the case of batchBrochure, batchBaseStock we don’t really care about their precise value, as long as they fulfill their role in making sure the capacity is not exceeded. Putting it differently, the calcBatchBrochures and calcBatchBaseStock inequalities implement the logical condition:

if x[i,k]=1 then batchBrochure[k,b]=1 else leave batchBrochure[k,b] unrestricted
if x[i,k]=1 then batchBaseStock[k,b]=1 else leave batchBaseStock[k,b] unrestricted

When batchBrochure[k,b], batchBaseStock[k,b] are left floating, the capacity constraints brochureCapacity and baseStockCapacity may or may not force them to zero. So at the end of the solve, some of the values of batchBrochure[k,b], batchBaseStock[k,b] may be not equal to zero while there is no usage. The only thing you can rely on, is that if there is usage of  brochures or base stock these value will be one (the reverse will not hold in general).

It is not known how large the BATCH set should be in advance. In this case we could have used just 2 elements. An upper bound on the number of elements needed can be established by allowing as many batches as there are jobs. The result from this model will look like:

Generating obj...
Generating batchUsage...
Generating allJobs...
Generating calcBatchBrochures...
Generating brochureCapacity...
Generating calcBatchBaseStock...
Generating baseStockCapacity...
Model has been successfully generated
ipp_basic_tech:  4 row(s) and 0 column(s) removed
ipp_reduce_bnds: 1 pass(es) made, 0 bound(s) reduced
ipp_basic_tech:  0 row(s) and 0 column(s) removed
ipp_reduce_coef: 1 pass(es) made, 0 coefficient(s) reduced
glp_intopt: presolved MIP has 51 rows, 39 columns, 120 non-zeros
glp_intopt: 9 integer columns, all of which are binary
Scaling...
A: min|aij| = 1.000e+000  max|aij| = 1.000e+000  ratio = 1.000e+000
Problem data seem to be well scaled
Crashing...
Size of triangular part = 51
Solving LP relaxation...
0: obj =  0.000000000e+000  infeas = 1.400e+001 (0)
*    25: obj =  1.000000000e+000  infeas = 0.000e+000 (0)
*    30: obj =  1.000000000e+000  infeas = 0.000e+000 (0)
OPTIMAL SOLUTION FOUND
Integer optimization begins...
Gomory's cuts enabled
MIR cuts enabled
Cover cuts enabled
Clique cuts enabled
Creating the conflict graph...
The conflict graph has 2*9 vertices and 18 edges
+   105: >>>>>  2.000000000e+000 >=  1.333333333e+000  33.3% (4; 0)
+   120: mip =  2.000000000e+000 >=     tree is empty   0.0% (0; 7)
INTEGER OPTIMAL SOLUTION FOUND
Time used:   0.0 secs
Memory used: 0.2 Mb (239814 bytes)
Display statement at line 29
x[job1,batch1] = 1
x[job1,batch2] = 0
x[job1,batch3] = 0
x[job2,batch1] = 0
x[job2,batch2] = 1
x[job2,batch3] = 0
x[job3,batch1] = 0
x[job3,batch2] = 1
x[job3,batch3] = 0
Model has been successfully processed

These post illustrate that good modeling is difficult. It requires paying attention to details as well as keeping an eye on the model as a whole. In that sense it is often more difficult than programming where often once break down difficult tasks into more manageable pieces.