Sunday, May 24, 2009

NLP model reformulation

With MIP models we know that it is important to try out different formulations. A good formulation can make the difference between being able to solve a model is reasonable time or not. For NLP models this is also the case. A client model (non-convex NLP) was solved with GAMS/CONOPT with the following results:

equations : 35,347
variables : 35,721
nonzeroes : 141,753 (of which 106,032 nonlinear)

objective : 8021923.97 (local optimum, minimization)
time : 2366.7 seconds

After reformulating the model the results were:

equations : 3
variables : 377
nonzeroes : 753 (of which 376 nonlinear)

objective : 7714519.00 (local optimum, minimization)
time : 0.4 seconds

The main effort was in substituting out some variables, causing the problem to become linearly constrained and much smaller. The objective became not just a little bit but awfully nonlinear, but the trade-off was clear.


  1. This is amazing! I havent seen such model size and complexity reduction.Could you please provide more details about the models to see how it is reduced ? Thanks.

  2. Carefully substituting things out. Here is another example: In both cases a smaller but highly nonlinear model follows. So this is not always a good thing to do.

  3. thank you sir, i like this information