With MIP models we know that it is important to try out different formulations. A good formulation can make the difference between being able to solve a model is reasonable time or not. For NLP models this is also the case. A client model (non-convex NLP) was solved with GAMS/CONOPT with the following results:
equations : 35,347
variables : 35,721
nonzeroes : 141,753 (of which 106,032 nonlinear)objective : 8021923.97 (local optimum, minimization)
time : 2366.7 seconds
After reformulating the model the results were:
equations : 3
variables : 377
nonzeroes : 753 (of which 376 nonlinear)
objective : 7714519.00 (local optimum, minimization)
time : 0.4 seconds
The main effort was in substituting out some variables, causing the problem to become linearly constrained and much smaller. The objective became not just a little bit but awfully nonlinear, but the trade-off was clear.
This is amazing! I havent seen such model size and complexity reduction.Could you please provide more details about the models to see how it is reduced ? Thanks.
ReplyDeleteCarefully substituting things out. Here is another example: http://yetanothermathprogrammingconsultant.blogspot.com/2008/06/ampl-defined-variables.html. In both cases a smaller but highly nonlinear model follows. So this is not always a good thing to do.
ReplyDeletethank you sir, i like this information
ReplyDelete