Wednesday, February 24, 2010

CONOPT: Convergence problem

Occasionally the NLP solver CONOPT will terminate with:

  Iter Phase Ninf     Objective     RGmax    NSB   Step InItr MX OK
   823   3        1.6227533982E+01 3.0E-02   407 0.0E+00      F  F
   824   3        1.6227533982E+01 3.0E-02   401 0.0E+00      F  F
   825   3        1.6227533982E+01 3.0E-02   405 0.0E+00      F  F
   826   3        1.6227533982E+01 3.0E-02   407 0.0E+00      F  F
   827   3        1.6227533982E+01 3.0E-02   407

** Feasible solution. Convergence too slow. The change in objective
   has been less than 4.8683E-11 for 20 consecutive iterations

In some cases we can reach a (local) optimum just by solving again:

solve ramsey maximizing AggW using nlp;
solve ramsey maximizing AggW using nlp;

Indeed this trick actually works in this case and the second model solves quickly and terminates with:

  Iter Phase Ninf     Objective     RGmax    NSB   Step InItr MX OK
   193   4        1.6948156578E+01 2.7E+00   342 2.0E+00    1 F  T
   194   4        1.6948221678E+01 1.4E-03   441 1.0E+00    1 F  T
   195   4        1.6948492083E+01 1.9E-03   441 1.0E+00    6 T  T
   196   4        1.6949366978E+01 1.3E-03   440 1.0E+00   24 F  T
   197   4        1.6950307538E+01 3.1E-03   440 1.0E+00  113 F  T
   198   4        1.6950345143E+01 1.4E-04   440 1.0E+00  171 F  T
   199   4        1.6950345450E+01 1.6E-05   440 1.0E+00  136 F  T
   200   4        1.6950345453E+01 1.6E-06   440 1.0E+00  101 F  T
   201   4        1.6950345453E+01 1.0E-07   440 1.0E+00    7 F  T
   202   4        1.6950345453E+01 5.6E-08   440

** Optimal solution. Reduced gradient less than tolerance.