In this post a somewhat complex class of functions was incorrectly assumed to be convex:
(From: http://www.sfu.ca/~ssurjano/boha.html, the plot is \(f_1(x)\)).
Indeed the picture gives the impression this is a simple bowl shaped thing and should be easy to minimize. Most functions with a cosine term are not convex, and indeed we can see more details here:
(From: http://al-roomi.org/benchmarks/unconstrained/2-dimensions/11-bohachevsky-s-function-no-2; this plot is \(f_2(x)\)).
All the functions have a global optimal solution \(x^*_1=0\), \(x^*_2=0\), \(f^*=0\).
Solving with multi-start local solver
It is not so difficult to solve this with a good local solver: use a simple multi-start strategy. I.e. using the NLP solver CONOPT and 20 trials with a random starting point \(x^0_1,x^0_2 \in [-100,100]\) gives:
---- 27 PARAMETER xsol x.1 x.2 f.obj trial1 0.6186 1.35041E-22 0.4129 |
(blank means zero in this GAMS output). There was no problem in finding the global optimum \(x^*_1=0\) , \(x^*_2=0\) , \(f^*=0\) here.
Solving with a global solver
Not all solvers support the \(cos(x)\) function:
- Baron: Cannot handle function 'cos'
- Couenne: found global optimum
- Antigone: ERROR: Unsupported function cos in objective defObj_f
Tools to verify convexity
It would be nice if we had a tool that could detect convexity. A modeling system like AMPL and GAMS has knowledge about the analytical form of the optimization problem, so in theory it could try to find out if the problem is convex. This is not a trivial task (see (1)). A global solver like Baron does in essence a similar thing before solving the problem. A more heuristic approach is used in (2).
References
- R. Fourer, C. Maheshwari, A. Neumaier, D. Orban and H. Schichl, “Convexity and Concavity Detection in Computational Graphs: Tree Walks for Convexity Assessment.” INFORMS Journal on Computing 22(2010) 26–43.
- John W. Chinneck (2002), “Discovering the Characteristics of Mathematical Programs via Sampling”, Optimization Methods and Software, vol. 17, no.2, pp. 319-352