Lots of statistical procedures are based on an underlying optimization problem. Least squares regression and maximum likelihood estimation are two obvious examples. In a few cases, linear programming is used. Some examples are:
- Least absolute deviation (LAD) regression [1]
- Chebyshev regression [2]
- Quantile regression [3]
Here is another regression example that uses linear programming.
We want to estimate a sparse vector β from the linear model y=Xβ+e where the number of observations n (rows in X) is (much) smaller than the number of coefficients p to estimate (columns in X) [4]: p≫n. This is an alternative to the well-known Lasso method [5].