## Monday, April 15, 2024

### LP in statistics: The Dantzig Selector

Lots of statistical procedures are based on an underlying optimization problem. Least squares regression and maximum likelihood estimation are two obvious examples. In a few cases, linear programming is used. Some examples are:

• Least absolute deviation (LAD) regression [1]
• Chebyshev regression [2]
• Quantile regression [3]
Here is another regression example that uses linear programming.

We want to estimate a sparse vector $$\color{darkred}\beta$$ from the linear model $\color{darblue}y=\color{darkblue}X\color{darkred}\beta+\color{darkred}e$ where the number of observations $$n$$ (rows in $$\color{darkblue}X$$) is (much) smaller than the number of coefficients $$p$$ to estimate (columns in $$\color{darkblue}X$$) [4]: $$p \gg n$$. This is an alternative to the well-known Lasso method [5].

## Friday, April 12, 2024

### Instead of integers use binaries

In [1], a small (fragment of a) model is proposed:

High-Level Model
\begin{align} \min\> & \sum_i | \color{darkblue}a_i\cdot \color{darkred}x_i| \\ & \max_i |\color{darkred}x_i| = 1 \\ & \color{darkred}x_i \in \{-1,0,1\} \end{align}

Can we formulate this as a straight MIP?