Lecture 6:Objective function involving absolute values

 

This blog is still talking about how to reformulate the optimization problem with objective function including absolute values to linear programming problem.

Problem Involving absolute values

Consider the optimization problem like

$$ \begin{equation} \begin{aligned} &\text{minmize} && \sum_{i=1}^n c_i \vert x_i \vert \\ &\text{subject to} && A\bar{x} \leq \bar{b} \end{aligned} \end{equation} $$
Here, ${ c_i }$ is nonnegative, so that ${ f(\bar{x}) = \sum_{i=1}^n c_i \vert x_i \vert }$ is convex. Proof. ${ \forall \bar{x}, \bar{y} \in \mathbb{R}^n, \lambda \in [0,1]}$, we have
$$ \begin{equation} \begin{aligned} f((1-\lambda)\bar{x} + \lambda \bar{y}) &= \sum_{i=1}^n c_i \vert (1-\lambda)x_i + \lambda y_i \vert \\ & = \sum_{i=1}^n c_i \max \{ (1-\lambda)x_i + \lambda y_i, - (1-\lambda)x_i - \lambda y_i \} \\ & \leq \sum_{i=1}^n c_i \max \{ (1-\lambda)x_i, - (1-\lambda)x_i \} + c_i \max \{ \lambda y_i, - \lambda y_i \} \\ & = \sum_{i=1}^n c_i \vert (1-\lambda)x_i \vert +\sum_{i=1}^n c_i \vert \lambda y_i \vert \\ & = (1-\lambda) f(\bar{x}) + \lambda f(\bar{y}) \end{aligned} \end{equation} $$
Proof Done! ${ \square }$
  • Claim: ${ f(\bar{x}) = \sum_{i=1}^n c_i \vert x_i \vert }$ is a piecewise linear convex function.

Proof. …

We would like to reformulated the optimization above as a linear optimization problem. However, it’s not so easy to express ${ f(\bar{x}) }$ in the form ${ \max \{\bar{\alpha_i}^T \bar{x}+ \bar{\beta_i}\} }$.

So, here we have two way to accomplish it.

Way one

Since ${ \vert x_i \vert }$ is the smallest number ${ z_i }$ satisifying ${ x_i \leq z_i, -x_i \leq z_i }$. So, we introduce a new decision variables ${ z_i }$, and add addtional constraints

$$ \begin{equation} \begin{aligned} &\text{minmize} && \sum_{i=1}^n c_i z_i \\ &\text{subject to} && A\bar{x} \leq \bar{b} \\ & && x_i \leq z_i \\ & && -x_i \leq z_i \end{aligned} \end{equation} $$

Way two

Here is another way to reformulate this form of optimization problem to a linear problem.

Think from 1-dimensional case

Consider following optimization problem,

$$ \begin{equation} \begin{aligned} &\text{minmize} && \vert x \vert \\ &\text{subject to} && ax \geq b\\ \end{aligned} \end{equation} $$
  • Cliam: ${ \forall x \in \mathbb{R} }$, we have
$$ \begin{aligned} \vert x \vert &= \max \{0,x\} + \max \{0,-x\} \\ x &= \max \{0,x\} - \max \{0,-x\} \end{aligned} $$

This suggests writing the decision variable ${ x }$ as the difference ${ x^+ - x^-, x^+ , x^- \geq 0 }$.

For instance, we can take ${ x^+ = \max\{0,x\},x^- = \max\{0,-x\} }$

But, we have infinity other possible way, we can take

$$ x^+ = \max\{0,x\}+M,x^- = \max\{0,-x\} +M , \forall M\geq 0 $$

And, we can easy to transfer from ${ x,M }$ to ${ x^+, x^- }$. (Hint: ${ x^+ + x^- = \vert x\vert +2M }$)

$$ \begin{equation} \begin{cases} x^+ = \frac{\vert x \vert + x}{2} +M \\ x^- = \frac{\vert x \vert - x}{2} +M \end{cases} \text{ and } \begin{cases} x = x^+ - x^- \\ M = \frac{x^+ +x^- - \vert x^+-x^- \vert }{2} \end{cases} \end{equation} $$

So, now we can reformulate our original optimization problem

$$ \begin{equation} \begin{aligned} &\text{minmize} && \vert x \vert \\ &\text{subject to} && ax \geq b\\ \end{aligned} \end{equation} $$

We can add ${ M\geq 0 }$ to the objective function, because we take minmum of it, so anyway, the ${ M }$ will take ${ 0 }$ in solution. So, we have

$$ \begin{equation} \begin{aligned} &\text{minmize} && \vert x \vert + 2M \\ &\text{subject to} && ax \geq b\\ & && M \geq 0 \end{aligned} \end{equation} $$

Then, we use ${ x^+, x^- }$ to replace ${ x,M }$, we finally reformulate the original problem to a linear programming problem.

$$ \begin{equation} \begin{aligned} &\text{minmize} && x^+ + x^- \\ &\text{subject to} && a(x^+ - x^-) \geq b\\ & && x^+ \geq 0\\ & && x^- \geq 0 \end{aligned} \end{equation} $$

Due to ${ x^+,x^- \geq 0 }$, ${ M = \frac{x^+ +x^- - \vert x^+-x^- \vert }{2} }$ implicitly promise ${ M \geq 0 }$. So, we don’t need to write ${ \frac{x^+ +x^- - \vert x^+-x^- \vert }{2} \geq 0 }$ in Constraints again.

Accomplishment

In generally, provide the numbers ${ c_1, c_2, \cdots, c_n }$ are all nonnegative, the objective function ${ \bar{x} \mapsto \sum_{i=1}^n c_i \vert x_i \vert }$ is piecewise linear convex and the optimization problem

$$ \begin{equation} \begin{aligned} &\text{minmize} && \sum_{i=1}^n c_i \vert x_i \vert \\ &\text{subject to} && A\bar{x} \leq \bar{b} \end{aligned} \end{equation} $$

can be reformulate as the linear optimization problem

$$ \begin{equation} \begin{aligned} &\text{minmize} && \sum_{i=1}^n c_i ( x_i^+ + x^- )\\ &\text{subject to} && A\bar{x}^+ - A\bar{x}^- \leq \bar{b} \\ & && \bar{x}^+ \geq 0 \\ & && \bar{x}^- \geq 0 \end{aligned} \end{equation} $$