This blog is still talking about how to reformulate the optimization problem with objective function including absolute values to linear programming problem.
Problem Involving absolute values
Consider the optimization problem like
Here, ${ c_i }$ is nonnegative, so that ${ f(\bar{x}) = \sum_{i=1}^n c_i \vert x_i \vert }$ is convex.
Proof. ${ \forall \bar{x}, \bar{y} \in \mathbb{R}^n, \lambda \in [0,1]}$, we have- Claim: ${ f(\bar{x}) = \sum_{i=1}^n c_i \vert x_i \vert }$ is a piecewise linear convex function.
Proof. …
We would like to reformulated the optimization above as a linear optimization problem. However, it’s not so easy to express ${ f(\bar{x}) }$ in the form ${ \max \{\bar{\alpha_i}^T \bar{x}+ \bar{\beta_i}\} }$.
So, here we have two way to accomplish it.
Way one
Since ${ \vert x_i \vert }$ is the smallest number ${ z_i }$ satisifying ${ x_i \leq z_i, -x_i \leq z_i }$. So, we introduce a new decision variables ${ z_i }$, and add addtional constraints
Way two
Here is another way to reformulate this form of optimization problem to a linear problem.
Think from 1-dimensional case
Consider following optimization problem,
- Cliam: ${ \forall x \in \mathbb{R} }$, we have
This suggests writing the decision variable ${ x }$ as the difference ${ x^+ - x^-, x^+ , x^- \geq 0 }$.
For instance, we can take ${ x^+ = \max\{0,x\},x^- = \max\{0,-x\} }$
But, we have infinity other possible way, we can take
And, we can easy to transfer from ${ x,M }$ to ${ x^+, x^- }$. (Hint: ${ x^+ + x^- = \vert x\vert +2M }$)
So, now we can reformulate our original optimization problem
We can add ${ M\geq 0 }$ to the objective function, because we take minmum of it, so anyway, the ${ M }$ will take ${ 0 }$ in solution. So, we have
Then, we use ${ x^+, x^- }$ to replace ${ x,M }$, we finally reformulate the original problem to a linear programming problem.
Due to ${ x^+,x^- \geq 0 }$, ${ M = \frac{x^+ +x^- - \vert x^+-x^- \vert }{2} }$ implicitly promise ${ M \geq 0 }$. So, we don’t need to write ${ \frac{x^+ +x^- - \vert x^+-x^- \vert }{2} \geq 0 }$ in Constraints again.
Accomplishment
In generally, provide the numbers ${ c_1, c_2, \cdots, c_n }$ are all nonnegative, the objective function ${ \bar{x} \mapsto \sum_{i=1}^n c_i \vert x_i \vert }$ is piecewise linear convex and the optimization problem
can be reformulate as the linear optimization problem