Polynimial Reduction

 

The following blogs we will talks about the Hardness of Probelms that are P, NP, NPC problems. This blog will focus on reduction, and how to do reduction.

Catalogs of Problems

Here we only give some intuitional definition, and in the following blogs, we will give their rigorous one.

  • ${ P =}$ Problems with polytime algorithm.

  • ${ NP =}$ Problems whose answers are easy to (in polynimial time) to verify.

  • Reduction: ${ X }$ reduced to ${ Y }$ ${ \Rightarrow }$ ${ Y }$ is at least as hard as ${ X }$. And denoted as ${ X \leq_p Y }$

  • ${ NP}$-${ Complete }$: ${ Y }$ is ${ NPC }$ if

    1. ${ Y \in NP }$
    2. ${ \forall X \in NP, X \leq_p Y }$
  • ${ EXP= }$ Problems having algorithms with exponential run time.

Polynimial Reduction (Cook, Karp)

Now we will formally define “Reduction”. In fact, we have several ways to define it. Here we introduce two definition.

Cook Reduction: Problem ${ X }$ polynomial reduces to problem ${ Y }$ (${ X \leq_p Y }$) if any arbitrary instances of ${ X }$ can be solved using

  1. a polynomial number of standard computational steps, plus

  2. a polynomial number of calls to the algorithm/oracle that solves problem ${ Y }$

Note: here we use “oracle” this word, in fact this word has a more rigorous definition. But here we can undertand it as a “black box” that can solve ${ Y }$. Because, for some problems, we haven’t have any algorithm to solve it. But, we want to describe their relation (respect to hardness).

Karp Reduction: Problem ${ X }$ polynomial reduces to problem ${ Y }$ if given any instance ${ x }$ of ${ X }$, we can convert it to an instance ${ y }$ of ${ Y }$, such that

  1. ${ \vert y \vert}$ is polynimial of ${ \vert x \vert}$ (that means construct ${ y }$ by ${ x }$ in polynomial time)

  2. ${ x }$ is a ${ yes }$ instance of ${ X }$ if and only if ${ y }$ is a ${ yes }$ instance of ${ y }$.

First, Karp reduction only apply to decision problems.

Next, we noticed that karp reduction call oracle for ${ y }$ only once! But cook reduction permits us call oracle for ${ y }$ polynomial times. So, Karp is more restricted to Cook reduction.

But until now it seems all the Cook reduction is also satisify Karp reduction, we don’t prove whether they are equivalence.

In following discussion, we will only use Cook reduction.

Hence, we can immediatly get following three claims by the definition of Reduction

Claim 1: If ${ X \leq_p Y }$, and ${ Y }$ has a polynomial time algorithm then ${ X }$ can be solved in polynomial time.

Claim 2: If ${ X \leq_p Y }$, and ${ X }$ cannot be solved in a polynomial time algorithm then ${ Y }$ cannot be solved in polynomial time.

Claim 3: If ${ X \leq_p Y, Y \leq_p X }$, ${ X }$ can be solved in polynomial time iff ${ Y }$ can be solved in polynomial time.

Decision Problems

In previous problems, we have a lot of search/optimization problem, that is we find some best solution from a set of possible solution. Like Min Cut, Max Flow,Chromatic number.

What is Chromatic number problem? (Given a graph ${ G }$, the chromatic number is the minumum number of colors required to color each vertex of ${ G }$ such that no two adjacent vertices get the same color.)

But we can give above problems in their decision version. Here we introduce decision problem.

Definition:A decision problem is a computational problem that can be posed as a yes–no question of the input values.

Like, the decision version of Max Flow problem is ${ ((G,s,t,C),k) }$, that means for max flow problem ${ (G,s,t,C) }$ whether exists a flow with value as ${ k}$.

Another example, the decision version of Chromatic number is ${ (G,k) }$, whether we can use ${ k }$ kinds of colors to color the graph vertex.

Reduction from Search Prob to Decision Prob

It is obviously to know if we solve the search problem, we can immediately get the anwser of Decision problem. (Like Chromatic number, if we already know the Chromatic number ${ \chi }$ of graph ${ G }$, we can easily get the anwser of ${ (G,k) }$, ${ yes }$ iff ${ k \geq \chi }$).

But, if we already solve the decision problem, can we get the anwser of search problem in polynomial time?

Yes, we can. We can solve all the decision version of Chromatic problem ${ (G,k) , k\in \{1,2,\cdots,n\}}$. Here ${ n = \vert V \vert }$. We only need call algorithm of decision version of Chromatic problem polynomial times ${ O(n) }$. If we use binary search, we can do better by calling ${ O(\lg n) }$ times.

Reduction Techniques

Following, we will give some common techniques to prove reduction. We will show each technique by an example.

  • Reduction by Equivalence

  • Reduction from special cases to general case

  • Reduction via gadgets

Reduction by Equivalence

Vertex Cover Problem

Given an undirected graph ${ G=(V,E) }$, a vertex cover of ${ G }$ is the subset of nodes ${ V’ \subseteq V }$, such that ${ \forall e = (u,v) \in E }$ at least one of ${ u,v }$ is in ${ V’ }$. And vertex cover problem is to find the minimum size of vertex cover in graph ${ G }$.

Decision Version: ${ <G;k> = \{\text{All the graphs has a vertex cover of size } k\} }$. ${ G’ \in <G;k> }$ iff ${ (G’,k) }$ is ${ yes }$ instance.

Independent Set Problem

Given an undirected graph ${ G=(V,E) }$, an independent set of ${ G }$ is a subset of vertices ${ V’ \subseteq V }$, such that no two vertices in ${ V’ }$ are adjacent in ${ G }$ (there is no edge in ${ V’ }$). And Independent Set Problem is to find the maximum size of independent set in graph ${ G }$.

Decision Version: ${ <G;k> = \{\text{All the graphs has a independent of size } k\} }$. ${ G’ \in <G;k> }$ iff ${ (G’,k) }$ is ${ yes }$ instance.

Vertex Cover ${ =_p }$ Independet Set

We will show they can reduce to each other by prove the equivalence of vertex cover and independent set. We have following Claim.

Claim: ${ S }$ is an independent set (IS) iff ${ V \setminus S }$ is a vertex cover (VC).

Proof. we will prove the neccesity and sufficiency separately.

${ \Rightarrow }$: Let ${ S }$ be an IS. ${ \forall e =(u,v) \in E }$, either ${ u }$ or ${ v \notin S }$. So, either ${ u }$ or ${ v \in V\setminus S }$. Hence, we prove ${ V\setminus S }$ cover all the edges, which is a vertex cover of ${ G }$.

${ \Leftarrow }$: Let ${ C }$ be a VC. Let’s check ${ V \setminus C }$ is a IS. Assum ${ (u,v) \in E, u,v \in V \setminus C }$. That means ${ u,v \notin C }$, which contradicts to ${ C }$ is a vertex cover. ${ \square }$

Sum-up

In fact, “Reduction by Equivalence” is try to prove the relation in math of the two problems. Like vertex cover and independent set, they just like the two sides of a coin. Other example is Max flow and Min Cut.

Reduction from special cases to general case

Set Cover Problem

Given a set of elements ${ U }$ (universe) and a collection of subsets ${ s_1,s_2,\cdots,s_m \subseteq U }$. A set cover of ${ U }$ is a sub-collection of these subsets covering ${ U }$. And set cover problem is aiming to find the minimum number of subsets that can cover ${ U }$.

Decision Version: Does there exist a collection of at most ${ k }$ of these sets whose union is equal to ${ U }$?

Vertex Cover ${ \leq_p }$ Set Cover

We will prove Vertex Cover problem ${ \leq_p }$ Set Cover problem, by transform the VC problem to a SC problme.

Proof. Input an instance ${ (G,k),G=(V,E) }$ of vertex corver problem in decision version. We will construct a set cover instance ${ (U,s,_1,s_2,\cdots,s_m,k’) }$ as below

  1. ${ U = E }$

  2. ${ \forall v \in V }$, define ${ s_v = \{\{v,u\}\in E | u \in E \} }$ (${ s_v}$ contains all the edges that ${ v }$ is incident with )

  3. ${ k’ = k }$

We consume ${ O(m) }$ computation steps for transforming arbitrary instance of vertex cover set problem to the instance of Set Cover problem.

Now, we will prove following claim to finish the reduction.

Claim: ${ G }$ has a vertex cover of size ${ k }$ iff ${ U }$ has a set cover of size ${ k }$.

We will prove the neccessity and sufficiency separatly.

${ \Rightarrow }$: Let’s denote the vertex cover as ${ S }$. Here, sub-collection ${ C= \{s_v | v\in S\} }$ is a set cover of ${ U }$. For each element ${ u \in U }$, the corresponding edge ${ e_u = (t,v) }$ must be coverd by ${ S }$. That means either ${ t }$ or ${ v \in S}$. So either ${ s_t }$ or ${ s_v \in C}$. And ${ u \in s_v, u \in s_t }$, which means ${ C }$ cover ${ u }$.

${ \Leftarrow }$: Let’s denote the set cover ${ C }$. We give the vertex cover as ${ S = \{v \in V | s_v \in C \} }$. For each ${ e \in E }$, the corresponding element ${ u \in U }$ must contain in some subset ${ s_v \in C }$, hence ${ v \in S }$ and ${ v }$ is incident with ${ e }$. So, ${ S }$ is a vertex cover of ${ G }$. ${ \square }$

Sum-up

  1. We should notice that in above reduction set cover is a more general problem. Because, each pair of subsets ${ s_v, s_u \subseteq U }$ only have at most ${ 1}$ overlap element ${ \vert s_v \cap s_u \vert \leq 1 }$. So, these instances are a kinds of special cases of set cover problem. Inverse, it’s hard to covert set cover problem to vertex cover.

  2. But, in fact we will prove in the following vertex cover problem is a NP-Complete problem. That means set cover in decision version can reduce to vertex cover problem.

3. “Reduction from special cases to general case”, this technique has following step for reduction from ${ X }$ to ${ Y }$

  • For aribitrary instance ${ x \in X }$, construct an instance of ${ y \in Y }$ in polynomial time.

  • Instance ${ x }$ can be solved iff ${ y }$ can be solved.

Reduction via gadgets

Following, we will introduce last way to do reduction. The “gadgets” actually is a kind of more abstract problems, which are formulated in Boolen notation. We can reduce them to other problem.

SAT and ${ 3 }$-SAT Poeblems

  • Boolen variables: variables that can take values as ${ 0 }$ or ${ 1 }$ (equivalently, “false” or “true”).

  • Boolen operators: ${ \neg, \land, \lor }$

  • Boolen formulae: expressions defind with boolen boolen variables and boolen operators.

Attributes of a boolen formula

  1. Literal/term: A boolen variables ${ x_i \in \{0,1\}}$ or its negation ${ \overline{x_i} }$. (Here we use ${ \overline{x_i} }$ represent ${ \lnot x_i }$)

  2. Clause: is a disjunction of distinct terms/literals, like

$$ t_1 \lor t_2 \lor \cdots \lor t_m, t_i \in \{x_1, \cdots, x_n, \overline{x_1}, \cdots, \overline{x_n} \} $$

We say the length of clause is ${ m }$ if it contains ${ m }$ terms/literals.

  1. Conjunction Normal Form (CNF): is a disjunction of several clauses, like
$$ C_1 \land C_2 \land \cdots \land C_k $$

Here ${ C_i }$ is clause definded as above.

eg. ${ (x_1 \lor \overline{x_2}) \land (x_2 \lor x_3 ) \land (x_1 \lor \overline{x_2} \lor \overline{x_3}) }$

If we given each variable a value ${ 0 }$ or ${ 1 }$, it’s called an assignment.

  • We say the assignment of ${ X = \{x_1,x_2, \cdots, x_n\} }$ satisfies conjunction ${ C_1 \land C_2 \land \cdots \land C_k }$, that means this conjunction is true (${ 1 }$) for this assignment, and the assignment called true assignment.

  • If there exists a true assignment, we say conjunction ${ C_1 \land C_2 \land \cdots \land C_k }$ is satisifiable.

SAT and ${ 3 }$-SAT Poeblems

  • Satisfiability (SAT) Problem: Given a set of clauses ${ C_1, C_2,\cdots, C_k }$ over a set of variables ${ X = \{x_1,x_2, \cdots, x_n\} }$, does there exist a satisfying truth assignment? Or conjunction ${ C_1 \land C_2 \land \cdots \land C_k }$ is satisifiable?

  • ${ 3 }$-Satisfiability (${ 3 }$-SAT) Problem: Given a set of clauses ${ C_1, C_2,\cdots, C_k }$ of, each of length ${ 3 }$, over a set of variables ${ X = \{x_1,x_2, \cdots, x_n\} }$, does there exist a satisfying truth assignment? Or conjunction ${ C_1 \land C_2 \land \cdots \land C_k }$ is satisifiable?

3-SAT ${ \leq_p }$ Independent Set

Given a instance of ${ 3 }$-SAT Problem, we will construct a Independent Set based on it.

Proof. Given a CNF ${ \phi }$ from arbitrary instance of ${ 3 }$-SAT Problem, we will construct an IS instance ${ G }$ as follows:

  1. For each clause ${ C \in \phi }$, we use ${ 3 }$ nodes to represent each literal/term. And we connect each pair of nodes within a clause (form a triangle).

  2. Connect each literal/term with its negation. (add an edge ${ \{x_i,\overline{x_i}\} }$ between triganle)

Reduction Time

Suppose we have ${ k }$ clauses and ${ \ell }$ terms/literals.

So, build each triangle need ${ O(k) }$ time.

Correctness

We will show following claim

Claim: ${ \phi }$ is satisfiable iff ${ G }$ has an independent set of size ${ k }$.

We will prove the neccessity and sufficiency separately.

${ \Rightarrow }$: Given a satisfying assignemnt of ${ \phi }$, construct an Independent set ${ S }$ as follow: for each clauses, we add a vertex which corresponding literal is ${ 1 }$ (If we have multiple choices, we can pick arbitrary one).

Because ${ \phi }$ is satisifiable, all the clauses must be ${ 1 }$, so there is at least one literal set to be ${ 1 }$. So, we can pick up point to from each triangle, that means ${ \vert S \vert = k}$.

In the meantime, there’s no edge between vertices in a same triangle.

For the edge between triangles, because the corresponding vertices are negation to each other. Hence, ${ S }$ cannot contain two vertices in the meantime, because we cannot let ${ \overline{x_i} = x_i = 1 }$.

Thus, there’s no pair of vertices in ${ S }$ are adjacent. So, ${ S }$ is a Independent set with size of ${ k }$.

${ \Leftarrow }$: Let ${ S }$ be an independent size of ${ k }$. We construct ${ \phi }$ as: set all the literals whose corresponding vertices in ${ S }$ as ${ 1 }$.

Because, there’s no edge between triangles, so there is no conflict between two literals. And, ${ \vert S \vert = k }$, that means there exactly one vertex from each triangle, otherwise, ${ S }$ will contains some edge.

Hence, we have a consistent assignment which make each clause true. So, it’s a true assignment, that means ${ \phi }$ is satisfiable. ${ \square }$

Sum-up

SAT Problem is an abstract problem, with boolen formula. So, we can use this gadget that reduce to other problem.

Transitivity

If ${ X \leq_p Y }$, ${ Y \leq_p Z }$ we can get ${ X \leq_p Z }$.

It’s trivial to check by definition.