Polynomially bounded
Is the function ${ \lceil \lg n \rceil ! }$ polynomially bounded? Is the function ${ \lceil \lg \lg n \rceil !}$ polynomially bounded?
Proving that a funtion ${ f(n) }$ is polynomially bounded is equivalent to proving that ${ \lg (f(n)) = O(\lg n)}$.
That’s because:
${\Rightarrow}$ if ${ f(n) }$ is polynomially bounded means there exists constants ${ c,k,n_0 }$, when ${ n>n_0 }$, ${ f(n)\leq cn^k }$. Hence, ${ \lg (f(n)) \leq k \lg n + \lg c \leq (k+1) \lg n }$, that means ${ \lg (f(n)) = O(\lg n) }$.
${\Leftarrow}$ if ${ \lg (f(n)) = O(\lg n) }$, that means, there exists constants ${ k,n_0 }$ such that ${ \lg (f(n)) \leq k \lg n }$, when ${ n>n_0 }$. Therefore, ${ f(n) \leq n^k }$. So, the above proposition is true.
In the following, we will use two fact
#1. ${\lg (n!) = \Theta(n\lg n)}$ (We can check it by Stirling’s approximation).
#2. ${\lceil \lg n \rceil = \Theta(\lg n)}$ (Because ${ \lg n \leq \lceil \lg n \rceil \leq \lg n +1 }$).
Now, we can check the ${ \lg (\lceil \lg n \rceil ! )}$
and ${ \lg (\lceil \lg \lg n \rceil ! )}$
Asymptotic Notation Ex.
(a) ${ f(n)=O((f(n)^2)) }$
Sometimes true, when ${ f(n)=n }$, that’s true. But when ${ f(n) = 1/n }$, that’s not true!
(b) ${ f(n) + g(n) = \Theta\left(\max\left(f(n),g(n)\right)\right) }$
Always true. That’s because
(c) ${ f(n)+O(f(n)) =\Theta(f(n)) }$
Always true. Because for any ${ g(n) \in O(f(n)) }$, there exists ${ c,n_0 }$ such that ${ g(n) \leq c f(n) }$, when ${ n>n_0 }$. Therefore
So, for any ${ g(n) \in O(f(n)) }$, ${ f(n) + g(n) = \Theta (f(n)) }$, that means ${ f(n)+O(f(n))=\Theta(f(n)) }$.
(d) ${ f(n) = \Omega(g(n)) }$ and ${ f(n) = o(g(n)) }$
Never true. ${ f(n) = \Omega(g(n)) }$ means ${ \exists \varepsilon, n_0 }$, such that ${ g(n) \leq \varepsilon \cdot f(n) }$, when ${ n>n_1 }$. And ${ f(n) = o(g(n)) }$ means ${ \forall c >0, \exists n_c }$ such that ${ f(n) \leq c g(n) }$, when ${ n>n_c }$.
Therefore, for ${ c=\frac{1}{2\varepsilon} }$, when ${ n > \max (n_c, n_0) }$, we have
That’s cannot true!
(e) ${ f(n)\neq O(g(n)) }$ and ${ g(n)\neq O(f(n)) }$
Sometimes true. Consider ${ f(n) = 1, g(n) = \lVert n \sin n \rVert}$. Actually, ${ f(n) \neq O( g(n)) }$ means, for any ${ c,n_0 }$, there always exists some ${ n>n_0 }$, we have ${g(n) \geq cf(n) }$
Solve Recurrences
Give asymptotic upper and lower bounds for ${ T(n) }$ in each of the following recurrences. Assume that ${ T(n) }$ is constant for ${ n\leq 3 }$. Make your bounds as tight as possible, and justify it.
(a) ${ T(n) = 2T(n/3)+ n\lg n}$
We will use Master Method to solve it. In this case ${ a=2,b=3,f(n)=n \lg n }$, let’s compare ${ n^{\log_3 2} }$ and ${ f(n)=n \lg n }$
Let ${ \varepsilon = \log_3 \frac{3}{2} >0 }$, we have
Besides, ${ af(n/b)= \frac{an}{b} \lg \frac{n}{b} \leq \frac{an}{b} \lg n= \frac{2}{3} f(n)}$. So, we are in Case #3 of Master Theorem, that is ${ T(n)= \Theta(f(n))=\Theta(n \lg n)}$.
(b) ${ T(n)=3T(n/5) + \lg^2 n }$
In this case, ${ a=3,b=5,f(n)=\lg^2 n }$. We will use following fact
${\lg n = O\left(n^{\frac{1}{3}}\right)}$
Let ${ g(n)= \frac{3}{\ln 2} \cdot n^{\frac{1}{3}} - \lg n }$, thenHence, we can compare ${ f(n) }$ and ${ n^{\log_5 3} }$, let ${ \varepsilon = \log_5 3 - \frac{2}{3}=\log_5 \frac{3}{\sqrt[3] {5^2}} = \log_5 \frac{\sqrt[3]{27}}{\sqrt[3] {25}} >0}$
According to Mater Theorem Case #1, ${ T(n)=\Theta(n^{\log_5 3}) }$.
(c) ${ T(n) = T(n/2) + 2^n }$
Here ${ a=1,b=2, f(n)=2^n }$, it’s trivial to check ${ f(n)=2^n = \Omega(n^{\log_2 1 + \varepsilon})=\Omega(n^{ \varepsilon}) }$, for example ${ \varepsilon = 1 }$ can make it right.
Then, when ${ n > 2 }$, it’s easy to check
So, we are in the Case #3 of Master Theorem. And we can get ${ T(n)= \Theta(f(n))=\Theta(2^n) }$.
(d) ${ T(n)=T(\sqrt n)+\Theta (\lg \lg n) }$
Let ${ m = \lg n }$, we can get ${ T(2^m)= T(2^(m/2))+\Theta(\lg m) }$. Let ${ S(m)=T(2^m) }$, so we get ${ S(m)=S(m/2)+\Theta (\lg m) }$
In this case, ${ a=1,b=2,f(m)=\Theta (\lg m) }$. It’s easy to check ${ f(m)=\Theta(n^{\log_2 1}(\lg m)^1) }$. By Master theorem, we can get ${ S(m)=\Theta((\lg m)^2) }$, that is ${ T(2^{\lg n})=T(n)= \Theta ((\lg \lg n)^2) }$.
(e) ${ T(n)=10T(n/3)+17n^{1.2} }$
Because ${ \log_3 10 > \log_3 9 = 2 > 1.2 }$. We are in Case #1, so ${ T(n)=\Theta(n^{\log_3 10}) }$.
(f) ${ T(n)=7T(n/2)+n^3 }$
It’s easy to check ${ 3 = \log_2 8 > \log_2 7}$. And ${ 7f(n/2)=\frac{7}{8}\cdot n^3= \frac{7}{8}f(n)}$. We are in Case #3, ${ T(n) = \Theta(n^3) }$.
(g) ${ T(n)=T(n/2+\sqrt {n}) + \sqrt {6046}}$
Guess ${ T(n) = \Theta (\lg n) }$.
We observe that, in each stage, we incur the constant cost ${ \sqrt {6046} }$, but decrease the problem size to ${ 1/2 \sim 3/4 }$. Hence, finally, we have ${ \Theta(\lg n) \cdot \sqrt {6046} }$. Therefore we guess ${ T(n) = \Theta (\lg n) }$Assume, when ${ n<k }$, we have ${ T(n)=\Theta (\lg n) }$, that is, there exists constants ${ c_1, c_2 }$ such that ${ c_1 \lg n \leq T(n) \leq c_2 \lg n }$. Then we can check it when ${ n \geq k }$
By induction, ${ T(n) = \Theta (\lg n) }$.
(h) ${ T(n) = T(n-2)+\lg n }$
It’s easy to find that ${ T(n) = \sum_{i=1}^{n/2} \lg 2i }$
According to Stirling’s approximation, we have
So, ${ T(n) = \Theta (n\lg n) }$.
(i) ${T(n) = T(n/5)+T(4n/5)+\Theta(n) }$
Draw recursion tree for the recurrence, we can get the for each level, its sum is ${ \Theta(n) }$, and the height ${ h }$ of the tree is ${ \log_{5/4} n = \Theta(\lg n) }$. (In each step, the slowest part is multiplying ${ 4/5 }$ each time, so the height of the tree can be calculated by ${ \left(\frac{5}{4}\right)^h = n }$.) Hence, we can guess the cost of ${ T(n) = n \lg n }$.
Assume when ${ n<k }$, we have ${ T(n) = \Theta (n \lg n) }$, that is, there exists ${ c_0, c_1 >0 }$ such that ${ c_0 \cdot n\lg n \leq T(n) \leq c_1 \cdot n\lg n}$. About the last term, let ${ f(n) = \Theta(n) }$, so when ${ n>n_0 }$, ${ \exists d_0,d_1>0 }$, such that ${ d_0 n \leq f(n) \leq d_1 n }$
Let’s check ${ T(n), n\geq k }$
as long as we pick ${ c_1 > \frac{5d_1}{5\lg 5 - 8} }$, ${ T(n) < c_1 \cdot n \lg n, n\geq k }$. By induction, ${ T(n) = O(n\lg n) }$.
as long as we pick ${ c_0 < \frac{5d_0}{5\lg 5 - 8} }$, ${ T(n) > c_0 \cdot n \lg n, n\geq k }$. By induction, ${ T(n) = \Omega(n\lg n) }$.
Therefore, ${ T(n) = \Theta(n \lg n) }$.
(j)* ${ T(n) = \sqrt {n} T(\sqrt {n}) + 100n}$
We can change the form of above recurrence,
Let ${ S(n) = \frac{T(n)}{n} }$, the recurrence is ${ S(n) = S(\sqrt{n}) + 100 }$. Let ${ m = \lg n }$. So, ${ S(2^m) = S(2^(m/2)) +100 }$. Denote ${ L(m) = S(2^m) }$, then we can get ${ L(m) = L(m/2) +100 }$. In this case ${ a=1,b=2,f(m) = 100 }$, so ${ f(m) = \Theta(1) = \Theta (m^{\log_2 1}) }$. According to Master Theorem, we are in Case #2, so ${ L(m) = \Theta(\lg m)}$. Hence, ${ S(n) = S(2^m) = L(m)=\Theta(\lg m)= \Theta(\lg \lg n) }$. Therefore, ${ \frac{T(n)}{n} = S(n) = \Theta(\lg \lg n) \Rightarrow T(n) = \Theta(n \lg \lg n) }$.
Unimodal Search
An array ${ A[1 \dots n] }$ is unimodal if it consists of an increasing sequence followed by a decreasing sequence, or more precisely, if there is an index ${ m\in {1,2,\cdots,n} }$ such that
-
${ A[i] < A[i+1] }$ for for all ${ 1\leq i <m }$, and
-
${ A[i] > A[i+1] }$ for all ${ m \leq i < n }$
In particular, ${ A[m] }$ is the maximum element, and it is the unique “locally maximum” element surrounded by smaller elements (${ A[m-1] and A[m+1] }$.
(a) Give an algorithm to compute the maximum element of a unimodal input array ${ A[1 \dots n] }$ in ${ O(\lg n) }$ time. Prove the correctness of your algorithm, and prove the bound on its running time.
We design the “Iterative” and “Recursive” versions of algorithm for “Unimodal Search” as follow.
1
2
3
4
5
6
7
8
9
10
11
12
Iterative_Unimodal_Search(A,n):
## A[1...n] is a unimodal array
l = 1
r = n
m = int(((l+r)/2))
while A[m] < A[m-1] or A[m] < A[m+1]:
if A[m] < A[m-1]:
r = m-1
else:
l = m+1
m = int(((l + r) / 2))
return A[m]
1
2
3
4
5
6
7
8
9
10
Recursive_Unimodal_Search(A,l,r):
## A[1...n] is a unimodal array
m = int(((l + r) / 2))
if A[m] > A[m-1] and A[m] > A[m+1]:
return A[m]
else:
if A[m] < A[m-1]:
return Recursive_Unimodal_Search(A, l, m-1)
else:
return Recursive_Unimodal_Search(A, m+1, r)
Proof of Correctness: Let the index of maximum element is ${ a }$. For the loop we propose the “invariant”: The maximun element of ${ A[1 \dots n] }$ lies in ${ A[l \dots r] }$.
Initially, it’s easy to check the invariant is true.
In each loop, the input bound ${ l,r }$ such that the invariant. If ${ A[m] < A[m-1] }$, we get the new bound ${ l’ = l, r’ =m-1 }$. According to ${ A }$ is a unimodal array (Case 2), we get ${ a < m < n }$. Therefore, the index of maximum element lies in the ${ A[l \dots m-1] }$. It’s similar to check the case of ${ A[m] < A[m+1] }$. So, the invariant holds in each loop.
When the loop completed, it’s trivial to get ${ A[m] > A[m-1] and A[m] > A[m+1]}$. That means ${ A[m] }$ is the maximum of ${ A }$.
Running time: According to the pseudocode, we can get the recurrence ${ T(n)=T(n/2)+\Theta(1)=\Theta(\lg n) }$.
Convex polygon
A polygon is convex if all of its internal angles are less than ${ 180^{\circ} }$ (and none of the edges cross each other). The following figure shows an example. We represent a convex polygon as an array ${ V[1\dots n] }$ where each element of the array represents a vertex of the polygon in the form of a coordinate pair ${ (x,y) }$. We are told that ${ V[1] }$ is the vertex with the minimum ${ x }$ coordinate and that the vertices ${ V[1\dots n] }$ are ordered counterclockwise, as in the figure. You may also assume that the ${ x }$ coordinates of the vertices are all distinct, as are the ${ y }$ coordinates of the vertices.
An example of a convex polygon represented by the array ${ V[1 \dots 9] }$. ${ V[1] }$ is the vertex with the minimum ${ x }$-coordinate, and ${ V[1 \dots 9] }$ are ordered counterclockwise.1
(a) Give an algorithm to find the vertex with the maximum ${ x }$ coordinate in ${ O(\lg n) }$ time.
We can use Unimodal search to do it.
(b) Give an algorithm to find the vertex with the maximum ${ y }$ coordinate in ${ O(\lg n) }$ time.
After, we find the maximum ${ x }$, suppose the index is ${ a }$. Then, we can find the vertex with maximum y corrdinate from array ${ V[a], V[a+1],\cdots,V[n] , V[1] }$, which is also a unimodal array. Then we can also conduct a unimodal search in this array.