SlideShare a Scribd company logo
1 of 61
Applied Algorithms
Unit III : Approximation Algorithms : Syllabus
• Introduction
• Absolute Approximation
• Epsilon Approximation
• Polynomial Time Approximation Schemes
• Probabilistically Good Algorithms
Unit III : Approximation Algorithms
Introduction :
• There is a strong evidence to support that no NP-Hard
problem can be solved in polynomial time.
• Yet many NP-Hard problems have great practical
importance and it is desirable to solve large instances of
these problems in a reasonable amount of time.
• The best known algorithms for NP-Hard problems have a
worst case complexity that is exponential.
• The examples of NP-Hard problems are :
 0/1 Knapsack Problem
 Sum of subsets Problem
 Maximum Clique Problem
 Minimum Node Cover Problem
 Maximum independent set Problem
Unit III : Approximation Algorithms
Introduction contd… :
• Backtracking & Branch & Bound algorithmic strategies
(use of heuristic approach) enable to quickly solve a large
instance of a problem provided heuristic works on that
instance.
• However this heuristic approach does not work equally
effective on all problem instances. NP-Hard problems,
even coupled with heuristics, still show exponential
behavior on some set of inputs (instances).
• The discovery of a sub-exponential algorithm
(Approximate) for NP-Hard problem increases the
maximum problem size that can be solved.
• However for large problem instances, we need an
algorithm of low polynomial time complexity (say O(n)
or O(n2)).
Unit III : Approximation Algorithms
Introduction contd… :
• To produce an algorithm of low polynomial time
complexity to solve an NP-Hard optimization problem, it
is necessary to relax the meaning of “solve”.
These relaxations are :
We remove the requirement that the algorithm that
solves the optimization problem P must always
generate an optimal solution i.e. the algorithm for
the problem P must always generate a feasible
solution
A feasible solution with value close to the optimal
solution is called Approximate Solution and the
algorithm that generates approximate solution for
the problem P, is called as an Approximation
Algorithm.
Unit III : Approximation Algorithms
Introduction contd… :
In the case of NP-Hard problems, approximate
solutions have added importance as exact solutions
(i,.e. optimal solutions) may not be obtainable in a
feasible amount of computing time. One can get
approximate solution using Approximation
Algorithm for an NP-Hard problem in reasonable
amount of computing time.
• Probabilistically Good Algorithms :
In second relaxation, we look for an algorithm for the
problem P that almost always generates optimal solution.
Algorithms with this property are called as
“Probabilistically Good Algorithms”.
Unit III : Approximation Algorithms
Introduction contd… :
• Terminology :
P : Represents an NP-Hard problem such as 0/1
Knapsack or Traveling Salesperson Problem.
I : Represents an instance of problem P
F*(I) : Represents an optimal solution to I
F^(I) : Represents feasible solution produced for I by
an approximation algorithm
A : Represents an algorithm that generates a
feasible solution to every instance I of a
problem P
F*(I) > F^(I) if P is a maximization problem
F^(I) > F*(I) if P is a minimization problem
Unit III : Approximation Algorithms
Introduction contd… :
• Approximation Schemes :
 Absolute Approximation : A is absolute approximation
algorithm for problem P iff for every instance I of P,
|F*(I) - F^(I) | ≤ k for some constant k.
 f(n)-Approximation : A is an f(n)-approximation
algorithm of problem P iff for every instance I of size n,
|F*(I) - F^(I) | / F*(I) ≤ f(n) for F*(I) > 0.
ε-Approximation : A is an epsilon (ε) approximation
algorithm of problem P iff A is f(n)-approximation
algorithm for which f(n) ≤ ε, where ε is some constant.
Unit III : Approximation Algorithms
Absolute Approximations :
There are very few NP-Hard optimization problems for
which polynomial time absolute approximation algorithms
are known. Planar Graph Coloring is the example of absolute
approximation algorithm.
Planar Graph Coloring :
To determine the minimum number of colors needed to color
a planar graph G = (V, E). It is known that every planar graph
is 4-colorable. One can easily determine whether a graph is
zero, one or two colorable. It is zero colorable iff V = Ф, it is
one colorable iff E = Ф and it is two colorable iff it is
bipartite. Determining whether a graph is 3-colorable is NP-
Hard. However all planar graphs are 4-colorable. An absolute
approximation algorithm with |F*(I) - F^(I) | ≤ 1 is easy to
obtain.
Unit III : Approximation Algorithms
Absolute Approximations :
Algorithm Acolor (V, E)
// Determines an approximation to the minimum number of
// colors needed to color a plan graph.
{ if V = Ф
return 0;
else if E = Ф
return 1
else if (G is bipartite)
return 2;
else return 4.
}
Worst case Time Complexity = O(|V|+|E|) to find whether a graph is
bipartite.
Unit III : Approximation Algorithms
Absolute Approximations :
Maximum Programs Stored Problem :
Let there be n programs and two storage devices, two disks
with storage capacity of L each. Let li be the amount of
storage needed to store the ith program. We need to determine
the maximum number of these n programs that can be stored
on the two disks without splitting a program over the disks.
This problem is NP-hard. Proof follows :
Theorem : Partition α Maximum Programs Stored Problem.
Proof : Let {a1, a2, … ,an} define an instance of the partition
problem. We can assume Σai = 2T for 1≤ i ≤ n. This is
equivalent to the instance of Maximum Programs Stored
Problem for which L = T and ai = li for 1≤ i ≤ n. Clearly
{a1, a2, … ,an} has a partition iff all n programs can be stored
on the two disks,
Unit III : Approximation Algorithms
Absolute Approximations :
Maximum Programs Stored Problem :
Approximate Solution :
• By considering programs in order of non-decreasing
storage requirement li, we can obtain a polynomial time
absolute approximation algorithm.
• Following function Pstore assumes l1≤ l2≤ … ln and
assigns programs to disk1 so long as enough space
remains on disk1. Then it begins assigning programs to
disk2.
• Time complexity of this approximate algorithm is O(n) in
addition to O(nlogn) time for sorting the programs into
non-decreasing order of li.
• Thus we can get the approximate solution in polynomial
time i.e. O(nlogn) against exponential time optimal
solution to the problem .
Unit III : Approximation Algorithms
Absolute Approximations :
Maximum Programs Stored Problem :
Algorithm Pstore(l, n, L)
// assumes that li ≤ li+1, 1 ≤ i ≤n.
{ i = 1;
for j = 1 to 2 do
{ sum = 0; // amount (part) of disk j already
// assigned
while (sum + l[i]) ≤ L do
{ write (“store program”, i, “on disk”, j);
sum = sum + l[i]; i= i + 1;
if i> n then return;
}
}
}
Unit III : Approximation Algorithms
Absolute Approximations :
Maximum Programs Stored Problem :
Theorem : Let I be any instance of the Maximum Programs
Stored Problem. Let F*(I) be the maximum number of programs
that can be stored on two disks each of length L. Let F^(I) be the
number of programs stored using the function Pstore. Then
| F*(I) - F^(I) | ≤ 1.
Proof :
• Assume that k programs are stored when Pstore is used,
then F^(I) = k.
• Consider the program storage problem when only one disk
of capacity 2L is available. In this case considering
programs in order of non-decreasing storage requirement
maximizes the number of programs stored. Assume that ρ
programs get stored when this strategy is used on a single
disk of length 2L
Unit III : Approximation Algorithms
Absolute Approximations :
Maximum Programs Stored Problem :
Proof contd… :
• Therefore ρ ≥ F*(I) and Σ li ≤ 2L, 1 ≤ i ≤ ρ .….. (1)
• Let j be the largest index such that Σ li ≤ L, 1 ≤ i ≤ j,
therefore j ≤ ρ and Pstore assigns the first j programs to
disk 1.
• Also note that :
Σ li ≤ Σ li, ≤ L
j+1<= i<= ρ -1 j+2<= i<= ρ
Thus Pstore assigns at least j+1, j+2,, … , ρ -1 to disk 2
and therefore F^(I) ≥ ρ -1 i.e. -F^(I) ≤ -ρ +1 ...….. (2)
• From equations (1) and (2) above we conclude :
|F*(I) – F^(I)| ≤ 1 Absolute Approximation
Unit III : Approximation Algorithms
Absolute Approximations :
Maximum Programs Stored Problem :
Example : Consider the following instance for Pstore :
{a1, a2, a3, a4, a5, a6} = {1, 2, 5, 8, 10, 14} and L = 20.
ρ = 6 ≥ F*(I) when one disk of size 2L is considered.
F*(I) ≤ 6 Partitions {1, 5. 14} & {2, 8, 10}
F^(I) = 5 using Pstore (two disks of size 20 each)
{1, 2, 5, 8} & {10}
|F*(I) – F^(I)| ≤ 1
Unit III : Approximation Algorithms
ε-Approximations :
Scheduling Independent Tasks :
• Obtaining minimum finish time schedules on m identical
processors, m ≥2, is NP-hard. There exists a very simple
scheduling rule that generates schedules with a finish time
very close to that of an optimal schedule.
• An instance of the scheduling problem is defined by a set of
n tasks with times ti, 1 ≤ i ≤n, and m the number of identical
processors.
• The scheduling rule is known as the Largest Processing
Time (LPT) rule. An LPT schedule is a schedule that
results from this rule.
• LPT Schedule : An LPT schedule is one that is result of an
algorithm that, whenever a processor becomes free, assign
to that processor a task whose time is the largest of those
tasks not yet assigned. Ties are broken in an arbitrary
manner.
Unit III : Approximation Algorithms
ε-Approximations :
Scheduling Independent Tasks :
Example -1: Let m = 3, n = 6 and
(t1, t2, t3, t4, t5, t6) = (8, 7, 6, 5, 4, 3)
Following is the LPT Schedule.
Finish Time = 11 and
Schedule is optimal since (Σ ti) / 3= 33/3 = 11.
Time 1 2 3 4 5 6 7 8 9 10 11
p1 t1 t6
p2 t2 t5
p3 t3 t4
Unit III : Approximation Algorithms
ε-Approximations :
Scheduling Independent Tasks :
Example -2: Let m = 3, n = 7 and
(t1, t2, t3, t4, t5 , t6 , t7) = (5, 5, 4, 4, 3, 3, 3)
Following is the LPT Schedule.
Finish Time = 11 and
Schedule is not optimal since (Σ ti) / 3 = 27/3 = 9.
Time 1 2 3 4 5 6 7 8 9 10 11
p1 t1 t6 t7
p2 t2 t5
p3 t3 t4
Unit III : Approximation Algorithms
ε-Approximations :
Scheduling Independent Tasks :
Optimal Schedule for Example -2: Let m = 3, n = 7 and
(t1, t2, t3, t4, t5 , t6 , t7) = (5, 5, 4, 4, 3, 3, 3)
Following is the Optimal Schedule.
Finish Time = 9 and
Schedule is optimal since (Σ ti) / 3 = 27/3 = 9.
Time 1 2 3 4 5 6 7 8 9
p1 t1 t3
p2 t2 t4
p3 t5 t6 t7
Unit III : Approximation Algorithms
ε-Approximations :
Scheduling Independent Tasks :
Analysis :
It is possible to implement the LPT rule so that at most
O(nlogn) time is needed to generate an LPT schedule for n
tasks on m identical processors. From above examples,
although the LPT rule may generate optimal schedules for some
problem instances, it does not do so for all instances. How bad
can LPT schedules be relative to optimal schedules? This
question is answered by following theorem.
Graham’s Theorem :
Let F*(I) be the finish time of an optimal m-processor schedule
for instance I of task scheduling problem. Let F^(I) be the
finish time of an LPT schedule for the same instance, then,
|F*(I) – F^(I)| / |F*(I)| ≤ (1/3) – (1/(3m))
Unit III : Approximation Algorithms
ε-Approximations :
Scheduling Independent Tasks :
Graham’s Theorem : Proof by contradiction
Part-1 :
• The theorem is clearly true for m = 1. For m = 1, F*(I) = F^(I).
Therefore LHS = RHS = 0.
• Let us assume that for some m > 1, there exists a set of tasks for
which theorem is not true.
• Then let (t1, t2, …, tn) define an instance I with the fewest
number of tasks for which the theorem is violated.
• We assume that t1 ≥ t2 ≥ … ≥ tn and that an LPT schedule is
obtained by assigning tasks in the order 1,2,3, …, n. Let the
schedule be S and F^(I) be its finish time.
• Let k be the index of a task with latest completion time. Now
suppose k < n then the finish time f^ of the LPT schedule for
tasks 1,2, …, k is also F^(I) and the finish time f* of an
optimal schedule for these k tasks ≤ F*(I).
Unit III : Approximation Algorithms
ε-Approximations :
Scheduling Independent Tasks :
Graham’s Theorem : Proof by contradiction
Part-1 contd …:
• Hence f^ = F^(I) and f* ≤ F*(I) i.e. 1/f* ≥ 1/F*(I)
Therefore f^ / f* ≥ F^(I) / F*(I)
|f* - f^| / f* ≥ | F*(I) – F^(I)| / F*(I)
> (1/3) – (1/(3m))
The latter inequality follows from the assumption on I.
Then |f* - f^| / f* > (1/3) – (1/(3m)) contradicts the
assumption that I is the smallest m-processor instance for
which the theorem does not hold.
Hence k = n.
Unit III : Approximation Algorithms
ε-Approximations :
Scheduling Independent Tasks :
Graham’s Theorem : Proof by contradiction
Part-2 : Now we show that in no optimal schedule for I, can
more than two tasks be assigned to any processor.
• Therefore n ≤ 2m.
• Since task n has the latest completion time in the LPT schedule
for I (as shown in Part-1), it follows that this task is started
at time F^(I) - tn in this schedule. Further no processor can
have any idle time until this time. Hence we obtain,
F^(I) - tn ≤ (1/m) Σ ti … for 1 ≤ i ≤ n – 1
F^(I) ≤ (1/m) Σ ti + ((m – 1)/m) tn
1<= i<= n
• Now since F*(I) ≥ (1/m) Σ ti … for 1 ≤ i ≤ n, we conclude.
F^(I) – F*(I) ≤ ((m-1)/m) tn OR
| F*(I) – F^(I)| / F*(I) ≤ ((m-1)/m) tn / F*(I)
Unit III : Approximation Algorithms
ε-Approximations :
Scheduling Independent Tasks :
Graham’s Theorem : Proof by contradiction
Part-2 contd…:
• But from assumption | F*(I) – F^(I)| / F*(I) > (1/3) – (1/(3m))
we get (1/3) – (1/(3m)) < ((m-1)/m) tn / F*(I) OR
(m – 1) < (3(m-1)) tn / F*(I) OR
F*(I) < 3 tn
• Hence in an optimal schedule for I, no more than two tasks can
be assigned to any processor.
• But when the optimal schedule contains at most two tasks on
any processor, then it can be shown that the LPT schedule is
also optimal which reduces to | F*(I) – F^(I)| / F*(I) = 0 and
this contradicts the assumption on I , |F*(I) – F^(I)| / |F*(I)| >
(1/3) – (1/(3m)) , so there can be no I that violets the theorem.
Unit III : Approximation Algorithms
Polynomial Time Approximations :
The Vertex Cover Problem :
• A vertex cover of an undirected graph G = (V, E) is a subset
V’ of set V such that if (u, v) is an edge of G, then either u Є
V’ or v Є V’ (or both). The size of a vertex cover is the
number of vertices in it.
• The vertex cover problem is to find a vertex cover of
minimum size in a given undirected graph. We call such a
vertex cover an optimal vertex cover. This problem is the
optimization version of an NP-complete decision problem.
• Even though we do not know how to find an optimal vertex
cover in a graph G in polynomial time, we can efficiently find
a vertex cover that is near optimal. The following algorithm
(Approx-vertex-cover) takes as input an undirected graph G
and returns a vertex cover whose size is guaranteed to be
no more than twice the size of an optimal vertex cover.
Unit III : Approximation Algorithms
Polynomial Time Approximations :
The Vertex Cover Problem :
Algorithm Approx-Vertex-Cover (G)
{ C = Ф; // O(n)
E’ = G.E // O(e)
while E’ ≠ Ф
{ let (u, v) be an arbitrary edge of E’; // O(e)
C = CU{u, v}; // O(n)
remove from E’ every edge incident on either u or v;
// O(e)
}
return C;
}
The running time of this algorithm O(n + e) or O(|V|+|E|) when
graph G is represented as adjacency list.
Unit III : Approximation Algorithms
Polynomial Time Approximations :
The Vertex Cover Problem :
Example : Consider the following Graph G
Fig (a)
b dc
fea g
Unit III : Approximation Algorithms
Polynomial Time Approximations :
The Vertex Cover Problem :
Example : Consider the following Graph G
Fig (b)
b dc
fea g
Unit III : Approximation Algorithms
Polynomial Time Approximations :
The Vertex Cover Problem :
Example : Consider the following Graph G
Fig (c)
b dc
fea g
Unit III : Approximation Algorithms
Polynomial Time Approximations :
The Vertex Cover Problem :
Example : Consider the following Graph G
Fig (d)
F^(I) = C = {b, c, d, e, f, g} is a vertex cover produced
by algorithm Approx-Vertex-Cover
F*(I) = Optimal Cover = {b, d, e}
b dc
fea g
Unit III : Approximation Algorithms
Polynomial Time Approximations :
The Vertex Cover Problem :
Theorem : Approx-Vertex-Cover is a polynomial time 2-
approximation algorithm.
Proof :
• We have already shown that Approx-Vertex-Cover runs in
polynomial time.
• Let A be the set of edges that are chosen arbitrarily. The optimal
cover must include at least one endpoint of each edge. Therefore
|F*(I)| ≥ |A|. …. (1)
• No two edges in A share endpoints, since once an edge is selected
all edges incident on its endpoints are deleted from E’. Therefore
|F^(I)| = 2|A| …. (2)
• From inequalities (1) and (2) above, we get
|F^(I)| / |F*(I)| ≤ 2 therefore
|F^(I)| ≤ 2|F*(I)|
• Hence the theorem.
Unit III : Approximation Algorithms
Polynomial Time Approximations :
Scheduling Independent Tasks :
• We have seen that LPT rule leads to a
(1/3 – 1/(3m))-approximate algorithm for the problem of
obtaining an m-processor schedule for n independent
tasks. A polynomial time approximation scheme is also
known for this problem. This scheme relies upon
following scheduling rule.
• Let k be some specified and fixed integer. Obtain an
optimal schedule for the k longest tasks. Schedule the
remaining n – k tasks using the LPT rule.
Unit III : Approximation Algorithms
Polynomial Time Approximations :
Scheduling Independent Tasks : Example
• Let m = 2, n = 6, (t1, t2, t3, t4, t5, t6) = (8, 6, 5, 4, 4, 1) and
k = 4.
• The four longest tasks have task times 8, 6, 5 and 4
respectively. The optimal schedule for these tasks has finish
time 12
m1 processes t1 and t4 (finish at 12) and
m2 processes t2 and t3 (finish at 11)
• When the remaining two tasks are scheduled using the LPT
rule, the schedule has finish time 15
m1 processes t1, t4 and t6 (finish at 13)
m2 processes t2, t3 and t5 (finish at 15)
• Optimal finish time is 14.
– m1 processes t1 and t2 (finish at 14) and
m2 processes t3,t4,t5 and t6 (finish at 14)
Unit III : Approximation Algorithms
Polynomial Time Approximations :
Scheduling Independent Tasks :
Theorem [Graham] :
Let I be an m-processor instance of the scheduling problem.
Let F*(I) be the finish time of an optimal schedule for instance
I of task scheduling problem. Let F^(I) be the length of the
schedule generated by the above scheduling rule then,
|F*(I) – F^(I)| / |F*(I)| ≤ (1 – 1/m) / (1 + lower ceil (k/m)).
Proof :
• Let r be the finish time of an optimal schedule for the k longest
tasks. If F^(I) = r, then F*(I) = F^(I) and the theorem is proved.
So assume F^(I) > r. Also if n ≤ m then F*(I) = F^(I) & theorem
is proved.
• Let ti, 1 ≤ i ≤ n, be the task times of I. Without loss of
generality, we can assume ti ≥ ti+1, 1 ≤ i ≤ n, and n > k. Also
assume that n > m.
Unit III : Approximation Algorithms
Polynomial Time Approximations :
Scheduling Independent Tasks :
Proof contd…:
• Let j, j > k be such that task j has finish time F^(I). Then no
processor is idle in the interval [0, F^(I) – t j]. Since tk+1 ≥ tj, it
follows that no processor is idle in the interval [0, F^(I) – tk+1].
Hence,
Σ ti ≥ m (F^(I) - tk+1) + tk+1 AND so,1<= i<= n
F*(I) ≥ (1/m) Σ ti ≥ F^(I) – ((m-1))/m) tk+1 OR1<= i<= n
|F*(I) – F^(I)| ≤ ((m-1)/m) tk+1 ………………… (1)
Since ti ≥ ti+1, 1 ≤ i ≤ k+1, and at least one processor must execute at
least 1 + low-ceil (k/m) of these k + 1 tasks, it follows that,
F*(I) ≥ (1 + low-ceil (k/m)) tk+1 ………………....(2)
combining (1) and (2) we get,
|F*(I) – F^(I)| / F*(I) ≤ ((m-1)/m) / (1 + low-ceil (k/m))
≤ (1 – (1/m)) / (1 + low-ceil (k/m))
Unit III : Approximation Algorithms
Polynomial Time Approximations :
Scheduling Independent Tasks :
Analysis :
• Using above result, we can construct a polynomial time
ε–approximation scheme for the scheduling problem. This
scheme has ε as an input variable.
• For any input ε, it computes the value of k such that
ε = (1 – (1/m)) / (1 + low-ceil (k/m)). This defines the k to
be used in the scheduling rule described above.
• Solving for k, we obtain that any integer k = ((m – 1) /ε) - m,
guarantees ε-approximate schedules.
Unit III : Approximation Algorithms
Polynomial Time Approximations :
Scheduling Independent Tasks :
Analysis contd… :
• The time required to obtain such schedules mainly depends on the
time needed to obtain an optimal schedule for k tasks on m
machines. Using a branch and bound algorithm, this time is O(mk).
Time required to arrange the tasks in non-increasing order of
processing time and to obtain LPT schedule for the remaining
(n – k) tasks is O(nlogn).
• Total Time needed is O(nlogn + mk) = O(nlogn + m((m-1)/epsilon)-m))
• This time is not polynomial in 1/ε (it is exponential in 1/ε), this
approximation scheme is not a fully polynomial time
approximation scheme. It is polynomial time approximation
scheme (for any fixed m) as the computing time is polynomial in
the number of tasks n.
Unit III : Approximation Algorithms
Polynomial Time Approximations :
Heuristic algorithm for 0/1 knapsack problem :
Let p[ ] and w[ ] are the sets of profits and weights
respectively. Assume that pi /wi ≥ pi+1 /wi+1 , 1 ≤ i ≤ n. Let m
be a knapsack capacity and k ≤ n be a positive integer. The
algorithm considers all combinations taken 0 at a time, then 1
at a time, and so on up to k at a time and maximum profit for
each combination is computed such that capacity of knapsack
does not exceed and remaining capacity if any, is filled up by
filling remaining objects in original order. At the end
algorithm returns maximum profit.
Unit III : Approximation Algorithms
Polynomial Time Approximations :
Heuristic algorithm for 0/1 knapsack problem :
Algorithm Epsilon-Approx (p, w, n, k)
// The size of combination is the number of objects in it. The
// eight of the combination is the sum of the weights of the
// objects in that combination; k is the non-negative integer
// that defines the order of the algorithm.
{ Pmax = 0;
for all combinations I of size ≤ k and weight ≤ m do
{ PI = Σpi for all i Є I;
Pmax = max(Pmax , PI + Lbound (I, p, w, m, n));
}
return Pmax;
}
Unit III : Approximation Algorithms
Polynomial Time Approximations :
Heuristic algorithm for 0/1 knapsack problem :
Algorithm LBound (I, p, w, m, n)
{ s = 0;
t = m - Σwi for all i Є I;
for i = 1 to n do
{ if (i does not Є I and (w[i] ≤ t) then
{ s = s + p[i];
t = t – w[i];
}
}
return s;
}
Unit III : Approximation Algorithms
Polynomial Time Approximations :
Heuristic algorithm for 0/1 knapsack problem :
Analysis :
Time required by Algorithm Epsilon-Approx
= Σ c(n, i) for 0 ≤ i ≤k
≤ Σ ni for 0 ≤ i ≤k, = (nk+1 -1) / (n-1) = O(nk)
Time required by Algorithm Lbound
= O(n)
Time complexity = O(nk+1)
Theorem : Let J be an instance of the 0/1 knapsack problem.
Let n, m, p, w be as defined for function EpsilonApprox. Let
p* be the value of an optimal solution for J. Let Pmax be as
defined by function EpsilonApprox on termination. Then,
|p* - Pmax | / p* < 1/(k+1)
Unit III : Approximation Algorithms
Polynomial Time Approximations :
Heuristic algorithm for 0/1 knapsack problem :
Example :
Consider the 0/1 knapsack problem instance with n = 8
objects, size of knapsack = m = 110,
profits = {11, 21, 31, 33, 43, 53, 55, 65}
weights = { 1, 11, 21, 23, 33, 43, 45, 55}
The optimal solution = objects 1, 2, 3, 5 and 6. This results in
an optimal proft p* = 159 and a weight w = 109. We get
following approximations for different values of k:
k Pmax x w p* - Pmax | / p*
0 139 {1, 1, 1, 1, 1, 0, 0, 0} 89 20/159 = 0.126
1 151 {1, 1, 1, 1, 0, 0, 1, 0} 101 8/159 = 0.05
2 159 {1, 1, 1, 0, 1, 1, 0, 0} 109 0/159 = 0.00
Unit III : Approximation Algorithms
Fully Polynomial Time Approximations :
• The approximation algorithms and schemes we have studied so far
are particular to the problem considered. There is no set of well
defined techniques that we can use to obtain such algorithms. The
heuristics used depend very much on the particular problem being
solved.
• There are three techniques for fully polynomial time approximation
schemes. These are,
1. Rounding
2. Interval Partitioning
3. Separation
• We shall discus these techniques in terms of maximization problems
of the form Problem instance I),
max Σ pixi 1 ≤i ≤n
s.t. Σ aijxi ≤ bi 1 ≤j ≤m
x Є{0, 1} 1 ≤i ≤n
pi, aijxi ≥ 0
Without loss of generality, we assume that,
aij ≤ bi 1 ≤i ≤n and 1 ≤j ≤m.
Unit III : Approximation Algorithms
Fully Polynomial Time Approximations :
Rounding :
• In this technique, we start from the problem instance I as
stated above and transform it to another problem instance
I’ that is easier to solve.
• This transformation is carried out in such a way that the
optimal solution value of I’ is close to the optimal solution
value of I.
• In particular if we are provided with a bound ε on the
fractional difference between optimal and approximate
solution values wrt optimal value i.e. |F*(I) – F*(I’)| /
F*(I) ≤ ε where F*(I) and F*(I’) represent the optimal
solution values of I and I’ respectively.
Unit III : Approximation Algorithms
Fully Polynomial Time Approximations :
Rounding : Example
Consider a 0/1 knapsack instance I with n = 4, m =2007.1,
(p1, p2, p3, p4) = (1.1, 2.1, 1001.6, 1002.3) and
(w1, w2, w3, w4) = (1.1, 2.1, 1001.6, 1002.3)
Solution :
The feasible assignments in S(i) would have the following
distinct profit values for instance I :
S(0) = {0}
S(1) = {0, 1.1}
S(2) = {0, 1.1, 2.1, 3.2}
S(3) = {0, 1.1, 2.1, 3.2, 1001.6, 1002.7, 1003.7, 1004.8}
S(4) = {0, 1.1, 2.1, 3.2, 1001.6, 1002.7, 1003.7, 1004.8,
1002.3, 1003.4, 1004.4, 1005.5, 2003.9, 2005.0,
2006.0, 2007.1}
Unit III : Approximation Algorithms
Fully Polynomial Time Approximations :
Rounding : Example
Solution contd…:
Optimal solution for the instance I = F*(I) = 2007.1
i.e. solution is (x1, x2, x3, x4) = (1, 1, 1, 1)
Now we construct I’ with following values:
(p1, p2, p3, p4) = (0, 0, 1000, 1000) and
(w1, w2, w3, w4) = (1.1, 2.1, 1001.6, 1002.3), then
The feasible assignments in S(i) would have the following
distinct profit values for instance I’ :
S(0) = {0}
S(1) = {0}
S(2) = {0}
S(3) = {0, 1000}
S(4) = {0, 1000, 2000}
Optimal solution for the instance I’ = F*(I’) = 2000
i.e. solution is (x1, x2, x3, x4) = (0, 0, 1, 1)
Unit III : Approximation Algorithms
Fully Polynomial Time Approximations :
Rounding :
Analysis …:
Total values computed for optimal solution to I
= Σ|S(i) | = 31
0<= i<= n
Total values computed for optimal solution to I’
= Σ|S(i) | = 8
0<= i<= n
Thus I’ can be solved in time approximately 1/4th of that for I.
|F*(I) – F*(I’)| / F*(I) = (2007.1 – 2000) / 2007.1 = 0.0071
Given pi’s and ε what should be the values of qi’s such that :
|F*(I) – F*(I’)| / F*(I) ≤ ε and
Σ|S(i)| ≤ u(n, 1/ ε) where u is a polynomial in n and 1/ ε.
0<= i<= n
Once we achieve this we have a fully polynomial time approximation
scheme for our problem since it is possible to go from S(i-1) to S(i) in
time proportional to O(S(i-1) ).
Unit III : Approximation Algorithms
Fully Polynomial Time Approximations :
Rounding :
Analysis contd …:
Let LB be an estimate for F*(I) such that F*(I) ≥ LB. ….. (1)
Now we can assume that LB ≥ max {pi}. ….. (2)
For I’ let us define qi = pi – rem(pi, (LB. ε)/n), ….. (3)
where rem (a, b) is remainder of a/b, i.e. a – (lower ceil(a/b)) *b.
Therefore pi – qi = rem(pi, (LB. ε)/n) ….. (4)
Now since rem(pi, (LB. ε)/n) <(LB. ε)/n),
From (4) pi – qi < (LB. ε)/n
Therefore Σ|pi - q i| ≤ LB.ε for i = 1 to n ….. (5)
From (1) & (5) Σ|pi - q i| ≤ LB.ε ≤ F*(I).ε ….. (6)
Now Σ|pi ≥ F*(I) and Σ|qi ≥ F*(I’)
Therefore |F*(I) – F*(I’)| ≤ Σ|pi - q i| ≤ F*(I).ε ….. (7)
Therefore |F*(I) – F*(I’)| / F*(I) ≤ ε ….. (8)
Hence if an optimal solution to I’ is used as an optimal solution for I, the
fractional error is less than ε.
Unit III : Approximation Algorithms
Fully Polynomial Time Approximations :
Rounding :
Analysis contd… Time Complexity :
To determine the time required to solve I’ exactly, it is useful
to introduce another problem I” with si, 1 ≤ i ≤n, as its
objective function coefficients.
Define si = lower ceil of (pi*n) / (LB. ε), 1 ≤ i ≤n. It is easy to
see that si = lower ceil of (qi*n) / (LB. ε),
Clearly, the S(i)’s corresponding to the solutions of I’ and I”
will have the same number of tuples. i.e. if (r, t) is the tuple of
I’ then ((r*n) / (LB. ε)) is a tuple in S(i) for I”.
Hence the time needed to solve I’ is the same as that needed
to solve I”.
Unit III : Approximation Algorithms
Fully Polynomial Time Approximations :
Rounding :
Analysis contd… Worst case Time Complexity :
Since pi ≤ LB, …..(1)
si = lower ceil of (pi*n) / (LB. ε)
reduces to
≤ lower ceil of (n/ε). ….. (2)
Hence |S(i)| ≤ 1+ Σ sj
1<= j<= i
≤ 1+ i*(lower ceil of (n/ε) ….. (3)
Therefore Σ |S(i)|, ≤ n + Σ i(lower ceil of (n/ε)),
0<= i<= n-1 0<= i<= n-1
≤ n+ (n-1)*n*(LC of (n/ε)) / 2
≤ O(n3/ε). ….. (4)
Unit III : Approximation Algorithms
Fully Polynomial Time Approximations :
Interval Partitioning :
• Unlike rounding, interval partitioning does not transform the
original problem instance into one that is easier to solve.
• Instead an attempt is made to solve the problem instance I by
generating a restricted class of feasible assignments for S(0) , S(1) ,
…,S(n) .
• Let Pi be the maximum Σ pj x j among all feasible
1<= j<= i
assignments generated for S(i).
• Then the profit interval [0, Pi] is divided into subintervals each of
size Pi.ε / (n-1).
• All feasible assignments in S(i) with Σ pj x j in the same
1<= j<= i
subinterval are regarded as having the same Σ pj x j and the
dominance rules are used to discard all but one of them.
• The S(i)s resulting from this elimination are used in the
generation of S(i+1).
Unit III : Approximation Algorithms
Fully Polynomial Time Approximations :
Interval Partitioning : Analysis
• Since the number of subintervals for each S(i) w is at most
upper ceil of (n/ε) + 1,
• Therefore |S(i)| ≤ upper ceil of (n/ε) + 1, and
Σ |S(i)| = O(n 2/ε)
0<= i<= n
• The error introduced in each feasible assignment due to
this elimination in S(i) is less than the subinterval length
and this error is propogated from S(1) to S(n), however the
error is additive. Then it follows that :
• F*(I) – F^(I) ≤ Σ Pi.ε / (n-1) Cnd since Pi ≤ F*(I),
1<= i<= n-1
|F*(I) – F^(I)| ≤ Σ F*(I).ε / (n-1) and hence
1<= i<= n-1
|F*(I) – F^(I)| / F*(I) ≤ ε / (n-1) Σ 1 = ε
1<= i<= n-1
Unit III : Approximation Algorithms
Fully Polynomial Time Approximations :
Interval Partitioning : Example
n = 5, m = 1112, (p1, p2, p3 , p4 , p5) = (1, 2, 10, 100, 1000)
(w1,w2,w3 , w4 ,w5) = (1, 2, 10, 100, 1000)
Since pi = wi, we shall retain only one value in tuple.
Solution : Optimal solution i.e. F*(I)
S(0) = {0}
S(1) = {0, 1}
S(2) = {0, 1, 2, 3}
S(3) = {0, 1, 2, 3, 10, 11, 12, 13}
S(4) = {0, 1, 2, 3, 10, 11, 12, 13, 100, 101, 102, 103, 110,
11, 112, 113}
S(5) = {0, 1, 2, 3, 10, 11, 12, 13, 100, 101, 102, 103, 110,
111, 112, 113, 1000, 1001, 1002, 1003, 1010, 1011,
1012, 1013, 1100, 1101, 1102, 1103, 1110, 1111, 1112}
Optimal Solution = F*(I) = 1112 i.e. (0,1,1,1,1)
Unit III : Approximation Algorithms
Fully Polynomial Time Approximations :
Interval Partitioning : Example contd…
n = 5, m = 1112, (p1, p2, p3 , p4 , p5) = (1, 2, 10, 100, 1000)
(w1,w2,w3 , w4 ,w5) = (1, 2, 10, 100, 1000)
Solution : Rounding Method
LB = max pi = 1000 & ε = 1/10, qi = pi – rem(pi, (LB. ε)/n),
I’ = (q1, q2, q3 , q4 , q5) = (0, 0, 0, 100, 1000)
(w1,w2,w3 , w4 ,w5) = (1, 2, 10, 100, 1000)
S(0) = {(0, 0)}
S(1) = {(0, 0)}
S(2) = {(0, 0)}
S(3) = {(0, 0)}
S(4) = {(0, 0), (100, 100)}
S(5) = {(0, 0), (100, 100), (1000, 1000),(1100, 1100)}
Optimal Solution = F*(I’) = 1100 i.e. (0,0,0,1,1)
|F*(I) – F*(I’)| / F*(I) = 12 / 1112 < 0.011 < 0.1
Unit III : Approximation Algorithms
Fully Polynomial Time Approximations :
Interval Partitioning : Example contd…
n = 5, m = 1112, (p1, p2, p3 , p4 , p5) = (1, 2, 10, 100, 1000)
(w1,w2,w3 , w4 ,w5) = (1, 2, 10, 100, 1000)
Solution : Interval partitioning
LB = max pi = 1000 & ε = 1/10,
We can start with subinterval size of LB* ε /(n-1) = 1000*0.1/4 = 25
Subintervals are [0, 25), [25, 50), [75, 100), [100, 125) … so on
Using interval partitioning,
S(0) =S(1) = S(2) = S(3) = {0}
S(4) = {0, 100}
S(5) = {0,100, 1000, 1100}
Optimal Solution = F^(I) = 1100 i.e. (0,0,0,1,1)
|F*(I) – F^(I)| / F*(I) = 12 / 1112 < 0.011 < 0.1
Unit III : Approximation Algorithms
Fully Polynomial Time Approximations :
Separation :
• Assume that in solving a problem instance I, we have
obtained an S(i) = {0, 3.9, 4.1, 7.8, 8.2, 11.9, 12.1}.
Further assume that the interval size of Pi* ε /(n-1) is 2.
• Then the subintervals are [0, 2), [2, 4), [4, 6) , [6, 8),
[8, 10) , [10, 12) and [12, 14). Each feasible solution value
falls in different subinterval and so no feasible
assignments are eliminated.
• However there are three pairs of assignments with values
within Pi* ε /(n-1) i.e. 2. If the dominance rules are
used for each pair only four assignments remain and the
error introduced is at most the interval size Pi* ε /(n-1)
i.e.2.
Unit III : Approximation Algorithms
Fully Polynomial Time Approximations :
Separation : Algorithm
• Let a0, a1, a2, … , ar be the distinct values in S(i). Let us assume that
a0<<a1<<a2<… < ar..
• Construct a new set J from S(i) by making a left to right scan and
retaining a touple only if its value excedds the value of last tuple in
J by more than Pi* ε /(n-1).
Algorithm separation (a, J, r)
{ J = assignment corresponding to a0, XP = a0.
for j = 1 to r do
{ if aj > (XP + Pi* ε /(n-1))
{ J = J U {assignments corresponding aj ;
XP = aj
}
}
}
Unit III : Approximation Algorithms
Probabilistically Good Algorithms :
• The approximation algorithms discussed so far have the
nice property that their worst case performance could be
bounded by some constants (k in the case of absolute
approximations and ε in case of an ε-approximation.
• The requirement of bounded performance tends to
categorize other algorithms with unbounded performance
that usually work well as being bad.
• Some algorithms with unbounded performance may in
fact almost always solve the problem exactly or generate a
solution that is exceedingly close in value of an optimal
solution.
• Such algorithms are good in probabilistic sense. If we pick
a problem instance I at random, then there is very high
probability that the algorithm will generate a very good
approximate solution.
Unit III : Approximation Algorithms
Probabilistically Good Algorithms :
Example : To find a Hamiltonian cycle in an undirected path.
Algorithm :
• First an arbitrary vertex (say vertex 1) is chosen as starting
vertex.
• The algorithm maintains a simple path P starting from 1
and ending at vertex k. Initially P is trivial path with k = 1,
i.e. there are no edges in P.
• At each iteration an attempt is made to increase the length
of P by selected edge (k, j) arbitrarily. Following
possibilities are considered :
Unit III : Approximation Algorithms
Probabilistically Good Algorithms :
Example contd …:
Algorithm contd…. :
1. j = 1 & path P includes all the vertices of the graph : In this
case Hamiltonian Cycle has been found and the algorithm
terminates.
2. j is not in the path P : In this case the length of P is
increased by adding (k, j) to it and j becomes new End Point
of P
3. j is already on path P : Now there is unique edge e (j, m) in
P such that the deletion of e from P and inclusion of (k, j) to
P results in a simple path. Then edge e is deleted and (k, j) is
added to path P. P is now a simple path with End Point m.
• This algorithm does not always find Hamiltonian Cycle in a graph
that contains such a cycle.
• Time Complexity : O(n2)
Unit III : Approximation Algorithms
Probabilistically Good Algorithms :
Example contd …:
Let us try the above algorithm on the following five vertex graph.
Edges Selected : 1-4 (path : 1-4) , 4-5 (path : 1-4-5), 1-5 (path : 1-5-4),
4-3 (path : 1-5-4-3), 3-2 (path : 1-5-4-3-2),
2-1 (path : 1-5-4-3-2-1
5
2 3
41

More Related Content

What's hot

What's hot (20)

Topological Sorting
Topological SortingTopological Sorting
Topological Sorting
 
5 csp
5 csp5 csp
5 csp
 
Vertex cover Problem
Vertex cover ProblemVertex cover Problem
Vertex cover Problem
 
Randomized Algorithm
Randomized AlgorithmRandomized Algorithm
Randomized Algorithm
 
Graph coloring using backtracking
Graph coloring using backtrackingGraph coloring using backtracking
Graph coloring using backtracking
 
Loops in flow
Loops in flowLoops in flow
Loops in flow
 
Randomized algorithms ver 1.0
Randomized algorithms ver 1.0Randomized algorithms ver 1.0
Randomized algorithms ver 1.0
 
Amortized Analysis of Algorithms
Amortized Analysis of Algorithms Amortized Analysis of Algorithms
Amortized Analysis of Algorithms
 
Greedy algorithm
Greedy algorithmGreedy algorithm
Greedy algorithm
 
Machine Learning with Decision trees
Machine Learning with Decision treesMachine Learning with Decision trees
Machine Learning with Decision trees
 
P, NP, NP-Complete, and NP-Hard
P, NP, NP-Complete, and NP-HardP, NP, NP-Complete, and NP-Hard
P, NP, NP-Complete, and NP-Hard
 
Greedy Algorithm - Knapsack Problem
Greedy Algorithm - Knapsack ProblemGreedy Algorithm - Knapsack Problem
Greedy Algorithm - Knapsack Problem
 
P vs NP
P vs NP P vs NP
P vs NP
 
Fuzzy arithmetic
Fuzzy arithmeticFuzzy arithmetic
Fuzzy arithmetic
 
Apriori algorithm
Apriori algorithmApriori algorithm
Apriori algorithm
 
NP Complete Problems
NP Complete ProblemsNP Complete Problems
NP Complete Problems
 
Unit 2 in daa
Unit 2 in daaUnit 2 in daa
Unit 2 in daa
 
And or graph problem reduction using predicate logic
And or graph problem reduction using predicate logicAnd or graph problem reduction using predicate logic
And or graph problem reduction using predicate logic
 
Asymptotic notations
Asymptotic notationsAsymptotic notations
Asymptotic notations
 
Learning sets of rules, Sequential Learning Algorithm,FOIL
Learning sets of rules, Sequential Learning Algorithm,FOILLearning sets of rules, Sequential Learning Algorithm,FOIL
Learning sets of rules, Sequential Learning Algorithm,FOIL
 

Similar to Approximation algorithms

Ch-2 final exam documet compler design elements
Ch-2 final exam documet compler design elementsCh-2 final exam documet compler design elements
Ch-2 final exam documet compler design elementsMAHERMOHAMED27
 
Analysis of algorithms
Analysis of algorithmsAnalysis of algorithms
Analysis of algorithmsGanesh Solanke
 
Data Structures- Part2 analysis tools
Data Structures- Part2 analysis toolsData Structures- Part2 analysis tools
Data Structures- Part2 analysis toolsAbdullah Al-hazmy
 
TIME EXECUTION OF DIFFERENT SORTED ALGORITHMS
TIME EXECUTION   OF  DIFFERENT SORTED ALGORITHMSTIME EXECUTION   OF  DIFFERENT SORTED ALGORITHMS
TIME EXECUTION OF DIFFERENT SORTED ALGORITHMSTanya Makkar
 
DSA Complexity.pptx What is Complexity Analysis? What is the need for Compl...
DSA Complexity.pptx   What is Complexity Analysis? What is the need for Compl...DSA Complexity.pptx   What is Complexity Analysis? What is the need for Compl...
DSA Complexity.pptx What is Complexity Analysis? What is the need for Compl...2022cspaawan12556
 
Data Structure & Algorithms - Mathematical
Data Structure & Algorithms - MathematicalData Structure & Algorithms - Mathematical
Data Structure & Algorithms - Mathematicalbabuk110
 
Design and Analysis of Algorithms Exam Help
Design and Analysis of Algorithms Exam HelpDesign and Analysis of Algorithms Exam Help
Design and Analysis of Algorithms Exam HelpProgramming Exam Help
 
2010 3-24 cryptography stamatiou
2010 3-24 cryptography stamatiou2010 3-24 cryptography stamatiou
2010 3-24 cryptography stamatiouvafopoulos
 
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhhCh3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhhdanielgetachew0922
 
Aad introduction
Aad introductionAad introduction
Aad introductionMr SMAK
 
complexity analysis.pdf
complexity analysis.pdfcomplexity analysis.pdf
complexity analysis.pdfpasinduneshan
 
computer operating system:Greedy algorithm
computer operating system:Greedy algorithmcomputer operating system:Greedy algorithm
computer operating system:Greedy algorithmRitaThakkar1
 
Partial compute function
Partial compute functionPartial compute function
Partial compute functionRajendran
 
AN APPROXIMATION ALGORITHM FOR THE GENERALIZED ASSIGNMENT PROBLEM
AN APPROXIMATION ALGORITHM FOR THE GENERALIZED ASSIGNMENT PROBLEMAN APPROXIMATION ALGORITHM FOR THE GENERALIZED ASSIGNMENT PROBLEM
AN APPROXIMATION ALGORITHM FOR THE GENERALIZED ASSIGNMENT PROBLEMMonica Gero
 
2-Algorithms and Complexit data structurey.pdf
2-Algorithms and Complexit data structurey.pdf2-Algorithms and Complexit data structurey.pdf
2-Algorithms and Complexit data structurey.pdfishan743441
 
An Exact Exponential Branch-And-Merge Algorithm For The Single Machine Total ...
An Exact Exponential Branch-And-Merge Algorithm For The Single Machine Total ...An Exact Exponential Branch-And-Merge Algorithm For The Single Machine Total ...
An Exact Exponential Branch-And-Merge Algorithm For The Single Machine Total ...Joe Andelija
 

Similar to Approximation algorithms (20)

Ch-2 final exam documet compler design elements
Ch-2 final exam documet compler design elementsCh-2 final exam documet compler design elements
Ch-2 final exam documet compler design elements
 
Analysis of algorithms
Analysis of algorithmsAnalysis of algorithms
Analysis of algorithms
 
Data Structures- Part2 analysis tools
Data Structures- Part2 analysis toolsData Structures- Part2 analysis tools
Data Structures- Part2 analysis tools
 
TIME EXECUTION OF DIFFERENT SORTED ALGORITHMS
TIME EXECUTION   OF  DIFFERENT SORTED ALGORITHMSTIME EXECUTION   OF  DIFFERENT SORTED ALGORITHMS
TIME EXECUTION OF DIFFERENT SORTED ALGORITHMS
 
DSA Complexity.pptx What is Complexity Analysis? What is the need for Compl...
DSA Complexity.pptx   What is Complexity Analysis? What is the need for Compl...DSA Complexity.pptx   What is Complexity Analysis? What is the need for Compl...
DSA Complexity.pptx What is Complexity Analysis? What is the need for Compl...
 
Data Structure & Algorithms - Mathematical
Data Structure & Algorithms - MathematicalData Structure & Algorithms - Mathematical
Data Structure & Algorithms - Mathematical
 
Design and Analysis of Algorithms Exam Help
Design and Analysis of Algorithms Exam HelpDesign and Analysis of Algorithms Exam Help
Design and Analysis of Algorithms Exam Help
 
2010 3-24 cryptography stamatiou
2010 3-24 cryptography stamatiou2010 3-24 cryptography stamatiou
2010 3-24 cryptography stamatiou
 
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhhCh3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
 
Greedy Algorithms
Greedy AlgorithmsGreedy Algorithms
Greedy Algorithms
 
Aad introduction
Aad introductionAad introduction
Aad introduction
 
Unit ii algorithm
Unit   ii algorithmUnit   ii algorithm
Unit ii algorithm
 
complexity analysis.pdf
complexity analysis.pdfcomplexity analysis.pdf
complexity analysis.pdf
 
Analysis.ppt
Analysis.pptAnalysis.ppt
Analysis.ppt
 
computer operating system:Greedy algorithm
computer operating system:Greedy algorithmcomputer operating system:Greedy algorithm
computer operating system:Greedy algorithm
 
Partial compute function
Partial compute functionPartial compute function
Partial compute function
 
AN APPROXIMATION ALGORITHM FOR THE GENERALIZED ASSIGNMENT PROBLEM
AN APPROXIMATION ALGORITHM FOR THE GENERALIZED ASSIGNMENT PROBLEMAN APPROXIMATION ALGORITHM FOR THE GENERALIZED ASSIGNMENT PROBLEM
AN APPROXIMATION ALGORITHM FOR THE GENERALIZED ASSIGNMENT PROBLEM
 
2-Algorithms and Complexit data structurey.pdf
2-Algorithms and Complexit data structurey.pdf2-Algorithms and Complexit data structurey.pdf
2-Algorithms and Complexit data structurey.pdf
 
An Exact Exponential Branch-And-Merge Algorithm For The Single Machine Total ...
An Exact Exponential Branch-And-Merge Algorithm For The Single Machine Total ...An Exact Exponential Branch-And-Merge Algorithm For The Single Machine Total ...
An Exact Exponential Branch-And-Merge Algorithm For The Single Machine Total ...
 
Computer Network Assignment Help
Computer Network Assignment HelpComputer Network Assignment Help
Computer Network Assignment Help
 

Recently uploaded

VIP Call Girls Palanpur 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Palanpur 7001035870 Whatsapp Number, 24/07 BookingVIP Call Girls Palanpur 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Palanpur 7001035870 Whatsapp Number, 24/07 Bookingdharasingh5698
 
Booking open Available Pune Call Girls Pargaon 6297143586 Call Hot Indian Gi...
Booking open Available Pune Call Girls Pargaon  6297143586 Call Hot Indian Gi...Booking open Available Pune Call Girls Pargaon  6297143586 Call Hot Indian Gi...
Booking open Available Pune Call Girls Pargaon 6297143586 Call Hot Indian Gi...Call Girls in Nagpur High Profile
 
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...ranjana rawat
 
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...roncy bisnoi
 
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756dollysharma2066
 
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...Call Girls in Nagpur High Profile
 
Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)simmis5
 
Double rodded leveling 1 pdf activity 01
Double rodded leveling 1 pdf activity 01Double rodded leveling 1 pdf activity 01
Double rodded leveling 1 pdf activity 01KreezheaRecto
 
Online banking management system project.pdf
Online banking management system project.pdfOnline banking management system project.pdf
Online banking management system project.pdfKamal Acharya
 
Call Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance BookingCall Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance Bookingroncy bisnoi
 
University management System project report..pdf
University management System project report..pdfUniversity management System project report..pdf
University management System project report..pdfKamal Acharya
 
Thermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.pptThermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.pptDineshKumar4165
 
Thermal Engineering Unit - I & II . ppt
Thermal Engineering  Unit - I & II . pptThermal Engineering  Unit - I & II . ppt
Thermal Engineering Unit - I & II . pptDineshKumar4165
 
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...Call Girls in Nagpur High Profile
 
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 BookingVIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Bookingdharasingh5698
 
Generative AI or GenAI technology based PPT
Generative AI or GenAI technology based PPTGenerative AI or GenAI technology based PPT
Generative AI or GenAI technology based PPTbhaskargani46
 
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptx
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptxBSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptx
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptxfenichawla
 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Christo Ananth
 

Recently uploaded (20)

VIP Call Girls Palanpur 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Palanpur 7001035870 Whatsapp Number, 24/07 BookingVIP Call Girls Palanpur 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Palanpur 7001035870 Whatsapp Number, 24/07 Booking
 
Booking open Available Pune Call Girls Pargaon 6297143586 Call Hot Indian Gi...
Booking open Available Pune Call Girls Pargaon  6297143586 Call Hot Indian Gi...Booking open Available Pune Call Girls Pargaon  6297143586 Call Hot Indian Gi...
Booking open Available Pune Call Girls Pargaon 6297143586 Call Hot Indian Gi...
 
NFPA 5000 2024 standard .
NFPA 5000 2024 standard                                  .NFPA 5000 2024 standard                                  .
NFPA 5000 2024 standard .
 
Roadmap to Membership of RICS - Pathways and Routes
Roadmap to Membership of RICS - Pathways and RoutesRoadmap to Membership of RICS - Pathways and Routes
Roadmap to Membership of RICS - Pathways and Routes
 
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
 
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
 
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
 
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
 
Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)
 
Double rodded leveling 1 pdf activity 01
Double rodded leveling 1 pdf activity 01Double rodded leveling 1 pdf activity 01
Double rodded leveling 1 pdf activity 01
 
Online banking management system project.pdf
Online banking management system project.pdfOnline banking management system project.pdf
Online banking management system project.pdf
 
Call Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance BookingCall Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance Booking
 
University management System project report..pdf
University management System project report..pdfUniversity management System project report..pdf
University management System project report..pdf
 
Thermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.pptThermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.ppt
 
Thermal Engineering Unit - I & II . ppt
Thermal Engineering  Unit - I & II . pptThermal Engineering  Unit - I & II . ppt
Thermal Engineering Unit - I & II . ppt
 
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...
 
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 BookingVIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
 
Generative AI or GenAI technology based PPT
Generative AI or GenAI technology based PPTGenerative AI or GenAI technology based PPT
Generative AI or GenAI technology based PPT
 
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptx
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptxBSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptx
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptx
 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
 

Approximation algorithms

  • 1. Applied Algorithms Unit III : Approximation Algorithms : Syllabus • Introduction • Absolute Approximation • Epsilon Approximation • Polynomial Time Approximation Schemes • Probabilistically Good Algorithms
  • 2. Unit III : Approximation Algorithms Introduction : • There is a strong evidence to support that no NP-Hard problem can be solved in polynomial time. • Yet many NP-Hard problems have great practical importance and it is desirable to solve large instances of these problems in a reasonable amount of time. • The best known algorithms for NP-Hard problems have a worst case complexity that is exponential. • The examples of NP-Hard problems are :  0/1 Knapsack Problem  Sum of subsets Problem  Maximum Clique Problem  Minimum Node Cover Problem  Maximum independent set Problem
  • 3. Unit III : Approximation Algorithms Introduction contd… : • Backtracking & Branch & Bound algorithmic strategies (use of heuristic approach) enable to quickly solve a large instance of a problem provided heuristic works on that instance. • However this heuristic approach does not work equally effective on all problem instances. NP-Hard problems, even coupled with heuristics, still show exponential behavior on some set of inputs (instances). • The discovery of a sub-exponential algorithm (Approximate) for NP-Hard problem increases the maximum problem size that can be solved. • However for large problem instances, we need an algorithm of low polynomial time complexity (say O(n) or O(n2)).
  • 4. Unit III : Approximation Algorithms Introduction contd… : • To produce an algorithm of low polynomial time complexity to solve an NP-Hard optimization problem, it is necessary to relax the meaning of “solve”. These relaxations are : We remove the requirement that the algorithm that solves the optimization problem P must always generate an optimal solution i.e. the algorithm for the problem P must always generate a feasible solution A feasible solution with value close to the optimal solution is called Approximate Solution and the algorithm that generates approximate solution for the problem P, is called as an Approximation Algorithm.
  • 5. Unit III : Approximation Algorithms Introduction contd… : In the case of NP-Hard problems, approximate solutions have added importance as exact solutions (i,.e. optimal solutions) may not be obtainable in a feasible amount of computing time. One can get approximate solution using Approximation Algorithm for an NP-Hard problem in reasonable amount of computing time. • Probabilistically Good Algorithms : In second relaxation, we look for an algorithm for the problem P that almost always generates optimal solution. Algorithms with this property are called as “Probabilistically Good Algorithms”.
  • 6. Unit III : Approximation Algorithms Introduction contd… : • Terminology : P : Represents an NP-Hard problem such as 0/1 Knapsack or Traveling Salesperson Problem. I : Represents an instance of problem P F*(I) : Represents an optimal solution to I F^(I) : Represents feasible solution produced for I by an approximation algorithm A : Represents an algorithm that generates a feasible solution to every instance I of a problem P F*(I) > F^(I) if P is a maximization problem F^(I) > F*(I) if P is a minimization problem
  • 7. Unit III : Approximation Algorithms Introduction contd… : • Approximation Schemes :  Absolute Approximation : A is absolute approximation algorithm for problem P iff for every instance I of P, |F*(I) - F^(I) | ≤ k for some constant k.  f(n)-Approximation : A is an f(n)-approximation algorithm of problem P iff for every instance I of size n, |F*(I) - F^(I) | / F*(I) ≤ f(n) for F*(I) > 0. ε-Approximation : A is an epsilon (ε) approximation algorithm of problem P iff A is f(n)-approximation algorithm for which f(n) ≤ ε, where ε is some constant.
  • 8. Unit III : Approximation Algorithms Absolute Approximations : There are very few NP-Hard optimization problems for which polynomial time absolute approximation algorithms are known. Planar Graph Coloring is the example of absolute approximation algorithm. Planar Graph Coloring : To determine the minimum number of colors needed to color a planar graph G = (V, E). It is known that every planar graph is 4-colorable. One can easily determine whether a graph is zero, one or two colorable. It is zero colorable iff V = Ф, it is one colorable iff E = Ф and it is two colorable iff it is bipartite. Determining whether a graph is 3-colorable is NP- Hard. However all planar graphs are 4-colorable. An absolute approximation algorithm with |F*(I) - F^(I) | ≤ 1 is easy to obtain.
  • 9. Unit III : Approximation Algorithms Absolute Approximations : Algorithm Acolor (V, E) // Determines an approximation to the minimum number of // colors needed to color a plan graph. { if V = Ф return 0; else if E = Ф return 1 else if (G is bipartite) return 2; else return 4. } Worst case Time Complexity = O(|V|+|E|) to find whether a graph is bipartite.
  • 10. Unit III : Approximation Algorithms Absolute Approximations : Maximum Programs Stored Problem : Let there be n programs and two storage devices, two disks with storage capacity of L each. Let li be the amount of storage needed to store the ith program. We need to determine the maximum number of these n programs that can be stored on the two disks without splitting a program over the disks. This problem is NP-hard. Proof follows : Theorem : Partition α Maximum Programs Stored Problem. Proof : Let {a1, a2, … ,an} define an instance of the partition problem. We can assume Σai = 2T for 1≤ i ≤ n. This is equivalent to the instance of Maximum Programs Stored Problem for which L = T and ai = li for 1≤ i ≤ n. Clearly {a1, a2, … ,an} has a partition iff all n programs can be stored on the two disks,
  • 11. Unit III : Approximation Algorithms Absolute Approximations : Maximum Programs Stored Problem : Approximate Solution : • By considering programs in order of non-decreasing storage requirement li, we can obtain a polynomial time absolute approximation algorithm. • Following function Pstore assumes l1≤ l2≤ … ln and assigns programs to disk1 so long as enough space remains on disk1. Then it begins assigning programs to disk2. • Time complexity of this approximate algorithm is O(n) in addition to O(nlogn) time for sorting the programs into non-decreasing order of li. • Thus we can get the approximate solution in polynomial time i.e. O(nlogn) against exponential time optimal solution to the problem .
  • 12. Unit III : Approximation Algorithms Absolute Approximations : Maximum Programs Stored Problem : Algorithm Pstore(l, n, L) // assumes that li ≤ li+1, 1 ≤ i ≤n. { i = 1; for j = 1 to 2 do { sum = 0; // amount (part) of disk j already // assigned while (sum + l[i]) ≤ L do { write (“store program”, i, “on disk”, j); sum = sum + l[i]; i= i + 1; if i> n then return; } } }
  • 13. Unit III : Approximation Algorithms Absolute Approximations : Maximum Programs Stored Problem : Theorem : Let I be any instance of the Maximum Programs Stored Problem. Let F*(I) be the maximum number of programs that can be stored on two disks each of length L. Let F^(I) be the number of programs stored using the function Pstore. Then | F*(I) - F^(I) | ≤ 1. Proof : • Assume that k programs are stored when Pstore is used, then F^(I) = k. • Consider the program storage problem when only one disk of capacity 2L is available. In this case considering programs in order of non-decreasing storage requirement maximizes the number of programs stored. Assume that ρ programs get stored when this strategy is used on a single disk of length 2L
  • 14. Unit III : Approximation Algorithms Absolute Approximations : Maximum Programs Stored Problem : Proof contd… : • Therefore ρ ≥ F*(I) and Σ li ≤ 2L, 1 ≤ i ≤ ρ .….. (1) • Let j be the largest index such that Σ li ≤ L, 1 ≤ i ≤ j, therefore j ≤ ρ and Pstore assigns the first j programs to disk 1. • Also note that : Σ li ≤ Σ li, ≤ L j+1<= i<= ρ -1 j+2<= i<= ρ Thus Pstore assigns at least j+1, j+2,, … , ρ -1 to disk 2 and therefore F^(I) ≥ ρ -1 i.e. -F^(I) ≤ -ρ +1 ...….. (2) • From equations (1) and (2) above we conclude : |F*(I) – F^(I)| ≤ 1 Absolute Approximation
  • 15. Unit III : Approximation Algorithms Absolute Approximations : Maximum Programs Stored Problem : Example : Consider the following instance for Pstore : {a1, a2, a3, a4, a5, a6} = {1, 2, 5, 8, 10, 14} and L = 20. ρ = 6 ≥ F*(I) when one disk of size 2L is considered. F*(I) ≤ 6 Partitions {1, 5. 14} & {2, 8, 10} F^(I) = 5 using Pstore (two disks of size 20 each) {1, 2, 5, 8} & {10} |F*(I) – F^(I)| ≤ 1
  • 16. Unit III : Approximation Algorithms ε-Approximations : Scheduling Independent Tasks : • Obtaining minimum finish time schedules on m identical processors, m ≥2, is NP-hard. There exists a very simple scheduling rule that generates schedules with a finish time very close to that of an optimal schedule. • An instance of the scheduling problem is defined by a set of n tasks with times ti, 1 ≤ i ≤n, and m the number of identical processors. • The scheduling rule is known as the Largest Processing Time (LPT) rule. An LPT schedule is a schedule that results from this rule. • LPT Schedule : An LPT schedule is one that is result of an algorithm that, whenever a processor becomes free, assign to that processor a task whose time is the largest of those tasks not yet assigned. Ties are broken in an arbitrary manner.
  • 17. Unit III : Approximation Algorithms ε-Approximations : Scheduling Independent Tasks : Example -1: Let m = 3, n = 6 and (t1, t2, t3, t4, t5, t6) = (8, 7, 6, 5, 4, 3) Following is the LPT Schedule. Finish Time = 11 and Schedule is optimal since (Σ ti) / 3= 33/3 = 11. Time 1 2 3 4 5 6 7 8 9 10 11 p1 t1 t6 p2 t2 t5 p3 t3 t4
  • 18. Unit III : Approximation Algorithms ε-Approximations : Scheduling Independent Tasks : Example -2: Let m = 3, n = 7 and (t1, t2, t3, t4, t5 , t6 , t7) = (5, 5, 4, 4, 3, 3, 3) Following is the LPT Schedule. Finish Time = 11 and Schedule is not optimal since (Σ ti) / 3 = 27/3 = 9. Time 1 2 3 4 5 6 7 8 9 10 11 p1 t1 t6 t7 p2 t2 t5 p3 t3 t4
  • 19. Unit III : Approximation Algorithms ε-Approximations : Scheduling Independent Tasks : Optimal Schedule for Example -2: Let m = 3, n = 7 and (t1, t2, t3, t4, t5 , t6 , t7) = (5, 5, 4, 4, 3, 3, 3) Following is the Optimal Schedule. Finish Time = 9 and Schedule is optimal since (Σ ti) / 3 = 27/3 = 9. Time 1 2 3 4 5 6 7 8 9 p1 t1 t3 p2 t2 t4 p3 t5 t6 t7
  • 20. Unit III : Approximation Algorithms ε-Approximations : Scheduling Independent Tasks : Analysis : It is possible to implement the LPT rule so that at most O(nlogn) time is needed to generate an LPT schedule for n tasks on m identical processors. From above examples, although the LPT rule may generate optimal schedules for some problem instances, it does not do so for all instances. How bad can LPT schedules be relative to optimal schedules? This question is answered by following theorem. Graham’s Theorem : Let F*(I) be the finish time of an optimal m-processor schedule for instance I of task scheduling problem. Let F^(I) be the finish time of an LPT schedule for the same instance, then, |F*(I) – F^(I)| / |F*(I)| ≤ (1/3) – (1/(3m))
  • 21. Unit III : Approximation Algorithms ε-Approximations : Scheduling Independent Tasks : Graham’s Theorem : Proof by contradiction Part-1 : • The theorem is clearly true for m = 1. For m = 1, F*(I) = F^(I). Therefore LHS = RHS = 0. • Let us assume that for some m > 1, there exists a set of tasks for which theorem is not true. • Then let (t1, t2, …, tn) define an instance I with the fewest number of tasks for which the theorem is violated. • We assume that t1 ≥ t2 ≥ … ≥ tn and that an LPT schedule is obtained by assigning tasks in the order 1,2,3, …, n. Let the schedule be S and F^(I) be its finish time. • Let k be the index of a task with latest completion time. Now suppose k < n then the finish time f^ of the LPT schedule for tasks 1,2, …, k is also F^(I) and the finish time f* of an optimal schedule for these k tasks ≤ F*(I).
  • 22. Unit III : Approximation Algorithms ε-Approximations : Scheduling Independent Tasks : Graham’s Theorem : Proof by contradiction Part-1 contd …: • Hence f^ = F^(I) and f* ≤ F*(I) i.e. 1/f* ≥ 1/F*(I) Therefore f^ / f* ≥ F^(I) / F*(I) |f* - f^| / f* ≥ | F*(I) – F^(I)| / F*(I) > (1/3) – (1/(3m)) The latter inequality follows from the assumption on I. Then |f* - f^| / f* > (1/3) – (1/(3m)) contradicts the assumption that I is the smallest m-processor instance for which the theorem does not hold. Hence k = n.
  • 23. Unit III : Approximation Algorithms ε-Approximations : Scheduling Independent Tasks : Graham’s Theorem : Proof by contradiction Part-2 : Now we show that in no optimal schedule for I, can more than two tasks be assigned to any processor. • Therefore n ≤ 2m. • Since task n has the latest completion time in the LPT schedule for I (as shown in Part-1), it follows that this task is started at time F^(I) - tn in this schedule. Further no processor can have any idle time until this time. Hence we obtain, F^(I) - tn ≤ (1/m) Σ ti … for 1 ≤ i ≤ n – 1 F^(I) ≤ (1/m) Σ ti + ((m – 1)/m) tn 1<= i<= n • Now since F*(I) ≥ (1/m) Σ ti … for 1 ≤ i ≤ n, we conclude. F^(I) – F*(I) ≤ ((m-1)/m) tn OR | F*(I) – F^(I)| / F*(I) ≤ ((m-1)/m) tn / F*(I)
  • 24. Unit III : Approximation Algorithms ε-Approximations : Scheduling Independent Tasks : Graham’s Theorem : Proof by contradiction Part-2 contd…: • But from assumption | F*(I) – F^(I)| / F*(I) > (1/3) – (1/(3m)) we get (1/3) – (1/(3m)) < ((m-1)/m) tn / F*(I) OR (m – 1) < (3(m-1)) tn / F*(I) OR F*(I) < 3 tn • Hence in an optimal schedule for I, no more than two tasks can be assigned to any processor. • But when the optimal schedule contains at most two tasks on any processor, then it can be shown that the LPT schedule is also optimal which reduces to | F*(I) – F^(I)| / F*(I) = 0 and this contradicts the assumption on I , |F*(I) – F^(I)| / |F*(I)| > (1/3) – (1/(3m)) , so there can be no I that violets the theorem.
  • 25. Unit III : Approximation Algorithms Polynomial Time Approximations : The Vertex Cover Problem : • A vertex cover of an undirected graph G = (V, E) is a subset V’ of set V such that if (u, v) is an edge of G, then either u Є V’ or v Є V’ (or both). The size of a vertex cover is the number of vertices in it. • The vertex cover problem is to find a vertex cover of minimum size in a given undirected graph. We call such a vertex cover an optimal vertex cover. This problem is the optimization version of an NP-complete decision problem. • Even though we do not know how to find an optimal vertex cover in a graph G in polynomial time, we can efficiently find a vertex cover that is near optimal. The following algorithm (Approx-vertex-cover) takes as input an undirected graph G and returns a vertex cover whose size is guaranteed to be no more than twice the size of an optimal vertex cover.
  • 26. Unit III : Approximation Algorithms Polynomial Time Approximations : The Vertex Cover Problem : Algorithm Approx-Vertex-Cover (G) { C = Ф; // O(n) E’ = G.E // O(e) while E’ ≠ Ф { let (u, v) be an arbitrary edge of E’; // O(e) C = CU{u, v}; // O(n) remove from E’ every edge incident on either u or v; // O(e) } return C; } The running time of this algorithm O(n + e) or O(|V|+|E|) when graph G is represented as adjacency list.
  • 27. Unit III : Approximation Algorithms Polynomial Time Approximations : The Vertex Cover Problem : Example : Consider the following Graph G Fig (a) b dc fea g
  • 28. Unit III : Approximation Algorithms Polynomial Time Approximations : The Vertex Cover Problem : Example : Consider the following Graph G Fig (b) b dc fea g
  • 29. Unit III : Approximation Algorithms Polynomial Time Approximations : The Vertex Cover Problem : Example : Consider the following Graph G Fig (c) b dc fea g
  • 30. Unit III : Approximation Algorithms Polynomial Time Approximations : The Vertex Cover Problem : Example : Consider the following Graph G Fig (d) F^(I) = C = {b, c, d, e, f, g} is a vertex cover produced by algorithm Approx-Vertex-Cover F*(I) = Optimal Cover = {b, d, e} b dc fea g
  • 31. Unit III : Approximation Algorithms Polynomial Time Approximations : The Vertex Cover Problem : Theorem : Approx-Vertex-Cover is a polynomial time 2- approximation algorithm. Proof : • We have already shown that Approx-Vertex-Cover runs in polynomial time. • Let A be the set of edges that are chosen arbitrarily. The optimal cover must include at least one endpoint of each edge. Therefore |F*(I)| ≥ |A|. …. (1) • No two edges in A share endpoints, since once an edge is selected all edges incident on its endpoints are deleted from E’. Therefore |F^(I)| = 2|A| …. (2) • From inequalities (1) and (2) above, we get |F^(I)| / |F*(I)| ≤ 2 therefore |F^(I)| ≤ 2|F*(I)| • Hence the theorem.
  • 32. Unit III : Approximation Algorithms Polynomial Time Approximations : Scheduling Independent Tasks : • We have seen that LPT rule leads to a (1/3 – 1/(3m))-approximate algorithm for the problem of obtaining an m-processor schedule for n independent tasks. A polynomial time approximation scheme is also known for this problem. This scheme relies upon following scheduling rule. • Let k be some specified and fixed integer. Obtain an optimal schedule for the k longest tasks. Schedule the remaining n – k tasks using the LPT rule.
  • 33. Unit III : Approximation Algorithms Polynomial Time Approximations : Scheduling Independent Tasks : Example • Let m = 2, n = 6, (t1, t2, t3, t4, t5, t6) = (8, 6, 5, 4, 4, 1) and k = 4. • The four longest tasks have task times 8, 6, 5 and 4 respectively. The optimal schedule for these tasks has finish time 12 m1 processes t1 and t4 (finish at 12) and m2 processes t2 and t3 (finish at 11) • When the remaining two tasks are scheduled using the LPT rule, the schedule has finish time 15 m1 processes t1, t4 and t6 (finish at 13) m2 processes t2, t3 and t5 (finish at 15) • Optimal finish time is 14. – m1 processes t1 and t2 (finish at 14) and m2 processes t3,t4,t5 and t6 (finish at 14)
  • 34. Unit III : Approximation Algorithms Polynomial Time Approximations : Scheduling Independent Tasks : Theorem [Graham] : Let I be an m-processor instance of the scheduling problem. Let F*(I) be the finish time of an optimal schedule for instance I of task scheduling problem. Let F^(I) be the length of the schedule generated by the above scheduling rule then, |F*(I) – F^(I)| / |F*(I)| ≤ (1 – 1/m) / (1 + lower ceil (k/m)). Proof : • Let r be the finish time of an optimal schedule for the k longest tasks. If F^(I) = r, then F*(I) = F^(I) and the theorem is proved. So assume F^(I) > r. Also if n ≤ m then F*(I) = F^(I) & theorem is proved. • Let ti, 1 ≤ i ≤ n, be the task times of I. Without loss of generality, we can assume ti ≥ ti+1, 1 ≤ i ≤ n, and n > k. Also assume that n > m.
  • 35. Unit III : Approximation Algorithms Polynomial Time Approximations : Scheduling Independent Tasks : Proof contd…: • Let j, j > k be such that task j has finish time F^(I). Then no processor is idle in the interval [0, F^(I) – t j]. Since tk+1 ≥ tj, it follows that no processor is idle in the interval [0, F^(I) – tk+1]. Hence, Σ ti ≥ m (F^(I) - tk+1) + tk+1 AND so,1<= i<= n F*(I) ≥ (1/m) Σ ti ≥ F^(I) – ((m-1))/m) tk+1 OR1<= i<= n |F*(I) – F^(I)| ≤ ((m-1)/m) tk+1 ………………… (1) Since ti ≥ ti+1, 1 ≤ i ≤ k+1, and at least one processor must execute at least 1 + low-ceil (k/m) of these k + 1 tasks, it follows that, F*(I) ≥ (1 + low-ceil (k/m)) tk+1 ………………....(2) combining (1) and (2) we get, |F*(I) – F^(I)| / F*(I) ≤ ((m-1)/m) / (1 + low-ceil (k/m)) ≤ (1 – (1/m)) / (1 + low-ceil (k/m))
  • 36. Unit III : Approximation Algorithms Polynomial Time Approximations : Scheduling Independent Tasks : Analysis : • Using above result, we can construct a polynomial time ε–approximation scheme for the scheduling problem. This scheme has ε as an input variable. • For any input ε, it computes the value of k such that ε = (1 – (1/m)) / (1 + low-ceil (k/m)). This defines the k to be used in the scheduling rule described above. • Solving for k, we obtain that any integer k = ((m – 1) /ε) - m, guarantees ε-approximate schedules.
  • 37. Unit III : Approximation Algorithms Polynomial Time Approximations : Scheduling Independent Tasks : Analysis contd… : • The time required to obtain such schedules mainly depends on the time needed to obtain an optimal schedule for k tasks on m machines. Using a branch and bound algorithm, this time is O(mk). Time required to arrange the tasks in non-increasing order of processing time and to obtain LPT schedule for the remaining (n – k) tasks is O(nlogn). • Total Time needed is O(nlogn + mk) = O(nlogn + m((m-1)/epsilon)-m)) • This time is not polynomial in 1/ε (it is exponential in 1/ε), this approximation scheme is not a fully polynomial time approximation scheme. It is polynomial time approximation scheme (for any fixed m) as the computing time is polynomial in the number of tasks n.
  • 38. Unit III : Approximation Algorithms Polynomial Time Approximations : Heuristic algorithm for 0/1 knapsack problem : Let p[ ] and w[ ] are the sets of profits and weights respectively. Assume that pi /wi ≥ pi+1 /wi+1 , 1 ≤ i ≤ n. Let m be a knapsack capacity and k ≤ n be a positive integer. The algorithm considers all combinations taken 0 at a time, then 1 at a time, and so on up to k at a time and maximum profit for each combination is computed such that capacity of knapsack does not exceed and remaining capacity if any, is filled up by filling remaining objects in original order. At the end algorithm returns maximum profit.
  • 39. Unit III : Approximation Algorithms Polynomial Time Approximations : Heuristic algorithm for 0/1 knapsack problem : Algorithm Epsilon-Approx (p, w, n, k) // The size of combination is the number of objects in it. The // eight of the combination is the sum of the weights of the // objects in that combination; k is the non-negative integer // that defines the order of the algorithm. { Pmax = 0; for all combinations I of size ≤ k and weight ≤ m do { PI = Σpi for all i Є I; Pmax = max(Pmax , PI + Lbound (I, p, w, m, n)); } return Pmax; }
  • 40. Unit III : Approximation Algorithms Polynomial Time Approximations : Heuristic algorithm for 0/1 knapsack problem : Algorithm LBound (I, p, w, m, n) { s = 0; t = m - Σwi for all i Є I; for i = 1 to n do { if (i does not Є I and (w[i] ≤ t) then { s = s + p[i]; t = t – w[i]; } } return s; }
  • 41. Unit III : Approximation Algorithms Polynomial Time Approximations : Heuristic algorithm for 0/1 knapsack problem : Analysis : Time required by Algorithm Epsilon-Approx = Σ c(n, i) for 0 ≤ i ≤k ≤ Σ ni for 0 ≤ i ≤k, = (nk+1 -1) / (n-1) = O(nk) Time required by Algorithm Lbound = O(n) Time complexity = O(nk+1) Theorem : Let J be an instance of the 0/1 knapsack problem. Let n, m, p, w be as defined for function EpsilonApprox. Let p* be the value of an optimal solution for J. Let Pmax be as defined by function EpsilonApprox on termination. Then, |p* - Pmax | / p* < 1/(k+1)
  • 42. Unit III : Approximation Algorithms Polynomial Time Approximations : Heuristic algorithm for 0/1 knapsack problem : Example : Consider the 0/1 knapsack problem instance with n = 8 objects, size of knapsack = m = 110, profits = {11, 21, 31, 33, 43, 53, 55, 65} weights = { 1, 11, 21, 23, 33, 43, 45, 55} The optimal solution = objects 1, 2, 3, 5 and 6. This results in an optimal proft p* = 159 and a weight w = 109. We get following approximations for different values of k: k Pmax x w p* - Pmax | / p* 0 139 {1, 1, 1, 1, 1, 0, 0, 0} 89 20/159 = 0.126 1 151 {1, 1, 1, 1, 0, 0, 1, 0} 101 8/159 = 0.05 2 159 {1, 1, 1, 0, 1, 1, 0, 0} 109 0/159 = 0.00
  • 43. Unit III : Approximation Algorithms Fully Polynomial Time Approximations : • The approximation algorithms and schemes we have studied so far are particular to the problem considered. There is no set of well defined techniques that we can use to obtain such algorithms. The heuristics used depend very much on the particular problem being solved. • There are three techniques for fully polynomial time approximation schemes. These are, 1. Rounding 2. Interval Partitioning 3. Separation • We shall discus these techniques in terms of maximization problems of the form Problem instance I), max Σ pixi 1 ≤i ≤n s.t. Σ aijxi ≤ bi 1 ≤j ≤m x Є{0, 1} 1 ≤i ≤n pi, aijxi ≥ 0 Without loss of generality, we assume that, aij ≤ bi 1 ≤i ≤n and 1 ≤j ≤m.
  • 44. Unit III : Approximation Algorithms Fully Polynomial Time Approximations : Rounding : • In this technique, we start from the problem instance I as stated above and transform it to another problem instance I’ that is easier to solve. • This transformation is carried out in such a way that the optimal solution value of I’ is close to the optimal solution value of I. • In particular if we are provided with a bound ε on the fractional difference between optimal and approximate solution values wrt optimal value i.e. |F*(I) – F*(I’)| / F*(I) ≤ ε where F*(I) and F*(I’) represent the optimal solution values of I and I’ respectively.
  • 45. Unit III : Approximation Algorithms Fully Polynomial Time Approximations : Rounding : Example Consider a 0/1 knapsack instance I with n = 4, m =2007.1, (p1, p2, p3, p4) = (1.1, 2.1, 1001.6, 1002.3) and (w1, w2, w3, w4) = (1.1, 2.1, 1001.6, 1002.3) Solution : The feasible assignments in S(i) would have the following distinct profit values for instance I : S(0) = {0} S(1) = {0, 1.1} S(2) = {0, 1.1, 2.1, 3.2} S(3) = {0, 1.1, 2.1, 3.2, 1001.6, 1002.7, 1003.7, 1004.8} S(4) = {0, 1.1, 2.1, 3.2, 1001.6, 1002.7, 1003.7, 1004.8, 1002.3, 1003.4, 1004.4, 1005.5, 2003.9, 2005.0, 2006.0, 2007.1}
  • 46. Unit III : Approximation Algorithms Fully Polynomial Time Approximations : Rounding : Example Solution contd…: Optimal solution for the instance I = F*(I) = 2007.1 i.e. solution is (x1, x2, x3, x4) = (1, 1, 1, 1) Now we construct I’ with following values: (p1, p2, p3, p4) = (0, 0, 1000, 1000) and (w1, w2, w3, w4) = (1.1, 2.1, 1001.6, 1002.3), then The feasible assignments in S(i) would have the following distinct profit values for instance I’ : S(0) = {0} S(1) = {0} S(2) = {0} S(3) = {0, 1000} S(4) = {0, 1000, 2000} Optimal solution for the instance I’ = F*(I’) = 2000 i.e. solution is (x1, x2, x3, x4) = (0, 0, 1, 1)
  • 47. Unit III : Approximation Algorithms Fully Polynomial Time Approximations : Rounding : Analysis …: Total values computed for optimal solution to I = Σ|S(i) | = 31 0<= i<= n Total values computed for optimal solution to I’ = Σ|S(i) | = 8 0<= i<= n Thus I’ can be solved in time approximately 1/4th of that for I. |F*(I) – F*(I’)| / F*(I) = (2007.1 – 2000) / 2007.1 = 0.0071 Given pi’s and ε what should be the values of qi’s such that : |F*(I) – F*(I’)| / F*(I) ≤ ε and Σ|S(i)| ≤ u(n, 1/ ε) where u is a polynomial in n and 1/ ε. 0<= i<= n Once we achieve this we have a fully polynomial time approximation scheme for our problem since it is possible to go from S(i-1) to S(i) in time proportional to O(S(i-1) ).
  • 48. Unit III : Approximation Algorithms Fully Polynomial Time Approximations : Rounding : Analysis contd …: Let LB be an estimate for F*(I) such that F*(I) ≥ LB. ….. (1) Now we can assume that LB ≥ max {pi}. ….. (2) For I’ let us define qi = pi – rem(pi, (LB. ε)/n), ….. (3) where rem (a, b) is remainder of a/b, i.e. a – (lower ceil(a/b)) *b. Therefore pi – qi = rem(pi, (LB. ε)/n) ….. (4) Now since rem(pi, (LB. ε)/n) <(LB. ε)/n), From (4) pi – qi < (LB. ε)/n Therefore Σ|pi - q i| ≤ LB.ε for i = 1 to n ….. (5) From (1) & (5) Σ|pi - q i| ≤ LB.ε ≤ F*(I).ε ….. (6) Now Σ|pi ≥ F*(I) and Σ|qi ≥ F*(I’) Therefore |F*(I) – F*(I’)| ≤ Σ|pi - q i| ≤ F*(I).ε ….. (7) Therefore |F*(I) – F*(I’)| / F*(I) ≤ ε ….. (8) Hence if an optimal solution to I’ is used as an optimal solution for I, the fractional error is less than ε.
  • 49. Unit III : Approximation Algorithms Fully Polynomial Time Approximations : Rounding : Analysis contd… Time Complexity : To determine the time required to solve I’ exactly, it is useful to introduce another problem I” with si, 1 ≤ i ≤n, as its objective function coefficients. Define si = lower ceil of (pi*n) / (LB. ε), 1 ≤ i ≤n. It is easy to see that si = lower ceil of (qi*n) / (LB. ε), Clearly, the S(i)’s corresponding to the solutions of I’ and I” will have the same number of tuples. i.e. if (r, t) is the tuple of I’ then ((r*n) / (LB. ε)) is a tuple in S(i) for I”. Hence the time needed to solve I’ is the same as that needed to solve I”.
  • 50. Unit III : Approximation Algorithms Fully Polynomial Time Approximations : Rounding : Analysis contd… Worst case Time Complexity : Since pi ≤ LB, …..(1) si = lower ceil of (pi*n) / (LB. ε) reduces to ≤ lower ceil of (n/ε). ….. (2) Hence |S(i)| ≤ 1+ Σ sj 1<= j<= i ≤ 1+ i*(lower ceil of (n/ε) ….. (3) Therefore Σ |S(i)|, ≤ n + Σ i(lower ceil of (n/ε)), 0<= i<= n-1 0<= i<= n-1 ≤ n+ (n-1)*n*(LC of (n/ε)) / 2 ≤ O(n3/ε). ….. (4)
  • 51. Unit III : Approximation Algorithms Fully Polynomial Time Approximations : Interval Partitioning : • Unlike rounding, interval partitioning does not transform the original problem instance into one that is easier to solve. • Instead an attempt is made to solve the problem instance I by generating a restricted class of feasible assignments for S(0) , S(1) , …,S(n) . • Let Pi be the maximum Σ pj x j among all feasible 1<= j<= i assignments generated for S(i). • Then the profit interval [0, Pi] is divided into subintervals each of size Pi.ε / (n-1). • All feasible assignments in S(i) with Σ pj x j in the same 1<= j<= i subinterval are regarded as having the same Σ pj x j and the dominance rules are used to discard all but one of them. • The S(i)s resulting from this elimination are used in the generation of S(i+1).
  • 52. Unit III : Approximation Algorithms Fully Polynomial Time Approximations : Interval Partitioning : Analysis • Since the number of subintervals for each S(i) w is at most upper ceil of (n/ε) + 1, • Therefore |S(i)| ≤ upper ceil of (n/ε) + 1, and Σ |S(i)| = O(n 2/ε) 0<= i<= n • The error introduced in each feasible assignment due to this elimination in S(i) is less than the subinterval length and this error is propogated from S(1) to S(n), however the error is additive. Then it follows that : • F*(I) – F^(I) ≤ Σ Pi.ε / (n-1) Cnd since Pi ≤ F*(I), 1<= i<= n-1 |F*(I) – F^(I)| ≤ Σ F*(I).ε / (n-1) and hence 1<= i<= n-1 |F*(I) – F^(I)| / F*(I) ≤ ε / (n-1) Σ 1 = ε 1<= i<= n-1
  • 53. Unit III : Approximation Algorithms Fully Polynomial Time Approximations : Interval Partitioning : Example n = 5, m = 1112, (p1, p2, p3 , p4 , p5) = (1, 2, 10, 100, 1000) (w1,w2,w3 , w4 ,w5) = (1, 2, 10, 100, 1000) Since pi = wi, we shall retain only one value in tuple. Solution : Optimal solution i.e. F*(I) S(0) = {0} S(1) = {0, 1} S(2) = {0, 1, 2, 3} S(3) = {0, 1, 2, 3, 10, 11, 12, 13} S(4) = {0, 1, 2, 3, 10, 11, 12, 13, 100, 101, 102, 103, 110, 11, 112, 113} S(5) = {0, 1, 2, 3, 10, 11, 12, 13, 100, 101, 102, 103, 110, 111, 112, 113, 1000, 1001, 1002, 1003, 1010, 1011, 1012, 1013, 1100, 1101, 1102, 1103, 1110, 1111, 1112} Optimal Solution = F*(I) = 1112 i.e. (0,1,1,1,1)
  • 54. Unit III : Approximation Algorithms Fully Polynomial Time Approximations : Interval Partitioning : Example contd… n = 5, m = 1112, (p1, p2, p3 , p4 , p5) = (1, 2, 10, 100, 1000) (w1,w2,w3 , w4 ,w5) = (1, 2, 10, 100, 1000) Solution : Rounding Method LB = max pi = 1000 & ε = 1/10, qi = pi – rem(pi, (LB. ε)/n), I’ = (q1, q2, q3 , q4 , q5) = (0, 0, 0, 100, 1000) (w1,w2,w3 , w4 ,w5) = (1, 2, 10, 100, 1000) S(0) = {(0, 0)} S(1) = {(0, 0)} S(2) = {(0, 0)} S(3) = {(0, 0)} S(4) = {(0, 0), (100, 100)} S(5) = {(0, 0), (100, 100), (1000, 1000),(1100, 1100)} Optimal Solution = F*(I’) = 1100 i.e. (0,0,0,1,1) |F*(I) – F*(I’)| / F*(I) = 12 / 1112 < 0.011 < 0.1
  • 55. Unit III : Approximation Algorithms Fully Polynomial Time Approximations : Interval Partitioning : Example contd… n = 5, m = 1112, (p1, p2, p3 , p4 , p5) = (1, 2, 10, 100, 1000) (w1,w2,w3 , w4 ,w5) = (1, 2, 10, 100, 1000) Solution : Interval partitioning LB = max pi = 1000 & ε = 1/10, We can start with subinterval size of LB* ε /(n-1) = 1000*0.1/4 = 25 Subintervals are [0, 25), [25, 50), [75, 100), [100, 125) … so on Using interval partitioning, S(0) =S(1) = S(2) = S(3) = {0} S(4) = {0, 100} S(5) = {0,100, 1000, 1100} Optimal Solution = F^(I) = 1100 i.e. (0,0,0,1,1) |F*(I) – F^(I)| / F*(I) = 12 / 1112 < 0.011 < 0.1
  • 56. Unit III : Approximation Algorithms Fully Polynomial Time Approximations : Separation : • Assume that in solving a problem instance I, we have obtained an S(i) = {0, 3.9, 4.1, 7.8, 8.2, 11.9, 12.1}. Further assume that the interval size of Pi* ε /(n-1) is 2. • Then the subintervals are [0, 2), [2, 4), [4, 6) , [6, 8), [8, 10) , [10, 12) and [12, 14). Each feasible solution value falls in different subinterval and so no feasible assignments are eliminated. • However there are three pairs of assignments with values within Pi* ε /(n-1) i.e. 2. If the dominance rules are used for each pair only four assignments remain and the error introduced is at most the interval size Pi* ε /(n-1) i.e.2.
  • 57. Unit III : Approximation Algorithms Fully Polynomial Time Approximations : Separation : Algorithm • Let a0, a1, a2, … , ar be the distinct values in S(i). Let us assume that a0<<a1<<a2<… < ar.. • Construct a new set J from S(i) by making a left to right scan and retaining a touple only if its value excedds the value of last tuple in J by more than Pi* ε /(n-1). Algorithm separation (a, J, r) { J = assignment corresponding to a0, XP = a0. for j = 1 to r do { if aj > (XP + Pi* ε /(n-1)) { J = J U {assignments corresponding aj ; XP = aj } } }
  • 58. Unit III : Approximation Algorithms Probabilistically Good Algorithms : • The approximation algorithms discussed so far have the nice property that their worst case performance could be bounded by some constants (k in the case of absolute approximations and ε in case of an ε-approximation. • The requirement of bounded performance tends to categorize other algorithms with unbounded performance that usually work well as being bad. • Some algorithms with unbounded performance may in fact almost always solve the problem exactly or generate a solution that is exceedingly close in value of an optimal solution. • Such algorithms are good in probabilistic sense. If we pick a problem instance I at random, then there is very high probability that the algorithm will generate a very good approximate solution.
  • 59. Unit III : Approximation Algorithms Probabilistically Good Algorithms : Example : To find a Hamiltonian cycle in an undirected path. Algorithm : • First an arbitrary vertex (say vertex 1) is chosen as starting vertex. • The algorithm maintains a simple path P starting from 1 and ending at vertex k. Initially P is trivial path with k = 1, i.e. there are no edges in P. • At each iteration an attempt is made to increase the length of P by selected edge (k, j) arbitrarily. Following possibilities are considered :
  • 60. Unit III : Approximation Algorithms Probabilistically Good Algorithms : Example contd …: Algorithm contd…. : 1. j = 1 & path P includes all the vertices of the graph : In this case Hamiltonian Cycle has been found and the algorithm terminates. 2. j is not in the path P : In this case the length of P is increased by adding (k, j) to it and j becomes new End Point of P 3. j is already on path P : Now there is unique edge e (j, m) in P such that the deletion of e from P and inclusion of (k, j) to P results in a simple path. Then edge e is deleted and (k, j) is added to path P. P is now a simple path with End Point m. • This algorithm does not always find Hamiltonian Cycle in a graph that contains such a cycle. • Time Complexity : O(n2)
  • 61. Unit III : Approximation Algorithms Probabilistically Good Algorithms : Example contd …: Let us try the above algorithm on the following five vertex graph. Edges Selected : 1-4 (path : 1-4) , 4-5 (path : 1-4-5), 1-5 (path : 1-5-4), 4-3 (path : 1-5-4-3), 3-2 (path : 1-5-4-3-2), 2-1 (path : 1-5-4-3-2-1 5 2 3 41