DESIGN AND ANALYSIS
OF ALGORITHMS
Comparing time efficiency
We measure time efficiency only upto an order of
magnitude
Ignore constants
How do we compare functions with respect to
orders of magnitude?
Upper bounds, “big O”
t(n) is said to be O(g(n)) if we can find suitable
constants c and n0 so that cg(n) is an upper bound
for t(n) for n
beyond n0
t(n) ≤ cg(n)
for every n ≥ n0
Examples: Big O
100n + 5 is O(n2)
100n + 5
≤ 100n + n, for n ≥ 5
2
= 101n ≤ 101n , so n0 = 5, c = 101
Alternatively
100n + 5
≤ 100n + 5n, for n ≥1
= 105n ≤ 105n2, so n0 = 1, c = 105
n0 and c are not unique!
Of course, by the same argument, 100n+5 is also O(n)
Examples: Big O
100n2 + 20n + 5 is O(n2)
100n2 + 20n + 5
≤ 100n2 + 20n2 + 5n2, for n ≥ 1
≤ 125n2
n0 = 1, c = 125
What matters is the highest term
20n + 5 dominated by 100n2
Examples: Big O
n3 is not O(n2)
No matter what c we choose, cn2 will be
dominated by n3 for n ≥ c
Useful properties
If
f1(n) is O(g1(n))
f2(n) is O(g2(n))
then f1(n) + f2(n) is O(max(g1(n),g2(n)))
Proof
f1(n) ≤ c1g1(n) for all n > n1
f2(n) ≤ c2g2(n) for all n > n2
Why is this important?
Algorithm has two phases
Phase A takes time O(gA(n))
Phase B takes time O(gB(n))
Algorithm as a whole takes time
max(O(gA(n)),O(gB(n)))
For an algorithm with many phases, least efficient
phase is an upper bound for the whole algorithm
Lower bounds, Ω (omega)
t(n) is said to be Ω(g(n)) if we can find suitable
constants c and n0 so that cg(n) is an lower bound
for t(n) for n
beyond n0
t(n) ≥ cg(n)
for every n ≥ n0
Lower bounds
n3 is Ω(n2)
n3 ≥ n2 for all n
n0 = 0 and c = 1
Typically we establish lower bounds for problems
as a whole, not for individual algorithms
Sorting requires Ω(n log n) comparisons, no
matter how clever the algorithm is
Tight bounds, Θ (theta)
t(n) is Θ(g(n)) if it is both O(g(n)) and Ω(g(n))
Find suitable constants
c1, c2, and n0 so that
c2g(n) ≤ t(n) ≤ c1g(n)
for every n ≥ n0
Tight bounds
n(n-1)/2 is Θ(n2)
Upper bound
2 2
n(n-1)/2 = n /2 - n/2 ≤ n /2, for n ≥ 0
Lower bound
n(n-1)/2 = n2/2 - n/2 ≥ n2/2 - (n/2 x n/2) ≥ n2/4,
for n ≥ 2
Choose n0 = max(0,2) = 2, c1 = 1/2 and c2 = 1/4
Summary
f(n) = O(g(n)) means g(n) is an upper bound for f(n)
Useful to describe limit of worst case running
time for an algorithm
f(n) = Ω(g(n)) means g(n) is a lower bound for f(n)
Typically used for classes of problems, not
individual algorithms
f(n) = Θ(g(n)): matching upper and lower bounds
Best possible algorithm has been found