DATA STRUCTURES
AND ALGORITHMS
Complexity Analysis/Algorithm Analysis
(Continue…)
What will be discussed
Asymptotic Analysis
Big-O notation
Omega notation
Theta notation
Asymptotic Analysis
The concept of Asymptotic notation as a way to
classify the running time of an algorithms into some
generic or broad classes or sets
(Continue…)
In previous class, we had seen how calculate the
running time expression for algorithms using the
concept of a hypothetical model machine
So, why do really want to classify the running time
of an algorithms into these broad classes?
(Continue…)
Let’s say we have two algorithms
Algo1: T(n) = 5n2 +7
Algo2: T(n) = 17n2+6n +8
Let’s we deduced these expression on a model machine
*We have already assumed it in previous lecture
Now, these functions correspond to the model machine but
we wants some functions or representation.
Which is true irrespective of the machine and still gives us
the idea about the rate of growth time
(Continue…)
Now couples of things about time complexity
analysis, when we analyze time complexity we
analyze it for really very large input sizes
So want to analyze time taken when n tends to
infinite n then 17n +6n+8 will be insignificant in
8
first and second function respectively and the constant
multiplier corresponding n2 also be insignificant
(Continue…)
Now these two functions will pretty much have
similar rate of growth for very high value of n
So irrespective of the machine used we can say that
these two algorithms have something like a quadratic
rate of growth which kind of brings those two function
in the same class
But we have very formal way to classify functions
into classes in the form of asymptotic notation where
we do not have constant in the expression and we have
only the variable terms No constant terms
Asymptotic Notation
Let us define these asymptotic notations firstly
O- “big-oh” Notation
If we have non negative function g(n) that takes non
negative argument n then O(g(n)) is defined as the set of
all the functions f(n) for which there exist some constant c
& n0 such that f(n) is less than or equal to c times of g(n)
for all n>= n0
O(g(n)) = { f(n): there exist constant c & n0,
f(n)<= c*g(n) for n>=n0 }
Example
Let
f(n)=5n2+2n+1
g(n)=n2
2n never be greater than 2n2 and 1 never be greater
than n2
So if we use c=5+2+1 =8 than f(n)<=8n2 for all n>=1
n0=1
We can say that f(n) is an element of the set O(n2)
f(n) ϵ O(n2) OR f(n) = O(n2)
(Continue..)
Now plot two function on a graph
(Continue..)
Now plot two function on a graph
After n=1 that is our n0, cg(n) always greater than the function
f(n)
So this assure that f(n) never grows at a rate faster than c*g(n)
after n0
So, in this way ,Big-oh Notation kind of notation ,gives us a
upper bound of the rate of growth of function and in time
complexity analysis this is useful
Another important thing this c and n0 can be chosen differently
so for the same function f(n) and g(n) and still condition may be
valid
Ω-Omega Notation
Ω(g(n)) = { f(n): there exist constant c & n0,
f(n)>= c*g(n) for n>=n0 }
Lets assume same example:
f(n)=5n2+2n+1
g(n)=n2
If we have c=5 and n0=0 than clearly
5n2 <= f(n) , n >=0 so we can say Ω(n2)
Graph shows cg(n) will never exceed for all n>=n0
So, omega notation gives us lower bound of the rate of the growth of a function
ɵ-Theta Notation
ɵ(g(n)) = { f(n): there exist constant c1,c2 & n0,
c1*g(n)<= (f(n) <= c2*g(n) for n>=n0 }
Lets assume same example:
f(n)=5n2+2n+1
g(n)=n2
If we have c1=5,c2=8and n0=1 and our inequality will hold and we can say f(n) is ɵ (n2)
So, theta notation gives us tight bound of the rate of the growth of a function
Conclusion
In this Lecture we discussed…
What is an asymptotic analysis?
What are the Big-O, Omega and Theta notations?
ANY QUERY ?
Practice Questions
Question #8
Question #9
Question #10
Book “Data Structures and Algorithms in C++” Adam
Drozdek 4th Edition