CS424 – Design and
Analysis of Algorithms
Week #3
Saima Gul
GROWTH OF FUNCTIONS
05/19/25 18:14 2
Growth of Functions
The order of growth of running time (or of any resource usage)
provides
a simple characterization of an algorithm’s efficiency
a tool to compare the relative asymptotic performances of
algorithms
The exact running times can sometimes be derived, however, its
use is limited, and not worth the effort for computing it, as when
the inputs get large, the lower order terms and the constant
coefficients are all dominated by the highest order term.
Usually, an algorithm that is asymptotically more efficient will be
a better choice for all but very small input sizes.
We will introduce the asymptotic notations formally…
05/19/25 18:14 3
Asymptotic Notation
We will define the asymptotic running time of
algorithms using a function T(n) whose domain is
the set of natural numbers N={0,1,2,…}.
Although the notation will be introduced for only
natural number input sizes, sometimes it will be
abused to apply to rational input sizes.
Other abuses of the notations will also be
performed. Therefore, it is quite important to
understand the formal definitions, to prevent the
misuses due to abuses.
05/19/25 18:14 4
O-notation (upper bounds)
f(n)=O(g(n)) if there exist positive constants c
and n0 such that, for all n ≥ n0, 0 ≤ f(n) ≤ cg(n)
For example, 2n2 = O(n3) [take c=1, n0=2]
g(n) bounds f(n) from above.
The rate of growth of f(n) is at most the same
as the rate of growth of g(n).
Note that, for asymptotic notations,
= is a funny, one way equality.
Consider O(g(n)) as a set of function.
05/19/25 18:14 5
Formal Definition of O-
notation
O( g (n)) f (n) : c, n0 0 such that n n0 , 0 f (n) cg (n)
Therefore, O(g(n)) actually denotes a set of
functions
When we write
2n2 = O(n3)
we actually mean
2n2 O(n3)
05/19/25 18:14 6
Some examples on O-
notation
Example 1: Let f(n)=5n+24,
then f(n)=O(n) [take c=6, n0=24]
[ 0 ≤ 5n+24 ≤ 6n forall n ≥ 24 ]
Example 2: Let f(n)=5n+24,
then f(n)=O(n2) [take c=1, n0=8]
[ 0 ≤ 5n+24 ≤ n2 forall n ≥ 8 ]
d
In general, for any polynomial f ( n ) i
a n
i 0
i
where ai’s are constant and ad > 0,
f (n) O n d
05/19/25 18:14 7
O-notation provides an upper
bound
If the worst case running time of an algorithm
is O(g(n)), then any running time of the
algorithm is also O(g(n)).
For example, consider Insertion sort:
2 2
Worst case: T (n) d1n d 2 n d 3 O(n )
2 2
Average case: T ( n ) b1 n b2 n b3 O ( n )
Best case: T (n) (a1 a2 )n a3 a2 O(n 2 )
05/19/25 18:14 8
O-notation provides an easy way
to find worst case running time
of algorithms
Insertion-Sort(A) {
for (j=2; j≤n; j=j+1) { iterates n-1 times
num = A[j];
i = j-1;
// find the correct place for num
while (i>0 and A[i]>num) { O(n2)
A[i+1] = A[i];
i=i-1;
}
A[i+1] = num; iterates at most n
} times
}
05/19/25 18:14 9
o-notation (upper bounds that
are not tight)
Bounds given by O-notation may or may not
be tight:
a tight upper bound
5n2=O(n2)
3n=O(n2) a loose upper bound
o-notation provides upper bounds that are not
tight.
When f(n)=o(g(n)), g(n) bounds f(n) from
above.
The rate of growth of f(n) is strictly less than
the rate of growth of g(n).
05/19/25 18:14 10
Formal Definition of o-
O(notation
g (n)) f (n) : c, n 0 such that n n , 0 f (n) cg (n)
0 0
o( g (n)) f (n) : c 0, n0 0 s.t. n n0 , 0 f (n) cg (n)
o( g (n)) f (n) : c 0, n0 0 s.t. n n0 , 0 cf (n) g (n)
Intuitively, f(n)=o(g(n)) if as n gets bigger and
bigger, f(n) becomes insignificantly small as
compared to g(n).
05/19/25 18:14 11
Examples for o-notation
3n=o(n2)
3n < cn2
[need to show how to pick n0 for any given c]
5n2 ≠ o(n2)
since for c<5, we cannot find n0 such that for
all n > n0 : 5n2 < cn2
05/19/25 18:14 12
Ω-notation (lower bounds)
Ω-notation provides asymptotic lower bounds.
f(n)= Ω(g(n)) if there exist positive constants c and
n0 such that, for all n ≥ n0, 0 ≤ cg(n) ≤ f(n)
( g (n)) f (n) : c, n0 0 s.t. n n0 , 0 cg (n) f (n)
For example, 2n3 = Ω(n2) [take c=1, n0=1]
g(n) bounds f(n) from below.
The rate of growth of f(n) is at least the same as the
rate of growth of g(n).
05/19/25 18:14 13
Ω-notation provides a lower
bound
If the best case running time of an algorithm
is Ω(g(n)), then any running time of the
algorithm is also Ω(g(n)).
For example, consider Insertion sort:
Best case: T (n) (a1 a2 )n a3 a2 (n)
2
Average case: T ( n ) b1 n b2 n b3 (n)
Worst case: T (n) d1n 2 d 2 n d 3 (n)
05/19/25 18:14 14
ω-notation (lower bounds that
are not tight)
Bounds given by Ω-notation may or may not
be tight:
5n2= Ω(n2)
5n2= Ω(n)
ω-notation provides lower bounds that are
not tight.
When f(n)= ω(g(n)), g(n) bounds f(n) from
below.
The rate of growth of f(n) is strictly more than
the rate of growth of g(n).
05/19/25 18:14 15
Formal definition of ω-
notation
One way of defining is to say
f (n) ( g (n)) iff g (n) o( f (n))
However, by using our set notation
( g (n)) f (n) : c 0, n0 0 s.t. n n0 , 0 cg (n) f (n)
05/19/25 18:14 16
Θ-notation (tight bounds)
( g (n)) f (n) : c1 , c2 , n0 0 s.t. n n0 , 0 c1 g (n) f (n) c2 g (n)
If f(n)=Θ(g(n)), then g(n) is said to be an asymptotic
tight bound for f(n).
If f(n)=Θ(g(n)), the rate of growth of f(n) and g(n) are
the same.
Example:
5n2 = Θ(n2)
5n2 ≠ Θ(n)
5n2 ≠ Θ(n3)
05/19/25 18:14 17
05/19/25 18:14 18
Some properties of asymptotic
notations
f (n) ( g (n)) iff f (n) O( g (n)) and f (n) ( g (n))
f (n) ( g (n)) iff g (n) ( f (n))
f (n) O( g (n)) iff g (n) ( f (n))
f (n) o( g (n)) iff g (n) ( f (n))
f (n) ( f (n)) f (n) O( f (n)) f (n) ( f (n))
05/19/25 18:14 19
Some more …
f (n) ( g (n)) and g (n) (h(n)) implies f (n) (h(n))
and this still holds when Θ above is replaced
by any other asymptotic notation we have
introduced (O,Ω,o,ω).
05/19/25 18:14 20
Some Complexity Classes
(1) : constant time
(lg n) : logarithmic time
(n) : linear time
(n lg n) : loglinear
2
(n ) : quadratic time
3
(n ) : cubic time
(2 n ) : exponential time
05/19/25 18:14 21