Asymptotic Notations
MD. MUKTAR HOSSAIN
LECTURER
DEPT. OF CSE
VARENDRA UNIVERSITY
Introduction
Asymptotic Notations are mathematical tools used to analyze the performance
of algorithms by understanding how their efficiency changes as the input size
grows.
There are mainly three asymptotic notations:
◦ Big-O Notation (O-notation) ---- Worst Case
◦ Omega Notation (Ω-notation) ---- Best Case
◦ Theta Notation (Θ-notation) ---- Average Case
Big-O Notation (O-notation)
The upper bound of the running time of an algorithm. Therefore, it gives the worst-
case complexity of an algorithm.
O(g(n)) = { f(n): there exist positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n)
for all n ≥ n0 }
Example: Let us consider a given function, f(n)=4n3+10n2+5n+1
f(n)⩽5g(n), for all the values of n>10
Considering g(n)=n3,
Hence, the complexity of f(n) can be represented as O(g(n)), i.e. O(n 3)
Omega Notation (Ω-Notation)
The lower bound of the running time of an algorithm. Thus, it provides the best
case complexity of an algorithm.
Ω(g(n)) = { f(n): there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n)
for all n ≥ n0 }
Example: Let us consider a given function, f(n)=4n3+10n2+5n+1
f(n)⩾4g(n), for all the values of n⩾0
Considering g(n)=n3
Hence, the complexity of f(n) can be represented as Ω(g(n)), i.e. Ω(n 3)
Theta Notation (Θ-Notation)
The upper and the lower bound of the running time of an algorithm, it is used for
analyzing the average-case complexity of an algorithm.
Θ (g(n)) = {f(n): there exist positive constants c1, c2 and n0 such that 0 ≤ c1 * g(n)
≤ f(n) ≤ c2 * g(n) for all n ≥ n0}
Example: Let us consider a given function, f(n)=4n3+10n2+5n+1
4g(n)⩽f(n)⩽5g(n), for all the large values of n.
Considering g(n)=n3
Hence, the complexity of f(n) can be represented as θ(g(n)), i.e. θ(n 3)
Insertion Sort Algorithm
Insertion-Sort()
for to do
//insert into the sorted sequence
while and do
end
end
Analysis of The Insertion Sort Algorithm
Insertion-Sort() cost times
1. for to do
2.
3.
4. while and do
5.
6.
7. end
8.
9. end
Analysis of The Insertion Sort Algorithm
(Best Case)
Now if we represent the running time of insertion sort as a function of input size ,
than the definition of becomes,
If the input sequence is already sorted for
=
Analysis of The Insertion Sort Algorithm
(Worst Case)
If the array is reversed sorted for
We know
And
Then becomes,
We notice that this is of the form . Thus is a quadratic function of the input size .
Properties of Asymptotic Notations
General properties Symmetric
If f(n) is O(g(n)), then a*f(n) is O(g(n)) If f(n) is Θ (g(n)), then g(n) is Θ (f(n))
If f(n) is Ω(g(n)), then a*f(n) is Ω(g(n)) Suppose f(n)=n2; g(n)=n2
Reflexive f(n)=Θ (n2)
If f(n) is given, then f(n) is O(f(n)) g(n)=Θ (n2)
Transitive Transpose symmetric
If f(n) is O(g(n)) and g(n) is O(h(n)), then If f(n) = O(g(n)), then g(n) is Ω(f(n))
f(n)=O(h(n)) f(n)=n and g(n)=n2
f(n)=n, g(n)=n2, and h(n)=n3 then
Then n is O(n2) and n2 is Ω(n)
n = O(n2), n2 = O(n3), −>n = O(n3)
Common Asymptotic Notations
Name Notation
Constant O(1)
Logarithmic O(log n)
Linear O(n)
Linearithmic, Log-linear O(n log n)
Quadratic O(n2)
Polynomial O(nc)
Exponential O(cn); where c>1
Factorial O(n!)
Comparison of Complexities
If f1(n) = 3n logn, f2(n) = n2, f3(n) = 2n , then what will be the formation according to
complexity in ascending order?
Answer: f1 < f2 < f3
If f1(n) = 3n logn, f2(n) = n2, f3(n) = 2n , f4 = 210, then what will be the formation according to
complexity in descending order?
Answer: f3 > f2 > f1 > f4.
If f1 = O(n) and f2(n) = O(logn), respectively. Therefore, “f2 always runs faster than f1” – is
true or false?
Answer: Home Work.
Which one is better between f1(n) = 105 and f2(n) = 510 in terms of complexity of algorithm.
Answer: Equal. O(1).
Time Complexity Analysis
int a = 0, b = 0;
i. O(N * M) time, O(1) space
for (i = 0; i < N; i++) {
a = a + rand(); ii. O(N + M) time, O(N + M) space
}
iii. O(N + M) time, O(1) space
for (j = 0; j < M; j++) {
b = b + rand(); iv. O(N * M) time, O(N + M) space
}
int a = 0;
i. O(N)
for (i = 0; i < N; i++) {
for (j = N; j > i; j--) { ii. O(N*log(N))
a = a + i + j;
iii. O(N * Sqrt(N))
}
} iv. O(N*N)
Time Complexity Analysis Cont’d
int i, j, k = 0;
i. O(n)
for (i = n / 2; i <= n; i++)
{ ii. O(n log n)
for (j = 2; j <= n; j = j * 2) {
iii. O(n^2)
k = k + n / 2;
} iv. O(n2Logn)
}
int a = 0, i = N;
i. O(N)
while (i > 0) {
a += i; ii. O(Sqrt(N))
i /= 2;
iii. O(N / 2)
}
iv. O(log N)
Time Complexity Analysis Cont’d
for (int i = 1; i < n; i++) {
i. O(n)
i *= k;
} ii. O(k)
iii. O(logkn)
iv. O(lognk)
int value = 0;
i. n
for(int i=0;i<n;i++)
for(int j=0;j<i;j++) ii. (n+1)
value += 1;
iii. n(n-1)/2
iv. n(n+1)
Thank You