Complexity analysis
Asymptotic Notations in Complexity Analysis:
1. Big O Notation
Big-O notation represents the upper bound of the running time
of an algorithm. Therefore, it gives the worst-case complexity
of an algorithm. By using big O- notation, we can
asymptotically limit the expansion of a running time to a range
of constant factors above and below. It is a model for
quantifying algorithm performance.
• 2. Omega Notation
• Omega notation represents the lower bound of the
running time of an algorithm. Thus, it provides the best-
case complexity of an algorithm.
The execution time serves as a lower bound on the
algorithm’s time complexity. It is defined as the condition
that allows an algorithm to complete statement execution
in the shortest amount of time.
• Graphical Representation
• Mathematical Representation of Omega notation :
• Ω(g(n)) = { f(n): there exist positive constants c and n0
such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 }
Note: Ω (g) is a set
•
• Little ο asymptotic notation
• Big-Ο is used as a tight upper bound on
the growth of an algorithm’s effort (this
effort is described by the function
f(n)), even though, as written, it can also be
a loose upper bound. “Little-ο” (ο())
notation is used to describe an upper
bound that cannot be tight.
•
• Graphical Representation
• for (int i = 0; i < n; i++) {
cout << “GeeksForGeeks” << endl;
}
• For the above example, the loop will execute n times, and it will
print “GeeksForGeeks” N number of times. so the time taken to run this
program is:
• T(N)= n *( t(cout statement))
= n * O(1)
=O(n), Linear complexity.
• for (int i = 0; i < n; i++) {
for (int j = 0; j < m; j++) {
cout << “GeeksForGeeks” << endl;
}
}
• For the above example, the cout statement will execute n*m times, and it will
print “GeeksForGeeks” N*M number of times. so the time taken to run this program is:
• T(N)= n * m *(t(cout statement))
= n * m * O(1)
=O(n*m), Quadratic Complexity.
Data
Access Search Insertion Deletion
structure
Array O(1) O(N) O(N) O(N)
Stack O(N) O(N) O(1) O(1)
Queue O(N) O(N) O(1) O(1)
Singly Linked
O(N) O(N) O(N) O(N)
list
Doubly Linked
O(N) O(N) O(1) O(1)
List
Hash Table O(N) O(N) O(N) O(N)
Binary Search
O(N) O(N) O(N) O(N)
Tree
AVL Tree O(log N) O(log N) O(log N) O(log N)
Binary Tree O(N) O(N) O(N) O(N)
Red Black Tree O(log N) O(log N) O(log N) O(log N)
Algorithm Complexity
1. Linear Search Algorithm O(N)
15. Prim’s Algorithm O(V2)
2 Binary Search O(LogN)
16 Kruskal’s Algorithm O(ElogV)
3. Bubble Sort O(N^2)
17. 0/1 Knapsack O(N * W)
4. Insertion Sort O(N^2)
5. Selection Sort O(N^2) 18. Floyd Warshall Algorithm O(V3)
6. QuickSort O(N^2) worst
19. Breadth First Search O(V+E)
7 Merge Sort O(N log(N))
20. Depth first Search O(V + E)
8. Counting Sort O(N)
9 Radix Sort O((n+b) * logb(k)).
10. Sieve of Eratosthenes O(n*log(log(n)))
11. KMP Algorithm O(N)
12. Z algorithm O(M+N)
13. Rabin-Karp Algorithm O(N*M).
14. Johnson’s algorithm O(V2log V + VE)