Time Complexity Quick Guide
short trick to calculate time complexity of any code
Here’s a simple and short trick to quickly estimate the time complexity of most code blocks:
⚡️ SHORT TRICK TO CALCULATE TIME COMPLEXITY
🔁 1. Loops
A single loop from 1 to n = O(n)
A loop inside another (nested) = O(n²)
(If both go up to n)
A loop like i = i * 2 (doubling each time) = O(log n)
A loop like i = i / 2 (halving each time) = O(log n)
🔄 2. Loop with Function Calls
If a loop runs n times and calls a function of complexity O(log n), total = O(n log n)
📦 3. Recursive Functions
Basic recursion (e.g., Fibonacci): Exponential = O(2ⁿ)
Divide and conquer (e.g., Merge Sort):
Recurrence = T(n) = 2T(n/2) + O(n) → O(n log n)
If T(n) = T(n/2) + O(1) → O(log n)
💡 4. Common Patterns
Code Pattern Time Complexity
for i in range(n): O(n)
for i in range(n): for j in range(n): O(n²)
while i < n: i *= 2 O(log n)
Merge Sort O(n log n)
Binary Search O(log n)
Traversing array/list O(n)
Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 1/4
🧠 BONUS MEMORY TIP (Easy Mnemonic):
"Loop = Linear, Nested = Square, Divide = Log"
✅ How to Approach:
1. Look at how many times the code repeats.
2. Check if it's nested, dividing, or calling itself.
3. Combine complexities of parts:
If multiple blocks → take the highest order term.
Example:
python
for i in range(n): # O(n)
for j in range(n): # O(n)
print(i, j) # O(1)
✅ Total = O(n²)
Want me to try a few examples with you?
for (......){}
for (......){}
Great! You're asking about time complexity when we have two for loops. Let's break it down into two
common scenarios using your pattern:
✅ Case 1: Nested Loops
cpp
for (int i = 0; i < n; i++) {
for (int j = 0; j < n; j++) {
// some constant time operation
}
}
🔍 Analysis:
Outer loop runs n times.
For each i, inner loop runs n times.
Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 2/4
Total operations = n × n = O(n²)
✅ Case 2: Sequential Loops
cpp
for (int i = 0; i < n; i++) {
// some operation
}
for (int j = 0; j < n; j++) {
// some operation
}
🔍 Analysis:
First loop = O(n)
Second loop = O(n)
Total = O(n + n) = O(n) (drop constants)
🧠 Trick to Remember:
Loops inside each other? → Multiply → O(n × n) = O(n²)
Loops one after another? → Add → O(n + n) = O(n)
If you want to test with a real code example, send it and I’ll explain!
from all this O(n), n log n, log n, 2^n, and so on...
provide me with the chart from biggest to smallest
Sure! Here's a time complexity hierarchy chart from worst (slowest) to best (fastest) — assuming n is
large.
📊 Time Complexities from Biggest to Smallest
Rank Time Complexity Name / Type Example Problem
🔴1 O(n!) Factorial Time Traveling Salesman, Permutations
🔴2 O(2^n) Exponential Time Recursive Fibonacci, Subset Problems
🟠3 O(n^k) Polynomial Time (k > 2) Matrix Multiplication (Naive)
🟠4 O(n²) Quadratic Time Bubble Sort, Nested Loops
🟡5 O(n log n) Linearithmic Time Merge Sort, Quick Sort
🟢6 O(n) Linear Time Single loop, Array traversal
Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 3/4
Rank Time Complexity Name / Type Example Problem
🟢7 O(log n) Logarithmic Time Binary Search
🟢8 O(1) Constant Time Accessing array element by index
🔁 Mnemonic to Remember:
"Factorial → Exponential → Polynomial → Quadratic → n log n → Linear → Log →
Constant"
🧠 Visualization (Simplified Growth Example):
Let’s say n = 10:
O(1) →1
O(log n) → ~3
O(n) → 10
O(n log n) → ~30
O(n²) → 100
O(2^n) → 1024
O(n!) → 3,628,800
Let me know if you want a graph of this too, or if you're comparing two algorithms!
Printed using ChatGPT to PDF, powered by PDFCrowd HTML to PDF API. 4/4