Algorithm for Selection Sort
// A is an array of size n
for i ← 1 to n - 1 do
min ← i // min stores the index of the minimum element
for j ← i + 1 to n do
if (A[j] < A[min]) do
min ← j
end
end
swap(A[i], A[min]) // Swap the found minimum element with the current position i
end
Time Complexity Analysis
Case Explanation Time Complexity
Best Case Even if the array is already O(n²)
sorted, the algorithm still
compares all elements.
Average Case On average, it performs O(n²)
n(n−1)/2 comparisons.
Worst Case When the array is sorted in O(n²)
reverse order, still same
number of comparisons.
✅ Number of comparisons: n(n-1)/2
✅ Number of swaps: ≤ n-1 (Selection sort makes fewer swaps than Bubble sort)
💾 Space Complexity
O(1) (In-place sorting — no extra array used)
Algorithm for Insertion Sort
Algorithm Insertion_Sort(A)
// A is an array of size n
for j ← 2 to n do
key ← A[j]
i←j-1
while (i > 0 and A[i] > key) do
A[i + 1] ← A[i]
i←i-1
end while
A[i + 1] ← key
end for
Time Complexity Analysis
Case Explanation Time Complexity
Best Case Array already sorted → only O(n)
1 comparison per element
Average Case Random order elements O(n²)
Worst Case Array sorted in reverse order O(n²)
✅ Comparisons: Between O(n) and O(n²)
✅ Swaps (shifts): Up to O(n²)
✅ Space Complexity: O(1) (in-place sorting)
Algorithm for Binary Search
Algorithm BINARY_SEARCH(A, key)
// Description: Perform binary search on sorted array A
// Input: Sorted array A of size n, and key to be searched
// Output: Index of key if found, otherwise 0 (failure)
low ← 0
high ← n - 1
while (low ≤ high) do
mid ← (low + high) / 2
if A[mid] == key then
return mid // Element found
else if A[mid] < key then
low ← mid + 1 // Search in right half
else
high ← mid - 1 // Search in left half
end if
end while
return 0 // Element not found
end
Time Complexity Analysis
Case Explanation Time Complexity
Best Case When the middle element is O(1)
the key
Average Case Repeatedly divides search O(log n)
space in half
Worst Case Element not found after all O(log n)
divisions
✅ Space Complexity: O(1) (Iterative version)
✅ Requires: Array must be sorted beforehand
Algorithm for Merge Sort
Algorithm MERGE(A, low, mid, high)
// Combine two sorted subarrays into one sorted array
i ← low
j ← mid + 1
k ← low
Create temporary array B
while (i ≤ mid and j ≤ high) do
if A[i] ≤ A[j] then
B[k] ← A[i]
i←i+1
else
B[k] ← A[j]
j←j+1
end if
k←k+1
end while
// Copy remaining elements from left subarray (if any)
while (i ≤ mid) do
B[k] ← A[i]
i←i+1
k←k+1
end while
// Copy remaining elements from right subarray (if any)
while (j ≤ high) do
B[k] ← A[j]
j←j+1
k←k+1
end while
// Copy sorted elements back to original array
for k ← low to high do
A[k] ← B[k]
end for
end
Time Complexity Analysis
Case Explanation Time Complexity
Best Case Array already sorted — still O(n log n)
splits and merges
Average Case Always divides and merges O(n log n)
halves
Worst Case Reverse or random order — O(n log n)
same work
✅ Space Complexity: O(n) (uses temporary arrays during merge)
✅ Stable Sorting Algorithm: Yes
✅ Divide and Conquer technique
Algorithm: Fractional Knapsack Problem
Given:
N → Total number of objects
M → Capacity of the knapsack
P → Total profit (initially P = 0)
Pi → Profit of the iᵗʰ object
Wi → Weight of the iᵗʰ object
Step 1:
For i = 1 to N
Calculate Profit/Weight Ratio → Ri = Pi / Wi
End For
Step 2:
Sort all objects in decreasing order of Profit/Weight Ratio (Ri)
Step 3:
For i = 1 to N
if M > 0 AND Wi ≤ M then
M = M - Wi
P = P + Pi
else
break
end If
end For
if M > 0 then
P = P + Pi * (M / Wi) // take fractional part of the next item
end if
Step 4:
Display Total Profit = P
Time Complexity:
Explanation:
The first O(n) is for calculating the profit/weight ratio of all items.
The second O(n log n) is for sorting the items based on their ratio in decreasing order.
The third O(n) is for selecting items and calculating total profit.
Now, taking the maximum order among them:
Final Time Complexity = O(n log n)
Algorithm: Job Sequencing with Deadlines
Step 1:
Sort all jobs in decreasing order of profit.
🕒 Time Complexity: O(n log n)
Step 2:
Iterate over all jobs (in decreasing order of profit).
For each job i = 1 to N
Find a time slot j, such that
if (slot is empty and j < deadline[i] and j is greatest possible) then
Put the job in this slot
Mark this slot as filled
Add the profit
else
Ignore the job
🕒 Time Complexity:
Finding a slot for each job → O(n × m)
When m = n, it becomes → O(n²)
Step 3:
Display the total profit.
⏱️Overall Time Complexity:
Final Time Complexity = O(n²)
Algorithm
FloydWarshall(w, n)
// w: weight matrix
// n: number of vertices
for i = 1 to n do
for j = 1 to n do
D[i, j] = w[i, j] // Initialize distance matrix
end for
end for
for k = 1 to n do // Consider all intermediate vertices
for i = 1 to n do
for j = 1 to n do
if (D[i, k] + D[k, j] < D[i, j]) then
D[i, j] = D[i, k] + D[k, j]
end if
end for
end for
end for
return D[1..n, 1..n] // Final shortest path matrix
Time Complexity
O(n³) — due to three nested loops.
Space Complexity
O(n²) — for storing the distance matrix.
Algorithm for N-Queen
NQueens(n)
// n: number of queens (and size of chessboard)
PlaceQueen(1) // Start placing from the first row
Procedure PlaceQueen(k)
// Try placing Queen in row k
for i = 1 to n do
if isSafe(k, i) then // Check if placing Queen at (k, i) is safe
x[k] = i // Place Queen at column i in row k
if k == n then
printSolution(x) // All queens placed successfully
else
PlaceQueen(k + 1) // Place next Queen
end if
end if
end for
Function isSafe(k, i)
// Check for conflict with previously placed queens
for j = 1 to k - 1 do
if (x[j] == i) OR (abs(x[j] - i) == abs(j - k)) then
return false // Same column or diagonal conflict
end if
end for
return true
Time Complexity
T(N)=O(N!)
Explanation:
For the first column → N choices
For the next column → (N−1) safe positions roughly remain
Total combinations = N × (N−1) × (N−2) ... = O(N!)
Space Complexity:
O(N2)
(for storing the chessboard and recursion stack)
Algorithm: Sum of Subset using Backtracking
SumOfSubset(i, sum, target, set[], subset[])
1️.If sum == target
→ Print subset[] // Valid subset found
→ Return
2️.If i == n OR sum > target
→ Return // No valid subset possible from here
3️.Else:
Choice 1: Include set[i] in the subset
→ subset.push(set[i])
→ SumOfSubset(i+1, sum + set[i], target, set, subset)
→ subset.pop() (Backtrack)
Choice 2: Exclude set[i] from the subset
→ SumOfSubset(i+1, sum, target, set, subset)
⚙️Explanation
The algorithm uses recursion and backtracking to explore all possible subsets of the
given set.
At each step, it decides whether to include or exclude the current element.
Whenever the current subset sum equals the target, it prints that subset.
If the sum exceeds the target, the algorithm prunes (cuts off) that path early.
Time Complexity Analysis
Best Case: O(n)
Happens when early pruning occurs (i.e., unnecessary recursive calls are avoided).
Example:
o When elements are sorted and the partial sum quickly exceeds the target.
o Or the first few branches already yield the required subset.
So, only a few recursive calls are made.
Best-case complexity ≈ O(n) (linear).
Worst Case: O(2ⁿ)
In the worst case, the algorithm explores all subsets of the given set.
Each element has two choices: include or exclude, leading to 2ⁿ recursive calls.
Worst-case complexity = O(2ⁿ)
Space Complexity:
O(n)
Due to the recursion stack and subset array.
Algorithm: NAÏVE_STRING_MATCHING(T, P)
Input:
T → Text string of length n
P → Pattern string of length m
Steps:
1️. For i ← 0 to n - m do
If P[1 ... m] == T[i + 1 ... i + m] then
Print "Match Found at position i"
End if
End for
Explanation:
The algorithm checks for the pattern P in the text T by aligning P at every possible
position in T.
At each position i, it compares all characters of P with the corresponding substring of
T.
If all characters match, a pattern occurrence is reported.
Time Complexity:
Best Case: O(n) → When mismatches occur early.
Worst Case: O(n × m) → When all characters need to be compared.
✅ Overall Time Complexity: O(n × m)
✅ Space Complexity: O(1)