0% found this document useful (0 votes)
71 views3 pages

01 Algorithm Analysis Cheat Sheet

Uploaded by

wiremu casey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
71 views3 pages

01 Algorithm Analysis Cheat Sheet

Uploaded by

wiremu casey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Algorithm Analysis - Cheat Sheet

Key Concepts
Definition: Theoretical study of algorithm performance and resource usage
Purpose: Predict algorithm behavior without implementation, compare algorithms objectively
Applications: Performance optimization, system design, scalability planning

Core Analysis Types


Time Complexity
Focus: How execution time grows with input size
Measure: Number of basic operations (not actual time)
Goal: Understand scalability patterns

Space Complexity
Focus: Memory requirements beyond input/output
Includes: Auxiliary space, recursive call stack
Goal: Optimize memory usage

Non-Recursive Algorithm Analysis (5-Step Method)


Step 1: Input Size Parameter

Examples:
- Array size: n
- Matrix: n×n or n×m
- Graph: |V| vertices, |E| edges
- String: length n

Step 2: Basic Operation

Common Operations:
- Comparisons: A[i] == A[j]
- Assignments: x = y
- Arithmetic: +, -, *, /
- Array access: A[i]

Step 3: Case Analysis


Best Case: Minimum operations (optimistic scenario)
Worst Case: Maximum operations (pessimistic scenario)
Average Case: Expected operations across all inputs

Step 4: Summation Setup

Single Loop: Σ(i=1 to n) f(i)


Nested Loops: Σ(i=1 to n) Σ(j=1 to m) f(i,j)
Conditional: Consider different execution paths

Step 5: Solve for Growth Order

Common Summation Formulas:


Σ(i=1 to n) 1 = n
Σ(i=1 to n) i = n(n+1)/2 ≈ n²/2
Σ(i=1 to n) i² = n(n+1)(2n+1)/6 ≈ n³/3

Recursive Algorithm Analysis


Recurrence Relations

General Form: T(n) = aT(n/b) + f(n)


- a: number of recursive calls
- n/b: size of each subproblem
- f(n): work done at current level

Common Patterns

T(n) = T(n-1) + c → O(n) [decrease by constant]


T(n) = T(n/2) + c → O(log n) [decrease by factor]
T(n) = 2T(n/2) + c → O(n) [divide and conquer]
T(n) = 2T(n/2) + n → O(n log n) [merge sort pattern]

Solution Methods
1. Substitution: Expand recurrence step by step
2. Master Theorem: For T(n) = aT(n/b) + f(n)
3. Recursion Tree: Visualize recursive calls
Algorithm Complexity Examples
Linear Search

for i = 0 to n-1:
if A[i] == x: return i

Analysis:
- Basic op: comparison A[i] == x
- Best: O(1) - found at first position
- Worst: O(n) - not found or at end
- Average: O(n/2) = O(n)

Nested Loop (Unique Elements)

for i = 0 to n-2:
for j = i+1 to n-1:
if A[i] == A[j]: return false

Analysis:
- Basic op: comparison A[i] == A[j]
- Summation: Σ(i=0 to n-2) Σ(j=i+1 to n-1) 1
- Solution: n(n-1)/2 = O(n²)

Binary Search Recurrence

T(n) = T(n/2) + 1, T(1) = 1

Solution by substitution:
T(n) = T(n/2) + 1
= T(n/4) + 1 + 1
= T(n/8) + 1 + 1 + 1
= T(n/2^k) + k

When n/2^k = 1: k = log₂(n)


Therefore: T(n) = O(log n)

Important Formulas
Summation Formulas

Σ(i=1 to n) 1 = n
Σ(i=1 to n) i = n(n+1)/2
Σ(i=1 to n) i² = n(n+1)(2n+1)/6
Σ(i=0 to n) 2^i = 2^(n+1) - 1
Σ(i=1 to n) 1/i ≈ ln(n)

Logarithm Properties

log(ab) = log(a) + log(b)


log(a/b) = log(a) - log(b)
log(a^b) = b·log(a)
log_b(a) = log(a)/log(b)
log₂(n) ≈ 3.32·log₁₀(n)

Common Examples
Example 1: Matrix Multiplication

for i = 0 to n-1:
for j = 0 to n-1:
for k = 0 to n-1:
C[i][j] += A[i][k] * B[k][j]

Analysis: Triple nested loop → O(n³)

Example 2: Recursive Fibonacci

fib(n):
if n ≤ 1: return n
return fib(n-1) + fib(n-2)

Recurrence: T(n) = T(n-1) + T(n-2) + 1


Solution: T(n) = O(φⁿ) where φ ≈ 1.618

Example 3: Merge Sort Analysis

mergeSort(A, left, right):


if left < right:
mid = (left + right)/2
mergeSort(A, left, mid) // T(n/2)
mergeSort(A, mid+1, right) // T(n/2)
merge(A, left, mid, right) // O(n)
Recurrence: T(n) = 2T(n/2) + n
Solution: T(n) = O(n log n)

Implementation Tips
Choose appropriate basic operation (usually innermost, most frequent)
Consider all cases (best, average, worst) when relevant
Set up summations carefully - match loop bounds exactly
Use standard formulas for common summation patterns
Verify with small examples to check analysis correctness

Common Pitfalls
Confusing theoretical analysis with actual runtime
Choosing wrong basic operation
Incorrect summation bounds
Forgetting to consider different cases
Mixing up decrease-and-conquer vs divide-and-conquer patterns

Quick Reference
Complexity Hierarchy

O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(n³) < O(2ⁿ) < O(n!)

Master Theorem Quick Check

T(n) = aT(n/b) + f(n)

Case 1: f(n) = O(n^(log_b(a) - ε)) → T(n) = Θ(n^log_b(a))


Case 2: f(n) = Θ(n^log_b(a)) → T(n) = Θ(n^log_b(a) · log n)
Case 3: f(n) = Ω(n^(log_b(a) + ε)) → T(n) = Θ(f(n))

Analysis Checklist
Input size parameter identified
Basic operation chosen appropriately
Cases considered (best/average/worst)
Summation set up correctly
Solution method applied properly
Result verified with examples

You might also like