Topic: Algorithm and Efficiency of an Algorithm
Introduction to Algorithms
An algorithm is a step-by-step procedure or formula for solving a problem. It consists of well-defined instructions that
take an input, process it, and produce an output.
Characteristics of an Algorithm:
1. Input: One or more inputs are provided.
2. Output: It must produce at least one output.
3. Definiteness: Each step must be precisely defined.
4. Finiteness: It must have a finite number of steps.
5. Effectiveness: Each step should be simple and feasible.
Algorithm Design Techniques
1. Divide and Conquer: Breaking a problem into subproblems (e.g., Merge Sort, Quick Sort).
2. Dynamic Programming: Solving overlapping subproblems using memoization (e.g., Fibonacci, Knapsack
problem).
3. Greedy Algorithms: Making locally optimal choices (e.g., Kruskal’s Algorithm, Dijkstra’s Algorithm).
4. Backtracking: Searching for a solution by exploring possibilities recursively (e.g., N-Queens problem, Sudoku
Solver).
5. Brute Force: Trying all possible solutions (e.g., String Matching, Traveling Salesman Problem).
Efficiency of an Algorithm
The efficiency of an algorithm is measured by the resources it requires for execution, primarily time complexity and
space complexity.
1. Time Complexity
Time complexity measures the number of operations an algorithm performs as a function of input size n.
Asymptotic Notations:
Big-O (O): Upper bound on growth rate (worst case scenario).
Omega (Ω): Lower bound (best case scenario).
Theta (Θ): Tight bound (average case scenario).
Common Time Complexities:
Complexity Notation Example
Constant O(1) Accessing an array element
Logarithmic O(log n) Binary Search
Linear O(n) Linear Search
Linearithmic O(n log n) Merge Sort, Quick Sort
Quadratic O(n²) Bubble Sort, Insertion Sort
Cubic O(n³) Matrix Multiplication
Exponential O(2ⁿ) Recursive Fibonacci
Factorial O(n!) Traveling Salesman Problem
2. Space Complexity
Space complexity refers to the amount of memory required by an algorithm, including input storage and auxiliary storage.
O(1): Constant space (e.g., swapping two numbers).
O(n): Linear space (e.g., storing an array).
O(n²): Quadratic space (e.g., adjacency matrix for a graph).
Trade-offs in Algorithm Efficiency
1. Time vs. Space Trade-off: Reducing time complexity may increase space usage and vice versa.
2. Exact vs. Approximate Solutions: Some problems are too complex to solve exactly, requiring approximate
algorithms (e.g., heuristic search in AI).