UNIT I: Introduction to Algorithms
Detailed Notes
1. Time Complexity and Space Complexity: Time complexity measures the amount of time an algorithm takes
as a function of input size (n). Space complexity measures the amount of memory required by the algorithm. Both
are expressed using asymptotic notations like Big-O, Omega, and Theta.
2. Asymptotic Analysis & Growth Rates: Asymptotic analysis studies the behavior of algorithms as input size
grows. Common growth rates include constant O(1), logarithmic O(log n), linear O(n), polynomial O(n^k), and
exponential O(2^n).
3. Common Complexity Analysis Techniques: - Master Theorem: Provides direct solutions for recurrence
relations. - Substitution Method: Guess and prove the complexity using induction. - Iteration Method: Expands the
recurrence step by step. - Recursive Algorithms: Complexity depends on number of recursive calls and work per
call.
4. Algorithm Design Principles: Problem-solving involves breaking problems into smaller parts, analyzing
constraints, and designing efficient solutions. The role of data structures is crucial as they define how information
is stored and accessed.
5. Sorting Algorithms: - Selection Sort: Selects smallest element repeatedly. Time: O(n^2). - Bubble Sort:
Repeatedly swaps adjacent elements. Time: O(n^2). - Insertion Sort: Inserts elements in sorted part. Time:
O(n^2). - Counting Sort: Works for integer keys. Time: O(n+k). - Linear Time Sorting: Possible with
non-comparison sorts (e.g., radix sort).
6. Searching Algorithms: - Linear Search: Sequentially checks each element. Time: O(n). - Binary Search:
Works on sorted arrays, divides search space in half each time. Time: O(log n).
Important Questions with Answers
Q1. Define Time Complexity and Space Complexity with examples.
Time Complexity refers to the computational complexity that describes the amount of time it takes to run an
algorithm. Example: Linear Search has O(n). Space Complexity refers to the amount of memory used. Example:
Merge Sort requires O(n) extra space.
Q2. Explain Asymptotic Notations with diagrams.
- Big-O (O): Upper bound (worst-case). - Omega (Ω): Lower bound (best-case). - Theta (Θ): Tight bound (average
case). Graph: Time vs Input Size (showing O(1), O(log n), O(n), O(n log n), O(n^2)).
Q3. Differentiate between Bubble Sort, Selection Sort, and Insertion Sort.
- Bubble Sort: Adjacent swaps, O(n^2). - Selection Sort: Finds min, swaps, O(n^2). - Insertion Sort: Inserts into
sorted portion, O(n^2), better for small/partially sorted arrays.
Q4. What is the Master Theorem? Give an example.
The Master Theorem provides solutions for recurrence relations of divide and conquer algorithms. Example: T(n)
= 2T(n/2) + O(n). By Master theorem: O(n log n).
Q5. Write differences between Linear Search and Binary Search.
Linear Search: Works on unsorted data, O(n). Binary Search: Works only on sorted data, O(log n).
Quick Summary
- Time & Space Complexity: Measure efficiency of algorithms. - Asymptotic Analysis: Big-O, Omega, Theta
notations. - Growth Rates: Constant, log, linear, polynomial, exponential. - Analysis Techniques: Master theorem,
substitution, iteration. - Sorting: Bubble, Selection, Insertion, Counting, Linear-time. - Searching: Linear search
O(n), Binary search O(log n).