An algorithm is a finite set of steps designed to perform a task or solve a problem.
Big-O Notation: Describes the performance or complexity of an algorithm. - O(1): Constant
time - O(n): Linear time - O(log n): Logarithmic time - O(n^2): Quadratic time Sorting
Algorithms: - Bubble Sort: Repeatedly swaps adjacent elements if they are in wrong order.
Time complexity: O(n^2) - Merge Sort: Divides array into halves, sorts them, and merges.
Time complexity: O(n log n) - Quick Sort: Uses a pivot to partition the array. Average
time complexity: O(n log n) Searching Algorithms: - Linear Search: Check each element one
by one. O(n) - Binary Search: Works on sorted arrays by repeatedly dividing the search
interval. O(log n) Recursion: A function that calls itself. Example: def factorial(n):
if n == 0: return 1 return n * factorial(n-1) Greedy vs Dynamic Programming:
- Greedy: Makes the locally optimal choice at each step. - DP: Solves subproblems and
combines solutions (e.g., Fibonacci numbers with memoization) Common Mistakes: - Off-by-
one errors - Infinite recursion - Forgetting base cases
An algorithm is a finite set of steps designed to perform a task or solve a problem.
Big-O Notation: Describes the performance or complexity of an algorithm. - O(1): Constant
time - O(n): Linear time - O(log n): Logarithmic time - O(n^2): Quadratic time Sorting
Algorithms: - Bubble Sort: Repeatedly swaps adjacent elements if they are in wrong order.
Time complexity: O(n^2) - Merge Sort: Divides array into halves, sorts them, and merges.
Time complexity: O(n log n) - Quick Sort: Uses a pivot to partition the array. Average
time complexity: O(n log n) Searching Algorithms: - Linear Search: Check each element one
by one. O(n) - Binary Search: Works on sorted arrays by repeatedly dividing the search
interval. O(log n) Recursion: A function that calls itself. Example: def factorial(n):
if n == 0: return 1 return n * factorial(n-1) Greedy vs Dynamic Programming:
- Greedy: Makes the locally optimal choice at each step. - DP: Solves subproblems and
combines solutions (e.g., Fibonacci numbers with memoization) Common Mistakes: - Off-by-
one errors - Infinite recursion - Forgetting base cases
An algorithm is a finite set of steps designed to perform a task or solve a problem.
Big-O Notation: Describes the performance or complexity of an algorithm. - O(1): Constant
time - O(n): Linear time - O(log n): Logarithmic time - O(n^2): Quadratic time Sorting
Algorithms: - Bubble Sort: Repeatedly swaps adjacent elements if they are in wrong order.
Time complexity: O(n^2) - Merge Sort: Divides array into halves, sorts them, and merges.
Time complexity: O(n log n) - Quick Sort: Uses a pivot to partition the array. Average
time complexity: O(n log n) Searching Algorithms: - Linear Search: Check each element one
by one. O(n) - Binary Search: Works on sorted arrays by repeatedly dividing the search
interval. O(log n) Recursion: A function that calls itself. Example: def factorial(n):
if n == 0: return 1 return n * factorial(n-1) Greedy vs Dynamic Programming:
- Greedy: Makes the locally optimal choice at each step. - DP: Solves subproblems and
combines solutions (e.g., Fibonacci numbers with memoization) Common Mistakes: - Off-by-
one errors - Infinite recursion - Forgetting base cases