Algorithms
An algorithm is a step-by-step procedure or formula for solving a problem. It involves a finite
number of well-defined instructions to achieve a particular goal. Algorithms can be
represented using pseudocode, flowcharts, or programming languages.
Time Complexity
Time complexity measures the amount of computational time that an algorithm takes to run
as a function of the size of the input. It helps in understanding the efficiency of an algorithm.
Common time complexities include:
O(1): Constant time.
O(log n): Logarithmic time.
O(n): Linear time.
O(n log n): Linearithmic time.
O(n^2): Quadratic time.
O(2^n): Exponential time.
O(n!): Factorial time.
Recurrence
Recurrence relations are equations or inequalities that describe a function in terms of its value
on smaller inputs. They are often used to characterize the time complexity of recursive
algorithms. Solving recurrence relations provides insights into the overall complexity of the
algorithm.
Example: For the recurrence relation T(n) = 2T(n/2) + n, the solution is T(n) = O(n log n).
Probabilistic Analysis
Probabilistic analysis evaluates the performance of an algorithm when the input is subjected
to some random distribution. It helps in understanding the average-case behavior of
algorithms, especially when worst-case analysis is too pessimistic.
Example: QuickSort's average-case time complexity is O(n log n) due to the probabilistic
nature of the pivot selection.
Amortized Analysis
Amortized analysis examines the average performance of an algorithm over a sequence of
operations, smoothing out the cost of expensive operations over time. This is particularly
useful for analyzing algorithms where occasional operations are expensive but infrequent.
Example: In a dynamic array, the cost of resizing (doubling the array) is high, but amortized
over many insertions, the average cost per insertion remains constant, O(1).