has chapters on combinatorics in algorithmic problems.
- Online Courses: Coursera – Introduction to Discrete Mathematics for Computer Science (covers
standard combinatorial settings and how to recognize them in algorithmic problems) 10 11 . This
specialization by UC San Diego includes practical exercises counting paths, passwords, etc.
Example Applications: Combinatorics pops up in many interview problems. A classic example is “Unique
Paths” (LeetCode) – counting paths in a grid can be solved by a combination formula $C(r+c, r)$ (choosing
steps out of moves). “Subsets” and “Permutations” problems explicitly generate combinations or
permutations of input sets. Another example is calculating the number of structurally unique BSTs for n
nodes (LeetCode “Unique Binary Search Trees”), which uses the Catalan number formula mentioned above
7 . In backtracking problems like “Combinations” or “Combination Sum”, understanding $n \choose k$
helps reason about complexity and verify results. Essentially, whenever you need to count arrangements,
selections, or sequences, combinatorics is the tool to apply.
Probability & Basic Statistics
Why it matters in DSA: Probability theory helps when dealing with randomized algorithms, probabilistic
analysis of algorithms, and certain brainteaser problems. In coding interviews, probability concepts are
often seen in problems involving random choices (e.g. shuffling an array, sampling data) or analyzing
expected outcomes. For instance, understanding probability ensures that a shuffle algorithm gives all
permutations with equal likelihood 12 , or that a randomized QuickSort avoids worst-case $O(n^2)$
behavior on average 13 14 . Basic statistics (expectation, variance) help in reasoning about average-case
performance and fairness of algorithms. In short, probability “keeps you flexible” in approaching problems
involving randomness 2 .
Must-Know Concepts: Basic probability rules (an event’s probability 0–1, sum of probabilities of all
outcomes = 1, etc.) 15 . Key definitions like independent events (and the rule $P(A\text{ and }B)=P(A)\cdot P(B)
$) and mutually exclusive events (and the rule $P(A\text{ or }B)=P(A)+P(B)$) 16 . Conditional probability
($P(A|B)$) and understanding how conditions affect likelihood. Bayes’ Theorem (for updating probabilities
given new information) is a nice conceptual tool – not common in coding questions, but good to know in
case a probability puzzle or a simple inference problem appears. Also learn about expected value (the
average outcome over many trials). Expectation is used to analyze randomized algorithms (e.g. expected
runtime or expected number of loop iterations) 17 . For example, reservoir sampling’s correctness is shown
by expecting each element to have equal chance in the sample.
Nice-to-Know: Variance and standard deviation (less likely to be directly asked, but understanding variance
can help in reasoning about result stability). Common distributions such as the binomial distribution (e.g.
probability of exactly $k$ successes in $n$ independent trials, which combines combinatorics and
probability) are occasionally useful. Knowing the uniform distribution (all outcomes equally likely) is
implicit in many random selection problems. These are more relevant in data science or ML, but can
occasionally appear in algorithm puzzles (e.g. analyzing a randomized algorithm’s success probability). If
you’re preparing for specialized roles, you might also look at randomized data structures or hashing
(which involve probability of collisions).
Resources: - Video Lessons: Make School – Probability (YouTube series) 9 is a beginner-friendly
introduction covering basic probability and examples (Make School’s videos on Probability Part 1 and 2).
Khan Academy’s “Probability Explained” playlist 8 is excellent for short, concept-focused videos (41 short