0% found this document useful (0 votes)
16 views2 pages

Advanced Probability Notes

The document provides an overview of advanced probability theory, covering key concepts such as random variables, probability distributions, expectation, variance, and joint distributions. It also discusses important theorems like the Law of Large Numbers and the Central Limit Theorem, as well as common probability distributions and Bayes' theorem. These concepts are foundational for applications in statistics, machine learning, and other fields.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views2 pages

Advanced Probability Notes

The document provides an overview of advanced probability theory, covering key concepts such as random variables, probability distributions, expectation, variance, and joint distributions. It also discusses important theorems like the Law of Large Numbers and the Central Limit Theorem, as well as common probability distributions and Bayes' theorem. These concepts are foundational for applications in statistics, machine learning, and other fields.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Advanced Probability Notes

1. Probability Theory Overview


Probability theory provides the mathematical foundation for quantifying uncertainty. It is essential in
fields such as statistics, machine learning, economics, and engineering.

2. Random Variables
A random variable (RV) is a function that maps outcomes of a random experiment to numerical
values. It can be either discrete or continuous.

• Discrete RV: Takes countable values (e.g., Binomial, Poisson).

• Continuous RV: Takes values in an interval (e.g., Normal, Exponential).

3. Probability Distributions
Probability distributions describe how probabilities are distributed over possible outcomes of a
random variable.

• Probability Mass Function (PMF): For discrete RVs.

• Probability Density Function (PDF): For continuous RVs.

• Cumulative Distribution Function (CDF): Gives P(X ≤ x).

4. Expectation and Variance


Expectation (mean) represents the average value of a random variable, and variance measures its
spread.

• E[X] = ∑ x·P(X=x) for discrete, ∫ x·f(x) dx for continuous.

• Var(X) = E[X²] – (E[X])².

5. Joint, Marginal, and Conditional Distributions


For multiple random variables:

• Joint distribution: P(X, Y) or f(x, y).

• Marginal distribution: Obtained by summing/integrating out other variables.

• Conditional distribution: P(X|Y) = P(X, Y) / P(Y).

6. Covariance and Correlation


Covariance measures joint variability; correlation normalizes covariance to [-1, 1].

• Cov(X,Y) = E[(X−E[X])(Y−E[Y])].

• ρ(X,Y) = Cov(X,Y) / (σ■σ■).

7. Law of Large Numbers (LLN) & Central Limit Theorem (CLT)


These theorems form the basis of inferential statistics:

• LLN: Sample mean converges to expected value as n → ∞.


• CLT: Sum (or mean) of many i.i.d. RVs tends to a Normal distribution, regardless of original
distribution.

8. Moment Generating Functions (MGFs)


MGFs help find moments and identify distributions.

• M_X(t) = E[e^{tX}].

• n-th moment = M_X■■■(0).

9. Common Distributions
• Bernoulli: P(X=1)=p.

• Binomial(n,p): Sum of n Bernoulli trials.

• Poisson(λ): Counts events in fixed interval.

• Exponential(λ): Time between Poisson events.

• Normal(µ,σ²): Continuous symmetric distribution.

• Uniform(a,b): Equal probability over [a,b].

10. Conditional Expectation & Bayes’ Theorem


Conditional expectation generalizes expectation under given information. Bayes’ theorem connects
conditional probabilities:

• E[X|Y] = ∑ x·P(X=x|Y).

• Bayes: P(A|B) = [P(B|A)P(A)] / P(B).

You might also like