0% found this document useful (0 votes)
15 views6 pages

Lec 19

Uploaded by

manishkumars0914
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views6 pages

Lec 19

Uploaded by

manishkumars0914
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Probability Theory and Random

Processes (MA225)
Lecture Slides
Lecture 19
Bivariate normal

Def: A two dimensional random vector X = (X , Y ) is said to have a


bivariate normal distribution if aX + bY is a univariate normal for all
(a, b) ∈ R2 \ (0, 0).
Theorem: If µ = E (X ) and Σ is the variance-covariance matrix of X ,
then for any fixed u = (a, b) ∈ R2 \ (0, 0), u ′ X ∼ N(u ′ µ, u ′ Σu).
Theorem: Let X be a bivariate normal random vector, then
′ 1 ′
MX (t) = e t µ+ 2 t Σt for all t ∈ R2 .
Remark: Thus the bivariate normal distribution is completely specified
by the mean vector µ and the variance-covariance matrix Σ. We may
therefore denote a bivariate normal distribution by N2 (µ, Σ).
Def: A two dimensional random vector X is said to have a bivariate
normal distribution if it can be expressed in the form X = µ + AY ,
where A is a 2 × 2 matrix of real numbers, Y = (Y1 , Y2 ) and Y1 and
Y2 are i.i.d N(0, 1). In this case E (X ) = µ and Σ = AA′ .
Theorem: If X ∼ N2 (µ, Σ), then X ∼ N(µ1 , σ11 ) and
Y ∼ N(µ2 , σ22 ).
Remark: The converse of the above theorem is not true.
Theorem: Let X ∼ N2 (µ, Σ) be such that Σ is invertible, then, for
all x ∈ R2 , X has a joint PDF given by
 
1 1 ′ −1
f (x) = exp − (x − µ) Σ (x − µ)
2π|Σ|1/2 2
1
= p e A(x, y , µx , µy , σx , σy , ρ) ,
2πσx σy 1 − ρ2

where
( 2     2 )
1 x − µx x − µx y − µy y − µy
A=− − 2ρ + .
2(1 − ρ2 ) σx σx σy σy

Remark: If Cov (X , Y ) = 0, then X and Y are independent.


Theorem: Let X ∼ N2 (µ, Σ) be such that Σ is invertible, then
1 for all y ∈ R, the conditional PDF of X given Y = y is given by
" 2 #
1 x − µx|y

1
fX |Y (x|y ) = √ exp − for x ∈ R,
σx|y 2π 2 σx|y

where µx|y = µx + ρ σσyx (y − µy ) and σx|y


2
= σx2 (1 − ρ2 ).
2 E (X |Y = y ) = µx|y = µx + ρ σσyx (y − µy ) for all y ∈ R.
Theorem:
Pn Let X1 , X2 , . . . Xn be i.i.d. N(0, 1) random variables. Then
2 2
i=1 Xi ∼ Gamma(n/2, 1/2) ≡ χn .

Theorem: Let X1 , X2 , . . . Xn be i.i.d. N(µ, σ 2 ) random variables.


2
Then X ∼ N(µ, σ 2 /n), (n−1)Sσ2
∼ χ2n−1 , and X and S 2 are
independently distributed. Here X = n1 ni=1 Xi and
P
1
Pn
S 2 = n−1 2
i=1 (Xi − X ) .

You might also like