0% found this document useful (0 votes)
9 views20 pages

Lecture 32

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views20 pages

Lecture 32

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

NPTEL- Probability and Distributions

MODULE 6
RANDOM VECTOR AND ITS JOINT DISTRIBUTION
LECTURE 32
Topics
6.10 DISTRIBUTION OF FUNCTION OF RANDOM
VECTORS
6.10.1 Distribution Function Technique

6.10.1.1 Marginal Distirbution of Order Statistics of a Random Sample Of


Absolutely Continuous type Random Variables

6.10 DISTRIBUTION OF FUNCTION OF RANDOM


VECTORS
Let 𝑋 = 𝑋1 , … , 𝑋𝑝 be a random vector of either discrete type or of absolutely
continuous type and let 𝑓𝑋 ∙ denote the p.m.f./p.d.f. of 𝑋. Let 𝑔: ℝ𝑝 → ℝ be a Borel
function. As the following example illustrates, in many situations, it may be of interest to
find the probability distribution of 𝑔 𝑋 .

Example 10.1

Consider a company that manufactures electric bulbs. The lifetimes of electric bulbs
manufactured by the company are random. Past experience with testing on electric bulbs
manufactured by the company suggests that the lifetime of a randomly chosen electric
bulb manufactured by the company can be described by a random variable 𝑋 having the
p.d.f.

1 −𝑥
𝑓𝑋 𝑥|𝜃 = 𝜃 𝑒 , if 𝑥 > 0
𝜃
, 𝜃 > 0.
0, otherwise

However the value of 𝜃 > 0 is not evident from the past experience and therefore 𝜃 is
unknown. One way to obtain information about unknown 𝜃 is to do testing independently
and under identical conditions, on a number (say 𝑛) of electric bulbs manufactured by the

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 1


NPTEL- Probability and Distributions

company. Let 𝑋𝑖 denote the lifetime of the 𝑖 -th bulb, 𝑖 = 1, … , 𝑛 . We call 𝑋1 , … , 𝑋𝑛


(which are independent and identically distributed random variables from the distribution
𝑓𝑋 ⋅ |𝜃 , 𝜃 > 0) the random sample from distribution 𝑓𝑋 ⋅ |𝜃 , 𝜃 > 0. Clearly the joint
p.d.f. of 𝑋 = 𝑋1 , … , 𝑋𝑛 is given by

𝑛 𝑛 𝑥
1 − 𝑖=1 𝑖

𝑥𝑖 |𝜃 = 𝜃 𝑛 𝑒 , if 𝑥𝑖 > 0, 𝑖 = 1, … , 𝑛 .
𝜃
𝑓𝑋 𝑥|𝜃 = 𝑓𝑋 𝑖
𝑖=1 0, otherwise
1 𝑛
Since 𝐸 𝑋 = 𝜃, a natural estimator of 𝜃 is the sample mean 𝑋 = 𝑛 𝑖=1 𝑋𝑖 .To study
theoretical properties of the estimator 𝑋 we need probability distribution of 𝑋. ▄

Definition 10.1

(i) A function of one or more random variables that does not depend on any unknown
parameter is called a statistic.
(ii) Let 𝑋1 , … , 𝑋𝑛 be a collection of independent random variables each having the same
p.m.f./p.d.f. 𝑓(or distribution function 𝐹). We then call 𝑋1 , … , 𝑋𝑛 a random sample
(of size 𝑛) from a distribution having p.m.f./p.d.f 𝑓 (or distribution function 𝐹). In
other words a random sample is a collection of independent and identically
distributed random variables.

Remark 10.1

(i) Let 𝑋 ~ 𝑁2 𝜇1 , 𝜇2 , 𝜍12 , 𝜍22 , 𝜌 , −∞ < 𝜇𝑖 < ∞, 𝜍𝑖 , 𝑖 = 1, 2, −1 < 𝜌 < 1. Then the
𝑋1 −𝜇 1
random variable 𝑌1 = 𝑋1 + 𝑋2 is a statistic but the random variable 𝑌2 = is not
𝜍1
a statistic unless 𝜇1 and 𝜍1 are known parameters.
(ii) Although a statistic does not depend upon any unknown parameters, the distribution
of a statistic may very well depend upon unknown parameters. For example, in (i)
above, 𝑌1 ~ 𝑁 𝜇1 + 𝜇2 , 𝜍12 + 𝜍22 + 2𝜌𝜍1 𝜍2 .
(iii) If 𝑋1 , … , 𝑋𝑛 is a random sample from a distribution having p.m.f./p.d..f. 𝑓 ∙ , then the
joint p.m.f./p.d.f. of 𝑋 = (𝑋1 , … , 𝑋𝑛 ) is
𝑛

𝑓𝑋 𝑥1 , … , 𝑥𝑛 = 𝑓𝑋 𝑖 𝑥𝑖
𝑖=1

= 𝑓 𝑥𝑖 , 𝑥 = 𝑥1 , … , 𝑥𝑛 ∈ ℝ𝑛 .
𝑖=1
(iv) Let 𝑋1 , … , 𝑋𝑛 be a random sample from a distribution. Some of the commonly used
statistics are

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 2


NPTEL- Probability and Distributions

𝑛
1
a Sample Mean 𝑋 = 𝑋𝑖 ;
𝑛
𝑖=1

𝑛 𝑛
1 1
b Sample Variance 𝑆 = 2
𝑋𝑖 − 𝑋 2
= 𝑋𝑖2 − 𝑛𝑋 2 , 𝑛 ≥ 2 ;
𝑛−1 𝑛−1
𝑖=1 𝑖=1

c r − th Order Statistics 𝑋𝑟:𝑛 = r − th smallest of 𝑋1 , … , 𝑋𝑛 , 𝑟 = 1,2, … , 𝑛 ;

d Sample Range 𝑅 = 𝑋𝑛:𝑛 − 𝑋1:𝑛 ;

𝑋𝑛 +1:𝑛 , if 𝑛 is odd
2
e Median 𝑀 = 𝑋𝑛 :𝑛 + 𝑋𝑛 +1:𝑛 .
2 2
, if 𝑛 is even
𝑛
Theorem 10.1

Let 𝑋1 , … , 𝑋𝑛 be a random sample from a distribution having p.m.f./p.d.f. 𝑓(⋅). Then, for
any permutation 𝛽1 , … , 𝛽𝑛 of 1, … , 𝑛 ,
𝑑
𝑋1 , … , 𝑋𝑛 = 𝑋𝛽 1 , … , 𝑋𝛽𝑛 .

Proof. Let 𝛽1 , … , 𝛽𝑛 be a permutation of 1, … , 𝑛 and let 𝛾1 , … , 𝛾𝑛 be the inverse


permutation of 𝛽1 , … , 𝛽𝑛 . Then, for 𝑥 = 𝑥1 , … , 𝑥𝑛 ∈ ℝ𝑛 ,

𝑓𝑋 𝛽 𝑥1 , … , 𝑥𝑛 = 𝑓𝑋1 ,…,𝑋𝑛 𝑥𝛾 1 , … , 𝑥𝛾 𝑛
1 ,…,𝑋 𝛽 𝑛

= 𝑓𝑋 𝑖 𝑥𝛾 𝑖
𝑖=1

= 𝑓 𝑥𝑖
𝑖=1

= 𝑓𝑋1 ,…,𝑋𝑛 𝑥1 , … , 𝑥𝑛 .

It follows that

𝑓𝑋 𝛽 1 ,…,𝑋 𝛽 𝑛 𝑥 = 𝑓𝑋1 ,…,𝑋𝑛 𝑥 , ∀𝑥 ∈ ℝ𝑛

𝑑
⇒ 𝑋𝛽1 , … , 𝑋𝛽𝑛 = 𝑋1 , … , 𝑋𝑛 . ▄

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 3


NPTEL- Probability and Distributions

Example10.2

Let 𝑋1 , … , 𝑋𝑛 be a random sample from a given distribution.

(i) If 𝑋1 is of absolutely continuous type then show that 𝑃 𝑋1 < 𝑋2 < ⋯ < 𝑋𝑛 =
1
𝑃 𝑋𝛽 < 𝑋𝛽 < ⋯ < 𝑋𝛽 = 𝑛!, for any permutation (𝛽1 , … , 𝛽𝑛 ) of 1, … , 𝑛 ;
1 2 𝑛
1
(ii) If 𝑋1 is of absolutely continuous type then show that 𝑃 𝑋𝑖 = 𝑋𝑟:𝑛 = 𝑛 , 𝑖 = 1, … , 𝑛,
where, for 𝑟 ∈ 1, … , 𝑛 , 𝑋𝑟:𝑛 = 𝑟-th smallest of 𝑋1 , … , 𝑋𝑛 ;
(iii) Show that
𝑋𝑖 1
𝐸 = , 𝑖 = 1,2, … , 𝑛,
𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛 𝑛

provided the expectations are finite;

(iv) Show that


𝑛
𝑡
𝐸 𝑋𝑖 𝑋𝑖 = 𝑡 = , 𝑖 = 1, … , 𝑛.
𝑛
𝑖=1

Solution. Let 𝑆𝑛 denote the set of all permutations of 1, … , 𝑛 . Using Theorem 10.1 we
have
𝑑
𝑋1 , … , 𝑋𝑛 = 𝑋𝛽 , … , 𝑋𝛽 , ∀𝛽 = 𝛽1 , … , 𝛽𝑛 ∈ 𝑆𝑛 .
1 𝑛

⇒ 𝐸 Ψ 𝑋1 , … , 𝑋𝑛 = 𝐸 Ψ 𝑋𝛽 , … , 𝑋𝛽 , ∀𝛽 ∈ 𝑆𝑛 (10.1)
1 𝑛

(i) On taking

1, if 𝑥1 < 𝑥2 < ⋯ < 𝑥𝑛


Ψ 𝑥1 , … , 𝑥𝑛 = .
0, otherwise

We conclude that

𝑃 𝑋1 < 𝑋2 < ⋯ < 𝑋𝑛 =𝑃 𝑋𝛽 < 𝑋𝛽 < ⋯ < 𝑋𝛽 , ∀𝛽 ∈ 𝑆𝑛 . (10.2)


1 2 𝑛

Since 𝑃 𝑋𝑖 = 𝑋𝑗 = 0 for 𝑖 ≠ 𝑗 (as (𝑋𝑖 , 𝑋𝑗 ) is of absolutely continuous type; see


Remark 2.1 (x)), we have

𝑃 𝑋𝛽 < 𝑋𝛽 < ⋯ < 𝑋𝛽 =1


1 2 𝑛
𝛽 ∈𝑆𝑛

1
⇒ 𝑃 𝑋1 < 𝑋2 < ⋯ < 𝑋𝑛 =𝑃 𝑋𝛽 < 𝑋𝛽 < ⋯ < 𝑋𝛽 = 𝑛! . (using (10.2))
1 2 𝑛

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 4


NPTEL- Probability and Distributions

(ii) Fix 𝑖 ∈ 1, 2, … , 𝑛 . On taking


1, if 𝑥𝑖 = 𝑟 − th smallest of 𝑥1 , … , 𝑥𝑛
Ψ 𝑥1 , ⋯ , 𝑥𝑛 = ,
0, otherwise

in (10.1) and noting that, for any permutation 𝛽 = 𝛽1 , … , 𝛽𝑛 ∈ 𝑆𝑛 , 𝑟-th smallest of


𝑋𝛽 , … , 𝑋𝛽 = 𝑟-th smallest of 𝑋1 , … , 𝑋𝑛 = 𝑋𝑟:𝑛 ,we have
1 𝑛

𝑃 𝑋𝑖 = 𝑋𝑟:𝑛 =𝑃 𝑋𝛽 = 𝑋𝑟:𝑛 ∀ 𝛽 ∈ 𝑆𝑛
𝑖

⇒ 𝑃 𝑋𝑖 = 𝑋𝑟:𝑛 = 𝑃 𝑋1 = 𝑋𝑟:𝑛 𝑖 = 1, … , 𝑛.

But
𝑛

𝑃 𝑋𝑖 = 𝑋𝑟:𝑛 = 1,
𝑖=1

and therefore

1
𝑃 𝑋𝑖 = 𝑋𝑟:𝑛 = 𝑃 𝑋1 = 𝑋𝑟:𝑛 = ∙
𝑛
(iii) On taking
𝑥1
Ψ 𝑥1 , … , 𝑥𝑛 = , 𝑥 ∈ ℝ𝑛 ,
𝑥1 + ⋯ + 𝑥𝑛

in (10.1) we get, for all 𝛽 = 𝛽1 , … , 𝛽𝑛 ∈ 𝑆𝑛 ,

𝑋1 𝑋𝛽
1
𝐸 =𝐸
𝑋1 + ⋯ + 𝑋𝑛 𝑋𝛽 + ⋯ + 𝑋𝛽
1 𝑛

𝑛 𝑛
𝑋𝛽
1
=𝐸 since 𝑥𝑖 = 𝑋𝛽
𝑋1 + ⋯ + 𝑋𝑛 𝑖
𝑖=1 𝑖=1

𝑋1 𝑋𝑖
⇒𝐸 =𝐸 , 𝑖 = 1, … , 𝑛. (10.3)
𝑋1 + ⋯ + 𝑋𝑛 𝑋1 + ⋯ + 𝑋𝑛
But
𝑛 𝑛
𝑋𝑖 𝑋𝑖
𝐸 =𝐸
𝑋1 + ⋯ + 𝑋𝑛 𝑋1 + ⋯ + 𝑋𝑛
𝑖=1 𝑖=1

𝑋1 + ⋯ + 𝑋𝑛
=𝐸
𝑋1 + ⋯ + 𝑋𝑛

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 5


NPTEL- Probability and Distributions

= 1.

Therefore, from (10.3), we get

𝑋1 𝑋𝑖 1
𝐸 =𝐸 = , 𝑖 = 1, … , 𝑛.
𝑋1 + ⋯ + 𝑋𝑛 𝑋1 + ⋯ + 𝑋𝑛 𝑛

(iv) For fixed 𝑡


𝑛 𝑛
𝑑
𝑋1 𝑋𝑗 = 𝑡 = 𝑋𝛽1 𝑋𝛽𝑗 = 𝑡 ∀𝛽 ∈ 𝑆𝑛
𝑗 =1 𝑗 =1
𝑛 𝑛 𝑛 𝑛
𝑑
⇒ 𝑋1 𝑋𝑗 = 𝑡 = 𝑋𝛽 𝑋𝑗 = 𝑡 , ∀𝛽 ∈ 𝑆𝑛 since 𝑥𝑗 = 𝑥𝛽𝑗
1
𝑗 =1 𝑗 =1 𝑗 =1 𝑗 =1

𝑛 𝑛

⇒ 𝐸 𝑋1 𝑋𝑗 = 𝑡 = 𝐸 𝑋𝛽 𝑋𝑗 = 𝑡 , ∀𝛽 ∈ 𝑆𝑛
1
𝑗 =1 𝑗 =1

𝑛 𝑛

⇒ 𝐸 𝑋1 𝑋𝑗 = 𝑡 = 𝐸 𝑋𝑖 𝑋𝑗 = 𝑡 , 𝑖 = 1, … , 𝑛. (10.4)
𝑗 =1 𝑗 =1

But

𝑛 𝑛 𝑛 𝑛

𝐸 𝑋𝑖 𝑋𝑗 = 𝑡 = 𝐸 𝑋𝑗 𝑋𝑗 = 𝑡 = 𝑡.
𝑖=1 𝑗 =1 𝑖=1 𝑗 =1

Now using (10.4) we get

𝑛 𝑛
𝑡
𝐸 𝑋1 𝑋𝑗 = 𝑡 = 𝐸 𝑋𝑖 𝑋𝑗 = 𝑡 = , 𝑖 = 1, … , 𝑛. ▄
𝑛
𝑗 =1 𝑗 =1

In the following subsections we will discuss various techniques to find the distribution of
functions of random variables.

6.10.1 Distribution Function Technique

Let 𝑋 = 𝑋1 , … , 𝑋𝑝 be a random vector and let 𝑔: ℝ𝑝 → ℝ be a Borel function. The


distribution of 𝑌 = 𝑔 𝑋1 , … , 𝑋𝑝 can be determined by computing the distribution
function

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 6


NPTEL- Probability and Distributions

𝐹𝑌 𝑦 = 𝑃 𝑔 𝑋1 , … , 𝑋𝑝 ≤ 𝑦 , −∞ < 𝑦 < ∞.

6.10.1.1 Marginal Distribution of Order Statistics of a Random Sample of Absolutely


Continuous Type Random Variables

Example 10.1.1

Let 𝑋1 , … , 𝑋𝑛 be a random sample of absolutely continuous type random variables, each


having the distribution function 𝐹 ∙ and p.d.f. 𝑓 ∙ . Suppose that 𝐹 is differential
𝑑
everywhere except (possibly) on a countable set 𝐶, so that 𝑓 𝑥 = 𝑑𝑥 𝐹 𝑥 , ∀𝑥 ∈ 𝐶 and


𝑓 𝑥 𝐼𝐶 𝑐 𝑥 𝑑𝑥 = 1.
−∞

Then the joint distribution function of 𝑋 = 𝑋1 , … , 𝑋𝑛 is


𝑛

𝐹𝑋 𝑥1 , … , 𝑥𝑛 = 𝐹 𝑥𝑖 , 𝑥 ∈ ℝ𝑛 .
𝑖=1

We have
𝑛

𝐹𝑋 𝑥 = 𝐹 𝑥𝑖
𝑖=1

𝑛 𝑥𝑖

= 𝑓 𝑡𝑖 𝑑𝑡𝑖
𝑖=1 −∞

𝑥1 𝑥𝑛 𝑛

= ⋯ 𝑓 𝑡𝑖 𝑑𝑡𝑛 ⋯ 𝑑𝑡1
−∞ −∞ 𝑖=1

𝑥1 𝑥𝑛

= ⋯ 𝑓𝑋 𝑡 𝑑𝑡𝑛 ⋯ 𝑑𝑡1 .
−∞ −∞

It follows that 𝑋 is of absolutely continuous type with joint p.d.f. 𝑓𝑋 (⋅). Therefore, for
𝑖 ≠ 𝑗,

𝑃 𝑋𝑖 = 𝑋𝑗 = 0.

Define

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 7


NPTEL- Probability and Distributions

𝑋𝑟:𝑛 = 𝑟 − th smallest of 𝑋1 , … , 𝑋𝑛 , 𝑟 = 1, … , 𝑛,

so that

𝑃 𝑋1:𝑛 < 𝑋2:𝑛 < ⋯ < 𝑋𝑛:𝑛 = 1.

First let us derive the distribution of 𝑋𝑟:𝑛 , 𝑟 = 1, ⋯ , 𝑛. Note that, for 𝑥 ∈ ℝ,

𝑋𝑟:𝑛 ≤ 𝑥 ⇔ at least 𝑟 of 𝑋1 , … , 𝑋𝑛 are ≤ 𝑥.

Therefore

𝐹𝑋 𝑟:𝑛 𝑥 = 𝑃 𝑋𝑟:𝑛 ≤ 𝑥

= 𝑃 at least 𝑟 of 𝑋1 , … , 𝑋𝑛 are ≤ 𝑥
𝑛

= 𝑃 𝑖 of 𝑋1 , … , 𝑋𝑛 are ≤ 𝑥 , 𝑥 ∈ ℝ .
𝑖=𝑟

Fix 𝑥 ∈ ℝ, and consider a sequence of 𝑛 trials where at the 𝑖-th trial we observe 𝑋𝑖 and
consider the trial having resulted in a success if 𝑋𝑖 ≤ 𝑥 and it having resulted in a failure
if 𝑋𝑖 > 𝑥, 𝑖 = 1, … , 𝑛. Since 𝑋1 , … , 𝑋𝑛 are independent and the probability of success in
the 𝑖-th trial is 𝑃 𝑋𝑖 ≤ 𝑥 = 𝐹 𝑥 (same for all the trials), the above sequence of trials
may be considered as a sequence of independent Bernoulli trials with probability of
success in each trial as 𝐹 𝑥 . Therefore

𝑃 𝑖 of 𝑋1 , … , 𝑋𝑛 ≤ 𝑥 = 𝑃 𝑖 successes in 𝑛 trials
𝑛 𝑖 𝑛−𝑖
= 𝐹 𝑥 1−𝐹 𝑥 ,
𝑖
and consequently
𝑛
𝑛 𝑖 𝑛−𝑖
𝐹𝑋 𝑟:𝑛 𝑥 = 𝐹 𝑥 1−𝐹 𝑥 , 𝑥 ∈ ℝ.
𝑖
𝑖=𝑟

Recall that for 𝑠 ∈ 0, 1, … , 𝑛 and 𝑝 ∈ (0, 1) (see Theorem 3.1, Module 5)


𝑛 𝑝
𝑛 𝑗 𝑛−𝑗
1
𝑗 𝑝 1−𝑝 = 𝑡 𝑠−1 1 − 𝑡 𝑛−𝑠
𝑑𝑡.
𝐵 𝑠, 𝑛 − 𝑠 + 1
𝑗 =𝑠 0

Therefore,

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 8


NPTEL- Probability and Distributions

𝐹(𝑥)
1
𝐹𝑋 𝑟:𝑛 𝑥 = 𝑡 𝑟 −1 1 − 𝑡 𝑛 −𝑟
𝑑𝑡, 𝑥 ∈ ℝ .
𝐵 𝑟, 𝑛 − 𝑟 + 1
0

Let

1 𝑟−1 𝑛−𝑟
𝑓𝑋 𝑟:𝑛 𝑥 = 𝐹 𝑥 1−𝐹 𝑥 𝑓 𝑥 , 𝑥 ∈ ℝ, (10.1.1)
𝐵 𝑟, 𝑛 − 𝑟 + 1

so that

𝑑
𝐹 𝑥 = 𝑓𝑋 𝑟:𝑛 𝑥 , ∀𝑥 ∉ 𝐶 𝑐 ,
𝑑𝑥 𝑋 𝑟:𝑛
and
∞ ∞
𝑓𝑋 𝑟:𝑛 𝑥 𝐼𝐶 𝑐 (𝑥)𝑑𝑥 = 𝑓𝑋 𝑟:𝑛 𝑥 𝐼𝐶 𝑐 (𝑥)𝑑𝑥
−∞ −∞


1 𝑟−1 𝑛−𝑟
= 𝐹 𝑥 1−𝐹 𝑥 𝑓 𝑥 𝐼𝐶 𝑐 (𝑥)𝑑𝑥
−∞ 𝐵 𝑟, 𝑛 − 𝑟 + 1

1
1
= 𝑡 𝑟 −1 1 − 𝑡 𝑛−𝑟
𝑑𝑡
𝐵 𝑟, 𝑛 − 𝑟 + 1 0

= 1.

Using Remark 4.2 (vii), Module 2, it follows that the random variable 𝑋𝑟:𝑛 is of
absolutely continuous type with p.d.f. given by (10.1.1). A simple heuristic argument for
expression (10.1.1) is as follows. Interpret 𝑓𝑋 𝑟:𝑛 𝑥 ∆𝑥 as the probability that 𝑋𝑟:𝑛 lies in
an infinitesimal interval 𝑥, 𝑥 + ∆𝑥 . Realizing that the probability of more than one 𝑋𝑖′ 𝑠
falling in the infinitesimal interval 𝑥, 𝑥 + ∆𝑥 may be negligible, 𝑓𝑋 𝑟:𝑛 𝑥 ∆𝑥 may be
interpreted as probability that one of the 𝑋𝑖′ 𝑠 falls in the infinitesimal interval 𝑥, 𝑥 +
∆𝑥 , 𝑟 − 1 𝑋𝑖′ 𝑠 fall in the interval (−∞, 𝑥] and 𝑛 − 𝑟 𝑋𝑖′ 𝑠 fall in the interval (𝑥 +
∆𝑥, ∞) ≃ 𝑥, ∞ . Since 𝑋1 , … , 𝑋𝑛 are independent and the probabilities of an observation
falling in intervals 𝑥, 𝑥 + ∆𝑥 , −∞, 𝑥 and 𝑥, ∞ are 𝑓 𝑥 ∆𝑥, 𝐹 𝑥 and 1 − 𝐹 𝑥
respectively, 𝑓𝑋𝑟:𝑛 𝑥 ∆𝑥 is given by the multinomial probability

𝑛! 1 𝑟−1 𝑛−𝑟
𝑓𝑋𝑟:𝑛 𝑥 ∆𝑥 ≡ 𝑓 𝑥 ∆𝑥 𝐹 𝑥 1−𝐹 𝑥 ,
1! (𝑟 − 1)! (𝑛 − 𝑟)!

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 9


NPTEL- Probability and Distributions

i.e.,

𝑛! 𝑟−1 𝑛−𝑟
𝑓𝑋𝑟:𝑛 𝑥 = 𝐹 𝑥 1−𝐹 𝑥 𝑓 𝑥 , −∞ < 𝑥 < ∞.
(𝑟 − 1)! (𝑛 − 𝑟)!

Now we will derive the joint distribution of 𝑋𝑟:𝑛 , 𝑋𝑠:𝑛 , where 𝑟 and 𝑠 are fixed positive
integers satisfying 1 ≤ 𝑟 < 𝑠 ≤ 𝑛. For −∞ < 𝑥 < 𝑦 < ∞,

𝐹𝑋𝑟:𝑛 ,𝑋𝑠:𝑛 𝑥, 𝑦

= 𝑃 𝑋𝑟:𝑛 ≤ 𝑥, 𝑋𝑠:𝑛 ≤ 𝑦

= 𝑃 at least 𝑟 of 𝑋1 , … , 𝑋𝑛 are ≤ 𝑥 and at least 𝑠 of 𝑋1 , … , 𝑋𝑛 are ≤ 𝑦


𝑛 𝑛

= 𝑃( 𝑖 of 𝑋1 , … , 𝑋𝑛 are in −∞, 𝑥 and 𝑗 of 𝑋1 , … , 𝑋𝑛 are in (𝑥, 𝑦] ).


𝑖=0 𝑗 =0
𝑟≤𝑖≤𝑛
𝑠≤𝑖+𝑗 ≤𝑛

Since 𝑋1 , … , 𝑋𝑛 are independent and probabilities of an observation falling in intervals


−∞, 𝑥 , 𝑥, 𝑦 and 𝑦, ∞ are 𝐹 𝑥 , 𝐹 𝑦 − 𝐹 𝑥 and 1 − 𝐹 𝑦 respectively, using
property of multinomial distribution, we have, for 𝑟 ≤ 𝑖 ≤ 𝑛, 𝑠 ≤ 𝑖 + 𝑗 ≤ 𝑛 and − ∞ <
𝑥<𝑦<∞

𝑃 𝑖 of 𝑋1 , … , 𝑋𝑛 are in −∞, 𝑥 and 𝑗 of 𝑋1 , … , 𝑋𝑛 are in (𝑥, 𝑦]

𝑛! 𝑖 𝑗 𝑛−𝑖−𝑗
= 𝐹 𝑥 𝐹 𝑦 −𝐹 𝑥 1−𝐹 𝑦 ∙
𝑖! 𝑗! (𝑛 − 𝑖 − 𝑗)!

Therefore, for −∞ < 𝑥 < 𝑦 < ∞,


𝑛 𝑛
𝑛! 𝑖 𝑗 𝑛−𝑖−𝑗
𝐹𝑋𝑟:𝑛 ,𝑋𝑠:𝑛 𝑥, 𝑦 = 𝐹 𝑥 𝐹 𝑦 −𝐹 𝑥 1−𝐹 𝑦
𝑖! 𝑗! (𝑛 − 𝑖 − 𝑗)!
𝑖=0 𝑗 =0
𝑟≤𝑖≤𝑛
𝑠≤𝑖+𝑗 ≤𝑛

𝑛 𝑛 −𝑖
𝑛! 𝑖 𝑗 𝑛−𝑖−𝑗
= 𝐹 𝑥 𝐹 𝑦 −𝐹 𝑥 1−𝐹 𝑦
𝑖! 𝑗! (𝑛 − 𝑖 − 𝑗)!
𝑖=𝑟 𝑗 =max 0,𝑠−𝑖

𝑠−1 𝑛 −𝑖
𝑛! 𝑖 𝑗 𝑛−𝑖−𝑗
= 𝐹 𝑥 𝐹 𝑦 −𝐹 𝑥 1−𝐹 𝑦
𝑖! 𝑗! (𝑛 − 𝑖 − 𝑗)!
𝑖=𝑟 𝑗 =𝑠−𝑖

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 10


NPTEL- Probability and Distributions

𝑛 𝑛 −𝑖
𝑛! 𝑖 𝑗 𝑛−𝑖−𝑗
+ 𝐹 𝑥 𝐹 𝑦 −𝐹 𝑥 1−𝐹 𝑦
𝑖! 𝑗! (𝑛 − 𝑖 − 𝑗)!
𝑖=𝑠 𝑗 =0

𝑠−1 𝑛 −𝑖
𝑛 𝑖 𝑛−𝑖 𝑗 𝑛−𝑖−𝑗
= 𝐹 𝑥 𝐹 𝑦 −𝐹 𝑥 1−𝐹 𝑦
𝑖 𝑗
𝑖=𝑟 𝑗 =𝑠−𝑖

𝑛 𝑛 −𝑖
𝑛 𝑖 𝑛−𝑖 𝑗 𝑛−𝑖−𝑗
+ 𝐹 𝑥 𝐹 𝑦 −𝐹 𝑥 1−𝐹 𝑦
𝑖 𝑗
𝑖=𝑠 𝑗 =0

𝑠−1 𝑛−𝑖 𝑗 𝑛−𝑖−𝑗


𝑛 𝑖 𝑛−𝑖 𝑛−𝑖 𝐹 𝑦 −𝐹 𝑥 𝐹 𝑦 −𝐹 𝑥
= 𝐹 𝑥 1−𝐹 𝑥 1−
𝑖 𝑗 1−𝐹 𝑥 1−𝐹 𝑥
𝑖=𝑟 𝑗 =𝑠−𝑖
𝑛
𝑛 𝑖 𝑛−𝑖
+ 𝐹 𝑥 1−𝐹 𝑥
𝑖
𝑖=𝑠

𝐹 𝑦 −𝐹 𝑥
𝑠−1 1−𝐹 𝑥
𝑛 𝑖 𝑛−𝑖
1
= 𝐹 𝑥 1−𝐹 𝑥 𝑡 𝑠−𝑖−1 1 − 𝑡 𝑛−𝑠
𝑑𝑡
𝑖 𝐵 𝑠 − 𝑖, 𝑛 − 𝑠 + 1
𝑖=𝑟 0

𝐹 𝑥
1
+ 𝑡 𝑠−1 1 − 𝑡 𝑛 −𝑠
𝑑𝑡. (using Theorem 3.1, Module 5)
𝐵 𝑠, 𝑛 − 𝑠 + 1
0

Thus, for −∞ < 𝑥 < 𝑦 < ∞, 𝑥 ∉ 𝐶, 𝑦 ∉ 𝐶


𝑠−1 𝑠−𝑖−1
𝜕 𝑛 𝑖 𝑛−𝑖
(𝑛 − 𝑖)! 𝐹 𝑦 −𝐹 𝑥
𝐹 𝑥, 𝑦 = 𝐹 𝑥 1−𝐹 𝑥
𝜕𝑦 𝑋𝑟:𝑛 ,𝑋𝑠:𝑛 𝑖 (𝑠 − 𝑖 − 1)! (𝑛 − 𝑠)! 1 − 𝐹 𝑥
𝑖=𝑟

𝑛−𝑠
𝐹 𝑦 −𝐹 𝑥 𝑓 𝑦
× 1−
1−𝐹 𝑥 [1 − 𝐹 𝑥 ]
𝑛−𝑠 𝑠−1
𝑛! 1 − 𝐹 𝑦 𝑠−1 𝑖 𝑠−𝑖−1
= 𝑓 𝑦 𝐹 𝑥 𝐹 𝑦 −𝐹 𝑥
(𝑠 − 1)! (𝑛 − 𝑠)! 𝑖
𝑖=𝑟

𝑛! 𝑠−1 𝑛−𝑠
= 𝐹 𝑦 1−𝐹 𝑦 𝑓 𝑦
𝑠−1 ! 𝑛−𝑠 !
𝑠−1 𝑖 𝑠−𝑖−1
𝑠−1 𝐹 𝑥 𝐹 𝑥
× 1−
𝑖 𝐹 𝑦 𝐹 𝑦
𝑖=𝑟

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 11


NPTEL- Probability and Distributions

𝑛! 𝑠−1 𝑛−𝑠
= 𝐹 𝑦 1−𝐹 𝑦 𝑓 𝑦
𝑠−1 ! 𝑛−𝑠 !
𝐹 𝑥
𝐹 𝑦
1
× 𝑡 𝑟 −1 1 − 𝑡 𝑠−𝑟−1
𝑑𝑡
𝐵 𝑟, 𝑠 − 𝑟
0

𝜕2
⇒ 𝑓𝑋𝑟:𝑛 ,𝑋𝑠:𝑛 𝑥, 𝑦 = 𝐹
𝜕𝑥𝜕𝑦 𝑋𝑟:𝑛 ,𝑋𝑠:𝑛

𝑛! 𝑠−1 𝑛−𝑠
= 𝐹 𝑦 1−𝐹 𝑦 𝑓 𝑦
𝑟−1 ! 𝑠−𝑟−1 ! 𝑛−𝑠 !
𝑟−1 𝑠−𝑟−1
𝐹 𝑥 𝐹 𝑥 𝑓 𝑥
× 1−
𝐹 𝑦 𝐹 𝑦 𝐹 𝑦

𝑛! 𝑟−1 𝑠−𝑟−1
= 𝐹 𝑥 𝐹 𝑦 −𝐹 𝑥
𝑟−1 ! 𝑠−𝑟−1 ! 𝑛−𝑠 !
𝑛−𝑠
× 1−𝐹 𝑦 𝑓 𝑥 𝑓 𝑦 , −∞ < 𝑥 < 𝑦 < ∞, 𝑥, 𝑦 ∈ 𝐶 𝑐 .

Also, for −∞ < 𝑦 < 𝑥 < ∞, 𝑥, 𝑦 ∈ 𝐶 𝑐 , and 1 ≤ 𝑟 < 𝑠 ≤ 𝑛

𝑋𝑠:𝑛 ≤ 𝑦 ⊆ 𝑋𝑟:𝑛 ≤ 𝑥

and therefore

𝐹𝑋𝑟:𝑛 ,𝑋𝑠:𝑛 𝑥, 𝑦 = 𝐹𝑋𝑟:𝑛 𝑥

𝜕2
⇒ 𝐹 𝑥, 𝑦 = 0, −∞ < 𝑥 < 𝑦 < ∞, 𝑥, 𝑦 ∈ 𝐶 𝑐 ..
𝜕𝑥𝜕𝑦 𝑋𝑟:𝑛 ,𝑋𝑠:𝑛

Let
𝑛! 𝑟−1 𝑠−𝑟−1 𝑛−𝑠
𝐹 𝑥 𝐹 𝑦 −𝐹 𝑥 1−𝐹 𝑦 𝑓 𝑥 𝑓 𝑦 ,
𝑟−1 ! 𝑠−𝑟−1 ! 𝑛−𝑠 !
𝑓𝑟,𝑠 𝑥, 𝑦 = if − ∞ < 𝑥 < 𝑦 < ∞
0, otherwise

(10.1.2)

so that

𝜕2
𝐹 𝑥, 𝑦 = 𝑓𝑟,𝑠 𝑥, 𝑦 ∀ 𝑥, 𝑦 ∈ ℝ2 − 𝐶 × 𝐶 = 𝐷 (say).
𝜕𝑥𝜕𝑦 𝑋𝑟:𝑛 ,𝑋𝑠:𝑛

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 12


NPTEL- Probability and Distributions

Clearly 𝐷𝑐 is countable. It is easy to verify that


∞ ∞ ∞ ∞
𝑓𝑟,𝑠 𝑥, 𝑦 𝑑𝑥𝑑𝑦 = 𝑓𝑟,𝑠 𝑥, 𝑦 𝐼𝐷 𝑥, 𝑦 𝑑𝑥𝑑𝑦 = 1.
−∞ −∞ −∞ −∞

Using Remark 2.1 (xiii), it follows that the random vector (𝑋𝑟:𝑛 , 𝑋𝑠:𝑛 ) is of absolutely
continuous type with joint p.d.f. given by (10.1.2). One can give the following heuristic
argument for the expression (10.1.2). For −∞ < 𝑥 < 𝑦 < ∞,

𝑓𝑟,𝑠 𝑥, 𝑦 ∆𝑥∆𝑦 = probability that 𝑟 − 1 𝑋𝑖′ 𝑠 fall in (−∞, 𝑥], one 𝑋𝑖 falls in 𝑥, 𝑥 +
Δ𝑥 , 𝑠 − 𝑟 − 1 𝑋𝑖′ 𝑠 fall in 𝑥 + ∆𝑥, 𝑦 ≈ 𝑥, 𝑦 , one 𝑋𝑖 falls in (𝑦, 𝑦 + Δ𝑦] and
𝑛 − 𝑠 𝑋𝑖 s fall in 𝑦 + ∆𝑦, ∞ ≈ 𝑦, ∞ . Using the property of multinomial
distribution, we have

𝑓𝑟,𝑠 𝑥, 𝑦 ∆𝑥∆𝑦
𝑛! 𝑟−1 1 𝑠−𝑟−1 1 𝑛−𝑠
= 𝐹 𝑥 𝑓 𝑥 ∆𝑥 𝐹 𝑦 −𝐹 𝑥 𝑓 𝑦 ∆𝑦 1−𝐹 𝑦
(𝑟 − 1)! 1! (𝑠 − 𝑟 − 1)! 1! (𝑛 − 𝑠)!

i.e.,

𝑛! 𝑟−1 𝑠−𝑟−1
𝑓𝑟,𝑠 𝑥, 𝑦 = 𝐹 𝑥 𝐹 𝑦 −𝐹 𝑥
𝑟−1 ! 𝑠−𝑟−1 ! 𝑛−𝑠 !
𝑛−𝑠
× 1−𝐹 𝑦 𝑓 𝑥 𝑓 𝑦 , −∞ < 𝑥 < 𝑦 < ∞. ▄
Example 10.1.2

Let 𝑋1 , … , 𝑋𝑛 be a random sample from a discrete distribution having support 𝑆 ,


distribution function 𝐹 ∙ and p.m.f. 𝑓 ∙ . Define 𝑋1:𝑛 = min 𝑋1 , … , 𝑋𝑛 and 𝑋𝑛:𝑛 =
max 𝑋1 , … , 𝑋𝑛 . Find the p.m.f.s of 𝑋1:𝑛 and 𝑋𝑛:𝑛 .

Solution. For 𝑥 ∈ ℝ, the distribution function of 𝑋1:𝑛 is

𝐹𝑋1:𝑛 𝑥 = 𝑃 𝑋1:𝑛 ≤ 𝑥

= 1 − 𝑃 𝑋1:𝑛 > 𝑥

= 1 − 𝑃 𝑋𝑖 > 𝑥, 𝑖 = 1, … , 𝑛
𝑛

= 1− 𝑃 𝑋𝑖 > 𝑥
𝑖=1

=1− 1−𝐹 𝑥
𝑖=1

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 13


NPTEL- Probability and Distributions

𝑛
= 1− 1−𝐹 𝑥 .

Note that

𝐷𝑋1:𝑛 = 𝑥 ∈ ℝ: 𝐹𝑋 1:𝑛 ⋅ is discontinuous at 𝑥

= 𝑥 ∈ ℝ: 𝐹 ∙ is discontinuous at 𝑥

= 𝑆.

Thus 𝑋1:𝑛 is a discrete type random variable with support 𝑆 and p.m.f.

𝐹𝑋1:𝑛 𝑥 − 𝐹𝑋1:𝑛 𝑥 − , if 𝑥 ∈ 𝑆
𝑓𝑋 1:𝑛 𝑥 =
0, otherwise
𝑛 𝑛
1−𝐹 𝑥− − 1−𝐹 𝑥 , if 𝑥 ∈ 𝑆
= .
0, otherwise

Also the distribution function of 𝑋𝑛:𝑛 is given by

𝐹𝑋 1:𝑛 𝑥 = 𝑃 𝑋𝑛:𝑛 ≤ 𝑥

= 𝑃 𝑋𝑖 ≤ 𝑥, 𝑖 = 1, … , 𝑛
𝑛

= 𝑃 𝑋𝑖 ≤ 𝑥
𝑖=1

= 𝐹 𝑥
𝑖=1

𝑛
= 𝐹 𝑥 , 𝑥 ∈ ℝ.

Since 𝐹𝑋𝑛 :𝑛 ∙ is continuous at 𝑥 if, and only if, 𝐹 ∙ is continuous at 𝑥 , the random
variable 𝑋𝑛:𝑛 is of discrete type with support 𝑆 and p.m.f.

𝐹𝑋𝑛 :𝑛 𝑥 − 𝐹𝑋𝑛 :𝑛 𝑥 − , if 𝑥 ∈ 𝑆
𝑓𝑋𝑛 :𝑛 𝑥 =
0, otherwise
𝑛 𝑛
𝐹 𝑥 − 𝐹 𝑥− , if 𝑥 ∈ 𝑆
= .▄
0, otherwise

Example 10.1.3

Let 𝑋1 , 𝑋2 be a random sample from 𝑈 0,1 distribution. Find the distribution function of
𝑌 = 𝑋1 + 𝑋2 . Hence find the p.d.f. of 𝑌.

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 14


NPTEL- Probability and Distributions

Solution. The joint p.d.f. of 𝑋1 , 𝑋2 is given by

𝑓𝑋1 ,𝑋2 𝑥1 , 𝑥2 = 𝑓𝑋1 𝑥1 𝑓𝑋2 𝑥2

1, if 0 < 𝑥1 < 1, 0 < 𝑥2 < 1


= .
0, otherwise

Therefore the distribution function of 𝑌 is given by

𝐹𝑌 𝑥 = 𝑃 𝑋1 + 𝑋2 ≤ 𝑥
∞ ∞

= 𝑓𝑋1 ,𝑋2 𝑥1 , 𝑥2 𝐼(−∞,𝑥] 𝑥1 + 𝑥2 𝑑𝑥1 𝑑𝑥2


−∞ −∞

1 1

= 𝑑𝑥1 𝑑𝑥2
0 0
𝑥 1 + 𝑥 2 ≤𝑥

0, if 𝑥 < 0
1
× 𝑥 × 𝑥, if 0 ≤ 𝑥 < 1
= 2 .
1
1− 2−𝑥 × 2−𝑥 , if 1 ≤ 𝑥 < 2
2
1, if 𝑥 ≥ 2

0, if 𝑥 < 0
𝑥2
, if 0 ≤ 𝑥 < 1
⇒ 𝐹𝑌 𝑥 = 2 .
4𝑥 − 𝑥 2 − 2
, if 1 ≤ 𝑥 < 2
2
1, if 𝑥 ≥ 2

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 15


NPTEL- Probability and Distributions

Clearly 𝐹𝑌 ∙ is differentiable everywhere except on a finite set 𝐶 ⊆ 0, 1, 2 . Let

𝑥, if 0 < 𝑥 < 1
𝑔 𝑥 = 2 − 𝑥, if 1 < 𝑥 < 2 ,
0, otherwise

so that
𝑑
𝐹 𝑥 = 𝑔 𝑥 , ∀𝑥 ∈ ℝ − 𝐶
𝑑𝑥 𝑌
and

𝑔 𝑥 𝑑𝑥 = 1.
−∞

It follows that 𝑌 is of absolutely continuous type with a p.d.f.

𝑥, if 0 < 𝑥 < 1
𝑔 𝑥 = 2 − 𝑥, if 1 < 𝑥 < 2 . ▄
0, otherwise

Example 10.1.4

Let 𝑋1 , 𝑋2 be a random sample from a distribution having p.d.f.

2𝑥, if 0 < 𝑥 < 1


𝑓 𝑥 = .
0, otherwise

Find the distribution function of 𝑌 = 𝑋1 + 𝑋2 . Hence find the p.d.f. of 𝑌.

Solution. The joint p.d.f. of 𝑋1 , 𝑋2 is given by

𝑓𝑋1 ,𝑋2 𝑥1 , 𝑥2 = 𝑓𝑋 1 𝑥1 𝑓𝑋 2 𝑥2

4 𝑥1 𝑥2 , if 0 < 𝑥1 < 1, 0 < 𝑥2 < 1


= .
0, otherwise

The distribution function of 𝑌 is given by


1 1

𝐹𝑌 𝑥 = 𝑃 𝑋1 + 𝑋2 ≤ 𝑥 = 4𝑥1 𝑥2 𝑑𝑥1 𝑑𝑥2 .


0 0
𝑥 1 +𝑥 2 ≤𝑥

Clearly, for 𝑥 < 0, 𝐹𝑌 𝑥 = 0 and, for 𝑥 ≥ 2, 𝐹𝑌 𝑥 = 1.

For 0 ≤ 𝑥 < 1

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 16


NPTEL- Probability and Distributions

𝑥 𝑥−𝑥 1
𝑥4
𝐹𝑌 𝑥 = 4𝑥1 𝑥2 𝑑𝑥2 𝑑𝑥1 = .
6
0 0

For 1 ≤ 𝑥 < 2
𝑥−1 1 1 𝑥−𝑥 1

𝐹𝑌 𝑥 = 4𝑥1 𝑥2 𝑑𝑥2 𝑑𝑥1 + 4𝑥1 𝑥2 𝑑𝑥2 𝑑𝑥1


0 0 𝑥−1 0

3
2
4𝑥 − 3 − 𝑥 + 3 𝑥 − 1
= 𝑥−1 + .
6

Therefore,

0, if 𝑥 < 0
𝑥4
, if 0 ≤ 𝑥 < 1
𝐹𝑌 𝑥 = 6 .
2
4𝑥 − 3 − 𝑥 + 3 𝑥 − 1 3
𝑥−1 + , if 1 ≤ 𝑥 < 2
6
1, if 𝑥 ≥ 2

Clearly 𝐹𝑌 ∙ is differentiable everywhere except on a finite set 𝐶 ⊆ 0, 1, 2 . Let

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 17


NPTEL- Probability and Distributions

2 3
𝑥 , if 0 < 𝑥 < 1
3
𝑔 𝑥 = 2 2
,
2 𝑥−1 + 1− 𝑥+2 𝑥−1 , if 1 < 𝑥 < 2
3
0, otherwise

so that
𝑑
𝐹 𝑥 = 𝑔 𝑥 , ∀𝑥 ∈ ℝ − 𝐶
𝑑𝑥 𝑌
and

𝑔 𝑥 𝑑𝑥 = 1.
−∞

It follows that 𝑌 is of absolutely continuous type with a p.d.f.

2 3
𝑥 , if 0 < 𝑥 < 1
3
𝑔 𝑥 = 2 2
.▄
2 𝑥−1 + 1− 𝑥+2 𝑥−1 , if 1 < 𝑥 < 2
3
0, otherwise

Example 10.1.5

Let 𝑋1 , 𝑋2 , 𝑋3 be a random sample and let 𝑋1 ~ 𝑁 0, 1 . Find the distribution function of


𝑌 = 𝑋12 + 𝑋22 + 𝑋32 . Hence find the p.d.f. of 𝑌.

Solution. The joint p.d.f. of 𝑋 = 𝑋1 , 𝑋2 , 𝑋3 is


3

𝑓𝑋 𝑥1 , 𝑥2 , 𝑥3 = 𝑓𝑋 𝑖 𝑥𝑖
𝑖=1

3
1 𝑥2
𝑖
= 𝑒− 2
𝑖=1
2𝜋

1 −
1
𝑥 12 +𝑥 22 +𝑥 32
= 3𝑒
2 , − ∞ < 𝑥𝑖 < ∞, 𝑖 = 1, 2, 3.
2𝜋 2

Therefore the distribution function of 𝑌 = 𝑋12 + 𝑋22 + 𝑋32 is


∞ ∞ ∞
1 1
𝑥 12 +𝑥 22 +𝑥 32
𝐹𝑌 𝑦 = 3 𝑒− 2 𝑑𝑥1 𝑑𝑥2 𝑑𝑥3 .
−∞ −∞ −∞ 2𝜋 2
𝑥 12 + 𝑥 22 + 𝑥 32 ≤𝑦

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 18


NPTEL- Probability and Distributions

On making the spherical coordinates transformation

𝑥1 = 𝑟 sin 𝜃1 sin 𝜃2 ,
𝑥2 = 𝑟 sin 𝜃1 cos 𝜃2 ,
𝑥3 = 𝑟 cos 𝜃1 ,

so that 𝑟 > 0, 0 < 𝜃1 ≤ 𝜋, 0 < 𝜃2 ≤ 2𝜋 and the Jacobian of the transformation is


𝐽 = 𝑟 2 sin 𝜃1 , we get for 𝑦 > 0
𝑦 𝜋 2𝜋
1 𝑟2
𝐹𝑌 𝑦 = 3 𝑒− 2 𝑟 2 sin𝜃1 𝑑𝜃2 𝑑𝜃1 𝑑𝑟
2𝜋 2
0 0 0

𝑦
2 𝑟2
= 𝑒− 2 𝑟 2 𝑑𝑟
𝜋
0

𝑦
1 𝑡 3
= 3 𝑒 − 2 𝑡 2− 1 𝑑𝑡
3
22 Γ(2) 0

Therefore

0, if 𝑦 ≤ 0
𝑦
𝐹𝑌 𝑦 = 1 𝑡 3
.
3 𝑒 −2 𝑡 2 − 1 𝑑𝑡, if 𝑦 > 0
3
2 Γ(2)
2
0

Clearly 𝐹𝑌 ∙ is the distribution function of 𝜒32 distribution having the p.d.f.


𝑦 3
𝑒 −2 𝑦 2 − 1
3 , if 𝑦 > 0
𝑓𝑌 𝑦 = 22 Γ(2)
3 ,
0, otherwise

Thus 𝑌 ~ 𝜒32 (also see Example 7.6 (ii)). ▄

In many situations finding distribution function

𝐹𝑌 𝑦 = 𝑃 𝑔 𝑋1 , … , 𝑋𝑝 ≤ 𝑦 , −∞ < 𝑦 < ∞,

of random variable 𝑌 = 𝑔 𝑋1 , … , 𝑋𝑝 may be difficult or quite tedious. For example,


consider a random sample 𝑋1 , … , 𝑋𝑛 (𝑛 ≥ 4) from 𝑁(0, 1) distribution and suppose that
the distribution function of 𝑌 = 𝑛𝑖=1 𝑋𝑖2 is desired. Clearly, for 𝑦 > 0

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 19


NPTEL- Probability and Distributions

∞ ∞
1 1 𝑛 2
𝐹𝑌 𝑦 = ⋯ 𝑛 𝑒 −2 𝑖=1 𝑥 𝑖 𝑑𝑥1 𝑑𝑥2 ⋯ 𝑑𝑥𝑛 .
−∞ −∞ 2𝜋 2
𝑥 12 +⋯ +𝑥 𝑛2 ≤𝑦

On making the spherical coordinates transformation

𝑥1 = 𝑟 sin 𝜃1 sin 𝜃2 ⋯ sin 𝜃𝑛 −2 sin 𝜃𝑛 −1


𝑥2 = 𝑟 sin 𝜃1 sin 𝜃2 ⋯ sin 𝜃𝑛 −2 cos 𝜃𝑛 −1
𝑥3 = 𝑟 sin 𝜃1 sin 𝜃2 ⋯ sin 𝜃𝑛 −3 cos 𝜃𝑛 −2

𝑥𝑛 −1 = 𝑟 sin 𝜃1 cos 𝜃2
𝑥𝑛 = 𝑟 cos 𝜃1 ,

So that 𝑟 > 0, 𝑛𝑖=1 𝑥𝑖2 = 𝑟 2 , 0 < 𝜃𝑖 ≤ 𝜋, 𝑖 = 1, … , 𝑛 − 2, 0 < 𝜃𝑛 −1 ≤ 2𝜋 and the


Jacobian of transformation is 𝐽 = 𝑟 𝑛−1 sin𝑛−2 𝜃1 sin𝑛−3 𝜃2 ⋯ sin 𝜃𝑛−2 , we get for 𝑦 > 0
𝑦 𝜋 𝜋 2𝜋
1 𝑟2
𝐹𝑌 𝑦 = ⋯ 𝑛 𝑒 − 2 𝑟 𝑛−1 sin𝑛−2 𝜃1 sin𝑛−3 𝜃2 ⋯ sin 𝜃𝑛−2 𝑑𝜃𝑛−1 𝑑𝜃𝑛−2 ⋯ 𝑑𝜃1 𝑑𝑟.
2𝜋 2
0 0 0 0

Clearly evaluating the above integral may be tedious. This points towards desirability, if
possible, of other methods of determining the distributions of functions of random
variable. We will see that other techniques are available and, in a given situation, often
one technique is more elegant than the others.

Dept. of Mathematics and Statistics Indian Institute of Technology, Kanpur 20

You might also like