0% found this document useful (0 votes)
40 views5 pages

Tutorial 3

The document contains a tutorial on Probability and Statistics from the Indian Institute of Technology Mandi, featuring various problems related to probability distributions, including binomial, Poisson, and normal distributions. It also includes questions on Markov's Inequality, Chebyshev's Theorem, and the Weak Law of Large Numbers, along with detailed solutions for each problem. The tutorial aims to enhance understanding of statistical concepts through practical applications.

Uploaded by

Kapish
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views5 pages

Tutorial 3

The document contains a tutorial on Probability and Statistics from the Indian Institute of Technology Mandi, featuring various problems related to probability distributions, including binomial, Poisson, and normal distributions. It also includes questions on Markov's Inequality, Chebyshev's Theorem, and the Weak Law of Large Numbers, along with detailed solutions for each problem. The tutorial aims to enhance understanding of statistical concepts through practical applications.

Uploaded by

Kapish
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Indian Institute of Technology Mandi

भारतीय ौ ोिगक सं थान म डी


MA-524 Probability and Statistics
Tutorial - 03

Questions
1. Because not all airline passengers show up for their reserved seat, an airline sells 125 tickets for a flight that holds only
120 passengers. The probability that a passenger does not show up is 0.10, and the passengers behave independently.

(a) What is the probability that every passenger who shows up can take the flight?
(b) What is the probability that the flight departs with empty seats?

2. Flaws occur at random along the length of a thin copper wire. Suppose that the number of flaws follows a Poisson
distribution with a mean of 2.3 flaws per millimeter.
(a) Determine the probability of exactly 2 flaws in 1 millimeter of wire.
(b) Determine the probability of 10 flaws in 5 millimeters of wire.
(c) Determine the probability of at least 1 flaw in 2 millimeters of wire.
3. A typesetter, on the average, makes one error in every 500 words typeset. A typical page contains 300 words. What is
the probability that there will be no more than two errors in five pages?
4. Let X denote the time between detections of a particle with a Geiger counter and assume that X has an exponential
distribution with λ = 1.4 minutes.
(a) What is the probability that we detect a particle within 30 seconds of starting the counter?
(b) Suppose we turn on the Geiger counter and wait 3 minutes without detecting a particle. What is the probability that
a particle is detected in the next 30 seconds?
5. Assume X is normally distributed with a mean of 10 and a standard deviation of 2. Determine the following:
(a) P (X < 13)
(b) P (X > 9)
(c) P (6 < X < 14)
(d) P (2 < X < 4)
(e) P (−2 < X < 8)
6. Assume that in the detection of a digital signal the background noise follows a normal distribution with a mean of 0 volt
and standard deviation of 0.45 volt.
(a) The system assumes a digital 1 has been transmitted when the voltage exceeds 0.9. What is the probability of
detecting a digital 1 when none was sent?
(b) Determine symmetric bounds about 0 that include 99% of all noise readings.
7. Let X1 , X2 , . . . , Xn be a sequence of i.i.d. random variables with mean µ and variance σ 2 . Define the standardized sample
mean as:

X̄n − µ
Zn = √
σ/ n
1
Pn
where X̄n = n i=1 Xi is the sample mean. Show that:

d
Zn −
→ N (0, 1) as n → ∞.

1
8. Statement of Markov’s Inequality
Let X be a non-negative random variable (i.e., X ≥ 0 almost surely), and let a > 0. Then, Markov’s inequality states
that:

E[X]
P(X ≥ a) ≤ .
a
This inequality provides an upper bound on the probability that a non-negative random variable exceeds a given positive
threshold.

9. Statement of Chebyshev’s Theorem:


Let X be a random variable with finite mean µ = E[X] and finite variance σ 2 = Var(X). Then, for any k > 0, we have:

1
P (|X − µ| ≥ kσ) ≤ .
k2
1
That is, the probability that X deviates from its mean by at least k standard deviations is at most k2 .

10. Let X1 , X2 , . . . , Xn be i.i.d. random variables with finite mean µ = E[Xi ]. Prove that for any ϵ > 0:

P X̄n − µ ≥ ϵ → 0 as n → ∞.

Solutions
1. Let X denote the passengers with tickets that do not show up for the flight. Then, X is binomial with n = 125 and
p = 0.1.

(a) P (X ≥ 5) = 1 − P (X ≤ 4)
     
125 0 125 125 1 124 125
=1− 0.1 (0.9) + 0.1 (0.9) + 0.12 (0.9)123
0 1 2
    
125 125
+ 0.13 (0.9)122 + 0.14 (0.9)121 = 0.9961
3 4
(b) P (X > 5) = 1 − P (X ≤ 5) = 0.9886

2. (a) Let X denote the number of flaws in 1 millimeter of wire. Then, E(X) = 2.3 flaws and

e−2.3 2.32
P (X = 2) = = 0.265
2!

(b) Let X denote the number of flaws in 5 millimeters of wire. Then, X has a Poisson distribution with

E(X) = 5 mm × 2.3 flaws/mm = 11.5 flaws

Therefore,
e−11.5 11.510
P (X = 10) = = 0.113
10!
(c) Let X denote the number of flaws in 2 millimeters of wire. Then, X has a Poisson distribution with

E(X) = 2 mm × 2.3 flaws/mm = 4.6 flaws

Therefore,
P (X ≥ 1) = 1 − P (X = 0) = 1 − e−4.6 = 0.9899
1
3. If we assume that setting a word is a Bernoulli trial with success probability p = 500 (notice that we are labeling an error
as a “success”) and that the trials are independent, then X = number of errors in five pages (1500 words) is binomial
1
(1500, 500 ). Thus
P (no more than two errors) = P (X ≤ 2)

2
2   x  1500−x
X 1500 1 499
=
x=0
x 500 500
= 0.4230,
which is a fairly cumbersome calculation.
If we use the Poisson approximation with  
1
λ = 1500 = 3,
500
we have
32
 
−3
P (X ≤ 2) ≈ e 1+3+ = 0.4232.
2

4. (a) The probability that we detect a particle within 30 seconds of starting the counter is
0.5
P (X < 0.5 minute ) = F (0.5) = 1 − e− 1.4 = 0.30

(b) The requested probability can be expressed as the conditional probability

P (3 < X < 3.5)


P (X < 3.5 | X > 3) =
P (X > 3)
where
3.5 3
P (3 < X < 3.5) = F (3.5) − F (3) = [1 − e− 1.4 ] − [1 − e− 1.4 ] = 0.0035
and
3
P (X > 3) = 1 − F (3) = e− 1.4 = 0.117
Therefore,
0.035
P (X < 3.5 | X > 3) = = 0.30
0.117
Alternatively, we can use the lack of memory property of exponential distribution as:

P (X < s + t | X > s) = P (X < t)

Equivalently,
P (X < 3 + 0.5 | X > 3) = P (X < 0.5) = 0.30
After waiting for 3 minutes without a detection, the probability of a detection in the next 30 seconds is the same as
the probability of a detection in the 30 seconds immediately after starting the counter.
5. (a)  
13 − 10
P (X < 13) = P Z<
2
P (Z < 1.5) = 0.93319
(b)
P (X > 9) = 1 − P (X < 9)
 
9 − 10
=1−P Z <
2
= 1 − P (Z < −0.5)
= 1 − [1 − P (Z < 0.5)]
= P (Z < 0.5)
= 0.69146
(c)  
6 − 10 14 − 10
P (6 < X < 14) = P <Z<
2 2
= P (Z < 2) − P (Z < −2)
= 0.9545

3
(d)  
2 − 10 4 − 10
P (2 < X < 4) = P <Z<
2 2
= P (−4 < Z < −3)
= P (Z < −3) − P (Z < −4)
= 0.00135
(e)
P (−2 < X < 8) = P (X < 8) − P (X < −2)
   
8 − 10 −2 − 10
=P Z< −P Z <
2 2
= P (Z < −1) − P (Z < −6)
= 0.15866

6. (a) Let the random variable N denote the voltage of noise. The requested probability is
 
N 0.9
P (N > 0.9) = P > = P (Z > 2) = 1 − 0.97725 = 0.02275
0.45 0.45

This probability can be described as the probability of a false detection.


(b) The question requires us to find x such that P (−x < N < x) = 0.99. Now,
 
−x N x
P (−x < N < x) = P < <
0.45 0.45 0.45
 
−x x
=P <Z< = 0.99
0.45 0.45
We know that
P (−2.58 < Z < 2.58) = 0.99
Therefore,
x
= 2.58
0.45
and
x = 2.58(0.45) = 1.16

7. By the definition of the sample mean:


" n
# n
1X 1X
E[X̄n ] = E Xi = E[Xi ] = µ.
n i=1 n i=1

Similarly, the variance of X̄n is:

n
! n
1X 1 X σ2
Var(X̄n ) = Var Xi = Var(Xi ) = .
n i=1 n2 i=1 n

Thus, the standardized form of X̄n is:

X̄n − µ
Zn = √ .
σ/ n

By the Central Limit Theorem, we have:


Pn
i=1 (Xi − µ) d
Zn = √ −
→ N (0, 1).

4
8. To prove Markov’s inequality, consider the expectation of X using the fact that X takes values at least a whenever X ≥ a:
Z ∞
E[X] = xfX (x) dx.
0

Splitting the integral at a, we write:


Z a Z ∞
E[X] = xfX (x) dx + xfX (x) dx.
0 a

Since x ≥ a in the second integral, we bound it as follows:


Z ∞ Z ∞
xfX (x) dx ≥ a fX (x) dx = aP(X ≥ a).
a a

Thus, we obtain:

E[X] ≥ aP(X ≥ a).


Dividing both sides by a, we arrive at Markov’s inequality:

E[X]
P(X ≥ a) ≤ .
a
9. We start with Markov’s inequality, which states that for any non-negative random variable Y and any a > 0,

E[Y ]
P(Y ≥ a) ≤ .
a
Now, define Y = (X − µ)2 , which is always non-negative. Applying Markov’s inequality with a = k 2 σ 2 , we obtain:

 E[(X − µ)2 ]
P (X − µ)2 ≥ k 2 σ 2 ≤ .
k2 σ2
Since E[(X − µ)2 ] = σ 2 , we simplify:

σ2 1
P (X − µ)2 ≥ k 2 σ 2 ≤ 2 2 = 2 .

k σ k
Since (X − µ)2 ≥ k 2 σ 2 is equivalent to |X − µ| ≥ kσ, we conclude:

1
P (|X − µ| ≥ kσ) ≤ .
k2
10. By Chebyshev’s inequality, we have:

 Var(X̄n )
P X̄n − µ ≥ ϵ ≤ .
ϵ2
Since Xi are i.i.d. with variance σ 2 , we compute:

σ2
Var(X̄n ) = .
n
Thus, we obtain:

 σ2
P X̄n − µ ≥ ϵ ≤ 2 .

σ2
Since nϵ2 → 0 as n → ∞, it follows that:

P X̄n − µ ≥ ϵ → 0.
This proves the Weak Law of Large Numbers.

You might also like