0% found this document useful (0 votes)
25 views83 pages

Chapter #3 Discrete RV Prob Dist Function

Chapter 3 covers discrete random variables and their probability distributions, including definitions and examples of probability mass functions, cumulative distribution functions, and various types of distributions such as binomial and Poisson. Key concepts include the mean, variance, and standard deviation of discrete random variables, along with practical applications in real-world scenarios. The chapter emphasizes the importance of understanding these distributions for modeling physical systems and conducting statistical analyses.

Uploaded by

yashabiba23
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views83 pages

Chapter #3 Discrete RV Prob Dist Function

Chapter 3 covers discrete random variables and their probability distributions, including definitions and examples of probability mass functions, cumulative distribution functions, and various types of distributions such as binomial and Poisson. Key concepts include the mean, variance, and standard deviation of discrete random variables, along with practical applications in real-world scenarios. The chapter emphasizes the importance of understanding these distributions for modeling physical systems and conducting statistical analyses.

Uploaded by

yashabiba23
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Chapter 3:

Discrete Random Variables and


Probability Distributions
3 Definitions
3-1 Probability Distributions, Discrete Random Variable
3-2 Probability Density Functions (PDF)
3-3 Cumulative Distribution Functions
3-4 Mean Value, Standard Deviation: Variance,
Expected value of a discrete random variable
Standard deviation of a discrete random variable
3-5 Discrete Uniform Distribution
Tutorial 2
3-6 Binomial Distribution
3-7 Geometric Distribution
3-9 Poisson Distribution
1
Learning Objectives

1. Discrete Random Variables


2. Probability Distributions and Probability Mass Functions
3. Cumulative Distribution Functions
4. Mean and Variance of a Discrete Random Variable
5. Discrete Uniform Distribution
6. Binomial Distribution
7. Geometric and Negative Binomial Distributions
8. Poisson Distribution

2
Definitions: Random Variables (RV)
▪ Random Variable (RV): A numeric outcome that results
from an experiment
▪ For each element of an experiment’s sample space, the
random variable can take on exactly one value
▪ Discrete Random Variable: An RV that can take on only a
finite or countably infinite set of outcomes
▪ Continuous Random Variable: An RV that can take on
any value along a continuum (but may be reported
“discretely”)
▪ Random Variables are denoted by upper case letters (𝒀)
▪ Individual outcomes for an RV are denoted by lower case
letters (𝒚)
Definitions
Random variable- A random variable associates a numerical value
with each outcome of an experiment. It is defined mathematically as a
real-valued function defined on a sample space.

Continuous random variables- variables that can assume any value


in some range and infinite values within that range
Example:
▪ Motor shaft may have diameter between 0.197m and 0.203m
▪ Steel box could have any weight between 0.345kg and 0.352 kg
▪ Electrical current, length, pressure, time, voltage, weight, etc.

Discrete random variables - Variables that have only finite values


within a certain range.
Example:
▪ Number of transmitted bit received in error
▪ Number of scratches on a surface, proportion of defective parts among 1000

4
3-1 Probability Distributions,
Discrete Random Variable
A probability distribution consists of the values of a random
variable and their corresponding probabilities.

Example:
The probability of a discrete random variable X occurring
x 2 2.5 3 3.5 4 4.5
a) Plot the table
P(X=x) 0.07 0.36 0.21 0.19 0.10 0.07
b) State P(X = 3.5)
c) Calculate P(x ≥ 3.0)
d) Calculate P(x) < 4.0)
e) Calculate P(x > 3.5)
f) Calculate P(x ≤ 3.9)
g) The variable x is sampled 50000 times. How many times would
you expected to have the value 2.5
5
Introductory Example
▪ Suppose that a day’s production of 850 manufacturing parts contains 50 parts
that do not conform to customer requirements.
▪ Two parts are selected at random, without replacement, from the batch.
▪ Let the random variable X equal the number of nonconforming parts in the
sample.

1. Determine the sample space


2. What are all possible value of X ? and what are the associated probabilities?

This case can be generalized as following in next slide


6
3-1 Discrete Random Variables
▪ Many physical systems can be modeled by the same or
similar random experiments and random variables.

▪ The distribution of the random variables involved in each


of these common systems can be analyzed, and the results
of the analysis can be used in different applications and
examples.

▪ So instead of using the sample space of the random


experiment we describe the distribution of a particular
random variable.

7
3-1 Probability Distributions and
Probability Mass Functions
Example: Bits in error
▪ There is a chance that a bit transmitted in a digital transmission channel
is received in error.
▪ Let X equal the number of bits in error in the next 4 bits transmitted.
▪ The possible values of X are (0,1,2,3,4).

▪ The probability distribution of X can be specified by the possible values


along with the probability of each:
𝑷 𝑿 = 𝟎 = 𝟎. 𝟔𝟓𝟔
𝑷 𝑿 = 𝟏 = 𝟎. 𝟐𝟗𝟐
𝑷 𝑿 = 𝟐 = 𝟎. 𝟎𝟒𝟗
𝑷 𝑿 = 𝟑 = 𝟎. 𝟎𝟎𝟒
𝑷(𝑿 = 𝟒) = 𝟎. 𝟎𝟎𝟎𝟏

8
3-1 Probability Distributions and
Probability Mass Functions (cont.)
𝑃(𝑥)

Probability distribution for bits in error.


9
Example 3-5: wafer contamination
▪ X: a random variable denotes the number of semiconductor wafers
that need to be analyzed to detect a large particle of contamination.
▪ Probability that a wafer contains a large particle is 1%.
▪ Determine the probability distribution of X.
Answer:
• Let p denote a wafer with a large particle is present
• Let a denote a wafer with a large particle is absent
• The sample space is infinite and represents all possible sequences
that starts a with and end with p.
S = {p, ap, aap, aaap, aaaap, … and so forth}

Examples: P(X=1) = P(p) = 0.01


P(X=2) = P(ap) = 0.99*0.01 = 0.0099

A general formula is:


𝑃 𝑋 = 𝑥 = 𝑃(𝑎𝑎 … . 𝑎𝑝) = 0.99𝑥−1 0.01 for 𝑥 = 1,2,3, …
𝑥−1 𝑎′ 𝑠
describes the probabilities associated with 𝑿 in terms of this formula is
the simplest method to define the distribution of 𝑋 in this example
10
3-2 Probability Distributions and
Probability Mass Functions
Definition

11
Example: Flip three coins

▪ Flip three coins, X : Number of images,


X : S → {0,1,2,3} Probability Distribution
▪ Possible values of X
f(x)
✓X=0, Outcome HHH 0.4
0.3
✓X=1, Outcome THH, HTH, HHT 0.2
0.1
✓X=2, Outcome TTH, THT, HTT
✓X=3, Outcome TTT
0 1 2 3 x
▪ The probability distribution of X is given by the possible
values along with their probabilities Probabilities
P(X=0) = 1/8
P(X=1) = 3/8
P(X=2) = 3/8
P(X=3) = 1/8 12
Example: Flip three coins

▪ Probability Mass Function


The range of the variable is {0, 1, 2, 3}
f(0) = P(X=0) = 1/8
f(1) = P(X=1) = 3/8
f(2) = P(X=2) = 3/8
f(3) = P(X=3) = 1/8

Notice! The definition of a probability function is fulfilled:


✓1. f(x)  0 for all x
✓2. σni=1 f xi = 1
✓3. P(X = xi) = f(xi)
13
3-3 Cumulative Distribution Functions
▪ For the example of “Bits in error” (slide # 9), if we would like to find
P(X  3), we know that it is the union of the mutually exclusive
events: X=0, X=1, X=2, and X=3.

▪ Hence P(X  3) will be summation (or accumulation) of the


probability of these events:

𝑷(𝑿 ≤ 𝟑) = 𝑷(𝑿 = 𝟎) + 𝑷(𝑿 = 𝟏) + 𝑷(𝑿 = 𝟐) + 𝑷(𝑿 = 𝟑)


= 𝟎. 𝟗𝟗𝟗𝟗

▪ Using the cumulative probabilities is another way to represent the


probability distribution of a random variable.

14
3-3 Cumulative Distribution Functions

Definition

15
3-3 Cumulative Distribution Functions
Example:
Consider a random variable X that is equal 1, 2 or 3.
If we know that 𝑝(𝑋 = 1) = 1/2 and 𝑝(𝑋 = 2) = 1/3.
1) What is the p(X=3) = ?
𝑝 𝑋=1 + 𝑝 𝑋=2 + 𝑝 𝑋=3 = 1
⇒ 𝑝 𝑋 = 3 = 1/6
2) Draw the cumulative distribution function F(x).
𝟎 𝒙<𝟏
𝟏
𝟏≤𝒙<𝟐
𝑭 𝑿=𝒙 = 𝟐
𝟓
𝟐≤𝒙<𝟑
𝟔
𝟏 𝟑≤𝒙

16
Example 3-8
▪ Supposes that a day’s production of 850 manufacturing parts contains
50 parts that do not conform to customer requirements.
▪ Two parts are selected at random, without replacement, from the
batch.
▪ Let the random variable X equal the number of nonconforming parts in
the sample.
▪ What is the cumulative distribution function of X?
Answer:

17
Example 3-8
𝐹 0 = 𝑃(𝑋 ≤ 0) = 𝑃(𝑋 = 0) = 0.886
𝐹 1 = 𝑃 𝑋 ≤ 1 = 𝑃 𝑋 = 0 + 𝑃 𝑋 = 1 = 0.997
𝐹 2 = 𝑃 𝑋 ≤ 2 = 𝑃 𝑋 = 0 + 𝑃 𝑋 = 1 + 𝑃(𝑋 = 0) = 1

𝐶𝐷𝐹 𝑜𝑓 𝑋:
0 𝑥<0
0.886 0≤𝑥<1
𝐹 𝑥 =
0.997 1≤𝑥<2
1 2≤𝑥

Cumulative distribution function 𝒇𝒐𝒓 − ∞ < 𝒙 


18
Cumulative Distribution Function (CDF)
Example 2
𝐶𝐷𝐹 𝑜𝑓 𝑋:
0 𝑥<1
0.7 1≤𝑥<4
𝐹 𝑥 =
0.9 4≤𝑥<7
1 7≤𝑥
1. Determine each of the following probabilities:
𝑃 𝑋≤4 =
𝑃 𝑋>5 =
𝑃 𝑋<2 =
2. Determine the probability mass function of X:

19
Cumulative Distribution Function (CDF)
Example 2
𝐶𝐷𝐹 𝑜𝑓 𝑋:
0 𝑥<1
0.7 1≤𝑥<4
𝐹 𝑥 =
0.9 4≤𝑥<7
1 7≤𝑥
Determine each of the following probabilities:
𝑃 𝑋 ≤ 4 = 𝐹 4 = 0.9
𝑃 𝑋 > 5 =1-P(X ≤5)=1-F(5)=1-0.9=0.1
𝑃 𝑋 < 2 = P(X ≤1)=F(1)=0.7
P(X<4)=P(X ≤3)=F(3)=0.7
P(2 ≤X ≤5)=F(5)-F(2)=0.9-0.7=0.2

20
Cumulative Distribution Function (CDF)
Example 2
𝐶𝐷𝐹 𝑜𝑓 𝑋:
0 𝑥<1
0.7 1≤𝑥<4
𝐹 𝑥 =
0.9 4≤𝑥<7
1 7≤𝑥
Determine the probability mass function of X:
X={1, 4, 7}

𝑓 1 = 𝑃 𝑋 = 1 = 𝐹 1 − 𝐹 0 = 0.7
𝑓 4 = 𝑃 𝑋 = 4 = 𝐹 4 − 𝐹 1 = 0.9 − 0.7 = 0.2
𝑓 7 = 𝑃 𝑋 = 7 = 𝐹 7 − 𝐹 4 = 1 − 0.9 = 0.1

21
Cumulative Distribution Function (CDF)
In General
• 𝑃 𝑋≤𝑎 =𝐹 𝑎

• P(X<a)=P(X ≤a-1)=F(a-1)

• P(X>a)=1-P(X ≤a)=1-F(a)

• P(X≥ 𝑎) = 1 − 𝑃(𝑋 < 𝑎) = 1 − 𝑃(𝑋 ≤a-1)=1-F(a-1)

• P(a ≤X ≤b)= 𝑓 𝑎 + 𝑓 𝑎 + 1 + ⋯ + 𝑓 𝑏 =F(b)-F(a-1)


F(b)= 𝑓 1 + 𝑓 2 + ⋯ + 𝑓 𝑎 + 𝑓(𝑏)
F(a-1)= 𝑓 1 + 𝑓 2 + ⋯ + 𝑓 𝑎 − 1
So F(b)-F(a-1)= 𝑓 𝑎 + 𝑓 𝑎 + 1 + ⋯ + 𝑓 𝑏 =P(a ≤X ≤b)

• 𝑃 𝑋 =𝑎 =𝐹 𝑎 −𝐹 𝑎−1
F a = 𝑓 1 + 𝑓 2 + ⋯+ 𝑓 𝑎
F a − 1 = 𝑓 1 + 𝑓 2 + ⋯+ 𝑓 𝑎 − 1
So F a − 𝐹(𝑎 − 1) = 𝑓 𝑎
22
Probability Distributions and
Probability Mass Functions

▪ What is the meaning of Probability Distribution ?


Probability distribution of a random variable X is a
description of the probabilities associated with possible
values of X.

▪ For a discrete random variable, the distribution is often


specified by just a list of the possible values along with
the probabilities of each

▪ In some cases, it is convenient to express the probability in


terms of a formula.

23
Probability Distributions
▪ Probability Distribution: Table, Graph, or Formula that
describes values a random variable can take on, and its
corresponding probability (discrete RV) or density
(continuous RV)
▪ Discrete Probability Distribution: Assigns probabilities
(masses) to the individual outcomes
▪ Continuous Probability Distribution: Assigns density at
individual points, probability of ranges can be obtained by
integrating density function

▪ Discrete Probabilities denoted by: p(y) = P(Y=y)


▪ Continuous Densities denoted by: f(y)
▪ Cumulative Distribution Function: F(y) = P(Y≤y)
Discrete Probability Distributions
Probability (Mass) Function
p ( y ) = P (Y = y )
p ( y )  0 y
 p( y) = 1
all y

Cumulative Distribution Function (CDF) :


F ( y ) = P (Y  y )
b
F (b) = P (Y  b) =  p( y)
y = −

F (−) = 0 F () = 1
F ( y ) is monotonically increasing in y
Example – Rolling 2 Dice (Red/Green)
Y = Sum of the up faces of the two die.
Table gives value of y for all elements in S
Red\ Green 1 2 3 4 5 6

1 2 3 4 5 6 7
2 3 4 5 6 7 8
3 4 5 6 7 8 9
4 5 6 7 8 9 10
5 6 7 8 9 10 11
6 7 8 9 10 11 12
Rolling 2 Dice – Probability Mass Function &
CDF

y p(y) F(y)
2 1/36 1/36 # of ways 2 die can sum to y
3 2/36 3/36
p( y) =
# of ways 2 die can result in
4 3/36 6/36
y
5 4/36 10/36
F ( y ) =  p (t )
6 5/36 15/36 t =2
7 6/36 21/36
8 5/36 26/36
9 4/36 30/36
10 3/36 33/36
11 2/36 35/36
12 1/36 36/36
Rolling 2 Dice – Probability Mass Function
Dice Rolling Probability Function

0.18

0.16

0.14

0.12

0.1
p(y)

0.08

0.06

0.04

0.02

0
2 3 4 5 6 7 8 9 10 11 12
y
Rolling 2 Dice – Cumulative Distribution Function

Dice Rolling - CDF

0.9

0.8

0.7

0.6
F(y)

0.5

0.4

0.3

0.2

0.1

0
1 2 3 4 5 6 7 8 9 10 11 12 13
y
3-4 Mean Value
If {𝑥1 , 𝑥2 , … , 𝑥𝑛 } is a set of n numbers, then the mean value of these
numbers is denoted by 𝑥ҧ
𝑛
1
𝑥ҧ = ෍ 𝑥𝑖
𝑛
𝑖=1
Example: Find the mean value of -1, 0, 1

If the values of 𝑥1 , 𝑥2 , … ; 𝑥𝑛 occur with frequency 𝑓1 , 𝑓2 … , 𝑓𝑛 ,


respectively, then
σ𝑛𝑖=1 𝑥𝑖 𝑓𝑖
𝑥ҧ = 𝑛
σ𝑖=1 𝑓𝑖
Example: Find the mean value of
𝑥𝑖 2 3 4 5 6
𝑓𝑖 6 9 3 7 4
30
3-4 Standard Deviation: Variance
The variance is the mean of the square deviations
2
σ𝑛 𝑥
𝑖=1 𝑖 − ഥ
𝑥
𝑣𝑎𝑟𝑖𝑎𝑛𝑐𝑒 = 𝑣𝑎𝑟 = 𝑛

and the standard deviation (SD) is


𝑆𝐷 = 𝑣𝑎𝑟𝑖𝑎𝑛𝑐𝑒

Example: Calculate the standard deviation of −1, 0, 1

Calculating 𝑥𝑖 − 𝑥,ҧ i = 1, 2,...,n is tedious for large n hence we use the formula
σ𝑛 2
𝑖=1 𝑖 −𝑛𝑥ҧ
𝑥 2
𝑣𝑎𝑟 =
𝑛

σ𝑛 2 ഥ2
𝑖=1 𝑥𝑖 −𝑛𝑥
and 𝑆𝐷 = 𝑛

Example: Find the standard deviation of −1, 0, 1


31
3-4 Expected Value of a Random variable
Expected value of a discrete random variable
If a discrete random variable can take values 𝑥1 , 𝑥2 , ⋯ ; 𝑥𝑛 with
the probability 𝑃 𝑥1 , 𝑃 𝑥2 , ⋯ , 𝑃 𝑥𝑛 respectively, then

𝜇 = σ𝑛𝑖=1 𝑥𝑖𝑃(𝑥𝑖 )
Example: Consider the probability distribution
𝒙𝒊 0 1 2 3 4
𝑷 𝒙𝒊 0.1 0.2 0.4 0.15 0.15

32
3-4 Standard Deviation of Discrete RV
If a discrete random variable can take values 𝑥1 , 𝑥2 , ⋯ ; 𝑥𝑛 with the
respective probability 𝑃 𝑥1 , 𝑃 𝑥2 , ⋯ , 𝑃 𝑥𝑛 respectively, then the variance
of the random variable 𝜎 2 is
𝑛

𝜎 2 = ෍ 𝑃 𝑥𝑖 𝑥𝑖 − 𝜇 2

𝑖=1
The standard deviation 𝜎 of the discrete random variable is
𝑛

𝜎= ෍ 𝑃 𝑥𝑖 𝑥𝑖 − 𝜇 2

𝑖=1

Example: A discrete random variable has the probability distribution


𝒙𝒊 1 2 3 4 5
𝑷 𝒙𝒊 0.12 0.15 0.23 0.3 0.2

Calculate (a) the expected value (b) the standard deviation


33
3-4 Mean and Variance of a Discrete
Random Variable

Definition

34
3-4 Mean and Variance of a Discrete
Random Variable

Parts (a) and (b) illustrate equal means, but Part (a) illustrates a larger
variance.

A probability distribution can be viewed as a loading with the mean


equal to the balance point.

35
3-4 Mean and Variance of a Discrete
Random Variable

The probability distribution illustrated in Parts (a) and (b) differ even
though they have equal means and equal variances.

What does the length of each line represent in the above graphs?

36
Example 3-11
The number of messages sent per hour over a computer network has
the following distribution:

Determine the mean and standard deviation of the number of


messages sent per hour.

37
3-4 Mean and Variance of a Discrete
Random Variable

Expected Value of a Function [h(X)] of a Discrete Random Variable

Where h(X) is any function of the random variable X.

Example: the expected value of the function (X-)2 is the variance of the
random variable X.

38
3-4 Mean and Variance of a Discrete
Random Variable
Example: Bits in error (see slide #6)
▪ Let X equal the number of bits in error in the next 4 bits transmitted.
P(X=0) = 0.656; P(X=1) = 0.292; P(X=2) = 0.049;
P(X=3) = 0.004; P(X=4) = 0.0001

▪ What is the expected value of X2?


ℎ(𝑋) = 𝑋2

𝐸ℎ 𝑋
×
= 02 0.656 + 12 × 0.292 + 22 × 0.049 + 32 × 0.004 + 42
× 0.0001 = 0.52
Is 𝑬[𝑿𝟐] = (𝑬 𝑿 )𝟐 ?
If in doubt try it at home!!!
39
3-5 Discrete Uniform Distribution
Definition

Example:

Probability mass function for a discrete uniform random variable.


40
3-5 Discrete Uniform Distribution

Example of Product serial number:

B C BxC (B-4.5)2 CxE


first digit Probability E
0 0.1 0 20.25 2.025
1 0.1 0.1 12.25 1.225
2 0.1 0.2 6.25 0.625
Estimating the 3 0.1 0.3 2.25 0.225
Mean and Variance 4 0.1 0.4 0.25 0.025
5 0.1 0.5 0.25 0.025
6 0.1 0.6 2.25 0.225
7 0.1 0.7 6.25 0.625
8 0.1 0.8 12.25 1.225
9 0.1 0.9 20.25 2.025
Sum 4.5 8.25
Mean Variance
41
3-5 Discrete Uniform Distribution

Mean and Variance

42
3-5 Discrete Uniform Distribution
Example of work sampling:
• A voice communication system for a business contain 48 external lines.
• At a particular time, the system is observed and some of the lines are used.
• X denotes the number of lines in use (0 ≤ 𝑋 ≤ 48).
• Assume X is uniformly distributed (range: 0-48).
Solution:
𝑬[𝑿] = (𝟒𝟖 + 𝟎)/𝟐 = 𝟐𝟒
𝟒𝟖−𝟎+𝟏 𝟐− 𝟏
 = = 𝟏𝟒. 𝟏𝟒 ≡ Standard deviation
𝟏𝟐
Example:
Let Y denote the proportion of the 48 lines that are in use at
particular time. Then Y = X/48. What is the E[Y] ?
Solution:
E[Y] = E[X]/48 = 24/48 = 0.5 = 50%
V(Y) = V(X)/(48)2 = 0.087
43
End of Discrete Uniform Distribution

44
Any example of Random experiment with (only)
two possible outcomes?
Examples:
• Tossing a coin
• The gender of an expected baby
• The result of a basketball match
• Whether a produced part is good or defective
• Etc. success with
probability p
Two
options

failure with
probability 1-p
Bernoulli Distribution
45
Bernoulli Distribution

Jacob Bernoulli (1655-1705)


A random experiment with two possible outcomes:
─ Success with probability 𝒑
─ Failure with probability (𝟏 − 𝒑)
Is called a Bernoulli Distribution

What if a Bernoulli Distribution is repeated n times?


3-6 Binomial Distribution
Random experiments and random variables
1. Flip a coin 10 times. Let X = number of heads obtained.
2. A worn machine tool produces 1% defective parts. Let X =
number of defective parts in the next 25 parts produced.
3. Each sample of air has a 10% chance of containing a particular
rare molecule. Let X = the number air samples that contain the
rare molecule in the next 18 samples analyzed.
4. Of all bits transmitted through a digital transmission channel,
10% are received in error. Let X = number of bits in error in the
next five bits transmitted.

47
3-6 Binomial Distribution (cont.)
Random experiments and random variables
5. A multiple choice test contains 10 questions. Each with 4
choices, and you guess at each question. Let X = number of
questions answered correctly.
6. In the next 20 births at a hospital, let X= the number of female
births.
7. Of all patients suffering a particular illness, 35% experience
improvement from a particular medication. In the next 100
patients administered the medication, Let X = the number of
patients who experience improvement.

48
3-6 Binomial Distribution (cont.)

How many successes


and how many failures do we have
within the n trials ?

Given n Bernoulli trials

Number of trials is known =n


Number of success (failure) is unknown =Random variable

P (1-P) (1-P) P (1-P) (1-P) 49


3-6 Binomial Distribution (cont.)
A binomial distribution is obtained from a probability experiment called a
binomial experiment. The experiment must satisfy these conditions:
1. Each trial can have only two outcomes or outcomes that can be reduced
to two outcomes. The outcomes are usually considered as a success or a
failure.
2. There is a fixed number of trials.
3. The outcomes of each trial are independent of each other.
4. The probability of a success must remain the same for each trial.

In a single trial a particular result may or may not be obtained. For example, in
an examination, a student may pass or fail; when testing a component it may
work or do not work. The important point is that the two outcomes are
complementary (mutually exclusive)

Example: A machine makes components. The probability that a


component is acceptable is 0.9. If three components are sampled
what is the probability that exactly two are acceptable !?
50
3-6 Binomial Distribution (cont.)

Definition

51
3-6 Binomial Distribution (cont.)
The following formula is used to determine the probability of a success
for a single trial of a probability experiment. From the example above
we can work out the probability of k successes from n trials

𝑃(𝑋 = 𝑘) = 𝐶 𝑛, 𝑘 𝑝𝑘 1 − 𝑝 𝑛−𝑘
; 𝑘 = 0, 1, ⋯ , 𝑛

where:
n: the total number of trials
k: the number of successes (1, 2, 3, ⋯ , 𝑛)
p: the probability of a success

The formula has three parts:


➢ 𝐶(𝑛, 𝑘) determines the number of ways a success can occur
➢ 𝑝𝑘 is the probability of getting k successes
➢ 1 − 𝑝 𝑛−𝑘 is the probability of getting n − k failures.
52
3-6 Binomial Distribution (cont.)
Example: A coin is tossed 3 times. Find the probability of getting
two heads and a tail in any given order.
Answer:
𝑃(𝐻) = 𝑃(𝑇) = 1/2 = 𝑝
𝑃 𝑋 = 2 = 𝑃(𝐻𝐻𝑇) + 𝑃(𝐻𝑇𝐻) + 𝑃(𝑇𝐻𝐻)
= 3 × 𝑃(𝐻) × 𝑃(𝐻) × 𝑃(𝑇)
𝑃(𝑋 = 2) = 𝐶 3, 2 × 𝑝2 × 1 − 𝑝 3−2

Example: The probability that a component is acceptable is 0.93.


Ten components are picked at random. What is the probability that
(a) at least nine are acceptable
(b) At most three are acceptable
Answer:
a) 𝑃(9 𝑎𝑐𝑐𝑒𝑝𝑡𝑎𝑏𝑙𝑒 𝑜𝑟 10 𝑎𝑐𝑐𝑒𝑝𝑡𝑎𝑏𝑙𝑒) = 𝑃(𝑋 = 9) + 𝑃(𝑋 = 10)
b) 𝑃(𝑋 = 0) + 𝑃(𝑋 = 1) + 𝑃(𝑋 = 2) + 𝑃(𝑋 = 3)
53
3-6 Binomial Distribution (cont.)
Example:
▪ The chance that a bit transmitted through a digital transmission
channel is received in error is 0.1.
▪ Assume that the transmission trials are independent.
▪ Let X = the number of bits in error in the next 4 bits transmitted.

➢ Determine P(X = 2) ?

Answer:
▪ Let E denotes a bit in Error, and O denotes the bit is Correct.
▪ The outcomes can be represented as following: (next slide)

54
3-6 Binomial Distribution (cont.)
▪ Let E denotes a bit in Error, and O denotes the bit is Correct.
▪ The outcomes can be represented as following:

55
3-6 Binomial Distribution (cont.)
▪ Let E denotes a bit in Error, and O denotes the bit is Correct.
▪ The outcomes can be represented as following:

✓ Event X = 2 consists of 6 outcomes:


{EEOO, EOEO, EOOE, OEEO, OEOE, OOEE}
✓ Under assumption that trials are independent:
𝑃 𝐸𝐸𝑂𝑂 = 𝑃 𝐸 𝑃 𝐸 𝑃 𝑂 𝑃 𝑂 = 0.12 × 0.92 = 0.0081
✓ Also any of the 6 mutually exclusive outcomes for which 𝑋 = 2 has the
same probability:
𝑷 𝑿 = 𝟐 = 𝐶 4, 2 × 𝑝2 × 1 − 𝑝 4−2 = 𝟔 × 𝟎. 𝟏𝟐 × 𝟎. 𝟗𝟐
56
3-6 Binomial Distribution (cont.)

Binomial distributions for selected values of n and p.


✓ For a fixed n, the distribution becomes more symmetric as p increases
from 0 to 0.5 or decreases from 1 to 0.5.
✓ For a fixed p, the distribution becomes more symmetric as n increases.

57
3-6 Binomial Distribution (cont.)
Example:
Each sample of water has a 10% of containing a particular organic
pollutant. Assume that the samples are independent with regard to the
presence of the pollutant. Find the probability that in the next 18
samples, exactly 2 contain the pollutant.
Solution:
➢ Let X = the number of samples that contain the pollutant in the next 18
samples analyzed.
➢ Then, X is a binomial random variable with 𝑝 = 0.1 and 𝑛 = 18.
➢ Number of ways success (2 contain the pollutant) can occur:
18 18!
𝐶 18,2 = = = 153
2 2! 18 − 2 !
➢ Finally:
18 2 16
𝑃 𝑋=2 = 0.1 0.9 = 0.284
2
58
3-6 Binomial Distribution (cont.)
Example (cont.):
Determine the probability that at least 4 samples
contain the pollutant.
Answer:
18
18 𝑥 18−𝑥
𝑃 𝑋≥4 =෍ 0.1 0.9
𝑥
𝑥=4
However, it easier to use the complementary event:
𝑃 𝑋 ≥4 =1−𝑃 𝑋 <4
3
18 𝑥 18−𝑥
=1−෍ 0.1 0.9
𝑥
𝑥=0
= 1 − 0.15 + 0.3 + 0.284 + 0.168 = 0.098
59
3-6 Binomial Distribution
Example (cont.):
Determine the probability that 3 ≤ 𝑋 < 7.

Answer:

60
3-6 Binomial Distribution (cont.)

Mean and Variance

Example: For the number of transmitted bits


received in error, determine the mean and variance.
n = 4, p = 0.1
Answer:
So 𝐸[𝑋] = 4(0.1) = 0.4
𝑉[𝑋] = 4(0.1)(0.9) = 0.36

61
End Binomial Distribution
3-7 Geometric Distribution

Number of success is known =1


Number of trials is unknown =Random variable

How many trials (including the success)


to get a success for the first time?

(1-P) (1-P) (1-P) (1-P) (P) success for


the first time
63
3-7 Geometric Distributions (cont.)

Definition

64
3-7 Geometric Distributions (cont.)
Example:
The Probability that a bit transmitted through a digital transmission
channel is received in error is 0.1. Assume the transmissions are
independent events, and let the random variable X denote the number of
bits transmitted until the first error. What is the probability that the 5th bit
is in error?
Answer:

𝑃 𝑋 = 5 = 𝑃 𝑂𝑂𝑂𝑂𝐸 = 1 − 𝑝 5−1 𝑝

= 0.94 × 0.1 = 0.066

65
3-7 Geometric Distributions (cont.)

Probability decreases
as in Geometric
series. That is why it is
called Geometric.

Geometric distributions for selected values of the parameter p.


66
3-7 Geometric Distributions (cont.)
Example:
▪ X: a random variable denotes the number of semiconductor wafers
that need to be analyzed to detect a large particle of contamination
▪ Probability that a wafer contains a large particle is 1%.
➢ What is the probability that exactly 125 wafers will be analyzed
before a large particle is detected?

Answer:
▪ Let X denote the number of samples analyzed until a large particle
is detected.
▪ Then X is a geometric random variable with p = 0.01.
▪ The requested probability is:

67
3-7 Geometric Distributions
Definition

68
3-7 Geometric Distribution (cont.)
Example-2:
▪ X: a random variable denotes the number of times a Die need to be
thrown to get face number 2 on top.
▪ What is the probability that exactly 3 trials will be needed to get face n2
on top for the first time ?
Answer:
5 2 1 25
𝑃 𝑋=3 =𝑓 3 = ∗ =
6 6 216
➢ What is the probability of getting face n2 for the first time in the 5th
trial, knowing that the first two trials are unsuccessful ?
Answer:

𝑃 𝑋 =5∩𝑋 >2 𝑃 𝑋=5


𝑃 𝑋 = 5 /𝑋 > 2 = =
P X>2 𝑃 𝑋>2

𝑃 𝑋=5 𝑃 𝑋=5 25
𝑃 𝑋 = 5 /𝑋 > 2 = = =
1−𝑃 𝑋 ≤2 1 − 𝑃 𝑋 = 1 − 𝑃(𝑋 = 2) 216

→ 𝑃 𝑋 = 3 = 𝑃 𝑋 = 5 /𝑋 > 2 69
3-7 Geometric Distribution

Lack of Memory Property

P(X=106|100 are transmitted) = P(X=6)

70
3-7 Geometric Distribution

Lack of Memory Property

71
Name Probability Distribution Mean Variance

Discrete
Uniform 1 𝑏+𝑎 2
𝑏−𝑎+1 −1
, 𝑎≤𝑏
𝑛 2 12

Binomial 𝑛 𝑥
𝑝 1 − 𝑝 𝑛−𝑥
𝒏 𝑥 𝑛𝑝 𝑛𝑝(1 − 𝑝)
𝒙 𝑥 = 0,1, … , 𝑛, 0≤𝑝≤1
𝑥−1
1−𝑝 𝑝 𝑥 = 1,2 … , 0≤ 1 1−𝑝
Geometric
𝑝≤1 𝑝 𝑝2

𝑒 −𝜆 𝜆𝑥 𝜆 𝜆
Poisson 𝑥 = 1,2 … , 0<𝜆
𝑥!
Continuous
Uniform 1 𝑏+𝑎 2
𝑏−𝑎
, 𝑎≤𝑥≤𝑏
𝑏−𝑎 2 12
1 1 𝑥−𝜇 2

𝑒 2 𝜎
Normal 𝜎 2𝜋 𝜇 𝜎2
−∞ ≤ 𝑥 ≤ +∞ − ∞ ≤ 𝜇 ≤ +∞
0<𝜎
1 1
Exponential 𝜆𝑒 −𝜆 𝑥 0 ≤ 𝑥, 0≤𝜆 72
𝜆 𝜆2
Poisson Distribution

73
3-8 Poisson Distribution
This distribution is used when the variable occurs over a period of time,
volume, area, etc.
➢ Poisson distribution models the number of occurrences of an event in a
given interval.
For example: used to describe
✓ the arrivals of airplanes at an airport,
✓ the number of phone calls per hour for a 911 operator,
✓ the density of a certain species of plants over a geographic region,
✓ the number of white blood cells on a fixed area.
▪ The number of independent occurrences of an event in a given
time period is a discrete random variable X with values 0, 1, 2,
… Let μ be the expected value of X (𝜇 = 𝐸[𝑋]).
➢ The probability that X = r is
𝑒 −𝝀 𝝀𝑟
𝑃 𝑋 = 𝑟 = ; 𝑟 = 0, 1, 2, ⋯
𝑟!
▪ The expected value μ = 𝝀 and the variance 𝝈2 of the Poisson
distribution are both equal.
74
Poisson Distribution (cont.)

Definition

75
Poisson Distribution (cont.)

Example: Records show that on average three emergency calls per


day are received by a service engineer. What is the probability that on
a particular day (a) three (b) two (c) four calls will be received.

Answer:
On average three emergency calls per day = 𝝀 = 3
𝑒 −𝝀 𝝀𝑟 𝑒 −3 33
a) P(X=3) = =
𝑟! 3!

𝑒 −3 32
b) P(X=2) =
2!

𝑒 −3 34
c) P(X=4) =
4!

76
Poisson Distribution (cont.)
Example:
▪ Consider the transmission of n bits over a digital
communication channel.
▪ Let the RV equal the number of bits in error.
▪ When the probability that a bit is in error is constant and the
transmissions are independent, X has a binomial distribution.
▪ Let p denote the probability that a bit is in error.
▪ Let  = pn. Then E(x) = pn = .

MEAN an VARIANCE

77
Poisson Distribution (cont.)
Example: optical storage contamination
▪ Contamination is a problem in the manufacture of optical storage disks.
▪ The number of particles of contamination that occur on an optical disk
has a Poisson distribution, and the average number of particles per
centimeter squared of media surface is 0.1.
▪ The area of a disk under study is 100 squared centimeters.
▪ Find the probability that 12 particles occur in the area of a disk under
study.

Answer:
▪ Let X denote the number of particles in the area of a disk under study.
▪ Because the mean number of particles is 0.1 particles per cm2, then:

▪ Therefore

78
Poisson Distribution (cont.)
Example: optical storage contamination
▪ The probability that zero particles occur in the area of the
disk is:

▪ Determine the probability that 12 or less particles occur in


the area of the disk under study.

Answer:
▪ The probability is:

▪ You need a computer to estimate it!!


79
Poisson Distribution (cont.)

80
Poisson Distribution (cont.)
Example: The copper wire
• Suppose that the number of flaws follow a Poisson distribution with
mean of 2.3 flaws per millimeter.
• Determine the probability of exactly 2 flaws in 1 millimeter of wire
Answer:
P(X=2) = e-2.3 2.32/2! = 0.265

• Determine the probability of 10 flaws in 5 millimeters of wire.


Answer:
E(X) = 5 mm x 2.3 flaws/mm = 11.5 flaws
P(X=10) = e-11.5 11.510/10! = 0.113

• Determine the probability of at least 1 flaw in 2 millimeters of wire.


Answer:
E(X) = 2 mm x 2.3 flaws/mm = 4.6 flaws
P(X1) = 1- P(X=0) =1-e-4.6 = 0.9899

81
Poisson approximation to the Binomial
Distribution
▪ When n is large and p is small such that np is finite then the
binomial distribution can be approximated by the Poisson
distribution:

Example: A workforce comprises of 250 people. The


probability a person is absent on any one day is 0.02. Find the
probability that on a day (a) three (b) seven people are absent.

Answer:
n = 250 people p = 0.02 q = 0.98 k = 250
Binomial Distribution: 𝑃(𝑋 = 𝑘) = 𝐶 𝑛, 𝑘 𝑝𝑘 1 − 𝑝 𝑛−𝑘 ; 𝑘 = 0, 1, ⋯ , 𝑛
𝑒 −𝝀 𝝀𝑘
Poisson: 𝑃 𝑋 = 𝑘 = 𝑘!
μ = 𝝀 = np = 250x0.02 = 5

82
Tutorial
✓ The probability of the random variable y is given as
y -3 -2 -1 0 1 2 3
P(y) 0.63 0.2 0.09 0.04 0.02 0.01 0.01

(a) Calculate P(y ≥ 0) (b) P(y ≤ 1) (c) P(| y |≤ 1) (d) P(y2 > 3) (e) P(y2 ≤ 6)

✓ In communication network, packets of information travel along lines. The number


of lines used by each packet varies according to the following table. Calculate the
mean number of lines used per packet.
Number used lines 1 2 3 4 5

Number of packets 17 54 32 6 1

✓ Calculate the mean and standard deviation of −1; 2;−3; 4;−5; 6

✓ A set of measurements {𝑥1 , 𝑥2 , ⋯ 𝑥𝑛 } has a mean of x and a standard


deviation of s. What are the mean and standard deviation of the set
𝑘𝑥1 , 𝑘𝑥2 , ⋯ , 𝑘𝑥𝑛 , where k is a constant?

83

You might also like