STAT 333
Probability Generating Functions
Definition: Let X have range {0, 1, 2, . . . } {} and let pn = P (X = n) for n = 0, 1, 2, . . . .
We define the probability generating function (pgf) of X to be
GX (s) =
pn sn .
n=0
Note that
the pgf of X is just the generating function of the sequence of probabilities p0 , p1 , p2 , . . . .
P
GX (1) = n=0 pn 1. Thus the radius R of convergence of GX (s) is at least 1.
X is proper if GX (1) = 1, and X is improper if GX (1) < 1.
P
n
if X is proper then GX (s) = E(sX ) =
for all R < s < R.
n=0 s P (X = n)
Examples:
1. Let X binom(n, p). Then
X
GX (s) = E(s ) =
n
X
sk P (X = k)
k=0
n k
=
s
p (1 p)nk
k
k=0
n
X
n
=
(ps)k (1 p)nk
k
k=0
n
X
= (1p + ps)n
applying the binomial theorem
Here the radius of convergence is R = .
2. Let X geometric(p). Then
GX (s) = E(sX ) =
sk (1p)k1 p
k=1
p X
[s(1 p)]k
=
1p
k=1
ps
1 s(1p)
for |s(1p)| < 1
i.e. |s| <
1
1p
2
Applications of Probability Generating Functions:
Suppose we are able to find the functional form GX (s) by some method. Then:
1. we can determine whether or not X is proper: X is proper iff GX (1) = 1.
2. if X is proper, then E(X) = G0X (1).
proof:
G0X (s)
X
X
d X
d
n
n
pn s =
=
pn s =
npn sn1
ds n=0
ds
n=0
n=1
If R > 1 we can directly substitute G0X (1) =
If R = 1 the derivative
G0X (s)
lim G0X (s) = lim
s1
s1
n=1
n pn =
n=0
for |s| < R
n pn = E(X).
is not defined at s = 1. But note that we have
npn sn1 =
n=1
npn lim sn1 = E(X).
s1
n=1
For convenience we will also denote the above left hand side limit by G0X (1).
00
00
3. if X is proper, then GX (1) = E(X(X 1)). Thus Var(X) = GX (1)+G0X (1)[G0X (1)]2.
00
proof: GX (s) =
X
X
d 0
d
GX (s) =
n pn sn1 =
n(n1) pn sn2
ds
ds
n=1
n=2
for |s| < R
00
If R > 1 directly substitute s = 1 to obtain GX (1) = E(X(X 1)). If R = 1 we again
00
take limits to obtain E(X(X 1)) = lims1 GX (s) and we denote the right hand side
00
limit by GX (1).
Then since Var(X) = E(X 2 ) E(X)2 = E(X(X 1)) + E(X) E(X)2 , the result
follows.
4. we can obtain the probabilities pn = P (X = n) by expanding GX (s) in a power series.
This is a powerful tool for finding the probability mass function of X through its pgf.
Multiplicative Properties of Probability Generating Functions
Suppose X1 , X2 , . . . , Xn are independent proper nonnegative-valued random variables.
Then
n
Y
GX1 +X2 ++Xn (s) = GX1 (s)GX2 (s) GXn (s) =
GXk (s)
k=1
proof:
X1 +X2 ++Xn
GX1 +X2 ++Xn (s) = E(s
X1 X2
= E(s s
Xn
= E(sX1 )E(sX2 ) E(sXn ) by independence
= GX1 (s)GX2 (s) GXn (s) as claimed.
3
Examples:
1. Let IA = indicator of an event A, and suppose p = P (A). Then
GIA (s) = E(sIA ) = s0 (1 p) + s p = 1p + ps.
Now if X binom(n, p) we can write X = I1 + I2 + + In where I1, I2, . . . , In are
independent identically-distributed indicator variables. It follows that
GX (s) = GI1 +I2 ++In (s) = (GI1 (s))n = (1p + ps)n
2. Let X geometric(p). We have
[1 s(1p)](p) ps[(1p)]
d
ps
0
=
for |s| < 1/(1p)
GX (s) =
ds 1 s(1p)
[1 s(1p)]2
p
do not bother simplifying like this
=
[1 s(1p)]2
Then E(X) = G0X (1) = 1/p as we already know.
3. Suppose we are given a function of the form
4
1
s
which makes sense for |3s/4| < 1
G(s) =
4 1 3s
4
Expand in a power series to obtain the coefficients. Use this to verify that G(s) is in
fact a probability generating function and obtain the probability mass function of X.
Show that X is proper and find E(X).
G(s) =
1 4
s
4
1
1 3s
4
1 4
s
4
X
n !
n
n4
X
1X 3
3s
3
n+4
sn
s
=
=
n3
4
4
4
4
n=0
n=0
n=4
n4
Thus p0 = p1 = p2 = p3 = 0, 0 pn = 34n3 1 for each n 4, and G(s) is potentially
a probability generating function. (Note G(s) will actually be a probability generating
function iff G(1) 1.) Now since G(1) = 1 (just plug in s = 1) it follows that we
have
P a probability generating function and X is proper. (Alternatively, check that
n pn = 1.)
Now
1 [(1 3s/4)(4s3 ) s4(3/4)]
G (s) =
4
(1 3s/4)2
0
Then E(X) = G0 (1) = 7.
For probability generating functions, all the action happens at s = 1.