0% found this document useful (0 votes)
81 views27 pages

Extinction Probability in Queues

This document discusses extinction probabilities for queues and martingales. It begins by defining busy periods in queues and noting they must terminate eventually. The extinction probability π0, the probability the queue becomes empty, is analyzed for different traffic intensities. Martingales are introduced as stochastic processes where the expected future value given the present and past is equal to the present value. Examples are provided of calculating extinction probabilities for various queue models like M/M/1 and M/D/1 queues. Martingales are related to Markov chains where the transition probabilities satisfy a martingale property.

Uploaded by

mvamsip
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
81 views27 pages

Extinction Probability in Queues

This document discusses extinction probabilities for queues and martingales. It begins by defining busy periods in queues and noting they must terminate eventually. The extinction probability π0, the probability the queue becomes empty, is analyzed for different traffic intensities. Martingales are introduced as stochastic processes where the expected future value given the present and past is equal to the present value. Examples are provided of calculating extinction probabilities for various queue models like M/M/1 and M/D/1 queues. Martingales are related to Markov chains where the transition probabilities satisfy a martingale property.

Uploaded by

mvamsip
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

1

20. Extinction Probability for Queues


and Martingales
(Refer to section 15.6 in text (Branching processes) for
discussion on the extinction probability).
20.1 Extinction Probability for Queues:
A customer arrives at an empty server and immediately goes
for service initiating a busy period. During that service period,
other customers may arrive and if so they wait for service.
The server continues to be busy till the last waiting customer
completes service which indicates the end of a busy
period. An interesting question is whether the busy periods
are bound to terminate at some point ? Are they ?
PILLAI
2
Do busy periods continue forever? Or do such
queues come to an end sooner or later? If so, how ?
Slow Traffic ( )
Steady state solutions exist and the probability of extinction
equals 1. (Busy periods are bound to terminate with
probability 1. Follows from sec 15.6, theorem 15-9.)
Heavy Traffic ( )
Steady state solutions do not exist, and such queues can be
characterized by their probability of extinction.
Steady state solutions exist if the traffic rate Thus
What if too many customers rush in, and/or the service
rate is slow ( ) ? How to characterize such queues ?
. 1 <
{ }
lim ( ) 1. exists if
k
n
p P X nT k

= = <
1
1
1 >
PILLAI
3
Extinction Probability for Population Models
) (
0

3
26 X =
1
3 X =
2
9 X =
( ) 2
1
Y

( ) 2
2
Y
( ) 2
3
Y
( ) 3
1
Y
( ) 3
2
Y
( ) 3
3
Y
( ) 3
5
Y
( ) 3
4
Y
( ) 3
6
Y
( ) 3
8
Y
( ) 3
7
Y
( ) 3
9
Y
0
1 X =
0
1 X =
Fig 20.1
PILLAI
4
Offspring moment generating function:

=
=
0
) (
k
k
k
z a z P
Queues and Population Models
Population models
: Size of the n
th
generation
: Number of offspring for the i
th
member of
the n
th
generation. From Eq.(15-287), Text
n
X
( ) n
i
Y
(
1
1
n
X
n
n k
k
X Y
+
=
=

)
Let
z
) (z P
0
a
1
1
PILLAI
(20-1)
Fig 20.2
( )
= { }
n
k i
a P Y k =

5
{ }
1
1 1
1 1
0
( ) { } { }
( { })
{[ ( )] } { ( )} { } ( ( ))
n
j
i
n i
X k
n n
k
Y
X
n n
j
j
n n
j
P z P X k z E z
E E z X j E E z X j
E P z P z P X j P P z
+
+ =

+ +
=
= = =
| |
= = = =
|
\ .
= = = =

)) ( ( )) ( ( ) (
1
z P P z P P z P
n n n
= =
+
Extinction probability satisfies the equation
which can be solved iteratively as follows:
z z P = ) (
0

2, 1, , ) (
1
= =

k z P z
k k
{ } ?
n
P X k = =
lim { 0} ? Extinction probability
n o
n
P X

= = = =
and
PILLAI
(20-2)
(20-3)
(20-4)
(20-5)

0 0
(0) z P a = =

6
Let
0 0
( ) (1) { } 0
i i k
k k
E Y P k P Y k ka

= =

= = = = = >

0
0
0 0
1 1
1 ( ) ,
1
is the unique solution of P z z
a

> =
`

< <
)
Left to themselves, in the long run, populations either
die out completely with probability , or explode with
probability 1- (Both unpleasant conclusions).

0
.
0

Review Theorem 15-9 (Text)


PILLAI
(20-6)
(20-7)
0
a
0

1
1
( ) z P
1
1 >
0
a
(a)
(b)
1
( ) z P
Fig 20.3
7
arrivals between two
( ) 0.
successive departures
k i
k
a P Y k P

= = =
`
)
Note that the statistics depends both on the arrival as well
as the service phenomena.
{ }
k
a
Queues :
2
s
2
X
1 Y
t
0
1
X
0
X
3
X
1
s
3
s
4
s
2 Y 3 Y
k
s
1
1
n
X
n i
i
X Y
+
=
=

Customers that arrive


during the first service time
Service time of
first customer
Service time of
second customer
First
customer
Busy period
PILLAI
Fig 20.4
8
: Inter-departure statistics generated by arrivals
) (
0

=
=
k
k
k
z a z P
: Traffic Intensity Steady state
Heavy traffic

=
=

=
1
) 1 (
k
k
ka P
1
Termination of busy periods corresponds to extinction of
queues. From the analogy with population models the
extinction probability is the unique root of the equation
0

z z P = ) (
Slow Traffic :
Heavy Traffic :
i.e., unstable queues either terminate their busy periods
with probability , or they will continue to be busy with
probability 1- . Interestingly, there is a finite probability of
busy period termination even for unstable queues.
: Measure of stability for unstable queues.
1 0 1
0
< < >
1 1
0
=
1
0
<
0

1 >
}
( 1) >
PILLAI
9
Example 20.1 : M/M/ 1 queue
2, 1, 0, ,
1 1
1
=
|
.
|

\
|
+ +
=
|
.
|

\
|
+ +
= k a
k k
k

1
From (15-221), text,we have
t
) ( P
) ( P
Number of arrivals between any two departures follows
a geometric random variable.

>

=
= = + + =
=
+
=
1 ,
1
1 , 1
0 1) - z ( ) 1 ( 1 ) 1 ( ) (
,
)] 1 ( 1 [
1
) (
0
2

z z z z z P
z
z P
PILLAI
(20-8)
(20-10)
(20-9)
Fig 20.5
10
Example 20.2 : Bulk Arrivals M
[x]
/M/ 1 queue
Compound Poisson Arrivals : Departures are exponential
random variables as in a Poisson process with parameter
Similarly arrivals are Poisson with parameter However each
arrival can contain multiple jobs.
2, 1, , 0 k} {
,
= = = k
c
A P
k
i
.
: Number of items arriving at instant
i
i
t
A

=
= =
0
} { ) (
k
k
k
A
z c z E z C
i
Let
and represents the bulk arrival statistics.
PILLAI
.
A
1
A
2
) ( P
) ( P
t
1
t 2
t
Fig 20.6
11
Inter-departure Statistics of Arrivals
1
( ) [1 {1 ( )}] , Let (1 ) , 0, 1, 2,
1 1
( ) , ( )
1 1 (1 )
k
k
P z C z c k
z
C z P z
z z


= + = =

= =
+ +

)) 1 ( /( 1 0 1] - )z (1 [ 1) - (z ) (
1
1
) 1 (
0
Rate Traffic

+ = = + =
>

=
z z P
P
0
( ) { k} { }
k Y
k
P z P Y z E z

=
= = =

( )
1 2
( )
0
0

0
0

0
0
{ (0, )} { (0, )} (t)
( )
[ { }] ,
!
[ ( )] 1
! 1 {1 ( )}
arrivals in arrivals in
n
i
A A A
s
n
n
A n t t
n
n
t
n
E z n t P n t f dt
t
E z e e dt
n
t C z
e
n C z




+ + +
=


+
=
=
= =
= =
+

PILLAI
(20-11)
(20-12)
12
Bulk Arrivals (contd)
Compound Poisson arrivals with geometric rate
0
0
1 1
, .
(1 )
2
For , we obatain
3
3 1
,
2 (1 ) 2

= >
+
=
= >
+
Doubly Poisson arrivals gives
(1 )
( )
z
C z e

=
PILLAI
1
( )
1 [1 ( )]
P z z
C z
= =
+
(20-13)
(20-14)
(20-15)
0

2
1

1
1
0

M/M/1
(a)
(b)
Fig 20.7
1
13
Example 20.3 : M/E
n
/ 1 queue (n-phase exponential service)
From (16-213)
1 1 = n / M / M
2 1
2
= n / E / M
5 . 0
0
=
38 . 0
0
=
2 =
Example 20.4 : M/D/1 queue
Letting in (16.213),text, we obtain
so that
1 ,
2
8 1 1
0
2
2
> |
.
|

\
| + +
= =

n
m
. 1 ,
) 1 (
0
>




e
e
0
(1 / ) , 1
n
n n

+ >> (20-17)
n
z
n
z P

|
.
|

\
|
+ = ) 1 ( 1 ) (

n
z x
n
x
n
x
n
x z z P
1
, 0
1
) ( = =
|
.
|

\
|
+ +

+ =

(20-16)
, ) (
) 1 ( z
e z P

=

PILLAI
(20-18)
14
20.2 Martingales
Martingales refer to a specific class of stochastic processes
that maintain a form of stability in an overall sense. Let
refer to a discrete time stochastic process. If n refers
to the present instant, then in any realization the random
variables are known, and the future values
are unknown. The process is stable in the sense
that conditioned on the available information (past and
present), no change is expected on the average for the future
values, and hence the conditional expectation of the
immediate future value is the same as that of the present
value. Thus, if
{ , 0}
i
X i
0 1
, , ,
n
X X X
1 2
, ,
n n
X X
+ +

(20-19)
1 1 1 0
{ | , , , , }
n n n n
E X X X X X X
+
=
PILLAI
15
for all n, then the sequence {X
n
} represents a Martingale.
Historically martingales refer to the doubling the stake
strategy in gambling where the gambler doubles the bet on
every loss till the almost sure win occurs eventually at which
point the entire loss is recovered by the wager together with
a modest profit. Problems 15-6 and 15-7, chapter 15, Text
refer to examples of martingales. [Also refer to section 15-5,
Text].
If {X
n
} refers to a Markov chain, then as we have
seen, with
Eq. (20-19) reduces to the simpler expression [Eq. (15-224),
Text]
1
{ | },
ij n n
p P X j X i
+
= = =
.
ij
j
j p i =

(20-20)
PILLAI
16
PILLAI
For finite chains of size N, interestingly, Eq. (20-20) reads
implying that x
2
is a right-eigenvector of the transition
probability matrix associated with the eigenvalue 1.
However, the all one vector is always
an eigenvector for any P corresponding to the unit eigenvalue
[see Eq. (15-179), Text], and from Perrons theorem and the
discussion there [Theorem 15-8, Text] it follows that, for
finite Markov chains that are also martingales, P cannot be
a primitive matrix, and the corresponding chains are in fact
not irreducible. Hence every finite state martingale has
at least two closed sets embedded in it. (The closed sets in the
two martingales in Example 15-13, Text correspond to two
absorbing states. Same is true for the Branching Processes
discussed in the next example also. Refer to remarks
following Eq. (20-7)).

2 2 2
, [1, 2, 3, , ]
T
P x x x N = = (20-21)
N N
1
[1, 1, 1, , 1]
T
x =
( )
ij
P p =
17
PILLAI
Example 20.5: As another example, let {X
n
} represent the
branching process discussed in section 15-6, Eq. (15-287),
Text. Then Z
n
given by
is a martingale, where Y
i
s are independent, identically
distributed random variables, and refers to the extinction
probability for that process [see Theorem 15.9, Text].
To see this, note that
where we have used the Markov property of the chain,
1
0
1
,
n
n
X
X
n n i
i
Z X Y

=
= =

(20-22)
0

1
0
1 0 0 0
0 0 0 0
1
since { } is
a Markov chain
{ | , , } { | , , }
{ | } [ { }] [ ( )] ,
n
k
X k Y
n i
i i n n
n
X
n n n
Y X X
n n
i
X
E Z Z Z E X X
E X k E P Z


+
=
=
+

=
=
= = = = = =

(20-23)
since Y
i
s are
independent of X
n
use (15-2)
18
PILLAI
the common moment generating function P(z) of Y
i
s, and
Theorem 15-9, Text.
Example 20.6 (DeMoivres Martingale): The gamblers ruin
problem (see Example 3-15, Text) also gives rise to various
martingales. (see problem 15-7 for an example).
From there, if S
n
refers to player As cumulative capital
at stage n, (note that S
0
= $ a ), then as DeMoivre has observed
generates a martingale. This follows since
where the instantaneous gain or loss given by Z
n+1
obeys
and hence
( )
n
S
n
q
p
Y =
(20-24)
1 1 n n n
S S Z
+ +
= + (20-25)
1 1
{ 1} , { 1} ,
n n
P Z p P Z q
+ +
= = = = (20-26)
( )
( )
1
1
1 1 0 1 0
{ | , , , } { | , , , }
{ | },
n
n n
S
n n n n n
S Z
n
q
p
q
p
E Y Y Y Y E S S S
E S
+
+
+
+
=
=

19
( ) ( )
( )
( )
1
1 1 0
{ | , , , }
n n
S S
n n n n
q q q q
p p p p
E Y Y Y Y p q Y

+
= + = =
PILLAI
since {S
n
} generates a Markov chain.
Thus
i.e., Y
n
in (20-24) defines a martingale!
Martingales have excellent convergence properties
in the long run. To start with, from (20-19) for any given
n, taking expectations on both sides we get
Observe that, as stated, (20-28) is true only when n is known
or n is a given number.
As the following result shows, martingales do not fluctuate
wildly. There is in fact only a small probability that a large
deviation for a martingale from its initial value will occur.
(20-27)
1 0
{ } { } { }.
n n
E X E X E X
+
= = (20-28)
20
PILLAI
Hoeffdings inequality: Let {X
n
} represent a martingale and
be a sequence of real numbers such that the random
variables
Then
Proof: Eqs. (20-29)-(20-30) state that so long as the
martingale increments remain bounded almost surely, then
there is only a very small chance that a large deviation occurs
between X
n
and X
0
. We shall prove (20-30) in three steps.
(i) For any convex function f (x), and we have
(Fig 20.8)
1 2
, , ,
2 2
1
( 2 )
0
/
{| | } 2
i
n
i
x
n
P X X x e

=


(20-30)
(20-29)
0 1, < <
1 2 1 2
( ) (1 ) ( ) ( (1 ) ), f x f x f x x + + (20-31)
1
1 .
i i
i
i
X X
Y with probability one

21
PILLAI
which for
and
gives
Replacing a in (20-32) with any zero mean random variable
Y that is bounded by unity almost everywhere, and taking
expected values on both sides we get
Note that the right side is independent of Y in (20-33).
On the other hand, from (20-29)
and since Y
i
s are bounded by unity, from (20-32) we get
(as in (20-33))
1 1
, 1 ,
2 2
a a

+
= =
1 2
| | 1, 1, 1 a x x < = =
( ) , 0
x
f x e

= >
1 1
(1 ) (1 ) , | | 1.
2 2
a
a e a e e a

+ + <
(20-32)
2
/ 2
1
2
{ } ( )
Y
E e e e e

+
(20-33)
1 0 1 1 1 1
{ | , , , } ( | ) 0
i i i i i i i
E Y X X X E X X X X X

= = = (20-34)
Fig 20.8
1
x
2
x
1
( ) f x
2
( ) f x
( ) f x
x
1 2
(1 ) x x +
1 2
( ) (1 ) ( ) f x f x +
1 2
( (1 ) ) f x x +
i
i
22
(ii) To make use of (20-35), referring back to the Markov
inequality in (5-89), Text, it can be rewritten as
and with
But
2
/ 2
1 1 0
{ | , , , }
i
Y
i
E e X X X e

(20-35)
{ } { }, 0
X
P X e E e

>
0
( )
0
{ } { }
n
X X x
n
P X X x e E e


(20-36)
(20-37)
0
, and we get
n
X X X x = =
(20-38)
PILLAI
2 2
2
0 1 1 0
1 0
1 0
/ 2
2 2 2
1 1 0
( ) ( ) ( )
( )
1 1 0
( )
1 1 0
using (20-35)
( ) / 2 / 2
{ } { }
[ { | , , , }]
[ { | , , , }]
{ } .
n
i
n n n n
n n n
n n n
n
i n n
X X X X X X
X X Y
n
X X Y
n
e
X X
E e E e
E E e e X X X
E e E e X X X
E e e e





=
+


=
=
=

use (20-29)
23
Substituting (20-38) into (20-37) we get
(iii) Observe that the exponent on the right side of (20-39) is
minimized for and hence it reduces to
The same result holds when X
n
X
0
is replaced by X
0
X
n
,
and adding the two bounds we get (20-30), the Hoeffdings
inequality.
From (20-28), for any fixed n, the mean value
E{X
n
} equals E{X
0
}. Under what conditions is this result
true if we replace n by a random time T ? i.e., if T is a
random variable, then when is
PILLAI
2 2
1
( / 2)
0
{ }
i
n
i
x
n
P X X x e

=



(20-39)
2
1
/
n
i
i
x
=
=

(20-40)
2 2
1
/ 2
0
{ } , 0.
i
n
i
x
n
P X X x e x


>
24
The answer turns out to be that T has to be a stopping time.
What is a stopping time?
A stochastic process may be known to assume a
particular value, but the time at which it happens is in general
unpredictable or random. In other words, the nature of the
outcome is fixed but the timing is random. When that outcome
actually occurs, the time instant corresponds to a stopping
time. Consider a gambler starting with $a and let T refer to the
time instant at which his capital becomes $1. The random
variable T represents a stopping time. When the capital
becomes zero, it corresponds to the gamblers ruin and that
instant represents another stopping time (Time to go home for
the gambler!)
PILLAI
(20-41)
0
{ } { }.
T
E X E X =
?
25
Recall that in a Poisson process the occurrences of the first,
second, arrivals correspond to stopping times
Stopping times refer to those random instants at which there
is sufficient information to decide whether or not a specific
condition is satisfied.
Stopping Time: The random variable T is a stopping time
for the process X(t), if for all is a
function of the values of the process up to
t, i.e., it should be possible to decide whether T has occurred
or not by the time t, knowing only the value of the process
X(t) up to that time t. Thus the Poisson arrival times T
1
and T
2
referred above are stopping times; however T
2
T
1
is not
a stopping time.
A key result in martingales states that so long as
PILLAI

1 2
, , . T T
0, { } the event t T t
{ ( ) | 0, } X t >
26
T is a stopping time (under some additional mild restrictions)
Notice that (20-42) generalizes (20-28) to certain random
time instants (stopping times) as well.
Eq. (20-42) is an extremely useful tool in analyzing
martingales. We shall illustrate its usefulness by rederiving the
gamblers ruin probability in Example 3-15, Eq. (3-47), Text.
From Example 20.6, Y
n
in (20-24) refer to a martingale in the
gamblers ruin problem. Let T refer to the random instant at
which the game ends; i.e., the instant at which either player A
loses all his wealth and P
a
is the associated probability of ruin
for player A, or player A gains all wealth $(a + b) with
probability (1 P
a
). In that case, T is a stopping time
and hence from (20-42), we get
PILLAI
0
{ } { }.
T
E X E X = (20-42)
27
since player A starts with $a in Example 3.15. But
Equating (20-43)-(20-44) and simplifying we get
that agrees with (3-47), Text. Eq. (20-45) can be used to derive
other useful probabilities and advantageous plays as well. [see
Examples 3-16 and 3-17, Text].
Whatever the advantage, it is worth quoting the master
Gerolamo Cardano (1501-1576) on this: The greatest
advantage in gambling comes from not playing at all.
PILLAI
( )
( )
1
1
b
a
a b
p
q
p
q
P
+

(20-45)
( )
0
{ } { }
a
T
q
p
E Y E Y = =
(20-43)
( ) ( )
( )
0
{ } (1 )
(1 ).
a b
T a a
a b
a a
q q
p p
q
p
E Y P P
P P
+
+
= +
= +
(20-44)

You might also like