Class Notes
Class Notes
Moshe Zukerman
EE Department
City University of Hong Kong
email: moshezu@[Link]
Preface
The aim of this textbook is to provide students with basic knowledge of stochastic models with
a special focus on queueing models that may apply to telecommunications topics, such as traffic
modelling, performance evaluation, resource provisioning and traffic management. These topics
are included in a field called teletraffic. This book assumes prior knowledge of a programming
language and mathematics normally taught in an electrical engineering bachelor program.
The book aims to enhance intuitive and physical understanding of the theoretical concepts it
introduces. The famous mathematician Pierre-Simon Laplace is quoted to say that “Probability
is common sense reduced to calculation” [18]; as the content of this book falls under the field
of applied probability, Laplace’s quote very much applies. Accordingly, the book aims to link
intuition and common sense to the mathematical models and techniques it uses. It mainly
focuses on steady-state analyses and avoids discussions of time-dependent analyses.
A unique feature of this book is the considerable attention given to guided homework assign-
ments involving computer simulations and analyzes. By successfully completing these assign-
ments, students learn to simulate and analyze stochastic models, such as queueing systems
and networks, and by interpreting the results, they gain insight into the queueing performance
effects and principles of telecommunications systems modelling. Although the book, at times,
provides intuitive explanations, it still presents the important concepts and ideas required for
the understanding of teletraffic, queueing theory fundamentals and related queueing behavior
of telecommunications networks and systems. These concepts and ideas form a strong base
for the more mathematically inclined students who can follow up with the extensive literature
on probability models and queueing theory. A small sample of it is listed at the end of this
book.
Queueing Theory and Stochastic Teletraffic Models
c Moshe Zukerman 2
The first two chapters provide background on probability and stochastic processes topics rele-
vant to the queueing and teletraffic models of this book. These two chapters provide a summary
of the key topics with relevant homework assignments that are especially tailored for under-
standing the queueing and teletraffic models discussed in later chapters. The content of these
chapters is mainly based on [18, 38, 95, 102, 100, 101, 103]. Students are encouraged to also
study the original textbooks for more explanations, illustrations, discussions, examples and
homework assignments.
Chapter 3 discusses general queueing notation and concepts. Chapter 4 aims to assist the
student to perform simulations of queueing systems. Simulations are useful and important
in the many cases where exact analytical results are not available. An important learning
objective of this book is to train students to perform queueing simulations. Chapter 5 provides
analyses of deterministic queues. Many queueing theory books tend to exclude deterministic
queues; however, the study of such queues is useful for beginners in that it helps them better
understand non-deterministic queueing models. Chapters 6 – 14 provide analyses of a wide
range of queueing and teletraffic models most of which fall under the category of continuous-
time Markov-chain processes. Chapter 15 provides an example of a discrete-time queue that
is modelled as a discrete-time Markov chain. In Chapter 16, various aspects of a single server
queue with Poisson arrivals and general service times are studied, mainly focussing on mean
value results as in [17]. Then, in Chapter 17, some selected results of a single server queue
with a general arrival process and general service times are provided. Chapter 18 focusses on
multi access applications, and in Chapter 19, we extend our discussion to queueing networks.
Finally, in Chapter 20, stochastic processes that have been used as traffic models are discussed
with special focus on their characteristics that affect queueing performance.
Throughout the book there is an emphasis on linking the theory with telecommunications ap-
plications as demonstrated by the following examples. Section 1.19 describes how properties
of Gaussian distribution can be applied to link dimensioning. Section 6.11 shows, in the con-
text of an M/M/1 queueing model, how optimally to set a link service rate such that delay
requirements are met and how the level of multiplexing affects the spare capacity required to
meet such delay requirements. An application of M/M/∞ queueing model to a multiple access
performance problem [17] is discussed in Section 7.5. Then later in Chapter 18 more multi-
access models are presented. In Sections 8.8 and 9.5, discussions on dimensioning and related
utilization issues of a multi-channel system are presented. Especially important is the empha-
sis on the insensitivity property of models such as M/M/∞, M/M/k/k, processor sharing and
multi-service that lead to practical and robust approximations as described in Chapters 7, 8,
13, and 14. Section 19.3 guides the reader to simulate a mobile cellular network. Section 20.6
describes a traffic model applicable to the Internet.
Last but not least, the author wishes to thank all the students and colleagues that provided
comments and questions that helped develop and edit the manuscript over the years.
Queueing Theory and Stochastic Teletraffic Models
c Moshe Zukerman 3
Contents
1 Background on Relevant Probability Topics 8
1.1 Events, Sample Space, and Random Variables . . . . . . . . . . . . . . . . . . . 8
1.2 Probability, Conditional Probability and Independence . . . . . . . . . . . . . . 9
1.3 Probability and Distribution Functions . . . . . . . . . . . . . . . . . . . . . . . 10
1.4 Joint Distribution Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
1.5 Conditional Probability for Random Variables . . . . . . . . . . . . . . . . . . . 12
1.6 Independence between Random Variables . . . . . . . . . . . . . . . . . . . . . . 13
1.7 Convolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
1.8 Selected Discrete Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . 18
1.8.1 Non-parametric . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
1.8.2 Bernoulli . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
1.8.3 Geometric . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
1.8.4 Binomial . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
1.8.5 Poisson . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
1.8.6 Pascal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
1.8.7 Discrete Uniform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
1.9 Continuous Random Variables and their Distributions . . . . . . . . . . . . . . . 25
1.10 Selected Continuous Random Variables . . . . . . . . . . . . . . . . . . . . . . . 29
1.10.1 Uniform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
1.10.2 Exponential . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
1.10.3 Relationship between Exponential and Geometric Random Variables . . 33
1.10.4 Hyper-Exponential . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
1.10.5 Erlang . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
1.10.6 Hypo-Exponential . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
1.10.7 Gaussian . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
1.10.8 Pareto . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
1.11 Moments and Variance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
1.11.1 Mean (or Expectation) . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
1.11.2 Moments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
1.11.3 Variance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
1.11.4 Conditional Mean and the Law of Iterated Expectation . . . . . . . . . . 43
1.11.5 Conditional Variance and the Law of Total Variance . . . . . . . . . . . . 44
1.12 Mean and Variance of Specific Random Variables . . . . . . . . . . . . . . . . . 51
1.13 Sample Mean and Sample Variance . . . . . . . . . . . . . . . . . . . . . . . . . 55
1.14 Covariance and Correlation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
1.15 Transforms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
1.15.1 Z-transform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
1.15.2 Laplace Transform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
1.16 Multivariate Random Variables and Transform . . . . . . . . . . . . . . . . . . . 66
1.17 Probability Inequalities and Their Dimensioning Applications . . . . . . . . . . 66
1.18 Limit Theorems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
1.19 Link Dimensioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
1.19.1 Case 1: Homogeneous Individual Sources . . . . . . . . . . . . . . . . . . 70
1.19.2 Case 2: Non-homogeneous Individual Sources . . . . . . . . . . . . . . . 71
1.19.3 Case 3: Capacity Dimensioning for a Community . . . . . . . . . . . . . 72
Queueing Theory and Stochastic Teletraffic Models
c Moshe Zukerman 4
4 Simulations 122
4.1 Confidence Intervals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
4.2 Simulation of a G/G/1 Queue . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125
6 M/M/1 135
6.1 Steady-State Queue Size Probabilities . . . . . . . . . . . . . . . . . . . . . . . . 135
6.2 State Transition Diagram of M/M/1 . . . . . . . . . . . . . . . . . . . . . . . . 137
6.3 Queue Performance Measures: E[Q], E[NQ ], E[D], and E[WQ ] . . . . . . . . . . 138
6.4 Using Z-Transform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
6.5 Mean Delay of Delayed Customers . . . . . . . . . . . . . . . . . . . . . . . . . . 139
6.6 Delay Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140
6.7 The Departure Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
6.8 Mean Busy Period and First Passage Time . . . . . . . . . . . . . . . . . . . . . 144
6.9 Dimensioning Based on Meeting Required Mean Delay . . . . . . . . . . . . . . 145
6.10 Effect of Rising Internet Bit-rate on Link Efficiency and QoS . . . . . . . . . . . 146
6.11 Multiplexing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
6.12 Dimensioning Based on Delay Distribution . . . . . . . . . . . . . . . . . . . . . 150
6.13 A Markov-chain Simulation of M/M/1 . . . . . . . . . . . . . . . . . . . . . . . 151
7 M/M/∞ 154
7.1 Steady-State Equations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
7.2 Solving the Steady-State Equations . . . . . . . . . . . . . . . . . . . . . . . . . 155
7.3 State Transition Diagram of M/M/∞ . . . . . . . . . . . . . . . . . . . . . . . . 156
7.4 Insensitivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
7.5 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
7.5.1 A Multi-access Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
7.5.2 Birth Rate Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
9 M/M/k 184
9.1 Steady-State Equations and Their Solution . . . . . . . . . . . . . . . . . . . . . 184
9.2 Erlang C Formula . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
9.3 Mean Queue Size, Delay, Waiting Time and Delay Factor . . . . . . . . . . . . . 186
9.4 Mean Delay of Delayed Customers . . . . . . . . . . . . . . . . . . . . . . . . . . 188
9.5 Dimensioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
9.6 Utilization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
Homework 1.1
Consider the experiment to be tossing a coin. What is the Sample Space? What are the events
associated with this Sample Space?
Guide
Notice that although the sample space includes only the outcome of the experiments which are
Head (H) and Tail (T), the events associated with this samples space includes also the empty
Queueing Theory and Stochastic Teletraffic Models
c Moshe Zukerman 9
set which in this case is the event H ∩ T and the entire sample space which in this case is the
event H ∪ T .
P (A ∩ B)
P (A | B) = . (1)
P (B)
P (A ∩ B) = P (B ∩ A) = P (B | A)P (A),
so by (1) we obtain
P (B | A)P (A)
P (A | B) = . (2)
P (B)
Eq. (2) is useful to obtain conditional probability of one event (A) given another (B) when
P (B | A) is known or easier to obtain, than P (A | B).
Remark: The intersection of A and B is also denoted by A, B or AB in addition to A∩B.
If events A and B are independent, which means that if one of them occurs, then the probability
of the other to occur is not affected, then
P (A | B) = P (A) (3)
Queueing Theory and Stochastic Teletraffic Models
c Moshe Zukerman 10
and since the Bi s are mutually exclusive, the events A ∩ Bi s are also mutually exclusive.
Hence,
X n
P (A) = P (A ∩ Bi ). (6)
i=1
The latter is a very useful formula for deriving probability of a given event by conditioning and
unconditioning on a set of mutually exclusive and exhaustive events. It is called the Law of
Total Probability. Therefore, by Eqs. (7) and (1) (again), we obtain the following formula for
conditional probability between two events:
P (A | B1 )P (B1 )
P (B1 | A) = Pn . (8)
i=1 P (A | Bi ) × P (Bi )
The latter is known as Bayes’ theorem (or Bayes’ law or Bayes’ rule).
Note that Eq. (2) is also referred to as Bayes’ theorem.
Set A is said to be a subset of set B, namely, A ⊂ B if A 6= B and every element of A is
an element of B. If we do not require A 6= B, but still require that every element of A is an
element of B, then we say that A is subset of or equal to set B, which is denoted by A ⊆ B.
If event A occurs implies that event B occurs, then we have that A ⊆ B. In such a case (and
of course under the stronger condition A ⊂ B), A = A ∩ B, so
important that the student is familiar with these alternative terms, so we will use these terms
alternately in this book.
The cumulative distribution function of random variable X is defined for all x ∈ R (R being
the set of all real numbers), is defined as
Consequently, for any random variable, for every x ∈ R, F (x) + F̄ (x) = 1. As the comple-
mentary and the cumulative distribution functions as well as the probability function can be
obtained from each other, we will use the terms distribution function when we refer to any of
these functions without being specific.
We have already mentioned that if X is a random variable, then Y = g(X) is also a random
variable. In this case, if PX (x) is the probability function of X, then the probability function
of Y is X
PY (y) = PX (x). (12)
x:g(x)=y
Having the joint distribution function, we can obtain the distribution function of a single
random variable, say, X1 , as
A random variable is called discrete if it takes at most a countable number of possible values.
On the other hand, a continuous random variable takes an uncountable number of possible
values.
When the random variables X1 , X2 , ..., Xn are discrete, we can use their joint probability
function which is defined by
In this section and in sections 1.5, 1.6, 1.7, when we mention random variables or their distri-
bution functions, we consider them all to be discrete. Then, in Section 1.9, we will introduce
the analogous definitions and notation relevant to their continuous counterparts.
We have already mentioned the terms probability function, distribution, probability distribution
function and probability distribution. These terms apply to discrete as well as to continuous
random variables. There are however additional terms that are used to describe probability
function only for discrete random variable they are: probability mass function, and probability
mass, and there are equivalent terms used only for continuous random variables – they are
probability density function, density function and simply density.
P (X = x, Y = y)
P (X = x | Y = y) = . (17)
P (Y = y)
PX,Y (x, y)
PX|Y (x | y) = . (18)
PY (y)
Noticing that X
PY (y) = PX,Y (x, y), (19)
x
we obtain by (18) X
PX|Y (x | y) = 1. (20)
x
This means that if we condition on the event {Y = y} for a specific y, then the probability
function of X given {Y = y} is a legitimate probability function. This is consistent with
our discussion above. The event {Y = y} is the new sample space and X has a legitimate
distribution function there. By (18)
and by symmetry
PX,Y (x, y) = PY |X (y | x)PX (x) (22)
so the latter and (19) gives
X X
PY (y) = PX,Y (x, y) = PY |X (y | x)PX (x) (23)
x x
1.7 Convolution
Consider independent random variables V1 and V2 that have probability functions PV1 (v1 ) and
PV2 (v2 ), respectively, and their sum which is another random variable V = V1 + V2 . Let us now
derive the probability function PV (v) of V .
PV (v) = P (V1 + V2 = v)
X
= P (V1 = v1 , V2 = v − v1 )
v1
X
= PV1 (v1 )PV2 (v − v1 ).
v1
The latter is called the convolution of the probability functions PV1 (v1 ) and PV2 (v2 ).
Let us now extend the result from two to k random variables. Consider k independent ran-
dom variables Xi , i = 1, 2, 3, . . . , Pk. Let PXi (xi ) be the probability function of Xi , for
k
i = 1, 2, 3, . . . , k, and let Y = i=1 Xi . If k = 3, then we first compute the convo-
lution of X1 and X2 to obtain the probability function of V = X1 + X2 using the above
convolution formula and then we use the formula again to obtain the probability function of
Y = V + X3 = X1 + X2 + X3 . Therefore, for an arbitrary k, we obtain