0 ratings0% found this document useful (0 votes) 345 views25 pagesModule 3 Markov Chains
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here.
Available Formats
Download as PDF or read online on Scribd
acy
420 ENGINEERING MATHEMATICS -IV
5.2] Stochastic Process 1
We have already stated that in a random experiment, if a real variable X is associated
with every out come then it is called a random vai riable or stochastic variable, This is
equivalent to, having a function on the sample space $ and this function is called a .
random function or a stochastic function. In this article we discuss a stochastic process
called the Markov process which is such that the generation of the probability
distributions depend only on the present state. Before we take up the actual discussion
of this Markov process we present some basic. definitions and concepts relating to
stochastic process. .
© Classification of Stochastic Processes
Let Sbe the sample space of a random experiment and R be the set of all real mimbers.
A random variable X is a function f from $ to Rie, X = f(s),s € S. We define an
index set T C Rindexed by the parameter t such as time. Let us suppose that the value
of a random variable defined on $ depends ons € Sandt T. In this context a
Stochastic process is a set of random variables { X(t), t € T} defined on $ with a
parameter f. Here Xj = X (0) is called as the initial state of the system.
‘The values assumed by the random variable X (+) are called states and the set of all.
possible values forms the state space of the process. If the state space of a stochastic
process is discrete then itis called a discrete state process also called a chain.
On the other hand if the state space is continuous then the stochastic process is called
a continuous state process.
Similarly if the index set T is discrete then we have a discrete parameter process. -
Otherwise (ie., when T is a continuous set) we have a continuous parameter process,
A discrete parameter process is also called a stochastic sequence denoted by”
{ x} neT. .
‘The classification of the four different type of stochastic processes are presented in the
form of a table. —
Discrete Index Set- F Continuous Index Set -T
Discrete Discrete parameter Continuous parameter
State Space stochastic process (chain) stochastic process (chain)
Continuous Discrete parameter Continuous parameter
State Space continuous state continuous state
stochastic process stochastic processSTOCHASTIC PROCESS
5.21
Probability Vector : tc
“(,, Up, ..:0,) where the quantities 7), ¥,, ...
vector, : :
A vector 0 = (01, 0) 0
Examples‘: a =.(1,0):;
probability vectors.
Definitions
421
By a vector we simply mean n tuple of numbers
v, are Called components of the
v,) is called. a probability vector if each one of its
components are non negative and their sum is equal to unity.
(12, w), w= (4, 4,12) are all
Note: Ifv is not a probability vector but each.one of the v, (i = 1 to m) are non negative
"
then Xv isa probability vector where: & = 1/ Ev,
i=t
For example if v = (1, 2, 3) then 2. = 1/6 and (1/6, 2/6, 3/6) is a probability
vector.
Stochastic Matrix : A square matrix’ P'= (p,j) having every row in the form of a
Probability vector is called a stochastic matrix.
Examples *: (i), Identity matrix (J of any order,
100
10
tay |; 01.0
2 3
_ @Alo1 we[sis
@) [01 > By dataa, +a, = 1,b, +b) = 1,242)
a, 4
vA=[4, %| b, =[aate hy, % +0, |
We have to prove that (0, a, +0, b; )+( 0; 4) +0, by ) =
LHS= 0, (4 +4) +0, (bj +b) = 0) «140,» 12, +0,
1
Thus v Ais also a probability vector.
48. Prove with reference to two second order stochastic matrices that their: product is also a”
stochastic matrix.
>> Let A
have,
ay, 4, by 4,
HP | and B =|," ’;?? | be two stochastic matrices, Hence we
"21 2 bay bap i,
My tO. = 1p by thy
iho . @
gy + lgg = 1; bay ty = 1 : ™STOCHASTIC PROCESS
MH %2 || on Oe
AB =
api M29 || Bay bog
wart aba yyy tay bp
oy bay Fan boys My Byy + yg byy
We have to show that,
My yy Fy bai +, bay + yy bay
and yy by, +p boy + dy bay + byy = 1
LHS of (ii) can be written as,
My (Dy +b i2) + yy (boy bay)
Fi a+ tay -1= ay +a = 1, byusing (i.
LHS of (iii) can be written as,
ag (yy +Byp) + Mog (Bay +b pp) = yy» Lp +1 = 1
Thus AB is a stochastic matrix.
423
Gi)
Remark : In particular we can say that A" (n = 1,2,3,... )areall stochastic matrices.
49. If A is a square matrix of order n whose rows are each the same vector
@ = (ay, a, ++°0,) andif v= (01, 0), ---0,) isaprobability vector, prove that
vA=a
4%
{My M2 My
>> By datawehave, A =424 ENGINEERING MATHEMATICS - IV.»
= [evetenet tans mts Rete tt tHe]
=[n(ty £40, ) 5 dy (OB 4D, FFD, )/ dy (ay Hey to 8, )]
=[a. ty, va, oa since 0, +0) 4+++4+0, = 1
Thus vA = a as required.
50. Find the’ unique fixed probability vector of the regular stochastic matrix
nal 344
“| V2 1/2
>> Wehave to find v = (x,y) where x+y = 1 such that 0A =v
| o[e alte tall]
| ie, [Seady trata le
L [Seedy daetylepe, 9]
= 3y4t
attoy
11 .
| getgyry ++ ii)
We can solve either of the two equations by using y = 1-.
3) Oe
Using y= 1-xin (i) wehave, 2,4G%.,
or kt 2-2e ade ET
Hence y = 1-x = 1/3 and v = (xy) = (2, 18)
Thus (2/3, 1/3) is the unique fixed probability vector.
BL. Find the unique fixed probability vector for the regular stochastic matrix
0
o 1
| A=|16 12 173 ia
0 23 143 i
| >> Wehave to find v = (x,y,z) wherex+y+z = 1suchthatvA =v :
|
' o 1:0
| vy 2}| 6 12 18/=[x, y
j - ada [eaie, y= 6x, 6x+3y 442 = by ,ytz = 3z
, ; . £
ie,” 'y = 6x, 6x—3y+4z = 0,y-22 = 0 iid
Usingy = 6xandz = 1-x—y = 1-x-6x = 1-7xin 6x—3y+4z = 0 wehave,
6x—18x+4-28x = 0 - x= 1/10
“Hence y = 6/10, 2 = 3/10 ©
Thus the required unique fixed probability vector v is given by é
v= (1/0, 6/10, 3/10) Z
52.: With reference to the stochastic matrix A in Example-24, verify the property that the ¢€
sequence A*, A?, A approaches the matrix whose rows are each the 2 ficxed probability
vector. : é
>> We have A= eS vas we have obtained in Problem-50 the fixed
: ©
piobability vector v= (2/3,° 1/3). « |
Let B be the matrix whose each row is v.: &
ne 243 3 .
wo a-[38 4) d
tan oh 27] €
Consider A = 5 [? al
Nowa2-= {9 2][3 1]_ 1/11 5] _[o06875 o3125 a
w= 612 2] 12 2)" Te] 10 6 || 0.625 0375 |
22[31]f 1 5]_ 1 [43 21]_Posrs7s ‘osesi2s
© 64/22 10 6| 64 42 22 |” | 0.65625 0.34375
ek 3.1][43 21]_ 1 [171 85] [oer oss
256 | 2 2] 42 22} 256 | 170 86 |* | 066 034
Each row of A‘ is approaching v = (2/3, V3) = (0.67, 0.33)426 ENGINEERING °MATHEMATICS - 1
53. Find the unique fixed probability vector of the regular stochastic matrix -
0 V2 v4 14 .
v2 0 V4 V4
v2 V2 0 0
vw2v20 0
>> Wehavetofind v = (a,b, c,d) where at+b+c+d = 1 such that vP =v
0 V2 4 14] :
v2 0 14 V4 .
v2 12 0 0 [ala ber dl
212.0 0
> [a,b, c,d]
1
=F (btord) =a or bectd =
1
p (atetd) = b or atetd = 2 7
1
4 (ath) =e or ath = 4
Fash) =d ot atb=4d
Byusing b+c-+d = 1-a and atc+d = 1-b
(j) and (ii) respectively becomes 1-a = 2n and 1-b = 2b
a= and b=13
Hence we have from (iii) and (iv),
Ac = 2/3 and 4d = 2/3
c= 1% and d= 1/6
‘Thus v = (1/3, 1/3, 1/6, 1/6) isthe required unique xed probabity ve vector.
1-b b
54, Show that (a, b) isa fixed point of the stochastic matrix p-[ ae
the associated fixed probability vector ?
Hence write down the fixed probbility vector of enc of the following matrices,
V3 23 P= 2 12 7/10 3/10, %
1 0 v3 3 _ (8210: 2/10} »
sjSTOCHASTIC PROCESS 427
>>, Let x =.(a, b) and consider the matrix product
to fink be
rP=ta.bi| A ia
= [e(1-b)¢bn, ab+b(1-a)] = [a,b]
Thus xp = x
= (a, b) isa fixed point of P.
Also v= (a/a+b, b/a+b) is the required fixed probability vector of P.
Comparing P,, P,, P, with P we have respectively
@=1,b=28 5 0223,b=12; 0=8/10,b=3/10
atb=5/8 jatb=7/6 — ;a+b= 11/10
‘The corresponding fixed probability vectors of P,., P,, P3, be respectively denoted
by v1, ¥), v5 where we have in general
.-0 = (a/at+b, b/atb)
‘Thus 0 = (3/5, 2/5) 5 0, = (4/7, 3/7) | v3 = (8/11, 3/11)
are the required fixed probability vectors of P,, P,, Py in the respective order.
1-a a 1-b 6
55. If P, -| j sy] Be aes
show that Py, Py and P, P, are stochastic matrices.
>> In P; wehave (1-a)+a=1 and b+(1-b) =1
In P, wehave b+(1-b) = 1 and a+(1-a) =
Thus P, and P, -are stochastic matrices.
_[i-a a@ ]fi-> »
Now MyPy=|"y deal] e aa
(1-a) (1-b) +a, (1-0) b+a(1—-a)]_ [4% (
= i: =] ap, | ou
b(1-b)+a(1-b), P+(1-b) (1-a) ab,
We shalll show that a, +b; = 1 and a,+b, = 1
Now a, +b, = (15a) (1-b) + (1-a)b+a?+a(1-a)
= (1=a) (1-b+b] +2 [a+1-a}
sleata=l 6 a+b =1428 ENGINEERING MATHEMATICS - I
Also a+b, = b(1— b)+W40(1-b) + (1-b) (1-2)
= 6 {(1-b)+b}+C1- ~b) {a+(1-a)}
Sb+I-b= 1, Thus a+b, = 1
Thus P,P, is a stochastic matrix.
| o 100 Z
56. ShowthatP=| 0 O 1 is a regular stochastic matrix, Also find the associated . ~
12120
unique fixed probability vector.
o 10 o 10 oo
>> Consider P=| 0 0 1f// 0 0 1J=|1212 0
12120\/12 120] | 0 17212
10 21/2 0
P.P= p= } o1 wind =| 0 v2V2
121/20/| 0 17212) ..] 174144 172
0° 1 offa242 0 01212
P-P-pi-| 0 01]| 0 w212/=/141412
121720|/141412| [141214
| 0 10 0 12 2 41/4 1/72
| ppaP i O 1s) 2|=| 1741/2 174
1/2 1/20 }| 1/4 1/2 1/4 V8 3/8 1/2
We observe that in P® all the entries are positive. }
Thus P isa regular stochastic matrix
Next we have to find v = (a, b, c) where a+b+c =1 suchthat vP =v
010
=> [a,b,c]| 0 0 1/=[2, b,c}
V21220
ce
ie, [ser] [a, b,c]
b bse
@ ate
2 wey
Using ¢ = 2n and b= ¢ = 22 in atb+c = 1 weget
Sa=lora=1/5 Hence b=c= = 25
Thus (1/5, 2/5, 2/5) is the required unique fixed probability vector of P.
>
|
|- Probability p,;
eye 429
‘STOCHASTIC: PROCESS .
5.22| Markov Chains
A stochastic process which is such that the generation of the probability distribution
sdepend only on the present state is called a Markov proces. ;
If this state space is discrete (finite or countably infinite) we say that the process is a
diserete state process of chain. Then the Markov process is known as aMarkov chain.
~, Further if the state space is continuous, the process is called a continuous state process,
We explicitly def fine a Markov chain as follows.
Let the outcomes X, Xz, «+, of asequence of trials satisfy the following properties.
(i)’ Each outcome belong to the finite set (state space) of the outcomes
Mr ty yh
Gi) The outcome.of aity trial depend at_most upon the outcome of the immediate
preceeding trial!
is associated with every pair of states (4,4) that a; occurs
immediately after a; occurs. Such a, stochastic process is called a
fete Matos Tie chin P;;) which arenon zero real numbers are called
transition probabilities and they form a square matrix of order m called the transition
probability matrix (p.m) denoted by P,
Pru Paz ++ Pam
<9 | Pe te oP Cc
tes Pay ]=] 2B Pm
Prt Pia s+ Pram
With. each. state a, there. corresponds the
row of transition probabilities
Pays Pine +
Ping [tis evident thatthe elements ofP have the following properties,
m
® Osms1m Ep,=
i
. (2 123,.00m)
The above two properties satisfy the requirement of a stochastic matrix and hence we
Conclude that the transition matrix of a Markov chain is a stochastic matrix.
lustrative Examples for writing tip.m of a Markov chain
1. A person commutes the distarice to his Office everyday either by train or by bus. Suppose
hte does not go by irain for two consecutive days, but if he goes by bus the next day he is Just as
Ukely to go by bus again as hes to travel by train,
a
&430
ENGINEERING MATHEMATICS - IV
The state space of the system is { train (t), bus (b)
areas ‘Process is a Markov chain since the outcome of any day depends only
m the happening of the previous day.The t-p-m isas follows.
t ob
p-t|0 1
b|12 12
‘The first row of the matrix is related to the fact that the person does not commute two
consecutive days by train and is sure to go-by bus if he had travelled. by train. The
second row of the matrix is related to the fact that if the person had commuted in bus
ona particular day hes likely to go by bus again or by train. Thus the probabilities are,
equal to 1/2, ¢ . °
2. Three boys A, B, C arethrowing bal toenéh other.A. always throws the ball to B and
B always throws the ball to C. C is just as likely to throw the ball to B as to A. ;
State space = [A, B, C} andthe t-p-m P isas follows.
A BC
A{/O 1 0
P=B)/0 01
c\1/2 172 0 §
© Higher transition probabilities
The entry p,; in the transition probability matrix P of the Markov chain is the
probability that the system changes fronv the state a; to 4 ina single step. That is
a 4, , Z
‘The probability that the system changes from the state a; to the state a, in exactly
nt steps is denoted by pf") .
Thatis a; 4, a, Pe PB, gy
‘The matrix formed by the probabilities p{?"” iscalled the 1 - step
transition matrix denoted by P"? rae :
[Ph] = [p{1?] is obviously a stochastic matrix,
It can be proved that the m step transition matrix is equal to the #* powss of P.
Thatis PC") = p™STOCHASTIC PROCESS 431
Let P be the tp.m of the Markov chain and let p = (p;) = (py, Py» **-Py,) be the
probability distribution at some arbitrary time. Then p P, p P”---p P respectively
are the probabilities of the system after one step, two steps,... 1 steps.
0:
Let pO) = fpf, pf)
, -+-pl)] denote the initial probability distributional thestart
of the process and let pt) = ppl), pf, pU)] denote the 1"* step probability
distribution at the end of steps. Thus we have
1 oO
© pl) afl) p, p62) a pO) p 2 pl 0) pe. gl
Illustrations
pO) pr
1. Let us consider the t1p.m of the earlier illustrated Example-1
bob
pat) 0 UL) Lf Pa Pe
bLV212] | Pye Poy
We shall find P® and P*
e-[s ule 1 -(4%]- wD AD
TY (2) (2
v2v2}|v212|"| v4 ae} ™| (2) 2)
3 3
PB FE fall va\"|sa aa" AP AP]
~ ~ *] 3) 3
v2.12} 174 3/4 || 378 5/8] "| (3) 43) |
P{;’ = 1/2, means that the probability that the system changes from the state’ to
b inexactly two stepsis 1/2
p{3) = 3/8 méans that the probability that the system changes from the state b to
# in exactly 3 steps is 3/8.
Next let us create an initial probability distribution for the start of the process. Let us
suppose that the person rolled a ‘die’ and decided that he will go by bus if the number
appeared on the face is divisible by 3.
x p(b) = 2/6 = 1/3 and p(t)=2/3
Thatis p( = (2/3, 1/3) is the initial probability distribution,
. 2) 2 (0) p2 wv.) _ an]
Now pl?) = pl)’ =[20, v9] 3a [5/12, 7712)
ileS
[um ea] -[7. vas]
mat) 6] ¥e 3/8
J=[-?. |
This is the probability distribution after3 days.
That is, probability of travelling by train after 3 days = 7/24.
probability of travelling by bus after3days = 17/24
2 Letus consider the tp.m of the earlier illustrated Example'-2.
A BC
4ro 10
P=Blo0 0 1
c|12 12 0
Ves 12
Referring to Problem-58, wehave P® =| 1/4 1/2 1/4
V8 3/8 V2
Supposing that C was the person having the bal first then p= (0,0, 1)
14 14 172
Consider pS) = pl) P® = [0,0,1]] 1/4 V2 1/4
18 3/8 1/2
pre [%. 3/8, v2] = [p= >, ©]
This implies that after 5 throws the probability that the ball is with A is 1/8, theball
with B is 3/8, the ball with C is 1/2.
© Stationary distribution of regular Markov chains
‘A Markov chain is said to be regular if the associated transition probability matrix P
is regular.
If P isa regular stochastic matrix of the Markov chain, then the sequence of » step
transition matrices P?, P?, ---P™ approaches the matrix V whose rows are each .
the unique fixed probability vector v of P.
Wehave pt") =p) p™ where, pl") -[A”. n.. w]
Furtheras > ©, p{") =v; where i= 1,2, 3, --+m.. 433
STOCHASTIC PROCESS
* ‘Whig '$8-"callled” “the stationary distribution of the, markov-chain and
2, U, +1-2,,). is called the stationary (fixed) probability vector of the Markov
chain.
Referring to the Illustrative Example - 2, the t.p.m of the Markov chain is
: 0.10
P=/.0 O11
“| 2-172 0
and by Worked Problem - 56 the’unique fixed probability vector of
P is (14,275, 245), * 2
“Hence we conclude that in the long run (11 —> se) A_will have thrown the ball 20%
of the time; while B and C will have thrown the ball 40% of the time.
Note: A Markov chain is said tobe irreducible ifevery state can be reached from every
other state in a finite number of steps. That is to say that p{") > 0 for some n > 1,
This is equivalent: to saying that a Markov chain is irreducible if the associated
transition probability matrix is regular. {
* Absorbing state of a Markov chain
In a Markov chain the process reaches to a certain state after which it continues to
remain in the same state. Such a state is called anabsorbing state of the Markov chain. (
In an absorbing state the transition probabilities p,; are such that
Pj = for i= j and p,, = 0 otherwise.
Thus a state 4;.of the Markov chain is absorbing if the i" row of the tip.m has 1 on
the principal diagonal and zeroes elsewhere.
Examples :
The state a, is absorbing,
0
2 pe? m . a The states a, and a, are absorbing,
%
aj % 0 O1~ 1434 eae
= ENGINEERING’ MATHEMATICS -1V
WORKED EXAMPLES
57. The transition, matrix P of a Markov chain is given ala val with the initial
3 1/4
probability distribution p) = (1/4, 3/4). Define and find the, flowing
@ AD Gd PD Gi pf. Gey PO
(o) the vector p\) y* approches.
(vi) the matrix P” approaches.
3 >
>> (i) p{?) is the probability of moving from state a, tostate a, in’3steps. This can
be‘obtained from the 2 - step transition matrix P?
2 2
pr | V2 V2}[v2 12] [58 38 )_ PD pip
3/4 1/4|| 3/4 1/4)” | 9716 7/16] | (2) 4f2)
Pa’ Po}
«AP = 9716
Gi) _p\2 is the probability of moving from state a, toa, in two se {2) = 3/8 *
(ii) __p©) is the probability distribution of the system after 2 steps.
(2) = (0) p2 — 5/8 -3/8 a 37» 27
Be AF = [UA ¥4]( Se one |" | ea 64
Thatis pl? = (37/64, 27/68] = 1), PP]
(iv) _p(?) is the probability that the process isin the state a, after 2steps. Hence
> = 37/64.
(wv) The vector pi) P" approaches the unique fixed probabilty vector of P and
we shall find the same.
Let v = (x,y) where x+y =1 and we musthave vP =
2-1 “
That is [x, n[e valet me . a
XDA By/h = x and x/L4Y/A ey
Using y = 1-x the first equation becomes
x, 3(1-2)
2 4
since x = 3/5, ¥ = 2/5
=x or 243(1-x) = 40 : 23/5STOCHASTIC PROCESS 435
The vector p°) P" approaches the vector (3/5, 2/5)
(i) P™ approaches the matrix V whose rows are each the fixed probability vector
of P.
" = | 3/5 2/5
P" approaches the matrix [ a Z|
58. The t-p- mt ofa Markov chain is given by
20 12
P=!1 0 0
412 144
and the initial probability distribution is p) = (1/2, 1/2, 0)
Find pD, “PB, pl and pl?
>> First let us find the two step transition matrix P*
V2.0. 72][172 0 172 3/8 1/4 3/8
Peel dio o}li1 0 ol=|12 0 12
V4i1/2 V4)| 17412 4] | 11/16 18 3/16
ae 2)_ ’
Ag) = 3/8 and phf) = V2
3/8 1/4 3/8
pl?) = pl) p? [12, 12, 0] v2 0 12
oy [Mas vs sae}
=[7/16, 18, 716 | ~ —
% p) = (7/16, V8, 7/16) and pl? = 7/16
59. Prove that the Markov chain whose t-p-m is
0 243 13
P=|1/2 0 1/2} is irreducible.
172 172 0
Find the corresponding stationary probability vector.
>> Weshallshow that P isa regularstochasticmatrix, For convenience weshall write
the given matrix in the form1
j
|
436
ENGINEERING MATHEMATICS -IV
1 042);//042 1 18 6 12
Consider P? = 3 303//303/= 7/9 m6
330//3.30 9 12 15
Since all the entries in P* are positive we concliide that the t.p:m-P-is regular.
Hence the Markov chain having tp.m P is irreducible.
Next we shall find the fixed probability vector of P.
If v= (x, y, z) weshall find v such that vP
xtytzel
v where
1042
Thatis [x, y, z] “GE |303/=Lx yz]
330
1
=F (8y +e, 4et32, 2e43y] = [2, y, 2]
Solvingthesebyusing x+y+z = 1 weobtain
x= 1/3, y = 10/27, 2= 8/7
Thus 0 = (18, 10/27, 8/27) is the required stationary probability vector.
60. Ahabitual gambler is a member of two clubs A and B. He visits either of the clubs everyday
‘for playing cards. He never visits club A on two consecutive days. But, ifhe visits club B
on a particular day, then the next day he is as likely to visit ciub B or club A. Find the
transition matrix of this Markov chain, Also,
(a) show that the matrix isa regular stochastic matrix and find the unique fixed probability
vector.
|
1
{
|
|
|
> By tBz = 6x 5 Ax 432 = by ; 2e4dy = & |
|
i
(b) if the person had visited club B on Monday, find the probability that he visits club A ont
Thursday.
>> The transition matrix P_ of the Markov chain is formulated as follows,
AB !
be 4 0 17} Mt 8
~ pl V2 V2} | ay ay
t
|
|
The first row corresponds to the fact that he never goes to clubA on two conseciative |, |
days which implies that he is sure to visit club B. The second row corresponds to the”
fact that if he goes to B on a particular day he visits B or A on the following day.
Probability of going to A is 1/2 and probability of going to B is.also 1/2.
: . : !STOCHASTIC PROCESS
sb eomertel 22 o 4 O.t |_[iw2 2
(@) Now consider P* = [a lla zai = a Al
Since ail the entries of P? are positive P is a-regular stochastic matrix,
‘We shall find the unique fixed probability vector. That is to find
v'= (x,y) such that oP = y where x+y =1
[+ Ifa aa)-L* 9]
~ [feed]
Thus 0 = (1/3,2/3)
_ 437
(b) Let us suppose Monday as day 1, then Thursday will be 3 days after Monday.
Given that the person had visited club B on Monday the probability thathe visits club
A after 3 days is equivalent to finding a{*) from P°.
-_ 12 V2)f 0 1 ]_[waava
eH Pe Pola 12 172|"| 3/8 5/8
a3) = 3/8. Thus the required probability is 3/8,
61. A student's study habits are as follows. fhe studies one night, he is 70% sure not to study
the next night. On the other hand if he does not study one night, he is 60% sure not io
study the next night, In the long run how often does he study ?
>> The state space of the system is{ A, B} where A Studying B: Not studying,
The associated transition matrix P is as follows.
. AB
4f03 07
Be oles oe
«In order to find the happening in the long run we have to find the
* probability vector v of P. That is to find
x,y) suchthatvP =» where x+y =1
unique fixed
‘
‘438
ENGINEERING MATHEMATICS «IV
ie (m9) te ae L 2]
ie, [03x+04y, 0: ‘+ 06y ]=[x, i
= 03¢+0.4y O7e+06 y = y
Using y = 1—x in the first of the equations we have
03x+04(1-x) =x or Lix= 04 . x= 4/11
Since x= 4/11, y= 7/11, v= (4/11 and 7/11) = (P41 Pp)
Thus we conclude that in the Jong run the student will study 4/11 of the time
or 36.36% of the time.
62. A man’s smoking habits are as follows. If he smokes filter cigarettes one weck, he switches
to non filter cigarettes the next week with probability 0.2. On the other hand, if he smokes
non filter cigarettes one week there is a probability of 0.7 that he will smoke non filter
cigarettes the next week as well. In the long run how often does hte smoke filter cigarettes ?
>> Thestate spacé of the system is{ A, B'} where
A: Smoking filter cigarettes, B: Smoking non filter cigarettes
‘The associated transition matrix is as follows.
AB
p— *[08 02]_ [810 210]_ 1/8 2
~ 3/03 07 |~ | 3/10 7/710|~ 1013 7].
We have to find the unique fixed probability vector, v= (ayy) such that »
where x+y = 1
ie [wo] aa[8 a] 2]
ie, [sx+9y, 2e47y] = [ 10%, 10y]
> Bet By = 10x, 2x+7y = 10y
Using y = 1x in the first equation, we get,
8r+3(1-x) = 10x
o | £535 y=
= (3/5, 2/5) = (Pas Bg)
In the long run, he will smoke filter cigarettes 3/5 or 60% .of the time,
Se EE,
Hence v = (x.ySTOCHASTIC PROCESS 439
63. Each year a man trades his car for a new cnr in 3 brands of the popular company Maruti
Udyog limited. If he hes a ‘Standard he trades it for ‘Zen’, If he has a "Zen he lrades it
fora Esteem’. Ifhe has a ‘Estee’ ke is just as likely to trade it for a neta ‘Esteen’ or for
a Zen’ or a ‘Standard’ one, In 1996 he bought lis first car which twas Esteem,
(i) Find the probability that li ins
(a) 1998 Esteem — (b) 1998 Standard
(c) 1999 Zen (@ 1999 Esteem.
Gi) In the Long rie, how often till he have a Esteem?
>> The state space of the system is{ A, B, C} where
A:Standard B:Zen C: Esteem.
‘The associated transition matrix is as follows.
AB Cc
Aro 1 0 1 42 Sa
P=Bl 0 0 1 |=) 4% ay my
cL YS 13 V3] Jas, ay O55
(i) With 1996 as the first year, 1998 is to be régatded as 2 years after and 1999
as 3 years after,
We need to compute P* and P®
o 1 ojfe 1 oj}'f'o 0 1
P=/o 0 1{/0 0 1 |=|18 13 3
13 13 13||13 14 13] |19 49 49
0-0 1)f0 1.0 1B 13 V3
P=/13-1313|}0 0 1/=|19 49 4/9
yo 4/9°4/9 || 139 173 V3) | 4/27 7727 16/27
(a) 1998 Esteem, = 0(2) = 4/9
a with reference to P*
(b) 1998 Standard = af?) = 1/9
(e) 1999 Zen = al) = 7/27
a with reference to P®
(4) 1999 Bsteem , = a3) = 16/27
* Gi) We have to find the unique fixed probability vector » = (x,y,2) such that
vP.= v where x+y+z=1
070
a is : i] is A is ale é ‘l“a ENGINEERING MATHEMATICS,- IV :
ie, [= y=] ; oor =[s ¥ 2]
ie, [= 3x42, sy+2z] =[3s, 3y, 3:
= z= 3x, 3rt¢z = 3y, By +z = 32
Consider 3x+2 = 3y ; Using z = 3x and y = 1—x~-z we get 6x = 3(1-x-z)
or 6x = 3-31-32 or 8x =3 0. x= 16
Hence we obtain y = 1/3, z = 1/2
o=[xy z]=[1%, 14, v2] = [A, ?, ] :
In the long run, probability of he having Esteem is f°) = 1/2
Thus in the long run in 50% of the time he will have Esteem.
64. Three boys A, B, C are throwing ball to each other. A always throws the ball to B
and B always throws the ball fo C. C is just as likely to throw the ball to B as to A
If C was the frst persom to throw the bal find the probabilities that after three throws
@ A has the ball Gi) B has the ball (iii) C has the ball
>> State space ={A, B, C} and the associated t- p-m isas follows.
ABC
470 10
P=Bl 0 01
c [12 12 0]
Intially if C has the ball, the associated initial probability vector is given by
py) =(0,0,1)
' 7
Since the probabilities are desired after three throws we have to find pl) = p() Pe
1212 0
Refering to the Problem -56, P? =| 0 1/2 1/2
14 1/4 172
3). (ps2 2 12) _y sy cay ay
Pp) = pI P(E Ea] [a Pa
Thus after three throws the probability that the ball is with
Ais 1/4, with B is 1/4 and with C is 1/2,STOCHASTIC PROCESS * 441
65. Two biys'B, ,B, and hoo girls G,, Gare trang bl from one tothe otter. Each
~ bay throws the ball othe other boy with probability 1/2. and toench gil wit probability
1/4. On the otherhand each girl throws the ball to each boy with probability 1/2. and
never to the other girl. In the long run how often does each receive the ball.
>> State space: = {3,, By, Gy, G} and the associated t-p+im P isas follows.
B, BG, G
B. 1 *2 “r Yo
To 12 v4
Blin o w414
2
G)1212 0. 0
G [2120 0
P=
Weneed to find the fixed probability vector 0
= (a,b, ¢, d)
such that uP = 0
Refering to Problem -53, Wehave 9 = (1/3, 1/3, 1/6, V6)
"Thus we can say that inn the long run each boy receives the ball 1/3 of the time
and each girl 1/6 of the time.
66. A gimbler’s luck follows a patiern. fhe wins a game, the probability of winning the next
$amé is 0.6, However if he lses a game, the probabil
lity of losing the next game is 0.7.
‘There isan even chance of gambler winning the frst game. Iso
(a) What isthe probability of he wintiing the second game?
(): What is the probability of he ‘winning the third game?
(©) Inthe long run, how aften he will win?
>> State space { Win (W), Lose (L)}
Wo
pu” [06 04]_1f64
“iL £03.07}" 10/37
Probability of winning the first gameis “1/2,
iniitial probability vector p\°) = (1/2, 1/2)
and the associated t- p-m is.as follows.
(19'S (0p 2 Af 64] a
(a) Now pI = AO P= FULL a5] Sa =e IS, M1]
Henee p!) = [9/00, 120] =[p", fh]
Thus the probability of he winning the second game is 9/20.44; . ‘.
2 ENGINEERING MATHEMATICS -IV
» PoBnege, 1] 33 |= ate 70
Hence p'?) = > =[ 87/200, 113/200 ] =[o (WY pt ]
Thus the probability of he winning the third game is 87/200.
(&) We hall find the fixed probability vector
= (x,y) suchthat vP =v where x+y = 1
Thatis [x,y] “spl s7]7 a
=> 6r+3y = 10x, Ax+7y = 10y
or —3y = 4x and by using y = 1-x weget
B(l-x)=4% 40 x= 3/7 and y= 47
Hence v= [37,47 ]}=[p, pf]
‘Thus in the longrun he wins 3/7 of the fime.
EXERCISES
1. Identify the probability vectors from the following.
(a) (2/5, 3/5) (b) (0, -1/3,4/3)
(©) (1, 0, 1%, 1/2, 1/3) 5
@ (13,0, 14%, 1/2) (ey (0.1; 02, 03, 04)
2, Find the associated probability vector to each of the following tuples.
@) (1,3, 5) (b) (4, 0,1, 2)
© (12, 2%, 0, 2, 5/6) :
3A = [ay ] isa stochastic matrix of order mx n and
pe (oy, ¢ -v,) isa probability vector, show that vA. is ia qotatitly
vector.
4, If A and B aretwo stochastic matrices of order 3 x 3, prove that AB isalso
a stochastic matrix.STOCHASTIC PROCESS
443
5.
2
x
10.
Show that (cftcerde, aftbf+ac, ad + bd +be) isa fixed point of the stochastic
matrix
l-a-b a b
e l-e-d d
e f l-e-f
» Show that the following matrix P isa regular stochastic matrix and also find its
unique fixed probability vector. :
05 0.25 0.25
P=/05 0 05
o 1 Oo
" 1 0 an ee i ae
. Given the t-p-m P= [is iA with initial probability distribution
py) = (1/3, 2/3), find the following
© PE? & PO © A?
A software engineer goes to his office everyday by motorbike or by car, He never
goes by bike on two consecutive days, but if he goes by car on a day then he is
equally likely to goby car orbybikethenext day. Find the t- p-m ofthe Markov
chain: If car is used on the first day of the week find the probability that after 4
days (a) bike is used (b) car is used.
1. Asalesman’s territory consists of 3cities A, B, C. Henever sells in the same city
for 2 consecutive days. Ifhe sells in city 4, then the next day he sells in city B.
However if he sells in either B ot C, then the next day he is twice as likely to
sell in city A ‘as in the other city, In'the long run how often does he sell in each
of the cities ?
Show that the Markov chain with tp’ m given by
622
70 | 1 8 1| is ieteducible. Find the corresponding stationary probability
604
vector.te ENGINEERING MATHEMATICS -1V
1 (a), (4), (e) are probability vectors.
2 (a) (19, 3/9, 59) () (4/7, 0, 1/7, 2)
(9 (18, 1%, 0, 1/2, 5/24)
6 (4/11, 4/11, 3/11)
7% (a) 78 (b) (1112, 1712) () 1/12
BoC i |
Bio 1
8 ale | (@) 5/6 =~) 1116
9. v = (04, 0.45, 0.15). po long sie al ofthe tins incl A, 45% of
the time in B, 15% of the time in ‘C
10. v = (4/10, 4/10, 2/10)