0% found this document useful (0 votes)
18 views5 pages

Adobe Scan 28 Dec 2022

Uploaded by

Yash Bansal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views5 pages

Adobe Scan 28 Dec 2022

Uploaded by

Yash Bansal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Chapter 3: Probability and Random Signal Theory 161

3.12 RANDOM PROCESSES


Let us consider an experiment of measuring the temperature of a room. Let there be a collection
of thermometers. Each thermometer reading is a random variable which can take on any value
from the sample space S. Also, at different times the reading of thermometers may bedifferent.
Thus the room temperature is function of a both the sample space and the time. In this example,
we have extended the concept of random variable by taking into consideration the time dimension
Here we assign a time function x(, s) to every outcome S. There will be a family of all such
functions. This family of functions X(1,S) is known as random process or stochasticprocess. A
random process X(1, S) represents an ensemble or a set or afamily of time functions wheretand S
are variables. In place of x(1,S) and X(7,S), the short notations x(t) and X() are often used.
Figure 3.12.1 shows a few members ofthe ensemble. , () is the reading of first thermometer,
x) is the reading of second thermometer and so on.] Each member is also known as sample
function or ensemble member orrealization of the process. A random process represents single
time functions when fis variable ands isfixed. x ) and x,() are exmaples of single time functions

Xt)|

A
B
)

B
Am

Fig.3.12.1 A Random Process

Todetermine the statistics of the room temperature, say mean value, we


may follow one of the
following two procedures:
random variable X(,, S) X1)= [4, A,
value, say . The result iS a =

[Link] may fix t to some


[Link] mean value of X(t), E[X(1)], can now be calculated. It is known as ensemble average.
Itmay be noted that ensemble average is a function oftime, There is an ensemble average
we have
Corresponding to each time. Thus at time t,
XU2, s) = X U ) = [B, B, . B ]

The ensemble average corresponding to time ELXT,)), can also be found out Similarly,
can be
found out.
Cnsemble average corresponding to any time
[Link] may consider a sample function, say Xj) over the entire time scale. Then the mean
value of x, (1) is defined as
162 Communication Systems: Analog and Digital

<x)>-=im
T
d2T
-T
Similarly, we can values of other mple
find mean
functions. The expected valu
values is known as time average and is given as value of all
mean
<X)>= E[<x() >
A random process for which
mean values of all
sample functions are same is Lnes
7egular random process. In this case,
as a
For some
<x)>=<x)>=*. =<x()>
ensembles
=< X()>
processes, average is
independent to time; i.e.
Such processes are known as
ELXC)]= ELX(,)]= ELX)]
stationary processes in restricted
second moment, third moment, etc.)sense. (Here it is restricted to m
It may also be restricted to
random process are If all the mean.
statistical properties of
independent of time, then it 1s known as
When we say stationary
process, then it is meant that the stationary process in strict Sense
When an ensemble average is process
equal to the time average, then is stationary in strict
sense.
process in restricted sense. When all statistical the process is known
as ergodic
properties, then the process is known as ergodicensemble properties are equal to statisticl time
process, then it is meant that the process in strict sense. When we say ergodic
process ergodic in strict sense.
It may be noted that
ergodic process is a subset ofa
then it is also
stationary, but the vice versa is not stationary proces, i.e. if a process is ergodic,
is known as necessarily true. The
process which is not stationary
is non-stationaryprocess.
non-statioñary, because the ensemble
It is obvious
that the random process of Fig. 3.11.1 (room temperature)
averages
at different times are not
always the same.
3.13 MARKOV PROCESSES
Many times a given random variable is
previously occurring random variables.x,Thus,statistically
if
dependent upon some finite number of
fe ,/x,-1 *n-2 .) =Jck (n Xn-1-2
then we say that - (3.13.1)
{x,} is kth
order Markov
a
K
previously occurring random variables Process. Here the occurrence of x, is conditioned on
By putting K = 1, 2, x, -1 n-23, Xn-k
we get first-order
.,
Markov process, second-order
respectively. e.g. the first-order Markov process is
Markov process,
given byy
and the second-order J.G,-1n*-2 )=Ja&, /¥,-1) ntne3.13.2)
Markov is process given byy
A particular class of
Markov Jn-1*n-2)=JaC,/x,-1-2) (3.13.3)
where the random variables are processes is discrete time, discrete valued Markov processes,
discrete and they can assume
Such Markov processes are also only a finite set of possible values
known as Markov chains. The order of
the number of Markov chain depends upon
previously
Let us consider the
occurring random variables which condition the
random variable Xn;
first-order Markov chain where the outcome
the outcome at the any at
1, depends only ou time
immediately proceeding time ,-1 Thus, for each pair
probability can be defined as the conditional (,-j X,), the transi
probability,
Py =P C, V/x,-1V)=P, V) (3.13.4)
Chapter 3: Probability and Random Signal Theory
163
ore . and , are the outcomes of the random variables x, and x,_, respectively. The transition

matrix 7is defined as a q X q square matrix

n P P2 Ra o dheob toeo
T=P,1-|P21 P22 P24 obutg ft e 3.13.5)
P4 P42 Pag
where q is the number of possible outcomes of x, The conditions to be satisfied for the matrix
3.13.5 are
given in Eq.
() Pi20, i,j=1,2,.4 (3.13.6)

(ii) P =1, i=1,2,., tesbn Ahee 3.13.7)


j=l
Matrices whose elements satisty Eqs. 3.13.6 and 3.13.7 are also known as Stochastic Matrices or

Markov Matrices. In addition, if the following condition is also satisfied,

(i) P1, j=1,2,.. g


i=l
(3.13.8)
then the matrix is knowm as doubly stochastic matrix.
Interpretations of the Above Conditions
in it must lie between 0 and 1.
The stochastic matrix must be a square matrix. All the elements
if summation of all
The summation of all rows of RHS of Eq. (3.13.5) must be unity. In addition,
columns of RHS of Eq. (3.13.5) is zero, then it is a doubly stochastic matrix.
Let there be a column vector known as state distribution vector r , where

r)=PP..P (3.13.9)
outcomes at time , The initial state distribution
which gives the probabilities of all possible
vector is given by
T0T= P P . . P, (3.13.10)
state distribution vectors.
A Markov chain is completely defined by the transition matrix and the

1. Which of the following is incorrect?


(b) AB = AB () AA = 0 e ( d ) AA =A
(a) A-B = AB
2. Pick the odd man out,
(b) stochastic function
(a) stochastic variable
(c) random variable (d) random experiment
3. Pick the odd man out
(b) normal distribution,
(a) binomial distribution (d) Rayleigh distribution
(c) uniform distribution
4. Which of the following is incorrect?
(a) P(V) = 1 (b) P(A) =
P(A) -

1
If A and B are mutually exclusive,
(d) P(A then
(c) 0sPA) S 1 + B) = PA) + P B )
164 Communication Systems: Analog and Digital
5.
The total
(a) 1
under the probability distribution curve is
area

(b) 0
6.
(C)depends on the nature of the distribution
The spectral density of white noise
(d) none of the above
(a) varies with frequency (b) varies with
(c) varies with amplitude of the signal (d) is constant
bandwidth
7. The theoretical power of white noise is
(a) zero (b) finite
(c) infinite (d) depends on frequency of
8. The stationary process has the
(a) ensemble average equal to time average
signal
(b) all the statistical properties dependent on time
(c) all the statistical properties independent of time
(d) zero variancee
9. Event A and B are statistically
independent if
(a) A and B occur simultaneously
(b) A and B occur at
(c) occurrence of A includes occurrence of B (d) none of the abovedifferent times
10. Pick the odd man out
(a) expectation (b) variance
(c) standard deviation
11. The probability density function
(d) Tchebycheff's inequality
(a) a and b can be arbitrary
of a random variable X is ae u(x). Then
(b) a = b/2
(c) a = b
(d) a = 2b

12. The density function of a random variable Xis given by: f(x) aSxsb
=}b-a
The variable X is said to have 0 otherwise
(a) Poisson distribution (b) Gaussian distribution
(c) Rayleigh distribution (d) uniform distribution
13. The probability density function of a random
variable is given by p(x) ke-x<2 =
coX o, Ii
value of k should be

1
(a)27
State True or False
(b)
)2T (d)T2
14. The classical
approach for probability theory does not explain the situation when the numberu
outcomes of an experiment is small.
(a) True (b) False
15. Mutually exclusive events are also statistically independent.
(a) True (b) False
16The probability thata continuous random variable takes on a particular value is zero.
(a) True (b) False
17. Unit of variance is same as that of the random variable.
(a) True (b) False
18. Ifa random process is
ergotic then it is also stationary.
(a) True (b) False

ANSWERS
. (b) 4 (b) 5. (d)
10.
6. (d)
2. (d) 3. (a)
9. (d)
7. (d) 8. (c) 15. (b)
11. (b) 12. (d) 13. 14. (b)
(a)
16. (a) 17. (b) 18. (a)
Chapter 3: Probability and Random Signal Theory 165

De Morgan's theorem by using Venn diagrams.


3.1 Prove both forms of
3 Black and 4 Red balls. Three balls are drawn in succession. What is the
3.2 A box contains 2 White,
probability that they are of
(a) different colours?
(b) same colour?
What is the probability that
3.3 Three cards are drawn from an ordinary deck of 52 playing cards.
black colour?
(a) all three cards are even numbered andareofof same colour as the ace?
of them is an ace, and other two
(b) one
numbers on the two dice. Find the probability
3.4 Two fair dice are thrown. X denotes the total of the
function and distribution function of X.
3.5 A function is given by

C+1), 1<x<3

(a) Find the value of C for


f)0,
to be a density function.
f(x)
otherwise

Find the probability that x lies between 1 and 2.


(b)
3.6 The joint probability function of two discrete random variables X and Yis given by
So.y) =Cx'y,x= 1,2
y=0, 1, 2
0, otherwise
Find
(a) the value of C
(b) P(X>1, YS 1) and
(c) Marginal probability functions ofX and Y.
The joint probability function of two continuous random variables is given by
3.7
1<x < 3 0<y <2
C(2x+ 3y),
f,y)10, otherwise
Find
(a) the value of C
(b) PX< 2, Y> 1) and,
(c) P[X + Y)> 3]
3.8 For the random variable of Prob. 3.6, find
(a) EX2), E ()
(b) E(3X+2Y2)
3.9 For the random variable of Prob. 3.7, find
(a) E (X?), E(Y)
(b) E (3X +2Y2)
3.10 Prove that E (cX) = c E (X) and E (X + Y) = E + E (Y
Also find the mean and
3.11 A fair coin is tossed five times. What is the probability of getting two heads?
standard deviation for the number of heads.
to these
in sports. Special marks are awarded
.12
Only 15% of the total students of a college take part 100
students. What is the probability that more than two students
will get these special marks out of
students chosen at random? six times and
1 A tair coin is tossed fifteen times. Find the probability of a head appearing more than
less than or equal to ten times.
3.14 Using MATLAB, solve Example 3.8.1

You might also like