Probability Theory and Random
Processes (ECE2005)
Dr. Jeetashee Aparajeeta
Assistant Professor (Sr)
SENSE
Random Processes – Temporal
Characteristics
Random Process
• In real world, the random variable X can be defined as a function of
time (time signals) or space (Images) or both (Videos).
• It can also be used to represent both desired waveforms
(communication of signals) or undesirable waveforms (noise).
• The desired signal is always accompanied by an undesired random
waveform (noise).
• The noise corrupts the message and limits the performance of the
System.
• So to determine the performance of a system with random
waveforms limiting the efficacy, we need to describe and deal with
such random waveforms.
• The concept of random process is based on enlarging the random
variable concept to include time or space or both.
• So, now the Random variable is a function of the possible outcomes s
of an experiment and time t.
• It is denoted by a time function x(t, s) for every outcome s.
• The family of all such functions are called random process.
• We can also denote a specific waveform of a random process (X(t)) as x(t).
• A random process X(t, s) or X(t) represents a family or ensemble of time
functions when t and s are variables.
• Few members of an ensemble are given in next slide.
• The graph of the function X(t, s), versus t for s fixed, is called a realization,
ensemble member, or sample function of the random process.
• For each fixed 𝑡𝑘 from the indexed set I, X(𝑡𝑘 , 𝑠) is a random variable.
The graph of the function X(t,
s), versus t for s fixed, is
called a realization, ensemble
member, or sample function
of the random process.
For each fixed 𝑡𝑘 from the
indexed set I, X(𝑡𝑘 , 𝑠) is a
random variable.
Formal Definition
𝑋1 used to denote the random variable associated with the process
X(t) at time 𝑡1 .
𝑋1 corresponds to a vertical “slice” through the ensemble at time 𝑡1 .
The statistical properties of 𝑋1 =X(𝑡1 ) describe the statistical
properties of the random processes at time 𝑡1 .
The expected value of 𝑋1 is called the ensemble average as well as the
expected or mean value of the random process (at time 𝑡1 ).
Since 𝑡1 may have various values, the mean value of a process may be
a function of time instead of a constant.
A random process represent a number when both t and s are fixed.
Classification of Processes
• The random processes can be classified according to the characteristics of t
and the random variable X=X(t) at time t.
1. Continuous Random Process
2. Discrete Random Process
3. Continuous Random Sequence
4. Discrete Random Sequence
Continuous Random Process
X and t both are continuous
Eg: Thermal noise generated by
any realizable network can be
modeled as a sample function of
a continuous Random Process.
Discrete Random Process
X is discrete and t is continuous
These random processes can be
derived by heavily limiting the
sample functions of continuous
random process.
The sample functions have only two
discrete values: Positive level and
negative level.
Continuous Random Sequence
X is continuous and t is discrete
This can be generated by periodically
sampling the ensemble members of
continuous random process.
It is also called as Discrete-Time (DT)
random process, as the continuous
random sequence is defined at discrete
(sample) time.
Discrete Random Sequence
Both X and t are discrete.
It can be derived from rounding off
samples of a DT random process or
continuous random sequence.
Eg. Quantization in DSP systems
Deterministic and Nondeterministic
Process
If the future values of any sample function cannot be predicted
exactly from the observed past values, then the process is called
nondeterministic process.
Eg. Most of the Continuous random process are non-deterministic.
A process is called deterministic if future values of any sample
function can be predicted from past values.
Eg. 𝑋 𝑡 = 𝐴 𝑐𝑜𝑠(𝜔0 𝑡 + Θ)
Here A, Θ, or 𝜔0 can be random variables
Stationary and Independence
A random Process becomes a random variable when time is fixed at some
particular value.
This random variable will possess statistical properties, such as mean,
moments, variance etc., that are related to its density function.
If two random variables are obtained from the process for two time instants,
then they will have statistical properties related to their joint density function.
N random variables will possess statistical properties related to their N-
Dimensional joint density function.
A random process is said to be stationary if all its statistical
properties do not change with time.
All other processes are called non-stationary.
Distribution and Density Functions
A random variable is fully characterized by a pdf or CDF. Likewise we
can fully define a random process with the help of N dimensional joint
density function.
For a time instant 𝑡1 , the distribution function associated with the
random variable 𝑋1 = 𝑋(𝑡1 ) is denoted as 𝐹𝑋 (𝑥1 ; 𝑡1 ).
𝐹𝑋 𝑥1 , 𝑡1 = 𝑃{𝑋(𝑡1 ) ≤ 𝑥1 }, for any real number 𝑥1 .
For two random variables 𝑋1 = 𝑋(𝑡1 ) and 𝑋2 = 𝑋(𝑡2 ), the second
order joint distribution function is the two dimensional extension of
the above equation
𝐹𝑋 𝑥1 , 𝑥2 ; 𝑡1 , 𝑡2 = 𝑃{𝑋(𝑡1 ) ≤ 𝑥1 , 𝑋(𝑡2 ) ≤ 𝑥2 }
For N random Variables, 𝑋𝑖 = 𝑋 𝑡𝑖 , 𝑖 = 1, 2, … , 𝑁, the Nth-order joint
distribution function is
𝐹𝑋 𝑥1 , … , 𝑥𝑛 ; 𝑡1 , … , 𝑡𝑁 = 𝑃{𝑋(𝑡1 ) ≤ 𝑥1 , … , 𝑋(𝑡𝑛 ) ≤ 𝑥𝑛 }
Joint density functions of interest are found from appropriate
derivatives of the above three relationships.
𝑓𝑋 𝑥1 ; 𝑡1 = 𝑑𝐹𝑋 (𝑥1 ; 𝑡1 )/𝑑𝑥1
𝜕 2 𝐹𝑋 (𝑥1 ,𝑥2 ;𝑡1 ,𝑡2 )
𝑓𝑋 𝑥1 , 𝑥2 ; 𝑡1 , 𝑡2 =
𝜕𝑥1 𝜕𝑥2
𝜕 𝑁 𝐹𝑋 (𝑥1 ,…,𝑥𝑁 ;𝑡1 ,…,𝑡𝑁 )
𝑓𝑋 𝑥1 , … , 𝑥𝑁 ; 𝑡1 , … , 𝑡𝑁 =
𝜕𝑥1 …𝜕𝑥𝑛
Statistical Independence
Two process X(t) and Y(t) are statistically independent if the random
variable group 𝑋 𝑡1 , 𝑋 𝑡2 , … , 𝑋 𝑡𝑁 is independent of the group
𝑌 𝑡1 ′ , 𝑌 𝑡22 , … , 𝑌(𝑡𝑀 ′ ) for any choice of times
𝑡1 , 𝑡2, … , 𝑡𝑁 , 𝑡1 ′ , 𝑡2 ′ , 𝑡𝑀 ′ . If
Types of Stationary Processes
1. First Order Stationary Processes
2. Second-Order and Wide-Sense Stationary Process
3. N-Order and Strict-Sense Stationary Processes
First Order Stationary Processes
A random process is called stationary to order one if its first-order
density function does not change with a shift in time origin.
Second-Order and Wide-Sense
Stationary Process
A process is called stationary to order two if its second-order density
function satisfies
A second-order stationary process is also first-order stationary
because the second-order density function determines the lower, first-
order, density.
Problem
Consider the random process X(t)=3t+b, where b is uniformly
distributed random variable in the range (-2,2). Determine the
mean and the autocorrelation of X(t). Is X(t) wide sense
stationary? Justify your answer.
Solution
1. Mean = E[X(t)] = E[3t+b]=3t (E[b]=0 by inspection)
2. 𝑅𝑋𝑋 𝑡, 𝑡 +∝ = 𝐸 𝑋 𝑡 𝑋 𝑡 + 𝜏 = 𝐸[(3𝑡 + 𝑏)(3𝑡 + 3𝜏 + 𝑏)]
=𝐸[9𝑡 2 + 9𝑡𝜏 + 3𝑡𝑏 + 3𝑏𝑡 + 3𝑏𝜏 + 𝑏 2]
=9𝑡 2 + 9𝑡𝜏 + 𝐸[𝑏 2] (Since E[b]=0)
2 4 2 21 4
= 9𝑡 + 9𝑡𝜏 + (−2 𝑏 𝑑𝑏 = )
3 4 3
X(t) is not WSS because means is a function of time.
Time Average
Time average of a quantity is defined as,
1 𝑇
𝐴 . = lim . 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇
A denotes the time average in a manner analogous to E for the
statistical average.
Time average is taken over all time because, as applied to random
processes, sample functions of processes are presumed to exist for all
time.
The mean value 𝑥ҧ = 𝐴[𝑥(𝑡)] of a sample function 𝑥
The time autocorrelation function is denoted as
ℜ𝑥𝑥 𝜏 = 𝐴[𝑥 𝑡 𝑥(𝑡 + 𝜏)]
1 𝑇
𝑥ҧ = 𝐴 𝑥 𝑡 = lim 𝑡𝑑 𝑡 𝑥
𝑇→∞ 2𝑇 −𝑇
ℜ𝑥𝑥 𝜏 = 𝐴[𝑥 𝑡 𝑥(𝑡 + 𝜏)]
1 𝑇
= lim 𝑡 𝑥 𝑡 𝑥 + 𝜏 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇
𝐸 𝑥ҧ = 𝑋ത
𝐸 ℜ𝑥𝑥 𝜏 = 𝑅𝑋𝑋 (𝜏)
Ergodic Process
If the random variables 𝑥ҧ and ℜ𝑥𝑥 (𝜏) could be made to have zero
variance, then
𝑥ҧ = 𝑋ത
ℜ𝑥𝑥 𝜏 = 𝑅𝑥𝑥 (𝜏)
i.e. the time average 𝑥ҧ and ℜ𝑥𝑥 𝜏 will be equal to the statistical
average 𝑋ത and 𝑅𝑥𝑥 (𝜏).
This is known as ergodic theorem and the processes that satisfy
ergodic theorem are called ergodic process.
Two random processes are called jointly ergodic if they are individually
ergodic and also have a time cross-correlation function that equals the
statistical cross-correlation function.
1 𝑇
ℜ𝑥𝑦 (𝜏) = lim 𝑡 𝑦 𝑡 𝑥 + 𝜏 𝑑𝑡 = 𝑅𝑋𝑌 (𝜏)
𝑇→∞ 2𝑇 −𝑇
Mean Ergodic Process
A process 𝑋(𝑡) with a constant mean value 𝑋ത is called mean-ergodic,
or ergodic in the mean, if its statistical average 𝑋ത equals the time
average 𝑥ҧ of any sample function 𝑥(𝑡) with probability 1 for all sample
functions, if
𝐸X t = 𝑋ത = 𝐴 𝑥 𝑡 = 𝑥,ҧ with probability 1 for all 𝑥(𝑡)
𝑋(𝑡) will be mean-ergodic, if variance is zero if 𝐶𝑋𝑋 −𝜏 = 𝐶𝑋𝑋 𝜏
𝟏 𝟐𝑻 𝝉 𝟏 𝟐𝑻
𝝈𝑨𝑿 𝟐 = 𝐥𝐢𝐦 𝟏− 𝑪𝑿𝑿 𝝉 𝒅𝝉 < 𝐥𝐢𝐦 −𝟐𝑻 𝑪𝑿𝑿(𝝉) 𝒅𝝉
𝑻→∞ 𝟐𝑻 −𝟐𝑻 𝟐𝑻 𝟐𝑻
𝑻→∞
The variance is zero if,
1. 𝐶𝑋𝑋 0 < ∞ 𝑎𝑛𝑑 𝐶𝑋𝑋 𝜏 → 0 as 𝜏 → ∞, and
∞
2. −∞ 𝐶𝑋𝑋 (𝜏) < ∞
A discrete sequence is called mean-ergodic if the time average of
samples equals the statistical average with probability 1.
1
𝐴𝑋 = lim σ𝑁 ത
𝑛=−𝑁 𝑋 𝑛 = 𝑋, probability 1
𝑁→∞ 2𝑁+1
1 𝑛
𝜎𝐴𝑋 2 = 𝐸 𝐴𝑋 − 𝐴𝑋 2
= lim σ2𝑁
𝑛=−2𝑁 (1 − ) 𝐶𝑋𝑋 𝑛 = 0
𝑁→∞ 2𝑁+1 2𝑁+1
Problem
A zero mean wide-sense stationary process X(t) has an
autocorrelation function 𝑅𝑋𝑋 𝜏 = 𝐶𝑋𝑋 𝜏 = 𝑒 (−2𝛼 𝜏 ) for 𝛼>0 a
constant. Determine if X(t) is mean-ergodic.
Solution
1 2𝑇 𝜏
𝐼 = lim (1 − )𝑒 (−2𝛼𝜏)
𝑑𝜏, since 𝐶𝑋𝑋 −𝜏 = 𝐶𝑋𝑋 𝜏
𝑇→∞ 𝑇 0 2𝑇
By following the simple integral given in Appendix C, we will get
1 1
𝐼 = lim 1−𝑒 −4𝛼𝑇
− (1 − 𝑒 −4𝛼𝑇
− 4𝛼𝑇𝑒 −4𝛼𝑇
) =0
𝑇→∞ 2𝛼𝑇 8𝛼 2 𝑇 2
Since, 𝜎𝐴𝑋 2 = 0 and 𝐶𝑋𝑋 −𝜏 = 𝐶𝑋𝑋 𝜏 , then X(t) is mean-ergodic.
Correlation-Ergodic process
A stationary continuous process X(t) with autocorrelation
function 𝑅𝑋𝑋 (𝜏) is called autocorrelation-ergodic or ergodic in
the autocorrelation if, and only if, for all 𝜏
1 𝑇
lim −𝑇 𝑋 𝑡 𝑋 𝑡 + 𝜏 𝑑𝑡 = 𝑅𝑋𝑋 (𝜏)
𝑇→∞ 2𝑇
The necessary and sufficient condition for the above are,
𝑊 𝑡 = 𝑋 𝑡 𝑋(𝑡 + 𝜆), where 𝜆 is the offset
𝐸𝑊 𝑡 =𝐸 𝑋 𝑡 𝑋 𝑡+𝜆 = 𝑅𝑋𝑋 (𝜆)
𝑅𝑊𝑊 𝜏 = 𝐸 𝑊 𝑡 𝑊 𝑡 + 𝜏
=𝐸[𝑋 𝑡 𝑋 𝑡 + 𝜆 𝑋 𝑡 + 𝜏 𝑋(𝑡 + 𝜏 + 𝜆)]
𝐶𝑊𝑊 𝜏 = 𝑅𝑊𝑊 𝜏 − {𝐸[𝑊(𝑡)]}2
= 𝑅𝑊𝑊 𝜏 − 𝑅𝑋𝑋 2 (𝜆)
Thus, X(t) is autocorrelation-ergodic if 𝐶𝑋𝑋 𝜏 is replaced by 𝐶𝑊𝑊 𝜏
and the integral is zero for variance defined in previous slide.
A wide-sense stationary sequence X[n] is autocorrelation-ergodic if,
and only if, for all k
1
lim σ𝑁
𝑛=−𝑁 𝑋 𝑛 𝑋 𝑛 + 𝑘 = 𝑅𝑋𝑋 [𝑘]
𝑁→∞ 2𝑁+1
Two processes X(t) and Y(t) are called cross-correlation-ergodic, or
ergodic in the correlation, if the time cross-correlation function is
equal to the statistical cross-correlation function
Auto-Correlation Function
Autocorrelation function of a random process X(t) is the correlation
𝐸[𝑋1𝑋2] of two random variables 𝑋1 = 𝑋(𝑡1 ) and 𝑋2 = 𝑋(𝑡2 ) defined
by the process at times 𝑡1 and 𝑡2 .
𝑅𝑋𝑋 𝑡1 , 𝑡2 = 𝐸[𝑋 𝑡1 𝑋(𝑡2 )]
For 𝑡1 = 𝑡 and 𝑡2 = 𝑡 + 𝜏, with 𝜏 a real number, the equation
becomes
𝑅𝑋𝑋 𝑡, 𝑡 + 𝜏 = 𝐸[𝑋 𝑡 𝑋(𝑡 + 𝜏)]
Where X(t) is the wide – sense stationary process.
Properties
1. 𝑅𝑋𝑋 (𝜏) ≤ 𝑅𝑋𝑋 (0)
This property states that, 𝑅𝑋𝑋 (𝜏) is bounded by its value at the origin.
2. 𝑅𝑋𝑋 −𝜏 = 𝑅𝑋𝑋 (𝜏)
This states that the an autocorrelation function has even symmetry.
3. 𝑅𝑋𝑋 0 = 𝐸 𝑋 2(𝑡)
The bound is equal to the mean squared value called the power in the
process.
4. If 𝐸 𝑋 𝑡 = 𝑋ത ≠ 0 and 𝑋(𝑡) is ergodic with no periodic components
then,
lim 𝑅𝑋𝑋 (𝜏) = 𝑋ത 2
𝜏→∞
5. If 𝑋(𝑡) has a periodic component, then 𝑅𝑋𝑋 𝜏 will have a periodic
component with the same period.
6. If 𝑋(𝑡) is ergodic, zero mean, and has no periodic component, then
lim 𝑅𝑋𝑋 𝜏 = 0
𝜏→∞
7. 𝑅𝑋𝑋 (𝜏) cannot have an arbitrary shape.
This means, any arbitrary function cannot be an autocorrelation function.
Problem
Given the autocorrelation function, for a stationary ergodic
process with no periodic components, is
Find the mean value and the variance of the process X(t).
Property 4:
𝐸𝑋 𝑡 = 𝑋ത
lim 𝑅𝑋𝑋 (𝜏) = 𝑋ത 2 =25
𝜏→∞
𝑋ത = 25 = ±5
Variance,
𝜎𝑋 2 = 𝐸 𝑋 2 𝑡 − (𝐸[𝑋(𝑡)]) 2
Property 3:
But 𝐸 𝑋 2 𝑡 = 𝑅𝑋𝑋 0 = 25 + 4 = 29, so
𝜎𝑋 2 = 29 − 25 = 4
Problem 2
Cross-Correlation Function
Properties
Problem
Covariance function
The auto covariance function is,
𝐶𝑋𝑋 𝑡, 𝑡 + 𝜏 = 𝐸[{𝑋 𝑡 − 𝐸[𝑋(𝑡)]}{𝑋 𝑡 + 𝜏 − 𝐸[𝑋(𝑡 + 𝜏)]}]
𝐶𝑋𝑋 𝑡, 𝑡 + 𝜏 = 𝑅𝑋𝑋 𝑡, 𝑡 + 𝜏 − 𝐸 𝑋 𝑡 𝐸[𝑋(𝑡 + 𝜏)]
The cross covariance function for two processes is,
𝐶𝑋𝑌 𝑡, 𝑡 + 𝜏 = 𝐸[{𝑋 𝑡 − 𝐸[𝑋(𝑡)]}{𝑌 𝑡 + 𝜏 − 𝐸[𝑌(𝑡 + 𝜏)]}]
𝐶𝑋𝑌 𝑡, 𝑡 + 𝜏 = 𝑅𝑋𝑌 𝑡, 𝑡 + 𝜏 − 𝐸 𝑋 𝑡 𝐸[𝑌(𝑡 + 𝜏)]
For least wide sense stationary processes,
𝐶𝑋𝑋 𝜏 = 𝑅𝑋𝑋 𝜏 − 𝑋ത 2
𝐶𝑋𝑌 𝜏 = 𝑅𝑋𝑌 𝜏 − 𝑋ത 𝑌ത
The variance of a w.s.s process does not depend upon time and with
𝜏 = 0,
𝜎𝑋 2 = 𝐸[{𝑋 𝑡 − 𝐸[𝑋(𝑡)]}2]=𝑅𝑋𝑋 0 − 𝑋ത 2
Two random processes X and Y are uncorrelated, if
𝐶𝑋𝑌 𝑡, 𝑡 + 𝜏 = 0
That means,
𝑅𝑋𝑌 𝑡, 𝑡 + 𝜏 = 𝐸 𝑋 𝑡 𝐸[𝑌(𝑡 + 𝜏)]
This means that, independent processes are uncorrelated.
The reverse is not true, except for joint Gaussian processes.
Discrete time processes and
Sequences
All the equations we have discussed till now are valid for discrete
time processes and discrete time sequences.
Such processes and sequences are defined only at “Sample” times
𝑛𝑇𝑠 .
For a discrete time process X[n𝑇𝑠 ], different definitions are given in
next slide.
𝑀𝑒𝑎𝑛 = 𝐸[𝑋(𝑛𝑇𝑠 )]
𝑅𝑋𝑋 𝑛𝑇𝑠 , 𝑛𝑇𝑠 + 𝑘𝑇𝑠 = 𝐸[𝑋 𝑛𝑇𝑠 𝑋(𝑛𝑇𝑠 + 𝑘𝑇𝑠 )]
𝐶𝑋𝑋 𝑛𝑇𝑠 , 𝑛𝑇𝑠 + 𝑘𝑇𝑠 = 𝑅𝑋𝑋 𝑛𝑇𝑠 , 𝑛𝑇𝑠 + 𝑘𝑇𝑠 − 𝐸 𝑋 𝑛𝑇𝑠 𝐸[𝑋(𝑛𝑇𝑠 +
𝑘𝑇𝑠 )]
For a DT process Y(n𝑇𝑠 ), the cross-correlation and cross-covariance
functions can be written as,
𝑅𝑋𝑌 𝑛𝑇𝑠 , 𝑛𝑇𝑠 + 𝑘𝑇𝑠 = 𝐸 𝑋 𝑛𝑇𝑠 𝑌(𝑛𝑇𝑠 + 𝑘𝑇𝑠 )
𝐶𝑋𝑌 𝑛𝑇𝑠 , 𝑛𝑇𝑠 + 𝑘𝑇𝑠 = 𝑅𝑋𝑌 𝑛𝑇𝑠 , 𝑛𝑇𝑠 + 𝑘𝑇𝑠 − 𝐸 𝑋 𝑛𝑇𝑠 𝐸[𝑌(𝑛𝑇𝑠 +
𝑘𝑇𝑠 )]
For the process that are jointly wide-sense stationary,
𝐸 𝑋 𝑛𝑇𝑠 = 𝑋ത 𝑎𝑛𝑑 𝐸 𝑌 𝑛𝑇𝑠 = 𝑌ത
𝑅𝑋𝑋 𝑛𝑇𝑠 , 𝑛𝑇𝑠 + 𝑘𝑇𝑠 = 𝑅𝑋𝑋 (𝑘𝑇𝑠 )
𝑅𝑌𝑌 𝑛𝑇𝑠 , 𝑛𝑇𝑠 + 𝑘𝑇𝑠 = 𝑅𝑌𝑌 (𝑘𝑇𝑠 )
𝐶𝑋𝑋 𝑛𝑇𝑠 , 𝑛𝑇𝑠 + 𝑘𝑇𝑠 = 𝑅𝑋𝑋 𝑘𝑇𝑠 − 𝑋ത 2
𝐶𝑌𝑌 𝑛𝑇𝑠 , 𝑛𝑇𝑠 + 𝑘𝑇𝑠 = 𝑅𝑌𝑌 𝑘𝑇𝑠 − 𝑌ത 2
𝑅𝑋𝑌 𝑛𝑇𝑠 , 𝑛𝑇𝑠 + 𝑘𝑇𝑠 = 𝑅𝑋𝑌 (𝑘𝑇𝑠 )
𝑅𝑌𝑋 𝑛𝑇𝑠 , 𝑛𝑇𝑠 + 𝑘𝑇𝑠 = 𝑅𝑌𝑋 (𝑘𝑇𝑠 )
𝐶𝑋𝑌 𝑛𝑇𝑠 , 𝑛𝑇𝑠 + 𝑘𝑇𝑠 = 𝑅𝑋𝑌 𝑘𝑇𝑠 − 𝑋ത 𝑌ത
𝐶𝑌𝑋 𝑛𝑇𝑠 , 𝑛𝑇𝑠 + 𝑘𝑇𝑠 = 𝑅𝑌𝑋 𝑘𝑇𝑠 − 𝑌ത 𝑋ത
Measurement of correlation function
We can never measure the true correlation functions of two random
processes X(t) and Y(t), because we never have all sample functions
of the ensemble at our disposal.
To resolve the problem, we need to determine the time averages
based on finite time portions of single sample functions, taken large
enough to approximate true results for ergodic processes.
If x(t) and y(t) exists at least during the interval –T<t and 𝑡1 is an
arbitrary time except 0 ≤ 𝑡1 ,
1 𝑡1 +𝑇
𝑅𝑜 𝑡1 + 2𝑇 = 𝑥 𝑡 𝑦 𝑡 + 𝜏 𝑑𝑡
2𝑇 𝑡1 −𝑇
If 𝑡1 = 0 and assume T is large, then
1 𝑇
𝑅0 2𝑇 = 𝑥 𝑡 𝑦 𝑡 + 𝜏 𝑑𝑡 ≈ ℜ𝑥𝑦 𝜏 = 𝑅𝑋𝑌 (𝜏)
2𝑇 −𝑇
A
y(t) Delay
T-𝝉
𝟏 𝒕𝟏+𝟐𝑻
Product න . 𝒅𝒕
𝟐𝑻 𝒕𝟏
B
Delay
x(t)
T
For jointly ergodic process, the above system can approximately measure
their cross-correlation function (𝜏 is varied to obtain the complete function).
By connecting point A and B, and applying either x(t) or y(t) to the system,
we can also measure the autocorrelation functions 𝑅𝑋𝑋 (𝜏) and 𝑅𝑌𝑌 𝜏 .
Problem
Connect A and B together in the previous figure and use the system to
measure the autocorrelation function of the process 𝑋 𝑡 = 𝐴𝑐𝑜𝑠(𝜔0 𝑡 +
Θ).
Solution:
1 𝑇
𝑅0 2𝑇 = 𝐴2
cos 𝜔0 + 𝜃 cos 𝜔0 𝑡 + 𝜃 + 𝜔0𝜏 𝑑𝑡
2𝑇 −𝑇
𝐴2 𝑇
= cos 𝜔0 𝜏 + cos 2𝜔0 𝑡 + 2𝜃 + 𝜔0 𝜏 𝑑𝑡
4𝑇 −𝑇
𝜃 represents a specific value of the random variable Θ.
Gaussian Random Process
Consider a continuous random process and define N random variables
𝑋1 = 𝑋 𝑡1 , … , 𝑋𝑖 = 𝑋 𝑡𝑖 , … , 𝑋𝑁 = 𝑋(𝑡𝑁 ) corresponding to N time
instants 𝑡1 , … , 𝑡𝑖 , … , 𝑡𝑁 . If, for any N=1,2,… and any times 𝑡1 , 𝑡2, … , 𝑡𝑁 ,
these random variables are jointly Gaussian, then the process is called
Gaussian.
The mean values 𝑋ഥ𝑖 of X(𝑡𝑖 ):
𝑋ഥ𝑖 = 𝐸 𝑋𝑖 = 𝐸[𝑋(𝑡𝑖 )]
The covariance matrix [𝐶𝑋 ] is,
𝐶𝑖𝑘 = 𝐶𝑋𝑖 𝑋𝑘 = 𝐸[(𝑋𝑖 − 𝑋ഥ𝑖 )(𝑋𝑘 − 𝑋𝑘 )]
=𝐸[ 𝑋 𝑡𝑖 − 𝐸 𝑋 𝑡𝑖 {𝑋 𝑡𝑘 − 𝐸[𝑋(𝑡𝑘 )]}]
= 𝐶𝑋𝑋 (𝑡𝑖 , 𝑡𝑘 )
𝐶𝑋𝑋 𝑡𝑖 , 𝑡𝑘 = 𝑅𝑋𝑋 𝑡𝑖 , 𝑡𝑘 − 𝐸 𝑋 𝑡𝑖 𝐸[𝑋(𝑡𝑘 )]
If the Gaussian process is wide sense stationary, the mean will be
constant.
Example
Poisson Random Process
Probability Density Function
Joint probability Density
(𝜆 𝑡1 )𝑘1 𝑒 −𝜆 𝑡1
𝑃 𝑋 𝑡1 = 𝑘1 = , 𝑘1 =0,1,2,…
𝑘1 !
Conditional probability of 𝑘2 occurrences over (0, 𝑡2 ) given that 𝑘1
events occurred over (0, 𝑡1 ), is just probability that 𝑘2 -𝑘1 events
occurred over (𝑡1 , 𝑡2 ) is
[𝜆(𝑡2 −𝑡1 )]𝑘2 −𝑘1 𝑒 −𝜆(𝑡2 −𝑡1 )
𝑃 𝑋 𝑡2 = 𝑘2|𝑋 𝑡1 = 𝑘1 =
𝑘2 −𝑘1 !
𝑃 𝑘1 , 𝑘2 = 𝑃 𝑋 𝑡2 = 𝑘2 𝑋 𝑡1 = 𝑘1 . 𝑃[𝑋 𝑡1 = 𝑘1 ]
(𝜆𝑡1 )𝑘1 [𝜆(𝑡2 −𝑡1 )]𝑘2 −𝑘1 𝑒 −𝜆𝑡2
= 𝑘2 ≥ 𝑘1
𝑘1 ! 𝑘2 −𝑘1 !
The joint density now becomes
𝑓𝑋 𝑥1 , 𝑥2 = σ∞ σ ∞
𝑘1 =0 𝑘2 =𝑘1 𝑃 𝑘1 , 𝑘2 𝛿(𝑥1 − 𝑘1 ) 𝛿(𝑥2 − 𝑘2 )
For the process random variables 𝑋1 and 𝑋2
Complex Random Processes
If we include the notion of time in the complex random variable, then
the result will be a complex random process.
𝑍 𝑡 = 𝑋 𝑡 + 𝑗𝑌(𝑡)
Where, X(t) and Y(t) are real processes.
If X(t) and Y(t) are jointly stationary, then Z(t) will be stationary
If X(t) and Y(t) are jointly wide-sense stationary, then Z(t) will be
wide sense stationary.
Two complex processes 𝑍𝑖 𝑡 and 𝑍𝑗 𝑡 are jointly wide sense
stationary if each is wide sense stationary and their cross correlation
function is a function of time differences only and not absolute time.
The mean value of Z(t) is,
𝐸𝑍 𝑡 =𝐸 𝑋 𝑡 + 𝑗𝐸[𝑌(𝑡)]
Autocorrelation function is defined by,
𝑅𝑍𝑍 𝑡, 𝑡 + 𝜏 = 𝐸[𝑍∗ 𝑡 𝑍(𝑡 + 𝜏)], where 𝑍∗ 𝑡 is the complex
conjugate.
The autocovariance function is defined by
𝐶𝑍𝑍 𝑡, 𝑡 + 𝜏 = 𝐸[ 𝑍 𝑡 − 𝐸 𝑍 𝑡 ∗ 𝑍 𝑡 + 𝜏 − 𝐸[𝑍(𝑡 + 𝜏)] ]
If Z(t) is wide sense stationary, then the mean value becomes a
constant.
𝑍ҧ = 𝑋ത + 𝑗𝑌ത
The correlation functions are independent of absolute time:
𝑅𝑍𝑍 𝑡, 𝑡 + 𝜏 = 𝑅𝑍𝑍 (𝜏)
𝐶𝑍𝑍 𝑡, 𝑡 + 𝜏 = 𝐶𝑍𝑍 (𝜏)
For two complex processes 𝑍𝑖 𝑡 and 𝑍𝑗 (𝑡), cross-correlation and
cross-covariance functions are defined by
𝑅𝑍𝑖 𝑍𝑗 𝑡, 𝑡 + 𝜏 = 𝐸[𝑍𝑖 ∗ 𝑡 𝑍𝑗 (𝑡 + 𝜏)]
𝐶𝑍𝑖𝑍𝑗 𝑡, 𝑡 + 𝜏 = 𝐸[ 𝑍𝑖 𝑡 − 𝐸 𝑍𝑖 𝑡 ∗ 𝑍𝑗 𝑡 + 𝜏 − 𝐸[𝑍𝑗 (𝑡 + 𝜏)] ]
If two processes are at least jointly wide sense stationary, then
𝑅𝑍𝑖 𝑍𝑗 𝑡, 𝑡 + 𝜏 = 𝑅𝑍𝑖𝑍𝑗 (𝜏) 𝑖 ≠ 𝑗
𝐶𝑍𝑖𝑍𝑗 𝑡, 𝑡 + 𝜏 = 𝐶𝑍𝑖 𝑍𝑗 (𝜏) 𝑖 ≠ 𝑗
𝑍𝑖 𝑡 and 𝑍𝑗 (𝑡) are uncorrelated processes if 𝐶𝑍𝑖𝑍𝑗 𝑡, 𝑡 + 𝜏 = 0, 𝑖 ≠ 𝑗
𝑍𝑖 𝑡 and 𝑍𝑗 (𝑡) are orthogonal processes if 𝑅𝑍𝑖 𝑍𝑗 𝑡, 𝑡 + 𝜏 = 0, 𝑖 ≠ 𝑗
Thank you