04-10-2018/CE 608
CE 513: STATISTICAL METHODS
IN CIVIL ENGINEERING
Lecture- 11: Random process
Dr. Budhaditya Hazra
Room: N-307
Department of Civil Engineering
1
04-10-2018
Recall: Random Variable Def
𝜉1
𝜉2
:
: 𝑋(𝜉1 ) 𝑋(𝜉𝑝 )
:
𝜉𝑝
𝜉𝑖 RV 𝑋(𝜉𝑖 )
2
04-10-2018
Random process
𝜉𝑖 RP 𝑋(𝑡, 𝜉𝑖 )
3
04-10-2018
Random process
𝜉𝑖 RP 𝑋(𝑡, 𝜉𝑖 )
4
04-10-2018
Random process
5
04-10-2018
Random process
First-order distribution (for a particular value of t)
First-order density function
6
04-10-2018
2nd Order Averages
7
04-10-2018
2nd Order Averages
2nd order distribution
2nd order density function
8
8
04-10-2018
Expectations
Ensemble Average
Autocorrelation
Autocovariance
9
04-10-2018
Random process
10
04-10-2018
Random process
11
04-10-2018
Random process
12
04-10-2018
Random process
13
04-10-2018
Random process
14
04-10-2018
Random process
15
04-10-2018
Autocorrelation: example
16
04-10-2018
Autocorrelation: example
17
04-10-2018
Classification of stochastic
process
Strictly stationary
Thus both first order and second order distributions are independent of 𝑡
18
04-10-2018
Wide sense stationary
If stationary condition of a random process X(t) does
not hold for all n but holds for n ≤ k, then we say that
the process X(t)is stationary to order k.
If X(t) is stationary to order 2, then X(t) is said to be wide-
sense stationary (WSS) or weak stationary.
19
04-10-2018
Wide sense stationary
• Stationarity of a random process
is analogous to
steady state in vibration problems
• One or more of the properties of random process becomes
independent of time
• Strong sense stationarity (SSS) : defined with respect to
pdf-s
• Wide sense stationarity (WSS) : defined with respect to
moments
20
04-10-2018
Stationary SS: Few Theorems
1. If a random process which is stationary to order n is also
stationary to all orders lower than n.
2. If {X(t), 𝑡 ∈ 𝑇} is a strict-sense stationary random
process, then it is also WSS.
3. If a random process X(t) is WSS, then it must also be
covariance stationary
21
04-10-2018
SSS: Example
22
04-10-2018
23
04-10-2018
24
04-10-2018
𝑅𝑥 𝜏 (WSS) examples
1) 𝐺 𝑡 = 𝐴 cos 𝜔0 𝑡 + 𝜙 , where 𝜙 is uniform RV with
𝜙~𝑈(0, 2𝜋). Determine the mean and the autocorrelation ?
𝐴2
Ans = cos 𝜔0 𝜏
2
2) 𝐺 𝑡 = 𝐴 cos 𝜔𝑡 + 𝜃 , where 𝜔 and 𝜃 are independent RVs with
𝜃~𝑈(0, 2𝜋) and 𝜔~𝑈 𝜔1 , 𝜔2 . Determine the mean and
the autocorrelation ?
𝐴2
Ans = [sin 𝜔2 𝜏 −sin𝜔1 𝜏]
2𝜏(𝜔2 −𝜔1 )
25
04-10-2018
Autocorrelation: Properties
1. It is an even function of 𝜏
𝑅𝑥 𝜏 = 𝑅𝑥 −𝜏
2. Bounded by its value at origin
𝑅𝑥 𝜏 ≤ 𝑅𝑥 0
3. 𝑅𝑥 0 = 𝐸 𝑋 2
4. If X is periodic 𝑅𝑥 𝜏 is also periodic
26
04-10-2018
Autocorrelation: Example
27
04-10-2018
Autocorrelation: Example
28
04-10-2018
Autocorrelation: Example
29
04-10-2018
Cross-correlation
1. Two processes X(t) and Y(t) are called jointly
stationary
❖ if each of them are WSS individually
❖ 𝑅𝑥𝑦 𝑡, 𝑡 + 𝜏 = 𝑅𝑥𝑦 𝜏
𝑅𝑦𝑥 𝑡, 𝑡 + 𝜏 = 𝑅𝑦𝑥 𝜏
2. 𝑅𝑥𝑦 𝜏 and 𝑅𝑦𝑥 𝜏 are mirror images of each
other 𝑅𝑥𝑦 𝜏 = 𝑅𝑦𝑥 −𝜏
30
04-10-2018
White-Noise
31
04-10-2018
White-Noise
32
04-10-2018
White-Noise
33
04-10-2018
White-Noise
• Note that, as the parameter λ gets larger, 𝑅𝑥 𝜏 becomes narrower
• We use this to define a special WSS called White Noise
• As λ→∞, the process is very erratic and 𝑅𝑥 𝜏 becomes a Dirac delta function
• In order that 𝑅𝑥 𝜏 doesn’t disappear completely, a becomes very large
34
04-10-2018
Erogodicity
Basic idea: Equivalence of temporal and ensemble averages
35
04-10-2018
Erogodicity
A random process is said to be Ergodic if it has the property that
the time averages of sample functions of the process are equal to
the corresponding statistical or ensemble averages.
1 𝑇/2
𝐸𝑋 𝑡 = 𝑋(𝑡) = න 𝑥 𝑡 𝑑𝑡
𝑇 −𝑇/2
The sample autocorrelation can be calculated using the following formula
1 𝑇/2
𝑅𝑋 (𝜏) = 𝑋 𝑡 𝑋(𝑡 + 𝜏) = න 𝑥 𝑡 𝑥 𝑡 + 𝜏 𝑑𝑡
𝑇 −𝑇/2
36
04-10-2018
Erogodicity
• Consider a sample of a random process: x (1), x (2),………x (N)
• The sample mean of the sequence could be estimated as:
𝑁−1
1
𝑚
ෞ𝑥 (𝑁) = 𝑥𝑛
𝑁
𝑛=0
• Since the sample is a realization of a random process it must have
a constant ensemble mean E[X(n)]= 𝑚𝑥
If the sample mean 𝑚
ෞ𝑥 (𝑁 ) of a WSS converges to 𝑚𝑥 in a mean
square sense as N→ ∞, then the random process is said to be Ergodic
in mean
ෞ𝑥 (𝑁 ) = 𝑚𝑥
lim 𝑚
𝑁→∞
37
04-10-2018
Mean Ergodic Theorem
38
04-10-2018
Sample autocorrelation of a WSS and
Ergodic process
𝑟𝑥 𝑘 = 𝐸 𝑥 𝑘 𝑥 𝑛 − 𝑘
For each k, the autocorrelation is the expected
value of the process: 𝑦𝑘 𝑛 = 𝑥 𝑘 𝑥 𝑛 − 𝑘
Using Ergodicity properties, the autocorrelation
is finally estimated as :
1 𝑁−1
𝑟ෝ𝑥 𝑘, 𝑁 = σ 𝑥 𝑘 𝑥 𝑛−𝑘
𝑁 𝑛=0
39
04-10-2018
WSS& Ergodic process: example
Coming back to the random phase sinusoid
𝐺 𝑡 = 𝐴 cos 𝜔0 𝑡 + 𝜙 , where 𝜙 is uniform RV with 𝜙~𝑈(0, 2𝜋).
1 𝑇 1 𝑇
𝑋(𝑡) = lim 𝑥 𝑡 𝑑𝑡 = lim 𝐴 cos 𝜔0 𝑡 + 𝜙 𝑑𝑡 = 0
𝑇→∞ 2𝑇 −𝑇 𝑇→∞ 2𝑇 −𝑇
1 𝑇
𝑋 𝑡 𝑋(𝑡 + 𝜏) = lim 𝑥 𝑡 𝑥 𝑡 + 𝜏 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇
1 𝑇
= lim −𝑇
𝐴2 cos
𝜔0 𝑡 + 𝜔0 𝜏 + 𝜙 𝑐𝑜𝑠 𝜔0 𝑡 + 𝜙 𝑑𝑡
𝑇→∞ 2𝑇
𝐴2
= cos 𝜔0 𝜏
2
40
04-10-2018
Applications
Noisy signals
Consider a signal buried in white-noise, i.e. y(t) = s(t) + n(t)
Assume: Noise and signal are uncorrelated and with mean = 0
Therefore:
As Rnn(τ ) decays very rapidly, the autocorrelation function of the
signal Rss (τ ) will dominate for larger values of τ
41
04-10-2018
Application of cross-correlation
Consider a wheeled vehicle moving over rough terrain as shown in Figure.
• Let the time function (profile) experienced by the leading wheel be x(t)
and that by the trailing wheel be y(t)
• Let the autocorrelation of x (t) be 𝑅𝑥𝑥 𝜏
• Assume that the vehicle moves at a constant speed V.
Then, y(t) = x(t − ∆ ) where ∆ = L/V
42
04-10-2018
Application of cross-correlation
• Let x (t) and y(t) be observed in presence of white noise (~𝑁 0, 𝜎 2 )
43
04-10-2018
MATLAB examples
• Autocorrelation of a random phase sinusoid
• Noisy signal
• Time delay problem
44
04-10-2018
Independent Increment Processes
• Independent Increment Process:
• {𝑋(𝑡), t> 0 } is said to have independent increments when
𝑋 0 , 𝑋 𝑡1 − 𝑋 0 , 𝑋 𝑡2 − 𝑋 𝑡1 , … … … … . , 𝑋 𝑡𝑛 − 𝑋(𝑡𝑛−1 )
are independent
• If {𝑋(𝑡), t> 0 } possesses independent increments and
𝑋 𝑡 + ℎ − 𝑋 𝑠 + ℎ has the same distn as 𝑋(𝑡)- 𝑋 𝑠 , then process
𝑋(𝑡) is said to have stationary independent increments.
45
04-10-2018
Arrival Process
Let t represent a time variable
• Suppose an experiment begins at t = 0
• Events of a particular kind occur randomly,
the first at T1, the second at T2, and so on.
• The RV (Ti ) denotes the time at which the ith event occurs, and
• The values ti of Ti(i= 1, 2, . . . ) are called points of occurrence
Let 𝑍𝑛 = 𝑡𝑛 − 𝑡𝑛−1
46
04-10-2018
Arrival Process
Let 𝑍𝑛 = 𝑇𝑛 − 𝑇𝑛−1 & 𝑇0 = 0
Then 𝑍𝑛 denotes the time between the (n - 1)st and the nth events
The sequence of ordered RV's {𝑍𝑛 , n ≥ 1} is sometimes called an
interarrival process.
Observe that 𝑇𝑛 = 𝑍0 + 𝑍1 + 𝑍2 + ⋯ + 𝑍𝑛
The sequence {𝑇𝑛 , n ≥ 1} is called Arrival Process
47
04-10-2018
Counting Process
A random process {𝑋(𝑡), t ≥ 0} is said to be a counting process
if:
• X(t) represents the total number of events that have occurred
in the interval (0, t)
• 𝑋 𝑡 ≥ 0 and 𝑋 0 = 0
• 𝑋(𝑡) is integer valued & 𝑋 𝑠 ≤ 𝑋(𝑡) if 𝑠 < 𝑡
• 𝑋(𝑡)-𝑋(𝑠) equals to the no of events that have occurred in the
interval (𝑠, 𝑡)
48
04-10-2018
Counting Process
• X(t) represents the total no of events that have occurred in the interval (0, t)
• 𝑋 𝑡 ≥ 0 & 𝑋 0 = 0; 𝑋(𝑡) is integer valued & 𝑋 𝑠 ≤ 𝑋(𝑡) if 𝑠 < 𝑡
• 𝑋(𝑡)-𝑋(𝑠) equals to the no of events that have occurred in the interval (𝑠, 𝑡)
• 𝑋(𝑡) is a independent increment process if the no of events which occur in
disjoint time intervals are independent
• A counting process 𝑋(𝑡) possesses stationary increments if 𝑋 𝑡 + ℎ −
𝑋 𝑠 + ℎ has the same dist. as 𝑋(𝑡)- 𝑋(𝑠)
49
04-10-2018
Poisson counting Process
A counting process X(t) is said to be a Poisson process with rate (or intensity) 𝜆
(> 0) if
1. X(O) = 0.
2. X(t) has independent increments.
3. The number of events in any interval of length t is Poisson distributed with
mean 𝜆𝑡
Thus, the expected number of events in the unit interval (0, 1), or any
other interval of unit length, is just 𝜆 (hence the name of the rate or
intensity).
50
04-10-2018
Example
Suppose vehicle are passing a bridge with the rate of two per minute.
Q1. In 5 minutes, what is the average number of vehicles?
Q2. what is the variance in 5 minutes ?
Q3. What is the probability of at least one vehicle passing the bridge in
that 5 minutes?
To determine the above, the Poisson process is assumed ,where 𝑣 𝑡 is the
number of vehicles in the interval 0, 𝑡 , with a rate of 𝜆 = 2.
𝜆𝑡 𝑛 −𝜆𝑡
𝑃 𝑉 𝑡 =𝑛 = 𝑒 , 𝑛 = 1, 2, 3, … . .
𝑛!
5∗2 𝑛 −5∗2
For t=5 𝑃 𝑉 𝑡 =5 = 𝑒 ⇒ 𝜇𝑣 5 = 10 = 𝜎𝑣2 5
𝑛!
Q3: Do it yourself : Hint 𝑃 𝑉 5 ≥ 1
51
04-10-2018
Interarrival times for Poisson
counting Process
A counting process X(t) is said to be a Poisson process with rate 𝜆 (> 0) if
1. X(O) = 0 & X(t) has independent increments.
2. The no of events in any interval of length t is Poisson distributed with mean 𝜆𝑡
The time intervals between successive events (ti) or interarrival times
in a Poisson’s process X(t) with rate 𝜆 are IID exponential with parameter 𝜆
Let Z1, Z2, • • • be the r.v. 's representing the lengths of interarrival times in the Poisson process
X(t)
{Z1 > t} takes place if and only if no event of the Poisson process occurs in the interval (0, t),
P 𝑍1 ≤ 𝑡 = 1 − 𝑃{𝑍1 > 𝑡} = 1 − 𝑒 −𝜆𝑡
52
04-10-2018
Interarrival times for Poisson
counting Process
The time intervals between successive events (ti) or interarrival times
in a Poisson’s process X(t) with rate 𝜆 are IID exponential with parameter 𝜆
Let 𝑍1 , 𝑍2 , • • • be the r.v. 's representing the lengths of interarrival times in the Poisson process X(t).
{𝑍1 > t} takes place if and only if no event of the Poisson process occurs in the interval (0, t),
P 𝑍1 ≤ 𝑡 = 1 − 𝑃{𝑍1 > 𝑡} = 1 − 𝑒 −𝜆𝑡
P 𝑍2 > 𝑡 = 𝑍 𝑃 2 > 𝑡 𝑍1 = 𝜏 𝑓1 𝜏 𝑑𝜏
= {𝑃 X(t + τ)−X(τ) = 0] 𝑓1 𝜏 𝑑𝜏 = e−𝜆𝑡
which indicates that 𝑍2 is also an exponential with parameter 𝜆 and is
independent of 𝑍1 .
Repeating the same argument, we conclude that 𝑍1 , 𝑍2 , … . are iid
exponential r.v.'s with parameter 𝜆
53
04-10-2018
Arrival times for Poisson counting
Process
Let 𝑇𝑛 = 𝑍0 + 𝑍1 + 𝑍2 + ⋯ + 𝑍𝑛 denote the time of the 𝑛th event
of a Poisson process X(t) with rate 𝜆.
𝑇𝑛 is a gamma r.v. with parameters (n, 𝜆)
we know that 𝑍𝑛 are iid exponential r.v.'s with parameter 𝜆.
It can be proved that the sum of n iid exponential r.v.'s with
parameter 𝜆 is a gamma RV with parameters (n, 𝜆)
54
04-10-2018
Example: Semirandom telegraph
signal
Determine the mean and covariance of Y(t)
55
04-10-2018
Example: Semirandom telegraph signal
56
04-10-2018
Example: Semirandom telegraph signal
57
04-10-2018
Bernoulli Process
• Let 𝑋1 , 𝑋2 …… be independent Bernoulli RVs with P(𝑋𝑛 = 1) =
p and P(𝑋𝑛 = 0) = q = 1 – p for all n.
• The collection of RVs {𝑋(𝑛), n≥ 1 } is a random process, and
it is called a Bernoulli process.
• A sample sequence of the Bernoulli process can be obtained by
tossing a coin consecutively
➢ If a head appears, we assign 1,
➢ If a tail appears, we assign 0.
58
04-10-2018
Bernoulli Process
59
04-10-2018
Random Walk
• Let 𝑍1 , 𝑍2 …… be independent Bernoulli RVs with P(𝑍𝑛 = 1) =
p and P(𝑍𝑛 = -1) = q = 1 – p for all n.
The collection of RVs {𝑋(𝑛), n≥ 1 } is a random process,
and it is called Random Walk
60
04-10-2018
Random Walk
Repeat the same coin tossing exercise as Bernoulli process
Homework: Find the mean, variance and the autocorrelation of a simple
random walk
61
04-10-2018
Wiener Process
A random process {𝑋(𝑡), t> 0 } is called a Wiener process if
1. X(t) has stationary independent increments.
2. The increment 𝑋(𝑡)- 𝑋 𝑠 (𝑡 > 𝑠) is normally distributed.
3. E𝑋 𝑡 =0
4. 𝑋(0) = 0
The Wiener process is also known as the Brownian motion process, since it originates as
a model for Brownian motion, the motion of particles suspended in a fluid.
62
04-10-2018
Wiener Process
A Wiener process X(t) has stationary independent increments in which the
increment 𝑋(𝑡)- 𝑋 𝑠 (𝑡 > 𝑠) is normally distributed with:
1. E𝑋 𝑡 =0
2. VAR 𝑋 𝑡 = 𝜎 2𝑡
3. When 𝜎 2 =1, X(t) is called a STANDARD Wiener process
The autocorrelation function of Wiener process 𝑅𝑥 𝑡, 𝑠 = 𝜎 2 min(t, s)
63
04-10-2018
Wiener Process with Drift
1. X(t) has stationary independent increments.
2. The increment 𝑋(𝑡)- 𝑋 𝑠 (𝑡 > 𝑠) is normally distributed.
3. E𝑋 𝑡 = 𝜇𝑡
4. 𝑋(0) = 0
The pdf of a standard Wiener process with drift coefficient 𝜇 is given by :
1
𝑒 −(𝑥−𝜇𝑡) /2𝑡
2
f𝑋 𝑡 (𝑥) =
2𝜋𝑡
64
04-10-2018
Other processes related to Wiener
• Brownian Motion: 𝐵 𝑡 = 0.5𝑊 𝑡
• Brownian Bridge: X(t)= 𝐵 𝑡 – t𝐵(1)
• Geometric Brownian Motion : 𝐺 𝑡 = 𝑒 𝜇𝑡+𝜎𝑊(𝑡)
65
04-10-2018
Convergence of Random Process
Definitions:
Sequence of RVs: {𝑋𝑛 , n ≥ 1}; 𝑛∈𝑁
𝐸 [𝑋𝑛2 ] < ∞
Mean squared Error: MSE 𝑋𝑛 , 𝑋 ≔ 𝐸 (𝑋𝑛 −𝑋 2 ]
Limit in mean square: l. i. m. 𝑋𝑛 = 𝑋
𝑛→∞
Is the same way of stating lim MSE 𝑋𝑛 , 𝑋 → 0
𝑛→∞
66
04-10-2018
Convergence of Random Process
67