Smart antennas for Wireless Communication
Principles of Random Variables and Processes
2025
Mustefa Badri
Contents
Definition of Random Variables
Probability Density Functions
Expectation and Moments
Common Probability Density Functions
Stationarity and Ergodicity
Autocorrelation and Power Spectral Density
Correlation Matrix
Definition of Random Variables
A random variable is a function that describes all A random variable is continuous if the variable can
possible outcomes of an experiment. take on a continuum of values during an observation
interval.
In the context of communication systems, the received
voltages, currents, phases, time delays, and angles-of- Ex: The voltage associated with receiver noise or
arrival tend to be random variables. The phase of an arriving signal.
Random variables can either be discrete or
continuous variables. The behavior of random variables best to described by
A random variable is discrete if the variable can only using probability density functions (pdf).
take on a finite number of values during an observation
interval.
Ex: The arrival angle for indoor multipath propagation
Probability Density Functions
Every random variable x is characterized by a probability density function
p(x).
The probability density function (pdf) is established after a large number of
measurements have been performed, which determine the likelihood of all
possible values of x.
The probability that x will take on a range of values between two limits
x1 and x2 is defined by
There are two important properties for pdfs. First, no event can have a
negative probability. Thus
Second, the probability that an x value exists somewhere over its range of
values is certain. Thus
Expectation and Moments
The statistical average is defined as the expected value The spreading about the first moment is called the
denoted by E. Thus, the expected value of x is defined as variance and is defined as
Not only can we find the expected value of x but we can The standard deviation is denoted by σ and is
also find the expected value of any function of x. defined as the spread about the mean, thus,
The calculation of multiple moments will be
The expected value of x is typically called the first simplified using moment generating function .
moment denoted as m1.
The moment generating function is defined as
The nth moment denoted as mn The moment generating function resembles the Laplace
transform of the pdf
Common Probability Density Functions
The pdfs describe the characteristics of the receiver
noise, the arriving signal from multipaths, the
distribution of the phase, envelope, and power of
arriving signals.
Gaussian density
The Gaussian distribution generally defines the behavior
of noise in receivers and also the nature of the random
amplitudes of arriving multipath signals.
According to the Central Limit Theorem, the sum of
numerous continuous random variables as the number
increases, tends toward a Gaussian distribution.
The Gaussian density is defined as
Uniform density
Not only does the phase delay tend to be uniformly
distributed but often the angles of arrival for diverse
propagating waves can also take on a uniform
distribution.
The uniform distribution is normally attributed to the
distribution of the random phase for propagating
signals.
The uniform distribution is defined as
Exponential density
The exponential density function is sometimes used to
describe the angles of arrival for incoming signals.
It can also be used to describe the power distribution
for a Rayleigh process.
The Exponential density is the Erlang density when n
= 1 ([1]) and is defined by
Rayleigh density
The Rayleigh probability density generally results
when one finds the envelope of two independent
Gaussian processes.
This envelope can be found at the output of a linear
filter where the inputs are Gaussian random variables
Rayleigh distributions are normally attributed to the
envelope of multipath signals when there is no direct
path.
The Rayleigh distribution is defined as
Rician density
The Rician distribution is common for propagation
channels where there is a direct path signal added to
the multipath signals.
The direct path inserts a nonrandom carrier thereby
modifying the Rayleigh distribution.
The Rician distribution is defined as
where I0( ) is the Modified Bessel function of first kind and
zero-order.
Laplace density
The Laplace density function is generally attributed to
the distribution of indoor or congested urban angles of
arrival.
The Laplace distribution is given as
Since the Laplace distribution is symmetric about the
origin, the first moment is zero.
The second moment can be shown to be .
Stationarity and Ergodicity
Stationarity
stationary processes are ones in which the statistics of
the random variables do not change at different times.
The time average for the random variable x can be
written as
or
If all statistics of the random variable x do not change
with time, the random process is said to be strict-
sense stationary.
If the mean value of a random variable does not change
with time, the process is said to be wide-sense
stationary.
If x is wide-sense stationary, the E[x] is simplified to
Ergodicity If by increasing T (or K ) we can force the variance
Ergodic processes are ones where it is possible to estimate to converge to the statistical variance, the
estimate the statistics, such as mean, variance, and process is said to be ergodic in the variance or
autocorrelation, from the measured values in time variance-ergodic.
In reality, the statistics might change for short blocks This can be written as
of time T but stabilize over longer blocks of time.
If by increasing T (or K) we can force the time
average estimate to converge to the statistical average,
the process is said to be ergodic in the mean or mean- or
ergodic.
This can be written as
or
Autocorrelation and Power Spectral Densit
In practical systems where we are constrained to
It is valuable to know how well a random variable
process limited blocks of data, one is forced to
correlates with itself at different points in time. estimate the autocorrelation based upon using a time
That is, how does x at the time t1 correlate with x at average.
the time t2? Therefore, the estimate of the autocorrelation can be
This correlation is defined as an autocorrelation since defined as
we are correlating x with itself.
The autocorrelation is normally written as or
If the random variable x is wide-sense stationary
process, the autocorrelation can be rewritten as
The autocorrelation value at τ = 0 is the second
moment.
Increasing T (or K ) forces the autocorrelation
estimate to converge to the statistical autocorrelation,
the process is said to be ergodic in the autocorrelation The autocorrelation itself is a function of the time
or autocorrelation-ergodic. delay between two time-separated random variables.
This can be written as
Thus, the autocorrelation is subject to Fourier analysis.
The power spectral density as the Fourier transform of
the autocorrelation function is defined:
The units of the autocorrelation function for electrical
systems are normally expressed in watts.
Thus Rx(0) yields the average power of the random
variable x
The Fourier transform pair in Sx and Rx is frequently referred
to as the Wiener-Khinchin pair.
Correlation Matrix
Or
n d
e E
T h