0% found this document useful (0 votes)
20 views7 pages

Transformations of Random Variables

The document discusses the transformation of random variables, detailing the definitions and properties of discrete and continuous random variables, as well as their probability distributions. It covers functions of random variables, bivariate random variables, joint distributions, and techniques for variable transformation and moment generating functions. Examples illustrate the application of these concepts in determining probability distributions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views7 pages

Transformations of Random Variables

The document discusses the transformation of random variables, detailing the definitions and properties of discrete and continuous random variables, as well as their probability distributions. It covers functions of random variables, bivariate random variables, joint distributions, and techniques for variable transformation and moment generating functions. Examples illustrate the application of these concepts in determining probability distributions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Transformation of multiple variables Ignacio López

FUNCTIONS AND TRANSFORMATIONS OF RANDOM VARIABLES


THE RANDOM VARIABLE

Random Variable. In review, it is a numerical variable whose value cannot


predict with certainty before the occurrence of the experiment. And its behavior is
describe using a Probability law. Let the experiment be and the sample space
associate. A function X that assigns to each of the elementss Sa real number
X(s) is called a Random Variable.

Discrete Random Variable. If the number of possible values of X is finite, that is, it
they can record the values of X as,x1...,xn

Let X be a discrete random variable, then Rxconsists of a number of values of


x1,...,xn as a possible result of xiwe associate a number p(xi) = P(X=xicalled
probability of xI, and is satisfied,

P(xi) 0 i

i 1
p(x i) 1

where, p is the probability of the random variable X. The collection of pairs


(x i ,p(x i 1,..Yes, sometimes, the probability distribution of X

Given the random variables X and Y; it occurs:


Case 1: If X is a discrete random variable, Y=H(x), then, Y is a variable.
discrete random
-Case 2: If X is a continuous random variable, Y is a discrete random variable, the function
the probability distribution of Y determines the equivalent event that
corresponds to the values of Y. If the distribution function is known
probability of X. In general, if {Y=yi} is equivalent to an event, let x be the route
of X, then, q(y i ) P(Y yi) f(x)dx
k

Continuous Random Variable. The continuous random variable is X, if there exists a function f
or a probability density function of X that satisfies
f(x) 0 x

f(x)dx 1

Now, for any a and b, such that, a b , we have


b
P(a X f (x)dx
a
Transformation of multiple variables Ignacio López

Functions of Random Variables. Let B and C be equivalent events. If B is the set


of values of X such thatH(x) C, then,

B np
indIuvital R xH(x) C}, B R x

Let X be a continuous random variable with probability distribution function f and H


a continuous function, then, Y=(H(x)) is a continuous random variable with function of
probability distribution g, with the following steps:

 G(y) P(Y y)
 differentiate G(y) with respect to y and obtain g(y),
 find the values of y in the range of Y for g(y) > 0

Let X be a continuous random variable with probability distribution function f,


f(x) > 0 for a < x < b. Let y = H(x) be strictly monotonic and differentiable for all x, then
The random variable Y=H(X) has the probability distribution function:
dx
g(y) f ( x )
dy

X is a random variable from the space S, Rxit is the path of X, H is a real function, the
random variable Y=H(X) with range in RyFor any event
C R yis defined
P(C) P( x R xH(x) C ]
Therefore, the probability of an event associated with the path of Y is the probability
of the equivalent event (as a function of X)

Bivariate Random Variables. Let there be an experiment with sample space


associated S, X1=X1(s), ..., Xn=Xn(s)
X=X(s) and Y=Y(s) are two functions that assign a real number to each one of the
resultss S (X,Y) is the two-dimensional random variable

In the discrete case, (X,Y) has finite or countably infinite possible values.
In the continuous case, (X,Y) can have all values in a non-countable set.
of the Euclidean plane

Let (X,Y) be a two-dimensional random variable with each possible outcome (x i,yj)
we associate a number p(xij) that represents P(X=xiY=yjthat satisfies

p(x i,yj) 0(x,y)

p(x, iy) j 1
j 1I 1

The function p defined for all (xi,yjin the course of (X,Y) is the function of
probability of (X,Y) and the triplet (xi,yj,p(xi,yjIt is the probability distribution of (X,Y)
Transformation of multiple variables Ignacio López

Let (X,Y) be a continuous random variable that takes all values from the region R of
Euclidean plane, the joint probability distribution function f is the function that
satisfies
f(x, y) 0(x,y) R
f(x, y)dxdy 1
R

The cumulative distribution function F of the random variable (X,Y) is defined


F(x, y) P(X x,Y y)

If F is the cumulative distribution function of a two-dimensional variable with function


of the joint probability distribution f, then
2
F(x, y)
f(x,y)
x y
Cumulative Distribution Function. Let X be a discrete random variable or
continue, F is the cumulative density function of X,
F(x) P(X x) , if
If X is a discrete random variable,
F(x) p(x), x x
j

If X is a continuous random variable,


F(x) f(s)ds

F is non-decreasing, therefore, x 1 x, 2then, F(x) F(x) 1 2

Lim F(x) 0 Lim F(x) 1


x  x 
d
It is also fulfilled,f ( x ) F(x)with values x1,x2,... and they are x1<x2<x3<...
dx

Let F be the cumulative distribution function, then


p(x i) P(X x i) F(x i)  F(xj1)

Discrete Probability Mass Function. PMF:p x (x) P[X x], it is a law of


probability that you fulfill three axioms:
- 0 p x(x) 1, for all x
- p x(x) 1
x i
x b
- P[a X b] pxxi (x)
a
i

Cumulative Distribution Function (CDF):Fx(x)it is the probability of the event that the
random variable takes values less than or equal to x
Fx(x) P[X x] p (x)in xthe discrete case
x i
Transformation of multiple variables Ignacio López

x2
P[x 1 X x ]2 f x(x)dx
x1
x
Fx(x) P[ X ] f x(u)du

dFx(x) d x
it is known that, f x(x)dx f x(x)
dx dx 

Properties:
- 0 Fx(x) 1 - Fx ( ) 0andFx( ) 1
- Fx(x ) Fx(x)for anyone >0 - Fx(x 2)  Fx(x 1) P[x1 X x 2]
The FDA is a monotonically increasing function.

Joint Distribution Random Variables.

Joint FMP: When two or more variables have joint behaviors


Pxy(x, y) P[(X x) (Y y)]
Fxy(x, y) xi x yi y
p xy(x Ian,di) which is equal toP[(X x)I(Y y)]

Marginal FMP: Behavior of one variable without considering another.


For the random variable Y:
Px(x) P[X ] y
p xy(x,y)
i
i

Fx(x) P[X x] xi x
p x(x i) Fxy(x, ) which is equal to xi x y
p xy(x aIn,di)
i

Similarly, this is done for the random variable X.

Conditional FMP: If the value of one of the random variables Y=y is known0, the
the relative probabilities of the different values of the other variable are given by
p xy(x,y) 0 there is a conditional probability mass function of X given Y
P[(X x) (Y y)]
Px / y(x, y) P[X x / Y y] , which is equivalent to
P[Y y]
p xy(x,y) p xy(x,y)
Px / y(x, y) . It also fulfills what was stated earlier.
p y(y)] x
p x
y(x i, y)
I

0 ox / y 1 y x
p x / y(x I, y) 1 .
i

Same for Y given X

Joint FMP from Marginal and Conditional Probabilities


Pxy(x, y) p x / y(x,y)py(y) p y / x(y,x)px (x)
x 2y 2
probability distribution function: P[x1 X x2 f xy(x, y)dydx
x1y1

The probability distribution function satisfies: f(x,


xy
y) 0 y f xy(x, y)dxdy 1
for the entire interval

And the accumulated,


Transformation of multiple variables Ignacio López

x y
Fxy(x, y) P[(X x) (Y y)] P[ X x   Y y ]
 
f xy(x 0an,d0)dy0 dx0

where fromf xy(x, y) FXY(x, y)


x y

TECHNIQUE OF VARIABLE TRANSFORMATION

A Variable. It is about applying the first method without developing the acquisition first.
the distribution function. In the discrete case there is no real problem, as long as X and
Y=u(X) are biuniquely related. Therefore, the substitution must be appropriate.

For example, let X be the number of heads obtained in 4 tosses of a coin.


Determine the probability distribution of y=(1+x)-1

Solution, If we apply the Binomial with n=4 and p=1/2 we obtain that the distribution of
the probability of X is given by,
x 0 1 2 3 4
f(x) 1/16 4/16 6/16 4/16 1/16

and then applying y=(1+x)-1to substitute the values of Y with those of X, we have
y 0 1/2 1/3 1/4 1/5
G(y) 1/16 4/16 6/16 4/16 1/16

Or also having substituted x=(1/y)-1 for x in


4 1 4
f(x) for x 0,1,2,3,4
x 2

obtaining,
4 4
1 1
g(y) f 1 1
1
for and 1,1/ 2,1/3,1/ 4,1/5
y y 2

Theorem.g(y) f w(y) w (y) whenever u (x) 0

For example, if the double arrow in the figure is made to oscillate, the random variable have the
density
1/  /2 /2
f( )
0 another
Transformation of multiple variables Ignacio López

Find the probability density of X, the abscissa on the x-axis to which it will point
arrow.

Solution, The relationship between x and yes, x=[Link]


d a
dx a x
2 2

and it is deduced that


1 a 1 a
g(x) 2 2 2 2
for x
a x a x

Example, If X has a N(0,1) distribution, determine the probability density of


Z=X2

2 y 2 day 11 /2
Solution, g(y) e /2
z , then,
2 dz 2
2 z / 2
h(z) e
2

Two Variables. This is the previous method but considering two variables. Let them be the
joint distribution Y=u(X)1,X2), if the relationships between y and x1with x2and between y and x2with x1
they remain constant, then,
x1 x2
g(y,x2) f ( x 1 , x 2) g(x 1 , y) f ( x 1 , x 2)
y y

Example, let x1y x2independent random variables that have distributions of


Poisson with parameters 1y 2determine the probability distribution of the variable
random Y=X1+X2

Solution, For the independence of X1y X2, the joint function is,
e  1 1ex1 2 2 x2
f ( x 1 , x 2)
x 1! x 2!

For x1=0,1,… and x2=0,1,… and since y=x1+x2, then x1=y-x2, then,
y x 2
e  ( 1 )2 1 x12 x 2 e  ( 1 )2 2 x 21
f ( x 1 , x 2) g(y,x2)
x 1!*x2 ! x 1!*x2 !
y
e ( 1 )2 1 x12 x2
e ( 1 )2
 
1 2 y
h(y)
x2 0 x 2 !*(y x 2)! y!

Theorem. The joint distribution function f(x1,x2of the random variables X1y X2,
and if it has to be
1=u(x1,x2y y2=u(x1,x2),
then it is possible to make the transformation g(y1,y2)=f(w1(y1,y2)w2(y1,y2)*│J│,
Transformation of multiple variables Ignacio López

where w1y w2they are solutions to the simultaneous equations in partial derivatives of
y1y y2.J is the Jacobian of the transformation.
x1 x1
y1 y 2
J
x2 x2
y1 y 2

TECHNIQUE GENERATING FUNCTION OF MOMENT

Theorem. If X1,X2,...,Xn are independent random variables and Y=X1+…+Xn, then,


n
M Y(t) M X(t)
i
i 1

Where Mxi(t) is the value of the moment generating function of xiin t

Example, Obtain the probability distribution of the sum of n random variables


independents X1,…,Xnthat have a Poisson distribution with parameters 1,…, ,
respectively

t
(e i
Solution, We know thatM X(t)i e
And therefore, being y=x1+…,xn, it has
t t
i (e
M Y(t) e e(  )e
1 1) i

i 1

You might also like