4. Random Vectors

1 The General Case: Joint cdf

Multivariate random variable: A k dimensional random variable (or: random vector) is a function with domain S and codomain Rk: (X1,,Xk):sS(X1(s),,Xk(s))Rk. The function (X1(s),,Xk(s)) is usually written for simplicity as (X1,,Xk).

Remark: If k=2 we have the bivariate random variable or two dimensional random variable (X,Y):sS(X(s),Y(s))R2.

Joint cumulative distribution function: Let (X,Y) be a bivariate random variable. The real function of two real variables with domain R2 and defined by FX,Y(x,y)=P(Xx,Yy) is the joint cumulative distribution function of the two dimensional random variable (X,Y).

Example 1.1 Random Experiment: Roll two different dice (one red and one green) and write down the number of dots on the upper face of each die.

Random vector: (Xred,Xgreen), where Xi is the of dots the i die, with i= green or red.

Some probabilities: P(Xred=2,Xgreen=4)=136P(Xred+Xgreen>10)=112P(XredXgreen2)=P(XredXgreen=1)+P(XredXgreen=2)                     =636+336=14

Example 1.2 Random experiment: Two different and fair coins are tossed once.

Random vector: (X1,X2), where Xi represents the number of heads obtained with coin i, with i=1,2.

Some probabilities: P(X1=0,X2=0)=P(X1=0,X2=1)=P(X1=1,X2=0)=P(X1=1,X2=1)=14P(X1+X21)=34

Properties of the joint cumulative distribution function:

  • 0FX,Y(x,y)1

  • FX,Y(x,y) is non decreasing with respect to x and y:

    • Δx>0FX,Y(x+Δx,y)FX,Y(x,y)

    • Δy>0FX,Y(x,y+Δy)FX,Y(x,y)

  • limxFX,Y(x,y)=0, limyFX,Y(x,y)=0 and

    limx+,y+FX,Y(x,y)=1

  • P(x1<Xx2,y1<Yy2)=FX,Y(x2,y2)FX,Y(x1,y2)FX,Y(x2,y1)+FX,Y(x1,y1).

  • FX,Y(x,y) is right continuous with respect to x and y: limxa+FX,Y(x,y)=FX,Y(a,y) and limyb+FX,Y(x,y)=FX,Y(x,b).

Example 1.3 Random experiment: Two different and fair coins are tossed once.

Random vector: (X1,X2), where Xi represents the number of heads obtained with coin i, with i=1,2.

Joint cumulative distribution function: FX1,X2(x1,x2)=P(X1x1,X2x2)={0,x1<00,x2<014,0x1<1,0x2<112,0x1<1,x2112,0x2<1,0x1<11,x11,x21

2 The General Case: Marginal cdf

The (marginal) cumulative distribution functions of X and Y can be obtained form the Joint cumulative distribution functions of (X,Y):

  • The Marginal cumulative distribution function of X: FX(x)=P(Xx)=P(Xx,Y+)=limy+FX,Y(x,y).

  • The Marginal cumulative distribution function of Y: FY(y)=(Yy)=P(X+,Yy)=limx+FX,Y(x,y).

Remark: The joint distribution uniquely determines the marginal distributions, but the inverse is not true.

Example 2.1 Random experiment: Two different and fair coins are tossed once.

Random vector: (X1,X2), where Xi represents the number of heads obtained with coin i, with i=1,2.

Joint cululative distribution function: FX1,X2(x1,x2)={0,x1<0 or x2<014,0x1<1,0x2<112,0x1<1,x2112,0x2<1,0x1<11,x11,x21 Marginal cumulative distribution function for X1: FX1(x1)=limx2+FX1,X2(x1,x2)={0,x1<012,0x1<11,x11

Example 2.2 Let (X,Y) be a jointly distributed random variable with CDF: FX,Y(x,y)={1exey+exy,x0,y00,x<0,y<0. Marginal cumulative distribution function of the random variable X is: FX(x)={1ex,x00,x<0.

Marginal cumulative distribution function of the random variable Y is: FY(y)={1ey,y00,y<0.

3 The General Case: Independence

Definition: The jointly distributed random variables X and Y are said to be independent if and only if for any two sets B1R, B2R we have P(XB1,YB2)=P(XB1)P(YB2)

Remark: Independence implies that FX,Y(x,y)=FX(x)FY(y), for any (x,y)R2.

Theorem: If X and Y are independent random variables and if h(X) and g(Y) are two functions of X and Y respectively, then the random variables U=h(X) and V=g(Y) are also independent random variables.

Example 3.1 Random experiment: Two different and fair coins are tossed once.

Random vector: (X1,X2), where Xi represents the number of heads obtained with coin i, with i=1,2.

Are these random variables independent? FX1(x1)={0,x1<012,0x1<11,x11,FX2(x2)={0,x2<012,0x2<11,x21

One can easily verify that FX1,X2(x1,x2)=FX1(x1)×FX1(x1). FX1,X2(x1,x2)={0,x1<0 or x2<014,0x1<1,0x2<112,0x1<1,x2112,0x2<1,0x1<11,x11,x21

Example 3.2 Let (X,Y) be a jointly distributed random variable with CDF: FX,Y(x,y)={1exey+exy,x0,y00,x<0,y<0. Marginal cumulative distribution function of the random variable X and Y are: FX(x)=limy+FX,Y(x,y)={1ex,x00,x<0FY(y)=limx+FX,Y(x,y)={1ey,y00,y<0.

X and Y are independent random variables because: FX,Y(x,y)=FX(x)FY(y)

since 1exey+exy=(1ey)(1ex).

4 Discrete Random Variables

4.1 Joint pmf

Let D(X,Y) be the set of of discontinuities of the joint cumulative distribution function F(X,Y)(x,y), that is D(X,Y)={(x,y)R2:P(X=x,Y=y)>0}

Definition: (X,Y) is a two dimensional discrete random variable if and only if (x,y)D(X,Y)P(X=x,Y=y)=1.

Remark: As in the univariate case, a multivariate discrete random variable can take a finite number of possible values (xi,yj), where i=1,2,...,k1 and j=1,2,...,k2, where k1 and k2 are finite integers, or a countably infinite (xi,yj), where i=1,2,... and j=1,2,.... For the sake of generality we consider the latter case. That is D(X,Y)={(xi,yj),i=1,2,...,j=1,2,...}

Joint probability mass function: If X and Y are discrete random variables, then the function given by fX,Y(x,y)=P(X=x,Y=y) for (x,y)D(X,Y) is called the joint probability mass function of (X,Y) (joint pmf) or joint probability distribution of the random variables X and Y.

Theorem: A bivariate function fX,Y(x,y) can serve as joint probability mass function of the pair of discrete random variables X and Y if and only if its values satisfy the conditions:

  • fX,Y(x,y)0 for any (x,y)R2

  • (x,y)D(X,Y)fX,Y(x,y)=i=1j=1fX,Y(xi,yj)=1

Remark: We can calculate any probability using this function. For instance P((x,y)B)=(x,y)BfX,Y(x,y)

Example 4.1 Let X and Y be the random variables representing the population of monthly wages of husbands and wives in a particular community. Say, there are only three possible monthly wages in euros: 0, 1000, 2000. The joint probability mass function is

X=0 X=1000 X=2000
Y=0 0.05 0.15 0.10
Y=1000 0.10 0.10 0.30
Y=2000 0.05 0.05 0.10

The probability that a husband earns 2000 euros and the wife earns 1000 euros is given by fX,Y(2000,1000)=P(X=2000,Y=1000)=0.30

4.2 Joint cdf

Joint cumulative distribution function: If X and Y are discrete random variables, the function given by FX,Y(xy)=sxtyfX,Y(s,t) for (x,y)R2 is called the joint distribution function or joint cumulative distribution of X and Y.

4.3 Marginal pmf’s

Marginal probability distribution/function: If Y and X are discrete random variables and fX,Y is the value of their joint probability distribution at (x,y) the function given by

P(X=x)={yDyf(x,y)=yDYfX,Y(x,y),forxDx0,forxDx.P(Y=y)={xDxf(x,y)=xDXfX,Y(x,y),foryDy0,foryDy

are respectively is the Marginal probability distribution of the r.v. X and Y, where Dx and Dy are the range of X and Y respectively.

Example 4.2 P(X=x)=P(X=x,Y=0)+P(X=x,Y=1000)+P(X=x,Y=2000)P(Y=y)=P(X=0,Y=y)+P(X=1000,Y=y)+P(X=2000,Y=y)

Applying these formulas we have:

X=0 X=1000 X=2000 fY(y)
Y=0 0.05 0.15 0.10 0.3
Y=1000 0.10 0.10 0.30 0.5
Y=2000 0.05 0.05 0.10 0.2
fX(x) 0.2 0.3 0.5 1

FX,Y(1000,1000)=P(X=0,Y=0)+P(X=0,Y=1000)             +P(X=1000,Y=0)+P(X=1000,Y=1000)FX,Y(0,1000)=P(X=0,Y=0)+P(X=0,Y=1000)

4.4 Independence

Independence of random variables: Two discrete random variables X and Y are independent if and only if, for all (x,y)DX,Y, P(X=x,Y=y)=P(X=x)P(Y=y).

X=0 X=1000 X=2000 fY(y)
Y=0 0.05 0.15 0.10 0.3
Y=1000 0.10 0.10 0.30 0.5
Y=2000 0.05 0.05 0.10 0.2
fX(x) 0.2 0.3 0.5 1

Are these two random variables independent?

P(X=2000,Y=2000)=P(X=2000)×P(Y=2000)=0.1Is this sufficient to say that X and Y are independent? NO!P(X=0,Y=0)=0.05 but P(X=0)P(Y=0)=0.06 thus X and Y are not independent.

4.5 Conditional pmf’s

Conditional probability mass function of Y given X:A conditional probability function of a discrete random variable Y given another discrete variable X taking a specific value is defined as fY|X=x(y)=P(Y=y|X=x)=P(Y=y,X=x)P(X=x)=fX,Y(x,y)fX(x),{}fX(x)>0, for a fixed x. The conditional probability function of X given Y is defined by

fX|Y=y(x)=fX,Y(x,y)fY(y),{}fY(y)>0.

Remarks:

  • The conditional probability functions satisfy all the properties of probability functions, and therefore i=1fY|X(yi)=1.

  • If X and Y are independentfY|X=x(y)=fy(y) and fX|Y=y(x)=fX(x)

example Consider the joint probability function

X=0 X=1000 X=2000 fY(y)
Y=0 0.05 0.15 0.10 0.3
Y=1000 0.10 0.10 0.30 0.5
Y=2000 0.05 0.05 0.10 0.2
fX(x) 0.2 0.3 0.5 1

Compute P(Y=y|X=0).

P(Y=0|X=0)=P(Y=0,X=0)P(X=0)=0.050.2=0.25P(Y=1000|X=0)=P(Y=1000,X=0)P(X=0)=0.10.2=0.5.P(Y=2000|X=0)=P(Y=2000,X=0)P(X=0)=0.050.2=0.25.

4.6 Conditional cdf’s

Definition: The conditional CDF of Y given X is defined by

FY|X=x(y)=P(Yy|X=x)=yDYyyP(Y=y,X=x)P(X=x) for a fixed x, with P(X=x)>0.

Remark: It can be checked that FY|X=x is indeed a CDF.

Exercice: Verify that FY|X=x is non-decreasing and and limy+FY|X=x(y)=1.

1)  FY|X=x(y+δ)FY|X=x(y)           =P(Yy+δ,X=x)P(X=x)P(Yy,X=x)P(X=x)0.2)   limy+FY|X=x(y)=limy+P(Yy|X=x)=limy+P(Yy,X=x)P(X=x)=P(Y,X=x)P(X=x)=P(X=x)P(X=x)=1.

Example 4.3 Consider the conditional probability of Y given that X=0 previously deduced: P(Y=0|X=0)=P(Y=0,X=0)P(X=0)=0.050.2=0.25P(Y=1000|X=0)=P(Y=1000,X=0)P(X=0)=0.10.2=0.5.P(Y=2000|X=0)=P(Y=2000,X=0)P(X=0)=0.050.2=0.25.

Then the conditional CDF of Y given that X=0 is

FY|X=0(y)={0,y<00.25,0y<10000.75,1000y<20001,y0.

5 Continuous Random Variables

5.1 Joint pdf and joint cdf

Definition: (X,Y) is a two-dimensional continuous random variable with a joint cumulative distribution function FX,Y(x,y), if and only if X and Y are continuous random variables and there is a non-negative real function fX,Y(x,y), such that FX,Y(x,y)=P(Xx,Yy)=yxfX,Y(t,s)dtds.

The function fX,Y(x,y) is the joint (probability) density of X and Y.

Remark: Let A be a set in the R2. Then, P((X,Y)A)=AfX,Y(t,s)dtds.

::: {.example} Joint probability density function of the two dimensional random variable (P1,S) where P1 represents the price and S the total sales (in 10000 units).

Joint probability density function: fP1,S(p,s)={5peps,0.2<p<0.4,  s>00,otherwise

Joint cumulative distribution function: FP1,S(p,s)=P(P1p,Ss)={0,p<0.2 or s<01+5p5e0.2sepss,0.2<p<0.4,s015e0.2se0.4ss,p0.4,s0 To get the CDF we need to make the following computations:

  • If p<0.2 or s<0, then fP1,S(p,s)=0 and P(P1p,Ss)=psfP1,S(t,u)dtdu=0

  • If 0.2<p<0.4 and s0, then P(P1p,Ss)=psfP1,S(t,u)dtdu        =0.2p0sfP1,S(t,u)dtdu=1+5p5e0.2sepss

  • If p0.4 and s0, then P(P1p,Ss)=psfP1,S(t,u)dtdu        =0.20.40sfP1,S(t,u)dtdu=15e0.2se0.4ss

Theorem: A bivariate function can serve as a joint probability density function of a pair of continuous random variables X and Y if its values, fX,Y(x,y), satisfy the conditions:

  • fX,Y(x,y)0, for all (x,y)R2

  • ++fX,Y(x,y)dxdy=1.

Property: Let (X,Y) be a bivariate random variable and BR2, then P((X,Y)B)=BfX,Y(x,y)dxdy.

Example 5.1 Let (X,Y) be a continuous bi-dimensional random variable with density function fX,Y given by fX,Y(x,y)={kx+y,0<x<1,0<y<10,otherwise. Find k.

Solution: From the first condition, we know that fX,Y(x,y)0. Therefore k0. Additionally, ++fX,Y(x,y)dxdy=1. This is equivalent to 0101kx+ydxdy=11+k2=1k=1.

::: {.example} Let (X,Y) be a continuous bi-dimensional random variable with density function fX,Y given by fX,Y(x,y)={x+y,0<x<1,0<y<10,otherwise. Compute P(X>Y).

Solution: Firstly, we notice that

P(X>Y)=010x(x+y)dydx=0132x2dx=12

Properties: Let (X,Y) be a continuous bivariate random variable. If fX,Y represents the density function of (X,Y) and FX,Y represents respectively joint CDF of (X,Y). Then,

  • fX,Y(x,y)=2FX,Y(x,y)xy=2FX,Y(x,y)yx, almost everywhere.

5.2 Marginal pdf’s

  • Marginal density functions of the random variable X fX(x)=+fX,Y(x,v)dv,

  • Marginal density functions of the random variable Y fY(y)=+fX,Y(u,y)du.

5.3 Marginal cdf’s

  • Marginal CDF of the random variable X FX(x)=limy+FX,Y(x,y)=x+fX,Y(u,y)dy=fX(u)du,

  • Marginal CDF of the random variable Y FY(y)=limx+FX,Y(x,y)=y+fX,Y(x,v)dv=fY(v)dx.

Example 5.2 Joint density function: fP,S(p,s)={5peps,0.2<p<0.4,  s>00,otherwise

Marginal density function of P: fP(p)=+fP,S(p,s)ds={50+pepsds=1,0.2<p<0.40,otherwise={5,0.2<p<0.40,otherwise

Marginal cumulative distribution function: FS(s)=limp+FP,S(p,s)={0,s<015e0.2se0.4ss2,s0

5.4 Conditional pdf’s

Definition: If fX,Y(x,y) is the joint probability density function of the continuous random variables X and Y and fY(y) is the marginal density function of Y, the function given by fX|Y=y(x)=fX,Y(x,y)fY(y),xR (for fixed y),fY(y)>0 is the conditional probability density function of X given {Y=y}. Similarly if fX(x) is the marginal density function of X fY|X=x(y)=fX,Y(x,y)fX(x),yR(for fixed x),fX(x)>0 is the conditional probability function of Y given {X=x}.

Remark: Note that P(XB|Y=y)=BfX|Y=y(x)dx for any BR.

Example 5.3 (X,Y) is a random vector with the following joint density function: fX,Y(x,y)={(y+x)for(x,y)(0,1)×(0,1)0otherwise.

Conditional density function of Y given that X=x (with x(0,1)):

fX(x)=01(y+x)dy=x+12

fY|X=x(y)=fX,Y(x,y)fX(x)=x+yx+12,y(0,1)

Probability of Y0.7|X=0.5 P(Y0.7|X=0.5)=0.71fY|X=0.5(y)dy=0.71(y+0.5)dy=0.405.

Remark:

  • The conditional density functions of X and Y verify all the properties of a density function of a univariate random variable.

  • Note that we can always decompose a joint density function in the following way fX,Y(x,y)=fX(x)fY|X=x(y)=fY(y)fX|Y=y(x).

5.5 Independence

  • If X and Y are independent fY|X=x(y)=fY(y) and fX|Y=y(x)=fX(x).

Example 5.4 Consider the conditional density function of Y given that X=x (with x(0,1)): fY|X=x(y)=fX,Y(x,y)fX(x)=x+yx+12,y(0,1).

fY|X=x is indeed a density function: fY|X=x(y)0and01x+yx+12dy=1.

Example 5.5 Consider the conditional density function of Y given that X=x (with x(0,1)) and the marginal density function of Y. fY|X=x(y)=fX,Y(x,y)fX(x)=x+yx+12,y(0,1)fY(y)=y+12,y(0,1).

The random variables are not independent because fY|X=x(y)fY(y),for some y(0,1).

5.6 Conditional cdf’s

Definition: The conditional CDF of Y given X by defined by FY|X=x(y)=yfY|X=x(s)ds=yfY,X(s,x)fX(x)ds for a fixed x, with fX(x)>0.

Remark: It can be checked that FY|X=x is indeed a CDF.

Example 5.6 Consider the conditional density function of Y given that X=x (with x(0,1)): fY|X=x(y)=fX,Y(x,y)fX(x)=x+yx+12,y(0,1).

For x(0,1), the conditional cumulative density function is given by: FY|X=x(y)={0,y<0y(2x+y)1+2x,0y<11,y1

where, y(2x+y)1+2x=0yx+sx+12ds.

Previous
Next