Stochastic processes Lecture : Multiple Random Variables Ch. 5 Dr. Ir. Richard C. Hendriks 26/04/8 Delft University of Technology Challenge the future Organization Plenary Lectures Book: R.D. Yates and D.J. Goodman, Probability and Stochastic Processes: A Friendly Introduction for Electrical and Computer Engineers (third Edition) Slides, eams, answers on some selected eercises etc.: http://cas.tudelft.nl/education/courses/ee2s3/ (The least) preparation of the net lecture: Make sure to have understood lecture material last lecture read the corresponding chapters solved the selected eercises Your responsibility: Study and solve eercises before the net lecture.
Eam The eam consists of two parts:. Part : Mid-term eam on 28th of May 2. Part 2: Final eam on 3rd of July. Both parts will consist of eercises from signal processing and stochastic processes (50/50). The end result is the average of the mid-term and final eam. The result of mid-term eam is only valid in case you take the final eam on July 3rd. (Hence, it is not anymore valid for the re-eam) During the eam a HANDWRITTEN double sided A4 formula sheet may be used. Topics to be Discussed (3 rd edition) 2
Message of this course () X(t) linear system (LTI) h(n), h(t) Y (t) 5 Message of this course (2) X(t) linear system (LTI) h(n), h(t) Y (t) Statistical descriptions of X(t): mean µ X Autocorrelation function R X ( ). Statistical descriptions of Y (t): mean µ Y = µ X Rt h(t)dt R Y ( ) =h( ) h( ) R X ( ). 6 3
Message of this course (3) Some questions to answer: What is a stochastic process? How to characterize a stochastic process? What is the autocorrelation function? Statistical descriptions of output Y (t) of LTI system with WSS input X(t): mean µ Y = µ X Rt h(t)dt R Y ( ) =h( ) h( ) R X ( ). 7 Definition of Random a Variable () Eperiments often take place in a physical world: We observe physical quantities. Probability theory works in a mathematical world, with mathematical tools. How to map physical observations to numbers we can do mathematics on? E.g. Flip a coin: Outcomes are head/tail E.g. Observe flooding because of frequent rainfall. Outcomes are yes/no 4
Definition of Random a Variable (2) = X(Outcome) = X(s) X(Outcome) Outcome s Arbitrary sample space Often related to physical world Real numbers < Definition of Random a Variable (3) Eperiment:Head or tail in 3 tosses Outcomes TTT Random variable X(outcome): number of heads H in three tosses HHT HHH HTH THT THH X(s) X(s) HTT TTH 0 2 3 Random variable is simply a rule to assign a number to each outcome in the sample space 5
Definition of Random a Variable (4) X(event) < A<X(s) apple B roll even; grown up {2, 4, 6}; [8, ) From scalar, to stochastic processes X(s) =X S X Y X X(t, s) =X(t) One eperiment ) one outcome or realization of the stochastic/random process 6
Scalar random variables Discrete random variable: Discrete sample space Countable number of outcomes Each outcome has a non-zero probability of occurrence Continuous random variable: continuous sample space Have an infinite number of outcomes Eample: Pick a real-valued number between 3 and 5. How many outcomes? Every outcome has probability zero: P [X = ] = 0. Therefore we will consider events instead, e.g., 4 apple X apple 5. How to describe a scalar RV? Using the cumulative distribution function F X (): For continuous AND discrete RVs: F X () =P [X apple ] Properties: F X ( ) =0 F X () = P [ <Xapple 2 ]=F X ( 2 ) F X ( ) For all 0, F X ( 0 ) F X () For continuous RVs: F X () =P [X apple ] = R f X(u)du 7
Eample CDF F X () =P [X apple ] F X () F X (20) F X (0) 0 0 0 20 P (0 <Xapple 20) = F X (20) F X (0) 38 38 PDF versus PMF Discrete RVs are usually described by their Probability Mass Function (PMF) Continuous RVs are usually described by their Probability Density Function (PDF) For both cases the CDF can also be used For Discrete RVs the PDF also eists Useful for mied random variables PDF becomes a sum of delta functions 8
Properties of PDF and PMF Properties of the PDF Instead of a PMF (for discrete RVs), we have a probability density function, or pdf, for continuous RVs: f X () = df X() d A probability density is NOT a probability. It can even be larger than one! 9
Eample Scalar RV: from PMF to CDF Y 0 2 P Y (y) 0. 0.09 0.8 Plot the CDF F Y (y). What is F Y ()? Eample Scalar RV: from pdf to CDF Given is the eponential pdf ( f T (t) = 3 e t/3 t 0 0 otherwise Calculate the CDF F T (t) Calculate the epected value E[T ] 0
Eample Scalar RV: from pdf to CDF Given is the eponential pdf ( f T (t) = 3 e t/3 t 0 0 otherwise F T (t) = Z t f T (u)du = Z t 0 3 e u/3 du =[ e u/3 ] t 0 = e t/3 E[T ] = = Z h 0 t 3 e t/3 dt = te t/3i 0 + Z 0 Z 0 tde t/3 h e t/3 dt =0+ 3e t/3i 0 =3 From Scalar RVs to Multiple RVs Model: Y = X + N How to model this scenario? Using a pair of random variables: (X, Y ) or (X, N)
Multiple random variables S Y X(s) =X X X We will now consider a collection of random variables X,X 2,...,X n (Today, Chapter 5). Notice that we can also consider this as a vector X =[X,X 2,...,X n ] T (Chapter 8) How to describe multiple RVs? Remember for scalar RVs: CDF and pdf (or pmf). For multiple RVs we use the the joint CDF F X,Y (, y) and the joint pdf f X,Y (, y) (or joint pmf P X,Y (, y)). The joint cumulative distribution function F X,Y (, y) =P [X apple, Y apple y] 2
The joint cumulative distribution function Properties: 0 apple F X,Y (, y) apple F X,Y (, ) = F X () =F X,Y (, ) F Y (y) =F X,Y (,y) F X,Y (, ) =0 F X,Y (, y) =P [X apple, Y apple y] F X,Y (,y)=0 If apple and y apple then F X,Y (, y) apple F X,Y (,y ) The joint cumulative distribution function F X,Y (, y) y 3
Eample: j-cdf The joint probability mass function Using the j-cdf can be comple. It is often easier to work with the joint pmf and joint pdf. The j-pmf: P X,Y (, y) =P [X =, Y = y] Let S X,Y = {(, y) P X,Y (, y) > 0} be the set of possible (X, Y ) pairs. Eample: P X,Y (, y) y =0 y = y =2 =0 0.0 0 0 = 0.09 0.09 0 =2 0 0 0.8 X X 2S X y2s Y P X,Y (, y) = 4
The j-pmf for events For discrete RVs X and Y and any set B in the X, Y -plane, the probability of the event {(X, Y ) 2 B} is P [B] = X (,y)2b Eample: Let B be the event {X = Y }. Give P [B] for the j-pmf P X,Y (, y) P X,Y (, y) y =0 y = y =2 =0 0.0 0 0 = 0.09 0.09 0 =2 0 0 0.8 P X,Y (, y). The marginal pmf Eample: Given is the j-pmf P X,Y (, y). GiveP X () and P Y (y) P X,Y (, y) y =0 y = y =2 =0 0.0 0 0 = 0.09 0.09 0 =2 0 0 0.8 5
Eample: From j-pmf to j-cdf P X,Y (, y) y =0 y = y =2 =0 0.0 0 0 = 0.09 0.09 0 =2 0 0 0.8 F X,Y (, y) =P [X apple, Y apple y] F X,Y (, y) y =0 y = y =2 =0 0.0 0.0 0.0 = 0. 0.9 0.9 =2 0. 0.9 From marginal pmf to j-pmf? P X,Y (, y) y =0 y = P X () =0?? 0.5 =?? 0.5 P Y (y) 0.5 0.5 6
From marginal pmf to j-pmf? P X,Y (, y) y =0 y = P X () =0 0.5 0 0.5 = 0 0.5 0.5 P Y (y) 0.5 0.5 P X,Y (, y) y =0 y = P X () =0 0.25 0.25 0.5 = 0.25 0.25 0.5 P Y (y) 0.5 0.5 P X,Y (, y) y =0 y = P X () =0 0. 0.4 0.5 = 0.4 0. 0.5 P Y (y) 0.5 0.5 From marginal pmf to j-pmf? j-pmf (or j-pdf) marginal pmf (or marginal pdf) 7
The joint probability density function For continuous multiple random variables we use the joint pdf The j-pdf is related to the j-cdf by: F X,Y (, y) = Z Z y f X,Y (u, v)dudv f X,Y (, y) = @2 F X,Y (, y) @@y Eample j-pdf f X,Y (, y) = = ( @ 2 @@y ( 5)(y 6) 5 apple <6, 6 apple y<7 0 otherwise ( 5 apple <6, 6 apple y<7 0 otherwise 8
The joint probability density function Remember the properties of the pdf for scalar RVs: f X () 0 8 F X () = R f X(u)du R f X()d = Similarly, for the j-pdf we thus have f X,Y (, y) 0 8 (, y) F X,Y (, y) = R R y f X,Y (u, v)dudv R R f X,Y (, y)ddy = The joint probability density function The j-pdf and j-pmf thus simply etend what we already know from scalar RVs. However: Finding the right integration area for these higher dimensional integrals can be complicated! Z Z y P (A) = f X,Y (, y)ddy A A 9
Eample y f X,Y (, y) = ( c 0 apple y apple apple 0 otherwise 0 Value of c? Eample y f X,Y (, y) = ( c 0 apple y apple apple 0 otherwise 0 Value of c? P (X >/2)? P (X >/2) = Z Z /2 0 2dyd =3/4 20
The marginal pdf Remember for pmfs: P Y (y) = P 2S X P X,Y (, y) P X () = P y2s Y P X,Y (, y) How to obtain f X () and f Y (y) from f X,Y (, y)? f X () = f Y (y) = Z Z Proof using CDF: f X,Y (, y)dy f X,Y (, y)d F X () =P [X apple ] = Z f X () = @F X() @ Z f X,Y (u, y)dy du Illustration f X,Y (, y) f Y (y) y y f X () 2
Eample marginal pdf y What are the marginal pdfs? 0 Independent RVs Two RVs are independent if and only if: Discrete RVs: P X,Y (, y) =P X ()P Y (y) Continuous RVs: f X,Y (, y) =f X ()f Y (y) Are X and Y independent? No, as f X ()f Y (y) 6= f X,Y (, y) 22
Epected values of functions of RVs Let g(x, Y ) be a function of the variables X and Y. Epected value of W = g(x, Y ), i.e.,e [g(x, Y )]? Determine the pdf of W and calculate R wf W (w)dw. However, often it is di cult to find the pdf of W = g(x, Y ). Much easier: E[W ]= R R g(, y)f X,Y (, y)ddy 45 Epected values of Sums of RVs For the sum of two RVs we have: Typical value of X + Y E[X + Y ]=E[X]+E[Y ] Var[X+Y ]=Var[X]+Var[Y ]+2 E[(X E[X])(Y E[Y ])] {z } Covariance Spread around E[X + Y ] 4 6 23
Covariance Cov[X, Y ]=E[(X E[X])(Y E[Y ])] E[X + Y ]: Typical value of X + Y. Var[X + Y ]: Spread around E[X + Y ]. Cov[X, Y ]=E[(X E[X])(Y E[Y ])]: Describes how X and Y vary together. Covariance y y Cov[X, Y ] > 0 Cov[X, Y ]=0 y y Cov[X, Y ] < 0 Cov[X, Y ]=0 24
The correlation coefficient Correlation The correlation between X and Y is r X,Y = E[XY ] Correlation r X,Y = E[XY ] Covariance Cov[X, Y ]=E[(X E[X])(Y E[Y ])] = r X,Y E[X]E[Y ] Var[X + Y ]=Var[X]+Var[Y ]+2Cov[X, Y ] If X = Y, Cov[X, Y ]=Var[X] =Var[Y ] and r X,Y = E[X 2 ]=E[Y 2 ] 25
Uncorrelated and orthogonal RVs Two RVs are orthogonal if there is no correlation: r X,Y =0 Two RVs are uncorrelated if Cov[X, Y ]=0. Independent RVs 26
Independent vs. uncorrelated Eample: covariance y What are E[X] and E[Y ]? E[XY ] Cov(X, Y )? 0 5 4 27
Eample: covariance y 0 5 5 Eample: covariance y 0 28
Gaussian variables Joint Gaussian pdf 29
Gaussian variables ep f X,Y (, y) = " E[X] X 2 2 X,Y ( E[X])(y E[Y ]) X Y 2( 2 X,Y ) 2 X Y q 2 X,Y + y E[Y ] 2 Y # For uncorrelated variables X and Y (Cov[X, Y ]=0]): f X,Y (, y) = p 2 2 e ( E[X])2 (y E[Y ])2 2 2 p e 2 2 2 2 Gaussian variables 30
Multivariate models Multivariate models 3
Practice before net lecture Some selected eercises: 3rd ed: 5.., 5.2., 5.2.2, 5.3.2, 5.4., 5.5.3, 5.5.9, 5.5.8 32