Random Signals and Systems Chapter 3 Jitendra K Tugnait Professor Department of Electrical & Computer Engineering Auburn University
Two Random Variables Previously, we only dealt with one random variable For the next few lectures, we will study two random variables» How they are related to each other» How we describe this relationship This is an intermediate step toward random signals
Example + V - V O L T M E T E R measured voltage (random) random voltage measurement error (random variable) 3
Joint Probability Distribution Joint Probability Distribution: Consider two RV s X and Y F xy, PX ( xy, y) Properties AND ) 0 Fx, y x y ) 3) F, y F x, F, 0 F, 4
Properties cont d 4) F(x,y) is a nondecreasing function as either x or y, or both increase 5) F, y F ( y Y ), ( ) F x F x X marginal distributions F, y P( X, Y y) P( Y y) F ( y) Y 5
Joint Density Function, f x y, F x y x y order of differentiation is not important f x, y dxdy P x X xdx, yy ydy ), 0 f x y x y ) f x, y dxdy 3) x y F xy, fuvdvdu, 6
Joint Density Function cont d 4) f x f x, ydy X f y f x, y dx Y marginal densities 5) x y P x X x, y Y y f x, y dxdy x y 7
Example, y yx x f x y 0 elsewhere for x x x, y y y f xy, height y y x x y y x x 8
Expected Values Given two RV s X and Y, the expected value of g(x,y) is E g Y g y f y X, x, x, dydx E XY xyf x, y dydx is called the correlation. 9
Example, y yx x f x y 0 elsewhere for x x x, y y y y f xy, x x y x y E XY xy dydx x y y y x x x y y y x x 4 x y x y x x y y 0
Example cont d f x dy y X y y x x y y y y y y x x x x y y f xy, x x Similarly, f y dx x Y y y x x y y x
Example Rectangular semiconductor substrate with dimensions having mean values of cm and cm.» Actual dimensions are independent and uniformly distributed between ±0.0cm of their means a) Probability that both dimensions are larger than their mean values by 0.005 cm..00.00 P (.005 X,.005 Y ) dxdy 0.00.005.005 0.005 0.00 6
Example cont d b) Probability that the larger dimension is greater than its mean value by 0.05cm and the smaller dimension is less than its mean value by 0.05 cm. 0.995.00 P ( X 0.995,.005 Y ) dxdy 0.00 0.990.005 0.005 0.00 6 3
Example cont d c) The mean value of the area of the substrate A E{ A} E{ XY }.0.0 0.99.99 0.0 xy dydx 0.0.0.0 x y 0.99.99 4
Joint Density Function cont d Ex: Two RV s X and Y f x, y a) Value of A x 3y Ae x 0, y 0 0 x 0, y 0 A 00 x 3y 3y x 3y x A e e dx dy A e e dy 0 0 0 0 3 y e dy A A 3 6 Ae dxdy 0 5
Joint Density Function cont d 4 3 4, 6 6 [ ] [ ] 0.3335 4 3 x y b) 3 P X Y e dydx e e 00 c) The expected value of XY. 0 0 x 3y E XY 6 xye dydx 6 0 0 6 4 9 6 x 3y xe ye dy dx 6
Conditional Probability Previously, we worked with conditional probability in terms of events PA ( B) PAB ( ) PB ( ) We now define conditional probability in terms of random variables Example: What is the probability density of the actual voltage given that the measured voltage is.v? + V - V O L T M E T E R measured voltage actual voltage measurement error 8
Conditional Probability Consider two RV s X and Y with joint distribution F(x,y) First, consider the probability that X x given that Y y.,, P X x Y y F xy PX x Y y F x Y y X P Y y F y Y Now, suppose that y Y y is given,, F x y F x y PX x y Y y F x y Y y X F y F y Y Y Now, suppose that Y = y is given (i.e. y =y ) F x, y F x, y 0 PX x Y y? F y F y 0 Y Y 9
Conditional Probability cont d By using something similar to l Hospital s Rule, we get, f xy f x Y y X f y Y conditional density often written f(x y) f x y, f xy f y x f y Y, f xy f x X joint marginal 0
Conditional Probability cont d Bayes Theorem: f y x Y f x y f y f x X Total Probability:, f x f x y dy f x y f y dy X Y, f y f x y dx f y x f x dx Y X
Conditional Probability Example Ex:, f x y 0, k x y x y 0 elsewhere a) Value of k x y kx ydxdy k 0 0 0 0 k
Conditional Probability Example, f x y x y 0 x, y 0 elsewhere b) Probability that X>/ given that Y=/4? f y f x, ydx x ydx y Y 0 P X Y 4 f( x, y) f( x y) f y Y x y y f x x x x f Y, 4 4 4 P dx dx 4 3/4 3 4 3 3
Statistical Independence Two RV s X and Y are statistically independent if and only if f(x,y)=f X (x) f Y (y)» the joint density is separable. 5
Statistical Independence If X and Y are independent, then, f x y f x f y X Y Y Y f x y f x f y f y X» Knowledge of y tells us nothing about x» Similarly f y x f y Y 6
Uncorrelated Random Variables The random variables X and Y are uncorrelated if» E{XY}=E{X} E{Y} 7
Example Two random variables, X and Y, have the following joint density function, f x y» Are they independent? x y 0 x, y 0 elsewhere Answer: NO! Density function is not separable 8
Example Two random variables, X and Y, have the following joint density function, f x y 4xy 0 x, y 0 elsewhere» Are they independent? Answer: YES! Density function is separable fx ( x) x 0 x fy ( y) y 0 y 9
Statistical Independence If X and Y are independent, then E XY xyf ( x, y ) dxdy» X and Y are uncorrelated xyf ( x ) f ( y ) dxdy X Y [ xf ( x )][ yf ( y )] dxdy X Y xf ( x) dx yf ( y) dy X Y E XY E{ X} E{ Y} XY 30
Independent vs. Uncorrelated Both uncorrelated and independent mean that the RV s are not related They differ in how this uncorrelatedness is defined» Independent density function» Uncorrelated first and second order statistics uncorrelated E XY E X E Y independent f ( xy, ) f x f y X Y 3
Example f( x, y) k( xy x y) 0 x, y a) Are X and Y independent? Answer: YES! f( x, y) k( xy x y) kx ( )( y) b) Find k 4 5 kx ( )( y) dxdy 0 0 x y k ( x) ( y) k ( )( ) k 0 0 3 5 3
Example f( x, y) k( xy x y) 0 x, y b) Find E{XY} E{ X} x ( x) dx 0 3 3 x x [ ] 3 3 ( ) 3 3 5 9 0 EY { } y ( y) dx 0 5 3 [ y ] y 5 3 ( ) 5 3 8 5 0 Since X and Y are independent, E { XY } E { X } E { Y } 5 8 8 9 5 7 33
height in feet Correlation Between RV s 6.5 6 5.5 5 4 5 6 7 8 9 0 3 4 shoe size 35
vision 0/x Correlation Between RV s 5 4 3 0 9 8 7 6 5 4 5 6 7 8 9 0 3 4 shoe size 36
miles per gallon Correlation Between RV s 8 6 4 0 8 6 4 750 800 850 900 950 000 050 00 50 00 50 car weight in lbs 37
Correlation Between RV s We need to quantify the relationship between RV s Correlation E{XY} is one way» X and Y may have different mean values and scales» Difficult to tell much about how related they are. If we subtract the mean, we get the covariance E X X Y Y» The covariance is not affected by the mean values, but we still have a problem with RV s with different scales (i.e. σ) 38
Y X X -0-0 -8-6 -4-0 4 6 8 0-0 -0-8 -6-4 - 0 4 6 8 0-00 -8-80 -6-60 -4-40 - -0 Y 0 0 0 4 40 6 60 8 EXY=80 EXY=5 80 0 Problem with Correlation 39
Correlation Coefficient If we substract the means and divide by the standard deviations, we get the correlation coefficient X X Y Y E XY XY E X Y XY»» ρ>0 means the RV s are positively correlated» ρ<0 means the RV s are negatively correlated» ρ=0 means the RV s are not correlated (uncorrelated)» ρ=± means the RV s are linearly related e.g. Y=mX+b If ρ=, m>0 If ρ=-, m<0 40
height in feet 4 0.75 0 0.75 shoe size shoe size car weight in lbs 5 4 5 6 7 8 9 0 3 4 5 4 5 6 7 8 9 0 3 4 4 750 800 850 900 950 000 050 00 50 00 50 6 6 7 6 5.5 vision 0/x 0 9 8 miles per gallon 4 0 8 3 6 4 6.5 5 8 Correlation Coefficient
Example, f x y X Y X Y 7 x y 0 x,0 y 0 44 0 0 X Y else XY E XY xy x y dxdy Slight E XY XY 3 7 44 3 Negative correlation 4
Correlation Between RV s cont d Ex: Z ax by a,b constant. What is z? E Z E ax by ax by E Z E ax by a E X abe XY b E Y a X ab XY b Y X X Y Y a ab b a X abxy b Y X X Y Y a ab b ax by X X Y Y a ab b E Z X X Y Y Z Z z ax by a ab b X X Y Y 43
Formulas to Remember X X Y Y E XY XY E X Y XY E XY XY X Y a ab b ax by X X Y Y 44
Density Function of the Sum of Two RV s Suppose we have RV s X and Y that are statistically independent Let Z=X+Y What is f z Z? First find the probability distribution function. y z=x+y Z z F z P Z z P X Y z Z x=z-y Integrate f(x,y) over this region z y z y, F z f x y dxdy Z f x f y dxdy X Y z y f y f x dxdy Y X since X and Y are independent 46
Density Function of the Sum of Two RV s cont d z y F z f y f x dxdy Z Y X Differentiate both sides with respect to z using Leibniz rule ht ht d d,,, ( ), dt gt gt dt A tsds Atht ht Atgt gt Atsds f z f y f z y dy Z Y X 47
Density Function of the Sum of Two RV s cont d f z f f z d Z Y X The probability density function of Z=X+Y where X and Y are independent is the convolution of the density functions of X and Y f z f z f z Z X Y f z f f z d Z Y X For Z=X-Y 48
Convolution Example f x X f y Y x y e y Flip and shift f Y (y) f z f f z d Z X Y 49
f x X e Convolution Example f y Y x y e y z 3 cases z 0 0 z z z 0 λ 50
Convolution Example z 0 0 f z Z 0 z z z 0 f z e d e Z 0 z z f z e d e e Z z z f z Z 0.63 z 5
Correlation For Z=X-Y f z f f z d Z Y X Deterministic correlation of two functions Similar to convolution, but» No flipping just shifting» Must shift the z+λ function» When shifting, f(0) is at λ=- z 5
f y Y Shift f X (x) Correlation Example f x X y x e x f z f f z d Z Y X z e z 0 λ 53
f y Y Correlation Example f x X y x e x 3 cases z 0 0 z 0 z z 0 z z z e z 0 λ 54
Correlation Example 0 z f z e d e e Z 0 z z e z z 0 λ z 0 f z e d e Z z z ( z ) 0 z e z λ z f z 0 Z 0 z e z λ 55
f y Y Correlation Example f x X y x e x f z f f z d Z Y X f z Z 0.63-0 z 56
Two Important Consequences ) CENTRAL LIMIT THEOREM (CLT) The PDF of sum of n independent random variables approaches a Gaussian density as n becomes large, regardless of the pdf s of the individual RV s. (many RV s encountered in real life satisfy this) 0 3 4 0.8 0.6 0.4 0. 0-3 - - 0 3 57
Two Important Consequences ) If X and Y are Gaussian, Z is also Gaussian with: Z X Y Z X Y X Y 58