Two-dimensional Random Vectors Joint Cumulative Distrib bution Functio n F, (, ) P[ and ] Properties: ) F, (, ) = ) F, 3) F, F 4), (, ) = F 5) P[ < 6) P[ < (, ) is a non-decreasing unction (, ) = F ( ),,, (, ) = 0 ] = F, (, ) F, (, ) < = F ], (, ) F, (, ) F, (, ) + F, (, ) F ( ),, F, (, ) F (,,, ) F, (, ) 7) F, ( a, ) = lim F, (, ) + a
Joint Probabilit Densit Function (pd), (, ) F, (, ) Properties ), (, ) 0 ), (, ) d d = 3) P[ <, <, ] =, ( d, ) d F (, ) = 4), 5),, ( [ u, vdudv ) ( dd, ) = P + d and + d] Find the marginal pd rom the joint pd For continuous rvs, ( ) For discrete rvs, =, (, ) d ( ) =, (, ) d p ( ) = p, (, ) p ( ) = p, (, )
3 Conditional Distributions Conditional cd [ = ] F ( ) P c [ = ] = lim P [ +Δ ] P = = = Δ 0 lim Δ 0 lim Δ 0, [ +Δ] P[ +Δ] P, (, u) Δdu, Δ (, u) du ( c) Conditional pd d ( ) F ( ) d From eq. c, d, (, u) du d d F ( ) = d = As the consequence,, (, ) (, ) = ( ) ( ) = ( ) ( ),
4 Eamp ple 4.5* F, (, ) =, (u, v) du dv
5 Eamp ple 4.9*
6 Eample. Jointl Gaussian. and are said to be jointl Gaussian i their joint pd is given b, / πσ σ( ρ ) μ μ μ μ ρ + σ σ σ σ ( ρ ) (, ) = e g Note that there are ive parameters: μ, σ, μ, σ, and ρ. ρ is the correlation coeicient between and μ ( μ ) μ μ ρ + ( ) σ Eponent o. σ σ σ eq g = ( ρ ) μ ( )( μ ) ( μ ) μ μ ρ + ρ ρ + ( ) σ σ σ σ σ = ( ρ ) μ ( )( μ ) μ ρ ρ + σ σ σ = ( ρ ) σ μ ρ ( μ ) ( μ ) σ = σ σ ( ρ ) The joint pd in eq. g can be written as the product o two Gaussian pd's, (, ) = e πσ ( μ ) σ / πσ ( ρ ) e σ μ ρ μ σ σ ( ) ρ ( ) ( g) Eq. g shows the relation: (, ) = ( )., The irst term is the marginal pd o,, which is Gaussian N μ, σ. The second term is the conditional pd, ( ), which is also Gaussian σ Nμ + ρ ( μ ), σ( ρ ) σ
7 Alternativel we could write eq. g to show (, ) = ( )., In summar, When and are jointl Gaussian, the random variables and are marginall Gaussian. σ The conditional pd ( ) is Gaussian with mean μ + ρ μ and variance σ ρ. σ The conditional variance depends on ρ but does not depend on the condition =.
8 Moments o Bi-variate Random Variables Conditional Mean E [ = ] = ( ) d Eample. Discrete Bi-variate Random Variables Find the conditional mean. 0. Correlation E[ ] (, ) dd =, and are said to be orthogonal i correlation is zero. 0 Eample Discrete Bi-variate Random Variables. Find the correlation between and. 0. 0
9 Eample Find the correlation between the jointl Gaussian Random Variables. Solution. and are jointl Gaussian i their joint pd is given b (, ) = e, / πσ σ( ρ ) μ μ μ μ ρ + σ σ σ σ ( ρ ) Note that there are ive parameters: μ, σ, μ, σ, and ρ. We have shown or jointl Gaussian and, ( μ σ ) ( μ σ ) N,. N,. σ Nμ + ρ ( μ ), σ( ρ ). σ = (, ) dd =, d ( ) d σ = d μ + ρ μ σ ( ) σ = μ d + ρ d μ σ σ = μ μ + ρ μ σ σ = μ μ + ρ σ σ = μ μ + ρσ σ
0 Covariance cov(, ) E[( )( )] ( )( ) Properties = ( )( ) (, ) d d, cov(, ) = and are said to be uncorrelated i cov(, ) = 0. Eample For jointl Gaussian bi-variate random variables = μ. = μ. = μ μ + ρσ σ cov, = = μ μ + ρσ σ μ μ = ρσ σ Homework. Discrete Bi-variate Random Variables. Find the covariance between and. 0. 0
Correlation Coeicient (or Normalized Covariance) ρ, cov(, ) σ σ Eample For jointl Gaussian Random Variables, ρ, cov(, ) ρσ = = σ σ σ σ σ = ρ. Plots o a jointl Gaussian pd or dierent values o correlation coeicient.
Independent Random Variables and are said to be independent i ( ) ( ) = or all values o,. I and are independent,, (, ) = and ( ) =. Propert Independent implies uncorrelated. independent (, ) =, cov(, ) = 0 cov(, ) ρ, = = 0. σ σ Thus and are uncorrelated. (, ) = dd = dd = The converse is not necessaril true. However, or Gaussian random variables, the converse is true. proo σ (, ) = Nμ+ ρ μ, σ ρ. σ I uncorrelated, ρ = 0 and thus ( μ σ ) (, ) = N, = Thus and are independent.
3 Cauch-Schwarz Inequalit For an pair o random variables and, E [ ] E[ ] E[ ] Homework. Prove the Cauch-Schwarz inequalit ( λ ) Hint: 0 or an random variables and, and or an constant λ. Bounds on the Correlation Coeicient For an pair o random variables and, proo ρ, ( μ )( μ ) ( μ ) ( μ ) σ Notation: = μ and VAR( ) = σ. cov(, ) =, using the Cauch-Schwarz inequalit, = σ cov(, ) Thereore ρ, σ σ
4 Variance o a Sum o Random Variables VAR( ± ) = VAR( ) + VAR( ) ± cov(, ) proo VAR( + ) = + + ( ) = + ( ) ( ) ( )( ) = + + = VAR( ) + VAR( ) + cov(, ) Properties I and are independent, cov(, ) = 0 and thus VAR( + ) = VAR( ) = VAR( ) + VAR