Joint ] X 5) P[ 6) P[ (, ) = y 2. x 1. , y. , ( x, y ) 2, (

Similar documents
Two-dimensional Random Vectors

3. Several Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables

10. Joint Moments and Joint Characteristic Functions

1036: Probability & Statistics

6. Vector Random Variables

Short course A vademecum of statistical pattern recognition techniques with applications to image and video analysis. Agenda

Review of Elementary Probability Lecture I Hamid R. Rabiee

Random Vectors. 1 Joint distribution of a random vector. 1 Joint distribution of a random vector

Continuous Random Variables

University of California, Los Angeles Department of Statistics. Joint probability distributions

Course on Inverse Problems

A Function of Two Random Variables

Inference about the Slope and Intercept

Review of Probability

5 Operations on Multiple Random Variables

MODULE 6 LECTURE NOTES 1 REVIEW OF PROBABILITY THEORY. Most water resources decision problems face the risk of uncertainty mainly because of the

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

Chapter 8: MULTIPLE CONTINUOUS RANDOM VARIABLES

18 Bivariate normal distribution I

2: Distributions of Several Variables, Error Propagation

Summary of Random Variable Concepts March 17, 2000

Review: mostly probability and some statistics

Expected value of r.v. s

1. Definition: Order Statistics of a sample.

Mixed Signal IC Design Notes set 6: Mathematics of Electrical Noise

3. Several Random Variables

The data can be downloaded as an Excel file under Econ2130 at

L2: Review of probability and statistics

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Multiple Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

6 The normal distribution, the central limit theorem and random samples

Math 180B Problem Set 3

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Introduction to Normal Distribution

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

1: PROBABILITY REVIEW

0.24 adults 2. (c) Prove that, regardless of the possible values of and, the covariance between X and Y is equal to zero. Show all work.

ECE594I Notes set 4: More Math: Expectations of 1-2 R.V.'s

Probability and Statistics

9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown

Stochastic Processes. Review of Elementary Probability Lecture I. Hamid R. Rabiee Ali Jalali

Introduction to Probability and Stocastic Processes - Part I

( x) f = where P and Q are polynomials.

Multivariate Random Variable

Correlation analysis 2: Measures of correlation

Ch. 12 Linear Bayesian Estimators

Chapter 5 continued. Chapter 5 sections

CSCI-6971 Lecture Notes: Probability theory

and ( x, y) in a domain D R a unique real number denoted x y and b) = x y = {(, ) + 36} that is all points inside and on

Ex x xf xdx. Ex+ a = x+ a f x dx= xf x dx+ a f xdx= xˆ. E H x H x H x f x dx ˆ ( ) ( ) ( ) μ is actually the first moment of the random ( )

EE4601 Communication Systems

4. CONTINUOUS RANDOM VARIABLES

Probability, Statistics, and Reliability for Engineers and Scientists MULTIPLE RANDOM VARIABLES

2.7 The Gaussian Probability Density Function Forms of the Gaussian pdf for Real Variates

1 Presessional Probability

Review (Probability & Linear Algebra)

Chapter 4 continued. Chapter 4 sections

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1

Lecture 2: Repetition of probability theory and statistics

Lecture 5: Moment Generating Functions

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Random vectors X 1 X 2. Recall that a random vector X = is made up of, say, k. X k. random variables.

matrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2

Math 180B, Winter Notes on covariance and the bivariate normal distribution

Random Variables and Their Distributions

TAMS39 Lecture 2 Multivariate normal distribution

Formulas for probability theory and linear models SF2941

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

2 Statistical Estimation: Basic Concepts

7. Two Random Variables

Elements of Probability Theory

Statistical signal processing

Econ 424 Time Series Concepts

Homework 10 (due December 2, 2009)

where r n = dn+1 x(t)

Mat 267 Engineering Calculus III Updated on 9/19/2010

A Probability Review

Exercises with solutions (Set D)

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

4. Distributions of Functions of Random Variables

Joint Distribution of Two or More Random Variables

8.4 Inverse Functions

Next is material on matrix rank. Please see the handout

FINAL EXAM: Monday 8-10am

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM

Multivariate Gaussian Distribution. Auxiliary notes for Time Series Analysis SF2943. Spring 2013

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS

MAHALAKSHMI ENGINEERING COLLEGE TIRUCHIRAPALLI

The Multivariate Normal Distribution. In this case according to our theorem

ENGG2430A-Homework 2

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES

Joint p.d.f. and Independent Random Variables

Transcription:

Two-dimensional Random Vectors Joint Cumulative Distrib bution Functio n F, (, ) P[ and ] Properties: ) F, (, ) = ) F, 3) F, F 4), (, ) = F 5) P[ < 6) P[ < (, ) is a non-decreasing unction (, ) = F ( ),,, (, ) = 0 ] = F, (, ) F, (, ) < = F ], (, ) F, (, ) F, (, ) + F, (, ) F ( ),, F, (, ) F (,,, ) F, (, ) 7) F, ( a, ) = lim F, (, ) + a

Joint Probabilit Densit Function (pd), (, ) F, (, ) Properties ), (, ) 0 ), (, ) d d = 3) P[ <, <, ] =, ( d, ) d F (, ) = 4), 5),, ( [ u, vdudv ) ( dd, ) = P + d and + d] Find the marginal pd rom the joint pd For continuous rvs, ( ) For discrete rvs, =, (, ) d ( ) =, (, ) d p ( ) = p, (, ) p ( ) = p, (, )

3 Conditional Distributions Conditional cd [ = ] F ( ) P c [ = ] = lim P [ +Δ ] P = = = Δ 0 lim Δ 0 lim Δ 0, [ +Δ] P[ +Δ] P, (, u) Δdu, Δ (, u) du ( c) Conditional pd d ( ) F ( ) d From eq. c, d, (, u) du d d F ( ) = d = As the consequence,, (, ) (, ) = ( ) ( ) = ( ) ( ),

4 Eamp ple 4.5* F, (, ) =, (u, v) du dv

5 Eamp ple 4.9*

6 Eample. Jointl Gaussian. and are said to be jointl Gaussian i their joint pd is given b, / πσ σ( ρ ) μ μ μ μ ρ + σ σ σ σ ( ρ ) (, ) = e g Note that there are ive parameters: μ, σ, μ, σ, and ρ. ρ is the correlation coeicient between and μ ( μ ) μ μ ρ + ( ) σ Eponent o. σ σ σ eq g = ( ρ ) μ ( )( μ ) ( μ ) μ μ ρ + ρ ρ + ( ) σ σ σ σ σ = ( ρ ) μ ( )( μ ) μ ρ ρ + σ σ σ = ( ρ ) σ μ ρ ( μ ) ( μ ) σ = σ σ ( ρ ) The joint pd in eq. g can be written as the product o two Gaussian pd's, (, ) = e πσ ( μ ) σ / πσ ( ρ ) e σ μ ρ μ σ σ ( ) ρ ( ) ( g) Eq. g shows the relation: (, ) = ( )., The irst term is the marginal pd o,, which is Gaussian N μ, σ. The second term is the conditional pd, ( ), which is also Gaussian σ Nμ + ρ ( μ ), σ( ρ ) σ

7 Alternativel we could write eq. g to show (, ) = ( )., In summar, When and are jointl Gaussian, the random variables and are marginall Gaussian. σ The conditional pd ( ) is Gaussian with mean μ + ρ μ and variance σ ρ. σ The conditional variance depends on ρ but does not depend on the condition =.

8 Moments o Bi-variate Random Variables Conditional Mean E [ = ] = ( ) d Eample. Discrete Bi-variate Random Variables Find the conditional mean. 0. Correlation E[ ] (, ) dd =, and are said to be orthogonal i correlation is zero. 0 Eample Discrete Bi-variate Random Variables. Find the correlation between and. 0. 0

9 Eample Find the correlation between the jointl Gaussian Random Variables. Solution. and are jointl Gaussian i their joint pd is given b (, ) = e, / πσ σ( ρ ) μ μ μ μ ρ + σ σ σ σ ( ρ ) Note that there are ive parameters: μ, σ, μ, σ, and ρ. We have shown or jointl Gaussian and, ( μ σ ) ( μ σ ) N,. N,. σ Nμ + ρ ( μ ), σ( ρ ). σ = (, ) dd =, d ( ) d σ = d μ + ρ μ σ ( ) σ = μ d + ρ d μ σ σ = μ μ + ρ μ σ σ = μ μ + ρ σ σ = μ μ + ρσ σ

0 Covariance cov(, ) E[( )( )] ( )( ) Properties = ( )( ) (, ) d d, cov(, ) = and are said to be uncorrelated i cov(, ) = 0. Eample For jointl Gaussian bi-variate random variables = μ. = μ. = μ μ + ρσ σ cov, = = μ μ + ρσ σ μ μ = ρσ σ Homework. Discrete Bi-variate Random Variables. Find the covariance between and. 0. 0

Correlation Coeicient (or Normalized Covariance) ρ, cov(, ) σ σ Eample For jointl Gaussian Random Variables, ρ, cov(, ) ρσ = = σ σ σ σ σ = ρ. Plots o a jointl Gaussian pd or dierent values o correlation coeicient.

Independent Random Variables and are said to be independent i ( ) ( ) = or all values o,. I and are independent,, (, ) = and ( ) =. Propert Independent implies uncorrelated. independent (, ) =, cov(, ) = 0 cov(, ) ρ, = = 0. σ σ Thus and are uncorrelated. (, ) = dd = dd = The converse is not necessaril true. However, or Gaussian random variables, the converse is true. proo σ (, ) = Nμ+ ρ μ, σ ρ. σ I uncorrelated, ρ = 0 and thus ( μ σ ) (, ) = N, = Thus and are independent.

3 Cauch-Schwarz Inequalit For an pair o random variables and, E [ ] E[ ] E[ ] Homework. Prove the Cauch-Schwarz inequalit ( λ ) Hint: 0 or an random variables and, and or an constant λ. Bounds on the Correlation Coeicient For an pair o random variables and, proo ρ, ( μ )( μ ) ( μ ) ( μ ) σ Notation: = μ and VAR( ) = σ. cov(, ) =, using the Cauch-Schwarz inequalit, = σ cov(, ) Thereore ρ, σ σ

4 Variance o a Sum o Random Variables VAR( ± ) = VAR( ) + VAR( ) ± cov(, ) proo VAR( + ) = + + ( ) = + ( ) ( ) ( )( ) = + + = VAR( ) + VAR( ) + cov(, ) Properties I and are independent, cov(, ) = 0 and thus VAR( + ) = VAR( ) = VAR( ) + VAR