ECE Lecture #9 Part 2 Overview

Similar documents
ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

conditional cdf, conditional pdf, total probability theorem?

ENGG2430A-Homework 2

Continuous Random Variables

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Functions of two random variables. Conditional pairs

5 Operations on Multiple Random Variables

Stat 5101 Notes: Algorithms (thru 2nd midterm)

SOLUTION FOR HOMEWORK 11, ACTS 4306

STAT 430/510: Lecture 16

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Continuous r.v practice problems

2 (Statistics) Random variables

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

BASICS OF PROBABILITY

Chapter 4 continued. Chapter 4 sections

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

STA 256: Statistics and Probability I

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

18 Bivariate normal distribution I

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

SDS 321: Introduction to Probability and Statistics

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com

Stat 5101 Notes: Algorithms

3-1. all x all y. [Figure 3.1]

Review: mostly probability and some statistics

STT 441 Final Exam Fall 2013

Bivariate distributions

The Multivariate Normal Distribution. In this case according to our theorem

EXPECTED VALUE of a RV. corresponds to the average value one would get for the RV when repeating the experiment, =0.

Review of Probability Theory

Practice Examination # 3

3. Probability and Statistics

Lecture 2: Review of Probability

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Bivariate Distributions. Discrete Bivariate Distribution Example

(4) Suppose X is a random variable such that E(X) = 5 and Var(X) = 16. Find

Covariance and Correlation

Chp 4. Expectation and Variance

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:

ECE 4400:693 - Information Theory

MAS223 Statistical Inference and Modelling Exercises and Solutions

Chapter 4. Chapter 4 sections

Introduction to Probability and Stocastic Processes - Part I

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1

Multiple Random Variables

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

Formulas for probability theory and linear models SF2941

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

Multivariate Random Variable

ECE Lecture #10 Overview

Joint Distribution of Two or More Random Variables

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Properties of Random Variables

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Chapter 5 continued. Chapter 5 sections

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

6.041/6.431 Fall 2010 Quiz 2 Solutions

Random Variables and Their Distributions

Random Variables. P(x) = P[X(e)] = P(e). (1)

matrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2

4. CONTINUOUS RANDOM VARIABLES

Lecture 11. Probability Theory: an Overveiw

Stat 150 Practice Final Spring 2015

Lecture 2: Repetition of probability theory and statistics

Probability and Distributions

Lesson 4: Stationary stochastic processes

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Final Exam # 3. Sta 230: Probability. December 16, 2012

STA 4322 Exam I Name: Introduction to Statistics Theory

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

10. Joint Moments and Joint Characteristic Functions

ECE 673-Random signal analysis I Final

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

Problem Set 7 Due March, 22

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

2 Functions of random variables

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

1 Solution to Problem 2.1

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

Elements of Probability Theory

Multiple Random Variables

ECON Fundamentals of Probability

MAS223 Statistical Inference and Modelling Exercises

Let X and Y denote two random variables. The joint distribution of these random

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type.

4. Distributions of Functions of Random Variables

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Transcription:

ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by definition; then differentiate to find f(z); Method : Method of Auxiliary Variables

Expected Value of Z = g(x, Y) For continuous RV s: E(Z) = E[g(X, Y)] g(x, y)f (x, y)dxdy zf Z (z)dz (often hard to find) For discrete RV s: E(Z) = E[g(X, Y)] j k g(x j, y k ) Pr(X x j, Y y k )

Property/Example: Expected Value Say Z = X + Y E(Z) = E(X + Y) = x f ( x y)f (x, y)dxdy (x, y)dxdy y f (x, y)dxdy x f X (x)dx yf Y (y)dy E(X) E(Y) Note: Expectation is linear! Similarly, E(aX + by + c)= 3

Correlation of RV s Definition: The correlation of RV s X and Y is: E( ). Calculation: E() = xyf (x, y)dxdy Concept: Correlation is a measure of similarity (Large) positive correlation X, Y tend to be both negative or both positive, on the average (Large, in abs. value) negative correlation X, Y tend to be opposite in sign, on the average Note E() E(X) E(Y), in general Problem with correlation as a measure of similarity: the number may be large just due to the fact the RV s take large values. Ex: correlation between gpa s and entry-level salaries would be larger if we measured salaries in cents rather than dollars. 4

Covariance of RV s Definition: The covariance of RV s X and Y is cov(x, Y) Calculation: E {(X X)(Y Y)} Property #: cov(x, Y) = E( ) E( ) E( ) Mean of the product minus the product of the means Similar to the expression for variance cov( X,Y) Covariance is a (partially) normalized measure of similarity between RV s (x X)(y Y)f (x, y)dxdy 5

Proof of Property # Cov(X, Y) = ( x X)(y Y)f(x,y)dx dy ( xy Xy xy X Y)f(x,y)dx dy xy f xyf (x, y)dx dy Xyf = E( ) - term term3 + E( ) E( ) (x, y)dx dy where terms and 3 are computed on the next page: (x, y)dx dy X Yf (x, y)dx dy 6

Proof of Property #, continued So far, cov(x, Y) = E( ) - term term3 + E( ) E( ) where term X y f x, y) dxdy X y f ( x, y) dxdy ( X f ( x, y) dx y dy X y f ( y) dy Y X Y and where term3 = xy f x, y) dxdy Y x f ( x, y) dxdy ( Y f ( x, y) dy x dx Y f X ( x) x dx Y X cov(x, y) E( ) X Y Y X X Y E( ) X Y 7

More About Covariances Var(X Y) = Var(X) + Var(Y) cov(x, Y) Definition: RV s X and Y are uncorrelated if cov(x, Y) = 0. Property: X, Y independent X, Y uncorrelated (not, in general, conversely) 8

Correlation Coefficient Defn: The correlation coefficient between RV s X and Y is cov(x, Y) r = (often denoted: r) Measures the degree to which X and Y are statistically related in a linear sense. Property : - r Property : If Y = ax + b, where a 0 X Y then r = (r = if a > 0) Note: the parameter r (or r) in the pdf for joint Gaussian RV s is the correlation coefficient for X and Y. 9

Correlation Intuition re: Correlation Coefficient, r Say we run an experiment in which we measure the outcomes of RV s X and Y, and plot the resulting points in the x-y plane: Case : r.9 Case : r -.9.5.5 0.5 0-0.5 - -.5 - -.5 - -0.75-0.5-0.5 0 0.5 0.5 0.75.5.5 0.5 0-0.5 - -.5 - -.5 - -0.75-0.5-0.5 0 0.5 0.5 0.75 (highly correlated in the positive sense; nearly fall on a line with positive slope. (highly correlated in the negative sense); nearly fall on a line with negative slope. 0

Correlation Intuition, continued Case 3: r 0 Case 4: r 0.5.5 0.5 0-0.5 - -.5 - -.5 - -0.75-0.5-0.5 0 0.5 0.5 0.75 4 3 0 - - -3-4 -5-4 -3 - - 0 3 4 5 (little correlation; little dependence of any kind between X and Y) (little correlation; little linear dependence, but obviously there exists a strong dependence between X and Y)

Example Givens: E(X) = 0, E(Y) =, var(x) = 4, var(y) =, r =.4; Find W = X + Y; Z = X + 3Y a) The mean of W and the mean of Z; E(W) = E(X + Y) = + = + = E(Z) = E(X + 3Y) = + 3 = ( ) + 3( ) = 6 b) The variance of W and the variance of Z; var(w) = var(x + Y) = var(x) + var(y) + cov(x, Y) (need this)

r Example, continued cov(x, Y) X Y cov(x, Y) Thus, var(w) = var(x + Y) = var(x) + var(y) + cov(x, Y) = 4 + + (.8) = 6.6 And, var(z) = var(u + V), where U = X, V = 3Y = var(u) + var(v) + cov(u, V) var(u) = var(x) = 4 var(x) = 4(4) = 6; var(v) = var(3y) = 9 var(y) = 9() = 9; cov(u, V) = E(UV) E(U) E(V) r.4()().8 = E[ (X) (3Y) ] E(X) 3E(Y)= 6 E() X Y (need this) 3

Example, continued To find E(): cov() = E() E(X) E(Y).8 = E() (0) () E() =.8 Thus, cov(u, V) = 6 E() = 6 (.8) = 4.8 Hence, var(z) = var(u) + var(v) + cov(u, V) = 6 + 9 + (4.8) = 34.6 c) The correlation coefficient, r WZ, of W and Z. First we will find cov(w, Z). 4

Example, continued Cov(W, Z) = E(WZ) E(W) E(Z) E(WZ) = E[(X+Y)(X + 3Y)] = E[X + 5 + 3Y ] = E(X ) + 5 E() + 3 E(Y ) var(x) = E(X ) (E(X)) 4 = E(X ) 0 E(X ) = 4 var(y) = E(Y ) (E(Y)) = E(Y ) () E(Y ) = 5 Thus, E(WZ) = E(X ) + 5 E() + 3 E(Y ) = (4) + 5(.8) + 3 (5) = 7 Therefore, Cov(W, Z) = 7 ()(6) = 7 = 5 5

Example (continued from Lecture 8) Recall f (x, y) = 6 ( x y), on a triangle: So far: f X (x) = 3( - x) 0 x < 0 else f Y (y) = 3( y) 0 y < else Find (a) cov(x, Y) and (b) r. (a) Solution: we need E(), E(X), and E(Y) since cov(x, Y) = E() E(X) E(Y) 6

Example, continued E(X) = E(Y) = X 4 0 x f (x)dx 3 x ( x) dx y 4 0 yf (y)dy 3 y( y) dy E() = = 6 6 xy f (x, y)dx dy x0 xy( x x y0 y)dx dy (xy x y xy )dy dx y y = -x + x 0 7

Example, continued Inner Integral: Thus (evaluating the outer integral), E() = x0 x y0 (xy x x( x) 3 dx y xy 0 )dy x( x) 6 3 cov(x, Y) = E() E(X) E(Y) = (/0) (/4) (/4) = (/0) (/6) = -/80 8

Example, continued b) Find r cov(x, Y) r X To get var(x), we need E(X ); to get var(y), we need E(Y ) E(X ) = Y x f (x)dx 3 x ( x) dx X E() E(X)E(Y) Thus, var(x) = var(y) = E(X ) (E(X)) =(/0) (/6) =3/80 cov(x, Y) r = 80 = -/3 3/80 X Y X 0 Y (need var(x), var(y) for denominator) 0 E(Y ) 9

Functions of RV s Method Let Z = g(x, Y), where X and Y are RV s F Z (z) = Pr(Z z) = Pr(g(X, Y) z) F Z (z) = R( z) Then: f Z (z) = Defines a subset, say R(z), of the xy-plane, meeting this condition f ( x, y) dxdy d dz F Z (z) 0

Let Z = X + Y, and f (x, y) = So F Z (z) = Pr(Z z) = Example x y exp x y z f Find f Z (z). (x, y) dx dy Polar Coordinates: r r dr dq F z (z) q0 z r0 r exp r drd q e z /( ) U(z) e u du

Example, continued Thus, f z (z) = = e d dz F z /( Z ) ( z) d dz (z) U(z)e /( e z z /( ) ) U ( z) product rule z /( ) e U(z)

Method : Auxiliary Variables (not shown in Cooper & McGillem) Given inputs X, Y, and output, Z = g(x, Y); Find f Z (z). X Y g(x, y) Z Model of actual problem Method of Auxiliary Variable: create a new (auxiliary) variable, W = h(x, Y) = h(x). X Y g(x, y) h(x, y) Z W (aux.) Aux. Variable Model for same problem 3

Method : Auxiliary Variables - 4-step Approach -. Solve the output equations backwards for X and Y. Find the Jacobian of the transformation: 3. f ZW (z, w) = f X (x) (analogous to f Y (y) = dy single RV) dx 4. f Z (z) = f f ZW (x, y) J(x, y) (z,w)dw z,w, for transformation of a x - (getting rid of the auxiliary variable) 4

Method, Example : Let Z = X + Y; find f Z (z) X Y g(x, y) Z = X + Y h(x, y) W = X (aux.). Solve the output equations backwards for X and Y: Y = Z X = (in terms of Z, W) X = W 5

Method, Example, continued: (Z = X + Y; aux: W = X). Find the Jacobian of the transformation, J(x, y): J(x, y) = 3. f ZW (z, w) = d"new" det d"old" f (x, y) J(x, y) z,w dz dx det dw dx (w,z w) (w,z 4. f Z (z) = (**) f ZW (z,w)dw f f dz dy dw dy (w,z f 0 w)dw w) 6

Method, Example Going Further Special Case of Interest: Z = X + Y where X, Y independent Repeating the (**) result from p. 5: f Z (z) = f (w,z w)dw f (, z ) d If X and Y are independent, this becomes: f Z (z) = fx y ( )f Y (z )d 7

Example, continued Summary: If Z = X+Y, and if X and Y are independent RV s, then f Z (z) = f X (z) * f Y (z) (This generalizes to the sum of an arbitrary # of independent RV s) Specific Example: If X and Y are both U(0, ), and if X and Y are independent, find the pdf of Z = X + Y. 8

Example Specific Example, continued Solving by Graphical Convolution: f X (x) f Y (y) 0 x 0 y (Convolution details will be reviewed during class.) f Z (z) z 0 Check: area = 9

Other Things to Recall about Convolution Important for Discrete RV s: (x) * f(x) = f(x) (x - a) * f(x) = f(x - a) In class application/example: Discrete RV X takes values 0 and 5 with equal probability Discrete RV Y takes values and - with equal probability Find the pdf of RV Z = X + Y if X and Y are independent. 30

Discrete RV X takes values 0 and 5 with equal probability Discrete RV Y takes values and - with equal probability Find the pdf of RV Z = X + Y Sketch of pdf s: 3

Method, Example : Let Z = ; use aux. variable W = X; find f Z (z). Solve the output equations backwards for X and Y: X = W; Y = Z/W. Find the Jacobian of the transformation: dz dz d"new" dx dy J(x, y) = det det d"old" dw dw dx dy y x 0 x 3. f ZW (z, w) = f (x, y) J(x, y) z,w f (w,z / w w) 4. f Z (z) = 3