Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Similar documents
ENGG2430A-Homework 2

STA 256: Statistics and Probability I

Continuous r.v practice problems

Lecture 11. Probability Theory: an Overveiw

conditional cdf, conditional pdf, total probability theorem?

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Bivariate Distributions

Continuous Random Variables

SDS 321: Introduction to Probability and Statistics

Covariance and Correlation

Bivariate distributions

ECE Lecture #9 Part 2 Overview

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Homework 5 Solutions

Joint Probability Distributions, Correlations

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

STAT 430/510: Lecture 16

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

Joint Probability Distributions, Correlations

Mathematics 426 Robert Gross Homework 9 Answers

UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

5 Operations on Multiple Random Variables

4 Pairs of Random Variables

Basics on Probability. Jingrui He 09/11/2007

Review: mostly probability and some statistics

Multivariate distributions

2 (Statistics) Random variables

ECE 302: Probabilistic Methods in Engineering

Quantitative Methods in Economics Conditional Expectations

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

EE4601 Communication Systems

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

ECE 4400:693 - Information Theory

2 Functions of random variables

Review of Probability Theory

Introduction to Probability and Stocastic Processes - Part I

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

STAT Chapter 5 Continuous Distributions

STAT 515 MIDTERM 2 EXAM November 14, 2018

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

Appendix A : Introduction to Probability and stochastic processes

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

Multiple Random Variables

Lecture 2: Repetition of probability theory and statistics

Algorithms for Uncertainty Quantification

Chapter 4 Multiple Random Variables

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

6.041/6.431 Fall 2010 Quiz 2 Solutions

Joint Probability Distributions and Random Samples (Devore Chapter Five)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24

ECSE B Solutions to Assignment 8 Fall 2008

HW4 : Bivariate Distributions (1) Solutions

Chapter 2. Probability

General Random Variables

STAT515, Review Worksheet for Midterm 2 Spring 2019

Let X and Y denote two random variables. The joint distribution of these random

Multivariate Distributions CIVL 7012/8012

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

1 Review of di erential calculus

Lecture 19: Properties of Expectation

Probability theory. References:

Homework 10 (due December 2, 2009)

Consider the joint probability, P(x,y), shown as the contours in the figure above. P(x) is given by the integral of P(x,y) over all values of y.

1 Solution to Problem 2.1

Formulas for probability theory and linear models SF2941

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

Joint p.d.f. and Independent Random Variables

CS145: Probability & Computing

Name of the Student: Problems on Discrete & Continuous R.Vs

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Recitation 2: Probability

1 Random variables and distributions

Notes for Math 324, Part 19

Random Variables and Their Distributions

Bivariate Paired Numerical Data

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

3-1. all x all y. [Figure 3.1]

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

MAS223 Statistical Inference and Modelling Exercises

Statistics, Data Analysis, and Simulation SS 2015

Jointly Distributed Random Variables

MATH 307: Problem Set #3 Solutions

Chapter 5 Joint Probability Distributions

Partial Solutions for h4/2014s: Sampling Distributions

Math 222 Spring 2013 Exam 3 Review Problem Answers

Bivariate Distributions. Discrete Bivariate Distribution Example

STAT 430/510: Lecture 15

Math 266, Midterm Exam 1

Solution to Assignment 3

[POLS 8500] Review of Linear Algebra, Probability and Information Theory

Introduction to Probability and Stocastic Processes - Part I

Transcription:

Random Signals and Systems Chapter 3 Jitendra K Tugnait Professor Department of Electrical & Computer Engineering Auburn University

Two Random Variables Previously, we only dealt with one random variable For the next few lectures, we will study two random variables» How they are related to each other» How we describe this relationship This is an intermediate step toward random signals

Example + V - V O L T M E T E R measured voltage (random) random voltage measurement error (random variable) 3

Joint Probability Distribution Joint Probability Distribution: Consider two RV s X and Y F xy, PX ( xy, y) Properties AND ) 0 Fx, y x y ) 3) F, y F x, F, 0 F, 4

Properties cont d 4) F(x,y) is a nondecreasing function as either x or y, or both increase 5) F, y F ( y Y ), ( ) F x F x X marginal distributions F, y P( X, Y y) P( Y y) F ( y) Y 5

Joint Density Function, f x y, F x y x y order of differentiation is not important f x, y dxdy P x X xdx, yy ydy ), 0 f x y x y ) f x, y dxdy 3) x y F xy, fuvdvdu, 6

Joint Density Function cont d 4) f x f x, ydy X f y f x, y dx Y marginal densities 5) x y P x X x, y Y y f x, y dxdy x y 7

Example, y yx x f x y 0 elsewhere for x x x, y y y f xy, height y y x x y y x x 8

Expected Values Given two RV s X and Y, the expected value of g(x,y) is E g Y g y f y X, x, x, dydx E XY xyf x, y dydx is called the correlation. 9

Example, y yx x f x y 0 elsewhere for x x x, y y y y f xy, x x y x y E XY xy dydx x y y y x x x y y y x x 4 x y x y x x y y 0

Example cont d f x dy y X y y x x y y y y y y x x x x y y f xy, x x Similarly, f y dx x Y y y x x y y x

Example Rectangular semiconductor substrate with dimensions having mean values of cm and cm.» Actual dimensions are independent and uniformly distributed between ±0.0cm of their means a) Probability that both dimensions are larger than their mean values by 0.005 cm..00.00 P (.005 X,.005 Y ) dxdy 0.00.005.005 0.005 0.00 6

Example cont d b) Probability that the larger dimension is greater than its mean value by 0.05cm and the smaller dimension is less than its mean value by 0.05 cm. 0.995.00 P ( X 0.995,.005 Y ) dxdy 0.00 0.990.005 0.005 0.00 6 3

Example cont d c) The mean value of the area of the substrate A E{ A} E{ XY }.0.0 0.99.99 0.0 xy dydx 0.0.0.0 x y 0.99.99 4

Joint Density Function cont d Ex: Two RV s X and Y f x, y a) Value of A x 3y Ae x 0, y 0 0 x 0, y 0 A 00 x 3y 3y x 3y x A e e dx dy A e e dy 0 0 0 0 3 y e dy A A 3 6 Ae dxdy 0 5

Joint Density Function cont d 4 3 4, 6 6 [ ] [ ] 0.3335 4 3 x y b) 3 P X Y e dydx e e 00 c) The expected value of XY. 0 0 x 3y E XY 6 xye dydx 6 0 0 6 4 9 6 x 3y xe ye dy dx 6

Conditional Probability Previously, we worked with conditional probability in terms of events PA ( B) PAB ( ) PB ( ) We now define conditional probability in terms of random variables Example: What is the probability density of the actual voltage given that the measured voltage is.v? + V - V O L T M E T E R measured voltage actual voltage measurement error 8

Conditional Probability Consider two RV s X and Y with joint distribution F(x,y) First, consider the probability that X x given that Y y.,, P X x Y y F xy PX x Y y F x Y y X P Y y F y Y Now, suppose that y Y y is given,, F x y F x y PX x y Y y F x y Y y X F y F y Y Y Now, suppose that Y = y is given (i.e. y =y ) F x, y F x, y 0 PX x Y y? F y F y 0 Y Y 9

Conditional Probability cont d By using something similar to l Hospital s Rule, we get, f xy f x Y y X f y Y conditional density often written f(x y) f x y, f xy f y x f y Y, f xy f x X joint marginal 0

Conditional Probability cont d Bayes Theorem: f y x Y f x y f y f x X Total Probability:, f x f x y dy f x y f y dy X Y, f y f x y dx f y x f x dx Y X

Conditional Probability Example Ex:, f x y 0, k x y x y 0 elsewhere a) Value of k x y kx ydxdy k 0 0 0 0 k

Conditional Probability Example, f x y x y 0 x, y 0 elsewhere b) Probability that X>/ given that Y=/4? f y f x, ydx x ydx y Y 0 P X Y 4 f( x, y) f( x y) f y Y x y y f x x x x f Y, 4 4 4 P dx dx 4 3/4 3 4 3 3

Statistical Independence Two RV s X and Y are statistically independent if and only if f(x,y)=f X (x) f Y (y)» the joint density is separable. 5

Statistical Independence If X and Y are independent, then, f x y f x f y X Y Y Y f x y f x f y f y X» Knowledge of y tells us nothing about x» Similarly f y x f y Y 6

Uncorrelated Random Variables The random variables X and Y are uncorrelated if» E{XY}=E{X} E{Y} 7

Example Two random variables, X and Y, have the following joint density function, f x y» Are they independent? x y 0 x, y 0 elsewhere Answer: NO! Density function is not separable 8

Example Two random variables, X and Y, have the following joint density function, f x y 4xy 0 x, y 0 elsewhere» Are they independent? Answer: YES! Density function is separable fx ( x) x 0 x fy ( y) y 0 y 9

Statistical Independence If X and Y are independent, then E XY xyf ( x, y ) dxdy» X and Y are uncorrelated xyf ( x ) f ( y ) dxdy X Y [ xf ( x )][ yf ( y )] dxdy X Y xf ( x) dx yf ( y) dy X Y E XY E{ X} E{ Y} XY 30

Independent vs. Uncorrelated Both uncorrelated and independent mean that the RV s are not related They differ in how this uncorrelatedness is defined» Independent density function» Uncorrelated first and second order statistics uncorrelated E XY E X E Y independent f ( xy, ) f x f y X Y 3

Example f( x, y) k( xy x y) 0 x, y a) Are X and Y independent? Answer: YES! f( x, y) k( xy x y) kx ( )( y) b) Find k 4 5 kx ( )( y) dxdy 0 0 x y k ( x) ( y) k ( )( ) k 0 0 3 5 3

Example f( x, y) k( xy x y) 0 x, y b) Find E{XY} E{ X} x ( x) dx 0 3 3 x x [ ] 3 3 ( ) 3 3 5 9 0 EY { } y ( y) dx 0 5 3 [ y ] y 5 3 ( ) 5 3 8 5 0 Since X and Y are independent, E { XY } E { X } E { Y } 5 8 8 9 5 7 33

height in feet Correlation Between RV s 6.5 6 5.5 5 4 5 6 7 8 9 0 3 4 shoe size 35

vision 0/x Correlation Between RV s 5 4 3 0 9 8 7 6 5 4 5 6 7 8 9 0 3 4 shoe size 36

miles per gallon Correlation Between RV s 8 6 4 0 8 6 4 750 800 850 900 950 000 050 00 50 00 50 car weight in lbs 37

Correlation Between RV s We need to quantify the relationship between RV s Correlation E{XY} is one way» X and Y may have different mean values and scales» Difficult to tell much about how related they are. If we subtract the mean, we get the covariance E X X Y Y» The covariance is not affected by the mean values, but we still have a problem with RV s with different scales (i.e. σ) 38

Y X X -0-0 -8-6 -4-0 4 6 8 0-0 -0-8 -6-4 - 0 4 6 8 0-00 -8-80 -6-60 -4-40 - -0 Y 0 0 0 4 40 6 60 8 EXY=80 EXY=5 80 0 Problem with Correlation 39

Correlation Coefficient If we substract the means and divide by the standard deviations, we get the correlation coefficient X X Y Y E XY XY E X Y XY»» ρ>0 means the RV s are positively correlated» ρ<0 means the RV s are negatively correlated» ρ=0 means the RV s are not correlated (uncorrelated)» ρ=± means the RV s are linearly related e.g. Y=mX+b If ρ=, m>0 If ρ=-, m<0 40

height in feet 4 0.75 0 0.75 shoe size shoe size car weight in lbs 5 4 5 6 7 8 9 0 3 4 5 4 5 6 7 8 9 0 3 4 4 750 800 850 900 950 000 050 00 50 00 50 6 6 7 6 5.5 vision 0/x 0 9 8 miles per gallon 4 0 8 3 6 4 6.5 5 8 Correlation Coefficient

Example, f x y X Y X Y 7 x y 0 x,0 y 0 44 0 0 X Y else XY E XY xy x y dxdy Slight E XY XY 3 7 44 3 Negative correlation 4

Correlation Between RV s cont d Ex: Z ax by a,b constant. What is z? E Z E ax by ax by E Z E ax by a E X abe XY b E Y a X ab XY b Y X X Y Y a ab b a X abxy b Y X X Y Y a ab b ax by X X Y Y a ab b E Z X X Y Y Z Z z ax by a ab b X X Y Y 43

Formulas to Remember X X Y Y E XY XY E X Y XY E XY XY X Y a ab b ax by X X Y Y 44

Density Function of the Sum of Two RV s Suppose we have RV s X and Y that are statistically independent Let Z=X+Y What is f z Z? First find the probability distribution function. y z=x+y Z z F z P Z z P X Y z Z x=z-y Integrate f(x,y) over this region z y z y, F z f x y dxdy Z f x f y dxdy X Y z y f y f x dxdy Y X since X and Y are independent 46

Density Function of the Sum of Two RV s cont d z y F z f y f x dxdy Z Y X Differentiate both sides with respect to z using Leibniz rule ht ht d d,,, ( ), dt gt gt dt A tsds Atht ht Atgt gt Atsds f z f y f z y dy Z Y X 47

Density Function of the Sum of Two RV s cont d f z f f z d Z Y X The probability density function of Z=X+Y where X and Y are independent is the convolution of the density functions of X and Y f z f z f z Z X Y f z f f z d Z Y X For Z=X-Y 48

Convolution Example f x X f y Y x y e y Flip and shift f Y (y) f z f f z d Z X Y 49

f x X e Convolution Example f y Y x y e y z 3 cases z 0 0 z z z 0 λ 50

Convolution Example z 0 0 f z Z 0 z z z 0 f z e d e Z 0 z z f z e d e e Z z z f z Z 0.63 z 5

Correlation For Z=X-Y f z f f z d Z Y X Deterministic correlation of two functions Similar to convolution, but» No flipping just shifting» Must shift the z+λ function» When shifting, f(0) is at λ=- z 5

f y Y Shift f X (x) Correlation Example f x X y x e x f z f f z d Z Y X z e z 0 λ 53

f y Y Correlation Example f x X y x e x 3 cases z 0 0 z 0 z z 0 z z z e z 0 λ 54

Correlation Example 0 z f z e d e e Z 0 z z e z z 0 λ z 0 f z e d e Z z z ( z ) 0 z e z λ z f z 0 Z 0 z e z λ 55

f y Y Correlation Example f x X y x e x f z f f z d Z Y X f z Z 0.63-0 z 56

Two Important Consequences ) CENTRAL LIMIT THEOREM (CLT) The PDF of sum of n independent random variables approaches a Gaussian density as n becomes large, regardless of the pdf s of the individual RV s. (many RV s encountered in real life satisfy this) 0 3 4 0.8 0.6 0.4 0. 0-3 - - 0 3 57

Two Important Consequences ) If X and Y are Gaussian, Z is also Gaussian with: Z X Y Z X Y X Y 58