ENGG2430A-Homework 2

Similar documents
f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

matrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2

ECE Lecture #9 Part 2 Overview

Covariance and Correlation

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Multivariate Random Variable

Bivariate distributions

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

ECE Homework Set 3

Homework 5 Solutions

STT 441 Final Exam Fall 2013

Elements of Probability Theory

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

Jointly Distributed Random Variables

Continuous Random Variables

Class 8 Review Problems solutions, 18.05, Spring 2014

Algorithms for Uncertainty Quantification

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

STAT/MATH 395 PROBABILITY II

Chp 4. Expectation and Variance

conditional cdf, conditional pdf, total probability theorem?

Functions of two random variables. Conditional pairs

Review of Probability Theory

5 Operations on Multiple Random Variables

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Probability Theory and Statistics. Peter Jochumzen

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

Lecture 2: Repetition of probability theory and statistics

Bivariate Distributions. Discrete Bivariate Distribution Example

ECSE B Solutions to Assignment 8 Fall 2008

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Let X and Y denote two random variables. The joint distribution of these random

Multiple Random Variables

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Practice Examination # 3

Bivariate Distributions

UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, Homework Set #6 Due: Thursday, May 22, 2011

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

Chapter 5 continued. Chapter 5 sections

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

STAT 430/510: Lecture 16

18 Bivariate normal distribution I

Notes for Math 324, Part 19

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Review: mostly probability and some statistics

STA 256: Statistics and Probability I

3. General Random Variables Part IV: Mul8ple Random Variables. ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof.

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

Final Exam # 3. Sta 230: Probability. December 16, 2012

STAT Chapter 5 Continuous Distributions

Chapter 4 Multiple Random Variables

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS

STAT515, Review Worksheet for Midterm 2 Spring 2019

2 (Statistics) Random variables

ECE 4400:693 - Information Theory

3. Probability and Statistics

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

SDS 321: Introduction to Probability and Statistics

Solution to Assignment 3

Preliminary Statistics. Lecture 3: Probability Models and Distributions

BASICS OF PROBABILITY

Lecture 2: Review of Probability

4 Pairs of Random Variables

Appendix A : Introduction to Probability and stochastic processes

Final Examination Solutions (Total: 100 points)

Lecture 16 : Independence, Covariance and Correlation of Discrete Random Variables

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

STAT 516 Midterm Exam 3 Friday, April 18, 2008

p. 6-1 Continuous Random Variables p. 6-2

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

Multivariate probability distributions and linear regression

Stat 5101 Notes: Algorithms

Midterm Exam 1 Solution

Joint Probability Distributions and Random Samples (Devore Chapter Five)

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Gaussian random variables inr n

MAS113 Introduction to Probability and Statistics. Proofs of theorems

STAT 430/510 Probability

Random Variables and Their Distributions

Lecture 19: Properties of Expectation

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

1 Basic continuous random variable problems

Regression and Covariance

Transcription:

ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent, and whether they are uncorrelated. i. X, Y is uniformly distributed over {,,,,,,, }. Solution: X, Y is uniformly distributed over {,,,,,,, }. The joint pmf of X and Y is P XY x, y for x, y {, }. We can obtain the pmf s of X and Y by marginalization as follows. P X x P Y y y {,} x {,} P XY x, y P XY x, y for all x {, } for all y {, }. It follows that P XY x, y P X yp Y y and so X and Y are independent. This also implies uncorrelatedness as proved in the lecture. ii. X, Y is uniformly distributed over {,,,,,,, }. Solution: X, Y is uniformly distributed over {,,,,,,, }. The marginal pmfs of X and Y are { P X α P Y α α {, }. α X and Y are not independent because P Y but P Y X P Y. However, they are uncorrelated because E[XY] + + + E[X] + + + E[Y] + + + and so CovX, Y E[XY] E[X]E[Y]. b For each of the following cases, compute the marginal pdfs from the joint pdfs. Explain whether X and Y are independent, and whether they are uncorrelated. i. X, Y is uniformly distributed over the unit circle {x, y : x + y }.

Page Solution: X, Y is uniformly distributed over the unit circle C : {x, y : x + y }. X and Y are not independent. The joint pdf is f XY x, y for x, y C. The marginal pdfs are f X x f XY x, ydy for x [, ] x x dy x f Y y y,and similarly for y [, ]. The product distribution f X xf Y y is clearly not uniformly distributed, and therefore not equal to the joint pdf. However, X and Y are uncorrelated as follows. E[XY] C xy dx dy xyf XY x, ydx dy which can be seen easily by symmetry. Similarly, E[X] E[Y], and so CovXY as desired. ii. The joint pdf is f X,Y x, y exp x +y jointly gaussian. Solution: Notice that the joint pdf can be factored as fxfy with fα : exp which is the gaussian distribution. Therefore, X and Y are independent with marginal distribution f X α f Y α fα. Alternatively, suppose we don t know that f is a valid pdf that integrates to. We can still derive the same result by marginalization. c because f X x exp x +y dy fx fydy cfx }{{} c: f Y y fy fxdx cfy f X xf Y y c fxfy c f X,Y x, y f XY x, ydx dy fxdx fydy c The integration for the first equality can be performed by changing the coordinate system α,

Page 3 to Polar coodinates. i.e. f XY x, ydx dy exp x +y dx dy r dθ dr exp r exp r r dr r exp r d c Explain why uncorrelatedness does not imply independence. Solution: From a ii and b i, we see that uncorrelatedness does not imply independence. Intuitively, the requirement for independence is more stringent because there are many equations involved. e.g. in the discrete case, independence requires P XY x, y P X xp Y y for all possible x X and y Y. However, the requirement for uncorrelatedness consists of only one equation E[XY] E[X]E[Y] and so does not cover the entire set of equations demanded for independence.

Page. Sequence of random variables a Prove that i. VarX E[X] VarX Solution: Let Z X E[X]. It follows that E[Z] E[X] E[X] by the linearity of expectation. Furthermore, which is the desired result. ii. VarX + Y VarX + CovX, Y + VarY VarZ E[X E[X] ] VarX Solution: Assume for simplicity that X and Y have zero mean, since the mean does not affect the variance by i. By definition, VarX + Y E[X + Y ] E[X ] + E[XY] + E[Y ] VarX + CovX, Y + VarY where the second equality is again by the linearity of expectation. b Let X i for i {,..., n} be continuous random variables that are identically distributed. Suppose every pair of distinct random variables X i and X j have the same correlation. Compute i. the correlation of X X and X 3 X Solution: Let σ and ρ be the variance of X i and correlation of X i and X j respectively for i j. For simplicity, we can assume the variables have zero mean, since the value of the mean does not affect the correlation, similar to a i. Then, ρ E[X ix j ] σ, for i j. Similarly, E[X X X 3 X ] E[X X 3 ] E[X X ] E[X X 3 ] + E[X X ] ρσ σ ρσ + ρσ ρσ σ E[X X ] E[X ] E[X X ] + E[X ] σ ρσ + σ ρσ By symmetry, E[X 3 X ] ρσ, and so the desired correlation is, E[X X X 3 X ] ρ σ E[X X ]E[X 3 X ] ρσ. ii. the correlation of X X and X 3 X

Page 5 Solution: Similar to b i, we assume X i s have zero mean. E[X X X X 3 ] E[X X ] E[X X 3 ] E[X X ] + E[X X 3 ] ρσ ρσ ρσ + ρσ Thus, the desired correlation is. c Explain why the answers to b i and ii are the same/different. Solution: The difference in b i and ii is because, in b i, the differences X X and X 3 X share the same random variable X, but the differences in b ii do not. The correlation in b i is negative because, when X is large, the first difference X X tends to be large but the second difference X 3 X tends to be small.