ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent, and whether they are uncorrelated. i. X, Y is uniformly distributed over {,,,,,,, }. Solution: X, Y is uniformly distributed over {,,,,,,, }. The joint pmf of X and Y is P XY x, y for x, y {, }. We can obtain the pmf s of X and Y by marginalization as follows. P X x P Y y y {,} x {,} P XY x, y P XY x, y for all x {, } for all y {, }. It follows that P XY x, y P X yp Y y and so X and Y are independent. This also implies uncorrelatedness as proved in the lecture. ii. X, Y is uniformly distributed over {,,,,,,, }. Solution: X, Y is uniformly distributed over {,,,,,,, }. The marginal pmfs of X and Y are { P X α P Y α α {, }. α X and Y are not independent because P Y but P Y X P Y. However, they are uncorrelated because E[XY] + + + E[X] + + + E[Y] + + + and so CovX, Y E[XY] E[X]E[Y]. b For each of the following cases, compute the marginal pdfs from the joint pdfs. Explain whether X and Y are independent, and whether they are uncorrelated. i. X, Y is uniformly distributed over the unit circle {x, y : x + y }.
Page Solution: X, Y is uniformly distributed over the unit circle C : {x, y : x + y }. X and Y are not independent. The joint pdf is f XY x, y for x, y C. The marginal pdfs are f X x f XY x, ydy for x [, ] x x dy x f Y y y,and similarly for y [, ]. The product distribution f X xf Y y is clearly not uniformly distributed, and therefore not equal to the joint pdf. However, X and Y are uncorrelated as follows. E[XY] C xy dx dy xyf XY x, ydx dy which can be seen easily by symmetry. Similarly, E[X] E[Y], and so CovXY as desired. ii. The joint pdf is f X,Y x, y exp x +y jointly gaussian. Solution: Notice that the joint pdf can be factored as fxfy with fα : exp which is the gaussian distribution. Therefore, X and Y are independent with marginal distribution f X α f Y α fα. Alternatively, suppose we don t know that f is a valid pdf that integrates to. We can still derive the same result by marginalization. c because f X x exp x +y dy fx fydy cfx }{{} c: f Y y fy fxdx cfy f X xf Y y c fxfy c f X,Y x, y f XY x, ydx dy fxdx fydy c The integration for the first equality can be performed by changing the coordinate system α,
Page 3 to Polar coodinates. i.e. f XY x, ydx dy exp x +y dx dy r dθ dr exp r exp r r dr r exp r d c Explain why uncorrelatedness does not imply independence. Solution: From a ii and b i, we see that uncorrelatedness does not imply independence. Intuitively, the requirement for independence is more stringent because there are many equations involved. e.g. in the discrete case, independence requires P XY x, y P X xp Y y for all possible x X and y Y. However, the requirement for uncorrelatedness consists of only one equation E[XY] E[X]E[Y] and so does not cover the entire set of equations demanded for independence.
Page. Sequence of random variables a Prove that i. VarX E[X] VarX Solution: Let Z X E[X]. It follows that E[Z] E[X] E[X] by the linearity of expectation. Furthermore, which is the desired result. ii. VarX + Y VarX + CovX, Y + VarY VarZ E[X E[X] ] VarX Solution: Assume for simplicity that X and Y have zero mean, since the mean does not affect the variance by i. By definition, VarX + Y E[X + Y ] E[X ] + E[XY] + E[Y ] VarX + CovX, Y + VarY where the second equality is again by the linearity of expectation. b Let X i for i {,..., n} be continuous random variables that are identically distributed. Suppose every pair of distinct random variables X i and X j have the same correlation. Compute i. the correlation of X X and X 3 X Solution: Let σ and ρ be the variance of X i and correlation of X i and X j respectively for i j. For simplicity, we can assume the variables have zero mean, since the value of the mean does not affect the correlation, similar to a i. Then, ρ E[X ix j ] σ, for i j. Similarly, E[X X X 3 X ] E[X X 3 ] E[X X ] E[X X 3 ] + E[X X ] ρσ σ ρσ + ρσ ρσ σ E[X X ] E[X ] E[X X ] + E[X ] σ ρσ + σ ρσ By symmetry, E[X 3 X ] ρσ, and so the desired correlation is, E[X X X 3 X ] ρ σ E[X X ]E[X 3 X ] ρσ. ii. the correlation of X X and X 3 X
Page 5 Solution: Similar to b i, we assume X i s have zero mean. E[X X X X 3 ] E[X X ] E[X X 3 ] E[X X ] + E[X X 3 ] ρσ ρσ ρσ + ρσ Thus, the desired correlation is. c Explain why the answers to b i and ii are the same/different. Solution: The difference in b i and ii is because, in b i, the differences X X and X 3 X share the same random variable X, but the differences in b ii do not. The correlation in b i is negative because, when X is large, the first difference X X tends to be large but the second difference X 3 X tends to be small.