6.4/6.43: Probabilistic Systems Analysis (Fall ) Recitation Solutions October 9,. cov(ax + b,y ) ρ(ax + b,y ) var(ax + b)(var(y )) E[(aX + b E[aX + b])(y E[Y ])] a var(x)var(y ) E[(aX + b ae[x] b)(y E[Y ])] a var(x)var(y ) ae[(x E[X])(Y E[Y ])] a var(x)var(y ) cov(x,y ) var(x)var(y ) = ρ(x,y ) As an example where this property of the correlation coefficient is relevant, consider the homework and exam scores of students in a class. We expect the homework and exam scores to be positively correlated and thus have a positive correlation coefficient. Note that, in this example, the above property will mean that the correlation coefficient will not change whether the exam is out of 5 points, points, or any other number of points.. (a) When z : F Z (z) = P(X Y z) = P(X Y + z) y+z = f X,Y (x,y y+z = λe λy λe λx dxdy = λ λy λ(y+z) e dy λz y= e λy = + e y= λz = e z )dxdy Page of 6
When z < : Hence, Massachusetts Institute of Technology 6.4/6.43: Probabilistic Systems Analysis (Fall ) P(X Y + z) = F Z (z) = y+z f X,Y (x,y)dxdy d f Z (z) = F Z (z) = dz = λe λx λe λy dydx x z = λe λx e λ(x z) dx λz = e λe λx dx = λz e z e λz z f Z (z) = λz e z < λ e λz z λ λz e z < λ e λ z (b) Solving using the total probability theorem, we have: f Z (z) = f X (x)f Z X (z x)dx = f X (x)f Y X (x z x)dx = f X (x)f Y (x z)dx First when z <, we have: f X (x)f Y (x z)dx = λe λx λe λ(x z) dx = λe λz λe λx dx λ λz = e Page of 6
6.4/6.43: Probabilistic Systems Analysis (Fall ) Then, when z we have: f X (x)f Y (x z)dx = z λe λx λe λ(x z) dx = λe λz λe λx dx z λ λz λz = e e λ λz = e f Z (z) = λ e λ z z 3. (a) We have X = Rcos(Θ) and Y = Rsin(Θ). Recall that in polar coordinates, the differential area is da = dxdy = rdrd. So r π F R (r) = P(R r) = f X (r cos)f Y (r sin)d r dr r π = e (r ) / d r dr π F R (r) = r π e (r ) / dr r / = r d π = e u du (u = (r ) /) { e / r r < f R (r) = d FR (r) = ( /)(r)( e / ) dr = re /, r Page 3 of 6
6.4/6.43: Probabilistic Systems Analysis (Fall ) (b) F Θ () = P(Θ ) = f X (rcos )f Y (rsin )rdr d = e / rdr d π = re / d dr π = e u du (u = r /) π u = e = π π π < F Θ () = π π π d f Θ () = F Θ () = π d π (c) F R,Θ (r,) = P(R r,θ ) = = = = e / r, π π F R,Θ (r,) = e / r, > π otherwise r r e (r ) / dr d π r / e u dud π e / d π e / r, π (u = (r ) /) > π f R,Θ (r,) = FR,Θ (r,) = re / r π r, π Note: The PDF of R is exponentially distributed with parameter λ = /. This is a very convenient way to generate normal random variables from independent uniform and exponential random variables. We can generate an arbitrary random variable X with CDF F X by first generating a uniform random variable and then passing the samples from the uniform distribution through the function F X. But since we don t have a closed-form expression for the CDF of a normal random variable, this method doesn t work. However, Page 4 of 6
6.4/6.43: Probabilistic Systems Analysis (Fall ) we do have a closed-form expression for the exponential distribution. Therefore, we can generate an exponential distribution with parameter / and we can generate a uniform distribution in [,π], and with these two distributions we can generate standard normal distributions. 4. Problem 4., page 5 in text. See text for the proof. An alternative proof is given below: Consider the problem of picking a parameter α to minimize the expected squared difference between two random variables X and Y. Consider [ ] J(α) = E (X αy ) with Y =. We start with a variational calculation to find α that minimizes J(α). The value of α which minimizes J(α) is found by setting the first derivative of J(α) to zero (since, for Y =, d J(α) = E[Y ] > ). dα J(α) α dj dα = d d J(α) = E[X ] αe[xy ] + α E[Y ] = dα dα α = E[XY ] E[Y ] minimizes J(α). Page 5 of 6
6.4/6.43: Probabilistic Systems Analysis (Fall ) Then E[XY ] J E[Y ] = E [( ) ] E[XY ] X E[Y Y ] = E[X (E[XY ]) (E[XY ]) E[Y ] ] E[Y + ] (E[Y ]) = E[X ] (E[XY ]) E[Y ] Rearranging this expression gives the Schwarz inequality for expected values: E[X ]E[Y ] (E[XY ]) Note that in the above derivation, we assumed Y = so that E[Y ] >. If we assume Y = then the Schwarz inequality will hold with equality since then E[XY ] = and E[Y ] =. Page 6 of 6
MIT OpenCourseWare http://ocw.mit.edu 6.4 / 6.43 Probabilistic Systems Analysis and Applied Probability Fall For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.