SOLUTION FOR HOMEWORK 2, STAT 435 Welcome to your 2th homework. It looks like this is the last one! As usual, try to find mistakes and get extra points! Now let us look at your problems.. Problem 7.22. Here the joint pmf is f(x, x 2 ) := f X,X 2 (x, x 2 ) = (/36)x x 2 I(x, 2, 3})I(x 2, 2, 3}). (a) Note that y = x x 2, 2, 3, 4, 6, 9}. Then it is simpler to calculate the pmf of Y directly: f Y () = f(, ) = /36; f Y (2) = f(, 2) + f(2, ) = /9; f Y (3) = f(, 3) + f(3, ) = /6; f Y (4) = f(2, 2) = /9; f Y (6) = f(2, 3) + f(3, 2) = /3; f Y (9) = f(3, 3) = /4. Please check that the sum is. (b) For Y = X /X 2 we get the support y /3, /2, 2/3,, 3/2, 2, 3}. Then again it is simpler to calculate the pmf of Y directly: f Y (/3) = f(, 3) = 3/36; f Y (/2) = f(, 2) = 2/36; f Y (2/3) = f(2, 3) = 6/36; f Y () = f(, ) + f(2, 2) + f(3, 3) = ( + 4 + 9)/36; f Y (3/2) = f(3, 2) = 6/36; Please check that the sum is. f Y (2) = f(2, ) = 2/36; f Y (3) = f(3, ) = 3/36. 2. Problem 7.28. Note that the solution follows directly from definitions of the negative binomial and geometric distributions, but in any case we will check this via the mgf s method; below I one more time do this. Let X be geometric with the probability of S denoted as θ, then its moment generating function (I repeat its calculation we also had it in one of the previous HWs) M X (t) = Ee Xt } = θ( θ) x e xt = θ( θ) [( θ)e t ] x. x= x= In the right-hand side we have a geometric sum which converge for all sufficiently small t, and we know how to calculate it, so I continue and get M X (t) = θ θ On the other hand, if Y NegBinom(θ, k) then M Y (t) = ( θ)e t ( θ)e t = θe t ( θ)e t. (x )! (x k)!(k )! θk ( θ) x k e xt
= (x )! (x k)!(k )! [θ/( θ)]k [( θ)e t ] x. Now I would like to use the fact that a pmf of a negative binomial RV with the probability of success q = ( θ)e t is summed to (note that q is a valid probability of S for a sufficiently small t). We continue: M Y (t) = [θ/( θ)] k q k ( q) k (x )! (x k)!(k )! qk ( q) x k = [θ/( θ)] k ( θ) k e kt ( ( θ)e t ) k = θ k e kt ( ( θ)e t ) k. Now we can conclude that for k = 2 the mgf M Y (t) is indeed equal to M X +X 2 (t) = M X (t)m X2 (t). In other words, the sum of two independent geometric RVs has the negative binomial distribution with the stop at the second success (of course, the probability of S in each underlying Bernoulli trial should be the same). 3. Problem 7.29. Let X and Y be independent standard normal. Then for Z = X + Y we have X = Z Y and then using one of our methods we get Now note that Using this we continue: Then f Z,Y (z, y) = (z y)/ z f X,X 2 (z y, y) = (2π) e [(z y)2 +y 2 ]/2. (z y) 2 + y 2 = 2y 2 2zy + z 2 = [2y 2 2zy + z 2 /2] z 2 /2 + z 2 f Z (z) = e z2 /4 (2π) /2 = (2 /2 y z/2 /2 ) 2 + z 2 /2 = 2(y z/2) 2 + z 2 /2. f Z,Y (z, y) = (2π) e (y z/2)2 e z2 /4. (/2) /2 (2π(/2)) /2e (y z/2)2 /[2(/2)] dy = [(2π)2] /2 e z2 /[(2)(2)]. This implies that Z Norm(, 2) with zero-mean and variance equal to 2. Please check that the mean and variance are reasonable. 4. Problem 7.3. Let f X,Y (x, y) = 2xy( y)i( < x < )I( < y < ). Set Z = XY 2 and note that Z (, ). Then we can write two equivalent systems (direct and inverse relations): Z = XY 2 X = Z/U 2 U = Y Y = U Note that if z (, ) then z < u <. Now we are calculating Jacobian: x/ z y/ z x/ u y/ u = /u 2 2z/u 3 = u 2. 2
Then f Z,U (z, u) = u 2 f X,X 2 (z/u 2, u) = u 2 [2(z/u 2 )u( u)]i(z (, ), z < u < ). Further, we calculate the marginal density f Z (z) = f Z,U (z, u)du = (2)z u 3 ( u)du z = (2)z[( /2)( z)+( z /2 )] = (2)z[/2+(/2)z z /2 ] = (6z+6 2z /2 )I(z (, )). Is it integrable to? 5. Problem 7.36. Here the joint pdf is f X,X 2 (x, x 2 ) = 4x x 2 I( < x <, < x 2 < ). Then for the considered transformation we have Y = X 2 Y 2 = X X 2 X = Y /2 X 2 = Y 2 /Y /2 with y (, ) and y 2 (, y /2 ). The corresponding Jacobian is This yields x / y x 2 / y x / y 2 x 2 / y 2 = /(2y /2 ) y 2 /(2y 3/2 ) /y /2 = /(2y ). f Y,Y 2 (y, y 2 ) = (/2y )4y /2 y 2 y /2 I(y (, ))I(y 2 (, y /2 )) Let us check this answer: = (2y 2 /y )I(y (, ))I(y 2 (, y /2 ). [ y /2 f Y,Y 2 (y, y 2 )dy 2 ]dy = (2/y ) y /2 y 2 dy 2 ]dy It looks OK. 6. Problem 7.37. Here = (2/y )(y /2)dy =. f X,Y (x, y) = 24xyI( < x <, < y <, x + y < ). For the studied transformation Z = X + Y W = X 3 X = W Y = Z W
with w (, ) and w < z <. Corresponding Jacobian is x/ z y/ z x/ w y/ w = =. Then we use our rule to find Let us check the answer: f Z,W (z, w) = f X,Y (w, z w)i(w (, ), z (w, )) 24w[ w = 24w(z w)i(w (, ), z (w, )). (z w)dz]dw = 24 w[(/2)( w 2 ) w( w)]dw = 24 [w(/2 + (/2)w 2 w]dw = 24[(/4) + (/8) (/3)] = 24(6 + 3 8)/24 =. The answer looks OK. 7. Problem 7.4. If X Binom(θ, n) then its mgf is M X (t) = [ + θ(e t )] n. Then for two independent X Binom(θ, n ) and X Binom(θ, n 2 ) we have which is the mgf of Binom(θ, n + n 2 ). M X +X 2 (t) = M X (t)m X2 (t) = ( + θ(e t )) n +n 2 8. Problem 7.43. Let X Gamma(α, β). Then M X (t) = ( βt) α. Suppose that X,...,X n are iid Gamma RV with parameters (α, β). Then M X +X 2 +...+X n (t) = ( βt) αn. This yields that the sum of n iid Gamma RVs has Gamma distribution with parameters (nα, β). 8. Problem 7.6. Here X Poisson(λ = 3.3) and X is the number of complaints per day. Then: (a)p(x = 2) = e λ λ 2 /2!. (b) Here the RV of interest is Y = X + X 2 with X and X 2 being the number of complaints during the first and second days. Then we know that X + X 2 Poisson(2λ). You may quickly check the latter via Using this fact we get M X (t) = e λ(et ), M X +X 2 (t) = M X (t)m X2 (t) = e 2λ(et ). (c) Here X + X 2 + X 3 Poisson(3λ), so P(X + X 2 = 5) = e 2λ (2λ) 5 /5!. P(X + X 2 + X 3 2) = 4 k=2 e 3λ (3λ) k. k!
9. Problem 7.63(a). It is known that if X i Expon(θ) then Y k = k i= X i Gamma(α = k, β = ); see p.257, Ex.7.6. (It is easy to check this via the mgf approach.) Then where Y 2 Gamma(α = 2, β = θ = 5). Then P(X + X 2 < 8) = P(Y 2 < 8) P(Y 2 < 8) = 8 xe x/5 dx Γ(2)5 2 8 = (/5 2 )[ 5xe x/5 8 + 5 e x/5 dx] = (/25)[ 4e 8/5 + 25 25e 8/5 ] = (/25)[25 65e 8/5 ] = (65/25)e 8/5.. Problem 7.69. Let X Normal(µ, σ 2 ). Then Y = e X has support (, ) and X = ln(y ) is the inverse function. Then we apply our rule and get for the transformation at hand: f Y (y) = dx(y)/dy f X (ln(y)) = y (2πσ 2 ) /2 e (ln(y) µ)2 /2σ 2 I(y > ). This is the famous log-normal density which plays an important role in many branches of statistics, in particular in regression. 5