Solutios: Homework 3 Suppose that the radom variables Y,...,Y satisfy Y i = x i + " i : i =,..., IID where x,...,x R are fixed values ad ",...," Normal(0, )with R + kow. Fid ˆ = MLE( ). IND Solutio: Observe that Y i Normal( xi, ):i =,...,. With this iformatio, the log-likelihood fuctio relative to a observed sample y =(y,...,y ) R is `( ; y) = l( ) (y i x i ). The fuctio `( ; y) : R is cotiuously di eretiable i. I particular `0( ; y) = x i (y i x i )= h X x i y i i : = x i. Notice that Also This meas that `( ; y) : `0( ; y) > 0, < `0( ; y) < 0, > x i y i x i y i. lim `( ; y) =.!± R has a global maximum (which depeds o y) at MAX(y) = x i y i Fially, the MLE of will be the fuctio MAX : R! R evaluated at the sample Y =(Y,...,Y ), i.e ˆ = MAX (Y )= x i Y i. Fid the distributio of ˆ. Solutio: Observe that ˆ ca be expressed as ˆ = c i Y i : c i = x i. The last relatio says that ˆ is a liear combiatio of idepedet radom variables with Normal distributio. The ˆ Normal(µ, )where µ =E[ˆ] = c i E[Y i ]= = Var[ ˆ] = c i Var[Y i ]= S xx x i ( x i )= x i ( )=.
Fid the CRLB for estimatig with ˆ. Solutio: Because Y,...,Y are radom variables whose distributio belogs to the Expoetial Family CRLB( )= E[`00 ( ; Y )] : where `00 ( ; y) = for ay observed sample y R. This shows that `00 ( ; Y ) is costat as a fuctio of Y.The E[`00 ( ; Y )] = S xx ) CRLB( )=. Show that ˆ = UMVUE( ). Solutio: We proved earlier that Show that ˆ Normal,. From this it follows that ˆ is a ubiased estimator for belogs to the Expoetial Family ad This proves that ˆ = UMVUE( ). Var[ ˆ] = = CRLB( ):8 R. T = X i. Also ˆ has a distributio that is the UMVUE for R + if X,...,X IID Normal(0, ). Solutio: Give that the distributios for X,...,X belog to the Expoetial Family CRLB( )= E[`00 ( ; X)] where X = (X,...,X ). The associated log-likelihood fuctio relative to a observed sample x =(x,...,x ) R is Notice that With this iformatio E[`00 ( ; X)] = O the other had `( ; x) = l( ) l( ) `00 ( ; x) = ( ) 3 Var[X ] T = ( ) ( ) 3 x i. x i. ( ) = ( ) ) CRLB( )= ( ). U : U = with W i =( ) X i. Because W,...,W IID (), U (). The T has a distributio that belogs to the Expoetial Family (Chage of Variables) ad W i E[T ]= E[U ]=
i.e, a ubiased estimator for. Also Var[T ]= ( ) Var[U ]= ( ) = CRLB( ):8 R +. From this calculatios it follows that T = UMVUE( ). Let X,...,X IID Beroulli( ) with (0, ). Fid the Bayes Estimator ˆ B of with respect to the Uiform(0, ) prior uder the Loss Fuctio L(t, ) = (t ) : t R, (0, ). ( ) Solutio: The first step is to deduce the posterior distributio for give a observed sample x =(x,...,x ) {0, }. With the provided iformatio h Y ( x) / xi ( ) ii( xi (0, )) = s ( ) s I( (0, )) : s = x i. Notice that this expressio is the fuctioal part of a Beta(, ) desity with parameters = s + ad = s +. The x Beta(, ). The ext step is to calculate the posterior expected loss for our observed sample. I this case E[L(t, ) x] = B(, ) Z (0,) (t ) s ( ) s d : t R Observe that f( ) = s ( ) s I( (0, )) : R is proportioal to a Beta(, ) desity with parameters = s ad = s. Because of this observatio E[L(t, ) x] = B(, ) B(, ) E[(t? ) x] with? x Beta(, ). As a cosequece of the above relatio, the value t MIN (x) R (which depeds o x) that miimizes E[L(t, ) x] is the same that miimizes E[(t? ) x]. This last problem has a explicit solutio (Squared Error Loss) t MIN (x) =E[? x] = + = s = x. Fially, the Bayes Estimator will be the fuctio t MIN : {0, }! R evaluated at the sample X =(X,...,X ), i.e ˆ B = t MIN (X) =X. Solutios: Homework 4 Let X,...,X be a radom sample from a Normal(, ) populatio ( kow). Cosider estimatig usig Squared Error Loss ad a Normal(µ, ) prior distributio for. Let be the Bayes Estimator for. Show that the posterior distributio of is Normal(m,v ) with parameters m = x + + µ + v = +. 3
Solutio: Our task it s reduce to idetify the fuctioal part of the posterior desity for give a observed sample x =(x,...,x ) R. Igorig all terms that do t deped o Notice that exp Fially h Y ( x) / exp =exp h X (x i ) + (x i ) i exp h X (x i ) + ( i µ) / exp ( x) / exp which meas that x Normal(m,v ). =exp =exp / exp ( m ) v ( µ) I( R) ( i µ) I( R). h + h + h m i v ( m ) v. v I( R) x + µ i x + µ Fid the Bayes Estimator ˆ B of uder Squared Error Loss. Solutio: I geeral, whe the posterior distributio admits at least secod-order momets, the Bayes Estimator uder Squared Error Loss is the Posterior Mea. I our case, the Normal distributio has fiite momets for all orders. The with (X) =E[ X] = X + + µ + a = + µ b = +. = a + b Complete parts (a)-(c) of Problem 7.6. Solutio: For ay costats a R ad b R\{0}, the estimator (X) =ax +b has a Normal(a + b, a ) distributio give (ivariace uder liear trasformatios). The associated Risk Fuctio (Squared Error Loss) is R(, ) =(E[ (X) ] ) + Var[ (X) ] =((a + b) ) + a =( (a ) + b) + a =(b ( a) ) + a. I particular, for (X) =a X + b the coe ciets a R + ad b R satisfy the relatio b =( a )µ. The R(, ) =(b ( a ) ) + a =( a ) (µ ) + a = c (µ ) +( c ) : c = a = +. i 4
Fially, the associated Bayes Risk for the Bayes Estimator is B(, ) =E[R(, )] = c E[(µ ) ]+( c ) = ( ) ( + ) + ( ) ( + ) = h + + + + = + = c. i Solutios: Homework 5 = c +( c ) Cosider a radom sample of size from a distributio with discrete pdf f(x p) =p( for x =0,,..., ad 0 otherwise. p) x Before proceedig otice that E(X )=( p)/p ad Var(X )=( p)/p.. The MLE of p is the value that solves the followig d dp log f(x p) =0) d dp log p ( p) P i xi =0 P i ) x i =0 p ( p) ) ( p) =p X i x i ) p = P i x i + Now sice d dp log f(x p) = P i x i p < 0, ˆp is ideed a maximum. Further, the ( p) likelihood at p = ad p = 0 is zero. Thus, the MLE of p is ˆp = P i x i +. By the ivariace properties of MLEs, the MLE of =( p)/p is ˆ = ˆp = ˆp X. 3. The CRLB associated with ubiased estimators of is apple d dp I (p). Now d dp = d dp ( p)/p = /p ad I (p) =E [ ddp log f(x p)] apple d = E dp apple = E log f(x p) P i x i ( p) p = p + p p ( p) = p ( p) Thus the CRLB associated with ubiased estimators of is apple d dp I (p) = [ /p ] p ( p) = p p 5
4. Sice the variace of X attais the CRLB (i.e., Var( X) = (/)Var(X ) = ( p)/(p )), the ˆ is UMVUE for 5. First otice that ˆ is ubiased for. That is E( X) =( p)/p. Thesice lim E[(ˆ ) ]= lim Var( X) p = lim!!! p =0 ˆ is MSEC 6. From the asymptotic results of MLEs we have p (ˆp p) d! N(0, /I (p)) ) p (ˆp p) d! N(0,p ( p)) The from the Delta Method results we have p (ˆ ) d! N(0, (d /dp) /I (p)) ) p (ˆ ) d! N(0, (/p 4 )p ( p)) ) p (ˆ ) d! N(0, ( p)/p ) Thus, we ca say (somewhat iformally) that ˆ. N(, ( p)/(p )). 7. First the risk fuctio associated with ˆ = X. (Notice that i what follows that the risk fuctios are fuctios of p. If so desired, they could be made fuctios of.) " # Rˆ (p) (ˆ ) =E[L(ˆ, )] = E + = h + E (ˆ ) i = + Var( X) sice ˆ is ubiased for =( p)/p p p = + = /(p) + = /(p) + = 6
Now for the risk fuctio associated with = X/( + ) " # R (p) ( ) =E[L(, )] = E + = h + E ( ) i = + [Var( X/( + )) + bias( X/( + )) ] " = # p + + p + p p + p p " = # + + p + + " = # + + p + + = + p ( + ) R(p) 0.080 0.090 0.00 0.0 R θ ~ (p) R θ^(p) 0.0 0. 0.4 0.6 0.8.0 p 7
Solutios: Homework 6. (C& B 7.58 pg. 366) Let X be a observatio from the pdf f(x ) = x ( ) x, x =, 0, [0, ] (a) Fid the MLE of. To fid the MLE of for oe observatio of X it is eough to determie the value of [0, ] that maximizes f(x ) for each of the 3 possible X values. Notice that f(x = ) =f(x = ) = /. Ad = is the value of that maximizes f(x ). Similarly f(x =0 ) = ad = 0 is the value of that maximizes f(x ). Therefore the MLE is ˆ = x = 0 x =0 ) ˆ = x (b) Defie a estimator T (X) = x = 0 otherwise Show that T (X) is a ubiased estimator of theta. Notice that E [T (X)] = P (X = ) = ( /) = (c) Fid a estimator that is better tha T (X) ad prove that it is better. From cotext I am workig uder the assumptio that we are lookig for a better ubiased estimator. First otice that E(ˆ ) =P ( X = ) = P (X = or X = ) = P (X = ) + P (X = ) = /+ / = Thus, ˆ is ubiased. Note further that ad Var(T (X)) = E(T (X) ) [E(T (X)) ]=4P (X = ) = 4( /) = Var(ˆ ) =P ( X = ) = Fially for ay [0, ] we have that Var(ˆ ) apple Var(T (X)) ad therefore the MLE ˆ is a better estimator tha T (X) Alteratively, oe ca show that ˆ is a complete ad su ciet statistic for ad the by Method I coclude that ˆ is UMVUE. First to show that ˆ is use the factorizatio theorem. Let g(,s)=( /) s ( ) s for s = x ad h(x) =I[x {, 0, }]. The f(x ) = g(s, )h(x) ad by the factorizatio theorem x is su ciet for theta. To see that x is also complete, suppose that m( x ) is a fuctio such that E[m( x )] = 0 for all [0, ]. The 0=E[m( x )] = m( )( /) + m( 0 )( ) +m( )( /) = m() + m(0)( ) for all [0, ]. I particular for =0wehave0=m(0) ad for =wehave 0=m(). Therefore if 0 = E [m( x )] for all the m( x ) = 0 for x {, 0, } ) P (m( X ) = 0) = for all [0, ]. By defiitio x is complete. Sice ˆ is a fuctio of a complete ad su ciet statistic it is UMVUE ad therefore better tha T (X) 8
. Let X,...,X be iid N(, ), R. (a) By Method I ad sice we are ca assume that P X i is complete if we ca fid a ubiased estimator of ( ) = that is a fuctio of P X i the we are doe. As a first guess, cosider X. Note that X N(, /) ad thus E( X )=Var( X)+E( X) =/ +. This is obviously ot a ubiased estimator of ( ), but T (X) = X / is. Sice T P (X) is ubiased ad a fuctio of a complete ad su ciet statistic (we have show X i to be su ciet for uder N(, )) the T (X) is UMVUE. (b) To fid the Var(T (X)) otice that Z = p ( X ) N(0, ) ad Z () with E(Z ) = ad Var(Z ) =. Additioally, we will use the fact that skewess (as defied as the third cetral momet) is zero for symmetric distributios. That is, for X N(µ, )wehavee(x µ) 3 = 0. Thus for Z N(0, ), E(Z 3 ) = 0. Now we ca rewrite Hece the variace of T (X) is T (X) = X / =[ p ( X )+ p ] (/) / =[Z p ] (/) /. Var(T (X)) = Var( p ( X )+ p ] (/) /) =/ Var(Z Z p + ) =/ [Var(Z )+4 Var(Z) 4 p Cov(Z,Z)] =/ [ + 4 4 p (E(Z 3 ) E(Z )E(Z))] =/ [ + 4 4 p E(Z 3 )] sice E(Z) =0 =/ [ + 4 ] sice E(Z 3 ) = 0 see above =/ +4 / (c) Sice X i are iid, the CRLB of ( ) = is [ 0 ( )] /I ( ) with[ 0 ( )] =4 ad apple @ I ( ) = E @ log f(x ) apple @ = E @ (/) log( ) (/)(X ) apple @ = E (/)(X )( ) @ = E [ ] = Thus the CLRB is 4 / (d) Notice that Var(T (X)) = / +4 / =/ + CRLB > CRLB 9
3. (7.60 of C & B pg. 366). Fid the best ubiased estimator of / whe X,...,X are iid Gamma(, ) ad is kow. We will show that T = P X i is a complete ad su ciet statistic ad use Method. Notice that f(x, )= ( ) x exp{ x/ }I[x >0] c( )h(x)exp{q( )t(x)}i[x >0] where c( )=, h(x) =x / ( ), q( )=/, ad t(x) =x. Notice further that the support of the distributio A = {x : x>0} does ot deped o ad {q( ): > 0} = { / : > 0} =(, 0) is ope i R. Thusthegamma(, ) distributio with kow is a member of the expoetial family ad the ecessary requiremets are fulfilled to coclude that T = P X i is complete ad su ciet for. A logical first choice for a ubaised estimator of / is /T Note that T gamma(, E(/T )= = = = Z ) ad also that (/T ) 0 ( ) T exp{ T/ }dt Z ( ) T exp{ T/ }dt 0 Z ( ) ( ) 0 ( ) T exp{ T/ }dt itegral of desity is ad Gamma(a) =(a )! ( ) So defie T? =( )/T. Goig through the same procedure it is easy to show that T? is a ubaised estimate of / ad sice it is a fuctio of T (which is complete ad su ciet) the it is UMVUE. 0