CHAPTER Exercise 1 Suppose that Y (Y 1,,Y ) is a radom sample from a Exp(λ) distributio The we may write f Y (y) λe λy i λ e λ y i }{{} }{{} 1 g λ (T(Y )) h(y) It follows thatt(y ) Y i is a sufficiet statistic forλ Exercise Suppose that Y (Y 1,,Y ) is a radom sample from a Exp(λ) distributio The the ratio of the joit pdfs at two differet realizatios ofy,xad y, is x i f(x;λ) f(y;λ) λ e λ λ e λ y i e λ ( y i x i The ratio is costat iff y i x i Hece, by Lemma 1, T(Y ) Y i is a miimal sufficiet statistic forλ ) Exercise 3 Y i are idetically distributed, hace have the same expectatio, saye(y), for alli 1,, Here, fory [,θ], we have: E(Y) θ θ y(θ y)dy Bias: θ θ (θy y )dy θ [ θ 1 y 1 3 y3 ] θ 1 3 θ E(T(Y )) E(3Y) 3 1 That itbias(t(y )) E(T(Y )) θ E(Y i ) 3 1 1 3 θ θ Variace: Y i are idetically distributed, hace have the same variace, say var(y), for all i 1,,, var(y) E(Y ) [E(Y)] We eed to calculatee(y ) E(Y ) θ θ y (θ y)dy θ θ (θy y 3 )dy θ [ θ 1 3 y3 1 4 y4 ] θ 1 6 θ Hece var(y) E(Y ) [E(Y)] 1 6 θ 1 9 θ 9 θ This gives var(t(y )) 9var(Y) 9 1 var(y i ) 9 1 9 θ 9 1 9 θ θ 15
Cosistecy: T(Y ) is ubiased, so it is eough to check if its variace teds to zero whe teds to ifiity Ideed, as we have θ, that is T(Y ) 3Y is a cosistet estimator ofθ Exercise 4 We have X i iid Ber(p) for i 1,, Also, X 1 X i (a) For a estimator ϑ to be cosistet forϑwe require that themse( ϑ) as We have MSE( ϑ) var( ϑ)+[bias( ϑ)] We will ow calculate the variace ad bias of p X E(X) 1 HeceX is a ubiased estimator ofp E(X i ) 1 p p var(x) 1 var(x i ) 1 pq 1 pq Hece, MSE(X) 1 pq as, that is,x is a cosistet estimator ofp (b) The estimator ϑ is asymptotically ubiased for ϑ if E( ϑ) ϑ as, that is, the bias teds to zero as Here we have Note that we ca write That is E( pq) E[X(1 X)] E[X X ] E[X] E[X ] E[X ] var(x)+[e(x)] 1 pq +p ( ) 1 E( pq) p pq +p pq( 1) pq as Hece, the estimator is asymptotically ubiased forpq Exercise 5 Here we have a sigle parameterpad g(p) p By defiitio the CRLB(p) is CRLB(p) { } dg(p) dp { }, (1) E d logp(y y;p) dp where the joit pmf ofy (Y 1,,Y ) T, wherey i Ber(p) idepedetly, is P(Y y;p) p y i (1 p) 1 y i p y i (1 p) y i For the umerator of (1) we get dg(p) dp 1 Further, for brevity deote P P(Y y;p) For the deomiator of (1) we calculate logp y i logp+( 16 y i )log(1 p)
ad dlogp y i + y i, dp p 1 p d logp dp y i p + y i (1 p) Hece, sicee(y i ) p for alli, we get [ ] E d logp dp ( E Y ) ( i p E Y ) i (1 p) p p p (1 p) p(1 p) Hece, CRLB(p) p(1 p) Sicevar(Y) p(1 p), it meas thatvar(y)achieves the boud ad soy has the miimum variace amog all ubiased estimators ofp Exercise 6 From lectures, we kow that the joit pdf of idepedet ormal rvs is { } f(y;µ,σ ) (πσ ) exp 1 σ (y i µ) Deote f f(y µ,σ ) Takig log of the pdf we obtai Thus, we have ad It follows that ad logf log(πσ ) 1 σ logf µ 1 σ (y i µ) (y i µ) logf σ σ + 1 σ 4 (y i µ) log µ σ, logf µ σ 1 σ 4 logf σ 4 σ 4 1 σ 6 (y i µ) (y i µ) Hece, takig expectatio of each of the secod derivatives we obtai the Fisher iformatio matrix ( ) M σ σ 4 Now, letg(µ,σ ) µ+σ The we have g/ µ 1 ad g/ σ 1 So ( ) 1 ( ) CRLB(g(µ,σ )) (1,1) σ 1 1 σ ( 4 ) σ ( ) (1,1) 1 σ σ 4 1 (1+σ ) 17
Exercise 7 Suppose that Y 1,,Y are idepedet Poisso(λ) radom variables The we kow that T Y i is a sufficiet statistic forλ Now, we eed to fid out what is the distributio of T We showed i Exercise 11 that the mgf of a Poisso(λ) rv is M Y (z) e λ(ez 1) Hece, we may write (we usedz ot to be cofused with the values of T, deoted byt) M T (z) M Yi (z) e λ(ez 1) e λ(ez 1) Hece, T Poisso(λ), ad so its probability mass fuctio is Next, suppose that for λ > The we have P(T t) (λ)t e λ, t,1, t! E{h(T)} t t h(t) (λ)t e λ t! h(t) (λ) t t! for λ > Thus, every coefficiet h(t)/t! is zero, so that h(t) for all t, 1,, Sice T takes o valuest,1,, with probability 1 it meas that P{h(T) } 1 for allλ Hece,T Y i is a complete sufficiet statistic Exercise 8 S Y i is a complete sufficiet statistic for λ We have see that T Y S/ is a MVUE forλ Now, we will fid a uique MVUE ofφ λ We have ( ) 1 E(T ) E S 1 E(S ) 1 [ var(s)+[e(s)] ] 1 [ λ+ λ ] 1 λ+λ 1 E(T)+λ It meas that E [ T 1 T] λ, ie, T 1 T Y 1 Y is a ubiased estimator of λ It is a fuctio of a complete sufficiet statistic, hece it is the uique MVUE ofλ 18
Exercise 9 We may write P(Y y;λ) λy e λ y! 1 y! exp{ylogλ λ} 1 exp{(logλ)y λ} y! Thus, we have a(λ) logλ, b(y) y, c(λ) λ ad h(y) 1 y! That is the P(Y y;λ) has a represetatio of the form required by Defiitio 1 Exercise 1 (a) Here, fory >, we have f(y λ,α) λα Γ(α) yα 1 e λy { [ ]} λ α exp log Γ(α) yα 1 e λy { [ ]} λ α exp λy +(α 1)logy +log Γ(α) This has the required form of Defiitio 1, wherep ad a 1 (λ,α) λ a (λ,α) α 1 b 1 (y) y b (y) logy [ ] λ α c(λ,α) log Γ(α) h(y) 1 (b) By Theorem 8 (lecture otes) we have that S 1 (Y ) Y i ad S (Y ) logy i are the joit complete sufficiet statistics forλad α Exercise 11 To obtai the Method of Momets estimators we compare the populatio ad the sample momets For a oe parameter distributio we obtai θ as the solutio of: E(Y) Y () Here, fory [,θ], we have: E(Y) θ θ y(θ y)dy θ θ (θy y )dy θ [ θ 1 y 1 3 y3 ] θ 1 3 θ 19
The by () we get the method of momets estimator ofθ: θ 3Y Exercise 1 (a) First, we will show that the distributio belogs to a expoetial family Here, fory > ad kow α, we have f(y λ,α) λα Γ(α) yα 1 e λy α 1 λα y Γ(α) e λy { [ ]} λ y α 1 α exp log Γ(α) e λy { [ ]} λ y α 1 α exp λy +log Γ(α) This has the required form of Defiitio 11, wherep 1 ad a(λ) λ By Theorem 8 (lecture otes) we have that is the complete sufficiet statistic forλ (b) The likelihood fuctio is The the log-likelihood is L(λ;y) b(y) y [ ] λ α c(λ) log Γ(α) h(y) y α 1 S(Y ) Y i λ α Γ(α) yα 1 i e λy i λ α ) y α 1 i e λy i ( Γ(α) elog { λ α Γ(α) } e (α 1) logy i e λ y i l(λ;y) logl(λ;y ) αlogλ logγ(α)+(α 1) The, we obtai the followig derivative ofl(λ;y ) with respect toλ: dl dλ α1 λ y i logy i λ y i This, set to zero, gives λ α y α i y
Hece, themle(λ) α/y So, we get MLE[g(λ)] MLE [ ] 1 1 λ MLE(λ) 1 α Y 1 α That is,mle[g(λ)] is a fuctio of the complete sufficiet statistic Y i 1 S(Y ) α (c) To show that it is a ubiased estimator ofg(λ) we calculate: ( ) 1 E[g( λ)] E Y i 1 E(Y i ) 1 α α α α1 λ 1 λ It is a ubiased estimator ad a fuctio of a complete sufficiet statistics, hece, by Corollary (give i Lectures), it is the uique MVUE(g(λ)) Exercise 13 (a) The likelihood is L(β,β 1 ;y) (πσ ) exp { Now, maximizig this is equivalet to miimizig S(β,β 1 ) 1 σ } (y i β β 1 x i ) (y i β β 1 x i ), which is the criterio we use to fid the least squares estimators Hece, the maximum likelihood estimators are the same as the least squares estimators (c) The estimates ofβ ad β 1 are β Y β 1 x 9413, β 1 x iy i xy 166 x i x Hece the estimate of the mea respose at a givexis E(Y x 4) 9413 166x For the temperature of x 4 degrees we obtai the estimate of expected hardess equal to E(Y x) 43483 Exercise 14 The LS estimator ofβ 1 is β 1 x iy i xy x i x We will see that it has a ormal distributio ad is ubiased, ad we will fid its variace Now, ormality is clear from the fact that we may write β 1 x iy i x Y i x i x (x i x) Y i, 1
where x i x, so that ˆβ 1 is a liear fuctio ofy 1,,Y, each of which is ormally distributed Next, we have E( β 1 ) 1 1 1 1 1 (x i x)e(y i ) { x i E(Y i ) x } E(Y i ) { x i (β +β 1 x i ) x { } (β +β 1 x i ) } β x+β 1 x i β x β 1 x { x i x }β 1 1 β 1 β 1 Fially, sice they i s are idepedet, we have var(ˆβ 1 ) 1 ( ) 1 ( ) (x i x) var(y i ) (x i x) σ 1 ( ) σ σ Hece, ˆβ 1 N(β 1,σ / ) ad a1(1 α)% cofidece iterval forβ 1 is S ˆβ 1 ±t, α, wheres 1 (Y i ˆβ ˆβ 1 x i ) is the MVUE for σ Exercise 15 (a) The likelihood is ad so the log-likelihood is Thus, solvig the equatio L(θ;y) θy θ 1 i ( ) θ 1 θ y i, ( ) l(θ;y) logθ +(θ 1)log y i ( dl dθ ) θ +log y i, we obtai the maximum likelihood estimator ofθ as ˆθ /log( Y i) (b) Sice d l dθ θ,
we have CRLB(θ) 1 θ ( ) E d l dθ Thus, for large, ˆθ N(θ,θ /) (c) Here we have to replacecrlb(θ) with its estimator to obtai the approximate pivot This gives Q(Y,θ) θ θ θ P θ z α < θ θ AN(,1) approx < zα 1 α wherezα is such thatp( Z < z α) 1 α,z N(,1) It may be rearraged to yield θ P zα ˆθ < θ < ˆθ θ +zα 1 α Hece, a approximate1(1 α)% cofidece iterval forθ is θ ±zα θ Fially, the approximate 9% cofidece iterval forθ is ˆθ ˆθ ±16449, where θ /log( Y i) Exercise 16 (a) For a Poisso distributio we have the MLE(λ) equal to λ Y ad ( λ AN λ, λ ) Hece, So, after stadardizatio, we get λ 1 λ AN λ 1 λ (λ 1 λ ) λ1 1 + λ Hece, the approximate pivot forλ 1 λ is ( λ 1 λ, λ 1 + λ ) 1 AN(,1) Q(Y,λ 1 λ ) λ 1 λ (λ 1 λ ) λ1 1 + λ N(,1) approx 3
The, forzα such thatp( Z < z α) 1 α, Z N(,1), we may write what gives P P λ 1 λ zα z α < λ 1 λ (λ 1 λ ) λ1 1 + λ That is, a1(1 α)% CI for λ 1 λ is < zα 1 α λ 1 + λ < λ 1 λ < 1 λ λ 1 1 + λ zα Y 1 Y ±zα Y 1 1 + Y (b) Deote: Y i - desity of seedligs of tree A at a square meter areaiaroud the tree; X i - desity of seedligs of tree B at a square meter areaiaroud the tree The, we may assume thaty i iid Poisso(λ 1 ) ad X i iid Poisso(λ ) We are iterested i the differece i the mea desity, ie, iλ 1 λ From the data we get: λ 1 λ 1 λ 1, + λ 17 1 7 Hece, the approximate99% CI for λ 1 λ is [ ] 17 17 1 5758 7, 1+5758 [ 69, 69] 7 + λ 1 1 α The CI icludes zero, hece, at the 1% sigificace level, there is o evidece to reject H : λ 1 λ agaisth 1 : λ 1 λ, that is, there is o evidece to say, at the1% sigificace level, that tree A produced a higher desity of seedligs tha tree B did Exercise 17 (a) Y i iid Ber(p),i 1,, ad we are iterested i testig a hypothesis H : p p agaisth 1 : p p 1 The likelihood is ad so we get the likelihood ratio: L(p;y) p y i (1 p) y i λ(p) L(p ;y) L(p 1 ;y) p y i (1 p ) p y i 1 (1 p 1 ) { } p p 1 { p (1 p 1 ) p 1 (1 p ) y i y i } y i y { i 1 p 1 p 1 } y { } i 1 p 1 p 1 The, the critical regio is R {y : λ(p) a}, 4
where a is a costat chose to give sigificace level α It meas that we reject the ull hypothesis if { } p (1 p 1 ) y { } i 1 p a, p 1 (1 p ) 1 p 1 which is equivalet to { } p (1 p 1 ) y i b, p 1 (1 p ) or, after takig logs of both sides, to { } p (1 p 1 ) y i log c, p 1 (1 p ) wherebad c are costats chose to give sigificace levelα Whep 1 > p we have { } p (1 p 1 ) log < p 1 (1 p ) Hece, the critical regio ca be writte as R {y : y d}, for some costatdchose to give sigificace levelα By the cetral limit theorem, we have that (whe the ull hypothesis is true, ie,p p ): ( Y AN p, p ) (1 p ) Hece, ad we may write wherez α d p p (1 p ) Z Y p p (1 p ) AN(,1) α P(Y d p p ) P(Z z α ), p Heced p +z (1 p ) α ad the critical regio is R { y : y p +z α p (1 p ) (b) The critical regio does ot deped o p 1, hece it is the same for all p > p ad so there is a uiformly most powerful test forh : p p agaist H 1 : p > p The power fuctio is β(p) P(Y R p) ( ) p (1 p ) P Y p +z α p P Y p p p +z (1 p ) α p p(1 p) p(1 p) 1 Φ{g(p)}, } where g(p) p +z α p (1 p ) p p(1 p) stadard ormal distributio ad Φ deotes the cumulative distributio fuctio of the 5
Questio 18 (a) Let us deote: Y i Ber(p) - a respose of mouseito the drug cadidate The, from Questio 1, we have the followig critical regio { } p (1 p ) R y : y p +z α Herep 1, 3,α 5,z α 16449 It gives R {y : y 19} From the sample we have p y 6 3, that is there is evidece to reject the ull hypothesis at the sigificace levelα 5 (b) The power fuctio is β(p) 1 Φ{g(p)}, where g(p) p +z α z 5 16449, p (1 p ) p p(1 p) Whe 3, p 1 ad p we obtai, for g() 1356 adφ( 1356) 1 Φ(1356) 1 5539 4461 It gives the power equal to β() 5539 It meas that the probability of type II error is 4461, which is rather high This is because the value of the alterative hypothesis is close to the ull hypothesis ad also the umber of observatios is ot large To fid what is eeded to get the powerβ() 8 we calculate: g(p) 1+z 5 9 16 75z 5 5 Forβ(p) 1 Φ{g(p)} to be equal to 8 it meas thatφ{g(p)} From statistical tables we obtai thatg(p) 8416 Hece, it gives, forz 5 16449, (4 8416+3 16449) 689 At least 69 mice are eeded to obtai as high power test as 8 for detectig that the proportio is rather tha 1 6