Fial Examiatio Statistics 00C T. Ferguso Jue 0, 00. (a State the Borel-Catelli Lemma ad its coverse. (b Let X,X,... be i.i.d. from a distributio with desity, f(x =θx (θ+ o the iterval (,. For what value of θ is it true that (/X 0.. Let X,X,... be idepedet radom variables with X k havig the distributio X k = { k with probability with probability k ( k+ ( k+ (a Let S = k= X k.fide(s advar(s. Note that Var(S. (b Check that the UAN coditio holds. (c Show whether or ot (S E(S / Var(S coverges i law to the stadard ormal distributio by checkig the Lideberg coditio. 3. Suppose we are give idepedet trials resultig i c possible cells, each trial havig probability p i of fallig i cell i, fori =,...,c.let i deote the umber of trials fallig i cell i. (a What is Pearso s chi-square for testig the hypothesis that the true probabilities are p i for i =,...,c? (b Fid the trasformed chi-square with the trasformatio, g(p =log(p applied to each cell. Fid the modified trasformed chi-square. (c What is the approximate large sample distributio of the modified trasformed chi-square if the true cell probabilities are p 0 i for i =,...,c? 4. I samplig from a populatio of N objects havig values z,z,...,z N,firsta sample of size <N/ is take without replacemet. Later a secod sample of size is take from the remaiig N objects without replacemet. The differece of the meas of the two samples is used to compare the samples. This leads to a rak statistic of the form S N = N z ja(r j, where a(i =fori =,...,, a(i = fori = +,...,, ad a(i =0fori = +,...,N. (a What are the mea ad the variace of S N? (b Assume that as N. Uder what coditio o the z i is it true that (S N ES N / L Var(S N N (0,?
5. (a Give the defiitio of the Kullback-Leibler Iformatio umber, K(f 0,f. (b What is the Iformatio Iequality? (c Suppose f 0 (x is the desity of the biomial distributio, B(, / (with sample size ad probability of success /, ad f (x is the desity of the biomial distributio, B(, 3/4. Fid K(f 0,f ad check that the iequality holds. 6. Let (X,Y,...,(X,Y be a sample from a bivariate distributio with desity f(x, y µ, θ =θ µx exp{ θx( + µy} for x>0ady>0, where µ>0adθ>0 are parameters. (a Fid the maximum likelihood estimates of µ ad θ. (b Fid the Fisher Iformatio matrix for this distributio. (c What is the asymptotic distributio of the MLE of µ whe θ is ukow? What is the asymptotic distributio of the MLE of µ whe θ is kow? 7. Let X,...,X be a sample from the Poisso distributio P(λ, let Y,...,Y be a sample from a Poisso distributio, P(λ+β, let Z,...,Z be a sample from the Poisso distributio, P(λ + β, with all three parameters, λ, β, β, ukow. Assume that all three samples are idepedet. (a Fid the likelihood ratio test statistic for testig the hypothesis H 0 : β = β. (b What fuctio of the likelihood ratio test statistic has asymptotically a chi-square distributio, ad how may degrees of freedom does it have i this case? 8. A sample of size is take i a multiomial experimet with c cells deoted (i, j, i =,...,cad j =,...,c.letp ij deote the probability of cell (i, j, ad let ij deote the umber fallig i cell (i, j, so that p ij =ad ij =. (a Let H deote the hypothesis of symmetry, that p ij = p ji for all i ad j. Fidthe chi-square test of H agaist all alteratives? How may degrees of freedom does it have? (b Let H 0 deote the hypothesis that all off-diagoal elemets are equal: p ij = q for all i j, forsomeq. Note that uder H 0, p + p +...+ p cc + c(c q =. Fidthe chi-square test of H 0 agaist all alteratives. How may degrees of freedom? (c What, the, is the chi-square test of H 0 agaist H, ad how may degrees of freedom does it have?
Solutios to the Fial Examiatio, Stat 00C, Sprig 00.. (a If A,A,... are evets such that j= P(A j <, thep(a i.o. =0. Coversely, if the A j are idepedet evets, ad j= P(A j=, thep(a i.o. =. (b Let ɛ be a arbitrary positive umber. The (/X 0 if, ad oly if, P((/X >ɛ i.o. = 0. Sice P((/X >ɛ= = P(X >ɛ= = /(ɛ θ < = if, ad oly if, θ>, we have (/X 0 if, ad oly if, θ>.. (a E(X k = 0 ad Var(X k = / k. So E(S = 0 ad B = Var(S = / k (/x dx. (b max j / j =,so[max j Var(X j ]/B / 0. (c Sice X j for all j, B j= E(X j I(X j >ɛ B B E(Xj I( >ɛ B = I( >ɛ / =0 j= for sufficietly large. So, S /B L N (0,, or S / /4 L N (0, / c 3. (a χ (( j / p j P =. p j= j (b χ T = c j= p j(log( j / log(p j ad χ TM = c j= j(log( j / log(p j. (cthe limitig distributio is ocetral χ c (λ, with c degrees of freedom ad ocetrality parameter λ = c j= p0 j (log(p0 j log(p j. 4. (a Sice ā N =0,wehaveES N = 0. The variace of S N is (N /(N σ z σ a, ad sice σ a =(/N N a(i =/N, wehavevar(s N =(N/(N σ z. (b For asymptotic ormality of S N, we eed max j (z j z N max(a(j ā N Nσ zσ a 0. We have max j (a(j ā N =,adσ a =/N. The the above coditio becomes max j (z j z N σ z 0. 5. (a K(f 0,f =E 0 log f 0(X f (X,whereE 0 represets the expectatio whe f 0 (x is the desity of X. 3
(b K(f 0,f 0, with equality if, ad oly if, f 0 (x adf (x are the same distributio. (c f 0 (x = ( x (/ ad f (x = ( x (3/4 x (/4 x,sof 0 (x/f (x = /3 x. So K(f 0,f =E 0 ( log X log 3 = log (/ log 3 = (/[log 4 log 3]. This is obviously positive. 6. (a l (θ, µ = log θ + log µ + log x i θ x i( + µy i. / θ =(/θ x i( + µy i = 0 implies = ˆθ x + ˆθˆµ xy ad / µ = (/µ θ x iy i = 0 implies = ˆθˆµ xy. Solvig these equatios gives ˆθ =/X ad ˆµ = X /XY,whereXY =(/ X iy i. (b Ψ(x, θ, µ = ((/θ x( + µy, (/µ θxy, which shows that E(XY =/µθ, so that /θ xy Ψ =( xy /µ ad I(θ, µ = E Ψ = ( /θ /µθ /µθ /µ (c Sice Det(I ( =/µ θ,wehavee(xy ( =/µθ, sothat /µ I(θ, µ = µ θ /µθ θ µθ /µθ /θ = µθ µ. So whe θ is ukow, (ˆµ L µ N (0, µ. Whe θ is kow, the asymptotic variace of the MLE is the reciprocal of the lower right corer of the iformatio matrix, amely µ.so L ( µ µ N (0,µ. (Here, the MLE of µ is µ =/(θxy. 7. (a The log-likelihood fuctio is l =logl (λ, β,β = (3λ + β + β + log(λ X i +log(λ+β Y i +log(λ+β Z i plus a term ot ivolvig the parameters. The likelihood equatios are λ = 3 + λ X i +(λ + β Y i +(λ + β Z i β = +(λ + β Y i β = +(λ + β Z i The maximum likelihood estimates are ˆλ = X, ˆβ = Y X,adˆβ = Z X. I a similar way, the MLE s uder H 0 are λ = X,ad β = β =(Y + Z / X. The likelihood ratio test rejects H 0 for small values of Λ= L ( λ, β, β L (ˆλ, ˆβ, ˆβ = ((Y + Z / (Y +Z. Y Y Z Z (b log Λ has asymptotically a chi-square distributio with degree of freedom. 4
8. (a Uder H, the maximum likelihood estimates of the p ij are ˆp ii = ii / ad for i j, ˆp ij =( ij + ji /. Thereare(c + (c + +=c(c / restrictios goig from the geeral hypothesis to H. So the chi-square test of H rejects H if χ (ˆp is greater tha the appropriate cutoff poit for a chi-square distributio with c(c / degrees of freedom. (b Uder H 0, the likelihood is proportioal to [ j= p jj jj ]qm, where m = c j= jj. So the maximum likelihood estimates are p jj =( jj / ad q =[ c p jj ]/(c(c. There are c parameters estimated so the chi-square test of H 0 rejects H 0 if χ ( p isgreater tha the appropriate cutoff poit for a chi-square distributio with (c c = c c degrees of freedom. (c The chi-square test of H 0 withi H, rejects H 0 if χ ( p χ (ˆp is greater tha the appropriate cutoff poit for a chi-square distributio with c c (c(c / = (c(c / degrees of freedom. 5