1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given. (a) If X and Y are independent, Corr(X, Y ) = 0. (b) (c) (d) (e) A consistent estimator must be asymptotically normal. The sample mean is a sufficient statistics for θ in Normal distribution N(µ, θ). The likelihood ratio test provides a best rejection region of size α for testing the simple hypothesis H 0 : θ = θ 0 v.s. H 1 : θ = θ 1, and it is an unbiased test. A uniform most powerful unbiased (UMPU) test has to be a uniform most powerful (UMP) test. (f) If X i iid N(µ, σ 2 ), for i = 1,, n, then E( XS 2 ) = µσ 2, where X is the sample mean and S 2 is the sample variance. 2. (2 points) Which of the MLEs for θ in the following distributions may NOT be asymptotic efficient? A) N(θ, 1) B) N(0, θ) C)Uniform[0, θ] D)Bernoulli(θ) E) Poisson(θ). 3. (2 points) Which of the following statement is WRONG? A) If Y is an MLE for θ, then Y 2 is an MLE for θ 2. B) If Y is an unbiased estimator for θ, then Y 2 is an unbiased estimator for θ 2. C) If Y is a consistent estimator for θ, then Y 2 is a consistent estimator for θ 2. D) If Y is a test statistic for test H 0 : θ = 0 v.s. H a : θ 0, then e Y is also a test statistic for the test. E) If Y from a random sample of X 1, X 2,, X n is a sufficient statistics for parameter θ, then e Y is also sufficient for θ. 1
4. (2 points) Let X 1,, X n be an random sample of size n from a density f(x; θ) with an unknown parameter θ, θ Θ R. Assume all regularity conditions hold. Which of the following statement is CORRECT? A) Let Y be a statistic with E(Y ) = θ 2. Then V ar(y ) 4θ 2 /ni(θ) by Rao-Cramér Lower Bound. B) The MLE of θ has the limiting distribution n(ˆθ ML θ 0 ) D N(0, I 1 (θ 0 )), where θ 0 is the true parameter and I(θ) is the Fisher information for θ. C) Under the null hypothesis, Wald s test statistics always exactly follows χ 2 p distribution, where p = dim(θ) dim(θ 0 ), and Θ 0 and Θ are the null and entire parameter spaces respectively. D) If E(X i ) = θ, for i = 1,, n, then the sample mean, X, is an unbiased estimator of θ and is asymptotically efficient regardless of the distribution of X i. E) The MLE of σ 2 constructed from a random sample of N(µ, σ 2 ) is (finite sample) efficient and sufficient for σ 2. 2
5. (4 points) Let X 1,, X 5 be an random sample from Bernoulli distribution with p.d.f f(x; θ) = θ x (1 θ) 1 x, for x = 0, 1, zero elsewhere, and θ (0, 1). (a) Find the MLE of P (X = 0). (b) Find a uniformly most powerful (UMP) test of size α for the onesided test H 0 : θ = 1 v.s. H 2 1 : θ < 1. Specify your rejection rule 2 in term of X. 3
6. (6 points) Suppose X 1, X 2,, X n are iid with pdf f(x; θ) = 1 θ ex/θ, for x > 0 and 0 elsewhere, θ > 0. In this case, all regularity conditions (R0-R5) are satisfied. (a) Find the Fisher information of θ in the random sample ni(θ). (b) What is the asymptotic distribution of n( X θ)? Note that X is the MLE of θ in this case. 4
(c) Show that the likelihood ratio test of H 0 : θ = θ 0 v.s. H 1 : θ θ 0 is based on the statistic W = n i=1 X i. (d) Consider n=10. Find c 1 and c 2 so that the test that rejects H 0 when W c 1 or W c 2 has significance level α = 0.05. (Hint: Obtain the null distribution of 2W/θ 0 first.) 5
7. (4 points) Let X 1,, X n be an random sample from Laplace distribution with p.d.f f(x; θ) = 1 2 e x θ, for < x <, < θ <. Note that E(X i ) = θ and V ar(x i ) = 2. Laplace distribution satisfies all regularity conditions (R0-R5). (a) (2 points) Find the maximum likelihood estimator of θ. (b) (2 points) Calculate the asymptotic relative efficiency of the sample mean to the MLE. Which estimator is better? 6
8. (4 points) Consider the regression model with a single covariate: Y i = βx i + e i, i = 1,, n, where e i iid N(0, σ 2 ) and σ 2 is known. Assume the covariate x i is fixed (not random). That is Y i N(βx i, σ 2 ) independently. (a) Derive the MLE of β. (Hint: the likelihood function is same as the joint distribution of Y i s.) (b) In this case, all regularity conditions (R0-R5) are satisfied. It can be shown that the fisher information in the whole sample is ni(θ) = n i=1 x2 i /σ 2. Prove that he ˆβ from part(a) is a consistent and efficient estimator of β. (Assume that n i=1 x2 i as n.) 7