Two hours MATH20802 To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER STATISTICAL METHODS 21 June 2010 9:45 11:45 Answer any FOUR of the questions. University-approved calculators may be used 1 of 7 P.T.O.
1. This question explores the moment generating function of the normal distribution. (i) Show that exp(t 2 /2) is the moment generating function of the standard normal distribution. (ii) Find the first four moments of the standard normal distribution by differentiating the moment generating function. (iii) Suppose that Z N(0, 1) and let Y = µ + σz for some scalar σ > 0. Find the moment generating function of Y. This is the form of the moment generating function of the N(µ, σ 2 ) distribution. (iv) Let X 1 N(µ 1, σ1), 2 let X 2 N(µ 2, σ2), 2 and assume that X 1 and X 2 are independent. Find the moment generating function of S = X 1 + X 2. (v) Use the linearity of expectation to find E(S) and V ar(s). (vi) Use the uniqueness of the moment generating function to determine the distribution of S. 2 of 7 P.T.O.
2. Suppose θ is an estimator of θ. Define what is meant by the following: (i) θ is an unbiased estimator of θ; (ii) θ is an asymptotically unbiased estimator of θ; (iii) the bias of θ (written as bias( θ)); (iv) the mean squared error of θ (written as MSE ( θ)); (v) θ is a consistent estimator of θ. Let X i denote the time that it takes student i to complete a take-home exam, and suppose that X 1, X 2,..., X n constitute a random sample from an exponential distribution with parameter β. Consider the following estimators for θ = 1/β: θ1 = c min(x 1, X 2,..., X n ) and θ 2 = 1/n n i=1 X i. (i) Determine the value of c (perhaps as a function of n) for which θ 1 is an unbiased estimator of θ. (ii) Determine the variance and MSE of the estimator, θ 1, from (i) as a function of the parameter θ. (iii) Determine the bias, variance, and MSE of θ 2 as a function of the parameter θ. (iv) Which of the two estimators ( θ 1 or θ 2 ) is better and why? (3 marks) 3 of 7 P.T.O.
3. Let the random variable Y i be the number of typographical errors on page i of a 400-page book (for i = 1, 2,..., 400), and suppose that the Y i s are independent and identically distributed according to a Poisson distribution with parameter λ. Let the random variable X be the number of pages of this book that contain at least one typographical error. Suppose that you are told the value of X but are not told anything about the values of the Y i. (i) Identify the probability distribution of X and write down its probability mass function in terms of λ. (ii) Write down the likelihood function of λ. (iii) Find the maximum likelihood estimator (MLE) of λ by maximizing the log likelihood function. (iv) Suppose that the data reveal that 25 of the 400 pages contain a typographical error. What is the MLE of λ? (v) Show that you could have used the invariance property of maximum likelihood estimators to determine the MLE of λ. 4 of 7 P.T.O.
4. Suppose X 1, X 2,..., X n are independent and identically distributed random variables with the common probability density function (pdf): f(x) = θ 2 x θ 2 1 θ θ 2 1 for 0 < x < θ 1, θ 1 > 0 and θ 2 > 0. Both θ 1 and θ 2 are unknown. (i) Calculate the cumulative distribution function, mean and variance corresponding to the given pdf. (ii) Write down the joint likelihood function of θ 1 and θ 2. (iii) Determine the maximum likelihood estimator (MLE) of θ 1. (iv) Determine the MLE of θ 2. (v) Show that the MLE, θ 1, is a biased and consistent estimator for θ 1. 5 of 7 P.T.O.
5. Suppose we wish to test H 0 : θ = θ 0 versus H 1 : θ θ 0. Define what is meant by the following: (i) the Type I error of the test. (ii) the Type II error of the test. (iii) the significance level of the test. (iv) the power function of the test (denoted Π(θ)). Suppose X 1, X 2,..., X n is a random sample from a Bernoulli distribution with parameter p. State the rejection region for each of the following tests: (i) H 0 : p = p 0 versus H 1 : p p 0. (ii) H 0 : p = p 0 versus H 1 : p < p 0. (iii) H 0 : p = p 0 versus H 1 : p > p 0. In each case, assume a significance level of α and that X = (X 1 +X 2 + +X n )/n has an approximate normal distribution. Under the same assumptions, find the power function, Π(p), for each of the tests: (i) H 0 : p = p 0 versus H 1 : p p 0. (ii) H 0 : p = p 0 versus H 1 : p < p 0. (iii) H 0 : p = p 0 versus H 1 : p > p 0. (3 marks) In each case, you may express the power function, Π(p), in terms of Φ( ), the standard normal distribution function. 6 of 7 P.T.O.
6. State the Neyman Pearson test for H 0 : θ = θ 1 versus H 1 : θ = θ 2 based on a random sample X 1, X 2,..., X n from a distribution with the probability density function f(x; θ). Let X 1, X 2,..., X n be a random sample from a Bernoulli distribution with parameter p. (i) Find the most powerful test at level α for H 0 : p = p 0 versus p = p 1, where p 1 > p 0 are constants. Show that the test rejects H 0 if and only if n i=1 X i > k for some k. (ii) Determine the power function, Π(p), of the test in part (i). (iii) Find the value of k when α = 0.05, n = 5, p 0 = 0.5 and p 1 = 0.6. (iv) Find β = Pr (Type II error) when n = 5, p 0 = 0.5 and p 1 = 0.6. END OF EXAMINATION PAPER 7 of 7