Probability and Statistics qualifying exam, May 2015

Similar documents
Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Statistics Ph.D. Qualifying Exam

Qualifying Exam in Probability and Statistics.

Problem Selected Scores

McGill University. Faculty of Science. Department of Mathematics and Statistics. Part A Examination. Statistics: Theory Paper

Statistics Ph.D. Qualifying Exam: Part II November 3, 2001

Qualifying Exam in Probability and Statistics.

Statistics Ph.D. Qualifying Exam: Part II November 9, 2002

Final Exam. 1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given.

Mathematics Ph.D. Qualifying Examination Stat Probability, January 2018

Spring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =

Spring 2012 Math 541B Exam 1

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017

Mathematics Qualifying Examination January 2015 STAT Mathematical Statistics

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER. 21 June :45 11:45

Qualifying Exam in Probability and Statistics.

Statistics GIDP Ph.D. Qualifying Exam Theory Jan 11, 2016, 9:00am-1:00pm

Masters Comprehensive Examination Department of Statistics, University of Florida

Statistics Masters Comprehensive Exam March 21, 2003

First Year Examination Department of Statistics, University of Florida

Review and continuation from last week Properties of MLEs

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

Math 494: Mathematical Statistics

MAS223 Statistical Inference and Modelling Exercises

STAT 450: Final Examination Version 1. Richard Lockhart 16 December 2002

Hypothesis Test. The opposite of the null hypothesis, called an alternative hypothesis, becomes

Statistics 135 Fall 2008 Final Exam

ECE 275B Homework # 1 Solutions Version Winter 2015

Information in Data. Sufficiency, Ancillarity, Minimality, and Completeness

Ph.D. Qualifying Exam Friday Saturday, January 3 4, 2014

Chapter 3: Unbiased Estimation Lecture 22: UMVUE and the method of using a sufficient and complete statistic

Comprehensive Examination Quantitative Methods Spring, 2018

Statistics 135: Fall 2004 Final Exam

Hypothesis testing: theory and methods

1. Let A be a 2 2 nonzero real matrix. Which of the following is true?

ECE 275B Homework # 1 Solutions Winter 2018

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences FINAL EXAMINATION, APRIL 2013

Ph.D. Qualifying Exam Monday Tuesday, January 4 5, 2016

parameter space Θ, depending only on X, such that Note: it is not θ that is random, but the set C(X).

A Note on UMPI F Tests

PhD Qualifying Examination Department of Statistics, University of Florida

Lecture 32: Asymptotic confidence sets and likelihoods

Chapter 4 HOMEWORK ASSIGNMENTS. 4.1 Homework #1

1. Point Estimators, Review

Final Examination. STA 215: Statistical Inference. Saturday, 2001 May 5, 9:00am 12:00 noon

Lecture 26: Likelihood ratio tests

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

Master s Written Examination

Master s Examination Solutions Option Statistics and Probability Fall 2011

6. MAXIMUM LIKELIHOOD ESTIMATION

Asymptotic Statistics-III. Changliang Zou

2017 Financial Mathematics Orientation - Statistics

Test Code: STA/STB (Short Answer Type) 2013 Junior Research Fellowship for Research Course in Statistics

ST5215: Advanced Statistical Theory

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

(a) (3 points) Construct a 95% confidence interval for β 2 in Equation 1.

Mathematical Statistics

Suggested solutions to written exam Jan 17, 2012

Lecture 17: Likelihood ratio and asymptotic tests

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

18.05 Practice Final Exam

Asymptotics. Hypothesis Testing UMP. Asymptotic Tests and p-values

Applied Statistics Comprehensive Exam

STAT 830 Bayesian Estimation

Principles of Statistics

STATISTICS SYLLABUS UNIT I

Chapters 10. Hypothesis Testing

ECON 4117/5111 Mathematical Economics

IEOR165 Discussion Week 5

Masters Comprehensive Examination Department of Statistics, University of Florida

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators

Math 494: Mathematical Statistics

Maximum Likelihood. F θ, θ Θ. X 1,..., X n. L(θ) = f(x i ; θ) l(θ) = ln L(θ) = i.i.d. i=1. n ln f(x i ; θ) Sometimes

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

557: MATHEMATICAL STATISTICS II HYPOTHESIS TESTING: EXAMPLES

WISE MA/PhD Programs Econometrics Instructor: Brett Graham Spring Semester, Academic Year Exam Version: A

A Very Brief Summary of Bayesian Inference, and Examples

Chapter 4: Asymptotic Properties of the MLE (Part 2)

Statement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:.

simple if it completely specifies the density of x

First Year Examination Department of Statistics, University of Florida

40.530: Statistics. Professor Chen Zehua. Singapore University of Design and Technology

Institute of Actuaries of India

M(t) = 1 t. (1 t), 6 M (0) = 20 P (95. X i 110) i=1

Chapters 10. Hypothesis Testing

Chapter 5 continued. Chapter 5 sections

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Notes, March 4, 2013, R. Dudley Maximum likelihood estimation: actual or supposed

Model Checking and Improvement

Probability Theory and Statistics. Peter Jochumzen

WISE MA/PhD Programs Econometrics Instructor: Brett Graham Spring Semester, Academic Year Exam Version: A

Hypothesis Testing: The Generalized Likelihood Ratio Test

Solution. (i) Find a minimal sufficient statistic for (θ, β) and give your justification. X i=1. By the factorization theorem, ( n

Statistical Inference

2014/2015 Smester II ST5224 Final Exam Solution

Testing Hypothesis. Maura Mezzetti. Department of Economics and Finance Università Tor Vergata

f(x θ)dx with respect to θ. Assuming certain smoothness conditions concern differentiating under the integral the integral sign, we first obtain

Bayesian Regression (1/31/13)

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Transcription:

Probability and Statistics qualifying exam, May 2015 Name: Instructions: 1. The exam is divided into 3 sections: Linear Models, Mathematical Statistics and Probability. You must pass each section to pass the exam. 2. M.S. students: in the Mathematical Statistics section, please choose a total of 4 questions (say, 4 from Group 1 plus 0 from Group 2; or 2 from Group 1 plus 2 from Group 2). Ph.D. students: do the 4 questions from Group 2. 3. M.S. students: in the Probability section, please choose a total of 4 questions (say, 4 from Group 1 plus 0 from Group 2; or 2 from Group 1 plus 2 from Group 2). Ph.D. students: do the 4 questions from Group 2. 4. remark: we consider a M.S. student to be someone whose goal is to get a M.S. degree in Statistics, irrespective of whether (s)he is also enrolled in the Ph.D. program. 5. Please justify all your answers. 6. Good luck!

Page 2 Linear Models 1. Consider data (X i, Y i ) R 2 related by Y i = β 0 + β 1 X i + ϵ i i = 1,..., n where β 0, β 1 R are constants, ϵ i N(0, σ 2 ) are iid, and σ 2 is some known constant. (a) Write the least squares estimate ˆβ of β = [β 0, β 1 ] in matrix form and identify its distribution. (b) Under what conditions are the estimates ˆβ 0, ˆβ 1 uncorrelated? (c) Suppose you have a new X 0 and believe that Y 0 = β 0 + β 1 X 0 + ϵ 0 where ϵ 0 N(0, σ 2 ) is independent of all other ϵ i. Find a 95% prediction interval for Y 0. 2. Consider a model Y = Xβ + ϵ where X = 1 1 1 1 1 0, β i = [ β0 β 1 ] and ϵ N(0, σ 2 I). (a) Demonstrate that this model s parameterization is identifiable. (b) Describe how you would test the hypothesis β 1 = 0 if σ 2 is a known constant. What is the rejection region for a size α test? (c) Describe an estimate of σ 2 and prove that it is (un)biased. 3. Let X R m n be some nonzero matrix and M the PPO onto C(X). (a) Prove that tr(m) = r(x). (b) Prove that 0 M ii 1. (c) Prove that if M has full rank, then M = I. 4. Let Y N(µ, Σ) where Y = [Y 1, Y 2, Y 3 ], µ = [0, 0, 0], Σ = σ 2 + ρ ρ 0 ρ σ 2 + ρ 0 0 0 σ 2 = σ 2 I + ρee where ρ is not necessarily positive and e = [1, 1, 0]. (a) For a given σ 2 > 0, which restrictions on ρ ensure that Σ is positive definite?

Page 3 (b) Suppose that µ, σ 2 are known and you have observed Y = y. Find an equation for the MLE ˆρ of ρ. It may be helpful to note that det(a + uv ) = (1 + v A 1 u) det(a) (A + uv ) 1 = A 1 A 1 uv A 1 1+v A 1 u

Page 4 Mathematical Statistics Group 1 1. Consider a sample X 1,..., X n U[θ, θ + 1]. (a) Prove that T (X) = (X (1), X (n) ) is a minimal and sufficient statistic for θ. (b) Is the statistic T (X) complete? 2. Let X Bin(n, p), 0 < p < 1, and suppose that the prior distribution of p is Beta(α, β), i.e. where π(p) = Γ(α + β) Γ(α)Γ(β) pα 1 (1 p) β 1 1 (0,1) (p), α, β > 0, Γ(α) = (a) Find the posterior distribution of p, i.e., f(p x). 0 t α 1 e t dt. (b) Find the Bayes estimator of p under the squared loss function L(p, d) = (p d) 2. 3. Consider a sample X 1,..., X n f(x 1 θ), where the density f( θ) is given by Define the parameter g(θ) = θ 1. (a) Find the UMVU estimator of g(θ). f(x 1 θ) = θx θ 1 1 1 (0,1) (x 1 ), θ > 1. (b) For a given n N, find a maximum likelihood estimator of g(θ). Describe its asymptotic distribution as n. 4. Consider the sample X 1,..., X n U[0, θ], θ > 0, and the composite hypotheses For a fixed α (0, 1), find the UMP test. H 0 : θ θ 0 vs H 1 : θ > θ 0. Group 2 1. Let X 1,..., X n N(µ, σ 2 ), where µ = σ > 0. (a) Show that ( n T (X) = X i, i=1 is minimal sufficient for this parametric family. (b) Is T (X) also complete? Please justify your answer. 2. Let X 1,..., X n N(θ, aθ 2 ), θ > 0, and a > 0 is known. (a) Find an explicit expression for an EL (efficient likelihood) estimator θ EL (X) of θ (this includes proving that θ EL (X) is, indeed, an EL estimator). n i=1 X 2 i )

Page 5 (b) Let g(θ) = log θ be a reparametrization. Show that g( θ EL (X)) is asymptotically unbiased for g(θ). 3. Suppose we have two independent samples X 1,..., X n λ 1, λ 2 > 0. (a) Find the GLR test for the hypotheses exp(λ 1 ), Y 1,..., Y m exp(λ 2 ), H 0 : λ 1 = λ 2 vs H 1 : λ 1 λ 2. (b) Show that the test in part (a) can be based on the statistic n i=1 T (X, Y) = X i n i=1 X i + m j=1 Y. j (c) Find the distribution of T (X, Y) when H 0 is true. 4. Let X 1,..., X n, n 2, be an sample from a Poi(λ), λ > 0, parametric family. (a) Prove that E(S 2 X) = X a.s. (b) Use part (a) to establish that S 2 is not UMVU for λ.

Page 6 Probability Group 1 1. For Borel sets A, B, define d(a, B) = P (A B), where A B = (A B)\(A B). Let {A n } n N, A be a collection of Borel sets. Prove that d(a n, A) 0 if and only if 1 An 1 A in the L 2 (P ) sense. 2. Suppose that {A n } n N are independent events such that P (A n ) < 1, n N. Prove that ( P n=1 A n ) = 1 P (A n i.o.) = 1. 3. Let X Cauchy(0, 1), i.e., f X (x) = 1 1 π 1 + x 2, x R. Find the density of the random variable Y = 1. 1+X 2 4. Prove the classical central limit theorem. In other words, let {X n } n N be an sequence such that EX 1 = µ and Var(X i ) = σ 2. Then, ( Xn µ ) d n N(0, 1), n, σ where X n = 1 n n i=1 X i. Group 2 1. Let {X n } n N be an sequence of non-degenerate random variables defined on a given probability space. Show that P (X n converges) = 0. 2. Let {X n } n N, X be random variables, and let {F n } n N, F be their respective distribution d functions. Assume that X n X, and that F is continuous. Prove that 3. Please answer the following questions. sup F n (x) F (x) 0, n. x R (a) Let X and Y be two a.c. random variables with joint density f X,Y (x, y). Show that a conditional density f X Y (x y) exists and establish its form. (b) Assume that P (W = c) = 1 for some c R, and let Z be some other random variable. Show that P (W = c Z = z) = 1, z R. 4. Let {X n } n N be an sequence on a given probability space such that P (X n = 0) = 1 2 = P (X n = 1), n N. Show that the random variable is well-defined and find its distribution. X := n=1 X n 2 n End of exam.