Statistics Ph.D. Qualifying Exam: Part II November 9, 2002

Similar documents
Statistics Ph.D. Qualifying Exam: Part II November 3, 2001

Problem Selected Scores

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Qualifying Exam in Probability and Statistics.

Math 152. Rumbos Fall Solutions to Assignment #12

Spring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =

Spring 2012 Math 541B Exam 1

Statistics Masters Comprehensive Exam March 21, 2003

Ph.D. Qualifying Exam Friday Saturday, January 3 4, 2014

Probability and Statistics qualifying exam, May 2015

ST5215: Advanced Statistical Theory

Statistics GIDP Ph.D. Qualifying Exam Theory Jan 11, 2016, 9:00am-1:00pm

Chapter 3: Unbiased Estimation Lecture 22: UMVUE and the method of using a sufficient and complete statistic

Statistics 135 Fall 2008 Final Exam

Final Exam. 1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given.

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER. 21 June :45 11:45

Mathematics Ph.D. Qualifying Examination Stat Probability, January 2018

Post-exam 2 practice questions 18.05, Spring 2014

Chapters 9. Properties of Point Estimators

McGill University. Faculty of Science. Department of Mathematics and Statistics. Part A Examination. Statistics: Theory Paper

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017

1. (Regular) Exponential Family

Qualifying Exam in Probability and Statistics.

Mathematics Qualifying Examination January 2015 STAT Mathematical Statistics

Statistics Ph.D. Qualifying Exam

MIT Spring 2015

Mathematical Statistics

Chapter 7. Hypothesis Testing

Economics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1,

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

Summary of Chapters 7-9

Qualifying Exam in Probability and Statistics.

All other items including (and especially) CELL PHONES must be left at the front of the room.

f(x θ)dx with respect to θ. Assuming certain smoothness conditions concern differentiating under the integral the integral sign, we first obtain

Math 494: Mathematical Statistics

Comparing two independent samples

1. Let A be a 2 2 nonzero real matrix. Which of the following is true?

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

BIO5312 Biostatistics Lecture 13: Maximum Likelihood Estimation

Three hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

Statistics 3858 : Maximum Likelihood Estimators

A Very Brief Summary of Statistical Inference, and Examples

Master s Written Examination

MAS223 Statistical Inference and Modelling Exercises

Chapter 3 : Likelihood function and inference

Hypothesis testing: theory and methods

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.

Introduction to Estimation Methods for Time Series models Lecture 2

Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answer sheet.

Contents 1. Contents

This does not cover everything on the final. Look at the posted practice problems for other topics.

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.

Some Assorted Formulae. Some confidence intervals: σ n. x ± z α/2. x ± t n 1;α/2 n. ˆp(1 ˆp) ˆp ± z α/2 n. χ 2 n 1;1 α/2. n 1;α/2

simple if it completely specifies the density of x

STAT 450: Final Examination Version 1. Richard Lockhart 16 December 2002

(Practice Version) Midterm Exam 2

Chapter 4 HOMEWORK ASSIGNMENTS. 4.1 Homework #1

Master s Written Examination

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences FINAL EXAMINATION, APRIL 2013

(a) (3 points) Construct a 95% confidence interval for β 2 in Equation 1.

parameter space Θ, depending only on X, such that Note: it is not θ that is random, but the set C(X).

Masters Comprehensive Examination Department of Statistics, University of Florida

Math 562 Homework 1 August 29, 2006 Dr. Ron Sahoo

Problems. Suppose both models are fitted to the same data. Show that SS Res, A SS Res, B

Asymptotic Statistics-III. Changliang Zou

Chapter 9: Hypothesis Testing Sections

Mathematical statistics

Direction: This test is worth 250 points and each problem worth points. DO ANY SIX

Composite Hypotheses and Generalized Likelihood Ratio Tests

Probability and Statistics Notes

λ(x + 1)f g (x) > θ 0

ECE 275B Homework # 1 Solutions Version Winter 2015

Methods of evaluating estimators and best unbiased estimators Hamid R. Rabiee

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

Problem 1 (20) Log-normal. f(x) Cauchy

PhD Qualifying Examination Department of Statistics, University of Florida

Test Problems for Probability Theory ,

Statistical Inference

Ph.D. Qualifying Exam Monday Tuesday, January 4 5, 2016

ECE 275A Homework 7 Solutions

MIT Spring 2016

Probability & Statistics - FALL 2008 FINAL EXAM

A Very Brief Summary of Statistical Inference, and Examples

S6880 #7. Generate Non-uniform Random Number #1

Econ 583 Final Exam Fall 2008

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

General Regression Model

SYSM 6303: Quantitative Introduction to Risk and Uncertainty in Business Lecture 4: Fitting Data to Distributions

Lecture 10: Generalized likelihood ratio test

Theory of Statistical Tests

ECE 275B Homework # 1 Solutions Winter 2018

STAT 135 Lab 5 Bootstrapping and Hypothesis Testing

STAT 512 sp 2018 Summary Sheet

HT Introduction. P(X i = x i ) = e λ λ x i

Statistics & Data Sciences: First Year Prelim Exam May 2018

Probability and Statistics Notes

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Mathematical statistics

Department of Statistical Science FIRST YEAR EXAM - SPRING 2017

Transcription:

Statistics Ph.D. Qualifying Exam: Part II November 9, 2002 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your answer right after each problem selected, attach more pages if necessary. 3. Assemble your work in right order and in the original problem order.

1. Let {X 1,..., X n } be a random sample from the population with density f(x, θ), where f(x, θ) is given by: f(x, θ) = e (x θ), x > θ; = 0 if otherwise. Denote by {Y i = i(x (n i+1) X (n i) ) if i = 1,..., n 1, Y n = n(x (1) θ), where X (r) is the r th order statistic, r = 1,..., n. (a) Show that {Y i, i = 1,..., n} are independently and identically distributed random variables with density g(y) = e y, y > 0; = 0, y 0. Use this result to derive the UMVUE ( Uniformly Minimum Varianced Unbiased estimator) of θ. (b) Suppose the density of the prior distribution is given by: h(θ) = e (a θ), a > θ; = 0, a θ, where a is a known constant. Derive the Bayese estimator of θ and compare it with the UMVUE of θ.

2. Let {X 1,..., X n } be a random sample from the population with density g(x; θ, µ 1, µ 2 ) = θf(x; µ 1 ) + (1 θ)f(x; µ 2 ), where f(x; µ) is the density of N(µ, 1) and 0 < θ < 1. Illustrate how to derive a procedure to compute the MLE ( Maximum Likelihood Estimator) of {θ, µ 1, µ 2 } by using the EM-algorithm.

3. Let {X 1,..., X n } be a random sample from the population with density f(x; θ i, i = 1, 2), where f(x; θ i, i = 1, 2) is given by: f(x; θ i, i = 1, 2) = 1 if θ 1 x θ 2, θ 2 θ 1 = 0 if for otherwise. (a) Show that the statistics {X (1) = Min(X 1,..., X n ), X (n) = Max(X 1,..., X n )} are sufficient and complete statistics for the parameters {θ i, i = 1, 2}. (b) Derive the UMVUE of θ 2 θ 1.

4. Let U be a Uniform (0, 1) random variable. Let λ a constant, such that 0 < λ < 1, and let V be a random variable with support (0, 1) that is independent of U. (a) Prove that min ( U, ) 1 U λ 1 λ has a Uniform(0, 1) distribution. (b) Find P (min ( U, ) 1 U V 1 V >.5). (c) If X N(0, 4) and Z N(0, 1) with distribution function Φ. Find a relation between the distribution functions of 2 min(φ(x), (1 Φ(X)) and 2 min(φ(z), (1 Φ(Z))

5. Let Y 1,..., Y N be independent random variable such that Y i Binomial(n i, p i ), where p i = 1 1 + e β x i, and x i are vectors of fixed covariates, i = 1,..., N. (a) Find a set of jointly sufficient statistics for β. (b) Suppose β is the estimate of β that minimizes N Q = n iˆp i (1 ˆp i )(x iβ l(ˆp i )) 2, i=1 where ˆp i = Y i /n i, and l(ˆp i ) = log( ˆp i 1 ˆp i ), i = 1,..., N. Find β. (c) Let ˆβ be MLE s of β. Compare the two methods of parametric estimation. Which is better. Sketch a proof of your statement.

6. Let X 1 and X 2 be independently and identically distributed from the following probability mass function p(x) = P (X = x) = pq x, x = 0, 1, 2,..., where 0 < p < 1 and q = 1 p. (a) Find the UMVUE for (1 + p) 2. (b) Find the UMVUE for (1 + p) 2 /p.

7. Let Y 1, Y 2,..., Y n, X 1, X 2 be i.i.d. with continuous distribution function F. Define Y (0) =, Y (n+1) = + and for j = 1, 2,..., n let Y (j) be the jth smallest Y j. (a) Find P [Y (j) X 1 Y (j+1) ] for j = 0, 1,..., n. (b) Find P [Y (j) X 1 Y (j+1) X 2 Y (j+2) ] for j = 0, 1,..., n 1.

8. Consider independent random samples X i1,..., X in (i = 1, 2) from the uniform populations: f(x i ) = 1 θ i, 0 < x i < θ i, θ i > 0, i = 1, 2. Let Y i = max(x i1,..., X in ), i = 1, 2, and Y = max(y 1, Y 2 ). (a) Show that the likelihood ratio statistic to test the hypothesis H 0 : θ 1 = θ 2 is given by Z = (Y 1 Y 2 /Y 2 ) n. (b) Obtain the exact distribution of 2 log Z under H 0.

9. Let X denote a nonnegative integer valued random variable. The function λ(n) = P {X = n X n}, n 0, is called the discrete hazard rate function. (a) Show that P {X = n} = λ(n) n 1 i=0 (1 λ(i)). (b) Show that we can simulate X by generating independent uniform (0, 1) random variables U 1, U 2,... stopping at X = min {n : U n λ(n)}. (c) Apply this method to simulating a geometric random variable. Explain why it works. (d) Suppose that λ(n) p < 1 for all n. Consider the following algorithm for simulating X and explain why it works: Simulate X i, U i, i 1, where X i is geometric with mean 1/p and U i is a uniform (0, 1) random variable. Set S k = X 1 +... + X k and let X = min {S k : U k λ(s k )/p}.

10. Suppose that Λ is distributed as V/s 0, where s 0 is some constant and V χ 2 (k), and that given Λ = λ, X 1,..., X n are i.i.d. Poisson distributed with parameter λ. Let T = n i=1 X i. (a) Show that (Λ T = t) is distributed as W/s, where s = s 0 + 2n and W χ 2 (m) with m = k + 2t. (b) Show how quantiles of the χ 2 distribution can be used to determine level (1 α) upper and lower limits for λ.

11. Let (X i, Y i ), i = 1, 2,, n be n i.i.d. random vectors, where X i N(θ, 1), P (Y i = 1) = p, P (Y i = 0) = 1 p, and X i and Y i are independent. Suppose that we can only observe the product Z i = X i Y i, i = 1, 2,, n. (a) Find a minimal sufficient statistics for (p, θ). (b) Find the maximum likelihood estimator of p and θ.

12. Let X 1,..., X n be a random sample of size n from a population with density f(x; θ) = x θ 2 e x2 /(2θ 2), x > 0. Let Y 1,..., Y n be a random sample of size n from a population with density g(y; µ) = 1 µ e x/µ, y > 0. (a) Find the uniformly post powerful test for H 0 : θ θ 0 versus H 1 : θ > θ 0 based on sample {X 1,..., X n }. (b) Find the uniformly post powerful test for H 0 : µ µ 0 versus H 1 : µ > µ 0 based on sample {Y 1,..., Y n }. (c) Consider a transformation Z i = X 2 i and then draw a connection between questions (a) and (b).