Math 152. Rumbos Fall Solutions to Assignment #12

Similar documents
Hypothesis Testing: The Generalized Likelihood Ratio Test

Statistics Ph.D. Qualifying Exam: Part II November 9, 2002

Statistics 3858 : Maximum Likelihood Estimators

BIO5312 Biostatistics Lecture 13: Maximum Likelihood Estimation

Hypothesis Testing. Robert L. Wolpert Department of Statistical Science Duke University, Durham, NC, USA

Topic 19 Extensions on the Likelihood Ratio

Hypothesis Test. The opposite of the null hypothesis, called an alternative hypothesis, becomes

Definition 3.1 A statistical hypothesis is a statement about the unknown values of the parameters of the population distribution.

Loglikelihood and Confidence Intervals

Exercises and Answers to Chapter 1

Chapter 7. Hypothesis Testing

Math 152. Rumbos Fall Solutions to Exam #2

Hypothesis Testing. 1 Definitions of test statistics. CB: chapter 8; section 10.3

Theory of Statistical Tests

Spring 2012 Math 541B Exam 1

Practice Problems Section Problems

Hypothesis testing: theory and methods

MATH5745 Multivariate Methods Lecture 07

Summary of Chapters 7-9

Notes on the Multivariate Normal and Related Topics

2014/2015 Smester II ST5224 Final Exam Solution

f(x θ)dx with respect to θ. Assuming certain smoothness conditions concern differentiating under the integral the integral sign, we first obtain

Direction: This test is worth 250 points and each problem worth points. DO ANY SIX

STAT 512 sp 2018 Summary Sheet

Mathematical statistics

Review of Discrete Probability (contd.)

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER. 21 June :45 11:45

Math 494: Mathematical Statistics

Review. December 4 th, Review

2.6.3 Generalized likelihood ratio tests

Testing Hypothesis. Maura Mezzetti. Department of Economics and Finance Università Tor Vergata

Topic 15: Simple Hypotheses

STAT 135 Lab 2 Confidence Intervals, MLE and the Delta Method

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

40.530: Statistics. Professor Chen Zehua. Singapore University of Design and Technology

Statistical Inference

TUTORIAL 8 SOLUTIONS #

HT Introduction. P(X i = x i ) = e λ λ x i

Quick Tour of Basic Probability Theory and Linear Algebra

Non-parametric Inference and Resampling

STAT 135 Lab 5 Bootstrapping and Hypothesis Testing

Lecture 13: p-values and union intersection tests

Mathematical statistics

MIT Spring 2015

Computing the MLE and the EM Algorithm

simple if it completely specifies the density of x

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

Statistics and Econometrics I

Lecture 10: Generalized likelihood ratio test

Estimation MLE-Pandemic data MLE-Financial crisis data Evaluating estimators. Estimation. September 24, STAT 151 Class 6 Slide 1

Lecture 23 Maximum Likelihood Estimation and Bayesian Inference

MAS223 Statistical Inference and Modelling Exercises

Maximum likelihood in log-linear models

Topic 10: Hypothesis Testing

Hypothesis Testing. A rule for making the required choice can be described in two ways: called the rejection or critical region of the test.

Likelihoods. P (Y = y) = f(y). For example, suppose Y has a geometric distribution on 1, 2,... with parameter p. Then the pmf is

Central Limit Theorem ( 5.3)

Mathematics Qualifying Examination January 2015 STAT Mathematical Statistics

Likelihood Ratio tests

A Very Brief Summary of Statistical Inference, and Examples

SOLUTION FOR HOMEWORK 8, STAT 4372

Math 181B Homework 1 Solution

Composite Hypotheses and Generalized Likelihood Ratio Tests

Generalized Linear Models Introduction

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf

Hypothesis Testing Chap 10p460

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

Chapter 8: Hypothesis Testing Lecture 9: Likelihood ratio tests

Economics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1,

Topic 17: Simple Hypotheses

Primer on statistics:

1 General problem. 2 Terminalogy. Estimation. Estimate θ. (Pick a plausible distribution from family. ) Or estimate τ = τ(θ).

Lecture 26: Likelihood ratio tests

Statistical Inference

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences FINAL EXAMINATION, APRIL 2013

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.

7 Gaussian Discriminant Analysis (including QDA and LDA)

Mathematics Ph.D. Qualifying Examination Stat Probability, January 2018

Solution: First note that the power function of the test is given as follows,

Statistics GIDP Ph.D. Qualifying Exam Theory Jan 11, 2016, 9:00am-1:00pm

3.0.1 Multivariate version and tensor product of experiments

Probabilistic Graphical Models

1. Point Estimators, Review

6. MAXIMUM LIKELIHOOD ESTIMATION

Theory of Maximum Likelihood Estimation. Konstantin Kashin

Topic 10: Hypothesis Testing

Topic 2: Logistic Regression

Stat 710: Mathematical Statistics Lecture 27

Section 8: Asymptotic Properties of the MLE

Math 10C - Fall Final Exam

Mathematical statistics

1 Probability Model. 1.1 Types of models to be discussed in the course

Introduction to Normal Distribution

CHAPTER 8. Test Procedures is a rule, based on sample data, for deciding whether to reject H 0 and contains:

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

A. Evaluate log Evaluate Logarithms

Nuisance parameters and their treatment

Basic math for biology

2 Random Variable Generation

Chapter 4. Theory of Tests. 4.1 Introduction

Transcription:

Math 52. umbos Fall 2009 Solutions to Assignment #2. Suppose that you observe n iid Bernoulli(p) random variables, denoted by X, X 2,..., X n. Find the LT rejection region for the test of H o : p p o versus H : p > p o in terms of the test statistic Y = X i. Solution: The likelihood ratio statistic is Λ(x, x 2,..., x n ) = sup L(p x, x 2,..., x n ) p p o, L(ˆp x, x 2,..., x n ) where L(p x, x 2,..., x n ) = p y ( p) n y, for y = x i, is the likelihood function, and is the MLE for p. Observe that if p o ˆp, then ˆp = y = x n sup L(p x, x 2,..., x n ) = L(ˆp x, x 2,..., x n ) p p o and so Λ(x, x 2,..., x n ) = ; thus, in this case we would not get a rejection region for the LT, : Λ(x, x 2,..., x n ) c, for some 0 < c <. We therefore have that p o < ˆp so that Λ(x, x 2,..., x n ) = L(p o x, x 2,..., x n ) L(ˆp x, x 2,..., x n ) = py o( p o ) n y ˆp y ( ˆp ) n y,

Math 52. umbos Fall 2009 2 where ˆp > p o, which can in turn be written as Λ(x, x 2,..., x n ) = ( ˆp p o ) y p o p o ˆp p o n y. Setting t = ˆp p o, follows Λ(t) = we see that Λ can be written as a function of t as for 0 t, t npot ( ) n npot po, for < t p p o t o, since ˆp > p o, where we have used the fact that ˆp = n y so that y = np o t. The graph of Λ(t) can be shown to be like the one sketched as the one shown in Figure on page 3, where we have sketched the case p o = /4 and n = 20 for 0 t 4. The sketch in Figure shows that Λ(t) decreases for t > ; thus, given any positive value of c such that c < and c > p n o, there exists a positive value t 2 such that t 2 >, and Λ(t 2 ) = c, and Λ(t) c for t t 2. Thus, the LT rejection region for the test of H o : H : p > p o is equivalent to ˆp p o t 2, which we could rephrase in terms of Y = : Y t 2 np o, X i as p p o versus for some t 2 with t 2 >. This rejection region can also be phrased as : Y > np o + b,

Math 52. umbos Fall 2009 3 Lambda(t) 0.0 0.2 0.4 0.6 0.8.0 0 2 3 4 t Figure : Sketch of graph of Λ(t) for p o = /4, n = 20, and 0 t 4

Math 52. umbos Fall 2009 4 for some b > 0. The value of b will then be determined by the significance level that we want to impose on the test, and Y is the statistic Y = X i, which counts the number of successes in the sample. 2. Consider the likelihood ratio test for H o : p = p o versus H : p = p, where p o = p, based on a random sample X, X 2,..., X n from a Bernoulli(p) distribution for 0 < p <. Show that, if p > p o, then the likelihood ratio statistic for the test is is a monotonically decreasing function of Y = X i.. Conclude, therefore, that if the test rejects H o at the significance level α for an observed value ˆy of Y, it will also rejects H o at that level for Y > ˆy. Solution: The likelihood ratio statistic in this case is Λ(x, x 2,..., x n ) = py o( p o ) n y p y ( p ), for y = x n y i, which can be written as where Λ(x, x 2,..., x n ) = a n r y, a = p o > 0 and r = p o( p ) p p ( p o ) <, since p > p o. It then follows that the likelihood ratio statistic for the test is is a monotonically decreasing function of Y = X i, since r <. Suppose now that the test rejects H o at the significance level α for an observed value ˆy of Y ; that is Λ(ˆy) c, for some c in (0, ) determined by α. Then, for any value of y which is bigger than ˆy, Λ(y) < Λ(ˆy) c, () since Λ is a decreasing function of y. It then follows from () that the LT will also rejects H o at that level for Y > ˆy.

Math 52. umbos Fall 2009 5 3. We wish to use an LT to test the hypothesis H o : μ = μ o against the alternative H : μ = μ o based on a random sample, X, X 2,..., X n, from a normal(μ, ) distribution. (a) Give the maximum likelihood estimator, ˆμ, for μ based on the sample. Solution: The likelihood function in this problem is n L(μ x, x 2,..., x n ) = e (x i μ) 2 /2 for μ. (2π) n/2 To find the MLE for μ, it suffices to maximize the natural logarithm of the likelihood function l(μ) = 2 (x i μ) 2 n ln(2π), for μ. 2 Taking derivatives we get l (μ) = (x i μ) = x i nμ, and l (μ) = (x i μ) = n < 0. It then follows that ˆμ = x i = x is the only critical point of n l and l (ˆμ) < 0. Thus, the likelihood function is maximized at ˆμ = x, the sample mean. (b) Give the likelihood ratio statistic for the test. Solution: The likelihood ratio statistic is Λ(x, x 2,..., x n ) = L(μ o x, x 2,..., x n ) L(ˆμ x, x 2,..., x n ), where ˆμ = x is the MLE for μ. We then have that Λ(x, x 2,..., x n ) = e n (x i μ o) 2 /2 e n (x i x) 2 /2, (2)

Math 52. umbos Fall 2009 6 where (x i μ o ) 2 = (x i x + x μ o ) 2 = (x i x) 2 + (x μ o ) 2 = (x i x) 2 + n(x μ o ) 2, since 2(x i x)(x μ o ) = 2(x μ o ) (x i x) = 0. Hence, from (2) we have that Λ(x, x 2,..., x n ) = e n(x μo)2 /2, (3) where x denotes the sample mean. (c) Express the LT rejection region in terms of the sample mean X n. Solution: The LT rejection region is given by : Λ(x, x 2,..., x n ) c, for some c with 0 < c <. It then follows from equation (3) in part (b) of this problem that H o is rejected if e n(x μo)2 /2 c, or, taking natural logarithm on both sides of the last inequality, n(x μ o ) 2 /2 ln c, or or n(x μ o ) 2 2 ln c, n x μo 2 ln c b > 0. Thus, the LT will reject H o if n Xn μ o b, for some b > 0 determined by the significance level α.

Math 52. umbos Fall 2009 7 4. Let X, X 2,..., X n denote a random sample from a uniform(0, θ) distribution for some parameter θ > 0. (a) Give the likelihood function L(θ x, x 2,..., x n ). Solution: The pdf for each of the X i s is if 0 < x < θ; θ f(x θ) = 0 otherwise. Thus, the likelihood function is if 0 < x θ L(θ x, x 2,..., x n ) = n, x 2,..., x n < θ; 0 otherwise. (4) (b) Give the maximum likelihood estimator for θ. Solution: Observe that the likelihood function in (4) is a decreasing function of θ. Thus, L(θ x, x 2,..., x n ) will the largest when θ is the smallest value it can take, ˆθ, based on the sample. This value is the maximum of the values x, x 2,..., x n, because, if x i > θ for some i, then L(θ x, x 2,..., x n ) = 0 according to the definition of the likelihood function given (4). It then follows that the MLE for θ is ˆθ = max{x, X 2,..., X n }. 5. Let denote the rejection region for an LT of H o : θ = θ o versus H : θ = θ based on a random sample, X, X 2,..., X n, from continuous distribution with pdf f(x θ). Let L(θ x, x 2,..., x n ) denote the likelihood function. Suppose the LT has significance level α. (a) Explain why α = L(θ o x, x 2,..., x n ) dx dx 2 dx n.

Math 52. umbos Fall 2009 8 Answer: The significance level α is the probability the the LT will reject H o when H o is true; in other words, when θ = θ o. Thus, α = P((x, x 2,..., x n ) = f(x, x 2,..., x n θ o ) dx dx 2 dx n, where is the rejection region of the LT and f(x, x 2,..., x n θ o ) is the joint distribution of the sample for the case in which H o is true. It then follows that α = L(θ o x, x 2,..., x n ) dx dx 2 dx n, by the definition of the likelihood function. (b) Explain why the power of the test is γ(θ ) = L(θ x, x 2,..., x n ) dx dx 2 dx n. Answer: The power of the test, γ(θ ), is the probability the the LT will reject H o when H o is false; in other words, when θ = θ. Thus, γ(θ ) = f(x, x 2,..., x n θ ) dx dx 2 dx n, which yields γ(θ ) = L(θ x, x 2,..., x n ) dx dx 2 dx n. by the definition of the likelihood function. (c) Explain why α cγ(θ ), where c is the critical value used in the definition of the rejection region,, for the LT. Solution: Since, Λ(x, x 2,..., x n ) c on,, for some c (0, ) determined by α, it follows that L(θ 0 x, x 2,..., x n ) cl(θ x, x 2,..., x n )

Math 52. umbos Fall 2009 9 for all (x, x 2,..., x n ). Consequently, L(θ 0 x, x 2,..., x n ) cl(θ x, x 2,..., x n ) from which the result follows in view of parts (a) and (b) of this problem.