λ(x + 1)f g (x) > θ 0

Similar documents
Spring 2012 Math 541B Exam 1

Chapter 4. Theory of Tests. 4.1 Introduction

Theory of Statistical Tests

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

d(x n, x) d(x n, x nk ) + d(x nk, x) where we chose any fixed k > N

4 Invariant Statistical Decision Problems

Measure and Integration: Solutions of CW2

STAT 830 Hypothesis Testing

Statistics Masters Comprehensive Exam March 21, 2003

ECE 275A Homework 7 Solutions

Bayesian statistics: Inference and decision theory

Math 140A - Fall Final Exam

STAT215: Solutions for Homework 2

Continuity. Chapter 4

Parameter Estimation

Exercises and Answers to Chapter 1

Hypothesis Testing. 1 Definitions of test statistics. CB: chapter 8; section 10.3

Chapter 4: Asymptotic Properties of the MLE

ST5215: Advanced Statistical Theory

STAT 830 Hypothesis Testing

Direction: This test is worth 250 points and each problem worth points. DO ANY SIX

Bounded uniformly continuous functions

is a Borel subset of S Θ for each c R (Bertsekas and Shreve, 1978, Proposition 7.36) This always holds in practical applications.

Chapter 4 HOMEWORK ASSIGNMENTS. 4.1 Homework #1

Solutions Final Exam May. 14, 2014

ST5215: Advanced Statistical Theory

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Chapter 3: Unbiased Estimation Lecture 22: UMVUE and the method of using a sufficient and complete statistic

Solutions Final Exam May. 14, 2014

Lecture 2: Basic Concepts of Statistical Decision Theory

Section 8: Asymptotic Properties of the MLE

STA 732: Inference. Notes 10. Parameter Estimation from a Decision Theoretic Angle. Other resources

Hypothesis Testing. A rule for making the required choice can be described in two ways: called the rejection or critical region of the test.

7. Estimation and hypothesis testing. Objective. Recommended reading

Homework Assignment #2 for Prob-Stats, Fall 2018 Due date: Monday, October 22, 2018

Probability and Measure

Optimal Tests of Hypotheses (Hogg Chapter Eight)

Lecture 7 October 13

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Continuity. Chapter 4

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

EC9A0: Pre-sessional Advanced Mathematics Course. Lecture Notes: Unconstrained Optimisation By Pablo F. Beker 1

Midterm 1. Every element of the set of functions is continuous

Math 152. Rumbos Fall Solutions to Assignment #12

Statistics Ph.D. Qualifying Exam: Part II November 9, 2002

REAL VARIABLES: PROBLEM SET 1. = x limsup E k

7. Let X be a (general, abstract) metric space which is sequentially compact. Prove X must be complete.

Lecture 21. Hypothesis Testing II

ECE 275B Homework # 1 Solutions Version Winter 2015

Principle of Mathematical Induction

Lecture 17: Density Estimation Lecturer: Yihong Wu Scribe: Jiaqi Mu, Mar 31, 2016 [Ed. Apr 1]

Recall that in order to prove Theorem 8.8, we argued that under certain regularity conditions, the following facts are true under H 0 : 1 n

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences FINAL EXAMINATION, APRIL 2013

CSE 312 Final Review: Section AA

A Very Brief Summary of Statistical Inference, and Examples

STA 732: Inference. Notes 2. Neyman-Pearsonian Classical Hypothesis Testing B&D 4

g(x) = P (y) Proof. This is true for n = 0. Assume by the inductive hypothesis that g (n) (0) = 0 for some n. Compute g (n) (h) g (n) (0)

Master s Written Examination

Qualifying Exam in Probability and Statistics.

Part III. A Decision-Theoretic Approach and Bayesian testing

MAS223 Statistical Inference and Modelling Exercises

Applied Analysis (APPM 5440): Final exam 1:30pm 4:00pm, Dec. 14, Closed books.

The logistic regression model is thus a glm-model with canonical link function so that the log-odds equals the linear predictor, that is

Homework 27. Homework 28. Homework 29. Homework 30. Prof. Girardi, Math 703, Fall 2012 Homework: Define f : C C and u, v : R 2 R by

Continuity. Matt Rosenzweig

Preliminary Exam 2016 Solutions to Morning Exam

LECTURE 5 NOTES. n t. t Γ(a)Γ(b) pt+a 1 (1 p) n t+b 1. The marginal density of t is. Γ(t + a)γ(n t + b) Γ(n + a + b)

Statistics. Statistics

Part 2 Continuous functions and their properties

MATH 202B - Problem Set 5

HOMEWORK ASSIGNMENT 6

M2PM1 Analysis II (2008) Dr M Ruzhansky List of definitions, statements and examples Preliminary version

ECE 275B Homework # 1 Solutions Winter 2018

Problem 3. Give an example of a sequence of continuous functions on a compact domain converging pointwise but not uniformly to a continuous function

Chapter 3. Point Estimation. 3.1 Introduction

International Competition in Mathematics for Universtiy Students in Plovdiv, Bulgaria 1994

Statistical Approaches to Learning and Discovery. Week 4: Decision Theory and Risk Minimization. February 3, 2003

Introduction to Bayesian Statistics

Section 8.2. Asymptotic normality

Review of Multi-Calculus (Study Guide for Spivak s CHAPTER ONE TO THREE)

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

3.4 Introduction to power series

Problem Set 5: Solutions Math 201A: Fall 2016

Statistics 581 Revision of Section 4.4: Consistency of Maximum Likelihood Estimates Wellner; 11/30/2001

Problems for Chapter 3.

Problem Set 6: Solutions Math 201A: Fall a n x n,

MATH 409 Advanced Calculus I Lecture 12: Uniform continuity. Exponential functions.

THE INVERSE FUNCTION THEOREM

Statistics Ph.D. Qualifying Exam: Part II November 3, 2001

Probability Theory and Statistics. Peter Jochumzen

6.1 Variational representation of f-divergences

Answers to the 8th problem set. f(x θ = θ 0 ) L(θ 0 )

447 HOMEWORK SET 1 IAN FRANCIS

The reference [Ho17] refers to the course lecture notes by Ilkka Holopainen.

Hypothesis Test. The opposite of the null hypothesis, called an alternative hypothesis, becomes

Master s Written Examination - Solution

Introductory Analysis 2 Spring 2010 Exam 1 February 11, 2015

Homework for MATH 4603 (Advanced Calculus I) Fall Homework 13: Due on Tuesday 15 December. Homework 12: Due on Tuesday 8 December

Solution Sheet 3. Solution Consider. with the metric. We also define a subset. and thus for any x, y X 0

Test Code: STA/STB (Short Answer Type) 2013 Junior Research Fellowship for Research Course in Statistics

Transcription:

Stat 8111 Final Exam December 16 Eleven students took the exam, the scores were 92, 78, 4 in the 5 s, 1 in the 4 s, 1 in the 3 s and 3 in the 2 s. 1. i) Let X 1, X 2,..., X n be iid each Bernoulli(θ) where θ Θ = [, 1]. What is the a complete sufficient statistic for θ. What real valued functions γ(θ) have unbiased estimators? For such a γ(θ) what is its best unbiased estimator? ii) Suppose now n = 3 and Θ = {, 1/2, 1}. What is the a complete sufficient statistic for this problem? Justify your answer. What real valued functions γ(θ) have unbiased estimators? For such a γ(θ) what is its best unbiased estimator? 2. Let X be a discrete random variable taking on values in the set of non-negative integers. Suppose the probability functions f θ for < θ < 1 have the form Let < θ < 1 be given and consider testing f θ (x) = θ x h(θ)λ(x) for x =, 1,... H : θ θ against K : θ > θ Let d be the decision we accept H and d 1 the decision we accept K. Let the loss function satisfy L(d, θ) = for θ θ = θ θ for θ > θ L(d 1, θ) = for θ θ = θ θ for θ < θ If g is a prior density for θ and f g the corresponding marginal probability function for X. Show that a Bayes test is given by δ g (x) = d 1 when λ(x)f g(x + 1) λ(x + 1)f g (x) > θ = d otherwise 3. Let < a < b < be fixed real numbers. Consider a decision problem where D = [a, b] is the space of possible decisions and Θ = [a, b] is the space of possible parameter values. For some fixed λ > consider the loss function L(θ, d) = 2 λ(λ + 1) {d[(d θ )λ 1] + λ(θ d)} i) Show that L(θ, θ) = and that for each fixed θ L(θ, ) is strictly convex in d. ii) If g is a prior density for θ find the no data Bayes estimator for g. 4. Suppose X is a discrete random variable taking on values 2, 3, 4,.... For θ Θ, a subset of real numbers, let f θ ( ) be a probability function for X. Assume that f(x θ) > for all x and all θ. Assume that this family has the monotone likelihood ratio (MLR) property in T (X) = X. Let Y = W (X) where W (j) = j if j is odd = j + 1 if j is even 1

Show that the family of distributions for Y has the MLR property in T (Y ) = Y 5. Let X be a random variable with a family of possible distributions indexed by the real parameter θ. Consider testing H : θ against K : θ >. Let φ be a test and E θ φ(x) its power function. Let g be a prior density for θ. Let P g (type i error of φ) = E θ φ(x)g(θ) dθ P g (type ii error of φ) = (1 E θ φ(x))g(θ) dθ Let < α < 1 be fixed. We say that a test is prior most powerful (PMP) at level α if subject to P g (type i error of φ) α it minimizes P g (type ii error of φ) i) Find the form of the PMP level α test. ii) Suppose the prior g belongs to a class of possible prior densities, say G. We say that a test is uniformly prior most powerful (UPMP) for the family G if subject to P g (type i error of φ) α for all g G it minimizes P g (type ii error of φ) uniformly for g G For the rest of the problem assume that X is a normal random variable with mean θ and variance one. For each g G let φ g be the PMP level α test found in part i). Find the form of this test. Show that if there exists a g G such that φ g (x) = inf g G φ g(x) for all x. then φ g is a UPMP level α test. iii) Now suppose G is the family of normal distributions with a known variance, σ 2 but unknown mean µ [a, b] where < a < b < are specified real numbers. Show that the φ g of part ii) must exist. iv) Show that the φ g of part ii) must exist when µ belongs to the set of real numbers. 2

Answers 1. i) T (X,,..., X n ) = n i 1 X i is complete sufficient. Since E θ δ(t ) is a polynomial in θ of degree n and T (T 1) (T k + 1) E θ ( n(n 1) (n k + 1) ) = θk every such polynomial will have a best unbiased estimator. ii) Let T (x 1, x 2, x 3 ) = if xi = = 1.5 if = 3 if xi = 1 or 2 xi = 3 Note for θ {, 1/2, 1} P θ (T = ) = (1 θ) 3 P θ (T = 1.5) = 3θ(1 θ) P θ (T = 3) = θ 3 so by factorization theorem T is sufficient. Easy to check that T is complete. E θ= δ(t ) = implies that δ() =. In the same way E θ=1 δ(t ) = implies that δ(3) =. So E θ=1/2 δ(t ) = (1/8)δ() + (3/4)δ(1.5) + (1/8)δ(3) = implies that δ(1.5) = and so T is complete. If g is an arbitrary real valued function of θ then its best unbiased estimator is δ() = g(), δ(3) = g(3) and 2. For the no data problem we have In the same way δ(1.5) = 4 [g(1/2) g()/8 g(1)/8] 3 E g L(d, θ) = = 1 θ (θ θ o )g(θ) dθ 1 θ θg(θ) dθ θ P g (θ > θ ) E g L(d 1, θ) = θ P g (θ < θ ) θ θg(θ) dθ Then easy to check that E g L(d 1, θ) < E g L(d, θ) when 1 θg(θ) dθ > θ Since we have 1 f g (x) = λ(x) 1 θ x h(θ)g(θ) dθ θp g (θ x) = λ(x) 1 θx+1 h(θ)g(θ) dθ f g (x) = λ(x)f g(x + 1) λ(x + 1)f g (x) 3

3. The first part of part i) is trivial for the second part we just need to differentiate the loss function twice to see that it is convex and that its minimum value occurs when θ = d. When taking the derivative it is enough to consider the function and we see that f(d) = dλ+1 θ λ (λ + 1)d + λθ f (d) = (λ + 1)( d θ )λ (λ + 1) and the results follow. ii) Now for a fixed d we have L(θ, d)g(θ) dθ = Θ 2 { d λ+1 E(1/θ λ ) (λ + 1)d + λe(θ) } λ(λ + 1) Now differentiating and setting it equal to zero we find the minimizing d is given by d = (E(1/θ λ )) 1/λ 4. Let, b 1, a 2, b 2 be positive real numbers with (b 1 / ) (b 2 /a 2 ) then To prove the first inequality note that b 1 b 1 + b 2 + a 2 b 2 a 2 b 1 b 1 + b 2 + a 2 b 1 + a 2 b 1 b 1 + b 2 a 2 b 1 b 2 b 1 b 2 a 2 and the second is proved in the same way. Let g(y θ) be the probability function for Y. Let y 1 < y 2 be positive even integers and let θ 1 < θ 2 then we have g(y 1 θ 2 ) g(y 1 θ 1 ) = f(y 1 θ 2 ) + f(y 1 + 1 θ 2 ) f(y 1 θ 1 ) + f(y 1 + 1 θ 1 ) f(y 1 + 1 θ 2 ) f(y 1 + 1 θ 1 ) f(y 2 θ 2 ) f(y 1 θ 1 ) f(y 2 θ 2 ) + f(y 2 + 1 θ 2 ) f(y 2 θ 1 ) + f(y 2 + 1 θ 1 ) = g(y 2 θ 2 ) g(y 2 θ 1 ) Note the first and third inequality follow from the above remark and the second inequality from the MLR property. 4

5. i) Note P g (type i error of φ) = P g (θ ) P g (type ii error of φ) = P g (θ > ) φ(x)f,g (x) dx (1 φ(x))f 1,g dx where f,g and f 1,g denote the conditional densities of X given θ and θ >. So the NP Lemma tells us to reject H when f 1,g > kf,g for some constant k. ii) Consider tests of the form φ c (x) = 1 when x > c and φ c (x) = when x < c The power function of such a test E θ φ c (X) is a strictly increasing function of θ. Clearly every φ g must be a φ c for some choice of c, say c(g). Note by assumption c(g ) c(g) for every other g in G. So for θ > P θ (type II error of φ g ) P θ (type II error of φ g ) and hence P g (type ii error of φ g ) P g (type ii error of φ g ) iii) For the prior g corresponding to µ let φ µ be the PMP level α test. Then clearly c(µ) is a continuous function of µ so it must achieve its minimum on a compact set. iv) This follows since lim c(µ) = = lim µ c(µ) µ The first equality is true since P µ (θ ) goes to as µ approaches. To prove the second it is enough to show that for any real number a we have that P µ (X a and θ ) goes to as µ approaches. But this follows since the marginal distribution of X is normal with mean µ and variance 1 + σ 2. 5