Lecture 16: UMVUE: conditioning on sufficient and complete statistics

Similar documents
Lecture 18: Sampling distributions

Lecture 23: Minimal sufficiency

Direction: This test is worth 250 points. You are required to complete this test within 50 minutes.

Lecture 19: Convergence

Chapter 6 Principles of Data Reduction

Lecture 4: UMVUE and unbiased estimators of 0

Lecture 33: Bootstrap

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

6. Sufficient, Complete, and Ancillary Statistics

Lecture 10 October Minimaxity and least favorable prior sequences

Asymptotics. Hypothesis Testing UMP. Asymptotic Tests and p-values

APPM 5720 Solutions to Problem Set Five

Lecture 8: Convergence of transformations and law of large numbers

SOLUTION FOR HOMEWORK 7, STAT np(1 p) (α + β + n) + ( np + α

Lecture 7: Properties of Random Samples

Lecture 15: Density estimation

Statistical Theory MT 2009 Problems 1: Solution sketches

Lecture 20: Multivariate convergence and the Central Limit Theorem

Lecture 12: September 27

Statistical Theory MT 2008 Problems 1: Solution sketches

MATH 472 / SPRING 2013 ASSIGNMENT 2: DUE FEBRUARY 4 FINALIZED

Lecture 11 and 12: Basic estimation theory

Direction: This test is worth 150 points. You are required to complete this test within 55 minutes.

STAT Homework 1 - Solutions

Lecture 6 Ecient estimators. Rao-Cramer bound.

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

Last Lecture. Unbiased Test

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

Lecture 3. Properties of Summary Statistics: Sampling Distribution

SDS 321: Introduction to Probability and Statistics

ST5215: Advanced Statistical Theory

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

Probability 2 - Notes 10. Lemma. If X is a random variable and g(x) 0 for all x in the support of f X, then P(g(X) 1) E[g(X)].

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. Comments:

Last Lecture. Wald Test

Estimation for Complete Data

1.010 Uncertainty in Engineering Fall 2008

IIT JAM Mathematical Statistics (MS) 2006 SECTION A

Exponential Families and Bayesian Inference

Solutions: Homework 3

Unbiased Estimation. February 7-12, 2008

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA STATISTICAL THEORY AND METHODS PAPER I

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

2. The volume of the solid of revolution generated by revolving the area bounded by the

Confidence Level We want to estimate the true mean of a random variable X economically and with confidence.

Mathmatical Statisticals

( θ. sup θ Θ f X (x θ) = L. sup Pr (Λ (X) < c) = α. x : Λ (x) = sup θ H 0. sup θ Θ f X (x θ) = ) < c. NH : θ 1 = θ 2 against AH : θ 1 θ 2

HOMEWORK I: PREREQUISITES FROM MATH 727

January 25, 2017 INTRODUCTION TO MATHEMATICAL STATISTICS

Introduction to Econometrics (4th Edition) Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 2*

Problem Set 4 Due Oct, 12

Please do NOT write in this box. Multiple Choice. Total

Lecture 12: November 13, 2018

Fundamentals of Statistics

Stat410 Probability and Statistics II (F16)

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Bayesian Methods: Introduction to Multi-parameter Models

Stat 319 Theory of Statistics (2) Exercises

STAT Homework 2 - Solutions

Review Questions, Chapters 8, 9. f(y) = 0, elsewhere. F (y) = f Y(1) = n ( e y/θ) n 1 1 θ e y/θ = n θ e yn

The variance of a sum of independent variables is the sum of their variances, since covariances are zero. Therefore. V (xi )= n n 2 σ2 = σ2.

Exercise 4.3 Use the Continuity Theorem to prove the Cramér-Wold Theorem, Theorem. (1) φ a X(1).

Topic 9: Sampling Distributions of Estimators

Topic 9: Sampling Distributions of Estimators

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

Math 152. Rumbos Fall Solutions to Review Problems for Exam #2. Number of Heads Frequency

Since X n /n P p, we know that X n (n. Xn (n X n ) Using the asymptotic result above to obtain an approximation for fixed n, we obtain

Stat 421-SP2012 Interval Estimation Section

APPLIED MULTIVARIATE ANALYSIS

This section is optional.

Output Analysis and Run-Length Control

Economics 326 Methods of Empirical Research in Economics. Lecture 18: The asymptotic variance of OLS and heteroskedasticity

Math 132, Fall 2009 Exam 2: Solutions

Last Lecture. Biostatistics Statistical Inference Lecture 16 Evaluation of Bayes Estimator. Recap - Example. Recap - Bayes Estimator

EE 4TM4: Digital Communications II Probability Theory

Randomized Algorithms I, Spring 2018, Department of Computer Science, University of Helsinki Homework 1: Solutions (Discussed January 25, 2018)

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n.

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 5

Machine Learning Brett Bernstein

7.1 Convergence of sequences of random variables

AMS 216 Stochastic Differential Equations Lecture 02 Copyright by Hongyun Wang, UCSC ( ( )) 2 = E X 2 ( ( )) 2

Mathematics 170B Selected HW Solutions.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS

Distribution of Random Samples & Limit theorems

Lecture 7: Density Estimation: k-nearest Neighbor and Basis Approach

Lecture Note 8 Point Estimators and Point Estimation Methods. MIT Spring 2006 Herman Bennett

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

Quick Review of Probability

Topic 9: Sampling Distributions of Estimators

Quick Review of Probability

An Introduction to Asymptotic Theory

Chapter 3: Unbiased Estimation Lecture 22: UMVUE and the method of using a sufficient and complete statistic

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 2 9/9/2013. Large Deviations for i.i.d. Random Variables

Linear regression. Daniel Hsu (COMS 4771) (y i x T i β)2 2πσ. 2 2σ 2. 1 n. (x T i β y i ) 2. 1 ˆβ arg min. β R n d

Goodness-of-Fit Tests and Categorical Data Analysis (Devore Chapter Fourteen)

Expectation and Variance of a random variable

4. Partial Sums and the Central Limit Theorem

STATISTICAL INFERENCE

Notes 19 : Martingale CLT

Parameter, Statistic and Random Samples

Transcription:

Lecture 16: UMVUE: coditioig o sufficiet ad complete statistics The 2d method of derivig a UMVUE whe a sufficiet ad complete statistic is available Fid a ubiased estimator of ϑ, say U(X. Coditioig o a sufficiet ad complete statistic T (X: E[U(X T ] is the UMVUE of ϑ. We eed to derive a explicit form of E[U(X T ] From the uiqueess of the UMVUE, it does ot matter which U(X is used. Thus, we should choose U(X so as to make the calculatio of E[U(X T ] as easy as possible. We do ot eed the distributio of T. But we eed to work out the coditioal expectatio E[U(X T ]. Usig the idepedece of some statistics (Basu s theorem, we may avoid to work o coditioal distributios. UW-Madiso (Statistics Stat 709 Lecture 16 2018 1 / 15

Example 7.3.24 (biomial family Let X 1,...,X be iid from biomial(k,θ with kow k ad ukow θ (0,1. We wat to estimate g(θ = P θ (X 1 = 1 = kθ(1 θ k 1. Note that T = i=1 X i biomial(k,θ is the sufficiet ad complete statistic for θ. But o ubiased estimator based o it is immediately evidet. To apply coditioig, we take the simple ubiased estimator of P θ (X 1 = 1, the idicator fuctio I(X 1 = 1. By Theorem 7.3.23, the UMVUE of g(θ is ψ(t = E[I(X 1 = 1 T = P(X 1 = 1 T We eed to simply ψ(t ad obtai a explicit form. Whe T = 0, P(X 1 = 1 T = 0 = 0. For t = 1,...,k, UW-Madiso (Statistics Stat 709 Lecture 16 2018 2 / 15

ψ(t = P (X 1 = 1 T = t = P θ (X 1 = 1, i=1 X i = t P θ ( i=1 X i = t = P θ (X 1 = 1, i=2 X i = t 1 P θ ( i=1 X i = t = P θ (X 1 = 1P θ ( i=2 X i = t 1 P θ ( i=1 X i = t [ (k( 1 kθ(1 θ k 1 ] t 1 θ t 1 (1 θ k( 1 (t 1 = θ t (1 θ k t = k( k( 1 t 1 ( k t ( k t Hece, the UMVUE of g(θ = kθ(1 θ k 1 is k( k( 1 T 1 T = 1,...,k ψ(t = ( k T 0 T = 0 UW-Madiso (Statistics Stat 709 Lecture 16 2018 3 / 15

Example 3.3 Let X 1,...,X be i.i.d. from the expoetial distributio E(0,θ. F θ (x = (1 e x/θ I (0, (x. Cosider the estimatio of ϑ = 1 F θ (t. X is sufficiet ad complete for θ > 0. I (t, (X 1 is ubiased for ϑ, E[I (t, (X 1 ] = P(X 1 > t = ϑ. Hece T (X = E[I (t, (X 1 X] = P(X 1 > t X is the UMVUE of ϑ. If the coditioal distributio of X 1 give X is available, the we ca calculate P(X 1 > t X directly. By Basu s theorem (Theorem 2.4, X 1 / X ad X are idepedet. By Propositio 1.10(vii, P(X 1 > t X = x = P(X 1 / X > t/ X X = x = P(X 1 / X > t/ x UW-Madiso (Statistics Stat 709 Lecture 16 2018 4 / 15

To compute this ucoditioal probability, we eed the distributio of X 1 / i=1 X i = X 1 / ( X 1 + i=2 X i. Usig the trasformatio techique discussed i 1.3.1 ad the fact that i=2 X i is idepedet of X 1 ad has a gamma distributio, we obtai that X 1 / i=1 X i has the Lebesgue p.d.f. ( 1(1 x 2 I (0,1 (x. Hece P(X 1 > t X 1 = x = ( 1 (1 x 2 dx ( = 1 t x t/( x 1 ad the UMVUE of ϑ is ( T (X = 1 t 1 X. UW-Madiso (Statistics Stat 709 Lecture 16 2018 5 / 15

Example 3.4 Let X 1,...,X be i.i.d. from N(µ,σ 2 with ukow µ R ad σ 2 > 0. From Example 2.18, T = ( X,S 2 is sufficiet ad complete for θ = (µ,σ 2 X ad ( 1S 2 /σ 2 are idepedet X has the N(µ,σ 2 / distributio S 2 has the chi-square distributio χ 2 1. Usig the method of solvig for h directly, we fid that the UMVUE for µ is X; the UMVUE of µ 2 is X 2 S 2 /; the UMVUE for σ r with r > 1 is k 1,r S r, where k,r = r/2 Γ ( 2 2 r/2 Γ ( +r 2 the UMVUE of µ/σ is k 1, 1 X/S, if > 2. UW-Madiso (Statistics Stat 709 Lecture 16 2018 6 / 15

Example 3.4 (cotiued Suppose that ϑ satisfies P(X 1 ϑ = p with a fixed p (0,1. Let Φ be the c.d.f. of the stadard ormal distributio. The ϑ = µ + σφ 1 (p ad its UMVUE is X + k 1,1 SΦ 1 (p. Let c be a fixed costat ad ( c µ ϑ = P(X 1 c = Φ σ We ca fid the UMVUE of ϑ usig the method of coditioig. Sice I (,c (X 1 is a ubiased estimator of ϑ, the UMVUE of ϑ is. E[I (,c (X 1 T ] = P(X 1 c T. By Basu s theorem, the acillary statistic Z (X = (X 1 X/S is idepedet of T = ( X,S 2. UW-Madiso (Statistics Stat 709 Lecture 16 2018 7 / 15

Example 3.4 (cotiued The, by Propositio 1.10(vii, ( P X 1 c T = ( x,s 2 ( = P = P It ca be show that Z has the Lebesgue p.d.f. f (z = ( Γ 1 2 π( 1Γ ( 2 2 Hece the UMVUE of ϑ is Z c X S T = ( x,s2 ( Z c x. s ] (/2 2 [1 z2 ( 1 2 I (0,( 1/ ( z (c X/S P(X 1 c T = ( 1/ f (zdz UW-Madiso (Statistics Stat 709 Lecture 16 2018 8 / 15

Example 3.4 (cotiued Suppose that we would like to estimate ϑ = 1 σ Φ ( c µ σ the Lebesgue p.d.f. of X 1 evaluated at a fixed c, where Φ is the first-order derivative of Φ. By the previous result, the coditioal p.d.f. of X 1 give X = x ad S 2 = s 2 is s 1 f ( x x s. Let f T be the joit p.d.f. of T = ( X,S 2. The ϑ = Hece the UMVUE of ϑ is, ( [ ( ] 1 c x 1 c s f f T (tdt = E s S f X. S ( 1 c S f X. S UW-Madiso (Statistics Stat 709 Lecture 16 2018 9 / 15

Example Let X 1,...,X be i.i.d. with Lebesgue p.d.f. f θ (x = θx 2 I (θ, (x, where θ > 0 is ukow. Suppose that ϑ = P(X 1 > t for a costat t > 0. The smallest order statistic X (1 is sufficiet ad complete for θ. Hece, the UMVUE of ϑ is P(X 1 > t X (1 = P(X 1 > t X (1 = x (1 ( X1 = P > t X (1 X X (1 = x (1 (1 ( X1 = P > t X (1 x X (1 = x (1 (1 ( X1 = P > s X (1 (Basu s theorem, where s = t/x (1. If s 1, this probability is 1. UW-Madiso (Statistics Stat 709 Lecture 16 2018 10 / 15

Example (cotiued Cosider s > 1 ad assume θ = 1 i the calculatio: ( ( X1 X1 P > s = X (1 P > s,x X (1 = X i i=1 (1 ( X1 = P > s,x X (1 = X i i=2 (1 ( X1 = ( 1P > s,x X (1 = X (1 = ( 1P (X 1 > sx,x 2 > X,...,X 1 > X 1 = ( 1 dx 1 dx = ( 1 x 1 >sx,x 2 >x,...,x 1 >x i=1 [ 1 = ( 1 1 sx 1 sx +1 1 i=2 ( x 1 xi 2 x 2 i dx = ( 1x (1 t ] 1 1 dx i dx 1 dx x 2 1 x 2 UW-Madiso (Statistics Stat 709 Lecture 16 2018 11 / 15

Example (cotiued This shows that the UMVUE of P(X 1 > t is Aother solutio h(x (1 = The UMVUE must be h(x (1 The Lebesgue p.d.f. of X (1 is { ( 1X(1 t X (1 < t 1 X (1 t θ x +1 I (θ, (x. Use the method of fidig h If θ t, the P(X 1 > t = 1 ad P(t > X (1 = 0. Hece, if X (1 t, h(x (1 must be 1 a.s. P θ The value of h(x (1 for X (1 < t is ot specified. UW-Madiso (Statistics Stat 709 Lecture 16 2018 12 / 15

If θ < t, t = h(x θ θ x +1 dx + t Sice P(X 1 > t = θ/t, we have E[h(X (1 ] = h(x θ dx θ x +1 θ t x +1 dx = θ t t = h(x θ θ dx + θ x +1 t i.e., 1 t tθ 1 = h(x θ x +1 dx + 1 t Differetiatig both sizes w.r.t. θ leads to Hece, for ay X (1 < t, 1 tθ = h(θ θ +1 θ h(x θ θ dx + x +1 t h(x (1 = ( 1X (1. t UW-Madiso (Statistics Stat 709 Lecture 16 2018 13 / 15

Ubiased estimators of 0 If a sufficiet ad complete statistic is ot available, the what should we do? If W is ubiased for ϑ ad T is sufficiet, the by Theorem 2.5 (Rao-Blackwell, E(W T is better tha W. If we have aother sufficiet statistic S, should we cosider E[E(W T S]? If there is a fuctio h such that S = h(t, the by the properties of coditioal expectatio, E[E(W T S] = E(W S = E[E(W S T ] That is, we should always coditioig o a simpler sufficiet statistic, such as a miimal sufficiet statistic. To see whe a ubiased estimator is best ubiased, we might ask how could we improve upo a give ubiased estimator? Suppose that T (X is ubiased for g(θ ad U(X is a statistic satisfyig E θ (U = 0 for all θ, i.e., U is ubiased for 0. The, for ay costat a, UW-Madiso (Statistics Stat 709 Lecture 16 2018 14 / 15

is ubiased for g(θ. Ca it be better tha T (X? T (X + au(x Var θ (T + au = Var θ (T + 2aCov θ (T,U + a 2 Var θ (U If for some θ 0, Cov θ0 (T,U < 0, the we ca make 2aCov θ0 (T,U + a 2 Var θ0 (U < 0 by choosig 0 < a 2Cov θ0 (T,U/Var θ0 (U. Hece, T (X + au(x is better tha T (X at least whe θ = θ 0 ad T (X caot be UMVUE. Similarly, if Cov θ0 (T,U > 0 for some θ 0, the T (X caot be UMVUE either. Thus, Cov θ (T,U = 0 is ecessary for T (X to be a UMVUE, for all ubiased estimators of 0. It turs out that Cov θ (T,U = 0 for all U(X ubiased for 0 is also sufficiet for T (X beig a UMVUE. UW-Madiso (Statistics Stat 709 Lecture 16 2018 15 / 15