Chapters 9. Properties of Point Estimators

Similar documents
Mathematical statistics

ELEG 5633 Detection and Estimation Minimum Variance Unbiased Estimators (MVUE)

Chapter 8.8.1: A factorization theorem

Theory of Statistics.

STA 260: Statistics and Probability II

HT Introduction. P(X i = x i ) = e λ λ x i

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER. 21 June :45 11:45

Elements of statistics (MATH0487-1)

Math 494: Mathematical Statistics

Estimation MLE-Pandemic data MLE-Financial crisis data Evaluating estimators. Estimation. September 24, STAT 151 Class 6 Slide 1

Regression Estimation - Least Squares and Maximum Likelihood. Dr. Frank Wood

Evaluating the Performance of Estimators (Section 7.3)

Mathematical statistics

Spring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =

Review. December 4 th, Review

Mathematical statistics

Chapter 3: Unbiased Estimation Lecture 22: UMVUE and the method of using a sufficient and complete statistic

STAT 135 Lab 3 Asymptotic MLE and the Method of Moments

Central Limit Theorem ( 5.3)

6 The normal distribution, the central limit theorem and random samples

Methods of evaluating estimators and best unbiased estimators Hamid R. Rabiee

Statistics and Econometrics I

Mathematics Ph.D. Qualifying Examination Stat Probability, January 2018

1. (Regular) Exponential Family

EIE6207: Estimation Theory

Econometrics I, Estimation

f(x θ)dx with respect to θ. Assuming certain smoothness conditions concern differentiating under the integral the integral sign, we first obtain

Mathematics Qualifying Examination January 2015 STAT Mathematical Statistics

ST5215: Advanced Statistical Theory

Example: An experiment can either result in success or failure with probability θ and (1 θ) respectively. The experiment is performed independently

Classical Estimation Topics

parameter space Θ, depending only on X, such that Note: it is not θ that is random, but the set C(X).

Statistics Ph.D. Qualifying Exam: Part II November 9, 2002

Introduction to Estimation Methods for Time Series models Lecture 2

p y (1 p) 1 y, y = 0, 1 p Y (y p) = 0, otherwise.

SUFFICIENT STATISTICS

STAT 730 Chapter 4: Estimation

Regression Estimation Least Squares and Maximum Likelihood

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.

Parametric Inference

STAT 512 sp 2018 Summary Sheet

BIO5312 Biostatistics Lecture 13: Maximum Likelihood Estimation

Statistics - Lecture One. Outline. Charlotte Wickham 1. Basic ideas about estimation

ECE534, Spring 2018: Solutions for Problem Set #3

Proof In the CR proof. and

STATS 200: Introduction to Statistical Inference. Lecture 29: Course review

Test Code: STA/STB (Short Answer Type) 2013 Junior Research Fellowship for Research Course in Statistics

Statistics GIDP Ph.D. Qualifying Exam Theory Jan 11, 2016, 9:00am-1:00pm

Advanced Signal Processing Introduction to Estimation Theory

Final Exam. 1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given.

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Hypothesis testing: theory and methods

Parametric Models: from data to models

Review of Maximum Likelihood Estimators

McGill University. Faculty of Science. Department of Mathematics and Statistics. Part A Examination. Statistics: Theory Paper

Chapters 10. Hypothesis Testing

BTRY 4090: Spring 2009 Theory of Statistics

Terminology Suppose we have N observations {x(n)} N 1. Estimators as Random Variables. {x(n)} N 1

EXAMINERS REPORT & SOLUTIONS STATISTICS 1 (MATH 11400) May-June 2009

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Ph.D. Qualifying Exam Monday Tuesday, January 4 5, 2016

1. Let A be a 2 2 nonzero real matrix. Which of the following is true?

STATISTICAL METHODS FOR SIGNAL PROCESSING c Alfred Hero

Maximum Likelihood Estimation

STAT 135 Lab 5 Bootstrapping and Hypothesis Testing

A Very Brief Summary of Statistical Inference, and Examples

Statistics II Lesson 1. Inference on one population. Year 2009/10

Parameter estimation! and! forecasting! Cristiano Porciani! AIfA, Uni-Bonn!

Non-parametric Inference and Resampling

Lecture 3 September 1

Nuisance parameters and their treatment

Statistics 1B. Statistics 1B 1 (1 1)

Machine learning - HT Maximum Likelihood

Problems ( ) 1 exp. 2. n! e λ and

STAT215: Solutions for Homework 2

Unbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.

Maximum-Likelihood Estimation: Basic Ideas

Part IB Statistics. Theorems with proof. Based on lectures by D. Spiegelhalter Notes taken by Dexter Chua. Lent 2015

Gov 2001: Section 4. February 20, Gov 2001: Section 4 February 20, / 39

Sufficiency 1. Sufficiency. Math Stat II

MS&E 226: Small Data. Lecture 11: Maximum likelihood (v2) Ramesh Johari

(a) (3 points) Construct a 95% confidence interval for β 2 in Equation 1.

Bias Variance Trade-off

Testing Hypothesis. Maura Mezzetti. Department of Economics and Finance Università Tor Vergata

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators

Lecture 3. Inference about multivariate normal distribution

All other items including (and especially) CELL PHONES must be left at the front of the room.

Problem Selected Scores

5.2 Expounding on the Admissibility of Shrinkage Estimators

Brief Review on Estimation Theory

Lecture 18: Bayesian Inference

Probability Theory and Statistics. Peter Jochumzen

Statistics 3858 : Maximum Likelihood Estimators

1. Point Estimators, Review

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Hypothesis Testing. 1 Definitions of test statistics. CB: chapter 8; section 10.3

5.2 Fisher information and the Cramer-Rao bound

Interval estimation. October 3, Basic ideas CLT and CI CI for a population mean CI for a population proportion CI for a Normal mean

Module 1 - Signal estimation

Accouncements. You should turn in a PDF and a python file(s) Figure for problem 9 should be in the PDF

Transcription:

Chapters 9. Properties of Point Estimators

Recap Target parameter, or population parameter θ. Population distribution f(x; θ). { probability function, discrete case f(x; θ) = density, continuous case The form of f(x; θ) is usually assumed to be known except the value of θ. Sample: {X 1, X 2,..., X n } are iid with distribution f(x, θ). Estimator ˆθ is a function of samples {X 1, X 2,..., X n }: ˆθ = ˆθ(X 1, X 2,..., X n ). MSE, unbiased, confidence interval.

Relative efficiency Two estimators for θ: ˆθ1 and ˆθ 2. The relative efficiency of ˆθ 1 with respect to ˆθ 2 is defined as eff(ˆθ 1, ˆθ 2 ) = MSE(ˆθ 2 ) MSE(ˆθ 1 ) Remark: When ˆθ 1 and ˆθ 2 are both unbiased, their relative efficiency reduces to eff(ˆθ 1, ˆθ 2 ) = Var(ˆθ 2 ) Var(ˆθ 1 ) Remark: When eff(ˆθ 1, ˆθ 2 ) > (<)1, ˆθ 1 is more (less) efficient than ˆθ 2.

Minimal Variance Unbiased Estimator (MVUE) Goal: Among all the unbiased estimators, find the one with the minimal variance (most efficient unbiased estimator). Keywords: 1. Estimator: function of samples {X 1, X 2,..., X n } 2. Unbiased. 3. Minimal variance.

MVUE: Sufficient Statistics Definition: A Statistics is a function of samples {X 1, X 2,..., X n }. Definition: A statistics t = t(x 1, X 2,..., X n ) is said to be sufficient if the likelihood of samples {X 1, X 2,..., X n } can be written as L(x 1, x 2,..., x n ; θ) = f(x 1 ; θ) f(x 2 ; θ) f(x n ; θ) L(x 1, x 2,..., x n ; θ) = g θ (t) h(x 1, x 2,..., x n ).

Examples of sufficient statistics 1. Bernoulii Distribution. {X 1, X 2,..., X n } iid Bernoulli with parameter p (target parameter). Then n is sufficient. i=1 X i

2. Poisson Distribution. {X 1, X 2,..., X n } iid Poisson with parameter λ (target parameter). Then n is sufficient. i=1 X i

3. Uniform Distribution. {X 1, X 2,..., X n } iid uniform on interval [0, θ] (target parameter θ). Then is sufficient. X (n) = max{x 1, X 2,..., X n }

4. Normal Distribution. {X 1, X 2,..., X n } iid N(µ, σ 2 ). (a) Suppose σ is known, and µ is the target parameter. Then n is sufficient. (b) Suppose µ and σ are both unknown (target parameters). Then ( n ) n X n, are (jointly) sufficient. i=1 X i i=1 i=1 X 2 i

Remark: Sufficient statistics are not unique. Many of them. Remark: What is the meaning of sufficiency A sufficient statistics contains all the information about θ from the samples {X 1, X 2,..., X n }. The conditional distribution of {X 1, X 2,..., X n } given a sufficient statistics t = t(x 1, X 2,..., X n ) is independent of θ Verify the discrete case...

MVUE: Rao-Blackwell Theorem Theorem: Let ˆθ = ˆθ(X 1, X 2,..., X n ) be an unbiased estimator for θ, and t any sufficient statistics. Define ˆθ = E[ˆθ(X 1, X 2,..., X n ) t]. Then ˆθ is an unbiased estimator for θ and Var[ˆθ ] Var[ˆθ] Remark: ˆθ is a function of t only. Proof. Need equality Var[X] = E[Var[X Y ]] + Var[E[X Y ]]

Observation: If there is only one function of t, say h(t), such that h(t) is an unbiased estimator for θ, that is then h(t) is the MVUE. E[h(t)] = θ, Definition: We say a statistics t is complete if for every θ implies h 0. E[h(t)] = 0 Remark: Suppose t is sufficient and complete, then there will be at most one function of t, say h(t), that is an unbiased estimator for θ.

To identify an MVUE, 1. Find a sufficient statistics, say t. 2. Argue this statistics is complete. MVUE: A useful approach 3. Find an unbiased estimator h(t) for θ. (One can use any unbiased estimator, say ˆθ, and then let h(t) = E[ˆθ t]) 4. This estimator h(t) is MVUE.

MVUE: Examples 1. A coin with P (H) = p (target parameter). Toss coin n times, { 1, if i-th toss is heads X i = 0, if i-th toss is tails Identify the MVUE for p.

2. Suppose {X 1, X 2,..., X n } are iid N(µ, σ 2 ). (a) If σ 2 is known, what is the MVUE for µ? (b) If µ and σ 2 are both unknown, what is the MVUE for µ? for σ 2?

3. Suppose {X 1, X 2,..., X n } are iid samples from uniform distribution on [0, θ]. Find an MVUE for θ.

4. Suppose {X 1, X 2,..., X n } are iid samples from Poisson distribution with parameter λ. Find an MVUE for θ. What about an MVUE for e θ?

Maximum Likelihood Estimate (MLE) and Consistency MLE: Find θ to maximize L(x 1, x 2,..., x n ; θ). [In this maximization problem, {x 1, x 2,..., x n } are regarded as fixed]

Examples 1. Suppose {X 1, X 2,..., X n } are iid samples from Poisson distribution with parameter θ. Find the MLE for θ.

2. Suppose {X 1, X 2,..., X n } are iid samples from uniform distribution [0, θ]. Find the MLE for θ.

3. Suppose {X 1, X 2,..., X n } are iid samples from N(µ, σ 2 ). Find the MLE for µ and σ 2.

Properties of MLE MLE has the following nice properties under mild regularity conditions. 1. MLE is a function of sufficient statistics. 2. Consistency: An estimator ˆθ = ˆθ(X 1, X 2,..., X n ) is said to be consistent if ˆθ(X1, X 2,..., X n ) θ 0 as n. 3. Asymptotic optimality: MLE is asymptotically normal and asymptotically most efficient. 4. Invariance Property: Suppose ˆθ is the MLE for θ, then h(ˆθ) is MLE for h(θ) when h is a one-to-one function. Example: Suppose {X 1, X 2,..., X n } are iid samples from uniform distribution [0, θ]. Find the MLE for the variance.