SOLUTION FOR HOMEWORK 7, STAT np(1 p) (α + β + n) + ( np + α

Similar documents
Unbiased Estimation. February 7-12, 2008

Homework for 2/3. 1. Determine the values of the following quantities: a. t 0.1,15 b. t 0.05,15 c. t 0.1,25 d. t 0.05,40 e. t 0.

Statistical Theory MT 2009 Problems 1: Solution sketches

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

Lecture 16: UMVUE: conditioning on sufficient and complete statistics

Solutions: Homework 3

Review Questions, Chapters 8, 9. f(y) = 0, elsewhere. F (y) = f Y(1) = n ( e y/θ) n 1 1 θ e y/θ = n θ e yn

Statistical Theory MT 2008 Problems 1: Solution sketches

6. Sufficient, Complete, and Ancillary Statistics

First Year Quantitative Comp Exam Spring, Part I - 203A. f X (x) = 0 otherwise

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

Direction: This test is worth 250 points. You are required to complete this test within 50 minutes.

MATH 472 / SPRING 2013 ASSIGNMENT 2: DUE FEBRUARY 4 FINALIZED

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 2 9/9/2013. Large Deviations for i.i.d. Random Variables

Lecture 12: September 27

Asymptotics. Hypothesis Testing UMP. Asymptotic Tests and p-values

IIT JAM Mathematical Statistics (MS) 2006 SECTION A

1.010 Uncertainty in Engineering Fall 2008

Chapter 6 Principles of Data Reduction

Stat410 Probability and Statistics II (F16)

HOMEWORK I: PREREQUISITES FROM MATH 727

Parameter, Statistic and Random Samples

7.1 Convergence of sequences of random variables

7.1 Convergence of sequences of random variables

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

Simulation. Two Rule For Inverting A Distribution Function

APPM 5720 Solutions to Problem Set Five

Stat 319 Theory of Statistics (2) Exercises

Lecture 23: Minimal sufficiency

Lecture 19: Convergence

Lecture 3. Properties of Summary Statistics: Sampling Distribution

Maximum Likelihood Estimation

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n.

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

The variance of a sum of independent variables is the sum of their variances, since covariances are zero. Therefore. V (xi )= n n 2 σ2 = σ2.

AMS570 Lecture Notes #2

Mathematics 170B Selected HW Solutions.

Lecture 10 October Minimaxity and least favorable prior sequences

2. The volume of the solid of revolution generated by revolving the area bounded by the

STATISTICAL METHODS FOR BUSINESS

Lecture 6 Ecient estimators. Rao-Cramer bound.

January 25, 2017 INTRODUCTION TO MATHEMATICAL STATISTICS

Lecture 7: Properties of Random Samples

SDS 321: Introduction to Probability and Statistics

Lecture 11 and 12: Basic estimation theory

Advanced Stochastic Processes.

Questions and Answers on Maximum Likelihood

This section is optional.

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

ECE 901 Lecture 14: Maximum Likelihood Estimation and Complexity Regularization

LECTURE 8: ASYMPTOTICS I

Estimation for Complete Data

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 3 9/11/2013. Large deviations Theory. Cramér s Theorem

Convergence of random variables. (telegram style notes) P.J.C. Spreij

ECE 901 Lecture 13: Maximum Likelihood Estimation

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS

Last Lecture. Unbiased Test

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

Topic 9: Sampling Distributions of Estimators

Kurskod: TAMS11 Provkod: TENB 21 March 2015, 14:00-18:00. English Version (no Swedish Version)

Lecture 18: Sampling distributions

Mathematical Statistics - MS

Bayesian Methods: Introduction to Multi-parameter Models

32 estimating the cumulative distribution function

Monte Carlo Integration

Lecture 13: Maximum Likelihood Estimation

Point Estimation: properties of estimators 1 FINITE-SAMPLE PROPERTIES. finite-sample properties (CB 7.3) large-sample properties (CB 10.

Goodness-of-Fit Tests and Categorical Data Analysis (Devore Chapter Fourteen)

Summary. Recap. Last Lecture. Let W n = W n (X 1,, X n ) = W n (X) be a sequence of estimators for

STAT Homework 1 - Solutions

STAT Homework 2 - Solutions

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 5

An Introduction to Asymptotic Theory

Lecture 2: Monte Carlo Simulation

Topic 9: Sampling Distributions of Estimators

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

Mathmatical Statisticals

Solutions of Homework 2.

Exercise 4.3 Use the Continuity Theorem to prove the Cramér-Wold Theorem, Theorem. (1) φ a X(1).

Statisticians use the word population to refer the total number of (potential) observations under consideration

Properties of Point Estimators and Methods of Estimation

Random Variables, Sampling and Estimation

Discrete Probability Functions

Topic 9: Sampling Distributions of Estimators

Lecture 01: the Central Limit Theorem. 1 Central Limit Theorem for i.i.d. random variables

1 Introduction to reducing variance in Monte Carlo simulations

Asymptotic distribution of products of sums of independent random variables

G. R. Pasha Department of Statistics Bahauddin Zakariya University Multan, Pakistan

STATISTICAL INFERENCE

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA STATISTICAL THEORY AND METHODS PAPER I

STAT 516 Answers Homework 6 April 2, 2008 Solutions by Mark Daniel Ward PROBLEMS

Exponential Families and Bayesian Inference

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall Midterm Solutions

ECE 330:541, Stochastic Signals and Systems Lecture Notes on Limit Theorems from Probability Fall 2002

Math 525: Lecture 5. January 18, 2018

Estimation of the Mean and the ACVF

Stat 421-SP2012 Interval Estimation Section

Section 14. Simple linear regression.

Transcription:

SOLUTION FOR HOMEWORK 7, STAT 6331 1 Exerc733 Here we just recall that MSE(ˆp B ) = p(1 p) (α + β + ) + ( p + α 2 α + β + p) 2 The you plug i α = β = (/4) 1/2 After simplificatios MSE(ˆp B ) = 4( 1/2 + )2 ad this is costat i p How do we get such α ad β? We take a partial derivative i p ad the set it equal to zero 2 Exerc 737 Let X 1,, X be iid accordig to the pdf f(x θ) = (2θ) 1 I( x < θ), θ Ω = (, ) We discussed i class that X () is the CSS At the same time, (X (1), X () ) is ot CSS because E θ (X (1) = E θ (X () (What do you thik about the SS statistic (X (1), X () ) for the cases of Uif(θ, 2θ) or Uif(θ 1/2, θ + 1/2)?) Now, Y := X () is SS by Factorizatio Theorem because Further, let us fid the pdf of Y Write, f(x θ) = (2θ) I( x () θ) F Y (y) = P(Y y) = P( X () y) = X l y) = θ y I(y (, θ)) Thus, the pdf of Y is f Y (y) = d dz F Y (z) z=y = θ y 1 I(o < y < θ) (1) Let us show that Y is CSS Suppose that E θ (g(y )) = g(y)θ y 1 dy, for all θ > (2) The de θ (g(y ))/dθ for all θ > This yields (remember the Leibitz rule o p69 of how to take the derivative) θ 1 g(y)y 1 dy + g(θ)θ θ 1, θ > 1

But the itegral i the above-writte idetity is zero due to (2) This implies that g(θ)θ 1 for all θ >, ad this g(θ for all θ > We proved (directly) that Y is complete Our ext step is to fid a ubiased estimator δ(y ) which is also the UMVUE Let us check, usig (1), that Thus, the UMVUE is E θ (Y ) = yθ y 1 dyθ y dy = θ ( + 1) 1 θ +1 = + 1 θ δ (Y ) = + 1 max X l l 3 Exers 74 Let X 1,, X be a sample from Beroulli(p) We calculate the Fisher iformatio for a sigle observatio: I(p) := E p { 2 p 2 l(px (1 p) 1 X )} = E p { 2 p2[x l(p) + (1 X) l(1 p)]} = E p { p [x p 1 X 1 p ]} = E p{ x p 2 1 X (1 p) 2 } = (1 p)2 p + p 2 (1 p) p 2 (1 p) 2 The Cramér-Rao lower boud tells us that = 1 p + p p(1 p) = 1 p(1 p) Var p (δ(x)) [ E p(δ(x))/ p] 2 I(p) If δ (X) is ubiased, the umerator i the lower boud is 1, ad this yields that Var p (δ (X)) p(1 p), ad because Var p ( X) = p(1 p)/, the sample mea is the best ubiased estimator of p 4 Exerc 741 A sequece of iid RVs X 1,,X is observed It is kow that E µ,σ 2(X) = µ, Var µ,σ 2 = σ 2 (a) If δ := a i X i the m E µ,σ 2(δ) = a i µ = mu i = 1 a i = µiff a i = 1 (b) Let us fid {a i } that miimize the variace Var µ,σ 2(δ) = E µ,σ 2( (a i X i µ) 2 ) = Var µ,σ 2( a i (X i µ)) 2

[because observatios are iid we cotiue] = a 2 i Var µ,σ 2(X i) = σ 2 a 2 i Now we should fid {a i } that miimize a 2 i give a i = 1 Let us check that a i 1/ are the extreme (what else ca we try?) Write (i what follows the summatio is over i {1, 2,, }), a 2 i = (a i 1 + 1 ) 2 = (a i 1 ) 2 + 2 1 (a i 1 ) + 1 The last sum is zero because a i = 1 As a result, we get that m a 2 i = (a i 1 ) 2 + 1 1 with the equality iff a i 1 5 Exerc 747 We have X = µ + ǫ where ǫ N(, σ 2 ) ad µ is a uderlyig radius A sample of size is observed What is the UMVUE of a = πµ 2? Here X is CSS (due to the expoetial family) so I just ote that E µ,σ 2( X 2 ) 1 σ 2 = µ 2, so â ub = π( X 2 1 σ 2 ) Because it is a fuctio of the CSS, it is UMVUE Here σ 2 is kow, but what if it is also ukow? Cosider the followig estimator: What do you thik about its properties? ã := π( X 1 S 2 )? 6 Exerc 748 Here X 1,,X are iid Beroulli(p) (a) We foud i Exerc 74 that I(p) = 1/[p(1 p)], so Var p ( X) = 1 p(1p) attais the lower boud [I(p)] 1 = 1 p(1 p) (b) Well, we ca write usig iid, 4 E p {X 1 X 2 X 3 X 4} = E p {X l } = p 4 The, because X l is CSS (remember that we are dealig with a expoetial class), the statistic δ( X l ) := E p {X 1 X 2 X 3 X 4 X l } is the UMVUE of p 4 Ca you calculate it? Try by yourself ad the look at this: δ(t) = E p {X 1 X 2 X 3 X 4 X l = t} = P p (X 1 = 1, X 2 = 1, X 3 = 1, X 4 = 1 X l = t) 3

= P p(x 1 = 1, X 2 = 1, X 3 = 1, X 4 = 1, X l = t) P p ( X l = t) = p4 [( 4)!/(t 4)!( t)!]p t 4 (1 p) t [!/t!( t)!]p t (1 p) t ] = ( 4)!t!!(t 4)! 7 Exerc 749 Let X 1,, X be a sample from Expo(λ) (a) Fid a UE of λ based o X (1) Well, because f X (x λ) = λ 1 e x/λ I(x > ) we use our techique to fid the desity of the first ordered observatio Remember how we do this: F X(1) (x λ) = P λ (X (1) x) = 1 P(X (1) > x) = 1 P λ (X 1 > x,, X > x) = 1 [λ 1 e z/λ dz] = 1 e x/λ Take derivative ad get f X(1) (x λ) = λ 1 e x/λ As we see, X (1) Expo(λ/) so E(X (1) ) = λ/ ad λ := X (1) is UE (b) Let us fid UMVUE Here Y := X l is CSS (agai due to the expoetial family) Because E λ (Y ) = λ we get ˆλ UMV U = X x 8 Exerc 752 Here X 1,,X are iid from Poisso(λ) (a) Write e λ λ X l f X (x λ) = = e λ λ x l x l! x l! As we see, this is a expoetial family with Y := X l beig the CSS We coclude, usig our theory, that X = Y/ is the UMVUE of λ (b) To aalyze directly E λ (S 2 X) = E λ {( 1) 1 (X l X) X} is possible but rather complicated Let be smart ad use the theory We kow that E λ (S 2 ) = λ because λ is the variace ad S 2 is UE of the variace But λ is also the mea for poisso distributio, so E λ (S 2 X) is the UMVUE of the mea Further, X is also UMVUE of λ ad X is the CSS, so by uiqueess of the UMVUE we have for Poisso distributio! E λ (S 2 X) = X 4

The we also ca write that Var λ (S 2 ) = Var λ (E λ (S 2 X)) + E λ {Var λ (S 2 X)} > Var λ (E λ (S 2 X)) = Var λ ( X) (c) If Y is CSS ad Z is ay other statistic such that E θ (Y ) E θ (Z) for all θ Ω the E θ (Z Y ) = Y for all θ Ω Ideed, let E θ (Z Y ) =: g(y ), the E θ (g(y ) Y ) for all θ Ω because E θ g(y ) = E θ (Z) = E θ (Y ) for all θ Ω Because Y is complete this yields that g(y ) = Y as Fially, we kow that coditioig o a CSS reduces the variace, so Var θ (Z) > Var θ (E θ (Z Y )) = Var θ (Y ) 9 Exerc 755 (a) Give that the pdf is f(x θ) = θ 1 I( < x < θ) Here Y := X () is the CSS ad f Y (y θ) = θ y 1 The As a result, E θ (Y r ) = y r θ y 1 dy = θ y +r 1 dy = ˆθ UMV U := + r Xr () + r θ θ +r = + r θr 1 Exerc 759 Here X 1,,X are iid from N(µ, σ 2 ) Fid UMVUE for σ p, p > Well, it is reasoable to try to aalyze (S 2 ) p/2 because S 2 is a good estimate of σ 2 We kow that S 2 D = σ 2 ( 1)χ 2 1, so E µ,σ 2(S 2 ) p/2 = σ p [( 1) p/2 E{(χ 2 1) p/2 }] Deote K p := ( 1) p/2 E{(χ 2 1) p/2 } This is a fuctio i p which ca be calculated for all p (I skip its calculatio but you ca thik about momet geeratig fuctio for chi-squared RV), ad the ˆδ p := (S2 ) p/2 K p (3) is the UMVUE Ideed, ( X, S 2 ) is the CSS for the ormal distributio ad the mea of the estimator is σ p 5