Lecture 16: UMVUE: coditioig o sufficiet ad complete statistics The 2d method of derivig a UMVUE whe a sufficiet ad complete statistic is available Fid a ubiased estimator of ϑ, say U(X. Coditioig o a sufficiet ad complete statistic T (X: E[U(X T ] is the UMVUE of ϑ. We eed to derive a explicit form of E[U(X T ] From the uiqueess of the UMVUE, it does ot matter which U(X is used. Thus, we should choose U(X so as to make the calculatio of E[U(X T ] as easy as possible. We do ot eed the distributio of T. But we eed to work out the coditioal expectatio E[U(X T ]. Usig the idepedece of some statistics (Basu s theorem, we may avoid to work o coditioal distributios. UW-Madiso (Statistics Stat 709 Lecture 16 2018 1 / 15
Example 7.3.24 (biomial family Let X 1,...,X be iid from biomial(k,θ with kow k ad ukow θ (0,1. We wat to estimate g(θ = P θ (X 1 = 1 = kθ(1 θ k 1. Note that T = i=1 X i biomial(k,θ is the sufficiet ad complete statistic for θ. But o ubiased estimator based o it is immediately evidet. To apply coditioig, we take the simple ubiased estimator of P θ (X 1 = 1, the idicator fuctio I(X 1 = 1. By Theorem 7.3.23, the UMVUE of g(θ is ψ(t = E[I(X 1 = 1 T = P(X 1 = 1 T We eed to simply ψ(t ad obtai a explicit form. Whe T = 0, P(X 1 = 1 T = 0 = 0. For t = 1,...,k, UW-Madiso (Statistics Stat 709 Lecture 16 2018 2 / 15
ψ(t = P (X 1 = 1 T = t = P θ (X 1 = 1, i=1 X i = t P θ ( i=1 X i = t = P θ (X 1 = 1, i=2 X i = t 1 P θ ( i=1 X i = t = P θ (X 1 = 1P θ ( i=2 X i = t 1 P θ ( i=1 X i = t [ (k( 1 kθ(1 θ k 1 ] t 1 θ t 1 (1 θ k( 1 (t 1 = θ t (1 θ k t = k( k( 1 t 1 ( k t ( k t Hece, the UMVUE of g(θ = kθ(1 θ k 1 is k( k( 1 T 1 T = 1,...,k ψ(t = ( k T 0 T = 0 UW-Madiso (Statistics Stat 709 Lecture 16 2018 3 / 15
Example 3.3 Let X 1,...,X be i.i.d. from the expoetial distributio E(0,θ. F θ (x = (1 e x/θ I (0, (x. Cosider the estimatio of ϑ = 1 F θ (t. X is sufficiet ad complete for θ > 0. I (t, (X 1 is ubiased for ϑ, E[I (t, (X 1 ] = P(X 1 > t = ϑ. Hece T (X = E[I (t, (X 1 X] = P(X 1 > t X is the UMVUE of ϑ. If the coditioal distributio of X 1 give X is available, the we ca calculate P(X 1 > t X directly. By Basu s theorem (Theorem 2.4, X 1 / X ad X are idepedet. By Propositio 1.10(vii, P(X 1 > t X = x = P(X 1 / X > t/ X X = x = P(X 1 / X > t/ x UW-Madiso (Statistics Stat 709 Lecture 16 2018 4 / 15
To compute this ucoditioal probability, we eed the distributio of X 1 / i=1 X i = X 1 / ( X 1 + i=2 X i. Usig the trasformatio techique discussed i 1.3.1 ad the fact that i=2 X i is idepedet of X 1 ad has a gamma distributio, we obtai that X 1 / i=1 X i has the Lebesgue p.d.f. ( 1(1 x 2 I (0,1 (x. Hece P(X 1 > t X 1 = x = ( 1 (1 x 2 dx ( = 1 t x t/( x 1 ad the UMVUE of ϑ is ( T (X = 1 t 1 X. UW-Madiso (Statistics Stat 709 Lecture 16 2018 5 / 15
Example 3.4 Let X 1,...,X be i.i.d. from N(µ,σ 2 with ukow µ R ad σ 2 > 0. From Example 2.18, T = ( X,S 2 is sufficiet ad complete for θ = (µ,σ 2 X ad ( 1S 2 /σ 2 are idepedet X has the N(µ,σ 2 / distributio S 2 has the chi-square distributio χ 2 1. Usig the method of solvig for h directly, we fid that the UMVUE for µ is X; the UMVUE of µ 2 is X 2 S 2 /; the UMVUE for σ r with r > 1 is k 1,r S r, where k,r = r/2 Γ ( 2 2 r/2 Γ ( +r 2 the UMVUE of µ/σ is k 1, 1 X/S, if > 2. UW-Madiso (Statistics Stat 709 Lecture 16 2018 6 / 15
Example 3.4 (cotiued Suppose that ϑ satisfies P(X 1 ϑ = p with a fixed p (0,1. Let Φ be the c.d.f. of the stadard ormal distributio. The ϑ = µ + σφ 1 (p ad its UMVUE is X + k 1,1 SΦ 1 (p. Let c be a fixed costat ad ( c µ ϑ = P(X 1 c = Φ σ We ca fid the UMVUE of ϑ usig the method of coditioig. Sice I (,c (X 1 is a ubiased estimator of ϑ, the UMVUE of ϑ is. E[I (,c (X 1 T ] = P(X 1 c T. By Basu s theorem, the acillary statistic Z (X = (X 1 X/S is idepedet of T = ( X,S 2. UW-Madiso (Statistics Stat 709 Lecture 16 2018 7 / 15
Example 3.4 (cotiued The, by Propositio 1.10(vii, ( P X 1 c T = ( x,s 2 ( = P = P It ca be show that Z has the Lebesgue p.d.f. f (z = ( Γ 1 2 π( 1Γ ( 2 2 Hece the UMVUE of ϑ is Z c X S T = ( x,s2 ( Z c x. s ] (/2 2 [1 z2 ( 1 2 I (0,( 1/ ( z (c X/S P(X 1 c T = ( 1/ f (zdz UW-Madiso (Statistics Stat 709 Lecture 16 2018 8 / 15
Example 3.4 (cotiued Suppose that we would like to estimate ϑ = 1 σ Φ ( c µ σ the Lebesgue p.d.f. of X 1 evaluated at a fixed c, where Φ is the first-order derivative of Φ. By the previous result, the coditioal p.d.f. of X 1 give X = x ad S 2 = s 2 is s 1 f ( x x s. Let f T be the joit p.d.f. of T = ( X,S 2. The ϑ = Hece the UMVUE of ϑ is, ( [ ( ] 1 c x 1 c s f f T (tdt = E s S f X. S ( 1 c S f X. S UW-Madiso (Statistics Stat 709 Lecture 16 2018 9 / 15
Example Let X 1,...,X be i.i.d. with Lebesgue p.d.f. f θ (x = θx 2 I (θ, (x, where θ > 0 is ukow. Suppose that ϑ = P(X 1 > t for a costat t > 0. The smallest order statistic X (1 is sufficiet ad complete for θ. Hece, the UMVUE of ϑ is P(X 1 > t X (1 = P(X 1 > t X (1 = x (1 ( X1 = P > t X (1 X X (1 = x (1 (1 ( X1 = P > t X (1 x X (1 = x (1 (1 ( X1 = P > s X (1 (Basu s theorem, where s = t/x (1. If s 1, this probability is 1. UW-Madiso (Statistics Stat 709 Lecture 16 2018 10 / 15
Example (cotiued Cosider s > 1 ad assume θ = 1 i the calculatio: ( ( X1 X1 P > s = X (1 P > s,x X (1 = X i i=1 (1 ( X1 = P > s,x X (1 = X i i=2 (1 ( X1 = ( 1P > s,x X (1 = X (1 = ( 1P (X 1 > sx,x 2 > X,...,X 1 > X 1 = ( 1 dx 1 dx = ( 1 x 1 >sx,x 2 >x,...,x 1 >x i=1 [ 1 = ( 1 1 sx 1 sx +1 1 i=2 ( x 1 xi 2 x 2 i dx = ( 1x (1 t ] 1 1 dx i dx 1 dx x 2 1 x 2 UW-Madiso (Statistics Stat 709 Lecture 16 2018 11 / 15
Example (cotiued This shows that the UMVUE of P(X 1 > t is Aother solutio h(x (1 = The UMVUE must be h(x (1 The Lebesgue p.d.f. of X (1 is { ( 1X(1 t X (1 < t 1 X (1 t θ x +1 I (θ, (x. Use the method of fidig h If θ t, the P(X 1 > t = 1 ad P(t > X (1 = 0. Hece, if X (1 t, h(x (1 must be 1 a.s. P θ The value of h(x (1 for X (1 < t is ot specified. UW-Madiso (Statistics Stat 709 Lecture 16 2018 12 / 15
If θ < t, t = h(x θ θ x +1 dx + t Sice P(X 1 > t = θ/t, we have E[h(X (1 ] = h(x θ dx θ x +1 θ t x +1 dx = θ t t = h(x θ θ dx + θ x +1 t i.e., 1 t tθ 1 = h(x θ x +1 dx + 1 t Differetiatig both sizes w.r.t. θ leads to Hece, for ay X (1 < t, 1 tθ = h(θ θ +1 θ h(x θ θ dx + x +1 t h(x (1 = ( 1X (1. t UW-Madiso (Statistics Stat 709 Lecture 16 2018 13 / 15
Ubiased estimators of 0 If a sufficiet ad complete statistic is ot available, the what should we do? If W is ubiased for ϑ ad T is sufficiet, the by Theorem 2.5 (Rao-Blackwell, E(W T is better tha W. If we have aother sufficiet statistic S, should we cosider E[E(W T S]? If there is a fuctio h such that S = h(t, the by the properties of coditioal expectatio, E[E(W T S] = E(W S = E[E(W S T ] That is, we should always coditioig o a simpler sufficiet statistic, such as a miimal sufficiet statistic. To see whe a ubiased estimator is best ubiased, we might ask how could we improve upo a give ubiased estimator? Suppose that T (X is ubiased for g(θ ad U(X is a statistic satisfyig E θ (U = 0 for all θ, i.e., U is ubiased for 0. The, for ay costat a, UW-Madiso (Statistics Stat 709 Lecture 16 2018 14 / 15
is ubiased for g(θ. Ca it be better tha T (X? T (X + au(x Var θ (T + au = Var θ (T + 2aCov θ (T,U + a 2 Var θ (U If for some θ 0, Cov θ0 (T,U < 0, the we ca make 2aCov θ0 (T,U + a 2 Var θ0 (U < 0 by choosig 0 < a 2Cov θ0 (T,U/Var θ0 (U. Hece, T (X + au(x is better tha T (X at least whe θ = θ 0 ad T (X caot be UMVUE. Similarly, if Cov θ0 (T,U > 0 for some θ 0, the T (X caot be UMVUE either. Thus, Cov θ (T,U = 0 is ecessary for T (X to be a UMVUE, for all ubiased estimators of 0. It turs out that Cov θ (T,U = 0 for all U(X ubiased for 0 is also sufficiet for T (X beig a UMVUE. UW-Madiso (Statistics Stat 709 Lecture 16 2018 15 / 15