Fundamentals of Statistics

Size: px
Start display at page:

Download "Fundamentals of Statistics"

Transcription

1 Chapter 2 Fudametals of Statistics Exercise 1 (#2.9). Cosider the family of double expoetial distributios: P = { 2 1 e t µ : <µ< }. Show that P is ot a expoetial family. Solutio. Assume that P is a expoetial family. The there exist p- dimesioal Borel fuctios T (X) ad η(µ) (p 1) ad oe-dimesioal Borel fuctios h(x) ad ξ(µ) such that 2 1 exp{ t µ } = exp{[η(µ)] τ T (t) ξ(µ)}h(t) for ay t ad µ. Let X =(X 1,..., X ) be a radom sample from P P (i.e., X 1,..., X are idepedet ad idetically distributed with P P), where >p, T (X) = T (X i), ad h (X) =Π h(x i). The the joit Lebesgue desity of X is { } 2 exp x i µ = exp {[η(µ)] τ T (x) ξ(µ)} h (x) for ay x =(x 1,..., x ) ad µ, which implies that x i x i µ =[ η(µ)] τ T (x) ξ(µ) for ay x ad µ, where η(µ) =η(µ) η(0) ad ξ(µ) =ξ(µ) ξ(0). Defie ψ µ (x) = x i x i µ. We coclude that if x =(x 1,..., x ) ad y =(y 1,..., y ) such that T (x) =T (y), the ψ µ (x) =ψ µ (y) for all µ, which implies that vector of the ordered x i s is the same as the vector of the ordered y i s. O the other had, we may choose real umbers µ 1,..., µ p such that η(µ i ), i =1,..., p, are liearly idepedet vectors. Sice ψ µi (x) =[ η(µ i )] τ T (x) ξ(µ i ), i =1,..., p, 51

2 52 Chapter 2. Fudametals of Statistics for ay x, T (x) is the a fuctio of the p fuctios ψ µi (x), i =1,..., p. Sice >p, it ca be show that there exist x ad y i R such that ψ µi (x) =ψ µi (y), i =1,..., p, (which implies T (x) =T (y)), but the vector of ordered x i s is ot the same as the vector of ordered y i s. This cotradicts the previous coclusio. Hece, P is ot a expoetial family. Exercise 2 (#2.13). A discrete radom variable X with P (X = x) =γ(x)θ x /c(θ), x =0, 1, 2,..., where γ(x) 0, θ>0, ad c(θ) = x=0 γ(x)θx, is called a radom variable with a power series distributio. Show that (i) {γ(x)θ x /c(θ) :θ>0} is a expoetial family; (ii) if X 1,..., X are idepedet ad idetically distributed with a power series distributio γ(x)θ x /c(θ), the X i has the power series distributio γ (x)θ x /[c(θ)], where γ (x) is the coefficiet of θ x i the power series expasio of [c(θ)]. Solutio. (i) Note that γ(x)θ x /c(θ) = exp{x log θ log(c(θ))}γ(x). Thus, {γ(x)θ x /c(θ) :θ>0} is a expoetial family. (ii) From part (i), we kow that the atural parameter η = log θ, ad also ζ(η) = log (c(e η )). From the properties of expoetial families (e.g., Theorem 2.1 i Shao, 2003), the momet geeratig fuctio of X is ψ X (t) =e ζ(η+t) /e ζ(η) = c(θe t )/c(θ). The momet geeratig fuctio of X i is [c(θe t )] /[c(θ)], which is the momet geeratig fuctio of the power series distributio γ (x)θ x /[c(θ)]. Exercise 3 (#2.17). Let X be a radom variable havig the gamma distributio with shape parameter α ad scale parameter γ, where α is kow ad γ is ukow. let Y = σ log X. Show that (i) if σ>0is ukow, the the distributio of Y is i a locatio-scale family; (ii) if σ>0is kow, the the distributio of Y is i a expoetial family. Solutio. (i) The Lebesgue desity of X is 1 Γ(α)γ α xα 1 e x/γ I (0, ) (x). Applyig the result i the ote of Exercise 17 i Chapter 1, the Lebesgue desity for Y = σ log X is 1 { Γ(α)σ eα(y σ log γ)/σ exp e (y σ log γ)/σ}. It belogs to a locatio-scale family with locatio parameter η = σ log γ ad scale parameter σ.

3 Chapter 2. Fudametals of Statistics 53 (ii) Whe σ is kow, we rewrite the desity of Y as } 1 exp{αy/σ} exp { ey/σ σγ(α) γ α log γ. Therefore, the distributio of Y is from a expoetial family. Exercise 4. Let (X 1,..., X ) be a radom sample from N(0, 1). Show that Xi 2/ j=1 X2 j ad j=1 X2 j are idepedet, i =1,...,. Solutio. Note that X1 2,..., X 2 are idepedet ad have the chi-square distributio χ 2 1. Hece their joit Lebesgue desity is ce (y1+ +y)/2 y1 y, y j > 0, where c is a costat. Let U = j=1 X2 j ad V i = Xi 2 /U, i =1,...,. The Xi 2 = UV i ad j=1 V j = 1. The Lebesgue desity for (U, V 1,..., V 1 )is ce u/2 v u 1 u v 1 v = cu /2 1 e u/2 1 v 1 v 1 v 1 v 1, u > 0,v j > 0. Hece U ad (V 1 /U,..., V 1 /U) are idepedet. Sice V =1 (V V 1 ), we coclude that U ad V /U are idepedet. A alterative solutio ca be obtaied by usig Basu s theorem (e.g., Theorem 2.4 i Shao, 2003). Exercise 5. Let X =(X 1,..., X ) be a radom -vector havig the multivariate ormal distributio N (µj, D), where J is the -vector of 1 s, D = σ 2 1 ρ ρ ρ 1 ρ ρ ρ 1, ad ρ < 1. Show that X = 1 X i ad W = ( (X i X) 2 are idepedet, X has the ormal distributio N µ, 1+( 1)ρ σ ), 2 ad W/[(1 ρ)σ 2 ] has the chi-square distributio χ 2 1. Solutio. Defie A = ( 1) ( 1) ( 1) ( 1) 1 ( 1) ( 1).

4 54 Chapter 2. Fudametals of Statistics The AA τ = I (the idetity matrix) ad ADA τ = σ 2 1+( 1)ρ ρ ρ. Let Y = AX. The Y is ormally distributed with E(Y )= AE(X) = ( µ, 0,..., 0) ad Var(Y ) = ADA τ, i.e., the compoets of Y are idepedet. Let Y i be the ith compoet of Y. The, Y 1 = X ad Y i 2 = Y τ Y = X τ A τ AX = X τ X = X2 i. Hece X = Y 1 / ad W = (X i X) 2 = X2 i X 2 = Y i 2 Y1 2 = i=2 Y i 2. Sice Y i s are idepedet, X ad W are idepedet. Sice Y 1 has distributio N( µ, [1+( 1)ρ]σ 2 ), X = Y1 / ( has distributio N µ, 1+( 1)ρ σ ). 2 Sice Y 2,..., Y are idepedet ad idetically distributed as N(0, (1 ρ)σ 2 ), W/[(1 ρ)σ 2 ]= i=2 Y i 2/[(1 ρ)σ2 ] has the χ 2 1 distributio. Exercise 6. Let (X 1,..., X ) be a radom sample from the uiform distributio o the iterval [0, 1] ad let R = X () X (1), where X (i) is the ith order statistic. Derive the Lebesgue desity of R ad show that the limitig distributio of 2(1 R) is the chi-square distributio χ 2 4. Solutio. The joit Lebesgue desity of X (1) ad X () is { ( 1)(y x) 2 0 <x<y<1 f(x, y) = 0 otherwise (see, e.g., Example 2.9 i Shao, 2003). The, the joit Lebesgue desity of R ad X () is { ( 1)x 2 0 <x<y<1 g(x, y) = 0 otherwise ad, whe 0 <x<1, the Lebesgue desity of R is 1 g(x, y)dy = ( 1)y 2 ds = ( 1)x 2 (1 x) x for 0 <x<1. Cosequetly, the Lebesgue desity of 2(1 R) is { 1 h (x) = 4 x ( 1 2) x 2 0 <x<2 0 otherwise. ( Sice lim 1 x 2 2) = e x/2, lim h (x) =4 1 xe x/2 I (0, ) (x), which is the Lebesgue desity of the χ 2 4 distributio. By Scheffé s theorem (e.g.,

5 Chapter 2. Fudametals of Statistics 55 Propositio 1.18 i Shao, 2003), the limitig distributio of 2(1 R) is the χ 2 4 distributio. Exercise 7. Let (X 1,..., X ) be a radom sample from the expoetial distributio with Lebesgue desity θ 1 e (a x)/θ I (0, ) (x), where a Rad θ>0 are parameters. Let X (1) X () be order statistics, X (0) =0, ad Z i = X (i) X (i 1), i =1,...,. Show that (i) Z 1,..., Z are idepedet ad 2( i +1)Z i /θ has the χ 2 2 distributio; (ii) 2[ r X (i) +( r)x (r) a]/θ has the χ 2 2r distributio, r =1,..., ; (iii) X (1) ad Y are idepedet ad (X (1) a)/y has the Lebesgue desity ( ) 1+ t 1 I(0, ) (t), where Y =( 1) 1 (X i X (1) ). Solutio. If we ca prove the result for the case of a = 0 ad θ = 1, the the result for the geeral case follows by cosiderig the trasformatio (X i a)/θ, i =1,...,. Hece, we assume that a = 0 ad θ =1. (i) The joit Lebesgue desity of X (1),..., X () is f(x 1,..., x )= {!e x 1 x 0 <x 1 < <x 0 otherwise. The the joit Lebesgue desity of Z i, i =1,...,, is g(x 1,..., x )= {!e x 1 ( i+1)x i x x i > 0,,...,, 0 otherwise. Hece Z 1,..., Z are idepedet ad, for each i, the Lebesgue desity of 2Z i is ( i +1)e ( i+1)xi I (0, ) (x i ). The the desity of 2( i +1)Z i is 2 1 e xi/2 I (0, ) (x i ), which is the desity of the χ 2 2 distributio. (ii) For r =1,...,, r X (i) +( r)x (r) = r ( i +1)Z i. From (i), Z 1,..., Z are idepedet ad 2( i +1)Z i has the χ 2 2 distributio. Hece 2 r X (i) +( r)x (r) has the χ 2 2r distributio for ay r. (iii) Note that Y = 1 1 i=2 (X (i) X (1) )= 1 1 ( i +1)Z i. From the result i (i), Y ad X (1) are idepedet ad 2( 1)Y has the χ 2 2( 1) distributio. Hece the Lebesgue desity of Y is f Y (y) = ( 1) ( 1)! y 2 e ( 1)y I (0, ) (y). Note that the Lebesgue desity of X (1) is i=2

6 56 Chapter 2. Fudametals of Statistics f X(1) (x) = e x I (0, ) (x). Hece, for t > 0, the desity of the ratio X (1) /Y is (e.g., Example 1.15 i Shao, 2003) f(t) = x f Y (x)f X(1) (tx)dx ( 1) = 0 ( 1)! x 1 e (+t 1)x dx ( = 1+ t ) ( + t 1) x 1 e (+t 1)x dx 1 0 ( 1)! ( = 1+ t ). 1 Exercise 8 (#2.19). Let (X 1,..., X ) be a radom sample from the gamma distributio with shape parameter α ad scale parameter γ x ad let (Y 1,..., Y ) be a radom sample from the gamma distributio with shape parameter α ad scale parameter γ y. Assume that X i s ad Y i s are idepedet. Derive the distributio of the statistic X/Ȳ, where X ad Ȳ are the sample meas based o X i s ad Y i s, respectively. Solutio. From the property of the gamma distributio, X has the gamma distributio with shape parameter α ad scale parameter γ x ad Ȳ has the gamma distributio with shape parameter α ad scale parameter γ y. Sice X ad Ȳ are idepedet, the Lebesgue desity of the ratio X/Ȳ is, for t>0, f(t) = 1 [Γ(α)] 2 (γ x γ y ) α = Γ(2α)tα 1 [Γ(α)] 2 (γ x γ y ) α 0 ( t (tx) α 1 e tx/γx x α e x/γy dx γ x + 1 γ y ) 2α. Exercise 9 (#2.22). Let (Y i,z i ), i =1,...,, be idepedet ad idetically distributed radom 2-vectors. The sample correlatio coefficiet is defied to be 1 T = ( 1)S Y S Z (Y i Ȳ )(Z i Z), where Ȳ = 1 Y i, Z = 1 Z i, SY 2 =( 1) 1 (Y i Ȳ )2, ad SZ 2 =( 1) 1 (Z i Z) 2. (i) Assume that E Y i 4 < ad E Z i 4 <. Show that (T ρ) d N(0,c 2 ),

7 Chapter 2. Fudametals of Statistics 57 where ρ is the correlatio coefficiet betwee Y 1 ad Z 1 ad c is a costat. Idetify c i terms of momets of (Y 1,Z 1 ). (ii) Assume that Y i ad Z i are idepedetly distributed as N(µ 1,σ1) 2 ad N(µ 2,σ2), 2 respectively. Show that T has the Lebesgue desity Γ ( ) 1 2 ( πγ 2 )(1 t 2 ) ( 4)/2 I ( 1,1) (t). 2 (iii) Uder the coditios of part (ii), show that the result i (i) is the same as that obtaied by applyig Scheffé s theorem to the desity of T. Solutio. (i) Cosider first the special case of EY 1 = EZ 1 = 0 ad Var(Y 1 ) = Var(Z 1 )=1. LetW i = ( Y i,z i,yi 2,Z2 i,y ) iz i ad W = 1 W i. Sice W 1,..., W are idepedet ad idetically distributed ad Var(W 1 ) is fiite uder the assumptio of E Y 1 4 < ad E Z 1 4 <, by the cetral limit theorem, ( W θ) d N 5 (0, Σ), where θ =(0, 0, 1, 1,ρ) ad Σ= 1 ρ E(Y1 3 ) E(Y 1Z1 2 ) E(Y1 2 Z 1) ρ 1 E(Y1 2 Z 1) E(Z1 3 ) E(Y 1Z1 2 ) E(Y1 3 ) E(Y1 2 Z 1) E(Y1 4 ) 1 E(Y1 2 Z1 2 ) 1 E(Y1 3 Z 1) ρ E(Y 1Z1 2 ) E(Z1 3 ) E(Y1 2 Z1 2 ) 1 E(Z1 4 ) 1 E(Y 1Z1 3 ) ρ E(Y1 2 Z 1) E(Y 1Z1 2 ) E(Y1 3 Z 1) ρ E(Y 1Z1 3 ) ρ E(Y1 2 Z1 2 ) ρ 2. Defie h(x 1,x 2,x 3,x 4,x 5 )= x 5 x 1 x 2 (x3 x 2 1 )(x 4 x 2 2 ). The T = h( W ) ad ρ = h(θ). By the δ-method (e.g., Theorem 1.12 i Shao, 2003), [h( W ) h(θ)] d N(0,c 2 ), where c 2 = ξ τ Σξ ad ξ = w=θ =(0, 0, ρ/2, ρ/2, 1). Hece h(w) w c 2 = ρ 2 [E(Y1 4 )+E(Z1)+2E(Y Z1)]/4 2 ρ[e(y1 3 Z 1 )+E(Y 1 Z1)] 3 + E(Y1 2 Z1). 2 The result for the geeral case ca be obtaied by cosiderig the trasformatio (Y i EY i )/ Var(Y i ) ad (Z i EZ i )/ Var(Z i ). The value of c 2 is the give by the previous expressio with Y 1 ad Z 1 replaced by (Y 1 EY 1 )/ Var(Y 1 ) ad (Z 1 EZ 1 )/ Var(Z 1 ), respectively. (ii) We oly eed to cosider the case of µ 1 = µ 2 = 0 ad σ 2 1 = σ 2 2 =1. Let Y =(Y 1,..., Y ), Z =(Z 1,..., Z ), ad A Z be the -vector whose ith compoet is (Z i Z)/( 1S Z ). Note that ( 1)S 2 Y (A τ ZY ) 2 = Y τ B Z Y with B Z = I 1 JJ τ A Z A τ Z, where I is the idetity matrix of order ad J is the -vector of 1 s. Sice A τ Z A Z = 1 ad J τ A Z =0,B Z A Z =0,

8 58 Chapter 2. Fudametals of Statistics BZ 2 = B Z ad tr(b Z )= 2. Cosequetly, whe Z is cosidered to be a fixed vector, Y τ B Z Y ad A τ Z Y are idepedet, Aτ ZY is distributed as N(0, 1), Y τ B Z Y has the χ 2 2 distributio, ad 2A τ Z Y/ Y τ B Z Y has the t-distributio t 2. Sice T = A τ Z Y/( 1S Y ), P (T t) =E[P (T t Z)] [ ( A τ Z = E P Y )] Y τ B Z Y +(A τ Z Y t Z )2 [ ( A τ = E P Z Y Y τ B Z Y [ ( = E P t 2 t )] 2 1 t 2 = Γ( 1 2 ) ( 2)πΓ( 2 2 ) )] t 1 t 2 Z t 2 1 t 2 0 ) ( 1)/2 (1+ x2 dx, 2 where t 2 deotes a radom variable havig the t-distributio t 2 ad the third equality follows from the fact that t if ad oly if a t b 2 1 t 2 desity a a 2 +b 2 for real umbers a ad b ad t (0, 1). Thus, T has Lebesgue d P (T t) = dt 1 Γ( 2 ) πγ( 2 2 )(1 t2 ) ( 4)/2 I ( 1,1) (t). (iii) Uder the coditios of part (ii), ρ = 0 ad, from the result i (i), c = 1 ad T d N(0, 1). From the result i (ii), T has Lebesgue desity Γ( 1 2 ) ( ) ( 4)/2 πγ( 2 2 ) 1 t2 I (, )(t) 1 e t2 /2 2π by Stirlig s formula. By Scheffé s Theorem, T d N(0, 1). Exercise 10 (#2.23). Let X 1,..., X be idepedet ad idetically distributed radom variables with EX1 4 <, T =(Y,Z), ad T 1 = Y/ Z, where Y = 1 X i ad Z = 1 X2 i. (i) Show that (T θ) d N 2 (0, Σ) ad (T 1 ϑ) d N(0,c 2 ). Idetify θ, Σ,ϑ, ad c 2 i terms of momets of X 1. (ii) Repeat (i) whe X 1 has the ormal distributio N(0,σ 2 ). (iii) Repeat (i) whe X 1 has Lebesgue desity (2σ) 1 e x /σ. Solutio. (i) Defie θ j = E X 1 j, j =1, 2, 3, 4, ad W i =( X i,x 2 i ), i =1,...,. The T = 1 W i. Let θ = EW 1 =(θ 1,θ 2 ). By the cetral limit theorem, (T θ) d N 2 (0, Σ), where ( Σ= Var( X 1 ) Cov( X 1,X1 2 ) Cov( X 1,X1 2 ) Var(X1 2 ) ) ( θ2 θ1 2 θ = 3 θ 1 θ 2 θ 3 θ 1 θ 2 θ 4 θ2 2 ).

9 Chapter 2. Fudametals of Statistics 59 Let g(y, z) =y/ z. The T 1 = g(t ), g(θ) =θ 1 / θ 2, g y (y,z)=θ =1/ θ 2, ad g z (y,z)=θ = θ 1 /(2θ 3/2 2 ). The, by the δ-method, (T 1 ϑ) d N(0,c 2 ) with ϑ = θ 1 / θ 2 ad c 2 =1+ θ2 1θ 4 4θ 3 2 θ2 1. 4θ 3 (ii) We oly eed to calculate θ j. Whe X 1 is distributed as N(0,σ 2 ), a direct calculatio shows that θ 1 = 2σ/ π, θ 2 = σ 2, θ 3 =2 2σ 3 / π, ad θ 4 =3σ 4. (iii) Note that X 1 has the expoetial distributio with Lebesgue desity σ 1 e x/σ I (0, ) (x). Hece, θ j = σ j j!. θ 1θ 3 θ 2 2 Exercise 11 (#2.25). Let X be a sample from P P, where P is a family of distributios o the Borel σ-field o R. Show that if T (X) is a sufficiet statistic for P Pad T = ψ(s), where ψ is measurable ad S(X) is aother statistic, the S(X) is sufficiet for P P. Solutio. Assume first that all P i P are domiated by a σ-fiite measure ν. The, by the factorizatio theorem (e.g., Theorem 2.2 i Shao, 2003), dp dν (x) =g (T (x))h(x), P where h is a Borel fuctio of x (ot depedig o P ) ad g P (t) is a Borel fuctio of t. IfT = ψ(s), the dp dν (x) =g (ψ(s(x)))h(x) P ad, by the factorizatio theorem agai, S(X) is sufficiet for P P. Cosider the geeral case. Suppose that S(X) is ot sufficiet for P P. By defiitio, there exist at least two measures P 1 Pad P 2 P such that the coditioal distributios of X give S(X) uder P 1 ad P 2 are differet. Let P 0 = {P 1,P 2 }, which is a sub-family of P. Sice T (X) is sufficiet for P P, it is also sufficiet for P P 0. Sice all P i P 0 are domiated by the measure P 1 + P 2, by the previously proved result, S(X) is sufficiet for P P 0. Hece, the coditioal distributios of X give S(X) uder P 1 ad P 2 are the same. This cotradictio proves that S(X) is sufficiet for P P. Exercise 12. Let P = {f θ : θ Θ}, where f θ s are probability desities, f θ (x) > 0 for all x Rad, for ay θ Θ, f θ (x) is cotiuous i x. Let X 1 ad X 2 be idepedet ad idetically distributed as f θ. Show that if X 1 + X 2 is sufficiet for θ, the P is a expoetial family idexed by θ. Solutio. The joit desity of X 1 ad X 2 is f θ (x 1 )f θ (x 2 ). By the factorizatio theorem, there exist fuctios g θ (t) ad h(x 1,x 2 ) such that f θ (x 1 )f θ (x 2 )=g θ (x 1 + x 2 )h(x 1,x 2 ).

10 60 Chapter 2. Fudametals of Statistics The log f θ (x 1 ) + log f θ (x 2 )=g(x 1 + x 2,θ)+h 1 (x 1,x 2 ), where g(t, θ) = log g θ (t) ad h 1 (x 1,x 2 ) = log h(x 1,x 2 ). Let θ 0 Θ ad r(x, θ) = log f θ (x) log f θ0 (x) ad q(x, θ) =g(x, θ) g(x, θ 0 ). The Cosequetly, q(x 1 + x 2,θ) = log f θ (x 1 ) + log f θ (x 2 )+h 1 (x 1,x 2 ) log f θ0 (x 1 ) log f θ0 (x 2 ) h 1 (x 1,x 2 ) = r(x 1,θ)+r(x 2,θ). r(x 1 + x 2,θ)+r(0,θ)=q(x 1 + x 2,θ)=r(x 1,θ)+r(x 2,θ) for ay x 1, x 2, ad θ. Let s(x, θ) =r(x, θ) r(0,θ). The for ay x 1, x 2, ad θ. Hece, s(x 1,θ)+s(x 2,θ)=s(x 1 + x 2,θ) s(, θ) =s(1,θ) =0, ±1, ±2,... For ay ratioal umber m ( ad m are itegers ad m 0), s ( m,θ) = s ( 1 m,θ) = m m s ( 1 m,θ) = m s ( m m,θ) = m s (1,θ). Hece s(x, θ) =xs(1,θ) for ay ratioal x. From the cotiuity of f θ,we coclude that s(x, θ) =xs(1,θ) for ay x R, i.e., ay x R. The, for ay x ad θ, r(x, θ) =s(1,θ)x + r(0,θ) f θ (x) = exp{r(x, θ) + log f θ0 (x)} = exp{s(1,θ)x + r(0,θ) + log f θ0 (x)} = exp{η(θ)x ξ(θ)}h(x), where η(θ) =s(1,θ), ξ(θ) = r(0,θ), ad h(x) =f θ0 (x). This shows that P is a expoetial family idexed by θ. Exercise 13 (#2.30). Let X ad Y be two radom variables such that Y has the biomial distributio with size N ad probability π ad, give Y = y, X has the biomial distributio with size y ad probability p. (i) Suppose that p (0, 1) ad π (0, 1) are ukow ad N is kow. Show that (X, Y ) is miimal sufficiet for (p, π). (ii) Suppose that π ad N are kow ad p (0, 1) is ukow. Show

11 Chapter 2. Fudametals of Statistics 61 whether X is sufficiet for p ad whether Y is sufficiet for p. Solutio. (i) Let A = {(x, y) :x =0, 1,..., y, y =0, 1,..., N}. The joit probability desity of (X, Y ) with respect to the coutig measure is { = exp x log ( N y ) π y (1 π) N y ( y x ) p x (1 p) y x I A p π(1 p) + y log + N log(1 π) 1 p 1 π }( N y )( ) y I A. x Hece, (X, Y ) has a distributio from a expoetial family of full rak (0 <p<1 ad 0 <π<1). This implies that (X, Y ) is miimal sufficiet for (p, π). (ii) The joit probability desity of (X, Y ) ca be writte as { } ( )( ) p N y exp x log + y log(1 p) π y (1 π) N y I A. 1 p y x This is from a expoetial family ot of full rak. Let p 0 = 1 2, p 1 = 1 3, p 2 = 2 3, ad η(p) = (log p 1 p, log(1 p)). The, two vectors i R2, η(p 1 ) η(p 0 )=( log 2, 2 log 2 log 3) ad η(p 2 ) η(p 0 ) = (log 2, log 2 log 3), are liearly idepedet. By the properties of expoetial families (e.g., Example 2.14 i Shao, 2003), (X, Y ) is miimal sufficiet for p. Thus, either X or Y is sufficiet for p. Exercise 14 (#2.34). Let X 1,..., X be idepedet ad idetically distributed radom variables havig the Lebesgue desity { exp ( ) x µ 4 } σ ξ(θ), where θ =(µ, σ) Θ=R (0, ). Show that P = {P θ : θ Θ} is a expoetial family, where P θ is the joit distributio of X 1,..., X, ad that the statistic T = ( X i, X2 i, X3 i, ) X4 i is miimal sufficiet for θ Θ. Solutio. Let T (x) =( x i, x2 i, x3 i, i=3 x4 i ) for ay x = (x 1,..., x ) ad let η(θ) =σ 4 ( 4µ 3, 6µ 2, 4µ, 1). The joit desity of (X 1,..., X )is f θ (x) = exp { [η(θ)] τ T (x) µ 4 /σ 4 ξ(θ) }, which belogs to a expoetial family. For ay two sample poits x = (x 1,..., x ) ad y =(y 1,..., y ), { [( f θ (x) f θ (y) = exp 1 ) ( ) σ 4 x 4 i yi 4 4µ x 3 i yi 3 ( ) ( )]} + 6µ 2 x 2 i 4µ 3 x i y i, y 2 i

12 62 Chapter 2. Fudametals of Statistics which is free of parameter (µ, σ) if ad oly if T (x) =T (y). By Theorem 2.3(iii) i Shao (2003), T (X) is miimal sufficiet for θ. Exercise 15 (#2.35). Let (X 1,..., X ) be a radom sample of radom variables havig the Lebesgue desity f θ (x)=(2θ) 1[ I (0,θ) (x)+i (2θ,3θ) (x) ]. Fid a miimal sufficiet statistic for θ (0, ). Solutio. We use the idea of Theorem 2.3(i)-(ii) i Shao (2003). Let Θ r = {θ 1,θ 2,...} be the set of positive ratioal umbers, P 0 = {g θ : θ Θ r }, ad P = {g θ : θ>0}, where g θ (x) = f θ(x i ) for x =(x 1,..., x ). The P 0 Pad a.s. P 0 implies a.s. P (i.e., if a evet A satisfyig P (A) =0 for all P P 0, the P (A) = 0 for all P P). Let {c i } be a sequece of positive umbers satisfyig c i = 1 ad g (x) = c ig θi (x). Defie T =(T 1,T 2,...) with T i (x) =g θi (x)/g (x). By Theorem 2.3(ii) i Shao (2003), T is miimal sufficiet for θ Θ 0 (or P P 0 ). For ay θ>0, there is a sequece {θ ik } {θ i } such that lim k θ ik = θ. The g θ (x) = lim k g θik (x) = lim k T ik (x)g (x) holds for all x C with P (C) = 1 for all P P. By the factorizatio theorem, T is sufficiet for θ>0 (or P P). By Theorem 2.3(i) i Shao (2003), T is miimal sufficiet for θ>0. Exercise 16 (#2.36). Let (X 1,..., X ) be a radom sample of radom variables havig the Cauchy distributio with locatio parameter µ ad scale parameter σ, where µ Rad σ > 0 are ukow parameters. Show that the vector of order statistics is miimal sufficiet for (µ, σ). Solutio. The joit Lebesgue desity of (X 1,..., X )is f µ,σ (x) = σ π 1 σ 2 +(x i µ) 2, x =(x 1,..., x ). For ay x =(x 1,..., x ) ad y =(y 1,..., y ), suppose that f µ,σ (x) = ψ(x, y) f µ,σ (y) holds for ay µ ad σ, where ψ does ot deped o (µ, σ). Let σ = 1. The we must have [ 1+(y i µ) 2] = ψ(x, y) [1+(x i µ) 2] for all µ. Both sides of the above idetity ca be viewed as polyomials of degree 2 i µ. Compariso of the coefficiets to the highest terms gives

13 Chapter 2. Fudametals of Statistics 63 ψ(x, y) =1. Thus, [ 1+(y i µ) 2] = [1+(x i µ) 2] for all µ. As a polyomial of µ, the left-had side of the above idetity has 2 complex roots x i ± 1, i =1,...,, while the right-had side of the above idetity has 2 complex roots y i ± 1, i =1,...,. By the uique factorizatio theorem for the etire fuctios i complex aalysis, we coclude that the two sets of roots must agree. This meas that the ordered values of x i s are the same as the ordered values of y i s. By Theorem 2.3(iii) i Shao (2003), the order statistics of X 1,..., X is miimal sufficiet for (µ, σ). Exercise 17 (#2.40). Let (X 1,..., X ), 2, be a radom sample from a distributio havig Lebesgue desity f θ,j, where θ>0, j =1, 2, f θ,1 is the desity of N(0,θ 2 ), ad f θ,2 (x) =(2θ) 1 e x /θ. Show that T =(T 1,T 2 )is miimal sufficiet for (θ, j), where T 1 = X2 i ad T 2 = X i. Solutio A. Let P be the joit distributio of X 1,..., X. By the factorizatio theorem, T is sufficiet for (θ, j). Let P = {P : θ>0,j =1, 2}, P 1 = {P : θ>0,j =1}, ad P 2 = {P : θ>0,j =2}. Let S be a statistic sufficiet for P P. The S is sufficiet for P P j, j =1, 2. Note that P 1 is a expoetial family with T 1 as a miimal sufficiet statistic. Hece, there exists a Borel fuctio ψ 1 such that T 1 = ψ 1 (S) a.s. P 1. Sice all desities i P are domiated by those i P 1, we coclude that T 1 = ψ 1 (S) a.s. P. Similarly, P 2 is a expoetial family with T 2 as a miimal sufficiet statistic ad, thus, there exists a Borel fuctio ψ 2 such that T 2 = ψ 2 (S) a.s. P. This proves that T =(ψ 1 (S),ψ 2 (S)) a.s. P. Hece T is miimal sufficiet for (θ, j). Solutio B. Let P be the joit distributio of X 1,..., X. The Lebesgue desity of P ca be writte as { exp I {1}(j) 2θ 2 T 1 I }[ {2}(j) I{1} (j) T 2 θ (2πθ 2 ) + I ] {2}(j) /2 (2θ). Hece P = {P : θ>0,j =1, 2} is a expoetial family. Let ( I{1} (j) η(θ, j) = 2θ 2, I ) {2}(j). θ Note that η(1, 1)=( 1 2, 0), η(2 1/2, 1)=( 1, 0), ad η(1, 2)=(0, 1). The, η(2 1/2, 1) η(1, 1)=( 1 2, 0) ad η(1, 2) η(1, 1)=(1 2, 1) are two liearly idepedet vectors i R 2. Hece T =(T 1,T 2 ) is miimal sufficiet for (θ, j) (e.g., Example 2.14 i Shao, 2003).

14 64 Chapter 2. Fudametals of Statistics Exercise 18 (#2.41). Let (X 1,..., X ), 2, be a radom sample from a distributio with discrete probability desity f θ,j, where θ (0, 1), j = 1, 2, f θ,1 is the Poisso distributio with mea θ, ad f θ,2 is the biomial distributio with size 1 ad probability θ. (i) Show that T = X i is ot sufficiet for (θ, j). (ii) Fid a two-dimesioal miimal sufficiet statistic for (θ, j). Solutio. (i) To show that T is ot sufficiet for (θ, j), it suffices to show that, for some x t, P (X = x T = t) for j = 1 is differet from P (X = x T = t) for j = 2. Whe j =1, ( ) t ( 1) t x P (X = x T = t) = x t > 0, whereas whe j =2,P (X = x T = t) = 0 as log as x>1. (ii) Let g θ,j be the joit probability desity of X 1,..., X. Let P 0 = {g 1 4,1,g1 2,1,g1 2,2 }. The, a.s. P 0 implies a.s. P. By Theorem 2.3(ii) i Shao (2003), the two-dimesioal statistic ( ) g 1 2 S =,1 g 1 2,,2 = (e /4 2 T,e /2 W 2 T ) g 1 4,1 g 1 4,1 is miimal sufficiet for the family P 0, where { 1 Xi = 0 or 1,,...,, W = 0 otherwise. Sice there is a oe-to-oe trasformatio betwee S ad (T,W), we coclude that (T,W) is miimal sufficiet for the family P 0. For ay x = (x 1,..., x ), the joit desity of X 1,..., X is e θi {1}(j) (1 θ) I {2}(j) W I {2}(j) e T [I {1}(j) log θ+i {2} (j) log θ (1 θ) ] 1 x i!. Hece, by the factorizatio theorem, (T,W) is sufficiet for (θ, j). Theorem 2.3(i) i Shao (2003), (T,W) is miimal sufficiet for (θ, j). Exercise 19 (#2.44). Let (X 1,..., X ) be a radom sample from a distributio o R havig the Lebesgue desity θ 1 e (x θ)/θ I (θ, ) (x), where θ>0 is a ukow parameter. (i) Fid a statistic that is miimal sufficiet for θ. (ii) Show whether the miimal sufficiet statistic i (i) is complete. Solutio. (i) Let T (x) = x i ad W (x) = mi 1 i x i, where x =(x 1,..., x ). The joit desity of X =(X 1,..., X )is f θ (x) = e θ e T (x)/θ I (θ, ) (W (x)). By

15 Chapter 2. Fudametals of Statistics 65 For x =(x 1,..., x ) ad y =(y 1,..., y ), f θ (x) f θ (y) = e[t (y) T (x)]/θ I (θ, )(W (x)) I (θ, ) (W (y)) is free of θ if ad oly if T (x) =T (y) ad W (x) =W (y). Hece, the two-dimesioal statistic (T (X),W(X)) is miimal sufficiet for θ. (ii) A direct calculatio shows that, for ay θ, E[T (X)] = 2θ ad E[W (X)] =(1+ 1 )θ. Hece E[(2) 1 T (1 + 1 ) 1 W (X)] = 0 for ay θ ad (2) 1 T (1 + 1 ) 1 W (X) is ot a costat. Thus, (T,W) is ot complete. Exercise 20 (#2.48). Let T be a complete (or boudedly complete) ad sufficiet statistic. Suppose that there is a miimal sufficiet statistic S. Show that T is miimal sufficiet ad S is complete (or boudedly complete). Solutio. We prove the case whe T is complete. The case i which T is boudedly complete is similar. Sice S is miimal sufficiet ad T is sufficiet, there exists a Borel fuctio h such that S = h(t ) a.s. Sice h caot be a costat fuctio ad T is complete, we coclude that S is complete. Cosider T E(T S) =T E[T h(t )], which is a Borel fuctio of T ad hece ca be deoted as g(t ). Note that E[g(T )] = 0. By the completeess of T, g(t ) = 0 a.s., that is, T = E(T S) a.s. This meas that T is also a fuctio of S ad, therefore, T is miimal sufficiet. Exercise 21 (#2.53). Let X be a discrete radom variable with probability desity θ x =0 f θ (x) = (1 θ) 2 θ x 1 x =1, 2,... 0 otherwise, where θ (0, 1). Show that X is boudedly complete, but ot complete. Solutio. Cosider ay Borel fuctio h(x) such that E[h(X)] = h(0)θ + h(x)(1 θ) 2 θ x 1 =0 x=1 for ay θ (0, 1). Rewritig the left-had side of the above equatio i the ascedig order of the powers of θ, we obtai that h(1) + [h(x 1) 2h(x)+h(x + 1)] θ x =0 x=1 for ay θ (0, 1). Comparig the coefficiets of both sides, we obtai that h(1) = 0 ad h(x 1) h(x) =h(x) h(x+1). Therefore, h(x) =(1 x)h(0)

16 66 Chapter 2. Fudametals of Statistics for x =1, 2,... This fuctio is bouded if ad oly if h(0)=0. Ifh(x) is assumed to be bouded, the h(0) = 0 ad, hece, h(x) 0. This meas that X is boudedly complete. For h(x) =1 x, E[h(X)] = 0 for ay θ but h(x) 0. Therefore, X is ot complete. Exercise 22. Let X be a discrete radom variable with ( θ N θ ) P θ (X = x) = x)( x ( N, x =0, 1, 2,..., mi{θ, }, x N θ, ) where ad N are positive itegers, N, ad θ =0, 1,..., N. Show that X is complete. Solutio. Let g(x) be a fuctio of x {0, 1,..., }. Assume E θ [g(x)]=0 for ay θ, where E θ is the expectatio with respect to P θ. Whe θ =0, P 0 (X = x) =1ifx = 0 ad E 0 [g(x)] = g(0). Thus, g(0) = 0. Whe θ =1, P 1 (X 2) = 0 ad ( N 1 ) 1 E 1 [g(x)] = g(0)p 1 (X =0)+g(1)P 1 (X =1)=g(1) ( N. ) Sice E 1 [g(x)] = 0, we obtai that g(1) = 0. Similarly, we ca show that g(2) = = g() = 0. Hece X is complete. Exercise 23. Let X be a radom variable havig the uiform distributio o the iterval (θ, θ + 1), θ R. Show that X is ot complete. Solutio. Cosider g(x) = cos(2πx). The g(x) 0 but E[g(X)] = θ+1 θ cos(2πx)dx = for ay θ. Hece X is ot complete. si(2π(θ + 1)) si(2πθ) 2π Exercise 24 (#2.57). Let (X 1,..., X ) be a radom sample from the N(θ, θ 2 ) distributio, where θ>0is a parameter. Fid a miimal sufficiet statistic for θ ad show whether it is complete. Solutio. The joit Lebesgue desity of X 1,..., X is { } 1 (2πθ 2 ) exp 1 2θ 2 x 2 i + 1 x i 1. θ 2 Let ( η(θ) = 1 2θ 2, 1 ). θ The η( 1 2 ) η(1) = ( 3 2, 1) ad η( 1 2 ) η(1) = ( 1 2, 2) are liearly idepedet vectors i R 2. Hece T =( X2 i, X i) is miimal =0

17 Chapter 2. Fudametals of Statistics 67 sufficiet for θ. Note that ( E ad X 2 i ) = EX1 2 =2θ 2 ( ) 2 E X i = θ 2 +(θ) 2 =( + 2 )θ 2. Let h(t 1,t 2 )= 1 2 t 1 1 (+1) t2 2. The h(t 1,t 2 ) 0 but E[h(T )] = 0 for ay θ. Hece T is ot complete. Exercise 25 (#2.56). Suppose that (X 1,Y 1 ),..., (X,Y ) are idepedet ad idetically distributed radom 2-vectors ad X i ad Y i are idepedetly distributed as N(µ, σx 2 ) ad N(µ, σ2 Y ), respectively, with θ = (µ, σx 2,σ2 Y ) R (0, ) (0, ). Let X ad SX 2 be the sample mea ad variace for X i s ad Ȳ ad S2 Y be the sample mea ad variace for Y i s. Show that T =( X,Ȳ,S2 X,S2 Y ) is miimal sufficiet for θ but T is ot boudedly complete. Solutio. Let ) ad η = ( 1 2σX 2, µ σ 2 X ( S = Xi 2, Yi 2,, 1 2σY 2, µ σ 2 Y ) X i, Y i. The the joit Lebesgue desity of (X 1,Y 1 ),..., (X,Y )is } 1 {η (2π) exp τ S µ2 2σX 2 µ2 2σY 2 log(σ X σ Y ). Sice the parameter space {η : µ R,σX 2 > 0,σ2 Y > 0} is a threedimesioal curved hyper-surface i R 4, we coclude that S is miimal sufficiet. Note that there is a oe-to-oe correspodece betwee T ad S. Hece T is also miimal sufficiet. To show that T is ot boudedly complete, cosider h(t )=I { X> Ȳ } 1 2. The h(t ) 0.5 ad E[h(T )] = 0 for ay η, but h(t ) 0. Hece T is ot boudedly complete. Exercise 26 (#2.58). Suppose that (X 1,Y 1 ),..., (X,Y ) are idepedet ad idetically distributed radom 2-vectors havig the ormal distributio with EX 1 = EY 1 = 0, Var(X 1 )=Var(Y 1 ) = 1, ad Cov(X 1,Y 1 )=θ ( 1, 1). (i) Fid a miimal sufficiet statistic for θ. (ii) Show whether the miimal sufficiet statistic i (i) is complete or ot.

18 68 Chapter 2. Fudametals of Statistics (iii) Prove that T 1 = X2 i ad T 2 = Y i 2 are both acillary but (T 1,T 2 ) is ot acillary. Solutio. (i) The joit Lebesgue desity of (X 1,Y 1 ),..., (X,Y )is ( ) { 1 2π exp 1 1 θ 2 1 θ 2 (x 2 i + yi 2 )+ 2θ 1 θ 2 } x i y i. Let η = ( 1 ) 1 θ 2, 2θ 1 θ 2. The parameter space {η : 1 < θ < 1} is a curve i R 2. Therefore, ( (X2 i + Y i 2), (ii) Note that E[ X iy i ) is miimal sufficiet. (X2 i + Y i 2)] 2 = 0, but (X2 i + Y i 2 ) 2 0. Therefore, the miimal sufficiet statistic is ot complete. (iii) Both T 1 ad T 2 have the chi-square distributio χ 2, which does ot deped o θ. Hece both T 1 ad T 2 are acillary. Note that ( ) E(T 1 T 2 )=E Xi 2 = E ( X 2 i Y 2 i j=1 Y 2 j ) + E i j X 2 i Y 2 j = E(X 2 1 Y 2 1 )+( 1)E(X 2 1 )E(Y 2 1 ) = (1+2θ 2 )+2( 1), which depeds o θ. Therefore the distributio of (T 1,T 2 ) depeds o θ ad (T 1,T 2 ) is ot acillary. Exercise 27 (#2.59). Let (X 1,..., X ), >2, be a radom sample from the expoetial distributio o (a, ) with scale parameter θ. Show that (i) (X i X (1) ) ad X (1) are idepedet for ay (a, θ), where X (j) is the jth order statistic; (ii) Z i =(X () X (i) )/(X () X ( 1) ), i =1,..., 2, are idepedet of (X (1), (X i X (1) )). Solutio: (i) Let θ be arbitrarily fixed. Sice the joit desity of X 1,..., X is { θ e a/θ exp 1 θ } x i I (a, ) (x (1) ), where oly a is cosidered as a ukow parameter, we coclude that X (1) is sufficiet for a. Note that θ e (x a)/θ I (a, ) (x) is the Lebesgue desity

19 Chapter 2. Fudametals of Statistics 69 for X (1). For ay Borel fuctio g, E[g(X (1) )] = θ for ay a is equivalet to a a g(x)e (x a)/θ dx =0 g(x)e x/θ dx =0 for ay a, which implies g(x) = 0 a.e. with respect to Lebesgue measure. Hece, for ay fixed θ, X (1) is sufficiet ad complete for a. The Lebesgue desity of X i a is θ 1 e x/θ I (0, ) (x), which does ot deped o a. Therefore, for ay fixed θ, (X i X (1) )= [(X i a) (X (1) a)] is acillary. By Basu s theorem (e.g., Theorem 2.4 i Shao, 2003), (X i X (1) ) ad X (1) are idepedet for ay fixed θ. Sice θ is arbitrary, we coclude that (X i X (1) ) ad X (1) are idepedet for ay (a, θ). (ii) From Example 5.14 i Lehma (1983, p. 47), (X (1), (X i X (1) )) is sufficiet ad complete for (a, θ). Note that (X i a)/θ has Lebesgue desity e x I (0, ) (x), which does ot deped o (a, θ). Sice Z i = X () X (i) X () X ( 1) = X () a θ X () a θ X (i) a θ X ( 1) a θ the statistic (Z 1,..., Z 2 ) is acillary. By Basu s Theorem, (Z 1,..., Z 2 ) is idepedet of ( X (1), (X i X (1) ) ). Exercise 28 (#2.61). Let (X 1,..., X ), > 2, be a radom sample of radom variables havig the uiform distributio o the iterval [a, b], where <a<b<. Show that Z i =(X (i) X (1) )/(X () X (1) ), i =2,..., 1, are idepedet of (X (1),X () ) for ay a ad b, where X (j) is the jth order statistic. Solutio. Note that (X i a)/(b a) has the uiform distributio o the iterval [0, 1], which does ot deped o ay (a, b). Sice Z i = X (i) X (1) X () X (1) = X (i) a b a X () a b a X (1) a b a X (1) a b a the statistic (Z 2,..., Z 1 ) is acillary. By Basu s Theorem, the result follows if (X (1),X () ) is sufficiet ad complete for (a, b). The joit Lebesgue desity of X 1,..., X is (b a) I {a<x(1) <x () <b}. By the factorizatio theorem, (X (1),X () ) is sufficiet for (a, b). The joit Lebesgue desity of (X (1),X () )is ( 1) (b a) (y x) 2 I {a<x<y<b}.,,

20 70 Chapter 2. Fudametals of Statistics For ay Borel fuctio g(x, y), E[g(X (1),X () )] = 0 for ay a<bimplies that g(x, y)(y x) 2 dxdy =0 a<x<y<b for ay a < b. Hece g(x, y)(y x) 2 = 0 a.e. m 2, where m 2 is the Lebesgue measure o R 2. Sice (y x) 2 0 a.e. m 2, we coclude that g(x, y) = 0 a.e. m 2. Hece, (X (1),X () ) is complete. Exercise 29 (#2.62). Let (X 1,..., X ), >2, be a radom sample from a distributio P o R with EX1 2 <, X be the sample mea, X(j) be the jth order statistic, ad T =(X (1) + X () )/2. Cosider the estimatio of a parameter θ Ruder the squared error loss. (i) Show that X is better tha T if P = N(θ, σ 2 ), θ R, σ>0. (ii) Show that T is better tha X if P is the uiform distributio o the iterval (θ 1 2,θ+ 1 2 ), θ R. (iii) Fid a family P for which either X or T is better tha the other. Solutio. (i) Sice X is complete ad sufficiet for θ ad T X is acillary to θ, by Basu s theorem, T X ad X are idepedet. The R T (θ) =E[(T X)+( X θ)] 2 = E(T X) 2 + R X(θ) >R X(θ), where the last iequality follows from the fact that T X a.s. Therefore X is better. (ii) Let W = X (1) θ+x () θ 2. The the Lebesgue desity of W is f(w) = Therefore ET = EW + θ = θ ad O the other had, 2 ( 1 w + 2) <w<0 2 ( w) 1 0 <w< otherwise. R T (θ) =Var(T )=Var(W )= R X(θ) =Var( X) = Var(X 1) 1 2( + 1)( +2). = Hece, whe >2, R T (θ) <R X(θ). (iii) Cosider the family P = P 1 P 2, where P 1 is the family i part (i) ad P 2 is the family i part (ii). Whe P P 1, X is better tha T. Whe P P 2, T is better tha X. Therefore, either of them is better tha the other for P P. Exercise 30 (#2.64). Let (X 1,..., X ) be a radom sample of biary radom variables with P (X i =1)=θ (0, 1). Cosider estimatig θ with

21 Chapter 2. Fudametals of Statistics 71 the squared error loss. Calculate the risks of the followig estimators: (i) the oradomized estimators X (the sample mea) ad 0 if more tha half of X i s are 0 T 0 (X) = 1 if more tha half of X i s are if exactly half of X i s are 0; (ii) the radomized estimators { X with probability 1 T 1 (X) = 2 T 0 with probability 1 2 ad { X with probability X T 2 (X) = 1 2 with probability 1 X. Solutio. (i) Note that R T0 (θ) =E(T 0 θ) 2 = θ 2 P ( X <0.5)+(1 θ) 2 P ( X >0.5)+(0.5 θ) 2 P ( X =0.5). Whe =2k, P ( X k 1 ( ) 2k <0.5) = θ j (1 θ) 2k j, j ad Whe =2k +1, P ( X >0.5) = j=1 2k j=k+1 P ( X =0.5) = P ( X <0.5) = P ( X >0.5) = ( ) 2k θ j (1 θ) 2k j, j ( ) 2k θ k (1 θ) k. k k ( ) 2k +1 θ j (1 θ) 2k+1 j, j j=0 2k+1 j=k+1 ad P ( X =0.5)=0. (ii) A direct calculatio shows that R T1 (θ) =E(T 1 θ) 2 ( ) 2k +1 θ j (1 θ) 2k+1 j, j = 1 2 E( X θ) E(T 0 θ) 2 = θ(1 θ) R T 0 (θ),

22 72 Chapter 2. Fudametals of Statistics where R T0 (θ) is give i part (i), ad R T2 (θ) =E(T 2 θ) 2 [ = E X( X θ) 2 + ( ) X)] θ (1 ( ) 2 = E( X θ) 3 + θe( X 1 θ) θ (1 θ) = 1 3 E(X i θ)(x j θ)(x k θ) j=1 k=1 + θ2 (1 θ) + ( ) θ (1 θ) = E(X 1 θ) θ2 (1 θ) + ( ) θ (1 θ) = θ(1 θ)3 θ 3 (1 θ) 2 + θ2 (1 θ) + ( ) θ (1 θ), where the fourth equality follows from E( X θ) 2 = Var( X) =θ(1 θ)/ ad the fifth equality follows from the fact that E(X i θ)(x j θ)(x k θ) 0 if ad oly if i = j = k. Exercise 31 (#2.66). Cosider the estimatio of a ukow parameter θ 0 uder the squared error loss. Show that if T ad U are two estimators such that T U ad R T (P ) <R U (P ), the R T+ (P ) <R U+ (P ), where R T (P ) is the risk of a estimator T ad T + deotes the positive part of T. Solutio. Note that T = T + T, where T = max{ T,0} is the egative part of T, ad T + T = 0. The R T (P )=E(T θ) 2 = E(T + T θ) 2 = E(T + θ) 2 + E(T )+2θE(T 2 ) 2E(T + T ) = R T+ (P )+E(T )+2θE(T 2 ). Similarly, R U (P )=R U+ (P )+E(U )+2θE(U 2 ). Sice T U, T U. Also, θ 0. Hece, E(T 2 )+2θE(T ) E(U 2 )+2θE(U ). Sice R T (P ) <R U (P ), we must have R T+ (P ) <R U+ (P ). Exercise 32. Cosider the estimatio of a ukow parameter θ R uder the squared error loss. Show that if T ad U are two estimators such

23 Chapter 2. Fudametals of Statistics 73 that P (θ t<t <θ+ t) P (θ t<u<θ+ t) for ay t>0, the R T (P ) R U (P ). Solutio. From the coditio, for ay s>0. Hece, P ((T θ) 2 >s) P ((U θ) 2 >s) R T (P )=E(T θ) 2 = 0 0 = E(U θ) 2 = R U (P ). P ((T θ) 2 >s)ds P ((U θ) 2 >s)ds Exercise 33 (#2.67). Let (X 1,..., X ) be a radom sample from the expoetial distributio o (0, ) with scale parameter θ (0, ). Cosider the hypotheses H 0 : θ θ 0 versus H 1 : θ>θ 0, where θ 0 > 0isa fixed costat. Obtai the risk fuctio (i terms of θ) of the test rule T c = I (c, ) ( X) uder the 0-1 loss, where X is the sample mea ad c>0 is a costat. Solutio. Let L(θ, a) be the loss fuctio. The L(θ, 1) = 0 whe θ>θ 0, L(θ, 1) = 1 whe θ θ 0, L(θ, 0) = 0 whe θ θ 0, ad L(θ, 0) = 1 whe θ>θ 0. Hece, R Tc (θ) =E[L(θ, I (c, ) ( X))] = E [ L(θ, 1)I (c, ) ( X)+L(θ, 0)I (0,c] ( X) ] = L(θ, 1)P ( X >c)+l(θ, 0)P ( X c) { P ( X >c) θ θ0 = P ( X c) θ>θ 0. Sice X has the gamma distributio with shape parameter ad scale parameter θ, P ( X >c)= 1 θ ( 1)! c x 1 e x/θ dx. Exercise 34 (#2.71). Cosider a estimatio problem with a parametric family P = {P θ : θ Θ} ad the squared error loss. If θ 0 Θ satisfies that P θ P θ0 for ay θ Θ, show that the estimator T θ 0 is admissible.

24 74 Chapter 2. Fudametals of Statistics Solutio. Note that the risk R T (θ) = 0 whe θ = θ 0. Suppose that U is a estimator of θ ad R U (θ) =E(U θ) 2 R T (θ) for all θ. The R U (θ 0 )=0, i.e., E(U θ 0 ) 2 = 0 uder P θ0. Therefore, U = θ 0 a.s. P θ0. Sice P θ P θ0 for ay θ, we coclude that U = θ 0 a.s. P. Hece U = T a.s. P. Thus, T is admissible. Exercise 35 (#2.73). Let (X 1,..., X ) be a radom sample of radom variables with EX1 2 <. Cosider estimatig µ = EX 1 uder the squared error loss. Show that (i) ay estimator of the form a X + b is iadmissible, where X is the sample mea, a ad b are costats, ad a>1; (ii) ay estimator of the form X + b is iadmissible, where b 0 is a costat. Solutio. (i) Note that R a X+b (P )=E(a X + b µ) 2 = a 2 Var( X)+(aµ + b µ) 2 a 2 Var( X) = a 2 R X(P ) >R X(P ) whe a>1. Hece X is better tha a X + b with a>1. (ii) For b 0, R X+b (P )=E( X + b µ) 2 = Var( X)+b 2 > Var( X) =R X(P ). Hece X is better tha X + b with b 0. Exercise 36 (#2.74). Cosider a estimatio problem with ϑ [c, d] R, where c ad d are kow. Suppose that the actio space cotais [c, d] ad the loss fuctio is L( ϑ a ), where L( ) is a icreasig fuctio o [0, ). Show that ay decisio rule T with P (T (X) [c, d]) > 0 for some P P is iadmissible. Solutio. Cosider the decisio rule T 1 = ci (,c) (T )+TI [c,d] (T )+di (d, ) (T ). The T 1 ϑ T ϑ ad, sice L is a icreasig fuctio, for ay P P. Sice R T1 (P )=E[L( T 1 ϑ )] E[L( T ϑ )] = R T (P ) P ( T 1 (X) ϑ < T (X) ϑ ) =P (T (X) / [a, b]) > 0 holds for some P P, R T1 (P ) <R T (P ).

25 Chapter 2. Fudametals of Statistics 75 Hece T 1 is better tha T ad T is iadmissible. Exercise 37 (#2.75). Let X be a sample from P P, δ 0 (X) bea oradomized rule i a decisio problem with R k as the actio space, ad T be a sufficiet statistic for P P. Show that if E[I A (δ 0 (X)) T ] is a oradomized rule, i.e., E[I A (δ 0 (X)) T ]=I A (h(t )) for ay Borel A R k, where h is a Borel fuctio, the δ 0 (X) =h(t (X)) a.s. P. Solutio. From the assumptio, [ ] E c i I Ai (δ 0 (X)) T = c i I Ai (h(t )) for ay positive iteger, costats c 1,..., c, ad Borel sets A 1,..., A. Usig the results i Exercise 39 of Chapter 1, we coclude that for ay bouded cotiuous fuctio f, E[f(δ 0 (X)) T ]=f(h(t )) a.s. P. The, by the result i Exercise 45 of Chapter 1, δ 0 (X) =h(t ) a.s. P. Exercise 38 (#2.76). Let X be a sample from P P, δ 0 (X) be a decisio rule (which may be radomized) i a problem with R k as the actio space, ad T be a sufficiet statistic for P P. For ay Borel A R k, defie δ 1 (T,A)=E[δ 0 (X, A) T ]. Let L(P, a) be a loss fuctio. Show that [ ] L(P, a)dδ 1 (X, a) =E L(P, a)dδ 0 (X, a) T a.s. P. R k R k Solutio. If L is a simple fuctio (a liear combiatio of idicator fuctios), the the result follows from the defiitio of δ 1. For oegative L, it is the limit of a sequece of oegative icreasig simple fuctios. The the result follows from the result for simple L ad the mootoe covergece theorem for coditioal expectatios (Exercise 38 i Chapter 1). Exercise 39 (#2.80). Let X 1,..., X be radom variables with a fiite commo mea µ = EX i ad fiite variaces. Cosider the estimatio of µ uder the squared error loss. (i) Show that there is o optimal rule i I if I cotais all possible estimators. (ii) Fid a optimal rule i { } I 2 = c i X i : c i R, c i =1 if Var(X i )=σ 2 /a i with a ukow σ 2 ad kow a i, i =1,...,. (iii) Fid a optimal rule i I 2 if X 1,..., X are idetically distributed but

26 76 Chapter 2. Fudametals of Statistics are correlated with correlatio coefficiet ρ. Solutio. (i) Suppose that there exists a optimal rule T. Let P 1 ad P 2 be two possible distributios of X =(X 1,..., X ) such that µ = µ j uder P j ad µ 1 µ 2. Let R T (P ) be the risk of T. For T 1 (X) µ 1, R T1 (P 1 ) = 0. Sice T is better tha T 1, R T (P 1 ) R T1 (P 1 ) = 0 ad, hece, T µ 1 a.s. P 1. Let P =(P 1 +P 2 )/2. If X has distributio P, the µ =(µ 1 + µ 2 )/2. Let T 0 (X) (µ 1 + µ 2 )/2. The R T0 ( P ) = 0. Sice T is better tha T 0, R T ( P ) = 0 ad, hece, T (µ 1 + µ 2 )/2 a.s. P, which implies that T (µ 1 + µ 2 )/2 a.s. P 1 sice P 1 P. This is impossible sice µ 1 (µ 1 + µ 2 )/2. (ii) Let T = c ix i ad T = a ix i / a i. The R T (P ) = Var(T ) ( ) / ( ) 2 =Var a i X i a i = = / ( ) 2 a 2 i Var(X i ) a i / ( ) 2 a i σ 2 a i / ( ) = σ 2 a i. By the Cauchy-Schwarz iequality, ( )( ) ( c 2 ) 2 i a i c i =1. a i Hece, R T (P ) σ 2 ( c 2 ) i =Var c i X i = Var(T )=R T (P ). a i Therefore T is optimal. (iii) For ay T = c ix i, R T (P )=Var(T) = c 2 i σ 2 + c i c j ρσ 2 i j ( ) 2 = c 2 i σ 2 + ρσ 2 c i c 2 i

27 Chapter 2. Fudametals of Statistics 77 = σ 2 [(1 ρ) ] c 2 i + ρ ( ) 2/ σ 2 (1 ρ) c i + ρ = σ 2 [(1 ρ)/ + ρ] = Var( X), where the last equality follows from the Cauchy Schwarz iequality ( ) 2 c i c 2 i. Hece, the sample mea X is optimal. Exercise 40 (#2.83). Let X be a discrete radom variable with P (X = 1) = p, P (X = k) =(1 p) 2 p k, k =0, 1, 2,..., where p (0, 1) is ukow. Show that (i) U(X) is a ubiased estimator of 0 if ad oly if U(k) =ak for all k = 1, 0, 1, 2,... ad some a; (ii) T 0 (X) =I {0} (X) is ubiased for (1 p) 2 ad, uder the squared error loss, T 0 is a optimal rule i I, where I is the class of all ubiased estimators of (1 p) 2 ; (iii) T 0 (X) =I { 1} (X) is ubiased for p ad, uder the squared error loss, there is o optimal rule i I, where I is the class of all ubiased estimators of p. Solutio. (i) If U(X) is ubiased for 0, the E[U(X)] = U( 1)p + U(k)(1 p) 2 p k k=0 k=0 k=0 = U(k)p k 2 U(k)p k+1 + U( 1)p + U(k)p k+2 = U(0) + = + =0 k= 1 k= 1 U(k)p k+2 U(k +2)p k+2 2 k= 1 [U(k) 2U(k +1)+U(k + 2)]p k+2 k= 1 k=0 U(k +1)p k+2

28 78 Chapter 2. Fudametals of Statistics for all p, which implies U(0) = 0 ad U(k) 2U(k +1)+U(k +2)=0for k = 1, 0, 1, 2,..., or equivaletly, U(k) =ak, where a = U(1). (ii) Sice E[T 0 (X)] = P (X =0)=(1 p) 2, T 0 is ubiased. Let T be aother ubiased estimator of (1 p) 2. The T (X) T 0 (X) is ubiased for 0 ad, by the result i (i), T (X) =T 0 (X)+aX for some a. The, R T (p) =E[T 0 (X)+aX (1 p) 2 ] 2 = E(T 0 + ax) 2 +(1 p) 4 2(1 p) 2 E[T 0 (X)+aX] = E(T 0 + ax) 2 (1 p) 4 = a 2 P (X = 1) + P (X =0)+a 2 k 2 P (X = k) (1 p) 4 P (X =0) (1 p) 4 = Var(T 0 ). Hece T 0 is a optimal rule i I. (iii) Sice E[T 0 (X)] = P (X = 1) = p, T 0 is ubiased. Let T be aother ubiased estimator of p. The T (X) = T 0 (X)+aX for some a ad R T (p) =E(T 0 + ax) 2 p 2 =(1 a) 2 p + a 2 k=0 k=1 k 2 (1 p)p 2 p 2, which is a quadratic fuctio i a with miimum [ ] 1 a = 1+(1 p) k 2 p k 1 k=1 depedig o p. Therefore, there is o optimal rule i I. Exercise 41. Let X be a radom sample from a populatio ad θ be a ukow parameter. Suppose that there are k + 1 estimators of θ, T 1,..., T k+1, such that ET i = θ + k j=1 c i,jb j (θ), i =1,..., k + 1, where c i,j s are costats ad b j (θ) are fuctios of θ. Suppose that the determiat C = c 1,1 c 2,1 c k+1,1 c 1,k c 2,k c k+1,k 0.

29 Chapter 2. Fudametals of Statistics 79 Show that T = 1 C T 1 T 2 T k+1 c 1,1 c 2,1 c k+1,1 c 1,k c 2,k c k+1,k is a ubiased estimator of θ. Solutio. From the properties of a determiat, ET ET = 1 1 ET 2 ET k+1 c 1,1 c 2,1 c k+1,1 C c 1,k c 2,k c k+1,k θ + k = 1 j=1 c 1,jb j (θ) θ + k j=1 c k+1,jb j (θ) c 1,1 c k+1,1 C c 1,k c k+1,k 1 1 = θ c 1,1 c k+1,1 C c 1,k c k+1,k k + 1 j=1 c k 1,jb j (θ) j=1 c k+1,jb j (θ) c 1,1 c k+1,1 C c 1,k c k+1,k = θ, where the last equality follows from the fact that the last determiat is 0 because its first row is a liear combiatio of its other k rows. Exercise 42 (#2.84). Let X be a radom variable havig the biomial distributio with size ad probability p (0, 1). Show that there is o ubiased estimator of p 1. Solutio. Suppose that T (X) is a ubiased estimator of p 1. The for all p. However, k=0 E[T (X)] = k=0 ( ) T (k)p k (1 p) k k ( ) T (k)p k (1 p) k = 1 k p k=0 ( ) T (k) < k

30 80 Chapter 2. Fudametals of Statistics for ay p but p 1 diverges to as p 0. This is impossible. Hece, there is o ubiased estimator of p 1. Exercise 43 (#2.85). Let X =(X 1,..., X ) be a radom sample from N(θ, 1), where θ = 0 or 1. Cosider the estimatio of θ with actio space {0, 1}, i.e., the rage of ay estimator is {0, 1}. (i) Show that there does ot exist ay ubiased estimator of θ. (ii) Fid a estimator ˆθ of θ that is approximately ubiased, that is, lim E(ˆθ) =θ. Solutio. (i) Sice the actio space is {0, 1}, ay radomized estimator ˆθ ca be writte as T (X), where T is Borel, 0 T (X) 1, ad ˆθ = { 1 with probability T (X) 0 with probability 1 T (X). The E(ˆθ) =E[T (X)]. If ˆθ is ubiased, the E[T (X)] = θ for θ =0, 1. This implies that, whe θ =0,T (X) = 0 a.e. Lebesgue measure, whereas whe θ = 1, T (X) = 1 a.e. Lebesgue measure. This is impossible. Hece there does ot exist ay ubiased estimator of θ. (ii) Cosider ˆθ = I (, )( X ), where X is the sample mea. Sice X is 1/4 distributed as N(θ, 1 ), E(ˆθ) =P ( X ( > 1/4 )=1 Φ 1/4 θ ) ( +Φ 1/4 θ ), where Φ is the cumulative distributio fuctio of N(0, 1). Hece, whe θ = 0, lim E(ˆθ) =1 Φ( )+Φ( ) = 0 ad, whe θ = 1, lim E(ˆθ) = 1 Φ( )+Φ( ) =1. Exercise 44 (#2.92(c)). Let X be a sample from P θ, where θ Θ R. Cosider the estimatio of θ uder the absolute error loss fuctio a θ. Let Π be a give distributio o Θ with fiite mea. Fid a Bayes rule. Solutio. Let P θ X be the posterior distributio of θ ad P X be the margial of X. By Fubii s theorem, θ dp θ X dp X = θ dp θ dπ= θ dπ <. Hece, for almost all X, θ dp θ X <. From Exercise 11 i Chapter 1, if m X is a media of P θ X, the θ m X dp θ X θ a dp θ X for almost all X holds for ay a. Hece, E θ m X E θ T (X) for ay other estimator T (X). This shows that m X is a Bayes rule.

31 Chapter 2. Fudametals of Statistics 81 Exercise 45 (#2.93). Let X be a sample havig a probability desity f j (x) with respect to a σ-fiite measure ν, where j is ukow ad j {1,..., J} with a kow iteger J 2. Cosider a decisio problem i which the actio space is {1,..., J} ad the loss fuctio is { 0 if a = j L(j, a) = 1 if a j. (i) Obtai the risk of a decisio rule (which may be radomized). (ii) Let Π be a prior probability measure o {1,..., J} with Π({j}) =π j, j =1,..., J. Obtai the Bayes risk of a decisio rule. (iii) Obtai a Bayes rule uder the prior Π i (ii). (iv) Assume that J =2,π 1 = π 2 =0.5, ad f j (x) =φ(x µ j ), where φ(x) is the Lebesgue desity of the stadard ormal distributio ad µ j, j =1, 2, are kow costats. Obtai the Bayes rule i (iii). (v) Obtai a miimax rule whe J =2. Solutio. (i) Let δ be a radomized decisio rule. For ay X, let δ(x, j) be the probability of takig actio j uder the rule δ. Let E j be the expectatio takig uder f j. The [ J ] R δ (j) =E j L(j, k)δ(x, k) = E j [δ(x, k)]=1 E j [δ(x, j)], k=1 k j sice J k=1 δ(x, k) =1. (ii) The Bayes risk of a decisio rule δ is J J r δ = π j R δ (j) =1 π j E j [δ(x, j)]. j=1 (iii) Let δ be a rule satisfyig δ (X, j) = 1 if ad oly if π j f j (X) =g(x), where g(x) = max 1 k J π k f k (X). The δ is a Bayes rule, sice, for ay rule δ, J r δ =1 π j δ(x, j)f j (x)dν 1 j=1 j=1 j=1 J δ(x, j)g(x)dν =1 g(x)dν =1 = r δ. J j=1 g(x)=π jf j(x) π j f j (x)dν

32 82 Chapter 2. Fudametals of Statistics (iv) From the result i (iii), the Bayes rule δ (X, j) = 1 if ad oly if φ(x µ j ) >φ(x µ k ), k j. Sice φ(x µ j )=e (x µj)2 /2 / 2π, we ca obtai a oradomized Bayes rule that takes actio 1 if ad oly if X µ 1 < X µ 2. (v) Let c be a positive costat ad cosider a rule δ c such that δ c (X, 1)=1 if f 1 (X) >cf 2 (X), δ c (X, 2)=1iff 1 (X) <cf 2 (X), ad δ c (X, 1) = γ if f 1 (X) =cf 2 (X). Sice δ c (X, j) = 1 if ad oly if π j f j (X)=max k π k f k (X), where π 1 =1/(c + 1) ad π 2 = c/(c + 1), it follows from part (iii) of the solutio that δ c is a Bayes rule. Let P j be the probability correspodig to f j. The risk of δ c is P 1 (f 1 (X) cf 2 (X)) γp 1 (f 1 (x) =cf 2 (X)) whe j = 1 ad 1 P 2 (f 1 (X) cf 2 (X)) + γp 2 (f 1 (x) =cf 2 (X)) whe j =2. Let ψ(c) =P 1 (f 1 (X) cf 2 (X)) + P 2 (f 1 (X) cf 2 (X)) 1. The ψ is odecreasig i c, ψ(0) = 1, lim c ψ(c) =1,adψ(c) ψ(c ) = P 1 (f 1 (X) =cf 2 (X)) + P 2 (f 1 (X) =cf 2 (X)). Let c = if{c : ψ(c) 0}. If ψ(c )=ψ(c ), we set γ = 0; otherwise, we set γ = ψ(c )/[ψ(c ) ψ(c )]. The, the risk of δ c is a costat. For ay rule δ, sup j R δ (j) r δ r δc = R δc (j) = sup j R δc (j). Hece, δ c is a miimax rule. Exercise 46 (#2.94). Let ˆθ be a ubiased estimator of a ukow θ R. (i) Uder the squared error loss, show that the estimator ˆθ+c is ot miimax uless sup θ R T (θ) = for ay estimator T, where c 0 is a kow costat. (ii) Uder the squared error loss, show that the estimator cˆθ is ot miimax uless sup θ R T (θ) = for ay estimator T, where c (0, 1) is a kow costat. (iii) Cosider the loss fuctio L(θ, a) =(a θ) 2 /θ 2 (assumig θ 0). Show that ˆθ is ot miimax uless sup θ R T (θ) = for ay T. Solutio. (i) Uder the squared error loss, the risk of ˆθ + c is The Rˆθ+c (P )=E(ˆθ + c θ) 2 = c 2 + Var(ˆθ) =c 2 + Rˆθ(P ). sup P Rˆθ+c (P )=c 2 + sup Rˆθ(P ) P ad either sup P Rˆθ+c (P )= or sup P Rˆθ+c (P ) > sup Rˆθ(P P ). Hece, the oly case where ˆθ + c is miimax is whe sup P R T (P )= for ay estimator T. (ii) Uder the squared error loss, the risk of cˆθ is R cˆθ(p )=E(cˆθ θ) 2 =(1 c) 2 θ 2 + c 2 Var(ˆθ) =(1 c) 2 θ 2 + c 2 Rˆθ(P ). The, sup P Rˆθ+c (P )= ad the oly case where ˆθ +c is miimax is whe sup P R T (P )= for ay estimator T.

Lecture 23: Minimal sufficiency

Lecture 23: Minimal sufficiency Lecture 23: Miimal sufficiecy Maximal reductio without loss of iformatio There are may sufficiet statistics for a give problem. I fact, X (the whole data set) is sufficiet. If T is a sufficiet statistic

More information

Unbiased Estimation. February 7-12, 2008

Unbiased Estimation. February 7-12, 2008 Ubiased Estimatio February 7-2, 2008 We begi with a sample X = (X,..., X ) of radom variables chose accordig to oe of a family of probabilities P θ where θ is elemet from the parameter space Θ. For radom

More information

6. Sufficient, Complete, and Ancillary Statistics

6. Sufficient, Complete, and Ancillary Statistics Sufficiet, Complete ad Acillary Statistics http://www.math.uah.edu/stat/poit/sufficiet.xhtml 1 of 7 7/16/2009 6:13 AM Virtual Laboratories > 7. Poit Estimatio > 1 2 3 4 5 6 6. Sufficiet, Complete, ad Acillary

More information

Chapter 6 Principles of Data Reduction

Chapter 6 Principles of Data Reduction Chapter 6 for BST 695: Special Topics i Statistical Theory. Kui Zhag, 0 Chapter 6 Priciples of Data Reductio Sectio 6. Itroductio Goal: To summarize or reduce the data X, X,, X to get iformatio about a

More information

Lecture 16: UMVUE: conditioning on sufficient and complete statistics

Lecture 16: UMVUE: conditioning on sufficient and complete statistics Lecture 16: UMVUE: coditioig o sufficiet ad complete statistics The 2d method of derivig a UMVUE whe a sufficiet ad complete statistic is available Fid a ubiased estimator of ϑ, say U(X. Coditioig o a

More information

Lecture 7: Properties of Random Samples

Lecture 7: Properties of Random Samples Lecture 7: Properties of Radom Samples 1 Cotiued From Last Class Theorem 1.1. Let X 1, X,...X be a radom sample from a populatio with mea µ ad variace σ

More information

IIT JAM Mathematical Statistics (MS) 2006 SECTION A

IIT JAM Mathematical Statistics (MS) 2006 SECTION A IIT JAM Mathematical Statistics (MS) 6 SECTION A. If a > for ad lim a / L >, the which of the followig series is ot coverget? (a) (b) (c) (d) (d) = = a = a = a a + / a lim a a / + = lim a / a / + = lim

More information

This section is optional.

This section is optional. 4 Momet Geeratig Fuctios* This sectio is optioal. The momet geeratig fuctio g : R R of a radom variable X is defied as g(t) = E[e tx ]. Propositio 1. We have g () (0) = E[X ] for = 1, 2,... Proof. Therefore

More information

Mathematical Statistics - MS

Mathematical Statistics - MS Paper Specific Istructios. The examiatio is of hours duratio. There are a total of 60 questios carryig 00 marks. The etire paper is divided ito three sectios, A, B ad C. All sectios are compulsory. Questios

More information

4. Partial Sums and the Central Limit Theorem

4. Partial Sums and the Central Limit Theorem 1 of 10 7/16/2009 6:05 AM Virtual Laboratories > 6. Radom Samples > 1 2 3 4 5 6 7 4. Partial Sums ad the Cetral Limit Theorem The cetral limit theorem ad the law of large umbers are the two fudametal theorems

More information

Convergence of random variables. (telegram style notes) P.J.C. Spreij

Convergence of random variables. (telegram style notes) P.J.C. Spreij Covergece of radom variables (telegram style otes).j.c. Spreij this versio: September 6, 2005 Itroductio As we kow, radom variables are by defiitio measurable fuctios o some uderlyig measurable space

More information

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam. Probability ad Statistics FS 07 Secod Sessio Exam 09.0.08 Time Limit: 80 Miutes Name: Studet ID: This exam cotais 9 pages (icludig this cover page) ad 0 questios. A Formulae sheet is provided with the

More information

Stat410 Probability and Statistics II (F16)

Stat410 Probability and Statistics II (F16) Some Basic Cocepts of Statistical Iferece (Sec 5.) Suppose we have a rv X that has a pdf/pmf deoted by f(x; θ) or p(x; θ), where θ is called the parameter. I previous lectures, we focus o probability problems

More information

Solutions: Homework 3

Solutions: Homework 3 Solutios: Homework 3 Suppose that the radom variables Y,...,Y satisfy Y i = x i + " i : i =,..., IID where x,...,x R are fixed values ad ",...," Normal(0, )with R + kow. Fid ˆ = MLE( ). IND Solutio: Observe

More information

Statistical Theory MT 2009 Problems 1: Solution sketches

Statistical Theory MT 2009 Problems 1: Solution sketches Statistical Theory MT 009 Problems : Solutio sketches. Which of the followig desities are withi a expoetial family? Explai your reasoig. (a) Let 0 < θ < ad put f(x, θ) = ( θ)θ x ; x = 0,,,... (b) (c) where

More information

Probability 2 - Notes 10. Lemma. If X is a random variable and g(x) 0 for all x in the support of f X, then P(g(X) 1) E[g(X)].

Probability 2 - Notes 10. Lemma. If X is a random variable and g(x) 0 for all x in the support of f X, then P(g(X) 1) E[g(X)]. Probability 2 - Notes 0 Some Useful Iequalities. Lemma. If X is a radom variable ad g(x 0 for all x i the support of f X, the P(g(X E[g(X]. Proof. (cotiuous case P(g(X Corollaries x:g(x f X (xdx x:g(x

More information

Statistical Theory MT 2008 Problems 1: Solution sketches

Statistical Theory MT 2008 Problems 1: Solution sketches Statistical Theory MT 008 Problems : Solutio sketches. Which of the followig desities are withi a expoetial family? Explai your reasoig. a) Let 0 < θ < ad put fx, θ) = θ)θ x ; x = 0,,,... b) c) where α

More information

17. Joint distributions of extreme order statistics Lehmann 5.1; Ferguson 15

17. Joint distributions of extreme order statistics Lehmann 5.1; Ferguson 15 17. Joit distributios of extreme order statistics Lehma 5.1; Ferguso 15 I Example 10., we derived the asymptotic distributio of the maximum from a radom sample from a uiform distributio. We did this usig

More information

The variance of a sum of independent variables is the sum of their variances, since covariances are zero. Therefore. V (xi )= n n 2 σ2 = σ2.

The variance of a sum of independent variables is the sum of their variances, since covariances are zero. Therefore. V (xi )= n n 2 σ2 = σ2. SAMPLE STATISTICS A radom sample x 1,x,,x from a distributio f(x) is a set of idepedetly ad idetically variables with x i f(x) for all i Their joit pdf is f(x 1,x,,x )=f(x 1 )f(x ) f(x )= f(x i ) The sample

More information

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1 EECS564 Estimatio, Filterig, ad Detectio Hwk 2 Sols. Witer 25 4. Let Z be a sigle observatio havig desity fuctio where. p (z) = (2z + ), z (a) Assumig that is a oradom parameter, fid ad plot the maximum

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS MASSACHUSTTS INSTITUT OF TCHNOLOGY 6.436J/5.085J Fall 2008 Lecture 9 /7/2008 LAWS OF LARG NUMBRS II Cotets. The strog law of large umbers 2. The Cheroff boud TH STRONG LAW OF LARG NUMBRS While the weak

More information

EE 4TM4: Digital Communications II Probability Theory

EE 4TM4: Digital Communications II Probability Theory 1 EE 4TM4: Digital Commuicatios II Probability Theory I. RANDOM VARIABLES A radom variable is a real-valued fuctio defied o the sample space. Example: Suppose that our experimet cosists of tossig two fair

More information

Lecture 8: Convergence of transformations and law of large numbers

Lecture 8: Convergence of transformations and law of large numbers Lecture 8: Covergece of trasformatios ad law of large umbers Trasformatio ad covergece Trasformatio is a importat tool i statistics. If X coverges to X i some sese, we ofte eed to check whether g(x ) coverges

More information

Lecture 18: Sampling distributions

Lecture 18: Sampling distributions Lecture 18: Samplig distributios I may applicatios, the populatio is oe or several ormal distributios (or approximately). We ow study properties of some importat statistics based o a radom sample from

More information

7.1 Convergence of sequences of random variables

7.1 Convergence of sequences of random variables Chapter 7 Limit Theorems Throughout this sectio we will assume a probability space (, F, P), i which is defied a ifiite sequece of radom variables (X ) ad a radom variable X. The fact that for every ifiite

More information

Lecture 12: September 27

Lecture 12: September 27 36-705: Itermediate Statistics Fall 207 Lecturer: Siva Balakrisha Lecture 2: September 27 Today we will discuss sufficiecy i more detail ad the begi to discuss some geeral strategies for costructig estimators.

More information

Lecture 20: Multivariate convergence and the Central Limit Theorem

Lecture 20: Multivariate convergence and the Central Limit Theorem Lecture 20: Multivariate covergece ad the Cetral Limit Theorem Covergece i distributio for radom vectors Let Z,Z 1,Z 2,... be radom vectors o R k. If the cdf of Z is cotiuous, the we ca defie covergece

More information

Math 61CM - Solutions to homework 3

Math 61CM - Solutions to homework 3 Math 6CM - Solutios to homework 3 Cédric De Groote October 2 th, 208 Problem : Let F be a field, m 0 a fixed oegative iteger ad let V = {a 0 + a x + + a m x m a 0,, a m F} be the vector space cosistig

More information

2. The volume of the solid of revolution generated by revolving the area bounded by the

2. The volume of the solid of revolution generated by revolving the area bounded by the IIT JAM Mathematical Statistics (MS) Solved Paper. A eigevector of the matrix M= ( ) is (a) ( ) (b) ( ) (c) ( ) (d) ( ) Solutio: (a) Eigevalue of M = ( ) is. x So, let x = ( y) be the eigevector. z (M

More information

Exponential Families and Bayesian Inference

Exponential Families and Bayesian Inference Computer Visio Expoetial Families ad Bayesia Iferece Lecture Expoetial Families A expoetial family of distributios is a d-parameter family f(x; havig the followig form: f(x; = h(xe g(t T (x B(, (. where

More information

TAMS24: Notations and Formulas

TAMS24: Notations and Formulas TAMS4: Notatios ad Formulas Basic otatios ad defiitios X: radom variable stokastiska variabel Mea Vätevärde: µ = X = by Xiagfeg Yag kpx k, if X is discrete, xf Xxdx, if X is cotiuous Variace Varias: =

More information

Lecture 19: Convergence

Lecture 19: Convergence Lecture 19: Covergece Asymptotic approach I statistical aalysis or iferece, a key to the success of fidig a good procedure is beig able to fid some momets ad/or distributios of various statistics. I may

More information

1 = δ2 (0, ), Y Y n nδ. , T n = Y Y n n. ( U n,k + X ) ( f U n,k + Y ) n 2n f U n,k + θ Y ) 2 E X1 2 X1

1 = δ2 (0, ), Y Y n nδ. , T n = Y Y n n. ( U n,k + X ) ( f U n,k + Y ) n 2n f U n,k + θ Y ) 2 E X1 2 X1 8. The cetral limit theorems 8.1. The cetral limit theorem for i.i.d. sequeces. ecall that C ( is N -separatig. Theorem 8.1. Let X 1, X,... be i.i.d. radom variables with EX 1 = ad EX 1 = σ (,. Suppose

More information

Asymptotics. Hypothesis Testing UMP. Asymptotic Tests and p-values

Asymptotics. Hypothesis Testing UMP. Asymptotic Tests and p-values of the secod half Biostatistics 6 - Statistical Iferece Lecture 6 Fial Exam & Practice Problems for the Fial Hyu Mi Kag Apil 3rd, 3 Hyu Mi Kag Biostatistics 6 - Lecture 6 Apil 3rd, 3 / 3 Rao-Blackwell

More information

Topic 9: Sampling Distributions of Estimators

Topic 9: Sampling Distributions of Estimators Topic 9: Samplig Distributios of Estimators Course 003, 2016 Page 0 Samplig distributios of estimators Sice our estimators are statistics (particular fuctios of radom variables), their distributio ca be

More information

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4 MATH 30: Probability ad Statistics 9. Estimatio ad Testig of Parameters Estimatio ad Testig of Parameters We have bee dealig situatios i which we have full kowledge of the distributio of a radom variable.

More information

MATH 472 / SPRING 2013 ASSIGNMENT 2: DUE FEBRUARY 4 FINALIZED

MATH 472 / SPRING 2013 ASSIGNMENT 2: DUE FEBRUARY 4 FINALIZED MATH 47 / SPRING 013 ASSIGNMENT : DUE FEBRUARY 4 FINALIZED Please iclude a cover sheet that provides a complete setece aswer to each the followig three questios: (a) I your opiio, what were the mai ideas

More information

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence Chapter 3 Strog covergece As poited out i the Chapter 2, there are multiple ways to defie the otio of covergece of a sequece of radom variables. That chapter defied covergece i probability, covergece i

More information

ST5215: Advanced Statistical Theory

ST5215: Advanced Statistical Theory ST525: Advaced Statistical Theory Departmet of Statistics & Applied Probability Tuesday, September 7, 2 ST525: Advaced Statistical Theory Lecture : The law of large umbers The Law of Large Numbers The

More information

LECTURE 8: ASYMPTOTICS I

LECTURE 8: ASYMPTOTICS I LECTURE 8: ASYMPTOTICS I We are iterested i the properties of estimators as. Cosider a sequece of radom variables {, X 1}. N. M. Kiefer, Corell Uiversity, Ecoomics 60 1 Defiitio: (Weak covergece) A sequece

More information

7.1 Convergence of sequences of random variables

7.1 Convergence of sequences of random variables Chapter 7 Limit theorems Throughout this sectio we will assume a probability space (Ω, F, P), i which is defied a ifiite sequece of radom variables (X ) ad a radom variable X. The fact that for every ifiite

More information

Direction: This test is worth 250 points. You are required to complete this test within 50 minutes.

Direction: This test is worth 250 points. You are required to complete this test within 50 minutes. Term Test October 3, 003 Name Math 56 Studet Number Directio: This test is worth 50 poits. You are required to complete this test withi 50 miutes. I order to receive full credit, aswer each problem completely

More information

Distribution of Random Samples & Limit theorems

Distribution of Random Samples & Limit theorems STAT/MATH 395 A - PROBABILITY II UW Witer Quarter 2017 Néhémy Lim Distributio of Radom Samples & Limit theorems 1 Distributio of i.i.d. Samples Motivatig example. Assume that the goal of a study is to

More information

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1. Eco 325/327 Notes o Sample Mea, Sample Proportio, Cetral Limit Theorem, Chi-square Distributio, Studet s t distributio 1 Sample Mea By Hiro Kasahara We cosider a radom sample from a populatio. Defiitio

More information

Sequences and Series of Functions

Sequences and Series of Functions Chapter 6 Sequeces ad Series of Fuctios 6.1. Covergece of a Sequece of Fuctios Poitwise Covergece. Defiitio 6.1. Let, for each N, fuctio f : A R be defied. If, for each x A, the sequece (f (x)) coverges

More information

Topic 9: Sampling Distributions of Estimators

Topic 9: Sampling Distributions of Estimators Topic 9: Samplig Distributios of Estimators Course 003, 2018 Page 0 Samplig distributios of estimators Sice our estimators are statistics (particular fuctios of radom variables), their distributio ca be

More information

Lecture 33: Bootstrap

Lecture 33: Bootstrap Lecture 33: ootstrap Motivatio To evaluate ad compare differet estimators, we eed cosistet estimators of variaces or asymptotic variaces of estimators. This is also importat for hypothesis testig ad cofidece

More information

Problem Set 4 Due Oct, 12

Problem Set 4 Due Oct, 12 EE226: Radom Processes i Systems Lecturer: Jea C. Walrad Problem Set 4 Due Oct, 12 Fall 06 GSI: Assae Gueye This problem set essetially reviews detectio theory ad hypothesis testig ad some basic otios

More information

Last Lecture. Biostatistics Statistical Inference Lecture 16 Evaluation of Bayes Estimator. Recap - Example. Recap - Bayes Estimator

Last Lecture. Biostatistics Statistical Inference Lecture 16 Evaluation of Bayes Estimator. Recap - Example. Recap - Bayes Estimator Last Lecture Biostatistics 60 - Statistical Iferece Lecture 16 Evaluatio of Bayes Estimator Hyu Mi Kag March 14th, 013 What is a Bayes Estimator? Is a Bayes Estimator the best ubiased estimator? Compared

More information

2.1. Convergence in distribution and characteristic functions.

2.1. Convergence in distribution and characteristic functions. 3 Chapter 2. Cetral Limit Theorem. Cetral limit theorem, or DeMoivre-Laplace Theorem, which also implies the wea law of large umbers, is the most importat theorem i probability theory ad statistics. For

More information

Review Questions, Chapters 8, 9. f(y) = 0, elsewhere. F (y) = f Y(1) = n ( e y/θ) n 1 1 θ e y/θ = n θ e yn

Review Questions, Chapters 8, 9. f(y) = 0, elsewhere. F (y) = f Y(1) = n ( e y/θ) n 1 1 θ e y/θ = n θ e yn Stat 366 Lab 2 Solutios (September 2, 2006) page TA: Yury Petracheko, CAB 484, yuryp@ualberta.ca, http://www.ualberta.ca/ yuryp/ Review Questios, Chapters 8, 9 8.5 Suppose that Y, Y 2,..., Y deote a radom

More information

Lecture 11 and 12: Basic estimation theory

Lecture 11 and 12: Basic estimation theory Lecture ad 2: Basic estimatio theory Sprig 202 - EE 94 Networked estimatio ad cotrol Prof. Kha March 2 202 I. MAXIMUM-LIKELIHOOD ESTIMATORS The maximum likelihood priciple is deceptively simple. Louis

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 2 9/9/2013. Large Deviations for i.i.d. Random Variables

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 2 9/9/2013. Large Deviations for i.i.d. Random Variables MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 2 9/9/2013 Large Deviatios for i.i.d. Radom Variables Cotet. Cheroff boud usig expoetial momet geeratig fuctios. Properties of a momet

More information

Topic 9: Sampling Distributions of Estimators

Topic 9: Sampling Distributions of Estimators Topic 9: Samplig Distributios of Estimators Course 003, 2018 Page 0 Samplig distributios of estimators Sice our estimators are statistics (particular fuctios of radom variables), their distributio ca be

More information

( θ. sup θ Θ f X (x θ) = L. sup Pr (Λ (X) < c) = α. x : Λ (x) = sup θ H 0. sup θ Θ f X (x θ) = ) < c. NH : θ 1 = θ 2 against AH : θ 1 θ 2

( θ. sup θ Θ f X (x θ) = L. sup Pr (Λ (X) < c) = α. x : Λ (x) = sup θ H 0. sup θ Θ f X (x θ) = ) < c. NH : θ 1 = θ 2 against AH : θ 1 θ 2 82 CHAPTER 4. MAXIMUM IKEIHOOD ESTIMATION Defiitio: et X be a radom sample with joit p.m/d.f. f X x θ. The geeralised likelihood ratio test g.l.r.t. of the NH : θ H 0 agaist the alterative AH : θ H 1,

More information

Estimation for Complete Data

Estimation for Complete Data Estimatio for Complete Data complete data: there is o loss of iformatio durig study. complete idividual complete data= grouped data A complete idividual data is the oe i which the complete iformatio of

More information

Lecture 11 October 27

Lecture 11 October 27 STATS 300A: Theory of Statistics Fall 205 Lecture October 27 Lecturer: Lester Mackey Scribe: Viswajith Veugopal, Vivek Bagaria, Steve Yadlowsky Warig: These otes may cotai factual ad/or typographic errors..

More information

Chapter 6 Infinite Series

Chapter 6 Infinite Series Chapter 6 Ifiite Series I the previous chapter we cosidered itegrals which were improper i the sese that the iterval of itegratio was ubouded. I this chapter we are goig to discuss a topic which is somewhat

More information

Lecture 3 : Random variables and their distributions

Lecture 3 : Random variables and their distributions Lecture 3 : Radom variables ad their distributios 3.1 Radom variables Let (Ω, F) ad (S, S) be two measurable spaces. A map X : Ω S is measurable or a radom variable (deoted r.v.) if X 1 (A) {ω : X(ω) A}

More information

Random Variables, Sampling and Estimation

Random Variables, Sampling and Estimation Chapter 1 Radom Variables, Samplig ad Estimatio 1.1 Itroductio This chapter will cover the most importat basic statistical theory you eed i order to uderstad the ecoometric material that will be comig

More information

1.010 Uncertainty in Engineering Fall 2008

1.010 Uncertainty in Engineering Fall 2008 MIT OpeCourseWare http://ocw.mit.edu.00 Ucertaity i Egieerig Fall 2008 For iformatio about citig these materials or our Terms of Use, visit: http://ocw.mit.edu.terms. .00 - Brief Notes # 9 Poit ad Iterval

More information

Mathematics 170B Selected HW Solutions.

Mathematics 170B Selected HW Solutions. Mathematics 17B Selected HW Solutios. F 4. Suppose X is B(,p). (a)fidthemometgeeratigfuctiom (s)of(x p)/ p(1 p). Write q = 1 p. The MGF of X is (pe s + q), sice X ca be writte as the sum of idepedet Beroulli

More information

Introduction to Extreme Value Theory Laurens de Haan, ISM Japan, Erasmus University Rotterdam, NL University of Lisbon, PT

Introduction to Extreme Value Theory Laurens de Haan, ISM Japan, Erasmus University Rotterdam, NL University of Lisbon, PT Itroductio to Extreme Value Theory Laures de Haa, ISM Japa, 202 Itroductio to Extreme Value Theory Laures de Haa Erasmus Uiversity Rotterdam, NL Uiversity of Lisbo, PT Itroductio to Extreme Value Theory

More information

Summary. Recap ... Last Lecture. Summary. Theorem

Summary. Recap ... Last Lecture. Summary. Theorem Last Lecture Biostatistics 602 - Statistical Iferece Lecture 23 Hyu Mi Kag April 11th, 2013 What is p-value? What is the advatage of p-value compared to hypothesis testig procedure with size α? How ca

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 016 MODULE : Statistical Iferece Time allowed: Three hours Cadidates should aswer FIVE questios. All questios carry equal marks. The umber

More information

APPM 5720 Solutions to Problem Set Five

APPM 5720 Solutions to Problem Set Five APPM 5720 Solutios to Problem Set Five 1. The pdf is f(x; α, β) 1 Γ(α) βα x α 1 e βx I (0, ) (x). The joit pdf is f( x; α, β) 1 [Γ(α)] β α ( i1 x i ) α 1 e β i1 x i i1 I (0, ) (x i ) 1 [Γ(α)] βα } {{ }

More information

Probability and Statistics

Probability and Statistics ICME Refresher Course: robability ad Statistics Staford Uiversity robability ad Statistics Luyag Che September 20, 2016 1 Basic robability Theory 11 robability Spaces A probability space is a triple (Ω,

More information

Entropy Rates and Asymptotic Equipartition

Entropy Rates and Asymptotic Equipartition Chapter 29 Etropy Rates ad Asymptotic Equipartitio Sectio 29. itroduces the etropy rate the asymptotic etropy per time-step of a stochastic process ad shows that it is well-defied; ad similarly for iformatio,

More information

Since X n /n P p, we know that X n (n. Xn (n X n ) Using the asymptotic result above to obtain an approximation for fixed n, we obtain

Since X n /n P p, we know that X n (n. Xn (n X n ) Using the asymptotic result above to obtain an approximation for fixed n, we obtain Assigmet 9 Exercise 5.5 Let X biomial, p, where p 0, 1 is ukow. Obtai cofidece itervals for p i two differet ways: a Sice X / p d N0, p1 p], the variace of the limitig distributio depeds oly o p. Use the

More information

Expectation and Variance of a random variable

Expectation and Variance of a random variable Chapter 11 Expectatio ad Variace of a radom variable The aim of this lecture is to defie ad itroduce mathematical Expectatio ad variace of a fuctio of discrete & cotiuous radom variables ad the distributio

More information

HOMEWORK I: PREREQUISITES FROM MATH 727

HOMEWORK I: PREREQUISITES FROM MATH 727 HOMEWORK I: PREREQUISITES FROM MATH 727 Questio. Let X, X 2,... be idepedet expoetial radom variables with mea µ. (a) Show that for Z +, we have EX µ!. (b) Show that almost surely, X + + X (c) Fid the

More information

The Central Limit Theorem

The Central Limit Theorem Chapter The Cetral Limit Theorem Deote by Z the stadard ormal radom variable with desity 2π e x2 /2. Lemma.. Ee itz = e t2 /2 Proof. We use the same calculatio as for the momet geeratig fuctio: exp(itx

More information

(A sequence also can be thought of as the list of function values attained for a function f :ℵ X, where f (n) = x n for n 1.) x 1 x N +k x N +4 x 3

(A sequence also can be thought of as the list of function values attained for a function f :ℵ X, where f (n) = x n for n 1.) x 1 x N +k x N +4 x 3 MATH 337 Sequeces Dr. Neal, WKU Let X be a metric space with distace fuctio d. We shall defie the geeral cocept of sequece ad limit i a metric space, the apply the results i particular to some special

More information

Limit Theorems. Convergence in Probability. Let X be the number of heads observed in n tosses. Then, E[X] = np and Var[X] = np(1-p).

Limit Theorems. Convergence in Probability. Let X be the number of heads observed in n tosses. Then, E[X] = np and Var[X] = np(1-p). Limit Theorems Covergece i Probability Let X be the umber of heads observed i tosses. The, E[X] = p ad Var[X] = p(-p). L O This P x p NM QP P x p should be close to uity for large if our ituitio is correct.

More information

Lecture 3. Properties of Summary Statistics: Sampling Distribution

Lecture 3. Properties of Summary Statistics: Sampling Distribution Lecture 3 Properties of Summary Statistics: Samplig Distributio Mai Theme How ca we use math to justify that our umerical summaries from the sample are good summaries of the populatio? Lecture Summary

More information

Sequences and Limits

Sequences and Limits Chapter Sequeces ad Limits Let { a } be a sequece of real or complex umbers A ecessary ad sufficiet coditio for the sequece to coverge is that for ay ɛ > 0 there exists a iteger N > 0 such that a p a q

More information

MATHEMATICAL SCIENCES PAPER-II

MATHEMATICAL SCIENCES PAPER-II MATHEMATICAL SCIENCES PAPER-II. Let {x } ad {y } be two sequeces of real umbers. Prove or disprove each of the statemets :. If {x y } coverges, ad if {y } is coverget, the {x } is coverget.. {x + y } coverges

More information

Properties and Hypothesis Testing

Properties and Hypothesis Testing Chapter 3 Properties ad Hypothesis Testig 3.1 Types of data The regressio techiques developed i previous chapters ca be applied to three differet kids of data. 1. Cross-sectioal data. 2. Time series data.

More information

Power series are analytic

Power series are analytic Power series are aalytic Horia Corea 1 1 Fubii s theorem for double series Theorem 1.1. Let {α m }, be a real sequece idexed by two idices. Assume that the series α m is coverget for all ad C := ( α m

More information

1 Introduction to reducing variance in Monte Carlo simulations

1 Introduction to reducing variance in Monte Carlo simulations Copyright c 010 by Karl Sigma 1 Itroductio to reducig variace i Mote Carlo simulatios 11 Review of cofidece itervals for estimatig a mea I statistics, we estimate a ukow mea µ = E(X) of a distributio by

More information

Linear regression. Daniel Hsu (COMS 4771) (y i x T i β)2 2πσ. 2 2σ 2. 1 n. (x T i β y i ) 2. 1 ˆβ arg min. β R n d

Linear regression. Daniel Hsu (COMS 4771) (y i x T i β)2 2πσ. 2 2σ 2. 1 n. (x T i β y i ) 2. 1 ˆβ arg min. β R n d Liear regressio Daiel Hsu (COMS 477) Maximum likelihood estimatio Oe of the simplest liear regressio models is the followig: (X, Y ),..., (X, Y ), (X, Y ) are iid radom pairs takig values i R d R, ad Y

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 3 9/11/2013. Large deviations Theory. Cramér s Theorem

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 3 9/11/2013. Large deviations Theory. Cramér s Theorem MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/5.070J Fall 203 Lecture 3 9//203 Large deviatios Theory. Cramér s Theorem Cotet.. Cramér s Theorem. 2. Rate fuctio ad properties. 3. Chage of measure techique.

More information

Statistical and Mathematical Methods DS-GA 1002 December 8, Sample Final Problems Solutions

Statistical and Mathematical Methods DS-GA 1002 December 8, Sample Final Problems Solutions Statistical ad Mathematical Methods DS-GA 00 December 8, 05. Short questios Sample Fial Problems Solutios a. Ax b has a solutio if b is i the rage of A. The dimesio of the rage of A is because A has liearly-idepedet

More information

Direction: This test is worth 150 points. You are required to complete this test within 55 minutes.

Direction: This test is worth 150 points. You are required to complete this test within 55 minutes. Term Test 3 (Part A) November 1, 004 Name Math 6 Studet Number Directio: This test is worth 10 poits. You are required to complete this test withi miutes. I order to receive full credit, aswer each problem

More information

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman: Math 224 Fall 2017 Homework 4 Drew Armstrog Problems from 9th editio of Probability ad Statistical Iferece by Hogg, Tais ad Zimmerma: Sectio 2.3, Exercises 16(a,d),18. Sectio 2.4, Exercises 13, 14. Sectio

More information

Advanced Stochastic Processes.

Advanced Stochastic Processes. Advaced Stochastic Processes. David Gamarik LECTURE 2 Radom variables ad measurable fuctios. Strog Law of Large Numbers (SLLN). Scary stuff cotiued... Outlie of Lecture Radom variables ad measurable fuctios.

More information

Large Sample Theory. Convergence. Central Limit Theorems Asymptotic Distribution Delta Method. Convergence in Probability Convergence in Distribution

Large Sample Theory. Convergence. Central Limit Theorems Asymptotic Distribution Delta Method. Convergence in Probability Convergence in Distribution Large Sample Theory Covergece Covergece i Probability Covergece i Distributio Cetral Limit Theorems Asymptotic Distributio Delta Method Covergece i Probability A sequece of radom scalars {z } = (z 1,z,

More information

Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Some Basic Probability Cocepts 2. Experimets, Outcomes ad Radom Variables A radom variable is a variable whose value is ukow util it is observed. The value of a radom variable results from a experimet;

More information

Machine Learning Brett Bernstein

Machine Learning Brett Bernstein Machie Learig Brett Berstei Week Lecture: Cocept Check Exercises Starred problems are optioal. Statistical Learig Theory. Suppose A = Y = R ad X is some other set. Furthermore, assume P X Y is a discrete

More information

Ma 4121: Introduction to Lebesgue Integration Solutions to Homework Assignment 5

Ma 4121: Introduction to Lebesgue Integration Solutions to Homework Assignment 5 Ma 42: Itroductio to Lebesgue Itegratio Solutios to Homework Assigmet 5 Prof. Wickerhauser Due Thursday, April th, 23 Please retur your solutios to the istructor by the ed of class o the due date. You

More information

62. Power series Definition 16. (Power series) Given a sequence {c n }, the series. c n x n = c 0 + c 1 x + c 2 x 2 + c 3 x 3 +

62. Power series Definition 16. (Power series) Given a sequence {c n }, the series. c n x n = c 0 + c 1 x + c 2 x 2 + c 3 x 3 + 62. Power series Defiitio 16. (Power series) Give a sequece {c }, the series c x = c 0 + c 1 x + c 2 x 2 + c 3 x 3 + is called a power series i the variable x. The umbers c are called the coefficiets of

More information

Approximations and more PMFs and PDFs

Approximations and more PMFs and PDFs Approximatios ad more PMFs ad PDFs Saad Meimeh 1 Approximatio of biomial with Poisso Cosider the biomial distributio ( b(k,,p = p k (1 p k, k λ: k Assume that is large, ad p is small, but p λ at the limit.

More information

5 Birkhoff s Ergodic Theorem

5 Birkhoff s Ergodic Theorem 5 Birkhoff s Ergodic Theorem Amog the most useful of the various geeralizatios of KolmogorovâĂŹs strog law of large umbers are the ergodic theorems of Birkhoff ad Kigma, which exted the validity of the

More information

2.2. Central limit theorem.

2.2. Central limit theorem. 36.. Cetral limit theorem. The most ideal case of the CLT is that the radom variables are iid with fiite variace. Although it is a special case of the more geeral Lideberg-Feller CLT, it is most stadard

More information

Solutions of Homework 2.

Solutions of Homework 2. 1 Solutios of Homework 2. 1. Suppose X Y with E(Y )

More information

Econ 325 Notes on Point Estimator and Confidence Interval 1 By Hiro Kasahara

Econ 325 Notes on Point Estimator and Confidence Interval 1 By Hiro Kasahara Poit Estimator Eco 325 Notes o Poit Estimator ad Cofidece Iterval 1 By Hiro Kasahara Parameter, Estimator, ad Estimate The ormal probability desity fuctio is fully characterized by two costats: populatio

More information

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering CEE 5 Autum 005 Ucertaity Cocepts for Geotechical Egieerig Basic Termiology Set A set is a collectio of (mutually exclusive) objects or evets. The sample space is the (collectively exhaustive) collectio

More information

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors ECONOMETRIC THEORY MODULE XIII Lecture - 34 Asymptotic Theory ad Stochastic Regressors Dr. Shalabh Departmet of Mathematics ad Statistics Idia Istitute of Techology Kapur Asymptotic theory The asymptotic

More information

Element sampling: Part 2

Element sampling: Part 2 Chapter 4 Elemet samplig: Part 2 4.1 Itroductio We ow cosider uequal probability samplig desigs which is very popular i practice. I the uequal probability samplig, we ca improve the efficiecy of the resultig

More information

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

Discrete Mathematics for CS Spring 2008 David Wagner Note 22 CS 70 Discrete Mathematics for CS Sprig 2008 David Wager Note 22 I.I.D. Radom Variables Estimatig the bias of a coi Questio: We wat to estimate the proportio p of Democrats i the US populatio, by takig

More information