18441 Answers o QUIZ 1 18441 1 Le P be he proporion of voers who will voe Yes Suppose he prior probabiliy disribuion of P is given by Pr(P < p) p for 0 < p < 1 You ake a poll by choosing nine voers a random, he choice of each being independen of who else was chosen I is found ha six of he nine will voe Yes Find he poserior probabiliy ha more han half of all voers in he whole populaion will voe Yes Answer: The prior probabiliy densiy funcion of P is f P (p) d dp F P (p) d dp Pr(P p) d dp p 1 if 0 < p < 1 and 0 if p < 0 or p > 1 In oher words, P is uniformly disribued on he inerval [0, 1], or, in ye oher words, P Bea(1, 1) Le X be he number of voers in he sample of nine who will voe Yes Then he likelihood funcion is 9 L(p) Pr(X 6 P p) p 6 (1 p) 6 Muliplying he prior densiy by he likelihood gives us [consan] 1 p 6 (1 p) [consan] p 7 1 (1 p) 4 1, so we have P [X 6] Bea(7, 4) In order o find Pr(P > 1/ X 6), we need he value of he normalizing consan; we wrie he densiy as f P [X6] (p) for 0 < p < 1 Then Pr(P > 1/) 1 1/ Γ(7 + 4) Γ(7)Γ(4) p7 1 (1 p) 4 1 10!!6! p6 (1 p) 840p 6 (1 p) 840p 6 (1 p) dp 1 [ p 7 840 7 p8 8 + p9 p10 10 10 } 15 {{ + 80 84 } 1 1/ ] p1 840 ( p 6 p 7 + p 8 p 9) dp p1/ If his 1 hen we can infer ha an error has occurred! ] p1 [10p 7 15p 8 + 80p 9 84p 10 ( 10 7 15 8 + 80 9 84 10 40 15 + 140 1 8 1 44 8 1 11 6 5 64 08815 1 ) p1/
Suppose a family of probabiliy disribuions of a random variable X is indexed by a parameer θ (a) Wha does i mean o say ha T (X) is a sufficien saisic for θ? Answer: I means ha he condiional probabiliy disribuion of X given T (X) does no depend on θ; he condiional disribuion remains he same as θ changes (b) Suppose T (X) is a sufficien saisic for θ Explain why he value of he Rao- Blackwell esimaor E(δ(X) T (X)) does no depend on θ, even hough he probabiliy disribuion of δ(x) mus depend on θ in order ha δ(x) make sense as an esimaor of θ Answer: The condiional disribuion of X given T (X) does no depend on θ The condiional disribuion of δ(x) given T (X) does no depend on θ The condiional expecaion of δ(x) given T (X) does no depend on θ Suppose X 1, X i i d Bernoulli(p), ie, hey are independen and idenically disribued and { 1 wih probabiliy p, X 1 0 wih probabiliy 1 p (a) Show ha X 1 X is no a complee saisic Answer: I is enough o find some funcion g such ha E(g(X 1 X )) remains zero as p changes Bu we have E(X 1 X ) E(X 1 ) E(X ) p p 0, so we can ake g o be he ideniy funcion (b) Show ha X 1 + X is a sufficien saisic for p Answer: One way o do his is by appealing direcly o he definiion of sufficiency, ie, by finding Pr(X 1 x 1 & X x X 1 + X ) and observing ha no p appears in he answer Pr(X 1 x 1 & X x X 1 + X ) px 1 (1 p) 1 x 1 p x (1 p) 1 x p (1 p) px 1+x (1 p) (x 1+x ) ( )p (1 p) and no p appears here p (1 p) ( )p (1 p) 1,
(c) You may use he fac ha X 1 + X is a complee saisic Show ha X 1 X is an unbiased esimaor of p, and find he bes unbiased esimaor of p, ie, he one wih he smalles mean error among all unbiased esimaors of p Answer: The Lehman-Scheffé heorem says ha he condiional expecaion of an unbiased esimaor given a complee sufficien saisic is he unique bes unbiased esimaor So we seek E(X 1 X X 1 + X ) Noice ha X 1 X mus be eiher 0 or 1, and is 1 if and only if boh X 1 and X are 1, and ha happens if and only if X 1 + X So { 1 if X1 + X E(X 1 X X 1 +X ) Pr(X 1 X 1 X 1 +X ), 0 if X 1 + X eiher 0 or 1 Since we also have X 1 X { 1 if X1 + X, 0 if X 1 + X eiher 0 or 1, we can say ha E(X 1 X X 1 + X ) X 1 X In oher words, X 1 X is already he bes unbiased esimaor of p, and is unchanged by he Rao-Blackwell process of improving an esimaor (d) Find he maximum likelihood esimaor of p Answer: By invariance of maximum-likelihood esimaors, he maximumlikelihood esimaor of p is jus he square of he maximum-likelihood esimaor of p The likelihood funcion is L(p) P (X 1 x 1 & X x ) p x 1+x (1 p) x 1 x Therefore l(p) log L(p) (x 1 + x ) log p + ( x 1 x ) log(1 p) l (p) x 1 + x p x 1 x 1 p x1 + x p p(1 p) > 0 if 0 < p < (x 1 + x )/, 0 if p (x 1 + x )/, < 0 if (x 1 + x )/ < p < 1 Consequenly p (X 1 + X 1 )/, and so he maximum-likelihood esimaor of p is (X 1 + X ) /4
(e) Consider he wo esimaors ha you found above: he bes unbiased esimaor of p and he maximum likelihood esimaor of p Which has a smaller mean error when p 1/? Answer: The bes unbiased esimaor is { 0 if X1 + X { 0, 1 }, 1 if X 1 + X, and so i is { 0 wih probabiliy 1 p, 1 wih probabiliy p Is mean error is Squared error when he Squared error when he esimaor is 0 esimaor is 1 when p 1/ {}}{ (0 p ) (1 p ) + (1 p ) p 16 01875 Probabiliy ha Probabiliy ha he esimaor is 0 he esimaor is 1 The maximum-likelihood esimaor is 0 if X 1 + X 0, 1/4 if X 1 + X 1, 1 if X 1 + X, and so i is 0 wih probabiliy (1 p), 1/4 wih probabiliy p(1 p), 1 wih probabiliy p Is mean error is herefore error error {}}{ (0 p ) (1 p) + (1/4 p ) p(1 p) probabiliy probabiliy + error when p 1/ {}}{ (1 p ) p 5 01565 probabiliy So he MSE of he MLE is slighly smaller han ha of he bes unbiased esimaor when p 1/ 4
4 Among families wih wo children, le X be he score on a saisics es aken by he firs child a age 1, and le Y be he income of he second child a age 40 Suppose he pair (X, Y ) has a bivariae normal disribuion, and E(X) 65, SD(X) 10, E(Y ) $50, 000 per year, SD(Y ) $10, 000 per year, and corr(x, Y ) 1/ [All of his is ficion] Among families in which he firs child scores 75 on he saisics es a age 1, in wha proporion of cases does he second child have an income of a leas $59, 0 a age 40? Answer: On page 15 of DeGroo & Schervish, we learn ha E(Y X) X E(X) E(Y ) + corr(x, Y ) SD(Y ) SD(X) X 65 50, 000 + (1/)(10, 000) 10 So E(Y X 75) 55, 000, and var(y X) (1 corr(x, Y ) ) SD(Y ) So SD(Y X 75) (/4) 10, 000 4 10, 000 10000 Since he condiional disribuion of Y given ha X 75 is normal, we can say Pr(Y 59, 0 X 75) 1 Pr(Y 59, 0 X 75) Y 55, 000 1 Pr 10000 59, 0 55, 000 / 10000 / X 75 59, 0 55, 000 1 Φ 10000 1 Φ(0500) 1 06915 0085 / So he even of ineres occurs in abou 085% of all cases 5