Review Quiz 1. Prove that in a one-dimensional canonical exponential family, the complete and sufficient statistic achieves the Cramér Rao lower bound (CRLB). That is, if where { } and are scalars, then achieves the Cramér Rao lower bound. 1
Solution: Let [ ]. Then we have and [ ]. (*Please do present detailed proof of the above. Through this exercise you will appreciate the moment generating function even more.) Now the Cramér Rao lower bound is: where [ ] [ ] Therefore we have: [ ] This proves that the complete and sufficient statistic indeed achieves the Cramér Rao lower bound (CRLB). 2
Theorem. Cramer-Rao Lower Bound (Version 1, scalar [one parameter]) Let be a random sample from a population with p.d.f.. Let be an unbiased estimator of. Given some regularity conditions (continuous differentiable etc.) and the domain of. does not depend on. Then, we have ( ) [( ) ] [ ] Theorem. Cramer-Rao Lower Bound (Version 2, scalar) Let be a random sample from a population with p.d.f.. Let be an unbiased estimator of. Given some regularity conditions (continuous differentiable etc.) and the domain of. does not depend on. Then, we have ( ) [ [( ] ) ] [ [ ] ] Definition: Fisher Information (scalar) Let be the data we observe with the joint pdf, then the Fisher information is defined as: [ If is twice differentiable with respect to and under certain regularity conditions, the Fisher information may also be written as 3
[ ] Note 1: Now if is also an iid random sample with pdf, then (*notice the factor of n): [( ) ] [ ] Note 2: Now one also realizes that one does not need the i.i.d. random sample assumption and can simply write the CRLB as: ( ) [ ] Definition: Fisher Information (vector) Let be the data we observe with the joint pdf, where is a dimension vector, then the Fisher information is a matrix with its (i,j)th element defined as: [ ] [( ) ( )] Under some regularity conditions, we also have: 4
[ ] [ ] Theorem. Cramer-Rao Lower Bound (Version 3, vector) ( ) That Is positive semi-definite. ( ) is, Theorem. Cramer-Rao Lower Bound (Version 4, vector) ( ) ( ) Note that and are most likely vectors also. Recall: [ ] 5
2. Let for some density function and define [ Prove that if exist for all and and Then [ ] 6
3. Let be independent with, where the constants are known whereas and are unknown parameters. Please derive the UMVUE of and. 7
Solution: The joint density is { } { } This means that ( ) is a complete and sufficient statistic. From, we have: ( ) ( ) Multiplying the top equation by and the bottom equation by n, and then subtracting each other, we have: ( ) [( ) ] Therefore we conclude that ( ) is an unbiased estimator of. In addition, we can easily derive that is an unbiased estimator for. Since these unbiased estimators are functions of the data only through the complete and sufficient statistic, they are the UMVUEs. 8
4. Suppose that are a simple random sample from a Weibull distribution with density function { } { } for some known constant. Note that when the Weibull distribution is simply an exponential distribution. (a) Show that is exponentially distributed. (b) If we wish to test versus, derive the value such that the test that rejects whenever is of size. Related to a critical value of the distribution with degrees of freedom. (c) Show that the test in part (b) above is a UMP level test. (d) Please derive a confidence interval for by inverting the test. 9
Solution: (a) The Weibull ranom variable X has a closed-form cumulative distribution function (cdf) given by { } Thus the cdf of is { } It is the cdf of the exponential distribution with parameter The mgf of is: (b) The mgf of is: The mgf of is: Thus we know This means that if we set Then the test that rejects whenever is of size. (c) The likelihood is: [ ] So T is sufficient for this family of distribution. By the Karlin-Rubin Theorem, it suffices to verify that this family has a 10
monotone likelihood ratio (MLR) in the statistic T. To verify this, we check that for, { } is a strictly decreasing function of T. 11