Final Examination Statistics 200C. T. Ferguson June 10, 2010

Similar documents
Direction: This test is worth 150 points. You are required to complete this test within 55 minutes.

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Topic 9: Sampling Distributions of Estimators

( θ. sup θ Θ f X (x θ) = L. sup Pr (Λ (X) < c) = α. x : Λ (x) = sup θ H 0. sup θ Θ f X (x θ) = ) < c. NH : θ 1 = θ 2 against AH : θ 1 θ 2

Stat 319 Theory of Statistics (2) Exercises

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

Topic 9: Sampling Distributions of Estimators

STATISTICAL INFERENCE

Topic 9: Sampling Distributions of Estimators

Econ 325 Notes on Point Estimator and Confidence Interval 1 By Hiro Kasahara

Statistical Theory MT 2009 Problems 1: Solution sketches

Introductory statistics

TAMS24: Notations and Formulas

Lecture Note 8 Point Estimators and Point Estimation Methods. MIT Spring 2006 Herman Bennett

Summary. Recap ... Last Lecture. Summary. Theorem

Since X n /n P p, we know that X n (n. Xn (n X n ) Using the asymptotic result above to obtain an approximation for fixed n, we obtain

Stat410 Probability and Statistics II (F16)

Simulation. Two Rule For Inverting A Distribution Function

The variance of a sum of independent variables is the sum of their variances, since covariances are zero. Therefore. V (xi )= n n 2 σ2 = σ2.

Lecture 7: Properties of Random Samples

Homework for 2/3. 1. Determine the values of the following quantities: a. t 0.1,15 b. t 0.05,15 c. t 0.1,25 d. t 0.05,40 e. t 0.

5. Likelihood Ratio Tests

Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Stat 200 -Testing Summary Page 1

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

Statistics 20: Final Exam Solutions Summer Session 2007

Statistical Theory MT 2008 Problems 1: Solution sketches

Estimation for Complete Data

Direction: This test is worth 250 points. You are required to complete this test within 50 minutes.

SDS 321: Introduction to Probability and Statistics

IIT JAM Mathematical Statistics (MS) 2006 SECTION A

Goodness-of-Fit Tests and Categorical Data Analysis (Devore Chapter Fourteen)

NANYANG TECHNOLOGICAL UNIVERSITY SYLLABUS FOR ENTRANCE EXAMINATION FOR INTERNATIONAL STUDENTS AO-LEVEL MATHEMATICS

Chapter 6 Principles of Data Reduction

7.1 Convergence of sequences of random variables

Mathematical Statistics - MS

Unbiased Estimation. February 7-12, 2008

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

Distribution of Random Samples & Limit theorems

TMA4245 Statistics. Corrected 30 May and 4 June Norwegian University of Science and Technology Department of Mathematical Sciences.

Parameter, Statistic and Random Samples

Final Examination Statistics 200C. T. Ferguson June 11, 2009

Chapter 13: Tests of Hypothesis Section 13.1 Introduction

Common Large/Small Sample Tests 1/55

Problem Set 4 Due Oct, 12

[ ] ( ) ( ) [ ] ( ) 1 [ ] [ ] Sums of Random Variables Y = a 1 X 1 + a 2 X 2 + +a n X n The expected value of Y is:

Random Variables, Sampling and Estimation

4. Partial Sums and the Central Limit Theorem

Confidence Level We want to estimate the true mean of a random variable X economically and with confidence.

Section 14. Simple linear regression.

Exercise 4.3 Use the Continuity Theorem to prove the Cramér-Wold Theorem, Theorem. (1) φ a X(1).

7.1 Convergence of sequences of random variables

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

1.010 Uncertainty in Engineering Fall 2008

Review Questions, Chapters 8, 9. f(y) = 0, elsewhere. F (y) = f Y(1) = n ( e y/θ) n 1 1 θ e y/θ = n θ e yn

AMS570 Lecture Notes #2

STAT Homework 1 - Solutions

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

IE 230 Probability & Statistics in Engineering I. Closed book and notes. No calculators. 120 minutes.

(6) Fundamental Sampling Distribution and Data Discription

Asymptotics. Hypothesis Testing UMP. Asymptotic Tests and p-values

Last Lecture. Wald Test

Important Formulas. Expectation: E (X) = Σ [X P(X)] = n p q σ = n p q. P(X) = n! X1! X 2! X 3! X k! p X. Chapter 6 The Normal Distribution.

Lecture 19: Convergence

First Year Quantitative Comp Exam Spring, Part I - 203A. f X (x) = 0 otherwise

STAT431 Review. X = n. n )

Lecture 2: Monte Carlo Simulation

Lecture 11 and 12: Basic estimation theory

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS

STAC51: Categorical data Analysis

of the matrix is =-85, so it is not positive definite. Thus, the first

KLMED8004 Medical statistics. Part I, autumn Estimation. We have previously learned: Population and sample. New questions

Limit Theorems. Convergence in Probability. Let X be the number of heads observed in n tosses. Then, E[X] = np and Var[X] = np(1-p).

z is the upper tail critical value from the normal distribution

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n.

Lecture 33: Bootstrap

FACULTY OF MATHEMATICAL STUDIES MATHEMATICS FOR PART I ENGINEERING. Lectures

Lecture 12: September 27

Lecture 3. Properties of Summary Statistics: Sampling Distribution

Lecture 18: Sampling distributions

Large Sample Theory. Convergence. Central Limit Theorems Asymptotic Distribution Delta Method. Convergence in Probability Convergence in Distribution

Describing the Relation between Two Variables

Math 152. Rumbos Fall Solutions to Review Problems for Exam #2. Number of Heads Frequency

MATH 472 / SPRING 2013 ASSIGNMENT 2: DUE FEBRUARY 4 FINALIZED

Lecture 8: Convergence of transformations and law of large numbers

CH.25 Discrete Random Variables

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

Probability 2 - Notes 10. Lemma. If X is a random variable and g(x) 0 for all x in the support of f X, then P(g(X) 1) E[g(X)].

Mathematics 170B Selected HW Solutions.

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

Chapter 6 Sampling Distributions

2. The volume of the solid of revolution generated by revolving the area bounded by the

f(x i ; ) L(x; p) = i=1 To estimate the value of that maximizes L or equivalently ln L we will set =0, for i =1, 2,...,m p x i (1 p) 1 x i i=1

Element sampling: Part 2

Class 27. Daniel B. Rowe, Ph.D. Department of Mathematics, Statistics, and Computer Science. Marquette University MATH 1700

Probability and statistics: basic terms

Final Review. Fall 2013 Prof. Yao Xie, H. Milton Stewart School of Industrial Systems & Engineering Georgia Tech

Linear regression. Daniel Hsu (COMS 4771) (y i x T i β)2 2πσ. 2 2σ 2. 1 n. (x T i β y i ) 2. 1 ˆβ arg min. β R n d

Probability and Statistics

Transcription:

Fial Examiatio Statistics 00C T. Ferguso Jue 0, 00. (a State the Borel-Catelli Lemma ad its coverse. (b Let X,X,... be i.i.d. from a distributio with desity, f(x =θx (θ+ o the iterval (,. For what value of θ is it true that (/X 0.. Let X,X,... be idepedet radom variables with X k havig the distributio X k = { k with probability with probability k ( k+ ( k+ (a Let S = k= X k.fide(s advar(s. Note that Var(S. (b Check that the UAN coditio holds. (c Show whether or ot (S E(S / Var(S coverges i law to the stadard ormal distributio by checkig the Lideberg coditio. 3. Suppose we are give idepedet trials resultig i c possible cells, each trial havig probability p i of fallig i cell i, fori =,...,c.let i deote the umber of trials fallig i cell i. (a What is Pearso s chi-square for testig the hypothesis that the true probabilities are p i for i =,...,c? (b Fid the trasformed chi-square with the trasformatio, g(p =log(p applied to each cell. Fid the modified trasformed chi-square. (c What is the approximate large sample distributio of the modified trasformed chi-square if the true cell probabilities are p 0 i for i =,...,c? 4. I samplig from a populatio of N objects havig values z,z,...,z N,firsta sample of size <N/ is take without replacemet. Later a secod sample of size is take from the remaiig N objects without replacemet. The differece of the meas of the two samples is used to compare the samples. This leads to a rak statistic of the form S N = N z ja(r j, where a(i =fori =,...,, a(i = fori = +,...,, ad a(i =0fori = +,...,N. (a What are the mea ad the variace of S N? (b Assume that as N. Uder what coditio o the z i is it true that (S N ES N / L Var(S N N (0,?

5. (a Give the defiitio of the Kullback-Leibler Iformatio umber, K(f 0,f. (b What is the Iformatio Iequality? (c Suppose f 0 (x is the desity of the biomial distributio, B(, / (with sample size ad probability of success /, ad f (x is the desity of the biomial distributio, B(, 3/4. Fid K(f 0,f ad check that the iequality holds. 6. Let (X,Y,...,(X,Y be a sample from a bivariate distributio with desity f(x, y µ, θ =θ µx exp{ θx( + µy} for x>0ady>0, where µ>0adθ>0 are parameters. (a Fid the maximum likelihood estimates of µ ad θ. (b Fid the Fisher Iformatio matrix for this distributio. (c What is the asymptotic distributio of the MLE of µ whe θ is ukow? What is the asymptotic distributio of the MLE of µ whe θ is kow? 7. Let X,...,X be a sample from the Poisso distributio P(λ, let Y,...,Y be a sample from a Poisso distributio, P(λ+β, let Z,...,Z be a sample from the Poisso distributio, P(λ + β, with all three parameters, λ, β, β, ukow. Assume that all three samples are idepedet. (a Fid the likelihood ratio test statistic for testig the hypothesis H 0 : β = β. (b What fuctio of the likelihood ratio test statistic has asymptotically a chi-square distributio, ad how may degrees of freedom does it have i this case? 8. A sample of size is take i a multiomial experimet with c cells deoted (i, j, i =,...,cad j =,...,c.letp ij deote the probability of cell (i, j, ad let ij deote the umber fallig i cell (i, j, so that p ij =ad ij =. (a Let H deote the hypothesis of symmetry, that p ij = p ji for all i ad j. Fidthe chi-square test of H agaist all alteratives? How may degrees of freedom does it have? (b Let H 0 deote the hypothesis that all off-diagoal elemets are equal: p ij = q for all i j, forsomeq. Note that uder H 0, p + p +...+ p cc + c(c q =. Fidthe chi-square test of H 0 agaist all alteratives. How may degrees of freedom? (c What, the, is the chi-square test of H 0 agaist H, ad how may degrees of freedom does it have?

Solutios to the Fial Examiatio, Stat 00C, Sprig 00.. (a If A,A,... are evets such that j= P(A j <, thep(a i.o. =0. Coversely, if the A j are idepedet evets, ad j= P(A j=, thep(a i.o. =. (b Let ɛ be a arbitrary positive umber. The (/X 0 if, ad oly if, P((/X >ɛ i.o. = 0. Sice P((/X >ɛ= = P(X >ɛ= = /(ɛ θ < = if, ad oly if, θ>, we have (/X 0 if, ad oly if, θ>.. (a E(X k = 0 ad Var(X k = / k. So E(S = 0 ad B = Var(S = / k (/x dx. (b max j / j =,so[max j Var(X j ]/B / 0. (c Sice X j for all j, B j= E(X j I(X j >ɛ B B E(Xj I( >ɛ B = I( >ɛ / =0 j= for sufficietly large. So, S /B L N (0,, or S / /4 L N (0, / c 3. (a χ (( j / p j P =. p j= j (b χ T = c j= p j(log( j / log(p j ad χ TM = c j= j(log( j / log(p j. (cthe limitig distributio is ocetral χ c (λ, with c degrees of freedom ad ocetrality parameter λ = c j= p0 j (log(p0 j log(p j. 4. (a Sice ā N =0,wehaveES N = 0. The variace of S N is (N /(N σ z σ a, ad sice σ a =(/N N a(i =/N, wehavevar(s N =(N/(N σ z. (b For asymptotic ormality of S N, we eed max j (z j z N max(a(j ā N Nσ zσ a 0. We have max j (a(j ā N =,adσ a =/N. The the above coditio becomes max j (z j z N σ z 0. 5. (a K(f 0,f =E 0 log f 0(X f (X,whereE 0 represets the expectatio whe f 0 (x is the desity of X. 3

(b K(f 0,f 0, with equality if, ad oly if, f 0 (x adf (x are the same distributio. (c f 0 (x = ( x (/ ad f (x = ( x (3/4 x (/4 x,sof 0 (x/f (x = /3 x. So K(f 0,f =E 0 ( log X log 3 = log (/ log 3 = (/[log 4 log 3]. This is obviously positive. 6. (a l (θ, µ = log θ + log µ + log x i θ x i( + µy i. / θ =(/θ x i( + µy i = 0 implies = ˆθ x + ˆθˆµ xy ad / µ = (/µ θ x iy i = 0 implies = ˆθˆµ xy. Solvig these equatios gives ˆθ =/X ad ˆµ = X /XY,whereXY =(/ X iy i. (b Ψ(x, θ, µ = ((/θ x( + µy, (/µ θxy, which shows that E(XY =/µθ, so that /θ xy Ψ =( xy /µ ad I(θ, µ = E Ψ = ( /θ /µθ /µθ /µ (c Sice Det(I ( =/µ θ,wehavee(xy ( =/µθ, sothat /µ I(θ, µ = µ θ /µθ θ µθ /µθ /θ = µθ µ. So whe θ is ukow, (ˆµ L µ N (0, µ. Whe θ is kow, the asymptotic variace of the MLE is the reciprocal of the lower right corer of the iformatio matrix, amely µ.so L ( µ µ N (0,µ. (Here, the MLE of µ is µ =/(θxy. 7. (a The log-likelihood fuctio is l =logl (λ, β,β = (3λ + β + β + log(λ X i +log(λ+β Y i +log(λ+β Z i plus a term ot ivolvig the parameters. The likelihood equatios are λ = 3 + λ X i +(λ + β Y i +(λ + β Z i β = +(λ + β Y i β = +(λ + β Z i The maximum likelihood estimates are ˆλ = X, ˆβ = Y X,adˆβ = Z X. I a similar way, the MLE s uder H 0 are λ = X,ad β = β =(Y + Z / X. The likelihood ratio test rejects H 0 for small values of Λ= L ( λ, β, β L (ˆλ, ˆβ, ˆβ = ((Y + Z / (Y +Z. Y Y Z Z (b log Λ has asymptotically a chi-square distributio with degree of freedom. 4

8. (a Uder H, the maximum likelihood estimates of the p ij are ˆp ii = ii / ad for i j, ˆp ij =( ij + ji /. Thereare(c + (c + +=c(c / restrictios goig from the geeral hypothesis to H. So the chi-square test of H rejects H if χ (ˆp is greater tha the appropriate cutoff poit for a chi-square distributio with c(c / degrees of freedom. (b Uder H 0, the likelihood is proportioal to [ j= p jj jj ]qm, where m = c j= jj. So the maximum likelihood estimates are p jj =( jj / ad q =[ c p jj ]/(c(c. There are c parameters estimated so the chi-square test of H 0 rejects H 0 if χ ( p isgreater tha the appropriate cutoff poit for a chi-square distributio with (c c = c c degrees of freedom. (c The chi-square test of H 0 withi H, rejects H 0 if χ ( p χ (ˆp is greater tha the appropriate cutoff poit for a chi-square distributio with c c (c(c / = (c(c / degrees of freedom. 5