Exercise 4.3 Use the Continuity Theorem to prove the Cramér-Wold Theorem, Theorem. (1) φ a X(1).

Similar documents
Convergence of random variables. (telegram style notes) P.J.C. Spreij

4. Partial Sums and the Central Limit Theorem

7.1 Convergence of sequences of random variables

7.1 Convergence of sequences of random variables

17. Joint distributions of extreme order statistics Lehmann 5.1; Ferguson 15

Simulation. Two Rule For Inverting A Distribution Function

Since X n /n P p, we know that X n (n. Xn (n X n ) Using the asymptotic result above to obtain an approximation for fixed n, we obtain


Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

HOMEWORK I: PREREQUISITES FROM MATH 727

Unbiased Estimation. February 7-12, 2008

Mathematics 170B Selected HW Solutions.

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

Topic 9: Sampling Distributions of Estimators

Lecture 19: Convergence

Limit Theorems. Convergence in Probability. Let X be the number of heads observed in n tosses. Then, E[X] = np and Var[X] = np(1-p).

Lecture 7: Properties of Random Samples

Topic 9: Sampling Distributions of Estimators

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

The Central Limit Theorem

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

Topic 9: Sampling Distributions of Estimators

STAT Homework 1 - Solutions

Lecture 20: Multivariate convergence and the Central Limit Theorem

STAT 516 Answers Homework 6 April 2, 2008 Solutions by Mark Daniel Ward PROBLEMS

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

LECTURE 8: ASYMPTOTICS I

6. Sufficient, Complete, and Ancillary Statistics

Homework 5 Solutions

Estimation for Complete Data

2.2. Central limit theorem.

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

1 = δ2 (0, ), Y Y n nδ. , T n = Y Y n n. ( U n,k + X ) ( f U n,k + Y ) n 2n f U n,k + θ Y ) 2 E X1 2 X1

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 21 11/27/2013

Direction: This test is worth 250 points. You are required to complete this test within 50 minutes.

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

Probability 2 - Notes 10. Lemma. If X is a random variable and g(x) 0 for all x in the support of f X, then P(g(X) 1) E[g(X)].

MATH 472 / SPRING 2013 ASSIGNMENT 2: DUE FEBRUARY 4 FINALIZED

MATH/STAT 352: Lecture 15

Large Sample Theory. Convergence. Central Limit Theorems Asymptotic Distribution Delta Method. Convergence in Probability Convergence in Distribution

Lecture 18: Sampling distributions

STAT Homework 2 - Solutions

STAT 350 Handout 19 Sampling Distribution, Central Limit Theorem (6.6)

5. INEQUALITIES, LIMIT THEOREMS AND GEOMETRIC PROBABILITY

Random Models. Tusheng Zhang. February 14, 2013

Chapter 6 Principles of Data Reduction

TMA4245 Statistics. Corrected 30 May and 4 June Norwegian University of Science and Technology Department of Mathematical Sciences.

Homework 3 Solutions

Statisticians use the word population to refer the total number of (potential) observations under consideration

( θ. sup θ Θ f X (x θ) = L. sup Pr (Λ (X) < c) = α. x : Λ (x) = sup θ H 0. sup θ Θ f X (x θ) = ) < c. NH : θ 1 = θ 2 against AH : θ 1 θ 2

Math 152. Rumbos Fall Solutions to Review Problems for Exam #2. Number of Heads Frequency

This section is optional.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS

Inverse Matrix. A meaning that matrix B is an inverse of matrix A.

PRACTICE FINAL/STUDY GUIDE SOLUTIONS

The Poisson Distribution

6.3 Testing Series With Positive Terms

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

Probability and statistics: basic terms

Problem Set 4 Due Oct, 12

Final Review for MATH 3510

Math 132, Fall 2009 Exam 2: Solutions

ECE 330:541, Stochastic Signals and Systems Lecture Notes on Limit Theorems from Probability Fall 2002

Distribution of Random Samples & Limit theorems

MA131 - Analysis 1. Workbook 9 Series III

Lecture 3. Properties of Summary Statistics: Sampling Distribution

Binomial Distribution

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n.

Element sampling: Part 2

Lecture 2: Concentration Bounds

Lecture 6 Ecient estimators. Rao-Cramer bound.

Stat 421-SP2012 Interval Estimation Section

Department of Mathematics

Exponential Families and Bayesian Inference

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 22

Solution. 1 Solutions of Homework 1. Sangchul Lee. October 27, Problem 1.1

Quick Review of Probability

Sequences and Series of Functions

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

Quick Review of Probability

5. Likelihood Ratio Tests

ACO Comprehensive Exam 9 October 2007 Student code A. 1. Graph Theory

PRACTICE PROBLEMS FOR THE FINAL

Statistical Inference Based on Extremum Estimators

Approximations and more PMFs and PDFs

Econ 325 Notes on Point Estimator and Confidence Interval 1 By Hiro Kasahara

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

Statistical Theory MT 2008 Problems 1: Solution sketches

BINOMIAL COEFFICIENT AND THE GAUSSIAN

Solutions to home assignments (sketches)

Question 1: The magnetic case

[412] A TEST FOR HOMOGENEITY OF THE MARGINAL DISTRIBUTIONS IN A TWO-WAY CLASSIFICATION

Parameter, Statistic and Random Samples

PRACTICE PROBLEMS FOR THE FINAL

Stat 319 Theory of Statistics (2) Exercises

page Suppose that S 0, 1 1, 2.

Statistical Theory MT 2009 Problems 1: Solution sketches

Math 113, Calculus II Winter 2007 Final Exam Solutions

IIT JAM Mathematical Statistics (MS) 2006 SECTION A

University of Colorado Denver Dept. Math. & Stat. Sciences Applied Analysis Preliminary Exam 13 January 2012, 10:00 am 2:00 pm. Good luck!

Transcription:

Assigmet 7 Exercise 4.3 Use the Cotiuity Theorem to prove the Cramér-Wold Theorem, Theorem 4.12. Hit: a X d a X implies that φ a X (1) φ a X(1). Sketch of solutio: As we poited out i class, the oly tricky part of the Cramér-Wold Theorem is showig that if a X d a X for all a, the X d X. But i this case, the Cotiuity Theorem states that φ a X (t) φ a X(t) for all t R. I particular, lettig t 1, we obtai φ X (a) φ X (a). Sice this is true for all a, we obtai oce agai from the Cotiuity Theorem that X d X. Exercise 4.4 Suppose X N k (µ, Σ), where Σ is ivertible. Prove that (X µ) Σ 1 (X µ) χ 2 k. Hit: If Q diagoalizes Σ, say QΣQ Λ, let Λ 1/2 be the diagoal, oegative matrix satisfyig Λ 1/2 Λ 1/2 Λ ad cosider Y Y, where Y (Λ 1/2 ) 1 Q(X µ). Sketch of solutio: Usig the hit, we obtai Y Y (X µ) Σ 1 (X µ). But with X N k (µ, Σ), we obtai Y N k (, I) because Var Y (Λ 1/2 ) 1 QΣQ (Λ 1/2 ) 1 (Λ 1/2 ) 1 Λ(Λ 1/2 ) 1 I. This proves the result, sice we see that Y Y is just the sum of k idepedet squared stadard ormal radom variables. Exercise 4.5 Let X 1, X 2,... be idepedet Poisso radom variables with mea λ 1. Defie Y (X 1). (a) Fid E (Y + ), where Y + Y I{Y > }. Sketch of solutio: Notig that Y Y + Y ad E Y, E Y. Thus, we ca fid E Y istead of E Y + we see that E Y + We shall use the fact that X is Poisso with mea. [ { (X E (Y ) E 1) } ] 1 E [ (X ) ] 236.

1 i e [ e ip (X i) 1 ] k k! k (k 1)! k [ k e.! k1 k 1 k! k k! k ] ( k)p (X k) (b) Fid, with proof, the limit of E (Y + ) ad prove Stirlig s formula Hit: Use the result of Exercise 3.12.! 2π +1/2 e. k Sketch of solutio: By the cetral limit theorem, Y d N(, 1). Sice E Y 2 1 for all, Exercise 3.11 (with ɛ 1) shows that the Y are uiformly itegrable, ad thus (by Exercise 3.1) E Y E Z, where Z is stadard ormal. By a similar argumet, E Y E Z. But Y + Y + Y, so we coclude that E Y + E Z + E Z 1/ 2π. Combiig this with the result of part (a), we obtai e! 1 2π, which is equivalet to 2πe! 1, which is exactly the cotet of Stirlig s approximatio as experssed i Example 1.21. Exercise 4.14 Let (a 1,..., a ) be a radom permutatio of the itegers 1,...,. If a j < a i for some i < j, the the pair (i, j) is said to form a iversio. Let X be the total umber of iversios: X I{a j < a i }. j2 237

For example, if 3 ad we cosider the permutatio (3, 1, 2), there are 2 iversios sice 1 a 2 < a 1 3 ad 2 a 3 < a 1 3. This problem asks you to fid the asymptotic distributio of X. (a) Defie Y 1 ad for j > 1, let Y j I{a j < a i } be the umber of a i greater tha a j to the left of a j. The the Y j are idepedet (you do t have to show this; you may wish to thik about why, though). Fid E (Y j ) ad Var Y j. Sketch of solutio: By symmetry, the evet a j < a i is just as probable as the evet a i < a j, which meas that the probability of this evet is 1/2. Therefore, E Y j P (a j < a i ) 1 2 j 1 2. For the variace of Y j, we eed the fact that for i k, P (a j < a i ad a j < a k ) P (a j is the smallest of {a j, a i, a k }) 1 3, agai by symmetry (because the smallest is just as likely to be ay of a j, a i, or a k ). Thus, we obtai Var Y j Cov (I{a j < a i }, I{a j < a k }) k1 Var I{a j < a i } + 2 1 i<k j 1 [ j 1 1 + 2 4 2 3 1 ] 4 3(j 1) (j 1)(j 2) + 12 12 j2 1 12. [P (a j < a i ad a j < a k ) P (a j < a i )P (a j < a k )] Notice that the above mea ad variace are also correct for Y 1, which is defied as the costat. 238

(b) Use X Y 1 + Y 2 + + Y to prove that 3 4X d 1 N(, 1). 2 2 Sketch of solutio: that s 2 j1 j 2 1 12 Let us verify the Lideberg coditio. Notice ( + 1)(2 + 1) 6(12) 6 6(12) (2 + 5)( 1) 72 3 36. Furthermore, otice that Y j E Y j j 1 < for all j because Y j may oly take values from through j 1. Therefore, 1 s 2 E [ (Y j E Y j ) 2 I{ Y j E Y j ɛs } ] 1 s 2 j1 E [ (Y j E Y j ) 2 I{ ɛs } ] j1 I{ ɛs }. But sice s 3 /36, we kow that I{ ɛs } will be idetically zero for large eough, o matter how small the fixed value of ɛ is. This meas that the Lideberg coditio is satisfied, so we coclude that We may check that X E X s d N(, 1). E X j 1 2 ( + 1) 4 2 ( 1). 4 Thus, with a bit of algebra, we obtai [ X E X 3 3 4X 1 + 3 ] s 36s 2 2 2. Sice 36s / 3 1 ad 3/2 as, we coclude by Slutsky s theorem that 3 4X d 1 1 Z +, 2 2 where Z is stadard ormal, which proves the result. 239

(c) For 1, evaluate the distributio of iversios as follows. First, simulate 1 permutatios o {1, 2,..., 1} ad for each permutatio, cout the umber of iversios. Plot a histogram of these 1 umbers. Use the results of the simulatio to estimate P (X 1 24). Secod, estimate P (X 1 24) usig a ormal approximatio. Ca you fid the exact iteger c such that 1!P (X 1 24) c? Sketch of solutio: Below is a R fuctio that couts the umber of iversios i a vector (by default a permutatio of 1 elemets), alog with code to test it o 1 permutatios of 1 elemets. iv <- fuctio(asample(), 1) { m <- outer(a,a,"<") sum(m & lower.tri(m)) } s <- replicate(1, iv()) sum(s<24) [1] 61 My simulatio produced 61/1 permutatios with 24 or fewer iversios. Below is a histogram alog with the code that created it. Percet of Total 1 2 3 4 5 6 7 pdf("iv.pdf") hist(s,mai"",ylab"percet of Total", class3,col5,yaxt"") axis(2,at1*(:8),labels:8) dev.off() 1 2 3 4 s 24

From part (b), we kow that the exact theoretical mea ad stadard deviatio for X 1 are 9/4 22.5 ad 225/72 5.59, respectively. Thus, usig a cotiuity correctio, we fid the ormal approximatio P (X 24) Φ([24.5 22.5]/5.59).64. We could also use the ormal approximatio proved to be asymptotically valid i part (b), which gives approximate mea 1/4 25 ad stadard deviatio 1/6 5.27, which gives as the ormal approximatio P (X 24) Φ([24.5 25]/5.27).462. (Note how far apart the two ormal approximatios are i this case despite the fact that they are asymptotically equivalet!) By the way, the exact theoretical value of P (X 24) is 238843/36288.636. If K,i deotes the umber of permutatios o elemets that cotai exactly i iversios, the I do t kow of ay closed-form expressio for K,i, but there s a fairly simple recursio you might wat to verify (which is how I obtaied the umber 238843): K,i 1 K 1,i j for i >, >, j where K, 1 for all 1 ad K,i wheever i < or. Suppose that X 1, X 2, X 3 is a sample of size 3 from a beta (2, 1) dis- Exercise 4.15 tributio. (a) Fid P (X 1 + X 2 + X 3 1) exactly. Sketch of solutio: The desity fuctio of a beta (2, 1) distributio is 2x 2 1 (1 x) 1 1 2x for x (, 1). Thus, the probability is just the triple itegral of the joit desity fuctio over the regio {x 1 + x 2 + x 3 1}: 1 1 x1 1 x1 x 2 8x 1 x 2 x 3 dx 3 dx 2 dx 1 1 3 1 1 x1 1 9. 1 4x 1 x 2 (1 x 1 x 2 ) 2 dx 2 dx 1 x1 4x 2 1 + 6x 3 1 4x 4 1 + x 5 1 dx1 As a decimal approximatio, 1/9.111. (b) Fid P (X 1 + X 2 + X 3 1) usig a ormal approximatio derived from the cetral limit theorem. 241

Sketch of solutio: The mea of the beta(2, 1) distributio is 2/3 ad the variace is 1/18. Thus, the cetral limit theorem approximatio says that X 1 +X 2 +X 3 is distributed approximately ormally with mea 2 ad variace 1/6. Stadardizig gives 1 2 P (X 1 + X 2 + X 3 1) Φ.715 1/6 (c) Let Z I{X 1 + X 2 + X 3 1}. Approximate E Z P (X 1 + X 2 + X 3 1) by Z 1 Z i/1, where Z i I{X i1 + X i2 + X i3 1} ad the X ij are idepedet beta (2, 1) radom variables. I additio to Z, report Var Z for your sample. (To thik about: What is the theoretical value of Var Z?) Sketch of solutio: I got 8/1.8 values of X 1 + X 2 + X 3 less tha 1 i my simulatio, code for which is show below. This gives a sample variace of.794. Sice Z is a Beroulli radom variable with probability 1/9, the theoretical variace is 89/9 2.11. sum(apply(matrix(rbeta(3,2,1),row1),1,sum)<1) [1] 8 (d) Approximate P (X 1 + X 2 + X 3 3 ) usig the ormal approximatio ad 2 the simulatio approach. (Do t compute the exact value, which is more difficult to tha i part (a); do you see why?) Sketch of solutio: With the simulatio, I got 125/1.125 as the poit estimate. The ormal approximatio gives 1.5 2 Φ.113. 1/6 The triple itegral i this case is much harder! 242