HOMEWORK I: PREREQUISITES FROM MATH 727

Similar documents
STAT Homework 1 - Solutions

7.1 Convergence of sequences of random variables

Chapter 6 Principles of Data Reduction

7.1 Convergence of sequences of random variables

4. Partial Sums and the Central Limit Theorem

17. Joint distributions of extreme order statistics Lehmann 5.1; Ferguson 15

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

EE 4TM4: Digital Communications II Probability Theory

Distribution of Random Samples & Limit theorems

This section is optional.

Exercise 4.3 Use the Continuity Theorem to prove the Cramér-Wold Theorem, Theorem. (1) φ a X(1).

STAT Homework 2 - Solutions


Lecture 7: Properties of Random Samples

Mathematics 170B Selected HW Solutions.

Lecture Chapter 6: Convergence of Random Sequences

Simulation. Two Rule For Inverting A Distribution Function

Lecture 12: November 13, 2018

Convergence of random variables. (telegram style notes) P.J.C. Spreij

6.3 Testing Series With Positive Terms

Probability 2 - Notes 10. Lemma. If X is a random variable and g(x) 0 for all x in the support of f X, then P(g(X) 1) E[g(X)].

Math 61CM - Solutions to homework 3

Large Sample Theory. Convergence. Central Limit Theorems Asymptotic Distribution Delta Method. Convergence in Probability Convergence in Distribution

Topic 9: Sampling Distributions of Estimators

Lecture 19: Convergence

January 25, 2017 INTRODUCTION TO MATHEMATICAL STATISTICS

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

University of Colorado Denver Dept. Math. & Stat. Sciences Applied Analysis Preliminary Exam 13 January 2012, 10:00 am 2:00 pm. Good luck!

Lecture 20: Multivariate convergence and the Central Limit Theorem

STA Object Data Analysis - A List of Projects. January 18, 2018

Since X n /n P p, we know that X n (n. Xn (n X n ) Using the asymptotic result above to obtain an approximation for fixed n, we obtain

MATH 472 / SPRING 2013 ASSIGNMENT 2: DUE FEBRUARY 4 FINALIZED

The variance of a sum of independent variables is the sum of their variances, since covariances are zero. Therefore. V (xi )= n n 2 σ2 = σ2.

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

AMS 216 Stochastic Differential Equations Lecture 02 Copyright by Hongyun Wang, UCSC ( ( )) 2 = E X 2 ( ( )) 2

Math 341 Lecture #31 6.5: Power Series

Limit Theorems. Convergence in Probability. Let X be the number of heads observed in n tosses. Then, E[X] = np and Var[X] = np(1-p).

Generalized Semi- Markov Processes (GSMP)

Problem Set 4 Due Oct, 12

Notes 5 : More on the a.s. convergence of sums

Integrable Functions. { f n } is called a determining sequence for f. If f is integrable with respect to, then f d does exist as a finite real number

Fall 2013 MTH431/531 Real analysis Section Notes

Lecture 18: Sampling distributions

Parameter, Statistic and Random Samples

Lesson 10: Limits and Continuity

Lecture 8: Convergence of transformations and law of large numbers

Topic 9: Sampling Distributions of Estimators

B Supplemental Notes 2 Hypergeometric, Binomial, Poisson and Multinomial Random Variables and Borel Sets

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

AMS570 Lecture Notes #2

Introduction to Probability. Ariel Yadin

32 estimating the cumulative distribution function

Mathematical Statistics - MS

An Introduction to Randomized Algorithms

ANSWERS TO MIDTERM EXAM # 2

Topic 9: Sampling Distributions of Estimators

Stat 421-SP2012 Interval Estimation Section

Advanced Stochastic Processes.

LECTURE 8: ASYMPTOTICS I

MATH 413 FINAL EXAM. f(x) f(y) M x y. x + 1 n

Math 132, Fall 2009 Exam 2: Solutions

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

Massachusetts Institute of Technology

Journal of Multivariate Analysis. Superefficient estimation of the marginals by exploiting knowledge on the copula

6. Sufficient, Complete, and Ancillary Statistics

HOMEWORK #10 SOLUTIONS

Lecture Notes 15 Hypothesis Testing (Chapter 10)

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA STATISTICAL THEORY AND METHODS PAPER I

Exponential Families and Bayesian Inference

Lecture 2: Monte Carlo Simulation

University of Manitoba, Mathletics 2009

Randomized Algorithms I, Spring 2018, Department of Computer Science, University of Helsinki Homework 1: Solutions (Discussed January 25, 2018)

The Central Limit Theorem

Stat 400: Georgios Fellouris Homework 5 Due: Friday 24 th, 2017

DS 100: Principles and Techniques of Data Science Date: April 13, Discussion #10

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS

MATH301 Real Analysis (2008 Fall) Tutorial Note #7. k=1 f k (x) converges pointwise to S(x) on E if and

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

Math 10A final exam, December 16, 2016

Sequences and Series of Functions

Chapter 2 The Monte Carlo Method

Math 140A Elementary Analysis Homework Questions 3-1

Direction: This test is worth 150 points. You are required to complete this test within 55 minutes.

Chapter 6 Infinite Series

Seunghee Ye Ma 8: Week 5 Oct 28

Math 451: Euclidean and Non-Euclidean Geometry MWF 3pm, Gasson 204 Homework 3 Solutions

IIT JAM Mathematical Statistics (MS) 2006 SECTION A

Statisticians use the word population to refer the total number of (potential) observations under consideration

Machine Learning Brett Bernstein

Lecture 6 Simple alternatives and the Neyman-Pearson lemma

UC Berkeley Department of Electrical Engineering and Computer Sciences. EE126: Probability and Random Processes

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

Spring Information Theory Midterm (take home) Due: Tue, Mar 29, 2016 (in class) Prof. Y. Polyanskiy. P XY (i, j) = α 2 i 2j

x = Pr ( X (n) βx ) =

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 21 11/27/2013

Joint Probability Distributions and Random Samples. Jointly Distributed Random Variables. Chapter { }

Please do NOT write in this box. Multiple Choice. Total

Lecture 15: Density estimation

IE 230 Seat # Name < KEY > Please read these directions. Closed book and notes. 60 minutes.

Transcription:

HOMEWORK I: PREREQUISITES FROM MATH 727 Questio. Let X, X 2,... be idepedet expoetial radom variables with mea µ. (a) Show that for Z +, we have EX µ!. (b) Show that almost surely, X + + X (c) Fid the limit i distributio of Y X + + X µ ad Z X + + X µ. X 4 + + X 4 Solutio. (a) Let λ Recall that the characteristic fuctio is give µ by φ(t) Ee itx We eed to recall that 0 λ exp(x(it λ))dx λ it λ λ exp(x(it λ)) 0. exp(x(it λ)) e xλ [cos(xt) + i si(xt)], from which it is obvious that as x, we have that exp(x(it λ)) 0. Thus φ(t) λ λ it. Also recall that if EX <, we have that A easy calculatio gives φ (0) i EX. φ (t) from which the result follows.!i λ (λ it) +,

(b) The law of large umbers gives that almost surely we have X + + X µ (0, ) Thus it suffices to show that if for a sequece (a ) we have that if a / a (0, ), the a /, but this is a easy exercise i aalysis. (c) It follows from the cetral limit theorem that Note that Y d N(0, µ 2 ). Z X + + X µ X 4 + + X 4 X + +X µ. X 4+ +X4 The cetral limit theorem gives that the umerator coverages i distributio to N(0, µ 2 ) The law of large umbers ad the cotiuity of the square-root fuctio give that the deomiator coverges almost surely to 24µ 4 Applyig Slutsky s theorem, we obtai the fractio coverages i distributio to N(µ, 24µ 2 ). Questio 2. Let U, U 2,..., be idepedet radom variables that are uiformly distributed i (0, ). Let (a) Let a [0, ]. Show that T : if { 2 : U + + U > }. P(U + + U a) a!. (b) Show that for ay iteger-valued radom variable X 0, we have EX P(X ). (c) Show that ET e 2.7. Solutio. (a) We proceed by iductio. The result is obvious i the case. Assume the result for. Set Z U + + U. Note that Z is idepedet of U +. By the iductio hypothesis we kow the desity of Z (at least up to value a ). Hece we have that a a z z P (Z + U + a) ( )! dudz, ad the iductive step follows from a easily calculatio. 0 0

(b) Note that so that X EX [X i], i E[X i] i P(X i) (c) Immediate from the previous parts, ad the fact that i {T } {U + + U < }. Questio 3. Let X,..., X be i.i.d. Beroulli radom variables with parameter p (0, ). Let T : ( ) 2 X i ad T j : i ( X ( ) 2 j + J : T ) 2, X i i T j. (a) Show that T coverges to p 2 i probability as. (b) Show that EJ p 2. Solutio. (a) Follows immediately from the law of large umbers ad cotiuity of the square fuctio. (b) Set S X + + X. A easy calculatio gives that so that Var( S j p( p) ), ET p 2 + p p2. Similarly, we have that ET j p 2 + p p2. The liearity of expectatio gives EJ p 2 + p p 2 ( )p 2 p + p 2 p 2.

Questio 4. Let T X + + X, where the X i are i.i.d. Poisso radom variables with mea µ. (a) Show that T is also a Poisso radom variable. (b) Compute E ( E(X X 2 T ) ) (c) Fix k 0 ad let t 0. Compute φ(t) : E([X k] T t). (d) Show that lim φ(t ) P(X k) almost surely. Solutio. (a) A easy computatio gives that the characteristic fuctio for a Poisso radom variable with mea λ is give by φ λ (t) exp(λ(e it )). If we compute the characteristic fuctio of T, the idepedece of the X i give that φ T (t) (φ µ (t)) exp(µ(e it )), so that T has the same characteristic fuctio as a Poisso radom variable with mea µ. Thus T must be a Poisso radom variable with mea µ. (b) By a elemetary property of coditioal expectatio, we have that E ( E(X X 2 T ) ) E(X X 2 ) EX X 2 µ 2. (c) Set T T X, so that T X + T, where T P oi(( )µ) ad is idepedet of X. For t k, we have that P(X k T t) P(X k, T t k) P(T t) P(X k)p(t t k) P(T k) Thus (eµ µ k /k!) (e ( )µ [( )µ] t k /(t k)!) e µ [µ] t /t! ( ) t k! t(t ) (t k + ). ( ) k φ(t) P(X k T t) ( ) t k! t(t ) (t k + ) ( ) k for t k, ad φ(t) 0 otherwise. (d) We have that φ(t ) ( ) T k! T (T ) (T k + ). ( ) k

We kow that by the law of large umbers T/ µ almost surely. This gives that almost surely we have ( ) T e µ. The term T (T ) (T k + ) ( ) k as behaves like (T/) k µ k so we are able to coclude that almost surely as desired. µk lim φ(t ) e µ k! Questio 5. (a) Let X have pmf give by P(X a), P(X b) 2 ad 6 6 P(X c) 3. Suppose I roll a three sided dice with distributio 6 X te times. What is the probability that I get two a s, three b s ad five c s? (b) Let U,..., U be idepedet radom variables that are uiformly distributed i [0, ]. Order the radom variables so that where V < V 2 < < V, {V,..., V } {U,..., U }. Recall that here V,..., V are called the order statistics for U,..., U. For h small ad x [0, ) show that P(x V k x+h, V k x, V k+ x+h)! (k )!( k)! xk ( x h) k h. (c) Show that the pdf for V k is give by a Beta distributio. Hit you may use the fact that P(x V k x+h) P(x V k x+h, V k x, V k+ x+h) /h 0 as h 0, sice the differece correspods to a evet that there is more tha two poits i the iterval [x, x + h]. (d) Show that if F is the cdf of a radom variable X, the F (U ) has the same distributio as X. Here, ad after, assume that F is strictly icreasig. Note that this result holds i geeral by defiig F (y) : sup {x R : F (x) < y}.

(e) Let F be the cdf for a cotiuous radom variable. Let W i F (U i ). Let Z < Z 2 < < Z be the order statistics for W,..., W. Show that Z k F (V k ). (f) Show that if X,..., X are idepedet cotiuous radom variables with cdf F ad pdf f ad order statistics give by Y < Y 2 < < Y, the the pdf for Y k is give by g k (x)! (k )!( k)! F (x)k ( F (x)) k f(x). Solutio. (a) Multiomial: 0! () 2 () 3 () 5 2!3!5! 6 3 2 (b) We thik of the uit iterval partitioed ito three disjoit itervals [0, x), [x, x + h], (x + h, ]. The evet of iterest is the evet that there are k poits i [0, x), oe poit i [x, x + h], ad k poits i (x + h, ]. This (multiomial type) evet happes with probability! (k )!( k)! xk ( x h) k h. (c) From the previous part, clearly the pdf is give by x! (k )!( k)! xk ( x) k for x [0, ], which is the pdf of a beta distributio with parameters k ad k +. (d) We have that P(F (U ) x) P(U F (x)) F (x), so that F (U ) has the cdf F. (e) This is immediate from (f) From the previous parts u v F (u) F (v). (Y,..., Y ) d ( F (V ),... F (V ) ), so that it suffices to fid the the pdf for F (V k ). We have P(F (V k ) x) P(V k F (x)) F (x) 0! (k )!( k)! tk ( t) k dt

So the it follows from calculus that the pdf is give by! (k )!( k)! F (x)k ( F (x)) k f(x). Questio 6. Let (X,..., X ) be idepedet stadard ormal radom variables ad let X X i deote the usual sample average. (a) Compute the distributio of X by usig the chage of variables: that is fid the distributio of i y x, y 2 x 2 x,..., x x; Y (Y,..., Y ), where Y X ad Y i X i X for i [2, ] to obtai the distributio of X. Of course you already kow what the aswer should be, but usig this method, as a bous, you will fid that X is idepedet of (Y 2,..., Y ). (b) Is the previous part regardig the idepedece true if X i are ot ormal radom variables? Solutio. (a) Clearly, the trasformatio is bijective. We have: ad We have that x 2 y 2 + y,..., x y + y x y y 2 y. x y, x y i for all i [2, ], ad for j 2 we have x j y ad x j y j ad everythig else is zero. With some row operatios, we get a triagular matrix, which from we easily see that the Jacobia

satisfies J ad by the chage of variables formula we have that the joit pdf for Y is give by g(y) (2π) /2 e (y y )2 /2 e (y i y ) 2 /2. Some algebra gives that i2 (y y ) 2 + (y 2 y ) 2 + + (y y ) 2 y 2 + y2 2 + + y 2 + y i y j, from which it follows that g(y) 2 i,j ( 2 2π e y2 /2 )h(y 2,..., y ), for some fuctio that does ot deped o y. From g we read off that X is idepedet of (Y2,..., Y ) ad that X N(0, 2 ) as required. (b) No. Cosider the case where X i Ber( ) ad 2. Notice 2 that { } X {Y2 0}. We have that P( X, Y 2 0) P( X ) 4. However, clearly P(Y 2 0) (0, ), so that X ad Y2 are ot idepedet. Questio 7. Suppose X N(µ, V ), where µ R ad V R is a positive defiite symmetric covariace matrix, so that X has a odegeerate multivariate ormal distributio. Let A R be a matrix with det(a) 0. Thik of X R as a colum vector. What is the distributio of Y AX? Note that: The pdf for X is give by f(x) (2π) det(v ) exp[ (x 2 µ)t V (x µ)], where x R, ad t deotes matrix traspositio. Solutio. Clearly, the trasformatio x Ax is a bijectio. Set so that ad W AV A t, W (A ) t V A, det(w ) det(a) det(v ) det(a).

We will show that Y N(Aµ, W ). The chage of variables formula gives that the pdf for Y is give by y f(a y) det(a) (2π) det(v ) exp[ 2 (A y µ) t V (A y µ)] det(a) (2π) det(v ) det(a) 2 exp[ 2 (y Aµ)t (A ) t V A (y Aµ)] (2π) det(w ) exp[ 2 (y Aµ)t W (y Aµ)], as required.