STAT Homework 1 - Solutions

Similar documents
STAT Homework 2 - Solutions

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

7.1 Convergence of sequences of random variables

Distribution of Random Samples & Limit theorems

Massachusetts Institute of Technology

Lecture 19: Convergence

This section is optional.

HOMEWORK I: PREREQUISITES FROM MATH 727

Lecture 7: Properties of Random Samples

7.1 Convergence of sequences of random variables

Chapter 6 Principles of Data Reduction

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS

Lecture 18: Sampling distributions

Advanced Stochastic Processes.

Convergence of random variables. (telegram style notes) P.J.C. Spreij

EE 4TM4: Digital Communications II Probability Theory

Randomized Algorithms I, Spring 2018, Department of Computer Science, University of Helsinki Homework 1: Solutions (Discussed January 25, 2018)

Learning Theory: Lecture Notes

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

Introduction to Probability. Ariel Yadin. Lecture 7

An Introduction to Randomized Algorithms

Notes 5 : More on the a.s. convergence of sums

Topic 9: Sampling Distributions of Estimators

Chapter 5. Inequalities. 5.1 The Markov and Chebyshev inequalities

It is always the case that unions, intersections, complements, and set differences are preserved by the inverse image of a function.

Probability 2 - Notes 10. Lemma. If X is a random variable and g(x) 0 for all x in the support of f X, then P(g(X) 1) E[g(X)].

Lecture 6 Simple alternatives and the Neyman-Pearson lemma

Topic 9: Sampling Distributions of Estimators

ST5215: Advanced Statistical Theory

Quick Review of Probability

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 3 9/11/2013. Large deviations Theory. Cramér s Theorem

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 21 11/27/2013

Sequences and Series of Functions

AMS570 Lecture Notes #2

Quick Review of Probability

Topic 9: Sampling Distributions of Estimators

Lecture 12: September 27

Lecture 8: Convergence of transformations and law of large numbers

Mathematical Statistics - MS

UC Berkeley Department of Electrical Engineering and Computer Sciences. EE126: Probability and Random Processes

Introduction to Probability. Ariel Yadin. Lecture 2

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

The standard deviation of the mean

Limit Theorems. Convergence in Probability. Let X be the number of heads observed in n tosses. Then, E[X] = np and Var[X] = np(1-p).

STA Object Data Analysis - A List of Projects. January 18, 2018

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

Solution. 1 Solutions of Homework 1. Sangchul Lee. October 27, Problem 1.1

4. Basic probability theory

Generalized Semi- Markov Processes (GSMP)

Introduction to Probability. Ariel Yadin

Real Variables II Homework Set #5

Lecture 12: November 13, 2018

IIT JAM Mathematical Statistics (MS) 2006 SECTION A

AMS 216 Stochastic Differential Equations Lecture 02 Copyright by Hongyun Wang, UCSC ( ( )) 2 = E X 2 ( ( )) 2

Review Questions, Chapters 8, 9. f(y) = 0, elsewhere. F (y) = f Y(1) = n ( e y/θ) n 1 1 θ e y/θ = n θ e yn

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

Stat 319 Theory of Statistics (2) Exercises

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 22

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

Discrete Mathematics for CS Spring 2005 Clancy/Wagner Notes 21. Some Important Distributions

Solutions of Homework 2.

Lecture 01: the Central Limit Theorem. 1 Central Limit Theorem for i.i.d. random variables

Random Variables, Sampling and Estimation

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 19

Estimation for Complete Data

STAT 516 Answers Homework 6 April 2, 2008 Solutions by Mark Daniel Ward PROBLEMS

0, otherwise. EX = E(X 1 + X n ) = EX j = np and. Var(X j ) = np(1 p). Var(X) = Var(X X n ) =

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

1 Convergence in Probability and the Weak Law of Large Numbers

Notes 27 : Brownian motion: path properties

Problem Set 4 Due Oct, 12

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 6 9/23/2013. Brownian motion. Introduction

Math 140A Elementary Analysis Homework Questions 3-1

Expectation and Variance of a random variable

Probability and Statistics

Exercise 4.3 Use the Continuity Theorem to prove the Cramér-Wold Theorem, Theorem. (1) φ a X(1).

Lecture 2. The Lovász Local Lemma

Stat 400: Georgios Fellouris Homework 5 Due: Friday 24 th, 2017

CS 330 Discussion - Probability

Math 25 Solutions to practice problems

Final Review for MATH 3510

The variance of a sum of independent variables is the sum of their variances, since covariances are zero. Therefore. V (xi )= n n 2 σ2 = σ2.

LECTURE 8: ASYMPTOTICS I

ECE 330:541, Stochastic Signals and Systems Lecture Notes on Limit Theorems from Probability Fall 2002

Lecture 2: April 3, 2013

STAT Homework 7 - Solutions

Discrete Mathematics and Probability Theory Summer 2014 James Cook Note 15

Analytic Continuation

January 25, 2017 INTRODUCTION TO MATHEMATICAL STATISTICS

17. Joint distributions of extreme order statistics Lehmann 5.1; Ferguson 15

Mathematics 170B Selected HW Solutions.

Economics 241B Relation to Method of Moments and Maximum Likelihood OLSE as a Maximum Likelihood Estimator

Introduction to Extreme Value Theory Laurens de Haan, ISM Japan, Erasmus University Rotterdam, NL University of Lisbon, PT

Axioms of Measure Theory

DS 100: Principles and Techniques of Data Science Date: April 13, Discussion #10

n outcome is (+1,+1, 1,..., 1). Let the r.v. X denote our position (relative to our starting point 0) after n moves. Thus X = X 1 + X 2 + +X n,

Lecture Chapter 6: Convergence of Random Sequences

Math 10A final exam, December 16, 2016

Law of the sum of Bernoulli random variables

Transcription:

STAT-36700 Homework 1 - Solutios Fall 018 September 11, 018 This cotais solutios for Homework 1. Please ote that we have icluded several additioal commets ad approaches to the problems to give you better isight. Problem 1. Suppose we toss a fair coi util we get exactly three heads. Describe the sample space Ω. Let X deote the umber of tosses. Fid probability mass fuctio of X. Solutio 1. The Sample space here is Ω {X 3 : HHH X 4 : THHH, HTHH, HHTH X 5 : TTHHH, THTHH,......} We the have the probability mass fuctio as: ( ) ( ) k 1 1 P(X k) }{{} I first (k 1) tosses there are two ad oly two Hs ( ) ( k 1 1 ) k ( ) 1 k 3 }{{} rem. (k 3) Ts i first (k 1) tosses Problem. Cosider evets A 1, A..., A. Prove that ( ) P A i P(A i ). i1 i1 ( ) 1 }{{} third H o kth toss Solutio. We will proceed by mathematical iductio. ( 1i1 ) Proof. Base case: If 1, the P A i P(A 1 ) 1 i1 P(A i). So, the theorem holds whe 1. Also for the result holds usig Lemma 0.. Iductive hypothesis: Suppose the theorem holds for ( all values ki1 ) of up to some k, k 1. That is we assume that P A i k i1 P(A i).

stat-36700 homework 1 - solutios Iductive step: Let k + 1. The we have: P ( k+1 i1 ) ( ) k A i P A i A k+1 i1 ( ) ( ) k k P A i + P(A k+1 ) P A i A k+1 i1 ( ) k P A i + P(A k+1 ) i1 k P(A i ) + P(A k+1 ) i1 k+1 P(A i ) i1 i1 }{{} 0, sice it is a probability (by iductive hypothesis) So, the theorem holds for k + 1. By the priciple of mathematical iductio, the theorem holds for all N. (Usig Lemma 0.) Problem 3. Suppose that A ad B are idepedet evets. Show that A ad B c are idepedet evets. 1 Solutio 3. Proof. P(A) P(B c ) P(A) (1 P(B)) P(A) P(A) P(B) P(A) P(A B) P(A B c ) 1 Idepedece is all about what iformatio is cotaied i evets. It is a very strog property to assume but usually doe iitially for mathematical simplicity. As a extesio to this questio are A c ad B c also idepedet? Prove it or provide a couterexample. So evets A ad B c are idepedet i.e. A B c. Problem 4. Show that if P(A) 0 or P(A) 1 the A is idepedet of every other evet. Show that if A is idepedet of itself the P(A) is either 0 or 1. Solutio 4.(a) First part: Proof. Cosider B arbitray evet, suppose P(A) 0, sice A B A, 0 P(A B) P(A) 0. Hece P(A B) 0 P(A) P(B), which implies idepedece. Suppose P(A) 1, the P(A c ) 1 P(A) 0. By the above proof, we have that A c B. Sice (A c ) c A, by problem 3, we get A B. (b) Secod part:

stat-36700 homework 1 - solutios 3 Proof. Sice we are give that A A this meas that P(A) P(A A) P(A) P(A) (P(A)) P(A) (1 P(A)) 0 From this it follows that P(A) 0 or P(A) 1. Problem 5. Let X have CDF F. Fid the CDF of Y mi{0, X}. Solutio 5. I this case we have F Y (t) : P(Y t) P(mi{0, X} t) 1 P(mi{0, X} > t) 1 P(0 > t) P(X > t) 1 P(t < 0) (1 P(X t)) 1 P(t < 0) (1 F X (t)) F X (t) t < 0 1 t 0 Key poit: max(x, Y) a implies that X a ad Y a. Similarly, mi(x, Y) a implies that X a ad Y a. Problem 6. A radom variable X is stochastically greater tha a radome variable Y if F X (t) F Y (t) for all t ad F X (t) < F Y (t) for some t. Prove that, i this case, ad P(X > t) P(Y > t) for every t, P(X > t) > P(Y > t) for some t. Solutio 6. Proof. We ote that by defiitio the complemetary evets {X t} c {X > t}. As such we have that P(X > t) 1 P(X t) : F X (t). Usig this simple fact we ow proceed to prove the first part as follows 3 : F X (t) F Y (t) t 1 F X (t) 1 F Y (t) P(X > t) P(Y > t) 3 Key poit: We just recogize that the CDF captures left-tail probabilities of a radom variable (i 1D). The complemetary evet is the right tailed probability of the same radom variable. Thereafter we just ote that the probability of a evet ad it s complemet sum to 1 to get a relatioship betwee CDF ad the right-tail probabilities usig Lemma 0.1

stat-36700 homework 1 - solutios 4 The secod part follows similarly (just keepig carefully otig the strict iequality) F X (t) < F Y (t) t 1 F X (t) > 1 F Y (t) P(X > t) > P(Y > t) Ad thus the required statemets are ow proved. Problem 7. Defie 0 t < F X (t) t t 4 1 t > 4. Prove that F is a valid CDF. Fid the probability desity fuctio. Solutio 7. We will show that F satisfies the required coditios to esure it is a valid CDF (a) lim t F X (t) 0 Proof. We have lim t F X (t) lim t 0 0 (b) lim t F X (t) 1 Proof. We have lim t F X (t) lim t 1 1 (c) F X (t) is odecreasig for all t. Proof. I the cases t < ad t > 4, F X (t) is defied to be costat valued at 0 ad 1 respectively so is o-decreasig over this domai by defiitio. I the iterval [, 4] we ote that for t 1, t [, 4] such that t t 1 we have: F X (t ) F X (t 1 ) t So F X (t) is odecreasig for all t. t 1 t t 1 0 sice t t 1 by assumptio (d) F X (t) is right-cotiuous all t.

stat-36700 homework 1 - solutios 5 Proof. I the cases t <, t > 4 ad t (, 4) we ote that F X (t) is cotiuous (ad thus right-cotiuous). We just eed to verify right-cotiuity at the ed poits of the iterval (, 4) amely t {, 4}. This is doe as follows: Case: t lim F t X(t) lim t + t 0 F X () Case: t 4 lim F X(t) lim 1 t 4 + t 4 1 F X (4) So F X (t) is right-cotiuous for all t. (e) Fid the PDF. 0 t < f X (t) 1 t 4 0 t > 4. Problem 8. The uiform distributio o [ 3, 3] has desity: f X (x) 1 6 for x [ 3, 3]. Suppose X has this desity. (a) Fid P(X 1). (b) Fid P(0.5 X 1.5). (c) Fid the CDF of Y X. Solutio 8. Note that this is the cotiuous Uiform Distributio o [ 3, 3] ot the discrete versio. (a) Fid P(X 1). This is 0. Proof. For a cotiuous desity the mass is 0 at ay give poit. More formally we have 1 1 1 6 dx 6 1 [x]1 1 0

stat-36700 homework 1 - solutios 6 (b) Fid P(0.5 X 1.5). This is 1 6. Proof. P(0.5 X 1.5) 1.5 0.5 1 6 dx 1 6 [x]1.5 0.5 1 6 (c) Fid the CDF of Y X. Proof. For y [0, 9], F Y (y) P(Y y) ( ) P X y P( y X y ) y y 1 6 [x] y y 3 1 6 dx y Hece, the CDF of Y is 0, if y < 0 F Y (y) y 3, if 0 y 9 1, if y > 9 Problem 9. Let (X, Y) have a uiform distributio o the uit circle i the plae. (a) Show that X ad Y are ot idepedet (b) Fid P ( X + Y < 1/4 ). Solutio 9. We proceed as follows: (a) Claim: X, Y are ot idepedet

stat-36700 homework 1 - solutios 7 Proof. Cosider the evets X > 0.8 ad Y > 0.8. We ote that P((X > 0.8) (Y > 0.8)) 0 sice the two regios are disjoit o the uit circle. However P(X > 0.8) P(Y > 0.8) > 0. Hece, X ad Y ad ot idepedet. 4 (b) Fid P ( X + Y < 1/4 ). Claim: this is 1 4 Proof. This is the probability that X, Y fall i a half circle give that they are joitly uiform o the uit circle. The required probability is simply the ratio of the areas of the circles i this case (sice the joit distributio is cotiuous the boudary of the ier circle has probability measure 0 ad we do t have to be cocered about it s exclusio): 4 Key poit: To disprove idepedece all you sometimes eed is a sigle couterexample to show it does ot hold. Alteratively you ca derive the margials of X ad Y ad show that their product does ot equal the joit desity of (X, Y) ( ) P X + Y < 1/4 π( 1 ) π(1) 1 4 Figure 1: Figure for Q9(a) 5 5 Key poit: Try ad draw a picture ad exploit the geometry of the problem to Problem 10. Let X 1, X,..., X N (µ, σ ). Let T i1 X i. Fid E(T) ad Var(T). Solutio 10.(a) We claim that E[T] (µ + σ ) Proof. Sice the X i s are IID for all it follows that the Xi s are IID for all. We the have E[Xi ] E[X 1 ] fid the quickest solutio. I this case because of the uiform distributio (ad thus uiform volume) i 3D, we are simply cocered with relative D areas to get our required probability E[X 1 ] Var X 1 + (EX 1 ) µ + σ So the we have 6 : E[T] E i1 i1 i1 [ Xi i1 E[X i ] E[X 1 ] ] (µ + σ ) (µ + σ ) (by liearity of expectatio) (sice X i s are idetically distributed) (Sice E[X 1 ] Var X 1 + (EX 1 ) ) 6 Key poit: We did ot rely o idepedece of X i s here to derive the expectatio of T. Simply usig liearity of Expectatio ad Idetically distributed X i s was eough. Always try ad prove statemets with miimal required assumptios

stat-36700 homework 1 - solutios 8 (b) We claim that Var [T] (µ 4 + 6µ σ + 3σ 4 ) (µ + σ ) Proof. We firstly ote that 7 : ] Var[T] : Var [ Xi i1 7 Key poit: Here we do rely o both idepedece of X i s ad their idetical distributio calculatio to simplify the variace of T i1 Var[X i ] Var[X1 ] i1 (Var[X 1 ]) (sice X i s are idepedet) (sice X i s are idetically distributed) Now Var[X1 ] E[X4 1 ] (E[X 1 ]). From the previous part we kow that E[X1 ] µ + σ. So we just eed to fid E[X1 4] (the fourth momet of X 1 ) ca be foud directly usig the MGF of X 1 as follows: ( ) M X1 (t) e ut+ 1 (σ t ) the MGF for the N (µ, σ ) M (4) ( X 1 (t) 6σ e σ t ) +µt µ + σ σ t + t ( ) 4 e +µt µ + σ t + 3σ 4 e σ t E[X 4 1 ] M(4) X 1 (0) µ 4 + 6µ σ + 3σ 4 +µt (Usig Wolfram Mathematica!) It follows that Var [T] (µ 4 + 6µ σ + 3σ 4 ) (µ + σ ) Appedix: Supportig Lemmas Here we prove some lemmas from the lecture otes that we use repeatedly i the solutios Lemma 0.1. For a evet B we have P(B c ) 1 P(B) Proof. We ote that Ω B B c. Sice B ad B c are disjoit by defiitio we have that 1 : P(Ω) P(B B c ) P(B) + P(B c ) From which we obtai the required result. Lemma 0.. For evets A, B we have P(A B) P(A) + P(B) P(A B)

stat-36700 homework 1 - solutios 9 Proof. We ote firstly: A (A B c ) (A B) (Writig A as a disjoit uio of sets) P(A) P(A B c ) + P(A B) (takig the probability measure of disjoit sets) P(A B c ) P(A) P(A B) (rearragig terms) By symmetry we also have that P(B A c ) P(B) P(A B). Now fially we ote that we have: A B (A B c ) (A B) (B A c ) (Writig A B as a disjoit uio of 3 sets) P(A B) P(A B c ) + P(A B) + P(B A c ) (takig the probability measure of disjoit sets) P(A) P(A B) + P(A B) + P(B) P(A B) (usig above derived results) P(A B) P(A) + P(B) P(A B) Which is the required result. Refereces