STAT Homework 2 - Solutions

Similar documents
STAT Homework 1 - Solutions

Lecture 7: Properties of Random Samples

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

Lecture 19: Convergence

AMS570 Lecture Notes #2

January 25, 2017 INTRODUCTION TO MATHEMATICAL STATISTICS

HOMEWORK I: PREREQUISITES FROM MATH 727

Chapter 2 The Monte Carlo Method

EE 4TM4: Digital Communications II Probability Theory

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

Lecture 12: September 27

Lecture 01: the Central Limit Theorem. 1 Central Limit Theorem for i.i.d. random variables

Chapter 5. Inequalities. 5.1 The Markov and Chebyshev inequalities

BHW #13 1/ Cooper. ENGR 323 Probabilistic Analysis Beautiful Homework # 13

Expectation and Variance of a random variable

Chapter 6 Principles of Data Reduction

Mathematical Statistics - MS

The standard deviation of the mean

AMS 216 Stochastic Differential Equations Lecture 02 Copyright by Hongyun Wang, UCSC ( ( )) 2 = E X 2 ( ( )) 2

Machine Learning Brett Bernstein

Probability 2 - Notes 10. Lemma. If X is a random variable and g(x) 0 for all x in the support of f X, then P(g(X) 1) E[g(X)].

7.1 Convergence of sequences of random variables

Variance of Discrete Random Variables Class 5, Jeremy Orloff and Jonathan Bloom

Randomized Algorithms I, Spring 2018, Department of Computer Science, University of Helsinki Homework 1: Solutions (Discussed January 25, 2018)

This section is optional.

IE 230 Seat # Name < KEY > Please read these directions. Closed book and notes. 60 minutes.

Final Review for MATH 3510

Lecture 2: Concentration Bounds

Quick Review of Probability

Quick Review of Probability

STAT Homework 7 - Solutions

Lecture 6 Simple alternatives and the Neyman-Pearson lemma

Mathematics 170B Selected HW Solutions.

Distribution of Random Samples & Limit theorems

Exercise 4.3 Use the Continuity Theorem to prove the Cramér-Wold Theorem, Theorem. (1) φ a X(1).

Lecture 20: Multivariate convergence and the Central Limit Theorem

Department of Civil Engineering-I.I.T. Delhi CEL 899: Environmental Risk Assessment HW5 Solution

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

7.1 Convergence of sequences of random variables

Massachusetts Institute of Technology

STAT 350 Handout 19 Sampling Distribution, Central Limit Theorem (6.6)

An Introduction to Randomized Algorithms

Topic 9: Sampling Distributions of Estimators

PRACTICE PROBLEMS FOR THE FINAL

Understanding Samples

1 Convergence in Probability and the Weak Law of Large Numbers

Stat410 Probability and Statistics II (F16)

n outcome is (+1,+1, 1,..., 1). Let the r.v. X denote our position (relative to our starting point 0) after n moves. Thus X = X 1 + X 2 + +X n,

The variance of a sum of independent variables is the sum of their variances, since covariances are zero. Therefore. V (xi )= n n 2 σ2 = σ2.

Learning Theory: Lecture Notes

Random Variables, Sampling and Estimation

DS 100: Principles and Techniques of Data Science Date: April 13, Discussion #10

Advanced Stochastic Processes.

Simulation. Two Rule For Inverting A Distribution Function

Solutions: Homework 3

Machine Learning Brett Bernstein

IIT JAM Mathematical Statistics (MS) 2006 SECTION A

17. Joint distributions of extreme order statistics Lehmann 5.1; Ferguson 15

Lecture 18: Sampling distributions

0, otherwise. EX = E(X 1 + X n ) = EX j = np and. Var(X j ) = np(1 p). Var(X) = Var(X X n ) =

Statistical and Mathematical Methods DS-GA 1002 December 8, Sample Final Problems Solutions

Notes 5 : More on the a.s. convergence of sums

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

Lecture 3 : Random variables and their distributions

Lecture 7: Density Estimation: k-nearest Neighbor and Basis Approach

Solutions to HW Assignment 1

2.2. Central limit theorem.

LECTURE 8: ASYMPTOTICS I

Stat 421-SP2012 Interval Estimation Section

Monte Carlo Integration

Unbiased Estimation. February 7-12, 2008

Econ 325: Introduction to Empirical Economics

Lecture 3. Properties of Summary Statistics: Sampling Distribution

Topic 8: Expected Values

Lecture 12: November 13, 2018

Review Questions, Chapters 8, 9. f(y) = 0, elsewhere. F (y) = f Y(1) = n ( e y/θ) n 1 1 θ e y/θ = n θ e yn

Lecture Chapter 6: Convergence of Random Sequences

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 22

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

Econ 325 Notes on Point Estimator and Confidence Interval 1 By Hiro Kasahara

32 estimating the cumulative distribution function

Kernel density estimator

1 Introduction to reducing variance in Monte Carlo simulations

Stat 400: Georgios Fellouris Homework 5 Due: Friday 24 th, 2017

FACULTY OF MATHEMATICAL STUDIES MATHEMATICS FOR PART I ENGINEERING. Lectures

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 21 11/27/2013

Partial match queries: a limit process

Discrete Mathematics for CS Spring 2005 Clancy/Wagner Notes 21. Some Important Distributions

Lecture 11 and 12: Basic estimation theory

2. The volume of the solid of revolution generated by revolving the area bounded by the

IE 230 Probability & Statistics in Engineering I. Closed book and notes. No calculators. 120 minutes.

f X (12) = Pr(X = 12) = Pr({(6, 6)}) = 1/36

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

Lecture 2: Monte Carlo Simulation

Math 10A final exam, December 16, 2016

Topic 9: Sampling Distributions of Estimators

Topic 9: Sampling Distributions of Estimators

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 6 9/23/2013. Brownian motion. Introduction

Transcription:

STAT-36700 Homework - Solutios Fall 08 September 4, 08 This cotais solutios for Homework. Please ote that we have icluded several additioal commets ad approaches to the problems to give you better isight. Problem. Suppose we geerate a radom variable X i the followig way. First we flip a fair coi. If the coi is heads, take X to have a N µ, σ distributio. If the coi is tails, take X to have a N µ, σ distributio. Fid: a The mea of X b The stadard deviatio of X Solutio. Let Z Beroulli, the X Z N µ, σ, X Z 0 N µ, σ. Note: Questio asks for stadard deviatio, ot the variace be careful a EX E[[E[X Z]] E[Zµ + Zµ ] µ + µ b VarX Var[E[X Z]] + E[Var[X Z]] Var[Zµ + Zµ ] + E[Zσ + Zσ ] Var[µ µ Z + µ ] + σ + σ µ µ VarZ + σ + σ 4 µ µ + σ + σ sdx VarX 4 µ µ + σ + σ Problem. For a collectio of radom variables prove that: Var i a i X i i j a i a j CovX i, X j. Solutio. Note: It is possible to use iductio to prove this give that we are give the formula. However sometimes just proceedig directly from defiitio i this case of variace ca be easier.

stat-36700 homework - solutios Var i a i X i E a i X i i i j i j i j [ a i a j EX i X j E i a i X i ] i j a i a j EXi X j EX i EX j a i a j CovX i, X j Problem 3. Let X N µ, σ. Show that M X t expµt + σ t / a i a j EX i EX j Solutio 3. Here we ote that if Z N 0, ad X N µ, σ the 3 X µ + σz Proof. We proceed as follows: M X t : E X e Xt 3 Note: Here we ca fid a affie i.e. scale ad shift represetatio of X i terms of Z. The advatage is that the MGF of Z is much easier to calculate usig Lemma 0. E Z e σz+µt by defiitio of X e µt M Z σt Usig lemma 0. e µt e σ t Usig lemma 0. e µt+ σ t Problem 4. Suppose that X, X..., X are i.i.d. radom variables with E[X i ] µ ad Var[X i ] σ Let ad Show that a E[Y] µ. b Var[Y] σ. c E[S] σ. Y S X i, i i X i Y Solutio 4. We firstly ote that sice the X i s are IID for all it follows that EX i EX µ ad VarX i VarX σ for all.

stat-36700 homework - solutios 3 a EY µ Proof. We proceed directly 4 : EY E X i i i i µ i µ EX i EX by liearity of expectatio sice X i s are idetically distributed 4 Key poit: We did ot rely o idepedece of X i s here to derive the expectatio of T. Simply usig liearity of Expectatio ad Idetically distributed X i s was eough. Always try ad prove statemets with miimal required assumptios µ b Var[Y] σ. Proof. We firstly ote that 5 VarY Var X i i 5 Key poit: Here we do rely o both idepedece of X i s ad their idetical distributio calculatio to simplify the variace of Y i VarX i by idepedece of X i s i VarX sice X i s are idetically distributed σ i σ σ c E[S] σ Proof. We ote some prelimiary useful idetities to simplify the proof 6. a X i X i i X i X b i X i µ i X i i µ X µ X µ 6 Key poit: This questio shows how to costruct a ubiased estimator of the true variace from the sample data. These are useful idetities to ote dow ad used frequetly i similar proofs

stat-36700 homework - solutios 4 Approach Proof. Assumig the prelimiary useful idetities, we proceed as follows: E E i X i X X i X i i ix i X i i X i }{{} X + X i X + X Xi X i i E i E Xi E X X E X i X by liearity of expectatio sice X i s are i.i.d VarX + EX Var X + E X σ + µ σ + µ σ from parts a b X i X σ by liearity of expectatio ad rescalig by : ES Approach Assumig the prelimiary useful idetities, we proceed with the followig useful decompositio 7 : 7 Key poit: Decompositios of complicated expressios are really isightful. It may lead to a loger proof i this case but ofte the breakdow ca be more isightful ad iterpretable. I comig weeks we will see the useful bias-variace decompositio which is used to measure squared error loss i machie learig so keep this add-subtract µ approach i mid

stat-36700 homework - solutios 5 E E i X i X ix i X i i i i X i µ + µ X add ad subtract µ [ X i µ X i µ X µ + X µ ] expad the square [ X i µ ] i [X i µ X µ] + split ito separate sums [ X i µ ] X µ i i remove X µ outside the sum Which gives us the required result i [X i µ] + E X i µ E X µ + } {{ } } {{ } σ σ from part b usig liearity of expectatio σ σ σ + σ i i i [ X µ ] [ X µ ] E X µ }{{} σ from part b X i X σ by liearity of expectatio ad rescalig by : ES Problem 5. Let X, Y have the uiform distributio o [, ] [, ] Fid the probability that X + Y /. Solutio 5. Claim: this is 9 3 Approach Proof. This is the probability that X, Y fall i the upper triagular regio o the box [, ] [, ] give that they are joitly uiform o the box [, ] [, ]. The required probability is simply the ratio of the area of the upper triagle to the area of the box i this case 8 : P X + Y P Y X 3 9 3 8 Key poit: Try ad draw a picture ad exploit the geometry of the problem to fid the quickest solutio. I this case because of the uiform distributio ad thus uiform volume i 3D, we are simply cocered with relative D areas to get our required probability

stat-36700 homework - solutios 6.5.0 0.5 0.0 0.5.0.5 X + Y.5.0 0.5 0.0 0.5.0.5 Approach Usig itegratio 9 : Notice that f X,Y x, y 4 o [, ] [, ] so that 9 f X,Y x, y dx dy. Note: itegratio approach is loger tha the geometric approach used i part a but more geeral whe the desity is ot uiform. Agai, drawig a picture helps. P X + Y P Y X / / [ 8 + x 8 4 + 3 9 3 / x 8 + x 4 dx ] / dy dx 4 Problem 6. Let X, Y have the uiform distributio o the set {x, y : x + y }. Fid the joit desity fuctio of X, Y. Solutio 6. By defiitio this is the uiform distributio of the closed uit disk circle i R. So we defie the joit desity f X,Y x, y as follows 0 : 0 Key poit: Always remember to write dow the desity for all values x, y R i.e. iclude the 0 desity value for poits outside the uit disk i this case

stat-36700 homework - solutios 7 π if x + y f X,Y x, y 0 otherwise Problem 7. Let F be a cotiuous, strictly icreasig CDF. Let U be a radom variable uiformly distributed o [0, ]. Show that the radom variable Z F U has CDF F. Remark. This result lets us draw samples from ay distributio usig samples from the uiform distributio o [0, ] Solutio 7. Proof. We deote the CDF of Z : F Z z F Z z : PZ z defiitio of CDF of Z P F U z by defiitio of Z PU Fz by ivertibility of F : F U Fz writig i terms of the CDF of U Fz Usig Lemma 0.3 sice U Uif0, So the radom variable Z F U has CDF F as required. Problem 8. Let X be uiformly distributed o [ 5, ]. Let Y X 4. Fid the CDF ad PDF of Y. Solutio 8. F Y y PY y PX 4 y y [0,] P y 4 X y 4 + y,65] P y 4 X Note: Compare this to the approach take i the similar Q8c of HW y 4 y [0,] y 4 6 dx + y,65] y 4 y [0,] 3 y 4 + y,65] 6 + y 4 This implies that the CDF ad PDF are: 6 dx 0 y 0 F Y y 3 y 4 0 < y 6 + y 4 < y 65 y > 65 y 3 4 0 < y f Y y F Y y 4 y 3 4 < y 65 0 otherwise

stat-36700 homework - solutios 8 Appedix: Supportig Lemmas Here we prove some lemmas from the lecture otes that we use repeatedly i the solutios Lemma 0. Scalig ad Shiftig MGF. For a radom variables X ad Y such that Y ax + b we have the followig relatioship betwee the MGFs: M Y t e bt M X at where the MGFs of X ad Y as deoted as M X t ad M Y t respectively Proof. M Y t : E Y e Yt E Y e ax+bt by defiitio of Y E Y e bt e axt e bt E X e atx e bt M X at From which we obtai the required result. Lemma 0. MGF of stadard ormal. For Z N 0, the M Z t e t Proof. M Z t : E Z e Zt e t e zt f Z zdz e zt π e z dz π e z +zt dz π e z t e t dz π e z t dz }{{} the itegral of N t, over it s etire support e t Which is the required result.

stat-36700 homework - solutios 9 Lemma 0.3 CDF of Uif0,. Let U Uif0, ad let F U u deote the CDF of U. The F U u u u [0, ] Proof. F U u : PU u u 0 [t] u 0 u dt by defiitio of Y From which we obtai the required result. Refereces