SOLUTION FOR HOMEWORK 12, STAT 4351

Similar documents
Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Lecture 5: Moment generating functions

SOLUTION FOR HOMEWORK 4, STAT 4352

Statistics for scientists and engineers

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

Things to remember when learning probability distributions:

STAT 430/510 Probability

Multiple Random Variables

Chapter 5. Chapter 5 sections

STAT 430/510: Lecture 15

Contents 1. Contents

Lecture 1: August 28

1 Solution to Problem 2.1

SOLUTION FOR HOMEWORK 11, ACTS 4306

Mathematical Statistics 1 Math A 6330

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

1 Review of Probability and Distributions

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

STAT 3610: Review of Probability Distributions

Summary. Ancillary Statistics What is an ancillary statistic for θ? .2 Can an ancillary statistic be a sufficient statistic?

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Probability and Distributions

Stat 100a, Introduction to Probability.

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

[Chapter 6. Functions of Random Variables]

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Will Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them:

Discrete Distributions

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

1 Probability Model. 1.1 Types of models to be discussed in the course

Basics on Probability. Jingrui He 09/11/2007

Multivariate distributions

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

STAT Chapter 5 Continuous Distributions

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

4 Moment generating functions

Chapter 2. Discrete Distributions

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

i=1 k i=1 g i (Y )] = k

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

Multivariate Random Variable

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

MA6451 PROBABILITY AND RANDOM PROCESSES

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

1 Probability Model. 1.1 Types of models to be discussed in the course

Chapter 5 continued. Chapter 5 sections

STAT 512 sp 2018 Summary Sheet

Name of the Student: Problems on Discrete & Continuous R.Vs

3. Probability and Statistics

1.6 Families of Distributions

Final Exam # 3. Sta 230: Probability. December 16, 2012

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Random Variables and Their Distributions

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Review for the previous lecture

1.1 Review of Probability Theory

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Exercises and Answers to Chapter 1

Basic concepts of probability theory

Basic concepts of probability theory

STAT 414: Introduction to Probability Theory

Master s Written Examination - Solution

Recitation 2: Probability

15 Discrete Distributions

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

First Year Examination Department of Statistics, University of Florida

STAT 430/510: Lecture 16

Probability Distributions Columns (a) through (d)

MATH : EXAM 2 INFO/LOGISTICS/ADVICE

Generating Random Variates 2 (Chapter 8, Law)

Brief Review of Probability

Name of the Student: Problems on Discrete & Continuous R.Vs

ACM 116: Lectures 3 4

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

Measure-theoretic probability

MATH Notebook 5 Fall 2018/2019

EXAM # 3 PLEASE SHOW ALL WORK!

HW4 : Bivariate Distributions (1) Solutions

The Multivariate Normal Distribution 1

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

2 Functions of random variables

More on Distribution Function

MATH2715: Statistical Methods

STA205 Probability: Week 8 R. Wolpert

Random vectors X 1 X 2. Recall that a random vector X = is made up of, say, k. X k. random variables.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

STAT 418: Probability and Stochastic Processes

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Probability Theory. Patrick Lam

MATH Notebook 4 Fall 2018/2019

Basic concepts of probability theory

Partial Solutions for h4/2014s: Sampling Distributions

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Probability and Statistics Notes

Continuous Random Variables

Transcription:

SOLUTION FOR HOMEWORK 2, STAT 435 Welcome to your 2th homework. It looks like this is the last one! As usual, try to find mistakes and get extra points! Now let us look at your problems.. Problem 7.22. Here the joint pmf is f(x, x 2 ) := f X,X 2 (x, x 2 ) = (/36)x x 2 I(x, 2, 3})I(x 2, 2, 3}). (a) Note that y = x x 2, 2, 3, 4, 6, 9}. Then it is simpler to calculate the pmf of Y directly: f Y () = f(, ) = /36; f Y (2) = f(, 2) + f(2, ) = /9; f Y (3) = f(, 3) + f(3, ) = /6; f Y (4) = f(2, 2) = /9; f Y (6) = f(2, 3) + f(3, 2) = /3; f Y (9) = f(3, 3) = /4. Please check that the sum is. (b) For Y = X /X 2 we get the support y /3, /2, 2/3,, 3/2, 2, 3}. Then again it is simpler to calculate the pmf of Y directly: f Y (/3) = f(, 3) = 3/36; f Y (/2) = f(, 2) = 2/36; f Y (2/3) = f(2, 3) = 6/36; f Y () = f(, ) + f(2, 2) + f(3, 3) = ( + 4 + 9)/36; f Y (3/2) = f(3, 2) = 6/36; Please check that the sum is. f Y (2) = f(2, ) = 2/36; f Y (3) = f(3, ) = 3/36. 2. Problem 7.28. Note that the solution follows directly from definitions of the negative binomial and geometric distributions, but in any case we will check this via the mgf s method; below I one more time do this. Let X be geometric with the probability of S denoted as θ, then its moment generating function (I repeat its calculation we also had it in one of the previous HWs) M X (t) = Ee Xt } = θ( θ) x e xt = θ( θ) [( θ)e t ] x. x= x= In the right-hand side we have a geometric sum which converge for all sufficiently small t, and we know how to calculate it, so I continue and get M X (t) = θ θ On the other hand, if Y NegBinom(θ, k) then M Y (t) = ( θ)e t ( θ)e t = θe t ( θ)e t. (x )! (x k)!(k )! θk ( θ) x k e xt

= (x )! (x k)!(k )! [θ/( θ)]k [( θ)e t ] x. Now I would like to use the fact that a pmf of a negative binomial RV with the probability of success q = ( θ)e t is summed to (note that q is a valid probability of S for a sufficiently small t). We continue: M Y (t) = [θ/( θ)] k q k ( q) k (x )! (x k)!(k )! qk ( q) x k = [θ/( θ)] k ( θ) k e kt ( ( θ)e t ) k = θ k e kt ( ( θ)e t ) k. Now we can conclude that for k = 2 the mgf M Y (t) is indeed equal to M X +X 2 (t) = M X (t)m X2 (t). In other words, the sum of two independent geometric RVs has the negative binomial distribution with the stop at the second success (of course, the probability of S in each underlying Bernoulli trial should be the same). 3. Problem 7.29. Let X and Y be independent standard normal. Then for Z = X + Y we have X = Z Y and then using one of our methods we get Now note that Using this we continue: Then f Z,Y (z, y) = (z y)/ z f X,X 2 (z y, y) = (2π) e [(z y)2 +y 2 ]/2. (z y) 2 + y 2 = 2y 2 2zy + z 2 = [2y 2 2zy + z 2 /2] z 2 /2 + z 2 f Z (z) = e z2 /4 (2π) /2 = (2 /2 y z/2 /2 ) 2 + z 2 /2 = 2(y z/2) 2 + z 2 /2. f Z,Y (z, y) = (2π) e (y z/2)2 e z2 /4. (/2) /2 (2π(/2)) /2e (y z/2)2 /[2(/2)] dy = [(2π)2] /2 e z2 /[(2)(2)]. This implies that Z Norm(, 2) with zero-mean and variance equal to 2. Please check that the mean and variance are reasonable. 4. Problem 7.3. Let f X,Y (x, y) = 2xy( y)i( < x < )I( < y < ). Set Z = XY 2 and note that Z (, ). Then we can write two equivalent systems (direct and inverse relations): Z = XY 2 X = Z/U 2 U = Y Y = U Note that if z (, ) then z < u <. Now we are calculating Jacobian: x/ z y/ z x/ u y/ u = /u 2 2z/u 3 = u 2. 2

Then f Z,U (z, u) = u 2 f X,X 2 (z/u 2, u) = u 2 [2(z/u 2 )u( u)]i(z (, ), z < u < ). Further, we calculate the marginal density f Z (z) = f Z,U (z, u)du = (2)z u 3 ( u)du z = (2)z[( /2)( z)+( z /2 )] = (2)z[/2+(/2)z z /2 ] = (6z+6 2z /2 )I(z (, )). Is it integrable to? 5. Problem 7.36. Here the joint pdf is f X,X 2 (x, x 2 ) = 4x x 2 I( < x <, < x 2 < ). Then for the considered transformation we have Y = X 2 Y 2 = X X 2 X = Y /2 X 2 = Y 2 /Y /2 with y (, ) and y 2 (, y /2 ). The corresponding Jacobian is This yields x / y x 2 / y x / y 2 x 2 / y 2 = /(2y /2 ) y 2 /(2y 3/2 ) /y /2 = /(2y ). f Y,Y 2 (y, y 2 ) = (/2y )4y /2 y 2 y /2 I(y (, ))I(y 2 (, y /2 )) Let us check this answer: = (2y 2 /y )I(y (, ))I(y 2 (, y /2 ). [ y /2 f Y,Y 2 (y, y 2 )dy 2 ]dy = (2/y ) y /2 y 2 dy 2 ]dy It looks OK. 6. Problem 7.37. Here = (2/y )(y /2)dy =. f X,Y (x, y) = 24xyI( < x <, < y <, x + y < ). For the studied transformation Z = X + Y W = X 3 X = W Y = Z W

with w (, ) and w < z <. Corresponding Jacobian is x/ z y/ z x/ w y/ w = =. Then we use our rule to find Let us check the answer: f Z,W (z, w) = f X,Y (w, z w)i(w (, ), z (w, )) 24w[ w = 24w(z w)i(w (, ), z (w, )). (z w)dz]dw = 24 w[(/2)( w 2 ) w( w)]dw = 24 [w(/2 + (/2)w 2 w]dw = 24[(/4) + (/8) (/3)] = 24(6 + 3 8)/24 =. The answer looks OK. 7. Problem 7.4. If X Binom(θ, n) then its mgf is M X (t) = [ + θ(e t )] n. Then for two independent X Binom(θ, n ) and X Binom(θ, n 2 ) we have which is the mgf of Binom(θ, n + n 2 ). M X +X 2 (t) = M X (t)m X2 (t) = ( + θ(e t )) n +n 2 8. Problem 7.43. Let X Gamma(α, β). Then M X (t) = ( βt) α. Suppose that X,...,X n are iid Gamma RV with parameters (α, β). Then M X +X 2 +...+X n (t) = ( βt) αn. This yields that the sum of n iid Gamma RVs has Gamma distribution with parameters (nα, β). 8. Problem 7.6. Here X Poisson(λ = 3.3) and X is the number of complaints per day. Then: (a)p(x = 2) = e λ λ 2 /2!. (b) Here the RV of interest is Y = X + X 2 with X and X 2 being the number of complaints during the first and second days. Then we know that X + X 2 Poisson(2λ). You may quickly check the latter via Using this fact we get M X (t) = e λ(et ), M X +X 2 (t) = M X (t)m X2 (t) = e 2λ(et ). (c) Here X + X 2 + X 3 Poisson(3λ), so P(X + X 2 = 5) = e 2λ (2λ) 5 /5!. P(X + X 2 + X 3 2) = 4 k=2 e 3λ (3λ) k. k!

9. Problem 7.63(a). It is known that if X i Expon(θ) then Y k = k i= X i Gamma(α = k, β = ); see p.257, Ex.7.6. (It is easy to check this via the mgf approach.) Then where Y 2 Gamma(α = 2, β = θ = 5). Then P(X + X 2 < 8) = P(Y 2 < 8) P(Y 2 < 8) = 8 xe x/5 dx Γ(2)5 2 8 = (/5 2 )[ 5xe x/5 8 + 5 e x/5 dx] = (/25)[ 4e 8/5 + 25 25e 8/5 ] = (/25)[25 65e 8/5 ] = (65/25)e 8/5.. Problem 7.69. Let X Normal(µ, σ 2 ). Then Y = e X has support (, ) and X = ln(y ) is the inverse function. Then we apply our rule and get for the transformation at hand: f Y (y) = dx(y)/dy f X (ln(y)) = y (2πσ 2 ) /2 e (ln(y) µ)2 /2σ 2 I(y > ). This is the famous log-normal density which plays an important role in many branches of statistics, in particular in regression. 5