Math 180B Homework 9 Solutions

Similar documents
Notes on Continuous Random Variables

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Chapter 4. Continuous Random Variables

Conditional distributions (discrete case)

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

Limiting Distributions

Math Spring Practice for the final Exam.

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Brief Review of Probability

Stat 426 : Homework 1.

Homework 4 Math 11, UCSD, Winter 2018 Due on Tuesday, 13th February

Guidelines for Solving Probability Problems

STAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in:

Math Spring Practice for the Second Exam.

) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014.

Solutions to Homework 2

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

4 Moment generating functions

Discrete Distributions

Continuous Distributions

3 Conditional Expectation

Chapter 5. Chapter 5 sections

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

THE QUEEN S UNIVERSITY OF BELFAST

Limiting Distributions

Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet.

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Discrete Distributions

Stochastic Differential Equations.

Module 9: Stationary Processes

ACM 116: Lectures 3 4

1. Aufgabenblatt zur Vorlesung Probability Theory

Chapter 3: Random Variables 1

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

STAT 516 Midterm Exam 3 Friday, April 18, 2008

Statistics Ph.D. Qualifying Exam: Part II November 3, 2001

Statistics for Economists. Lectures 3 & 4

Stat410 Probability and Statistics II (F16)

Expectation, variance and moments

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Math 362, Problem set 1

Chapter 3: Random Variables 1

ELEMENTS OF PROBABILITY THEORY

Chapter 6: Random Processes 1

Example 1. Assume that X follows the normal distribution N(2, 2 2 ). Estimate the probabilities: (a) P (X 3); (b) P (X 1); (c) P (1 X 3).

2 Continuous Random Variables and their Distributions

A Study of Poisson and Related Processes with Applications

Conditioning a random variable on an event

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

STOR : Lecture 17. Properties of Expectation - II Limit Theorems

CSE 312 Foundations, II Final Exam

Review for the previous lecture

(Practice Version) Midterm Exam 2

Basic concepts of probability theory

7. Higher Dimensional Poisson Process

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

STAT Chapter 5 Continuous Distributions

Test Problems for Probability Theory ,

Question 1. Question 4. Question 2. Question 5. Question 3. Question 6.

STA 4321/5325 Solution to Homework 5 March 3, 2017

Expectation of Random Variables

15 Discrete Distributions

Bell-shaped curves, variance

Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M. Ross.

3 Modeling Process Quality

Mathematics 1 Lecture Notes Chapter 1 Algebra Review

GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs

Lecture 5: Moment generating functions

Practice Final Solutions

2905 Queueing Theory and Simulation PART IV: SIMULATION

1 Solution to Problem 2.1

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

1. (Regular) Exponential Family

Sampling Distributions

MAS113 Introduction to Probability and Statistics. Proofs of theorems

the convolution of f and g) given by

Review of Probability Theory II

1 Basic continuous random variable problems

Page Max. Possible Points Total 100

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

Gillespie s Algorithm and its Approximations. Des Higham Department of Mathematics and Statistics University of Strathclyde

MA 320 Introductory Probability, Section 1 Spring 2017 Final Exam Practice May. 1, Exam Scores. Question Score Total. Name:

Continuous Random Variables and Continuous Distributions

Stat 5101 Notes: Brand Name Distributions

ISyE 6644 Fall 2016 Test #1 Solutions

Continuous r.v practice problems

Reflected Brownian Motion

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

Statistics 100A Homework 5 Solutions

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

Tutorial 1 : Probabilities

Transcription:

Problem 1 (Pinsky & Karlin, Exercise 5.1.3). Let X and Y be independent Poisson distributed random variables with parameters α and β, respectively. Determine the conditional distribution of X, given that N X + Y n. Solution. Note that N X +Y has the Poisson distribution with parameter α+β, and that N X, so if k > n, then P (X k N n). If k n, then by the independence of X and Y we have so P (X k, N n) P (X k, Y n k) P (X k)p (Y n k) e α α k e β β n k (n k)!, P (X k, X + Y n) P (X k N n) P (N n) e α α k e β β n k (n k)! n! e (α+β) (α + β) n ( ) n α k β n k ( ) ( ) k ( n α k (α + β) n 1 α ) n k. k α + β α + β Problem 2 (Pinsky & Karlin, Exercise 5.1.5). Suppose that a random variable X is distributed according to a Poisson distribution with parameter λ. The parameter λ is itself a random variable, exponentially distributed with density f(x) θe θx for x. Find the probability mass function for X. Solution. Stated precisely, the assumption of the problem is that if x and k, 1, 2,..., then P (X k λ x) e x x k. 1

By the law of total probability, we have P (X k) P (X k λ x)f(x) dx e x x k θe θx dx θ x k e (θ+1)x dx. Making the change of variables u (θ + 1)x, simplifying, and using the fact that u k e u du Γ(k + 1), we get P (X k) θ (θ + 1) k+1 θ (θ + 1) k+1 θ θ + 1 u k e u du ( 1 θ θ + 1 Thus, X has the geometric distribution with parameter θ/(θ + 1). ) k Problem 3 (Pinsky & Karlin, Exercise 5.1.9). Let {X(t) : t } be a Poisson process having rate parameter λ 2. Determine the following expectations: (a) E[X(2)]. (b) E [ (X(1)) 2]. (c) E[X(1)X(2)]. Solution. (a) Since X(2) Poisson(4), we have E[X(2)] 4. (b) Since X(1) Poisson(2), we have E[X(1)] 2 and Var(X(1)) 2, so E [ (X(1)) 2] Var(X(1)) + (E[X(1)]) 2 2 + 2 2 6. (c) Since a Poisson process has independent increments, the random variables X(1) and X(2) X(1) are independent. Moreover, X(2) X(1) Poisson(2). Therefore, we have E[X(1)X(2)] E [ (X(1)) 2 + X(1) ( X(2) X(1) )] E [ (X(1)) 2] + E[X(1)]E[X(2) X(1)] 6 + 2 2 1. 2

Problem 4 (Pinsky & Karlin, Problem 5.1.2). Suppose that minor defects are distributed over the length of a cable as a Poisson process with rate α, and that, independently, major defects are distributed over the cable according to a Poisson process of rate β. Let X(t) be the number of defects, either major or minor, in the cable up to length t. Argue that X(t) must be a Poisson process of rate α + β. Solution. Let Y (t) be the number of minor defects over the length of the cable up to time t, and let Z(t) be the number of major defects over the length of the cable up to time t. Then {Y (t) : t } and {Z(t) : t } are independent Poisson processes with rates α and β, respectively, and X(t) Y (t) + Z(t) for all t. We now verify that {X(t) : t } is a Poisson process of rate α + β. First, we have X() Y () + Z(). Next, if s and t >, then Y (s + t) Y (s) Poisson(αt) and Z(s + t) Z(s) Poisson(βt), so, since Y (s + t) Y (s) and Z(s + t) Z(s) are independent, it follows that X(s + t) X(s) ( Y (s + t) Y (s) ) + ( Z(s + t) Z(s) ) Poisson((α + β)t). Finally, suppose we have time points t < t 1 < < t n. Then the random variables Y (t 1 ) Y (t ),..., Y (t n ) Y (t n 1 ), Z(t 1 ) Z(t ),..., Z(t n ) Z(t n 1 ) are independent, so since X(t i ) X(t i 1 ) ( Y (t i ) Y (t i 1 ) ) + ( Z(t i ) Z(t i 1 ) ) for all i 1,..., n, it follows that the random variables X(t 1 ) X(t ),..., X(t n ) X(t n 1 ) are independent. We conclude that {X(t) : t } is a Poisson process of rate α + β. 3

Problem 5 (Pinsky & Karlin, Problem 5.1.3). The generating function of a probability mass function p k P (X k), for k, 1,..., is defined by g X (s) E[s X ] p k s k for s < 1. k Show that the generating function for a Poisson random variable X with mean µ is given by g X (s) e µ(1 s). Solution. If X Poisson(µ), then we just compute its generating function: g X (s) P (X k)s k e µ µ k k k k s k e µ (µs) k e µ e µs e µ(1 s). Problem 6 (Pinsky & Karlin, Problem 5.1.9). Arrivals of passengers at a bus stop form a Poisson process X(t) with rate λ 2 per unit time. Assume that a bus departed at time t leaving no customers behind. Let T denote the arrival time of the next bus. Then, the number of passengers present when it arrives is X(T ). Suppose that the bus arrival time T is independent of the Poisson process and that T has the uniform probability density function 1 for t 1, f T (t) otherwise. (a) Determine E[X(T ) T t] and E [ (X(T )) 2 T t ]. (b) Determine the mean E[X(T )] and variance Var[X(T )]. Solution. (a) By the independence of T and {X(t) : t }, we have E[X(T ) T t] E[X(t) T t] E[X(t)] 2t 4

and E [ (X(T )) 2 ] [ ] [ T t E (X(t)) 2 T t E (X(t)) 2 ] 4t 2 + 2t. (b) By the law of total probability, we have E[X(T )] E[X(T ) T t]f T (t) dt 1 2t dt 1. Similarly, so E [ (X(T )) 2] 1 ( 4t 2 + 2t ) dt 7 3, Var(X(T )) E [ (X(T )) 2] (E[X(T )]) 2 4 3. Problem 7 (Pinsky & Karlin, Exercise 5.2.4). Suppose that a book of 6 pages contains a total of 24 typographical errors. Develop a Poisson approximation for the probability that three particular successive pages are error-free. Solution. The typographical errors in the book can be modeled as the arrivals of a Poisson process with rate 24/6. If N is the number of typographical errors in three particular successive pages, then N approximately has the Poisson distribution with rate 3 24/6 1.2, so P (N ) e 1.2. Problem 8 (Pinsky & Karlin, Problem 5.2.4). Suppose that N points are uniformly distributed over the interval [, N). Determine the probability distribution for the number of points in the interval [, 1) as N. Solution. Let N be a positive integer, and let U 1,..., U N Uniform([, N)) be independent random variables, and for i 1,..., N, let X i be the indicator random variable of the event {U i [, 1)}. That is, 1 if U i [, 1), X i 1 {Ui [,1)} otherwise. 5

Since P (U i [, 1)) 1/N, we have X i Bernoulli(1/N) for each i, and X 1,..., X N are independent. If X denotes the number of the U i s in the interval [, 1), then X X 1 + + X N Binomial(N, 1/N). By the Poisson approximation to the binomial distribution, it follows that as N, X approximately has the Poisson distribution with rate N 1/N 1. Problem 9 (Pinsky & Karlin, Problem 5.2.5). Suppose that N points are uniformly distributed over the surface of a circular disk of radius r. Determine the probability distribution for the number of points within a distance of one of the origin as N, r, N/(πr 2 ) λ. Solution. To simplify notation, let D R denote the surface of the disk of radius R centered at the origin in R 2. Let N be a positive integer, and let r >. Let U 1,..., U N be independent random variables distributed uniformly over D r. That is, each U i has density 1 if x 2 + y 2 r 2, πr f(x, y) 2 otherwise. For i 1,..., N, let X i be the indicator random variable of the event U i D 1. That is, 1 if U i D 1, X i 1 {Ui D 1 } otherwise. Since 1 P (U i D 1 ) D 1 πr dx dy 1 2 r, 2 we have X i Bernoulli(1/r 2 ) for each i, and X 1,..., X N are independent. If X denotes the number of the U i s within 1 of the origin, then X X 1 + + X N Binomial(N, 1/r 2 ). By the Poisson approximation to the binomial distribution, it follows that as N and r, with N/(πr 2 ) λ, X approximately has the Poisson distribution with rate N 1/r 2 πλ. 6

Problem 1 (Pinsky & Karlin, Problem 5.2.11). Let X and Y be jointly distributed random variables and B an arbitrary set. Fill in the details that justify the inequality P (X B) P (Y B) P (X Y ). Solution. We have {X B} {X B, Y B} {X B, Y / B} {Y B} {X Y }, and hence Therefore, P (X B) P ( {Y B} {X Y } ) P (Y B) + P (X Y ). P (X B) P (Y B) P (X Y ). Switching the roles of X and Y in the argument above, we also have P (Y B) P (X B) P (X Y ), and hence P (X B) P (Y B) P (X Y ). 7