Stat 512 Homework key 2

Similar documents
Random variables. DS GA 1002 Probability and Statistics for Data Science.

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Basics of Stochastic Modeling: Part II

Northwestern University Department of Electrical Engineering and Computer Science

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Stat410 Probability and Statistics II (F16)

Continuous Probability Distributions. Uniform Distribution

Slides 8: Statistical Models in Simulation

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Three hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

Guidelines for Solving Probability Problems

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Mathematical Statistics 1 Math A 6330

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Exam 3, Math Fall 2016 October 19, 2016

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Probability Distributions Columns (a) through (d)

MTH 202 : Probability and Statistics

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

1 Random Variable: Topics

Fundamental Tools - Probability Theory II

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

DS-GA 1002 Lecture notes 2 Fall Random variables

0 otherwise. Page 100 Exercise 9: Suppose that a random variable X has a discrete distribution with the following p.m.f.: { c. 2 x. 0 otherwise.

ST5215: Advanced Statistical Theory

Continuous Random Variables

Things to remember when learning probability distributions:

15 Discrete Distributions

Math Spring Practice for the Second Exam.

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Brief Review of Probability

Random Variables and Their Distributions

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Chapter 4. Continuous Random Variables 4.1 PDF

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Marshall-Olkin Bivariate Exponential Distribution: Generalisations and Applications

Probability and Distributions

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

STAT/MATH 395 PROBABILITY II

Continuous Random Variables

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

PAS04 - Important discrete and continuous distributions

Statistics and Econometrics I

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

STAT 3610: Review of Probability Distributions

Continuous Probability Spaces

Week 2. Review of Probability, Random Variables and Univariate Distributions

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Continuous Probability Distributions. Uniform Distribution

Random Variables Example:

STAT 516: Basic Probability and its Applications

Definition 1.1 (Parametric family of distributions) A parametric distribution is a set of distribution functions, each of which is determined by speci

1 Complete Statistics

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Stat 426 : Homework 1.

Chapter 4: CONTINUOUS RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Continuous Distributions

Chapter 3: Random Variables 1

Continuous Distributions

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Lecture 6: Special probability distributions. Summarizing probability distributions. Let X be a random variable with probability distribution

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Statistics (1): Estimation

Sampling Distributions

MAS113 Introduction to Probability and Statistics. Proofs of theorems

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

1 Review of Probability

Basic concepts of probability theory

Basic concepts of probability theory

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Math Review Sheet, Fall 2008

PROBABILITY DISTRIBUTIONS

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

2 Continuous Random Variables and their Distributions

Basic concepts of probability theory

Probability Density Functions

3. Review of Probability and Statistics

Chapter 5. Chapter 5 sections

STAT 430/510: Lecture 15

Mathematical statistics

Chapter 3 Discrete Random Variables

Applied Statistics and Probability for Engineers. Sixth Edition. Chapter 4 Continuous Random Variables and Probability Distributions.

Chapter 4 Continuous Random Variables and Probability Distributions

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Joint Probability Distributions and Random Samples (Devore Chapter Five)

3 Continuous Random Variables

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Part 3: Parametric Models

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

Transcription:

Stat 51 Homework key October 4, 015 REGULAR PROBLEMS 1 Suppose continuous random variable X belongs to the family of all distributions having a linear probability density function (pdf) over the interval [0, 1] and zero elsewhere Let θ be the slope of the pdf (a) Derive the formula for the pdf f(x) of this family, making clear the range of permissible values of θ (b) Derive the formula for the cumulative distribution function (cdf) F (x) for this family (c) Derive an expression for the median value of X as a function of θ Ans: (a) Suppose { θx + b x [0, 1] f(x) 0 otherwise where b R whose value may depend on θ Integrating, we get that in the interval [0, 1], F (x) θ x + bx + c where c R Now since F (0) 0, c 0 Therefore, on this interval, the cdf (1) Now, F (x) θ x F (1) 1 + bx () b 1 θ Now we need to find the range of θ The only condition that need to be satisfied is f(x) 0 Let θ 0 Then Let θ 0 Then min [0,1] θ min [0,1] θ f(x) min θx + b b 0 [0,1] f(x) min θx + b θ + b 0 [0,1] Hence, from (1) we get that the formula for the pdf of the family is { θx + 1 θ x [0, 1] f(x) (3) 0 otherwise where θ [, ] 1

(b) From () we get that F (x) θ x 0 x 0 + (1 θ )x x [0, 1] 1 x 1 (4) where θ [, ] (c) Let M X denote the median of X We know that M X [0, 1] since X takes value only in this interval Then F (M X ) 1 θm X + ( θ)m X 1 0 (5) M X θ ± 4 + θ θ Now we need to determine which root is M X Case 1: θ 0 Notice that since 4θ 0, 4 + θ (θ ) and also, θ 0 Therefore, only the larger root, θ + 4 + θ 0The other root θ is negative Therefore, M X θ + 4 + θ θ Case : θ < 0 Notice that since 4θ 0, 4 + θ (θ ) This leads to θ + 4 + θ < 0 Clearly, both the roots are positive since for each of them, denominator and numerator both are negative numbers Now since F (x) is a strictly increasing function on the interval [0, 1], it will attain the value 1 only at one point in the interval Hence, only one root of the quadratic in (5) will lie in the interval [0,1] The other root is clearly greater than 1 Hence, M X is the smaller root M X θ + 4 + θ θ Hence, in general, for any θ [, ], M X θ + 4 + θ θ Now consider that continuous random variable X belongs to a two parameter family of all distributions having a linear probability density function (pdf) with slope θ over open intervals (0, η) (a) Derive the formula for the pdf f(x) of this family, making clear the range of permissible values of θ and η (b) Derive the formula for the cumulative distribution function (cdf) F (x) for this family (6)

(c) Derive an expression for the median value of X as a function of θ and η Ans: (a) Consider Y to be the random variable described in question 1 with pdf f Y (y) Here we denoted the parameter θ Y { θ Y y + 1 θ Y y [0, 1] 0 otherwise (7) instead of θ to avoid confusion with the slope parameter θ of X Now notice that Y X or X ηy η This follows if we observe that the family of X can be obtained from that of Y by a simple transformation ie, multiplication by η Hence, on x (0, η), f(x) 1 η f Y ( x η ) θ Y η x + θ Y η Therefore, θ θ Y η Since the admissible range of θ Y [ η, η ] Hence, on x (0, η), is [, ], θ f(x) θx + θη η For θ [ η, η ], θx + θη x (0, η) f(x) η 0 otherwise (8) (b) Since F (0) 0, the integration constant will be 0 (like 1(b)) Therefore, 0 x 0 F (x) θ x + θη x x (0, η) (9) η 1 x η (c) Let M Y be the median of Y From 1(c) we get that, M Y θ Y + 4 + θ Y θ Y 3

Since X ηy, the median of X, M X ηm Y Hence, replacing θ Y by θη M X η( θ Y + 4 + θ Y θ Y ) θη + 4 + θ η 4 θη 3 Suppose random variable X has a negative binomial distribution for some p [0, 1] and some integer r 1 with {( x 1 P r[x x] r 1) p r (1 p) x r x {r, r + 1, } 0 else Prove that the above formula is a probability mass function Ans: P r[x x] xr ( ) x 1 p r (1 p) x r r 1 xr ( ) k + r 1 p r (1 p) k (replacing k x r) r 1 k0 ( ) ( ) k k + r 1 1 p p r+k r 1 p k0 ( )( ) (r+k) ( ) k k + r 1 1 p 1 ( 1) k r 1 p p k0 ( 1 p + p 1 ) r ( Ussing the negative binomial series since 1 p p p 1 Here we have used the negative binomial expansion for x < a, (x + a) n ( ) n + k 1 ( 1) k x k a (n+k) k k0 < 1 p ) 4

Alternative method: P r[x x] xr ( ) x 1 p r (1 p) x r r 1 xr ( ) k + r 1 p r (1 p) k (replacing k x r) r 1 k0 ( ( ) (r+k) k + r 1 (1 p) )p r k 1 r 1 p(1 p) k0 ( k + r 1 (1 p) r ( 1) )( k 1p ( ) (r+k) 1 r 1 )k p(1 p) k0 ( (1 p) r 1 ) r p + 1 ( Using the negative binomial series since 1 p(1 p) p < 1 p(1 p) ) 1 Note: That x < a is important If you have not shown that this condition holds in your work, points will be deducted Moreover, if the terms you choose as x and a do not satisfy x < a at all, points will be deducted 4 Suppose discrete random variable X belongs to the family of all distributions having probability mass function (pmf) of the form { c(θ) p(x) P r[x x] θ x x {0, 1,, } 0 otherwise State clearly the range of permissible values of θ and explicitly providing the form of c(θ) What is the name of this parametric family? Ans: Clearly c(θ) 0 p(x) 1 x0 x0 1 θ x c(θ) 1 x0 1 < iff θ > 1 Hence, θ (1, ) In this case, θx 1 1 1 θ This is the geometric distribution c(θ) 1 c(θ) 1 1 θ 5

5 Suppose continuous random variable X has the exponential distribution X E(λ) with pdf f(x) λe λx 1 (0, ) (x) for λ > 0 (a) What is the cumulative distribution function (cdf) for X? (b) What is the median for X? (c) For arbitrary positive s and t, find an expression for P r(x > s + t X > s) (d) Suppose the distribution of X describes the time until failure of a light bulb that is left on continuously Suppose it is still burning at time s What is the median residual lifetime of the bulb after time s (ie, the time after s that there is exactly a 50% chance the bulb will still be lit)? Solution (a) The cdf of X for x > 0 is F (x) P r(x x) x x 0 λx 0 f(t)dt λ exp( λt)dt exp( w)dw [ exp( w)] λx 0 exp( λx) ( exp(0)) 1 e λx, x > 0 (Where w λt) and 0, otherwise (b) The median satisfies F (m) 1, so F (m) 1/ 1 e λm 1/ exp( λm) 1/ λm log m λ 1 log (c) This problem demonstrates the unique memoryless property of ex- 6

ponential distributions P r(x > s + t X > s) P r(x > s + t, X > s) P r(x > s) P r(x > s + t) P r(x > s) 1 P r(x s + t) 1 P r(x s) 1 F (s + t) 1 F (s) exp( λ(s + t) exp( λs) exp( λt) P r(x > t) Note: a similar exercise shows that the discrete geometric distributions also exhibit this property (d) The median residual lifetime is the m > 0 such that P r(x > s+m X > s) 1/ However, applying the lack of memory property derived in (c), P r(x > s + m X > s) P r(x > m) Hence m is just the median of X, which was found in (b) to be m λ 1 log 6 Suppose continuous random variable X has the standard uniform distribution X U(0, 1) with pdf f(x) 1 (0,1) (x) What are the cumulative distribution function and probability density function for W log(x)? Solution First, we solve the problem as written A more common transformation in practice would be Z log(x) Working either the problem as written or assuming Z was intended is fine for this homework To derive the cdf of W, we first need to note that the cdf of X is just x min(x,1) 0 x < 0 F X (x) f(t)dt 1dt x 0 x < 1 0 1 x 1 Then, we calculate F W (w) P r(w w) P r(log(x) w) P r(x exp(w) { exp(w) w < 0 1 w 0 7

We note that the support of W is now on the interval (, 0) With this in mind, the pdf of W follows by differentiation: f W (w) df W (w) dw exp(w)1 (,0) (w) The alternative formulation of this problem for Z log(x) results in the following cdf: F Z (z) P r(z z) P r( log(x) z) P r(log(x) z) P r(x exp( z)) P r(x > exp( z)) (by continuity) 1 P r(x exp( z)) { 1 exp( z) z 0 0 z < 0 Hence, Z follows the exponential distribution with λ 1 and with pdf f Z (z) exp( z)1 (0, ) (z) 8

MORE INVOLVED PROBLEMS 7 Suppose independent continuous random variables X 1 and X have the exponential distribution X E(λ) with pdf f(x) λe λx 1 (0, ) (x) for λ > 0 (a) What are the cumulative distribution function and probability density function for W min(x 1, X )? (b) What are the cumulative distribution function and probability density function for W max(x 1, X )? Solution (a) The cdf calculation should be familiar from Problem 6(a) of Homework #1 Later on, we will study the distributions of arbitrary order statistics (not just the minimum or maximum of a random sample) Here, the distribution of W follows as F (w) P r(w w) 1 P r(w > w) 1 P r(x 1 > w, X > w) 1 P r(x 1 > w) 1 (1 P r(x 1 w)) 1 (e λw ) 1 e λw for w > 0 and 0 for w 0 By differentiating or noting that W E(λ), we obtain the pdf f(w) λ exp( λw)1 (0, ) (w) Note: This exercise can be generalized to a sample of X 1,, X n that are independent and identically distributed according to X E(λ) What do you expect the distribution of W min(x 1,, X n ) to be, based on this problem? (b) For the maximum, we use the independence and identical distribution of the X i to find: F (w) P r(w w) P r(x 1 w, X w) P r(x 1 w) (1 exp( λw)) for w > 0 and 0 for w 0 By differentiation of F (w), the pdf of W is f(w) λ(1 exp( λw)) exp( λw)1 (0, ) (w) 8 Let X 1 and X be Poisson random variables with X i P(λ i ) and probability mass functions { e λ i λ x i p i (k) P r[x i x] x! x {0, 1,, } 0 else 9

Further suppose that X 1 and X are independent random variables, so that the events {X 1 x 1 } and {X x } are independent for all x 1, x {0, 1,, } (Note that we will eventually describe methods for deriving these answers using convolutions However, this problem can be answered by just considering independent events and the results we have discussed for conditional probabilities) (a) What is the joint probability mass function p X1,X (k 1, k ) P r(x 1 k 1 X k ) (b) Letting random variable S X 1 + X, what is the probability mass function p S (s)? (c) Find the conditional probability mass function p X1 S(x S s) To what parametric family does this family belong? Solution (a) By independence, we have that p X1,X (k 1, k ) P r(x 1 k 1 X k ) P r(x 1 k 1 )P r(x k ) p 1 (k 1 )p (k ) Hence, the joint distribution of X 1 and X has pmf { { e λ 1 λ k 1 1 e λ λ k p X1,X (k 1, k ) k 1! k! k 1, k {0, 1,, } 0 else (b) We first derive a useful fact about P r(s s X k): P r(s s X k) P r(x 1 + X s X k) P r(x 1 + X s, X k) P r(x k) e (λ 1 +λ ) k 1!k! P r(x 1 + X (s k) + k, X k) P r(x k) P r(x 1 s k, X k) P r(x k) P r(x 1 s k)p r(x k) (by independence) P r(x k) P r(x 1 s k) Now, we proceed in calculating the mass function of S For s λ k1 1 λk k 1, k {0, 1,, 0 else 10

0, 1,,, we have p S (s) P r(s s) P r(s s X k)p r(x k) k0 (by the Law of Total Probability) s P r(s s X k)p r(x k) (since P r(k > s) 0) k0 s p 1 (s k)p (k) k0 s k0 s k0 e λ1 λ s k 1 (s k)! e λ1 λ s k 1 (s k)! e λ λ k k! e λ λ k k! e (λ1+λ) (λ 1 + λ ) s s! e (λ1+λ) (λ 1 + λ ) s s! (using our useful fact) e (λ1+λ)(λ1+λ)s s! e (λ1+λ)(λ1+λ)s s! s k0 s! ( λ1 ) s k ( λ ) k (s k)!k! λ 1 + λ λ 1 + λ (recognizing the binomial sum) which is the pmf of a Poisson distribution with mean parameter λ 1 + λ (c) Before we evaluate the conditional probability for k {0, 1,,, s 1, s}, note for k / {0, 1,,, s 1, s} that the following probability is zero For k {0, 1,,, s 1, s}, P r(x 1 k S s) P r(s s, X 1 k) P r(s s) P r(s s X 1 k)p r(x 1 k) P r(s s) p (s k)p 1 (k) (applying the useful fact) p S (s) e λ λ s k (s k)! e λ 1 λ k 1 k! e (λ 1 +λ ) (λ 1+λ ) s! ( λ1 s! (s k)!k! λ 1 + λ ) k ( λ λ 1 + λ ) s k Equivalently, we could write ( ) k ( ) s k s! λ1 λ P r(x 1 k S s) 1 {0,1,,,s}(k), (s k)!k! λ 1 + λ λ 1 + λ which emphasizes for arbitrary k R whether k is in the support of the conditional distribution or not 11

This is the pmf of a Binomial distribution with probability of success p in a sample of size s We can also write λ1 λ 1+λ ( X 1 S s B s, λ 1 λ 1 + λ 9 Consider an experiment consisting of independent rolls of a weighted die Let X k be the random variable measuring the number showing on top of the die on the kth roll Hence, the sample space for X k is Ω X {1,, 3, 4, 5, 6}, with P X (X k ω) p ω for all ω Ω X and for k 1, Define S X 1 + X ) (a) What is the support of S? (b) Can you find values for the parameters p 1,, p 6 such that all possible outcomes for S are equally likely? If so, do so If not, prove that it cannot be done Solution (a) The support of S is the set of values s such that P S (S s) > 0 Since the probabilities for each outcome in Ω X must be positive, the support of S is given by Ω S {, 3, 4, 5, 6, 7, 8, 9, 10, 11, 1} This is the set of all possible outcomes of S (b) In order for the outcomes of S to be equally likely, we require that P S (S s) P (s) 1/11 for all s Ω S Thus, the following must hold: P () p 1 1/11 P (1) p 6 1/11 Then we have that p 1 p 6 1/ 11 However, this would imply that P (7) (p 1 p 6 + p p 5 + p 3 p 4 ) (1/11 + p p 5 + p 3 p 4 ) > 1/11 since p ω > 0 for each ω Ω X This suffices to show that no weighted die can be constructed such that the outcomes of S are all equally likely Note that additional correct solutions may be constructed based on outcomes other than ω 1 or 6 1