Lecture 17: The Exponential and Some Related Distributions

Similar documents
Things to remember when learning probability distributions:

Exponential Distribution and Poisson Process

Continuous Random Variables and Continuous Distributions

Chapter 3.3 Continuous distributions

Probability. Machine Learning and Pattern Recognition. Chris Williams. School of Informatics, University of Edinburgh. August 2014

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

The exponential distribution and the Poisson process

Continuous Distributions

Chapter 5. Chapter 5 sections

18.440: Lecture 28 Lectures Review

2 Random Variable Generation

Exponential & Gamma Distributions

Probability Density Functions

1 Probability and Random Variables

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Probability and Distributions

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Continuous Random Variables

Stat 426 : Homework 1.

Math Spring Practice for the Second Exam.

Topic 4: Continuous random variables

1 Review of Probability and Distributions

Chapter 12: Differentiation. SSMth2: Basic Calculus Science and Technology, Engineering and Mathematics (STEM) Strands Mr. Migo M.

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Topic 4: Continuous random variables

Chapter 5 continued. Chapter 5 sections

3 Continuous Random Variables

Continuous random variables

Statistics (1): Estimation

APPM/MATH 4/5520 Solutions to Problem Set Two. = 2 y = y 2. e 1 2 x2 1 = 1. (g 1

Lecture 2: Conjugate priors

Stat410 Probability and Statistics II (F16)

Lecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016

Continuous random variables and probability distributions

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

Expected Values, Exponential and Gamma Distributions

Brief Review of Probability

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

1.1 Review of Probability Theory

Lecture 5: Moment generating functions

Let X be a continuous random variable, < X < f(x) is the so called probability density function (pdf) if

STA 256: Statistics and Probability I

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Random Variables and Their Distributions

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

Exercise 5 Release: Due:

Continuous random variables

Common ontinuous random variables

Guidelines for Solving Probability Problems

STAT 3610: Review of Probability Distributions

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

18.440: Lecture 28 Lectures Review

ECE 313 Probability with Engineering Applications Fall 2000

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Actuarial Science Exam 1/P

Chapter 3: RANDOM VARIABLES and Probability Distributions

Lecture 4. Continuous Random Variables and Transformations of Random Variables

APM 504: Probability Notes. Jay Taylor Spring Jay Taylor (ASU) APM 504 Fall / 65

Definition 1.1 (Parametric family of distributions) A parametric distribution is a set of distribution functions, each of which is determined by speci

Lecture 1: August 28

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example

Continuous RVs. 1. Suppose a random variable X has the following probability density function: π, zero otherwise. f ( x ) = sin x, 0 < x < 2

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Will Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them:

Chapter 2 Continuous Distributions

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

THE QUEEN S UNIVERSITY OF BELFAST

) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

It can be shown that if X 1 ;X 2 ;:::;X n are independent r.v. s with

ARCONES MANUAL FOR THE SOA EXAM P/CAS EXAM 1, PROBABILITY, SPRING 2010 EDITION.

CS 237: Probability in Computing

Actuarial models. Proof. We know that. which is. Furthermore, S X (x) = Edward Furman Risk theory / 72

Deep Learning for Computer Vision

7 Random samples and sampling distributions

Continuous Distributions

Discrete Distributions Chapter 6

Midterm Examination. STA 205: Probability and Measure Theory. Thursday, 2010 Oct 21, 11:40-12:55 pm

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Gamma and Normal Distribuions

CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS. 6.2 Normal Distribution. 6.1 Continuous Uniform Distribution

Definition A random variable X is said to have a Weibull distribution with parameters α and β (α > 0, β > 0) if the pdf of X is

Continuous distributions

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Chapter 4. Continuous Random Variables

MATH : EXAM 2 INFO/LOGISTICS/ADVICE

STAT Chapter 5 Continuous Distributions

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Probability Distributions for Continuous Variables. Probability Distributions for Continuous Variables

Lecture 3 Continuous Random Variable

TUTORIAL 8 SOLUTIONS #

1, 0 r 1 f R (r) = 0, otherwise. 1 E(R 2 ) = r 2 f R (r)dr = r 2 3

Review for the previous lecture

Probability Distributions

1 Review of Probability

Slides 8: Statistical Models in Simulation

Stat 5101 Notes: Brand Name Distributions

Generation from simple discrete distributions

Transcription:

Lecture 7: The Exponential and Some Related Distributions. Definition Definition: A continuous random variable X is said to have the exponential distribution with parameter if the density of X is e x if x > px) otherwise. In this case, a simple calculation confirms that calculated explicitly: px)dx. Also, the CDF of X can be F X x) P{X x} e x. x e y dy 2.) Moments The moments of the exponential distribution can be calculated recursively using integration-byparts. Let X be an exponential random variable with parameter. Then E [ X n] x n e x dx x n e x + + n n E[ X n ] n! n, x n e x dx nx n e dx using the fact that E [ X ]. Taking n and then n 2 gives: E[X] V arx) E [ X 2] E[X] 2 2 2 2 2. Thus, the mean of an exponential random variable is equal to the reciprocal of its parameter, while the variance is equal to the mean squared.

3.) Memorylessness An important property of the exponential distribution is that it is memoryless: if X is exponentially distributed with parameter, then for all t, s, P{X > t + s X > t} P{X > s}. To verify this, use the CDF calculated above to obtain and then calculate P{X > t} P{X t} e t, P{X > t + s X > t} P{X > t + s} P{X > t} e t+s e t e s P{X > s}. In other words, if we think of X as being the random) time until some event occurs, then conditional on the event not having occurred by time t, i.e., on X > t, the distribution of the time remaining until the event does occur, X t, is also exponential with parameter. It is as if the conditional probability of the event occurring between times t and t + s has no memory of the amount of time that has elapsed up to that point. Remarkably, the converse of this statement is also true. Proposition: Suppose that X is a continuous random variable with values in [, ). Then X is memoryless if and only X is exponentially distributed for some parameter >. Proof: See footnote on pg. 2 of Ross. 4.) Interpretation: Exponential random variables are often used to model the distribution of the amount of time elapsed until some particular event occurs. In this case, the memorylessness property implies that the likelihood that the event occurs in some short interval [t, t + δt) conditional on it not yet having occurred does not depend on t. Notice that for events that occur at discrete times, the geometric distribution has a similar property: the probability of observing the event in question on any particular trial given that it has not yet occurred is just p. In fact, the exponential distribution can be thought of as a limiting case for the geometric distribution when the success probability p is small and when time is measured in units that are of size /p. Proposition: For each n, let X n be a geometric random variable with parameter /n. Then, for every t, lim n P{X n nt} e t. Proof: First, recall that lim nx e n n) x. 2

The result can then be deduced by writing x k/n below and approximating the sum by an integral with respect to x: P{X n nt} n nt k n ) k n n nt k nt k t e x dx e t. n) n k/n e k/n Remark: This proposition implies that if X n is geometric with parameter /n, then the distribution of the random variable T n n X n is approximately exponential with parameter when n is large. 5.) The Gamma Distribution Definition: A random variable X is said to have the gamma distribution with parameters α, > if its density is α Γα) xα e x if x > px) otherwise, where the gamma function Γα) is defined for α > ) by the formula Γα) y α e y dy. Remark: Notice that the exponential distribution with parameter is identical to the gamma distribution with parameters, ). Later we will show that if X,, X n are independent exponential RVs, each with parameter, then the sum X X + + X n is a gamma RV with parameters n, ). Because of this relationship, the gamma distribution is often used to model life spans. Moments: Integration by parts shows that for α >, the gamma function satisfies the following important recursion: Γα) y α e y + α ) y α 2 e y dy α )Γα ). 3

In particular, if α n is a positive integer, then because we obtain Γ) e y dy, Γn) n )Γn 2) n )n 2)Γn 3) n )n 2) 3 2Γ) n )!. This recursion can be used to calculate the moments of the gamma distribution. If X has the gamma distribution with parameters α, ), then E[X n ] x n α n Γn + α) Γα) Γα) xα e x dx n n α + k). k In particular, taking n and n 2 gives n+α Γn + α) xn+α e x dx E[X] α V arx) E[X 2 ] E[X] 2 α 2. αα + ) 2 α2 2 6.) The Beta Distribution Definition: A random variable X is said to have the beta distribution with parameters a, b > if its density is βa,b) xa x) b if x, ) px) otherwise, where the beta function βa, b) is defined for a, b > ) by the formula βa, b) x a x) b dx. 4

Remark: Notice that if a b, then X is simply a standard uniform random variable. Also, if X and X 2 are independent gamma-distributed RVs with parameters a, θ) and b, θ), respectively, then the random variable X X /X + X 2 ) is beta-distributed with parameters a, b). In particular, if X and X 2 are independent exponential RVs with parameter, then X X /X + X 2 ) is uniformly distributed on [, ]. The beta distribution is often used to model the distribution of frequencies. Moments: It can be shown that the beta function and the gamma function are related by the following identity: βa, b) Γa)Γb) Γa + b). In turn, we can use this identity to evaluate the moments of the beta distribution. Let X be a beta random variable with parameters a, b). Then E[X n ] x n x a x) b dx βa, b) βn + a, b) βa, b) Γn + a)γb) Γa + b) Γn + a + b) Γa)Γb) Γn + a) Γa + b) Γa) Γn + a + b) n k a + k a + b + k. Taking n and n 2 gives E[X] a a + b V arx) E[X 2 ] E[X] 2 aa + ) a + b)a + b + ) a2 a + b) 2 ab a + b) 2 a + b + ). 5