Chapter 4 Continuous Random Variables and Probability Distributions

Similar documents
HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

Expected Values, Exponential and Gamma Distributions

STAT509: Continuous Random Variable

Continuous Probability Spaces

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Common ontinuous random variables

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Chapter 4: Continuous Random Variable

Expected Values, Exponential and Gamma Distributions

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Chapter 4: Continuous Probability Distributions

Chapter Learning Objectives. Probability Distributions and Probability Density Functions. Continuous Random Variables

Chapter 4: CONTINUOUS RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Guidelines for Solving Probability Problems

Chapter 4: Continuous Random Variables and Probability Distributions

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Applied Statistics and Probability for Engineers. Sixth Edition. Chapter 4 Continuous Random Variables and Probability Distributions.

Chapter 4 Continuous Random Variables and Probability Distributions

The exponential distribution and the Poisson process

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) =

MAS1302 Computational Probability and Statistics

Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008

Chapter 4. Continuous Random Variables

Let X be a continuous random variable, < X < f(x) is the so called probability density function (pdf) if

Chapter 4 Continuous Random Variables and Probability Distributions

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Exponential & Gamma Distributions

1. Poisson Distribution

PROBABILITY DISTRIBUTIONS

Exponential, Gamma and Normal Distribuions

Part 3: Parametric Models

2DI90 Probability & Statistics. 2DI90 Chapter 4 of MR

Random variables. DS GA 1002 Probability and Statistics for Data Science.

5.2 Continuous random variables

CIVL Continuous Distributions

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Notes on Continuous Random Variables

Probability Methods in Civil Engineering Prof. Rajib Maity Department of Civil Engineering Indian Institute of Technology, Kharagpur

F X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.

Expectation, Variance and Standard Deviation for Continuous Random Variables Class 6, Jeremy Orloff and Jonathan Bloom

Special Discrete RV s. Then X = the number of successes is a binomial RV. X ~ Bin(n,p).

STAT 516 Midterm Exam 2 Friday, March 7, 2008

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999

S n = x + X 1 + X X n.

3 Continuous Random Variables

SDS 321: Introduction to Probability and Statistics

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

CS 237: Probability in Computing

MATH : EXAM 2 INFO/LOGISTICS/ADVICE

ECE Homework Set 2

1 Probability and Random Variables

Exponential Distribution and Poisson Process

Math 30530: Introduction to Probability, Fall 2013

Slides 8: Statistical Models in Simulation

Continuous-time Markov Chains

7 Continuous Variables

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

2 Continuous Random Variables and their Distributions

Statistics 100A Homework 5 Solutions

1. I had a computer generate the following 19 numbers between 0-1. Were these numbers randomly selected?

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

CS 1538: Introduction to Simulation Homework 1

Continuous Distributions

Probability and Statistics Concepts

Continuous Random Variables

b. ( ) ( ) ( ) ( ) ( ) 5. Independence: Two events (A & B) are independent if one of the conditions listed below is satisfied; ( ) ( ) ( )

Renewal Process Models for Crossover During Meiosis

Alg2H Ch6: Investigating Exponential and Logarithmic Functions WK#14 Date:

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Random Variables Example:

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s.

Exam 3, Math Fall 2016 October 19, 2016

Basics of Stochastic Modeling: Part II

The Binomial distribution. Probability theory 2. Example. The Binomial distribution

STAT Chapter 5 Continuous Distributions

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

(Practice Version) Midterm Exam 2

Lecture 20. Poisson Processes. Text: A Course in Probability by Weiss STAT 225 Introduction to Probability Models March 26, 2014

Part 3: Parametric Models

Examples of random experiment (a) Random experiment BASIC EXPERIMENT

Chapter 4. Continuous Random Variables 4.1 PDF

Math 480 The Vector Space of Differentiable Functions

Lecture 3 Continuous Random Variable

Chapter 8: Continuous Probability Distributions

Continuous Probability Distributions. Uniform Distribution

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Northwestern University Department of Electrical Engineering and Computer Science

Poisson population distribution X P(

Chapter 1: Revie of Calculus and Probability

Chapter 2: Random Variables

Page Max. Possible Points Total 100

Poisson Processes and Poisson Distributions. Poisson Process - Deals with the number of occurrences per interval.

Probability Density Functions

Transcription:

Chapter 4 Continuous Random Variables and Probability Distributions Part 3: The Exponential Distribution and the Poisson process Section 4.8 The Exponential Distribution 1 / 21

Exponential Distribution One place the exponential distribution arises is in the modeling of time or distance between occurrence of events. Wait time between phone calls Distance between recombination events on a DNA strand It is also used to model the distribution of component lifetime, or lifetime of a device. This is related to reliability, or length of time until a device fails. The shape and probabilities of an exponential distribution depends on only one parameter λ. 2 / 21

Exponential Distribution Three different exponential distributions are shown below for the λ = 0.04, λ = 2, λ = 5 values. λ = 0.04 λ = 2 λ = 5 f(x) x The exponential distribution is connected to the Poisson distribution (through the Poisson process) and λ can be seen as a rate parameter, in terms of a long-term rate of occurrence per unit interval. 3 / 21

Exponential Distribution Definition (Exponential Distribution) The random variable X that equals the distance between successive events from a Poisson process with mean number of events λ > 0 per unit interval is an exponential random variable with parameter λ. The probability density function of X is f(x) = λe λx for 0 x < f(x) x 4 / 21

Exponential Distribution Definition (Mean and Variance for Exponential Distribution) For an exponential random variable with rate parameter λ, µ = E(X) = 1 λ and σ 2 = V (X) = 1 λ 2 λ = 0.04 λ = 2 λ = 5 f(x) x A smaller λ coincides with larger expected value or µ. A smaller λ coincides with more of the probability being pushed-out into the right tail (relative to the other distributions). 5 / 21

The Exponential Distribution is connected to the Poisson process (next slide) Specifically, the probability distribution of the wait time (continuous X) until the next event occurs in a Poisson process IS an exponential distribution. X wait time (a continuous r.v.) X exponential(λ) where λ rate of the process or λ rate of event occurrence How long you have to wait for an event depends on how often events occur. A Poisson process is a certain type of situation to which the Poisson distribution applies. 6 / 21

The Poisson Process Based on the name, a Poisson process must be related to counting the number of events that occur in a window of time or region in space... In general, a Poisson process is a situation where the following conditions hold: Events occur randomly, but with a long-term average rate of λ per unit time. For example, λ=10 per hour or λ=240 per day One hour is just as likely as another to have an event. And the likelihood of an event is completely independent of the past history. The events are rare enough that in a very short time interval, there is a negligible chance of more than one event. -H.B. Enderton (UCLA) Any process that has these characteristics is called a Poisson process, and λ is called the rate of the process. 7 / 21

The Poisson Process More on the Poisson process... Suppose we can model the number of calls arriving during an x-minute time window with a Poisson distribution (we re modeling a count). We assume that the calls arrive completely at random in time during the x-minutes. Let the expected number of calls during a 1-minute interval be λ (a rate). If λ = 2 (i.e. average of 2 calls per minute), then the expected number of calls in 1-minute is 2 calls. then the expected number of calls in 2-minutes is 4 calls. And generically, the expected number of calls in x-minutes is λx or 2x calls. The expected number of calls depends on two things... the length of the time interval, x, and λ, the rate of occurrence per time unit. 8 / 21

Longer fixed time intervals will have higher expected number of calls. We can derive the exponential distribution as a wait time between events in a Poisson process... Let N denote the number of calls in an x-minute time interval in a Poisson process with a rate parameter of λ events per minute. Then, N P oisson(λx). The number of calls during a fixed time interval of x-minutes has a Poisson distribution with mean of λx. We saw this in Section 3.9. 9 / 21

Example (Radioactive pulses recorded by a Geiger counter) From a time point of t = 0 minutes, we count the number of radioactive pulses recorded by a Geiger counter. Let the process be a Poisson process with a rate parameter λ = 6 (i.e. 6 pulses per minute on average). What is the probability that in a 0.5 minute interval, at least one pulse is received? [ THINK: How many pulses do you expect to receive in this fixed window 0.5 minutes? ] We ll need a probability distribution to calculate this probability. 10 / 21

Example (Radioactive pulses recorded by a Geiger counter, cont.) ANS: Let N denote the number of pulses in a 0.5 minute interval. N P oisson(0.5 6 = 3) (N is a Poisson r.v. with an expected value of 3) P (N 1) = 1 P (N = 0) = 1 e 3 (3) 0 0! =0.950 Notice the units were consistent (in minutes) for both parts of the problem... how the rate was given to you, and how the question was asked (check this). Notice I could have set this up using notation with Y rather than N, or... Y P oisson(0.5 6 = 3) because it s an arbitrary labeling. 11 / 21

Back to deriving the exponential distribution... HERE S THE SWITCH TO WAIT TIME... Let X be the wait time until the first call from any start point in this setting. If you wait at least 3 minutes for a call, then NO CALL occurred in the first 3 minutes. If you wait at least 10 minutes for a call, then NO CALL occurred in the first 10 minutes. If you wait at least x minutes for a call, then NO CALL occurred in the first x minutes. 12 / 21

Let X be the wait time until the first call from any start point in this setting (a continuous random variable). Wait time random variable P (X > x) = P (you wait at least x minutes for first call) = P (there were no calls in the first x minutes) = P (N = 0) where N P oisson(λx) = e (λx) (λx) 0 0! = e (λx) For the fixed time interval of λx, we can compute the probability of 0 events using the Poisson distribution. 13 / 21

We can now use this probability to derive the cumulative distribution function for X (the r.v. representing the wait time until the first call)... F (x) = P (X x) = 1 P (X > x) = 1 P (N = 0) = 1 e λx To get the pdf or f(x) for X we simply take the derivative... f(x) = d dx F (x) = λe λx for x 0s It just so happens that this f(x) IS an exponential probability density function with parameter λ, or mean 1/λ (see slides 4 & 5). Thus, wait time in a Poisson process is modeled with the exponential distribution. 14 / 21

Example (Wait time in the Poisson process) Suppose emergency alarms occur with no particular pattern (at random), but at an average rate of λ = 12 per day. What is the expected wait time for an emergency alarm? ANS: Let X wait time until first emergency alarm (from any start point). X follows an exponential distribution with λ = 12. E(X) = 1 λ = 1 12 day Or we expect to wait 1 12 of a day (2 hours) until the first emergency alarm. 15 / 21

This has all been phrased in terms of the first occurrence from any start point, and so this holds for modeling the wait time until the next occurrence from any point (same thing). The random variable X that represents the distance between successive events in a Poisson process with rate parameter λ is an exponential random variable with parameter λ, or mean 1/λ. 16 / 21

Example (Wait time in the Poisson process) Suppose calls are received at a 24-hour alcoholics anonymous hotline according to a Poisson process with a rate λ = 0.5 calls per day. What is the probability that they will wait more than 2 days for a call (for any given start point)? ANS: [We ll work it here using the exponential distribution] [We ll work it here using the Poisson distribution] 17 / 21

The cdf, F (x), for an exponential distribution... P (X x) = F (x) = x f(u)du = 1 e λx for x 0. Making the connection... For an exponential random variable X with parameter λ, we have µ = E(X) = 1 λ. λ = 0.04 λ = 2 λ = 5 f(x) A smaller λ (rate of occurrence) coincides with a larger expected value of X (wait time). x A larger λ (rate of occurrence) coincides with a smaller expected value of X (wait time). 18 / 21

Lack of Memory Property of Exponential Distribution For an exponential random variable X, P (X < t 1 + t 2 X > t 1 ) = P (X < t 2 ) Given that you ve already waited t 1 =5 minutes for an event, what is the probability that you ll have an event in the next five minutes? (e.g. t 2 =5) It might seem like you should be due for an event. But you re essentially starting from scratch at the 5 minute point... P (X < 10 X > 5) = P (X < 5). 19 / 21

Example (Particle detection p. 135 Example 4-22) Let X denote the time between detections of a particle. Suppose detections of a particle follow a Poisson process with an average rate of occurrence of 10 detections every 14 minutes (λ = 10/14 per minute). 1) What is the probability that we detect a particle in the next 30 seconds? ANS: Keep consistent units... 30 seconds = 0.5 minutes. X is a wait time in a Poisson process. X exponential(λ = 10/14) P (X < 0.5) = F (0.5) = 1 e ((10/14) 0.5) = 0.3003 20 / 21

Example (Particle detection p. 135 Example 4-22, cont.) 2) Given that we have already waited for 3 minutes without a detection, what is probability that a particle is detected in the next 30 seconds? ANS: P (X < 3.5 X > 3) = P (X<3.5 and X>3) P (X>3) = P (3<X<3.5) P (X>3) = F (3.5) F (3) 1 F (3) = 0.3003 = P (X < 0.5) Recall one of the Poisson process characteristics: The likelihood of an event is completely independent of the past history. 21 / 21