EE126: Probability and Random Processes

Similar documents
EE126: Probability and Random Processes

The exponential distribution and the Poisson process

CHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p).

Exponential Distribution and Poisson Process

Part I Stochastic variables and Markov chains

Lecture 4: Bernoulli Process

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Continuous-time Markov Chains

EDRP lecture 7. Poisson process. Pawe J. Szab owski

1 Poisson processes, and Compound (batch) Poisson processes

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Lecture 20. Poisson Processes. Text: A Course in Probability by Weiss STAT 225 Introduction to Probability Models March 26, 2014

Notes on Continuous Random Variables

Poisson Processes. Stochastic Processes. Feb UC3M

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Massachusetts Institute of Technology

Things to remember when learning probability distributions:

Manual for SOA Exam MLC.

b. ( ) ( ) ( ) ( ) ( ) 5. Independence: Two events (A & B) are independent if one of the conditions listed below is satisfied; ( ) ( ) ( )

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Chapter 5. Chapter 5 sections

LECTURE #6 BIRTH-DEATH PROCESS

CS 237: Probability in Computing

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

IEOR 4106: Spring Solutions to Homework Assignment 7: Due on Tuesday, March 22.

Assignment 3 with Reference Solutions

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

1 Basic concepts from probability theory

ECE 313 Probability with Engineering Applications Fall 2000

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Name of the Student: Problems on Discrete & Continuous R.Vs

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

Math 597/697: Solution 5

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Continuous-Valued Probability Review

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Stochastic process. X, a series of random variables indexed by t

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Midterm Exam 1 Solution

(Practice Version) Midterm Exam 2

STAT 380 Markov Chains

Continuous time Markov chains

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2017 Kannan Ramchandran March 21, 2017.

Mathematical Statistics 1 Math A 6330

Renewal theory and its applications

O June, 2010 MMT-008 : PROBABILITY AND STATISTICS

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

MITOCW MIT6_041F11_lec15_300k.mp4

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

MITOCW MITRES6_012S18_L23-05_300k

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

General Random Variables

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

PoissonprocessandderivationofBellmanequations

Lecture Notes 2 Random Variables. Random Variable

Chapter 6: Random Processes 1

Figure 10.1: Recording when the event E occurs

[Chapter 6. Functions of Random Variables]

Homework 4 Math 11, UCSD, Winter 2018 Due on Tuesday, 13th February

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014.

Tom Salisbury

Introduction to Queueing Theory

Transform Techniques - CF

Math/Stat 352 Lecture 8

Special distributions

Slides 8: Statistical Models in Simulation

STAT 430/510 Probability

CS 1538: Introduction to Simulation Homework 1

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Relationship between probability set function and random variable - 2 -

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Guidelines for Solving Probability Problems

Queueing Theory and Simulation. Introduction

Midterm Exam 2 (Solutions)

Lecture 4a: Continuous-Time Markov Chain Models

IEOR 6711, HMWK 5, Professor Sigman

Glossary availability cellular manufacturing closed queueing network coefficient of variation (CV) conditional probability CONWIP

Disjointness and Additivity

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Name of the Student: Problems on Discrete & Continuous R.Vs

Midterm 2 Review. CS70 Summer Lecture 6D. David Dinh 28 July UC Berkeley

Review of Queuing Models

Lecture 20: Reversible Processes and Queues

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

errors every 1 hour unless he falls asleep, in which case he just reports the total errors

Queueing Theory II. Summary. ! M/M/1 Output process. ! Networks of Queue! Method of Stages. ! General Distributions

Lectures prepared by: Elchanan Mossel Yelena Shvets

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

2DI90 Probability & Statistics. 2DI90 Chapter 4 of MR

Chapters 3.2 Discrete distributions

Bernoulli Counting Process with p=0.1

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

Transcription:

EE126: Probability and Random Processes Lecture 19: Poisson Process Abhay Parekh UC Berkeley March 31, 2011 1

1 Logistics 2 Review 3 Poisson Processes 2

Logistics 3

Poisson Process A continuous version of the Bernoulli Process, i.e. time is not in slots but defined on the non-negative reals. 1 The number of arrivals in time t is a Poisson rv with mean λt: P(N(t) = k) = (λt)k e λt k! 2 The interarrival times are iid exponential rvs with parameter λ. 3 First Principles definition: Time homogeneous and non-overlapping intervals independent. For small intervals of size δ, prob of an arrival in that interval is approx λδ; prob of one arrival is approx 1 λδ (so prob of > 1 arrival is approx 0). Depending on the situation, use version 1,2 or 3 to your benefit. 4

Example: Fishing Bob catches fish according to a Poisson Process with rate λ = 0.6 per hour. If catches as at least one in the first two hours he quits. Else he continues until he has caught the first fish. Prob Bob fishes for > 2 hours: The is happens if Bob doesn t catch anything for the first 2 hours. Poisson RV: P(N(2) = 0) = e 1.2 Prob Bob fishes for time in [2, 5] hours: This happens when Bob does not catch anything in [0, 2] and catches a fish in [2, 5]. So no arrivals in [0, 2] and at least one arrival in [2, 5]. Both events are independent. Prob of first event is e 1.2. Prob second event is Prob(inter-arrival time 3)=(1 e 1.8 ). So e 1.2 (1 e 1.8 ). 5

Example: Fishing Bob catches fish according to a Poisson Process with rate λ = 0.6 per hour. If catches as at least one in the first two hours he quits. Else he continues until he has caught the first fish. Prob Bob catches at least 2 fish: This can only happen if he catches at least two fishes in the first two hours. So 1-P(catches 1 fish in 2 hours) - P(catches 0 fish in 2 hours). 1 P(N(2) = 1) P(N(2) = 0) = 1 e 1.2 1.2e 1.2. Expected number of fish caught: Total fish caught = Fish Caught in [0,2] + Fish Caught [2, ]. Linearity of execrations:1.2 + e 1.2 Expected total fishing time given it is > 4 hours: 4 + E[time to catch a fish] = 4+E[interarrival time]= 4 + 1.6 = 5 2 3 hours. 6

Merging and Splitting Poisson Processes 1 If you merge K independent PPs with rates λ 1,..., λ k you get one PP of rate λ = i λ i. 2 If you look at a random arrival in the merged PP the prob it belongs to PP i is λ i λ. 3 When you split a Poisson process of rate λ into k by routing each arrival independently to i with prob p i, each of the k PPs are independent and the i th has rate p i λ. Merging and splitting are very helpful in solving problems with Competing exponentials. 7

Example: Light Bulbs K identical light bulbs with lifetimes iid exponential with parameter λ. What is the expected value and variance of the time until the last one burns out? Think of a failure as an arrival of a Poisson Process. When k bulbs are working the the next failure arrives according to a Poisson Process with rate kλ. Initially k = K. E[first failure when k bulbs working] = 1 kλ After a failure you have one less bulb, i.e. one less PP:E[time until K fail]= K 1 i=0 E[1st failure with K i bulbs] Expected value = K i=1 1 iλ. Similarly variance = K i=1 1 i 2 λ 2. 8

Example: Spam Folders Valid mail and Spam mail both arrive to your inbox according to independent Poisson Processes of λ v = 4 (4 per hour) and λ s = 2 respectively. Your spam filter tags a spam email correctly with prob p = 0.8; a valid email as spam with prob q = 0.1 1 Prob an email in your inbox is spam 2 Prob an email in your spam filter is valid 3 How often should you check your spam folder if you d like to see one new valid email each time you check? 9

10 Competing Poisson Processes Two independent Poisson Processes with rates λ 1, λ 2. Let Y i k be the time of the k th arrival in process i = 1, 2. Find P(Y 1 n < Y 2 m), the prob the nth arrival in 1 is before the mth arrival in 2. Merge the 2 processes to get one with rate λ 1 + λ 2. Probability that an arrival is from 1: p 1 = λ 1 λ 1 + λ 2. We need n routed to 1 before m routed to 2 We need at least n successes before n + m 1 trials (midterm problem!): P(Y 1 n < Y 2 m) = m 1 k=0 ( n + m 1 n + k ) p n+k 1 (1 p 1 ) m k 1

Tasks on Processors There n tasks on n processors. Two stages to each task. Time for each stage is iid exponential with parameter 1. Find: Prob k half done tasks when the first task is done. There are exactly n iid competing exponentials at all times. Merge them to get a Poisson Process of rate n. An arrival means that a stage has been completed Now split the merged process n ways with prob 1 n. An Each arrival in the merged process is tagged i with prob 1 n for i = 1, 2,.., n. We want k distinct tags after we see a duplicate. So: k+1 successive tags and then a duplicate. In a merged process prob a given arrival came from i is λ i λ. So: This is 1 n 1 n n 2 n...n k n k + 1 n ( n k+1) (k + 1)! k + 1 n k n 11

Conditional Distribution of arrival times Given that N(t) = n, what is the joint distribution of the n arrival times Y 1, Y 2,..., Y n? Let us call this distribution f (t 1, t 2,..., t n ): (Try with n = 1 and Bernoulli...) (Try with n = 1 and Poisson) For 0 < t 1 < t 2 <... < t n < t and h 1, h 2,..., h n : P(t i Y i t i + h i, i = 1,... n N(t) = n) lim = f (t 1,.., t n ) h 1...h n 0 h 1 h 2...h n We need exactly one arrival in [t i, t i + h i ] (i = 1, 2,.., n). Pick h i small enough so that these intervals are non-overlapping. Also, no arrivals anywhere else in [0, t]. Thus: P(t i Y i t i + h i N(t) = n) = λh 1e λh1... λh n e λhn e λ(t h1 h2...hn) e λt (λt) n /n! = n! t n h 1h 2 h n f (t 1,..., t n ) = n! t n. 12

Conditional Distribution of arrival times Given that N(t) = n, the joint distribution of the n arrival times Y 1, Y 2,..., Y n is f (t 1,..., t n ) = n! t n. Interpretation: Given n iid random variables X 1,.., X n, order them so that X (1) < X (2) < < X (n) Then for x 1 < x 2 <... < x n f X(1),X (2),,X (n) (x 1,..., x n ) = n! If X 1,..X n are iid uniform over [0, t]: n f Xi (x i ) i=1 f X(1),X (2),,X (n) (x 1,..., x n ) = n! t n So given that n arrivals have occurred, Y 1,.., Y n considered as unordered random variables are iid uniform. 13

14 Customers in a bank Two kinds of customers (A and B) enter a bank. Both arrival processes are Poisson (λ A = 3 minutes and λ B = 4 minutes). Type A customers are always depart in 1 min and Type 2 customers always depart in time geometrically distributed with mean =2. 1 N(t) customers arrive in t minutes. How is N(t) distributed? 2 10 customers arrived in 10 minutes. Prob that 3 of type A?: 3 At t = 0 there are no customers in the bank. 4 At t = 1 exactly 1 type B job. Find mean and variance of the additional time time it must wait until it departs.

15 Customers in a bank Two kinds of customers (A and B) enter a bank. Both arrival processes are Poisson (λ A = 3 and λ B = 4). Type A customers are always served in 1 min and Type 2 customers are served in time geometrically distributed with mean =2. 1 N(t) customers arrive in t time units. How is N(t) distributed? Poisson with parameter 7t. 2 10 customers ( arrived in 10 mins. Prob that 3 of type A?: 10 ) 3 (3/7) 3 (4/6) 7. 3 At t = 0 there are no customers in the bank. Prob k type B customers arrive before the first type A customer: (4/7) k (3/7)

Customers in a bank Two kinds of customers (A and B) enter a bank. Both arrival processes are Poisson (λ A = 3 and λ B = 4). Type A customers are always served in 1 min and Type 2 customers are served in time geometrically distributed with mean =2. (4) At t = 1 exactly 1 type B job. Find the mean and variance of the additional time time it must wait until it departs. Suppose the job arrived at time K + X, K integer and X (0, 1). Since type B waiting time is memoryless the time left from 1 + X to its departure is geometric. Let us call this time L. Total waiting time is distributed as X + L. Given an arrival occurred in ( K 1, K), arrival is uniformly distributed in that interval. So X is uniform in (0, 1). X and L are independent: E[T ] = 2.5, var[t ] = 1/3. What is the distribution of X + L? 16