The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Similar documents
The exponential distribution and the Poisson process

Manual for SOA Exam MLC.

Exponential Distribution and Poisson Process

Figure 10.1: Recording when the event E occurs

Lecture 4a: Continuous-Time Markov Chain Models

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

EDRP lecture 7. Poisson process. Pawe J. Szab owski

Arrivals and waiting times

Solutions For Stochastic Process Final Exam

Introduction to Queuing Networks Solutions to Problem Sheet 3

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

Continuous Time Processes

EE126: Probability and Random Processes

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Part I Stochastic variables and Markov chains

Data analysis and stochastic modeling

Renewal theory and its applications

Poisson Processes. Stochastic Processes. Feb UC3M

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Continuous-Time Markov Chain

Lecture 20. Poisson Processes. Text: A Course in Probability by Weiss STAT 225 Introduction to Probability Models March 26, 2014

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Continuous-time Markov Chains

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

EE126: Probability and Random Processes

ECE 313 Probability with Engineering Applications Fall 2000

LECTURE #6 BIRTH-DEATH PROCESS

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

Stochastic process. X, a series of random variables indexed by t

Interlude: Practice Final

STAT STOCHASTIC PROCESSES. Contents

STAT 380 Markov Chains

Networking = Plumbing. Queueing Analysis: I. Last Lecture. Lecture Outline. Jeremiah Deng. 29 July 2013

Birth-Death Processes

Chapter 2. Poisson Processes

Queueing Theory. VK Room: M Last updated: October 17, 2013.

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Examination paper for TMA4265 Stochastic Processes

Lecture 20: Reversible Processes and Queues

THE QUEEN S UNIVERSITY OF BELFAST

ELEMENTS OF PROBABILITY THEORY

PoissonprocessandderivationofBellmanequations

Assignment 3 with Reference Solutions

1 Poisson processes, and Compound (batch) Poisson processes

IEOR 6711, HMWK 5, Professor Sigman

Chapter 6: Random Processes 1

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.

Notes on Continuous Random Variables

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013

Page 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

Stochastic Processes

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Statistics 150: Spring 2007

Name of the Student:

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Probability and Statistics Concepts

4 Branching Processes

Homework set 4 - Solutions

Markov processes and queueing networks

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012

Problems 5: Continuous Markov process and the diffusion equation

Markov Processes and Queues

TMA 4265 Stochastic Processes

Computer Systems Modelling

Question Points Score Total: 70

Modelling data networks stochastic processes and Markov chains

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

1 Continuous-time chains, finite state space

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

CAS Exam MAS-1. Howard Mahler. Stochastic Models

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Continuous time Markov chains

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk

Introduction to Queueing Theory

Modelling data networks stochastic processes and Markov chains

Poisson processes and their properties

Lectures prepared by: Elchanan Mossel Yelena Shvets

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

M/G/1 and M/G/1/K systems

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Introduction to Queueing Theory

ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains

Mahler s Guide to Stochastic Models

Readings: Finish Section 5.2

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

2. Transience and Recurrence

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

Tom Salisbury

Introduction to Queueing Theory

(implicitly assuming time-homogeneity from here on)

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Stat 150 Practice Final Spring 2015

S n = x + X 1 + X X n.

Transcription:

The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element of chance In particular, Markov processes, whose future only depends on the present and not on how we got there Particularly tractable examples are the Markov chains, which are discrete both in time and space : random walks branching processes It is interesting to consider also Markov processes {X(t) t 0} which are continuous in time We will only consider those where the X(t) are discrete random variables; e.g., integer-valued Main examples will be: Poisson processes Birth and death process, such as queues José Figueroa-O Farrill mi4a (Probability) Lecture 19 1 / 18 José Figueroa-O Farrill mi4a (Probability) Lecture 19 2 / 18 Continuous-time Markov processes We will be considering continuous-time stochastic processes {X(t) t 0}, where each X(t) is a discrete random variable taking integer values Such a stochastic process has the Markov property if for all i, j, k Z and real numbers 0 r < s < t, P (X(t) j X(s) i, X(r) k) P (X(t) j X(s) i) If, in addition, for any s, t 0, P (X(t + s) j X(s) i) is independent of s, we say {X(t) t 0} is (temporally) homogeneous José Figueroa-O Farrill mi4a (Probability) Lecture 19 3 / 18 Counting processes A stochastic process {N(t) t 0} is a counting process, if N(t) represents the total number of events that have occurred up to time t; that is, N(t) {0, 1, 2,... } If s < t, N(s) N(t) For s < t, N(t) N(s) is the number of events which have taken place in (s, t]. {N(t) t 0} has independent increments if the number of events taking place in disjoint time intervals are independent; that is, the number N(t) of events in [0, t], and the number N(t + s) N(t) of events in (t, t + s] are independent {N(t) t 0} is (temporally) homogeneous if the distribution of events occurring in any interval of time depends only on the length of the interval: N(t 2 + s) N(t 1 + s) has the same distribution as N(t 2 ) N(t 1 ), for all t 1 < t 2 and s > 0 José Figueroa-O Farrill mi4a (Probability) Lecture 19 4 / 18

Poisson processes Definition {N(t) t 0} is Poisson with rate λ > 0 if N(0) 0 it has independent increments for all s, t 0 and n N, λt (λt)n P(N(s+t) N(s) n) e n! Since P(N(s + t) N(s) n) does not depend on s, Poisson process are (temporally) homogeneous Hence, taking s 0, one sees that N(t) is Poisson distributed with mean E(N(t)) λt José Figueroa-O Farrill mi4a (Probability) Lecture 19 5 / 18 Inter-arrival times Let {N(t) t 0} be a Poisson process with rate λ > 0 Let T 1 be the time of the first event Let T n>1 be the time between the (n 1)st and the nth events {T 1, T 2,... } is the sequence of inter-arrival times P(T 1 > t) P(N(t) 0) e λt, whence T 1 is exponentially distributed with mean 1 λ The same is true for the other T n, e.g., P(T 2 > t T 1 s) P(0 events in (s, s + t] T 1 s) (indep. incr.) P(0 events in (s, s + t]) P(0 events in (0, t]) e λt The {T n } for n 1 are i.i.d. exponential random variables with mean 1 λ José Figueroa-O Farrill mi4a (Probability) Lecture 19 6 / 18 Waiting times The waiting time until the nth event (n 1) is S n n i1 T i S n t if and only if N(t) n, whence P(S n t) P(N(t) n) P(N(t) j) jn jn Differentiating with respect to t, we get the probability density function λt (λt)j e j! Example Suppose that trains arrive at a station at a Poisson rate λ 1 per hour. 1 What is the expected time until the 6th train arrives? 2 What is the probability that the elapsed time between the 6th and 7th trains exceeds 1 hour? 1 We are asked for E(S 6 ) where S 6 is the waiting time until the 6th train, hence E(S 6 ) 6 λ 6 hours. 2 We are asked for P(T 7 > 1) e λ e 1 37% λt (λt)n 1 f Sn (t) λe (n 1)! which is a gamma distribution E(S n ) i E(T i) n λ José Figueroa-O Farrill mi4a (Probability) Lecture 19 7 / 18 José Figueroa-O Farrill mi4a (Probability) Lecture 19 8 / 18

Time of occurrence is uniformly distributed If we know that exactly one event has occurred by time t, how is the time of occurrence distributed? For s t, P(T 1 < s N(t) 1) P(T 1 < s and N(t) 1) P(N(s) 1 and N(t) N(s) 0) P(N(s) 1)P(N(t) N(s) 0) (indep. incr.) λse λs e λ(t s) λte λt s t i.e., it is uniformly distributed Example (Stopping game) Events occur according to a Poisson process {N(t) t 0} with rate λ. Each time an event occurs we must decide whether or not to stop, with our objective being to stop at the last event to occur prior to some specified time τ. That is, if an event occurs at time t, 0 < t < τ and we decide to stop, then we lose if there are any events in the interval (t, τ], and win otherwise. If we do not stop when an event occurs, and no additional events occur by time τ, then we also lose. Consider the strategy that stops at the first event that occurs after some specified time s, 0 < s < τ. What should s be to maximise the probability of winning? We win if and only if there is precisely one event in (s, τ], hence P( ) P(N(τ) N(s) 1) e λ(τ s) λ(τ s) We differentiate with respect to s to find that the maximum occurs for s τ 1 λ, for which P( ) e 1 37% José Figueroa-O Farrill mi4a (Probability) Lecture 19 9 / 18 José Figueroa-O Farrill mi4a (Probability) Lecture 19 10 / 18 Example (Bus or walk?) Buses arrive to a stop according to a Poisson process {N(t) t 0} with rate λ. Your bus trip home takes b minutes from the time you get on the bus until you reach home. You can also walk home from the bus stop, the trip taking you w minutes. You adopt the following strategy: upon arriving at the bus stop, you wait for at most s minutes and start walking if no bus has arrived by that time. (Otherwise you catch the first bus that comes.) Is there an optimal s which minimises the duration of your trip home? Let T denote the duration of the trip home from the time you arrive at the bus stop. Conditioning on the mode of transport, E(T) E(T N(s) 0)P(N(s) 0) + E(T N(s) > 0)P(N(s) > 0) (s + w)e λs + (E(T 1 ) + b)(1 e λs ) where T 1 is the first arrival time. Recall E(T 1 ) 1 λ. Example (Bus or walk? continued) Therefore, E(T) (s + w)e λs + ( 1 λ + b)(1 e λs ) ( ) 1 λ + b + s + w 1 λ b e λs whose behaviour (as a function of s) depends on the sign of w 1 λ b. w 1 λ b < 0 w 1 λ b 0 w 1 λ b > 0 The minimum is either at s 0 (walk without waiting) or at s (wait for the bus no matter how long) José Figueroa-O Farrill mi4a (Probability) Lecture 19 11 / 18 José Figueroa-O Farrill mi4a (Probability) Lecture 19 12 / 18

Poisson processes as Markov processes I A Poisson process {N(t) t 0} is an example of a continuous-time homogeneous Markov process The states are the natural numbers N {0, 1, 2,... } The possible transitions are between states n and n + 1 The transition probabilities are for 0 s < t, P(N(t) n + 1 N(s) n) (indep. incr.) P(N(t) n + 1 and N(s) n) P(N(s) n) P(N(s) n and N(t) N(s) 1) P(N(s) n) P(N(s) n)p(n(t) N(s) 1) P(N(s) n) P(N(t s) 1) λ(t s)e λ(t s) Poisson processes as Markov processes II The inter-arrival time T n is the time the system spends in state (n 1) before making a transition to n They are exponentially distributed with mean 1 λ Now let us consider a general continuous-time Markov process, not necessarily Poisson Let T n denote the time the system spends in state n before making a transition to a different state The Markov property says that for s, t 0, P(T n > s + t T n > s) P(T n > t) i.e., P(T n > s + t T n > s) cannot depend on s, since how long the system has been in state n cannot matter Thus T n is memoryless and this means that T n is exponentially distributed José Figueroa-O Farrill mi4a (Probability) Lecture 19 13 / 18 José Figueroa-O Farrill mi4a (Probability) Lecture 19 14 / 18 Continuous and memoryless exponential (I) Let X be a continuous random variable taking non-negative values We say that X is memoryless if for all s, t 0, This is equivalent to P(X > t) P(X > s + t X > s) P(X > t) P(X > s + t and X > s) P(X > s) P(X > s + t) P(X > s) or to P(X > s + t) P(X > t)p(x > s) Letting F(s) P(X > s), this is equivalent to the functional equation F(s + t) F(s)F(t) Continuous and memoryless exponential (II) Theorem The only (nontrivial) continuous functions F(s) obeying are F(s) e sλ, for some λ > 0. Proof. F(s + t) F(s)F(t) s, t 0 F(0) F(0) 2 implies that either F(0) 0 or F(0) 1 F(s) F(s)F(0) implies F(0) 1 since F(s) is nontrivial For all n N, F(n + 1) F(n)F(1), whence F(n) F(1) n Since 0 < F(1) < 1, we can write F(1) e λ for some λ > 0 F( k n )n F(k) F(1) k e kλ, whence F( k n ) e kλ/n Two continuous functions agreeing on the (non-negative) rationals agree on the (non-negative) reals, so F(s) e sλ José Figueroa-O Farrill mi4a (Probability) Lecture 19 15 / 18 José Figueroa-O Farrill mi4a (Probability) Lecture 19 16 / 18

Warning Continuity is essential! There are discrete distributions (e.g., geometric) which are also memoryless. Of course, exponential distribution is the continuous limit of the geometric distribution. We conclude with the observation that a continuous-time Markov chain is a stochastic process such that each time the system enters state i 1 the amount of time it spends in that state before making a transition into a different state is exponentially distributed with mean, say, 1 λ i, and 2 when the process leaves state i is enters states j with some probability p ij, where the p ij satisfy 1 p ii 0 for all i, and 2 j p ij 1 for all i. In a Poisson process all the λ i are equal. Summary We have introduced continuous-time Markov processes {X(t) t 0}, satisfying the Markov property that for 0 r < s < t, P(X(t) j X(s) i, X(r) k) P(X(t) j X(s) i) We focussed on homogeneous processes, for which P(X(t + s) j X(s) i) does not depend on s Examples are the counting processes {N(t) t 0}, of which an important special case are the Poisson processes, where N(t) is Poisson distributed with mean λt Inter-arrival times in a Poisson process are exponential, waiting times are gamma distributed and time of occurrence is uniformly distributed Continuous-time Markov chains are defined by 1 a transition matrix [p ij ] (as in the discrete case) 2 the exponential transition rates λ i José Figueroa-O Farrill mi4a (Probability) Lecture 19 17 / 18 José Figueroa-O Farrill mi4a (Probability) Lecture 19 18 / 18