A review of Continuous Time MC STA 624, Spring 2015

Similar documents
STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

The Transition Probability Function P ij (t)

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Statistics 150: Spring 2007

88 CONTINUOUS MARKOV CHAINS

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Quantitative Model Checking (QMC) - SS12

Markov processes and queueing networks

Continuous-Time Markov Chain

STAT 380 Continuous Time Markov Chains

Stochastic Processes

SMSTC (2007/08) Probability.

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

Lecture 20: Reversible Processes and Queues

Lecture 21. David Aldous. 16 October David Aldous Lecture 21

STATS 3U03. Sang Woo Park. March 29, Textbook: Inroduction to stochastic processes. Requirement: 5 assignments, 2 tests, and 1 final

IEOR 6711, HMWK 5, Professor Sigman

Positive and null recurrent-branching Process

Continuous Time Markov Chains

Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321

Examples of Countable State Markov Chains Thursday, October 16, :12 PM

Performance Evaluation of Queuing Systems

TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis

Stochastic Processes (Week 6)

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

T. Liggett Mathematics 171 Final Exam June 8, 2011

Stochastic process. X, a series of random variables indexed by t

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

Birth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

STOCHASTIC PROCESSES Basic notions

Advanced Computer Networks Lecture 2. Markov Processes

Adventures in Stochastic Processes

STAT STOCHASTIC PROCESSES. Contents

Classification of Countable State Markov Chains

1 Continuous-time chains, finite state space

Homework 4 due on Thursday, December 15 at 5 PM (hard deadline).

Necessary and sufficient conditions for strong R-positivity

CS 798: Homework Assignment 3 (Queueing Theory)

Continuous time Markov chains

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

Probability Distributions

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Stochastic Processes MIT, fall 2011 Day by day lecture outline and weekly homeworks. A) Lecture Outline Suggested reading

Lecture 7. We can regard (p(i, j)) as defining a (maybe infinite) matrix P. Then a basic fact is

Birth-death chain models (countable state)

Markov Processes Cont d. Kolmogorov Differential Equations

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains

Note that in the example in Lecture 1, the state Home is recurrent (and even absorbing), but all other states are transient. f ii (n) f ii = n=1 < +

MATH 56A: STOCHASTIC PROCESSES CHAPTER 2

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

MARKOV PROCESSES. Valerio Di Valerio

Markov Chain Monte Carlo

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

LTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather

ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains

Non-homogeneous random walks on a semi-infinite strip

Winter 2019 Math 106 Topics in Applied Mathematics. Lecture 9: Markov Chain Monte Carlo

Time Reversibility and Burke s Theorem

MSc MT15. Further Statistical Methods: MCMC. Lecture 5-6: Markov chains; Metropolis Hastings MCMC. Notes and Practicals available at

1 IEOR 4701: Continuous-Time Markov Chains

Math Homework 5 Solutions

Countable state discrete time Markov Chains

Markov Chains (Part 3)

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

Markovské řetězce se spojitým parametrem

MATH HOMEWORK PROBLEMS D. MCCLENDON

Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS

(implicitly assuming time-homogeneity from here on)

Performance Modelling of Computer Systems

Introduction to queuing theory


stochnotes Page 1

LECTURE #6 BIRTH-DEATH PROCESS

LIMITING PROBABILITY TRANSITION MATRIX OF A CONDENSED FIBONACCI TREE

Continuous time Markov chains

Stochastic Processes

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk

Examination paper for TMA4265 Stochastic Processes

An Introduction to Probability Theory and Its Applications

Part I Stochastic variables and Markov chains

6 Markov Chain Monte Carlo (MCMC)

A&S 320: Mathematical Modeling in Biology

Probability Distributions

Introduction to Markov Chains, Queuing Theory, and Network Performance

= P{X 0. = i} (1) If the MC has stationary transition probabilities then, = i} = P{X n+1

At the boundary states, we take the same rules except we forbid leaving the state space, so,.

Problem Set 8

Math 597/697: Solution 5

Stochastic Modelling Unit 1: Markov chain models

MATH 56A: STOCHASTIC PROCESSES CHAPTER 3

LTCC. Exercises solutions

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10. x n+1 = f(x n ),

Queuing Theory. Using the Math. Management Science

Intro Refresher Reversibility Open networks Closed networks Multiclass networks Other networks. Queuing Networks. Florence Perronnin

AARMS Homework Exercises

reversed chain is ergodic and has the same equilibrium probabilities (check that π j =

Transcription:

A review of Continuous Time MC STA 624, Spring 2015 Ruriko Yoshida Dept. of Statistics University of Kentucky polytopes.net STA 624 1

Continuous Time Markov chains Definition A continuous time stochastic process is a collection {Z α } of random variables from a common sample space to a state space, where α R n. Definition A stochastic process {Y t } has the Markov property if for all measurable A, and 0 < s < t P(Y t A Y r,0 r s) = P(Y t A Y s ). Definition A continuous time Markov chain is a continuous time stochastic process with the Markov property that changes value at a countable number of times, 0 = T 0 < T 1 < T 2 <... with probability 1. STA 624 2

Definition For finite state space continuous time Markov chains, the infinitesimal generator of the chain is an S S matrix with Q(i,j) = q(i,j) for j i, and Q(i,i) = j i q(i,j). Fact For {X t } a continuous time Markov chain with jumps at 0 = T 0 < T 1 <..., 1) T i T i 1 are independent random variables, and {T i T i 1 X Ti 1 } is an exponential random variable with rate that only depends on X Ti 1, and 2) Y i = X Ti is a discrete time Markov chain called the underlying discrete time chain or embedded chain or jump chain. STA 624 3

Definition Recurrence, transience, positive recurrence, and null recurrence are defined the same way as for discrete time Markov chains. A continuous time Markov chain is irreducible if for all x,y S. Fact A continuous time Markov chain is irreducible, recurrent, or transient if and only if the underlying discrete chain is (respectively) irreducible, recurrent, or transient. However, one can be positive recurrent while the other is null recurrent. STA 624 4

Ergodic Thm for continuous time Definition Define the return time to a state x in a continuous time Markov Chain as follows: R x = inf{t > T 1 : X t = x X 0 = x}. Ergodic Thm for continuous time For an irreducible positive recurrent continuous time Markov chain on a countable state space, there exists a unique stationary distribution π with π(i) > 0. Also, π is the limiting distribution, and if R is the time for return to x starting from x, E(R) = 1/[Q(x, x)π(x)]. If the chain is null recurrent or transient, there is no stationary distribution π. STA 624 5

Poisson process Definition A Poisson process is a process {X t } satisfying: X 0 = 0. The number of events during one time interval does not affect the number of events during a different time interval. The average rate at which events occur remians constant. Events occur once at a time. STA 624 6

Birth death chain Definition A birth death chain is a continuous time Markov chain on state space {0,1,2,...} where q(i,i+1) = λ i for i 0, q(i,i 1) = µ i for all i 1 and no other edges exist. Fact A Poisson process is a birth death chain where λ i = λ and µ i = 0 for all i. Definition The Yule process is the process cross between Poisson process and Branching process. Each individual present at time t splits into 2 during the time interval (t,t+ t) with probability λ t+o( t) and µ = 0. STA 624 7

Birth and death process Theorem A birth death chain is transient if and only if n=1 µ 1 µ 2 µ n λ 1 λ 2 λ n <. STA 624 8

Birth and death process Theorem A birth death chain is positive recurrent if and only if q = n=1 λ 0 λ 2 λ n 1 µ 1 µ 2 µ n < (the term n = 0, the sum is equal to 1). Also the stationary distribution π is π i = λ 0λ 2 λ i 1 µ 1 µ 2 µ i q 1. STA 624 9

Chapman-Kolmogorov forward equations Theorem q(x,y) is the rate at which a continuous time Markov chain moves from x to y and if q(x) = y xq(x,y), then d dt p t(x,y) = q(y)p t (x,y)+ q(z,y)p t (x,z), z y where p t (x,y) = P(X t = y X 0 = x) = P(X t+s = y X s = x). STA 624 10

Chapman-Kolmogorov backward equations Theorem q(x,y) is the rate at which a continuous time Markov chain moves from x to y and if q(x) = y xq(x,y), then d dt p t(x,y) = q(x)p t (x,y)+ q(x,z)p t (z,y), z x where p t (x,y) = P(X t = y X 0 = x) = P(X t+s = y X s = x). STA 624 11

Balance equations Definition If q(x,y) is the rate at which a continuous time Markov chain moves from x to y, then the balance equations are for all x S: π(x) y x q(x,y) = y xπ(y)q(y,x). The detailed balance equations (aka reversibility) are for all x,y S: π(x)q(x, y) = π(y)q(y, x). STA 624 12

M/M/1 queue This is a birth and death process with λ n = λ and µ n = µ. ) n < iff it is transient. Thus µ < λ iff it is transient. n=1 ( µ λ n=0(λ µ) n < iff it is positive recurrent. Thus λ < µ iff it is positive recurrent. If λ < µ then the stationary distribution π(n) = (1 ρ)ρ n where ρ = λ/µ. The expected value of the length of the queue is λ/(µ λ) if λ < µ. STA 624 13

M/M/k queue This is a birth and death process with λ n = λ and µ n = { kµ if n k nµ if n < k. n=1 C ( kµ λ ) n < iff it is transient. Thus it is transient iff kµ < λ. ( n n=0 C λ kµ) < iff it is positive recurrent. Thus λ < kµ iff it is positive recurrent. STA 624 14

M/M/ queue M/M/ is never transient and always positive recurrent. The stationary distribution is π(n) = e λ/µ (λ/µ) n. n! STA 624 15