Birth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes

Similar documents
STAT 380 Continuous Time Markov Chains

Statistics 150: Spring 2007

Continuous-Time Markov Chain

The Transition Probability Function P ij (t)

STAT STOCHASTIC PROCESSES. Contents

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

SMSTC (2007/08) Probability.

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

Probability Distributions

Stochastic Processes

Stochastic process. X, a series of random variables indexed by t

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Continuous Time Markov Chains

1 Types of stochastic models

6 Continuous-Time Birth and Death Chains

Markov processes and queueing networks

TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis

Discrete time Markov chains. Discrete Time Markov Chains, Limiting. Limiting Distribution and Classification. Regular Transition Probability Matrices

Examination paper for TMA4265 Stochastic Processes

Lecture 4a: Continuous-Time Markov Chain Models

AARMS Homework Exercises

Probability Distributions

Markov Processes Cont d. Kolmogorov Differential Equations

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

A review of Continuous Time MC STA 624, Spring 2015

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

Time Reversibility and Burke s Theorem

Quantitative Model Checking (QMC) - SS12

Continuous time Markov chains

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

Data analysis and stochastic modeling

Stochastic modelling of epidemic spread

The cost/reward formula has two specific widely used applications:

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

IEOR 6711, HMWK 5, Professor Sigman

Continuous time Markov chains

18.175: Lecture 30 Markov chains

An Overview of Methods for Applying Semi-Markov Processes in Biostatistics.

Continuous Time Processes

Quasi-Stationary Distributions in Linear Birth and Death Processes

1 IEOR 4701: Continuous-Time Markov Chains

Lecture 20: Reversible Processes and Queues

Kolmogorov Equations and Markov Processes

Continuous Time Markov Chain Examples

= p(t)(1 λδt + o(δt)) (from axioms) Now p(0) = 1, so c = 0 giving p(t) = e λt as required. 5 For a non-homogeneous process we have

MARKOV PROCESSES. Valerio Di Valerio

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing

Stochastic Processes

MATH37012 Week 10. Dr Jonathan Bagley. Semester

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Department of Applied Mathematics Faculty of EEMCS. University of Twente. Memorandum No Birth-death processes with killing

2 Discrete-Time Markov Chains

Statistics 992 Continuous-time Markov Chains Spring 2004

Stochastic modelling of epidemic spread

Markov chains. Randomness and Computation. Markov chains. Markov processes

Lecture 21. David Aldous. 16 October David Aldous Lecture 21

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains

Example: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected

Problem Set 8

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

At the boundary states, we take the same rules except we forbid leaving the state space, so,.

STOCHASTIC PROCESSES Basic notions

ISyE 6650 Probabilistic Models Fall 2007

Markov Chains Absorption (cont d) Hamid R. Rabiee

Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321

Stochastic Modelling Unit 1: Markov chain models

(implicitly assuming time-homogeneity from here on)

Inventory Ordering Control for a Retrial Service Facility System Semi- MDP

Structured Markov Chains

Erik A. van Doorn Department of Applied Mathematics University of Twente Enschede, The Netherlands

The Markov Decision Process (MDP) model

Introduction to Queuing Networks Solutions to Problem Sheet 3

Examples of Countable State Markov Chains Thursday, October 16, :12 PM

Chapter 2 Continuous Time Discrete State Stochastic Models

M/M/1 Retrial Queueing System with Negative. Arrival under Erlang-K Service by Matrix. Geometric Method

GI/M/1 and GI/M/m queuing systems

Stochastic Processes (Week 6)

Answers Lecture 2 Stochastic Processes and Markov Chains, Part2

ISM206 Lecture, May 12, 2005 Markov Chain

Summary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University)

Probability, Random Processes and Inference

Outlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC)

LTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather

Derivation of Itô SDE and Relationship to ODE and CTMC Models

Hidden Markov Models and Gaussian Mixture Models

Chapter 2 SOME ANALYTICAL TOOLS USED IN THE THESIS

Markov Chains (Part 4)

Part I Stochastic variables and Markov chains

Reinforcement Learning

Discrete Markov Chain. Theory and use

Basic math for biology

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Figure 10.1: Recording when the event E occurs

Birth-death chain models (countable state)

SIMILAR MARKOV CHAINS

Transcription:

DTU Informatics 247 Stochastic Processes 6, October 27 Today: Limiting behaviour of birth and death processes Birth and death processes with absorbing states Finite state continuous time Markov chains Next week Renewal phenomena Twothree weeks from now Phase type distributions Limiting Behaviour for For an irreducible birth and death process we have If π j > then lim P ijt π j t πpt π or πa We can always solve recursively for π π π n n π n+ n+ π n [ + n i+ i n n π i+ i ] Linear Growth with Immigration n n + a, n n With θ k θ k k π k k i+ i aa + a + k k! k a + + k k! + k k k + k k k + k k k k a k a

Logistic Model Probability of Absorption Birth/death rate per individual αm Xt Xt N, n αnm n, n nn N. θ N+m π N+M α m N+m in N M N N + m m c M N N + m m im i i + i + N α m, m M N α m Theorem Theorem 6. Page 39 Consider a birth and death process with birth and death parameters n, n, n with is an absorbing state. The probability of absorption into state from the initial state m is u m ρ i { im ρ i + i ρ i the mean time to absorption is { w m + m k ρ k i ρ i < i jk+ j ρ j i ρ i i ρ i < where ρ, ρ i i j j j. Derivation of Absorption Probabilities First step analysis, with the embedded Markov chain... P + +... 2 2 + 2 2 + 2....... Let u i be the probability of absorption when starting in state i u i i i + u i + i + u i+, i, u i+ u i i u i u i, i Derivation of Absorption Probabilities cont. With v i u i+ u i we get v i i v i leading to If m j ρ i < we get otherwise u u i v i ρ i v with ρ i i j j j u i+ u i v i ρ i v o ρ i u u m u u ρ i, m > u j j ρ i + j ρ i

Derivation of Mean Time to Absorption Derivation of Mean Time to Absorption cont. Let w i be mean time to absorption starting in i w w i i + + i i + w i + i + w i+, i With z i w i w i+ negative! z i + i z i Inserting z m w m w m we get w m w m w m w m i ρ i w i ρ i + w z m m i ji+ i j m + j ρ i + z j j j z lim m w m w m w i ρ i Population Processes: probability of absorption Population Processes: mean time to absorption n n, n n, X m, EXt me t i ρ i, ρ i im { m when > when with m and < i ρ i i i i x i dx i log x i dx log x dx

Finite Contiunuous Time Markov Chains P ij t P{Xt + s j Xs i} a P ij t b N j P ijt c P ik s + t N j P ijsp jk t {, i j d lim t + P ij t, i j c is the Chapman-Kolmogorov equations, in matrix form P t PtA forward equations P t APt backward equations P e At A n t n A n t n I + n n Pt + s PtPs Stationary Distribution Infinitesimal Description πa π, π,..., π N elementwise π j q j i j q q... q N q q... q N...... q N q N... q NN π i q ij with q j k j q jk P{Xt + h j Xt i} q ij h + oh for i j P{Xt + h i Xt i} q i h + oh Sojourn Description. Embedded Markov chain of state sequences ξ i has one step transition probabilities p ij q ij q i 2. Successive sojourn times S ξi are exponentially distributed with mean q ξi.

Two State Markov Chain Summary of most Important Results A A 2 α α α 2 + α 2 α α 2 α 2 + α α + α α Thus And A n [ α + ] n A Pt I α + I α + [ α + t] n A n e α+t A I + α + A α + Ae α+t Pt e At π Under an assumption of irreducibility for t Additional Reading Kai Lai Chung: Markov Chains with Stationary Transition Probabilities