6 Continuous-Time Birth and Death Chains

Size: px
Start display at page:

Download "6 Continuous-Time Birth and Death Chains"

Transcription

1 6 Continuous-Time Birth and Death Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press, A. Peace Continuous-Time Birth and Death Chains 1/32

2 General Birth and Death Processes Let X ( t) denote the change in the process from t to t: X (t) = X (t + t) X (t) Standard notation for birth and death rates for X (t) = i λ i = birth rate & µ i = death rate The CTMC {X (t) : t [0, )} may have a finite or infinite state space: {0, 1,..., N}, {0, 1, 2...} A. Peace Continuous-Time Birth and Death Chains 2/32

3 General Birth and Death Processes The transition probabilities: p i+j,i ( t) = Prob{ x(t) = j X (t) = i} λ i t + o( t), j = 1 µ i t + o( t), j = 1 = 1 (λ i + µ i ) t + o( t), j = 0 o( t), j 1, 0, 1 where λ i, µ i 0 for i = 0, 1, 2,... and µ 0 = 0 If λ 0 > 0 then there is immigration. t is sufficiently small so at most one change in state can occur in the time interval a birth i i + 1 a death i i = 1 A. Peace Continuous-Time Birth and Death Chains 3/32

4 General Birth and Death Processes The forward Kolmogorov differential equations can be derived directly from the transition probabilities. Consider the transition probability p ji (t + t) p ji (t + t) = p j 1,i (t)[λ j 1 t + o( t)] + p j+1,i (t)[µ j+1 t + o( t)] + P ji (t)[1 (λ j + µ j ) t + o( t)] + p j+k,i (t)o( t) k 1,0,1 = p j 1,i (t)λ j 1 t +p j+1,i (t)µ j+1 t +P ji (t)[1 (λ j +µ j ) t]+o( t) If j = 0 then p 0i (t + t) = p 1i (t)µ 1 t + p 0i (t)[1 λ 0 t] + o( t) If j = N and λ N = 0, p kn (t) = 0 for K > N then p Ni (t + t) = p N 1,i (t)λ N 1 t + p Ni (t)[1 µ N t] + o( t) A. Peace Continuous-Time Birth and Death Chains 4/32

5 General Birth and Death Processes Question Given the transition probabilities of the finite general birth and death process p ji (t + t) = p j 1,i (t)λ j 1 t + p j+1,i (t)µ j+1 t + P ji (t)[1 (λ j + µ j ) t] + p 0i (t + t) = p 1i (t)µ 1 t + p 0i (t)[1 λ 0 t] + o( t) p Ni (t + t) = p N 1,i (t)λ N 1 t + p Ni (t)[1 µ N t] + o( t) 1. Derive the forward Kolmogorov differential equations for dp ji (t), dt 2. What is the generator matrix Q? dp 0i (t), & dt dp Ni (t) dt 3. What is the transition matrix T of the embedded DTMC? A. Peace Continuous-Time Birth and Death Chains 5/32

6 Stationary Probability Distribution π for general CTMC Stationary probability distribution π = (π 0, π 1,...) T of a CTMC with generator matrix Q satisfies Qπ = 0, π i = 1, & π i 0 i=0 A. Peace Continuous-Time Birth and Death Chains 6/32

7 Stationary Probability Distribution Theorem 6.1: π for birth and death processes Let {X (t) : t [0, )}, be a general birth and death chain. If the state space is infinite, {0, 1, 2,...}, a unique positive stationary probability distribution exists iff µ i > 0 and λ i 1 > 0 for i = 1, 2,..., and i=1 λ 0 λ 1 λ i 1 µ 1 µ 2 µ i < For i = 1, 2,..., the stationary probability distribution equals π i = λ 0λ 1 λ i 1 µ 1 µ 2 µ i π 0 & π 0 = i=1 λ 0 λ 1 λ i 1 µ 1 µ 2 µ i If the state space is finite {0, 1, 2,..., N} a unique positive stationary probability distribution exists iff µ i > 0 and λ i 1 > 0 for i = 1, 2,..., N. Here the above summations extend from i = 1, 2,..., N If λ 0 = 0 and µ i > 0 for i 1, then π = (1, 0, 0,...) T A. Peace Continuous-Time Birth and Death Chains 7/32

8 Stationary Probability Distribution Example Suppose a continuous-time birth and death Markov chain had birth and death rates λ i = b and µ i = id for i = 0, 1, 2, Show that the unique stationary probability distribution is π i = (b/d)i e b/d i! A. Peace Continuous-Time Birth and Death Chains 8/32

9 Simple Birth Process Let X (t) represent the population size at time t Assume X (0) = N so that p i (0) = δ in. The only event is a birth The population can only increase in size For t sufficiently small the transition probabilities are p i+j,i ( t) = Prob{ x(t) = j X (t) = i} λi t + o( t), j = 1 1 λi t + o( t), j = 0 = o( t), j 2 0, j < 0 A. Peace Continuous-Time Birth and Death Chains 9/32

10 Simple Birth Process The probabilities p i (t) = Prob{X (t) = i} are solutions of the forward Kolmogorov differential equations dp dt = Qp where dp i (t) = λ(i 1)p i 1 (t) λip i (t), i = N, N + 1,..., dt dp i (t) = 0, i = 0, 1,..., N 1 dt with initial conditions p i (0) = δ in. The state space is {N, N + 1,...} A. Peace Continuous-Time Birth and Death Chains 10/32

11 Simple Birth Process The generating functions for the simple birth process correspond to a negative binomial distribution. We can see this by deriving a PDE for the p.g.f. and the m.g.f. and solving them using the method of characteristics. Working through the details 1. Multiply the forward equations by z i and sum over i to get the p.g.f. P(z, t) t = λz(z 1) P z with initial conditions P(z, 0) = Z N. 2. Derive the PDE for the m.g.f. from the above PDE using the change of variables: z = e θ. 3. Use the method of characteristics to find the solutions M(θ, t) 4. Write the solution for the p.g.f by changing the variables back: θ = ln z A. Peace Continuous-Time Birth and Death Chains 11/32

12 Simple Birth Process The m.g.f. for the simple birth process is M(θ, t) = [1 e λt (1 e θ )] N The p.g.f. for the simple birth process is P(z, t) = (pz)n (1 zq) N where p = e λt and q = 1 e λt. The probabilities can be written as ( ) n 1 p n (t) = e λnt (1 e λt ) n N, n = N, N + 1,... n N The mean and variance are m(t) = Ne λt & σ 2 (t) = Ne 2λt (1 e λt ) The mean corresponds to exponential growth The variance increases exponentially with time A. Peace Continuous-Time Birth and Death Chains 12/32

13 Simple Death Process Let X (t) represent the population size at time t Assume X (0) = N so that p i (0) = δ in. The only event is a death The population can only decrease in size For t sufficiently small the transition probabilities are p i+j,i ( t) = Prob{ x(t) = j X (t) = i} µi t + o( t), j = 1 1 µi t + o( t), j = 0 = o( t), j 2 0, j < 0 The state space is {0, 1, 2,..., N} A. Peace Continuous-Time Birth and Death Chains 13/32

14 Simple Death Process The forward Kolmogorov equations are dp i (t) dt dp N (t) dt = µ(i + 1)p i+1 (t) µip i (t), i = 0, 1,...N 1 = µnp N (t) Initial conditions: p i0 = δ in Here 0 is an absorbing state The stationary probability distribution is π i = (1, 0,...0) T. A. Peace Continuous-Time Birth and Death Chains 14/32

15 Simple Death Process The generating function technique and method of characteristics are used to find the probability distribution The PDE for the p.g.f is P(z, t) t = µ(1 z) P z, P(z, 0) = zn The PDE for the m.g.f is M t = µ(e θ 1) M θ, M(θ, 0) = enθ Applying the method of characteristics yields the solutions: P(z, t) = (1 e µt + e µt z) N & M(θ, t) = (1 e µt (1 e θ )) N A. Peace Continuous-Time Birth and Death Chains 15/32

16 Simple Death Process The p.g.f had the form P(z, t) = (q + pz) N where p = e µt and q = 1 e µt This corresponds to a binomial distribution b(n, p) For i = 0, 1,..., N the probabilities satisfy ( ) N p i (t) = p i q N i i The mean and variance for a binomial distribution b(n, p) are m(t) = Np and σ 2 (t) = Npq: m(t) = Ne µt & σ 2 (t) = Ne µt (1 e µt ) The mean corresponds to exponential decay The variance decreases exponentially with time A. Peace Continuous-Time Birth and Death Chains 16/32

17 Simple Birth and Death Process Let X (t) represent the population size at time t Assume X (0) = N so that p i (0) = δ in. An event can be a birth or a death The population can increase or decrease in size For t sufficiently small the transition probabilities are p i+j,i ( t) = Prob{ x(t) = j X (t) = i} µi t + o( t), j = 1 λi t + o( t), j = 1 = 1 (λ + µ)i t + o( t), j = 0 o( t), j 1, 0, 1 The state space is {0, 1, 2,..., N} λ 0 = 0 and π = (1, 0,..., ) T A. Peace Continuous-Time Birth and Death Chains 17/32

18 Simple Birth and Death Process The forward Kolmogorov equations are dp i (t) = λ(i 1)P i 1 (t) + µ(i + 1)p i+1 (t) (λ + µ)ip i (t) dt dp 0 (t) = µp 1 (t) dt for i = 0, 1,... with initial conditions p i (0) = δ in. An explicit solution to P(t) is not possible but we can determine the moments of this distribution by using the generating function technique First-order partial differential equations can be derived for the p.g.f and m.g.f. A. Peace Continuous-Time Birth and Death Chains 18/32

19 Simple Birth and Death Process Generating function technique P(z, t) = i=0 p i(t)z i is the p.g.f. and M(θ, t) = P(e θ, t) the m.g.f. Multiply P(t) = QP(t) by z i and sum over i i=0 z i dp i(t) dt = z i [λ(i 1)p i 1 (t)+µ(i +1)p i+1 (t) (λ+µ)ip i (t)] i=0 Interchange differentiation and integration, evaluate at z = 1 to obtain the PDE for the p.g.f. P(z, t) t = [µ(1 z) + λz(z 1)] P z, P(z, 0) = zn The change of variables z = e θ leads to the m.g.f. equation M(θ, t) t = [µ(e θ 1) + λ(e θ 1)] M θ, M(θ, 0) = eθn A. Peace Continuous-Time Birth and Death Chains 19/32

20 Simple Birth and Death Process Generating function technique Application of the method of characteristics to the m.g.f yields the ODES dt dτ = 1, dθ dm µ(e θ 1) + λ(e θ = dτ, & 1) dτ = 0 with initial conditions t(s, 0) = 0, θ(s, 0) = s, & M(s, 0) = e sn A. Peace Continuous-Time Birth and Death Chains 20/32

21 Simple Birth and Death Process Generating function technique The m.g.f. is M(θ, t) = ( e t(µ λ) (λe θ µ) µ(e θ 1) e t(µ λ) (λe θ µ) λ(e θ 1) ( 1 (λt 1)(e θ 1) 1 λt(e θ 1) ) N, if λ µ ) N, if λ = µ Making the change of variables θ = ln z yields the p.g.f ( ) e t(µ λ) N (λz µ) µ(z 1) e P(z, t) = t(µ λ) (λz µ) λ(z 1), if λ µ ( ) N 1 (λt 1)(z 1), if λ = µ 1 λt(z 1) A. Peace Continuous-Time Birth and Death Chains 21/32

22 Simple Birth and Death Process Generating function technique Unlike, the simple birth process and the simple death process, these generating functions can not be associated with a well known probability distribution. But, probabilities and moments associated with the process can be obtained directly from the generating functions. Recall that P(z, t) = p i (t)z i, & p i (t) = 1 i P i! z i i=0 z=0 The first term in the series expansion of P(z, t) are p 0 (t) = P(0, t): ( ) µ µe (µ λ)t N λ µe, if λ µ p 0 (t) = ( ) (µ λ)t N λt 1+λt, if λ = µ A. Peace Continuous-Time Birth and Death Chains 22/32

23 Simple Birth and Death Process Generating function technique The probability of extinction has a simple expression when t { 1, if λ µ p 0 ( ) = lim p 0 (t) = ( t µ ) N λ, if λ > µ The mean and variance for λ µ are ( ) λ + µ m(t) = Ne (λ µ)t & σ 2 (t) = N e (λ µ)t (e (λ µ)t 1) λ µ The mean corresponds to exponential growth when λ > µ and exponential decay when λ < µ The mean and variance for λ = µ are m(t) = N & σ 2 (t) = 2Nλt A. Peace Continuous-Time Birth and Death Chains 23/32

24 Simple Birth and Death Process Population Extinction The probability of extinction has a simple expression with t : { 1 if λ µ p 0 (t) = ( µ ) N λ if λ > µ This is similar to the gambler s ruin problem, a semi-infinite random walk with an absorbing barrier at x = 0. µ is probability of losing a game λ is the probability of wining a game If probability of losing (death) is greater than or equal to the probability of winning (birth), then, in the long run the probability of losing all of the initial capital N (probability of absorption) approaches 1. If the probability of winning is greater than the probability of losing, then, in the long run, the probability of losing all of the initial capital is (µ/λ) N A. Peace Continuous-Time Birth and Death Chains 24/32

25 Population Extinction Theorem Let µ 0, λ 0 = 0 in a general birth and death chain with X (0) = m 1 1. Suppose µ i > 0 and λ i > 0 for i = 12,..., If i=1 µ 1 µ 2 µ i λ 1 λ 2 λ i = then lim p 0 (t) = 1 t If i=1 µ 1 µ 2 µ i λ 1 λ 2 λ i < and p 0 (t) 0 as m then lim p 0 (t) = t µ 1 µ 2 µ i i=m λ 1 λ 2 λ i 1 + i=1 µ 1 µ 2 µ i λ 1 λ 2 λ i 2. Suppose µ i > 0 for i = 1, 2,... and λ i > 0 for i = 1, 2,..., N 1 and λ i = 0 for i = N, N + 1,... Then lim p 0(t) = 1 t A. Peace Continuous-Time Birth and Death Chains 25/32

26 First Passage Time The time until the process reachers a certain stage for the first time is the first passage time. Suppose a population size is a and we want to find the time it takes until it reaches a size of b (b < a or b > a). The first passage time problems are related to the backward Kolmogorov equations. We will see how to compute the expected time to extinction A. Peace Continuous-Time Birth and Death Chains 26/32

27 First Passage Time Time to Extinction Suppose λ 0, µ 0 = 0 and lim t p 0 (t) = 1 in a finite MC. Define τ = (τ 1, τ 2,..., τ n ) as the expected time to extinction τ is the solution of the following system τ T Q = 1 T where Q is the n n generator matrix with the first row and column deleted. These elements are deleted to correspond with τ 0 = 0. A. Peace Continuous-Time Birth and Death Chains 27/32

28 Logistic Growth Process Recall deterministic logistic model: dn (1 dt = rn n ), n(0) = n o > 0 K where r is the intrinsic growth rate and K is the carrying capacity lim t n(t) = K The right hand side equals the birth minus the death rate λ n µ n = rn r k n2 Similar to the case for DTMC there are ways to describe a stochastic logistic growth process A. Peace Continuous-Time Birth and Death Chains 28/32

29 Logistic Growth Process Recall the forumlation for DTMC DTMC Stochastic Logistic Growth Model Case a: b i = r (i i 2 ) and d i = r i 2 2K 2K for i = 0, 1, 2,..., 2K Case b: b i = for i = 0, 1, 2,..., N { ri, i = 0, 1, 2,..., N 0, i N and d i = r i 2 K A. Peace Continuous-Time Birth and Death Chains 29/32

30 Logistic Growth Process Analogous forumlation for CTMC CTMC Stochastic Logistic Growth Model Case a: for i = 0, 1, 2,..., 2K λ i = r (i i 2 ) and µ i = r i 2 2K 2K Case b: λ i = for i = 0, 1, 2,..., N { ri, i = 0, 1, 2,..., N 0, i N and d i = r i 2 K Both cases correspond to the deterministic system dn (1 dt = rn n ) K A. Peace Continuous-Time Birth and Death Chains 30/32

31 Logistic Growth Process A. Peace Continuous-Time Birth and Death Chains 31/32

32 Logistic Growth Process Unlike the deterministic Logistic growth model, In the limit the stochastic logistic growth process does not approach K. Extinction is an absorbing state For large population size, the time to extinction is very large (but finite) A. Peace Continuous-Time Birth and Death Chains 32/32

Probability Distributions

Probability Distributions Lecture 1: Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function

More information

Probability Distributions

Probability Distributions Lecture : Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function (mgf)

More information

AARMS Homework Exercises

AARMS Homework Exercises 1 For the gamma distribution, AARMS Homework Exercises (a) Show that the mgf is M(t) = (1 βt) α for t < 1/β (b) Use the mgf to find the mean and variance of the gamma distribution 2 A well-known inequality

More information

2 Discrete-Time Markov Chains

2 Discrete-Time Markov Chains 2 Discrete-Time Markov Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

An Introduction to Stochastic Epidemic Models

An Introduction to Stochastic Epidemic Models An Introduction to Stochastic Epidemic Models Linda J. S. Allen Department of Mathematics and Statistics Texas Tech University Lubbock, Texas 79409-1042, U.S.A. linda.j.allen@ttu.edu 1 Introduction The

More information

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1 Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate

More information

4 Branching Processes

4 Branching Processes 4 Branching Processes Organise by generations: Discrete time. If P(no offspring) 0 there is a probability that the process will die out. Let X = number of offspring of an individual p(x) = P(X = x) = offspring

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Stochastic process. X, a series of random variables indexed by t

Stochastic process. X, a series of random variables indexed by t Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,

More information

Birth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes

Birth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes DTU Informatics 247 Stochastic Processes 6, October 27 Today: Limiting behaviour of birth and death processes Birth and death processes with absorbing states Finite state continuous time Markov chains

More information

Stochastic modelling of epidemic spread

Stochastic modelling of epidemic spread Stochastic modelling of epidemic spread Julien Arino Centre for Research on Inner City Health St Michael s Hospital Toronto On leave from Department of Mathematics University of Manitoba Julien Arino@umanitoba.ca

More information

Problem Set 8

Problem Set 8 Eli H. Ross eross@mit.edu Alberto De Sole November, 8.5 Problem Set 8 Exercise 36 Let X t and Y t be two independent Poisson processes with rate parameters λ and µ respectively, measuring the number of

More information

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process

More information

Continuous-Time Markov Chain

Continuous-Time Markov Chain Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),

More information

Lecture 4 - Random walk, ruin problems and random processes

Lecture 4 - Random walk, ruin problems and random processes Lecture 4 - Random walk, ruin problems and random processes Jan Bouda FI MU April 19, 2009 Jan Bouda (FI MU) Lecture 4 - Random walk, ruin problems and random processesapril 19, 2009 1 / 30 Part I Random

More information

Derivation of Itô SDE and Relationship to ODE and CTMC Models

Derivation of Itô SDE and Relationship to ODE and CTMC Models Derivation of Itô SDE and Relationship to ODE and CTMC Models Biomathematics II April 23, 2015 Linda J. S. Allen Texas Tech University TTU 1 Euler-Maruyama Method for Numerical Solution of an Itô SDE dx(t)

More information

Continuous Time Markov Chains

Continuous Time Markov Chains Continuous Time Markov Chains Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 2015 Outline Introduction Continuous-Time Markov

More information

P 2 (t) = o(t). e λt. 1. keep on adding independent, exponentially distributed (with mean 1 λ ) inter-arrival times, or

P 2 (t) = o(t). e λt. 1. keep on adding independent, exponentially distributed (with mean 1 λ ) inter-arrival times, or POISSON PROCESS PP): State space consists of all non-negative integers, time is continuous. The process is time homogeneous, with independent increments and P 0 t) = λ t+ot) P t) = λ t+ot) P 2 t) = ot).

More information

Stochastic modelling of epidemic spread

Stochastic modelling of epidemic spread Stochastic modelling of epidemic spread Julien Arino Department of Mathematics University of Manitoba Winnipeg Julien Arino@umanitoba.ca 19 May 2012 1 Introduction 2 Stochastic processes 3 The SIS model

More information

Lecture 4a: Continuous-Time Markov Chain Models

Lecture 4a: Continuous-Time Markov Chain Models Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time

More information

IEOR 6711, HMWK 5, Professor Sigman

IEOR 6711, HMWK 5, Professor Sigman IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.

More information

Statistics 150: Spring 2007

Statistics 150: Spring 2007 Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities

More information

The Transition Probability Function P ij (t)

The Transition Probability Function P ij (t) The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it

More information

Markov processes and queueing networks

Markov processes and queueing networks Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

More information

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3) STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it

More information

Part I Stochastic variables and Markov chains

Part I Stochastic variables and Markov chains Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)

More information

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling Norwegian University of Science and Technology Department of Mathematical Sciences Page of 7 English Contact during examination: Øyvind Bakke Telephone: 73 9 8 26, 99 4 673 TMA426 Stochastic processes

More information

MA 777: Topics in Mathematical Biology

MA 777: Topics in Mathematical Biology MA 777: Topics in Mathematical Biology David Murrugarra Department of Mathematics, University of Kentucky http://www.math.uky.edu/~dmu228/ma777/ Spring 2018 David Murrugarra (University of Kentucky) Lecture

More information

STAT 380 Continuous Time Markov Chains

STAT 380 Continuous Time Markov Chains STAT 380 Continuous Time Markov Chains Richard Lockhart Simon Fraser University Spring 2018 Richard Lockhart (Simon Fraser University)STAT 380 Continuous Time Markov Chains Spring 2018 1 / 35 Continuous

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Lectures on Probability and Statistical Models

Lectures on Probability and Statistical Models Lectures on Probability and Statistical Models Phil Pollett Professor of Mathematics The University of Queensland c These materials can be used for any educational purpose provided they are are not altered

More information

LTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather

LTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather 1. Markov chain LTCC. Exercises Let X 0, X 1, X 2,... be a Markov chain with state space {1, 2, 3, 4} and transition matrix 1/2 1/2 0 0 P = 0 1/2 1/3 1/6. 0 0 0 1 (a) What happens if the chain starts in

More information

SMSTC (2007/08) Probability.

SMSTC (2007/08) Probability. SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................

More information

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes? IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only

More information

Branching Processes II: Convergence of critical branching to Feller s CSB

Branching Processes II: Convergence of critical branching to Feller s CSB Chapter 4 Branching Processes II: Convergence of critical branching to Feller s CSB Figure 4.1: Feller 4.1 Birth and Death Processes 4.1.1 Linear birth and death processes Branching processes can be studied

More information

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities

More information

Statistics 3657 : Moment Generating Functions

Statistics 3657 : Moment Generating Functions Statistics 3657 : Moment Generating Functions A useful tool for studying sums of independent random variables is generating functions. course we consider moment generating functions. In this Definition

More information

Birth-Death Processes

Birth-Death Processes Birth-Death Processes Birth-Death Processes: Transient Solution Poisson Process: State Distribution Poisson Process: Inter-arrival Times Dr Conor McArdle EE414 - Birth-Death Processes 1/17 Birth-Death

More information

Introduction to course: BSCI 462 of BIOL 708 R

Introduction to course: BSCI 462 of BIOL 708 R Introduction to course: BSCI 462 of BIOL 708 R Population Ecology: Fundamental concepts in plant and animal systems Spring 2013 Introduction The biology of a population = Population Ecology Issue of scale,

More information

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion Brownian Motion An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2010 Background We have already seen that the limiting behavior of a discrete random walk yields a derivation of

More information

Approximating Deterministic Changes to Ph(t)/Ph(t)/1/c and Ph(t)/M(t)/s/c Queueing Models

Approximating Deterministic Changes to Ph(t)/Ph(t)/1/c and Ph(t)/M(t)/s/c Queueing Models Approximating Deterministic Changes to Ph(t)/Ph(t)/1/c and Ph(t)/M(t)/s/c Queueing Models Aditya Umesh Kulkarni Thesis submitted to the Faculty of the Virginia Polytechnic Institute and State University

More information

AN INTRODUCTION TO STOCHASTIC EPIDEMIC MODELS-PART I

AN INTRODUCTION TO STOCHASTIC EPIDEMIC MODELS-PART I AN INTRODUCTION TO STOCHASTIC EPIDEMIC MODELS-PART I Linda J. S. Allen Department of Mathematics and Statistics Texas Tech University Lubbock, Texas U.S.A. 2008 Summer School on Mathematical Modeling of

More information

Chapter 3: Birth and Death processes

Chapter 3: Birth and Death processes Chapter 3: Birth and Death processes Thus far, we have ignored the random element of population behaviour. Of course, this prevents us from finding the relative likelihoods of various events that might

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element

More information

Random variables and transform methods

Random variables and transform methods Chapter Random variables and transform methods. Discrete random variables Suppose X is a random variable whose range is {,,..., } and set p k = P (X = k) for k =,,..., so that its mean and variance are

More information

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Lecture 10: Semi-Markov Type Processes

Lecture 10: Semi-Markov Type Processes Lecture 1: Semi-Markov Type Processes 1. Semi-Markov processes (SMP) 1.1 Definition of SMP 1.2 Transition probabilities for SMP 1.3 Hitting times and semi-markov renewal equations 2. Processes with semi-markov

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

1 Types of stochastic models

1 Types of stochastic models 1 Types of stochastic models Models so far discussed are all deterministic, meaning that, if the present state were perfectly known, it would be possible to predict exactly all future states. We have seen

More information

UNIVERSITY OF LONDON IMPERIAL COLLEGE LONDON

UNIVERSITY OF LONDON IMPERIAL COLLEGE LONDON UNIVERSITY OF LONDON IMPERIAL COLLEGE LONDON BSc and MSci EXAMINATIONS (MATHEMATICS) MAY JUNE 23 This paper is also taken for the relevant examination for the Associateship. M3S4/M4S4 (SOLUTIONS) APPLIED

More information

STAT STOCHASTIC PROCESSES. Contents

STAT STOCHASTIC PROCESSES. Contents STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5

More information

Linear Variable coefficient equations (Sect. 2.1) Review: Linear constant coefficient equations

Linear Variable coefficient equations (Sect. 2.1) Review: Linear constant coefficient equations Linear Variable coefficient equations (Sect. 2.1) Review: Linear constant coefficient equations. The Initial Value Problem. Linear variable coefficients equations. The Bernoulli equation: A nonlinear equation.

More information

LECTURE #6 BIRTH-DEATH PROCESS

LECTURE #6 BIRTH-DEATH PROCESS LECTURE #6 BIRTH-DEATH PROCESS 204528 Queueing Theory and Applications in Networks Assoc. Prof., Ph.D. (รศ.ดร. อน นต ผลเพ ม) Computer Engineering Department, Kasetsart University Outline 2 Birth-Death

More information

Outlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC)

Outlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC) Markov Chains (2) Outlines Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC) 2 pj ( n) denotes the pmf of the random variable p ( n) P( X j) j We will only be concerned with homogenous

More information

4 Moment generating functions

4 Moment generating functions 4 Moment generating functions Moment generating functions (mgf) are a very powerful computational tool. They make certain computations much shorter. However, they are only a computational tool. The mgf

More information

Readings: Finish Section 5.2

Readings: Finish Section 5.2 LECTURE 19 Readings: Finish Section 5.2 Lecture outline Markov Processes I Checkout counter example. Markov process: definition. -step transition probabilities. Classification of states. Example: Checkout

More information

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 295 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 16 Queueing Systems with Two Types of Customers In this section, we discuss queueing systems with two types of customers.

More information

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv

More information

The SIS and SIR stochastic epidemic models revisited

The SIS and SIR stochastic epidemic models revisited The SIS and SIR stochastic epidemic models revisited Jesús Artalejo Faculty of Mathematics, University Complutense of Madrid Madrid, Spain jesus_artalejomat.ucm.es BCAM Talk, June 16, 2011 Talk Schedule

More information

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one

More information

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution

More information

MARKOV PROCESSES. Valerio Di Valerio

MARKOV PROCESSES. Valerio Di Valerio MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some

More information

Stochastic Modelling Unit 1: Markov chain models

Stochastic Modelling Unit 1: Markov chain models Stochastic Modelling Unit 1: Markov chain models Russell Gerrard and Douglas Wright Cass Business School, City University, London June 2004 Contents of Unit 1 1 Stochastic Processes 2 Markov Chains 3 Poisson

More information

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days? IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2005, Professor Whitt, Second Midterm Exam Chapters 5-6 in Ross, Thursday, March 31, 11:00am-1:00pm Open Book: but only the Ross

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

STAT/MATH 395 PROBABILITY II

STAT/MATH 395 PROBABILITY II STAT/MATH 395 PROBABILITY II Chapter 6 : Moment Functions Néhémy Lim 1 1 Department of Statistics, University of Washington, USA Winter Quarter 2016 of Common Distributions Outline 1 2 3 of Common Distributions

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

Birth-death chain models (countable state)

Birth-death chain models (countable state) Countable State Birth-Death Chains and Branching Processes Tuesday, March 25, 2014 1:59 PM Homework 3 posted, due Friday, April 18. Birth-death chain models (countable state) S = We'll characterize the

More information

Linear Variable coefficient equations (Sect. 1.2) Review: Linear constant coefficient equations

Linear Variable coefficient equations (Sect. 1.2) Review: Linear constant coefficient equations Linear Variable coefficient equations (Sect. 1.2) Review: Linear constant coefficient equations. The Initial Value Problem. Linear variable coefficients equations. The Bernoulli equation: A nonlinear equation.

More information

u( x)= Pr X() t hits C before 0 X( 0)= x ( ) 2 AMS 216 Stochastic Differential Equations Lecture #2

u( x)= Pr X() t hits C before 0 X( 0)= x ( ) 2 AMS 216 Stochastic Differential Equations Lecture #2 AMS 6 Stochastic Differential Equations Lecture # Gambler s Ruin (continued) Question #: How long can you play? Question #: What is the chance that you break the bank? Note that unlike in the case of deterministic

More information

Continuous time Markov chains

Continuous time Markov chains Continuous time Markov chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ October 16, 2017

More information

MATH HOMEWORK PROBLEMS D. MCCLENDON

MATH HOMEWORK PROBLEMS D. MCCLENDON MATH 46- HOMEWORK PROBLEMS D. MCCLENDON. Consider a Markov chain with state space S = {0, }, where p = P (0, ) and q = P (, 0); compute the following in terms of p and q: (a) P (X 2 = 0 X = ) (b) P (X

More information

Data analysis and stochastic modeling

Data analysis and stochastic modeling Data analysis and stochastic modeling Lecture 7 An introduction to queueing theory Guillaume Gravier guillaume.gravier@irisa.fr with a lot of help from Paul Jensen s course http://www.me.utexas.edu/ jensen/ormm/instruction/powerpoint/or_models_09/14_queuing.ppt

More information

Applied Stochastic Processes

Applied Stochastic Processes STAT455/855 Fall 26 Applied Stochastic Processes Final Exam, Brief Solutions 1 (15 marks (a (7 marks For 3 j n, starting at the jth best point we condition on the rank R of the point we jump to next By

More information

Quantitative Model Checking (QMC) - SS12

Quantitative Model Checking (QMC) - SS12 Quantitative Model Checking (QMC) - SS12 Lecture 06 David Spieler Saarland University, Germany June 4, 2012 1 / 34 Deciding Bisimulations 2 / 34 Partition Refinement Algorithm Notation: A partition P over

More information

An Introduction to Stochastic Modeling

An Introduction to Stochastic Modeling F An Introduction to Stochastic Modeling Fourth Edition Mark A. Pinsky Department of Mathematics Northwestern University Evanston, Illinois Samuel Karlin Department of Mathematics Stanford University Stanford,

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18. IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas

More information

IE 336 Seat # Name (one point) < KEY > Closed book. Two pages of hand-written notes, front and back. No calculator. 60 minutes.

IE 336 Seat # Name (one point) < KEY > Closed book. Two pages of hand-written notes, front and back. No calculator. 60 minutes. Closed book. Two pages of hand-written notes, front and back. No calculator. 6 minutes. Cover page and four pages of exam. Four questions. To receive full credit, show enough work to indicate your logic.

More information

Irregular Birth-Death process: stationarity and quasi-stationarity

Irregular Birth-Death process: stationarity and quasi-stationarity Irregular Birth-Death process: stationarity and quasi-stationarity MAO Yong-Hua May 8-12, 2017 @ BNU orks with W-J Gao and C Zhang) CONTENTS 1 Stationarity and quasi-stationarity 2 birth-death process

More information

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10. x n+1 = f(x n ),

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10. x n+1 = f(x n ), MS&E 321 Spring 12-13 Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10 Section 4: Steady-State Theory Contents 4.1 The Concept of Stochastic Equilibrium.......................... 1 4.2

More information

Markov Processes Cont d. Kolmogorov Differential Equations

Markov Processes Cont d. Kolmogorov Differential Equations Markov Processes Cont d Kolmogorov Differential Equations The Kolmogorov Differential Equations characterize the transition functions {P ij (t)} of a Markov process. The time-dependent behavior of the

More information

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t 2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition

More information

M/G/1 and M/G/1/K systems

M/G/1 and M/G/1/K systems M/G/1 and M/G/1/K systems Dmitri A. Moltchanov dmitri.moltchanov@tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Description of M/G/1 system; Methods of analysis; Residual life approach; Imbedded

More information

Lecture 21. David Aldous. 16 October David Aldous Lecture 21

Lecture 21. David Aldous. 16 October David Aldous Lecture 21 Lecture 21 David Aldous 16 October 2015 In continuous time 0 t < we specify transition rates or informally P(X (t+δ)=j X (t)=i, past ) q ij = lim δ 0 δ P(X (t + dt) = j X (t) = i) = q ij dt but note these

More information

Problems 5: Continuous Markov process and the diffusion equation

Problems 5: Continuous Markov process and the diffusion equation Problems 5: Continuous Markov process and the diffusion equation Roman Belavkin Middlesex University Question Give a definition of Markov stochastic process. What is a continuous Markov process? Answer:

More information

Department of Applied Mathematics Faculty of EEMCS. University of Twente. Memorandum No Birth-death processes with killing

Department of Applied Mathematics Faculty of EEMCS. University of Twente. Memorandum No Birth-death processes with killing Department of Applied Mathematics Faculty of EEMCS t University of Twente The Netherlands P.O. Box 27 75 AE Enschede The Netherlands Phone: +3-53-48934 Fax: +3-53-48934 Email: memo@math.utwente.nl www.math.utwente.nl/publications

More information

q P (T b < T a Z 0 = z, X 1 = 1) = p P (T b < T a Z 0 = z + 1) + q P (T b < T a Z 0 = z 1)

q P (T b < T a Z 0 = z, X 1 = 1) = p P (T b < T a Z 0 = z + 1) + q P (T b < T a Z 0 = z 1) Random Walks Suppose X 0 is fixed at some value, and X 1, X 2,..., are iid Bernoulli trials with P (X i = 1) = p and P (X i = 1) = q = 1 p. Let Z n = X 1 + + X n (the n th partial sum of the X i ). The

More information

1.3 Convergence of Regular Markov Chains

1.3 Convergence of Regular Markov Chains Markov Chains and Random Walks on Graphs 3 Applying the same argument to A T, which has the same λ 0 as A, yields the row sum bounds Corollary 0 Let P 0 be the transition matrix of a regular Markov chain

More information

Stochastic2010 Page 1

Stochastic2010 Page 1 Stochastic2010 Page 1 Extinction Probability for Branching Processes Friday, April 02, 2010 2:03 PM Long-time properties for branching processes Clearly state 0 is an absorbing state, forming its own recurrent

More information

A review of Continuous Time MC STA 624, Spring 2015

A review of Continuous Time MC STA 624, Spring 2015 A review of Continuous Time MC STA 624, Spring 2015 Ruriko Yoshida Dept. of Statistics University of Kentucky polytopes.net STA 624 1 Continuous Time Markov chains Definition A continuous time stochastic

More information

The Dynamic Analysis and Design of A Communication link with Stationary and Nonstationary Arrivals

The Dynamic Analysis and Design of A Communication link with Stationary and Nonstationary Arrivals 1 of 28 The Dynamic Analysis and Design of A Communication link with Stationary and Nonstationary Arrivals Five dubious ways to dynamically analyze and design a communication system Wenhong Tian, Harry

More information

IEOR 6711: Stochastic Models I Professor Whitt, Thursday, November 29, Weirdness in CTMC s

IEOR 6711: Stochastic Models I Professor Whitt, Thursday, November 29, Weirdness in CTMC s IEOR 6711: Stochastic Models I Professor Whitt, Thursday, November 29, 2012 Weirdness in CTMC s Where s your will to be weird? Jim Morrison, The Doors We are all a little weird. And life is a little weird.

More information

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011 Exercise 6.5: Solutions to Homework 0 6.262 Discrete Stochastic Processes MIT, Spring 20 Consider the Markov process illustrated below. The transitions are labelled by the rate q ij at which those transitions

More information

Chapter 2. Discrete Distributions

Chapter 2. Discrete Distributions Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation

More information

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk Instructor: Victor F. Araman December 4, 2003 Theory and Applications of Stochastic Systems Lecture 0 B60.432.0 Exponential Martingale for Random Walk Let (S n : n 0) be a random walk with i.i.d. increments

More information

MATH 564/STAT 555 Applied Stochastic Processes Homework 2, September 18, 2015 Due September 30, 2015

MATH 564/STAT 555 Applied Stochastic Processes Homework 2, September 18, 2015 Due September 30, 2015 ID NAME SCORE MATH 56/STAT 555 Applied Stochastic Processes Homework 2, September 8, 205 Due September 30, 205 The generating function of a sequence a n n 0 is defined as As : a ns n for all s 0 for which

More information