# Stochastic process. X, a series of random variables indexed by t

Size: px
Start display at page:

Download "Stochastic process. X, a series of random variables indexed by t"

## Transcription

1 Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t, also is a random variable X(t) is discrete/continuous, called discrete/continuous state stochastic process 1

2 Stochastic process, example The number of passengers at a bus station, with respect to (w.r.t.) time Continuous time discrete state stochastic process Temperature of room w.r.t. time Continuous time continuous state stochastic process 2

3 Stochastic process, example Random walk Sn X1 X 2... X n, n 1,2,... X i is independently identically distributed (i.i.d.) random variable E.g., X i = 1, with p=0.5; X i = -1, with p=0.5. is a discrete time random walk. Question: how is the random walk if the probability of X i is different from above? Brownian motion is a continuous time random walk B(t)-B(s) obeys a normal distribution N(0,t-s) 3

4 Characteristics of stochastic process Relationship of random variables X(t i ), i=1,2, Joint distribution function of X={X(t 1 ),X(t 2 ),,X(t n )} F X (x;t) = P{X(t 1 ) x 1, X(t 2 ) x 2,,X(t n ) x n }, for all n Very complicated, but can be simplified for most of commonly used stochastic processes 4

5 Stationary process and independent process Stationary process X(t) is stationary if F X (x;t) is invariant to any shifts in time F (x; t) F (x; t+ ) Independent process X The simplest stochastic process, white noise X(t) are independent each other X F (x; t) P( X ( t ) x ) P( X ( t ) x )... P( X ( t ) x ) X n n 5

6 Markov process Markovian property (memoryless) P[ X ( t ) x X ( t ) x, X ( t ) x,..., X ( t ) x ] n 1 n 1 n n n 1 n P[ X ( t ) x X ( t ) x ] n 1 n 1 n n Future state only depends on the current state, independent of the historical states Simplify the stochastic process Sojourn time on a state is exponentially/geometrically distributed for continuous/discrete time Markov process Category: discrete/continuous time discrete/continuous state Markov process 6

7 Point Process N(t) is a special case of stochastic process N(t) is integer, N(t)=0,1,2, ; t 0 N(t) is increasing w.r.t. time t N(t) 0 t 7

8 Renewal Process Renewal Process N(t) A special case of point process Independent process Identical distribution F( x) P[ T x], n 1,2,... n T n is defined as the holding time that N(t)=n i.e., holding time is i.i.d., general distribution 8

9 Poisson process Poisson (arrival) process N(t) A special case of renewal process that the holding time (interarrival time) obeys the exponential distribution with para. λ N(t): the number of arrivals in [0,t] N(t) has a Poisson distribution with para. λt k ( t) t P( N( t) k) e, k 0,1,2,... k! t 9

10 Poisson process Mean, variance, coefficient of variation E( N( t)) t, ( N( t)) t, cn t 2 2 () T: the interarrival time, interval between two successive arrivals, is a random variable T is exponentially distributed with para. λ E( T), ( T), c 1 2 T T has memoryless property 1 t 10

11 Memoryless Property -Interpretation 11

12 More on Poisson process 12

13 More on Poisson process 13

14 More on Poisson process Arrival equally likely happens in each small interval Pure random process Poisson process is extremely useful memoryless property, exponential inter-arrival time excellent model for many practical processes Telephone calls arrival process Many physical phenomena behave like Poisson process, x-ray Job size in computers obeys exponential distribution, while its arrival process does not affect the response time too much Limiting theorem 14

15 Limiting theorem Limiting theorem (Palm 43, Khinchin 60) Merge n (n>>1) i.i.d. arrival processes, each is an arbitrary interarrival distribution renewal process with arrival rate λ/n, then the aggregated arrival process approaches a Poisson process with rate λ Poisson process is a widely used model Exceptions, e.g., Internet traffic: self-similarity, bursty, heavy tailed traffic, transportation traffic: batch arrival 15

16 Markov modulated Poisson process MMPP The arrival rate of Poisson process is modulated by a state r on upper level State r is transiting according to Markov process E.g., r(t) obeys a Markov process with 2 states, average sojourn time is 1/r 1 and 1/r 2, respectively; at state 1 or 2, the Poisson arrival rate is λ 1 and λ 2 MMPP is a stochastic process with positive correlation coefficient 16

17 Markovian Arrival Process MAP is defined by two matrices D 0 and D 1 elements of D 0 represent hidden phase transitions elements of D 1 represent observable transitions Each transition of D 1 represents an arrival event Property (D 0 ) ii <0; (D 0 ) ij 0 for i j; (D 1 ) ij 0 D = D 0 + D 1 is a stochastic matrix, D1=0 Solve ωd=0, ω1=1, equivalent arrival rate is λ=ωd 1 1 MAP is a very general arrival process (similar to PH) Poisson process, MMPP, most of point process 17

18 Markovian Arrival Process Define the system state of MAP as S(t):=(N(t),J(t)), where N(t) is the number of arrivals until t, J(t) is the phase state of MAP at time t, then S(t) is a Markov process with the following rate matrix If we use MAP to represent a Poisson arrival with rate λ, then D_0 = -λ, D_1= λ 18

19 Relation map among some typical random processes Semi- Markov process Random walk Markov process Birth death process Poisson process Pure birth process Markov arrival process Renewal process 19

20 Part I: discrete time Markov chain (DTMC) Discrete time discrete state Markov process Notation: X={X 1,X 2,,X n } Memoryless: P[ X j X i, X i,..., X i ] n n 1 n 1 n 2 n P[ X j X i ] n n 1 n 1 Transition probability Homogeneous p ij, n p p : P[ X j X i] ij ij, n n n 1 20

21 Markov chain Basic elements State space, S={1,2,,S} assume discrete and finite Transition probability matrix Some properties m-step trans. prob., Chapman-Kolmogorov equation In matrix form: P [ pij ] i, j S ( m p ) : P[ X j X i] ij n m n S ( m) ( m 1) ij ik kj, 2,3,... k 1 ( m) ( m) m P [ pij ] P p p p m Sojourn time at state i is geometrically distributed with p ii 21

22 Example1, binary communication channel Bit states: 0 or 1, gets corrupted on the link with prob., p01 a, p00 1 a, p10 b, p11 1 b a and b are the corruption probability Transition probability matrix 1 a a P b 1 b Chalk write draw the state transition figure Write the limiting prob. matrix lim P n ( n ) b a a b a b b a a b a b 22

23 Example2, simple queue At each discrete time, t=1,2,3, Customer arrival with probability p Interarrival time is geometric distribution with p Job leaves the server with probability q, q>p Job size is geometrically distributed with q State: number of jobs in the system Chalk write state transition map Transition probability matrix, similar to discrete time birth-death process, a slight difference 23

24 Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate with each other Absorbing state: p jj =1 Reducible chain, i is an absorbing state p ( m0 ) ij 0 24

25 Recurrence ( ) f n : the probability that first returning to j jj occurs n steps after leaving j ( n) f : the probability that the chain ever jj : f jj n 1 returns to j j is recurrent, if f jj =1 j is transient, if f jj <1 If j is recurrent, mean recurrence time: If m jj <, j is positive recurrent If m jj =, j is null recurrent E.g., random walk return to origin 0, ½, ¼, 1/8, 1/16, n 1 25 m jj nf ( n) jj

26 Periodicity Period of a state i Greatest common divisor (gcd) Aperiodic Markov chain A DTMC (discrete time Markov chain) is aperiodic iff all states have period 1 Example period is 2, for DTMC with P Limiting of P n does not exist, P 2n does. ( n) gcd n: pii 0 26

27 Ergodicity Ergodic Markov chain positive recurrent, irreducible, aperiodic finite, irreducible, aperiodic Physical meaning If ergodic, time average = ensemble average For some moments, process is ergodic Fully ergodic: for all moments, process is ergodic See page 35 of Gross book 27

28 Limiting distribution If a finite DTMC is aperiodic and irreducible, ( ) then lim p n ij exists and is independent of i n 3 types of probabilities Limiting prob. Time-average prob. Stationary prob. ( ) l : lim n j pij n t 1 p j : lim 1( X ( n) j) t t n 1, s.t. P j 28

29 Limiting probability Limiting probability of i The limiting situation to enter i, may not exist (consider periodic case) Time-average probability of i % of time staying at state i Stationary probability Given the initial state is chosen according to π, all the future states follows the same distribution 29

30 Example, binary communication Limiting prob. Time-average prob. Stationary prob. channel Solve the equation and Conclusion Limiting prob. may not exist, otherwise they are all identical. π and p are always the same DNE, if a b 1 l j b a (, ) a b a b p j P b a (, ) a b a b e 1 30

31 Limiting probability Relations of 3 probabilities If the limiting probability exists, then these 3 probabilities match! If ergodic, limiting prob. exists and 3 prob. match How to solve? P, e 1 Pe e π is the eigen vector of matrix P with eigen value 1 balance equation 31

32 Example, discrete time birth-death process Model the dynamic of population Parameters Increase by 1 with a birth prob. b Decrease by 1 with a death prob. d Stay the same state with prob. 1-b-d Chalk write Write the transition probability matrix P Solve the equation for stationary prob. Understand the balance equation 32

33 Part II: continuous time Markov chain (CTMC) Continuous time discrete state Markov process Definition (Markovian property) X(t) is a CTMC, if for any n and any sequence t 1 <t 2 <,,<t n, we have P[ X ( t ) j X ( t ) i, X ( t ) i,..., X ( t ) i ] n n 1 n 1 n 2 n P[ X ( t ) j X ( t ) i ] n n 1 n 1 33

34 Exponentially distributed sojourn time of CTMC Sojourn time at state i, τ i, i=1,2, S, is exponentially distributed Proved with Markovian property P[ s t s] P[ t] i i i P[ s t] P[ s] P[ t] i i i P[ s t, s] P[ s] P[ t] i i i i dp[ i s t] f ( s) P[ ] i i t ds dp[ i t] f (0) ds i P[ t] i i f ( t) f (0) e i i ln P[ t] f (0) t i f (0) t i f i P[ t] e i (0) t Sojourn time is exponentially distributed! f i (0) is transition rate of state i 34

35 Basic elements of CTMC State space: S={1,2,,S}, assume it is discrete and finite for simplicity Transition prob. In matrix form If homogeneous, denote p ( s, t) : P[ X ( t) j X ( s) i], s t P(t) is independent of s (start time) Chapman Kolmogorov equation P( s, t) P( s, u) P( u, t), s u t ij P( s, t) : pij ( s, t) P( t) : P( s, s t) 35

36 Infinitesimal generator (transition rate matrix) Differentiation of C-K equation, let u t P( s, t) t P( s, t) Q( t), s t t 0 Q(t) is called the infinitesimal generator of transition matrix function P(s,t) since Its elements are Qt ( ) lim P( t, t t) I t P( s, t) Q() t t P( s, t) t (u) (, ) Q s P s t e u pii ( t, t t) 1 pij ( t, t t) qii ( t) lim 0 qij ( t) lim 0, i j t 0 t 0 t t S qij ( t) 0, for all i Q( t) e 0 j 1 relation between transition rate q ij and probability p ij -q ii is the rate flow out of i, q ij is the rate flow into i 36

37 Infinitesimal generator If homogeneous, denote Q=Q(t) Derivative of transition probability matrix Pt () t P() t Q With initial condition P(0)=I, we have P() t e Qt where Qt 1 1 e I Qt Q t Q t 2! 3! :... Infinitesimal generator (transition rate matrix) Q can generate the transition probability matrix P(t) 37

38 State probability () t is the state probability at time t We have ( t) (0) P( t) (0) e Qt Derivative operation d () t (0) e Qt Q ( t) Q dt Decompose into element d i() t dt ( t) q ( t) q i ii j ji j i 38

39 Steady state probability For ergodic Markov chain, we have lim ( t) lim p ( t) t Balance equation d i () t dt In matrix j j ij t i ( t) qii j ( t) q ji 0 j i Q 0 e 1 Qe 0 Comparing this with discrete case 39

40 Birth-death process A special Markov chain Continuous time: state k transits only to its neighbors, k-1 and k+1, k=1,2,3, model for population dynamics basis of fundamental queues state space, S={0,1,2, } at state k (population size) Birth rate, λ k, k=0,1,2, Death rate, μ k, k=1,2,3, 40

41 Infinitesimal generator Element of Q q k,k+1 = λ k, k=0,1,2, q k,k-1 = μ k, k=1,2,3, q k,j = 0, k-j >1 q k,k-1 = - λ k - μ k Chalk writing: tri-diagonal matrix Q 41

42 State transition probability analysis In Δt period, transition probability of state k k k+1: p k,k+1 (Δt)= λ k Δt+o(Δt) k k-1: p k,k-1 (Δt)= μ k Δt+o(Δt) k k: p k,k (Δt)= [1-λ k Δt-o(Δt)][1- μ k Δt-o(Δt)] = 1- λ k Δt-μ k Δt+o(Δt) k others: p k,j (Δt)= o(δt), k-j >1 42

43 State transition equation State transition equation ( t t) ( t) p ( t) ( t) p ( t) ( t) p ( t) o( t) k k k, k k 1 k 1, k k 1 k 1, k k-1 k=1,2, Case of k=0 is omitted, for homework k State k k+1 j k-j >1 at time t at time t+ Δt 43

44 State transition equation State transition equation: ( t t) ( t) ( ) t ( t) t ( t) t ( t) o( t) k k k k k k 1 k 1 k 1 k 1 With derivative form: d k () t dt d 0() t dt ( ) ( t) ( t) ( t), k 1,2,... k k k k 1 k 1 k 1 k 1 ( t) ( t), k This is the equation of dπ = πq Study the transient behavior of system states 44

45 Easier way to analyze birth-death process State transition rate diagram k 1 k k k-1 k k k 1 k k 1 This is transition rate, not probability If use probability, multiply dt and add self-loop transition Contain the same information as Q matrix 45

46 State transition rate diagram Look at state k based on state transition rate diagram Flow rate into state k I ( t) ( t) k k 1 k 1 k 1 k 1 Flow rate out of state k O ( ) ( t) k k k k Derivative of state probability d k () t dt I O ( t) ( t) ( ) ( t) k k k 1 k 1 k 1 k 1 k k k 46

47 State transition rate diagram Look at a state, or a set of neighboring states Change rate of state probability = incoming flow rate outgoing flow rate Balance equation For steady state, i.e., the change rate of state probability is 0, then we have balance equation incoming flow rate = outgoing flow rate, to solve the steady state probability This is commonly used to analyze queuing systems, Markov systems 47

48 Pure birth process Pure birth process Death rate is μ k =0 Birth rates are identical λ k = λ, for simplicity We have the derivative equation of state prob. d k () t k 1( t) k( t), k 1,2,... dt d 0() t 0( t), k 0 dt assume initial state is at k=0, i.e., π 0 (0)=1, we have () t 0 t e 48

49 Pure birth process For state k=1 d 1() t dt we have () t Generally, we have 1 t t The same as Poisson distribution! e () t 1 te k ( t) t k ( t) e, t 0, k 0,1,2,... k! 49

50 Pure birth process and Poisson process Pure birth process is exactly a Poisson process Exponential inter-arrival time with mean 1/λ k, number of arrivals during [0,t] k ( t) t Pk e, k 0,1,2,... k! Z-transform of random variable k To calculate the mean and variance of k Laplace transform of inter-arrival time Show it or exercise on class To calculate the mean and variance of inter-arrival time 50

51 Arrival time of Poisson process X is the time that exactly having k arrivals X = t 1 + t t k t k is the inter-arrival time of the kth arrival X is also called the k-erlang distributed r.v. The probability density function of X Use convolution of Laplace transform k ( ) ( ) s * * X s A s k Look at table of Laplace transform k 1 ( x) f X ( x) e ( k 1)! arrival rate k-1 arrival during (0,x) x 51

### Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate

### Part II: continuous time Markov chain (CTMC)

Part II: continuous time Markov chain (CTMC) Continuous time discrete state Markov process Definition (Markovian property) X(t) is a CTMC, if for any n and any sequence t 1

### Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2

### Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution

### CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv

### Part I Stochastic variables and Markov chains

Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)

### IEOR 6711, HMWK 5, Professor Sigman

IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.

### CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one

### MARKOV PROCESSES. Valerio Di Valerio

MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some

### The Transition Probability Function P ij (t)

The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it

### Markov processes and queueing networks

Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

### Continuous-Time Markov Chain

Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),

### Quantitative Model Checking (QMC) - SS12

Quantitative Model Checking (QMC) - SS12 Lecture 06 David Spieler Saarland University, Germany June 4, 2012 1 / 34 Deciding Bisimulations 2 / 34 Partition Refinement Algorithm Notation: A partition P over

### Outlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC)

Markov Chains (2) Outlines Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC) 2 pj ( n) denotes the pmf of the random variable p ( n) P( X j) j We will only be concerned with homogenous

### LECTURE #6 BIRTH-DEATH PROCESS

LECTURE #6 BIRTH-DEATH PROCESS 204528 Queueing Theory and Applications in Networks Assoc. Prof., Ph.D. (รศ.ดร. อน นต ผลเพ ม) Computer Engineering Department, Kasetsart University Outline 2 Birth-Death

### Introduction to Queuing Networks Solutions to Problem Sheet 3

Introduction to Queuing Networks Solutions to Problem Sheet 3 1. (a) The state space is the whole numbers {, 1, 2,...}. The transition rates are q i,i+1 λ for all i and q i, for all i 1 since, when a bus

### Statistics 150: Spring 2007

Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities

### IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas

### Data analysis and stochastic modeling

Data analysis and stochastic modeling Lecture 7 An introduction to queueing theory Guillaume Gravier guillaume.gravier@irisa.fr with a lot of help from Paul Jensen s course http://www.me.utexas.edu/ jensen/ormm/instruction/powerpoint/or_models_09/14_queuing.ppt

### Lecture 20: Reversible Processes and Queues

Lecture 20: Reversible Processes and Queues 1 Examples of reversible processes 11 Birth-death processes We define two non-negative sequences birth and death rates denoted by {λ n : n N 0 } and {µ n : n

### Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities

### Continuous time Markov chains

Continuous time Markov chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ October 16, 2017

### Stochastic modelling of epidemic spread

Stochastic modelling of epidemic spread Julien Arino Centre for Research on Inner City Health St Michael s Hospital Toronto On leave from Department of Mathematics University of Manitoba Julien Arino@umanitoba.ca

### Continuous Time Processes

page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point

### Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

Markov chains c A. J. Ganesh, University of Bristol, 2015 1 Discrete time Markov chains Example: A drunkard is walking home from the pub. There are n lampposts between the pub and his home, at each of

### Basic concepts of probability theory

Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,

### Lecturer: Olga Galinina

Lecturer: Olga Galinina E-mail: olga.galinina@tut.fi Outline Motivation Modulated models; Continuous Markov models Markov modulated models; Batch Markovian arrival process; Markovian arrival process; Markov

### STAT STOCHASTIC PROCESSES. Contents

STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5

### Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process

### Basic concepts of probability theory

Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,

### Outline. Finite source queue M/M/c//K Queues with impatience (balking, reneging, jockeying, retrial) Transient behavior Advanced Queue.

Outline Finite source queue M/M/c//K Queues with impatience (balking, reneging, jockeying, retrial) Transient behavior Advanced Queue Batch queue Bulk input queue M [X] /M/1 Bulk service queue M/M [Y]

### Figure 10.1: Recording when the event E occurs

10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

### TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

Norwegian University of Science and Technology Department of Mathematical Sciences Page of 7 English Contact during examination: Øyvind Bakke Telephone: 73 9 8 26, 99 4 673 TMA426 Stochastic processes

### TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis

TCOM 50: Networking Theory & Fundamentals Lecture 6 February 9, 003 Prof. Yannis A. Korilis 6- Topics Time-Reversal of Markov Chains Reversibility Truncating a Reversible Markov Chain Burke s Theorem Queues

### Basic concepts of probability theory

Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,

### Winter 2019 Math 106 Topics in Applied Mathematics. Lecture 9: Markov Chain Monte Carlo

Winter 2019 Math 106 Topics in Applied Mathematics Data-driven Uncertainty Quantification Yoonsang Lee (yoonsang.lee@dartmouth.edu) Lecture 9: Markov Chain Monte Carlo 9.1 Markov Chain A Markov Chain Monte

### (b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only

### STOCHASTIC PROCESSES Basic notions

J. Virtamo 38.3143 Queueing Theory / Stochastic processes 1 STOCHASTIC PROCESSES Basic notions Often the systems we consider evolve in time and we are interested in their dynamic behaviour, usually involving

### Performance Evaluation of Queuing Systems

Performance Evaluation of Queuing Systems Introduction to Queuing Systems System Performance Measures & Little s Law Equilibrium Solution of Birth-Death Processes Analysis of Single-Station Queuing Systems

### DISCRETE STOCHASTIC PROCESSES Draft of 2nd Edition

DISCRETE STOCHASTIC PROCESSES Draft of 2nd Edition R. G. Gallager January 31, 2011 i ii Preface These notes are a draft of a major rewrite of a text [9] of the same name. The notes and the text are outgrowths

### Introduction to Markov Chains, Queuing Theory, and Network Performance

Introduction to Markov Chains, Queuing Theory, and Network Performance Marceau Coupechoux Telecom ParisTech, departement Informatique et Réseaux marceau.coupechoux@telecom-paristech.fr IT.2403 Modélisation

### SMSTC (2007/08) Probability.

SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................

### Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Lecture 7: Simulation of Markov Processes Pasi Lassila Department of Communications and Networking Contents Markov processes theory recap Elementary queuing models for data networks Simulation of Markov

### Queueing. Chapter Continuous Time Markov Chains 2 CHAPTER 5. QUEUEING

2 CHAPTER 5. QUEUEING Chapter 5 Queueing Systems are often modeled by automata, and discrete events are transitions from one state to another. In this chapter we want to analyze such discrete events systems.

### Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Friday, Feb 8, 2013

Statistics 253/317 Introduction to Probability Models Winter 2014 - Midterm Exam Friday, Feb 8, 2013 Student Name (print): (a) Do not sit directly next to another student. (b) This is a closed-book, closed-note

### CS 798: Homework Assignment 3 (Queueing Theory)

1.0 Little s law Assigned: October 6, 009 Patients arriving to the emergency room at the Grand River Hospital have a mean waiting time of three hours. It has been found that, averaged over the period of

### STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

Name STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 There are five questions on this test. DO use calculators if you need them. And then a miracle occurs is not a valid answer. There

### Example: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected

4. Markov Chains A discrete time process {X n,n = 0,1,2,...} with discrete state space X n {0,1,2,...} is a Markov chain if it has the Markov property: P[X n+1 =j X n =i,x n 1 =i n 1,...,X 0 =i 0 ] = P[X

### ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains

ECE-517: Reinforcement Learning in Artificial Intelligence Lecture 4: Discrete-Time Markov Chains September 1, 215 Dr. Itamar Arel College of Engineering Department of Electrical Engineering & Computer

### Birth-Death Processes

Birth-Death Processes Birth-Death Processes: Transient Solution Poisson Process: State Distribution Poisson Process: Inter-arrival Times Dr Conor McArdle EE414 - Birth-Death Processes 1/17 Birth-Death

### Stochastic modelling of epidemic spread

Stochastic modelling of epidemic spread Julien Arino Department of Mathematics University of Manitoba Winnipeg Julien Arino@umanitoba.ca 19 May 2012 1 Introduction 2 Stochastic processes 3 The SIS model

### Queueing Theory. VK Room: M Last updated: October 17, 2013.

Queueing Theory VK Room: M1.30 knightva@cf.ac.uk www.vincent-knight.com Last updated: October 17, 2013. 1 / 63 Overview Description of Queueing Processes The Single Server Markovian Queue Multi Server

### Examination paper for TMA4265 Stochastic Processes

Department of Mathematical Sciences Examination paper for TMA4265 Stochastic Processes Academic contact during examination: Andrea Riebler Phone: 456 89 592 Examination date: December 14th, 2015 Examination

### SYMBOLS AND ABBREVIATIONS

APPENDIX A SYMBOLS AND ABBREVIATIONS This appendix contains definitions of common symbols and abbreviations used frequently and consistently throughout the text. Symbols that are used only occasionally

### Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS

Stochastic Processes Theory for Applications Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS Contents Preface page xv Swgg&sfzoMj ybr zmjfr%cforj owf fmdy xix Acknowledgements xxi 1 Introduction and review

### Contents Preface The Exponential Distribution and the Poisson Process Introduction to Renewal Theory

Contents Preface... v 1 The Exponential Distribution and the Poisson Process... 1 1.1 Introduction... 1 1.2 The Density, the Distribution, the Tail, and the Hazard Functions... 2 1.2.1 The Hazard Function

### Lecture 10: Semi-Markov Type Processes

Lecture 1: Semi-Markov Type Processes 1. Semi-Markov processes (SMP) 1.1 Definition of SMP 1.2 Transition probabilities for SMP 1.3 Hitting times and semi-markov renewal equations 2. Processes with semi-markov

### EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

Norges teknisk naturvitenskapelige universitet Institutt for matematiske fag Page 1 of 7 English Contact: Håkon Tjelmeland 48 22 18 96 EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013

### GI/M/1 and GI/M/m queuing systems

GI/M/1 and GI/M/m queuing systems Dmitri A. Moltchanov moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2716/ OUTLINE: GI/M/1 queuing system; Methods of analysis; Imbedded Markov chain approach; Waiting

### Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time

### Time Reversibility and Burke s Theorem

Queuing Analysis: Time Reversibility and Burke s Theorem Hongwei Zhang http://www.cs.wayne.edu/~hzhang Acknowledgement: this lecture is partially based on the slides of Dr. Yannis A. Korilis. Outline Time-Reversal

### Lecture 4a: Continuous-Time Markov Chain Models

Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time

### Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1

Queueing systems Renato Lo Cigno Simulation and Performance Evaluation 2014-15 Queueing systems - Renato Lo Cigno 1 Queues A Birth-Death process is well modeled by a queue Indeed queues can be used to

### Lecture 20 : Markov Chains

CSCI 3560 Probability and Computing Instructor: Bogdan Chlebus Lecture 0 : Markov Chains We consider stochastic processes. A process represents a system that evolves through incremental changes called

### LIMITING PROBABILITY TRANSITION MATRIX OF A CONDENSED FIBONACCI TREE

International Journal of Applied Mathematics Volume 31 No. 18, 41-49 ISSN: 1311-178 (printed version); ISSN: 1314-86 (on-line version) doi: http://dx.doi.org/1.173/ijam.v31i.6 LIMITING PROBABILITY TRANSITION

### M/G/1 and M/G/1/K systems

M/G/1 and M/G/1/K systems Dmitri A. Moltchanov dmitri.moltchanov@tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Description of M/G/1 system; Methods of analysis; Residual life approach; Imbedded

### 1 IEOR 4701: Continuous-Time Markov Chains

Copyright c 2006 by Karl Sigman 1 IEOR 4701: Continuous-Time Markov Chains A Markov chain in discrete time, {X n : n 0}, remains in any state for exactly one unit of time before making a transition (change

### STAT 380 Continuous Time Markov Chains

STAT 380 Continuous Time Markov Chains Richard Lockhart Simon Fraser University Spring 2018 Richard Lockhart (Simon Fraser University)STAT 380 Continuous Time Markov Chains Spring 2018 1 / 35 Continuous

### Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Queueing Theory I Summary Little s Law Queueing System Notation Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K " Little s Law a(t): the process that counts the number of arrivals

### Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Exercise 6.5: Solutions to Homework 0 6.262 Discrete Stochastic Processes MIT, Spring 20 Consider the Markov process illustrated below. The transitions are labelled by the rate q ij at which those transitions

### ISE/OR 760 Applied Stochastic Modeling

ISE/OR 760 Applied Stochastic Modeling Topic 2: Discrete Time Markov Chain Yunan Liu Department of Industrial and Systems Engineering NC State University Yunan Liu (NC State University) ISE/OR 760 1 /

### Lecturer: Olga Galinina

Renewal models Lecturer: Olga Galinina E-mail: olga.galinina@tut.fi Outline Reminder. Exponential models definition of renewal processes exponential interval distribution Erlang distribution hyperexponential

### Markov Processes Hamid R. Rabiee

Markov Processes Hamid R. Rabiee Overview Markov Property Markov Chains Definition Stationary Property Paths in Markov Chains Classification of States Steady States in MCs. 2 Markov Property A discrete

### Name of the Student:

SUBJECT NAME : Probability & Queueing Theory SUBJECT CODE : MA 6453 MATERIAL NAME : Part A questions REGULATION : R2013 UPDATED ON : November 2017 (Upto N/D 2017 QP) (Scan the above QR code for the direct

### ISM206 Lecture, May 12, 2005 Markov Chain

ISM206 Lecture, May 12, 2005 Markov Chain Instructor: Kevin Ross Scribe: Pritam Roy May 26, 2005 1 Outline of topics for the 10 AM lecture The topics are: Discrete Time Markov Chain Examples Chapman-Kolmogorov

### An Introduction to Stochastic Modeling

F An Introduction to Stochastic Modeling Fourth Edition Mark A. Pinsky Department of Mathematics Northwestern University Evanston, Illinois Samuel Karlin Department of Mathematics Stanford University Stanford,

LECTURE 19 Readings: Finish Section 5.2 Lecture outline Markov Processes I Checkout counter example. Markov process: definition. -step transition probabilities. Classification of states. Example: Checkout

### Markov Processes and Queues

MIT 2.853/2.854 Introduction to Manufacturing Systems Markov Processes and Queues Stanley B. Gershwin Laboratory for Manufacturing and Productivity Massachusetts Institute of Technology Markov Processes

### Non Markovian Queues (contd.)

MODULE 7: RENEWAL PROCESSES 29 Lecture 5 Non Markovian Queues (contd) For the case where the service time is constant, V ar(b) = 0, then the P-K formula for M/D/ queue reduces to L s = ρ + ρ 2 2( ρ) where

### Stochastic Modelling Unit 1: Markov chain models

Stochastic Modelling Unit 1: Markov chain models Russell Gerrard and Douglas Wright Cass Business School, City University, London June 2004 Contents of Unit 1 1 Stochastic Processes 2 Markov Chains 3 Poisson

### 2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

295 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 16 Queueing Systems with Two Types of Customers In this section, we discuss queueing systems with two types of customers.

### Bulk input queue M [X] /M/1 Bulk service queue M/M [Y] /1 Erlangian queue M/E k /1

Advanced Markovian queues Bulk input queue M [X] /M/ Bulk service queue M/M [Y] / Erlangian queue M/E k / Bulk input queue M [X] /M/ Batch arrival, Poisson process, arrival rate λ number of customers in

### Modelling data networks stochastic processes and Markov chains

Modelling data networks stochastic processes and Markov chains a 1, 3 1, 2 2, 2 b 0, 3 2, 3 u 1, 3 α 1, 6 c 0, 3 v 2, 2 β 1, 1 Richard G. Clegg (richard@richardclegg.org) November 2016 Available online

### Budapest University of Tecnology and Economics. AndrásVetier Q U E U I N G. January 25, Supported by. Pro Renovanda Cultura Hunariae Alapítvány

Budapest University of Tecnology and Economics AndrásVetier Q U E U I N G January 25, 2000 Supported by Pro Renovanda Cultura Hunariae Alapítvány Klebelsberg Kunó Emlékére Szakalapitvány 2000 Table of

### Analysis of an Infinite-Server Queue with Markovian Arrival Streams

Analysis of an Infinite-Server Queue with Markovian Arrival Streams Guidance Professor Associate Professor Assistant Professor Masao FUKUSHIMA Tetsuya TAKINE Nobuo YAMASHITA Hiroyuki MASUYAMA 1999 Graduate

### Birth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes

DTU Informatics 247 Stochastic Processes 6, October 27 Today: Limiting behaviour of birth and death processes Birth and death processes with absorbing states Finite state continuous time Markov chains

### Stochastic Petri Net

Stochastic Petri Net Serge Haddad LSV ENS Cachan & CNRS & INRIA haddad@lsv.ens-cachan.fr Petri Nets 2013, June 24th 2013 1 Stochastic Petri Net 2 Markov Chain 3 Markovian Stochastic Petri Net 4 Generalized

### 8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains 8.1 Review 8.2 Statistical Equilibrium 8.3 Two-State Markov Chain 8.4 Existence of P ( ) 8.5 Classification of States

### LECTURE 3. Last time:

LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

### 6 Continuous-Time Birth and Death Chains

6 Continuous-Time Birth and Death Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology.

### Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.

Service Engineering Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis. G/G/1 Queue: Virtual Waiting Time (Unfinished Work). GI/GI/1: Lindley s Equations

Sidney Resnick Adventures in Stochastic Processes with Illustrations Birkhäuser Boston Basel Berlin Table of Contents Preface ix CHAPTER 1. PRELIMINARIES: DISCRETE INDEX SETS AND/OR DISCRETE STATE SPACES

### Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition

### Markov Processes Cont d. Kolmogorov Differential Equations

Markov Processes Cont d Kolmogorov Differential Equations The Kolmogorov Differential Equations characterize the transition functions {P ij (t)} of a Markov process. The time-dependent behavior of the

### 4.7.1 Computing a stationary distribution

At a high-level our interest in the rest of this section will be to understand the limiting distribution, when it exists and how to compute it To compute it, we will try to reason about when the limiting

### Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition

### Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321

Lecture 11: Introduction to Markov Chains Copyright G. Caire (Sample Lectures) 321 Discrete-time random processes A sequence of RVs indexed by a variable n 2 {0, 1, 2,...} forms a discretetime random process

### The Markov Chain Monte Carlo Method

The Markov Chain Monte Carlo Method Idea: define an ergodic Markov chain whose stationary distribution is the desired probability distribution. Let X 0, X 1, X 2,..., X n be the run of the chain. The Markov