Markovské řetězce se spojitým parametrem

Size: px
Start display at page:

Download "Markovské řetězce se spojitým parametrem"

Transcription

1 Markovské řetězce se spojitým parametrem Mgr. Rudolf B. Blažek, Ph.D. prof. RNDr. Roman Kotecký, DrSc. Katedra počítačových systémů Katedra teoretické informatiky Fakulta informačních technologií České vysoké učení technické v Praze Rudolf Blažek & Roman Kotecký, 2011 Statistika pro informatiku MI-SPI, ZS 2011/12, Přednáška 20 Evropský sociální fond Praha & EU: Investujeme do vaší budoucnos@

2 Mgr. Rudolf B. Blažek, Ph.D. prof. RNDr. Roman Kotecký, DrSc. Department of Computer Systems Department of Theoretical Informatics Faculty of Information Technologies Czech Technical University in Prague Rudolf Blažek & Roman Kotecký, 2011 Statistics for Informatics MI-SPI, ZS 2011/12, Lecture 20 The European Social Fund Prague & EU: We Invest in Your Future

3 Recall Discrete-time Markov Chains The Markov Property for a discrete-time Markov chains P(X n+1 = j X n = i, X n 1 = i n 1,..., X 0 = i 0 )= = P(X n+1 = j X n = i) = P i,j = p i,j = p(i, j) In continuous time difficult to define conditional probability given Xr for all r < s we instead work with all choices of 0 s0 < s1,..., sn < s (for all n 1) 3

4 Definition!!! Markovský řetezec se spojitým parametrem Xt, t 0 is a continuous-time Markov chain if for any times 0 s0 < s1,..., sn < s and any states i0, i1,..., in, i, j (for all n 1) we have P(X t+s = j X s = i, X = i sn n,..., X = i s0 0 )= = P(X t = j X 0 = i) = p t (i, j) 4

5 Construction Construction: Discrete-time Markov Chain with Poisson Process Timing Example Let N(t), t 0 be a Poisson Process with rate λ. Let Yn be a discrete-time Markov chain with transition probabilities u(i,j). Assume N(t) is independent of Yn. Then Xt = YN(t) is a continuous-time Markov chain. Xt jumps according to u(i,j) at each arrival of N(t). Most continuous-time Markov chains can be constructed is a similar way. We will show how later today. 5

6 Construction Construction: Discrete-time Markov Chain with Poisson Process Timing Example N(t), t 0 is a Poisson Process with rate λ, therefore the number of arrivals in (0,t) is a Poisson random variable * * * * * * * N(t) ~ Poisson(λt) t P(N(t) =n) =e ( t)n n! Therefore p t (i, j) =P(X t = j X 0 = i) 1X = P(N(t) =n, X t = j X 0 = i) n=0 6

7 Construction Construction: Discrete-time Markov Chain with Poisson Process Timing Example 1X p t (i, j) = P(N(t) =n, X t = j X 0 = i) n=0 1X Xt = YN(t) N(t) is independent of the MC Yn = = n=0 1X n=0 P(N(t) =n, Y N(t) = j Y N(0) = i) P(N(t) =n) P(Y n = j Y 0 = i) 7

8 Construction Construction: Discrete-time Markov Chain with Poisson Process Timing Example 1X p t (i, j) = P(N(t) =n) P(Y n = j Y 0 = i) n=0 = 1X n=0 e t ( t)n n! u n (i, j) N(t) ~ Poisson(λt) n-step transition probability for discretetime Markov chain Yn 8

9 Chapman-Kolmogorov Equation Chapman-Kolmogorov Equation Theorem X k p s (i, k) p t (k, j) =p s+t (i, j) Idea of the Proof We want to go from i to j in time s + t First we must go to some state k in time s Then we must finish by going from k to j in time t We must consider all states k, thus we sum over k 9

10 Chapman-Kolmogorov Equation Chapman-Kolmogorov Equation Theorem X k p s (i, k) p t (k, j) =p s+t (i, j) Assume we know pt(i,j) for all t t0 (and for all i,j) The theorem allows us to calculate pt(i,j) for all t > 0!!! Let t0 0, consider the derivative at 0: lim t 0!0 +(p t (i, j) 0)/t 0 0 The derivative at 0 should be be enough to determine pt(i,j) for all t > 0. Well, it is!! (We will show this later today.) 10

11 Jump Rates Jump Rates Definition!!!!!!!!!!! Intenzita přechodu Denote the derivative of pt(i,j) with respect to time at 0 as If the derivative exists, then we call q(i,j) the jump rate from i to j. q(i, j) = lim h!0 + p h (i, j) h for j 6= i Note that we do not calculate q(i,i) Why is it called a jump rate? Let s look at our construction of the Markov chain... 11

12 Jump Rates Why is q(i,j) called a jump rate? Previous Example Xt = YN(t); Yn = discrete-time MC with transition prob. u(i,j). N(t), t 0 is a Poisson Process with rate λ, indep. of Yn. If Xt is at i, it makes jumps with rate λ (a Poisson process) It goes to j with probability u(i,j) This is a Poisson process thinning: Xt jumps from i to j as a Poisson Process with rate λ u(i,j) Next we will show that in this example q(i,j) = λ u(i,j) That is why we call q(i,j) the jump rate from i to j 12

13 Jump Rates Why is q(i,j) called a jump rate? Previous Example Xt = YN(t); Yn = discrete-time MC with transition prob. u(i,j). N(t), t 0 is a Poisson Process with rate λ, indep. of Yn. We got before p t (i, j) = 1X n=0 e t ( t)n n! u n (i, j) Therefore p h (i, j) h = 1 h 1X n=0 e h ( h)n n! u n (i, j) 13

14 Jump Rates Why is q(i,j) called a jump rate? Previous Example p t (i, j) = 1 h h 1X n=0 h e ( h)n u n (i, j) n! Let h 0 1 λ u(i,j) p h (i, j) h = 1 e h ( h)0 u 0 (i, j) +e h 0! + e h 1 1X n=2 n h n 1 h ( )1 1! u 1 (i, j) = 0 j i (transition in 0 steps) 0 u n (i, j) 0 n! 14

15 Jump Rates Why is q(i,j) called a jump rate? Summary of Previous Example Xt = YN(t); Yn = discrete-time MC with transition prob. u(i,j). N(t), t 0 is a Poisson Process with rate λ, indep. of Yn. Xt jumps from i to j as a Poisson Process w/ rate λ u(i,j) The jump rates of the process Xt are q(i, j) = lim h!0 p h (i, j) h = lim h!0 e h u(i, j) = u(i, j) Most Markov chains can be constructed in a similar way!! That is why for any continuous-time Markov chain we call q(i,j) the jump rates from i to j. 15

16 Jump Rates Poisson Process as a Markov Chain Simple Example!!!!!!!!! Poisson Process Consider a Poisson process with rate λ. Let X(t) be number of arrivals of the process in (0,t). X(t) increases from n to n + 1 at rate λ: q(n, n+1) = λ for all n 0 This simple example will allow us to create other more complicated examples... 16

17 Jump Rates Queueing System M / M / s Infinite FCFS Queue m s Parallel Servers w/ Exp(!) service time Infinite Customer Population Arriving Customers Poisson Process w/ rate " Departing Customers 17

18 Jump Rates Queueing System M / M / s Example!!!!!!!!!!!! M / M / s Queue Consider load-balancing s replicated database servers. A request is routed to the next available server. Requests line-up in a single queue if all servers are busy. Requests arrive at times of a Poisson Process w/ rate λ: q(n, n+1) = λ for all n 0 Service times are random independent ~ Exp(μ): q(n, n 1) = nμ q(n, n 1) = sμ if all 0 n s if all n s 18

19 Jump Rates Queueing System M / M / s with Balking Example!!!!!!!! M / M / s Queue with Balking The same setup; requests arrive as Poisson Process (λ). The load balancer forwards requests randomly to a different data center with probability (1 an) if there are n requests. I.e., the new request stays with probability an: q(n, n+1) = λ an for all n 0 Service times are as before independent ~ Exp(μ): q(n, n 1) = nμ q(n, n 1) = sμ if all 0 n s if all n s 19

20 Jump Rates Jump Rates vs. Transition Probabilities Example In our example the jump rates are q(i, j) = u(i, j) We leave i with rate λ (a Poisson process)... * * * * * * * *... and go to j with probability u(i,j) Note this can be reversed u(i, j) =q(i, j)/ This is how we can construct the MC from its jump rates... * * * * * * * *... but what is λ? 20

21 Constructing Markov Chains from Jump Rates From Jump Rates To a Markov Chain Constructing the MC from its jump rates Let λi = j i q(i,j). This is the rate at which Xt leaves i. λi =... Xt leaves i immediately... so we assume λi <. For λi > 0 we define (a routing matrix) r(i, j) =q(i, j)/ i If Xt is in state i with λi = 0, then Xt stays in i forever. If λi > 0, then Xt waits in i for a random time ~ Exp(λi). Then Xt jumps to j with probability r(i,j). We have many Poisson processes with λi for each state i 21

22 Constructing Markov Chains from Jump Rates A Single Poisson Process for Markov Chains with Bounded Rates Constructing the MC from bounded jump rates Let λi = j i q(i,j) and Λ = maxi λi. Assume that Λ < (bounded rates) and define u(i, j) =q(i, j)/ u(i, i) =1 X j6= i for j 6= i u(i, j) Xt tries to make a transitions with rate Λ (Poisson Process). Xt leaves i for j with probability u(i,j), but stays in i with u(i,i). Any MC with bounded rates can be constructed as our initial example! 22

23 Kolmogorov s Backward and Forward Equations Computing Transition Probabilities Our goal is to use the jump rates q(i,j) to find the transition probabilities p t (i, j) =P(X t = j X 0 = i) Recall the Chapman-Kolmogorov equation: X p s (i, k) p t (k, j) =p s+t (i, j) k It can be differentiated to show that p 0 X t (i, j) = q(i, k)p t (k, j) ip t (i, j) k6= i 23

24 Kolmogorov s Backward and Forward Equations Computing Transition Probabilities Kolmogorov s Backward Equation Define a matrix The equation Q(i, j) = q(i, j) if j 6= i i if j = i. p 0 X t (i, j) = q(i, k)p t (k, j) ip t (i, j) k6= i can be rewritten in matrix notation as p 0 = Qp t t. This is the Kolmogorov s Backward Equation. 24

25 Kolmogorov s Backward and Forward Equations Computing Transition Probabilities Kolmogorov s Backward Equation The Kolmogorov s Backward Equation is p 0 t = Qp t Kolmogorov s Forward Equation Using similar techniques, one can obtain Kolmogorov s Forward Equation p 0 = p t t Q 25

26 Kolmogorov s Backward and Forward Equations Solving Kolmogorov Backward Equation Theorem The solution of the Kolmogorov s backward equation can be obtained as e Qt p 0 t = Qp t (similarly as if Q were a number). The exponential function for matrix Q is defined as Proof: By direct differentiation. 26

Základy teorie front II

Základy teorie front II Základy teorie front II Aplikace Poissonova procesu v teorii front Mgr. Rudolf B. Blažek, Ph.D. prof. RNDr. Roman Kotecký, DrSc. Katedra počítačových systémů Katedra teoretické informatiky Fakulta informačních

More information

Bootstrap metody II Kernelové Odhady Hustot

Bootstrap metody II Kernelové Odhady Hustot Bootstrap metody II Kernelové Odhady Hustot Mgr. Rudolf B. Blažek, Ph.D. prof. RNDr. Roman Kotecký, DrSc. Katedra počítačových systémů Katedra teoretické informatiky Fakulta informačních technologií České

More information

Základy teorie front

Základy teorie front Základy teorie front Mgr. Rudolf B. Blažek, Ph.D. prof. RNDr. Roman Kotecký, DrSc. Katedra počítačových systémů Katedra teoretické informatiky Fakulta informačních technologií České vysoké učení technické

More information

Statistika pro informatiku

Statistika pro informatiku Statistika pro informatiku prof. RNDr. Roman Kotecký DrSc., Dr. Rudolf Blažek, PhD Katedra teoretické informatiky FIT České vysoké učení technické v Praze MI-SPI, ZS 2011/12, Přednáška 5 Evropský sociální

More information

Statistika pro informatiku

Statistika pro informatiku Statistika pro informatiku prof. RNDr. Roman Kotecký DrSc., Dr. Rudolf Blažek, PhD Katedra teoretické informatiky FIT České vysoké učení technické v Praze MI-SPI, ZS 2011/12, Přednáška 2 Evropský sociální

More information

The Transition Probability Function P ij (t)

The Transition Probability Function P ij (t) The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it

More information

Quantum computing. Jan Černý, FIT, Czech Technical University in Prague. České vysoké učení technické v Praze. Fakulta informačních technologií

Quantum computing. Jan Černý, FIT, Czech Technical University in Prague. České vysoké učení technické v Praze. Fakulta informačních technologií České vysoké učení technické v Praze Fakulta informačních technologií Katedra teoretické informatiky Evropský sociální fond Praha & EU: Investujeme do vaší budoucnosti MI-MVI Methods of Computational Intelligence(2010/2011)

More information

Continuous-Time Markov Chain

Continuous-Time Markov Chain Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),

More information

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process

More information

Statistics 150: Spring 2007

Statistics 150: Spring 2007 Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities

More information

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking Lecture 7: Simulation of Markov Processes Pasi Lassila Department of Communications and Networking Contents Markov processes theory recap Elementary queuing models for data networks Simulation of Markov

More information

Statistika pro informatiku

Statistika pro informatiku Statistika pro informatiku prof. RNDr. Roman Kotecký DrSc., Dr. Rudolf Blažek, PhD Katedra teoretické informatiky FIT České vysoké učení technické v Praze MI-SPI, ZS 2011/12, Přednáška 1 Evropský sociální

More information

Part I Stochastic variables and Markov chains

Part I Stochastic variables and Markov chains Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)

More information

IEOR 6711, HMWK 5, Professor Sigman

IEOR 6711, HMWK 5, Professor Sigman IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.

More information

Data analysis and stochastic modeling

Data analysis and stochastic modeling Data analysis and stochastic modeling Lecture 7 An introduction to queueing theory Guillaume Gravier guillaume.gravier@irisa.fr with a lot of help from Paul Jensen s course http://www.me.utexas.edu/ jensen/ormm/instruction/powerpoint/or_models_09/14_queuing.ppt

More information

Continuous Time Markov Chains

Continuous Time Markov Chains Continuous Time Markov Chains Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 2015 Outline Introduction Continuous-Time Markov

More information

T. Liggett Mathematics 171 Final Exam June 8, 2011

T. Liggett Mathematics 171 Final Exam June 8, 2011 T. Liggett Mathematics 171 Final Exam June 8, 2011 1. The continuous time renewal chain X t has state space S = {0, 1, 2,...} and transition rates (i.e., Q matrix) given by q(n, n 1) = δ n and q(0, n)

More information

Cole s MergeSort. prof. Ing. Pavel Tvrdík CSc. Fakulta informačních technologií České vysoké učení technické v Praze c Pavel Tvrdík, 2010

Cole s MergeSort. prof. Ing. Pavel Tvrdík CSc. Fakulta informačních technologií České vysoké učení technické v Praze c Pavel Tvrdík, 2010 Cole s MergeSort prof. Ing. Pavel Tvrdík CSc. Katedra počítačových systémů Fakulta informačních technologií České vysoké učení technické v Praze c Pavel Tvrdík, 2010 Pokročilé paralelní algoritmy (PI-PPA)

More information

TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis

TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis TCOM 50: Networking Theory & Fundamentals Lecture 6 February 9, 003 Prof. Yannis A. Korilis 6- Topics Time-Reversal of Markov Chains Reversibility Truncating a Reversible Markov Chain Burke s Theorem Queues

More information

LECTURE #6 BIRTH-DEATH PROCESS

LECTURE #6 BIRTH-DEATH PROCESS LECTURE #6 BIRTH-DEATH PROCESS 204528 Queueing Theory and Applications in Networks Assoc. Prof., Ph.D. (รศ.ดร. อน นต ผลเพ ม) Computer Engineering Department, Kasetsart University Outline 2 Birth-Death

More information

A review of Continuous Time MC STA 624, Spring 2015

A review of Continuous Time MC STA 624, Spring 2015 A review of Continuous Time MC STA 624, Spring 2015 Ruriko Yoshida Dept. of Statistics University of Kentucky polytopes.net STA 624 1 Continuous Time Markov chains Definition A continuous time stochastic

More information

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem Wade Trappe Lecture Overview Network of Queues Introduction Queues in Tandem roduct Form Solutions Burke s Theorem What

More information

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011 Exercise 6.5: Solutions to Homework 0 6.262 Discrete Stochastic Processes MIT, Spring 20 Consider the Markov process illustrated below. The transitions are labelled by the rate q ij at which those transitions

More information

SMSTC (2007/08) Probability.

SMSTC (2007/08) Probability. SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................

More information

Markov processes and queueing networks

Markov processes and queueing networks Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

More information

Computational intelligence methods

Computational intelligence methods Computational intelligence methods GA, schemas, diversity Pavel Kordík, Martin Šlapák Katedra teoretické informatiky FIT České vysoké učení technické v Praze MI-MVI, ZS 2011/12, Lect. 5 https://edux.fit.cvut.cz/courses/mi-mvi/

More information

Stochastic process. X, a series of random variables indexed by t

Stochastic process. X, a series of random variables indexed by t Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,

More information

STAT 380 Continuous Time Markov Chains

STAT 380 Continuous Time Markov Chains STAT 380 Continuous Time Markov Chains Richard Lockhart Simon Fraser University Spring 2018 Richard Lockhart (Simon Fraser University)STAT 380 Continuous Time Markov Chains Spring 2018 1 / 35 Continuous

More information

Introduction to Queuing Networks Solutions to Problem Sheet 3

Introduction to Queuing Networks Solutions to Problem Sheet 3 Introduction to Queuing Networks Solutions to Problem Sheet 3 1. (a) The state space is the whole numbers {, 1, 2,...}. The transition rates are q i,i+1 λ for all i and q i, for all i 1 since, when a bus

More information

MI-RUB Testing Lecture 10

MI-RUB Testing Lecture 10 MI-RUB Testing Lecture 10 Pavel Strnad pavel.strnad@fel.cvut.cz Dept. of Computer Science, FEE CTU Prague, Karlovo nám. 13, 121 35 Praha, Czech Republic MI-RUB, WS 2011/12 Evropský sociální fond Praha

More information

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1 Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate

More information

Branch-and-Bound Algorithm. Pattern Recognition XI. Michal Haindl. Outline

Branch-and-Bound Algorithm. Pattern Recognition XI. Michal Haindl. Outline Branch-and-Bound Algorithm assumption - can be used if a feature selection criterion satisfies the monotonicity property monotonicity property - for nested feature sets X j related X 1 X 2... X l the criterion

More information

GI/M/1 and GI/M/m queuing systems

GI/M/1 and GI/M/m queuing systems GI/M/1 and GI/M/m queuing systems Dmitri A. Moltchanov moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2716/ OUTLINE: GI/M/1 queuing system; Methods of analysis; Imbedded Markov chain approach; Waiting

More information

Queues and Queueing Networks

Queues and Queueing Networks Queues and Queueing Networks Sanjay K. Bose Dept. of EEE, IITG Copyright 2015, Sanjay K. Bose 1 Introduction to Queueing Models and Queueing Analysis Copyright 2015, Sanjay K. Bose 2 Model of a Queue Arrivals

More information

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element

More information

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems  M/M/1  M/M/m  M/M/1/K Queueing Theory I Summary Little s Law Queueing System Notation Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K " Little s Law a(t): the process that counts the number of arrivals

More information

Queuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011

Queuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011 Queuing Theory Richard Lockhart Simon Fraser University STAT 870 Summer 2011 Richard Lockhart (Simon Fraser University) Queuing Theory STAT 870 Summer 2011 1 / 15 Purposes of Today s Lecture Describe general

More information

Time Reversibility and Burke s Theorem

Time Reversibility and Burke s Theorem Queuing Analysis: Time Reversibility and Burke s Theorem Hongwei Zhang http://www.cs.wayne.edu/~hzhang Acknowledgement: this lecture is partially based on the slides of Dr. Yannis A. Korilis. Outline Time-Reversal

More information

Markov Processes Cont d. Kolmogorov Differential Equations

Markov Processes Cont d. Kolmogorov Differential Equations Markov Processes Cont d Kolmogorov Differential Equations The Kolmogorov Differential Equations characterize the transition functions {P ij (t)} of a Markov process. The time-dependent behavior of the

More information

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition

More information

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time Math 416 Lecture 11 Math 416 Lecture 16 Exam 2 next time Birth and death processes, queueing theory In arrival processes, the state only jumps up. In a birth-death process, it can either jump up or down

More information

stochnotes Page 1

stochnotes Page 1 stochnotes110308 Page 1 Kolmogorov forward and backward equations and Poisson process Monday, November 03, 2008 11:58 AM How can we apply the Kolmogorov equations to calculate various statistics of interest?

More information

Computational Intelligence Methods

Computational Intelligence Methods Computational Intelligence Methods Ant Colony Optimization, Partical Swarm Optimization Pavel Kordík, Martin Šlapák Katedra teoretické informatiky FIT České vysoké učení technické v Praze MI-MVI, ZS 2011/12,

More information

Examination paper for TMA4265 Stochastic Processes

Examination paper for TMA4265 Stochastic Processes Department of Mathematical Sciences Examination paper for TMA4265 Stochastic Processes Academic contact during examination: Andrea Riebler Phone: 456 89 592 Examination date: December 14th, 2015 Examination

More information

MI-RUB Testing II Lecture 11

MI-RUB Testing II Lecture 11 MI-RUB Testing II Lecture 11 Pavel Strnad pavel.strnad@fel.cvut.cz Dept. of Computer Science, FEE CTU Prague, Karlovo nám. 13, 121 35 Praha, Czech Republic MI-RUB, WS 2011/12 Evropský sociální fond Praha

More information

18.175: Lecture 30 Markov chains

18.175: Lecture 30 Markov chains 18.175: Lecture 30 Markov chains Scott Sheffield MIT Outline Review what you know about finite state Markov chains Finite state ergodicity and stationarity More general setup Outline Review what you know

More information

Estimation of arrival and service rates for M/M/c queue system

Estimation of arrival and service rates for M/M/c queue system Estimation of arrival and service rates for M/M/c queue system Katarína Starinská starinskak@gmail.com Charles University Faculty of Mathematics and Physics Department of Probability and Mathematical Statistics

More information

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv

More information

Intro Refresher Reversibility Open networks Closed networks Multiclass networks Other networks. Queuing Networks. Florence Perronnin

Intro Refresher Reversibility Open networks Closed networks Multiclass networks Other networks. Queuing Networks. Florence Perronnin Queuing Networks Florence Perronnin Polytech Grenoble - UGA March 23, 27 F. Perronnin (UGA) Queuing Networks March 23, 27 / 46 Outline Introduction to Queuing Networks 2 Refresher: M/M/ queue 3 Reversibility

More information

Queueing Theory II. Summary. ! M/M/1 Output process. ! Networks of Queue! Method of Stages. ! General Distributions

Queueing Theory II. Summary. ! M/M/1 Output process. ! Networks of Queue! Method of Stages. ! General Distributions Queueing Theory II Summary! M/M/1 Output process! Networks of Queue! Method of Stages " Erlang Distribution " Hyperexponential Distribution! General Distributions " Embedded Markov Chains M/M/1 Output

More information

Quantitative Model Checking (QMC) - SS12

Quantitative Model Checking (QMC) - SS12 Quantitative Model Checking (QMC) - SS12 Lecture 06 David Spieler Saarland University, Germany June 4, 2012 1 / 34 Deciding Bisimulations 2 / 34 Partition Refinement Algorithm Notation: A partition P over

More information

Name of the Student:

Name of the Student: SUBJECT NAME : Probability & Queueing Theory SUBJECT CODE : MA 6453 MATERIAL NAME : Part A questions REGULATION : R2013 UPDATED ON : November 2017 (Upto N/D 2017 QP) (Scan the above QR code for the direct

More information

QUEUING MODELS AND MARKOV PROCESSES

QUEUING MODELS AND MARKOV PROCESSES QUEUING MODELS AND MARKOV ROCESSES Queues form when customer demand for a service cannot be met immediately. They occur because of fluctuations in demand levels so that models of queuing are intrinsically

More information

Markov Chain Model for ALOHA protocol

Markov Chain Model for ALOHA protocol Markov Chain Model for ALOHA protocol Laila Daniel and Krishnan Narayanan April 22, 2012 Outline of the talk A Markov chain (MC) model for Slotted ALOHA Basic properties of Discrete-time Markov Chain Stability

More information

Performance Evaluation of Queuing Systems

Performance Evaluation of Queuing Systems Performance Evaluation of Queuing Systems Introduction to Queuing Systems System Performance Measures & Little s Law Equilibrium Solution of Birth-Death Processes Analysis of Single-Station Queuing Systems

More information

(implicitly assuming time-homogeneity from here on)

(implicitly assuming time-homogeneity from here on) Continuous-Time Markov Chains Models Tuesday, November 15, 2011 2:02 PM The fundamental object describing the dynamics of a CTMC (continuous-time Markov chain) is the probability transition (matrix) function:

More information

TMA 4265 Stochastic Processes

TMA 4265 Stochastic Processes TMA 4265 Stochastic Processes Norges teknisk-naturvitenskapelige universitet Institutt for matematiske fag Solution - Exercise 9 Exercises from the text book 5.29 Kidney transplant T A exp( A ) T B exp(

More information

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010 Exercises Stochastic Performance Modelling Hamilton Institute, Summer Instruction Exercise Let X be a non-negative random variable with E[X ]

More information

15 Closed production networks

15 Closed production networks 5 Closed production networks In the previous chapter we developed and analyzed stochastic models for production networks with a free inflow of jobs. In this chapter we will study production networks for

More information

6 Solving Queueing Models

6 Solving Queueing Models 6 Solving Queueing Models 6.1 Introduction In this note we look at the solution of systems of queues, starting with simple isolated queues. The benefits of using predefined, easily classified queues will

More information

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015 Markov chains c A. J. Ganesh, University of Bristol, 2015 1 Discrete time Markov chains Example: A drunkard is walking home from the pub. There are n lampposts between the pub and his home, at each of

More information

Introduction to Markov Chains, Queuing Theory, and Network Performance

Introduction to Markov Chains, Queuing Theory, and Network Performance Introduction to Markov Chains, Queuing Theory, and Network Performance Marceau Coupechoux Telecom ParisTech, departement Informatique et Réseaux marceau.coupechoux@telecom-paristech.fr IT.2403 Modélisation

More information

ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains

ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains ECE-517: Reinforcement Learning in Artificial Intelligence Lecture 4: Discrete-Time Markov Chains September 1, 215 Dr. Itamar Arel College of Engineering Department of Electrical Engineering & Computer

More information

MATH37012 Week 10. Dr Jonathan Bagley. Semester

MATH37012 Week 10. Dr Jonathan Bagley. Semester MATH37012 Week 10 Dr Jonathan Bagley Semester 2-2018 2.18 a) Finding and µ j for a particular category of B.D. processes. Consider a process where the destination of the next transition is determined by

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

4.7 Finite Population Source Model

4.7 Finite Population Source Model Characteristics 1. Arrival Process R independent Source All sources are identical Interarrival time is exponential with rate for each source No arrivals if all sources are in the system. OR372-Dr.Khalid

More information

Queueing Systems: Lecture 3. Amedeo R. Odoni October 18, Announcements

Queueing Systems: Lecture 3. Amedeo R. Odoni October 18, Announcements Queueing Systems: Lecture 3 Amedeo R. Odoni October 18, 006 Announcements PS #3 due tomorrow by 3 PM Office hours Odoni: Wed, 10/18, :30-4:30; next week: Tue, 10/4 Quiz #1: October 5, open book, in class;

More information

Non Markovian Queues (contd.)

Non Markovian Queues (contd.) MODULE 7: RENEWAL PROCESSES 29 Lecture 5 Non Markovian Queues (contd) For the case where the service time is constant, V ar(b) = 0, then the P-K formula for M/D/ queue reduces to L s = ρ + ρ 2 2( ρ) where

More information

Bernoulli Counting Process with p=0.1

Bernoulli Counting Process with p=0.1 Stat 28 October 29, 21 Today: More Ch 7 (Sections 7.4 and part of 7.) Midterm will cover Ch 7 to section 7.4 Review session will be Nov. Exercises to try (answers in book): 7.1-, 7.2-3, 7.3-3, 7.4-7 Where

More information

Promotion and Application of the M/M/1/N Queuing System

Promotion and Application of the M/M/1/N Queuing System Mathematica Aeterna, Vol 2, 2012, no 7, 609-616 Promotion and Application of the M/M/1/N Queuing System Quanru Pan School of Mathematics and Physics, Jiangsu University of science and technology, Zhenjiang,

More information

Continuous Time Markov Chain Examples

Continuous Time Markov Chain Examples Continuous Markov Chain Examples Example Consider a continuous time Markov chain on S {,, } The Markov chain is a model that describes the current status of a match between two particular contestants:

More information

Other properties of M M 1

Other properties of M M 1 Other properties of M M 1 Přemysl Bejda premyslbejda@gmail.com 2012 Contents 1 Reflected Lévy Process 2 Time dependent properties of M M 1 3 Waiting times and queue disciplines in M M 1 Contents 1 Reflected

More information

Networking = Plumbing. Queueing Analysis: I. Last Lecture. Lecture Outline. Jeremiah Deng. 29 July 2013

Networking = Plumbing. Queueing Analysis: I. Last Lecture. Lecture Outline. Jeremiah Deng. 29 July 2013 Networking = Plumbing TELE302 Lecture 7 Queueing Analysis: I Jeremiah Deng University of Otago 29 July 2013 Jeremiah Deng (University of Otago) TELE302 Lecture 7 29 July 2013 1 / 33 Lecture Outline Jeremiah

More information

Queuing Theory. Using the Math. Management Science

Queuing Theory. Using the Math. Management Science Queuing Theory Using the Math 1 Markov Processes (Chains) A process consisting of a countable sequence of stages, that can be judged at each stage to fall into future states independent of how the process

More information

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 Name STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 There are five questions on this test. DO use calculators if you need them. And then a miracle occurs is not a valid answer. There

More information

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014 Statistics 253/317 Introduction to Probability Models Winter 2014 - Midterm Exam Monday, Feb 10, 2014 Student Name (print): (a) Do not sit directly next to another student. (b) This is a closed-book, closed-note

More information

Continuous Time Processes

Continuous Time Processes page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point

More information

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling Norwegian University of Science and Technology Department of Mathematical Sciences Page of 7 English Contact during examination: Øyvind Bakke Telephone: 73 9 8 26, 99 4 673 TMA426 Stochastic processes

More information

Binary Decision Diagrams

Binary Decision Diagrams Binary Decision Diagrams Logic Circuits Design Seminars WS2010/2011, Lecture 2 Ing. Petr Fišer, Ph.D. Department of Digital Design Faculty of Information Technology Czech Technical University in Prague

More information

Set Theory. Pattern Recognition III. Michal Haindl. Set Operations. Outline

Set Theory. Pattern Recognition III. Michal Haindl. Set Operations. Outline Set Theory A, B sets e.g. A = {ζ 1,...,ζ n } A = { c x y d} S space (universe) A,B S Outline Pattern Recognition III Michal Haindl Faculty of Information Technology, KTI Czech Technical University in Prague

More information

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one

More information

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00 Norges teknisk naturvitenskapelige universitet Institutt for matematiske fag Page 1 of 7 English Contact: Håkon Tjelmeland 48 22 18 96 EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013

More information

M/G/1 and M/G/1/K systems

M/G/1 and M/G/1/K systems M/G/1 and M/G/1/K systems Dmitri A. Moltchanov dmitri.moltchanov@tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Description of M/G/1 system; Methods of analysis; Residual life approach; Imbedded

More information

The shortest queue problem

The shortest queue problem The shortest queue problem Ivo Adan March 19, 2002 1/40 queue 1 join the shortest queue queue 2 Where: Poisson arrivals with rate Exponential service times with mean 1/ 2/40 queue 1 queue 2 randomly assign

More information

Markov Processes and Queues

Markov Processes and Queues MIT 2.853/2.854 Introduction to Manufacturing Systems Markov Processes and Queues Stanley B. Gershwin Laboratory for Manufacturing and Productivity Massachusetts Institute of Technology Markov Processes

More information

Stochastic Models in Computer Science A Tutorial

Stochastic Models in Computer Science A Tutorial Stochastic Models in Computer Science A Tutorial Dr. Snehanshu Saha Department of Computer Science PESIT BSC, Bengaluru WCI 2015 - August 10 to August 13 1 Introduction 2 Random Variable 3 Introduction

More information

SYMBOLS AND ABBREVIATIONS

SYMBOLS AND ABBREVIATIONS APPENDIX A SYMBOLS AND ABBREVIATIONS This appendix contains definitions of common symbols and abbreviations used frequently and consistently throughout the text. Symbols that are used only occasionally

More information

Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam. Name: Student ID:

Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam. Name: Student ID: Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam 1 2 3 4 5 6 7 8 9 10 7 questions. 1. [5+5] Let X and Y be independent exponential random variables where X has

More information

Queueing Networks G. Rubino INRIA / IRISA, Rennes, France

Queueing Networks G. Rubino INRIA / IRISA, Rennes, France Queueing Networks G. Rubino INRIA / IRISA, Rennes, France February 2006 Index 1. Open nets: Basic Jackson result 2 2. Open nets: Internet performance evaluation 18 3. Closed nets: Basic Gordon-Newell result

More information

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18. IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas

More information

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution

More information

Inventory Ordering Control for a Retrial Service Facility System Semi- MDP

Inventory Ordering Control for a Retrial Service Facility System Semi- MDP International Journal of Engineering Science Invention (IJESI) ISS (Online): 239 6734, ISS (Print): 239 6726 Volume 7 Issue 6 Ver I June 208 PP 4-20 Inventory Ordering Control for a Retrial Service Facility

More information

Approximating Deterministic Changes to Ph(t)/Ph(t)/1/c and Ph(t)/M(t)/s/c Queueing Models

Approximating Deterministic Changes to Ph(t)/Ph(t)/1/c and Ph(t)/M(t)/s/c Queueing Models Approximating Deterministic Changes to Ph(t)/Ph(t)/1/c and Ph(t)/M(t)/s/c Queueing Models Aditya Umesh Kulkarni Thesis submitted to the Faculty of the Virginia Polytechnic Institute and State University

More information

1 IEOR 4701: Continuous-Time Markov Chains

1 IEOR 4701: Continuous-Time Markov Chains Copyright c 2006 by Karl Sigman 1 IEOR 4701: Continuous-Time Markov Chains A Markov chain in discrete time, {X n : n 0}, remains in any state for exactly one unit of time before making a transition (change

More information

Question Points Score Total: 70

Question Points Score Total: 70 The University of British Columbia Final Examination - April 204 Mathematics 303 Dr. D. Brydges Time: 2.5 hours Last Name First Signature Student Number Special Instructions: Closed book exam, no calculators.

More information

6 Continuous-Time Birth and Death Chains

6 Continuous-Time Birth and Death Chains 6 Continuous-Time Birth and Death Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology.

More information

Renewal theory and its applications

Renewal theory and its applications Renewal theory and its applications Stella Kapodistria and Jacques Resing September 11th, 212 ISP Definition of a Renewal process Renewal theory and its applications If we substitute the Exponentially

More information

NEW FRONTIERS IN APPLIED PROBABILITY

NEW FRONTIERS IN APPLIED PROBABILITY J. Appl. Prob. Spec. Vol. 48A, 209 213 (2011) Applied Probability Trust 2011 NEW FRONTIERS IN APPLIED PROBABILITY A Festschrift for SØREN ASMUSSEN Edited by P. GLYNN, T. MIKOSCH and T. ROLSKI Part 4. Simulation

More information

NICTA Short Course. Network Analysis. Vijay Sivaraman. Day 1 Queueing Systems and Markov Chains. Network Analysis, 2008s2 1-1

NICTA Short Course. Network Analysis. Vijay Sivaraman. Day 1 Queueing Systems and Markov Chains. Network Analysis, 2008s2 1-1 NICTA Short Course Network Analysis Vijay Sivaraman Day 1 Queueing Systems and Markov Chains Network Analysis, 2008s2 1-1 Outline Why a short course on mathematical analysis? Limited current course offering

More information