Markovské řetězce se spojitým parametrem

Similar documents
Základy teorie front II

Bootstrap metody II Kernelové Odhady Hustot

Základy teorie front

Statistika pro informatiku

Statistika pro informatiku

The Transition Probability Function P ij (t)

Quantum computing. Jan Černý, FIT, Czech Technical University in Prague. České vysoké učení technické v Praze. Fakulta informačních technologií

Continuous-Time Markov Chain

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Statistics 150: Spring 2007

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Statistika pro informatiku

Part I Stochastic variables and Markov chains

IEOR 6711, HMWK 5, Professor Sigman

Data analysis and stochastic modeling

Continuous Time Markov Chains

T. Liggett Mathematics 171 Final Exam June 8, 2011

Cole s MergeSort. prof. Ing. Pavel Tvrdík CSc. Fakulta informačních technologií České vysoké učení technické v Praze c Pavel Tvrdík, 2010

TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis

LECTURE #6 BIRTH-DEATH PROCESS

A review of Continuous Time MC STA 624, Spring 2015

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

SMSTC (2007/08) Probability.

Markov processes and queueing networks

Computational intelligence methods

Stochastic process. X, a series of random variables indexed by t

STAT 380 Continuous Time Markov Chains

Introduction to Queuing Networks Solutions to Problem Sheet 3

MI-RUB Testing Lecture 10

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Branch-and-Bound Algorithm. Pattern Recognition XI. Michal Haindl. Outline

GI/M/1 and GI/M/m queuing systems

Queues and Queueing Networks

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Queuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011

Time Reversibility and Burke s Theorem

Markov Processes Cont d. Kolmogorov Differential Equations

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time

stochnotes Page 1

Computational Intelligence Methods

Examination paper for TMA4265 Stochastic Processes

MI-RUB Testing II Lecture 11

18.175: Lecture 30 Markov chains

Estimation of arrival and service rates for M/M/c queue system

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Intro Refresher Reversibility Open networks Closed networks Multiclass networks Other networks. Queuing Networks. Florence Perronnin

Queueing Theory II. Summary. ! M/M/1 Output process. ! Networks of Queue! Method of Stages. ! General Distributions

Quantitative Model Checking (QMC) - SS12

Name of the Student:

QUEUING MODELS AND MARKOV PROCESSES

Markov Chain Model for ALOHA protocol

Performance Evaluation of Queuing Systems

(implicitly assuming time-homogeneity from here on)

TMA 4265 Stochastic Processes

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

15 Closed production networks

6 Solving Queueing Models

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

Introduction to Markov Chains, Queuing Theory, and Network Performance

ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains

MATH37012 Week 10. Dr Jonathan Bagley. Semester

Figure 10.1: Recording when the event E occurs

4.7 Finite Population Source Model

Queueing Systems: Lecture 3. Amedeo R. Odoni October 18, Announcements

Non Markovian Queues (contd.)

Bernoulli Counting Process with p=0.1

Promotion and Application of the M/M/1/N Queuing System

Continuous Time Markov Chain Examples

Other properties of M M 1

Networking = Plumbing. Queueing Analysis: I. Last Lecture. Lecture Outline. Jeremiah Deng. 29 July 2013

Queuing Theory. Using the Math. Management Science

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

Continuous Time Processes

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

Binary Decision Diagrams

Set Theory. Pattern Recognition III. Michal Haindl. Set Operations. Outline

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

M/G/1 and M/G/1/K systems

The shortest queue problem

Markov Processes and Queues

Stochastic Models in Computer Science A Tutorial

SYMBOLS AND ABBREVIATIONS

Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam. Name: Student ID:

Queueing Networks G. Rubino INRIA / IRISA, Rennes, France

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Inventory Ordering Control for a Retrial Service Facility System Semi- MDP

Approximating Deterministic Changes to Ph(t)/Ph(t)/1/c and Ph(t)/M(t)/s/c Queueing Models

1 IEOR 4701: Continuous-Time Markov Chains

Question Points Score Total: 70

6 Continuous-Time Birth and Death Chains

Renewal theory and its applications

NEW FRONTIERS IN APPLIED PROBABILITY

NICTA Short Course. Network Analysis. Vijay Sivaraman. Day 1 Queueing Systems and Markov Chains. Network Analysis, 2008s2 1-1

Transcription:

Markovské řetězce se spojitým parametrem Mgr. Rudolf B. Blažek, Ph.D. prof. RNDr. Roman Kotecký, DrSc. Katedra počítačových systémů Katedra teoretické informatiky Fakulta informačních technologií České vysoké učení technické v Praze Rudolf Blažek & Roman Kotecký, 2011 Statistika pro informatiku MI-SPI, ZS 2011/12, Přednáška 20 Evropský sociální fond Praha & EU: Investujeme do vaší budoucnos@

Mgr. Rudolf B. Blažek, Ph.D. prof. RNDr. Roman Kotecký, DrSc. Department of Computer Systems Department of Theoretical Informatics Faculty of Information Technologies Czech Technical University in Prague Rudolf Blažek & Roman Kotecký, 2011 Statistics for Informatics MI-SPI, ZS 2011/12, Lecture 20 The European Social Fund Prague & EU: We Invest in Your Future

Recall Discrete-time Markov Chains The Markov Property for a discrete-time Markov chains P(X n+1 = j X n = i, X n 1 = i n 1,..., X 0 = i 0 )= = P(X n+1 = j X n = i) = P i,j = p i,j = p(i, j) In continuous time difficult to define conditional probability given Xr for all r < s we instead work with all choices of 0 s0 < s1,..., sn < s (for all n 1) 3

Definition!!! Markovský řetezec se spojitým parametrem Xt, t 0 is a continuous-time Markov chain if for any times 0 s0 < s1,..., sn < s and any states i0, i1,..., in, i, j (for all n 1) we have P(X t+s = j X s = i, X = i sn n,..., X = i s0 0 )= = P(X t = j X 0 = i) = p t (i, j) 4

Construction Construction: Discrete-time Markov Chain with Poisson Process Timing Example Let N(t), t 0 be a Poisson Process with rate λ. Let Yn be a discrete-time Markov chain with transition probabilities u(i,j). Assume N(t) is independent of Yn. Then Xt = YN(t) is a continuous-time Markov chain. Xt jumps according to u(i,j) at each arrival of N(t). Most continuous-time Markov chains can be constructed is a similar way. We will show how later today. 5

Construction Construction: Discrete-time Markov Chain with Poisson Process Timing Example N(t), t 0 is a Poisson Process with rate λ, therefore the number of arrivals in (0,t) is a Poisson random variable * * * * * * * N(t) ~ Poisson(λt) t P(N(t) =n) =e ( t)n n! Therefore p t (i, j) =P(X t = j X 0 = i) 1X = P(N(t) =n, X t = j X 0 = i) n=0 6

Construction Construction: Discrete-time Markov Chain with Poisson Process Timing Example 1X p t (i, j) = P(N(t) =n, X t = j X 0 = i) n=0 1X Xt = YN(t) N(t) is independent of the MC Yn = = n=0 1X n=0 P(N(t) =n, Y N(t) = j Y N(0) = i) P(N(t) =n) P(Y n = j Y 0 = i) 7

Construction Construction: Discrete-time Markov Chain with Poisson Process Timing Example 1X p t (i, j) = P(N(t) =n) P(Y n = j Y 0 = i) n=0 = 1X n=0 e t ( t)n n! u n (i, j) N(t) ~ Poisson(λt) n-step transition probability for discretetime Markov chain Yn 8

Chapman-Kolmogorov Equation Chapman-Kolmogorov Equation Theorem X k p s (i, k) p t (k, j) =p s+t (i, j) Idea of the Proof We want to go from i to j in time s + t First we must go to some state k in time s Then we must finish by going from k to j in time t We must consider all states k, thus we sum over k 9

Chapman-Kolmogorov Equation Chapman-Kolmogorov Equation Theorem X k p s (i, k) p t (k, j) =p s+t (i, j) Assume we know pt(i,j) for all t t0 (and for all i,j) The theorem allows us to calculate pt(i,j) for all t > 0!!! Let t0 0, consider the derivative at 0: lim t 0!0 +(p t (i, j) 0)/t 0 0 The derivative at 0 should be be enough to determine pt(i,j) for all t > 0. Well, it is!! (We will show this later today.) 10

Jump Rates Jump Rates Definition!!!!!!!!!!! Intenzita přechodu Denote the derivative of pt(i,j) with respect to time at 0 as If the derivative exists, then we call q(i,j) the jump rate from i to j. q(i, j) = lim h!0 + p h (i, j) h for j 6= i Note that we do not calculate q(i,i) Why is it called a jump rate? Let s look at our construction of the Markov chain... 11

Jump Rates Why is q(i,j) called a jump rate? Previous Example Xt = YN(t); Yn = discrete-time MC with transition prob. u(i,j). N(t), t 0 is a Poisson Process with rate λ, indep. of Yn. If Xt is at i, it makes jumps with rate λ (a Poisson process) It goes to j with probability u(i,j) This is a Poisson process thinning: Xt jumps from i to j as a Poisson Process with rate λ u(i,j) Next we will show that in this example q(i,j) = λ u(i,j) That is why we call q(i,j) the jump rate from i to j 12

Jump Rates Why is q(i,j) called a jump rate? Previous Example Xt = YN(t); Yn = discrete-time MC with transition prob. u(i,j). N(t), t 0 is a Poisson Process with rate λ, indep. of Yn. We got before p t (i, j) = 1X n=0 e t ( t)n n! u n (i, j) Therefore p h (i, j) h = 1 h 1X n=0 e h ( h)n n! u n (i, j) 13

Jump Rates Why is q(i,j) called a jump rate? Previous Example p t (i, j) = 1 h h 1X n=0 h e ( h)n u n (i, j) n! Let h 0 1 λ u(i,j) p h (i, j) h = 1 e h ( h)0 u 0 (i, j) +e h 0! + e h 1 1X n=2 n h n 1 h ( )1 1! u 1 (i, j) = 0 j i (transition in 0 steps) 0 u n (i, j) 0 n! 14

Jump Rates Why is q(i,j) called a jump rate? Summary of Previous Example Xt = YN(t); Yn = discrete-time MC with transition prob. u(i,j). N(t), t 0 is a Poisson Process with rate λ, indep. of Yn. Xt jumps from i to j as a Poisson Process w/ rate λ u(i,j) The jump rates of the process Xt are q(i, j) = lim h!0 p h (i, j) h = lim h!0 e h u(i, j) = u(i, j) Most Markov chains can be constructed in a similar way!! That is why for any continuous-time Markov chain we call q(i,j) the jump rates from i to j. 15

Jump Rates Poisson Process as a Markov Chain Simple Example!!!!!!!!! Poisson Process Consider a Poisson process with rate λ. Let X(t) be number of arrivals of the process in (0,t). X(t) increases from n to n + 1 at rate λ: q(n, n+1) = λ for all n 0 This simple example will allow us to create other more complicated examples... 16

Jump Rates Queueing System M / M / s Infinite FCFS Queue m s Parallel Servers w/ Exp(!) service time Infinite Customer Population Arriving Customers Poisson Process w/ rate " Departing Customers 17

Jump Rates Queueing System M / M / s Example!!!!!!!!!!!! M / M / s Queue Consider load-balancing s replicated database servers. A request is routed to the next available server. Requests line-up in a single queue if all servers are busy. Requests arrive at times of a Poisson Process w/ rate λ: q(n, n+1) = λ for all n 0 Service times are random independent ~ Exp(μ): q(n, n 1) = nμ q(n, n 1) = sμ if all 0 n s if all n s 18

Jump Rates Queueing System M / M / s with Balking Example!!!!!!!! M / M / s Queue with Balking The same setup; requests arrive as Poisson Process (λ). The load balancer forwards requests randomly to a different data center with probability (1 an) if there are n requests. I.e., the new request stays with probability an: q(n, n+1) = λ an for all n 0 Service times are as before independent ~ Exp(μ): q(n, n 1) = nμ q(n, n 1) = sμ if all 0 n s if all n s 19

Jump Rates Jump Rates vs. Transition Probabilities Example In our example the jump rates are q(i, j) = u(i, j) We leave i with rate λ (a Poisson process)... * * * * * * * *... and go to j with probability u(i,j) Note this can be reversed u(i, j) =q(i, j)/ This is how we can construct the MC from its jump rates... * * * * * * * *... but what is λ? 20

Constructing Markov Chains from Jump Rates From Jump Rates To a Markov Chain Constructing the MC from its jump rates Let λi = j i q(i,j). This is the rate at which Xt leaves i. λi =... Xt leaves i immediately... so we assume λi <. For λi > 0 we define (a routing matrix) r(i, j) =q(i, j)/ i If Xt is in state i with λi = 0, then Xt stays in i forever. If λi > 0, then Xt waits in i for a random time ~ Exp(λi). Then Xt jumps to j with probability r(i,j). We have many Poisson processes with λi for each state i 21

Constructing Markov Chains from Jump Rates A Single Poisson Process for Markov Chains with Bounded Rates Constructing the MC from bounded jump rates Let λi = j i q(i,j) and Λ = maxi λi. Assume that Λ < (bounded rates) and define u(i, j) =q(i, j)/ u(i, i) =1 X j6= i for j 6= i u(i, j) Xt tries to make a transitions with rate Λ (Poisson Process). Xt leaves i for j with probability u(i,j), but stays in i with u(i,i). Any MC with bounded rates can be constructed as our initial example! 22

Kolmogorov s Backward and Forward Equations Computing Transition Probabilities Our goal is to use the jump rates q(i,j) to find the transition probabilities p t (i, j) =P(X t = j X 0 = i) Recall the Chapman-Kolmogorov equation: X p s (i, k) p t (k, j) =p s+t (i, j) k It can be differentiated to show that p 0 X t (i, j) = q(i, k)p t (k, j) ip t (i, j) k6= i 23

Kolmogorov s Backward and Forward Equations Computing Transition Probabilities Kolmogorov s Backward Equation Define a matrix The equation Q(i, j) = q(i, j) if j 6= i i if j = i. p 0 X t (i, j) = q(i, k)p t (k, j) ip t (i, j) k6= i can be rewritten in matrix notation as p 0 = Qp t t. This is the Kolmogorov s Backward Equation. 24

Kolmogorov s Backward and Forward Equations Computing Transition Probabilities Kolmogorov s Backward Equation The Kolmogorov s Backward Equation is p 0 t = Qp t Kolmogorov s Forward Equation Using similar techniques, one can obtain Kolmogorov s Forward Equation p 0 = p t t Q 25

Kolmogorov s Backward and Forward Equations Solving Kolmogorov Backward Equation Theorem The solution of the Kolmogorov s backward equation can be obtained as e Qt p 0 t = Qp t (similarly as if Q were a number). The exponential function for matrix Q is defined as Proof: By direct differentiation. 26