LECTURE #6 BIRTH-DEATH PROCESS
|
|
- Anabel Dawson
- 6 years ago
- Views:
Transcription
1 LECTURE #6 BIRTH-DEATH PROCESS Queueing Theory and Applications in Networks Assoc. Prof., Ph.D. (รศ.ดร. อน นต ผลเพ ม) Computer Engineering Department, Kasetsart University
2 Outline 2 Birth-Death Process Markov Process Property Continuous Time Birth-Death Markov Chains State Transition Diagram A Pure Birth System A Pure Death System A Birth-Death Process Equilibrium Solution
3 Birth-Death Process 3 A Markov Process Homogeneous, aperiodic, and irreducible Discrete time / Continuous time State changes can only happen between neighbors
4 Birth-Death Process 4 Size of population System is in state E k when consists of k members Changes in population size occur by at most one Size increased by one Birth Size decreased by one Death Transition probabilities p ij do not change with time
5 Birth-Death Process 5 p ij = i j = i 1 1 i i j = i i j = i Otherwise 1 i i i i i-1 i i+1
6 Birth-Death Process 6 i = death (less one in population size) 0 = 0 (no population no death) i = birth (increase one in population) i > 0 (birth is allowed) Pure Birth = no decrement, only increment Pure Death = no increment, only decrement
7 Queueing Theory Model 7 Population = customers in the queueing system Death = a customer departure from the system Birth = a customer arrival to the system
8 Transition matrix P = i 1 - i - i i
9 Discrete Time Markov Chains 9 One can stay in a Discrete state (position) and is permitted to change state at Discrete time.
10 Discrete Time Markov Chains 10 P{X n = j X 1 = i 1, X 2 = i 2,, X n-1 = i n-1 } = P{X n = j X n-1 = i n-1 } Where n = 1,2,3, X n : The system is in state j at time n The system can begin at state 0 with initial probability P[X 0 = x] P{X n = j X n-1 = i n-1 } is the one-step transition probability
11 Discrete Time Markov Chains 11 From initial probability and one-step transition probability, we can find probability of being in various states at time n
12 Continuous Time Markov Chains 12 P{X(t n+1 ) = j X(t 1 ) = i 1, X(t 2 ) = i 2,, X(t n ) = i n } = P{X(t n+1 ) = j X(t n ) = i n } Where n = 1,2,3, t 1 < t 2 < <t n One can stay in a Discrete state (position) and is permitted to change state at occur at arbitrary time
13 Markov Process Property 13 Time that the process spends in any state must be Memoryless Discrete Time Markov Chains Geometrically distributed state times Continuous Time Markov Chains Exponentially distributed state times
14 Markov Process Property 14 i p 1-p For Discrete Time Markov Chain P[system in state i for N time units system in current state i] = p N P[system in state i for N time units before exiting from state i] = p N (1-p) Geometrically distributed state times Dept. of Computer Enginerring, Kasetsart University, Thailand
15 Markov Process Property 15 For Continuous Time Markov Chain i 1 µ t µ t P[system in state i for time T system in current state i] = (1 µ t) T/ t = e µt where t 0 Exponentially distributed state times Dept. of Computer Enginerring, Kasetsart University, Thailand
16 Continuous Time Birth-Death Markov Chains 16 Let i = birth rate in state i µ i = death rate in state i Then P[state i to state i 1 in t] = µ i t P[state i to state i + 1 in t] = i t P[state i to state i in t] = 1 ( i + µ i ) t P[state i to other state in t] = 0
17 Continuous Time Birth-Death Markov Chains 17 p ij = µ i t j = i 1 1 ( i + µ i ) t j = i i t j = i Otherwise 1 ( i + µ i ) t µ i t i t i-1 i i+1 t 0
18 State Transition Diagram µ i-2 i-1 i i+1 2 i-1 i i+1 µ 2 µ 3 µ i-1 µ i µ i+1 µ i+2 X(t) = # in the system at time t = birth death in (0,t) p i (t) = P[ X(t) = i ] = Prob. that system is in state i at time t
19 State Transition Diagram 19 From t to t + t p 0 (t+ t) = p 0 (t)[1 0 t] + p 1 (t)µ 1 t p i (t+ t) = p i (t)[1 ( i +µ i ) t] + p i+1 (t)µ i+1 t + p i-1 (t) i-1 t t 0 dp 0 (t)/dt = 0 p 0 (t) + µ 1 p 1 (t) dp i (t)/dt = ( i +µ i ) p i (t) + µ i+1 p i+1 (t) + i-1 p i-1 (t) Σ p i (t) = 1 i = 0
20 Flow Balance Method i-2 i-1 i i+1 2 i-1 i i+1 µ 2 µ 3 µ i-1 µ i µ i+1 µ i µ 1 Draw a closed boundary Observe all Flow In and Flow Out across the boundary
21 Flow Balance Method 21 Flow Out = Flow In i-1 i i Draw a closed boundary around state i µ i µ i+1 ( i +µ i ) p i = µ i+1 p i+1 + i-1 p i µ 1 Draw a closed boundary around state 0 0 p 0 = µ 1 p 1
22 Flow Balance Method 22 i i+1 µ i+1 i Draw a closed boundary around state i at infinity i p i = µ i+1 p i+1
23 Flow Balance General Form 23 i-1 i Draw a closed boundary around state i µ i i µ i+1 Global Balance Equation Σ p i p ij = i j p j Σ p ji i j i µ j i j Draw a closed boundary between state i and j Detailed Balance Equation p i p ij = p j p ji
24 A Pure Birth System k-2 k-1 k k+1 2 k-1 k k+1 Assumption µ k = 0 for all k k = for all k The system begins at time t 0 with 0 member 1 k = 0 p k (0) = 0 k 0
25 A Pure Birth System 25 dp 0 (t)/dt = 0 p 0 (t) + µ 1 p 1 (t) dp 0 (t)/dt = p 0 (t) dp k (t)/dt = ( k +µ k ) p k (t)+µ k+1 p k+1 (t)+ k-1 p k-1 (t) dp k (t)/dt = p k (t) + p k-1 (t) Solution for p 0 (t) p 0 (t) = e t
26 A Pure Birth System 26 For k = 1 dp 1 (t)/dt = p 1 (t) + p 0 (t) = p 1 (t) + e t p 1 (t) = te t For k 0, t 0 p k (t) = ( t) k k! e t Poisson Distribution
27 Poisson Distribution 27 Poisson_distribution3.jpg
28 A Poisson Process 28 The arrival of customers = the average rate that customer arrives p k (t) = Prob. that k arrivals occur during (0,t) K = # of arrivals in the interval t The average # of arrivals in an interval t, E[K] =?
29 A Poisson Process 29 E[K] =Σ k p k (t) k = 0 = e t Σ k k = 0 = e t Σ k = 1 = e t t Σ k = 0 ( t) k k! ( t) k (k-1)! ( t) k k! = t
30 Pure Birth Process Example 30 Linear Birth Process Yule-Furry Process Consider cells which reproduce according to the following rules: 1) A cell present at time t has probability t + o( t) of splitting in two in the interval (t, t + t ) 2) This probability is independent of age 3) Events between different cells are independent Modified from The theory of stochastic processes By D. R. Cox, H. D. Miller
31 Pure Birth Process Example 31 Cell Division Library/5thText/SimplePart3.html time
32 32 Pure Birth Process Example Non-Probabilistic Analysis n(t) = no. of cells at time t = birth rate per single cell n(t) t births occur in (t, t + t) n (t) = n(t + t) = n(t) + n(t) t n(t + t) n(t) = n(t) t n (t) d = log n(t) = n(t) dt log n(t) = t + c n(t) = Ke t, n(0) = n 0 n(t) = n 0 e t
33 33 Pure Birth Process Example Probabilistic Analysis N(t) = no. of cells at time t P{N(t) = n} = P n (t) Prob. of birth in (t, t + t) if {N(t) = n} = n t + o( t) P n (t + t) = P n (t)(1 n t + o( t)) + P n 1 (t)((n 1) t + o( t)) P n (t + t) P n (t) = n tp n (t) + P n 1 (t)(n 1) t + o( t) P n (t + t) P n (t) = n P n (t) + P n 1 (t)(n 1) + o( t) t P n (t) = n P n (t) + (n 1) P n 1 (t) as t 0
34 Pure Birth Process Example Probabilistic Analysis 34 P n (t) = n P n (t) + (n 1) P n 1 (t) Initial condition: P n 0 (0) = P{n(0) = n 0} = 1 n 1 P n (t) = e n 0 t (1 e t ) n-n 0 n n 0 n = n 0, n 0 +1, Solution is negative binomial distribution = Probability of obtaining exactly n 0 successes in n trials
35 35 Pure Birth Process Example Probabilistic Analysis Suppose p = prob. of success and q = 1 p = prob. of failure Then in the first (n 1) trials results in (n 0 1) successes and (n n 0 ) failures followed by success on n th trial n 1 P n (t) = p n 0-1 q n-n 0.p n 0 1 n 1 = p n 0 q n-n 0 n n 0 If p = e t and q = (1 e t ) P n (t) is as same as previous equation n = n 0, n 0 +1,
36 Yule-Furry Process 36 Yule studied this process in connection with theory of evolution i.e. population consists of the species within a genus and creation of new element is due to mutations Neglects probability of species dying out and size of species Furry used same model for radioactive transmutations a genus is a low-level taxonomic rank used in the classification of living
37 A Pure Death System µ 1 Example Microbial (a bacterium that causes disease) risk analysis Assumption µ k = µ 0 k = 0 2 k-1 k k+1 µ 2 µ 3 µ k-1 µ k µ k+1 µ k+2 for all k for all k The system begins with N members k = 1,2,3,,N
38 A Pure Death Process 38 ( t) p k (t) = N k e t 0 < k N (N k)! dp 0 (t) dt ( t) = N 1 e t k = 0 (N 1)! Erlang Distribution
39 39 M/M/1
40 A Birth-Death Process : M/M/1 40 Assumption k = for k 0 µ k = µ for k 1 The system begins at time t 0 with 0 member 0 1 µ 2 k-1 k k+1 µ µ µ µ µ µ
41 A Birth-Death Process : M/M/1 41 A Birth-Death Process Constant coefficients and µ Interarrival Time / Service Time / #Servers Memoryless / Memoryless / 1 Server M/M/1 = A single-server queue with a Poisson arrival and an exponential distribution for service time
42 A Birth-Death Process : M/M/1 42 A(t) = CDF of the arrival time k = for k 0 A(t) = 1 e t B(x) = CDF of the service time µ k = µ for k 1 B(x) = 1 e µx
43 A Birth-Death Process : M/M/1 43 dp 0 (t)/dt = p 0 (t) + µp 1 (t) dp k (t)/dt = ( +µ)p k (t) + p k-1 (t) + µp k+1 (t) Now we try to find the solution! Hint: Solved by using z-transforms
44 A Birth-Death Process : M/M/1 44 p k (t) = e ( +µ)t (k-i)/2 I k-i (at) + (k-i-1)/2 I k+i+1 (at) + (1 ) k j/2 I j (at) Where = /µ and a = 2µ 1/2 I k (x) = k 1 m = 0 (x/2) k+2m (k+m)!m! j = k+i+2 Time dependent behavior of the state prob.
45 Equilibrium Solution 45 The time-dependent solution is unmanageable p k (t) not too useful transient Let p k limiting probability (system = k members) = lim p k (t) = System in state E k t p k is not time-dependent
46 Equilibrium Solution 46 From dp 0 (t)/dt = 0 p 0 (t) + µ 1 p 1 (t) dp k (t)/dt = ( k +µ k )p k (t) + k-1 p k-1 (t) + µ k+1 p k+1 (t) If set lim dp k (t)/dt = 0 t Obtain the result 0 = 0 p 0 + µ 1 p 1 k = 0 0 = ( k +µ k )p k + k-1 p k-1 + µ k+1 p k+1 k 1
47 Equilibrium Solution 47 From conservation relation Σ p k (t) = 1 k = 0 Find the answer for p 0 and p i
48 Equilibrium Solution 48 From 0 = 0 p 0 + µ 1 p 1 k = 0 0 = ( k +µ k )p k + k-1 p k-1 + µ k+1 p k+1 k 1 Yield 0 p 0 = µ 1 p 1 p 1 = ( 0 /µ 1 ) p 0
49 Equilibrium Solution 49 And for k = 1 ( k +µ k )p k = k-1 p k-1 + µ k+1 p k+1 ( 1 +µ 1 )p 1 = 0 p 0 + µ 2 p 2 ( 1 +µ 1 )( 0 /µ 1 )p 0 = 0 p 0 + µ 2 p 2 ( 1 0 /µ 1 )p p 0 = 0 p 0 + µ 2 p 2 ( 1 0 /µ 1 )p 0 = µ 2 p 2 p 2 = p µ 1 µ 2
50 Equilibrium Solution 50 i 1 p 0 = 1 + i = 0 k = 0 k µ k+1 1 i 1 p i = p 0 k = 0 k µ k+1 i 1
51 Homework 51 Please check our class web site for homework assignment Homework due on 27 July 2010 (In class)
CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv
More informationCDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical
CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one
More informationData analysis and stochastic modeling
Data analysis and stochastic modeling Lecture 7 An introduction to queueing theory Guillaume Gravier guillaume.gravier@irisa.fr with a lot of help from Paul Jensen s course http://www.me.utexas.edu/ jensen/ormm/instruction/powerpoint/or_models_09/14_queuing.ppt
More informationRecap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks
Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution
More informationMarkov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued
Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2
More informationPart I Stochastic variables and Markov chains
Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)
More informationStochastic process. X, a series of random variables indexed by t
Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,
More informationContinuous-Time Markov Chain
Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),
More informationStatistics 150: Spring 2007
Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities
More informationMARKOV PROCESSES. Valerio Di Valerio
MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some
More informationAssignment 3 with Reference Solutions
Assignment 3 with Reference Solutions Exercise 3.: Poisson Process Given are k independent sources s i of jobs as shown in the figure below. The interarrival time between jobs for each source is exponentially
More informationChapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process
More informationLecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking
Lecture 7: Simulation of Markov Processes Pasi Lassila Department of Communications and Networking Contents Markov processes theory recap Elementary queuing models for data networks Simulation of Markov
More informationIEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.
IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas
More informationIEOR 6711, HMWK 5, Professor Sigman
IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.
More informationReadings: Finish Section 5.2
LECTURE 19 Readings: Finish Section 5.2 Lecture outline Markov Processes I Checkout counter example. Markov process: definition. -step transition probabilities. Classification of states. Example: Checkout
More informationContinuous Time Processes
page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point
More informationLecture 4a: Continuous-Time Markov Chain Models
Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time
More informationIrreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1
Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate
More informationQueues and Queueing Networks
Queues and Queueing Networks Sanjay K. Bose Dept. of EEE, IITG Copyright 2015, Sanjay K. Bose 1 Introduction to Queueing Models and Queueing Analysis Copyright 2015, Sanjay K. Bose 2 Model of a Queue Arrivals
More informationQueueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "
Queueing Theory I Summary Little s Law Queueing System Notation Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K " Little s Law a(t): the process that counts the number of arrivals
More informationGI/M/1 and GI/M/m queuing systems
GI/M/1 and GI/M/m queuing systems Dmitri A. Moltchanov moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2716/ OUTLINE: GI/M/1 queuing system; Methods of analysis; Imbedded Markov chain approach; Waiting
More informationFigure 10.1: Recording when the event E occurs
10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable
More informationBirth-Death Processes
Birth-Death Processes Birth-Death Processes: Transient Solution Poisson Process: State Distribution Poisson Process: Inter-arrival Times Dr Conor McArdle EE414 - Birth-Death Processes 1/17 Birth-Death
More informationTCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis
TCOM 50: Networking Theory & Fundamentals Lecture 6 February 9, 003 Prof. Yannis A. Korilis 6- Topics Time-Reversal of Markov Chains Reversibility Truncating a Reversible Markov Chain Burke s Theorem Queues
More informationPerformance Modelling of Computer Systems
Performance Modelling of Computer Systems Mirco Tribastone Institut für Informatik Ludwig-Maximilians-Universität München Fundamentals of Queueing Theory Tribastone (IFI LMU) Performance Modelling of Computer
More informationECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains
ECE-517: Reinforcement Learning in Artificial Intelligence Lecture 4: Discrete-Time Markov Chains September 1, 215 Dr. Itamar Arel College of Engineering Department of Electrical Engineering & Computer
More informationTime Reversibility and Burke s Theorem
Queuing Analysis: Time Reversibility and Burke s Theorem Hongwei Zhang http://www.cs.wayne.edu/~hzhang Acknowledgement: this lecture is partially based on the slides of Dr. Yannis A. Korilis. Outline Time-Reversal
More informationContinuous Time Markov Chains
Continuous Time Markov Chains Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 2015 Outline Introduction Continuous-Time Markov
More informationMarkov processes and queueing networks
Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution
More informationThe exponential distribution and the Poisson process
The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]
More informationM/G/1 and M/G/1/K systems
M/G/1 and M/G/1/K systems Dmitri A. Moltchanov dmitri.moltchanov@tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Description of M/G/1 system; Methods of analysis; Residual life approach; Imbedded
More informationLecture 21. David Aldous. 16 October David Aldous Lecture 21
Lecture 21 David Aldous 16 October 2015 In continuous time 0 t < we specify transition rates or informally P(X (t+δ)=j X (t)=i, past ) q ij = lim δ 0 δ P(X (t + dt) = j X (t) = i) = q ij dt but note these
More informationPerformance Evaluation of Queuing Systems
Performance Evaluation of Queuing Systems Introduction to Queuing Systems System Performance Measures & Little s Law Equilibrium Solution of Birth-Death Processes Analysis of Single-Station Queuing Systems
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationThe Transition Probability Function P ij (t)
The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it
More informationPage 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.
Final Examination Closed book. 120 minutes. Cover page plus five pages of exam. To receive full credit, show enough work to indicate your logic. Do not spend time calculating. You will receive full credit
More informationNetworking = Plumbing. Queueing Analysis: I. Last Lecture. Lecture Outline. Jeremiah Deng. 29 July 2013
Networking = Plumbing TELE302 Lecture 7 Queueing Analysis: I Jeremiah Deng University of Otago 29 July 2013 Jeremiah Deng (University of Otago) TELE302 Lecture 7 29 July 2013 1 / 33 Lecture Outline Jeremiah
More informationChapter 3 Balance equations, birth-death processes, continuous Markov Chains
Chapter 3 Balance equations, birth-death processes, continuous Markov Chains Ioannis Glaropoulos November 4, 2012 1 Exercise 3.2 Consider a birth-death process with 3 states, where the transition rate
More informationStochastic Processes
Stochastic Processes 8.445 MIT, fall 20 Mid Term Exam Solutions October 27, 20 Your Name: Alberto De Sole Exercise Max Grade Grade 5 5 2 5 5 3 5 5 4 5 5 5 5 5 6 5 5 Total 30 30 Problem :. True / False
More informationAdvanced Queueing Theory
Advanced Queueing Theory 1 Networks of queues (reversibility, output theorem, tandem networks, partial balance, product-form distribution, blocking, insensitivity, BCMP networks, mean-value analysis, Norton's
More informationChapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition
More informationEXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00
Norges teknisk naturvitenskapelige universitet Institutt for matematiske fag Page 1 of 7 English Contact: Håkon Tjelmeland 48 22 18 96 EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013
More informationIntroduction to Queueing Theory
Introduction to Queueing Theory Raj Jain Washington University in Saint Louis Saint Louis, MO 63130 Jain@cse.wustl.edu Audio/Video recordings of this lecture are available at: 30-1 Overview Queueing Notation
More informationIntroduction to queuing theory
Introduction to queuing theory Queu(e)ing theory Queu(e)ing theory is the branch of mathematics devoted to how objects (packets in a network, people in a bank, processes in a CPU etc etc) join and leave
More informationIntroduction to Queuing Networks Solutions to Problem Sheet 3
Introduction to Queuing Networks Solutions to Problem Sheet 3 1. (a) The state space is the whole numbers {, 1, 2,...}. The transition rates are q i,i+1 λ for all i and q i, for all i 1 since, when a bus
More informationThe story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes
The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element
More informationLecture 20: Reversible Processes and Queues
Lecture 20: Reversible Processes and Queues 1 Examples of reversible processes 11 Birth-death processes We define two non-negative sequences birth and death rates denoted by {λ n : n N 0 } and {µ n : n
More informationContinuous-time Markov Chains
Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017
More informationQUEUING SYSTEM. Yetunde Folajimi, PhD
QUEUING SYSTEM Yetunde Folajimi, PhD Part 2 Queuing Models Queueing models are constructed so that queue lengths and waiting times can be predicted They help us to understand and quantify the effect of
More informationQueueing Theory. VK Room: M Last updated: October 17, 2013.
Queueing Theory VK Room: M1.30 knightva@cf.ac.uk www.vincent-knight.com Last updated: October 17, 2013. 1 / 63 Overview Description of Queueing Processes The Single Server Markovian Queue Multi Server
More informationCS 798: Homework Assignment 3 (Queueing Theory)
1.0 Little s law Assigned: October 6, 009 Patients arriving to the emergency room at the Grand River Hospital have a mean waiting time of three hours. It has been found that, averaged over the period of
More informationMath 597/697: Solution 5
Math 597/697: Solution 5 The transition between the the ifferent states is governe by the transition matrix 0 6 3 6 0 2 2 P = 4 0 5, () 5 4 0 an v 0 = /4, v = 5, v 2 = /3, v 3 = /5 Hence the generator
More informationChapter 8 Queuing Theory Roanna Gee. W = average number of time a customer spends in the system.
8. Preliminaries L, L Q, W, W Q L = average number of customers in the system. L Q = average number of customers waiting in queue. W = average number of time a customer spends in the system. W Q = average
More informationName of the Student:
SUBJECT NAME : Probability & Queueing Theory SUBJECT CODE : MA 6453 MATERIAL NAME : Part A questions REGULATION : R2013 UPDATED ON : November 2017 (Upto N/D 2017 QP) (Scan the above QR code for the direct
More informationNICTA Short Course. Network Analysis. Vijay Sivaraman. Day 1 Queueing Systems and Markov Chains. Network Analysis, 2008s2 1-1
NICTA Short Course Network Analysis Vijay Sivaraman Day 1 Queueing Systems and Markov Chains Network Analysis, 2008s2 1-1 Outline Why a short course on mathematical analysis? Limited current course offering
More informationQueuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011
Queuing Theory Richard Lockhart Simon Fraser University STAT 870 Summer 2011 Richard Lockhart (Simon Fraser University) Queuing Theory STAT 870 Summer 2011 1 / 15 Purposes of Today s Lecture Describe general
More informationSTOCHASTIC PROCESSES Basic notions
J. Virtamo 38.3143 Queueing Theory / Stochastic processes 1 STOCHASTIC PROCESSES Basic notions Often the systems we consider evolve in time and we are interested in their dynamic behaviour, usually involving
More informationLink Models for Packet Switching
Link Models for Packet Switching To begin our study of the performance of communications networks, we will study a model of a single link in a message switched network. The important feature of this model
More informationIntroduction to Queueing Theory
Introduction to Queueing Theory Raj Jain Washington University in Saint Louis Jain@eecs.berkeley.edu or Jain@wustl.edu A Mini-Course offered at UC Berkeley, Sept-Oct 2012 These slides and audio/video recordings
More informationExercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010
Exercises Stochastic Performance Modelling Hamilton Institute, Summer Instruction Exercise Let X be a non-negative random variable with E[X ]
More informationContinuous time Markov chains
Continuous time Markov chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ October 16, 2017
More informationStatistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014
Statistics 253/317 Introduction to Probability Models Winter 2014 - Midterm Exam Monday, Feb 10, 2014 Student Name (print): (a) Do not sit directly next to another student. (b) This is a closed-book, closed-note
More informationSMSTC (2007/08) Probability.
SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................
More informationSTA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008
Name STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 There are five questions on this test. DO use calculators if you need them. And then a miracle occurs is not a valid answer. There
More informationMath 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time
Math 416 Lecture 11 Math 416 Lecture 16 Exam 2 next time Birth and death processes, queueing theory In arrival processes, the state only jumps up. In a birth-death process, it can either jump up or down
More informationPart II: continuous time Markov chain (CTMC)
Part II: continuous time Markov chain (CTMC) Continuous time discrete state Markov process Definition (Markovian property) X(t) is a CTMC, if for any n and any sequence t 1
More informationMarkov Reliability and Availability Analysis. Markov Processes
Markov Reliability and Availability Analysis Firma convenzione Politecnico Part II: Continuous di Milano e Time Veneranda Discrete Fabbrica State del Duomo di Milano Markov Processes Aula Magna Rettorato
More informationQuantitative Model Checking (QMC) - SS12
Quantitative Model Checking (QMC) - SS12 Lecture 06 David Spieler Saarland University, Germany June 4, 2012 1 / 34 Deciding Bisimulations 2 / 34 Partition Refinement Algorithm Notation: A partition P over
More informationStochastic Models in Computer Science A Tutorial
Stochastic Models in Computer Science A Tutorial Dr. Snehanshu Saha Department of Computer Science PESIT BSC, Bengaluru WCI 2015 - August 10 to August 13 1 Introduction 2 Random Variable 3 Introduction
More informationMarkov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015
Markov chains c A. J. Ganesh, University of Bristol, 2015 1 Discrete time Markov chains Example: A drunkard is walking home from the pub. There are n lampposts between the pub and his home, at each of
More informationBulk input queue M [X] /M/1 Bulk service queue M/M [Y] /1 Erlangian queue M/E k /1
Advanced Markovian queues Bulk input queue M [X] /M/ Bulk service queue M/M [Y] / Erlangian queue M/E k / Bulk input queue M [X] /M/ Batch arrival, Poisson process, arrival rate λ number of customers in
More informationEE126: Probability and Random Processes
EE126: Probability and Random Processes Lecture 18: Poisson Process Abhay Parekh UC Berkeley March 17, 2011 1 1 Review 2 Poisson Process 2 Bernoulli Process An arrival process comprised of a sequence of
More informationECE 313 Probability with Engineering Applications Fall 2000
Exponential random variables Exponential random variables arise in studies of waiting times, service times, etc X is called an exponential random variable with parameter λ if its pdf is given by f(u) =
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More informationProbability and Statistics Concepts
University of Central Florida Computer Science Division COT 5611 - Operating Systems. Spring 014 - dcm Probability and Statistics Concepts Random Variable: a rule that assigns a numerical value to each
More informationEE126: Probability and Random Processes
EE126: Probability and Random Processes Lecture 19: Poisson Process Abhay Parekh UC Berkeley March 31, 2011 1 1 Logistics 2 Review 3 Poisson Processes 2 Logistics 3 Poisson Process A continuous version
More informationIntroduction to Queueing Theory
Introduction to Queueing Theory Raj Jain Washington University in Saint Louis Saint Louis, MO 63130 Jain@cse.wustl.edu Audio/Video recordings of this lecture are available at: http://www.cse.wustl.edu/~jain/cse567-11/
More informationMarkov Processes and Queues
MIT 2.853/2.854 Introduction to Manufacturing Systems Markov Processes and Queues Stanley B. Gershwin Laboratory for Manufacturing and Productivity Massachusetts Institute of Technology Markov Processes
More informationreversed chain is ergodic and has the same equilibrium probabilities (check that π j =
Lecture 10 Networks of queues In this lecture we shall finally get around to consider what happens when queues are part of networks (which, after all, is the topic of the course). Firstly we shall need
More informationSTAT 380 Continuous Time Markov Chains
STAT 380 Continuous Time Markov Chains Richard Lockhart Simon Fraser University Spring 2018 Richard Lockhart (Simon Fraser University)STAT 380 Continuous Time Markov Chains Spring 2018 1 / 35 Continuous
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science
MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.
More information88 CONTINUOUS MARKOV CHAINS
88 CONTINUOUS MARKOV CHAINS 3.4. birth-death. Continuous birth-death Markov chains are very similar to countable Markov chains. One new concept is explosion which means that an infinite number of state
More informationP (L d k = n). P (L(t) = n),
4 M/G/1 queue In the M/G/1 queue customers arrive according to a Poisson process with rate λ and they are treated in order of arrival The service times are independent and identically distributed with
More informationLecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes
Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities
More informationStructured Markov Chains
Structured Markov Chains Ivo Adan and Johan van Leeuwaarden Where innovation starts Book on Analysis of structured Markov processes (arxiv:1709.09060) I Basic methods Basic Markov processes Advanced Markov
More informationOutline. Finite source queue M/M/c//K Queues with impatience (balking, reneging, jockeying, retrial) Transient behavior Advanced Queue.
Outline Finite source queue M/M/c//K Queues with impatience (balking, reneging, jockeying, retrial) Transient behavior Advanced Queue Batch queue Bulk input queue M [X] /M/1 Bulk service queue M/M [Y]
More information1 IEOR 4701: Continuous-Time Markov Chains
Copyright c 2006 by Karl Sigman 1 IEOR 4701: Continuous-Time Markov Chains A Markov chain in discrete time, {X n : n 0}, remains in any state for exactly one unit of time before making a transition (change
More informationQueueing. Chapter Continuous Time Markov Chains 2 CHAPTER 5. QUEUEING
2 CHAPTER 5. QUEUEING Chapter 5 Queueing Systems are often modeled by automata, and discrete events are transitions from one state to another. In this chapter we want to analyze such discrete events systems.
More informationCS 237: Probability in Computing
CS 237: Probability in Computing Wayne Snyder Computer Science Department Boston University Lecture 13: Normal Distribution Exponential Distribution Recall that the Normal Distribution is given by an explicit
More informationIntroduction to Markov Chains, Queuing Theory, and Network Performance
Introduction to Markov Chains, Queuing Theory, and Network Performance Marceau Coupechoux Telecom ParisTech, departement Informatique et Réseaux marceau.coupechoux@telecom-paristech.fr IT.2403 Modélisation
More informationIEOR 4106: Spring Solutions to Homework Assignment 7: Due on Tuesday, March 22.
IEOR 46: Spring Solutions to Homework Assignment 7: Due on Tuesday, March. More of Chapter 5: Read the rest of Section 5.3, skipping Examples 5.7 (Coupon Collecting), 5. (Insurance claims)and Subsection
More informationExample: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected
4. Markov Chains A discrete time process {X n,n = 0,1,2,...} with discrete state space X n {0,1,2,...} is a Markov chain if it has the Markov property: P[X n+1 =j X n =i,x n 1 =i n 1,...,X 0 =i 0 ] = P[X
More informationSolutions to Homework Discrete Stochastic Processes MIT, Spring 2011
Exercise 6.5: Solutions to Homework 0 6.262 Discrete Stochastic Processes MIT, Spring 20 Consider the Markov process illustrated below. The transitions are labelled by the rate q ij at which those transitions
More informationExamples of Countable State Markov Chains Thursday, October 16, :12 PM
stochnotes101608 Page 1 Examples of Countable State Markov Chains Thursday, October 16, 2008 12:12 PM Homework 2 solutions will be posted later today. A couple of quick examples. Queueing model (without
More informationSolution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.
Solutions Stochastic Processes and Simulation II, May 18, 217 Problem 1: Poisson Processes Let {N(t), t } be a homogeneous Poisson Process on (, ) with rate λ. Let {S i, i = 1, 2, } be the points of the
More informationChapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations
Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models
More informationBIRTH DEATH PROCESSES AND QUEUEING SYSTEMS
BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS Andrea Bobbio Anno Accademico 999-2000 Queueing Systems 2 Notation for Queueing Systems /λ mean time between arrivals S = /µ ρ = λ/µ N mean service time traffic
More informationLecture 10: Semi-Markov Type Processes
Lecture 1: Semi-Markov Type Processes 1. Semi-Markov processes (SMP) 1.1 Definition of SMP 1.2 Transition probabilities for SMP 1.3 Hitting times and semi-markov renewal equations 2. Processes with semi-markov
More information