CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Similar documents
CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

LECTURE #6 BIRTH-DEATH PROCESS

Stochastic process. X, a series of random variables indexed by t

Part I Stochastic variables and Markov chains

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Statistics 150: Spring 2007

Continuous time Markov chains

Data analysis and stochastic modeling

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Lecture 4a: Continuous-Time Markov Chain Models

MARKOV PROCESSES. Valerio Di Valerio

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Lecture 20: Reversible Processes and Queues

SMSTC (2007/08) Probability.

Continuous-Time Markov Chain

Continuous-time Markov Chains

Example: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected

Exponential Distribution and Poisson Process

Continuous Time Processes

IEOR 6711, HMWK 5, Professor Sigman

STAT STOCHASTIC PROCESSES. Contents

Figure 10.1: Recording when the event E occurs

The Transition Probability Function P ij (t)

Poisson Processes. Stochastic Processes. Feb UC3M

Assignment 3 with Reference Solutions

TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis

Continuous Time Markov Chains

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

STOCHASTIC PROCESSES Basic notions

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Stochastic Modelling Unit 1: Markov chain models

1 IEOR 4701: Continuous-Time Markov Chains

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

Introduction to Queuing Networks Solutions to Problem Sheet 3

STAT 380 Continuous Time Markov Chains

Performance Modelling of Computer Systems

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Markov processes and queueing networks

Stochastic Processes

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

STAT 380 Markov Chains

Birth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes

The exponential distribution and the Poisson process

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321

Quantitative Model Checking (QMC) - SS12

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

Networking = Plumbing. Queueing Analysis: I. Last Lecture. Lecture Outline. Jeremiah Deng. 29 July 2013

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Part II: continuous time Markov chain (CTMC)

CS 798: Homework Assignment 3 (Queueing Theory)

Examination paper for TMA4265 Stochastic Processes

Time Reversibility and Burke s Theorem

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Random variables and transform methods

Queueing. Chapter Continuous Time Markov Chains 2 CHAPTER 5. QUEUEING

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains

Stochastic Processes

THE QUEEN S UNIVERSITY OF BELFAST

Lecture 21. David Aldous. 16 October David Aldous Lecture 21

Markov Chain Model for ALOHA protocol

Manual for SOA Exam MLC.

Interlude: Practice Final

Probability, Random Processes and Inference

GI/M/1 and GI/M/m queuing systems

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Availability. M(t) = 1 - e -mt

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Name of the Student:

EE126: Probability and Random Processes

PoissonprocessandderivationofBellmanequations

A review of Continuous Time MC STA 624, Spring 2015

The Theory behind PageRank

Lecture 10: Semi-Markov Type Processes

M/G/1 and M/G/1/K systems

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Northwestern University Department of Electrical Engineering and Computer Science

ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains

Chapter 2 SOME ANALYTICAL TOOLS USED IN THE THESIS

Question Points Score Total: 70

Basic concepts of probability theory

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Stochastic Models in Computer Science A Tutorial

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

1 Poisson processes, and Compound (batch) Poisson processes

µ n 1 (v )z n P (v, )

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

Slides 8: Statistical Models in Simulation

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

TMA 4265 Stochastic Processes

Midterm 2 Review. CS70 Summer Lecture 6D. David Dinh 28 July UC Berkeley

Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS

Disjointness and Additivity

P i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=

Transcription:

CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes

Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv for each X(t) for each t2 T Index set T --- set of possible values of t t only means time T: countable discrete-time process T: real number continuous-time process State space --- set of possible values of X(t) 2

Counting Process A stochastic process that represents no. of events that occurred by time t; a continuoustime, discrete-state process {N(t), t>0} if N(0)=0 N(t) 0 N(t) increasing (non-decreasing) in t N(t)-N(s) is the Number of events happen in time interval [s, t] 3

Counting Process Counting process has independent increments if no. events in disjoint intervals are independent P(N 1 =n 1, N 2 =n 2 ) = P(N 1 =n 1 )P(N 2 =n 2 ) if N 1 and N 2 are disjoint intervals counting process has stationary increments if no. of events in [t 1 +s; t 2 +s] has the same distribution as no. of events in [t 1 ; t 2 ]; s > 0 4

Bernoulli Process N t : no. of successes by time t=0,1, is a counting process with independent and stationary increments p: prob. of success Note: t is discrete time When n t, P (N t = n) = N t» B(t, p) E[N t ]=tp, Var[N t ]=tp(1-p) Ã t n! p n (1 p) t n 5

Bernoulli Process X: time between success Geometric distribution P(X=n) = (1-p) n-1 p 6

Little o notation Definition: f(h) is o(h) if f(h) lim f(h)=h 2 h!0 h = 0 is o(h) f(h)=h is not f(h)=h r, r>1 is o(h) sin(h) is not If f(h) and g(h) are o(h), then f(h)+g(h)=o(h) Note: h is continuous 7

Example: Exponential R.V. Exponential r.v. X with parameter has PDF P(X<h) = 1-e - h, h>0 Why? Why? 8

Poisson Process Counting process {N(t), t 0} with rate t is continuous N(0)=0 Independent and stationary increments P(N(h)=1) = h +o(h) P(N(h) 2) = o(h) Thus, P(N(h)=0) =? P(N(h)=0) = 1 - h +o(h) Notation: P n (t) = P(N(t)=n) 9

Drift Equations 10

For n=0, P 0 (t+ t) = P 0 (t)(1- t)+o( t) Thus, dp 0 (t)/dt = - P 0 (t) Thus, P 0 (t) = e - t Why? Thus, inter-arrival time is exponential distr. With the same rate Remember exponential r.v.: F X (x)= 1-e - x That means: P(X> t) = e - t {X>t} means at time t, there is still no arrival X (n) : time for n consecutive arrivals Erlang r.v. with order k 11

dp n (t) = P n 1 (t) P n (t) dt P 0 (t) = e t Similar to Poisson r.v. P (X = k) = e k You can think Poisson r.v. is the static distr. of a Poisson process at time t k! 12

Poisson Process Take i.i.d. sequence of exponential rvs {X i } with rate Define: N(t) = max{n Σ 1 i n X i t}, {N(t)} is a Poisson process Meaning: Poisson process is composed of many independent arrivals with exponential inter-arrival time. 13

Poisson Process if N(t) is a Poisson process and one event occurs in [0,t], then the time to the event, denoted as r.v. X, is uniformly distributed in [0,t], f X N(t)=1 (x 1)=1/t, 0 x t Meaning: Given an arrival happens, it could happen at any time Exponential distr. is memoryless One reason why call the arrival with rate Arrival with the same prob. at any time 14

Poisson Process if N 1 (t) and N 2 (t) are independent Poisson processes with rates λ 1 and λ 2, then N(t) = N 1 (t) + N 2 (t) is a Poisson process with rate λ =λ 1 + λ 2 Intuitive explanation: A Poisson process is caused by many independent entities (n) with small chance (p) arrivals Arrival rate is proportional to population size =np Still a Poisson proc. if two large groups of entities arrives in mixed format 15

Poisson Process N(t) is Poisson proc. with rate λ, M i is Bernoulli proc. with success prob. p. Construct a new process L(t) by only counting the n-th event in N(t) whenever M n >M n -1 (i.e., success at time n) L(t) is Poisson with rate λp Useful in analysis based on random sampling 16

Example 1 A web server where failures are described by a Poisson process with rate λ = 2.4/day, i.e., the time between failures, X, is exponential r.v. with mean E[X] = 10hrs. P(time between failures < 1 day) = P(5 failures in 1 day)= P(N(5)<10)= look in on system at random day, what is prob. of no. failures during next 24 hours? failure is memory failure with prob. 1/9, CPU failure with prob. 8/9. Failures occur as independent events. What is process governing memory failures? 17

Example 2 The arrival of claims at an insurance company follows a Poisson process. On average the company gets 100 claims per week. Each claim follows an exponential distribution with mean $700.00. The company offers two types of policies. The first type has no deductible and the second has a $250.00 deductible. If the claim sizes and policy types are independent of each other and of the number of claims, and twice as many policy holders have deductibles as not, what is the mean liability amount of the company in any 13 week period? First, claims be split into two Poisson arrival processes X: no deductible claims Y: deductible claims Second, the formula for liability? 18

Birth-Death Process Continuous-time, discrete-space stochastic process {N(t), t >0}, N(t) {0, 1,...} N(t): population at time t P(N(t+h) = n+1 N(t) = n) = n h + o(h) P(N(t+h) = n-1 N(t) = n) = ¹ n h + o(h) P(N(t+h) = n N(t) = n) = 1-( n + ¹ n ) h + o(h) n - birth rates ¹ n - death rates, ¹ 0 = 0 Q: what is P n (t) = P(N(t) = n)? n = 0,1,... 19

Birth-Death Process Similar to Poisson process drift equation Initial condition: P n (0) If ¹ i =0, i=, then B-D process is a Poisson process 20

Stationary Behavior of B-D Process Most real systems reach equilibrium as t1 No change in P n (t) as t changes No dependence on initial condition P n = lim t1 P n (t) Drift equation becomes: 21

Transition State Diagram Balance Equations: Rate of trans. into n = rate of trans. out of n Rate of trans. to left = rate of trans. to right n 1 P n 1 = ¹ n P n 22

Probability requirement: 23

Markov Process Prob. of future state depends only on present state {X(t), t>0} is a MP if for any set of time t 1 <<t n+1 and any set of states x 1 <<x n+1 P(X(t n+1 )=x n+1 X(t 1 )=x 1, X(t n )=x n } = P(X(t n+1 )=x n+1 X(t n )=x n } B-D process, Poisson process are MP 24

Markov Chain Discrete-state MP is called Markov Chain (MC) Discrete-time MC Continuous-time MC First, consider discrete-time MC Define transition prob. matrix: P = [P ij ] 25

Chapman-Kolmogorov Equation What is the state after n transitions? A: define Why? Why? 26

If MC has n state P 2 ij = n X k=1 P ik P kj Define n-step transition prob. matrix: P (n) = [Pij n ] C-K equation means: P (n+m) = P (n) P (m) P (n) = P P (n 1) = P n 27 ) [P 2 ij ] = P P

Irreducible MC: Markov Chain If every state can be reached from any other states Periodic MC: A state i has period k if any returns to state i occurs in multiple of k steps k=1, then the state is called aperiodic MC is aperiodic if all states are aperiodic 28

An irreducible, aperiodic finite-state MC is ergodic, which has a stationary (steadystate) prob. distr. ¼ = (¼ 0 ; ¼ 1 ; ; ¼ n ) 29

1 Example 0 1 1 Markov on-off model (or 0-1 model) Q: the steady-state prob.? 8 < : P = " 1 1 ¼ 0 = (1 )¼ 0 + ¼ 1 ¼ 1 = ¼ 0 + (1 )¼ 1 ¼ 0 + ¼ 1 = 1 # ) 8 < : ¼ 0 = + ¼ 1 = + 30

An Alternative Calculation Use balance equation: 1 1 0 1 Rate of trans. to left = rate of trans. to right ¼ 0 = ¼ 1 ¼ 0 + ¼ 1 = 1 ) 8 < : ¼ 0 = + ¼ 1 = + 31

Discrete-Time MC State Staying Time X i : the number of time steps a MC stays in the same state i P(X i = k) = P ii k-1 (1-P ii ) X i follows geometric distribution Average time: 1/(1-P ii ) In continuous-time MC, the staying time is? Exponential distribution time 32

Homogeneous Continuous-Time Markov Chain P(X(t+h)=j X(t)=i) = ij h +o(h) We have the properties: P (X(t + h) = ijx(t) = i) = 1 X j6=i ij h + o(h) P (X(t + h) 6= ijx(t) = i) = X j6=i ij h + o(h) The state holding time is exponential distr. with rate = X ij Why? j6=i Due to the summation of independent exponential distr. is still exponential distr. 33

Steady-State Ergodic continuous-time MC Define ¼ i = P(X=i) Consider the state transition diagram Transit out of state i = transit into state i ¼ i X j6=i ij = X j6=i X i ¼ i = 1 ¼ j ji 34

Infinitesimal Generator Define Q = [q ij ] where q ij = 8 < P k6=i ik when i = j i 6= j : ij Q is called infinitesimal generator ¼ = [¼ 1 ¼ 2 ] ¼Q = 0 ¼1 = 1 Why? 35

Discrete vs. Continues MC Discrete Jump at time tick Staying time: geometric distr. Transition matrix P Steady state: Continuous Jump at continuous t Staying time: exponential distr. Infinitesimal generator Q Steady state: ¼Q = 0 ¼1 = 1 State transition diagram: Has self-jump loop Probability on arc State transition diagram: No self-jump loop Transition rate on arc 36

Semi-Markov Process X(t): discrete state, continuous time State jump: follow Markov Chain Z n State i holding time: follow a distr. Y (i) If Y (i) follows exponential distr. X(t) is a continuous-time Markov Chain 37

Steady State Let ¼ j =lim n1 P(Z n =j) Let ¼ j = lim t1 P(X(t)=j) Why? 38