Assignment 3 with Reference Solutions

Similar documents
LECTURE #6 BIRTH-DEATH PROCESS

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

Part I Stochastic variables and Markov chains

Queueing Theory. VK Room: M Last updated: October 17, 2013.

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Quantitative Model Checking (QMC) - SS12

M/G/1 and M/G/1/K systems

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Lecture 4a: Continuous-Time Markov Chain Models

Introduction to Queueing Theory

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Exponential Distribution and Poisson Process

EE126: Probability and Random Processes

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Stochastic Processes

GI/M/1 and GI/M/m queuing systems

Expectation, variance and moments

An Analysis of the Preemptive Repeat Queueing Discipline

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Introduction to Queueing Theory

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Continuous Time Processes

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10. x n+1 = f(x n ),

TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis

Stochastic process. X, a series of random variables indexed by t

IEOR 6711, HMWK 5, Professor Sigman

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

THE QUEEN S UNIVERSITY OF BELFAST

Introduction to Queuing Networks Solutions to Problem Sheet 3

Figure 10.1: Recording when the event E occurs

1 Basic concepts from probability theory

Continuous-time Markov Chains

Manual for SOA Exam MLC.

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Friday, Feb 8, 2013

Stochastic Models of Manufacturing Systems

Introduction to Queueing Theory

Chapter 3 Balance equations, birth-death processes, continuous Markov Chains

UNIVERSITY OF LONDON IMPERIAL COLLEGE LONDON

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

Renewal theory and its applications

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

MATH3013: Simulation and Queues

Performance Modelling of Computer Systems

EE126: Probability and Random Processes

M/G/1 queues and Busy Cycle Analysis

STAT 380 Markov Chains

The exponential distribution and the Poisson process

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

Continuous time Markov chains

Bulk input queue M [X] /M/1 Bulk service queue M/M [Y] /1 Erlangian queue M/E k /1

Part II: continuous time Markov chain (CTMC)

Birth-Death Processes

Universal examples. Chapter The Bernoulli process

Slides 8: Statistical Models in Simulation

Lecture 20: Reversible Processes and Queues

Stat410 Probability and Statistics II (F16)

ECE 313 Probability with Engineering Applications Fall 2000

Stochastic Models in Computer Science A Tutorial

A note on adiabatic theorem for Markov chains

Basics of Stochastic Modeling: Part II

PoissonprocessandderivationofBellmanequations

CS 798: Homework Assignment 3 (Queueing Theory)

Massachusetts Institute of Technology

Link Models for Packet Switching

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Statistics 150: Spring 2007

Chapter 2 Queueing Theory and Simulation

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

P i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=

Introduction to Queuing Theory. Mathematical Modelling

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

P (L d k = n). P (L(t) = n),

Problem Set 8

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

Probability and Statistics Concepts

Chapter 3: Random Variables 1

EDRP lecture 7. Poisson process. Pawe J. Szab owski

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Lecturer: Olga Galinina

CS 237: Probability in Computing

Markov processes and queueing networks

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

Probability Distributions

Structured Markov Chains

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

Poisson Processes. Stochastic Processes. Feb UC3M

arxiv: v1 [math.pr] 18 Nov 2018

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Chapter 6: Random Processes 1

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Classification of Queuing Models

Queueing. Chapter Continuous Time Markov Chains 2 CHAPTER 5. QUEUEING

QUEUING SYSTEM. Yetunde Folajimi, PhD

Advanced Queueing Theory

Data analysis and stochastic modeling

Transcription:

Assignment 3 with Reference Solutions Exercise 3.: Poisson Process Given are k independent sources s i of jobs as shown in the figure below. The interarrival time between jobs for each source is exponentially distributed with parameter λ i, i,...,k. s s 2 2 s k k Consider the stream which arises by merging all sources s i. Prove that the stream is a poisson process with parameter λ λ + + λ k. Solution 3. The poisson distribution with parameter λ is defined by the distribution density function P n (t) (λt)n e λt n! and describes the probability of the occurrence of n events in a time interval of the length t when events happen with rate λ. Hence each source s i is a poisson process and generates jobs according to distribution density function where j is the number of events from source s i. P j (t) f i (λ it) j We now consider the situation that n jobs are generated in a certain time interval of the length of t units. There are several ways in which these n jobs can be generated by k sources. For simplicity we consider the case of k 2. Hence we have two arbitrarily sources s and s 2 as poisson processes. Each process produces N, N 2 with N + N 2 n jobs in the time interval of t units, respectively. We are searching for the random process which is defined by the summation of both random processes s and s 2. Given that f and f 2 are the distribution density functions for s and s 2 than the resulting random process s is given by s s + s 2 f f 2 and called convolution of the random variables. j! e λ it

P(N + N 2 n) n P(N j)p(n 2 n j) j0 n j0 (λ t) j j! e λ t (λ (n j) 2t) (n j)! e λ 2t e λt e λ n (λ 2t t) j (λ 2 t) (n j) j0 j! (n j)! e λ t e λ 2t n j0 e λ t e λ 2t n! e λ t e λ 2t n! j!(n j)! (λ t) j (n j) (λ 2 t) n j0 n j0 e λ t e λ 2t n! (λ t + λ 2 t) n e (λ +λ 2 )t n! ((λ + λ 2 )t) n λt (λt)n e n! n! j!(n j)! (λ t) j (λ 2 t) ( ) n (λ t) j (n j) (λ 2 t) j Convolution of two distribution functions (n j) Hence we show that the merging is valid for two arbitrarily sources it will also be valid for k sources, that could be shown by complete induction over k. 2

Exercise 3.2: Poisson Process Given is a job source which generates jobs according to a poisson process with parameter λ. We want to split the stream of jobs in m substreams b i, i,...,m as shown in the following figure. p b p 2 b 2 p m b m Let p i be the probability that a job will be assigned to substream i. Show that each substream b i is a poisson process with parameter λ p i, assuming that the p i are chosen indenpendently. Hint: Solution 3.2 x i i0 i! ex Let S be the poisson process with parameter λ. We consider a time interval of length of t with N jobs occurring in that time interval. The probability for that case is given by P(S N) (λt)n e λt A job is assigned to substream b i with probability p i. Hence, the probability that k of the N jobs are assigned to substream b i is ( ) N P(k of N jobs to b i ) P i p k i ( p i ) N k k We have to take into consideration that N jobs occurr and k of them are assigned to substream b i. 3

P(b i k) Nk Nk Nk e λt p k i P(b i k S N) P(S N) P i (λt) N e λt Nk e λt p k i ( p) k Nk e λt p k i ( p) k k! e λt p k i ( p) k k! (λt) N Nk Nk ( N k (λt) N Change of index e λt p k i ( p) k k! N0 ) p k i ( p i ) N k ( N k (λt) N e λt p k i (λt)k ( p i ) k ( p) k k! Apply hint ) ( p i ) N k (N k)!k! ( p i) N (N k)! ( p i) N (λt) N ( p i ) N (N k)! (λt) N+k ( p i ) N+k x i i0 i! ex (λt) N ( p i ) N N0 e λt p k i (λt)k ( p i ) k ( p) k e (λt)( p i) k! (p iλt) k e λt λt p i λt k! (p iλt) k e λt p i k! The result is again a poisson process with parameter λp i. 4

Exercise 3.3: Poisson Process A computer service (e.g. web server) starts at time t 0. Jobs arrive at random in a poisson fashion. Each job takes X time units of service time. X is a random variable. Consider the following two cases for X: X c constant X is exponentially distributed with b(x) µe µx and for both cases find out a) the probability P that the second job will not have to wait b) W, the average waiting time of the second job Solution 3.3 The situation is depicted in the following figure. service time x for job Start t t 2 We define the following notions: A random variable A for inter-arrival time which is exponentially distributed with parameter λ, i.e. a(t) λe λt and A(t) t 0 a(x)dx e λt A random variable X for service time which is either a constant or a exponential distribution with parameter µ. a) The second job does not have to wait if the inter-arrival time A 2 of this job is larger than the service time X of job, i.e. A 2 X. Hence we have to calculate P(no waiting) P(A 2 X ). X c constant: In that case X c P(no waiting) P(A 2 c) λe λt dt c [ e λt] c ( 0) ( e λc ) + e λc e λc 5

X is exponentially distributed with pdf b(x) µe µx P(no waiting) P(A 2 X ) µ x x 0 x x 0 P(A x X x ) P(A x ) P(X x ) from before we know the solution for the first part to be t A(t)dt [ e λt ] x e λx tx x x 0 x e λx P(X x ) e λx µe µx x 0 x e (λ+µ)x x 0 µ λ + µ b) Waiting time of jobs. The situation is depicted in the following figure. service time x for job waiting time Start t t 2 The second job has to wait if X A 2 than the waiting time is given by w X A 2. If A 2 i and X c than the waiting time is w c i, for 0 i c. In general, the average waiting time is given by E[W] t P(X A 2 t)dt 0 In the first case, we have X c constant: E[W] 0 c 0 (c i) P(a(t) i)di (c i)λe λi di e λc + λc λ E[W] c X is exponentially distributed with pdf b(x) µe µx In that case the waiting time depends on the distribution of X, i.e. W X A 2. For the case X c we got W c : e λc + λc λ 6

E[W] x x0 x x0 λ µ(λ + µ) W x P(X x) e λx + λx λ µe µx 7

Exercise 3.4: Markov Chains In the following figure the state diagram of a homogenous Markov chain is shown. p 0 -p -q 3 q 2 a) Explain the notion of homogenous. b) Find the probability transition matrix P. c) Show that the Markov chain is irreducible. d) Is the Markov chain aperiodic? e) Solve for the equilibrium probability vector π. f) What happens if we set p q p q 0 Solution 3.4 a) A Markov chain is denoted as homogenous if the following is valid: P(X m+n j X m i) P(X n j X 0 i) that means, the transition probability from state i to state j is independent of time. b) The probability transition matrix is given by 0 p p 0 P 0 0 0 0 q 0 q 0 0 0 c) A Markov chain is irreducible if every state communicates with every other state. The property communicate means, that state i can reach state j and vice versa, i.e. p (n) i, j > 0 and p (k) j,i > 0. In the lecture we defined p (n+m) i, j k0 p n i,k pm k, j 8

List reachability of the states: p 0, p p 0,2 ( p) p 2 0,3 ( p)q p 2,0 p 3,2 ( p) p,3 p 2 2,0 q p 2, ( q) p 2,3 q p 3,0 p 2 3, p p 2 3,2 ( p) The Markov chain is irreducible. d) We calculate the period for each state p 3 0,0 p 0, p,3 p 3,0 p p p 3, p,3 p 3,0 p 0, p p p 3 2,2 p 2,3 p 3,0 p 0,2 q ( p) q( p) p 3 3,3 p 3,0 p 0,2 p 2,3 ( p) q ( p)q There is only one equivalence class with period 3 The Markov chain is periodical. e) The solution of the equilibrium is gained by solving the following linear equation: 0 p p 0 (π 0,π,π 2,π 3 ) 0 0 0 0 q 0 q (π 0,π,π 2,π 3 ) 0 0 0 The result of the equilibrium probabilities is: and 3 i0 π i π 0 π π 2 π 3 3 + ( p)( q) 4 p q + pq p + ( q)( p) 3 + ( p)( q) q + pq 4 p q + pq ( p) 3 + ( p)( q) ( p) 4 p q + pq 3 + ( p)( q) 4 p q + pq 9

f) Special cases p q 0 3 2 p q 0 0 3 2 0