Assignment 3 with Reference Solutions Exercise 3.: Poisson Process Given are k independent sources s i of jobs as shown in the figure below. The interarrival time between jobs for each source is exponentially distributed with parameter λ i, i,...,k. s s 2 2 s k k Consider the stream which arises by merging all sources s i. Prove that the stream is a poisson process with parameter λ λ + + λ k. Solution 3. The poisson distribution with parameter λ is defined by the distribution density function P n (t) (λt)n e λt n! and describes the probability of the occurrence of n events in a time interval of the length t when events happen with rate λ. Hence each source s i is a poisson process and generates jobs according to distribution density function where j is the number of events from source s i. P j (t) f i (λ it) j We now consider the situation that n jobs are generated in a certain time interval of the length of t units. There are several ways in which these n jobs can be generated by k sources. For simplicity we consider the case of k 2. Hence we have two arbitrarily sources s and s 2 as poisson processes. Each process produces N, N 2 with N + N 2 n jobs in the time interval of t units, respectively. We are searching for the random process which is defined by the summation of both random processes s and s 2. Given that f and f 2 are the distribution density functions for s and s 2 than the resulting random process s is given by s s + s 2 f f 2 and called convolution of the random variables. j! e λ it
P(N + N 2 n) n P(N j)p(n 2 n j) j0 n j0 (λ t) j j! e λ t (λ (n j) 2t) (n j)! e λ 2t e λt e λ n (λ 2t t) j (λ 2 t) (n j) j0 j! (n j)! e λ t e λ 2t n j0 e λ t e λ 2t n! e λ t e λ 2t n! j!(n j)! (λ t) j (n j) (λ 2 t) n j0 n j0 e λ t e λ 2t n! (λ t + λ 2 t) n e (λ +λ 2 )t n! ((λ + λ 2 )t) n λt (λt)n e n! n! j!(n j)! (λ t) j (λ 2 t) ( ) n (λ t) j (n j) (λ 2 t) j Convolution of two distribution functions (n j) Hence we show that the merging is valid for two arbitrarily sources it will also be valid for k sources, that could be shown by complete induction over k. 2
Exercise 3.2: Poisson Process Given is a job source which generates jobs according to a poisson process with parameter λ. We want to split the stream of jobs in m substreams b i, i,...,m as shown in the following figure. p b p 2 b 2 p m b m Let p i be the probability that a job will be assigned to substream i. Show that each substream b i is a poisson process with parameter λ p i, assuming that the p i are chosen indenpendently. Hint: Solution 3.2 x i i0 i! ex Let S be the poisson process with parameter λ. We consider a time interval of length of t with N jobs occurring in that time interval. The probability for that case is given by P(S N) (λt)n e λt A job is assigned to substream b i with probability p i. Hence, the probability that k of the N jobs are assigned to substream b i is ( ) N P(k of N jobs to b i ) P i p k i ( p i ) N k k We have to take into consideration that N jobs occurr and k of them are assigned to substream b i. 3
P(b i k) Nk Nk Nk e λt p k i P(b i k S N) P(S N) P i (λt) N e λt Nk e λt p k i ( p) k Nk e λt p k i ( p) k k! e λt p k i ( p) k k! (λt) N Nk Nk ( N k (λt) N Change of index e λt p k i ( p) k k! N0 ) p k i ( p i ) N k ( N k (λt) N e λt p k i (λt)k ( p i ) k ( p) k k! Apply hint ) ( p i ) N k (N k)!k! ( p i) N (N k)! ( p i) N (λt) N ( p i ) N (N k)! (λt) N+k ( p i ) N+k x i i0 i! ex (λt) N ( p i ) N N0 e λt p k i (λt)k ( p i ) k ( p) k e (λt)( p i) k! (p iλt) k e λt λt p i λt k! (p iλt) k e λt p i k! The result is again a poisson process with parameter λp i. 4
Exercise 3.3: Poisson Process A computer service (e.g. web server) starts at time t 0. Jobs arrive at random in a poisson fashion. Each job takes X time units of service time. X is a random variable. Consider the following two cases for X: X c constant X is exponentially distributed with b(x) µe µx and for both cases find out a) the probability P that the second job will not have to wait b) W, the average waiting time of the second job Solution 3.3 The situation is depicted in the following figure. service time x for job Start t t 2 We define the following notions: A random variable A for inter-arrival time which is exponentially distributed with parameter λ, i.e. a(t) λe λt and A(t) t 0 a(x)dx e λt A random variable X for service time which is either a constant or a exponential distribution with parameter µ. a) The second job does not have to wait if the inter-arrival time A 2 of this job is larger than the service time X of job, i.e. A 2 X. Hence we have to calculate P(no waiting) P(A 2 X ). X c constant: In that case X c P(no waiting) P(A 2 c) λe λt dt c [ e λt] c ( 0) ( e λc ) + e λc e λc 5
X is exponentially distributed with pdf b(x) µe µx P(no waiting) P(A 2 X ) µ x x 0 x x 0 P(A x X x ) P(A x ) P(X x ) from before we know the solution for the first part to be t A(t)dt [ e λt ] x e λx tx x x 0 x e λx P(X x ) e λx µe µx x 0 x e (λ+µ)x x 0 µ λ + µ b) Waiting time of jobs. The situation is depicted in the following figure. service time x for job waiting time Start t t 2 The second job has to wait if X A 2 than the waiting time is given by w X A 2. If A 2 i and X c than the waiting time is w c i, for 0 i c. In general, the average waiting time is given by E[W] t P(X A 2 t)dt 0 In the first case, we have X c constant: E[W] 0 c 0 (c i) P(a(t) i)di (c i)λe λi di e λc + λc λ E[W] c X is exponentially distributed with pdf b(x) µe µx In that case the waiting time depends on the distribution of X, i.e. W X A 2. For the case X c we got W c : e λc + λc λ 6
E[W] x x0 x x0 λ µ(λ + µ) W x P(X x) e λx + λx λ µe µx 7
Exercise 3.4: Markov Chains In the following figure the state diagram of a homogenous Markov chain is shown. p 0 -p -q 3 q 2 a) Explain the notion of homogenous. b) Find the probability transition matrix P. c) Show that the Markov chain is irreducible. d) Is the Markov chain aperiodic? e) Solve for the equilibrium probability vector π. f) What happens if we set p q p q 0 Solution 3.4 a) A Markov chain is denoted as homogenous if the following is valid: P(X m+n j X m i) P(X n j X 0 i) that means, the transition probability from state i to state j is independent of time. b) The probability transition matrix is given by 0 p p 0 P 0 0 0 0 q 0 q 0 0 0 c) A Markov chain is irreducible if every state communicates with every other state. The property communicate means, that state i can reach state j and vice versa, i.e. p (n) i, j > 0 and p (k) j,i > 0. In the lecture we defined p (n+m) i, j k0 p n i,k pm k, j 8
List reachability of the states: p 0, p p 0,2 ( p) p 2 0,3 ( p)q p 2,0 p 3,2 ( p) p,3 p 2 2,0 q p 2, ( q) p 2,3 q p 3,0 p 2 3, p p 2 3,2 ( p) The Markov chain is irreducible. d) We calculate the period for each state p 3 0,0 p 0, p,3 p 3,0 p p p 3, p,3 p 3,0 p 0, p p p 3 2,2 p 2,3 p 3,0 p 0,2 q ( p) q( p) p 3 3,3 p 3,0 p 0,2 p 2,3 ( p) q ( p)q There is only one equivalence class with period 3 The Markov chain is periodical. e) The solution of the equilibrium is gained by solving the following linear equation: 0 p p 0 (π 0,π,π 2,π 3 ) 0 0 0 0 q 0 q (π 0,π,π 2,π 3 ) 0 0 0 The result of the equilibrium probabilities is: and 3 i0 π i π 0 π π 2 π 3 3 + ( p)( q) 4 p q + pq p + ( q)( p) 3 + ( p)( q) q + pq 4 p q + pq ( p) 3 + ( p)( q) ( p) 4 p q + pq 3 + ( p)( q) 4 p q + pq 9
f) Special cases p q 0 3 2 p q 0 0 3 2 0