Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

Similar documents
Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 8 Queuing Theory Roanna Gee. W = average number of time a customer spends in the system.

Exponential Distribution and Poisson Process

The exponential distribution and the Poisson process

Manual for SOA Exam MLC.

Renewal theory and its applications

Poisson Processes. Stochastic Processes. Feb UC3M

Chapter 2. Poisson Processes

Figure 10.1: Recording when the event E occurs

Continuous Time Processes

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 22 12/09/2013. Skorokhod Mapping Theorem. Reflected Brownian Motion

Markov processes and queueing networks

Basics of Stochastic Modeling: Part II

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

P (L d k = n). P (L(t) = n),

Other properties of M M 1

Simple queueing models

M/G/1 queues and Busy Cycle Analysis

BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS

M/G/1 and M/G/1/K systems

M/G/1 and Priority Queueing

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Renewal processes and Queues

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

2905 Queueing Theory and Simulation PART IV: SIMULATION

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013

An Analysis of the Preemptive Repeat Queueing Discipline

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Part I Stochastic variables and Markov chains

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Stochastic process. X, a series of random variables indexed by t

ELEMENTS OF PROBABILITY THEORY

LECTURE #6 BIRTH-DEATH PROCESS

GI/M/1 and GI/M/m queuing systems

Computer Networks More general queuing systems

Queueing Systems: Lecture 3. Amedeo R. Odoni October 18, Announcements

IEOR 6711, HMWK 5, Professor Sigman

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

1 Poisson processes, and Compound (batch) Poisson processes

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

Data analysis and stochastic modeling

Birth-Death Processes

Continuous-Time Markov Chain

Queueing Theory. VK Room: M Last updated: October 17, 2013.

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Performance Modelling of Computer Systems

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Friday, Feb 8, 2013

EDRP lecture 7. Poisson process. Pawe J. Szab owski

18.440: Lecture 28 Lectures Review

Solutions For Stochastic Process Final Exam

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.

E-Companion to Fully Sequential Procedures for Large-Scale Ranking-and-Selection Problems in Parallel Computing Environments

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M. Ross.

Introduction to Queuing Theory. Mathematical Modelling

Stat410 Probability and Statistics II (F16)

Introduction to Queuing Networks Solutions to Problem Sheet 3

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

IEOR 4106: Spring Solutions to Homework Assignment 7: Due on Tuesday, March 22.

YORK UNIVERSITY FACULTY OF ARTS DEPARTMENT OF MATHEMATICS AND STATISTICS MATH , YEAR APPLIED OPTIMIZATION (TEST #4 ) (SOLUTIONS)

Probability and Statistics Concepts

Performance Evaluation of Queuing Systems

Assignment 3 with Reference Solutions

Sampling Distributions

Tentamentsskrivning: Stochastic processes 1

Applied Stochastic Processes

Expected Values, Exponential and Gamma Distributions

Non Markovian Queues (contd.)

Arrivals and waiting times

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

STAT 380 Markov Chains

1.225J J (ESD 205) Transportation Flow Systems

CDA5530: Performance Models of Computers and Networks. Chapter 4: Elementary Queuing Theory

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Introduction to Queueing Theory

ECE 3511: Communications Networks Theory and Analysis. Fall Quarter Instructor: Prof. A. Bruce McDonald. Lecture Topic

HEAVY-TRAFFIC EXTREME-VALUE LIMITS FOR QUEUES

18.440: Lecture 28 Lectures Review

1 IEOR 6712: Notes on Brownian Motion I

STAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in:

Exam Stochastic Processes 2WB08 - March 10, 2009,

Selected Exercises on Expectations and Some Probability Inequalities

1 Basic concepts from probability theory

Expected Values, Exponential and Gamma Distributions

Computer Systems Modelling

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014.

Lecture 20: Reversible Processes and Queues

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Page 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.

One important issue in the study of queueing systems is to characterize departure processes. Study on departure processes was rst initiated by Burke (

1 Some basic renewal theory: The Renewal Reward Theorem

Chapter 1. Introduction. 1.1 Stochastic process

On the Partitioning of Servers in Queueing Systems during Rush Hour

Chapter 3 Balance equations, birth-death processes, continuous Markov Chains

T. Liggett Mathematics 171 Final Exam June 8, 2011

Transcription:

Solutions Stochastic Processes and Simulation II, May 18, 217 Problem 1: Poisson Processes Let {N(t), t } be a homogeneous Poisson Process on (, ) with rate λ. Let {S i, i = 1, 2, } be the points of the Poisson Process, such that S 1 S 2 S 3. a) Provide the distribution of S 2 S 1. (4p) Solution: The dierence between two subsequent points in a homogeneous Poisson process is exponentially distributed; in this case with expectation 1/λ. b) For x >, compute P[S 2 S 1 x N(1) = 2]. (4p) Solution: By N(1) = 2 we know that there are exactly two points in the interval [, 1]. By the order statistic property the positions of the two points (not taking into account their order) are independent Uniforms on [, 1] (Say U 1 and U 2 ). The distance between those two uniforms can be computed as follows. (In the computation s and t are the positions of the uniforms). P[S 2 S 1 x N(1) = 2] = P[ U 2 U 1 x] = 1 = 1 1( s t x)dsdt + 1 s 1 1 t 1 1 1( s t x)dsdt 1( s t x)dsdt 1 = 1( s t x)dsdt+ 1( s t x)dtds = 2 1( s t x)dsdt ( x 1 ) ( x 1 ) = 2 dsdt + dsdt = 2 tdt + xdt = x 2 +2x(1 x). x t x x c) Assume that at the points of the Poisson process {N(t), t } (as de- ned above) a busload of passangers arrives at a restaurant. The number of passangers in each bus is independent and identically distributed with a geometric distribution with parameter p. That is, if k is a strictly positive integer, then the number of passengers in a bus is k with probability p(1 p) k 1 and the mean of the number of passengers in a bus is 1/p. Let N (t) be the number of passengers that arrived at the restaurant up to t. What sort of process is {N (t), t }? Compute the mean of N (t). (4p) Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

Solutions Stochastic Processes and Simulation II, May 18, 217 2 Problem 2: Renewal Theory Let {N(t), t } be a renewal process with inter-arrival time distribution function F (x) = P(X x) = (1 p) + p(1 e λx ) = 1 pe λx for x. Note that one might write F (x) = P(X = ) + P(X > )F 1 (x), where F 1 (x) = 1 e λx is the distribution function of an exponentially distributed random variable. That is, the interarrival time X is with probability (1 p) and conditioned on not being the interarrival time is exponentially distributed with expectation 1/λ. a) Provide the distribution of N(). (4p) Solution: For the interested reader: note that the process is a compound Poisson Process with extra weight N() at time, where P(N() k) for k is the probability that the rst k arrivals where all with interarrival time. So, P(N() k) = (1 p) k and P(N() = k) = p(1 p) k. b) Show that m(t) = E[N(t)] = 1 p p + λt 1 p. (4p) Hint: One way (most straightforward, perhaps not the easiest) is to deduce the renewal equation, which in this case reads: m(t) = F (t) + P(X = )m(t) + m(t x)p(x > )f 1 (x)dx. for t If you use this equation, you have to justify the equation. Solution: The easiest way is to observe that the process described is a compound Poisson Process with extra weight N() at time. Here N() is a shifted geometric distribution with expectation (1/p) 1 and the weights at an arrival have expectation 1/p. It is also possible to use the renewal equation, with X 1 the time of the rst

Solutions Stochastic Processes and Simulation II, May 18, 217 3 arrival m(t) = E[N(t)] = E[E[N(t) X 1 ]] = P(X 1 = )E[N(t) X 1 = ] + = P(X 1 = )(1 + m(t)) + P(X 1 > ) f 1 (x)p(x 1 > )E[N(t) X 1 = x]dx = P(X t) + P(X 1 = )m(t) + P(X 1 > ) as desired. Note that we can rewrite as m(t) = F (t) + P(X = )m(t) + m(t)p(x > ) = F (t) + f 1 (x)(1 + m(t x))dx f 1 (x)m(t x)dx, m(t x)p(x > )f 1 (x)dx m(t x)p(x > )f 1 (x)dx and lling m(t) = 1 p p + λt 1 p in gives on the left hand side 1 p + λt, while the right hand side is given by 1 pe λt + λe λx (1 p + λ(t x))dx = 1 pe λt + (1 p + λt)(1 e λt ) = 1 pe λt + (1 p + λt)(1 e λt ) + = 1 pe λt + (1 p + λt)(1 e λt ) + ( [e λx λx] t x= xλ 2 e λx dx ( λe λx )λxdx ) e λx λdx = 1 pe λt + (1 p + λt)(1 e λt ) + λte λt (1 e λt ) = 1 p + λt, as desired. We have used integration by parts between the third and fourth line. c) Let Y (t) be the time until the next arrival at time t. So, Y (t) = S N(t)+1 t, where S i is the time of the i-th arrival. Provide lim t E[Y (t)]. (4p) Solution: See example 7.19 (page 433) in book. Or realize the relation between the renewal process and a compound Poisson process, in which the waiting time to the next event is always 1/λ.

Solutions Stochastic Processes and Simulation II, May 18, 217 4 Problem 3: Queueing Theory Consider an M/G/1 queue in which customers arrive according to a Poisson Process with rate λ, and each customer needs exactly one time unit of service. (So, the workloads are not random). a) Provide the condition that λ should satisfy in order for the number of customers in the queue not to go to innity. (2p) Solution: λ < 1, because the long-run departure rate during busy periods should exceed the long-run arrival rate. Let V (t) be the remaining workload in the system at time t. That is, V (t) is the number of customers in the queue plus the remaining time until the customer in service leaves. Let W Q be the expected time a customer spends in the queue. b*) Argue that 1 t V (s)ds λ(w Q + 1/2) as t. (4p) Solution: Assume that every customer pays at a rate which equals its remaining service time. So every customer pays the average time it is in the queue plus 1/2 (The average remaining service time, while in service). Over a long time the number of customers that has arrived is about lambda per time unit. The rate at which the system earns at time s is V(s) because every customer in the queue pays 1, and every customer in service pays equal to its remaining service time. This gives the desired result. c) Argue that 1 t V (s)ds W Q as t. (3p) 1 Solution: t By PASTA, a newly arrived customer sees on average lim t t V (s)ds of work in front of him, which is equal to the time he or she has to stay in the queue (with expectation W Q ). d) What is the asymptotic fraction of time that there are no customers in the system? (3p) Hint: this question can be solved without part b) and c). Solution: Let N(t) be the number of arrivals up to time t, which each bring in 1 time unit of work load. The total time the server has been working up to time t is equal to N(t) V (t). Since the system does not explode, neither does V (t). So the fraction of time the server has been busy is (N(t) V (t))/t. If t is large V (t)/t converges to and N(t)/t converges to the expected number of arrivals per time unit, which is λ. So, the fraction of time the system is idle converges to 1 λ.

Solutions Stochastic Processes and Simulation II, May 18, 217 5 Problem 4: Brownian Motion and Stationary Processes Let {B(t), t } be a standard Brownian motion. a) Provide the denition of a standard Brownian motion. (4p) Solution: A standard Brownian motion {B(t); t } is a stochastic process in continuous time and continuous state space R which satises: - B() = - B has independent increments, i.e. B(t 3 ) B(t 2 ) and B(t 1 ) B(t ) are independent if t < t 1 t 2 < t 3 - For s < t, B(t) B(s) N (, t s) - B is almost surely continuous Let µ and σ > be constants. Dene {Y (t), t } through Y (t) = e σb(t)+µt for all t. b) Compute the mean and the variance of Y (t). (4p) Solution: E[e σb(t)+µt ] = e µt E[e σb(t) ] = e µt 1 = e µt 1 e (x2 2σtx)/(2t) dx = e µt 1 = e µt+σ2 t/2 e σx e x2 /(2t) dx e [(x σt)2 (σt) 2 ]/(2t) dx 1 e (x σt)2 /(2t) dx = e µt+σ2 t/2. The last equality is because the integrand is the density of a normal distribution with mean σt and variance t. For the variance compute (multiplying µ and σ by 2 in the above comp.) E[Y (t) 2 ] = E[e 2σB(t)+2µt ] = e 2µt+2σ2t. So V ar[y (t)] = E[Y (t) 2 ] (E[Y (t)]) 2 = e σ2t. c) For this subquestion, assume that µ =. Provide the distribution function of the rst time Y (t) is larger or equal than 2. That is, let and nd F (t) = P(T t). T = inf{t ; Y (t) 2} (4p) Solution: Note T = inf{t ; Y (t) 2} = inf{t ; B(t) log[2]/σ}. By P(T t) = P(sup s t B(s) log[2]/σ) and the reection principle we obtain P(T t) = 2P(B(t) > log[2]/σ) = P(N (, t) > log 2/σ).

Solutions Stochastic Processes and Simulation II, May 18, 217 6 Problem 5: Simulation Consider a non-homogeneous Poisson Process on [, ) with density λ(x) = 1 + 1 1 + x. Assume that we are only able to generate realisations of uniform random variables and Poisson distributed random variables. Throughout the question all random variables generated are independent. a) Let T > and create the above non-homogeneous Poisson process on the interval [, T ) as follows. Generate N, a Poisson distributed random variable with expection 2T. Generate N uniform random variables on [, 1] and denote those random variables by U 1, U 2,, U N. Generate N uniform random variables on [, 1] and denote those random variables by U 1, U 2,, U N. If 2U i < 1 + 1/(1 + T U i) then place a point at point T U i. If 2U i > 1 + 1/(1 + T U i ), then ignore U i further. Argue that this procedure indeed generates the points of non-homogeneous Poisson Process on [, T ) with density λ(x) = 1 + 1/(1 + x). (4p) Solution: First create a homogeneous Poisson process with intensity 2 on [, T ). The number of points is by the denition of a homogeneous Poisson Process distributed as a Poisson (2T ) random variable. And if the number of points is known, you can use the order statistic property to place the points on the interval [, T ) as independent uniforms. Then use rejection sampling, by maintaining a point at x with probability λ(x)/2, independently of other points. b) How can we generate a realisation of the same non-homogeneous Poisson Process using two Poisson distributed random variables and a number of uniformly distributed random variables, where this number of uniformly distributed random variables has lower expectation than the corresponding number in part a)? (3p) Solution: Create independently green and red points each according to a homogeneous Poisson process with intensity 1 (as above). Keep all green

Solutions Stochastic Processes and Simulation II, May 18, 217 7 points (so there is no need for generating extra uniforms for rejection sampling on the green points) and use rejection sampling with accepting probability 1/(1 + x) on the red points. c*) A method to nd the rst point of this inhomogeneous Poisson Process is the following. Simulate two independent Uniform random variables on [, 1] (say U 1 and U 2 ). Set X 1 = log[u 1 ] and X 2 = U 2 1 U 2 and let the rst point of the non-homogeneous Poisson Process be min(x 1, X 2 ). Show that min(x 1, X 2 ) is indeed the rst point of the non-homogeneous Poisson Process. (5p) Hint: Show that X 1 is distributed as the rst point in a Poisson Process with intensity λ 1 (x) = 1 and X 2 is distributed as the rst point in a nonhomogeneous Poisson Process with intensity λ 2 (x) = 1/(1 + x). Argue that the minimum of those two random variables is distributed as the rst point in the original non-homogeneous Poisson Process. Solution: Use the idea of part b with red and green points. The rst green point arrives after an exponential distributed time with expectation 1 (Which is distributed as log[u 1 ). The rst red point is obtained using Example 11.13 (page 677) with a = 1. The rst point in the combination of the two Poisson Processes is the minimum of the rst red point and the rst green point.