MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

Similar documents
Notes on Continuous Random Variables

Exponential Distribution and Poisson Process

Sampling Distributions

Renewal theory and its applications

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Common ontinuous random variables

Assignment 3 with Reference Solutions

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Continuous Distributions

Sampling Distributions

T. Liggett Mathematics 171 Final Exam June 8, 2011

Continuous Random Variables

Figure 10.1: Recording when the event E occurs

3 Continuous Random Variables

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

Chapter 8: Continuous Probability Distributions

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

Stochastic Processes

Universal examples. Chapter The Bernoulli process

P (L d k = n). P (L(t) = n),

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

M/G/1 and M/G/1/K systems

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012

Renewal Process Models for Crossover During Meiosis

1 Delayed Renewal Processes: Exploiting Laplace Transforms

THE QUEEN S UNIVERSITY OF BELFAST

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Lecture 10: Semi-Markov Type Processes

Interlude: Practice Final

Chapter 8 Queuing Theory Roanna Gee. W = average number of time a customer spends in the system.

M/G/1 queues and Busy Cycle Analysis

2905 Queueing Theory and Simulation PART IV: SIMULATION

IEOR 6711, HMWK 5, Professor Sigman

Probability Density Functions

Part I Stochastic variables and Markov chains

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Lecture 4a: Continuous-Time Markov Chain Models

Stat410 Probability and Statistics II (F16)

f X (x) = λe λx, , x 0, k 0, λ > 0 Γ (k) f X (u)f X (z u)du

Conditional densities, mass functions, and expectations

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Slides 8: Statistical Models in Simulation

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

RELATING TIME AND CUSTOMER AVERAGES FOR QUEUES USING FORWARD COUPLING FROM THE PAST

Exercises Solutions. Automation IEA, LTH. Chapter 2 Manufacturing and process systems. Chapter 5 Discrete manufacturing problems

Data analysis and stochastic modeling

HITTING TIME IN AN ERLANG LOSS SYSTEM

Continuous Random Variables

Statistics 3657 : Moment Generating Functions

Chapter 3 Balance equations, birth-death processes, continuous Markov Chains

Stochastic process. X, a series of random variables indexed by t

MATH 56A: STOCHASTIC PROCESSES CHAPTER 2

Modern Discrete Probability Branching processes

57:022 Principles of Design II Final Exam Solutions - Spring 1997

Basics of Stochastic Modeling: Part II

Introduction to Queuing Networks Solutions to Problem Sheet 3

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

The exponential distribution and the Poisson process

MATH 56A: STOCHASTIC PROCESSES CHAPTER 3

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10. x n+1 = f(x n ),

STA 256: Statistics and Probability I

CS 798: Homework Assignment 3 (Queueing Theory)

Exact Simulation of the Stationary Distribution of M/G/c Queues

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Math Stochastic Processes & Simulation. Davar Khoshnevisan University of Utah

Random Walk on a Graph

Exam Stochastic Processes 2WB08 - March 10, 2009,

Introduction to Queuing Theory. Mathematical Modelling

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017

Continuous random variables

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

EE 261 The Fourier Transform and its Applications Fall 2006 Midterm Exam Solutions

Manual for SOA Exam MLC.

Random Variables and Their Distributions

MAT SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions.

UNIVERSITY OF YORK. MSc Examinations 2004 MATHEMATICS Networks. Time Allowed: 3 hours.

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS

Statistics 100A Homework 5 Solutions

8.4 Application to Economics/ Biology & Probability

Computer Networks More general queuing systems

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Expectation, variance and moments

Lecture 20: Reversible Processes and Queues

Continuous-Time Markov Chain

Time Reversibility and Burke s Theorem

11 The M/G/1 system with priorities

1.1 Review of Probability Theory

Chapter 2. Poisson Processes

Name of the Student: Problems on Discrete & Continuous R.Vs

Chapter 6 Continuous Probability Distributions

Queueing Theory. VK Room: M Last updated: October 17, 2013.

Chapter 4: CONTINUOUS RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Transcription:

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and the probability of jumping from x to x + depends only on how long the process was at state x. Renewal occurs at each jump. N t := number of jumps that occur in time interval (, t] The jumps (renewals) occur at times Y, Y + T, Y + T + T 2, etc. and Y + T + + T n = inf{t N t = n + } The interpretation is that there is an object or process which lasts for a certain amount of time which is random. When the object dies or the process stops then you replace the object with a new one or you restart the process from the beginning: You renew the process each time it stops. The number N t is equal to the number of times renewal has taken place up to time t. Y is the lifetime of the initial process, T is the lifetime of the second one, T 2 is the lifetime of the third one, etc. If the first process starts from the beginning then Y = and the numbering is different. T n becomes the lifetime of the nth process: T + + T n = inf{t N t = n} I gave a light bulb as an example. There are three kinds of light bulbs: () The guaranteed light bulb which will last exactly hours. (2) The Poisson light bulb. This light bulb is as good as new as long as it is working. Assume it has an expected life of hours. (λ = /). (3) A general light bulb which has a general probability distribution with the property that its expected life is hours. Date: November 4, 26.

2 MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 In all three cases, = E(T ) = where T is the length of time that the light bulb lasts. The first question is: Which light bulb is worth more? The answer is that they are all worth the same. They all give an expected utility of hours of light. With the general light bulb, there is another question: How long do you expect the last light bulb to last after it has been used for a certain amount of time? This depends on the light bulb. For example, if the guaranteed light bulb has been used for 5 hours then it is only worth half as much as a new one. If the Poisson light bulb lasts 5 hours then it is still worth the same as a new one. We will look at the value of a general light bulb (or a nenewal process with a general distribution.) 6.. Renewal theorem. The guaranteed light bulb is an example of a periodic renewal process. Each renewal occurs at multiples of hours. Definition 6.. A renewal process is periodic if renewals always occur at (random) integer multiples of a fixed time interval t starting with the first renewal which occurs at time Y. The renewal theorem says that, if renewal is not periodic, then the occurrences of the renewal will be spread out evenly around the clock. The probability that it will occur will depend only on the length of time you wait. Since the average waiting time is, the probability is approximately the proportion of that you wait: P = t/. For the lightbulb, suppose you install a million lightbulbs at the same time. Then after a while the number of light bulbs that burn out each day will be constant. This (after dividing by one million) will be the equilibrium distribution. Theorem 6.2 (Renewal Theorem). If a renewal process is aperiodic then, as t, P(renewal occurs in time (t, t + dt]) dt where = E(T ). This is equivalent to saying that lim t E(number of renewals in time (t, t + s]) = s lim t E(N t+s N t ) = s

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 3 6.2. age of current process. At any time t, let A t be the life of the current process. This would be the answer to the question: How long ago did you replace the light bulb? The book says that the pair (N t, A t ) determines the future of the process. B t denotes the remaining life of the current process. (How long will the current light bulb last?) First I needed the following lemma. 6.2.. picture for an expected value. Figure. The shaded area above the distribution function F (t) is equal to the expectation. Lemma 6.3. If T is a nonnegative random variable then the expected value of T is given by E(T ) = F (t) dt Proof. The expected value of T is defined by the integral E(T ) = tf (t) dt Substituting the integral t t = ds = ds s t we get: E(T ) = f(t) dsdt On the other hand, s t F (s) = P(T > s) = s f(t) dt

4 MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 So, F (s) ds = f(t) dtds = s s t f(t) dsdt = E(T ) 6.2.2. distribution of current age. What is the density function for the current age A t for large t? I.e., what is P(s < A t s + s)? This is given by P(s < A t s + s) s = t ( F (s)) because: The renewal event must occur in a time interval s: By the renewal theorem this has probability approximately s/. Then the next renewal event must occur at some time greater than s. That has probability F (s) where F (s) is the distribution function of the length of time that each renewal process lasts. This is an approximation for large t which depends on the Renewal Theorem. See the figure. X Figure 2. The age of the current process tells when was the last renewal. To get the density function of the current age function A t we have to take the limit as s : ψ A (s) = lim s s P(s < A t s + s) = F (s) The lemma says that the integral of this density function is (as it should be): F (s) ψ A (s) ds = ds = =

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 5 For the case of the exponential distribution we have F (t) = e λt and = /λ. So f(t) = λe λt = λ( F (t)) = F (t) = ψ A (t) and the age of the current process has the same distribution as the entire lifespan of the process. 6.2.3. distribution of remaining life. The remaining life or residual life of the process at time t is simply how long we have to wait for the next renewal. It is called B t. It is a little more complicated to analyze. X X Figure 3. The residual life B t seems to depend on the time of the last renewal. We want to calculate the distribution function Ψ B (x) = P(B t x). In order for this even to occur, we first need the last renewal to occur in some interval ds before t. This has probability ds/. Then we need the event not to occur again during the time interval s before time t but we need it to occur sometime in the time interval x after time t. This occurs with probability F (s + x) F (s). But s is arbitrary. So we sum over all s and get: Ψ B (x) = ds (F (s + x) F (s)) (See the figure.) To get the density function we should differentiate with respect to x: ψ B (x) = Ψ B(x) = If we substitute t = s + x, dt = ds we get: ψ B (x) = x ds f(s + x) dt F (x) f(t) = = ψ A (x)

6 MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 In other words, the current age A t and the residual life B t have the same probability distribution. 6.2.4. relativity argument. The symmetry between past and future was the point which I wanted to explain using relativity. Instead of having a fixed time t and looking at the renewal process as occurring at random times, you could think of the renewals as fixed and pick your current time t at random. If you pick t in some renewal period of length C then the point of time that you choose is uniformly distributed (has equal probability of being at any point in the interval). In particular, the left and right parts have the same distribution. The sum C t := A t + B t is equal to the total duration of the current process. To find the distribution function for C t you can use this relativity argument. Assume that renewal has occurred a very large number of times, say N. By the law of large number, the total amount of time this takes is very close to N. Of these N renewals, f(x)dx represents the proportion of renewal period of duration x to x + dx. So, Nf(x)dx is the number of times this occurs. Since the renewal periods all have the same length, the total length of time for all of these renewal periods is just the product xnf(x)dx. If you pick a time at random then the probability that the time you pick will be in one of these intervals is P(x < C t x + dx) = xnf(x)dx N = xf(x) dx Therefore, the density function for C t is ψ C (x) = xf(x)/. For example, for the exponential distribution with rate λ, we get: ψ C (x) = xf(x) = xλe λx /λ = λ2 xe λx This is the Gamma-distribution with parameters λ and α = 2. The reason is that, in this case, C t is the sum of two independent exponential variables A t, B t with rate λ. 6.3. Convolution. The convolution is used to describe the density function for the sum of independent random variables. It occurs in this chapter because the lifespan of the renewal periods are independent. So, the density function for the n-th renewal is given by a convolution. 6.3.. density of X + Y. Suppose that X, Y are independent random variables with density functions f(x), g(y) respectively. Then we discussed in class that there are two ways to find the density h(z) of Z = X + Y. The first method is intuitive and the second is rigorous.

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 7 Method. I assumed that X, Y are. But this assumption was not necessary. It just made it easier to talk about. h(z)dz is the probability that X +Y will lie in the interval [z, z +dz]. But in order for this to happen we first need X to lie in some interval [x, x + dx] where x z. This occurs with probability f(x)dx. Then we need Y to be in the interval [z x, z x + dz]. This occurs with probability g(z x)dz. So, Divide by dz to get This is the convolution: h(z)dz = h(z) = z z f(x)g(z x) dxdz f(x)g(z x) dx h = f g Method 2. Suppose that the distribution functions of X, Y, Z are F, G, H. Then H(z) = P(X + Y z) = Differentiate both sides to get h(z) = g(z x)f(x)dx G(z x)f(x)dx 6.3.2. Γ distribution. Suppose that you have a Poisson process with rate λ. Let T be the length of time you have to wait for the αth occurrence of the event. Then T has a Γ distribution with parameters λ and α. Since the expected value of the waiting time for the Poisson event is /λ the expected value of T must be E(T ) = α/λ. To get the density function of T we take the convolution of α exponential densities: f = φ φ φ φ }{{} α For example when α = 2 we get: f(t) = t φ(x)φ(t x) dx = = t t λ 2 e λt dx = λ 2 te λt λe λx λe λ(t x) dx

8 MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 In general you get: f(t) = if α is an integer and for any α: (α )! λα t α e λt f(t) = Γ(α) λα t α e λt where Γ(α) is what it has to be when λ = : Γ(α) = t α e t dt One example of the Γ-distribution is χ 2 r, the chi-squared distribution with r degrees of freedom. This is the Γ-distribution with λ = /2 and α = r/2. 6.4. M/G/-queueing. In this model, we have people lining up in a queue and one server taking care of these people one at a time. Let s assume the server is a machine. In the notation M/G/ the stands for the number of servers. The M means that the customers are entering the queue according to a Poisson process with some fixed rate λ. The G means that the servers does its job according to some fixed probability distribution which is general. i.e., it could be anything. This is a renewal process where renewal occurs at the moment the queue is empty. At that time, the system is back in its original state with no memory of what happened. X n = # people who enter the line during the n-th service period. U n = length of time to serve the n-th person. So, E(X n ) = λ where = E(U n ). We need to assume that λ <. Otherwise, the line gets longer and long. Y n = # people in queue right after the n-th person has been served. Then Y n+ Y n = X n+ because X n+ is the number of people who enter the line and one person leaves. (Let Y = so that the equation also holds for n =.) 6.4.. stopping time. Busy period is when the queue and server are active. Rest periods are when there is noone in the line. The queue will alternate between busy periods and rest periods. Define the stopping

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 9 time τ to be the number of people served during the first busy period. Then the first busy time (duration of the st busy period) is S = U + U 2 + + U τ To find a formula for τ we used exercise 5.6 on p.28: (a) M n = X + X 2 + + X n ne(x) is a uniformly integrable martingale. (b) M = (c) OST E(M τ ) = E(M ) =. This gives us: (d) (Wald s equation) E (X + + X τ ) = E(τ)E(X) But the sum of the numbers X n gives the total number of people who entered the line after the first person. So: X + + X τ = τ Put this into Wald s equation and we get: E(τ) = E(τ)E(X) = E(τ)λ where = E(U). Solve for E(τ) to get E(τ) = λ 6.4.2. equilibrium distribution. We want to know about the equilibrium distribution of the numbers Y n. The stopping time τ is the smallest number so that Y τ =. This means that τ is the time it takes for Y n to go from state Y = to state Y τ =. So τ + is the number of steps to go from to. (In one step it goes to.) Therefore, in the equilibrium distribution π of the Markov chain Y n we have or π = E(τ) + = π E(t) + = λ 2 λ