Math 180C, Spring Supplement on the Renewal Equation

Similar documents
RENEWAL PROCESSES AND POISSON PROCESSES

Renewal theory and its applications

MATH Solutions to Probability Exercises

ENGIN 211, Engineering Math. Laplace Transforms

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

1 Delayed Renewal Processes: Exploiting Laplace Transforms

Lecture 7 - Separable Equations

Problems 5: Continuous Markov process and the diffusion equation

Honors Differential Equations

1 Solution to Problem 2.1

Exponential Distribution and Poisson Process

Expectation, variance and moments

Math Spring Practice for the final Exam.

Math Spring Practice for the Second Exam.

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

Birth-Death Processes

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Properties of the Autocorrelation Function

Lecture 5: Moment generating functions

d n dt n ( Laplace transform of Definition of the Laplace transform

MATH 425, FINAL EXAM SOLUTIONS

MATH 56A SPRING 2008 STOCHASTIC PROCESSES 197

1 Expectation of a continuously distributed random variable

Lecture 17: The Exponential and Some Related Distributions

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

INHOMOGENEOUS LINEAR SYSTEMS OF DIFFERENTIAL EQUATIONS

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Math 20C Homework 2 Partial Solutions

MATH 251 Final Examination August 14, 2015 FORM A. Name: Student Number: Section:

MATH 1242 FINAL EXAM Spring,

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Northwestern University Department of Electrical Engineering and Computer Science

Slides 8: Statistical Models in Simulation

Uses of Asymptotic Distributions: In order to get distribution theory, we need to norm the random variable; we usually look at n 1=2 ( X n ).

JUST THE MATHS UNIT NUMBER LAPLACE TRANSFORMS 3 (Differential equations) A.J.Hobson

1 Basic concepts from probability theory

Before you begin read these instructions carefully.

Poisson processes and their properties

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Math 132 Lab 3: Differential Equations

Apart from this page, you are not permitted to read the contents of this question paper until instructed to do so by an invigilator.

Manual for SOA Exam MLC.

Math 353 Lecture Notes Week 6 Laplace Transform: Fundamentals

Lecture Notes 3 Convergence (Chapter 5)

Notes on the second moment method, Erdős multiplication tables

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

Review for the Final Exam

On the asymptotic behaviour of the number of renewals via translated Poisson

11.8 Power Series. Recall the geometric series. (1) x n = 1+x+x 2 + +x n +

Math 342 Partial Differential Equations «Viktor Grigoryan

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University

Lecture 4a: Continuous-Time Markov Chain Models

INTEGRATION: THE FUNDAMENTAL THEOREM OF CALCULUS MR. VELAZQUEZ AP CALCULUS

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 7

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

Lecture 21 Representations of Martingales

MATH 251 Final Examination December 16, 2015 FORM A. Name: Student Number: Section:

= 1 2 x (x 1) + 1 {x} (1 {x}). [t] dt = 1 x (x 1) + O (1), [t] dt = 1 2 x2 + O (x), (where the error is not now zero when x is an integer.

Notes on uniform convergence

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Math 510 midterm 3 answers

Math 115 Spring 11 Written Homework 10 Solutions

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Continuous-time Markov Chains

To take the derivative of x raised to a power, you multiply in front by the exponent and subtract 1 from the exponent.

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Random Variables (Continuous Case)

Limiting Distributions

Chapter 12: Differentiation. SSMth2: Basic Calculus Science and Technology, Engineering and Mathematics (STEM) Strands Mr. Migo M.

Continuous Distributions

Line of symmetry Total area enclosed is 1

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Math 307 Lecture 19. Laplace Transforms of Discontinuous Functions. W.R. Casper. Department of Mathematics University of Washington.

The exponential distribution and the Poisson process

Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M. Ross.

Solutions to Assignment-11

Third In-Class Exam Solutions Math 246, Professor David Levermore Thursday, 3 December 2009 (1) [6] Given that 2 is an eigenvalue of the matrix

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

Math 106 Answers to Exam 3a Fall 2015

Proving the central limit theorem

Math WW08 Solutions November 19, 2008

MATH20411 PDEs and Vector Calculus B

Lecture 19 L 2 -Stochastic integration

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Lycka till!

(a) x cos 3x dx We apply integration by parts. Take u = x, so that dv = cos 3x dx, v = 1 sin 3x, du = dx. Thus

We first review various rules for easy differentiation of common functions: The same procedure works for a larger number of terms.

Antiderivatives. Definition A function, F, is said to be an antiderivative of a function, f, on an interval, I, if. F x f x for all x I.

Continuous Random Variables

Advanced Calculus Math 127B, Winter 2005 Solutions: Final. nx2 1 + n 2 x, g n(x) = n2 x

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10

Lecture 13 - Wednesday April 29th

Limiting Distributions

AP Calculus Chapter 3 Testbank (Mr. Surowski)

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Transcription:

Math 18C Spring 218 Supplement on the Renewal Equation. These remarks supplement our text and set down some of the material discussed in my lectures. Unexplained notation is as in the text or in lecture. 1. In various contexts (continuous time Markov chains renewal processes etc.) we encounter probabilities or expectations that depend on time t and satisfy a renewal equation: (1.1) H(t) h(t) + or briefly H(t s)f(s) ds t (1.2) H h + H f. Here f is a probability density function on [ ) h : [ ) R is a bounded function (typically continuous) and the function H is the unknown. 2. Example. The renewal function M(t) : E[N(t)] satisfies M(t) P[W n t] n1 F n (t) where the cdf F n (t) P[W n t] satisfies (condition on the value of X 1 using the Law of Total Probability) Therefore because F 1 F that is F n (t) M(t) F (t) + n1 F n 1 (t s)f(s) ds F n 1 f(t). F (t) + F (t) + n2 k1 F n 1 (t s)f(s) ds F k (t s)f(s) ds M(t s)f(s) ds; (2.1) M F + M f 1

a special case of (1.2) when h F. Integrating by parts we can see that M f(t) M(t s)f(s) ds M(t s)f (s) st s + m(t s)f (s) ds F (t s)m(s) ds m(t s)f (s) ds F m(t). (Recall that m : M is the renewal density.) Consequently we also have (2.2) M F + F m 3. Some aspects of the solution H of the renewal equation (1.2) that appear in the preceding example are generally true as seen in parts (a) and (b) of the following theorem. Part (c) is the Key Renewal Theorem. Theorem. Let the bounded function h be fixed. (a) If H satisfies (3.1) H h + h m then max s t H(s) < for each t > (H is said to be locally bounded ) and (3.2) H h + H f. (b) Conversely if H is a locally bounded solution of (3.2) then H also satisfies (3.1). ( c) Suppose that H satisfies (3.1) (or equivalently (3.2))) and that h(t) for all t and h(t) h(s) for all t s t for some t > (h is eventually decreasing ). Then (3.3) lim t H(t) h(u) du where as usual tf(t) dt is the inter-arrival time mean. The proof of this theorem was sketched in class using Laplace transforms. I will not repeat it here. 4. Example. The renewal function M(t) E[N(t)] satisfies the renewal equation M h+m f with the choice h F. Unfortunately as noted in class F (t) dt + so part (c) of the Theorem is not informative. 2

5. Key observation. All of our examples rely on a simple but crucial observation. The sequence W {W 1 W 2 W 3...} of arrival times of a renewal process splits into the first arrival time W 1 and the (shifted) remaining arrival times W {W 1 W 2 W 3...} where W k : W k+1 W 1 X 2 + + X k+1. Notice that W is independent of W 1 and has the same distribution as W. Another way to say this is that given that W 1 s the sequence {W 2 W 3 W 4...} becomes {s + W 1 s + W 2 s + W 3...} and as before W : {W 1 W 2 W 3...} has the same distribution as W. 6. Example. Fix y > and consider H(t) : P[γ t > y] for t. One the one hand γ t X 1 t on the event {X t > t + y}. On the event {t < X 1 t + y} γ t cannot be > y. Finally on the event {X 1 t} we condition on the value of X 1 W 1 being s and observe that in this case γ t min{w n : W n > t n 2} t min{s + W k : W k > t s k 1} t min{w k : W k > t s k 1} (t s) γ t s. Therefore P[γ t > y X 1 s] P[γ t s > y] H(t s) in which the second equality follows because γ (t s) has the same distribution as γ t s being related to W in the same was that γ t s is related to W. By the Law of Total Probability P[γ t > y X 1 t] P[γ t > t X 1 s]f(s) ds H(t s)f(s) ds. It follows that H(t) h(t) + H f(t) for t where h(t) P[X 1 > t + y] 1 F (t + y). You have h(t) dt [1 F (u) du and so by Theorem 3 above y lim P[γ 1 F (u) t > y] du y >. t y In other words the distribution of the random variable γ t converges as t to that of a random variable γ with cdf 1 F (u) y 1 F (u) F γ (y) 1 du du y and density (differentiate!) f γ (y) 1 F (y). 7. Example. In this example we take H(t) 1 E[γ t ]. On the one hand because γ t W N(t)+1 t Wald s Identity tells us that H(t) 1 + M(t) t/. On the other hand by the discussion in Example 6 γ t 1 {X1 >t}(x 1 t) + 1 {Xt t}γ t s. sw1 3

Therefore by the Law of Total Probability H(t) 1 E[X 1 t; X 1 > t] + H(t s)f(s) ds. t. That is H h + H f where h(t) 1 t (x t)f(x) dx 1 t [1 F (x)] dx and the second equality above follows from integration by parts. As we saw in class h(t) dt 1 1 1 1 2 t x [1 F (x)] dx dt [1 F (x)] dt dx x[1 F (x)] dx x 2 f(x) dx σ2 + 2 2 where the penultimate equality results from integration by parts. By Theorem 3(c) [ lim M(t) t ] t h(t) dt 1 σ2 2 2 2. 8. Example. This example stems from the class discussion of the Central Limit Theorem for renewal processes. We define K(t) : E[N(t) 2 ] and V (t) : Var[N(t)] K(t) M(t) 2. Notice that N(t) on the event {X 1 > t} while on {X 1 t} we have N(t) 1+N (t s) sw1 using the logic (and the prime notation) of parts 5 and 6. Therefore N(t) 2 1 {X1 t} + 2 1 {X1 t}n (t s) + 1 {X1 t}[n (t s)] 2 sw1 sw1 and therefore for < s t E[N(t) 2 X 1 s] 1 + 2E[N (t s)] + E[[N (t s)] 2 ] 1 + 2M(t s) + K(t s). It now follows from the Law of Total Probability that K(t) P[X 1 t] + [2M(t s) + K(t s)]f(s) ds 4

or (what is the same) K(t) F (t) + (2M + K) f(t). But M F + M f and so K F + (2M + K) f means that K (2M F ) + K f. That is K satisfies the renewal equation with h 2M F. It follows that K M + 2(M m) and finally that V M + 2(M m) M 2. From this and our knowledge that M(t) t/ + (σ 2 2 )/2 2 + o(1) when the X k have finite variance σ 2 we can deduce that V (t) t σ2 + o(t) t. 3 5