Manual for SOA Exam MLC.

Similar documents
Exponential Distribution and Poisson Process

The exponential distribution and the Poisson process

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

STAT 380 Markov Chains

Poisson Processes. Stochastic Processes. Feb UC3M

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Continuous Time Processes

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

Continuous-time Markov Chains

Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

Lecture 4a: Continuous-Time Markov Chain Models

Renewal theory and its applications

Chapter 2. Poisson Processes

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

Part I Stochastic variables and Markov chains

ELEMENTS OF PROBABILITY THEORY

EDRP lecture 7. Poisson process. Pawe J. Szab owski

Arrivals and waiting times

Problems 5: Continuous Markov process and the diffusion equation

Chapter 6: Random Processes 1

1 Poisson processes, and Compound (batch) Poisson processes

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M. Ross.

EE126: Probability and Random Processes

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory

STAT STOCHASTIC PROCESSES. Contents

1 Delayed Renewal Processes: Exploiting Laplace Transforms

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Poisson processes and their properties

December 19, Probability Theory Instituto Superior Técnico. Poisson Convergence. João Brazuna. Weak Law of Small Numbers

PROBABILITY DISTRIBUTIONS

Assignment 3 with Reference Solutions

Lecture 7. Poisson and lifetime processes in risk analysis

MATH 564/STAT 555 Applied Stochastic Processes Homework 2, September 18, 2015 Due September 30, 2015

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

TMA 4265 Stochastic Processes

RENEWAL PROCESSES AND POISSON PROCESSES

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

EE126: Probability and Random Processes

M/G/1 and M/G/1/K systems

Point Process Control

Modèles de dépendance entre temps inter-sinistres et montants de sinistre en théorie de la ruine

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013

Chapter 4: Higher Order Linear Equations

Stochastic process. X, a series of random variables indexed by t

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Simulation and Calibration of a Fully Bayesian Marked Multidimensional Hawkes Process with Dissimilar Decays

Lecture 10: Semi-Markov Type Processes

M/G/1 queues and Busy Cycle Analysis

1. Stochastic Process

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Markov processes and queueing networks

Random point patterns and counting processes

CAS Exam MAS-1. Howard Mahler. Stochastic Models

Jump Processes. Richard F. Bass

Slides 8: Statistical Models in Simulation

Continuous time Markov chains

PoissonprocessandderivationofBellmanequations

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

Data analysis and stochastic modeling

Performance Modelling of Computer Systems

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

Discrete Distributions

i=1 k i=1 g i (Y )] = k f(t)dt and f(y) = F (y) except at possibly countably many points, E[g(Y )] = f(y)dy = 1, F(y) = y

Modelling the risk process

Figure 10.1: Recording when the event E occurs

Erdős-Renyi random graphs basics

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

1 Inverse Transform Method and some alternative algorithms

Section 7.5: Nonstationary Poisson Processes

ECE 313 Probability with Engineering Applications Fall 2000

Math Ordinary Differential Equations

Optimization and Simulation

1 Probability and Random Variables

LECTURE #6 BIRTH-DEATH PROCESS

Probability and Statistics

NICTA Short Course. Network Analysis. Vijay Sivaraman. Day 1 Queueing Systems and Markov Chains. Network Analysis, 2008s2 1-1

errors every 1 hour unless he falls asleep, in which case he just reports the total errors

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

µ X (A) = P ( X 1 (A) )

Math 180C, Spring Supplement on the Renewal Equation

Continuous Time Markov Chains

Lecture 1: Brief Review on Stochastic Processes

1 Stationary point processes

THE QUEEN S UNIVERSITY OF BELFAST

Stochastic Proceses Poisson Process

Random variables and transform methods

Superposition. Thinning

Transcription:

Chapter 10. Poisson processes. Section 10.5. Nonhomogenous Poisson processes Extract from: Arcones Fall 2009 Edition, available at http://www.actexmadriver.com/ 1/14

Nonhomogenous Poisson processes Definition 1 The counting process {N(t) : t 0} is said to be a nonhomogenous Poisson process with intensity function λ(t), t 0, if (i) N(0) = 0. (ii) For each t > 0, N(t) has a Poisson distribution with mean m(t) = t 0 λ(s) ds. (iii) For each 0 t 1 < t 2 < < t m, N(t 1 ), N(t 2 ) N(t 1 ),..., N(t m ) N(t m 1 ) are independent r.v. s. 2/14

Nonhomogenous Poisson processes Definition 1 The counting process {N(t) : t 0} is said to be a nonhomogenous Poisson process with intensity function λ(t), t 0, if (i) N(0) = 0. (ii) For each t > 0, N(t) has a Poisson distribution with mean m(t) = t 0 λ(s) ds. (iii) For each 0 t 1 < t 2 < < t m, N(t 1 ), N(t 2 ) N(t 1 ),..., N(t m ) N(t m 1 ) are independent r.v. s. An nonhomogeneous Poisson process with λ(t) = λ, for each t 0, is a regular Poisson process. 3/14

Nonhomogenous Poisson processes Definition 1 The counting process {N(t) : t 0} is said to be a nonhomogenous Poisson process with intensity function λ(t), t 0, if (i) N(0) = 0. (ii) For each t > 0, N(t) has a Poisson distribution with mean m(t) = t 0 λ(s) ds. (iii) For each 0 t 1 < t 2 < < t m, N(t 1 ), N(t 2 ) N(t 1 ),..., N(t m ) N(t m 1 ) are independent r.v. s. An nonhomogeneous Poisson process with λ(t) = λ, for each t 0, is a regular Poisson process. m(t) is the mean value function of the non homogeneous Poisson process. 4/14

The increments of an nonhomogeneous Poisson process are independent, but not necessarily stationary. 5/14

The increments of an nonhomogeneous Poisson process are independent, but not necessarily stationary. A nonhomogeneous Poisson process is a Markov process. 6/14

The increments of an nonhomogeneous Poisson process are independent, but not necessarily stationary. A nonhomogeneous Poisson process is a Markov process. For each 0 s < t, N(t) N(s) has Poisson distribution with mean m(t) m(s) = t s λ(x) dx. 7/14

The increments of an nonhomogeneous Poisson process are independent, but not necessarily stationary. A nonhomogeneous Poisson process is a Markov process. For each 0 s < t, N(t) N(s) has Poisson distribution with mean m(t) m(s) = t s λ(x) dx. For each 0 t 1 < t 2 < < t m and each integers k 1,..., k m 0, P{N(t 1 ) = k 1, N(t 2 ) = k 2,..., N(t m ) = k m } = P{N(t 1 ) = k 1, N(t 2 ) N(t 1 ) = k 1 k 1,..., N(t m ) N(t m 1 ) = k m k m 1 } = e m(t 1 ) (m(t 1 )) k 1 e (m(t 2 ) m(t 1 )) (m(t 2 ) (m(t 1 )) k 2 k 1 k 1! (k 2 k 1 )! e (m(tm) m(t m 1 )) (m(t m) m(t m 1 )) km k m 1 (k m k m 1 )!, 8/14

Example 1 For a nonhomogenous Poisson process the intensity function is given by { 5 if t is in (1, 2], (3, 4],... λ(t) = 3 if t is in (0, 1], (2, 3],... Find the probability that the number number of observed occurrences in the time period (1.25, 3] is more than two. 9/14

Example 1 For a nonhomogenous Poisson process the intensity function is given by { 5 if t is in (1, 2], (3, 4],... λ(t) = 3 if t is in (0, 1], (2, 3],... Find the probability that the number number of observed occurrences in the time period (1.25, 3] is more than two. Solution: N(3) N(1.25) has a Poisson distribution with mean m(3) m(1.25) = Hence, 3 1.25 λ(t) dt = 2 1.25 5 dt + 3 2 3 dt = 6.75. P{N(3) N(1.25) > 2} = 1 e 6.75 (1 + 6.75 + (6.75) 2 /2) =0.9642515816. 10/14

Let S n be the time of the n th occurrence. S n is an extended r.v. with values in [0, ]. Then, P{S n > t} = P{N(t) n 1} = n 1 j=0 = P{Gamma(n, 1) m(t)}. If lim t m(t) =, then e m(t) (m(t)) j j! P{S n = } = lim t P{S n > t} = P{Gamma(n, 1) lim t m(t)} = 0 and S n is a r.v. The density of S n is f Sn (t) = e m(t) (m(t))n 1 m (t) (n 1)! If lim t m(t) <, then S n is a mixed r.v. with = e m(t) (m(t))n 1 λ(t), t 0. (n 1)! P{S n = } = P{Gamma(n, 1) lim t m(t)} > 0 and density of its continuous part f Sn (t) = e m(t) (m(t)) n 1 λ(t) (n 1)!, t 0. 11/14

Let T n = S n S n 1 be the n th interarrival time. For a non homogeneous Poisson process {T n } n=1 are not necessarily independent r.v. s. For 0 s t, P{T n+1 > t S n = s} = P{S n+1 > s + t S n = t} =P{N(s + t) = n + 1 S n = t} = P{N(s + t) N(s) = 1 S n = t} =P{N(s + t) N(s) = 1} = e (m(s+t) m(s)). Notice S n = t depends on the non homogeneous Poisson process until time t. Hence, {N(s + t) N(s) = 1} and {S n = t} are independent. 12/14

Example 2 For a non homogenous Poisson process, the intensity function is given by { t for 0 < t 4, λ(t) = 4 for 10 < t. If S 5 = 2, calculate the probability that S 6 > 5. 13/14

Example 2 For a non homogenous Poisson process, the intensity function is given by { t for 0 < t 4, λ(t) = 4 for 10 < t. If S 5 = 2, calculate the probability that S 6 > 5. Solution: We have that P{S 6 > 5 S 5 = 2} = P{N(5) = 5 S 5 = 2} = P{N(5) N(2) = 0 S 5 = 2} = P{N(5) N(2) = 0} and m(5) m(2) = 5 2 λ(t) dt = 4 2 t dt + 5 4 4 dt = 10. Hence, P{S 6 > 5 S 5 = 2} = P{N(5) N(2) = 0} = e 10. 14/14