The Poisson Process *

Similar documents
Generalized Semi- Markov Processes (GSMP)

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

MAS275 Probability Modelling

STAT Homework 1 - Solutions

Limit Theorems. Convergence in Probability. Let X be the number of heads observed in n tosses. Then, E[X] = np and Var[X] = np(1-p).

4. Partial Sums and the Central Limit Theorem

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

First Year Quantitative Comp Exam Spring, Part I - 203A. f X (x) = 0 otherwise

x 2 x x x x x + x x +2 x

Advanced Stochastic Processes.

Approximations and more PMFs and PDFs

Lecture 2: Poisson Sta*s*cs Probability Density Func*ons Expecta*on and Variance Es*mators

Exponential Families and Bayesian Inference

It is always the case that unions, intersections, complements, and set differences are preserved by the inverse image of a function.

Outline Continuous-time Markov Process Poisson Process Thinning Conditioning on the Number of Events Generalizations

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS

x = Pr ( X (n) βx ) =

HOMEWORK I: PREREQUISITES FROM MATH 727

IIT JAM Mathematical Statistics (MS) 2006 SECTION A

EE / EEE SAMPLE STUDY MATERIAL. GATE, IES & PSUs Signal System. Electrical Engineering. Postal Correspondence Course

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

Discrete Mathematics and Probability Theory Summer 2014 James Cook Note 15


Probability and statistics: basic terms

Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Topic 9: Sampling Distributions of Estimators

Discrete probability distributions

Discrete Mathematics and Probability Theory Spring 2012 Alistair Sinclair Note 15

Topic 8: Expected Values

3. Z Transform. Recall that the Fourier transform (FT) of a DT signal xn [ ] is ( ) [ ] = In order for the FT to exist in the finite magnitude sense,

Introduction to probability Stochastic Process Queuing systems. TELE4642: Week2

STAT 516 Answers Homework 6 April 2, 2008 Solutions by Mark Daniel Ward PROBLEMS

This section is optional.

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 19

Chapter 6 Sampling Distributions

Economics 241B Relation to Method of Moments and Maximum Likelihood OLSE as a Maximum Likelihood Estimator

NANYANG TECHNOLOGICAL UNIVERSITY SYLLABUS FOR ENTRANCE EXAMINATION FOR INTERNATIONAL STUDENTS AO-LEVEL MATHEMATICS

Convergence of random variables. (telegram style notes) P.J.C. Spreij

Last time: Moments of the Poisson distribution from its generating function. Example: Using telescope to measure intensity of an object

Continuous Random Variables: Conditioning, Expectation and Independence

Quick Review of Probability

Lecture 1 Probability and Statistics

Quick Review of Probability

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n.

Mathematical Statistics - MS

Lecture 33: Bootstrap

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

SINGLE-CHANNEL QUEUING PROBLEMS APPROACH

Topic 1 2: Sequences and Series. A sequence is an ordered list of numbers, e.g. 1, 2, 4, 8, 16, or

CS 330 Discussion - Probability

An Introduction to Randomized Algorithms

Probability and Statistics

Lecture 12: November 13, 2018

Module 1 Fundamentals in statistics

Discrete Mathematics for CS Spring 2005 Clancy/Wagner Notes 21. Some Important Distributions

Lecture 7: Properties of Random Samples

Entropy and Ergodic Theory Lecture 5: Joint typicality and conditional AEP

Monte Carlo Integration

Orthogonal Gaussian Filters for Signal Processing

Unbiased Estimation. February 7-12, 2008

January 25, 2017 INTRODUCTION TO MATHEMATICAL STATISTICS

5. INEQUALITIES, LIMIT THEOREMS AND GEOMETRIC PROBABILITY

Random Variables, Sampling and Estimation

6.3 Testing Series With Positive Terms

The Poisson Distribution

(average number of points per unit length). Note that Equation (9B1) does not depend on the

Lecture 1 Probability and Statistics

( θ. sup θ Θ f X (x θ) = L. sup Pr (Λ (X) < c) = α. x : Λ (x) = sup θ H 0. sup θ Θ f X (x θ) = ) < c. NH : θ 1 = θ 2 against AH : θ 1 θ 2

Lecture 6 Simple alternatives and the Neyman-Pearson lemma

Entropy Rates and Asymptotic Equipartition

R. van Zyl 1, A.J. van der Merwe 2. Quintiles International, University of the Free State

Chapter 6 Principles of Data Reduction

STAT331. Example of Martingale CLT with Cox s Model

Simulation. Two Rule For Inverting A Distribution Function

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

Mathematics 170B Selected HW Solutions.

Lecture 2: Monte Carlo Simulation

MATHEMATICS. The assessment objectives of the Compulsory Part are to test the candidates :

Statistical Inference Based on Extremum Estimators

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 6 9/24/2008 DISCRETE RANDOM VARIABLES AND THEIR EXPECTATIONS

Statisticians use the word population to refer the total number of (potential) observations under consideration

EE 4TM4: Digital Communications II Probability Theory

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

Estimation for Complete Data

7.1 Convergence of sequences of random variables

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

CS/ECE 715 Spring 2004 Homework 5 (Due date: March 16)

Estimation of the Mean and the ACVF

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 21 11/27/2013

o <Xln <X2n <... <X n < o (1.1)

Binomial Distribution

MATHEMATICS. The assessment objectives of the Compulsory Part are to test the candidates :

Olli Simula T / Chapter 1 3. Olli Simula T / Chapter 1 5

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 22

Random Matrices with Blocks of Intermediate Scale Strongly Correlated Band Matrices

Elements of Statistical Methods Lots of Data or Large Samples (Ch 8)

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

PRACTICE PROBLEMS FOR THE FINAL

Element sampling: Part 2

Basics of Probability Theory (for Theory of Computation courses)

Transcription:

OpeStax-CNX module: m11255 1 The Poisso Process * Do Johso This work is produced by OpeStax-CNX ad licesed uder the Creative Commos Attributio Licese 1.0 Some sigals have o waveform. Cosider the measuremet of whe lightig strikes occur withi some regio; the radom process is the sequece of evet times, which has o itrisic waveform. Such processes are termed poit processes, ad have bee show (see Syder [2]) to have simple mathematical structure. Dee some quatities rst. Let N t be the umber of evets that have occurred up to time t (observatios are by covetio assumed to start at t = 0). This quatity is termed the coutig process, ad has the shape of a staircase fuctio: The coutig fuctio cosists of a series of plateaus always equal to a iteger, with jumps betwee plateaus occurig whe evets occur. The icremet N,t 2 = N t2 N correspods to the umber of evets i the iterval [, t 2 ). Cosequetly, N t = N 0,t. The evet times comprise the radom vector W ; the dimesio of this vector is N t, the umber of evets that have occured. The occurrece of evets is govered by a quatity kow as the itesityλ (t; N t ; W) of the poit process through the probability law P r [N t,t+ t = 1 N t ; W] = λ (t; N t ; W) (t) for sucietly small (t). Note that this probability is a coditioal probability; it ca deped o how may evets occurred previously ad whe they occurred. The itesity ca also vary with time to describe o-statioary poit processes. The itesity has uits of evets, ad it ca be viewed as the istataeous rate at which evets occur. The simplest poit process from a structural viewpoit, the Poisso process, has o depedece o process history. A statioary Poisso process results whe the itesity equals a costat: λ (t; N t ; W) = λ 0. Thus, i a Poisso process, a coi is ipped every (t) secods, with a costat probability of heads (a evet) occurig that equals λ 0 (t) ad is idepedet of the occurrece of past (ad future) evets. Whe this probability varies with time, the itesity equals λ (t), a o-egative sigal, ad a ostatioary Poiso process results. 1 From the Poisso process's deitio, we ca derive the probability laws that gover evet occurrece. These fall ito two categories: the cout statistics P r [N,t 2 = ], the probability of obtaiig evets i a iterval [, t 2 ), ad the time of occurrece statistics p W () (w ), the joit distributio of the rst evet times i the observatio iterval. These times form the vector W (), the occurrece time vector of dimesio. From these two probability distributios, we ca derive the sample fuctio desity. 1 Cout Statistics We derive a dieretio-dierece equatio that P r [N,t 2 = ], < t 2, must satisfy for evet occurrece i a iterval to be regular ad idepedet of evet occurreces i disjoit itervals. Let be xed ad cosider evet occurrece i the itervals [, t 2 ) ad [t 2, t 2 + δ), ad how these cotribute to the occurrece * Versio 1.4: May 24, 2005 8:49 am -0500 http://creativecommos.org/liceses/by/1.0 1 I the literature, statioary Poisso processes are sometimes termed homogeeous, ostatioary oes ihomogeeous.

OpeStax-CNX module: m11255 2 of evets i the uio of the two itervals. If k evets occur i [, t 2 ), the k must occur i [t 2, t 2 + δ). Furthermore, the scearios for dieret values of k are mutually exclusive. Cosequetly, P r [N,t 2 +δ = ] = k=0 P r [N,t 2 = k, N t2,t 2 +δ = k] = (1) P r [N t2,t 2 +δ = 0 N,t 2 = ] P r [N,t 2 = ]+P r [N t2,t 2 +δ = 1 N,t 2 = 1] P r [N,t 2 = 1]+ k=2 P r [N t 2,t 2 +δ = k N,t 2 = k] P r [N,t 2 = k] Because of the idepedece of evet occurrece i disjoit itervals, the coditioal probabilities i this expressio equal the ucoditioal oes. Whe δ is small, oly the rst two will be sigicat to the rst order i δ. Rearragig ad takig the obvious limit, we have the equatio deig the cout statistics. d dt2 (P r [N,t 2 = ]) = (λ (t 2 ) P r [N,t 2 = ]) + λ (t 2 ) P r [N,t 2 = 1] To solve this equatio, we apply a z-trasform to both sides. Deig the trasform of P r [N,t 2 = ] to be P (t 2, z), 2 we have P (t 2, z) = ( λ (t 2 ) ( 1 z 1) P (t 2, z) ) t 2 Applyig the boudary coditio that P (, z) = 1, this simple rst-order dieretial equatio has the solutio P (t 2, z) = e ((1 z 1 ) t 2 λ(α)dα) To evaluate the iverse z-trasform, we simply exploit the Taylor series expressio for the expoetial, ad we d that a Poisso probability mass fuctio govers the cout statistics for a Poisso process. ( ) t2 λ (α) dα P r [N,t 2 = ] = e t 2 t λ(α)dα 1 (2)! The itegral of the itesity occurs frequetly, ad we succictly deote it by Λ t2. Whe the Poisso process is statioary, the itesity equals a costat, ad the cout statistics deped oly o the dierece t 2. 2 Time of occurrece statistics To derive the multivariate distributio of W, we use the cout statistics ad the idepedece properties of the Poisso process. The desity we seek satises w1+δ 1 w 1 w+δ w p W () (v ) dvdv = P r [W 1 [w 1, w 1 + δ 1 ),..., W [w, w + δ )] The expressio o the right equals the probability that o evets occur i [, w 1 ), oe evet i [w 1, w 1 + δ 1 ), o evet i [w 1 + δ 1, w 2 ), etc. Because of the idepedece of evet occurrece i these disjoit itervals, we ca multiply together the probability of these evet occurreces, each of which is give by the cout statistics. P r [W 1 [w 1, w 1 + δ 1 ),..., W [w, w + δ )] = e Λw 1 Λ w 1+δ 1 k=1 λ (w k) δ k e Λw w 1 e Λw 1 +δ 1 w 1 e Λw 2 w 1 +δ 1 Λ w 2+δ 2 w 2 e Λw 2 +δ 2 w 2 for small δ k. From this approximatio, we d that the joit distributio of the rst evet times equals k=1 p W () (w ) = λ (w k) e w t λ(α)dα 1 if w 1 w 2 w (3) 0 otherwise... Λ w+δ w e 2 Remember, is xed ad ca be suppressed otatioally.

OpeStax-CNX module: m11255 3 3 Sample fuctio desity For Poisso processes, the sample fuctio desity describes the joit distributio of couts ad evet times withi a specied time iterval. Thus, it ca be writte as P r [N t t < t 2 ] = P r [N,t 2 = {W 1 = w 1,..., W = w }] p W () (w ) The secod term i the product equals the distributio derived previously for the time of occurrece statistics. The coditioal probability equals the probability that o evets occur betwee w ad t 2 ; from the Poisso process's cout statistics, this probability equals e Λt 2 w. Cosequetly, the sample fuctio desity for the Poisso process, be it statioary or ot, equals P r [N t t < t 2 ] = k=1 λ (w k ) e t 2 t λ(α)dα 1 (4) 4 Properties From the probability distributios derived o the previous pages, we ca discer may structural properties of the Poisso process. These properties set the stage for delieatig other poit processes from the Poisso. They, as described subsequetly, have much more structure ad are much more dicult to hadle aalytically. 4.1 The Coutig Process The coutig process N t is a idepedet icremet process. For a Poisso process, the umber of evets i disjoit itervals are statistically idepedet of each other, meaig that we have a idepedet icremet process. Whe the Poisso process is statioary, icremets take over equi-duratio itervals are idetically distributed as well as beig statistically idepedet. Two importat results obtai from this property. First, the coutig process's covariace fuctio K N (t, u) equals σ 2 mi {t, u}. This close relatio to the Wieer waveform process idicates the fudametal ature of the Poisso process i the world of poit processes. Note, however, that the Poisso coutig process is ot cotiuous almost surely. Secod, the sequece of couts forms a ergodic process, meaig we ca estimate the itesity parameter from observatios. The mea ad variace of the umber of evets i a iterval ca be easily calculated from the Poisso distributio. Alteratively, we ca calculate the characteristic fuctio ad evaluate its derivatives. The characteristic fuctio of a icremet equals Φ N,t 2 (v) = e (eiv 1)Λ t 2 The rst two momets ad variace of a icremet of the Poisso process, be it statioary or ot, equal E [N,t 2 ] = Λ t2 (5) E [ N,t 2 2 ] = Λ t2 + Λ t2 2 σ (N,t 2 ) 2 = Λ t2 Note that the mea equals the variace here, a trademark of the Poisso process.

OpeStax-CNX module: m11255 4 4.2 Poisso process evet times from a Markov process Cosider the coditioal desity p W W 1,...,W 1 (w w 1,..., w 1 ). This desity equals the ratio of the evet time desities for the - ad ( 1)-dimesioal evet time vectors. Simple substitutio yields w, w w 1 : (p W W 1,...,W 1 (w w 1,..., w 1 ) = λ (w ) e ) w λ(α)dα w 1 (6) Thus the th evet time depeds oly o whe the ( 1) th evet occurs, meaig that we have a Markov process. Note that evet times are ordered: the th evet must occur after the ( 1) th, etc. Thus, the values of this Markov process keep icreasig, meaig that from this viewpoit, the evet times form a ostatioary Markovia sequece. Whe the process is statioary, the evolutioary desity is expoetial. It is this special form of evet occurece time desity that dees a Poisso process. 4.3 Iterevet itervals i a Poisso process form a white sequece. Exploitig the previous property, the duratio of the th iterval τ = w w 1 does ot deped o the legths of previous (or future) itervals. Cosequetly, the sequece of iterevet itervals forms a "white" sequece. The sequece may ot be idetically distributed uless the process is statioary. I the statioary case, iterevet itervals are truly white - they form a IID sequece - ad have a expoetial distributio. ( τ, τ 0 : p τ (τ ) = λ 0 e (λ0τ)) (7) To show that the expoetial desity for a white sequece correspods to the most "radom" distributio, Parze [1] proved that the ordered times of evets sprikled idepedetly ad uiformly over a give iterval form a statioary Poisso process. If the desity of evet spriklig is ot uiform, the resultig ordered times costitute a ostatioary Poisso process with a itesity proportioal to the spriklig desity. 5 Doubly stochastic Poisso processes Here, the itesity λ (t) equals a sample fuctio draw from some waveform process. I waveform processes, the aalogous cocept does ot have early the impact it does here. Because itesity waveforms must be o-egative, the itesity process must be ozero mea ad o-gaussia. The authors shall assume throughout that the itesity process is statioary for simplicity. This model arises i those situatios i which the evet occurrece rate clearly varies upredictably with time. Such processes have the property that the variace-to-mea ratio of the umber of evets i ay iterval exceeds oe. I the process of derivig this last property, we illustrate the typical way of aalyzig doubly stochastic processes: Coditio o the itesity equalig a particular sample fuctio, use the statistical characteristics of ostatioary Poisso processes, the "average" with respect to the itesity process. To calculate the expected umber N,t 2 of evets i a iterval, we use coditioal expected values: E [N,t 2 ] = E [E [N,t 2 λt, t < t 2 : ( t < t 2 )]] [ ] t2 = E λ (α) dα = (t 2 ) E [λ (t)] This result ca also be writte as the expected value of the itegrated itesity: Similar calculatios yield the icremet's secod momet ad variace. E [ 2 N ],t 2 = E [ ] [ ] Λ t2 + E Λ t2 2 (8) E [N,t 2 ] = E [ Λ t2 ]. σ (N,t 2 ) 2 = E [ Λ t2 ] + σ ( Λ t 2 ) 2

OpeStax-CNX module: m11255 5 Usig the last result, we d that the variace-to-mea ratio i a doubly stochastic process always exceeds uity, equalig oe plus the variace-to-mea ratio of the itesity process. The approach of sample-fuctio coditioig ca also be used to derive the desity of the umber of evets occurrig i a iterval for a doubly stochastic Poisso process. Coditioed o the occurrece of a sample fuctio, the probability of evets occurrig i the iterval [, t 2 ) equals ((1)) P r [N,t 2 = λt, t < t 2 : ( t < t 2 )] = Λt2! e Λt 2 Because Λ t2 is a radom variable, the ucoditioal distributio equals this coditioal probability averaged with respect to this radom variable's desity. This average is kow as the Poisso Trasform of the radom variable's desity. α P r [N,t 2 = ] =! e α p t Λ 2 (α) dα (9) 0 Refereces [1] E. Parze. Stochastic Processes. Holde-Day, Sa Fracisco, 1962. [2] D. L. Syder. Radom Poit Processes. Wiley, New York, 1975.