Lecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona

Similar documents
ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Probability and Statistics

Statistical signal processing

4 Classical Coherence Theory

Foundations of Image Science

Definition of a Stochastic Process

noise = function whose amplitude is is derived from a random or a stochastic process (i.e., not deterministic)

Stochastic Processes

Signals and Spectra - Review

Probability and Statistics for Final Year Engineering Students

Stochastic Processes. A stochastic process is a function of two variables:

The TDCR method in LSC. P. Cassette Laboratoire National Henri Becquerel CEA/LNE, France

Photomultiplier (PMT) basic elements

Photon noise in holography. By Daniel Marks, Oct

IV. Covariance Analysis

Lecture 9. PMTs and Laser Noise. Lecture 9. Photon Counting. Photomultiplier Tubes (PMTs) Laser Phase Noise. Relative Intensity

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

Detection of X-Rays. Solid state detectors Proportional counters Microcalorimeters Detector characteristics

Statistical Methods in Particle Physics

Basics on 2-D 2 D Random Signal

7 The Waveform Channel

Detecting high energy photons. Interactions of photons with matter Properties of detectors (with examples)

A Hilbert Space for Random Processes

Advanced Optical Communications Prof. R. K. Shevgaonkar Department of Electrical Engineering Indian Institute of Technology, Bombay

Chapter 2 Random Processes

PART 2 : BALANCED HOMODYNE DETECTION

Basics of Infrared Detection

ECE Lecture #10 Overview

Notes on x-ray scattering - M. Le Tacon, B. Keimer (06/2015)

Name of the Student: Problems on Discrete & Continuous R.Vs

A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University. False Positives in Fourier Spectra. For N = DFT length: Lecture 5 Reading

Lecture 7 Random Signal Analysis

GENERALIZATION OF CAMPBELL S THEOREM TO NONSTATIONARY NOISE. Leon Cohen

Data, Estimation and Inference

Continuous Wave Data Analysis: Fully Coherent Methods

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

14 - Gaussian Stochastic Processes

3. Review of Probability and Statistics

Contribution of the Hanbury Brown Twiss experiment to the development of quantum optics

ECE Homework Set 3

Introduction to Probability and Stochastic Processes I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Applied Probability and Stochastic Processes

Probability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Name of the Student: Problems on Discrete & Continuous R.Vs

2.161 Signal Processing: Continuous and Discrete Fall 2008

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Lecture 2: Repetition of probability theory and statistics

Atmospheric Extinction

Coherence. This is based on. Chapter 10. Statistical Optics, Fundamentals of Photonics Bahaa E. A. Saleh, Malvin Carl Teich. and

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Chapter 1. Sets and probability. 1.3 Probability space

I(x, y; ξ)dξdxdy (1)

Lecture 32. Lidar Error and Sensitivity Analysis

Note the diverse scales of eddy motion and self-similar appearance at different lengthscales of the turbulence in this water jet. Only eddies of size

Fundamentals of Applied Probability and Random Processes

Chapter 5. Chapter 5 sections

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

Laser Types Two main types depending on time operation Continuous Wave (CW) Pulsed operation Pulsed is easier, CW more useful

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Review of Quantum Mechanics, cont.

Sample Spectroscopy System Hardware

ECE 565 Notes on Shot Noise, Photocurrent Statistics, and Integrate-and-Dump Receivers M. M. Hayat 3/17/05

Algorithms for Uncertainty Quantification

Single Photon detectors

A Probability Review

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Quantum Electronics/Laser Physics Chapter 4 Line Shapes and Line Widths

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Quantum Optical Communication

Signal Modeling, Statistical Inference and Data Mining in Astrophysics

Stochastic Processes

EEG- Signal Processing

If we want to analyze experimental or simulated data we might encounter the following tasks:

Random Processes Why we Care

Problems on Discrete & Continuous R.Vs

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

Chapter 6: Random Processes 1

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)

2. As we shall see, we choose to write in terms of σ x because ( X ) 2 = σ 2 x.

1: PROBABILITY REVIEW

Digital Image Processing

Chapter 6. Random Processes

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

Binomial and Poisson Probability Distributions

Analytical Chemistry II

Communication Theory II

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Problem Sheet 1 Examples of Random Processes

Probability Distributions

Stochastic Structural Dynamics Prof. Dr. C. S. Manohar Department of Civil Engineering Indian Institute of Science, Bangalore

1 Photon optics! Photons and modes Photon properties Photon streams and statistics

A SIGNAL THEORETIC INTRODUCTION TO RANDOM PROCESSES

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

FIBER OPTICS. Prof. R.K. Shevgaonkar. Department of Electrical Engineering. Indian Institute of Technology, Bombay. Lecture: 15. Optical Sources-LASER

b. ( ) ( ) ( ) ( ) ( ) 5. Independence: Two events (A & B) are independent if one of the conditions listed below is satisfied; ( ) ( ) ( )

PHYS 6710: Nuclear and Particle Physics II

Transcription:

Lecture 15 Theory of random processes Part III: Poisson random processes Harrison H. Barrett University of Arizona 1

OUTLINE Poisson and independence Poisson and rarity; binomial selection Poisson point processes: definition and basic properties Mean and covariance (stationary and nonstationary) Doubly stochastic Poisson variables and processes Poisson and non-poisson light sources Karhunen-Loève for the Poisson case Characteristic functionals 2

Poisson and independence The postulates that lead to the Poisson law are: (a) The number of counts in any time interval is statistically independent of the number in any other nonoverlapping interval (b) If we consider a very small time interval of duration T, the probability of a single count in this interval approaches some constant a times T, i.e., Pr(1 in T ) = a T, (11.1) (Note that a has dimensions of reciprocal time, will soon be interpreted as a rate.) (c) The probability of more than one count in a small interval T is zero: Pr(1 in T ) + Pr(0 in T ) = 1. (11.2) 3

THE POISSON PROBABILITY LAW If the postulates are satisfied with a = const., then (Derivation in B&M) Pr(N in T ) = (at )N N! exp( at ). (11.13) Mean = rate time: N = at Variance = mean: Var(N) = (N N ) 2 = at = N 4

A generalization Assume postulates are satisfied but rate is a nonrandom function of time, a(t). Example: a(t) could be a short pulse, provided pulse shape and amplitude are always the same on each repetition. Then Pr[N in (t, t + T )] = N N exp( N), (11.18) N! where now N is a function of time given by N = Variance still equals mean: t+t (N ) 2 Var(N) = N = t dt a(t ). (11.19) t+t t dt a(t ) = N 5

Punchline: Variation in rate does not spoil independence/poissonicity provided rate is not random.

Poisson and rarity Consider a set of M Bernoulli trials (coin flips, say), where the probability of success (heads) is p. The total number N of successes in M trials is given by the binomial law: Pr(N M, p) = Mean number of successes: N = Mp. ( N n ) p n q N n Now let M become very large and p become very small in such a way that Mp (or N) remains constant. (N)N lim Pr(N M, p) = M N! exp( N). (11.21) Rarity begets Poissonicity 6

Binomial selection of a Poisson In many applications of the binomial law, the number of trials M is random. Important example: radiation detection by an inefficient photon-counting detector. Suppose M photons are incident on the detector in time T and that each has a probability η (called the quantum efficiency) of producing a photoelectron. If the photoelectric interactions are statistically independent, then the conditional probability Pr(N M) for getting N photoelectrons is a binomial. The marginal probability Pr(N), however, is given by Pr(N) = M=N Pr(N M) Pr(M). (11.22) 7

Binomial selection cont. Now suppose that the photons satisfy the three postulates so that M is a Poisson random variable. With Pr(N M) being binomial, we have Pr(N) = M=N Pr(N M) Pr(M) = M=N ( M )η N (1 η) M N exp( M) M M N M!. (11.23) A change of variables, K = M N, and a little algebra shows that Pr(N) = exp( ηm) (ηm)n N! Thus N obeys a Poisson law with mean N = ηm.. (11.24) 8

Binomial selection theorem Binomial selection of a Poisson yields a Poisson, and the mean of the output of the selection process is the mean of the input times the binomial probability of success. Interesting corollary: Number of times that the photon is not detected is also a Poisson random variable. Moreover, the number of nondetections is statistically independent of the number of detections. In a sequence of Bernoulli trials where the number of attempts is random, the number of successes is independent of the number of failures if and only if the number of attempts is a Poisson random variable. 9

Cascaded binomial selection Suppose a light source emits photons randomly in all directions. Some of the photons fall on the photocathode of a photomultiplier tube and some of these produce photoelectrons. Assume: Photon emissions are independent, mean M Photon directions are independent, isotropic Fraction Ω/4π of photons fall on PMT Photoelectrons are produced independently Fraction η of photons produce photoelectrons 10

Cascaded binomial selection cont. Since binomial selection of a Poisson yields a Poisson, and a Poisson is fully characterized by its mean, the probability law on the number of photoelectrons is Pr(N) = (N)N exp( N), N! where the mean number of photoelectrons given by N = η Ω 4π M. Can continue this process, include other binomial selections (absorption, reflection, etc.) Key Point: Poisson source followed by any number of binomial selections gives a Poisson. 11

Doubly stochastic Poisson random variables Consider Poisson random variables where the Poisson mean is random. Example: Fluctuating optical source Though N is a discrete random variable with only integer values, its mean N can take on any value in (0, ). Thus random N must be described by a probability density function pr(n). Mean number of photoelectrons now given by Pr(N) = = 1 N! 0 0 dn Pr(N N) pr(n) dn N N exp( N) pr(n). (11.25) This expression is called the Poisson transform of pr(n) Example of interest in optics: Poisson transform of an exponential is a Bose-Einstein distribution 12

Illustration of doubly stochastic process 13

Doubly stochastic mean and variance Mean number of photoelectrons is given by N = N Pr(N) = dn N=0 0 N=0 N N! N N exp( N) pr(n) = 0 dn N pr(n) N. (11.28) The double-overbar indicates two separate averages: one with respect to Pr(N N), another with respect to pr(n). Variance computed similarly (see B&M): Var(N) = (N N) 2 = N + Var(N). (11.32) First term is Poisson variance appropriate to the average value of the mean. Second term, often called the excess variance, is the result of randomness in the Poisson mean. 14

Binomial selection from a non-poisson source Consider binomial selection (probability η) of a doubly stochastic Poisson random variable (mean M). Mean: N = η M Variance: Var(N) = η M + η 2 Var(M). (11.39) Note that Poisson part of the variance scales as η while the excess variance scales as η 2. Key point: Small efficiency reduces the excess variance relative to the Poisson part. Again, rarity begets Poissonicity 15

So when is a light source Poisson? When its intensity is constant When intensity fluctuates but a small fraction of the light is used When its intensity fluctuates rapidly (temporal incoherence) When it is perfect monochromatic (spatially and temporally coherent) When it is in a quantum mechanical coherent state 16

And when is a light source not Poisson? When its intensity fluctuates slowly When it is one mode of a thermal radiation field When it is quasithermal When one considers an ensemble of sources 17

Multivariate Poisson statistics quick reminder Consider an array of photon-counting detectors, with g j counts in j th detector. Data thus consist of set {g j, j = 1, 2,..., J} If individual counts g j are Poisson (i.e. probability law is independent), then multivariate Pr({g j }) = J j=1 Pr(g j ) = J j=1 exp( g j ) g j g j!. (11.40) 18

Covariance matrix Independent counts are necessarily uncorrelated, so K jk = [g j g j ][g k g k ] = g j δ jk. (11.41) Alternate notation: Data vector g has components {g 1, g 2,..., g J }. Mean vector g has components {g 1, g 2,..., g J }. [K g ] = [g g] [g g] t = diag (g) 19

Temporal point processes Output current of a photon-counting detector: i(t) = N n=1 i 0 (t t n ), (11.62) where i 0 (t) is the current pulse produced by a count at t = 0, t n is the time of occurrence of the n th count (0 < t n T ), and N is the total number of counts in (0, T ). This current waveform can be written as i(t) = z(t) i 0 (t), (11.63) where z(t) is a random point process defined by N z(t) = δ(t t n ), (0 < t T ). (11.64) n=1 Objective: Understand statistical properties of z(t) 20

The stationary Poisson case If the Poisson postulates hold and rate a = constant, then Pr(N in T ) = (at )N N! exp( at ). (11.13) pr(t n ) = a N = 1 T z(t) = a K z (t, t ) = [z(t) a][z(t ) a] = a δ(t t ) Var{z(t)} = S z (ν) = a Key point: All statistical properties of z(t) determined by rate a. 21

The nonstationary Poisson case If the Poisson postulates hold and rate a(t) is a nonrandom function, then [ T0 N dt a(t )] [ ] Pr(N in T ) = N! exp T 0 dt a(t ) pr(t n ) = a(t n) T0 dt a(t), (11.70) Power spectral density not defined z(t) = a(t) K z (t, t ) = a(t) δ(t t ) Var{z(t)} = Key point: All statistical properties of z(t) determined by rate function a(t). 22

Spectral description of a temporal Poisson process? For stationary random processes, we use the Fourier domain since the autocorrelation operator is diagonal there. Recall from Lecture 13: F (ν) F (ν ) = S(ν) δ(ν ν ). Karhunen-Loève Fourier domain For nonstationary Poisson process, however, the autocorrelation operator is diagonal in the time domain. From last slide: K z (t, t ) = a(t) δ(t t ) Fourier transformation would just undiagonalize it! Karhunen-Loève time domain 23

Spatial point processes The spatial counterpart of the temporal point process z(t) is g(r), defined by g(r) = N n=1 where r is a spatial position vector. δ(r r n ), (11.72) For example, g(r) could describe the pattern of photon interactions on a piece of film or (unbinned) output of an ideal photon-counting detector 24

Spatial Poisson postulates and their consequences (a) The number of counts in any area A 1 is statistically independent of the number in any other nonoverlapping area A 2. (b) If we consider a very small area A contained in A and centered on point r, the probability of a single count in this area during an observation time approaches a deterministic function b(r) times A: Pr(1 in A) = b(r) A, (11.73) (c) The probability of more than one count in a small area A is zero: Pr(1 in A) + Pr(0 in A) = 1. (11.74) Note that we have allowed Pr(1 in A) to be a function of position, but b(r) in (11.73) is a fixed function and not yet a random process. 25

Illustration of fluence and random process 26

Properties of spatial Poisson processes pr(r n ) = b(r n) A d2 r b(r) ; (11.76) Pr(N) = N N exp( N) ; (11.77) N! N = d 2 r b(r). (11.78) A g(r) = b(r) K g (r, r ) = b(r) δ(r r ). (11.94) Key point: All properties determined by b(r), the mean number of counts per unit area or the mean photon fluence. 27

Stochastic Wigner distribution function For nonstationary spatial Poisson random process, power spectral density is not defined, and not needed since autocorrelation operator is diagonal in the space domain. If we really have to have a Fourier description, however, we can use the stochastic WDF. Recall definition from Lecture 13: W g (r 0, ρ) = d q r g(r 0 + 1 2 r) g (r 0 1 2 r) exp( 2πiρ r). For a spatial Poisson process, we find W g (r 0, ρ) = b(r 0 ). In this sense, a Poisson process is white noise. (8.140) 28

Characteristic functional PDF of a Poisson process is not defined, and the variance is infinite, but we can describe all statistical properties by a characteristic functional. Recall definition from Lecture 13: where (s, f ) is the usual L 2 scalar product. For a Poisson random process: Ψ g (s) = exp 2πi d 2 r s(r) Ψ f (s) = exp[ 2πi(s, f )], (8.94) A N n=1 δ(r r n ) = exp 2πi N n=1 The expectation is performed in two steps, first over the set of random variables {r n } for fixed N, then over N. The result is { } Ψ g (s) = exp N + d 2 r n b(r n ) e 2πi s(r n), (11.150) A where b(r) is the usual photon fluence (which must be nonrandom for this expression to hold). 29 s(r n )

Uses of the Poisson characteristic functional Compute statistics of filtered Poisson processes Develop realistic models of textured objects Describe the statistics of digital images with random objects Treat difficult (non-gaussian) speckle problems Extend to doubly stochastic case 30

Doubly stochastic spatial Poisson processes Suppose fluence is random function. Important example: Random objects. g(r) is Poisson for repeated imaging of one object... but not Poisson when we average over objects. Autocovariance is modified to: K g (r, r ) = b(r) δ(r r ) + K b (r, r ). (11.116) First term represents the average Poisson random process Second term is the autocovariance of the fluence b(r). 31

Summary Poisson random variables and processes arise from two different basic assumptions: independence and rarity Binomial selection of a Poisson yields a Poisson and creates one if you didn t have it to start with. Poisson random processes are sums of delta functions All statistics are contained in the rate or fluence Poisson random processes are inherently uncorrelated (unless you insist on using the Fourier domain!) Randomness of the rate or fluence spoils the independence and creates correlations 32