Poisson Variables. Robert DeSerio

Similar documents
ECE 313 Probability with Engineering Applications Fall 2000

Sums and Products. a i = a 1. i=1. a i = a i a n. n 1

Differential Equations Practice: 2nd Order Linear: Nonhomogeneous Equations: Variation of Parameters Page 1

Probability Distributions Columns (a) through (d)

Physics 6720 Introduction to Statistics April 4, 2017

Power series and Taylor series

Chapter 4. Repeated Trials. 4.1 Introduction. 4.2 Bernoulli Trials

Hamiltonian flow in phase space and Liouville s theorem (Lecture 5)

Statistics and data analyses

ECE302 Spring 2006 Practice Final Exam Solution May 4, Name: Score: /100

6. Bernoulli Trials and the Poisson Process

Chapter 8: An Introduction to Probability and Statistics

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

POISSON RANDOM VARIABLES

Poisson Processes for Neuroscientists

Explicit evaluation of the transmission factor T 1. Part I: For small dead-time ratios. by Jorg W. MUller

functions Poisson distribution Normal distribution Arbitrary functions

Week 4: Chap. 3 Statistics of Radioactivity

Discrete Distributions Chapter 6

On random sequence spacing distributions

Phys 243 Lab 7: Radioactive Half-life

Bohr & Wheeler Fission Theory Calculation 4 March 2009

Radioactivity III: Measurement of Half Life.

Northwestern University Department of Electrical Engineering and Computer Science

Applied Statistics and Probability for Engineers. Sixth Edition. Chapter 4 Continuous Random Variables and Probability Distributions.

Chapter 4 Continuous Random Variables and Probability Distributions

Exponential and Logarithmic Functions

14 Branching processes

Continuous Probability Distributions. Uniform Distribution

Power series solutions for 2nd order linear ODE s (not necessarily with constant coefficients) a n z n. n=0

Statistical Methods in Particle Physics

Damped harmonic motion

What really matters in Hilbert-space stochastic processes

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

ELEMENTS OF PROBABILITY THEORY

1 Some Statistical Basics.

Chapter 6 Continuous Probability Distributions

Part I Stochastic variables and Markov chains

3 Modeling Process Quality

Probability Density Functions

16.4. Power Series. Introduction. Prerequisites. Learning Outcomes

Sample Spectroscopy System Hardware

Physics 161 Homework 3 - Solutions Wednesday September 21, 2011

Photocurrent, Photoluminescence and Exciton Dynamics in Rubrene Molecular Single Crystals

A Laboratory on Pulse Trains, Counting Statistics and the Central Limit Theorem

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Exercises and Answers to Chapter 1

Statistical Methods in Particle Physics

The number of distributions used in this book is small, basically the binomial and Poisson distributions, and some variations on them.

Chapter 4: An Introduction to Probability and Statistics

Recap from last time

Probabilities and distributions

Electrical Circuits (2)

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

Math Spring Practice for the Second Exam.

Chapter 5 Joint Probability Distributions

Muon Lifetime Notes. Let the singles event rate be R. This is displayed on the scaler/counter on the instrument rack.

Mixture distributions in Exams MLC/3L and C/4

Chapter 2. Linear Differential Equation of Second (or Higher) Order

Performance Modelling of Computer Systems


arxiv: v1 [math.pr] 18 Nov 2018

Continuous Distributions

CS 5014: Research Methods in Computer Science. Bernoulli Distribution. Binomial Distribution. Poisson Distribution. Clifford A. Shaffer.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

Green s functions: calculation methods

Application: Bucket Sort

Chapter 3.3 Continuous distributions

1) Radioactive Decay, Nucleosynthesis, and Basic Geochronology

Basic Concepts and Tools in Statistical Physics

ΔP(x) Δx. f "Discrete Variable x" (x) dp(x) dx. (x) f "Continuous Variable x" Module 3 Statistics. I. Basic Statistics. A. Statistics and Physics

Probability Distributions

Chapter IV: Radioactive decay

How do we analyze, evaluate, solve, and graph quadratic functions?

Performance Evaluation of Generalized Polynomial Chaos

Nuclear Physics Lab I: Geiger-Müller Counter and Nuclear Counting Statistics

04. Random Variables: Concepts

1.5 Sequence alignment

Decays and Scattering. Decay Rates Cross Sections Calculating Decays Scattering Lifetime of Particles

SOLUTIONS for Homework #2. 1. The given wave function can be normalized to the total probability equal to 1, ψ(x) = Ne λ x.

Physics Sep Example A Spin System

Chapter Learning Objectives. Probability Distributions and Probability Density Functions. Continuous Random Variables

Statistics, Probability Distributions & Error Propagation. James R. Graham

The Wright-Fisher Model and Genetic Drift

Princeton University Press, all rights reserved. Appendix 5: Moment Generating Functions

The Mössbauer Effect

AP3162D: Lecture 4 - Basic modelling frameworks for developmental biology and cell-fate decisions

Chapter 3. Entropy, temperature, and the microcanonical partition function: how to calculate results with statistical mechanics.

Contents of this Document [ntc5]

II. Probability. II.A General Definitions

Probability Distributions - Lecture 5

Chapter 5. Continuous Probability Distributions

MA22S3 Summary Sheet: Ordinary Differential Equations

Probability and Statistics

PLC Papers. Created For:

Poisson Processes. Stochastic Processes. Feb UC3M

Counting Statistics and Error Prediction

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Quantum Mechanics- I Prof. Dr. S. Lakshmi Bala Department of Physics Indian Institute of Technology, Madras

Transcription:

Poisson Variables Robert DeSerio Poisson processes A Poisson process is one in which events are randomly distributed in time, space or some other variable with the number of events in any non-overlapping intervals statistically independent. For example, naturally occurring gamma rays detected in a scintillation detector are randomly distributed in time, or chocolate chips in a cookie dough are randomly distributed in volume. For simplicity, we will limit our discussion to events randomly distributed in time. A homogeneous Poisson process is one in which the long-term average event rate is constant. The average rate will be denoted Γ and in any interval t the expected number of events is µ = Γ t (1) A nonhomogeneous Poisson process is one in which the average rate of events changes and so might be expressed as some function Γ(t) of time. The number of events expected in an interval from t 1 to t 2 would then be the integral µ = 2 t 1 Γ(t)dt (2) While the expected number of events µ for a given experiment need not be an integer, the number of events n actually observed must be. Moreover, due to the randomness of the events, n may be more or less than µ with the probability for a given n depending only on µ and given by µ µn P (n) = e (3) This is the Poisson distribution, a discrete probability distribution, with each P (n) giving the probability for that particular n to occur. Two Poisson distributions for µ = 1.5 and µ = 1 are shown in Fig. 1. This figure shows the parent distributions. Real sample distributions would be expected to vary somewhat from the parent, getting closer to the parent as the sample size N increases. AA-Poisson 1

Poisson Variables AA-Poisson 2 P(n).4.35.3.25.2.15.1.5 µ = 1.5 1 2 3 4 5 6 7 8 n P(n).45.4.35 µ = 1.3.25.2.15.1.5 6 7 8 9 1 11 12 13 14 n Figure 1: Poisson probabilities for means of 1.5 and 1 The exponential probability density function A better way of describing Γ is as a probability per unit time that an event will occur. That is dp = Γdt (4) where dp is the differential probability that an event will occur in the infinitesimal time interval dt. Of course, some care must be taken when translating a rate to a probability per unit time. For example, if Γ = 1/s, it is obviously not true that the probability is 1 that an event will occur in any particular second. However, if that same rate is expressed Γ =.1/ms it is roughly true that the probability is.1 that an event will happen in any particular millisecond. Eq. 4 only becomes exact in the limit of infinitesimal dt. Equation 4 also describes the decay process of an excited state of an atom, nuclei, or subatomic particle. In these cases, dp = Γdt is the probability for the excited state to decay in the next time interval dt and Γ is called the decay rate for the excited state rather than an event rate. Equation 4 can be shown to lead directly to the Poisson probability distribution. The first step is to see how it leads to the exponential probability density function (pdf) giving the probability dp e (t) that the next Poisson event (or the decay of an excited state) will occur in the interval from t to t + dt. 1 If the probability of no event (or survival of the excited state) to a time t is denoted P (; t), then the probability of no event (or survival) to t + dt would be the product of this probability with the probability of no event (or no decay) in the interval dt following t. Since the probability of an event (or decay) in this interval is Γdt, the probability of no event (or no decay) in this interval is 1 Γdt and thus: P (; t + dt) = P (; t)(1 Γdt) (5) Rearranging and substituting (P (; t + dt) P (; t))/dt = dp (; t)/dt gives dp (; t) dt = ΓP (; t) (6) 1 Equation 4 is equivalent to dp e () = Γdt and must be the t = limiting case for the general solution.

Poisson Variables AA-Poisson 3 which has the general solution P (; t) = Ae Γt. Because we must start with no event (or no decay) at t =, P (; ) = 1 and so A = 1 giving P (; t) = e Γt (7) Then, the differential probability dp e (t) for the next event (or decay) to occur in the interval from t to t + dt is given by the probability of no event (or no decay) in the interval from to t followed by an event (or a decay) in the next interval dt. The former has a probability P (; t) = e Γt and the later has a probability Γdt. Thus dp e (t) = Γe Γt dt (8) Equations 8 is a continuous probability density function (pdf). It is properly normalized, i.e., the integral over all times from to is unity as required. It also has the very reasonable property that the expectation value for the random variable t the time to the next event (or to the decay) is given by t = = 1 Γ tγe Γt dt (9) In the case of decay, the expectation value t, henceforth denoted τ, is called the lifetime of the excited state. Thus, Γ and τ are equivalent ways to quantify the decay process. If the decay rate is 1/s, the lifetime is.1 s. Moreover, Eq. 8 is often expressed in terms of the lifetime rather than the decay rate. dp e (t) = 1 τ e t/τ dt (1) The probability for decay in a time τ is found by integrating Eq. 8 from to τ and gives the value 1/e. Thus, for a large sample of excited states at t =, the fraction 1/e of them will have decayed by τ. The time it would take for half the sample to decay is called the half-life and is easily shown to be τ ln 2. The Poisson probability distribution There are several possible derivations of the Poisson probability distribution. It is often derived as a limiting case of the binomial probability distribution. The derivation to follow relies on Eq. 4 and begins with Eq. 7 for the probability P (; t) that there will be no events in some finite interval t. Next, a recursion relation is derived for the probability, denoted P (n + 1; t), for there to be n + 1 events in a time t, which will be based on the probability P (n; t) of one less event. For there to be n + 1 events in t, three independent events must happen in the following order (their probabilities given in parentheses).

Poisson Variables AA-Poisson 4 There must be n events up to some point t in the interval from to t (P (n, t ) by definition). An event must occur in the infinitesimal interval from t to t + dt (Γdt by Eq. 4). There must be no events in the interval from t to t (P (, t t ) by definition). The probability of n + 1 events in the interval from to t would be the product of the three probabilities above integrated over all t from to t to take into account that the last event may occur at any time in the interval. That is, P (n + 1; t) = P (n; t )Γdt P (; t t ) (11) From Eq. 7 we already have P (; t t ) = e Γ(t t ) and thus Eq. 11 becomes P (n + 1; t) = Γ P (n; t )e Γ(t t ) dt (12) With the following definition: Eq. 12 becomes P (n; t) = e Γt P (n; t) (13) e Γt P (n + 1; t) = Γ P (n + 1; t) = Γ e Γt P (n; t )e Γ(t t ) dt P (n; t )dt (14) From Eqs. 7 and 13, P (; t) = 1 and then P (1; t) can be found from an application of Eq. 14 Applying Eq. 14 for the next few terms P (1, t) = Γ = Γ P (, t )dt dt = Γt (15) P (2, t) = Γ = Γ = Γ2 t 2 2 P (1; t )dt Γt dt (16)

Poisson Variables AA-Poisson 5 The pattern clearly emerges that P (3, t) = Γ = Γ = Γ3 t 3 2 3 P (2; t )dt Γ 2 t 2 P (n; t) = (Γt)n 2 dt And thus with Eq. 13, the Poisson probabilities become (17) (18) Γt (Γt)n P (n; t) = e Note that, as expected, the right side depends only on the combination µ = Γt. Using that substitution and dropping the now meaningless t dependence on the left, the standard Poisson probability distribution of Eq. 3 results. Although the Poisson probabilities were derived assuming a homogeneous process, they are also appropriate for nonhomogeneous processes with the appropriate value of µ (Eq. 2). Several important properties of the Poisson distribution are easily investigated. For example, the normalization condition that some value of n from to infinity must occur translates to P (n) = 1 (2) and is easily verified from the series expansion of the exponential function n= e µ = The expected number of events is found from the following weighted average n= µ n (19) (21) and is evaluated as follows: n = n = np (n) (22) n= = µ = µ µ µn ne e µ µn 1 m= (n 1)! µ µm e m! = µ (23)

Poisson Variables AA-Poisson 6 In the first line, the explicit form for P (n) is used and the first term n = is explicitly dropped as it does not contribute to the sum. In the second line, the numerator s n is canceled with the one in the denominator s and one µ is also factored out in front of the sum. In the third line, m = n 1 is substituted, forcing a change in the indexing from n = 1... to m =.... And in the last line, the normalization property is used. Thus, the Poisson probability distribution gives the required result that the expectation value (or parent average) of n is equal to µ. We should also want to investigate the standard deviation of the Poisson distribution which would be evaluated from the expectation value σ 2 = (n µ) 2 The expectation value n 2 is evaluated as follows: = n 2 µ 2 (24) n 2 = n 2 P (n) n= = n 2 µ µn e = µ ne µ µn 1 (n 1)! µ µm = µ (m + 1)e m= m! [ µ µm ] = µ me m= m! + µ µn e m= = µ [µ + 1] = µ 2 + µ (25) In the second line, the form for P (n) is substituted and the first term n = is dropped from the sum as it does not contribute. In the third line, one power of µ is factored out of the sum and one n is canceled against one in the. The indexing and lower limit of the sum is adjusted in the fourth line using m = n 1. In the fifth line, the m + 1 term is separated into two terms, which are evaluated separately in the sixth line; the first term is just m and evaluates to µ by Eq. 23 and the second term evaluates to 1 based on the normalization condition Eq. 2. Now combining Eq. 25 with Eq. 24 gives the result σ 2 = µ (26) implying that the parent variance is equal to the mean, i.e., that the standard deviation is given by σ = µ.