Poisson Approximation for Independent Geometric Random Variables

Similar documents
A Note on Poisson Approximation for Independent Geometric Random Variables

Non Uniform Bounds on Geometric Approximation Via Stein s Method and w-functions

A Pointwise Approximation of Generalized Binomial by Poisson Distribution

1. Introduction. Let the distribution of a non-negative integer-valued random variable X be defined as follows:

APPROXIMATION OF GENERALIZED BINOMIAL BY POISSON DISTRIBUTION FUNCTION

On Approximating a Generalized Binomial by Binomial and Poisson Distributions

AN IMPROVED POISSON TO APPROXIMATE THE NEGATIVE BINOMIAL DISTRIBUTION

MODERATE DEVIATIONS IN POISSON APPROXIMATION: A FIRST ATTEMPT

On bounds in multivariate Poisson approximation

Notes on Poisson Approximation

arxiv: v1 [math.pr] 16 Jun 2009

On the Entropy of Sums of Bernoulli Random Variables via the Chen-Stein Method

A COMPOUND POISSON APPROXIMATION INEQUALITY

Approximation of the conditional number of exceedances

A New Approach to Poisson Approximations

A Non-uniform Bound on Poisson Approximation in Beta Negative Binomial Distribution

KERSTAN S METHOD FOR COMPOUND POISSON APPROXIMATION. BY BERO ROOS Universität Hamburg

Poisson approximations

Bounds of the normal approximation to random-sum Wilcoxon statistics

Poisson Process Approximation: From Palm Theory to Stein s Method

A new approach to Poisson approximation and applications

Wasserstein-2 bounds in normal approximation under local dependence

arxiv:math/ v1 [math.pr] 27 Feb 2007

The first divisible sum

Monotonicity and Aging Properties of Random Sums

A Non-Uniform Bound on Normal Approximation of Randomized Orthogonal Array Sampling Designs

Bipartite decomposition of random graphs

Stein s Method and the Zero Bias Transformation with Application to Simple Random Sampling

A Gentle Introduction to Stein s Method for Normal Approximation I

Mathematical Statistics 1 Math A 6330

A Short Introduction to Stein s Method

Fisher Information, Compound Poisson Approximation, and the Poisson Channel

arxiv:math/ v1 [math.pr] 5 Aug 2006

Asymptotic Nonequivalence of Nonparametric Experiments When the Smoothness Index is ½

The Generalized Coupon Collector Problem

MATH Notebook 5 Fall 2018/2019

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

A Criterion for the Compound Poisson Distribution to be Maximum Entropy

KOREKTURY wim11203.tex

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

ON POINTWISE BINOMIAL APPROXIMATION

Successive Derivatives and Integer Sequences

Some new sequences that converge to the Ioachimescu constant

On the length of the longest exact position. match in a random sequence

On the Poisson Approximation to the Negative Hypergeometric Distribution

PAIRS OF SUCCESSES IN BERNOULLI TRIALS AND A NEW n-estimator FOR THE BINOMIAL DISTRIBUTION

Sharp threshold functions for random intersection graphs via a coupling method.

Total variation error bounds for geometric approximation

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

arxiv: v1 [math.co] 17 Dec 2007

Binomial random variable

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.

THE LINDEBERG-FELLER CENTRAL LIMIT THEOREM VIA ZERO BIAS TRANSFORMATION

On large deviations for combinatorial sums

Stat 35: Introduction to Probability. Outline for the day: 1. teams, s, bruin. 2. HW2. 3. Booth and Ivey. 4. Binomial random variables. 5.

ON A SURVEY OF UNIFORM INTEGRABILITY OF SEQUENCES OF RANDOM VARIABLES

Lecture 3. Discrete Random Variables

Massachusetts Institute of Technology

STAT/MATH 395 PROBABILITY II

Discrete Distributions Chapter 6

A Geometric Characterization of Homogeneous Production Models in Economics

A Note on Certain Stability and Limiting Properties of ν-infinitely divisible distributions

Common Discrete Distributions

CS206 Review Sheet 3 October 24, 2018

Random Variable. Pr(X = a) = Pr(s)

A Remark on Complete Convergence for Arrays of Rowwise Negatively Associated Random Variables

Stein s Method: Distributional Approximation and Concentration of Measure

Approximation of convex bodies by polytopes. Viktor Vígh

ON CERTAIN SUBCLASSES OF UNIVALENT FUNCTIONS AND RADIUS PROPERTIES

UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES

Linear Recurrence Relations for Sums of Products of Two Terms

Asymptotic behavior for sums of non-identically distributed random variables

Probability Theory Overview and Analysis of Randomized Algorithms

Normal Approximation for Hierarchical Structures

Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( )

[1] Thavaneswaran, A.; Heyde, C. C. A note on filtering for long memory processes. Stable non-gaussian models in finance and econometrics. Math.

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

EXPLICIT DISTRIBUTIONAL RESULTS IN PATTERN FORMATION. By V. T. Stefanov 1 and A. G. Pakes University of Western Australia

Sensitivity analysis for abstract equilibrium problems

UNIQUENESS OF ENTIRE OR MEROMORPHIC FUNCTIONS SHARING ONE VALUE OR A FUNCTION WITH FINITE WEIGHT

A Note on the Approximation of Perpetuities

arxiv: v3 [math.mg] 3 Nov 2017

Zero inflated negative binomial-generalized exponential distribution and its applications

On the Average Path Length of Complete m-ary Trees

Topic 3: The Expectation of a Random Variable

2 suggests that the range of variation is at least about n 1=10, see Theorem 2 below, and it is quite possible that the upper bound in (1.2) is sharp

3 Conditional Expectation

Zeros of the Riemann Zeta-Function on the Critical Line

Bootstrap Percolation on Periodic Trees

U N I V E R S I T Ä T

Journal of Inequalities in Pure and Applied Mathematics

On the Convolution Order with Reliability Applications

MULTI PING-PONG AND AN ENTROPY ESTIMATE IN GROUPS. Katarzyna Tarchała, Paweł Walczak. 1. Introduction

On discrete distributions with gaps having ALM property

Resistance Growth of Branching Random Networks

Normal approximation of Poisson functionals in Kolmogorov distance

A NOTE ON THE COMPLETE MOMENT CONVERGENCE FOR ARRAYS OF B-VALUED RANDOM VARIABLES

Stability of the Defect Renewal Volterra Integral Equations

Fisher information and Stam inequality on a finite group

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

Transcription:

International Mathematical Forum, 2, 2007, no. 65, 3211-3218 Poisson Approximation for Independent Geometric Random Variables K. Teerapabolarn 1 and P. Wongasem Department of Mathematics, Faculty of Science Burapha University, Chonburi 20131, Thailand Abstract The Stein-Chen method is used to derive two formulas of uniform and non-uniform bounds on Poisson approximation for a sum of n independent geometric random variables. Application of these formulas is illustrated with the Poisson approximation to the negative binomial distribution. Mathematics Subject Classification: Primary 60F05 Keywords: Geometric summands; negative binomial distribution; Poisson approximation; Stein-Chen method 1 Introduction and main results In the past few years, there have been many papers on the Stein-Chen method for bounding Poisson approximation errors (see [3], [10] and references therein) and the application of the techniques to binomial, compound Poisson and geometric approximations: see [4], [2] and [6]. In this paper we further use the Stein-Chen method to bound error on Poisson approximation to the distribution of a sum of n independent geometric random variables, which is thought of as the number of failures before the n th success in a sequence of Bernoulli trials. Before constructing such bound of the approximation, we define a random variable X as the number of failures before the first success in a sequence of Bernoulli trials with probability of success p, 0 <p<1. Thus P (X = ) =(1 p) p, =0, 1,..., is the geometric probability with parameter p. 1 Corresponding author, email : anint@buu.ac.th

3212 K. Teerapabolarn and P. Wongasem Let X 1,..., X n be n independent geometric random variables with P (X i = ) =(1 p i ) p i,=0, 1,..., but are not necessarily identical. Let W = and λ = E(W )= (1 p i )p 1 i X i, then W is the number of failures before the n th success in a sequence( of independent ) Bernoulli trials. If p i s are identical n + 1 to p, then P (W = ) = (1 p) p n, =0, 1,..., is the negative binomial distribution with parameters n and p. This paper we use the Stein-Chen method to determine two formulas of uniform and non-uniform error bounds for approximating the distribution of W by the Poisson distribution with parameter λ. The following theorems are our main results. Theorem 1.1. Let W and λ be defined as above. Then, for A N 0}, A P (W A) λ e λ! } λ 1 (1 e λ ) min, 1 (1 p i ) 2 p 1 i. p i (1.1) Theorem 1.2. For w 0 N 0}, the following formula holds: P (W w w 0 λ e λ } 0)! λ 1 (e λ 1 1) min p =0 i (w 0 +1), 1 (1 p i ) 2 p 1 i. (1.2) Remars. 1. The error bounds in (1.1) and (1.2) are small when all (1 p i ) are small, i.e. each result of the theorems yields a good Poisson approximation when all (1 p i ) are small. Furthermore, the error bound in (1.2) depends on w 0, i.e. it is decreasing for w 0 0. 2. In the case of A = 0} or w 0 = 0, by applying Lemma 2.1 (3) of Teerapabolarn and Neammanee [11], we can obtain the following inequality P (W =0) e λ λ 2 (λ + e λ 1) 2 Proof of main results (1 p i ) 2 p 1 i. (1.3) We will prove our main results by using the Stein-Chen method. The method was originally formulated for normal approximation by Stein in 1972, and it was adapted and applied to the Poisson case by Chen in 1975. This

Poisson approximation 3213 method started by Stein s equation for Poisson distribution with a parameter λ which, given h, is defined by λf(w +1) wf(w) =h(w) P λ (h), (2.1) where P λ (h) =e λ h() λ and f and h are bounded real valued functions! =0 defined on N 0}. For A N 0}, let h A be defined by 1 if w A, h A (w) = (2.2) 0 if w/ A. From Barbour et al. [3] on pp.7, the solution f A (w) of (2.1) is of the form (w 1)!λ w e λ [P λ (h A Cw 1 ) P λ (h A )P λ (h Cw 1 )] if w 1, f A (w) = 0 if w =0, (2.3) where C w = 0,..., w}, and, for w 0 N 0}, the solutions f Cw0 of (2.1) can be expressed in the form of (w 1)!λ w e λ [P λ (h Cw0 )P λ (1 h Cw 1 )] if w 0 <w, f Cw0 (w) = (w 1)!λ w e λ [P λ (h Cw 1 )P λ (1 h Cw0 )] if w 0 w, (2.4) 0 if w =0, The following lemmas are established for proving the main results. Lemma 2.1. Let Vf A (w) =f A (w +1) f A (w). Then, for A N 0} and N, we have Vf A (w) min λ 1 (1 e λ ), 1 }. (2.5) w Proof. It is obtained from Barbour et al. [3] on pp.8. Lemma 2.2. For w 0 N 0} and N, the following estimate holds: 1 Vf Cw0 (w) λ 1 (e λ 1) min w w 0 +1, 1 }. (2.6) Proof. By using Lemma 2.1 (3) of Teerapabolarn and Neammanee [12], we have w Vf Cw0 (w) w 1 Vf Cw0 (w) λ 1 (e λ 1). w 0 +1

3214 K. Teerapabolarn and P. Wongasem Now we shall show that Vf Cw0 (w) λ 1 (e λ 1). w By the fact that w 0 (w 1)!λ (w+1) e λ j=0 Vf Cw0 (w) = (w 1)!λ (w+1) e λ λ j j! j=w 0 +1 =w+1 λ j j! For w w 0 + 1 and w, by (2.7), we have (w ) λ! < 0 if w w 0 +1, w (w ) λ! > 0 if 1 w w 0. 0 < Vf Cw0 (w) (w 1)! (l w) λl (w+1) l! l=w+1 } 1 =(w 1)! (w + 1)! + 2λ (w + 2)! + } (w 1)! 1 = w! w +1 + 2λ (w + 1)(w +2) + 1 } λ λ 1 2 + 2λ2 + 3! λ 1 (e λ 1), and, for w w 0, =0 (2.7) 0 <Vf Cw0 (w) λ 1 (e λ 1) w 0 +1 λ 1 (e λ 1). Hence, (2.6) is obtained. Proof of Theorem 1.1. We shall show that (1.1) holds. From (2.1), given h = h A, we have P (W A) λ e λ = E[λf(W +1) Wf(W )], (2.8)! where f = f A is defined as in (2.3). Note that E[λf(W +1) Wf(W )] E[(p 1 i 1)f(W +1) X i f(w )]. (2.9)

Poisson approximation 3215 Let W i = W X i. Then, for each i, we get E[(p 1 i 1)f(W +1) X i f(w )] = E[(p 1 i 1)f(W i + X i +1) X i f(w i + X i )] = E[E[(p 1 i 1)f(W i + X i +1) X i f(w i + X i )) X i ]] = E[(p 1 i 1)f(W i + X i +1) X i f(w i + X i )) X i =0]P (X i =0) + E[(p 1 i 1)f(W i + X i +1) X i f(w i + X i )) X i =1]P (X i =1) + E[(p 1 i 1)f(W i + X i +1) X i f(w i + X i )) X i = ]P (X i = ) 2 = E[(p 1 i 1)f(W i + X i +1) X i f(w i + X i )) X i =0]p i + E[(p 1 i 1)f(W i + X i +1) X i f(w i + X i )) X i =1]p i (1 p i ) + 2 E[(p 1 i 1)f(W i + X i +1) X i f(w i + X i )) X i = ]p i (1 p i ) = E[(1 p i )f(w i + 1)] + E[(1 p i ) 2 f(w i +2) p i (1 p i )f(w i + 1)] + 2 E[(1 p i ) +1 f(w i + +1) p i (1 p i ) f(w i + )] = [(1 p i ) p i (1 p i )]E[f(W i + 1)] + E[(1 p i ) 2 f(w i + 2)] + 2 E[(1 p i ) +1 f(w i + +1) p i (1 p i ) f(w i + )] =(1 p i ) 2 E[f(W i + 1)] + 2 E[(1 p i ) f(w i + ) p i (1 p i ) f(w i + )] =(1 p i ) 2 E[f(W i + 1)] + 2 E[(1 p i ) f(w i + ) p i (1 p i ) f(w i + )] 2 E[( 1)(1 p i ) f(w i + )] =(1 p i ) 2 E[f(W i + 1)] + 2 E[(1 p i ) +1 f(w i + )] 2 E[(1 p i ) +1 f(w i + + 1)] (1 p i ) 2 E[f(W i + 2)] =(1 p i ) 2 E[f(W i +1) f(w i + 2)] + 2 E[(1 p i ) +1 f(w i + )] E[(1 p i ) +1 f(w i + + 1)] 2 =(1 p i ) 2 E[f(W i +1) f(w i + 2)] + (1 p i ) +1 E[f(W i + ) f(w i + + 1)] 2 = (1 p i ) +1 E[f(W i + ) f(w i + + 1)] 1

3216 K. Teerapabolarn and P. Wongasem By using Lemma 2.1, E[(p 1 i 1)f(W +1) X i f(w )] (1 p i ) +1 E f(w i + ) f(w i + +1) 1 (1 p i ) +1 Vf(w) w 1 min λ 1 (1 e λ ) 1 = min λ 1 (1 e λ )(1 p i ) 2 p 2 i } λ 1 (1 e λ ) = min, 1 p i (1 p i ) +1, 1 }, (1 p i ) 2 p 1 i (1 p i ) +1 } (1 p i ) 2 p 1 i. (2.10) Hence, by (2.8), (2.9) and (2.10), (1.1) holds. Proof of Theorem 1.2. Using the same argument detailed as in the proof of Theorem 1.1 together with Lemma 2.2, the theorem is obtained. 3 An illustrative application This section gives an illustrative application of the theorems to approximate the negative binomial by the Poisson distribution with parameter λ = np 1 (1 p) in two formulas. Finally, we compare our result, a uniform bound, with some other results. 3.1 Uniform and non-uniform bounds When p i s are identical to p, we have λ = np 1 (1 p) and a uniform bound for approximating the negative binomial distribution by the Poisson distribution in Theorem 1.1 is of the form A P (W A) λ e λ! min 1 e λ,λp } 1 p p, (3.1) where A N 0}. Forw 0 N 0}, a non-uniform bound in Theorem 1.2 is in the form of P (W w w 0 λ e λ } 1 0)! min w 0 +1,p (e λ 1) 1 p p, (3.2) =0

Poisson approximation 3217 and in the case of w 0 = 0, by (1.3), we have P (W =0) e λ λ 1 (λ + e λ 1)(1 p) = (λ + e λ 1)p. (3.3) n It should be noted that the bounds in (3.1) and (3.2) are small when 1 p is small or λ is small, i.e. each result of (3.1) and (3.2) gives a good Poisson approximation for small 1 p or small λ. Moreover, the bound in (3.2) can be decreased by increasing w 0. 3.2 Comparison with some other results We will compare our result of (3.1) with some other results below.! 1 p (Vervaat [13]), (3.4) p! 1 p (Romanowsa [8]), (3.5) 2p λ(1 p) (Gerber [5]), (3.6)! λ(1 p) (Pfeifer [7]), (3.7)! p! (1 e λ )(1 p) (Barbour [1]), (3.8) p } 3 1 p! min 4e,λ (Roos [9]). (3.9) p Note that (3.8) can also be derived from Theorem 1.C (ii) of Barbour et al. [3]. Since our bound, the bound in (3.1), is the minimum value of the bounds in (3.6) and (3.8), thus it is better than the bounds in (3.4) and (3.7). If 1 p< 1 ( or λ< log 1 1 ), then our bound is better than Ro- 2n 2 manowsa s bound (see (3.5)). Observe that Roos s bound (see (3.9)) is better than Romanowsa s bound, hence the bound in (3.1) is better than the bounds in (3.5) and (3.9) if 1 p< 3 ( or λ< log 1 3 ). 4en 4e Acnowledgements The authors would lie to than Faculty of Science, Burapha University, for financial port.

3218 K. Teerapabolarn and P. Wongasem References [1] A.D. Barbour, Asymptotic expansions in the Poisson limit theorem, Ann. Prob., 15 (1987), 748-766. [2] A.D. Barbour, L.H.Y. Chen, W.L. Loh, Compound Poisson approximation for nonnegative random variables via Stein s method, Ann. Prob., 20 (1992), 1843-1866. [3] A.D. Barbour, L. Holst, S. Janson, Poisson approximation, Oxford Studies in Probability 2, Clarendon Press, Oxford, 1992. [4] W. Ehm, Binomial approximation to the Poisson binomial distribution, Statist. Prob. Lett., 11 (1991), 7-16. [5] H.U. Gerber, Error bounds for the compound Poisson approximation, Insurance Math. Econom., 3 (1984), 191-194. [6] E. Peoz, Stein s method for geometric approximation, J. Appl. Prob., 33 (1996), 707-713. [7] D. Pfeifer, On the distance between mixed Poisson and Poisson distributions, Statist. Decisions., 5 (1987), 367-379. [8] M. Romanowsa, A note on the upper bound for the distance in total variation between the binomial and the Poisson distributions, Statist. Neerlandica., 31 (1977), 127-130. [9] B. Roos, Improvements in the Poisson approximation of mixed Poisson distributions, J. Statist. Plann. Inference, 113 (2003), 467-483. [10] C.M. Stein, Approximate Computation of Expectations, IMS, Hayward California, 1986. [11] K. Teerapabolarn, K. Neammanee, A non-uniform bound on Poisson approximation in somatic cell hybrid model, Math. BioSc., 195 (2005), 56-64. [12] K. Teerapabolarn, K. Neammanee, Poisson approximation for sums of dependent Bernoulli random variables, Acta Math., 22 (2006), 87-99. [13] W. Vervaat, Upper bound for distance in total variation between the binomial or negative binomial and the Poisson distribution, Statist. Neerlandica., 23 (1969), 79-86. Received: June 27, 2007