lim Prob. < «_ - arc sin «1 / 2, 0 <_ «< 1.

Similar documents
ON THE NUMBER OF POSITIVE SUMS OF INDEPENDENT RANDOM VARIABLES

2. Linear recursions, different cases We discern amongst 5 different cases with respect to the behaviour of the series C(x) (see (. 3)). Let R be the

THE MAXIMUM OF SUMS OF STABLE RANDOM VARIABLES

T h e C S E T I P r o j e c t

ASYMPTOTIC DISTRIBUTION OF THE MAXIMUM CUMULATIVE SUM OF INDEPENDENT RANDOM VARIABLES

176 5 t h Fl oo r. 337 P o ly me r Ma te ri al s

A L A BA M A L A W R E V IE W

AN ARCSINE LAW FOR MARKOV CHAINS

e st f (t) dt = e st tf(t) dt = L {t f(t)} s

Nonparametric Density and Regression Estimation

PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY Volume 46, Number 1, October 1974 ASYMPTOTIC DISTRIBUTION OF NORMALIZED ARITHMETICAL FUNCTIONS PAUL E

Strauss PDEs 2e: Section Exercise 1 Page 1 of 6

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

CDS Solutions to Final Exam

COMBINATORIAL DIMENSION OF FRACTIONAL CARTESIAN PRODUCTS

BALANCING GAUSSIAN VECTORS. 1. Introduction

Simultaneous Gaussian quadrature for Angelesco systems

J2 e-*= (27T)- 1 / 2 f V* 1 '»' 1 *»

SOME REMARKS ON NUMBER THEORY BY P. ERDŐS 1. Let ABSTRACT This note contains some disconnected minor remarks on number theory. (1) Iz j I=1, 1<j<co be

Rademacher functions

Exhibit 2-9/30/15 Invoice Filing Page 1841 of Page 3660 Docket No

A = A U. U [n] P(A U ). n 1. 2 k(n k). k. k=1

Existence of Almost Periodic Solutions of Discrete Ricker Delay Models

THE LINDEBERG-LÉVY THEOREM FOR MARTINGALES1

Solutions Manual for Homework Sets Math 401. Dr Vignon S. Oussa

DIFFERENTIAL CRITERIA FOR POSITIVE DEFINITENESS

Mathematics 324 Riemann Zeta Function August 5, 2005

Gaussian Random Fields

A proof of a partition conjecture of Bateman and Erdős

arxiv: v1 [math.pr] 12 Jan 2017

Largest Values of the Stern Sequence, Alternating Binary Expansions and Continuants

SERIES REPRESENTATIONS FOR BEST APPROXIMATING ENTIRE FUNCTIONS OF EXPONENTIAL TYPE

I n t e r n a t i o n a l E l e c t r o n i c J o u r n a l o f E l e m e n t a r y E.7 d u, c ai ts is ou n e, 1 V3 1o-2 l6, I n t h i s a r t

GENERALIZED L2-LAPLACIANS

ON THE OSCILLATION OF THE DERIVATIVES OF A PERIODIC FUNCTION

2 K cos nkx + K si" nkx) (1.1)

When does a formal finite-difference expansion become real? 1

ENGIN 211, Engineering Math. Laplace Transforms

Math 361: Homework 1 Solutions

d= r(t)n, (1) dy = '/2r(t)y + 1/2v/v(j77) e(t). (3) dn = r(t)n + v(1t-v) E(t)x/N, (2)

9.5 The Transfer Function

THE DVORETZKY KIEFER WOLFOWITZ INEQUALITY WITH SHARP CONSTANT: MASSART S 1990 PROOF SEMINAR, SEPT. 28, R. M. Dudley

Math 259: Introduction to Analytic Number Theory More about the Gamma function

MATH 391 Test 1 Fall, (1) (12 points each)compute the general solution of each of the following differential equations: = 4x 2y.

MODELING OF CONTROL SYSTEMS

Math 0230 Calculus 2 Lectures

Approximation numbers of Sobolev embeddings - Sharp constants and tractability

Generating Functions

DISCRETE FOURIER TRANSFORM

MATH 241 Practice Second Midterm Exam - Fall 2012

A NOTE ON A BASIS PROBLEM

w = X ^ = ^ ^, (i*i < ix

ON GENERIC POLYNOMIALS FOR DIHEDRAL GROUPS

The Wiener Itô Chaos Expansion

Wiener Chaos Solution of Stochastic Evolution Equations

Asymptotic Statistics-VI. Changliang Zou

TRACE FORMULAS FOR PERTURBATIONS OF OPERATORS WITH HILBERT-SCHMIDT RESOLVENTS. Bishnu Prasad Sedai

Sequences and Summations

ASYMPTOTIC PROPERTIES OF SOME GOODNESS-OF-FIT TESTS BASED ON THE L1-NORM

(1.2) Nim {t; AN ax.f (nxt) C w}=e212 2 du,

10 Transfer Matrix Models

Slow P -point Ultrafilters

THE LINDEBERG-FELLER CENTRAL LIMIT THEOREM VIA ZERO BIAS TRANSFORMATION

1 Review of di erential calculus

ON THE GAPS BETWEEN ZEROS OF TRIGONOMETRIC POLYNOMIALS

SMOOTHNESS OF FUNCTIONS GENERATED BY RIESZ PRODUCTS

Chebyshev polynomials, quadratic surds and a variation of Pascal s triangle

I N A C O M P L E X W O R L D

Mathematics 104 Fall Term 2006 Solutions to Final Exam. sin(ln t) dt = e x sin(x) dx.

NEW GENERATING FUNCTIONS FOR CLASSICAL POLYNOMIALS1. J. w. brown

Math 321 Final Examination April 1995 Notation used in this exam: N. (1) S N (f,x) = f(t)e int dt e inx.

Finite Frames and Sigma Delta Quantization

A SHARP LOWER BOUND ON THE POLYGONAL ISOPERIMETRIC DEFICIT

UNIFORM BOUNDS FOR BESSEL FUNCTIONS

Local time path integrals and their application to Lévy random walks

Uniform convergence of N-dimensional Walsh Fourier series

Continued Fraction Digit Averages and Maclaurin s Inequalities

Part 3.3 Differentiation Taylor Polynomials

Random Bernstein-Markov factors

Second Printing Errata for Advanced Engineering Mathematics, by Dennis G. Zill and Michael R. Cullen

Taylor and Maclaurin Series

Upper Bounds for Partitions into k-th Powers Elementary Methods

Math 141: Lecture 19

On pathwise stochastic integration

SHARP BOUNDS FOR PROBABILITIES WITH GIVEN SHAPE INFORMATION

A CANONICAL FORM FOR A CLASS OF ORDINARY DIFFERENTIAL OPERATORS

Mathematical MCQ for international students admitted to École polytechnique

Math 113 (Calculus 2) Exam 4

n E(X t T n = lim X s Tn = X s

SEMI-INNER PRODUCTS AND THE NUMERICAL RADIUS OF BOUNDED LINEAR OPERATORS IN HILBERT SPACES

Combinatorial Dimension in Fractional Cartesian Products

Yunhi Cho and Young-One Kim

1 x i. i=1 EVEN NUMBERS RAFAEL ARCE-NAZARIO, FRANCIS N. CASTRO, AND RAÚL FIGUEROA

On the number of zero increments of random walks with a barrier

ON THE DENSITY OF SOME SEQUENCES OF INTEGERS P. ERDOS

MATH 2413 TEST ON CHAPTER 4 ANSWER ALL QUESTIONS. TIME 1.5 HRS.

MATH 205C: STATIONARY PHASE LEMMA

Math 220A Homework 4 Solutions

Law of the iterated logarithm for pure jump Lévy processes

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Transcription:

ON THE NUMBER OF POSITIVE SUMS OF INDEPENDENT RANDOM VARIABLES P. ERDÖS AND M. KAC 1 1. Introduction. In a recent paper' the authors have introduced a method for proving certain limit theorems of the theory of probability. The purpose of the present note is to apply this general method in order to establish the following : THEOREM. Let X1, X2,... be independent random variables each having mean 0 and variance 1 and such that the central limit theorem is applicable. Let sk=x1 -+2+... +X,E and let N denote the number of sk's, 1 :5=k :5 =n, which are positive. Then, N 2 lim Prob. < «_ - arc sin «1 / 2, 0 <_ «< 1. n a This theorem has been proved in the binomial case Prob. { X ; = 1 } = Prob. { X ; = - 1 } = 1/2 by P. Lévy3 who also indicated that it is true for more general random variables. However, the authors felt that their proof is of independent interest, especially since it follows an already established pattern., 2. The invariance principle. We first prove the following : If the theorem can be established for one particular sequence of independent random variables Y1, Y2,... satisfying the conditions of the theorem then the conclusion of the theorem holds for all sequences of independent random variables which satisfy the conditions of the theorem. In other words, if the limiting distribution exists it is independent of the distributions of the individual X's. Let (1 if s > 0, j l0 if s S 0, Received by the editors March 6, 1947. 1 This paper was written while both authors were John Simon Guggenheim Memorial Fellows. ' P. Erdös and M. Kac, On certain limit theorems of the theory of probability, Bull. Amer. Math. Soc. vol. 52 (1946) pp. 292-302. ' P. Levy, Sur certain processus stochastiques homogènes, Compositio Math. vol. 7 (1939) pp. 283-339, in particular, Corollaire 2 on pp. 303-304. 1011

1012 P. ERDÖS AND M. KAC [October and note that Nn IP(s,.). Let now and put n;= [j k ], j=0,1,...,k, We have Dn = 1 E E (+ #(Sn ;) - Y' (Sr) ) n ie1 r=n ;_,-{-1 1 ~k n; E( I Dn I) < - L.r E E( I Y (Sn ;) - '(Sr) I ) n i=1 r-ni_,+1 and we wish now to estimate E(I (sn ;)-1,/(sr)I) Notice that for ni_1+1=<=r=<ni.r=<ni. E(I and that for e > 0 (s,.) I) = Prob. { sn ; > 0, sr 5 0 } Prob. IS,, 5 0, sr > 0 } 1/2 Prob. { Sni > 0, S'. 5 0 } = Prob. Is,,, >- en', s r 5 0} 112 + Prob. {0 < s n, < eni, sr 5 0 } 1/2 < Prob. { s n, - sr >= en{ } By Chebysheff's inequality {/2 + Prob. {0 < s n ; < en }. and hence Prob. { Sni - sr > eni 1/2 } < n ; - r e 2 ni Prob. { Sni > 0, Sr S 0 } S n, - r + Prob. {0 < Sni < n,~$ }. e 2 n: In exactly the same way we obtain Prob. { Sni < 0,' sr > 0 } 5 n, - r + Prob. { - en, 1/2 < sn, < 0 } e 2n; and thus (for n ;_1 +1 <=rsni)

1 9471 POSITIVE SUMS OF INDEPENDENT RANDOM VARIABLES 1013 E(I T (Sni) - ''(Sr) I ) Finally, E(I Dn I ) Prob. { < 2 ni 1/2 { 1/2 5 2 r + Prob. { - eni < s < en ; }. 2 eni 2 k 1 ni ME 2 E - E (ni - r) ni r = ni-1+1 1 k 1/2 1/2 - E (ni - ni_i) Prob. { - eni < sni < eni } n i-i 1 (ni - ni_1)(ni - ni_1-1) ne e {a 1 ni 1 k 1/2 1/2 - E (ni - ni-1) Prob. { - en i < Sni < eni } n = R(n, e, k). We note that, letting n--> oo while keeping k and a fixed, we get 1 k 1 r s 1 +log k lim R(n, e, k) = - + e-" /zdu < + e n-1~ ke i=1 8 s ke2 (using the central limit theorem). Let S>0 ; we have or, recalling the definition of D n, Prob. { I Dn I? a } <= R(n, e, k)/s 1 E 9'(Sr) - 1 E (ni - ni_1)y'(sni) n r=1 n i=1 We have furthermore 1 n l 1 n l Prob. ~- (Sr) < a } = Prob. ~- E 1 '(s,.) < a, I Dn I < S } n rs1 n r=1 In the same way we obtain <_ Prob. > $~ < R(n, e' k) n + Prob. ~- E 1I/(sr ) < a, I Dn I >_ n r=1 4 1 k ) ~- F, (ni- ni_1)'p(sn i) <a+s} n ti=1 R(n, k, e) S S

101 4 P. ERDÖS AND M. KAC [October 1 n l Prob. < a } n r.i ( k >= Prob. {n E (n ; - n,_i) (s,, ;) < a - R(n, e, k) Combining the above inequalities we have k Prob. 1 (n; - n ;_1),P(s,,,) < n i- 1 R(n, e, k) a - g~ - S (1) 5 Prob. ~Nn < a} n" k 5 Prob. - E (n ; - nj_1) (s,, ;) < a + S} + n i_1 l R(n 'S "' k) Let now G1, G2, be independent, normally distributed random variables each having mean 0 and variance 1 and let R;=G1 + + G;. It then follows, almost immediately, from the multidimensional central limit theorem that lim Prob. 1 k l ~- E (n ; - n,_1)1'(s,,) < 0t n,-1 1 k l = Prob. ~- E,&&(R ;) < p I = pk((3). k aal If in (1) we let n- * oo while keeping k, e and S fixed we obtain 1 +logk e (N l pk(a - S) - - - S lim inf Prob. {- < a} ke 2S S n- m nn (2) 5 lim sup Prob. n-nn Nn < a } 1 + log k e S pk(a + S) + ke 2 S + S. Notice that the above inequality holds for all random variables satisfying the conditions of our theorem. Thus if for some particular sequence Y1, Y2, we can prove that N 2 lim Prob. - < a = - arc sin al/2 = p(a), 0 <= a < 1, nom n 7

19471 POSITIVE SUMS OF INDEPENDENT RANDOM VARIABLES 101 5 then, making use of (2), with a replaced by a-6 and` by a+s we get at once p(a - 6) - 1+logk a 1+logk e kegs - b < pk(a) < p(a + 6) + ke2s + b Making use of (2) again we are led to p(a - 26) - 2 1 + log k 2e ke2s 6 < lim inf Prob. lim sup Prob. n-0 p(a + 26) + 2 Nn n < a> Nn n < ai 1 + log k 2e ke2s + 6 Since this inequality holds for all k, e and S we can put, for instance, 6 = k-111o and obtain, by letting k-> oo, e = k-1ls Nn lim Prob. - < a = p(a). n-, - - rl Use has been made of the fact that p(a) is continuous for all a. The invariance principle stated at the beginning of this section has thus been proved. 3. Explicit calculations for a particular sequence of random variables. Having established the invariance principle we could appeal to P. Levy's result concerning the binomial case and thus complete the proof of the general theorem. We prefer, however, to work with another sequence of independent random variables because we feel that the calculations are of some independent interest and because they again emphasize the usefulness of integral equations in proving limit theorems 4 We consider independent random variables Y1i Y2, such that 1 u Prob. { Y1 < u } 2 J e-ivldy, 9=1,2,.., and we set s; = Y1+ + Y;. It should be noted that although E(YY)O1 we are justified in 4 See M. Kac, On the average of a certain Wiener functional and a related limit theorem in calculus of probability, Trans. Amer. Math. Soc. vol. 59 (1946) pp. 401-414.

1016 P. ERDÖS AND M. KAC [October using the Y's inasmuch as N is independent of the variance of the Y's as long as the variance is different from 0. In fact, N remains unaltered if one multiplies all the Y's by the same positive constant. We are going to calculate 0 (u) = E(exp (un )) = E (exp (u E (si) 1 1. Setting we have p(y) = 2-1 exp (- I y I ) 4n(u)... f_exp = J ( u.0 or, introducing the variables y )l.p(yl)... p(y,,)dyl... d yn S1=y 1, $2 = Y 1 + Y2, Sn=y1+y2+ + Y., cb,.(u) = J... J exp (u E +p(s ;)) ~ ;=1 p (S1)p(S2 - S1)... p(s - sn_1)dsjds2... ds,,. We also introduce the auxiliary functions F (u, s ) = J and We have, for n> 1,... m Co ao n exp u +p(s ;)) ;=1. p(sl)p(s2 - S 1)... p(s - s _1)ds1... ds,.-1, F1(u, s 1) = exp (u (sl))p(s1). F (u, s,.) = exp (ui(sn)) J p(s - s,.-1)fn-1(u, s -1)ds,_1 or, in more convenient notation,

19471 POSITIVE SUMS OF INDEPENDENT RANDOM VARIABLES 101 7 (3) Fn (u, s) = exp (u4'(s)) J p(s - t)fn_1 (u, t)dt. n (u) = J F,(u, s)ds Since we get Thus the series (6) F,(u, s) < exp (Re u) max I Fn_ 1 (u, t) ~. -.<I<m I F l (u, s) I S exp (Re u) F,,(u, s) I S exp (n Re u). W G(u, s ; z) = E Fn (u, s)zn - I n-1 converges for sufficiently small I z I. Using (3), we obtain, almost immediately, G( u, s ; z) - exp (u (s)) p(s) = exp (ut(s)) f-.0 p(s - t)g(u, t ; z)dt or exp (- up(s))g(u, s ; z) - 2_1 exp (- I s I ) (5) = z (' exp (- I s- t i )G(u, t ; z)dt. 2 For s > 0 we have Z e e - ng(u, s ; z) - 2 -le-e = - e-e e 1G(u, t ; z) dt 2 J + 2 eef e-0g(u, t ; z)dt. e It is easily seen that, for sufficiently small I z I, G(u, t ; z) is an absolutely integrable function of t. Differentiating (6) twice with respect to s we are readily led to

1 01 8 P. ERDÖS AND M. KAC [October In the same way we get for s<0 Thus we get d 2G + (zeu - 1)G = 0. ds 2 d 2G d$2+(z-1)g=0. and G = A exp ((1 - zeu) 1 I 2s) + B exp (- (1 - zeu) 1 / 2s), s > 0, G = C exp ((1 - z) 1 / 2s) + D exp (- (1 - z) 1 " 2 s), s < 0, where A, B, C, D are functions of u and z. Since G is an integrable function of s we must have A=D = 0. Thus (7) G = B exp (- (1 - zeu) 1 / 2s), s > 0, (8) G = C exp ((1 - z) 1 " 2s), s < 0. From the absolute integrability of G and the integral equation (5) it follows that exp (- up(s))g(u, s ; z) is continuous for all s and, in particular, for s=0. This observation yields (9) Be- u = C. Rewriting (6) in the form e--"g(s) - 2 - le, " = z z e- Oe'G(t)dt-} e_ d e G(t)dt 2 J 2 0 z -}- e" J e- G(t)dt 2 and substituting expressions (7) and (8) we get after a few elementary transformations This together with (9) z z 1+(1-2) 112 1-(1-zeu)h12 B yields

19471 POSITIVE SUMS OF INDEPENDENT RANDOM VARIABLES 1019 B z) 112 - (1 - zeu)112 e u C = (1 - z) 1 i 2 - (1 - zeu) 1/2 z(eu - 1) z(eu - 1) and thus G is completely determined.5 In particular, we get G(u, s ; z)ds (1 - z) 1 1 2 - (1 - zeu) 1 / 2 eu 1 z(eu - 1) C (1 - zeu)112 + (1 - z) 1 On the other hand (see (4)) 1 J ~G(u, s ; z)ds = E (0 n (u)z n-1 co n=1 and hence g5 n (u) is the coefficient of z- 1 in the power series expansion of (1 - z) 1 / 2 - (1 - zeu)1i2 eu 1 z(eu - 1) [(1 - zeu) 1/2 + (1 - A simple computation gives finally E(exp (un n) _ 10 n (u) Setting z)1/2 ]' 1 _ 1, C1/2,kC_1/2,1(- 1) 1+ 1 (1 - e ku )(1 + e (1+1)u). e u - 1 k+1-n u=n we obtain the characteristic function of Nn/n. For large k and l we have C112,k ^' (- 1)k-1 2k - 1 1 1 1 C_1/2'1 1)1 ' (7rl)1/2 (7rk)1/2I and it is easily seen that, in the limit n-~ cc, we can use these asymptotic expressions in the sum. s Having found an explicit expression for G it is not too difficult to find explicit expressions for F,,(u,s) and verify the recursive relation (3). Since this recursive relation together with the formula for F1(u, s) determine G uniquely one might substitute the verification process for the derivation of the formula for G.

1020 P. ERDÖS AND M. KAC Thus lim E (exp \ n N-)) = lim n-- 1 n E 1 1 7r(1 - eie/n) k=1 2k - 1 (k(n - k))ii 2 ' (1 - e %Fk/n)(1 -F eim-k/n+1/n)) 1 (' 1 (1 - e zx) (1 + e i ( 1-X) ) 1 (' I dx dx e ifx 27riE o x(x(1 - x))1/2-7r o (x(1 - x))ii 2 I 2 = fo eiexdc arc sin x 112). We complete the proof by appealing to the continuity theorem for Fourier-Stieltjes transforms. SYRACUSE UNIVERSITY AND CORNELL UNIVERSITY