Bull Korean Math Soc 38 (200), No 4, pp 763 772 ON CONVERGENCE OF SERIES OF INDEENDENT RANDOM VARIABLES Soo Hak Sung and Andrei I Volodin Abstract The rate of convergence for an almost surely convergent series S n = n i= X i of independent random variables is studied in this paper More specifically, when S n converges almost surely to a random variable S, the tail series T n S S n = X i is a well-defined sequence of random variables with T n 0 almost surely Conditions are provided so that for a given positive sequence {, n }, the limit law sup k n T k / 0 holds This result generalizes a result of Nam and Rosalsky [4] Introduction Throughout this paper, {X n, n } is a sequence of independent random variables defined on a probability space (Ω, F, ) As usual, their partial sums will be denoted by S n = n i= X i, n If S n converges almost surely (as) to a random variable S, then (set S 0 = 0) T n S S n = X i, n is a well-defined sequence of random variables (referred to as the tail series) with () T n 0 as Received April 6, 200 2000 Mathematics Subject Classification: 60F05, 60F5 Key words and phrases: convergence in probability, tail series, independent random variables, weak law of large numbers, almost sure convergence The first author was supported by Korea Research Foundation Grant(KRF-2000-05-D0049)
764 Soo Hak Sung and Andrei I Volodin In this paper, we shall be concerned with the rate in which S n converges to S or, equivalently, in which the tail series T n converges to 0 Recalling that () is equivalent to sup T k 0, k n Nam and Rosalsky [4] proved the limit law sup k n T k (2) 0 if {X n, n } is a sequence of independent random variables with EX n = 0, n, and {, n } is a sequence of positive constants such that (3) E X i p = o(b p n) for some < p 2 Rosalsky and Rosenblatt [7] extended Nam and Rosalsky s result to Banach spaces of Rademacher type p Rosalsky and Rosenblatt [8] extended Nam and Rosalsky s result to the setting of martingale differences The main purpose of this paper is to generalize Nam and Rosalsky s result More specifically, we will provide more general conditions than (3), under which the limit law (2) holds 2 reliminary lemmas Some lemmas are needed to establish the main results lemma is due to von Bahr and Esseen [] The first Lemma (von Bahr and Esseen []) Let X,, X n be random variables such that E{X m+ S m } = 0 for 0 m n, where S 0 = 0 and S m = m i= X i for m n Then n E S n p 2 E X i p for all p 2 i= Note that Lemma holds when X,, X n are independent random variables with EX m = 0 for m n Nam, Rosalsky, and Volodin [6] proved the following maximal inequality for the tail series using Etemadi s [3] maximal inequality for the partial sums Furthermore, the maximal inequalities for the partial sums and the tail sums hold for Banach space valued random variables
On convergence of series 765 Lemma 2 (Nam, Rosalsky, and Volodin [6]) Let {X n, n } be a sequence of independent random variables with X n converges as Then setting T n = X i, n, {sup T k > ɛ} 4 sup { T k > ɛ }, ɛ > 0 k n k n 4 Lemma 3 Let {X i, n i N} be a sequence of independent random variables with EX i = 0, n i N Let {g i (x), n i N} be a sequence of functions defined on [0, ) such that (2) 0 g i (0) g i (x), g i (x) x, g i (x) x p on (0, ), n i N for some p 2 Let {, n i N} be a sequence of positive constants Then for ɛ > 0 { N X } ( i 2 2p+ ) > ɛ ɛ p + 22 N ɛ Eg i ( X i ) g i ( ) roof Define X i = X ii( X i ), X i = X i I( X i > ) for n i N Noting that EX i = 0 for n i N, it follows that X i = X i EX i + X i EX i By using the Markov s inequality and Lemma, we have (22) { N { N 2p ɛ p E N 2p+ ɛ p 22p+ ɛ p X i > ɛ } X i EX i ɛ } { N X > i + 2 X i EX i p + 2 ɛ E N E X i EX i p + 2 ɛ E X i p b p i + 22 ɛ X i E E X i X i EX i EX i EX i ɛ } > 2
766 Soo Hak Sung and Andrei I Volodin It follows from (2) that on the set {x : x } we have x p /b p i g i ( x )/g i ( ) Thus we have (23) E X i p b p i Eg i ( X i ) g i ( ) Eg i ( X i ) g i ( ) On the set {x : x > }, we have x / g i ( x )/g i ( ) Thus we get (24) E X i Eg i ( X i ) g i ( ) The result follows by (22), (23), and (24) Eg i ( X i ) g i ( ) Remark When p =, Lemma 3 holds without the independence condition However, the independence condition is necessary if < p 2 Such a counter-example can be easily obtained The following lemma was proved by Nam and Rosalsky [5] Lemma 4 Let {X n, n } be a sequence of independent and symmetric random variables with X n converges as Then the tail series {T n = X i, n } is a well-defined sequence of random variables and for every ɛ > 0 and n the inequality { } T k > ɛ 2 { T n > ɛ} holds sup k n The following lemma gives conditions so that the tail series weak law of large numbers (25) and the limit law (26) T n 0 sup k n T k 0 are equivalent, where T n = X i, n
On convergence of series 767 Lemma 5 Let {X n, n } be a sequence of independent random variables with X n converges as Let {, n } be a sequence of positive constants Assume that either {, n } is a sequence of nonincreasing or {X n, n } is a sequence of symmetric random variables Then (25) and (26) are equivalent roof Since (26) clearly implies (25), it needs to show that (25) implies (26) When { } is a sequence of non-increasing, Nam and Rosalsky [5] proved that (25) implies (26) We now let {X n } is a sequence of independent and symmetric random variables By Lemma 4, we have { supk n T k } { Tn } > ɛ 2 > ɛ Hence (25) implies (26) 3 Main results With the preliminaries accounted for, the main results of this paper may be established Theorem gives conditions so that the series X n converges as Theorem Let {X n, n } be a sequence of independent random variables with EX n = 0, n, and let {, n } be a sequence of positive constants Let {g n (x), n } be a sequence of functions defined on [0, ) such that (3) 0 g n (0) g n (x), 0 < g n (x) as n for each x > 0 and (32) g n (x) g n (x), x x p on (0, ), n for some < p 2 If (33) Eg n ( X n ) <, then {T n, n } is a well-defined sequence of random variables, and for all n and all ɛ > 0, { supk n T k } ( 2 4p+3 ) > ɛ ɛ p + 26 Eg i ( X i ) ɛ g i ( )
768 Soo Hak Sung and Andrei I Volodin roof First, we show that T n is well-defined random variable, ie, X n converges as Let X n = X n I( X n ), n Noting that (32) implies g n (x) on (0, ), n, we have { X n > } {g n ( X n ) g n ()} Eg n ( X n ) g n () Eg n ( X n ) < g () by (3) and (33) It follows from (32) that on the set {x : x } we have x 2 x p g n ( x )/g n () Thus we have V ar(x n) E X n 2 g () g n () Eg n( X n ) Eg n ( X n ) < On the set {x : x > } we have x g n ( x )/g n () Using this and the fact that EX n = 0, we get EX n = EX n I( X n > ) E X n I( X n > ) Eg n ( X n I( X n > )) g n () Eg n ( X n ) < g () Thus X n converges as by the Kolmogorov s three-series theorem
On convergence of series 769 Now we find an upper bound of {sup k n T k / > ɛ} Observe that, by Theorem 83 of Chow and Teicher [2], for ɛ > 0 { T k > ɛ } { = lim M M } { X i > ɛ lim inf M X i > ɛ } M i=k Thus, it follows by Lemma 2 and Lemma 3 with =, k i M, that { supk n T k } { Tk > ɛ 4 sup > ɛ } k n 4 { 4 sup lim inf M i=k X i > ɛ } k n M 4 i=k ( 2 2p+ ) 4 sup lim inf k n M (ɛ/4) p + 22 M Eg i ( X i ) ɛ/4 g i ( ) i=k = Hence the proof is completed ( 2 4p+3 ) ɛ p + 26 ɛ Eg i ( X i ) g i ( ) Theorem 2 Let {X n, n } be a sequence of independent random variables with EX n = 0, n, and let {g n (x), n } be a sequence of functions defined on [0, ) satisfying (3) and (32) Let {, n } be a sequence of positive constants such that (34) Eg i ( X i ) = o(g n ( )) Then {T n, n } is a well-defined sequence of random variables obeying (26) roof Since (34) implies (33), T n is well-defined by Theorem Also, Theorem implies that { supk n T k } ( 2 4p+3 ) > ɛ ɛ p + 26 Eg i ( X i ) ɛ g i ( ) ) ( 2 4p+3 ɛ p + 26 ɛ g n ( ) Eg i ( X i ) = o()
770 Soo Hak Sung and Andrei I Volodin Thus the proof is completed Nam and Rosalsky [4] proved the following corollary which is the special case g n (x) = x p, x 0, n, of Theorem 2 Corollary Let {X n, n } be a sequence of independent random variables with EX n = 0, n, and let {, n } be a sequence of positive constants such that (35) E X i p = o(b p n) for some < p 2 Then {T n, n } is a well-defined sequence of random variables obeying (26) The following example illustrates the sharpness of Theorem 2 It shows that if o in (34) is replaced by O, then Theorem 2 can fail It also shows that if p > 2, then Theorem 2 can fail Example Let {Y n, n } be sequence of independent and identically distributed N(0, ) random variables, and let {a n, n } be a sequence of positive constants such that a2 n < Let X n = a n Y n, = a2 i, n, and g n(x) = x p (p > ), x 0, n Then {X n, n } is a sequence of independent and symmetric random variables with EX n = 0, n Since V ar(x n) = a2 n <, T n is well defined To show that the limit law sup k n T k / 0 does not hold, it is enough to show by Lemma 5 that the tail series weak law of large numbers does not hold To do this, let S n,k = k X i, k n, and let φ n,k (t) be the characteristic function of S n,k Then S n,k N(0, k a2 i ), and so { φ n,k (t) = exp k 2 t2 a 2 i } { exp 2 t2 as k Since lim k φ n,k (t) is the characteristic function of N(0, a2 i ), it follows by the Lévy continuity theorem that T n N(0, a 2 i }
On convergence of series 77 a2 i ) Thus T n/ has standard normal distribution, which implies that the tail series weak law of large numbers does not hold Now we let a n = /2 n, n Note that Eg i ( X i ) = E Y p ( ) p/2 g n ( ) = C2 2 2i 2 np for some constants C > 0 and C 2 > 0 Thus Eg i ( X i ) = O(g n ( )), 2 ip C 2 np, and as shown before T n is well-defined, but the limit law sup k n T k / 0 does not hold On the other hand, if we take a n = /n, n, then Eg i ( X i ) = E Y p ( ) p/2 g n ( ) = C4 i 2 n p/2 i p C 3 n p, for some constants C 3 > 0 and C 4 > 0 Thus, when p > 2, the limit law sup k n T k / 0 does not hold although (34) holds and Tn is well-defined References [] B von Bahr and C-G Esseen, Inequalities for the r-th absolute moment of a sum of random variables, r 2, Ann Math Statist 36 (965), 299 303 [2] Y S Chow and H Teicher, robability Theory; Independence, Interchangeability, Martingales, 2nd ed, Springer-Verlag, New York, 988 [3] N Etemadi, On some classical results in probability theory, Sankhyā Ser A 47 (985), 25-22 [4] E Nam and A Rosalsky, On the rate of convergence of series of random variables, Teor Īmovīrnost ta Mat Statist 52 (995), 20 3, In Ukrainian, English translation in Theor robab Math Statist 52 (996), 29 40
772 Soo Hak Sung and Andrei I Volodin [5], On the convergence rate of series of independent random variables, In: E Brunner and M Denker (eds), Research Developments in robability and Statistics: Festschrift in Honor of Madan L uri on the Occasion of his 65th Birthday, pp 33-44, VS International Science ubl, Utrecht, The Netherlands, 996 [6] E Nam, A Rosalsky, and A Volodin, On convergence of series of independent random elements in Banach spaces, Stochastic Anal Appl 7 (999), 85 98 [7] A Rosalsky and J Rosenblatt, On the rate of convergence of series of Banach space valued random elements, Nonlinear Anal 30 (997), 4237 4248 [8], On convergence of series of random variables with applications to martingale convergence and to convergence of series with orthogonal summands, Stochastic Anal Appl 6 (998), 553 566 [9] W F Stout, Almost Sure Convergence, Academic ress, New York, 974 Soo Hak Sung, Department of Applied Mathematics, aichai University, Taejon 302-735, Korea E-mail: sungsh@mailpaichaiackr Andrei I Volodin, Department of Mathematics and Statistics, University of Regina, Regina, Saskatchewan, Canada E-mail: volodin@mathureginaca