A BERRY ESSEEN BOUND FOR STUDENT S STATISTIC IN THE NON-I.I.D. CASE. 1. Introduction and Results

Size: px
Start display at page:

Download "A BERRY ESSEEN BOUND FOR STUDENT S STATISTIC IN THE NON-I.I.D. CASE. 1. Introduction and Results"

Transcription

1 A BERRY ESSEE BOUD FOR STUDET S STATISTIC I THE O-I.I.D. CASE V. Bentkus 1 M. Bloznelis 1 F. Götze 1 Abstract. We establish a Berry Esséen bound for Student s statistic for independent (non-identically) distributed random variables. The bound is very general and sharp. In particular, it implies an estimate like to the classical Berry Esséen bound. In the i.i.d. case it yields as well the best known sufficient conditions for the Central Limit Theorem for studentized sums. For non-i.i.d. random variables the bound shows that the Lindeberg condition is sufficient for the Central Limit Theorem for studentized sums. 1. Introduction and Results Let X 1,..., X be independent random variables. Let t = X/ˆσ denote the Student statistic, where X = 1 (X X ), ˆσ 2 = 1 (X i X) 2. We shall investigate the rate of the normal approximation of t when the random variables X 1,..., X are non-identically distributed. Define δ = sup x P ( t < x) Φ(x), where Φ(x) is the standard normal distribution function. Denote B 2 = Var X i and L = B 3 E X i Mathematics Subject Classification. Primary 62E20; secondary 60F05. Key words and phrases. Student s statistic, Berry Esséen bound, non-identically distributed random variables, convergence rate, Central Limit Theorem, self-normalized sums. Typeset by AMS-TEX 1

2 2 V. BETKUS M. BLOZELIS F. GÖTZE Theorem 1.1. Let X 1,..., X have mean zero and B > 0. Then there exists an absolute constant c > 0 such that δ cl. (1.1) Inequality (1.1) is an extension to the case of studentized sums of the classical bound by Esséen (1945) sup x P (X X < xb ) Φ(x) cl. Theorem 1.1 is a consequence of a general result formulated as Theorem 1.2. For a given number V > 0, define the truncated random variables Z i = V 1 X i II{X 2 i V 2 }, for 1 i, (1.2) where II {A} denotes the indicator function of the event A. Denote M 2 = and Var Z i, Π = P ( X 2 i > V 2 ), Λ = M 3 E Z i 3 Υ 0 = M 1 E Z i, Υ 1 = M 1 E Z i, Υ 2 = M 2 ( E Z i) 2. Theorem 1.2. Let X 1,..., X be arbitrary independent random variables. Then there exists an absolute constant c > 0 such that, for any V > 0, δ cπ + cυ 0 + cυ 2 + cλ. (1.3) Example 3.1 in the Appendix shows that the term Υ 2 in the bound (1.3) is optimal. Corollary 1.1. Let independent random variables X 1,..., X be arbitrary. Then there exists an absolute constant c > 0 such that, for any V > 0, δ cπ + cυ 1 + cλ. Let a denote the largest solution of the equation a 2 = E X 2 1 II{X 2 1 a 2 }. (1.4) Theorem 1.2 yields the following Berry Esséen bound for the Student statistic in the i.i.d. case.

3 STUDET S STATISTIC 3 Corollary 1.2 (Bentkus and Götze (1994)). Let X 1,..., X be i.i.d. random variables. Assume that the largest solution a of the equation (1.4) is positive. There exists an absolute constant c > 0 such that δ c P (X 2 1 a 2 ) + c E Z 1 + c E Z 1 3, (1.5) where Z 1 = ( a 2 ) 1/2 X1 II {X 2 1 a 2 }. The explicit bounds of Theorems 1.1 and 1.2 and Corollaries 1.1 and 1.2 are applicable for any independent random variables X 1,..., X. The random variables may depend on any parameters (for instance, on ). An application of a chosen bound reduces to estimation of moments or tails of random variables X i and to summation over 1 i of expressions obtained since the bounds depend additively on a separate summand. The bounds are indeed sharp and a lot of consequences may be derived, for example, the Central Limit Theorem for the Student test under the most precise known conditions follows (see the discussion at the bottom of the section). All the results remain valid if instead of the Student statistic t one considers the selfnormalized sums defined as S = X X X X2. In particular, Theorems 1.1, 1.2 and Corollaries hold with δ replaced by sup x P (S < x) Φ(x). Indeed, (see the argument in the proof of Lemma 2.3 below) sup x P ( t < x) P (S < x) cπ + cυ0 + cυ 2 + cλ. The convergence of selfnormalized sums was investigated by a number of authors, see, for instance, Efron (1969), Logan, Mallows, Rice and Shepp (1973), LePage, Woodroofe and Zinn (1981), Griffin and Kuelbs (1989, 1991), Hahn, Kuelbs and Weiner (1990), Griffin and Mason (1991). Convergence rates and Edgeworth expansions for Student s and related statistics were considered by Chung (1946), Chibisov (1980), Helmers and van Zwet (1982), van Zwet (1984), Helmers (1985), Slavova (1985), Bhattacharya and Ghosh (1986), Hall (1987, 1988), Praskova (1989), Friedrich (1989), Bhattacharya and Denker (1990), Bentkus, Götze and van Zwet (1994), Bentkus and Götze (1994), etc. For i.i.d. X 1,..., X, the rate O( 1/2 ) under the 3-rd moment condition was obtained by Slavova (1985). The bound δ c 1/2 β 3 /σ 3, where β s = E X 1 s and σ 2 = E X 2 1, follows from a general result of Bentkus and Götze (1994). A related result O( 1/2 ) under the additional condition: there exists y R such that E p(y) ( 1 p(y) ) > 0, where p(y) = P { X y X y },

4 4 V. BETKUS M. BLOZELIS F. GÖTZE was proved by Hall (1988) (see Bai and Shepp (1994) for a discussion of such condition). Most of the papers mentioned above deal with the case of independent and identically distributed random variables. Friedrich (1989) constructed a Berry Esséen bound for a general class of statistics without the i.i.d. assumption. In the i.i.d. case the result of Friedrich (1989) yields δ = O( 1/2 ) provided β 10/3 is finite, whereas our result implies δ c 1/2 β 3 /σ 3. In the remaining part of the Introduction we shall discuss sufficient conditions for the Central Limit Theorem (CLT) for studentized sums in i.i.d. and non-i.i.d. cases. We say that a sequence of studentized sums satisfies the CLT if it converges in distribution to the standard normal distribution. We are going to apply the bounds in the particular case when observations X 1,..., X are taken from a (fixed) infinite sequence X 1, X 2,... of independent random variables and t = t(x 1,..., X ). By ξ, ξ 1, ξ 2,... we shall denote i.i.d standard normal variables. The CLT for i.i.d. random variables. Let X 1, X 2,... be i.i.d. Then the right-hand side of (1.5) tends to zero as if and only if X 1 belongs to the domain of attraction of the standard normal law, see Bentkus and Götze (1994). Thus (1.5) yields the weakest known sufficient condition for the Central Limit Theorem for Student s statistic (Maller (1981), Csörgő and Mason (1987)). There is a conjecture (communicated for us by D.M. Mason and seemingly difficult to prove) that the condition is necessary as well. The CLT for non-i.i.d. random variables. Let X 1, X 2,... denote a sequence of independent random variables with mean zero and finite variances σ 2 i = E X2 i <. It is well known (Feller (1971)) that if X i, i 1, have mean zero then the Lindeberg condition ε > 0 B 2 E X2 i II {X 2 i εb 2 } 0 (1.6) is sufficient for and it is necessary for (1.7) provided that L B 1 (X X ) (0, 1), (1.7) max 1 i σ 2 i /B 2 0. (1.8) Corollary 1.3. Assume that X 1, X 2,... have mean zero and satisfy the Lindeberg condition (1.6). Then δ 0, that is, L( t) converge to the standard normal distribution. Remark 1.1. Corollary 1.3 tells us that the Lindeberg condition is sufficient for L( t) (0, 1). (1.9)

5 STUDET S STATISTIC 5 It is interesting to note that the Lindeberg condition (1.6) is not necessary for (1.9) even if (1.8) holds. More specifically, there exists a sequence (see Example 1.1 below) of centered independent random variables which satisfies (1.8) and (1.9) and which fails to satisfy (1.6). Example 1.1 (cf. Feller (1971), p. 521). Let τ 1, τ 2,... be a sequence of independent symmetric random variables such that P ( τ i = i 2 ) = 1 P ( τ i = 0 ) = i 2 for all i 1. Assume that i.i.d. standard normal random variables ξ 1, ξ 2,... are independent of τ 1, τ 2,.... It is easy to see that independent centered random variables X i = τ i +ξ i satisfy (1.7) and (1.9) and fail to satisfy the Lindeberg condition. In this case B 1 (X X ) P 0. otice as well that in this particular case (1.9) follows from the Borel Cantelli lemma (or Corollary 1.1 with V = V () = ). Thus we conclude that in case of finite variance and centered and asymptotically negligible summands in sense of (1.8) the studentization of the sum may lead to better results than normalization by B. However, in general a different situation may arise. In the following example we construct a sequence X 1, X 2,... of independent (non-centered) random variables such that (1.7) holds and L ( t ) (0, 1). Example 1.2. Define X i = i 1/2 ξ i + ( 1) i i p, p > 0. (1.10) It is easy to see that B 2 ln( + 2). Furthermore, X X is distributed as B ξ 1 + ( 1)i i p. Therefore L (X X )/ ln( + 2) L(ξ) converges to the standard normal distribution. Let V 2 = V 2 () = log( + 2). In what follows we show that, for the sequence (1.10), L ( t ) (0, 1) (1.11) if and only if ( E Z i) 2 0, (1.12) where r.v. Z i are defined in (1.2). A calculation shows that all terms other than ( E Z i) 2 in the right-hand side of (1.3) tend to 0 as, for each p > 0.

6 6 V. BETKUS M. BLOZELIS F. GÖTZE Hence (1.12) implies (1.11). otice that (1.12) holds if and only if p > 1/2. Therefore, in order to prove the converse implication, (1.11) (1.12) it suffices to show that t P 0, for 0 < p < 1/2, (1.13) and that L ( t ) (0, 1/2), for p = 1/2. (1.14) Proofs of (1.13) and (1.14) are given in Appendix. Remark 1.2. Example 1.2 demonstrates that the limiting behavior of the distribution of studentized sums crucially depends on ( E Z i) 2. This observation can be generalized. Let X 1, X 2,... be a sequence of independent random variables. Assume that, for a sequence of positive numbers V (), and ε > 0, max 1 k P ( X k > εv ()) 0 (1.15) Then (see Petrov (1975)), for each ε > 0, L ( (X X )/V () ) (0, 1). (1.16) P ( Xi 2 ε 2 V 2 () ) 0, E Z i 0, M 1, E Z i E Z i 3 0 as. Hence if (1.15) and (1.16) hold then (1.12) implies (1.11). A simple analysis shows that the converse implication (1.11) (1.12) is also true provided (1.15) and (1.16) hold. A similar effect was noticed by Robbins (1948). He considered the distribution of the Student statistic based on the independent normal observations X i (µ i, σ 2 ), σ 2 > 0, with µ i = E X i such that µ i = 0. In particular, Robbins (1948) showed that in this case the distribution function depends only on and λ = µ2 i /(2σ2 ). In his results λ plays a role similar to that of ( E Z i) 2 in our considerations. 2. Proofs The proofs are close to those given in Bentkus and Götze (1994). By means of truncation we replace the Student statistic t = t(x 1,..., X ) by a smooth function S = S(X 1,..., X ) of the sample (see Lemma 2.3 below). Then we apply Esséen s smoothing inequality. When estimating the difference between the characteristic functions ϕ(t) = exp{ t 2 /2} and f(t) = E exp{its} we split f into a conditional

7 STUDET S STATISTIC 7 product of two characteristic functions (different for different values of t) of conditionally independent random variables. This type of approach has been used by Helmers and van Zwet (1982), van Zwet (1984) and was extended in Götze and van Zwet (1992), Bentkus, Götze and van Zwet (1994) to the case of general symmetric statistics of i.i.d. random variables. The crucial step of the proof in the non-identically distributed case is to bound the characteristic function of the sum (which may contain, for example, only one summand) by a characteristic function which splits into a product. For this purpose we use a randomization by means of Bernoulli random variables. The Section is organized as follows. First we shall derive Corollary 1.1 from Theorem 1.2. Then we shall show that Corollary 1.1 implies Theorem 1.1, Corollary 1.2 and Corollary 1.3. The proof of these implications is simple. The real problem presents the proof of the main result Theorem 1.2. Lemmas are needed for the proof of Theorem 1.2. In particular, the preparatory Lemma 2.4 may serve a simplified version of the proof of Theorem 1.2: the result of the lemma supplies a sufficiently precise bound for the difference of the characteristic functions for Fourier frequencies t 1; for frequencies t 1 much more elaborated techniques is needed in order to ensure sufficiently fast decrease (as t ) of the bound. Proof of Corollary 1.1. We shall derive the Corollary from Theorem 1.2. Let us consider two incompatible cases: a) there exists 1 i such that M E Z i ; b) for all 1 i, the inverse inequality E Z i < M holds. In the case a) the Corollary follows from the trivial bound δ 1 since the inequality M E Z i implies δ 1 M 1 E Z i Υ 1. In the case b) we have M 1 E Z i 1, for all 1 i. Consequently, Υ 2 Υ 1. Obviously Υ 0 Υ 1, and the result follows from the bound of Theorem 1.2. Proof of Theorem 1.1. If L 1/4, then the desired bound follows from the trivial estimate δ 1. Therefore we shall assume that L 1/4. Let us apply the bound δ cπ + cυ 1 + cλ of Corollary 1.1. Choose V = B. By the Chebyshev inequality, Π L. Obviously Λ M 3 L. Recall that E X i = 0, and therefore we have E Z i = B 1 EX iii {Xi 2 > B 2 }. Thus, by the Chebyshev inequality, we obtain Υ 1 M 1 L. To conclude the proof of the Theorem, it suffices to show that M 2 1/2. We have Var Z i = E Z 2 i ( E Z i ) 2 B 2 E X2 i B 2 E X2 i II {X 2 i B 2 } E Z i since Z i 1. Applying the Chebyshev inequality and summing over i, we obtain M 2 1 L E Z i 1 2L 1/2 since, again by the Chebyshev inequality, E Z i L, and L 1/4.

8 8 V. BETKUS M. BLOZELIS F. GÖTZE Proof of Corollary 1.2. Choose V 2 = a 2 and apply Corollary 1.1. Due to the i.i.d. assumption, we have δ c P (X 2 1 a 2 ) + c M 1 E Z 1 + c M 3 E Z 1 3, and it remains to show that M 2 1/2. We can assume that E Z 1 1/2 since otherwise the trivial bound δ 1 implies the result. Therefore, due to the choice of a and Z i 1, we have M 2 = E Z 2 i ( E Z i ) 2 = 1 ( E Z i ) 2 1 E Z i 1/2. Proof of Corollary 1.3. We have to show that δ 0 as. Let us choose V = B, and apply the bound δ cπ + cυ 1 + cλ of Corollary 1.1. Then it is sufficient to prove that Π 0, Υ 1 0 and Λ 0. Denote λ ε = B 2 E X2 i II {Xi 2 εb}. 2 Then, for any ε > 0, the Lindeberg condition (1.6) yields λ ε 0, as. The Chebysvev inequality implies Π λ 1 0. Using E X i = 0, the inequality E Z i 1 and the Chebyshev inequality, we have Var Z i = E Z 2 i ( E Z i ) 2 B 2 E X2 i 2B 2 E X2 i II {X 2 i B 2 }, whence, summing over i, we obtain M 2 1 2λ 1. Consequently, M 1/2, for sufficiently large. To bound Υ 1, use E X i = 0 and the Chebyshev inequality. Then Υ 1 M 1 λ 1, and Υ 1 0 since M 1/2 and λ 1 0. It remains to show that Λ 0. For given 0 < ε < 1, we have E Z i 3 = B 3 εb 2 E X i 3 II {X 2 i ε 2 B 2 } + B 3 E X i 3 II {ε 2 B 2 X 2 i B 2 } E X2 i + B 2 E X2 i II {X 2 i ε 2 B 2 }. Multiplying by M 3 and summing over i, we get Λ εm 3 + λ ε M 3. Recall that M 1/2, for sufficiently large. Therefore we have 0 lim sup Λ 8ε, and lim Λ = 0 follows, since ε > 0 is arbitrary. It remains to prove Theorem 1.2. Introduce the notation Y i = M 1 (Z i E Z i ), 1 i, Y = Y Y, η = η η, where η i = Y 2 i E Y 2 i.

9 STUDET S STATISTIC 9 The random variables Y 1,..., Y (resp., η 1,..., η ) have means zero and are independent. Furthermore, E Y 2 i = 1, E Y i 3 8Λ. We shall assume throughout the rest of the paper that for a small absolute constant c 0 > 0, Λ c 0, Υ 2 c 0, Υ 0 c 0, (2.1) since otherwise the result follows from the obvious estimate δ 1. Let α = (α 1,..., α ) denote a sequence of i.i.d. Bernoulli random variables independent of Y 1,..., Y and such that P (α 1 = 1) = 1 P (α 1 = 0) = m with some m 1. Lemma 2.1. Assume that (2.1) holds. Denote κ i = α i ( ηi + M 1 Y i E Z i ), for 1 i. Then there exists an absolute constant c > 0 such that E Y i κ i cmλ, E κ i 3/2 cmλ. The U-statistic U = {1 i,j, i j} Proof of Lemma 2.1 is given in the Appendix. α i Y i κ j satisfies E U 3/2 cm 7/4 Λ. Lemma 2.2. Let β be a random variable with finite third moment. Then E exp{itβ} 2 1 t 2 E β t 3 E β 3. (2.2) 3 Let α 1 be a Bernoulli random variable with P (α 1 = 1) = 1 P (α 1 = 0) = m. Then E 1 α 1 t 2 E β 2 + α 1 4 t 3 3 E β 3 1/2 exp { mt2 2 E β m t 3 3 E β 3 }. (2.3) Proof of Lemma 2.2. Proof of (2.2) is easy and can be found in Petrov (1975). Formula (2.3) is a simple consequence of Jensen s inequality. Let g : R R denote a function which is infinitely many times differentiable with bounded derivatives and such that 1 8 g(x) 2, for x R, and g(x) = 1 x, for 1 2 x 7 4. Define the statistic S = Y g(1 + η + 2A + Υ 2 ), where A = M 1 Y i E Z i.

10 10 V. BETKUS M. BLOZELIS F. GÖTZE Lemma 2.3. Assume that (2.1) holds. There exists an absolute constant c > 0 such that δ sup P (S < x) Φ(x) +cπ + cυ 0 + cλ. (2.4) x Proof of Lemma 2.3 is given in the Appendix. Recall that by ξ, ξ 1, ξ 2,... we denote i.i.d. standard normal r.v. Define ψ i = ( E Y 2 i ) 1/2 ξ i, 1 i. In what follows θ and θ 1, θ 2,... denote generic real numbers such that θ 1 and θ i 1. Furhtermore, ϑ, ϑ 1, ϑ 2,... denote i.i.d. uniformly distributed in [0, 1] random variables. For a vector valued differentiable function H, the mean value formula may be written as We shall use the notation H(b) H(a) = E H ( a + ϑ(b a) ) (b a), a, b R. E ζ F (ζ, β) = E ( F (ζ, β) β ), E ζ F (ζ, β) = E ( F (ζ, β) ζ ), for conditional expectations. For Q = q q, we shall denote Q i = Q q i. Similarly, Q i,j = Q q i q j. Lemma 2.4. Assume that (2.1) holds. Let H : R C denote a bounded infinitely many times differentiable function with bounded derivatives. Then, for 1 k, E H ( (Y Y k ) g(1 + κ + Υ 2 ) ) E H(ψ ψ k ) cγ (Λ + Υ 2 ), (2.5) where and κ = k κ i, κ i = η i + 2M 1 Y i E Z i, 1 i, Γ = H + 1 H 3/4 + H + H + H, H = sup H(x). Proof of Lemma 2.4. We shall prove the lemma for k = only. Then the sums in (2.5) are equal to Υ 2 and Λ respectively, L(ψ ψ ) = L(ξ), and it suffices to prove that E H ( Y g(1 + κ + Υ 2 ) ) E H(Y g(1 + κ) cγ Υ 2, ) E H(Y g(1 + κ) E H(Y ) cγ Λ x

11 STUDET S STATISTIC 11 and E H(Y ) E H(ξ) cγ Λ. Expanding g in powers of Υ 2 and then expanding H we obtain the first inequality. The proof of the third inequality is easy and does not differ from that of given in Bentkus et al. (1990) where the i.i.d. case was considered. Let us prove the second inequality. Expanding g in powers of κ we have Y g(1 + κ) = Y g(1) + Y κ g (1 + θ 1 κ) = Y + D g (1 + θ 1 κ) + U g (1 + θ 1 κ). Here we split Y κ into the sum D + U, where D = Y i κ i, U = 1 i,j, i j Y i κ j. Expanding H in powers of D g (1 + θ 1 κ) we obtain E H ( Y g(1 + κ) ) = E H(Y + U g (1 + θ 1 κ)) + R with R cγ E D. An application of Lemma 2.1 gives E R cγ Λ. Expanding and applying the inequalities U g (1 + θ 1 κ) = U g (1) + U θ 1 κ g (1 + θ θ 1 κ) H(s) H(t) H t s H t s 3/4, for H t s 1, and for H t s > 1, we get H(s) H(t) H H ( H t s ) 3/4, E H(Y + U g (1 + θ 1 κ)) = E H(Y + U g (1)) + R with R cγ E ( U κ ) 3/4 cγ E U 3/2 + E κ 3/2 cγ Λ. In the last step we applied the inequality ab a 2 + b 2 with a = U 3/4 and b = κ 3/4 and Lemma 2.1. Since H, H are uniformly bounded, we can write E H(Y + U g (1)) = E H(Y ) + E H (Y ) U g (1) + R,

12 12 V. BETKUS M. BLOZELIS F. GÖTZE with It remains to estimate R cγ E U 3/2 cγ Λ. E H (Y ) U = E 1 i,j, i j Y i κ j H (Y ). Recall that Y i denotes the sum Y without summand Y i. The Taylor expansion of H (Y ) = H (Y i + Y i ) in powers of Y i and E Y i = 0 yield E Y i κ j H (Y i [ + Y i ) = E Y i κ j H (Y i ) Y i + H (Y i + ϑ 2 ϑ 1 Y i ) ϑ 1 Yi 2 ], where ϑ 1 and ϑ 2 are mutually independent (and independent of Y 1,..., Y ) random variables uniformly distributed in [0, 1]. Thus E H (Y ) U = R 1 + R 2 where R 1 = E 1 i,j, i j Y 2 i κ j H (Y i ), and R 2 cγ 1 i,j E Y i 3 E κ j cγ Λ. Here we estimated κ j η j + Yj M 1 E Z j, Yj M 1 E Z j Y 2 j + ( M 1 E Z j ) 2 and E κ j c Let us estimate R 1. Expanding E (Y j) 2 + ( E Y 2 j + (M 1 E Z j ) 2 ) c. H (Y i ) = H (Y i,j + Y j ) = H (Y i,j ) + Y j H (Y i,j + ϑy j ) and using E κ j = 0 we have E Y 2 i κ j H (Y i ) = E Y 2 i κ j Y j H (Y i,j + ϑy j ). Hence R 1 cγ Thus proving the lemma. E Y 2 i E κ j Y j cγ E κ j Y j cγ Λ

13 STUDET S STATISTIC 13 Proof of Theorem 1.2. Without loss of generality we shall assume that (2.1) is fulfilled. An application of Lemma 2.3 reduces the proof to the verification of the inequality sup P ( S < x ) P ( ξ < x ) cλ + cυ 2, (2.6) x assuming that (2.1) holds. In order to prove (2.6) let us apply the Esséen inequality for characteristic functions. Write Estimating f(t) = E exp{its} = E exp{ity g(1 + η + 2A + Υ 2 )}, ϕ(t) = E exp{itξ} = exp{ t 2 /2}. t C 1 f(t) ϕ(t) dt/ t cλ + cυ2 by Lemma 2.4, we see that (2.6) is a consequence of C 1 t T f(t) ϕ(t) / t dt cλ + cυ 2, (2.7) where T = c 1 /( E Y i 3 ). Here we may choose the absolute constant C 1 sufficiently large, and the absolute constant c 1 sufficiently small. We may assume that the interval (C 1, T ) is non-empty since otherwise the theorem follows from (2.1). Define the function m(t) = C 2 ln t t 2 < 1 2, for C 1 t T, where C 2 is a sufficiently large absolute constant. Throughout the proof we shall write h g if h(t) g(t) dt/ t cλ + cυ2. C 1 t T In particular, (2.7) is equivalent to f ϕ. We shall prove the inequality (2.7) in two steps. In the first step using randomization by means of Bernoulli random variables with the parameter m(t) we shall split f into a product of two conditionally independent characteristic functions. The first characteristic function will account for the contribution of the m(t)-th part of the sum Y Y and will ensure the convergence of the integral, cf. Bentkus and Götze (1994). Step 1. Let (Y 1,..., Y ) denote an independent copy of (Y 1,..., Y ). Recall that α 1,..., α denotes a sequence of i.i.d. Bernoulli random variables, assume that

14 14 V. BETKUS M. BLOZELIS F. GÖTZE P (α 1 = 1) = 1 P (α 1 = 0) = m(t) and assume that all these random variables are independent. It is easy to verify that where f(t) = E exp { it(x + Z)g(1 + γ + ρ) }, X = α 1 Y α Y, Z = (1 α 1 )Y (1 α )Y, γ = α i γ i with γ i = η i + 2M 1 Y i E Z i, ρ = (1 α i )ρ i + Υ 2 with ρ i = η i + 2M 1 Y i E Z i, η i = Y 2 i E Y 2 i, 1 i. Let us show that where and f(t) f 2 (t) = E exp{itw }, W = X g(1 + ρ) + U g (1 + ρ) + Z g(1 + ρ) + Z γ g (1 + ρ) U = {1 j,k, j k} α j Y j α k γ k. Denote D = α iy i γ i. Expanding g(1 + γ + ρ) in powers of γ we have (X + Z)g(1 + γ + ρ) = X g(1 + ρ) + X γ g (1 + ρ + θ 1 γ) + Z g(1 + ρ) + Z g (1 + ρ)γ + Z g (1 + ρ + θ 2 γ)θ 3 γ 2 ; X γ g (1 + ρ + θ 1 γ) = (D + U)g (1 + ρ + θ 1 γ) = D g (1 + ρ + θ 1 γ) + U g (1 + ρ) + U θ 1 γ g (1 + ρ + θ 4 γ), Expanding now the exponent in powers of itd g (1 + ρ + θ 1 γ) we obtain f(t) = f 1 (t) + R, f 1 (t) = E exp{itw + itr}, with and r = Z g (1 + ρ + θ 2 γ)θ 3 γ 2 + U θ 1 γ g (1 + ρ + θ 4 γ) R c t E D c t m(t)λ.

15 STUDET S STATISTIC 15 In the last estimate we applied Lemma 2.1. Since the function u exp{iu} and it s derivatives are uniformly bounded there exists an absolute constant c > 0 such that f 1 (t) = f 2 (t) + R and R c t 3/4 E r 3/4. The inequality ab a 2 + b 2 with a = U 3/4 and b = γ 3/4 implies E r 3/4 c E γ 3/2 E Y Z 3/4 + c E U γ 3/4 c E γ 3/2 + c E U 3/2 cm(t)λ + cm 7/4 (t)λ. In the last step we estimated E Y Z 3/4 c uniformly over α and applied lemma 2.1. We have f(t) f2 (t) c( t + t 3/4 )m(t)λ. Hence f f 2 because the factor m(t) ensures the convergence of the integral with respect to the measure dt/ t as t. Since the function u exp{iu} and it s derivatives are bounded there exists an absolute constant c > 0 such that f2 (t) f 3 (t) f 4 (t) R, where f 3 (t) = E exp{it [X g(1 + ρ) + Z g(1 + ρ) + Z γ g (1 + ρ)]}, f 4 (t) = E exp { it [X g(1 + ρ) + Z g(1 + ρ) + Z γ g (1 + ρ)] } itu g (1 + ρ) and R c t 3/2 E U 3/2. Applying Lemma 2.1 we get R c t 3/2 m 7/4 (t)λ. Hence f 2 f 3 + f 4. Furthermore, denoting f 5 (t) = E exp { it [X g(1 + ρ) + Z g(1 + ρ)] } itu g (1 + ρ), we have f4 (t) f 5 (t) t 3/2 R, with R = E U Z γ 1/2. Estimating E Y Z 1/2 ( E Y Z 2 ) 1/4 c uniformly with respect to α we obtain R = E U Zγ 1/2 c E U γ 1/2 E Y Z 1/2 c E U γ 1/2 c( E U 3/2 + E γ 3/2 ) cm(t)λ.

16 16 V. BETKUS M. BLOZELIS F. GÖTZE In the last step we used the inequality a 2/3 b 1/3 a + b with a 2/3 = U and b 1/3 = γ 1/2 and applied Lemma 2.1. Since the integral with respect to the measure t 3/2 m(t) dt converges, we conclude that f 4 f 5. We rewrite f 5 (t) in a more t convenient form such that some independent random variables get separated. Write P k,l := exp{it [X k,l g(1 + ρ) + Z g(1 + ρ)]}g (1 + ρ), 1 k, l, k l. Recall that X k,l denotes the sum α 1 Y α Y with the summands α k Y k and α l Y l removed. Then f 5 (t) = it E α {1 k,l, k l} E α α k Y k exp { itα k Y k g(1 + ρ) } α l γ l exp{itα l Y l g(1 + ρ)} P k,l. Expanding the exponents in powers of itα k Y k g(1 + ρ) and itα l Y l g(1 + ρ) we have f 5 (t) c t E α c t 3 m 2 (t) {1 k,l, k l} E γ l Y l = c t 3 m 2 (t)λ. α k c t E Y 2 k α l t E γ l Y l Here we used the fact that Y k, γ l and P k,l are independent, for k l, P k,l c and E Y k = 0, E γ l = 0. Hence f 5 0 and we conclude that f f 3. We shall show that f 3 f 6 + f 7, where f 6 (t) = E exp { it [X g(1 + ρ) + Z g(1 + ρ)]}, f 7 (t) = E itz γ g (1 + ρ) exp{it [X g(1 + ρ) + Z g(1 + ρ)]}. Using the inequality exp{ia} 1 ia a 3/2 with a = tz γ g (1 + ρ) we obtain f 3 (t) = f 6 (t) + f 7 (t) + t 3/2 R, where R c E γ 3/2 E Y Z 3/2 c E γ 3/2 cm(t)λ. Thus f 3 f 6 + f 7. ext we will prove that f 7 0. Introduce for brevity T j := E X j exp{itx j g(1 + ρ)}, Q := exp { itz g(1 + ρ) } Z g (1 + ρ).

17 STUDET S STATISTIC 17 Then f 7 (t) = it E α j γ j exp{itα j Y j g(1 + ρ)} T j Q. Expanding the exponent and estimating α j 1 we have f 7 (t) c t 2 E α E Yj γ j Y j E α T j Q. Using inequality (2.2) with β = Y l α l g(1+ρ) and with expectation taken with respect to Y l and using the fact that 1/8 g 2 we get T j T j := 1 l, l j ( t2 α l E Yl t 3 α l E Y l 3 ) 1/2. Using the simple bound E Y Q c and inequality (2.3) for E α T j we get E Q T j c E T j c exp{ W/2} c exp{ 10 3 t 2 m(t)}, where W = t2 m(t) 64 {1 l, l j} E Y 2 l 32 3 t 3 m(t) E Y l t 2 m(t). The last inequality is true for C 1 t T, provided that E Y j 2 1/2 which is satisfied due to (2.1) (otherwise E Y i 3 ( E Y i 2 ) 3/2 c 0 ). Thus we have f 7 (t) c t 2 exp{ C 3 t 2 m(t)} E γ i Y i c t 2 exp{ C 3 t 2 m(t)} Λ, where C 3 is a sufficiently large absolute positive constant. Hence, f 7 0. It remains to prove that f 6 ϕ. Introduce the sequence of i.i.d Bernoulli random variables τ = (τ 1,..., τ ) such that P (τ 1 = 1) = P (τ 1 = 0) = 1/2. Let (Y 1,..., Y ) be independent copy of (Y 1,..., Y ). Assume that τ, α, (Y 1,..., Y ), (Y 1,..., Y ) and (Y 1,..., Y )

18 18 V. BETKUS M. BLOZELIS F. GÖTZE are independent. Denote and X(τ) = α 1 τ 1 Y α τ Y X(1 τ) = α 1 (1 τ 1 )Y α (1 τ )Y. Observe that X(τ) and X(1 τ) are conditionally independent given α and τ. We have Write f 6 (t) = E exp { itz g(1 + ρ) } exp { it ( X(τ) + X(1 τ) ) g(1 + ρ) }. Q 0 := exp{it Z g(1 + ρ)}, Q 2 := exp{itx(1 τ) g(1 + ρ)}. Expanding g(1 + ρ) in powers of ρ we obtain Expanding the exponents we get Q 1 := exp{it X(τ) g(1 + ρ)}, g(1 + ρ) = g(1) + g (1)ρ + θ 1 g (1 + θ 2 ρ)ρ 2. exp { itx(τ)θ 1 g (1 + θ 2 ρ) ρ 2 } = 1 + r 1 with r 1 c tx(τ)ρ 2 3/4, exp { 2 1 itx(τ)ρ } = itx(τ)ρ + r 2 with r 2 c tx(τ)ρ 3/2 Q 1 = exp { itx(τ) (1 + g (1)ρ + θ 1 g (1 + θ 2 ρ)ρ 2 ) } = exp { itx(τ) } exp { 2 1 itx(τ)ρ } exp { itx(τ)θ 1 g (1 + θ 2 ρ)ρ 2 } =F (τ) + P (τ) + R(τ), where F (τ) = exp { itx(τ) }, P (τ) = 2 1 itx(τ)ρ exp { itx(τ) } and R(τ) c tx(τ) 3/4 ρ 3/2 + c tx(τ)ρ 3/2. Similarly, Q 2 = F (1 τ) + P (1 τ) + R(1 τ).

19 On the other hand one may write STUDET S STATISTIC 19 Q 2 = exp { itx(1 τ) ( 1 + g (1 + θ 1 ρ)ρ ) } = F (1 τ) + R0 (1 τ), where We have R 0 (1 τ) c tx(1 τ)ρ 1/2. f 6 = E Q 0 ( F (τ) + P (τ) + R(τ) ) Q2 = E Q 0 F (τ) ( F (1 τ) + P (1 τ) + R(1 τ) ) + E Q 0 P (τ) ( F (1 τ) + R 0 (1 τ) ) + E Q 0 R(τ) Q 2. In what follows we shall prove that f 6 E Q 0 F (τ) F (1 τ). (2.8) First, let us prove that E Q 0 R(τ) Q 2 0. The random variables R(τ) and Q 2 are conditionally independent given α, τ, Y and Q 0 1. Hence E Q 0 R(τ) Q 2 E Q 0 E Y R(τ) E Y Q 2 E E Y R(τ) E Y Q 2. Proceeding in the same way as above while estimating T j, we get E Y Q 2 The estimate l=1 ( t2 (1 τ l )α l E (Y l ) t 3 (1 τ l )α l E Y l 3) 1/2. E Y R(τ) c ( t 3/4 + t 3/2 ) ρ 3/2 which holds uniformly with respect to τ together with integration with respect to τ yields E τ E Y R(τ) EY Q 2 c ( t 3/4 + t 3/2 ) ρ 3/2 T l, (2.9) where T l = ( t2 α l E (Y l ) t 3 α l E Y l 3 ) 1/2, 1 l. ow we may integrate the product obtained in the right-hand side of (2.9) firstly with respect to Y and afterwards with respect to α. Applying Lemma 2.1 and (2.1) we get E Y ρ 3/2 2 3/2 E 3/ /2 Υ 3/2 2 c(λ + Υ 2 ). ρ i l=1

20 20 V. BETKUS M. BLOZELIS F. GÖTZE Hence, E ( t 3/4 + t 3/2 ) ρ 3/2 l=1 The expectation E α... does not exceed T l c ( t 3/4 + t 3/2 ) E α (Λ + Υ 2 ) T l. l=1 exp{ m(t)t m(t) t 3 E Y i 3 } exp{ C 3 m(t)t 2 }, for C 1 t T. We conclude that E Q 0 R(τ) Q 2 0. Proceeding in a similar way we get E Q 0 F (τ) R(1 τ) 0. Let us prove that E Q 0 P (τ) R 0 (1 τ) 0. Expanding the exponent in powers of itα j τ j Y j we get EY P (τ) = 2 1 tρ E Y α j τ j Y j exp { itα j τ j Y j + itx j (τ) } c t ρ G j H j, where G j = t E Y 2 j, and H j := E Y exp { itx j (τ) } 1 l,l j An application of Lemma 2.2 leads to { E α,τ H j exp 4 1 t 2 m(t) 1 l, l j 1 t 2 α l τ l E Y 2 l t 3 α l τ l E Y l 3 1/2. E Y 2 l t 3 m(t) exp{ C 3 m(t)t 2 }, (2.10) } E Y l 3 l=1 for C 1 t T. Here we assumed that 1 l, l j E Y l 2 the theorem follows from (2.1). Estimating > 1/2, since otherwise Q 0 1, E Y X(1 τ) 1/2 c, EY ρ 3/2 cλ + cυ 2, G j c t. and using (2.10) we get E Q0 P (τ) R 0 (1 τ) = Eα,τ,Y Q 0 EY P (τ) EY R0 (1 τ) c t 3/2 G j E α,τ H j E Y ρ 3/2 c t 3/2 exp{ C 3 m(t)t 2 } (Λ + Υ 2 ).

21 Thus we conclude that E Q 0 P (τ) R 0 (1 τ) 0. STUDET S STATISTIC 21 Let us prove that E Q 0 P (τ) F (1 τ) 0. Expanding the exponent as above we get E Y X(τ) exp { it X(τ) } c G j H j. Observe that E Y F (1 τ) E Y exp { it X j (1 τ) }, for all j. Hence, E τ X(τ) exp { itx(τ) } F (1 τ) is bounded from above by G j E τ E Y,Y exp { itx j (τ) + itx j (1 τ) } G j exp{ C 3 m(t)t 2 } c t exp{ C 3 m(t)t 2 }, for C 1 t T. The relation E Q 0 P (τ) F (1 τ) 0 now follows provided that the following estimate holds E Y Q 0 ρ c ( 1 + t 1/2 + t 3/4 + t ) (Λ + Υ 2 ) (2.11) uniformly in α. Let us prove (2.11). Define ˆρ := l=1 ˆρ l, where ˆρ l = (1 α)ρ l. We have E Y ρq 0 = E Y ˆρQ 0 + R, R E Y Υ 2 Q 0 Υ 2. Write g 0 (u) := g(u + Υ 2 ) and ˆρ Q 0 : = ˆρ j exp { itz g 0 (1 + ˆρ) } = ˆρ j exp { it (1 α j ) Y j g 0 (1 + ˆρ) } exp { itz j g 0 (1 + ˆρ) }. Expanding the first exponent we get ˆρQ 0 = f 8 + R,

22 22 V. BETKUS M. BLOZELIS F. GÖTZE where f 8 (t) = ˆρ j exp { it Z j g 0 (1 + ˆρ) } and E Y R c t E Y ˆρ j Y j c t Λ, see the proof of the first inequality of lemma 2.1. Expanding g 0 (1 + ˆρ) = g 0 (1) + g 0(1 + θˆρ) (ˆρ j + ˆρ j ) and then expanding the exponent in f 8 as follows exp { itz j g 0 (1 + ˆρ) } = exp { itz j g 0 (1) + itz j g 0(1 + θ ˆρ) ˆρ j } + r, with r c tz j ˆρ j 1/2, we get f 8 = f 9 + R. Here f 9 = ˆρ j exp { itz j g 0 (1) + itz j g 0(1 + θ ˆρ) ˆρ j } and E Y R c t 1/2 E Y ˆρ j 3/2 Z j 1/2 c t 1/2 E Y ˆρ j 3/2 c t 1/2 Λ. Here we used the independence of ˆρ j and Z j as well as bounds E Y Z j 1/2 ( E Y Z j 2 ) 1/4 c and E Y ˆρ j 3/2 c Λ. For the last inequality see the proof of lemma 2.1. Consider the second summand in the argument of the exponent in f 9. Let D (j) denote the diagonal part of the product Z j ˆρ j, that is, D (j) = {1 k, k j} (1 α k ) Y k ˆρ k and let U (j) denote the rest, i.e., U (j) + D (j) = Z j ˆρ j. Expanding the exponent in powers of it D (j) g 0(1 + θ ˆρ) we obtain f 9 = f 10 + R, where f 10 (t) = ˆρ j exp { it Z j + it U (j) g 0(1 + θˆρ) }

23 STUDET S STATISTIC 23 and E Y R c t E Y ˆρ j Y l ˆρ l c t E Y ρ j {1 l,l j} E Y Y l ρ l. l=1 Estimating E Y ρ j c as it was done in proof of Lemma 2.4 but with κ j instead of ρ j and estimating l=1 E Y Y l ρ l cλ, we get E Y R c t Λ. Let us consider the second summand in the argument of the exponent in f 10. We may write U (j) g 0(1 + θ ˆρ) = U (j) g 0(1) + U (j) ˆρθ g 0 (1 + θ 1 θ ˆρ), and U (j) ˆρ = U (j) ˆρ j + U (j) ˆρ j. Applying simple inequality exp{i(a + b)} 1 c ( a 1/2 + b 3/4 ), where c > 0 is an absolute constant, we have exp { itz j + itu (j) g 0(1 + θ ˆρ) } = exp { itz j + itu (j) g 0(1) } + r 1 + r 2, where Hence, where r 1 tu (j) ˆρ j 1/2, r 2 tu (j) ˆρ j 3/4. f 10 = f 11 + R, f 11 (t) = R ˆρ j exp { itz j + itu (j) g 0(1) }, ˆρ j ( t U (j) ˆρ j 1/2 + t U (j) ˆρ j 3/4). Observe that E Y f 11 = 0, since ˆρ j and (Z j, U (j) ) are independent and E Y ˆρ j = 0. Furthermore, estimating E Y j ρ j 3/2 E Y U (j) 1/2 c E Y j ρ j 3/2, E Y j ρ j E Y U (j) ˆρ j 3/4 c E Y j ρ j E Y ( U (j) 3/2 + ˆρ j 3/2), E ρ j c and using Lemma 2.1 combined with (2.1) we obtain E Y R c ( t 1/2 + t 3/4 ) Λ.

24 24 V. BETKUS M. BLOZELIS F. GÖTZE Thus (2.11) is proved. We have E Q 0 P (τ) F (1 τ) 0. Similarly, we prove that E Q 0 F (τ) P (1 τ) 0. We arrive at (2.8). Step 2. Observe that E Q 0 F (τ) F (1 τ) = E Q 0 exp{itx} =: f 12 (t). We shall show that f 12 ϕ. Recall that ψ i, 1 i, denote independent centered Gaussian random variables with variances E ψi 2 = E Y i 2, 1 i. Given α, let us apply Lemma 2.4 conditionally. We get Thus, using we obtain Furthermore, E ψ1... E ψ f 12 E exp { it exp { it (1 α i)ψ i } EY Q 0 c t 3 (Λ + Υ 2 ). E α EY exp{itx} exp{ C3 m(t)t 2 }, f 12 E exp { it (1 α i) ψ i } exp{it X}. (1 α i) ψ i } exp { it α i ψ i } = exp{ t 2 /2}. Here ψ 1,..., ψ denote independent copies of ψ i, 1 i. Thus Theorem 1.2 is proved. 3. Appendix Proof of Lemma 2.1. Let us prove the first inequality of the Lemma. Recall that κ i = α i ( ηi + M 1 Y i E Z i ). By the triangle inequality, we have E Y i κ i m E Y i η i + m E Y i 2 M 1 E Z i since E α i = m. Obviously E Y i η i 2 E Y i 3 cm 3 E Z i 3. Similarly E Yi 2 cm 2 E Zi 2. Consequently, an application of the Hölder inequality yields E Yi 2 M 1 E Z i cm 3 E Z i 3. Summing over i we derive the desired bound E Y i κ i cmλ. To prove the second and the third inequalities of the Lemma, we shall apply the following well known bound. Assume that T 0 = 0, T 1,..., T is a martingaledifference sequence, that is, E (T j T 1,..., T j 1 ) = 0. Then E T i p c(p) E T i p, for 1 p 2. (3.1)

25 STUDET S STATISTIC 25 To prove (3.1) define f(u) = u p. Then f (s) f (t) c t s p 1, for 1 < p 2. Writing S j = T T j and expanding into the Taylor series, we have where E f(s j+1 ) = E f(s j ) + E f (S j )T j+1 + R, E R c E f (S j + θ T j+1 ) f (S j ) T j+1 c E T j+1 p. Thus, it follows that E f(sj+1 ) E f(s j ) c E Tj+1 p, since E f (S j )T j+1 = 0. Applying the last inequality iteratively, we derive (3.1). Let us prove the second inequality of the Lemma. Random variables κ 1,..., κ are independent and have mean zero. Thus, by (3.1), κ i 3/2 c E κ i 3/2 cmλ, (3.2) since, similarly to the proof of the first inequality, E κ i 3/2 cm E Z i 3. For the proof of the third inequality E U 3/2 cm 7/4 Λ of the Lemma, write U = W + W, where W = 1 i<j α i Y i κ j, W = 1 j<i α i Y i κ j. It is sufficient to estimate E W 3/2 and E W 3/2 since U 3/2 2 W 3/2 +2 W 3/2. Let us estimate E W 3/2. The estimation of E W 3/2 is similar. Split the sum W as follows W = T T, where T j = κ j (α 1 Y α j 1 Y j 1 ), 2 j. By (3.1), we obtain Furthermore E W 3/2 c j=2 E T j 3/2. (3.3) E T j 3/2 = E κ j 3/2 E α 1 Y α j 1 Y j 1 3/2 E κ j 3/2 ( E (α 1 Y α j 1 Y j 1 ) 2) 3/4 cm 3/4 E κ j 3/2. Thus (3.3) together with (3.2) implies E W 3/2 cm 7/4 Λ, that completes the proof of the Lemma. Proof of Lemma 2.3. Recall that S = Y g(1 + η + 2A + Υ 2 ), where A = M 1 Y i E Z i.

26 26 V. BETKUS M. BLOZELIS F. GÖTZE Introduce the statistic t Z = t(z 1,..., Z ) based on observations Z 1,..., Z. Furhermore, denote S = (Y + C) g(1 + W ), where W = η + 2A + Υ (Y + C) 2 and C = M 1 To prove the Lemma, it is sufficient to show that E Z i. P ( t < x ) P ( tz < x ) 2Π, (3.4) P ( tz < x ) P ( S < x ) cλ, (3.5) P ( S < x ) Φ(x) sup x P ( S < x ) Φ(x) + cυ 0 + cλ. (3.6) To prove (3.4) notice that the event t(x 1,..., X ) t(z 1,..., Z ) has probability less than Π = P (X2 i > V 2 ). Let us prove (3.5). It is easy to verify that t(z1,..., Z ) = Y + C 1 + W. The function g(u) = 1/ u, for 1/2 u 7/8. Therefore the event t(z1,..., Z ) (Y + C)g ( 1 + W ) has probability less than P W > 1/4. Thus, it suffices to show that P W > 1/4 cλ. otice that W η + 2 A + Υ Y C 2. By (2.1), we have Υ C 2 1/8. Therefore P W > 1/4 P ( η > 1/24 ) + P ( 2 A > 1/24 ) + P ( 2 1 Y 2 > 1/24 ). We have P ( η > 1/24 ) c E η 3/2 c E Y i 3 cλ. Similarly P ( 2 A > 1/24 ) c E A 3/2 cλ. To conclude the proof of (3.5) notice that E Y i 3 Λ since E Yi 2 = 1, (3.7) 1

27 STUDET S STATISTIC 27 and therefore P (2 1 Y 2 > 1/24) c 1 E Y 2 = c 1 cλ. It remains to prove (3.6). Expanding g in powers of 1 (Y + C) 2 we obtain S = Y g (1 + W ) + C g (1 + W ) = Y g(1 + η + 2A + Υ 2 ) + R = S + R, where R R 1 + R 2, R 1 = c C = c Υ 0, R 2 = c 1 Y (Y + C) 2. Writing Φ(x) = P ( ξ < x ), where ξ is a standard normal random variable, we have sup x P ( S < x ) Φ(x) sup x P ( S < x ) Φ(x) + I 1 + I 2 with I 1 = sup x P ( ξ [x, x + 2R /2 ] ), I 2 = P ( R 2 1/2 ). Chebyshev s inequality and (2.1) give Estimating I 2 = sup x I 1 = P ( R 2 > 1/2 ) c 1/2 ( E Y C 2 + E Y 3) c 1/2. P ( ξ [x, x + 2R /2 ] ) cr 1 + c 1/2 = c Υ 0 + c 1/2, and using (3.7) we obtain (3.6), that completes the proof of the Lemma. Proof of (1.13). For X 1, X 2,... defined by (1.10) we have Therefore, for 0 < p < 1/2, X 2 i = i 1 ξ 2 i + 2( 1) i i 1/2 p ξ i + i 2p. P ( X X 2 > 1 2p ) 1 since X X 2 2 ( 1)i i 1/2 p ξ i + i 2p,

28 28 V. BETKUS M. BLOZELIS F. GÖTZE the first series in the right-hand side converges a.s. by Kolmogorov s Three Series Theorem, and i 2p > c(p) 1 2p, C(p) > 1, 2. ow it follows that t P 0, since asymptotically (X X )/ ln( + 2) is standard normal. That proves (1.13). Proof of (1.14). In this case p = 1/2 and in order to prove (1.14) it suffices to show that T := B 2 (X X 2 ) P 2, where B 2 = i 1. For any ε > 0, we have P ( T 2 > 2ε ) ( = P B 2 ε 2 B 4 i 1 (ξi 2 1) + 2 ( 1) i i 1 ξ i ) > 2ε ( E ) 2 ( i 1 (ξi 2 1) + P ( 1) i i 1 ξ i > εb 2 ). In the last step we applied the Chebyshev inequality. A simple calculation shows that the first summand tends to zero as 0. The second summand tends to zero because the series ( i) 1 ξ i converges a.s. by Kolmogorov s Three Series Theorem. Hence (1.14) is proved. Example 3.1. The term Υ 2 = M 2 ( E Z i) 2 in the bound (1.3) is optimal in the sense that it can not be replaced by Γ δ := M 1 E Z i 2+δ or by Υ 1+δ 2, (3.8) with some δ > 0. Indeed, fix 0 < ε < 1 and introduce the sequence X 1, X 2,... defined as X i = ξ i +( 1) i ε, where ξ i denote i.i.d. standard normal random variables. Choose V () =. A simple calculation shows that lim Υ 2 = ε 2, and that all other summands in the right hand side of (1.3) tend to zero as. On the other hand we have P ( ( t < 1) = P ξ < 1 + ε 2 + R + ε ( 1 ( 1) n ) /(2 ) ), (3.9)

29 STUDET S STATISTIC 29 where R is a random variable such that R P 0 as. It follows from (3.9) that there exists an absolute positive constant c such that lim inf δ lim inf P ( t < 1 ) P (ξ < 1) c ε 2. (3.10) The relations (3.9) and (3.10) contradict to any estimate of the type (1.3) with terms (3.8) instead of Υ 2 since lim Γ δ = 0 and lim Υ 2 1+δ = ε 2+2δ. References Bai, Z. D. and Shepp, L. A., A note on the conditional distribution of X when X y is given, Statistics & Probability Letters 19 (1994), Bentkus, V. and Götze, F., The Berry Esséen bound for Student s statistic (1994), SFB 343 Preprint (to appear in Ann. Probab.), Universität Bielefeld. Bentkus, V., Götze, F., Paulauskas, V. and Račkauskas, A., The accuracy of Gaussian approximation in Banach spaces, SFB 343 Preprint , Universität Bielefeld (In Russian: Itogi auki i Tehniki, ser. Sovr. Probl. Matem., Moskva, VIITI 81 (1991), ; to appear in English in Encyclopedia of Mathematical Sciences, Springer), Bentkus, V., Götze, F. and van Zwet, W., An Edgeworth expansion for symmetric statistics, SFB 343 Preprint , Universität Bielefeld, Bhattacharya, R.. and Denker, M., Asymptotic statistics, Birkhäuser, Basel Boston Berlin, Bhattacharya, R.. and Ghosh, J.K., On the validity of the formal Edgeworth expansion, Ann. Statist. 6 (1986), Chibisov, D.M., Asymptotic expansion for the distribution of a statistic admitting a stochastic expansion I, Theor. Probab. Appl. 25 (1980), Chung, K.L., The approximate distribution of Student s statistic, Ann. Math. Stat. 17 (1946), Csörgő, S. and Mason, D.M., Approximations of weighted empirical processes with applications to extreme, trimmed and self-normalized sums, Proc. of the First World Congress of the Bernoulli Soc., Tashkent, USSR, vol. 2, VU Science Publishers, Utrecht, 1987, pp Efron, B., Student s t-test under symmetry conditions, J. Amer. Statist. Assoc. (1969), Esséen, C. G., Fourier analysis of distribution functions. A mathematical study of the Laplace Gaussian law, Acta Math. 77 (1945), Feller, W., An introduction to probability theory and its applications. Vol. II, 2nd Ed., Wiley, ew York, Friedrich, K.O., A Berry Esséen bound for functions of independent random variables, Ann. Statist. 17 (1989), Götze, F. and van Zwet, W., Edgeworth expansions for asymptotically linear statistics, SFB 343 Preprint , Universität Bielefeld, 1991, revised version Griffin, P.S. and Kuelbs, J.D., Self-normalized laws of the iterated logarithm, Ann. Probab. 17 (1989), Griffin, P.S. and Kuelbs, J.D., Some extensions of the LIL via self-normalizations, Ann. Probab. 19 (1991),

30 30 V. BETKUS M. BLOZELIS F. GÖTZE Griffin, P.S. and Mason, D.M., On the asymptotic normality of self-normalized sums, Math. Proc. Camb. Phil. Soc. 109 (1991), Hahn, M.G., Kuelbs, J. and Weiner, D.C., The asymptotic joint distribution of self-normalized censored sums and sums of squares, Ann. Probab. 18 (1990), Hall, P., Edgeworth expansion for Student s t statistic under minimal moment conditions, Ann. Probab. 15 (1987), Hall, P., On the effect of random norming on the rate of convergence in the Central Limit Theorem, Ann. Probab. 16 (1988), Helmers, R., The Berry Esséen bound for studentized U-statistics, Canad. J. Stat. 13 (1985), Helmers, R. and van Zwet, W., The Berry Esséen bound for U-statistics, Stat. Dec. Theory Rel. Top. (S.S. Gupta and J.O. Berger, eds.), vol. 1, Academic Press, ew York, 1982, pp LePage, R., Woodroofe, M. and Zinn, J., Convergence to a stable distribution via order statistics, Ann. Probab. 9 (1981), Logan, B., Mallows, C., Rice, S. and Shepp, L., Limit distributions of self-normalized sums, Ann. Probab. 1 (1973), Maller, A., A theorem on products of random variables, with application to regression, Austral. J. Statist. 23 (1981), Petrov, V.V., Sums of independent random variables, Springer, ew York (1975). Praskova, Z., Sampling from a finite set of random variables: the Berry Esséen bound for the Studentized mean, Proceedings of the Fourth Prague Symposium on Asymptotic Statistics, Charles Univ., Prague, 1989, pp Robbins, H., The distribution of Student s t when the population means are unequal, Ann. Statist. 19 (1948), Slavova, V.V., On the Berry Esséen bound for Student s statistic, LM 1155 (1985), van Zwet, W.R., A Berry Esséen bound for symmetric statistics, Z. Wahrsch. verw. Gebiete 66 (1984),

Self-normalized Cramér-Type Large Deviations for Independent Random Variables

Self-normalized Cramér-Type Large Deviations for Independent Random Variables Self-normalized Cramér-Type Large Deviations for Independent Random Variables Qi-Man Shao National University of Singapore and University of Oregon qmshao@darkwing.uoregon.edu 1. Introduction Let X, X

More information

EMPIRICAL EDGEWORTH EXPANSION FOR FINITE POPULATION STATISTICS. I. M. Bloznelis. April Introduction

EMPIRICAL EDGEWORTH EXPANSION FOR FINITE POPULATION STATISTICS. I. M. Bloznelis. April Introduction EMPIRICAL EDGEWORTH EXPANSION FOR FINITE POPULATION STATISTICS. I M. Bloznelis April 2000 Abstract. For symmetric asymptotically linear statistics based on simple random samples, we construct the one-term

More information

Introduction to Self-normalized Limit Theory

Introduction to Self-normalized Limit Theory Introduction to Self-normalized Limit Theory Qi-Man Shao The Chinese University of Hong Kong E-mail: qmshao@cuhk.edu.hk Outline What is the self-normalization? Why? Classical limit theorems Self-normalized

More information

Refining the Central Limit Theorem Approximation via Extreme Value Theory

Refining the Central Limit Theorem Approximation via Extreme Value Theory Refining the Central Limit Theorem Approximation via Extreme Value Theory Ulrich K. Müller Economics Department Princeton University February 2018 Abstract We suggest approximating the distribution of

More information

A generalization of Strassen s functional LIL

A generalization of Strassen s functional LIL A generalization of Strassen s functional LIL Uwe Einmahl Departement Wiskunde Vrije Universiteit Brussel Pleinlaan 2 B-1050 Brussel, Belgium E-mail: ueinmahl@vub.ac.be Abstract Let X 1, X 2,... be a sequence

More information

ORTHOGONAL DECOMPOSITION OF FINITE POPULATION STATISTICS AND ITS APPLICATIONS TO DISTRIBUTIONAL ASYMPTOTICS. and Informatics, 2 Bielefeld University

ORTHOGONAL DECOMPOSITION OF FINITE POPULATION STATISTICS AND ITS APPLICATIONS TO DISTRIBUTIONAL ASYMPTOTICS. and Informatics, 2 Bielefeld University ORTHOGONAL DECOMPOSITION OF FINITE POPULATION STATISTICS AND ITS APPLICATIONS TO DISTRIBUTIONAL ASYMPTOTICS M. Bloznelis 1,3 and F. Götze 2 1 Vilnius University and Institute of Mathematics and Informatics,

More information

RELATIVE ERRORS IN CENTRAL LIMIT THEOREMS FOR STUDENT S t STATISTIC, WITH APPLICATIONS

RELATIVE ERRORS IN CENTRAL LIMIT THEOREMS FOR STUDENT S t STATISTIC, WITH APPLICATIONS Statistica Sinica 19 (2009, 343-354 RELATIVE ERRORS IN CENTRAL LIMIT THEOREMS FOR STUDENT S t STATISTIC, WITH APPLICATIONS Qiying Wang and Peter Hall University of Sydney and University of Melbourne Abstract:

More information

AN INEQUALITY FOR TAIL PROBABILITIES OF MARTINGALES WITH BOUNDED DIFFERENCES

AN INEQUALITY FOR TAIL PROBABILITIES OF MARTINGALES WITH BOUNDED DIFFERENCES Lithuanian Mathematical Journal, Vol. 4, No. 3, 00 AN INEQUALITY FOR TAIL PROBABILITIES OF MARTINGALES WITH BOUNDED DIFFERENCES V. Bentkus Vilnius Institute of Mathematics and Informatics, Akademijos 4,

More information

SOME CONVERSE LIMIT THEOREMS FOR EXCHANGEABLE BOOTSTRAPS

SOME CONVERSE LIMIT THEOREMS FOR EXCHANGEABLE BOOTSTRAPS SOME CONVERSE LIMIT THEOREMS OR EXCHANGEABLE BOOTSTRAPS Jon A. Wellner University of Washington The bootstrap Glivenko-Cantelli and bootstrap Donsker theorems of Giné and Zinn (990) contain both necessary

More information

Edgeworth expansions for a sample sum from a finite set of independent random variables

Edgeworth expansions for a sample sum from a finite set of independent random variables E l e c t r o n i c J o u r n a l o f P r o b a b i l i t y Vol. 12 2007), Paper no. 52, pages 1402 1417. Journal URL http://www.math.washington.edu/~ejpecp/ Edgeworth expansions for a sample sum from

More information

AN EXTENSION OF THE HONG-PARK VERSION OF THE CHOW-ROBBINS THEOREM ON SUMS OF NONINTEGRABLE RANDOM VARIABLES

AN EXTENSION OF THE HONG-PARK VERSION OF THE CHOW-ROBBINS THEOREM ON SUMS OF NONINTEGRABLE RANDOM VARIABLES J. Korean Math. Soc. 32 (1995), No. 2, pp. 363 370 AN EXTENSION OF THE HONG-PARK VERSION OF THE CHOW-ROBBINS THEOREM ON SUMS OF NONINTEGRABLE RANDOM VARIABLES ANDRÉ ADLER AND ANDREW ROSALSKY A famous result

More information

EXACT CONVERGENCE RATE AND LEADING TERM IN THE CENTRAL LIMIT THEOREM FOR U-STATISTICS

EXACT CONVERGENCE RATE AND LEADING TERM IN THE CENTRAL LIMIT THEOREM FOR U-STATISTICS Statistica Sinica 6006, 409-4 EXACT CONVERGENCE RATE AND LEADING TERM IN THE CENTRAL LIMIT THEOREM FOR U-STATISTICS Qiying Wang and Neville C Weber The University of Sydney Abstract: The leading term in

More information

MOMENT CONVERGENCE RATES OF LIL FOR NEGATIVELY ASSOCIATED SEQUENCES

MOMENT CONVERGENCE RATES OF LIL FOR NEGATIVELY ASSOCIATED SEQUENCES J. Korean Math. Soc. 47 1, No., pp. 63 75 DOI 1.4134/JKMS.1.47..63 MOMENT CONVERGENCE RATES OF LIL FOR NEGATIVELY ASSOCIATED SEQUENCES Ke-Ang Fu Li-Hua Hu Abstract. Let X n ; n 1 be a strictly stationary

More information

Additive functionals of infinite-variance moving averages. Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535

Additive functionals of infinite-variance moving averages. Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535 Additive functionals of infinite-variance moving averages Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535 Departments of Statistics The University of Chicago Chicago, Illinois 60637 June

More information

Asymptotic Nonequivalence of Nonparametric Experiments When the Smoothness Index is ½

Asymptotic Nonequivalence of Nonparametric Experiments When the Smoothness Index is ½ University of Pennsylvania ScholarlyCommons Statistics Papers Wharton Faculty Research 1998 Asymptotic Nonequivalence of Nonparametric Experiments When the Smoothness Index is ½ Lawrence D. Brown University

More information

On the Set of Limit Points of Normed Sums of Geometrically Weighted I.I.D. Bounded Random Variables

On the Set of Limit Points of Normed Sums of Geometrically Weighted I.I.D. Bounded Random Variables On the Set of Limit Points of Normed Sums of Geometrically Weighted I.I.D. Bounded Random Variables Deli Li 1, Yongcheng Qi, and Andrew Rosalsky 3 1 Department of Mathematical Sciences, Lakehead University,

More information

Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 2012

Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 2012 Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 202 BOUNDS AND ASYMPTOTICS FOR FISHER INFORMATION IN THE CENTRAL LIMIT THEOREM

More information

The Codimension of the Zeros of a Stable Process in Random Scenery

The Codimension of the Zeros of a Stable Process in Random Scenery The Codimension of the Zeros of a Stable Process in Random Scenery Davar Khoshnevisan The University of Utah, Department of Mathematics Salt Lake City, UT 84105 0090, U.S.A. davar@math.utah.edu http://www.math.utah.edu/~davar

More information

Self-normalized laws of the iterated logarithm

Self-normalized laws of the iterated logarithm Journal of Statistical and Econometric Methods, vol.3, no.3, 2014, 145-151 ISSN: 1792-6602 print), 1792-6939 online) Scienpress Ltd, 2014 Self-normalized laws of the iterated logarithm Igor Zhdanov 1 Abstract

More information

On large deviations of sums of independent random variables

On large deviations of sums of independent random variables On large deviations of sums of independent random variables Zhishui Hu 12, Valentin V. Petrov 23 and John Robinson 2 1 Department of Statistics and Finance, University of Science and Technology of China,

More information

The Convergence Rate for the Normal Approximation of Extreme Sums

The Convergence Rate for the Normal Approximation of Extreme Sums The Convergence Rate for the Normal Approximation of Extreme Sums Yongcheng Qi University of Minnesota Duluth WCNA 2008, Orlando, July 2-9, 2008 This talk is based on a joint work with Professor Shihong

More information

Some Contra-Arguments for the Use of Stable Distributions in Financial Modeling

Some Contra-Arguments for the Use of Stable Distributions in Financial Modeling arxiv:1602.00256v1 [stat.ap] 31 Jan 2016 Some Contra-Arguments for the Use of Stable Distributions in Financial Modeling Lev B Klebanov, Gregory Temnov, Ashot V. Kakosyan Abstract In the present paper,

More information

Almost sure limit theorems for U-statistics

Almost sure limit theorems for U-statistics Almost sure limit theorems for U-statistics Hajo Holzmann, Susanne Koch and Alesey Min 3 Institut für Mathematische Stochasti Georg-August-Universität Göttingen Maschmühlenweg 8 0 37073 Göttingen Germany

More information

THE LINDEBERG-FELLER CENTRAL LIMIT THEOREM VIA ZERO BIAS TRANSFORMATION

THE LINDEBERG-FELLER CENTRAL LIMIT THEOREM VIA ZERO BIAS TRANSFORMATION THE LINDEBERG-FELLER CENTRAL LIMIT THEOREM VIA ZERO BIAS TRANSFORMATION JAINUL VAGHASIA Contents. Introduction. Notations 3. Background in Probability Theory 3.. Expectation and Variance 3.. Convergence

More information

A Note on the Central Limit Theorem for a Class of Linear Systems 1

A Note on the Central Limit Theorem for a Class of Linear Systems 1 A Note on the Central Limit Theorem for a Class of Linear Systems 1 Contents Yukio Nagahata Department of Mathematics, Graduate School of Engineering Science Osaka University, Toyonaka 560-8531, Japan.

More information

AN EDGEWORTH EXPANSION FOR SYMMETRIC FINITE POPULATION STATISTICS. M. Bloznelis 1, F. Götze

AN EDGEWORTH EXPANSION FOR SYMMETRIC FINITE POPULATION STATISTICS. M. Bloznelis 1, F. Götze AN EDGEWORTH EXPANSION FOR SYMMETRIC FINITE POPULATION STATISTICS M. Bloznelis 1, F. Götze Vilnius University and the Institute of Mathematics and Informatics, Bielefeld University 2000 (Revised 2001 Abstract.

More information

Krzysztof Burdzy University of Washington. = X(Y (t)), t 0}

Krzysztof Burdzy University of Washington. = X(Y (t)), t 0} VARIATION OF ITERATED BROWNIAN MOTION Krzysztof Burdzy University of Washington 1. Introduction and main results. Suppose that X 1, X 2 and Y are independent standard Brownian motions starting from 0 and

More information

arxiv:math/ v2 [math.pr] 16 Mar 2007

arxiv:math/ v2 [math.pr] 16 Mar 2007 CHARACTERIZATION OF LIL BEHAVIOR IN BANACH SPACE UWE EINMAHL a, and DELI LI b, a Departement Wiskunde, Vrije Universiteit Brussel, arxiv:math/0608687v2 [math.pr] 16 Mar 2007 Pleinlaan 2, B-1050 Brussel,

More information

3. Probability inequalities

3. Probability inequalities 3. Probability inequalities 3.. Introduction. Probability inequalities are an important instrument which is commonly used practically in all divisions of the theory of probability inequality is the Chebysjev

More information

Wasserstein-2 bounds in normal approximation under local dependence

Wasserstein-2 bounds in normal approximation under local dependence Wasserstein- bounds in normal approximation under local dependence arxiv:1807.05741v1 [math.pr] 16 Jul 018 Xiao Fang The Chinese University of Hong Kong Abstract: We obtain a general bound for the Wasserstein-

More information

EXISTENCE OF THREE WEAK SOLUTIONS FOR A QUASILINEAR DIRICHLET PROBLEM. Saeid Shokooh and Ghasem A. Afrouzi. 1. Introduction

EXISTENCE OF THREE WEAK SOLUTIONS FOR A QUASILINEAR DIRICHLET PROBLEM. Saeid Shokooh and Ghasem A. Afrouzi. 1. Introduction MATEMATIČKI VESNIK MATEMATIQKI VESNIK 69 4 (217 271 28 December 217 research paper originalni nauqni rad EXISTENCE OF THREE WEAK SOLUTIONS FOR A QUASILINEAR DIRICHLET PROBLEM Saeid Shokooh and Ghasem A.

More information

ON CONCENTRATION FUNCTIONS OF RANDOM VARIABLES. Sergey G. Bobkov and Gennadiy P. Chistyakov. June 2, 2013

ON CONCENTRATION FUNCTIONS OF RANDOM VARIABLES. Sergey G. Bobkov and Gennadiy P. Chistyakov. June 2, 2013 ON CONCENTRATION FUNCTIONS OF RANDOM VARIABLES Sergey G. Bobkov and Gennadiy P. Chistyakov June, 3 Abstract The concentration functions are considered for sums of independent random variables. Two sided

More information

Almost Sure Central Limit Theorem for Self-Normalized Partial Sums of Negatively Associated Random Variables

Almost Sure Central Limit Theorem for Self-Normalized Partial Sums of Negatively Associated Random Variables Filomat 3:5 (207), 43 422 DOI 0.2298/FIL70543W Published by Faculty of Sciences and Mathematics, University of Niš, Serbia Available at: http://www.pmf.ni.ac.rs/filomat Almost Sure Central Limit Theorem

More information

On the Breiman conjecture

On the Breiman conjecture Péter Kevei 1 David Mason 2 1 TU Munich 2 University of Delaware 12th GPSD Bochum Outline Introduction Motivation Earlier results Partial solution Sketch of the proof Subsequential limits Further remarks

More information

Mi-Hwa Ko. t=1 Z t is true. j=0

Mi-Hwa Ko. t=1 Z t is true. j=0 Commun. Korean Math. Soc. 21 (2006), No. 4, pp. 779 786 FUNCTIONAL CENTRAL LIMIT THEOREMS FOR MULTIVARIATE LINEAR PROCESSES GENERATED BY DEPENDENT RANDOM VECTORS Mi-Hwa Ko Abstract. Let X t be an m-dimensional

More information

RELATION BETWEEN SMALL FUNCTIONS WITH DIFFERENTIAL POLYNOMIALS GENERATED BY MEROMORPHIC SOLUTIONS OF HIGHER ORDER LINEAR DIFFERENTIAL EQUATIONS

RELATION BETWEEN SMALL FUNCTIONS WITH DIFFERENTIAL POLYNOMIALS GENERATED BY MEROMORPHIC SOLUTIONS OF HIGHER ORDER LINEAR DIFFERENTIAL EQUATIONS Kragujevac Journal of Mathematics Volume 38(1) (2014), Pages 147 161. RELATION BETWEEN SMALL FUNCTIONS WITH DIFFERENTIAL POLYNOMIALS GENERATED BY MEROMORPHIC SOLUTIONS OF HIGHER ORDER LINEAR DIFFERENTIAL

More information

On the Bennett-Hoeffding inequality

On the Bennett-Hoeffding inequality On the Bennett-Hoeffding inequality of Iosif 1,2,3 1 Department of Mathematical Sciences Michigan Technological University 2 Supported by NSF grant DMS-0805946 3 Paper available at http://arxiv.org/abs/0902.4058

More information

Estimates for probabilities of independent events and infinite series

Estimates for probabilities of independent events and infinite series Estimates for probabilities of independent events and infinite series Jürgen Grahl and Shahar evo September 9, 06 arxiv:609.0894v [math.pr] 8 Sep 06 Abstract This paper deals with finite or infinite sequences

More information

The Central Limit Theorem: More of the Story

The Central Limit Theorem: More of the Story The Central Limit Theorem: More of the Story Steven Janke November 2015 Steven Janke (Seminar) The Central Limit Theorem:More of the Story November 2015 1 / 33 Central Limit Theorem Theorem (Central Limit

More information

Soo Hak Sung and Andrei I. Volodin

Soo Hak Sung and Andrei I. Volodin Bull Korean Math Soc 38 (200), No 4, pp 763 772 ON CONVERGENCE OF SERIES OF INDEENDENT RANDOM VARIABLES Soo Hak Sung and Andrei I Volodin Abstract The rate of convergence for an almost surely convergent

More information

The Kadec-Pe lczynski theorem in L p, 1 p < 2

The Kadec-Pe lczynski theorem in L p, 1 p < 2 The Kadec-Pe lczynski theorem in L p, 1 p < 2 I. Berkes and R. Tichy Abstract By a classical result of Kadec and Pe lczynski (1962), every normalized weakly null sequence in L p, p > 2 contains a subsequence

More information

FOURIER TRANSFORMS OF STATIONARY PROCESSES 1. k=1

FOURIER TRANSFORMS OF STATIONARY PROCESSES 1. k=1 FOURIER TRANSFORMS OF STATIONARY PROCESSES WEI BIAO WU September 8, 003 Abstract. We consider the asymptotic behavior of Fourier transforms of stationary and ergodic sequences. Under sufficiently mild

More information

On Concentration Functions of Random Variables

On Concentration Functions of Random Variables J Theor Probab (05) 8:976 988 DOI 0.007/s0959-03-0504- On Concentration Functions of Random Variables Sergey G. Bobkov Gennadiy P. Chistyakov Received: 4 April 03 / Revised: 6 June 03 / Published online:

More information

A LOWER BOUND ON BLOWUP RATES FOR THE 3D INCOMPRESSIBLE EULER EQUATION AND A SINGLE EXPONENTIAL BEALE-KATO-MAJDA ESTIMATE. 1.

A LOWER BOUND ON BLOWUP RATES FOR THE 3D INCOMPRESSIBLE EULER EQUATION AND A SINGLE EXPONENTIAL BEALE-KATO-MAJDA ESTIMATE. 1. A LOWER BOUND ON BLOWUP RATES FOR THE 3D INCOMPRESSIBLE EULER EQUATION AND A SINGLE EXPONENTIAL BEALE-KATO-MAJDA ESTIMATE THOMAS CHEN AND NATAŠA PAVLOVIĆ Abstract. We prove a Beale-Kato-Majda criterion

More information

Han-Ying Liang, Dong-Xia Zhang, and Jong-Il Baek

Han-Ying Liang, Dong-Xia Zhang, and Jong-Il Baek J. Korean Math. Soc. 41 (2004), No. 5, pp. 883 894 CONVERGENCE OF WEIGHTED SUMS FOR DEPENDENT RANDOM VARIABLES Han-Ying Liang, Dong-Xia Zhang, and Jong-Il Baek Abstract. We discuss in this paper the strong

More information

On the absolute constants in the Berry Esseen type inequalities for identically distributed summands

On the absolute constants in the Berry Esseen type inequalities for identically distributed summands On the absolute constants in the Berry Esseen type inequalities for identically distributed summands Irina Shevtsova arxiv:1111.6554v1 [math.pr] 28 Nov 2011 Abstract By a modification of the method that

More information

Stein s Method and the Zero Bias Transformation with Application to Simple Random Sampling

Stein s Method and the Zero Bias Transformation with Application to Simple Random Sampling Stein s Method and the Zero Bias Transformation with Application to Simple Random Sampling Larry Goldstein and Gesine Reinert November 8, 001 Abstract Let W be a random variable with mean zero and variance

More information

arxiv:math/ v1 [math.pr] 24 Apr 2003

arxiv:math/ v1 [math.pr] 24 Apr 2003 ICM 2002 Vol. III 1 3 arxiv:math/0304373v1 [math.pr] 24 Apr 2003 Estimates for the Strong Approximation in Multidimensional Central Limit Theorem A. Yu. Zaitsev Abstract In a recent paper the author obtained

More information

LIFE SPAN OF BLOW-UP SOLUTIONS FOR HIGHER-ORDER SEMILINEAR PARABOLIC EQUATIONS

LIFE SPAN OF BLOW-UP SOLUTIONS FOR HIGHER-ORDER SEMILINEAR PARABOLIC EQUATIONS Electronic Journal of Differential Equations, Vol. 21(21), No. 17, pp. 1 9. ISSN: 172-6691. URL: http://ejde.math.txstate.edu or http://ejde.math.unt.edu ftp ejde.math.txstate.edu LIFE SPAN OF BLOW-UP

More information

Probability Background

Probability Background Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second

More information

LARGE DEVIATION PROBABILITIES FOR SUMS OF HEAVY-TAILED DEPENDENT RANDOM VECTORS*

LARGE DEVIATION PROBABILITIES FOR SUMS OF HEAVY-TAILED DEPENDENT RANDOM VECTORS* LARGE EVIATION PROBABILITIES FOR SUMS OF HEAVY-TAILE EPENENT RANOM VECTORS* Adam Jakubowski Alexander V. Nagaev Alexander Zaigraev Nicholas Copernicus University Faculty of Mathematics and Computer Science

More information

Large deviations for weighted random sums

Large deviations for weighted random sums Nonlinear Analysis: Modelling and Control, 2013, Vol. 18, No. 2, 129 142 129 Large deviations for weighted random sums Aurelija Kasparavičiūtė, Leonas Saulis Vilnius Gediminas Technical University Saulėtekio

More information

Journal of Inequalities in Pure and Applied Mathematics

Journal of Inequalities in Pure and Applied Mathematics Journal of Inequalities in Pure and Applied Mathematics ON A HYBRID FAMILY OF SUMMATION INTEGRAL TYPE OPERATORS VIJAY GUPTA AND ESRA ERKUŞ School of Applied Sciences Netaji Subhas Institute of Technology

More information

STAT 7032 Probability Spring Wlodek Bryc

STAT 7032 Probability Spring Wlodek Bryc STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,

More information

STAT 200C: High-dimensional Statistics

STAT 200C: High-dimensional Statistics STAT 200C: High-dimensional Statistics Arash A. Amini May 30, 2018 1 / 59 Classical case: n d. Asymptotic assumption: d is fixed and n. Basic tools: LLN and CLT. High-dimensional setting: n d, e.g. n/d

More information

ON THE CONVERGENCE OF MODIFIED NOOR ITERATION METHOD FOR NEARLY LIPSCHITZIAN MAPPINGS IN ARBITRARY REAL BANACH SPACES

ON THE CONVERGENCE OF MODIFIED NOOR ITERATION METHOD FOR NEARLY LIPSCHITZIAN MAPPINGS IN ARBITRARY REAL BANACH SPACES TJMM 6 (2014), No. 1, 45-51 ON THE CONVERGENCE OF MODIFIED NOOR ITERATION METHOD FOR NEARLY LIPSCHITZIAN MAPPINGS IN ARBITRARY REAL BANACH SPACES ADESANMI ALAO MOGBADEMU Abstract. In this present paper,

More information

Arithmetic progressions in sumsets

Arithmetic progressions in sumsets ACTA ARITHMETICA LX.2 (1991) Arithmetic progressions in sumsets by Imre Z. Ruzsa* (Budapest) 1. Introduction. Let A, B [1, N] be sets of integers, A = B = cn. Bourgain [2] proved that A + B always contains

More information

Evgeny Spodarev WIAS, Berlin. Limit theorems for excursion sets of stationary random fields

Evgeny Spodarev WIAS, Berlin. Limit theorems for excursion sets of stationary random fields Evgeny Spodarev 23.01.2013 WIAS, Berlin Limit theorems for excursion sets of stationary random fields page 2 LT for excursion sets of stationary random fields Overview 23.01.2013 Overview Motivation Excursion

More information

EXISTENCE OF SOLUTIONS FOR CROSS CRITICAL EXPONENTIAL N-LAPLACIAN SYSTEMS

EXISTENCE OF SOLUTIONS FOR CROSS CRITICAL EXPONENTIAL N-LAPLACIAN SYSTEMS Electronic Journal of Differential Equations, Vol. 2014 (2014), o. 28, pp. 1 10. ISS: 1072-6691. URL: http://ejde.math.txstate.edu or http://ejde.math.unt.edu ftp ejde.math.txstate.edu EXISTECE OF SOLUTIOS

More information

A NOTE ON THE COMPLETE MOMENT CONVERGENCE FOR ARRAYS OF B-VALUED RANDOM VARIABLES

A NOTE ON THE COMPLETE MOMENT CONVERGENCE FOR ARRAYS OF B-VALUED RANDOM VARIABLES Bull. Korean Math. Soc. 52 (205), No. 3, pp. 825 836 http://dx.doi.org/0.434/bkms.205.52.3.825 A NOTE ON THE COMPLETE MOMENT CONVERGENCE FOR ARRAYS OF B-VALUED RANDOM VARIABLES Yongfeng Wu and Mingzhu

More information

A generalization of Cramér large deviations for martingales

A generalization of Cramér large deviations for martingales A generalization of Cramér large deviations for martingales Xiequan Fan, Ion Grama, Quansheng Liu To cite this version: Xiequan Fan, Ion Grama, Quansheng Liu. A generalization of Cramér large deviations

More information

Chapter 6. Convergence. Probability Theory. Four different convergence concepts. Four different convergence concepts. Convergence in probability

Chapter 6. Convergence. Probability Theory. Four different convergence concepts. Four different convergence concepts. Convergence in probability Probability Theory Chapter 6 Convergence Four different convergence concepts Let X 1, X 2, be a sequence of (usually dependent) random variables Definition 1.1. X n converges almost surely (a.s.), or with

More information

Banach Journal of Mathematical Analysis ISSN: (electronic)

Banach Journal of Mathematical Analysis ISSN: (electronic) Banach J. Math. Anal. 1 (2007), no. 1, 56 65 Banach Journal of Mathematical Analysis ISSN: 1735-8787 (electronic) http://www.math-analysis.org SOME REMARKS ON STABILITY AND SOLVABILITY OF LINEAR FUNCTIONAL

More information

EXISTENCE AND UNIQUENESS OF SOLUTIONS CAUCHY PROBLEM FOR NONLINEAR INFINITE SYSTEMS OF PARABOLIC DIFFERENTIAL-FUNCTIONAL EQUATIONS

EXISTENCE AND UNIQUENESS OF SOLUTIONS CAUCHY PROBLEM FOR NONLINEAR INFINITE SYSTEMS OF PARABOLIC DIFFERENTIAL-FUNCTIONAL EQUATIONS UNIVERSITATIS IAGELLONICAE ACTA MATHEMATICA, FASCICULUS XL 22 EXISTENCE AND UNIQUENESS OF SOLUTIONS CAUCHY PROBLEM FOR NONLINEAR INFINITE SYSTEMS OF PARABOLIC DIFFERENTIAL-FUNCTIONAL EQUATIONS by Anna

More information

THE ASYMPTOTICS OF L-STATISTICS FOR NON I.I.D. VARIABLES WITH HEAVY TAILS

THE ASYMPTOTICS OF L-STATISTICS FOR NON I.I.D. VARIABLES WITH HEAVY TAILS PROBABILITY AN MATHEMATICAL STATISTICS Vol. 31, Fasc. 2 2011, pp. 285 299 THE ASYMPTOTICS OF L-STATISTICS FOR NON I.I.. VARIABLES WITH HEAVY TAILS BY AAM BA R C Z Y K, ARNOL JA N S S E N AN MARKUS PAU

More information

SOLUTIONS OF SEMILINEAR WAVE EQUATION VIA STOCHASTIC CASCADES

SOLUTIONS OF SEMILINEAR WAVE EQUATION VIA STOCHASTIC CASCADES Communications on Stochastic Analysis Vol. 4, No. 3 010) 45-431 Serials Publications www.serialspublications.com SOLUTIONS OF SEMILINEAR WAVE EQUATION VIA STOCHASTIC CASCADES YURI BAKHTIN* AND CARL MUELLER

More information

Asymptotic efficiency of simple decisions for the compound decision problem

Asymptotic efficiency of simple decisions for the compound decision problem Asymptotic efficiency of simple decisions for the compound decision problem Eitan Greenshtein and Ya acov Ritov Department of Statistical Sciences Duke University Durham, NC 27708-0251, USA e-mail: eitan.greenshtein@gmail.com

More information

ONE DIMENSIONAL MARGINALS OF OPERATOR STABLE LAWS AND THEIR DOMAINS OF ATTRACTION

ONE DIMENSIONAL MARGINALS OF OPERATOR STABLE LAWS AND THEIR DOMAINS OF ATTRACTION ONE DIMENSIONAL MARGINALS OF OPERATOR STABLE LAWS AND THEIR DOMAINS OF ATTRACTION Mark M. Meerschaert Department of Mathematics University of Nevada Reno NV 89557 USA mcubed@unr.edu and Hans Peter Scheffler

More information

On variable bandwidth kernel density estimation

On variable bandwidth kernel density estimation JSM 04 - Section on Nonparametric Statistics On variable bandwidth kernel density estimation Janet Nakarmi Hailin Sang Abstract In this paper we study the ideal variable bandwidth kernel estimator introduced

More information

Conditional moment representations for dependent random variables

Conditional moment representations for dependent random variables Conditional moment representations for dependent random variables W lodzimierz Bryc Department of Mathematics University of Cincinnati Cincinnati, OH 45 22-0025 bryc@ucbeh.san.uc.edu November 9, 995 Abstract

More information

Non linear functionals preserving normal distribution and their asymptotic normality

Non linear functionals preserving normal distribution and their asymptotic normality Armenian Journal of Mathematics Volume 10, Number 5, 018, 1 0 Non linear functionals preserving normal distribution and their asymptotic normality L A Khachatryan and B S Nahapetian Abstract We introduce

More information

The circular law. Lewis Memorial Lecture / DIMACS minicourse March 19, Terence Tao (UCLA)

The circular law. Lewis Memorial Lecture / DIMACS minicourse March 19, Terence Tao (UCLA) The circular law Lewis Memorial Lecture / DIMACS minicourse March 19, 2008 Terence Tao (UCLA) 1 Eigenvalue distributions Let M = (a ij ) 1 i n;1 j n be a square matrix. Then one has n (generalised) eigenvalues

More information

arxiv: v1 [math.st] 2 Apr 2016

arxiv: v1 [math.st] 2 Apr 2016 NON-ASYMPTOTIC RESULTS FOR CORNISH-FISHER EXPANSIONS V.V. ULYANOV, M. AOSHIMA, AND Y. FUJIKOSHI arxiv:1604.00539v1 [math.st] 2 Apr 2016 Abstract. We get the computable error bounds for generalized Cornish-Fisher

More information

Notes on Poisson Approximation

Notes on Poisson Approximation Notes on Poisson Approximation A. D. Barbour* Universität Zürich Progress in Stein s Method, Singapore, January 2009 These notes are a supplement to the article Topics in Poisson Approximation, which appeared

More information

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1 Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be

More information

POSITIVE DEFINITE FUNCTIONS AND MULTIDIMENSIONAL VERSIONS OF RANDOM VARIABLES

POSITIVE DEFINITE FUNCTIONS AND MULTIDIMENSIONAL VERSIONS OF RANDOM VARIABLES POSITIVE DEFINITE FUNCTIONS AND MULTIDIMENSIONAL VERSIONS OF RANDOM VARIABLES ALEXANDER KOLDOBSKY Abstract. We prove that if a function of the form f( K is positive definite on, where K is an origin symmetric

More information

Packing-Dimension Profiles and Fractional Brownian Motion

Packing-Dimension Profiles and Fractional Brownian Motion Under consideration for publication in Math. Proc. Camb. Phil. Soc. 1 Packing-Dimension Profiles and Fractional Brownian Motion By DAVAR KHOSHNEVISAN Department of Mathematics, 155 S. 1400 E., JWB 233,

More information

PRECISE ASYMPTOTIC IN THE LAW OF THE ITERATED LOGARITHM AND COMPLETE CONVERGENCE FOR U-STATISTICS

PRECISE ASYMPTOTIC IN THE LAW OF THE ITERATED LOGARITHM AND COMPLETE CONVERGENCE FOR U-STATISTICS PRECISE ASYMPTOTIC IN THE LAW OF THE ITERATED LOGARITHM AND COMPLETE CONVERGENCE FOR U-STATISTICS REZA HASHEMI and MOLUD ABDOLAHI Department of Statistics, Faculty of Science, Razi University, 67149, Kermanshah,

More information

PACKING-DIMENSION PROFILES AND FRACTIONAL BROWNIAN MOTION

PACKING-DIMENSION PROFILES AND FRACTIONAL BROWNIAN MOTION PACKING-DIMENSION PROFILES AND FRACTIONAL BROWNIAN MOTION DAVAR KHOSHNEVISAN AND YIMIN XIAO Abstract. In order to compute the packing dimension of orthogonal projections Falconer and Howroyd 997) introduced

More information

Anti-concentration Inequalities

Anti-concentration Inequalities Anti-concentration Inequalities Roman Vershynin Mark Rudelson University of California, Davis University of Missouri-Columbia Phenomena in High Dimensions Third Annual Conference Samos, Greece June 2007

More information

Introduction and Preliminaries

Introduction and Preliminaries Chapter 1 Introduction and Preliminaries This chapter serves two purposes. The first purpose is to prepare the readers for the more systematic development in later chapters of methods of real analysis

More information

On probabilities of large and moderate deviations for L-statistics: a survey of some recent developments

On probabilities of large and moderate deviations for L-statistics: a survey of some recent developments UDC 519.2 On probabilities of large and moderate deviations for L-statistics: a survey of some recent developments N. V. Gribkova Department of Probability Theory and Mathematical Statistics, St.-Petersburg

More information

A note on the growth rate in the Fazekas Klesov general law of large numbers and on the weak law of large numbers for tail series

A note on the growth rate in the Fazekas Klesov general law of large numbers and on the weak law of large numbers for tail series Publ. Math. Debrecen 73/1-2 2008), 1 10 A note on the growth rate in the Fazekas Klesov general law of large numbers and on the weak law of large numbers for tail series By SOO HAK SUNG Taejon), TIEN-CHUNG

More information

4 Sums of Independent Random Variables

4 Sums of Independent Random Variables 4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables

More information

CONVOLUTION OPERATORS IN INFINITE DIMENSION

CONVOLUTION OPERATORS IN INFINITE DIMENSION PORTUGALIAE MATHEMATICA Vol. 51 Fasc. 4 1994 CONVOLUTION OPERATORS IN INFINITE DIMENSION Nguyen Van Khue and Nguyen Dinh Sang 1 Introduction Let E be a complete convex bornological vector space (denoted

More information

Zeros of lacunary random polynomials

Zeros of lacunary random polynomials Zeros of lacunary random polynomials Igor E. Pritsker Dedicated to Norm Levenberg on his 60th birthday Abstract We study the asymptotic distribution of zeros for the lacunary random polynomials. It is

More information

arxiv: v2 [math.st] 20 Feb 2013

arxiv: v2 [math.st] 20 Feb 2013 Exact bounds on the closeness between the Student and standard normal distributions arxiv:1101.3328v2 [math.st] 20 Feb 2013 Contents Iosif Pinelis Department of Mathematical Sciences Michigan Technological

More information

Asymptotically Efficient Nonparametric Estimation of Nonlinear Spectral Functionals

Asymptotically Efficient Nonparametric Estimation of Nonlinear Spectral Functionals Acta Applicandae Mathematicae 78: 145 154, 2003. 2003 Kluwer Academic Publishers. Printed in the Netherlands. 145 Asymptotically Efficient Nonparametric Estimation of Nonlinear Spectral Functionals M.

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

THE L 2 -HODGE THEORY AND REPRESENTATION ON R n

THE L 2 -HODGE THEORY AND REPRESENTATION ON R n THE L 2 -HODGE THEORY AND REPRESENTATION ON R n BAISHENG YAN Abstract. We present an elementary L 2 -Hodge theory on whole R n based on the minimization principle of the calculus of variations and some

More information

Bahadur representations for bootstrap quantiles 1

Bahadur representations for bootstrap quantiles 1 Bahadur representations for bootstrap quantiles 1 Yijun Zuo Department of Statistics and Probability, Michigan State University East Lansing, MI 48824, USA zuo@msu.edu 1 Research partially supported by

More information

Advanced Statistics II: Non Parametric Tests

Advanced Statistics II: Non Parametric Tests Advanced Statistics II: Non Parametric Tests Aurélien Garivier ParisTech February 27, 2011 Outline Fitting a distribution Rank Tests for the comparison of two samples Two unrelated samples: Mann-Whitney

More information

On non negative solutions of some quasilinear elliptic inequalities

On non negative solutions of some quasilinear elliptic inequalities On non negative solutions of some quasilinear elliptic inequalities Lorenzo D Ambrosio and Enzo Mitidieri September 28 2006 Abstract Let f : R R be a continuous function. We prove that under some additional

More information

Regularity of the density for the stochastic heat equation

Regularity of the density for the stochastic heat equation Regularity of the density for the stochastic heat equation Carl Mueller 1 Department of Mathematics University of Rochester Rochester, NY 15627 USA email: cmlr@math.rochester.edu David Nualart 2 Department

More information

ERRATA: Probabilistic Techniques in Analysis

ERRATA: Probabilistic Techniques in Analysis ERRATA: Probabilistic Techniques in Analysis ERRATA 1 Updated April 25, 26 Page 3, line 13. A 1,..., A n are independent if P(A i1 A ij ) = P(A 1 ) P(A ij ) for every subset {i 1,..., i j } of {1,...,

More information

BURGESS INEQUALITY IN F p 2. Mei-Chu Chang

BURGESS INEQUALITY IN F p 2. Mei-Chu Chang BURGESS INEQUALITY IN F p 2 Mei-Chu Chang Abstract. Let be a nontrivial multiplicative character of F p 2. We obtain the following results.. Given ε > 0, there is δ > 0 such that if ω F p 2\F p and I,

More information

A Law of the Iterated Logarithm. for Grenander s Estimator

A Law of the Iterated Logarithm. for Grenander s Estimator A Law of the Iterated Logarithm for Grenander s Estimator Jon A. Wellner University of Washington, Seattle Probability Seminar October 24, 2016 Based on joint work with: Lutz Dümbgen and Malcolm Wolff

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Density estimators for the convolution of discrete and continuous random variables

Density estimators for the convolution of discrete and continuous random variables Density estimators for the convolution of discrete and continuous random variables Ursula U Müller Texas A&M University Anton Schick Binghamton University Wolfgang Wefelmeyer Universität zu Köln Abstract

More information

Asymptotic Statistics-III. Changliang Zou

Asymptotic Statistics-III. Changliang Zou Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (

More information