FUNCTIONAL LAWS OF THE ITERATED LOGARITHM FOR THE INCREMENTS OF THE COMPOUND EMPIRICAL PROCESS 1 2. Abstract

Size: px
Start display at page:

Download "FUNCTIONAL LAWS OF THE ITERATED LOGARITHM FOR THE INCREMENTS OF THE COMPOUND EMPIRICAL PROCESS 1 2. Abstract"

Transcription

1 FUNCTIONAL LAWS OF THE ITERATED LOGARITHM FOR THE INCREMENTS OF THE COMPOUND EMPIRICAL PROCESS 1 2 Myriam MAUMY Received: Abstract Let {Y i : i 1} be a sequence of i.i.d. random variables and let {U i : i 1} be a sequence of independent and uniformly distributed random variables on, 1). The two sequences are assumed to be independent. Let {α n,c t) : t 1} be the compound empirical process generated by the first n observations from these two sequences of random variables. Let < h n < 1 be a sequence of constants such that h n as n. We investigate the strong limiting behavior as n of the increment function {α n,c x + h n u) α n,c x) : u 1}, where x 1 h n. Under suitable regularity assumptions imposed upon h n, we prove functional laws of the iterated logarithm for this increment function. 1. Notation Let U 1, U 2,..., be a sequence of independent uniform, 1) random variables rv and for each integer n 1, let U n x) = 1 n 1 {Ui x}, x 1, n i=1 be the right-continuous empirical distribution function, based on the first n of these uniform, 1) random variables. Here, 1 A denotes the indicator function of the event A. The uniform empirical process will be denoted for each integer n 1, by α n x) = n U n x) x, x 1. 1 AMS classification: 6F17 2 Keywords: Laws of the iterated logarithm, compound empirical process.

2 MAUMY 2 For any < h < 1 and integer n 1 consider the increment function 1.1) ξ n h, x; u) = α n x + hu) α n x), for u 1 and x 1. Let Y = Y 1, Y 2,..., be a sequence of independent and identically distributed iid random variables. Moreover, we suppose that EY = 1 and that < σ 2 = VarY <. The sequences {U i : i 1} and {Y i : i 1} are assumed to be independent. For each integer n 1, let U n,c x) = 1 n Y i 1 {Ui x}, x 1, n i=1 be the right-continuous compound empirical distribution function, based on the first n of these random variables. Here, the subscript c is used with the meaning of compound. The compound empirical process will be written for each integer n 1 by 1.2) α n,c x) = n U n,c x) x, x 1. For any < h < 1 and integer n 1 consider the increment function 1.3) ξ n,c h, x; u) = α n,c x + hu) α n,c x), for u 1 and x 1. A sequence of constants h n ) n 1, will be said to satisfy the Csörgő-Révész-Stute see, e.g., Csörgő-Révész 4, Stute 15) conditions CRS if the following hold: S.1) < h n < 1 for n 1, h n and nh n as n ; S.2) log1/h n )/ log log n as n ; S.3) nh n / log n as n. At times, we will make use of the condition S.4) nh n / log log n as n. The Strassen set S see, e.g., Strassen 14) consists of all absolutely continuous functions f on, 1 such that 1 2 f) = and fs)) ds 1, where f denotes the Lebesgue derivative of f. Let B, 1) be the set of all bounded functions on, 1. The sup-norm of a function f B, 1) is denoted by f = sup fs). s 1 For any ε > and C B, 1), we denote by C ε the set of all functions f B, 1) such that there exists a g C with f g < ε.

3 3 MAUMY 2. Strassen-type functional laws of the iterated logarithm for the increments of compound empirical processes 2.1. Introduction The tail empirical process see, e.g., Mason 9) is defined for each integer n 1 by h 1/2 n α n hn x ), x 1, where α n is the uniform empirical process, and h n is a sequence of positive constants such that h n and n h n as n. In order to motivate the results of this section, we cite the functional laws of iterated logarithm obtained by Mason 1988) for the tail empirical process based on α n. This result is in stated in Theorem 2.1, below. THEOREM 2.1. Under S.1) and S.4), the sequence of functions 2.1) { αn h n x) 2hn log log n : x 1 } are almost surely relatively compact in B, 1) endowed with the topology of uniform convergence on, 1. Moreover, the set of limit points is equal to the Strassen set S. Proof. See Mason 9. Next, we state in Theorem 2.2 below the functional laws of iterated logarithm due to Deheuvels and Mason 1992) for the increment process ξ n, as defined in 1.1). THEOREM 2.2. Under the CRS conditions S ), for any ε >, there exists almost surely a finite n ε such that, for all n n ε, we have 2.2) { } ξ n h n, x;.) 2hn log1/h n ) : x 1 h n S ε. Moreover, for any f S and each ε >, there exists almost surely a finite n ε,f for all n n ε,f, there exists an x, with x = x n,ε,f 1 h n, and such that 2.3) ξ n h n, x;.) 2hn log1/h n ) f < ε. such that, Proof. See Deheuvels and Mason 5.

4 MAUMY Main result We now state the versions of functional laws of iterated logarithm which hold for the increment process ξ n,c of the compound empirical process as defined in 1.3). Our main result is as follows. THEOREM 2.3. Suppose that the sequence {Y i : i 1} of i.i.d.r.v is such that : EY i = 1, VarY i = σ 2 < and Y i 1 M, for i 1. Under the CRS conditions S ), for any ε > there exists almost surely a finite n ε such that, for all n n ε, we have 2.4) { ξ n,c h n, x;.) 2σ2 + 1)h n log1/h n ) : x 1 h n } S ε. Moreover, for any f S and each ε >, there exists almost surely a finite n ε,f for all n n ε,f, there exists an x, with x = x n,ε,f 1 h n, and such that ξ n,c h n, x;.) 2.5) 2σ2 + 1)h n log1/h n ) f < ε. 3. Preliminary results on compound processes 3.1. Results on compound Poisson process A compound Poisson process {Π c t) : t } with rate λ is defined, by Πt) 3.1) Π c t) = Y i, t, i=1 such that, where {Πt) : t } denotes a Poisson process with rate λ >, and {Y i : i 1} is a sequence of independent and identically distributed random variables with common distribution function F y) = PY y, where Y = Y 1. The Poisson process {Πt) : t } and the sequence {Y i : i 1} are assumed to be independent. Recall that the subscript c in Π c is used with the meaning of compound. Remark 3.1. In 3.1), we set Y i =. i=1 THEOREM 3.1. The compound Poisson process {Π c t) : t } with rate λ based upon {Y i : i 1} has stationary independent increments, and characteristic function defined, for any t, by ϕ Πct)u) = exp λt 1 ϕ Y u)), where ϕ Y u) is the characteristic function of Y. Subject on EY 2 <, Π c t) has finite second moments given, for t, by EΠ c t) = µλt, VarΠ c t) = σ 2 + µ 2 )λt, where µ and σ 2 are the mean and variance, respectively, of Y.

5 5 MAUMY Proof. See, for example, Karlin and Taylor 7. THEOREM 3.2. The centered compound Poisson process {Π c t) µλt : t } is a martingale with respect to the σ-algebras A s = σ{π c u) : u s}. Proof. See, e.g., Maumy A useful Poisson Approximation Theorem Our next result relates compound empirical measures to compound Poisson processes. This uses the fact that compound Poisson processes are compound empirical measures. It goes as follows. Consider a separable metric space E with Borel algebra of subsets B. Denote by µ a bounded Borel positive measure on E. Next, consider an independent and identically distributed sequence of E-valued random variables ξ 1, ξ 2,..., with probability distribution Pξ B) = µ B) = µb) µe), for B B. Consider also an independent and identically distributed sequence of random variables Y = Y 1, Y 2,..., whose follows the common distribution function F and such that EY = 1. Moreover the sequence of random variables {Y i : i 1} is independent of the sequence of E-valued random variables {ξ i : i 1}. Define now a random variable Nλ), independent of ξ 1, ξ 2,..., independent of Y = Y 1, Y 2,..., and following a Poisson distribution with mean λ >, i.e. PNλ) = k = λk exp λ) for k N. k! The Poisson process on E with mean measure ν = λµ is defined as the random point measure on E given by Π = Nλ) i= The compound Poisson process on E, R) with mean measure ν c = λµ EY = λµ is defined as the marked point measure on E, R) given by Π c = Nλ) i= δ ξi. Y i δ ξi, where we use the convention that.) =. Now, the most important feature of the random measure Π c is that, for any collection A 1, A 2,... of disjoint Borel subsets of E, R), the random variables Π c A 1 ), Π c A 2 ),..., are independent. Moreover, for each A B, R), the random variable Π c A) follows a compound Poisson distribution with mean ν c A). It is not difficult to check that, if Π 1,c and Π 2,c are two independent compound Poisson processes with mean measures ν 1,c and ν 2,c, then the superposition Π 1,c + Π 2,c of Π 1,c and Π 2,c is a compound Poisson process with mean ν 1,c + ν 2,c see, for example 7 p ).

6 MAUMY 6 Given this fact, we may get rid of our previous assumption that the mean measures of the compound Poisson processes are bounded by defining a general compound Poisson process with mean measure ν c = ν i,c. i= This process is obtained by superposing a countable sequence of independent compound Poisson processes, with mean measures {ν i,c : < i < }. Of particular interest is the case where ν c is the compound measure on R and ν i,c is the uniform measure on i, i+1 weighted by Y i. In this case, one obtains the compound Poisson process on the line. If we denote this process by Π c,, it is convenient to define Π c, on R + by its cumulated right-continuous distribution function ) Π c t) = Π c,, t for t 1. Given these facts, the following Theorem is straightforward. We denote by d = equality in distribution. THEOREM 3.3. We have the following distribution identity. For any λ >, Proof. Omitted. {nu n,c t) : t 1} d = {Π c λt) : t 1 Nλ) = n}. Since the choice of λ > in Theorem 3.3 is arbitrary, it is convenient to choose λ = n. By so doing, we have the equality of expectations, with Π n,c t) = Π c nt), EnU n,c t) = EΠ c nt) = EΠ n,c t) = nt, for t 1. Below, we denote by B h the Borel σ-algebra of subsets induced on RC, h endowed with the topology of uniform convergence). THEOREM 3.4. For each < h < 1, there exists a constant < C h < such that, for all n 1 and all B B h, P {nu n,c t) : t h} B C h P {Π n,c t) : t h} B. Proof. Consider first a λ > and a random variable Z following a Poisson P oλ) distribution, i.e., such that It is easily checked that PZ = k = λk exp λ) for k N. k! λ λ 3.2) Mλ) = max PZ = k = PZ = λ ) = exp λ), k N λ! where u u < u + 1 denotes the integer part of u. Some routine calculus making use of the Stirling formula that is n! = 1 + o1))n/e) n 2πn as n ) shows that, as λ, 3.3) Mλ) = 1 + o1) 2πλ.

7 7 MAUMY Now, under the assumptions of Theorem 3.4, set R = Nnh) and R = Nn) Nnh). Observe that R and R are independent and follow Poisson distributions, with respective parameters P onh) and P on1 h)). Since R+R = Nn) follows also a Poisson distribution, with parameter P on), it follows from 3.2)-3.3) that, as n, 3.4) sup k N PR = k PR + R = n Mn1 h)) = ) = Mn) ) / 1 ) 1 + o1) + o1) 1. 2πn1 h) 2πn 1 h It follows from 3.4) that we may define a finite constant C h by setting { } ) Mn1 h)) 1 3.5) C h = sup,. n 1 Mn) 1 h Introduce now the events E 1 = { {nu n,c t) : t h} B } and E 2 = { {Π n,c t) : t h} B }. Observe, via Theorem 3.3 and 3.4)-3.5), that, uniformly over B B h, P E 1 = P E 2 R + R = n n PR = k = P E 2 {R = n k} PR + R = n which was to be proved. k= PR = k P E 2 sup k N PR + R = n C h P E 2, 3.3. Results on the modulus of continuity of compound processes Let h be a real number such that < h < 1 and let C be an interval s, t in, 1, that C = t s, and that XC) = Xt) Xs) denotes the increment. Introduce now the modulus of continuity ω X of the X process defined by ω X h) = sup XC) = sup Xt) Xs) = sup C h t s h sup u h t 1 u Xt + u) Xt). INEQUALITY 3.1. Let h, δ be some real numbers such that < h δ 1. Suppose 2 that the i.i.d.r.v. Y = Y 1,, Y n are such that Y i EY i M for i = 1,, n and VarY = σ 2 <. Then, for each n 1 and β >, we have P ω Πn,c h) β nh 2 )) hδ exp β2 1 δ) σ ψ Mβ σ 2 + 1). nh Proof. See, e.g., Maumy 11. INEQUALITY 3.2. Let h, δ be some real number such that < h δ 1. Suppose 2 that the i.i.d.r.v. Y = Y 1,, Y n are such that Y i EY i M for i = 1,, n and VarY = σ 2 <. Then, for each n 1 and β >, we have 3.6) P ω αn,c h) β h 2 )) hδ C 3 h exp β2 1 δ) 3 2 σ ψ Mβ σ 2 + 1). nh Proof. Omitted.

8 MAUMY The compound empirical process is a martingale Finally, we shall show that behind the compound empirical process there is a martingale structure. In order to prove this result, we shall define for each n 1 and t 1 the σ-algebras F n,c t) = σ Y i, 1 {Ui s} : 1 i n, s t. Introduce for each n 1 and t 1, 3.7) M n,c t) = n t U n,c t) 1 U n,c s) ds. 1 s LEMMA 3.1. For each n 1, M n,c t), F n,c t) ), t 1, is a compensator. Proof. Consider the counting process N i t) = 1 {Ui t}, t, 1, i {1,..., n} see Jacod, 6). For each i such that 1 i n, denote by M i t) = N i t) t and let M c it) be its compound version defined by M c it) = Y i N i t) t for each i such that 1 i n. Note that, for all s < t, 1 {Ui s} ds, t 1, 1 s Y i 1 {Ui s} 1 s ds = Y im i t), t 1, EM c it) F n,c s) = EY i M i t) F n,c s) = Y i EM i t) F n,c s). We recall from Jacod 6, that M i t) is a compensator with respect to the family of σ-algebras F n t) = σ1 {Ui s} : 1 i n, s t. Since the sequence {Y i : i 1} is independent of the independent and uniformly distributed on, 1) random variables U 1,..., U n, we have, for all s < t 1, EM c it) F n,c s) = Y i EM i t) F n,c s) = Y i EM i t) F n s) = Y i M i s) = M c is), be which we conclude that M c it), F n,c t) ), t 1 is a compensator. Moreover, since a sum of compensators is also a compensator and since 3.7) holds, M n,c t) = n M c it), is a compensator on, 1 with respect to {F n,c t)}. This achieves the proof. i=1 LEMMA 3.2. We have for each n 1, ) ) αn,c t) Un,c t) t n = n = 1 t 1 t Proof. We have I n = 1 n = = t t t t t 1 1 s dm n,cs), t s dm 1 n,cs) = 1 s du 1 U n,c s) n,cs) + d1 s) 1 s)1 s) 1 t ) 1 1 s du n,cs) 1 U n,c s)) d 1 s t s du Un,c s) t 1 n,cs) + 1 s 1 s d1 U n,cs)) = 1 ) αn,c t). n 1 t = U n,ct) t 1 t t

9 9 MAUMY For convenience, we set for each n 1 and t 1 ) Un,c t) t M n,c t) = n = ) αn,c t) n. 1 t 1 t Lemma 3.3 establishes that M n,c t) is a martingale with respect to {F n,c t)}. LEMMA 3.3. M n,c t), F n,c t)), t 1, is a martingale. Proof. We show that M n,c t) is a martingale with respect to F n,c t) by using the fact the integral of a predictable process with respect to a martingale is again a martingale, provided that the appropriate moments exist see Shorack and Wellner 13, p.261). 4. Proof of Theorem 2.3 In the remainder of this section, we present a proof of Theorem 2.3. Throughout and unless otherwise specified, we assume that the CRS conditions S ) are satisfied. For < h < 1, let 4.1) L n,c h, x, u) = Π n,cx + hu) Π n,c x) nhu n, for u 1 and x 1. LEMMA 4.1. There exists a constant C hn with < C hn <, and such that the following property holds. For each choice of {x 1,, x m } {kh n : k h 1 n 1} with < mh n 1 2 and Borel subsets B 1,..., B m of B, 1) endowed with the topology of uniform convergence on, 1, if A 1 = {ξ n,c h n, x i ;.) B i, i = 1,..., m} and then A 2 = {L n,c h n, x i ;.) B i, i = 1,..., m}, PA 1 C hn PA 2. Proof. The proof of Lemma 4.1 is practically the same as the proof of Theorem 3.4 after a change of scale. Therefore, we omit details. The next lemmaknown as Schilder s Theorem see, e.g., Schilder 12) gives a large deviation result which will be instrumental for our needs. For any f B, 1), set 1 ) 2ds, when f is absolutely continuous on, 1 fs) Jf) = with Lebesgue derivative f,, otherwise, and, for any B B, 1), define JB) = inf f B Jf).

10 MAUMY 1 LEMMA 4.2. Let {W x) : x } be a standard Wiener process and, for any γ >, set W γ x) = 2 1/2 γ 1 W γx), for x 1. Then the following properties hold: i) For each closed subset F of B, 1) endowed with the topology of uniform convergence on, 1, 4.2) lim sup γ 1 log PW γ F ) JF ). γ ii) For each open subset G of B, 1) endowed with the topology of uniform convergence on, 1, 4.3) lim inf γ γ 1 log PW γ G) JG). Proof. For the proof of Lemma 4.2, we refer to 16. The following lemma establishes the second part ie 2.5) of Theorem 2.3 for ξ n,c. LEMMA 4.3. Under the CRS conditions, for every f S and ε >, there exists almost surely a finite n ε,f such that, for all n n ε,f, there exists a x = x n,ε,f, 1 h n such that: ξ n,c h n, x;.) 2σ2 + 1)h n log1/h n ) f < ε holds. Proof. Set N ε f) = {g B, 1) : f g < ε}. By Lemma 4.1 applied with x i = ih n, i = 1,, m n = 1/2h n and B i = B, 1) N ε f), for i = 1,, m n, we obtain 4.4) P n = P mn C hn P ) ξ n,c h n, x i ;.) 2σ2 + 1)h n log1/h n ) N εf) i=1 mn i=1 L n,c h n, x i ;.) 2σ2 + 1)h n log1/h n ) N εf) L n,c h n, ;.) = C hn 1 P 2σ2 + 1)h n log1/h n ) N εf) = C hn 1 P1,n ) mn. ) ) mn In a second step, we evaluate P 1,n. For this, we apply the strong invariance principles of Komlós, Major and Tusnády 8 to the compound Poisson process. These enable us to construct on the same probability space a compound Poisson process {Π c t) : t }, and a standard Wiener process {W t) : t }. We have the following theorem.

11 11 MAUMY THEOREM 4.1. We can define a Wiener process {W t) : t } such that : if EexptY ) < in a neighbourhhod of, then / 4.5) P sup Π c t) µt) σ2 + µ 2 W t) C 1 log T + z C 2 exp C 3 z), t T for all z >, where C 1, C 2 and C 3 are positive constants. Proof. See, e.g., Csörgő, Deheuvels, Horváth 3. By 4.1), 4.4) and 4.5), we see that L n,c h n, ; ) P 1,n = P 2σ2 + 1)h n log1/h n ) N εf) Π c nh n ) nh n = P 2σ2 + 1)nh n log1/h n ) N εf), with Π c ) =. Then, we have 4.6) W nh n ) P 1,n P 2nhn log1/h n ) N ε/2f) P Π c nh n u) nh n u) / σ W nh n u) sup u 1 2nhn log1/h n ) ε/2 = P 2,n P ε/2 3,n. Notice that W nh n ) 2nhn log1/h n ) d = 2 1/2 log1/h n ) ) 1 W log1/hn ) ), under the S.2 3) conditions, so that where P 2,n = P W log1/hn) ) N ε/2 f), W γ x) = 2 1/2 γ 1 W γx), for x 1. Since f S, we have obviously JN ε/2 f)) < 1. Thus, by 4.3), it follows that for any ρ JN ε/2 f)), 1), we have for all n sufficiently large 4.7) P 2,n exp ρ log1/h n )) = h ρ n. Next, observe that S.1) implies that lognh n ) 2nhn log1/h n ) as n.

12 MAUMY 12 This, when combined with 4.5), 4.6) and S.3), implies that, for large n, P ε/2 Πc 3,n P sup x) x) / σ W x) x nh n 4.8) C 1 lognh n ) + ε 4 2nhn log 1/h n ) ε ) C 2 exp C 3 2nhn log 1/h n ) 4) 1 2 exp ρ log 1/h n)) = 1 2 hρ n. By 4.6), 4.7) and 4.8), we have ultimately as n, P 1,n P 2,n P ε/2 3,n h ρ n 1 2 hρ n P 1,n 1 2 hρ n, which, when combined with 4.4), yields for all large n the inequalities P n C hn exp 1 ) 2 m nh ρ n, P n C hn exp 1 ) ) 1 h ρ n, 2 4h n P n C hn exp 1 ) 4.9) 8 hρ 1 n. Since ρ 1 <, by S.2), the right hand side of 4.9) is ultimately less than or equal to C hn exp 18 ) log n)r for an arbitrary r > 1. It follows that n P n <, which by the Borel-Cantelli lemma implies 2.5), as sought. We now turn to the proof of 2.4). Fix any γ > and ε >. Set ν k = 1 + γ) k, for k 1, and let b n = 2σ 2 + 1)h n log1/h n ). Next, consider the events { C k ε, γ) = n/ ) 1/2 b 1 ξ n,c h νk+1, x; ) S ε for some x 1 h νk+1 and ν k < n }, and D k ε, γ) = { } b 1 ξ νk+1,c hνk+1, x;.) S ε for some x 1 h νk+1.

13 13 MAUMY LEMMA 4.4. For every ε > and γ >, there exists a K = K ε,γ such that k K implies that 4.1) P C k ε, γ) C hn P D k ε/2, γ). Proof. Let r i, i 1 be a denumeration of the rationals in, 1 h νk+1, and introduce the events for i 1 and ν k < n, { } E k,i,n ε) = n/ ) 1/2 b 1 ξ n,c h νk+1, r i ; ) S ε, E k,n ε) = i 1 E k,i,n ε) and F k,i,n ε) = { } b 1 ξνk+1,ch νk+1, r i ; ) n/ ) 1/2 ξ n,c h νk+1, r i ; ) < ε. Observe that for any ε 1 > and ε 2 >, the sequences of events {E k,i,n ε 1 ) : i 1} and {F k,i,n ε 2 ) : i 1} are independent. Denote by A the complement of the event A. We have i 1 q 1 P C k ε, γ) = P E k,i,q ε) E k,j,q ε) E r,q ε), and hence 4.11) inf ν k <n q=ν k +1 i=1 j=1 inf P F k,m,nε/2) P C k ε, γ) m 1 P E k,i,q ε) F k,i,q ε/2) q=ν k +1 i=1 i 1 j=1 ) P E k,i,νk+1 ε/2 = P D k ε/2, γ). i=1 Next, we see that, for any < h 1, β > and n 1, we have 2 P n α n,c u) β P sup u h r=ν k +1 E k,j,q ε) q 1 r=ν k +1 sup n U n,c u) u /1 u) β u h E r,q ε) which by the fact nu n,c u) u)/1 u) is a martingale in u see Preliminary results), is P n αn,c u) β E n 2 U n,c h) h 2 / β1 h)) 2 σ 2 + 1)nh/β 2. sup u h From this last inequality, we obtain that, for all ν k < n, m 1 and all k large enough, P F k,m,n ε/2) = P ν 1/2 k+1 sup n) U νk+1 n,cu) u ε/2)b νk+1 u h νk+1 / ) 4σ 2 + 1) ν k )h νk+1 ε 2 b 2 Using this bound in 4.11) yields 4.1). 2σ 2 + 1) / ε 2 log1/h νk+1 ) ) as k.,

14 MAUMY 14 LEMMA 4.5. We have lim lim sup max sup γ k ν k <n x 1 h νk ) n/ ) 1/2 b 1 b 1 n ξn,c h νk+1, x; ) ) = a.s. Proof. The left hand side 4.12) of is less than or equal to bn 4.13) lim lim sup max b 1 ν γ k ν k <n /n ) ) 1/2 lim sup b 1 n ω αn,c h n ), k+1 n where ω αn,c h n ) is defined, for < h n < 1 and n 1, by ω αn,c h n ) = sup ξn,c h n, x; ). x 1 h n Making use of the Theorem 2.1 in Maumy 1, we have 4.14) lim n b 1 n ω αn,c h n ) = 1 a.s. Moreover, the condition S.1) implies that nh n and h n as n, so that b n = 2σ2 + 1)h n log1/h n ) is ultimately nonincreasing and such that, for ν k < n, b n /b νk+1 1 b νk /b νk o1) ) /ν k ) 1 γ as k. This, in combination with 4.13) and 4.14), readily yields 4.12). LEMMA 4.6. We have 4.15) lim γ lim sup k max sup b 1 n ν k <n x 1 h νk+1 Proof. From h n we get that for ν k < n, ξn,c h νk+1, x; ) ξ n,c h n, x; ) ) = a.s. h n h νk+1 = h n 1 hνk+1 /h n ) hn 1 hνk+1 /h νk ). From nh n we obtain that, for all k large enough, h νk+1 /h νk ν k / 1 γ. Therefore, for ν k < n and for all k large enough, h n h νk+1 h n 1 hνk+1 /h νk ) γhn. Hence the left hand side of 4.15) is less than or equal to ) 4.16) lim lim n ω αn,c γh n ). γ n b 1 By 4.14) and 4.16) it is easy to conclude 4.15).

15 15 MAUMY In view of Lemmas , we will show in the sequel that the proof of 2.4) boils down to showing that, for every ε > and γ >, 4.17) P D k ε, γ) <. k=1 Toward this end, we will make use of the following inequality. LEMMA 4.7. For all < θ < 1, ε > and n 1, we have P b 1 n ξ n,c h n, x; ) S ε for some x 1 h n θh n ) 1 P b 1 n ξ n,c h n, ; ) S ε/2 + P b 1 n ω αn,c θh n ) > ε/4 Proof. For θih n x θi + 1)h n and i, 1,..., h 1 n 1)/θ 1 = µ n 1, or for µ n h n x 1 h n and i = µ n, we have uniformly over u 1, ξ n,c h n, x; u) ξ n,c h n, iθh n ; u) α n,c x + uh n ) α n,c iθh n + uh n ) The remainder of the proof is obvious. + αn,c x) α n,c iθh n ) 2ω αn,c θh n ). We are now prepared to show that 4.17) holds for all ε > and γ >. We have use Lemma 4.7 to obtain that, for any θ > sufficiently small, P D k ε, γ) ) 1 θh νk+1 P b 1 ξ νk+1,ch νk+1, ;.) S ε/2 + P b 1 ω ανk+1,cθh νk+1 ) ε/4 Q 1,k + Q 2,k. Next, we use Lemma 4.1 to obtain the inequality ) θhνk+1 Q1,k = P b 1 ξ νk+1 h νk+1, ;.) S ε/2 C hn P b 1 Lh νk+1, ;.) S ε/2 = C hn Q 3,k. We now follow the arguments of the proof of Lemma 4.3 to obtain the following analogue of 4.6), with P ε/2 3,n being as in 4.8) ) 4.18) Q 3,k P W log1/hνk+1 ) S ε/4 + P ε/4 3, = Q 4,k + Q 5,k. Since B, 1) S ε/4 is closed with respect to the topology of uniform convergence) and obviously satisfies JB, 1) S ε/4 ) > 1, it follows from 4.2) that, for any ρ 1, JB, 1) S ε/4 )), we have, for all k sufficiently large, 4.19) Q 4,k exp ρ log1/h νk+1 ) ) = h ρ.

16 MAUMY 16 Moreover, the same arguments as used for 4.7) and 4.8) show that, for all k sufficiently large, 4.2) Q 5,k h ρ. Thus, combining 4.18), 4.19) and 4.2), we obtain, for all large k, 4.21) Q 1,k C hn θhνk+1 ) 1 Q3,k 2C h n θ h ρ 1 = 2C h n exp ρ 1 ) )) 1 log. θ h νk+1 Since S.2) implies that log1/h νk+1 )/ log k as k, we see from 4.21) that Q 1,k 1/k 2 for all large k, so that 4.22) Q 1,k <. k=1 We use 3.6) taken with δ = 1/2, h = θh νk+1, n =, β = obtain 4.23) Q 2,k 16 C θhνk+1 θh νk+1 ) 1 exp ε2 log1/h νk+1 ) ψ 128θ ε 4 θ 2σ 2 + 1) log1/h νk+1 ) to Mε 2 log1/hνk+1 ) 4θ σ 2 + 1) h νk+1 Since S.3) implies that nh n / log 1/h n ), and ψx) 1 as x, the right hand side of 4.23) is less than or equal to ) )) ε 16 C θhνk+1 θ exp 256θ 1 log. h νk+1 Thus, the same argument as used for 4.22) shows that, for any < θ < ε/128, we have )). 4.24) Q 2,k <. k=1 Combining 4.22) and 4.24), we see that 4.17) holds as sought. By the Borel-Cantelli lemma and Lemmas , it follows that for any ε >, whenever γ > is sufficiently small, the event ν k <n { b 1 n ξ n,c h n, x; ) : x 1 h νk+1 } S ε holds finitely often with probability 1. The following lemma shows that the same is true for the event { } b 1 n ξ n,c h n, x; ) : 1 h νk+1 < x 1 h n S ε, ν k <n this completing the proof of 2.4).

17 17 MAUMY LEMMA 4.8. We have 4.25) lim γ lim sup k sup sup b 1 n ν k <n 1 h νk+1 x 1 h n ξn,c h n, x; ) ξ n,c h n, 1 h νk+1 ; ) ) = a.s. Proof. By the same arguments as used in the proof of Lemma 4.6, we see that the right hand side of 4.25) is less than or equal to ) lim 2 lim b 1 n ω αn,c γh n ), γ n which equals a.s. by 4.14) and 4.16). References 1 P. Billingsley. Convergence of Probability Measures. Wiley, Y. S. Chow and H. Teicher. Probability Theory. Springer-Verlag, M. Csörgő, P. Deheuvels, and L. Horváth. An approximation of stopped sums with apllications in queueing theory. Adv. Appl. Prob., 19:674 69, M. Csörgő and P. Révész. Strong Approximations in Probability and Statistics. Academic Press, P. Deheuvels and D. M. Mason. Functional laws of the iterated logarithm for the increments of empirical and quantile processes. Ann. Probab., 2: , J. Jacod. Multivariate point processes : predictable projection, radon-nykodim derivatives, representation of martingales. Z. Wahrscheinlichkeitstheorie verw. Gebiete, 31: , S. Karlin and H. M. Taylor. A Second Course in Stochastic Processes. Academic Press, J. Komlós, P. Major, and G. Tusnády. An approximation of partial sums of independent rv s and the sample d.f. i. Z. Wahrscheinlichkeitstheorie verw. Gebiete, 32: , D. M. Mason. A strong invariance theorem for the tail empirical process. Ann. Inst. H. Poincarré, 24:491 56, M. Maumy. Le comportement des oscillations du processus empirique composé. C. R. Acad. Sci. Paris, Ser.I, 333: , M. Maumy. Sur les oscillations du processus de poisson composé. C. R. Acad. Sci. Paris, Ser.I, 334:75 78, 22.

18 MAUMY M. Schilder. Some asymptotic formulas for wiener integrals. Trans. Amer. Math., 134: , G. R. Shorack and J. A. Wellner. Empirical Processes with Applications to Statistics. Wiley, V. Strassen. An invariance principle for the Law of the Iterated Logarithm. Z. Wahrscheinlichkeitstheorie und verw. Gebiete, 3: , W. Stute. The oscillation behaviour of empirical processes. Ann. Probab., 1:86 17, A. D. Ventsel. Rough limit theorems on large deviations for markov processses. Theory Probab. Appl., 21: , , L.S.T.A., Université Paris VI Boîte Courrier 158, 8 e étage, Plateau A 175, rue du Chevaleret, 7513 Paris, France address : maumy@ccr.jussieu.fr

On the Set of Limit Points of Normed Sums of Geometrically Weighted I.I.D. Bounded Random Variables

On the Set of Limit Points of Normed Sums of Geometrically Weighted I.I.D. Bounded Random Variables On the Set of Limit Points of Normed Sums of Geometrically Weighted I.I.D. Bounded Random Variables Deli Li 1, Yongcheng Qi, and Andrew Rosalsky 3 1 Department of Mathematical Sciences, Lakehead University,

More information

MOMENT CONVERGENCE RATES OF LIL FOR NEGATIVELY ASSOCIATED SEQUENCES

MOMENT CONVERGENCE RATES OF LIL FOR NEGATIVELY ASSOCIATED SEQUENCES J. Korean Math. Soc. 47 1, No., pp. 63 75 DOI 1.4134/JKMS.1.47..63 MOMENT CONVERGENCE RATES OF LIL FOR NEGATIVELY ASSOCIATED SEQUENCES Ke-Ang Fu Li-Hua Hu Abstract. Let X n ; n 1 be a strictly stationary

More information

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES RUTH J. WILLIAMS October 2, 2017 Department of Mathematics, University of California, San Diego, 9500 Gilman Drive,

More information

Logarithmic scaling of planar random walk s local times

Logarithmic scaling of planar random walk s local times Logarithmic scaling of planar random walk s local times Péter Nándori * and Zeyu Shen ** * Department of Mathematics, University of Maryland ** Courant Institute, New York University October 9, 2015 Abstract

More information

STAT 7032 Probability Spring Wlodek Bryc

STAT 7032 Probability Spring Wlodek Bryc STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,

More information

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij Weak convergence and Brownian Motion (telegram style notes) P.J.C. Spreij this version: December 8, 2006 1 The space C[0, ) In this section we summarize some facts concerning the space C[0, ) of real

More information

Random Bernstein-Markov factors

Random Bernstein-Markov factors Random Bernstein-Markov factors Igor Pritsker and Koushik Ramachandran October 20, 208 Abstract For a polynomial P n of degree n, Bernstein s inequality states that P n n P n for all L p norms on the unit

More information

Citation Osaka Journal of Mathematics. 41(4)

Citation Osaka Journal of Mathematics. 41(4) TitleA non quasi-invariance of the Brown Authors Sadasue, Gaku Citation Osaka Journal of Mathematics. 414 Issue 4-1 Date Text Version publisher URL http://hdl.handle.net/1194/1174 DOI Rights Osaka University

More information

The Codimension of the Zeros of a Stable Process in Random Scenery

The Codimension of the Zeros of a Stable Process in Random Scenery The Codimension of the Zeros of a Stable Process in Random Scenery Davar Khoshnevisan The University of Utah, Department of Mathematics Salt Lake City, UT 84105 0090, U.S.A. davar@math.utah.edu http://www.math.utah.edu/~davar

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

PATH PROPERTIES OF CAUCHY S PRINCIPAL VALUES RELATED TO LOCAL TIME

PATH PROPERTIES OF CAUCHY S PRINCIPAL VALUES RELATED TO LOCAL TIME PATH PROPERTIES OF CAUCHY S PRINCIPAL VALUES RELATED TO LOCAL TIME Endre Csáki,2 Mathematical Institute of the Hungarian Academy of Sciences. Budapest, P.O.B. 27, H- 364, Hungary E-mail address: csaki@math-inst.hu

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Krzysztof Burdzy University of Washington. = X(Y (t)), t 0}

Krzysztof Burdzy University of Washington. = X(Y (t)), t 0} VARIATION OF ITERATED BROWNIAN MOTION Krzysztof Burdzy University of Washington 1. Introduction and main results. Suppose that X 1, X 2 and Y are independent standard Brownian motions starting from 0 and

More information

Strong approximation for additive functionals of geometrically ergodic Markov chains

Strong approximation for additive functionals of geometrically ergodic Markov chains Strong approximation for additive functionals of geometrically ergodic Markov chains Florence Merlevède Joint work with E. Rio Université Paris-Est-Marne-La-Vallée (UPEM) Cincinnati Symposium on Probability

More information

9 Brownian Motion: Construction

9 Brownian Motion: Construction 9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Stochastic Processes

Stochastic Processes Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space

More information

Mi-Hwa Ko. t=1 Z t is true. j=0

Mi-Hwa Ko. t=1 Z t is true. j=0 Commun. Korean Math. Soc. 21 (2006), No. 4, pp. 779 786 FUNCTIONAL CENTRAL LIMIT THEOREMS FOR MULTIVARIATE LINEAR PROCESSES GENERATED BY DEPENDENT RANDOM VECTORS Mi-Hwa Ko Abstract. Let X t be an m-dimensional

More information

A NOTE ON THE COMPLETE MOMENT CONVERGENCE FOR ARRAYS OF B-VALUED RANDOM VARIABLES

A NOTE ON THE COMPLETE MOMENT CONVERGENCE FOR ARRAYS OF B-VALUED RANDOM VARIABLES Bull. Korean Math. Soc. 52 (205), No. 3, pp. 825 836 http://dx.doi.org/0.434/bkms.205.52.3.825 A NOTE ON THE COMPLETE MOMENT CONVERGENCE FOR ARRAYS OF B-VALUED RANDOM VARIABLES Yongfeng Wu and Mingzhu

More information

An almost sure invariance principle for additive functionals of Markov chains

An almost sure invariance principle for additive functionals of Markov chains Statistics and Probability Letters 78 2008 854 860 www.elsevier.com/locate/stapro An almost sure invariance principle for additive functionals of Markov chains F. Rassoul-Agha a, T. Seppäläinen b, a Department

More information

On large deviations of sums of independent random variables

On large deviations of sums of independent random variables On large deviations of sums of independent random variables Zhishui Hu 12, Valentin V. Petrov 23 and John Robinson 2 1 Department of Statistics and Finance, University of Science and Technology of China,

More information

Chapter 1. Measure Spaces. 1.1 Algebras and σ algebras of sets Notation and preliminaries

Chapter 1. Measure Spaces. 1.1 Algebras and σ algebras of sets Notation and preliminaries Chapter 1 Measure Spaces 1.1 Algebras and σ algebras of sets 1.1.1 Notation and preliminaries We shall denote by X a nonempty set, by P(X) the set of all parts (i.e., subsets) of X, and by the empty set.

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

The Uniform Modulus of Continuity Of Iterated Brownian Motion 1

The Uniform Modulus of Continuity Of Iterated Brownian Motion 1 The Uniform Modulus of Continuity Of Iterated Brownian Motion 1 By Davar Khoshnevisan Thomas M Lewis University of Utah & Furman University Summary Let X be a Brownian motion defined on the line with X0=0

More information

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974 LIMITS FOR QUEUES AS THE WAITING ROOM GROWS by Daniel P. Heyman Ward Whitt Bell Communications Research AT&T Bell Laboratories Red Bank, NJ 07701 Murray Hill, NJ 07974 May 11, 1988 ABSTRACT We study the

More information

Limiting behaviour of moving average processes under ρ-mixing assumption

Limiting behaviour of moving average processes under ρ-mixing assumption Note di Matematica ISSN 1123-2536, e-issn 1590-0932 Note Mat. 30 (2010) no. 1, 17 23. doi:10.1285/i15900932v30n1p17 Limiting behaviour of moving average processes under ρ-mixing assumption Pingyan Chen

More information

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)

More information

MATH 6605: SUMMARY LECTURE NOTES

MATH 6605: SUMMARY LECTURE NOTES MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

Limit Theorems for Exchangeable Random Variables via Martingales

Limit Theorems for Exchangeable Random Variables via Martingales Limit Theorems for Exchangeable Random Variables via Martingales Neville Weber, University of Sydney. May 15, 2006 Probabilistic Symmetries and Their Applications A sequence of random variables {X 1, X

More information

7 The Radon-Nikodym-Theorem

7 The Radon-Nikodym-Theorem 32 CHPTER II. MESURE ND INTEGRL 7 The Radon-Nikodym-Theorem Given: a measure space (Ω,, µ). Put Z + = Z + (Ω, ). Definition 1. For f (quasi-)µ-integrable and, the integral of f over is (Note: 1 f f.) Theorem

More information

On a Class of Multidimensional Optimal Transportation Problems

On a Class of Multidimensional Optimal Transportation Problems Journal of Convex Analysis Volume 10 (2003), No. 2, 517 529 On a Class of Multidimensional Optimal Transportation Problems G. Carlier Université Bordeaux 1, MAB, UMR CNRS 5466, France and Université Bordeaux

More information

Lecture 1: Brief Review on Stochastic Processes

Lecture 1: Brief Review on Stochastic Processes Lecture 1: Brief Review on Stochastic Processes A stochastic process is a collection of random variables {X t (s) : t T, s S}, where T is some index set and S is the common sample space of the random variables.

More information

UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE

UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE Surveys in Mathematics and its Applications ISSN 1842-6298 (electronic), 1843-7265 (print) Volume 5 (2010), 275 284 UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE Iuliana Carmen Bărbăcioru Abstract.

More information

MATHS 730 FC Lecture Notes March 5, Introduction

MATHS 730 FC Lecture Notes March 5, Introduction 1 INTRODUCTION MATHS 730 FC Lecture Notes March 5, 2014 1 Introduction Definition. If A, B are sets and there exists a bijection A B, they have the same cardinality, which we write as A, #A. If there exists

More information

Integration on Measure Spaces

Integration on Measure Spaces Chapter 3 Integration on Measure Spaces In this chapter we introduce the general notion of a measure on a space X, define the class of measurable functions, and define the integral, first on a class of

More information

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing Advances in Dynamical Systems and Applications ISSN 0973-5321, Volume 8, Number 2, pp. 401 412 (2013) http://campus.mst.edu/adsa Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes

More information

MATH5011 Real Analysis I. Exercise 1 Suggested Solution

MATH5011 Real Analysis I. Exercise 1 Suggested Solution MATH5011 Real Analysis I Exercise 1 Suggested Solution Notations in the notes are used. (1) Show that every open set in R can be written as a countable union of mutually disjoint open intervals. Hint:

More information

Almost sure limit theorems for U-statistics

Almost sure limit theorems for U-statistics Almost sure limit theorems for U-statistics Hajo Holzmann, Susanne Koch and Alesey Min 3 Institut für Mathematische Stochasti Georg-August-Universität Göttingen Maschmühlenweg 8 0 37073 Göttingen Germany

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Additive functionals of infinite-variance moving averages. Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535

Additive functionals of infinite-variance moving averages. Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535 Additive functionals of infinite-variance moving averages Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535 Departments of Statistics The University of Chicago Chicago, Illinois 60637 June

More information

Tools from Lebesgue integration

Tools from Lebesgue integration Tools from Lebesgue integration E.P. van den Ban Fall 2005 Introduction In these notes we describe some of the basic tools from the theory of Lebesgue integration. Definitions and results will be given

More information

1. Introduction. 0 i<j n j i

1. Introduction. 0 i<j n j i ESAIM: Probability and Statistics URL: http://www.emath.fr/ps/ Will be set by the publisher SHAO S THEOREM ON THE MAXIMUM OF STANDARDIZED RANDOM WALK INCREMENTS FOR MULTIDIMENSIONAL ARRAYS Zakhar Kabluchko

More information

The Lévy-Itô decomposition and the Lévy-Khintchine formula in31 themarch dual of 2014 a nuclear 1 space. / 20

The Lévy-Itô decomposition and the Lévy-Khintchine formula in31 themarch dual of 2014 a nuclear 1 space. / 20 The Lévy-Itô decomposition and the Lévy-Khintchine formula in the dual of a nuclear space. Christian Fonseca-Mora School of Mathematics and Statistics, University of Sheffield, UK Talk at "Stochastic Processes

More information

Bahadur representations for bootstrap quantiles 1

Bahadur representations for bootstrap quantiles 1 Bahadur representations for bootstrap quantiles 1 Yijun Zuo Department of Statistics and Probability, Michigan State University East Lansing, MI 48824, USA zuo@msu.edu 1 Research partially supported by

More information

A generalization of Strassen s functional LIL

A generalization of Strassen s functional LIL A generalization of Strassen s functional LIL Uwe Einmahl Departement Wiskunde Vrije Universiteit Brussel Pleinlaan 2 B-1050 Brussel, Belgium E-mail: ueinmahl@vub.ac.be Abstract Let X 1, X 2,... be a sequence

More information

PACKING-DIMENSION PROFILES AND FRACTIONAL BROWNIAN MOTION

PACKING-DIMENSION PROFILES AND FRACTIONAL BROWNIAN MOTION PACKING-DIMENSION PROFILES AND FRACTIONAL BROWNIAN MOTION DAVAR KHOSHNEVISAN AND YIMIN XIAO Abstract. In order to compute the packing dimension of orthogonal projections Falconer and Howroyd 997) introduced

More information

Convergence Rates for Renewal Sequences

Convergence Rates for Renewal Sequences Convergence Rates for Renewal Sequences M. C. Spruill School of Mathematics Georgia Institute of Technology Atlanta, Ga. USA January 2002 ABSTRACT The precise rate of geometric convergence of nonhomogeneous

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

for all f satisfying E[ f(x) ] <.

for all f satisfying E[ f(x) ] <. . Let (Ω, F, P ) be a probability space and D be a sub-σ-algebra of F. An (H, H)-valued random variable X is independent of D if and only if P ({X Γ} D) = P {X Γ}P (D) for all Γ H and D D. Prove that if

More information

1 Weak Convergence in R k

1 Weak Convergence in R k 1 Weak Convergence in R k Byeong U. Park 1 Let X and X n, n 1, be random vectors taking values in R k. These random vectors are allowed to be defined on different probability spaces. Below, for the simplicity

More information

1 Sequences of events and their limits

1 Sequences of events and their limits O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For

More information

On the quantiles of the Brownian motion and their hitting times.

On the quantiles of the Brownian motion and their hitting times. On the quantiles of the Brownian motion and their hitting times. Angelos Dassios London School of Economics May 23 Abstract The distribution of the α-quantile of a Brownian motion on an interval [, t]

More information

Self-normalized laws of the iterated logarithm

Self-normalized laws of the iterated logarithm Journal of Statistical and Econometric Methods, vol.3, no.3, 2014, 145-151 ISSN: 1792-6602 print), 1792-6939 online) Scienpress Ltd, 2014 Self-normalized laws of the iterated logarithm Igor Zhdanov 1 Abstract

More information

A note on the growth rate in the Fazekas Klesov general law of large numbers and on the weak law of large numbers for tail series

A note on the growth rate in the Fazekas Klesov general law of large numbers and on the weak law of large numbers for tail series Publ. Math. Debrecen 73/1-2 2008), 1 10 A note on the growth rate in the Fazekas Klesov general law of large numbers and on the weak law of large numbers for tail series By SOO HAK SUNG Taejon), TIEN-CHUNG

More information

AN EXTENSION OF THE HONG-PARK VERSION OF THE CHOW-ROBBINS THEOREM ON SUMS OF NONINTEGRABLE RANDOM VARIABLES

AN EXTENSION OF THE HONG-PARK VERSION OF THE CHOW-ROBBINS THEOREM ON SUMS OF NONINTEGRABLE RANDOM VARIABLES J. Korean Math. Soc. 32 (1995), No. 2, pp. 363 370 AN EXTENSION OF THE HONG-PARK VERSION OF THE CHOW-ROBBINS THEOREM ON SUMS OF NONINTEGRABLE RANDOM VARIABLES ANDRÉ ADLER AND ANDREW ROSALSKY A famous result

More information

SAMPLE CHAPTERS UNESCO EOLSS STATIONARY PROCESSES. E ξ t (1) K.Grill Institut für Statistik und Wahrscheinlichkeitstheorie, TU Wien, Austria

SAMPLE CHAPTERS UNESCO EOLSS STATIONARY PROCESSES. E ξ t (1) K.Grill Institut für Statistik und Wahrscheinlichkeitstheorie, TU Wien, Austria STATIONARY PROCESSES K.Grill Institut für Statistik und Wahrscheinlichkeitstheorie, TU Wien, Austria Keywords: Stationary process, weak sense stationary process, strict sense stationary process, correlation

More information

Complete Moment Convergence for Sung s Type Weighted Sums of ρ -Mixing Random Variables

Complete Moment Convergence for Sung s Type Weighted Sums of ρ -Mixing Random Variables Filomat 32:4 (208), 447 453 https://doi.org/0.2298/fil804447l Published by Faculty of Sciences and Mathematics, Uversity of Niš, Serbia Available at: http://www.pmf..ac.rs/filomat Complete Moment Convergence

More information

Zeros of a two-parameter random walk

Zeros of a two-parameter random walk Zeros of a two-parameter random walk Davar Khoshnevisan University of Utah Pál Révész Technische Universität Wien October 20, 2009 Abstract We prove that the number γ N of the zeros of a two-parameter

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures 36-752 Spring 2014 Advanced Probability Overview Lecture Notes Set 1: Course Overview, σ-fields, and Measures Instructor: Jing Lei Associated reading: Sec 1.1-1.4 of Ash and Doléans-Dade; Sec 1.1 and A.1

More information

Practical conditions on Markov chains for weak convergence of tail empirical processes

Practical conditions on Markov chains for weak convergence of tail empirical processes Practical conditions on Markov chains for weak convergence of tail empirical processes Olivier Wintenberger University of Copenhagen and Paris VI Joint work with Rafa l Kulik and Philippe Soulier Toronto,

More information

Limit Thoerems and Finitiely Additive Probability

Limit Thoerems and Finitiely Additive Probability Limit Thoerems and Finitiely Additive Probability Director Chennai Mathematical Institute rlk@cmi.ac.in rkarandikar@gmail.com Limit Thoerems and Finitiely Additive Probability - 1 Most of us who study

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

Regularity of the density for the stochastic heat equation

Regularity of the density for the stochastic heat equation Regularity of the density for the stochastic heat equation Carl Mueller 1 Department of Mathematics University of Rochester Rochester, NY 15627 USA email: cmlr@math.rochester.edu David Nualart 2 Department

More information

Notes 1 : Measure-theoretic foundations I

Notes 1 : Measure-theoretic foundations I Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,

More information

Empirical Processes: General Weak Convergence Theory

Empirical Processes: General Weak Convergence Theory Empirical Processes: General Weak Convergence Theory Moulinath Banerjee May 18, 2010 1 Extended Weak Convergence The lack of measurability of the empirical process with respect to the sigma-field generated

More information

ITERATED FUNCTION SYSTEMS WITH CONTINUOUS PLACE DEPENDENT PROBABILITIES

ITERATED FUNCTION SYSTEMS WITH CONTINUOUS PLACE DEPENDENT PROBABILITIES UNIVERSITATIS IAGELLONICAE ACTA MATHEMATICA, FASCICULUS XL 2002 ITERATED FUNCTION SYSTEMS WITH CONTINUOUS PLACE DEPENDENT PROBABILITIES by Joanna Jaroszewska Abstract. We study the asymptotic behaviour

More information

THE DVORETZKY KIEFER WOLFOWITZ INEQUALITY WITH SHARP CONSTANT: MASSART S 1990 PROOF SEMINAR, SEPT. 28, R. M. Dudley

THE DVORETZKY KIEFER WOLFOWITZ INEQUALITY WITH SHARP CONSTANT: MASSART S 1990 PROOF SEMINAR, SEPT. 28, R. M. Dudley THE DVORETZKY KIEFER WOLFOWITZ INEQUALITY WITH SHARP CONSTANT: MASSART S 1990 PROOF SEMINAR, SEPT. 28, 2011 R. M. Dudley 1 A. Dvoretzky, J. Kiefer, and J. Wolfowitz 1956 proved the Dvoretzky Kiefer Wolfowitz

More information

Detecting instants of jumps and estimating intensity of jumps from continuous or discrete data

Detecting instants of jumps and estimating intensity of jumps from continuous or discrete data Detecting instants of jumps and estimating intensity of jumps from continuous or discrete data Denis Bosq 1 Delphine Blanke 2 1 LSTA, Université Pierre et Marie Curie - Paris 6 2 LMA, Université d'avignon

More information

Poisson Processes. Stochastic Processes. Feb UC3M

Poisson Processes. Stochastic Processes. Feb UC3M Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written

More information

A Law of the Iterated Logarithm. for Grenander s Estimator

A Law of the Iterated Logarithm. for Grenander s Estimator A Law of the Iterated Logarithm for Grenander s Estimator Jon A. Wellner University of Washington, Seattle Probability Seminar October 24, 2016 Based on joint work with: Lutz Dümbgen and Malcolm Wolff

More information

The main results about probability measures are the following two facts:

The main results about probability measures are the following two facts: Chapter 2 Probability measures The main results about probability measures are the following two facts: Theorem 2.1 (extension). If P is a (continuous) probability measure on a field F 0 then it has a

More information

arxiv: v1 [math.pr] 7 Aug 2009

arxiv: v1 [math.pr] 7 Aug 2009 A CONTINUOUS ANALOGUE OF THE INVARIANCE PRINCIPLE AND ITS ALMOST SURE VERSION By ELENA PERMIAKOVA (Kazan) Chebotarev inst. of Mathematics and Mechanics, Kazan State University Universitetskaya 7, 420008

More information

SEPARABILITY AND COMPLETENESS FOR THE WASSERSTEIN DISTANCE

SEPARABILITY AND COMPLETENESS FOR THE WASSERSTEIN DISTANCE SEPARABILITY AND COMPLETENESS FOR THE WASSERSTEIN DISTANCE FRANÇOIS BOLLEY Abstract. In this note we prove in an elementary way that the Wasserstein distances, which play a basic role in optimal transportation

More information

Weighted uniform consistency of kernel density estimators with general bandwidth sequences

Weighted uniform consistency of kernel density estimators with general bandwidth sequences E l e c t r o n i c J o u r n a l o f P r o b a b i l i t y Vol. 11 2006, Paper no. 33, pages 844 859. Journal URL http://www.math.washington.edu/~ejpecp/ Weighted uniform consistency of kernel density

More information

On pathwise stochastic integration

On pathwise stochastic integration On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic

More information

An inverse of Sanov s theorem

An inverse of Sanov s theorem An inverse of Sanov s theorem Ayalvadi Ganesh and Neil O Connell BRIMS, Hewlett-Packard Labs, Bristol Abstract Let X k be a sequence of iid random variables taking values in a finite set, and consider

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

Random point patterns and counting processes

Random point patterns and counting processes Chapter 7 Random point patterns and counting processes 7.1 Random point pattern A random point pattern satunnainen pistekuvio) on an interval S R is a locally finite 1 random subset of S, defined on some

More information

Introduction and Preliminaries

Introduction and Preliminaries Chapter 1 Introduction and Preliminaries This chapter serves two purposes. The first purpose is to prepare the readers for the more systematic development in later chapters of methods of real analysis

More information

Research Article Exponential Inequalities for Positively Associated Random Variables and Applications

Research Article Exponential Inequalities for Positively Associated Random Variables and Applications Hindawi Publishing Corporation Journal of Inequalities and Applications Volume 008, Article ID 38536, 11 pages doi:10.1155/008/38536 Research Article Exponential Inequalities for Positively Associated

More information

Packing-Dimension Profiles and Fractional Brownian Motion

Packing-Dimension Profiles and Fractional Brownian Motion Under consideration for publication in Math. Proc. Camb. Phil. Soc. 1 Packing-Dimension Profiles and Fractional Brownian Motion By DAVAR KHOSHNEVISAN Department of Mathematics, 155 S. 1400 E., JWB 233,

More information

Convergence of Banach valued stochastic processes of Pettis and McShane integrable functions

Convergence of Banach valued stochastic processes of Pettis and McShane integrable functions Convergence of Banach valued stochastic processes of Pettis and McShane integrable functions V. Marraffa Department of Mathematics, Via rchirafi 34, 90123 Palermo, Italy bstract It is shown that if (X

More information

SUPPLEMENT TO PAPER CONVERGENCE OF ADAPTIVE AND INTERACTING MARKOV CHAIN MONTE CARLO ALGORITHMS

SUPPLEMENT TO PAPER CONVERGENCE OF ADAPTIVE AND INTERACTING MARKOV CHAIN MONTE CARLO ALGORITHMS Submitted to the Annals of Statistics SUPPLEMENT TO PAPER CONERGENCE OF ADAPTIE AND INTERACTING MARKO CHAIN MONTE CARLO ALGORITHMS By G Fort,, E Moulines and P Priouret LTCI, CNRS - TELECOM ParisTech,

More information

Brownian Motion. Chapter Stochastic Process

Brownian Motion. Chapter Stochastic Process Chapter 1 Brownian Motion 1.1 Stochastic Process A stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ,P and a real valued stochastic

More information

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic

More information

4th Preparation Sheet - Solutions

4th Preparation Sheet - Solutions Prof. Dr. Rainer Dahlhaus Probability Theory Summer term 017 4th Preparation Sheet - Solutions Remark: Throughout the exercise sheet we use the two equivalent definitions of separability of a metric space

More information

Weighted uniform consistency of kernel density estimators with general bandwidth sequences

Weighted uniform consistency of kernel density estimators with general bandwidth sequences arxiv:math/0607232v2 [math.st] 29 Sep 2006 Weighted uniform consistency of kernel density estimators with general bandwidth sequences Dony, Julia and Einmahl, Uwe Department of Mathematics Free University

More information

Invariant measures for iterated function systems

Invariant measures for iterated function systems ANNALES POLONICI MATHEMATICI LXXV.1(2000) Invariant measures for iterated function systems by Tomasz Szarek (Katowice and Rzeszów) Abstract. A new criterion for the existence of an invariant distribution

More information

ON THE STRONG LIMIT THEOREMS FOR DOUBLE ARRAYS OF BLOCKWISE M-DEPENDENT RANDOM VARIABLES

ON THE STRONG LIMIT THEOREMS FOR DOUBLE ARRAYS OF BLOCKWISE M-DEPENDENT RANDOM VARIABLES Available at: http://publications.ictp.it IC/28/89 United Nations Educational, Scientific Cultural Organization International Atomic Energy Agency THE ABDUS SALAM INTERNATIONAL CENTRE FOR THEORETICAL PHYSICS

More information

Exponential martingales: uniform integrability results and applications to point processes

Exponential martingales: uniform integrability results and applications to point processes Exponential martingales: uniform integrability results and applications to point processes Alexander Sokol Department of Mathematical Sciences, University of Copenhagen 26 September, 2012 1 / 39 Agenda

More information

Probability and Measure

Probability and Measure Probability and Measure Robert L. Wolpert Institute of Statistics and Decision Sciences Duke University, Durham, NC, USA Convergence of Random Variables 1. Convergence Concepts 1.1. Convergence of Real

More information

A SKOROHOD REPRESENTATION AND AN INVARIANCE PRINCIPLE FOR SUMS OF WEIGHTED i.i.d. RANDOM VARIABLES

A SKOROHOD REPRESENTATION AND AN INVARIANCE PRINCIPLE FOR SUMS OF WEIGHTED i.i.d. RANDOM VARIABLES ROCKY MOUNTAIN JOURNA OF MATHEMATICS Volume 22, Number 1, Winter 1992 A SKOROHOD REPRESENTATION AND AN INVARIANCE PRINCIPE FOR SUMS OF WEIGHTED i.i.d. RANDOM VARIABES EVAN FISHER ABSTRACT. A Skorohod representation

More information

The Essential Equivalence of Pairwise and Mutual Conditional Independence

The Essential Equivalence of Pairwise and Mutual Conditional Independence The Essential Equivalence of Pairwise and Mutual Conditional Independence Peter J. Hammond and Yeneng Sun Probability Theory and Related Fields, forthcoming Abstract For a large collection of random variables,

More information