Applied Probability Modeling of Data Networks;

Size: px
Start display at page:

Download "Applied Probability Modeling of Data Networks;"

Transcription

1 Applied Probability Modeling of Data Networks; 1. Random measures and point processes; weak convergence to PRM and Lévy processes. Sidney Resnick School of Operations Research and Industrial Engineering Rhodes Hall, Cornell University Ithaca NY USA sid August 5, 2007 Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 1 of 29

2 1. Outline of Lectures 1. Random measures and point processes; weak convergence to PRM and Lévy processes. 2. Multivariate regular variation; the Poisson transform; stable processes. 3. Introduction and survey of data network modeling. (Fairly nontechnical; some statistics.) 4. Some stylized applied probability network models: a renewal input model. 5. Some (more) stylized applied probability network models: a large time scale input model. 5.5 Some (more more) stylized applied probability network models: Heavy traffic and heavy tails in GI/G/1. Lectures primarily based on Resnick (2006). Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 2 of 29

3 2. Introduction: Point processes and random measures Setup. (Resnick, 2006, Chapter 3) Setup for study of random measures, point processes and associated topology yielding vague convergence. E: a nice space (lccb). Typically E is a finite dimensional Euclidean space: Subset of compactified R d or R d +. E: Borel σ-algebra. Space of all Radon measures on E: M + (E) = {µ : µ is a non-negative measure on E and µ is Radon.} (1) Note a measure µ is called Radon, if µ(k) <, K K(E) = compacta in E. Thus, compact sets have finite µ-mass. Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 3 of 29

4 Subset of M + (E) consisting of non-negative integer valued measures; ie, point measures: Call m( ) M + (E) a point measure if m( ) = ɛ xi ( ), x i E i where for A E ɛ xi (A) = { 1, if x i A, 0, if x i / A. So m( ) is Radon: m(k) <, for K K(E). Call {x i } the atoms or points and m is the function which counts how many atoms fall in a set. Then M p (E) M + (E) = { all Radon point measures.} Test functions: Those continuous functions which vanish on complements of compact sets: C + K (E) = {f : E R + : f is continuous with compact support.} Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 4 of 29

5 2.2. Convergence and topology in M + (E). Convergence concept: Vague convergence in M + (E) or M p (E): Sppse µ n M + (E), n 0. Then µ v n µ 0, if for all f C + K (E) we have µ n (f) := f(x)µ n (dx) µ 0 (f) := f(x)µ 0 (dx), (n ). E Topology specified by basis sets of the form {µ M + (E) : µ(f i ) (a i, b i ), i = 1,..., d} where f i C + K (E) and 0 a i b i. Unions of basis sets form the class of open sets constituting the vague topology. Topology is metrizable as CSMS. There exists a metric d(, ) = vague metric specified as follows: there exists some sequence of functions and for µ 1, µ 2 M + (E) d(µ 1, µ 2 ) = f i C + K (E) i=1 E µ 1 (f i ) µ 2 (f i ) 1 2 i. Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 5 of 29

6 Interpret µ M + (E) as an object determined by components indexed by C + K (E): µ = {µ(f), f C + K (E)} or µ = {µ(f i ), i 1}. Leads to compactness criterion: A subset M M + (E) vaguely relatively compact iff sup µ(f) <, µɛm f C + K (E). Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 6 of 29

7 2.3. Random elements. The open subsets of M + (E) generate the Borel σ-field: M + (E) = Borel σ-field of subsets in M + (E).} Random element: So ( M+ (E), M + (E) ) is measurable space. A random measure is measurable map (Ω, B) ( M + (E), M + (E) ) and a point process is measurable map (Ω, B) ( M p (E), M p (E) ). Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 7 of 29

8 3. Weak convergence of random measures. The following are usual and convenient methods of showing weak convergence of random elements in M + (E). In M + (E), random measures {η n ( ), n 0} converge weakly η n η 0 iff for any family {h j }, with h j C + K (E) we have Nice: (η n (h j ), j 1) (η 0 (h j ), j 1) in R. One assumes a sequence {h j } and prove R convergence; This reduces to proving R d -convergence for any fixed d. Often, in fact, this reduced to one dimensional convergence. Method of Laplace functionals: A convenient transform technique for manipulating distributions of point processes and random measures. Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 8 of 29

9 Definition. Let η : (Ω, A, P) (M + (E), M + (E)) be a random measure. The Laplace functional of the random measure η is the non-negative function Ψ η : C + K (E) R + Ψ η (f) =E exp{ η(f)} = exp{ η(ω, f)}dp(ω) Ω = exp{ µ(f)}p η 1 (dµ). M + (E) Weak convergence of a sequence of random measures in M + (E) equivalent to the Laplace functionals of the random measures converging for each f C + K (E). More detail: (Resnick, 1987, Section 3.5) or Kallenberg (1983), Neveu (1977). Theorem 1 (Convergence criterion) Let {η n, n 0} be random elements of M + (E). Then Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 9 of 29 η n η 0 in M p (E), iff Ψ ηn (f) = Ee ηn(f) Ee η0(f) = Ψ η0 (f), f C + K (E). (2)

10 So weak convergence is characterized by convergence of Laplace functionals on C + K (E). 1 Proof. Suppose η 2 n η 0 in M + (E). The map M + (E) [0, ) defined by µ µ(f) is continuous. The continuous mapping theorem gives Thus η n (f) η 0 (f) in R. e ηn(f) e η 0(f), and by Lebesgue s dominated convergence theorem Ee ηn(f) Ee η 0(f). Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 10 of 29

11 4. Poisson random measure Let N : (Ω, A) (M p (E), M p (E)) be a point process with state space E, where M p (E) is the Borel σ-algebra of subsets of M p (E) generated by open sets. Definition 1 N is a Poisson process with mean measure µ or synonomously a Poisson random measure (PRM(µ)) if 1. For A E e µ(a) (µ(a)) k, if µ(a) < P [N(A) = k] = k! 0, if µ(a) =. 2. If A 1,..., A k are disjoint subsets of E in E, then N(A 1 ),..., N(A k ) are independent random variables. Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 11 of 29

12 4.1. Properties. Transformation preserves Poisson-i-ness. Proposition 1 Suppose T : E E is a measurable mapping of one nice space E into another nice space E such that K K(E ) is compact in E T 1 K := {e E : T e K } K(E). If N is PRM(µ) on E then N := N T 1 is PRM(µ ) on E where µ := µ T 1. Proof. Poisson marginals and independence preserved. Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 12 of 29

13 Augmentation preserves Poisson-i-ness. Proposition 2 Suppose {X n } are random elements of a nice space E 1 such that n is PRM(µ). Suppose {J n } are iid random elements of a second nice space E 2 with common probability distribution F and suppose the Poisson process and the sequence {J n } are defined on the same probability space and are independent. Then the point process on E 1 E 2 n ɛ Xn ɛ (Xn,J n) is PRM with mean measure µ F. Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 13 of 29

14 Laplace functional of PRM(µ). The Poisson process can be identified by the characteristic form of its Laplace functional. Theorem 2 (Laplace functional of PRM) The point process N is PRM(µ) iff its Laplace functional is of the form Ψ N (f) = exp{ (1 e f(x) )µ(dx)}, f C + K (E). (3) E Proof. Verify (3) by usual measure theory argument: Let f = λ1 A ; for λ > 0. Then Ψ N (f) = E exp{ λn(a)} and (3) verified by direct calculation. Let f = k λ i 1 Ai i=1 for λ i > 0 and A 1,..., A k a partition. Then again verify (3) by direct calculation. Finish with a limiting argument, approximating general f by simple f n f. Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 14 of 29

15 Construction of PRM. Suppose N is PRM(µ) on a nice space E, such that µ(e) <. Represent N as Poissonized sum of masses: Define the pm F on E. Let F (dx) = µ(dx)/µ(e) {X n, n 1} iid random elements of E, common distribution F ; Let τ {X n }, τ has Poisson distribution with parameter µ(e); Define Then N is PRM(µ). N = { τ i=1 ɛ X i, if τ 1 0, if τ = 0. Proof. Compute the Laplace functional. Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 15 of 29

16 5. Sample measures and PRM Suppose that for each n 1 we have {X n,j, j 1} is a sequence of iid random elements of (E, E). We call an sample measure. n j=1 ɛ Xn,j Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 16 of 29

17 Theorem 3 (Basic convergence) We have in M p (E), iff n j=1 ɛ Xn,j k ɛ jk = PRM(µ) (4) ( n ) v np[x n,1 ] = E ɛ Xn,j ( ) µ (5) in M + (E). Furthermore (4) or (5) is equivalent to the version with a time component: j=1 Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy n ɛ ( ) j ɛ ( ) = PRM(LEB µ) (6) n,x n,j t k,j k j=1 k in M p ([0, ) E). Page 17 of 29

18 Proof. Compute Laplace functionals of the sample measures and decide when they converge: For f C + K (E), Ee P n j=1 ɛ X n,j (f) =Ee P n j=1 f(x n,j) = ( Ee f(x n,1) ) n and this converges to ) n ( = 1 E(n(1 e f(x n,1) )) n ( E = 1 (1 e f(x) )np [X n,1 dx] n exp{ (1 e f(x) )µ(dx)}, E the Laplace functional of PRM(µ), iff (1 e f(x) )np [X n,1 dx] (1 e f(x) )µ(dx). E and this last statement is equivalent to vague convergence in (5). E ) n Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 18 of 29

19 Corollary 1 (Variant of basic convergence) Suppose additionally that 0 < a n. Then for a measure µ M + (E) we have on M + (E) iff in M + (E). 1 a n n ɛ Xn,j µ (7) j=1 ( n 1 P [X n,1 ] = E a n a n ) n v ɛ Xn,j ( ) µ (8) j=1 Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 19 of 29

20 Proof. Similar to previous proof. Compute Laplace functional for the quantity on the left side of (7): P Ee 1 n an i=1 ɛ X (f) n,1 = (Ee ) 1 an f(x n n,1) ( ( E 1 e 1 f(x)) ) n an np[x n,1 dx] = 1 n and this converges to e µ(f), the Laplace functional of µ, iff (1 e 1 an f(x) )np [X n,1 dx] µ(f). (9) E Since a n, (1 e f(x)/an )np[x n,1 dx] iff E n v P [X n,1 ] µ. a n E f(x) n a n P [X n,1 dx] µ(f) Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 20 of 29

21 6. Weak convergence of partial sums to Lévy processes Result for d dimensions. Set E = [, ] \ {0}. Denote random vectors by X = (X (1),..., X (d) ). Theorem 4 Suppose for each n 1, that {X n,j, j 1} are iid random vectors such that np[x n,1 ] v ν( ) (10) in M + (E), where ν is a Lévy measure and for each j = 1,..., d, ( (X ) (j) 21[ X lim lim sup ne ε 0 n,1) (j) = 0. (11) n,1 ε] n Define the partial sum process based on the nth array row: [nt] ( X n (t) := X nk E ( X n,k 1 [ Xn,k 1]) ), t 0. k=1 Then (10) and (11) imply X n X 0, in D([0, ), R d ), where X 0 ( ) is a Lévy jump process with Lévy measure ν. Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 21 of 29

22 Proof from Resnick and Greenwood (1979), Resnick (2006). Proof in several steps: Step 1: Basic convergence, Theorem 3, and (10) imply k=1 ɛ ( k n,x n,k) k in M p ([0, ) E). Step 2: Two continuity assertions: ɛ (tk,j k ) = PRM(LEB ν) (12) (i) With respect to the distribution of PRM (LEB ν), the restriction map M p ([0, ) E) M p ([0, ) {x : x > ε}) defined by m m [0, ) {x: x >ε} is almost surely continuous. (ii) On M p ([0, ] {x : x > ε}) the summation functional from M p ([0, ) {x : x > ε}) D([0, T ], R d ) defined by ɛ (τk,j k ) k τ k ( ) is almost surely continuous with respect to the distribution of PRM(LEB ν). J k Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 22 of 29

23 Step 3: From first continuity assertion in Step 2, the convergence statement in Step 1, and continuous mapping Theorem we get the restricted convergence 1 [ Xn,k >ε]ɛ ( k n,x n,k) 1 [ jk >ε]ɛ (tk,j k ) (13) k k in M p ([0, ) {x : x > ε}). From the second continuity assertion in Step 2, we get from (13) [n ] k=1 X n,k 1 [ Xn,k >ε] t k ( ) j k 1 [ jk >ε] (14) Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy in D([0, T ], R d ). Similarly we get [n ] k=1 X n,k 1 [ε< Xn,k 1] t k ( ) j k 1 [ε< jk 1]. (15) Step 4. In (15), take expectations and apply (10) to get [n ]E ( ) X n,1 1 [ε< Xn,1 1] ( ) xν(dx) (16) {x:ε< x 1} Page 23 of 29

24 in D([0, T ], R d ). To justify this, observe first for any t > 0 that [nt]e ( ) [nt] X n,1 1 [ε< Xn,1 1] = xnp[x n,1 dx] n {x: x (ε,1]} t xν(dx) {x: x (ε,1]} v since np[x n,1 ] ν( ). Convergence is locally uniform in t and hence convergence takes place in D([0, T ], R d ). Step 5. Difference (14) (16). The result is [n ] n ( ) = X n,k 1 [ Xn,k >ε] [n ]E ( ) X n,1 1 [ε< Xn,1 1] X (ε) k=1 X (ε) 0 ( ) := t k j k 1 [ jk >ε] ( ) { } xν(dx). x: x (ε,1] (17) From the Itô representation of a Lévy process, for almost all ω, as ε 0, X (ε) 0 ( ) X 0 ( ), locally uniformly in t. Let d(, ) be the Skorohod metric on D[0, ). Local uniform convergence implies Skorohod convergence, so ( ) d X (ε) 0 ( ), X 0 ( ) 0 Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 24 of 29

25 almost surely as ε 0. Almost sure convergence weak convergence, so X (ε) 0 ( ) X 0 ( ). in D([0, ), R d ). Step 6. By Converging Together Theorem suffices to show lim lim sup P[d(X (ε) n, X n ) > δ] = 0. ε 0 n Convergence in D([0, ), R d ), if Skorohod convergence in D([0, T ], R d ) for any T. Skorohod metric on D([0, T ], R d ) bounded above by the uniform metric on D([0, T ], R d ). Suffices to show lim lim sup P[ sup X (ε) n (t) X n (t) > δ] = 0, ε 0 n 0 t T for any δ > 0. Recalling definitions of X (ɛ) n and X n : n (t) X n (t) = [nt] X n,k 1 [ Xn,k ε] [nt]e ( X n,1 1 [ Xn,1 ε]) X (ε) k=1 = [nt] ( X n,k 1 [ Xn,k ε] E ( X n,k 1 [ Xn,k ε]) ) k=1 Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 25 of 29

26 and so P[ sup X (ε) n (t) X n (t) > δ] 0 t T [ [nt] ( P sup X n,k 1 [ Xn,k ε] E ( ) ) ] X n,k 1 [ Xn,k ε] > δ 0 t T k=1 [ j ( =P X n,k 1 [ Xn,k ε] E ( X n,k 1 [ Xn,k ε]) ) ] > δ. sup 0 j nt k=1 Now use the fact that x d d i=1 x (i) and we get the bound d i=1 [ P sup 0 j nt j ( X (i) n,k 1 [ X n,k ε] E ( X (i) n,k 1 ) ) [ X n,k ε] > d] δ k=1 Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 26 of 29

27 and by Kolmogorov s inequality this has upper bound (δ/d) 2 =(δ/d) 2 (δ/d) 2 d ( [nt ] ) Var X (i) n,k 1 [ X n,k ε] i=1 k=1 d ( [nt ]Var i=1 d ( (X (i) [nt ]E i=1 X (i) n,11 [ Xn,1 ε] ) ) 21[ n,1) (i) X. n,1 ε] Taking lim ε 0 lim sup n, we easily get 0 by (11). Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 27 of 29

28 Contents Outline Pt processes, RM s Weak convergence RM s PRM Sample & PRM Sums to Lévy Page 28 of 29

29 References O. Kallenberg. Random Measures. Akademie-Verlag, Berlin, third edition, ISBN J. Neveu. Processus ponctuels. In École d Été de Probabilités de Saint- Flour, VI 1976, pages Lecture Notes in Math., Vol. 598, Berlin, Springer-Verlag. S.I. Resnick and P. Greenwood. A bivariate stable characterization and domains of attraction. J. Multivariate Anal., 9(2): , ISSN X. S.I. Resnick. Heavy Tail Phenomena: Probabilistic and Statistical Modeling. Springer Series in Operations Research and Financial Engineering. Springer-Verlag, New York, ISBN: S.I. Resnick. Extreme Values, Regular Variation and Point Processes. Springer-Verlag, New York, Page 29 of 29

30 Applied Probability Modeling of Data Networks; 2. Multivariate regular variation; the Poisson transform; stable processes. Sidney Resnick School of Operations Research and Industrial Engineering Rhodes Hall, Cornell University Ithaca NY USA sid August 5, 2007 Regular variation-fcts Reg variation-measures Poisson transform Sums to stable Page 1 of 16

31 1. Multivariate regular variation of functions Basic when studying models built from iid objects Regular variation of functions. C R d is a cone; ie, x C tx C, t > 0. Examples: C =(0, ], (d = 1), [0, ] d \ {0}, (0, ] d =[, ] d \ {0} [, ] d \ { } [0, ] (0, ] (d = 2). A function h : C (0, ) is monotone if it is either non-decreasing non-increasing in each component. h non-decreasing, x, y C and x y h(x) h(y). Regular variation-fcts Reg variation-measures Poisson transform Sums to stable Page 2 of 16

32 Definition 1 Sppse h 0 is measurable function defined on C. Sppse 1 = (1,..., 1) C. Call h multivariate regularly varying with limit function λ provided: λ(x) > 0 for x C; and for all x C we have h(tx) lim t h(t1) = λ(x). (1) Regular variation-fcts Reg variation-measures Poisson transform Sums to stable Properties Scaling arguments give the following properties: 1. For d = 1, C = (0, ) this says h(tx) lim t h(t) and a scaling argument shows since = λ(x), x > 0, λ(x) = x ρ, for some ρ R, λ(xy) = λ(x)λ(y), x > 0, y > 0. Page 3 of 16

33 2. For d > 1, fix x C and U(t) = h(tx) is one dimensional regularly varying so For any s > 0 U(ts) lim t U(t) = lim h(tsx) t h(tx) = lim h(tsx) t h(t1) /h(tx) h(t1) = λ(sx) λ(x), and conclude for some ρ(x) R that U RV ρ(x) and λ(sx) λ(x) = sρ(x). 3. Can check that ρ(x) does not depend on x so limit function satisfies homogeneity λ(sx) = s ρ λ(x). Regular variation-fcts Reg variation-measures Poisson transform Sums to stable Page 4 of 16

34 2. Regular variation and measures 2.1. d = 1, C = (0, ]. Suppose Z 0 is a random variable with distribution function F. Regular variation of 1 F has the following equivalences: (i) F RV α, α > 0. (ii) There exists a sequence {b n }, with b n such that lim n F (b n x) = x α, x > 0. n (iii) There exists a sequence {b n } with b n such that [ ] Z v µ n ( ) := np ν α ( ) (2) b n in M + (0, ], where ν α (x, ] = x α. To generalize (i) or (ii) to higher dimensions can be done but dealing with distribution functions in higher dimensions is awkward. However, (iii) generalizes nicely. Regular variation-fcts Reg variation-measures Poisson transform Sums to stable Page 5 of 16

35 2.2. d > 1, C = [0, ] \ {0}. Equivalences for regular variation of probability measures on C = [0, ] \ {0}. In each, we understand the phrase Radon measure to mean a Radon measure which is not identically zero and which is not degenerate at a point. Theorem 1 Repeated use of the symbols ν, b( ), {b n } from statement to statement, does not require these objects to be exactly the same in different statements. 1. There exists a Radon measure ν on C such that 1 F (tx) lim t 1 F (t1) = lim P [ Z [0, x] c] t t P [ Z [0, 1] c] = ν( [0, x] c), (3) t for all points x [0, ) \ {0} which are continuity points of the function ν([0, ] c ). 2. There exists a function b(t) and a Radon measure ν on C called the limit measure, such that in M + (C) tp [ Z b(t) ] v ν, t. (4) Regular variation-fcts Reg variation-measures Poisson transform Sums to stable Page 6 of 16

36 3. There exists a sequence b n and a Radon measure ν on C such that in M + (C) np [ Z b n ] v ν, n. (5) 4. Polar coordinate version: Define and (R, Θ) = ( Z, Z Z ℵ + = {x C : x = 1}. There exists a probability measure S( ) on ℵ + called the angular ( measure, ) and a function b(t) such that for (R, Θ) = Z Z, we have Z tp[ ( R b(t), Θ) v ] cν α S (6) in M + ( (0, ] ℵ+ ), for some c > 0. Regular variation-fcts Reg variation-measures Poisson transform Sums to stable Page 7 of 16

37 5. There exists a probability measure ( S( ) ) on ℵ + and a sequence b n Z such that for (R, Θ) = Z, we have Z in M + ( (0, ] ℵ+ ), for some c > 0. np[ ( R, Θ ) v ] cν α S (7) b n Remark 1 Regular variation for functions: limit function scales; i.e., it is homogeneous. Regular variation for measures: the homogeneity property in Cartesian coordinates translates into a product measure in polar coordinates. Regular variation-fcts Reg variation-measures Poisson transform Sums to stable Page 8 of 16

38 3. Regular variation and the Poisson random measure Multivariate regular variation additionally equivalent to induced sample measures weakly converging to Poisson random measure limits. Theorem 2 Suppose {Z, Z 1, Z 2,... } are iid; after transformation with polar coordinates the sequence is {(R, Θ), (R 1, Θ 1 ), (R 2, Θ 2 ),... }. Any of the equivalences in Theorem 1, are also equivalent to 6. There exists b n such that in M p (E). n ɛ Zi /b n PRM(ν), (8) i=1 7. There exists a sequence b n such that n ɛ (Ri /b n,θ i ) PRM(cν α S) (9) i=1 in M p ((0, ] ℵ + ). These conditions imply that for any sequence k = k(n) such that n/k we have Regular variation-fcts Reg variation-measures Poisson transform Sums to stable Page 9 of 16

39 8. In M + ( E ) and 9. In M + ( (0, ] ℵ+ ) 1 k 1 k n ɛ Zi /b( n k ) ν (10) i=1 n ɛ (Ri /b( n k ),Θ i) cν α S (11) i=1 and 8 or 9 is equivalent to any of 1 7, provided k( ) satisfies k(n) k(n + 1) Variant with time coordinate The result with a time coordinate needed for proving weak convergence of partial sum processes or maximal processes in the space D[0, ). Regular variation-fcts Reg variation-measures Poisson transform Sums to stable Page 10 of 16

40 Theorem 3 Suppose {Z, Z 1, Z 2,... } are iid random elements of [0, ). Then multivariate regular variation of the distribution of Z in C = [0, ] \ {0} np[ Z b n ] v ν is also equivalent to ɛ ( j n,z j/b n) PRM(LEB ν) (12) in M + ([0, ) C). j Regular variation-fcts Reg variation-measures Poisson transform Sums to stable Page 11 of 16

41 4. Weak convergence of partial sums to stable processes We can specialize the result for convergence of sums to Lévy processes. Assume, for simplicity, d = 1 and C = [, ] \ {0}. Regular variation of the tail probabilities on the cone R \ {0}, is equivalent to and lim x P[Z 1 > x] P[ Z 1 > x] = px α, P[ Z 1 > x] RV α. tp[ Z 1 b(t) ] v ν( ), P[Z 1 x] lim x P[ Z 1 > x] = qx α, Regular variation-fcts Reg variation-measures Poisson transform Sums to stable Page 12 of 16

42 Corollary 1 Consider the special case where {Z n, n 1} are iid random variables on R and set X n,j = Z j /b n for some b n. Define ν for x > 0 and 0 < α < 2 by ν((x, ]) = px α, ν((, x]) = qx α, (13) where 0 p 1 and q = 1 p. Then [n ] j=1 Z ( j b(n) [n ]E Z1 ) b(n) 1 [ Z 1 X b(n) 1] α ( ), (14) in D[0, ), where the limit is α-stable Lévy motion with Lévy measure ν, iff in M + (C). np[ Z 1 v ] ν, (15) b n Regular variation-fcts Reg variation-measures Poisson transform Sums to stable Page 13 of 16

43 Proof sufficiency. Given regular variation, plug into the result about convergence of sums to a Lévy process. We only need to check the truncated 2nd moment condition (Xn,1 ) ) 21[ lim lim sup ne( Xn,1 ε] = 0, ε 0 where n X n,1 = Z 1 b(n). We have by Karamata s theorem, ( ) ( Z1 ) 21 ne [ ] x 2 ν(dx), (n ) b(n) Z 1 b(n) ε [ x ε ] = pαε2 α 2 α + qαε2 α 2 α = ( const )ε2 α and as ε 0, we have ε 2 α 0 as required for the partial sum process to converge to the Lévy process. Regular variation-fcts Reg variation-measures Poisson transform Sums to stable Page 14 of 16

44 Contents Regular variation-fcts Reg variation-measures Poisson transform Sums to stable Page 15 of 16

45 References O. Kallenberg. Random Measures. Akademie-Verlag, Berlin, third edition, ISBN J. Neveu. Processus ponctuels. In École d Été de Probabilités de Saint- Flour, VI 1976, pages Lecture Notes in Math., Vol. 598, Berlin, Springer-Verlag. S.I. Resnick and P. Greenwood. A bivariate stable characterization and domains of attraction. J. Multivariate Anal., 9(2): , ISSN X. S.I. Resnick. Heavy Tail Phenomena: Probabilistic and Statistical Modeling. Springer Series in Operations Research and Financial Engineering. Springer-Verlag, New York, ISBN: S.I. Resnick. Extreme Values, Regular Variation and Point Processes. Springer-Verlag, New York, Page 16 of 16

46 Applied Probability Modeling of Data Networks; 3. Intro & survey of data network modeling. Sidney Resnick School of Operations Research and Industrial Engineering Rhodes Hall, Cornell University Ithaca NY USA sid August 5, 2007 Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 1 of 30

47 1. Introduction Action: Does data network traffic behave statistically like telephone network traffic? Stop assuming the two types of networks behave the same. Start checking. Initially at Bellcore (now Telcordia) and later at AT&T Labs-Research, high resolution measurements (data) were collected. Usually the data consisted of counts of bits, bytes, packets etc per unit time (eg millisecond). This could then be aggregated to coarser time scales. For example seconds 1 second 10 milliseconds 1 millisecond Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 2 of 30

48 Significant examples: LAN s and WAN s Willinger et al. (1997) Duffy et al. (1993) Leland et al. (1993) Willinger et al. (1995) WWW traffic Crovella and Bestavros (1996), Crovella and Bestavros (1997). Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Measurements on data networks exhibit features surprising by the standards of classical queueing and telephone network models. These are called invariants which is to networks what stylized facts are to finance. Page 3 of 30

49 2. Stylized facts 1. Heavy tails abound for such things as file sizes, transmission rates, durations (file transfers, connection lengths). (See Arlitt and Williamson (1996), Leland et al. (1994), Maulik et al. (2002), Resnick and Rootzén (2000), Resnick (2003), Willinger and Paxson (1998), Willinger et al. (1998), Willinger (1998).) Note: a random variable X has a heavy tail if P [X > x] x α, Tail exhibits power law decay. Limited moments: If 1 < α < 2, no variance! α > 0, x large. E( X α+δ ) =, δ > 0. Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 4 of 30

50 2. The number of bits or packets per slot exhibits long range dependence across time slots (eg, Leland et al. (1993), Willinger et al. (1995). There is also a perception of self-similarity as the width of the time slot varies across a range of time scales exceeding a typical round trip time. Note: A stationary process {X n } possesses long range dependence if dependence between variables decays slowly as the gap between the variables increases: Corr(X 0, X h ) (const)h β, 0 < β < Network traffic is bursty with rare but influential periods of very high transmission rates punctuating typical periods of modest activity. Bursty is a somewhat ill defined concept associated with heavy tailed transmissions rates. Introduces peak loads to the network. Associated with large files transmitted over fast links. Not associated with the truism: Traffic at a heavily loaded link is normally distributed. Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 5 of 30

51 3. Broad Issues (BI s) BI-1 Role of statistics and applied probability: Statistics: Empirically identify phenomena and properties of the data so as to better understand what network data in the wild should look like. NOT prediction typically. Examples: Identify presence of heavy tails, long range dependence, self-similarity; Understand different statistical properties of various applications and protocols (ftp, http, mail, streaming audio). Applied probability: Build models which explain relations and explain empirically observed phenomena. Example: Sizes of files stored on a server follows Pareto power law tail which causes long range dependence. Pardigm: Heavy tails cause long range dependence. Build models of end user behavior which allow construction of simulation tools to study effect of tweaking protocols. Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 6 of 30

52 BI-2 Problem of time scales: Can Applied Math, Applied Prob & Statistics make contributions to data network analysis and planning in Internet time. Developers have short attention spans and little patience with outsider s toys. Two year time horizon to write a PhD thesis and really understand something is ridiculously long time horizon. Pessimists view: the best the mathy community can hope for is to cause paradigm shifts with explanations which may lag behind developments. Take the money and run mentality. ( Anyone who gets a PhD does not understand economics. ) Long development time for a project means other people are earning. Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 7 of 30

53 4. An approach to modeling: Infinite source Poisson model A fluid model Suppose sessions characterized by Session initiation times are {Γ k } where {Γ k } homogeneous Poisson on (, ), rate λ. Sequence of iid marks independent of {Γ k }: Each Poisson point Γ k receives a mark which characterizes input characteristics: where (F k, L k, R k ) = (file, duration, rate ), F k = L k R k. All three quantities are seen empirically to be marginally heavy tailed: P [F > x] x α F P [L > x] x α L P [R > x] x α R, with (usually) 1 < α F, α R, α L < 2. Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 8 of 30

54 Examples of mark (F k, L k, R k ) structures: Early simple models assumed constant input rates, R k = 1 so that total input rate at time t is M(t) = # active sources at time t. Random but constant rates R k during the transmission interval. (Can even make rate time varying.) What is the distribution of the triple (F k, L k, R k )? Can depend on application protocol. Can depend on how data is segmented and how sessions defined from connection level data of packet headers. Different distributional assumptions lead to radically different model predictions. Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 9 of 30

55 Fluid input models for cumulative input in time (a, b]: A(a, b] = work inputted in time interval(a, b]. Process approximation to cumulative input. Large time scale approximation A(0, T t] b(t ) d lim T a(t ) = X(t), where possible limits include (Mikosch et al., 2002) fractional Brownian motion (Gauss marginals, lrd) stable Lévy motion (heavy tailed marginals, sii, ss) BUT: not easy to find agreement with these approximate models and data (Guerin et al., 2003) Small time scale approximations: Block time into small time slots (kδ, (k + 1)δ] and consider as δ 0 {A(kδ, (k + 1)δ], k N}. depending on the interaction of input rates and tails. Will need λ = λ(δ) (a la heavy traffic limit theorems). Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 10 of 30

56 Small time scale approximation results dependent on distributional assumptions on (F, L, R). Either D-limit is approximately highly correlated normal random variables or D-limit is approximately highly dependent stable infinite variance random variables. Compute dependence measure across different slots. Models predict lrd for {A(kδ, (k+1)δ], k N} (D Auria and Resnick, 2006a,b). Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 11 of 30

57 5. Summary: Stylized facts and small time scale approximation Stylized Facts F R Model L R Model 1. Heavy tails Built in Built in 2. P [A(0, δ] > x] x (α R+α F ), x α R, fixed δ; x fixed δ; x 3. LRD across slots Cov(k) cḡ(0) (k); EDM(k) c fixed δ; k fixed δ; k 4. Cum traffic/slot is N(0, 1)? (A(0, δ] (ctering(δ)) a(δ) d N(0, 1) F (0) L (k); (A(0, δ] (ctering(δ)) b(δ) d X αr ( ) Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 12 of 30

58 6. Some Technical Points Tech Pt 1: Identify in the connection level data Poisson time points from packet headers and validate the choice. Quick & dirty (Q&D) solution: Check if interpoint distances are iid (sample acf almost 0) and exponentially distributed (qq-plots). Q&D Rules of Thumb: Behavior of lots of humans acting independently is often well modelled by a Poisson process. Starting times of machine triggered downloads cannot be modelled as Poisson process. Example: UCB: Inter-arrival times of requests in http sessions via modem. Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 13 of 30

59 Introduction log sorted data ACF Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References quantiles of exponential lags UCB data; http sessions via modem. Left: qqplot against exponential distribution. Right: autocorrelation function of interarrival times. Page 14 of 30

60 Tech Pt 2: Heavy tails: A rv X has a heavy (right) tail if P [X > x] x α, x. Notes 0 < α < 1: Very heavy: mean & variance infinite. 1 < α < 2: Heavy: Frequent case where mean finite but variance is infinite. α > 2: Heavy with finite variance: Typical of financial data. For large x, P [log X > x] e αx, x. So inference can be based on exponential density and thresholding techniques to account for the distribution following this law only for large x. For many purposes, do not need to know the whole distribution but just the tail. Example: BU data: Influential study from mid 90 s: File sizes downloaded in a web session. Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 15 of 30

61 Figure: Sizes of www downloads; BU experiment: QQ-plot and Hill plot. Introduction Stylized facts Broad Issues (BI s) QQ-Plot, 272 Nov 94 Hill Plot, 272 Nov 94 Approach to Modeling log sorted data Hill estimate of alpha Model predictions Tech points Heavy tails LRD References quantiles of exponential number of order statistics Page 16 of 30

62 Tech Pt 3. Checking for independence. Q&D method 1: Standard time series method checks if sample correlation function n h i=1 ˆρ(h) = (X i X)(X i+h X) n i=1 (X i X), h = 1, 2,..., 2 is close to identically 0. How to put meaning to phrase close to 0? If If finite variances, Bartlett s formula provides asymptotic normal theory. If heavy tailed, Davis and Resnick formula provides asymptotic distributions for ˆρ(h). Q&D method 2: If data heavy tailed, take a function of the data (say the log) to get lighter tail and test. (But this may obscure the importance of large values.) Q&D method 3: Subset method. Split data into (say) 2 subsets. Plot acf of each half separately. If iid, pics should look same. Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 17 of 30

63 Example: UNC connection data. Data contains connection vectors ordered by the connection initiation times. Clusters are obtained by considering only the connection start times ordered as they appeared in time on the UNC link and arbitrarily using a time threshold of 5 milliseconds to separate clusters. This yields clusters. Do the cluster heads look Poisson? Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD QQexp: Cluster head interarrivals Series diff(clusterinittime) References sorted data ACF Page 18 of quantiles of exponential Lag

64 Tech Pt 4. Is the data stationary? Usually not and there are, for example, diurnal cycles. (There is only so much Red Bull (Jolt, Coke,... ) that a human can consume.) Q&D Coping Method: Take a slab of the copius data which looks stationary. Usually there is too much data. (!!??) Rule of thumb: don t take more than 4 hours. Depending on the data, it can be a couple of minutes; eg connection data. Should we try to model the cycles? Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 19 of 30

65 Tech Pt 5. Long range dependence. A stationary L 2 sequence {ξ n, n 1} has long-range dependence if Cov(ξ n, ξ n+h ) h β, h Introduction for 0 < β < 1. How to test? Q&D method: The sample acf should not 0 quickly as the lag increases. EXAMPLE: CompanyX packet counts per unit time on CompanyX s WAN (including trans-atlantic traffic). Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 20 of 30

66 Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD Series : pktcount References ACF Page 21 of Lag

67 7. How do heavy tails cause long range dependence? Assume infinite source Poisson model with Recall R k = 1 F k = L k. 1 < α L < 2. M(t) = # active sources at time t, = total input rate at time t, = analogue of packet count per unit time. Background and warmup: For each fixed t, M(t) is a Poisson random variable. Why? When 1 < α L < 2, M( ) has a stationary version. Assume ɛ Γk = PRM(λdt) on R. Then ξ := k k ɛ (Γk,L k ) = PRM(λdt F L ) Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 22 of 30

68 on R [0, ) and 1 [Γk t<γ k +L k ] M(t) = k =ξ({(s, l) : s t < s + l} = ξ(b) Introduction Stylized facts Broad Issues (BI s) Approach to Modeling l Model predictions Tech points is Poisson with mean ( ) E ξ({(s, l) : s t < s + l} Heavy tails LRD References B = t s= F L (t s)λds = λµ L t s Page 23 of 30

69 The process {M(t), t R} is stationary with covariance function Cov(M(t), M(t + s)) =Cov ( ξ(a 1 ) + ξ(a 2 ), ξ(a 2 ) + ξ(a 3 ) ) and because ξ(a 1 ) ξ(a 3 ), this is A 2 A 1 t A 3 t+s =Cov ( ξ(a 2 ), ξ(a 2 ) ) =Var ( ξ(a 2 ) ) =E ( ξ(a 2 ) ) = =λ t u= s λdu F L (t + s u) F L (v)dv =λs F L (s) c = c s (α 1) l(s). Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 24 of 30 The slow decay of the covariance as a function of the lag s characterizes long range dependence.

70 8. References Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 25 of 30

71 Contents Introduction Stylized facts Broad Issues (BI s) Approach to Modeling Model predictions Tech points Heavy tails LRD References Page 26 of 30

72 References M. Arlitt and C.L. Williamson. Web server workload characterization: The search for invariants (extended version). In Proceedings of the ACM Sigmetrics Conference, Philadelphia, Pa, Available from M. Crovella and A. Bestavros. Self-similarity in world wide web traffic: evidence and possible causes. In Proceedings of the ACM SIGMET- RICS 96 International Conference on Measurement and Modeling of Computer Systems, volume 24, pages , M. Crovella and A. Bestavros. Self-similarity in world wide web traffic: evidence and possible causes. IEEE/ACM Transactions on Networking, 5(6): , B. D Auria and S.I. Resnick. Data network models of burstiness. Adv. in Appl. Probab., 38(2): , June 2006a. B. D Auria and S.I. Resnick. The influence of dependence on data network models of burstiness. Technical report, Cornell University, 2006b. Report #1449, Available at legacy.orie.cornell. edu/~sid. D.E. Duffy, A.A. McIntosh, M. Rosenstein, and W. Willinger. Analyzing telecommunications traffic data from working common chan- Page 27 of 30

73 nel signaling subnetworks. In M.E. Tarter and M.D. Lock, editors, Computing Science and Statistics Interface, Proceedings of the 25th Symposium on the Interface, volume 25, pages , San Diego, California, C.A. Guerin, H. Nyberg, O. Perrin, S.I. Resnick, H. Rootzén, and C. Stărică. Empirical testing of the infinite source poisson data traffic model. Stochastic Models, 19(2): , W.E. Leland, M.S. Taqqu, W. Willinger, and D.V. Wilson. Statistical analysis of high time-resolution ethernet Lan traffic measurements. In Proceedings of the 25th Symposium on the Interface between Statistics and Computer Science, pages , Will E. Leland, Murad S. Taqqu, Walter Willinger, and Daniel V. Wilson. On the self-similar nature of ethernet traffic (extended version). IEEE/ACM Trans. Netw., 2(1):1 15, ISSN doi: K. Maulik, S.I. Resnick, and H. Rootzén. Asymptotic independence and a network traffic model. J. Appl. Probab., 39(4): , ISSN T. Mikosch, S.I. Resnick, H. Rootzén, and A.W. Stegeman. Is network traffic approximated by stable Lévy motion or fractional Brownian motion? Ann. Appl. Probab., 12(1):23 68, Page 28 of 30

74 S.I. Resnick and H. Rootzén. Self-similar communication models and very heavy tails. Ann. Appl. Probab., 10: , S.I. Resnick. Modeling data networks. In B. Finkenstadt and H. Rootzén, editors, SemStat: Seminaire Europeen de Statistique, Extreme Values in Finance, Telecommunications, and the Environment, pages Chapman-Hall, London, W. Willinger and V. Paxson. Where mathematics meets the Internet. Notices of the American Mathematical Society, 45(8): , W. Willinger, M.S. Taqqu, M. Leland, and D. Wilson. Self similarity in high speed packet traffic: analysis and modelling of ethernet traffic measurements. Statist. Sci., 10:67 85, W. Willinger, M.S. Taqqu, M. Leland, and D. Wilson. Self similarity through high variability: statistical analysis of ethernet lan traffic at the source level (extended version). IEEE/ACM Transactions on Networking, 5(1):71 96, W. Willinger, V. Paxson, and M.S. Taqqu. Self-similarity and heavy tails: Structural modeling of network traffic. In R.J. Adler, R.E. Feldman, and M.S. Taqqu, editors, A Practical Guide to Heavy Tails. Statistical Techniques and Applications, pages Birkhäuser Boston Inc., Boston, MA, Page 29 of 30

75 W. Willinger. Data network traffic: heavy tails are here to stay. Presentation at Extremes Risk and Safety, Nordic School of Public Health, Gothenberg Sweden, August Page 30 of 30

76 Applied Probability Modeling of Data Networks; 4. Some Stylized Applied Probability Network Models: A Renewal Input Model. Sidney Resnick School of Operations Research and Industrial Engineering Rhodes Hall, Cornell University Ithaca NY USA sid sir1@cornell.edu August 1, 2007 Intro Poisson sources Renewal: sessions Renewal: work Page 1 of 23

77 1. Introduction: Very Heavy Tails. Attempts to explain network self-similarity focus on heavy tailed transmission times of sources sending data to one or more servers: Reasons for assuming 1 < β < 2: P [On-period duration > x] x β. (Applied) Willinger et al (Bellcore) analyzed 700+ source destination pairs, and estimated the tail parameter of on-periods. Value usually in the range (1, 2). (Theoretical) Mathematical analysis has been based on renewal theory. Without a finite mean, stationary versions of renewal processes do not exist and (uncontrolled) buffer content stochastic processes would not be stable. Need for the case 0 < β < 1: BUT BU study (Crovella, Bestavros,... ) of file sizes downloaded in www session over 4 months in 2 labs. In November, 1994 in room 272: β.66. Calgary study of file lengths downloaded from various servers found β s in the range of 0.4 to 0.6. Intro Poisson sources Renewal: sessions Renewal: work Page 2 of 23

78 Intro QQ-Plot, 272 Nov 94 Hill Plot, 272 Nov 94 Poisson sources log sorted data Hill estimate of alpha Renewal: sessions Renewal: work quantiles of exponential number of order statistics Page 3 of 23 Figure 1: BU File Lengths, Nov 1994, Room 272

79 1.1. Summary of the difficulties with the case 0 < β < 1. Models will not be stable in the conventional senses; normalization necessary to keep control. Models will not have stationary versions. Some common performance measures which are expressed in terms of moments, may not be applicable. Nervousness about models where moments do not exist. Confusion between concepts of unbounded support and infinite moments. (Normal, exponential, gamma, weibull,... have unbounded support.) Intro Poisson sources Renewal: sessions Renewal: work Page 4 of 23

80 2. Infinite source Poisson model=m/g/ input model Notation and concepts: {Γ n, n 1} = times of session initiations; homogeneous Poisson points on [0, ) with rate λ. {L n } = session durations; iid non-negative rv s with common distribution F L (x) satisfying F L (x) x β l(x), 0 < β < 1; t m(t) = F L (s)ds ct 1 β l(t) ; truncated mean function, 0 M(t) = number of active sessions at t, = 1 [Γn t<γn+l n]; E(M(t)) = m(t); A(t) = n=1 =# of servers in M/G/ telephone model. t 0 M(s)ds, = cumulative work inputted in [0, t] assuming unit rate input. Intro Poisson sources Renewal: sessions Renewal: work Page 5 of 23

81 r = server work rate, X(t) = content process, dx(t) = da(t) r1 [X(t)>0] dt τ(γ) = inf{t > 0 : X(t) γ} Intro Poisson sources Renewal: sessions Renewal: work Page 6 of 23

82 3. Sessions initiated at renewal times. (Mikosch) See Mikosch and Resnick (2006) The model. {S n, n 1} = times of session initiations; ordinary renewal process; n S 0 = 0, S n = X i ; {X n } iid; i=1 X i F X (x); FX (x) = 1 F X (x) RV α {L n } = session durations; iid non-negative rv s with common distribution F L (x) satisfying F L (x) RV β, {L n } {S n }. M(t) = number of active sessions at t, = A(t) = n=1 t 0 1 [Sn t<s n+l n] M(s)ds, = cumulative work inputted in [0, t], assuming unit rate input. Intro Poisson sources Renewal: sessions Renewal: work Page 7 of 23

83 3.2. Cases. 1. Comparable tails: β = α and F X (x) c F L (x), c > 0, as x. (a) The distribution tails of X 1 and L 1 are essentially the same. (b) For simplicity, we assume c = 1. (c) Kind of stability for M which converges weakly w/o normalization. 2. F L heavier-tailed: (a) 0 < β < α < 1 or, if β = α, then F X (x)/ F L (x) 0 as x so that the distribution tail of X 1 is lighter than the distribution tail of L 1. (b) 0 = β < α < 1 so that the distribution tail of L 1 is slowly varying and thus again heavier than that of X 1. (a) Implies buildup in the M process. 3. F X heavier-tailed: β > α so that the distribution tail of X 1 is heavier than the distribution tail of L 1. Renewal epochs sparse relative to session lengths. Of less interest. Intro Poisson sources Renewal: sessions Renewal: work Page 8 of 23

84 Intro Poisson sources Renewal: sessions Renewal: work Figure 2: Paths of M; α = β = 0.9 (left); α = β = 0.6 (right). Page 9 of 23

85 Intro Poisson sources Renewal: sessions Renewal: work Figure 3: Paths of M; (α, β) = (0.9, 0.2) (left); (α, β) = (0.9, 0.4) (right). Page 10 of 23

86 3.3. Warm-up: Mean value analysis for α, β < 1. Obtain asymptotic behavior of E ( M(t) ) from Karamata s Tauberian theorem. Let U(x) = FX n (x), x > 0, = renewal function. n=0 Since 0 < α < 1, well known (eg Feller, 1971); as x, U(x) ( Γ(1 α) Γ(1 + α) F X (x) ) 1 c(α) x α /l F (x). Therefore ( t ), EM(t) = t 0 c(α) U(dx) F L (t x) = F L (t(1 s)) F L (t) (1 s) β αs α 1 ds F L (t) F X (t) = c (α) F L (t) F X (t). U(tds) ( FL (t)u(t)) U(t) Intro Poisson sources Renewal: sessions Renewal: work Page 11 of 23

87 EM(t) c (α) F L (t) F X (t), (t ). Conclusions Case (1): Comparable tails. E ( M(t) ) converges to a constant. Case (2): F L is more heavy-tailed than F. EM(t). Case (3): F X is more heavy-tailed than F L. and hence EM(t) 0. M(t) L 1 0. so Case (3) may be of lesser interest. (Renewals are sparse relative to event durations that at any time there is not likely to be an event in progress.) Intro Poisson sources Renewal: sessions Renewal: work Page 12 of 23

88 3.4. Summary of results. Table 1: Limiting behavior of M(t) as t. Conditions Limit behavior of M(t) as t 0 < α < 1 M(t) random limit. F X F L 0 β < α < 1 or 0 < α = β < 1 and F X = o( F L ) 0 < β < 1 E(X 1 ) < 0 < β α = 1 E(X 1 ) = E(X 1 ) < E(L 1 ) < F X (t) F L (t) M(t) random limit. M(t) t F L (t) constant Gaussian rv M(t) random centering t F L (t) M(t) t F L (t)µ(t) constant µ(t)= truncated 1st moment X Stationary version of M( ) exists Intro Poisson sources Renewal: sessions Renewal: work Page 13 of 23 Focus on the first 2 rows corresponding to α, β < 1.

89 3.5. Renewal: α, β < 1, comparable tails. A kind of stability exists for this case since fidi s of M(t ) converge in distribution to a limit Preliminaries Define N(x) = 1 [Sn x] = inf{n : S n > x} = S (x), x 0. n=0 = renewal counting function. ɛ (tk,j k ) =N = PRM(Leb ν α ) on [0, ) (0, ] := E k ν α (x, ] = x α X α (t) = j k, t 0, t k t =non-decreasing α-stable Lévy motion with Lévy measure ν α. ( ) 1 b(t) (t), t, t 1 F F X (b(t)) 1; X = quantile function of F X Intro Poisson sources Renewal: sessions Renewal: work Page 14 of 23

90 X (s) (t) = S [st] b(s) X α(t), (s ), = renewal epochs are asymptotically stable. (X (s) ) X α, FX (s)n(s ) X α ( ) or, setting u 1 = F X (s), 1 u N(b(u) ) X α ( ), u, 1 ɛ Sn Xα in M + [0, ). u b(u) n=0 Have not yet referenced L. Intro Poisson sources Renewal: sessions Renewal: work Page 15 of 23

91 3.6. Comparable tails; α, β < 1. Define the time change map by T : D [0, ) M + (E) M + ( E), (E = [0, ) (0, ]) by T (x, m) = m where m is defined by m(f) = f(x(u), v) m(du, dv), f C + K (E). If m is a point measure with representation m = k ɛ (τ k,y k ), then Intro Poisson sources Renewal: sessions Renewal: work T (x, m) = k ɛ (x(τk ),y k ). Page 16 of 23

Data Network Models of

Data Network Models of Data Network Models of Sidney Resnick School of Operations Research and Industrial Engineering Rhodes Hall, Cornell University Ithaca NY 14853 USA http://www.orie.cornell.edu/ sid sir1@cornell.edu August

More information

Multivariate Heavy Tails, Asymptotic Independence and Beyond

Multivariate Heavy Tails, Asymptotic Independence and Beyond Multivariate Heavy Tails, endence and Beyond Sidney Resnick School of Operations Research and Industrial Engineering Rhodes Hall Cornell University Ithaca NY 14853 USA http://www.orie.cornell.edu/ sid

More information

Estimation of the long Memory parameter using an Infinite Source Poisson model applied to transmission rate measurements

Estimation of the long Memory parameter using an Infinite Source Poisson model applied to transmission rate measurements of the long Memory parameter using an Infinite Source Poisson model applied to transmission rate measurements François Roueff Ecole Nat. Sup. des Télécommunications 46 rue Barrault, 75634 Paris cedex 13,

More information

Network Traffic Characteristic

Network Traffic Characteristic Network Traffic Characteristic Hojun Lee hlee02@purros.poly.edu 5/24/2002 EL938-Project 1 Outline Motivation What is self-similarity? Behavior of Ethernet traffic Behavior of WAN traffic Behavior of WWW

More information

Poisson Cluster process as a model for teletraffic arrivals and its extremes

Poisson Cluster process as a model for teletraffic arrivals and its extremes Poisson Cluster process as a model for teletraffic arrivals and its extremes Barbara González-Arévalo, University of Louisiana Thomas Mikosch, University of Copenhagen Gennady Samorodnitsky, Cornell University

More information

ASYMPTOTIC INDEPENDENCE AND A NETWORK TRAFFIC MODEL

ASYMPTOTIC INDEPENDENCE AND A NETWORK TRAFFIC MODEL ASYMPTOTIC INDEPENDENCE AND A NETWORK TRAFFIC MODEL KRISHANU MAULIK, SIDNEY RESNICK, AND HOLGER ROOTZÉN Abstract. The usual concept of asymptotic independence, as discussed in the context of extreme value

More information

MODELS FOR COMPUTER NETWORK TRAFFIC

MODELS FOR COMPUTER NETWORK TRAFFIC MODELS FOR COMPUTER NETWORK TRAFFIC Murad S. Taqqu Boston University Joint work with Walter Willinger, Joshua Levy and Vladas Pipiras,... Web Site http://math.bu.edu/people/murad OUTLINE Background: 1)

More information

Mice and Elephants Visualization of Internet

Mice and Elephants Visualization of Internet Mice and Elephants Visualization of Internet Traffic J. S. Marron, Felix Hernandez-Campos 2 and F. D. Smith 2 School of Operations Research and Industrial Engineering, Cornell University, Ithaca, NY, 4853,

More information

Nonlinear Time Series Modeling

Nonlinear Time Series Modeling Nonlinear Time Series Modeling Part II: Time Series Models in Finance Richard A. Davis Colorado State University (http://www.stat.colostate.edu/~rdavis/lectures) MaPhySto Workshop Copenhagen September

More information

Heavy Tailed Time Series with Extremal Independence

Heavy Tailed Time Series with Extremal Independence Heavy Tailed Time Series with Extremal Independence Rafa l Kulik and Philippe Soulier Conference in honour of Prof. Herold Dehling Bochum January 16, 2015 Rafa l Kulik and Philippe Soulier Regular variation

More information

Practical conditions on Markov chains for weak convergence of tail empirical processes

Practical conditions on Markov chains for weak convergence of tail empirical processes Practical conditions on Markov chains for weak convergence of tail empirical processes Olivier Wintenberger University of Copenhagen and Paris VI Joint work with Rafa l Kulik and Philippe Soulier Toronto,

More information

Capturing Network Traffic Dynamics Small Scales. Rolf Riedi

Capturing Network Traffic Dynamics Small Scales. Rolf Riedi Capturing Network Traffic Dynamics Small Scales Rolf Riedi Dept of Statistics Stochastic Systems and Modelling in Networking and Finance Part II Dependable Adaptive Systems and Mathematical Modeling Kaiserslautern,

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Regular Variation and Extreme Events for Stochastic Processes

Regular Variation and Extreme Events for Stochastic Processes 1 Regular Variation and Extreme Events for Stochastic Processes FILIP LINDSKOG Royal Institute of Technology, Stockholm 2005 based on joint work with Henrik Hult www.math.kth.se/ lindskog 2 Extremes for

More information

Multiplicative Multifractal Modeling of. Long-Range-Dependent (LRD) Trac in. Computer Communications Networks. Jianbo Gao and Izhak Rubin

Multiplicative Multifractal Modeling of. Long-Range-Dependent (LRD) Trac in. Computer Communications Networks. Jianbo Gao and Izhak Rubin Multiplicative Multifractal Modeling of Long-Range-Dependent (LRD) Trac in Computer Communications Networks Jianbo Gao and Izhak Rubin Electrical Engineering Department, University of California, Los Angeles

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Stochastic volatility models: tails and memory

Stochastic volatility models: tails and memory : tails and memory Rafa l Kulik and Philippe Soulier Conference in honour of Prof. Murad Taqqu 19 April 2012 Rafa l Kulik and Philippe Soulier Plan Model assumptions; Limit theorems for partial sums and

More information

In Proceedings of the 1997 Winter Simulation Conference, S. Andradottir, K. J. Healy, D. H. Withers, and B. L. Nelson, eds.

In Proceedings of the 1997 Winter Simulation Conference, S. Andradottir, K. J. Healy, D. H. Withers, and B. L. Nelson, eds. In Proceedings of the 1997 Winter Simulation Conference, S. Andradottir, K. J. Healy, D. H. Withers, and B. L. Nelson, eds. LONG-LASTING TRANSIENT CONDITIONS IN SIMULATIONS WITH HEAVY-TAILED WORKLOADS

More information

HEAVY-TRAFFIC EXTREME-VALUE LIMITS FOR QUEUES

HEAVY-TRAFFIC EXTREME-VALUE LIMITS FOR QUEUES HEAVY-TRAFFIC EXTREME-VALUE LIMITS FOR QUEUES by Peter W. Glynn Department of Operations Research Stanford University Stanford, CA 94305-4022 and Ward Whitt AT&T Bell Laboratories Murray Hill, NJ 07974-0636

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Tail process and its role in limit theorems Bojan Basrak, University of Zagreb

Tail process and its role in limit theorems Bojan Basrak, University of Zagreb Tail process and its role in limit theorems Bojan Basrak, University of Zagreb The Fields Institute Toronto, May 2016 based on the joint work (in progress) with Philippe Soulier, Azra Tafro, Hrvoje Planinić

More information

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk ANSAPW University of Queensland 8-11 July, 2013 1 Outline (I) Fluid

More information

Wiener Measure and Brownian Motion

Wiener Measure and Brownian Motion Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u

More information

FRACTIONAL BROWNIAN MOTION WITH H < 1/2 AS A LIMIT OF SCHEDULED TRAFFIC

FRACTIONAL BROWNIAN MOTION WITH H < 1/2 AS A LIMIT OF SCHEDULED TRAFFIC Applied Probability Trust ( April 20) FRACTIONAL BROWNIAN MOTION WITH H < /2 AS A LIMIT OF SCHEDULED TRAFFIC VICTOR F. ARAMAN, American University of Beirut PETER W. GLYNN, Stanford University Keywords:

More information

Max stable Processes & Random Fields: Representations, Models, and Prediction

Max stable Processes & Random Fields: Representations, Models, and Prediction Max stable Processes & Random Fields: Representations, Models, and Prediction Stilian Stoev University of Michigan, Ann Arbor March 2, 2011 Based on joint works with Yizao Wang and Murad S. Taqqu. 1 Preliminaries

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Stochastic process. X, a series of random variables indexed by t

Stochastic process. X, a series of random variables indexed by t Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

GARCH processes continuous counterparts (Part 2)

GARCH processes continuous counterparts (Part 2) GARCH processes continuous counterparts (Part 2) Alexander Lindner Centre of Mathematical Sciences Technical University of Munich D 85747 Garching Germany lindner@ma.tum.de http://www-m1.ma.tum.de/m4/pers/lindner/

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

Markov processes and queueing networks

Markov processes and queueing networks Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

More information

On the estimation of the heavy tail exponent in time series using the max spectrum. Stilian A. Stoev

On the estimation of the heavy tail exponent in time series using the max spectrum. Stilian A. Stoev On the estimation of the heavy tail exponent in time series using the max spectrum Stilian A. Stoev (sstoev@umich.edu) University of Michigan, Ann Arbor, U.S.A. JSM, Salt Lake City, 007 joint work with:

More information

Some functional (Hölderian) limit theorems and their applications (II)

Some functional (Hölderian) limit theorems and their applications (II) Some functional (Hölderian) limit theorems and their applications (II) Alfredas Račkauskas Vilnius University Outils Statistiques et Probabilistes pour la Finance Université de Rouen June 1 5, Rouen (Rouen

More information

Packet Size

Packet Size Long Range Dependence in vbns ATM Cell Level Trac Ronn Ritke y and Mario Gerla UCLA { Computer Science Department, 405 Hilgard Ave., Los Angeles, CA 90024 ritke@cs.ucla.edu, gerla@cs.ucla.edu Abstract

More information

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1 Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be

More information

Finite-time Ruin Probability of Renewal Model with Risky Investment and Subexponential Claims

Finite-time Ruin Probability of Renewal Model with Risky Investment and Subexponential Claims Proceedings of the World Congress on Engineering 29 Vol II WCE 29, July 1-3, 29, London, U.K. Finite-time Ruin Probability of Renewal Model with Risky Investment and Subexponential Claims Tao Jiang Abstract

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

Asymptotics for posterior hazards

Asymptotics for posterior hazards Asymptotics for posterior hazards Pierpaolo De Blasi University of Turin 10th August 2007, BNR Workshop, Isaac Newton Intitute, Cambridge, UK Joint work with Giovanni Peccati (Université Paris VI) and

More information

Multivariate Risk Processes with Interacting Intensities

Multivariate Risk Processes with Interacting Intensities Multivariate Risk Processes with Interacting Intensities Nicole Bäuerle (joint work with Rudolf Grübel) Luminy, April 2010 Outline Multivariate pure birth processes Multivariate Risk Processes Fluid Limits

More information

Stable Lévy motion with values in the Skorokhod space: construction and approximation

Stable Lévy motion with values in the Skorokhod space: construction and approximation Stable Lévy motion with values in the Skorokhod space: construction and approximation arxiv:1809.02103v1 [math.pr] 6 Sep 2018 Raluca M. Balan Becem Saidani September 5, 2018 Abstract In this article, we

More information

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Optional Stopping Theorem Let X be a martingale and T be a stopping time such Plan Counting, Renewal, and Point Processes 0. Finish FDR Example 1. The Basic Renewal Process 2. The Poisson Process Revisited 3. Variants and Extensions 4. Point Processes Reading: G&S: 7.1 7.3, 7.10

More information

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974 LIMITS FOR QUEUES AS THE WAITING ROOM GROWS by Daniel P. Heyman Ward Whitt Bell Communications Research AT&T Bell Laboratories Red Bank, NJ 07701 Murray Hill, NJ 07974 May 11, 1988 ABSTRACT We study the

More information

A source model for ISDN packet data traffic *

A source model for ISDN packet data traffic * 1 A source model for ISDN packet data traffic * Kavitha Chandra and Charles Thompson Center for Advanced Computation University of Massachusetts Lowell, Lowell MA 01854 * Proceedings of the 28th Annual

More information

Extremogram and Ex-Periodogram for heavy-tailed time series

Extremogram and Ex-Periodogram for heavy-tailed time series Extremogram and Ex-Periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Jussieu, April 9, 2014 1 2 Extremal

More information

On Kesten s counterexample to the Cramér-Wold device for regular variation

On Kesten s counterexample to the Cramér-Wold device for regular variation On Kesten s counterexample to the Cramér-Wold device for regular variation Henrik Hult School of ORIE Cornell University Ithaca NY 4853 USA hult@orie.cornell.edu Filip Lindskog Department of Mathematics

More information

r-largest jump or observation; trimming; PA models

r-largest jump or observation; trimming; PA models r-largest jump or observation; trimming; PA models Sidney Resnick School of Operations Research and Industrial Engineering Rhodes Hall, Cornell University Ithaca NY 14853 USA http://people.orie.cornell.edu/sid

More information

Beyond the color of the noise: what is memory in random phenomena?

Beyond the color of the noise: what is memory in random phenomena? Beyond the color of the noise: what is memory in random phenomena? Gennady Samorodnitsky Cornell University September 19, 2014 Randomness means lack of pattern or predictability in events according to

More information

Extremogram and ex-periodogram for heavy-tailed time series

Extremogram and ex-periodogram for heavy-tailed time series Extremogram and ex-periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Zagreb, June 6, 2014 1 2 Extremal

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Random sets. Distributions, capacities and their applications. Ilya Molchanov. University of Bern, Switzerland

Random sets. Distributions, capacities and their applications. Ilya Molchanov. University of Bern, Switzerland Random sets Distributions, capacities and their applications Ilya Molchanov University of Bern, Switzerland Molchanov Random sets - Lecture 1. Winter School Sandbjerg, Jan 2007 1 E = R d ) Definitions

More information

The strictly 1/2-stable example

The strictly 1/2-stable example The strictly 1/2-stable example 1 Direct approach: building a Lévy pure jump process on R Bert Fristedt provided key mathematical facts for this example. A pure jump Lévy process X is a Lévy process such

More information

Shape of the return probability density function and extreme value statistics

Shape of the return probability density function and extreme value statistics Shape of the return probability density function and extreme value statistics 13/09/03 Int. Workshop on Risk and Regulation, Budapest Overview I aim to elucidate a relation between one field of research

More information

Limit theorems for dependent regularly varying functions of Markov chains

Limit theorems for dependent regularly varying functions of Markov chains Limit theorems for functions of with extremal linear behavior Limit theorems for dependent regularly varying functions of In collaboration with T. Mikosch Olivier Wintenberger wintenberger@ceremade.dauphine.fr

More information

Notes 1 : Measure-theoretic foundations I

Notes 1 : Measure-theoretic foundations I Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/13 KC Border Introduction to Probability and Statistics Winter 217 Lecture 13: The Poisson Process Relevant textbook passages: Sections 2.4,3.8, 4.2 Larsen Marx [8]: Sections

More information

September 21, 2004 ACTIVITY RATES WITH VERY HEAVY TAILS

September 21, 2004 ACTIVITY RATES WITH VERY HEAVY TAILS September 21, 24 ACTIVITY RATES WITH VERY HEAVY TAILS THOMAS MIKOSCH AND SIDNEY RESNICK Abstract. Consider a data networ model in which sources begin to transmit at renewal time points {S n}. Transmissions

More information

The main results about probability measures are the following two facts:

The main results about probability measures are the following two facts: Chapter 2 Probability measures The main results about probability measures are the following two facts: Theorem 2.1 (extension). If P is a (continuous) probability measure on a field F 0 then it has a

More information

Poisson Processes. Stochastic Processes. Feb UC3M

Poisson Processes. Stochastic Processes. Feb UC3M Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written

More information

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2) 14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence

More information

WEIBULL RENEWAL PROCESSES

WEIBULL RENEWAL PROCESSES Ann. Inst. Statist. Math. Vol. 46, No. 4, 64-648 (994) WEIBULL RENEWAL PROCESSES NIKOS YANNAROS Department of Statistics, University of Stockholm, S-06 9 Stockholm, Sweden (Received April 9, 993; revised

More information

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition

More information

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) = Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of

More information

SELF-SIMILAR COMMUNICATION MODELS AND VERY HEAVY TAILS. By Sidney Resnick 1 and Holger Rootzén 2 Cornell University and Chalmers University

SELF-SIMILAR COMMUNICATION MODELS AND VERY HEAVY TAILS. By Sidney Resnick 1 and Holger Rootzén 2 Cornell University and Chalmers University he Annals of Applied Probability 2000, Vol. 10, No. 3, 753 778 SELF-SIMILAR COMMUNICAION MODELS AND VERY HEAVY AILS By Sidney Resnick 1 and Holger Rootzén 2 Cornell University and Chalmers University Several

More information

I. ANALYSIS; PROBABILITY

I. ANALYSIS; PROBABILITY ma414l1.tex Lecture 1. 12.1.2012 I. NLYSIS; PROBBILITY 1. Lebesgue Measure and Integral We recall Lebesgue measure (M411 Probability and Measure) λ: defined on intervals (a, b] by λ((a, b]) := b a (so

More information

LARGE DEVIATION PROBABILITIES FOR SUMS OF HEAVY-TAILED DEPENDENT RANDOM VECTORS*

LARGE DEVIATION PROBABILITIES FOR SUMS OF HEAVY-TAILED DEPENDENT RANDOM VECTORS* LARGE EVIATION PROBABILITIES FOR SUMS OF HEAVY-TAILE EPENENT RANOM VECTORS* Adam Jakubowski Alexander V. Nagaev Alexander Zaigraev Nicholas Copernicus University Faculty of Mathematics and Computer Science

More information

REGULARLY VARYING MEASURES ON METRIC SPACES: HIDDEN REGULAR VARIATION AND HIDDEN JUMPS

REGULARLY VARYING MEASURES ON METRIC SPACES: HIDDEN REGULAR VARIATION AND HIDDEN JUMPS REGULARLY VARYING MEASURES ON METRIC SPACES: HIDDEN REGULAR VARIATION AND HIDDEN JUMPS FILIP LINDSKOG, SIDNEY I. RESNICK, AND JOYJIT ROY Abstract. We develop a framework for regularly varying measures

More information

On Kusuoka Representation of Law Invariant Risk Measures

On Kusuoka Representation of Law Invariant Risk Measures MATHEMATICS OF OPERATIONS RESEARCH Vol. 38, No. 1, February 213, pp. 142 152 ISSN 364-765X (print) ISSN 1526-5471 (online) http://dx.doi.org/1.1287/moor.112.563 213 INFORMS On Kusuoka Representation of

More information

Additive functionals of infinite-variance moving averages. Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535

Additive functionals of infinite-variance moving averages. Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535 Additive functionals of infinite-variance moving averages Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535 Departments of Statistics The University of Chicago Chicago, Illinois 60637 June

More information

The Self-similar and Multifractal Nature of a Network Traffic Model

The Self-similar and Multifractal Nature of a Network Traffic Model The Self-similar and Multifractal Nature of a Network Traffic Model Krishanu Maulik Sidney Resnick December 4, 00 Abstract: We look at a family of models for Internet traffic with increasing input rates

More information

TOWARDS BETTER MULTI-CLASS PARAMETRIC-DECOMPOSITION APPROXIMATIONS FOR OPEN QUEUEING NETWORKS

TOWARDS BETTER MULTI-CLASS PARAMETRIC-DECOMPOSITION APPROXIMATIONS FOR OPEN QUEUEING NETWORKS TOWARDS BETTER MULTI-CLASS PARAMETRIC-DECOMPOSITION APPROXIMATIONS FOR OPEN QUEUEING NETWORKS by Ward Whitt AT&T Bell Laboratories Murray Hill, NJ 07974-0636 March 31, 199 Revision: November 9, 199 ABSTRACT

More information

Stochastic Hybrid Systems: Applications to Communication Networks

Stochastic Hybrid Systems: Applications to Communication Networks research supported by NSF Stochastic Hybrid Systems: Applications to Communication Networks João P. Hespanha Center for Control Engineering and Computation University of California at Santa Barbara Talk

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij Weak convergence and Brownian Motion (telegram style notes) P.J.C. Spreij this version: December 8, 2006 1 The space C[0, ) In this section we summarize some facts concerning the space C[0, ) of real

More information

In Proceedings of the Tenth International Conference on on Parallel and Distributed Computing Systems (PDCS-97), pages , October 1997

In Proceedings of the Tenth International Conference on on Parallel and Distributed Computing Systems (PDCS-97), pages , October 1997 In Proceedings of the Tenth International Conference on on Parallel and Distributed Computing Systems (PDCS-97), pages 322-327, October 1997 Consequences of Ignoring Self-Similar Data Trac in Telecommunications

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

9 Brownian Motion: Construction

9 Brownian Motion: Construction 9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of

More information

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past. 1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if

More information

COMPOSITION SEMIGROUPS AND RANDOM STABILITY. By John Bunge Cornell University

COMPOSITION SEMIGROUPS AND RANDOM STABILITY. By John Bunge Cornell University The Annals of Probability 1996, Vol. 24, No. 3, 1476 1489 COMPOSITION SEMIGROUPS AND RANDOM STABILITY By John Bunge Cornell University A random variable X is N-divisible if it can be decomposed into a

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

Extreme Value Analysis and Spatial Extremes

Extreme Value Analysis and Spatial Extremes Extreme Value Analysis and Department of Statistics Purdue University 11/07/2013 Outline Motivation 1 Motivation 2 Extreme Value Theorem and 3 Bayesian Hierarchical Models Copula Models Max-stable Models

More information

Asymptotic Analysis of Exceedance Probability with Stationary Stable Steps Generated by Dissipative Flows

Asymptotic Analysis of Exceedance Probability with Stationary Stable Steps Generated by Dissipative Flows Asymptotic Analysis of Exceedance Probability with Stationary Stable Steps Generated by Dissipative Flows Uğur Tuncay Alparslan a, and Gennady Samorodnitsky b a Department of Mathematics and Statistics,

More information

The largest eigenvalues of the sample covariance matrix. in the heavy-tail case

The largest eigenvalues of the sample covariance matrix. in the heavy-tail case The largest eigenvalues of the sample covariance matrix 1 in the heavy-tail case Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia NY), Johannes Heiny (Aarhus University)

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013 MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 7 9/5/013 The Reflection Principle. The Distribution of the Maximum. Brownian motion with drift Content. 1. Quick intro to stopping times.

More information

CHARACTERIZATIONS AND EXAMPLES OF HIDDEN REGULAR VARIATION

CHARACTERIZATIONS AND EXAMPLES OF HIDDEN REGULAR VARIATION CHARACTERIZATIONS AND EXAMPLES OF HIDDEN RULAR VARIATION KRISHANU MAULIK AND SIDNEY RESNICK Abstract. Random vectors on the positive orthant whose distributions possess hidden regular variation are a subclass

More information

Math 576: Quantitative Risk Management

Math 576: Quantitative Risk Management Math 576: Quantitative Risk Management Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 11 Haijun Li Math 576: Quantitative Risk Management Week 11 1 / 21 Outline 1

More information

Asymptotic Delay Distribution and Burst Size Impact on a Network Node Driven by Self-similar Traffic

Asymptotic Delay Distribution and Burst Size Impact on a Network Node Driven by Self-similar Traffic Èíôîðìàöèîííûå ïðîöåññû, Òîì 5, 1, 2005, ñòð. 4046. c 2004 D'Apice, Manzo. INFORMATION THEORY AND INFORMATION PROCESSING Asymptotic Delay Distribution and Burst Size Impact on a Network Node Driven by

More information

IEOR 8100: Topics in OR: Asymptotic Methods in Queueing Theory. Fall 2009, Professor Whitt. Class Lecture Notes: Wednesday, September 9.

IEOR 8100: Topics in OR: Asymptotic Methods in Queueing Theory. Fall 2009, Professor Whitt. Class Lecture Notes: Wednesday, September 9. IEOR 8100: Topics in OR: Asymptotic Methods in Queueing Theory Fall 2009, Professor Whitt Class Lecture Notes: Wednesday, September 9. Heavy-Traffic Limits for the GI/G/1 Queue 1. The GI/G/1 Queue We will

More information

Positive Harris Recurrence and Diffusion Scale Analysis of a Push Pull Queueing Network. Haifa Statistics Seminar May 5, 2008

Positive Harris Recurrence and Diffusion Scale Analysis of a Push Pull Queueing Network. Haifa Statistics Seminar May 5, 2008 Positive Harris Recurrence and Diffusion Scale Analysis of a Push Pull Queueing Network Yoni Nazarathy Gideon Weiss Haifa Statistics Seminar May 5, 2008 1 Outline 1 Preview of Results 2 Introduction Queueing

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

Continuous-time Markov Chains

Continuous-time Markov Chains Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017

More information

A NEW PROOF OF THE WIENER HOPF FACTORIZATION VIA BASU S THEOREM

A NEW PROOF OF THE WIENER HOPF FACTORIZATION VIA BASU S THEOREM J. Appl. Prob. 49, 876 882 (2012 Printed in England Applied Probability Trust 2012 A NEW PROOF OF THE WIENER HOPF FACTORIZATION VIA BASU S THEOREM BRIAN FRALIX and COLIN GALLAGHER, Clemson University Abstract

More information

Heavy Tails: The Origins and Implications for Large Scale Biological & Information Systems

Heavy Tails: The Origins and Implications for Large Scale Biological & Information Systems Heavy Tails: The Origins and Implications for Large Scale Biological & Information Systems Predrag R. Jelenković Dept. of Electrical Engineering Columbia University, NY 10027, USA {predrag}@ee.columbia.edu

More information

Operational Risk and Pareto Lévy Copulas

Operational Risk and Pareto Lévy Copulas Operational Risk and Pareto Lévy Copulas Claudia Klüppelberg Technische Universität München email: cklu@ma.tum.de http://www-m4.ma.tum.de References: - Böcker, K. and Klüppelberg, C. (25) Operational VaR

More information

Statistical analysis of peer-to-peer live streaming traffic

Statistical analysis of peer-to-peer live streaming traffic Statistical analysis of peer-to-peer live streaming traffic Levente Bodrog 1 Ákos Horváth 1 Miklós Telek 1 1 Technical University of Budapest Probability and Statistics with Applications, 2009 Outline

More information

STAT 7032 Probability Spring Wlodek Bryc

STAT 7032 Probability Spring Wlodek Bryc STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,

More information

Lecture Notes Introduction to Ergodic Theory

Lecture Notes Introduction to Ergodic Theory Lecture Notes Introduction to Ergodic Theory Tiago Pereira Department of Mathematics Imperial College London Our course consists of five introductory lectures on probabilistic aspects of dynamical systems,

More information