Exercises in Extreme value theory 2016 spring semester 1. Show that L(t) = logt is a slowly varying function but t ǫ is not if ǫ 0. 2. If the random variable X has distribution F with finite variance, then show that the condition x 2 P( X > x) x y x y2 df(y) = 0 automatically holds. Hint: To obtain a contradiction, suppose that x 2 P( X > x) does not converge to 0, i.e. along an increasing sequence x n, it is at least ε. If this happens, then using E( X 2 ) (x 2 n x2 n 1 )P( X > x n), n=2 one can give an infinite lower bound on the second moment of X. 3. Let W 1 and W 2 be independent standard normal random variables. (a) Check that 1/W2 2 has the Lévy distribution, that is, the stable law with index α = 1/2 and κ = 1, i.e. it has density ( 1 exp 1 ) for y 0. 2πy 3 2y (b) Prove that W 1 /W 2 has Cauchy distribution. 4. Let X have a symmetric stable law with index α. Show that E X p < for p < α. Hint: If ϕ(t) = E(e itx ) denotes the characteristic function of X, then the following estimate can be used without proof for u > 0: ( P X > 2 ) 1 u (1 ϕ(t))dt u u and the identity E( X p ) = p 0 u t p 1 P( X > t)dt. 5. For f : R R functions, consider Cauchy s functional equation f(x+y) = f(x)+f(y). Prove that the only solutions for the equation are the linear functions f(x) = αx for some α R, if we assume that f is 1
(a) continuous; (b) continuous at one point; (c) monotonic on any interval; (d) bounded on any interval; (e) measurable. Hint: Lusin s theorem ensures that by the measurability of f there is a subset F [0,1] with Lebesgue measure 2/3 such that f is uniformly continuous on F. Then for any h (0,1/3), the sets F and F h = {x h : x F} cannot be disjoint. Use this to conclude that the function f is bounded on an interval (0,δ) for some small δ > 0. 6. Let X 1,X 2,... be random variables defined on the probability space (Ω,A,P) which have the same distribution function F(x), but we do not assume anything about their dependence structure. Let M n := max 1 i n X i. (a) Supposethatthereisanα > 0suchthat x α df(x) <, thatis, E( X i α ) <. Prove that for any ε > 0 and for fixed δ > 0, P( n (1/α+ε) M n > δ ) = 0. (b) Supposethatthereisans > 0suchthat exp(s x )df(x) <,thatis, E(exp(s X i )) <. Prove that for any sequence b n that goes to and for fixed δ > 0, Hint: Use the following inequality P( (b n logn) 1 M n > δ ) = 0. ( ) P max X i > λ = P( n i=1{ X i > λ}) 1 i n and a Markov type inequality. n P( X i > λ) = np( X i > λ) 7. Let X 1,X 2,... beindependent andidentically distributed randomvariableswithcommon distribution function F(x) = P(X i < x). Let M n := max 1 i n X i. If F(x) < 1 for all x < and x x α (1 F(x)) = b for some fixed constants α,b (0, ) (that is, 1 F(x) bx α as x ), then show that the distribution of (bn) 1/α M n converges weakly to the Fréchet distribution: i=1 P ( (bn) 1/α M n < x ) ½(x > 0)exp ( x α). 8. Let X 1,X 2,... beindependent andidentically distributed randomvariableswithcommon distribution function F(x) = P(X i < x). Let M n := max 1 i n X i. If F(x) < 1 for all x < and x e λx (1 F(x)) = b for some fixed constants λ,b (0, ) (that is, 1 F(x) be λx as x ), then show that the distribution of λm n log(bn) converges weakly to the Gumbel distribution: P(λM n log(bn) < x) exp ( e x). 2
9. Let X 1,X 2,... beindependent andidentically distributed randomvariableswithcommon distribution function F(x) = P(X i < x). Let M n := max 1 i n X i. If F(x 0 ) = 1 and F(x) < 1 for all x < x 0 and x x0 (x 0 x) α (1 F(x)) = b for some fixed constants α,b (0, ) (that is, 1 F(x) b(x 0 x) α as x x 0 ), then show that the distribution of (bn) 1/α (M n x 0 ) converges weakly to the Weibull distribution: P ( (bn) 1/α (M n x 0 ) < x ) ½(x < 0)exp( ( x) α )+½(x 0). 10. Let X 1,X 2,... beindependent andidentically distributed randomvariableswithcommon distribution function F(x) = P(X i < x). Suppose that the common distribution has a finite right endpoint, that is, x F = sup{x R : F(x) < 1} < and that the right endpoint has a positive mass P(X i = x F ) = F(x F ) > 0. Let M n := max 1 i n X i. Prove that for any sequence of reals (u n ), if the sequence of probabilities P(M n < u n ) converges, then the it is either 0 or 1. Hint: Use the proposition about Poisson approximation. 11. Let X 1,X 2,... beindependent andidentically distributed randomvariableswithcommon distribution function F(x) = P(X i < x). Let M n := max 1 i n X i. Assume that for all x R, F(x) < 1. Prove that the following two assertions are equivalent. (a) There is a sequence of constants a n R such that for any ε > 0, (b) For any ε > 0, P( M n a n > ε) = 0. 1 F(x+ε) x 1 F(x) Hint: The condition in (a) is equivalent to having for any ε > 0. = 0. n(1 F(a n +ε)) 0 and n(1 F(a n ε)) To prove that (a) implies (b), note by the condition equivalent to (a), for the nondecreasing sequence b n = inf{x R : F(x) > 1 1/n}, a n b n 0 holds. For any x large enough, there is an integer n(x) such that b n(x) x b n(x)+1. Use this and 1 F(b n ) 1/n to upper bound the ratio in condition (b). To prove that (b) implies (a), the sequence a n can be chosen as a n = inf{x R : F(x) > 1 1/n}. 12. Let X 1,X 2,... be an iid sequence of Poisson random variables with parameter λ > 0, that is, P(X = k) = e λλk k! for k = 0,1,2,... and let M n := max 1 i n X i. Show that there is no such normalization under which the sequence of maxima M n has a non-degenerate it law. Hint: Show that the necessary condition F(x+) x x F F(x) fails to hold along a sequence of integers where F(x) = 1 F(x) is the tail probability function. More precisely F(k +1)/F(k) converge to 0 for the integers k. 3 = 1
13. Let X 1,X 2,... be an iid sequence of negative binomial random variables with parameters p (0,1) and m N, that is, ( ) k +m 1 P(X = k) = p m (1 p) k k for k = 0,1,2,... and let M n := max 1 i n X i. Show that there is no such normalization under which the sequence of maxima M n has a non-degenerate it law. Hint: Show that the necessary condition F(x+) x x F F(x) fails to hold along a sequence of integers where F(x) = 1 F(x) is the tail probability function. More precisely F(k +1)/F(k) converges to 1 p for the integers k. To this end, show and use the asymptotic identity as k. = 1 P(X = k) = km 1 (m 1)! pm (1 p) k (1+o(1)) 14. Let X be a random variable with distribution function F where the tail is 1 F(x) = x α for x 1 with some α > 0. Then we know that F MDA(Φ α ). Which MDA does the distribution of X p and that of ln(x) belong to if p > 0? What are the normalization constants? 15. Suppose that X > 0 is a random variable and α > 0 is a number. Show that the following are equivalent: (a) X has the Fréchet distribution function { 0 if x 0 Φ α (x) = exp( x α ) if x > 0 (b) lnx α has the Gumbel distribution function Λ(x) = exp( e x ) (c) X 1 has the Weibull distribution function { exp( ( x) Ψ α (x) = α ) if x 0 1 if x > 0 16. Let X 1,X 2,... be an independent but not identically distributed sequence of random variables, let X k be exponential with parameter λ k. Denote by m n = min(x 1,...,X n ) the minimum record up to n and let m = m n which exists since m n is nonincreasing. (a) Show that if k=1 λ k <, then with probability one, the minimum record is broken finitely many times and m > 0. (b) Show that if k=1 λ k =, then with probability one, the minimum record is broken infinitely many times and m = 0. 4
Hint: One can use the representation of X k as the first point of a Poisson point process on R + with intensity λ k and the fact that the union of independent Poisson point processes with intensities λ k for k = 1,2,... is a Poisson point process with intensity k=1 λ k. 17. Let X 1,X 2,... be an independent but not identically distributed sequence of random variables, let X k be exponential with parameter λ k. Denote by M n = max(x 1,...,X n ) the maximum record up to n and let M = M n which exists since M n is nondecreasing. (a) Show that for any constant K > 0 if e λnk =, then with probability one there are infinitely many indices n such that X n > K. Further, for any K > 0 if e λnk <, then with probability one there are finitely many indices n such that X n > K. Hint: Use the Borel Cantelli lemmas. (b) Conclude that if e λnk = for any K > 0, then with probability one, M = and the maximum record is broken infinitely many times. If e λnk < for any K > 0, then with probability one, M < and the maximum record is broken finitely many times. (c) If there are K 1 > K 2 > 0 such that e λnk 1 < but e λnk 2 =, then there is a critical K c such that for smaller values of K the sum is infinite and for larger values of K the sum is finite. Show that in this case with probability one supx n = K c. (d) Suppose that there is a critical K c as described above. Show that if e λnkc =, then with probability one the maximum record is broken finitely many times. Hint: The condition shows that there are infinitely many indices n such that X n > K c, but for any ε > 0, there are only finitely many with X n > K c +ε. (e) Suppose that there is a critical K c as described above. Show that if e λnkc <, then with positive probability the maximum record is broken infinitely many times. Compute this probability. Hint: Observe that the maximum record is broken infinitely many times if and only if X n < K c for all indices n, since sup X n = K c. 5