Exercises in Extreme value theory

Similar documents
Problem set 1, Real Analysis I, Spring, 2015.

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9

On the convergence of sequences of random variables: A primer

1.4 Outer measures 10 CHAPTER 1. MEASURE

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

STAT 7032 Probability Spring Wlodek Bryc

Chapter 6. Convergence. Probability Theory. Four different convergence concepts. Four different convergence concepts. Convergence in probability

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

g 2 (x) (1/3)M 1 = (1/3)(2/3)M.

Extreme Value Theory and Applications

Probability and Measure

18.175: Lecture 3 Integration

Notes 9 : Infinitely divisible and stable laws

MATH 202B - Problem Set 5

4 Expectation & the Lebesgue Theorems

18.175: Lecture 17 Poisson random variables

FUNDAMENTALS OF REAL ANALYSIS by. II.1. Prelude. Recall that the Riemann integral of a real-valued function f on an interval [a, b] is defined as

7 Convergence in R d and in Metric Spaces

2019 Spring MATH2060A Mathematical Analysis II 1

MAS221 Analysis Semester Chapter 2 problems

Probability Theory. Richard F. Bass

Exercise 1. Let f be a nonnegative measurable function. Show that. where ϕ is taken over all simple functions with ϕ f. k 1.

l(y j ) = 0 for all y j (1)

FUNCTIONAL ANALYSIS LECTURE NOTES: WEAK AND WEAK* CONVERGENCE

Math 576: Quantitative Risk Management

MATH68181: EXTREME VALUES FIRST SEMESTER ANSWERS TO IN CLASS TEST

3 Integration and Expectation

17. Convergence of Random Variables

Homework 11. Solutions

Spring 2012 Math 541B Exam 1

The main results about probability measures are the following two facts:

7 About Egorov s and Lusin s theorems

L p Functions. Given a measure space (X, µ) and a real number p [1, ), recall that the L p -norm of a measurable function f : X R is defined by

the convolution of f and g) given by

Characteristic Functions and the Central Limit Theorem

1 Sequences of events and their limits

MT804 Analysis Homework II

Midterm Examination. STA 205: Probability and Measure Theory. Wednesday, 2009 Mar 18, 2:50-4:05 pm

Product measure and Fubini s theorem

Analysis Qualifying Exam

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989),

THEOREMS, ETC., FOR MATH 515

Analysis Comprehensive Exam Questions Fall F(x) = 1 x. f(t)dt. t 1 2. tf 2 (t)dt. and g(t, x) = 2 t. 2 t

18.175: Lecture 13 Infinite divisibility and Lévy processes

Iowa State University. Instructor: Alex Roitershtein Summer Homework #5. Solutions

Probability Distributions Columns (a) through (d)

B. Appendix B. Topological vector spaces

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

Set, functions and Euclidean space. Seungjin Han

Continuity of convex functions in normed spaces

Chapter 5. Measurable Functions

MATH41011/MATH61011: FOURIER SERIES AND LEBESGUE INTEGRATION. Extra Reading Material for Level 4 and Level 6

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Chapter 6. Hypothesis Tests Lecture 20: UMP tests and Neyman-Pearson lemma

MATH5011 Real Analysis I. Exercise 1 Suggested Solution

Chapter 8. General Countably Additive Set Functions. 8.1 Hahn Decomposition Theorem

The Heine-Borel and Arzela-Ascoli Theorems

8 Laws of large numbers

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

Integration on Measure Spaces

Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 2012

Economics 204 Summer/Fall 2011 Lecture 5 Friday July 29, 2011

1 + lim. n n+1. f(x) = x + 1, x 1. and we check that f is increasing, instead. Using the quotient rule, we easily find that. 1 (x + 1) 1 x (x + 1) 2 =

Probability reminders

Chapter 5. Weak convergence

3 Measurable Functions

Three hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

If α is a probability distribution on the line, its characteristic function is defined by

Induction, sequences, limits and continuity

Chapter 1 Preliminaries

The Caratheodory Construction of Measures

Stochastic Models (Lecture #4)

Limit Laws for Maxima of Functions of Independent Non-identically Distributed Random Variables

1.1 Review of Probability Theory

MATS113 ADVANCED MEASURE THEORY SPRING 2016

Real Analysis Math 131AH Rudin, Chapter #1. Dominique Abdi

1* (10 pts) Let X be a random variable with P (X = 1) = P (X = 1) = 1 2

Generalized Logistic Distribution in Extreme Value Modeling

STAT 7032 Probability. Wlodek Bryc

) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D

REAL VARIABLES: PROBLEM SET 1. = x limsup E k

MAT 571 REAL ANALYSIS II LECTURE NOTES. Contents. 2. Product measures Iterated integrals Complete products Differentiation 17

Measurable functions are approximately nice, even if look terrible.

(Practice Version) Midterm Exam 2

Elementary Probability. Exam Number 38119

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

LEBESGUE INTEGRATION. Introduction

Chapter 4. Measure Theory. 1. Measure Spaces

Supplementary Notes for W. Rudin: Principles of Mathematical Analysis

Chapter 1. Measure Spaces. 1.1 Algebras and σ algebras of sets Notation and preliminaries

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

Qualifying Exam in Probability and Statistics.

Analysis Finite and Infinite Sets The Real Numbers The Cantor Set

1 Independent increments

Asymptotic statistics using the Functional Delta Method

The Borel-Cantelli Group

or E ( U(X) ) e zx = e ux e ivx = e ux( cos(vx) + i sin(vx) ), B X := { u R : M X (u) < } (4)

Lebesgue Integration on R n

Introduction to Real Analysis

Seunghee Ye Ma 8: Week 2 Oct 6

Transcription:

Exercises in Extreme value theory 2016 spring semester 1. Show that L(t) = logt is a slowly varying function but t ǫ is not if ǫ 0. 2. If the random variable X has distribution F with finite variance, then show that the condition x 2 P( X > x) x y x y2 df(y) = 0 automatically holds. Hint: To obtain a contradiction, suppose that x 2 P( X > x) does not converge to 0, i.e. along an increasing sequence x n, it is at least ε. If this happens, then using E( X 2 ) (x 2 n x2 n 1 )P( X > x n), n=2 one can give an infinite lower bound on the second moment of X. 3. Let W 1 and W 2 be independent standard normal random variables. (a) Check that 1/W2 2 has the Lévy distribution, that is, the stable law with index α = 1/2 and κ = 1, i.e. it has density ( 1 exp 1 ) for y 0. 2πy 3 2y (b) Prove that W 1 /W 2 has Cauchy distribution. 4. Let X have a symmetric stable law with index α. Show that E X p < for p < α. Hint: If ϕ(t) = E(e itx ) denotes the characteristic function of X, then the following estimate can be used without proof for u > 0: ( P X > 2 ) 1 u (1 ϕ(t))dt u u and the identity E( X p ) = p 0 u t p 1 P( X > t)dt. 5. For f : R R functions, consider Cauchy s functional equation f(x+y) = f(x)+f(y). Prove that the only solutions for the equation are the linear functions f(x) = αx for some α R, if we assume that f is 1

(a) continuous; (b) continuous at one point; (c) monotonic on any interval; (d) bounded on any interval; (e) measurable. Hint: Lusin s theorem ensures that by the measurability of f there is a subset F [0,1] with Lebesgue measure 2/3 such that f is uniformly continuous on F. Then for any h (0,1/3), the sets F and F h = {x h : x F} cannot be disjoint. Use this to conclude that the function f is bounded on an interval (0,δ) for some small δ > 0. 6. Let X 1,X 2,... be random variables defined on the probability space (Ω,A,P) which have the same distribution function F(x), but we do not assume anything about their dependence structure. Let M n := max 1 i n X i. (a) Supposethatthereisanα > 0suchthat x α df(x) <, thatis, E( X i α ) <. Prove that for any ε > 0 and for fixed δ > 0, P( n (1/α+ε) M n > δ ) = 0. (b) Supposethatthereisans > 0suchthat exp(s x )df(x) <,thatis, E(exp(s X i )) <. Prove that for any sequence b n that goes to and for fixed δ > 0, Hint: Use the following inequality P( (b n logn) 1 M n > δ ) = 0. ( ) P max X i > λ = P( n i=1{ X i > λ}) 1 i n and a Markov type inequality. n P( X i > λ) = np( X i > λ) 7. Let X 1,X 2,... beindependent andidentically distributed randomvariableswithcommon distribution function F(x) = P(X i < x). Let M n := max 1 i n X i. If F(x) < 1 for all x < and x x α (1 F(x)) = b for some fixed constants α,b (0, ) (that is, 1 F(x) bx α as x ), then show that the distribution of (bn) 1/α M n converges weakly to the Fréchet distribution: i=1 P ( (bn) 1/α M n < x ) ½(x > 0)exp ( x α). 8. Let X 1,X 2,... beindependent andidentically distributed randomvariableswithcommon distribution function F(x) = P(X i < x). Let M n := max 1 i n X i. If F(x) < 1 for all x < and x e λx (1 F(x)) = b for some fixed constants λ,b (0, ) (that is, 1 F(x) be λx as x ), then show that the distribution of λm n log(bn) converges weakly to the Gumbel distribution: P(λM n log(bn) < x) exp ( e x). 2

9. Let X 1,X 2,... beindependent andidentically distributed randomvariableswithcommon distribution function F(x) = P(X i < x). Let M n := max 1 i n X i. If F(x 0 ) = 1 and F(x) < 1 for all x < x 0 and x x0 (x 0 x) α (1 F(x)) = b for some fixed constants α,b (0, ) (that is, 1 F(x) b(x 0 x) α as x x 0 ), then show that the distribution of (bn) 1/α (M n x 0 ) converges weakly to the Weibull distribution: P ( (bn) 1/α (M n x 0 ) < x ) ½(x < 0)exp( ( x) α )+½(x 0). 10. Let X 1,X 2,... beindependent andidentically distributed randomvariableswithcommon distribution function F(x) = P(X i < x). Suppose that the common distribution has a finite right endpoint, that is, x F = sup{x R : F(x) < 1} < and that the right endpoint has a positive mass P(X i = x F ) = F(x F ) > 0. Let M n := max 1 i n X i. Prove that for any sequence of reals (u n ), if the sequence of probabilities P(M n < u n ) converges, then the it is either 0 or 1. Hint: Use the proposition about Poisson approximation. 11. Let X 1,X 2,... beindependent andidentically distributed randomvariableswithcommon distribution function F(x) = P(X i < x). Let M n := max 1 i n X i. Assume that for all x R, F(x) < 1. Prove that the following two assertions are equivalent. (a) There is a sequence of constants a n R such that for any ε > 0, (b) For any ε > 0, P( M n a n > ε) = 0. 1 F(x+ε) x 1 F(x) Hint: The condition in (a) is equivalent to having for any ε > 0. = 0. n(1 F(a n +ε)) 0 and n(1 F(a n ε)) To prove that (a) implies (b), note by the condition equivalent to (a), for the nondecreasing sequence b n = inf{x R : F(x) > 1 1/n}, a n b n 0 holds. For any x large enough, there is an integer n(x) such that b n(x) x b n(x)+1. Use this and 1 F(b n ) 1/n to upper bound the ratio in condition (b). To prove that (b) implies (a), the sequence a n can be chosen as a n = inf{x R : F(x) > 1 1/n}. 12. Let X 1,X 2,... be an iid sequence of Poisson random variables with parameter λ > 0, that is, P(X = k) = e λλk k! for k = 0,1,2,... and let M n := max 1 i n X i. Show that there is no such normalization under which the sequence of maxima M n has a non-degenerate it law. Hint: Show that the necessary condition F(x+) x x F F(x) fails to hold along a sequence of integers where F(x) = 1 F(x) is the tail probability function. More precisely F(k +1)/F(k) converge to 0 for the integers k. 3 = 1

13. Let X 1,X 2,... be an iid sequence of negative binomial random variables with parameters p (0,1) and m N, that is, ( ) k +m 1 P(X = k) = p m (1 p) k k for k = 0,1,2,... and let M n := max 1 i n X i. Show that there is no such normalization under which the sequence of maxima M n has a non-degenerate it law. Hint: Show that the necessary condition F(x+) x x F F(x) fails to hold along a sequence of integers where F(x) = 1 F(x) is the tail probability function. More precisely F(k +1)/F(k) converges to 1 p for the integers k. To this end, show and use the asymptotic identity as k. = 1 P(X = k) = km 1 (m 1)! pm (1 p) k (1+o(1)) 14. Let X be a random variable with distribution function F where the tail is 1 F(x) = x α for x 1 with some α > 0. Then we know that F MDA(Φ α ). Which MDA does the distribution of X p and that of ln(x) belong to if p > 0? What are the normalization constants? 15. Suppose that X > 0 is a random variable and α > 0 is a number. Show that the following are equivalent: (a) X has the Fréchet distribution function { 0 if x 0 Φ α (x) = exp( x α ) if x > 0 (b) lnx α has the Gumbel distribution function Λ(x) = exp( e x ) (c) X 1 has the Weibull distribution function { exp( ( x) Ψ α (x) = α ) if x 0 1 if x > 0 16. Let X 1,X 2,... be an independent but not identically distributed sequence of random variables, let X k be exponential with parameter λ k. Denote by m n = min(x 1,...,X n ) the minimum record up to n and let m = m n which exists since m n is nonincreasing. (a) Show that if k=1 λ k <, then with probability one, the minimum record is broken finitely many times and m > 0. (b) Show that if k=1 λ k =, then with probability one, the minimum record is broken infinitely many times and m = 0. 4

Hint: One can use the representation of X k as the first point of a Poisson point process on R + with intensity λ k and the fact that the union of independent Poisson point processes with intensities λ k for k = 1,2,... is a Poisson point process with intensity k=1 λ k. 17. Let X 1,X 2,... be an independent but not identically distributed sequence of random variables, let X k be exponential with parameter λ k. Denote by M n = max(x 1,...,X n ) the maximum record up to n and let M = M n which exists since M n is nondecreasing. (a) Show that for any constant K > 0 if e λnk =, then with probability one there are infinitely many indices n such that X n > K. Further, for any K > 0 if e λnk <, then with probability one there are finitely many indices n such that X n > K. Hint: Use the Borel Cantelli lemmas. (b) Conclude that if e λnk = for any K > 0, then with probability one, M = and the maximum record is broken infinitely many times. If e λnk < for any K > 0, then with probability one, M < and the maximum record is broken finitely many times. (c) If there are K 1 > K 2 > 0 such that e λnk 1 < but e λnk 2 =, then there is a critical K c such that for smaller values of K the sum is infinite and for larger values of K the sum is finite. Show that in this case with probability one supx n = K c. (d) Suppose that there is a critical K c as described above. Show that if e λnkc =, then with probability one the maximum record is broken finitely many times. Hint: The condition shows that there are infinitely many indices n such that X n > K c, but for any ε > 0, there are only finitely many with X n > K c +ε. (e) Suppose that there is a critical K c as described above. Show that if e λnkc <, then with positive probability the maximum record is broken infinitely many times. Compute this probability. Hint: Observe that the maximum record is broken infinitely many times if and only if X n < K c for all indices n, since sup X n = K c. 5