The Contraction Method on C([0, 1]) and Donsker s Theorem
|
|
- Margery Perkins
- 5 years ago
- Views:
Transcription
1 The Contraction Method on C([0, 1]) and Donsker s Theorem Henning Sulzbach J. W. Goethe-Universität Frankfurt a. M. YEP VII Probability, random trees and algorithms Eindhoven, March 12, 2010 joint work with Ralph Neininger
2 Quickselect / Find Task: Given a list of n different elements S, FIND the element of rank k.
3 Quickselect / Find Task: Given a list of n different elements S, FIND the element of rank k. Algorithm: Choose one element x at random (pivot),
4 Quickselect / Find Task: Given a list of n different elements S, FIND the element of rank k. Algorithm: Choose one element x at random (pivot), Comparing all other elements with x gives sublists S, S >,
5 Quickselect / Find Task: Given a list of n different elements S, FIND the element of rank k. Algorithm: Choose one element x at random (pivot), Comparing all other elements with x gives sublists S, S >, If I n = S k, search recursively in S for the element of rank k, otherwise search recursively ins > for element of rank k I n.
6 Quickselect / Find Task: Given a list of n different elements S, FIND the element of rank k. Algorithm: Choose one element x at random (pivot), Comparing all other elements with x gives sublists S, S >, If I n = S k, search recursively in S for the element of rank k, otherwise search recursively ins > for element of rank k I n. Stop, if the considered set is of size 1.
7 Quickselect / Find For simplicity k = 1. X n = number of key comparisons, I n = size of left sublist
8 Quickselect / Find For simplicity k = 1. X n = number of key comparisons, I n = size of left sublist This leads to X n d = XIn + n 1, where (X j ), I n are independent, I n is distributed uniformly on {0,..., n 1}.
9 Quickselect / Find For simplicity k = 1. X n = number of key comparisons, I n = size of left sublist This leads to X n d = XIn + n 1, where (X j ), I n are independent, I n is distributed uniformly on {0,..., n 1}. Scaling gives. Y n := X n n d = I n n Y I n + n 1 n
10 Recursive equation Y n := X n n d = I n n Y I n + n 1 n.
11 Recursive equation Y n := X n n A possible limit Y should satisfy for ind. U, Y with U uniform on [0, 1]. d = I n n Y I n + n 1 n. Y d = UY + 1, (1)
12 Recursive equation Y n := X n n A possible limit Y should satisfy for ind. U, Y with U uniform on [0, 1]. d = I n n Y I n + n 1 n. Y d = UY + 1, (1) Observe: µ M(R) satisfies (1), iff F (µ) = µ for F : M(R) M(R), F (µ) = L(UY + 1) with U, Y ind. and L(Y ) = µ.
13 Contraction
14 Contraction For µ, ν M 1 (R) let l 1 (µ, ν) = inf E[ X Y ] X,Y :L(X )=µ,l(y )=ν
15 Contraction For µ, ν M 1 (R) let l 1 (µ, ν) = inf E[ X Y ] X,Y :L(X )=µ,l(y )=ν l 1 is a complete metric on M 1 (R) and l 1 (µ n, µ) 0 µ n w µ.
16 Contraction For µ, ν M 1 (R) let l 1 (µ, ν) = inf E[ X Y ] X,Y :L(X )=µ,l(y )=ν l 1 is a complete metric on M 1 (R) and l 1 (µ n, µ) 0 µ n w µ. Show: F is a contraction according to l 1 in M 1 (R).
17 Contraction Proof: Let X, Y s.t. L(X ) = µ, L(Y ) = ν and E[ X Y ] d(µ, ν) + ε.
18 Contraction Proof: Let X, Y s.t. L(X ) = µ, L(Y ) = ν and E[ X Y ] d(µ, ν) + ε. Then l 1 (F (µ), F (ν)) E[ UX + 1 (UY + 1) ] = EUE[ X Y ] EU(l 1 (µ, ν) + ε).
19 Contraction Proof: Let X, Y s.t. L(X ) = µ, L(Y ) = ν and E[ X Y ] d(µ, ν) + ε. Then l 1 (F (µ), F (ν)) E[ UX + 1 (UY + 1) ] = EUE[ X Y ] EU(l 1 (µ, ν) + ε). This gives l 1 (F (µ), F (ν)) EUl 1 (µ, ν)
20 The stochastic fixed-point equation has a unique solution in M 1 (R) Y d = UY + 1
21 The stochastic fixed-point equation Y = d UY + 1 has a unique solution in M 1 (R) and it is easy to show l 1 (Y n, Y ) 0 for L(Y ) = µ.
22 Process version Consider all 1 k n uniformly in the same algorithm X n (k, ) ( Rescaling gives a the r.v. Y k ) n n = X n(k) n points k n Fit Y n to a stepfunction in D([0, 1]) Grübel, Rösler ( 95) prove defined on lattice Y n d Y in (D([0, 1]), d Sk )
23 Donsker s Theorem Let X 1, X 2,... be iid r.v. with EX 1 = 0, EX 2 1 = 1 and E X 1 2+ε < for some ε > 0.
24 Donsker s Theorem Let X 1, X 2,... be iid r.v. with EX 1 = 0, EX 2 1 = 1 and E X 1 2+ε < for some ε > 0. S n = (S n t ) t [0,1] with St n = 1 nt X k + (nt nt )X n nt +1, t [0, 1] k=1
25 Donsker s Theorem Let X 1, X 2,... be iid r.v. with EX 1 = 0, EX 2 1 = 1 and E X 1 2+ε < for some ε > 0. S n = (S n t ) t [0,1] with St n = 1 nt X k + (nt nt )X n nt +1, t [0, 1] k=1 S S^n
26 Donsker s Theorem
27 Donsker s Theorem St n = 1 nt X k + (nt nt )X n nt +1, t [0, 1] k=1 Theorem (Donsker, 1951) S n Motion. d B in (C([0, 1]), sup ), where B is a standard Brownian
28 Decomposition of Brownian Motion
29 In space: Decomposition of Brownian Motion
30 In space: Decomposition of Brownian Motion B B'
31 In space: Decomposition of Brownian Motion (B+B')/Sqrt(2) B B'
32 Decomposition of Brownian Motion It holds ( d Bt + B ) (B t ) t = t 2 for two independent standard Brownian Motions B, B. t (2)
33 Decomposition of Brownian Motion It holds ( d Bt + B ) (B t ) t = t 2 for two independent standard Brownian Motions B, B. In general, there is no spacian decomposition of S n. Equation (2) is satisfied by any gaussian process. t (2)
34 In time: Decomposition of Brownian Motion
35 In time: Decomposition of Brownian Motion Concatenation B B'
36 Decomposition of Brownian Motion ( ) d 1 ( ) 1 (B t ) t = 2 1{t 1/2} B 2t + 1 {t>1/2} B {t>1/2} B 2t 1 t = 1 ϕ 2 (B) + 1 ψ 2 (B ), (3) 2 2
37 Decomposition of Brownian Motion ( ) d 1 ( ) 1 (B t ) t = 2 1{t 1/2} B 2t + 1 {t>1/2} B {t>1/2} B 2t 1 t = 1 ϕ 2 (B) + 1 ψ 2 (B ), (3) 2 2 with (continuous and linear) functions ϕ β, ψ β : C([0, 1]) C([0, 1]) ϕ β (f )(t) = 1 {t 1/β} f (βt) + 1 {t>1/β} f (1), ψ β (f )(t) = 1 {t 1/β} f (0) + 1 {t>1/β} f ( βt 1 β 1 ).
38 Decomposition of Brownian Motion ( ) d 1 ( ) 1 (B t ) t = 2 1{t 1/2} B 2t + 1 {t>1/2} B {t>1/2} B 2t 1 t = 1 ϕ 2 (B) + 1 ψ 2 (B ), (3) 2 2 with (continuous and linear) functions ϕ β, ψ β : C([0, 1]) C([0, 1]) ϕ β (f )(t) = 1 {t 1/β} f (βt) + 1 {t>1/β} f (1), ψ β (f )(t) = 1 {t 1/β} f (0) + 1 {t>1/β} f ( βt 1 β 1 ).
39 Theorem (Uniqueness - Characterization of Brownian Motion) Let X = (X t ) t [0,1] be a continuous real-valued stochastic process satisfying (3). Then there exists a constant σ 0 such that σx is a standard Brownian Motion.
40 Time-decomposition of the random walk
41 Time-decomposition of the random walk Concatenation S_ S_
42 Time-decomposition of the random walk It holds: S n d n/2 ( = ϕ n S n/2 ) + n n/2 n/2 n ψ n n/2 (Ŝ n/2 ), (4)
43 Time-decomposition of the random walk It holds: Now S n d n/2 ( = ϕ n S n/2 ) + n n/2 n/2 n ψ n n/2 (Ŝ n/2 ), (4) n/2 n 1 2, n/2 n 1 2. suggests weak convergence S n B.
44 The Zolotarev metric on C([0, 1])
45 The Zolotarev metric on C([0, 1]) Let M(C([0, 1])) be the set of probability measures on C([0, 1]). For µ, ν M(C([0, 1])) and s > 0 define the Zolotarev distance of µ and ν by ζ s (µ, ν) = sup f F s E[f (X ) f (Y )], with L(X ) = µ, L(Y ) = ν
46 The Zolotarev metric on C([0, 1]) Let M(C([0, 1])) be the set of probability measures on C([0, 1]). For µ, ν M(C([0, 1])) and s > 0 define the Zolotarev distance of µ and ν by ζ s (µ, ν) = sup f F s E[f (X ) f (Y )], with L(X ) = µ, L(Y ) = ν and for s = m + α with 0 < α 1 and m := s 1 N 0 F s := {f C m (C([0, 1]), R) : D m f (x) D m f (y) x y α, x, y C(
47 The Zolotarev metric on C([0, 1]) Let M(C([0, 1])) be the set of probability measures on C([0, 1]). For µ, ν M(C([0, 1])) and s > 0 define the Zolotarev distance of µ and ν by ζ s (µ, ν) = sup f F s E[f (X ) f (Y )], with L(X ) = µ, L(Y ) = ν and for s = m + α with 0 < α 1 and m := s 1 N 0 F s := {f C m (C([0, 1]), R) : D m f (x) D m f (y) x y α, x, y C( Set ζ s (X, Y ) = ζ s (L(X ), L(Y )).
48 Properties of the Zolotarev distance ζ s
49 Properties of the Zolotarev distance ζ s It holds ζ s (X, Y ) <, if E X s, E Y s <, E[g(X,..., X )] = E[g(Y,..., Y )] for all k m and multilinear, bounded functions g : C([0, 1]) k R. In the following assume finiteness of the considered ζ s -distances
50 Lemma Properties of the Zolotarev distance ζ s ζ s is (s,+) - ideal, i.e. ζ s (ϕ(x ), ϕ(y )) ϕ s ζ s (X, Y ) for any continuous and linear function ϕ : C([0, 1]) C([0, 1]) with ϕ = sup ϕ(f ). f C([0,1]), f =1
51 Properties of the Zolotarev distance ζ s Lemma ζ s is (s,+) - ideal, i.e. ζ s (ϕ(x ), ϕ(y )) ϕ s ζ s (X, Y ) for any continuous and linear function ϕ : C([0, 1]) C([0, 1]) with Furthermore ϕ = sup ϕ(f ). f C([0,1]), f =1 ζ s (X 1 + X 2, Y 1 + Y 2 ) ζ s (X 1, Y 1 ) + ζ s (X 2, Y 2 ) for (X 1, Y 1 ) and (X 2, Y 2 ) independent. Proof: Zolotarev ( 76)
52 Proof of Donsker s Theorem (Sketch) We are in the case 2 < s < 3 and have to consider E[f (X, X )] for continuous, bilinear functions f : C([0, 1]) 2 R. This is done by controlling the covariance function.
53 Proof of Donsker s Theorem (Sketch) We are in the case 2 < s < 3 and have to consider E[f (X, X )] for continuous, bilinear functions f : C([0, 1]) 2 R. This is done by controlling the covariance function. Since S n and B do not share their covariance function (of course E[Ss n, St n ] min(s, t)) we also consider the process B n defined by B n t = B nt n + (nt nt ) ( B nt +1 n B nt n ). B
54 Proof of Donsker s Theorem (Sketch) An contraction argument shows for δ < ε/2. ζ 2+ε (S n, B n ) = O(n δ )
55 Proof of Donsker s Theorem (Sketch) An contraction argument shows for δ < ε/2. Together with ζ 2+ε (S n, B n ) = O(n δ ) B n B 0 a.s.
56 Proof of Donsker s Theorem (Sketch) An contraction argument shows for δ < ε/2. Together with ζ 2+ε (S n, B n ) = O(n δ ) B n B 0 a.s. Donsker s Theorem follows with the following properties of Zolotarev s distance.
57 Properties of ζ s in C([0, 1])
58 Properties of ζ s in C([0, 1]) Theorem Let B = C([0, 1]) and 0 < s 3, i.e, m {0, 1, 2}. Let (X n ) n 1, X be random variables in (C([0, 1]), ) where, for each n 1, X n is piecewise linear with intervals of length at least r n. If ) ζ s (X n, X ) = o (log m 1rn, as n, then X n X in distribution.
59 Properties of ζ s in C([0, 1]) Theorem Let B = C([0, 1]) and 0 < s 3, i.e, m {0, 1, 2}. Let (X n ) n 1, X be random variables in (C([0, 1]), ) where, for each n 1, X n is piecewise linear with intervals of length at least r n. If ) ζ s (X n, X ) = o (log m 1rn, as n, then X n X in distribution. Proof: This basically follows from a result of Barbour from the context of Stein s method.
60 Properties of ζ s in C([0, 1]) Theorem Suppose 0 < s 3, i.e, m {0, 1, 2}. Let (X n ) n 1, (Y n ) n 1, Z be random variables in (C([0, 1]), ) where, for each n 1, X n, Y n are piecewise linear with intervals of length at least r n. If Y n Z in distribution and ) ζ s (X n, Y n ) = o (log m 1rn, as n, then X n Z in distribution.
A Note on the Approximation of Perpetuities
Discrete Mathematics and Theoretical Computer Science (subm.), by the authors, rev A Note on the Approximation of Perpetuities Margarete Knape and Ralph Neininger Department for Mathematics and Computer
More informationErgodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.
Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions
More informationMATH 6605: SUMMARY LECTURE NOTES
MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic
More information1. Stochastic Processes and filtrations
1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S
More informationBrownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539
Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory
More informationLecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1
Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).
More informationSelected Exercises on Expectations and Some Probability Inequalities
Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ
More informationOptimal transportation and optimal control in a finite horizon framework
Optimal transportation and optimal control in a finite horizon framework Guillaume Carlier and Aimé Lachapelle Université Paris-Dauphine, CEREMADE July 2008 1 MOTIVATIONS - A commitment problem (1) Micro
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationarxiv:math.pr/ v1 17 May 2004
Probabilistic Analysis for Randomized Game Tree Evaluation Tämur Ali Khan and Ralph Neininger arxiv:math.pr/0405322 v1 17 May 2004 ABSTRACT: We give a probabilistic analysis for the randomized game tree
More informationWiener Measure and Brownian Motion
Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u
More informationHardy-Stein identity and Square functions
Hardy-Stein identity and Square functions Daesung Kim (joint work with Rodrigo Bañuelos) Department of Mathematics Purdue University March 28, 217 Daesung Kim (Purdue) Hardy-Stein identity UIUC 217 1 /
More informationBrownian Motion and Conditional Probability
Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical
More informationWeak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij
Weak convergence and Brownian Motion (telegram style notes) P.J.C. Spreij this version: December 8, 2006 1 The space C[0, ) In this section we summarize some facts concerning the space C[0, ) of real
More informationSelf-similar fractals as boundaries of networks
Self-similar fractals as boundaries of networks Erin P. J. Pearse ep@ou.edu University of Oklahoma 4th Cornell Conference on Analysis, Probability, and Mathematical Physics on Fractals AMS Eastern Sectional
More informationScaling limits for random trees and graphs
YEP VII Probability, random trees and algorithms 8th-12th March 2010 Scaling limits for random trees and graphs Christina Goldschmidt INTRODUCTION A taste of what s to come We start with perhaps the simplest
More informationMultilevel Monte Carlo for Lévy Driven SDEs
Multilevel Monte Carlo for Lévy Driven SDEs Felix Heidenreich TU Kaiserslautern AG Computational Stochastics August 2011 joint work with Steffen Dereich Philipps-Universität Marburg supported within DFG-SPP
More informationPartial match queries in random quadtrees
Partial match queries in random quadtrees Nicolas Broutin nicolas.broutin@inria.fr Projet Algorithms INRIA Rocquencourt 78153 Le Chesnay France Ralph Neininger and Henning Sulzbach {neiningr, sulzbach}@math.uni-frankfurt.de
More informationBounds on the Constant in the Mean Central Limit Theorem
Bounds on the Constant in the Mean Central Limit Theorem Larry Goldstein University of Southern California, Ann. Prob. 2010 Classical Berry Esseen Theorem Let X, X 1, X 2,... be i.i.d. with distribution
More informationProperties for systems with weak invariant manifolds
Statistical properties for systems with weak invariant manifolds Faculdade de Ciências da Universidade do Porto Joint work with José F. Alves Workshop rare & extreme Gibbs-Markov-Young structure Let M
More informationSelf-similar fractals as boundaries of networks
Self-similar fractals as boundaries of networks Erin P. J. Pearse ep@ou.edu University of Oklahoma 4th Cornell Conference on Analysis, Probability, and Mathematical Physics on Fractals AMS Eastern Sectional
More informationPreliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012
Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 202 The exam lasts from 9:00am until 2:00pm, with a walking break every hour. Your goal on this exam should be to demonstrate mastery of
More informationClass 1: Stationary Time Series Analysis
Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models
More information3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?
MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due ). Show that the open disk x 2 + y 2 < 1 is a countable union of planar elementary sets. Show that the closed disk x 2 + y 2 1 is a countable
More informationLECTURE 2: LOCAL TIME FOR BROWNIAN MOTION
LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in
More informationl(y j ) = 0 for all y j (1)
Problem 1. The closed linear span of a subset {y j } of a normed vector space is defined as the intersection of all closed subspaces containing all y j and thus the smallest such subspace. 1 Show that
More informationLectures 22-23: Conditional Expectations
Lectures 22-23: Conditional Expectations 1.) Definitions Let X be an integrable random variable defined on a probability space (Ω, F 0, P ) and let F be a sub-σ-algebra of F 0. Then the conditional expectation
More informationAnnealed Brownian motion in a heavy tailed Poissonian potential
Annealed Brownian motion in a heavy tailed Poissonian potential Ryoki Fukushima Research Institute of Mathematical Sciences Stochastic Analysis and Applications, Okayama University, September 26, 2012
More informationLecture 17 Brownian motion as a Markov process
Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is
More information9 Brownian Motion: Construction
9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of
More informationBrownian Motion. Chapter Definition of Brownian motion
Chapter 5 Brownian Motion Brownian motion originated as a model proposed by Robert Brown in 1828 for the phenomenon of continual swarming motion of pollen grains suspended in water. In 1900, Bachelier
More informationAn introduction to Liouville Quantum Field theory
An introduction to Liouville Quantum Field theory Vincent Vargas ENS Paris Outline 1 Quantum Mechanics and Feynman s path integral 2 Liouville Quantum Field theory (LQFT) The Liouville action The Gaussian
More informationTheoretical Statistics. Lecture 17.
Theoretical Statistics. Lecture 17. Peter Bartlett 1. Asymptotic normality of Z-estimators: classical conditions. 2. Asymptotic equicontinuity. 1 Recall: Delta method Theorem: Supposeφ : R k R m is differentiable
More informationHomogenization for chaotic dynamical systems
Homogenization for chaotic dynamical systems David Kelly Ian Melbourne Department of Mathematics / Renci UNC Chapel Hill Mathematics Institute University of Warwick November 3, 2013 Duke/UNC Probability
More informationDynamical systems with Gaussian and Levy noise: analytical and stochastic approaches
Dynamical systems with Gaussian and Levy noise: analytical and stochastic approaches Noise is often considered as some disturbing component of the system. In particular physical situations, noise becomes
More informationEstimation and Inference of Quantile Regression. for Survival Data under Biased Sampling
Estimation and Inference of Quantile Regression for Survival Data under Biased Sampling Supplementary Materials: Proofs of the Main Results S1 Verification of the weight function v i (t) for the lengthbiased
More informationA D VA N C E D P R O B A B I L - I T Y
A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2
More informationStochastic Models (Lecture #4)
Stochastic Models (Lecture #4) Thomas Verdebout Université libre de Bruxelles (ULB) Today Today, our goal will be to discuss limits of sequences of rv, and to study famous limiting results. Convergence
More informationGaussian vectors and central limit theorem
Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables
More informationP (A G) dp G P (A G)
First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume
More informationAsymptotic results for empirical measures of weighted sums of independent random variables
Asymptotic results for empirical measures of weighted sums of independent random variables B. Bercu and W. Bryc University Bordeaux 1, France Workshop on Limit Theorems, University Paris 1 Paris, January
More informationStability of optimization problems with stochastic dominance constraints
Stability of optimization problems with stochastic dominance constraints D. Dentcheva and W. Römisch Stevens Institute of Technology, Hoboken Humboldt-University Berlin www.math.hu-berlin.de/~romisch SIAM
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More informationarxiv: v2 [math.pr] 29 Jul 2012
Appendix to Approximating perpetuities arxiv:203.0679v2 [math.pr] 29 Jul 202 Margarete Knape and Ralph Neininger Institute for Mathematics J.W. Goethe-University 60054 Frankfurt a.m. Germany July 25, 202
More informationAsymptotic results for empirical measures of weighted sums of independent random variables
Asymptotic results for empirical measures of weighted sums of independent random variables B. Bercu and W. Bryc University Bordeaux 1, France Seminario di Probabilità e Statistica Matematica Sapienza Università
More informationUniversal examples. Chapter The Bernoulli process
Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial
More information1. Aufgabenblatt zur Vorlesung Probability Theory
24.10.17 1. Aufgabenblatt zur Vorlesung By (Ω, A, P ) we always enote the unerlying probability space, unless state otherwise. 1. Let r > 0, an efine f(x) = 1 [0, [ (x) exp( r x), x R. a) Show that p f
More informationStochastic integration. P.J.C. Spreij
Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................
More informationSpectral theory for compact operators on Banach spaces
68 Chapter 9 Spectral theory for compact operators on Banach spaces Recall that a subset S of a metric space X is precompact if its closure is compact, or equivalently every sequence contains a Cauchy
More informationFunctional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals
Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico
More informationOn Ergodic Impulse Control with Constraint
On Ergodic Impulse Control with Constraint Maurice Robin Based on joint papers with J.L. Menaldi University Paris-Sanclay 9119 Saint-Aubin, France (e-mail: maurice.robin@polytechnique.edu) IMA, Minneapolis,
More informationOn a class of stochastic differential equations in a financial network model
1 On a class of stochastic differential equations in a financial network model Tomoyuki Ichiba Department of Statistics & Applied Probability, Center for Financial Mathematics and Actuarial Research, University
More informationSTAT 992 Paper Review: Sure Independence Screening in Generalized Linear Models with NP-Dimensionality J.Fan and R.Song
STAT 992 Paper Review: Sure Independence Screening in Generalized Linear Models with NP-Dimensionality J.Fan and R.Song Presenter: Jiwei Zhao Department of Statistics University of Wisconsin Madison April
More informationStein s Method: Distributional Approximation and Concentration of Measure
Stein s Method: Distributional Approximation and Concentration of Measure Larry Goldstein University of Southern California 36 th Midwest Probability Colloquium, 2014 Concentration of Measure Distributional
More informationSmall ball probabilities and metric entropy
Small ball probabilities and metric entropy Frank Aurzada, TU Berlin Sydney, February 2012 MCQMC Outline 1 Small ball probabilities vs. metric entropy 2 Connection to other questions 3 Recent results for
More informationOn the convergence of sequences of random variables: A primer
BCAM May 2012 1 On the convergence of sequences of random variables: A primer Armand M. Makowski ECE & ISR/HyNet University of Maryland at College Park armand@isr.umd.edu BCAM May 2012 2 A sequence a :
More informationMATH 418: Lectures on Conditional Expectation
MATH 418: Lectures on Conditional Expectation Instructor: r. Ed Perkins, Notes taken by Adrian She Conditional expectation is one of the most useful tools of probability. The Radon-Nikodym theorem enables
More informationStein s Method for concentration inequalities
Number of Triangles in Erdős-Rényi random graph, UC Berkeley joint work with Sourav Chatterjee, UC Berkeley Cornell Probability Summer School July 7, 2009 Number of triangles in Erdős-Rényi random graph
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationIntegration on Measure Spaces
Chapter 3 Integration on Measure Spaces In this chapter we introduce the general notion of a measure on a space X, define the class of measurable functions, and define the integral, first on a class of
More informationLECTURE 3. Last time:
LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate
More informationBootstrap random walks
Bootstrap random walks Kais Hamza Monash University Joint work with Andrea Collevecchio & Meng Shi Introduction The two and three dimensional processes Higher Iterations An extension any (prime) number
More informationItô s excursion theory and random trees
Itô s excursion theory and random trees Jean-François Le Gall January 3, 200 Abstract We explain how Itô s excursion theory can be used to understand the asymptotic behavior of large random trees. We provide
More informationWhere is matrix multiplication locally open?
Linear Algebra and its Applications 517 (2017) 167 176 Contents lists available at ScienceDirect Linear Algebra and its Applications www.elsevier.com/locate/laa Where is matrix multiplication locally open?
More informationIt follows from the above inequalities that for c C 1
3 Spaces L p 1. Appearance of normed spaces. In this part we fix a measure space (, A, µ) (we may assume that µ is complete), and consider the A - measurable functions on it. 2. For f L 1 (, µ) set f 1
More informationPERIODS IMPLYING ALMOST ALL PERIODS FOR TREE MAPS. A. M. Blokh. Department of Mathematics, Wesleyan University Middletown, CT , USA
PERIODS IMPLYING ALMOST ALL PERIODS FOR TREE MAPS A. M. Blokh Department of Mathematics, Wesleyan University Middletown, CT 06459-0128, USA August 1991, revised May 1992 Abstract. Let X be a compact tree,
More informationTHEOREMS, ETC., FOR MATH 515
THEOREMS, ETC., FOR MATH 515 Proposition 1 (=comment on page 17). If A is an algebra, then any finite union or finite intersection of sets in A is also in A. Proposition 2 (=Proposition 1.1). For every
More informationFonctions on bounded variations in Hilbert spaces
Fonctions on bounded variations in ilbert spaces Newton Institute, March 31, 2010 Introduction We recall that a function u : R n R is said to be of bounded variation (BV) if there exists an n-dimensional
More informationMS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10
MS&E 321 Spring 12-13 Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10 Section 3: Regenerative Processes Contents 3.1 Regeneration: The Basic Idea............................... 1 3.2
More information1 Stat 605. Homework I. Due Feb. 1, 2011
The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due
More informationSobolev Spaces. Chapter 10
Chapter 1 Sobolev Spaces We now define spaces H 1,p (R n ), known as Sobolev spaces. For u to belong to H 1,p (R n ), we require that u L p (R n ) and that u have weak derivatives of first order in L p
More informationContinuous dependence estimates for the ergodic problem with an application to homogenization
Continuous dependence estimates for the ergodic problem with an application to homogenization Claudio Marchi Bayreuth, September 12 th, 2013 C. Marchi (Università di Padova) Continuous dependence Bayreuth,
More informationBranching within branching: a general model for host-parasite co-evolution
Branching within branching: a general model for host-parasite co-evolution Gerold Alsmeyer (joint work with Sören Gröttrup) May 15, 2017 Gerold Alsmeyer Host-parasite co-evolution 1 of 26 1 Model 2 The
More informationBrownian Motion and Stochastic Calculus
ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s
More informationMathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )
Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca October 22nd, 2014 E. Tanré (INRIA - Team Tosca) Mathematical
More informationPractical conditions on Markov chains for weak convergence of tail empirical processes
Practical conditions on Markov chains for weak convergence of tail empirical processes Olivier Wintenberger University of Copenhagen and Paris VI Joint work with Rafa l Kulik and Philippe Soulier Toronto,
More informationLipschitz continuity for solutions of Hamilton-Jacobi equation with Ornstein-Uhlenbeck operator
Lipschitz continuity for solutions of Hamilton-Jacobi equation with Ornstein-Uhlenbeck operator Thi Tuyen Nguyen Ph.D student of University of Rennes 1 Joint work with: Prof. E. Chasseigne(University of
More informationA large deviation principle for a RWRC in a box
A large deviation principle for a RWRC in a box 7th Cornell Probability Summer School Michele Salvi TU Berlin July 12, 2011 Michele Salvi (TU Berlin) An LDP for a RWRC in a nite box July 12, 2011 1 / 15
More informationFinite-time Ruin Probability of Renewal Model with Risky Investment and Subexponential Claims
Proceedings of the World Congress on Engineering 29 Vol II WCE 29, July 1-3, 29, London, U.K. Finite-time Ruin Probability of Renewal Model with Risky Investment and Subexponential Claims Tao Jiang Abstract
More informationLarge Deviations Principles for McKean-Vlasov SDEs, Skeletons, Supports and the law of iterated logarithm
Large Deviations Principles for McKean-Vlasov SDEs, Skeletons, Supports and the law of iterated logarithm Gonçalo dos Reis University of Edinburgh (UK) & CMA/FCT/UNL (PT) jointly with: W. Salkeld, U. of
More information2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?
MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due 9/5). Prove that every countable set A is measurable and µ(a) = 0. 2 (Bonus). Let A consist of points (x, y) such that either x or y is
More informationAsymptotics of minimax stochastic programs
Asymptotics of minimax stochastic programs Alexander Shapiro Abstract. We discuss in this paper asymptotics of the sample average approximation (SAA) of the optimal value of a minimax stochastic programming
More informationExercises Measure Theoretic Probability
Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary
More informationSome functional (Hölderian) limit theorems and their applications (II)
Some functional (Hölderian) limit theorems and their applications (II) Alfredas Račkauskas Vilnius University Outils Statistiques et Probabilistes pour la Finance Université de Rouen June 1 5, Rouen (Rouen
More informationHypothesis testing for Stochastic PDEs. Igor Cialenco
Hypothesis testing for Stochastic PDEs Igor Cialenco Department of Applied Mathematics Illinois Institute of Technology igor@math.iit.edu Joint work with Liaosha Xu Research partially funded by NSF grants
More informationFiltrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition
Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,
More informationsparse and low-rank tensor recovery Cubic-Sketching
Sparse and Low-Ran Tensor Recovery via Cubic-Setching Guang Cheng Department of Statistics Purdue University www.science.purdue.edu/bigdata CCAM@Purdue Math Oct. 27, 2017 Joint wor with Botao Hao and Anru
More informationIntegral representations in models with long memory
Integral representations in models with long memory Georgiy Shevchenko, Yuliya Mishura, Esko Valkeila, Lauri Viitasaari, Taras Shalaiko Taras Shevchenko National University of Kyiv 29 September 215, Ulm
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationOPERATOR THEORY ON HILBERT SPACE. Class notes. John Petrovic
OPERATOR THEORY ON HILBERT SPACE Class notes John Petrovic Contents Chapter 1. Hilbert space 1 1.1. Definition and Properties 1 1.2. Orthogonality 3 1.3. Subspaces 7 1.4. Weak topology 9 Chapter 2. Operators
More informationEconomics 204. The Transversality Theorem is a particularly convenient formulation of Sard s Theorem for our purposes: with r 1+max{0,n m}
Economics 204 Lecture 13 Wednesday, August 12, 2009 Section 5.5 (Cont.) Transversality Theorem The Transversality Theorem is a particularly convenient formulation of Sard s Theorem for our purposes: Theorem
More informationStatistical Inference on Large Contingency Tables: Convergence, Testability, Stability. COMPSTAT 2010 Paris, August 23, 2010
Statistical Inference on Large Contingency Tables: Convergence, Testability, Stability Marianna Bolla Institute of Mathematics Budapest University of Technology and Economics marib@math.bme.hu COMPSTAT
More informationOn the fast convergence of random perturbations of the gradient flow.
On the fast convergence of random perturbations of the gradient flow. Wenqing Hu. 1 (Joint work with Chris Junchi Li 2.) 1. Department of Mathematics and Statistics, Missouri S&T. 2. Department of Operations
More informationRANDOM FRACTALS AND PROBABILITY METRICS
RANDOM FRACTALS AND PROBABILITY METRICS JOHN E. HUTCHINSON AND LUDGER RÜSCHENDORF Abstract. New metrics are introduced in the space of random measures and are applied, with various modifications of the
More informationSTABILITY AND UNIFORM APPROXIMATION OF NONLINEAR FILTERS USING THE HILBERT METRIC, AND APPLICATION TO PARTICLE FILTERS 1
The Annals of Applied Probability 0000, Vol. 00, No. 00, 000 000 STABILITY AND UNIFORM APPROXIMATION OF NONLINAR FILTRS USING TH HILBRT MTRIC, AND APPLICATION TO PARTICL FILTRS 1 By François LeGland and
More informationApproximating irregular SDEs via iterative Skorokhod embeddings
Approximating irregular SDEs via iterative Skorokhod embeddings Universität Duisburg-Essen, Essen, Germany Joint work with Stefan Ankirchner and Thomas Kruse Stochastic Analysis, Controlled Dynamical Systems
More informationLarge deviations and fluctuation exponents for some polymer models. Directed polymer in a random environment. KPZ equation Log-gamma polymer
Large deviations and fluctuation exponents for some polymer models Timo Seppäläinen Department of Mathematics University of Wisconsin-Madison 211 1 Introduction 2 Large deviations 3 Fluctuation exponents
More informationUseful Probability Theorems
Useful Probability Theorems Shiu-Tang Li Finished: March 23, 2013 Last updated: November 2, 2013 1 Convergence in distribution Theorem 1.1. TFAE: (i) µ n µ, µ n, µ are probability measures. (ii) F n (x)
More informationOn Stopping Times and Impulse Control with Constraint
On Stopping Times and Impulse Control with Constraint Jose Luis Menaldi Based on joint papers with M. Robin (216, 217) Department of Mathematics Wayne State University Detroit, Michigan 4822, USA (e-mail:
More informationIntroduction to Empirical Processes and Semiparametric Inference Lecture 02: Overview Continued
Introduction to Empirical Processes and Semiparametric Inference Lecture 02: Overview Continued Michael R. Kosorok, Ph.D. Professor and Chair of Biostatistics Professor of Statistics and Operations Research
More information