Superconcentration inequalities for centered Gaussian stationnary processes
|
|
- Miles Davis
- 5 years ago
- Views:
Transcription
1 Superconcentration inequalities for centered Gaussian stationnary processes Kevin Tanguy Toulouse University June 21, / 22
2 Outline What is superconcentration? Convergence of extremes (Gaussian case). Superconcentration inequality for stationary Gaussian sequences. Tools and sketch of the proof. 2 / 22
3 What is superconcentration? 2 / 22
4 X = (X 1,..., X n ) N (0, Γ) 3 / 22
5 X = (X 1,..., X n ) N (0, Γ) with E [ Xi 2 ] = 1. 3 / 22
6 X = (X 1,..., X n ) N (0, Γ) with E [ Xi 2 ] = 1. Set M n = max i X i. 3 / 22
7 X = (X 1,..., X n ) N (0, Γ) with E [ Xi 2 ] = 1. Set M n = max i X i. Variance upper bound Var(M n )? 3 / 22
8 X = (X 1,..., X n ) N (0, Γ) with E [ Xi 2 ] = 1. Set M n = max i X i. Variance upper bound Var(M n )? Classical concentration theory Var(M n ) max Var(X i ). i 3 / 22
9 X = (X 1,..., X n ) N (0, Γ) with E [ Xi 2 ] = 1. Set M n = max i X i. Variance upper bound Var(M n )? Classical concentration theory Var(M n ) max Var(X i ). i sharp inequality, 3 / 22
10 X = (X 1,..., X n ) N (0, Γ) with E [ Xi 2 ] = 1. Set M n = max i X i. Variance upper bound Var(M n )? Classical concentration theory Var(M n ) max Var(X i ). i sharp inequality, does not depend on Γ. 3 / 22
11 Take Γ = Id ( X i s independent). 4 / 22
12 Take Γ = Id ( X i s independent). Var(M n ) 1 4 / 22
13 Take Γ = Id ( X i s independent). In fact, Var(M n ) Var(M n ) 1 C log n, C > 0 4 / 22
14 Classical theory suboptimal 5 / 22
15 Classical theory suboptimal Chatterjee s terminology : superconcentration phenomenon. 5 / 22
16 Classical theory suboptimal Chatterjee s terminology : superconcentration phenomenon. Lot of different models Largest eigenvalue in random matrix theory. 5 / 22
17 Classical theory suboptimal Chatterjee s terminology : superconcentration phenomenon. Lot of different models Largest eigenvalue in random matrix theory. 6 / 22
18 Classical theory suboptimal Chatterjee s terminology : superconcentration phenomenon. Lot of different models Largest eigenvalue in random matrix theory. Branching random walk. 6 / 22
19 Branching random walk. Take a binary tree of depth N. Put X e i.i.d. N (0, 1) on each edge e. Take a path π from the top to the bottom of the tree, X π = e π X e. 7 / 22
20 Branching random walk. Take a binary tree of depth N. Put X e i.i.d. N (0, 1) on each edge e. Take a path π from the top to the bottom of the tree, X π = e π X e. Classical theory : Var(max π X π ) N (X π N (0, N)). 7 / 22
21 Branching random walk. Take a binary tree of depth N. Put X e i.i.d. N (0, 1) on each edge e. Take a path π from the top to the bottom of the tree, X π = e π X e. Classical theory : Var(max π X π ) N (X π N (0, N)). In fact, Var(max π X π ) C [Bramson-Ding-Zeitouni]. 7 / 22
22 Classical theory suboptimal Chatterjee s terminology : superconcentration phenomenon. Lot of different models Largest eigenvalue in random matrix theory. Branching random walk. 8 / 22
23 Classical theory suboptimal Chatterjee s terminology : superconcentration phenomenon. Lot of different models Largest eigenvalue in random matrix theory. Branching random walk. Discrete Gaussian free field on Z d. 8 / 22
24 Classical theory suboptimal Chatterjee s terminology : superconcentration phenomenon. Lot of different models Largest eigenvalue in random matrix theory. Branching random walk. Discrete Gaussian free field on Z d. Free energy in spin glasses theory (SK model). 8 / 22
25 Classical theory suboptimal Chatterjee s terminology : superconcentration phenomenon. Lot of different models Largest eigenvalue in random matrix theory. Branching random walk. Discrete Gaussian free field on Z d. Free energy in spin glasses theory (SK model). First passage in percolation theory. 8 / 22
26 Classical theory suboptimal Chatterjee s terminology : superconcentration phenomenon. Lot of different models Largest eigenvalue in random matrix theory. Branching random walk. Discrete Gaussian free field on Z d. Free energy in spin glasses theory (SK model). First passage in percolation theory. Stationary Gaussian sequences. 8 / 22
27 What is superconcentration? Convergence of extremes (Gaussian case). 9 / 22
28 Stationary Gaussian sequences Theorem [Berman 64] (X i ) i 0 centered normalized stationary Gaussian sequence with covariance function φ. 10 / 22
29 Stationary Gaussian sequences Theorem [Berman 64] (X i ) i 0 centered normalized stationary Gaussian sequence with covariance function φ. Assume φ(n) = o(1/ log n) (n ), 10 / 22
30 Stationary Gaussian sequences Theorem [Berman 64] (X i ) i 0 centered normalized stationary Gaussian sequence with covariance function φ. Assume φ(n) = o(1/ log n) (n ), then a n (M n b n ) d G, (n ) 10 / 22
31 Stationary Gaussian sequences Theorem [Berman 64] (X i ) i 0 centered normalized stationary Gaussian sequence with covariance function φ. Assume φ(n) = o(1/ log n) (n ), then a n (M n b n ) d G, (n ) where a n = 2 log n 10 / 22
32 Stationary Gaussian sequences Theorem [Berman 64] (X i ) i 0 centered normalized stationary Gaussian sequence with covariance function φ. Assume φ(n) = o(1/ log n) (n ), then a n (M n b n ) d G, (n ) where a n = 2 log n and P(G t) = 1 e e t, t R (Gumbel). 10 / 22
33 Stationary Gaussian sequences Theorem [Berman 64] (X i ) i 0 centered normalized stationary Gaussian sequence with covariance function φ. Assume φ(n) = o(1/ log n) (n ), then a n (M n b n ) d G, (n ) where a n = 2 log n and P(G t) = 1 e e t, t R (Gumbel). Note : P(G t) e t 10 / 22
34 Results from classical theory Var(M n ) / 22
35 Results from classical theory Var(M n ) 1. Question : Correct bound? 11 / 22
36 Results from classical theory Var(M n ) 1. Question : Correct bound? Proposition [Chatterjee 14] Var(M n ) C log n. 11 / 22
37 What is superconcentration? Convergence of extremes (Gaussian case). Superconcentration inequality for stationary Gaussian sequences. 11 / 22
38 Question : (super)concentration inequality reflecting convergence of extremes? 12 / 22
39 Question : (super)concentration inequality reflecting convergence of extremes? Similar as Y 1,..., Y n i.i.d Rademacher ( P(Y1 = 1) = P(Y 1 = 1) = 1/2 ) 12 / 22
40 Question : (super)concentration inequality reflecting convergence of extremes? Similar as Y 1,..., Y n i.i.d Rademacher n 1/2 n i=1 X i d N (0, 1) ( P(Y1 = 1) = P(Y 1 = 1) = 1/2 ) (CLT) 12 / 22
41 Question : (super)concentration inequality reflecting convergence of extremes? Similar as Y 1,..., Y n i.i.d Rademacher n 1/2 n i=1 X i d N (0, 1) ( P(Y1 = 1) = P(Y 1 = 1) = 1/2 ) (CLT) P(n 1/2 n i=1 X i t) e t2 /2, t / 22
42 (Super)concentration inequality? Gaussian concentration inequality [Borel-Sudakov-Tsirelson 76] Take X N (0, Id) and F : R n R Lipschitz. 13 / 22
43 (Super)concentration inequality? Gaussian concentration inequality [Borel-Sudakov-Tsirelson 76] Take X N (0, Id) and F : R n R Lipschitz. P( F (X ) E[F (X )] ) t 2e t2 /2 F 2 Lip, t 0 13 / 22
44 (Super)concentration inequality? Gaussian concentration inequality [Borel-Sudakov-Tsirelson 76] Take X N (0, Id) and F : R n R Lipschitz. P( F (X ) E[F (X )] ) t 2e t2 /2 F 2 Lip, t 0 If F (x) = max i x i, F 2 Lip 1 ( P a n M n E[M n ] ) t 2e t2 2an 2, t 0, a n = 2 log n 13 / 22
45 (Super)concentration inequality? Gaussian concentration inequality [Borel-Sudakov-Tsirelson 76] Take X N (0, Id) and F : R n R Lipschitz. P( F (X ) E[F (X )] ) t 2e t2 /2 F 2 Lip, t 0 If F (x) = max i x i, F 2 Lip 1 ( P a n M n E[M n ] ) t 2e t2 2an 2, t 0, a n = 2 log n does not reflect Gumbel asymptotics. 13 / 22
46 Question : (super)concentration inequality reflecting convergence of extremes? 14 / 22
47 Question : (super)concentration inequality reflecting convergence of extremes? Theorem [T. 15] (X i ) i 0 centered stationary Gaussian sequence, covariance function φ. Assume φ(n) = o(1/ log n) (n ) and technicals hypothesis, then P ( M n E[M n ] t) 6e ct log n, t / 22
48 Question : (super)concentration inequality reflecting convergence of extremes? Theorem [T. 15] (X i ) i 0 centered stationary Gaussian sequence, covariance function φ. Assume φ(n) = o(1/ log n) (n ) and technicals hypothesis, then P ( M n E[M n ] t) 6e ct log n, t 0. Remark : same inequality holds with b n instead of E [M n ]. 14 / 22
49 Question : (super)concentration inequality reflecting convergence of extremes? Theorem [T. 15] (X i ) i 0 centered stationary Gaussian sequence, covariance function φ. Assume φ(n) = o(1/ log n) (n ) and technicals hypothesis, then P ( M n E[M n ] t) 6e ct log n, t 0. Remark : same inequality holds with b n instead of E [M n ]. ( ) P log n Mn b n t Ce ct, t / 22
50 Question : (super)concentration inequality reflecting convergence of extremes? Theorem [T. 15] (X i ) i 0 centered stationary Gaussian sequence, covariance function φ. Assume φ(n) = o(1/ log n) (n ) and technicals hypothesis, then P ( M n E[M n ] t) 6e ct log n, t 0. Remark : same inequality holds with b n instead of E [M n ]. ( ) P log n Mn b n t Ce ct, t 0. Reflects Gumbel asymptotics 14 / 22
51 Recall Theorem [Berman 64 ] (X i ) i 0 centered normalized stationary Gaussian sequence with covariance function φ. Assume φ(n) = o(log n) (n ), then a n (M n b n ) d G, (n ) where a n = 2 log n and P(G t) = 1 e e t, t R (Gumbel). Note : P(G t) e t 15 / 22
52 Theorem [T. 15 ] (X i ) i 0 centered stationary Gaussian, covariance function φ. Assume φ(n) = o(log n) (n ) and technicals hypothesis, then P ( M n E[M n ] t) 6e ct log n, t 0. Remark : same inequality holds with b n instead of E [M n ]. Reflects Gumbel asymptotics. 16 / 22
53 Theorem [T. 15 ] (X i ) i 0 centered stationary Gaussian, covariance function φ. Assume φ(n) = o(log n) (n ) and technicals hypothesis, then P ( M n E[M n ] t) 6e ct log n, t 0. Remark : same inequality holds with b n instead of E [M n ]. Reflects Gumbel asymptotics. Implies Var(max i X i ) C log n (optimal). 16 / 22
54 Tools Proof? 17 / 22
55 Tools Proof? Chatterjee s scheme of proof for the variance at the exponential level. 17 / 22
56 Tools Proof? Chatterjee s scheme of proof for the variance at the exponential level. General theorem implies superconcentration inequality for Gaussian stationary sequences. 17 / 22
57 Key steps of the proof Main steps Semigroup representation of the variance 18 / 22
58 Key steps of the proof Main steps Semigroup representation of the variance Hypercontractivity 18 / 22
59 Key steps of the proof Main steps Semigroup representation of the variance Hypercontractivity Proper use of a covering C of {1,... n}. 18 / 22
60 Interpolation, independant case Ornstein-Uhlenbeck semigroup P t f (x) = f (xe t + 1 e 2t y)dγ(y), t 0, x R n. R n 19 / 22
61 Interpolation, independant case Ornstein-Uhlenbeck semigroup P t f (x) = f (xe t + 1 e 2t y)dγ(y), t 0, x R n. R n Heat equation : t P t f = LP t f where L = x. 19 / 22
62 Interpolation, independant case Ornstein-Uhlenbeck semigroup P t f (x) = f (xe t + 1 e 2t y)dγ(y), t 0, x R n. R n Heat equation : t P t f = LP t f where L = x. Var(f ) = E[f 2 ] E[f ] 2 = 19 / 22
63 Interpolation, independant case Ornstein-Uhlenbeck semigroup P t f (x) = f (xe t + 1 e 2t y)dγ(y), t 0, x R n. R n Heat equation : t P t f = LP t f where L = x. Var(f ) = E[f 2 ] E[f ] 2 = E [ (P 0 f ) 2] ( E[P f ] ) 2 19 / 22
64 Interpolation, independant case Ornstein-Uhlenbeck semigroup P t f (x) = f (xe t + 1 e 2t y)dγ(y), t 0, x R n. R n Heat equation : t P t f = LP t f where L = x. Var(f ) = E[f 2 ] E[f ] 2 = E [ (P 0 f ) 2] ( E[P f ] ) 2 = 0 d dt E[ (P t f ) 2] dt 19 / 22
65 Interpolation, independant case Ornstein-Uhlenbeck semigroup P t f (x) = f (xe t + 1 e 2t y)dγ(y), t 0, x R n. R n Heat equation : t P t f = LP t f where L = x. Var(f ) = E[f 2 ] E[f ] 2 = E [ (P 0 f ) 2] ( E[P f ] ) 2 = = d dt E[ (P t f ) 2] dt E [ (P t f )( t P t f ) ] dt = / 22
66 Interpolation, independant case Ornstein-Uhlenbeck semigroup P t f (x) = f (xe t + 1 e 2t y)dγ(y), t 0, x R n. R n Heat equation : t P t f = LP t f where L = x. Var(f ) = E[f 2 ] E[f ] 2 = E [ (P 0 f ) 2] ( E[P f ] ) 2 = = 2 = 2 0 d dt E[ (P t f ) 2] dt 0 n e 2t 0 i=1 E [ (P t f )( t P t f ) ] dt =... E [ (P t i f ) 2] dt 19 / 22
67 Hypercontractivity P t hypercontractive, with p t = 1 + e 2t < 2. E [ (P t h) 2] 1/2 E[h p t ] 1/pt, 20 / 22
68 Hypercontractivity P t hypercontractive, with p t = 1 + e 2t < 2. E [ (P t h) 2] 1/2 E[h p t ] 1/pt, Apply to h = i (e θmn ), θ R and combined with concentration Lemma. 20 / 22
69 Hypercontractivity P t hypercontractive, with p t = 1 + e 2t < 2. E [ (P t h) 2] 1/2 E[h p t ] 1/pt, Apply to h = i (e θmn ), θ R and combined with concentration Lemma. General case similar, with further step to treat correlation. 20 / 22
70 Conclusion Similar results Gaussian stationary sequences with other covariance behavior. Gaussian stationary processes on R d. discrete Gaussian free field on Z d, d 3 21 / 22
71 Conclusion Similar results Gaussian stationary sequences with other covariance behavior. Gaussian stationary processes on R d. discrete Gaussian free field on Z d, d 3 (infinite group of finite type+volume growth condition ok). 21 / 22
72 Conclusion Similar results Gaussian stationary sequences with other covariance behavior. Gaussian stationary processes on R d. discrete Gaussian free field on Z d, d 3 (infinite group of finite type+volume growth condition ok). Uniform measure on the sphere S n 1 R n. Log-concave measures on R n with convexity assumptions. 21 / 22
73 Conclusion Hypercontractivity relevant Gaussian processes independent case (variance 1 log n ). 22 / 22
74 Conclusion Hypercontractivity relevant Gaussian processes independent case (variance 1 log n ). discrete Gaussian free field on Z 2 completely different behavior. Var(M n ) = O(1) [Bramson-Ding-Zeitouni] convergence in distribution Gumbel randomly shifted [Bramson-Ding-Zeitouni 15]. Hypercontractivity alone doesn t work. 22 / 22
75 Thank you for your attention. 22 / 22
Concentration inequalities: basics and some new challenges
Concentration inequalities: basics and some new challenges M. Ledoux University of Toulouse, France & Institut Universitaire de France Measure concentration geometric functional analysis, probability theory,
More informationSome superconcentration inequalities for extrema of stationary Gaussian processes
Some superconcentration inequalities for extrema of stationary Gaussian processes Kevin Tanguy University of Toulouse, France Kevin Tanguy is with the Institute of Mathematics of Toulouse (CNRS UMR 5219).
More informationPersistence as a spectral property
, Georgia Tech. Joint work with Naomi Feldheim and Ohad Feldheim. March 18, 2017 Advertisement. SEAM2018, March 23-25, 2018. Georgia Tech, Atlanta, GA. Organizing committee: Michael Lacey Wing Li Galyna
More informationHarmonic Analysis. 1. Hermite Polynomials in Dimension One. Recall that if L 2 ([0 2π] ), then we can write as
Harmonic Analysis Recall that if L 2 ([0 2π] ), then we can write as () Z e ˆ (3.) F:L where the convergence takes place in L 2 ([0 2π] ) and ˆ is the th Fourier coefficient of ; that is, ˆ : (2π) [02π]
More informationNEW FUNCTIONAL INEQUALITIES
1 / 29 NEW FUNCTIONAL INEQUALITIES VIA STEIN S METHOD Giovanni Peccati (Luxembourg University) IMA, Minneapolis: April 28, 2015 2 / 29 INTRODUCTION Based on two joint works: (1) Nourdin, Peccati and Swan
More informationConcentration Inequalities for Random Matrices
Concentration Inequalities for Random Matrices M. Ledoux Institut de Mathématiques de Toulouse, France exponential tail inequalities classical theme in probability and statistics quantify the asymptotic
More informationLower Tail Probabilities and Normal Comparison Inequalities. In Memory of Wenbo V. Li s Contributions
Lower Tail Probabilities and Normal Comparison Inequalities In Memory of Wenbo V. Li s Contributions Qi-Man Shao The Chinese University of Hong Kong Lower Tail Probabilities and Normal Comparison Inequalities
More informationComputing the MLE and the EM Algorithm
ECE 830 Fall 0 Statistical Signal Processing instructor: R. Nowak Computing the MLE and the EM Algorithm If X p(x θ), θ Θ, then the MLE is the solution to the equations logp(x θ) θ 0. Sometimes these equations
More informationHeat Flows, Geometric and Functional Inequalities
Heat Flows, Geometric and Functional Inequalities M. Ledoux Institut de Mathématiques de Toulouse, France heat flow and semigroup interpolations Duhamel formula (19th century) pde, probability, dynamics
More informationOXPORD UNIVERSITY PRESS
Concentration Inequalities A Nonasymptotic Theory of Independence STEPHANE BOUCHERON GABOR LUGOSI PASCAL MASS ART OXPORD UNIVERSITY PRESS CONTENTS 1 Introduction 1 1.1 Sums of Independent Random Variables
More informationStein s method, logarithmic Sobolev and transport inequalities
Stein s method, logarithmic Sobolev and transport inequalities M. Ledoux University of Toulouse, France and Institut Universitaire de France Stein s method, logarithmic Sobolev and transport inequalities
More informationConcentration inequalities for non-lipschitz functions
Concentration inequalities for non-lipschitz functions University of Warsaw Berkeley, October 1, 2013 joint work with Radosław Adamczak (University of Warsaw) Gaussian concentration (Sudakov-Tsirelson,
More informationA. Bovier () Branching Brownian motion: extremal process and ergodic theorems
Branching Brownian motion: extremal process and ergodic theorems Anton Bovier with Louis-Pierre Arguin and Nicola Kistler RCS&SM, Venezia, 06.05.2013 Plan 1 BBM 2 Maximum of BBM 3 The Lalley-Sellke conjecture
More informationSome functional (Hölderian) limit theorems and their applications (II)
Some functional (Hölderian) limit theorems and their applications (II) Alfredas Račkauskas Vilnius University Outils Statistiques et Probabilistes pour la Finance Université de Rouen June 1 5, Rouen (Rouen
More informationON MEHLER S FORMULA. Giovanni Peccati (Luxembourg University) Conférence Géométrie Stochastique Nantes April 7, 2016
1 / 22 ON MEHLER S FORMULA Giovanni Peccati (Luxembourg University) Conférence Géométrie Stochastique Nantes April 7, 2016 2 / 22 OVERVIEW ı I will discuss two joint works: Last, Peccati and Schulte (PTRF,
More informationFluctuations from the Semicircle Law Lecture 4
Fluctuations from the Semicircle Law Lecture 4 Ioana Dumitriu University of Washington Women and Math, IAS 2014 May 23, 2014 Ioana Dumitriu (UW) Fluctuations from the Semicircle Law Lecture 4 May 23, 2014
More informationConcentration of Measures by Bounded Couplings
Concentration of Measures by Bounded Couplings Subhankar Ghosh, Larry Goldstein and Ümit Işlak University of Southern California [arxiv:0906.3886] [arxiv:1304.5001] May 2013 Concentration of Measure Distributional
More informationRandom regular digraphs: singularity and spectrum
Random regular digraphs: singularity and spectrum Nick Cook, UCLA Probability Seminar, Stanford University November 2, 2015 Universality Circular law Singularity probability Talk outline 1 Universality
More informationUniversity of Regina. Lecture Notes. Michael Kozdron
University of Regina Statistics 252 Mathematical Statistics Lecture Notes Winter 2005 Michael Kozdron kozdron@math.uregina.ca www.math.uregina.ca/ kozdron Contents 1 The Basic Idea of Statistics: Estimating
More informationStein s Method: Distributional Approximation and Concentration of Measure
Stein s Method: Distributional Approximation and Concentration of Measure Larry Goldstein University of Southern California 36 th Midwest Probability Colloquium, 2014 Stein s method for Distributional
More informationConnection to Branching Random Walk
Lecture 7 Connection to Branching Random Walk The aim of this lecture is to prepare the grounds for the proof of tightness of the maximum of the DGFF. We will begin with a recount of the so called Dekking-Host
More informationIntegration by Parts and Its Applications
Integration by Parts and Its Applications The following is an immediate consequence of Theorem 4.7 [page 5] and the chain rule [Lemma.7, p. 3]. Theorem.. For every D (P ) and D (P ), Cov( ()) = E [(D)()
More information1 of 7 7/16/2009 6:12 AM Virtual Laboratories > 7. Point Estimation > 1 2 3 4 5 6 1. Estimators The Basic Statistical Model As usual, our starting point is a random experiment with an underlying sample
More informationChapter 9: Basic of Hypercontractivity
Analysis of Boolean Functions Prof. Ryan O Donnell Chapter 9: Basic of Hypercontractivity Date: May 6, 2017 Student: Chi-Ning Chou Index Problem Progress 1 Exercise 9.3 (Tightness of Bonami Lemma) 2/2
More information1 Cheeger differentiation
1 Cheeger differentiation after J. Cheeger [1] and S. Keith [3] A summary written by John Mackay Abstract We construct a measurable differentiable structure on any metric measure space that is doubling
More informationConcentration inequalities and the entropy method
Concentration inequalities and the entropy method Gábor Lugosi ICREA and Pompeu Fabra University Barcelona what is concentration? We are interested in bounding random fluctuations of functions of many
More informationProbability for Statistics and Machine Learning
~Springer Anirban DasGupta Probability for Statistics and Machine Learning Fundamentals and Advanced Topics Contents Suggested Courses with Diffe~ent Themes........................... xix 1 Review of Univariate
More information1. Stochastic Processes and filtrations
1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S
More informationSTAT 200C: High-dimensional Statistics
STAT 200C: High-dimensional Statistics Arash A. Amini May 30, 2018 1 / 59 Classical case: n d. Asymptotic assumption: d is fixed and n. Basic tools: LLN and CLT. High-dimensional setting: n d, e.g. n/d
More information< k 2n. 2 1 (n 2). + (1 p) s) N (n < 1
List of Problems jacques@ucsd.edu Those question with a star next to them are considered slightly more challenging. Problems 9, 11, and 19 from the book The probabilistic method, by Alon and Spencer. Question
More informationHigh-dimensional distributions with convexity properties
High-dimensional distributions with convexity properties Bo az Klartag Tel-Aviv University A conference in honor of Charles Fefferman, Princeton, May 2009 High-Dimensional Distributions We are concerned
More informationMATH 564/STAT 555 Applied Stochastic Processes Homework 2, September 18, 2015 Due September 30, 2015
ID NAME SCORE MATH 56/STAT 555 Applied Stochastic Processes Homework 2, September 8, 205 Due September 30, 205 The generating function of a sequence a n n 0 is defined as As : a ns n for all s 0 for which
More informationCS145: Probability & Computing
CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,
More informationA concentration theorem for the equilibrium measure of Markov chains with nonnegative coarse Ricci curvature
A concentration theorem for the equilibrium measure of Markov chains with nonnegative coarse Ricci curvature arxiv:103.897v1 math.pr] 13 Mar 01 Laurent Veysseire Abstract In this article, we prove a concentration
More informationIn Memory of Wenbo V Li s Contributions
In Memory of Wenbo V Li s Contributions Qi-Man Shao The Chinese University of Hong Kong qmshao@cuhk.edu.hk The research is partially supported by Hong Kong RGC GRF 403513 Outline Lower tail probabilities
More informationLower Tail Probabilities and Related Problems
Lower Tail Probabilities and Related Problems Qi-Man Shao National University of Singapore and University of Oregon qmshao@darkwing.uoregon.edu . Lower Tail Probabilities Let {X t, t T } be a real valued
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationOn a class of stochastic differential equations in a financial network model
1 On a class of stochastic differential equations in a financial network model Tomoyuki Ichiba Department of Statistics & Applied Probability, Center for Financial Mathematics and Actuarial Research, University
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationMarch 1, Florida State University. Concentration Inequalities: Martingale. Approach and Entropy Method. Lizhe Sun and Boning Yang.
Florida State University March 1, 2018 Framework 1. (Lizhe) Basic inequalities Chernoff bounding Review for STA 6448 2. (Lizhe) Discrete-time martingales inequalities via martingale approach 3. (Boning)
More informationA Lévy-Fokker-Planck equation: entropies and convergence to equilibrium
1/ 22 A Lévy-Fokker-Planck equation: entropies and convergence to equilibrium I. Gentil CEREMADE, Université Paris-Dauphine International Conference on stochastic Analysis and Applications Hammamet, Tunisia,
More informationMixture Models & EM. Nicholas Ruozzi University of Texas at Dallas. based on the slides of Vibhav Gogate
Mixture Models & EM icholas Ruozzi University of Texas at Dallas based on the slides of Vibhav Gogate Previously We looed at -means and hierarchical clustering as mechanisms for unsupervised learning -means
More informationClass 2 & 3 Overfitting & Regularization
Class 2 & 3 Overfitting & Regularization Carlo Ciliberto Department of Computer Science, UCL October 18, 2017 Last Class The goal of Statistical Learning Theory is to find a good estimator f n : X Y, approximating
More informationWeak quenched limiting distributions of a one-dimensional random walk in a random environment
Weak quenched limiting distributions of a one-dimensional random walk in a random environment Jonathon Peterson Cornell University Department of Mathematics Joint work with Gennady Samorodnitsky September
More informationInverse Brascamp-Lieb inequalities along the Heat equation
Inverse Brascamp-Lieb inequalities along the Heat equation Franck Barthe and Dario Cordero-Erausquin October 8, 003 Abstract Adapting Borell s proof of Ehrhard s inequality for general sets, we provide
More informationCan we do statistical inference in a non-asymptotic way? 1
Can we do statistical inference in a non-asymptotic way? 1 Guang Cheng 2 Statistics@Purdue www.science.purdue.edu/bigdata/ ONR Review Meeting@Duke Oct 11, 2017 1 Acknowledge NSF, ONR and Simons Foundation.
More informationExponential tail inequalities for eigenvalues of random matrices
Exponential tail inequalities for eigenvalues of random matrices M. Ledoux Institut de Mathématiques de Toulouse, France exponential tail inequalities classical theme in probability and statistics quantify
More informationSemester , Example Exam 1
Semester 1 2017, Example Exam 1 1 of 10 Instructions The exam consists of 4 questions, 1-4. Each question has four items, a-d. Within each question: Item (a) carries a weight of 8 marks. Item (b) carries
More informationMixture Models & EM. Nicholas Ruozzi University of Texas at Dallas. based on the slides of Vibhav Gogate
Mixture Models & EM icholas Ruozzi University of Texas at Dallas based on the slides of Vibhav Gogate Previously We looed at -means and hierarchical clustering as mechanisms for unsupervised learning -means
More informationGravitational allocation to Poisson points
Gravitational allocation to Poisson points Sourav Chatterjee joint work with Ron Peled Yuval Peres Dan Romik Allocation rules Let Ξ be a discrete subset of R d. An allocation (of Lebesgue measure to Ξ)
More informationConcentration, self-bounding functions
Concentration, self-bounding functions S. Boucheron 1 and G. Lugosi 2 and P. Massart 3 1 Laboratoire de Probabilités et Modèles Aléatoires Université Paris-Diderot 2 Economics University Pompeu Fabra 3
More informationRandom matrices: Distribution of the least singular value (via Property Testing)
Random matrices: Distribution of the least singular value (via Property Testing) Van H. Vu Department of Mathematics Rutgers vanvu@math.rutgers.edu (joint work with T. Tao, UCLA) 1 Let ξ be a real or complex-valued
More informationGaussian Processes. 1. Basic Notions
Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian
More informationAlmost giant clusters for percolation on large trees
for percolation on large trees Institut für Mathematik Universität Zürich Erdős-Rényi random graph model in supercritical regime G n = complete graph with n vertices Bond percolation with parameter p(n)
More information4: The Pandemic process
4: The Pandemic process David Aldous July 12, 2012 (repeat of previous slide) Background meeting model with rates ν. Model: Pandemic Initially one agent is infected. Whenever an infected agent meets another
More informationLarge deviations for random projections of l p balls
1/32 Large deviations for random projections of l p balls Nina Gantert CRM, september 5, 2016 Goal: Understanding random projections of high-dimensional convex sets. 2/32 2/32 Outline Goal: Understanding
More informationCritical branching Brownian motion with absorption. by Jason Schweinsberg University of California at San Diego
Critical branching Brownian motion with absorption by Jason Schweinsberg University of California at San Diego (joint work with Julien Berestycki and Nathanaël Berestycki) Outline 1. Definition and motivation
More informationLecture 32: Asymptotic confidence sets and likelihoods
Lecture 32: Asymptotic confidence sets and likelihoods Asymptotic criterion In some problems, especially in nonparametric problems, it is difficult to find a reasonable confidence set with a given confidence
More informationP (A G) dp G P (A G)
First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume
More informationLecture 8: Spectral Graph Theory III
A Theorist s Toolkit (CMU 18-859T, Fall 13) Lecture 8: Spectral Graph Theory III October, 13 Lecturer: Ryan O Donnell Scribe: Jeremy Karp 1 Recap Last time we showed that every function f : V R is uniquely
More informationStatistics & Data Sciences: First Year Prelim Exam May 2018
Statistics & Data Sciences: First Year Prelim Exam May 2018 Instructions: 1. Do not turn this page until instructed to do so. 2. Start each new question on a new sheet of paper. 3. This is a closed book
More informationA Remark on Hypercontractivity and Tail Inequalities for the Largest Eigenvalues of Random Matrices
A Remark on Hypercontractivity and Tail Inequalities for the Largest Eigenvalues of Random Matrices Michel Ledoux Institut de Mathématiques, Université Paul Sabatier, 31062 Toulouse, France E-mail: ledoux@math.ups-tlse.fr
More informationBranching Processes II: Convergence of critical branching to Feller s CSB
Chapter 4 Branching Processes II: Convergence of critical branching to Feller s CSB Figure 4.1: Feller 4.1 Birth and Death Processes 4.1.1 Linear birth and death processes Branching processes can be studied
More informationModern Discrete Probability Branching processes
Modern Discrete Probability IV - Branching processes Review Sébastien Roch UW Madison Mathematics November 15, 2014 1 Basic definitions 2 3 4 Galton-Watson branching processes I Definition A Galton-Watson
More informationThe parabolic Anderson model on Z d with time-dependent potential: Frank s works
Weierstrass Institute for Applied Analysis and Stochastics The parabolic Anderson model on Z d with time-dependent potential: Frank s works Based on Frank s works 2006 2016 jointly with Dirk Erhard (Warwick),
More informationLARGE DEVIATIONS OF TYPICAL LINEAR FUNCTIONALS ON A CONVEX BODY WITH UNCONDITIONAL BASIS. S. G. Bobkov and F. L. Nazarov. September 25, 2011
LARGE DEVIATIONS OF TYPICAL LINEAR FUNCTIONALS ON A CONVEX BODY WITH UNCONDITIONAL BASIS S. G. Bobkov and F. L. Nazarov September 25, 20 Abstract We study large deviations of linear functionals on an isotropic
More informationAnti-concentration Inequalities
Anti-concentration Inequalities Roman Vershynin Mark Rudelson University of California, Davis University of Missouri-Columbia Phenomena in High Dimensions Third Annual Conference Samos, Greece June 2007
More informationBrownian Bridge and Self-Avoiding Random Walk.
Brownian Bridge and Self-Avoiding Random Walk. arxiv:math/02050v [math.pr] 9 May 2002 Yevgeniy Kovchegov Email: yevgeniy@math.stanford.edu Fax: -650-725-4066 November 2, 208 Abstract We derive the Brownian
More information8.1 Concentration inequality for Gaussian random matrix (cont d)
MGMT 69: Topics in High-dimensional Data Analysis Falll 26 Lecture 8: Spectral clustering and Laplacian matrices Lecturer: Jiaming Xu Scribe: Hyun-Ju Oh and Taotao He, October 4, 26 Outline Concentration
More informationExtrema of log-correlated random variables Principles and Examples
Extrema of log-correlated random variables Principles and Examples Louis-Pierre Arguin Université de Montréal & City University of New York Introductory School IHP Trimester CIRM, January 5-9 2014 Acknowledgements
More informationSpring 2012 Math 541B Exam 1
Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote
More informationMaximum likelihood estimation
Maximum likelihood estimation Guillaume Obozinski Ecole des Ponts - ParisTech Master MVA Maximum likelihood estimation 1/26 Outline 1 Statistical concepts 2 A short review of convex analysis and optimization
More informationConvergence to equilibrium of Markov processes (eventually piecewise deterministic)
Convergence to equilibrium of Markov processes (eventually piecewise deterministic) A. Guillin Université Blaise Pascal and IUF Rennes joint works with D. Bakry, F. Barthe, F. Bolley, P. Cattiaux, R. Douc,
More informationExtrema of discrete 2D Gaussian Free Field and Liouville quantum gravity
Extrema of discrete 2D Gaussian Free Field and Liouville quantum gravity Marek Biskup (UCLA) Joint work with Oren Louidor (Technion, Haifa) Discrete Gaussian Free Field (DGFF) D R d (or C in d = 2) bounded,
More informationRecent results on branching Brownian motion on the positive real axis. Pascal Maillard (Université Paris-Sud (soon Paris-Saclay))
Recent results on branching Brownian motion on the positive real axis Pascal Maillard (Université Paris-Sud (soon Paris-Saclay)) CMAP, Ecole Polytechnique, May 18 2017 Pascal Maillard Branching Brownian
More informationδ -method and M-estimation
Econ 2110, fall 2016, Part IVb Asymptotic Theory: δ -method and M-estimation Maximilian Kasy Department of Economics, Harvard University 1 / 40 Example Suppose we estimate the average effect of class size
More informationSTA205 Probability: Week 8 R. Wolpert
INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and
More informationInstitute of Actuaries of India
Institute of Actuaries of India Subject CT3 Probability & Mathematical Statistics May 2011 Examinations INDICATIVE SOLUTION Introduction The indicative solution has been written by the Examiners with the
More informationStein s Method: Distributional Approximation and Concentration of Measure
Stein s Method: Distributional Approximation and Concentration of Measure Larry Goldstein University of Southern California 36 th Midwest Probability Colloquium, 2014 Concentration of Measure Distributional
More informationSTAT 200C: High-dimensional Statistics
STAT 200C: High-dimensional Statistics Arash A. Amini April 27, 2018 1 / 80 Classical case: n d. Asymptotic assumption: d is fixed and n. Basic tools: LLN and CLT. High-dimensional setting: n d, e.g. n/d
More informationBoolean Functions: Influence, threshold and noise
Boolean Functions: Influence, threshold and noise Einstein Institute of Mathematics Hebrew University of Jerusalem Based on recent joint works with Jean Bourgain, Jeff Kahn, Guy Kindler, Nathan Keller,
More information9.1 Orthogonal factor model.
36 Chapter 9 Factor Analysis Factor analysis may be viewed as a refinement of the principal component analysis The objective is, like the PC analysis, to describe the relevant variables in study in terms
More informationprocess on the hierarchical group
Intertwining of Markov processes and the contact process on the hierarchical group April 27, 2010 Outline Intertwining of Markov processes Outline Intertwining of Markov processes First passage times of
More informationLeast squares under convex constraint
Stanford University Questions Let Z be an n-dimensional standard Gaussian random vector. Let µ be a point in R n and let Y = Z + µ. We are interested in estimating µ from the data vector Y, under the assumption
More information4 Sums of Independent Random Variables
4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables
More informationInvariances in spectral estimates. Paris-Est Marne-la-Vallée, January 2011
Invariances in spectral estimates Franck Barthe Dario Cordero-Erausquin Paris-Est Marne-la-Vallée, January 2011 Notation Notation Given a probability measure ν on some Euclidean space, the Poincaré constant
More informationDiscrete Ricci curvature: Open problems
Discrete Ricci curvature: Open problems Yann Ollivier, May 2008 Abstract This document lists some open problems related to the notion of discrete Ricci curvature defined in [Oll09, Oll07]. Do not hesitate
More informationLecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1
Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).
More informationErgodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.
Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions
More informationNotes, March 4, 2013, R. Dudley Maximum likelihood estimation: actual or supposed
18.466 Notes, March 4, 2013, R. Dudley Maximum likelihood estimation: actual or supposed 1. MLEs in exponential families Let f(x,θ) for x X and θ Θ be a likelihood function, that is, for present purposes,
More informationConcentration estimates for the isoperimetric constant of the super critical percolation cluster
Concentration estimates for the isoperimetric constant of the super critical percolation cluster arxiv:1110.6006v2 [math.pr] 8 Nov 2011 Eviatar B. Procaccia, Ron Rosenthal November 9, 2011 Abstract We
More informationDelta Method. Example : Method of Moments for Exponential Distribution. f(x; λ) = λe λx I(x > 0)
Delta Method Often estimators are functions of other random variables, for example in the method of moments. These functions of random variables can sometimes inherit a normal approximation from the underlying
More informationDistance-Divergence Inequalities
Distance-Divergence Inequalities Katalin Marton Alfréd Rényi Institute of Mathematics of the Hungarian Academy of Sciences Motivation To find a simple proof of the Blowing-up Lemma, proved by Ahlswede,
More informationStein s Method and Stochastic Geometry
1 / 39 Stein s Method and Stochastic Geometry Giovanni Peccati (Luxembourg University) Firenze 16 marzo 2018 2 / 39 INTRODUCTION Stein s method, as devised by Charles Stein at the end of the 60s, is a
More informationComputer Vision Group Prof. Daniel Cremers. 9. Gaussian Processes - Regression
Group Prof. Daniel Cremers 9. Gaussian Processes - Regression Repetition: Regularized Regression Before, we solved for w using the pseudoinverse. But: we can kernelize this problem as well! First step:
More information5 Birkhoff s Ergodic Theorem
5 Birkhoff s Ergodic Theorem Birkhoff s Ergodic Theorem extends the validity of Kolmogorov s strong law to the class of stationary sequences of random variables. Stationary sequences occur naturally even
More informationStein s method and stochastic analysis of Rademacher functionals
E l e c t r o n i c J o u r n a l o f P r o b a b i l i t y Vol. 15 (010), Paper no. 55, pages 1703 174. Journal URL http://www.math.washington.edu/~ejpecp/ Stein s method and stochastic analysis of Rademacher
More informationIf the [T, Id] automorphism is Bernoulli then the [T, Id] endomorphism is standard
If the [T, Id] automorphism is Bernoulli then the [T, Id] endomorphism is standard Christopher Hoffman Daniel Rudolph September 27, 2002 Abstract For any - measure preserving map T of a probability space
More informationThe Johnson-Lindenstrauss Lemma
The Johnson-Lindenstrauss Lemma Kevin (Min Seong) Park MAT477 Introduction The Johnson-Lindenstrauss Lemma was first introduced in the paper Extensions of Lipschitz mappings into a Hilbert Space by William
More informationOberwolfach workshop on Combinatorics and Probability
Apr 2009 Oberwolfach workshop on Combinatorics and Probability 1 Describes a sharp transition in the convergence of finite ergodic Markov chains to stationarity. Steady convergence it takes a while to
More information