Bulk Universality for Random Matrices via the local relaxation flow

Size: px
Start display at page:

Download "Bulk Universality for Random Matrices via the local relaxation flow"

Transcription

1 Bulk Universality for Random Matrices via the local relaxation flow László Erdős Ludwig-Maximilians-Universität, Munich, Germany Disentis, Jul 26 30, 200 Joint with B. Schlein, H.T. Yau, and J. Yin

2 INTRODUCTION Basic question [Wigner]: Consider a large matrix whose elements are random variables with a given probability law. What can be said about the statistical properties of the eigenvalues? Do some universal patterns emerge and what determines them? Gaussian Unitary Ensemble (GUE): H = (h jk ) j,k N hermitian N N matrix with h jk = h kj = N (x jk + iy jk ) and h kk = 2 N x kk where x jk, y jk (for j < k) and x kk are independent, centered Gaussian random variables with variance /2. Classical ensembles: GUE, GOE, GSE, Wishart (H = A A) 2

3 ρ (x) = 4 x 2 2π 2 2 Eigenvalues: x x x N Typical eigenvalue spacing is x i+ x i N (in the bulk) Observations: i) Semicircle density. ii) Level repulsion. Holds for GUE, GOE, GSE. For Wishart: Marchenko-Pastur law 3

4 SINE KERNEL Probability density of the eigenvalues (wrt. Lebesgue) p(x, x 2,..., x N ) The k-point marginal density (= correlation) is given by p (k) N (x, x 2,..., x k ) = R N k p(x,... x k, x k+,..., x N )dx k+...dx N Special case: k = (density ) N(x) := p () N (x) = R N p(x, x 2,..., x N )dx 2...dx N It allows to compute expectation of observables with one eigenvalue: E N N i= O(x i ) = O(x) N(x)dx 2π O(x) 4 x 2 dx 4

5 Higher k is similar, for example, k = 2 gives E N(N ) N i j O(x i, x j ) = O(x, y)p (2) N (x, y)dxdy Correlation functions answer to the questions like: How many eigenvalues are in an interval I? What is the prob. that there are 5 eigenvalues in I and 3 in J? They do not (directly) answer to questions like; What is the distribution of the 200-th eigenvalue? What is the distribution of the gap between neighboring evalues? Nevertheless, the second question can be answered by exclusioninclusion formulas. 5

6 Local level correlation statistics [Wigner, Dyson, Mehta] lim [ρ N(E)] 2 p (2) N N ( E + x Nρ N (E), E + x 2 Nρ N (E) ) = det { S(x i x j ) } 2 i,j=, sin πx S(x) = πx, E < 2 = ( ) sin π(x x 2 ) 2 = Level repulsion π(x x 2 ) k-point correlation functions are given by k k determinants: lim [ρ N(E)] k p (k) N N ( E + x Nρ N (E), E + x 2 Nρ N (E),..., E + = det { S(x i x j ) } k i,j= x k Nρ N (E) Note that the limit is independent of E as long as E is in the bulk spectrum, i.e. E < 2. 6 )

7 Gap distribution E # of e.v. pairs with gap less than s near E # of eigenvalues near E s 0 p(α)dα, p(α) := d2 dα 2 det( K α)dα K α (x, y) = sin π(x y) π(x y) as operator on L 2 ((0, α)) Wigner surmise (not exact but very good approximation) p(α) πα 2 e πα2 /4 for the symmetric case p(α) 32π 2 α 2 e 4α2 /π for the hermitian case 7

8 More precisely: Fix an energy E with E < 2 and for any s > 0 and for some N-dependent parameter t with /N t let Λ(s) := 2Nt sc (E) # { j N : λ j+ λ j s N sc (E), λ j E t i.e., the proportion of rescaled eigenvalue differences below a threshold s in a large but still microscopic vicinity of an energy E. } By the exclusion-inclusion formula, change of variables and the universal limit of the correlation functions, EΛ(s) = 2Nt m=2 m=2 where = sc (E). ( ) m( N m ) t t dv... t t dv m { max v i v j p (m) N (E + v, E + v 2,..., E + v m ) ( ) m s ( s sin π(ai (m )! da 2... da a j ) m det 0 0 π(a i a j ) ) m i,j= s N, } 8

9 History, Applications and Conjectures Nuclear Physics: the excitation spectra of heavy nuclei are expected to have the same local statistical properties as the eigenvalues of GOE (Wigner, 955). Quantum Chaos Conjecture (vague) classical dynamics with potential V chaotic integrable e.v. gap of + V GOE statistics Poisson statistics Geodesic flow: Bohigas-Giannoni-Schmit (984), Berry-Tabor (977) 9

10 Riemann ζ-function: Gap distribution of zeros of ζ function is given by GUE (Montgomery, 973). Anderson Model (958): V ω random potential on R d or Z d random Schrödinger operator: H = + λv ω the eigenvalue distribution of H in the delocalization regime (λ small) is given by GOE. In the localization regime (λ large or near the spectral edge) Poisson statistics is proven. Universality conjecture (vague): Statistics of eigenvalues depends on symmetry but is otherwise independent of the details of the ensemble. 0

11 KNOWN RESULTS FOR UNITARY ENSEMBLES Unitary ensemble: Hermitian matrices with density P(H)dH e Tr V (H) dh Invariant under H UHU for any unitary U (GUE) Joint density function of the eigenvalues is explicitly known p(λ,..., λ N ) = const. (λ i λ j ) β e j V (λ j) i<j classical ensembles β =,2,4 (orthogonal, unitary, symplectic symmetry classes; GOU, GUE, GSE for Gaussian case) Correlation functions can be explicit computed via orthogonal polynomials due to the Vandermonde determinant structure. large N asymptotic of orthogonal polynomials = local eigenvalue statistics indep of V. But density of e.v. depends on V.

12 Many previous results: Dyson (962-76), Mehta (960- ) classical Gaussian ensembles via Hermite polynomials General case by Deift etc. (999), Pastur-Schcherbina (2008), Bleher-Its (999), Deift etc (2000-, GOE and GSE), Lubinsky (2008)... All results are limited to invariant ensembles. 2

13 PREVIOUS METHODS FOR RANDOM MATRICES Moment method: compute ETrH k. Semicircle law on macro scale [Wigner, 955]. Suitable on the spectral edges [Sinai, Soshnikov 999] for the Tracy-Widom law. Green Function Method: compute ETr z H. Resolvent is unstable inside the spectrum, randomness stabilizes. Explicit formulas + Asymptotic analysis. E.g: orthogonal polynomial methods or Brezin-Hikami and Johansson s formulae for hermitian case. Problems: Need explicit formulas. Universality??? 3

14 (HERMITIAN) WIGNER ENSEMBLE H = H = (h jk ) j,k N is a hermitian N N matrix, N. h kj = N ( xkj + iy kj ) h kk = 2 N x kk where x kj, y kj ( k < j N) and x kk ( k N) are i.i.d. with E x jk = 0 and E x 2 jk = 2 ( E h jk 2 = N ) No explicit formula for the eigenvalue distribution. Not unitary invariant, unless V (x) = x 2 Gaussian. Symmetric Wigner ensemble is defined similarly. Our final result: Universality for (generalized) Wigner matrices 4

15 SEMICIRCLE LAW (WIGNER 955) N[I] : = number of eigenvalues in interval I sc (E) : = (4 E 2 ) + 2π For any δ > 0 and E 2 ( lim lim P N[E η 2, E + η 2 ] η 0 N Nη ) sc (E) δ = 0 Remark : Semicircle is independent of the distribution of entries Remark 2: Wigner s result concerns the macroscopic density, i.e. density in intervals containing order N eigenvalues. 5

16 UNIVERSALITY IN THE BULK Theorem [E-Yau-Yin, 200] Suppose that H is a hermitian or symmetric Wigner matrix with Eh ij = 0, and σ 2 ij = E h ij 2, with and normalization (cn ) j σ 2 ij σ 2 ij =, CN and subexponential decay. Then the corr. functions of the local statistics are given by the GUE/GOE in the bulk, i.e. for E < 2, k for any nice test fn O : R k R we have (# is GUE or GOE): lim lim b 0 N 2b E+b E b de ( sc (E) k p (k) N i R k dα...dα k O(α,..., α k ) p(k) #,N )( E + α N sc (E),..., E + α ) k N sc (E) = 0. [Earlier result for Wigner with σ 2 ij = /N: E-Schlein-Yau, 2009] 6

17 DYSON BROWNIAN MOTION Gaussian convolution matrix (after rescaling) H t = H 0 + tv interpolates (Johansson [200] for t > 0, hermitian case). It can also be obtained by evolving the matrix elements with an OU process: t u t = Lu t, L = 2dx 2 x d 2dx, H t = e t/2 H 0 + ( e t ) /2 V. Denote the prob density of the eigenvalues by f t (x)µ(dx). d 2 GUE : µ = Ce NH(x) dx, H(x) = (PDE) t f t = Lf t, N i= flfdµ = 2N x 2 i 2 N i j log x j x i f 2 dµ,=: D µ (f) L = 2N N i= 2 i + N i= ( 2 x i + N j i x i x j ) i, 7

18 (SDE) dx i = N db i + ( 2 x i + N j i x i x j ) dt Derivation of DBM (for the BM case, h ij = N /2 B ij for simplicity): λ α h ik = ū α (i)u α (k), u α (i) h lk = β α λ α λ β ū β (l)u α (j)u β (i) 2 λ α = h ik h lj β α λ α λ β u β (l)ū α (j)ū β (i)u α (k) + c.c. dλ α = N ik ū α (i)u α (k)db ik + N ik β α λ α λ β u β (i)ū α (k)ū β (i)u α (k)dt = N db α + N β α λ α λ β dt where one can check that ik ū α (i)u α (k)db ik are independent white noises for different α s. 8

19 (SDE) dx i = N db i + ( 2 x i + N j i x i x j ) dt Idea: Equilibrium is the invariant ensemble (GUE, etc.) with known local statistics. For local statistics, only local equilibrium needs to be achieved which is faster. Bakry-Emery method: Speed of convergence to a measure µ = e H /Z is given by the lower bound on the Hessian matrix H : H K = S µ (f t ) e tk, D µ ( and LSI holds: KS µ (f t ) D µ ( f t ). f t ) e tk In our case, K = O(), so time to global equilibrium is O(). [Morally, Johansson s work is t O(), but with different method and for different reason] 9

20 KEY STEPS Step. Good local semicircle law including a control near the edge. Method: System of self-consistent equations for the Green function, control the error by large deviation methods. Step 2. Universality for Gaussian divisible matrices with a small ( N ε ) Gaussian component. Method: Modify DBM to speed up its local relaxation, then show that the modification is irrelevant for statistics involving differences of eigenvalues. Step 3. Universality for arbitrary Wigner matrices. Method: Remove the small Gaussian component in Step 2 by resolvent perturbation theory and moment matching. 20

21 DYSON CONJECTURE Theorem [Strong local ergodicity of DBM] [E-Yau-Yin, 200] Suppose that H is a hermitian or symmetric Wigner matrix with Eh ij = 0, and σ 2 ij = E h ij 2, with (cn ) σ 2 ij CN, j σ 2 ij =, and subexponential decay. Let E [2 c,2 + c] with some c > 0. Then for any ε > 0 and 0 < b < c/2, any integer n and for any nice test function O : R n R we have lim sup N t N +ε where p (n) t,n E+b E b de 2b ( p (n) t,n p(n) µ,n i dα R n...dα n O(α,..., α n ) (E) n )( E + α N (E),..., E + α n N (E) ) = 0 are the corr. functions of the evalues of the DBM flow. Dyson s conjecture [962]: Local equilibrium is already attained after time t N +ε. 2

22 HISTORICAL DETOUR We have been subsequently improving the local semicircle law in several papers focusing on shrinking the window. The following prototype result is optimal away from the edge: Local Semicircle Law. [E-Schlein-Yau, ] Let N[I] := #{λ n I} be no. of evalues in I R. Then P N[E K 2N, E + K 2N ] K sc (E) δ Ce cδ K for all E < 2 and K > 0 uniformly in N N 0 (E, δ) and the eigenvectors are completely delocalized. Method: Green function + Large deviation Local semicircle law is always the starting point of any analysis. 22

23 Extended states: eigenvector delocalization No concept of abs. continuous spectrum unlike in random Schr. Let v = (v, v 2,... v N ) C N be l 2 -normalized. Complete localization: a few large component, e.g. v = (,0,0,...0) = v p = for all 2 p Complete delocalization: components have the same size, e.g. v = ( N /2, N /2,... ) = v p = N p 2 for all 2 p Theorem 2. [E-Schlein-Yau, 2008] Assume E e x ij ε <. Fix E < 2, K > 0 and 2 < p <, then for large M P v, v 2 =, Hv = xv, x E K N, v p MN p 2 Ce c M 23

24 Level repulsion and Wegner estimate Theorem 3. [E-Schlein-Yau, 2008] Fix k and assume that the distribution ν is sufficiently smooth and decays. Let E < 2, then P ( N[E ε 2N, E + ε 2N ] k) C k ε k2 uniformly in ε, N i) Level repulsion for k = 2. [For Poisson ev s it would be ε k ]. ii) Exponent is optimal, it agrees with the GUE case: p(x,... x N ) (x j x k ) 2 e j x2 j P(N ε k) ε k2 j<k (Exponent in symmetric case is different, but also optimal) Method: Green function + local semicircle + integration by parts in probability space 24

25 Universality for hermitian Wigner matrices Johansson [200], Ben Arous-Peche [2005] proved sine-kernel for hermitian Wigner matrices with Gaussian component of the form H = H 0 + tv C N N, H 0 is Wigner, V is GUE, t > 0. Key explicit formula: [Brézin-Hikami 996, Johansson 200] The two-point corr. fn. of H in terms of p 0 (e.v. distribition of H 0 ) is with p (2) (x, x 2 ) = K t,n ( E, E + τ N ;y ) p 0 (y)dy = γ dz det ( ) 2 K t,n (x i, x j ;y) i,j= Γ dw g N(z, w; τ)e N(f N(w) f N (z)) f N (z) = 2t (z2 2Ez) + N j log(z y j ), g N =... Remark: Available only for the hermitian case. 25

26 To identify the saddles at energy E, one needs to solve for z 0 = f N (z) = t (z E) + N j z y j where y j s are the eigenvalues of the Wigner matrix. Approximately 0 = t (z E) + N and here Im z t. j 0 = sc z y j t (z E) + (y)dy z y ( ) Thus (*) can be justified if semicircle law is valid on scale O(t). Choosing t = N +ε, the sine-kernel is proved for matrices H = H 0 + tv, t = N +ε 26

27 Theorem [E-Peche-Ramirez-Schlein-Yau, 2009] Sine kernel holds for hermitian Wigner ensemble if the distribution of the matrix elements has subexp decay and some smoothness (no Gaussian component). Method: Extend Johansson s result to a tiny Gaussian component t = N +ε using local semicircle law in the saddle point analysis of the contour integral. To remove this tiny Gaussian component: use reverse heat flow: Given a smooth distribution u(x)dx, choose then g t = ( t + ( t )2 2! e t g t +... ) u e t u will have a Gaussian component and will be close to u. 27

28 Method of the reverse heat flow Let u 0 be the density of the matrix elements of H 0, then the matrix elements of H = H 0 + tv have density d 2 u t = e tl u 0, L = 2dx 2 The probability density of all matrix elements, F 0 = u N2 0 at t = 0 and F t = (e tl u 0 ) N2 satisfy F t F 0 N 2 e tl u 0 u 0 CN 2 t, t N +ε Key observation Compare F with e tl G for some G that satisfied the local semicircle law; G may even depend on t. Simple way to find a better G: For smooth u 0, choose G = g N2 t, g t = ( tl)u 0 then e tl g t u 0 = O(t 2 ) 28

29 Related results Four-Moment Theorem [Tao-Vu 2009]: if the first four moments of the single entry distribution of two Wigner matrices match, then the local e.v. statistics coincide. Idea: Expand the function x i (x jk, j =,..., N; k =,..., N) in Taylor series of x jk (recall N /2 x jk is the matrix element distribution). Input: Local semicircle law + eigenvector delocalization With Johansson s result, it yields universality for hermitian Wigner matrix under a third moment and a support condition. Universality for hermitian case with no condition apart from subexp decay [ERSTVY, 2009] (Combine the two methods) All results used explicit hermitian formulas; method is not robust. End of the historical detour: back to new method. 29

30 KEY STEPS (recall) Step. Good local semicircle law including a control near the edge. Method: System of self-consistent equations for the Green function, control the error by large deviation methods. Step 2. Universality for Gaussian divisible matrices with a small ( N ε ) Gaussian component. Method: Modify DBM to speed up its local relaxation, then show that the modification is irrelevant for statistics involving differences of eigenvalues. Step 3. Universality for arbitrary Wigner matrices. Method: Remove the small Gaussian component in Step 2 by resolvent perturbation theory and moment matching. 30

31 Step : Strong semicircle law for general Wigner matrices Theorem [E-Yau-Yin, 200] Let Eh ij = 0, E h ij 2 = σij 2, where B := (σij 2) is double stochastic, c Nσ2 ij C. Assume ) P ( h ij xσ ij ϑ e xϑ, x > 0 Then for any z = E + iη with E 5, N η 0 we have ( P max G ii (z) m sc (z) ) N i Nη ( P max G ij (z) ) ij Nη ( P m(z) m sc (z) ) Nη N N where G ij = H z (i, j) and m(z) = N TrG = N (Here means log N corrections) i G ii. 3

32 Corollary. [Counting functions] then for any E n(e) := N #{λ j E}, n sc (E) := ( P n(e) n sc (E) ) N E N sc(x)dx Corollary 2. [Eigenvalue locations] Let γ j be the classical location of the j-th eigenvalue, i.e. γj Then for any j N/2 P ( λ j γ j sc(x)dx = j N N 2/3 j /3 ) N All these estimates are optimal (up to log corrections) 32

33 From Stieltjes transform to counting function Let (λ)dλ be a signed measure on a compact interval in R, say [ 3,3] and suppose its Stieltjes transform satisfies Im m(x + iy) U, y > 0, x + y 0 Ny Then for any E, E 2 [ 3,3] we have CU log η f E,E 2,η(λ) (λ)dλ N where f E,E 2,η is a smoothed char. function of [E, E 2 ] on scale η. Helffer-Sjöstrand formula: Let f C (R). Let χ(y) be a cutoff function in [,]. Define the quasianalytic extension of f as f(x + iy) = (f(x) + iyf (x))χ(y), then f(λ) = 2π R 2 z f(x + iy) λ x iy dxdy = 2π R 2 iyf (x)χ(y) + i(f(x) + iyf (x))χ (y) λ x iy d 33

34 Detour: similar results for universal matrices Theorem [E-Yau-Yin, 200] Let Eh ij = 0, E h ij 2 = σij 2, where B := (σij 2) is double stochastic. Let M := (max ij σij 2). Assume ) P ( h ij xσ ij ϑ e xϑ, x > 0 Then for any z = E + iη with Mη and ε, K ( P max G ii (z) m sc (z) ) i Mη ( P max G ij (z) ) ij Mη P ( m(z) m sc (z) Nε ) Mη E 2 κ 0 > 0: N N K N For E away from the edge, P ( n(e) n sc (E) Nε M ) N K 34

35 Step 2: Local ergodicity of Dyson Brownian Motion Assumption I. [Convexity] The Hamiltonian of the invariant measure µ e NH(x) on R N has the representation H(x) = β N i= U(x i ) N i<j log x j x i f t = Lf t, L = 2N i i H Assumption II [Macroscopic limit of density] There exists a continuous, compactly supported density function (x) 0, R =, such that for any fixed a, b R lim sup N t 0 N N j= (x j [a, b])f t (x)dµ(x) b (x)dx = 0. a 35

36 Assumption III [Eigenvalues are close to their classical location] Let γ j denote the classical location of the j-th eigenvalue (given by ). Suppose there is ε > 0 such that sup N t N 2ε N j= (x j γ j ) 2 f t (dx)µ(dx) CN 2ε I.e., typical spacing between the random points x j and their classical location is less than N /2 ε. Assumption IV. [Upper bound on local density] For any I R, let N I := N i= (x i I) denote the number of points in I. For any compact subinterval I {E : (E) > 0}, I N +σ sup τ N 2ε { N I KN I } f τ dµ C n K n, n =,2,... 36

37 Theorem [E-Schlein-Yau-Yin, 2009] Suppose Assumption I [convexity] holds. Suppose at time t 0 = N 2ε, the density f t0 satisfy a bounded entropy condition, i.e., for some m S µ (f t0 ) := f t0 log f t0 dµ CN m and suppose that Assumptions II, III and IV hold for the solution of f t = Lf t. Let E R and b > 0 such that min{ (x) : x [E b, E + b]} > 0. Then δ > 0, k and for any nice test fn O : R k R we have lim N sup t N 2ε+δ 2b (E) k E+b E b de ( p (k) t,n p(k) µ,n dα R k...dα k O(α,..., α k ) )( E + α N (E),..., E + where ε > 0 is the exponent from Assumption III. α ) k N (E) = 0 The method gives an effective error bound with trade-off between ε and b, in particular b = N power is allowed. 37

38 Local relaxation flow We modify the dynamics by adding a fictitious term to H: H = H + W, W = j (x j γ j ) 2 where R and γ j are the classical location of ev s, defined by γj sc(x)dx = j N New equilibrium measure: ω e N H. R 2 We then have v, 2 H(x)v R 2 v 2 + N i<j (v i v j ) 2 (x i x j ) 2, v RN. i.e. the relaxation time is t R 2 (Bakry-Emery). We also keep the second term from log x i x j. 38

39 Theorem. [Relaxation of the modified dynamics] Suppose v, 2 H(x)v R 2 v 2 + N Consider the forward equation i<j (v i v j ) 2 t q t = Lq t, t 0, (x i x j ) 2, v RN. with initial condition q 0 = q and with reversible measure ω. Then t D ω ( q t ) R 2D ω( q t ) 2N 2 2N 2 0 ds N i,j= N i,j= ( i qs j qs ) 2 and the logarithmic Sobolev inequality holds ( i qt j qt ) 2 (x i x j ) 2 dω, (x i x j ) 2 dω D ω ( q) S ω (q) CR 2 D ω ( q) Thus the time to equilibrium is of order R 2 : S ω (q t ) e Ct/R2 S ω (q 0 ). 39

40 Theorem 2. [Gap distribution estimated by Dir. form] Let q L be a density, qdω =. Then for any τ > 0 we have G(N(x i x i+ ))qdω G(N(x i x i+ ))dω N N i ( Dω ( ) q)τ /2 C + C S ω (q)e cτ/r2. N Similar result holds for G i,m (x). As a comparison we show what entropy control alone can reach Theorem: We have, for any τ > 0, and E < 2 N i G ( N(x i E) )[ q ] dω C S ω (q) [τ + e cτ/r2] i = CR S ω (q) after optimizing for τ in the last step. N problem! /2 40

41 We will use this for q = g τ, where g t dω = f t dµ, or, g t = f t /ψ. Comparison of dynamics: t f t = Lf t f t µ µ slow [true DBM] t q t = Lq t q t ω ω fast [approx] Initial state f 0 = q 0 ψ. Write f t = g t ψ, compare f t µ = g t ω with q t ω. Theorem 3: [Comparison of dynamics] Set τ = R 2 N ε. Let g t = f t /ψ. Assume that S ω (g 0 ) CN m with some fixed m. Then the entropy and the Dirichlet form satisfy the estimates: S ω (g τ/2 ) CNR 2 Q, D ω ( g τ ) CNR 4 Q. Recall: Q := sup t 0 j (x j γ j ) 2 f t dµ. Note: g t does NOT evolve according to the L dynamics (then the estimate would be exp. small). 4

42 Still, differentiating entropy (similar to Boltzmann s t S = D) t S ω (g t ) = 2 N j ( j gt ) 2 dω+ j From the Schwarz inequality t S ω (g t ) D ω ( g t )+CN j Together with LSI, we have b j j g t dω, b j = W j = x j γ j R 2 b 2 j g tdω D ω ( g t )+CNQR 4 t S ω (g t ) CR 2 S ω (g t ) + CNQR 4 which, after integrating from t = t 0 to τ/2, proves the bound on S ω (g τ ), using that τ = R 2 N ε and S ω (g 0 ) = S µ (f t0 ψ) = S µ (f t0 ) log(z/ Z) + N In fact, we have S ω (g s ) CR 2 NQR 4 = CNQR 2 Wf t0 dµ CN m. for any s τ/2. The estimate for D is integration of (*) from τ/2 to τ. 42 ( )

43 Step 3: Green function comparison theorem Theorem: Let H (v) and H (w) be two generalized N N Wigner matrices, with matrix elements h ij given by the random variables N /2 v ij and N /2 w ij, respectively, with v ij and w ij satisfying the uniform subexponential decay condition. Define interpolating matrices, H 0 = H (v), H,... H N 2 = H (w), i.e. order the elements in some way and let H γ such that its matrix elements h ij follow the v-distribution up to γ and then they follow the w-distribution; Assume that for any y N +ε and E < 2, ( Assume moment matching H γ E iy ) kk Evij k Ev = Ewk ij, k =,2,3, 4 ij Ewij 4 N δ 43

44 Let F be a nice function and G (v) (z) = (H (v) z) be the resolvent. Then EF k N Tr G (v) (zj ),..., k n N Tr G (v) (zj n ) EF ( G (v) G (w)) j= j= for any collection of spectral parameters, zj m N ε, Ej m 2 2κ. = E m j ± iη, with η Similar statement holds for matrix elements. Moral of the statement: if four moments (almost) match, local statistics are the same. 44

45 The four moment condition first appeared in Tao-Vu s Four momemt theorem, stating that observables of the form ( ) Nλi, Nλ i2,..., Nλ ik i <i 2 <...<i k F have the same expectation under the two ensembles, if the first four moments of the single entry distribution match. Their proof also used local semicircle law and it was quite involved due to resonances [they used level repulsion]. Our proof is a simple resolvent perturbation, exploiting the stability of the resolvent; G(z) η (never blows up). 45

46 Matching lemma We have proved so far universality for Wigner matrices with a single entry distribution having an N ε Gaussian convolution. We also proved that local stat are stable if four moments (almost) match. To prove the universality for any single entry distribution, we need to approximate it by another one with a small Gaussian components so that the first four moments almost match. This is and elementary lemma. Let m k (ξ) = Eξ k be the k-th moment of a random variable ξ. 46

47 Matching lemma: Let m 3 and m 4 be two numbers such that m 4 m 2 3 0, m 4 C 2 Let ξ G be a Gaussian random variable with mean 0 and variance. Then for any sufficient small γ > 0 there exists a real random variable ξ γ with subexponential decay and independent of ξ G, such that the first four moments of ξ = ( γ) /2 ξ γ + γ /2 ξ G are m (ξ ) = 0, m 2 (ξ ) =, m 3 (ξ ) = m 3 and m 4 (ξ ), and m 4 (ξ ) m 4 Cγ Note that m 4 m holds for the moments of any random variable with m = 0, m 2 =. 47

48 SUMMARY: KEY STEPS Step. Good local semicircle law including a control near the edge. Method: System of self-consistent equations for the Green function, control the error by large deviation methods. Step 2. Universality for Gaussian divisible matrices with a small ( N ε ) Gaussian component. Method: Modify DBM to speed up its local relaxation, then show that the modification is irrelevant for statistics involving differences of eigenvalues. Step 3. Universality for arbitrary Wigner matrices. Method: Remove the small Gaussian component in Step 2. by perturbation theory and moment matching. 48

49 OPEN PROBLEMS. Extend universality to band matrices for W N 2. Prove delocalization for band matrices, W N 3. Extend the method to β-ensembles. Random band matrices: H is symmetric with independent but not identically distributed entries with mean zero and variance Conjecture Narrow band, W N = Broad band, W N = E N h kl 2 = e k l /W Even the Gaussian case is completely open. localization, Poisson statistics delocalization, GOE statistics 49

50 Some proofs for Step : Local semicircle law H (i) denotes the (N ) (N ) minor of H after removing the i-th column/row and similarly for H (ij) etc. Let a i be the i-th column. The Green functions are: G = H z, G(i) = H (i) z Then from Schur decomposition we have G ii = ( H z ) ii = h ii z a i G (i) a i With the notation K ij = h ij zδ ij a i G (ij) a j we also have G ij = G jj G (j) ii K ij and a different set of formulas (i, j, k different) G ii = G (j) ii + G ijg ji, G ij = G (k) G ij jj Rule of thumb: G ii O(), G ij is small. + G ikg kj G kk 50

51 Crudest local semicircle law (for σ 2 ij = N ) with G ii = h ii z E i a i G (i) a i Z i By a simple computation Z i := a i G (i) a i E i a i G (i) a i E i a i G (i) a i = N N m(i), and by the interlacing property of the eigenvalues of H and H (i), Putting all together G ii = m(z) m (i) (z) Nη, z m(z) + Ω i, η = Im z ( ) Ω i = h ii Z i + O Nη 5

52 G ii = z m(z) + Ω i, ( ) Ω i = h ii Z i + O Nη Expand and sum up, with Ω = max i Ω i m(z) = z + m(z) + O(Ω) By the stability analysis of this equation, and m sc = z+m sc, m m sc CΩ Ω + κ, κ := E 2 By the large deviation bound on Z i, we have Ω (Nη) /2, thus ( ) m m sc min, Nηκ (Nη) /4 It can be improved in various directions. 52

53 Some large deviation results Let a i ( i N) be independent complex random variables with mean zero, variance σ 2 and having a uniform subexponential decay P( a i xσ) ϑ exp ( x ϑ), x, with some ϑ > 0. Let A i, B ij C ( i, j N). Then for any ξ > there exists a constant L 0 depending on ϑ and ξ such that we have P N i= P a i B ii a i P N i= N i= i j a i A i (log N)L 0σ σ 2 B ii (log N)L 0σ 2 a i B ij a j (log N)L 0σ 2 i i i j A i 2 C exp [ (log N) ξ], B ii 2 C exp [ (log N) ξ], B ij 2 C exp [ (log N) ξ]. 53

54 For example since using j,k i σ ij G (i) and similarly jk σ ki k Z i 2 j,k i N 2 j σ ij G (i) jk σ ki 2 Nη ( G (i) 2) jj = Nη N G jk 2 = G jk G kj = (GG ) jj = η Im G jj Z ij = a i G (ij) a j Nη j Im G (i) jj C Nη 54

55 Improved local semicircle law. I: Bound on matrix elements Self-consistent equation for the diagonal elements: G ii = z j σij 2 G jj + Y i Y i := h ii + A i Z i A i := σ 2 ii G ii + j σ 2 ij G ji G ij G ii Z i := a i G (i) a i E i a i G (i) a i Further key quantities Λ d := max k G kk m sc, Λ o := max j k G jk, Λ := m m sc (they all depend on z = E + iη, but suppressed in the notation) 55

56 Dichotomy lemma Away from the spectral edge (for simplicity), we have that IF Λ o +Λ d (log N) 2, THEN Λ o+λ d Nη with high prob Proof. For the offdiagonal part, use G ij = G ii G (i) jj K ij C( h ij + Z ij ) Nη For the diagonal part, we will use the self-consistent equation and Y i h ii + A i + Z i N + ( N + C Nη ) + Nη Nη 56

57 Expansion of the self-consistent equation With the notation v i := G ii m sc, we have v i = ( ) m sc z m sc j σij 2 v j Y i We know that z + m sc c and by the assumption (...), so we can expand ( ) ( ) 2 v i = m 2 sc σij 2 v j Y i + O σij 2 v j Y i j j using m sc + z+m sc = 0. (More precise expansion is possible this is the key to further improvements). Introduce v = N i v i, Ȳ := N i Y i, Y := max Y i, ζ := m 2 sc v = ζ v ζȳ + O(Λ 2 d + Y 2 ) 57

58 v = ζ ( ) ζ ζȳ + O ζ (Λ2 d + Y 2 ) ( ) ζ = O ζ (Λ2 d + Y ) where the second bound is crude (another key to improvement). Let B ij = σij 2 be the covariance matrix, and let Q = I e e be the projection onto the orth. complement of the trivial eigenvector. v i v = ζ j B ij (v j v) ζ(y i Ȳ ) + O(Λ 2 d + Y 2 ) The error sums up to 0, so we can invert I ζb on the range of Q. v i v = ( ) ζq ζq (Y j Ȳ ) + O(Λ 2 d j ζb ij ζb + Y 2 ) ζq = O(Λ 2 d ζb + Y ) (crude) v i ζq ζb + ζ ζ O(Λ 2 d + Y ) 58

59 v i ζq ζb + ζ ζ O(Λ 2 d + Y ) Here use that Spec(B) [ + δ, δ + ] {}, δ ± > 0, thus ζq ζb k n ζ k+ QB k + N k>n ζ k+ QB k 2 2 n + N ( δ ± ) k k>n C log N, by choosing n log N δ ± δ ± Neglect the edge-deterioration (i.e. that ζ = m 2 sc κ + η), Quadratic dichotomy: Λ d C(Λ 2 d + Y )(log N) If Λ d (log N), then Λ d C(log N)Y Nη 59

60 Continuity argument How to guarantee Λ d (log N)? R(z) := (log N) 2, S(z) := (log N)C Nη, S(z) < R(z) We have proved that (with high prob) Λ o (z) + Λ d (z) R(z) = Λ o (z) + Λ d (z) S(z) ( ) For η = Imz = 0 (say), one has Λ o (z)+λ d (z) 3/0, so expansion is possible ( z + m 0 0 > 3/0) and one gets that the LHS of (*) holds for η = 0. Now fix E and reduce η, note that η Λ o (z) + Λ d (z) is cont. so Λ o (z) + Λ d (z) S(z) as long as S(z) < R(z). 60

61 Improved local semicircle law II. The key to improve the previous argument is to exploit a CLT-like cancellation in Ȳ = Y i, i.e., mainly in Z i N N Recall that i Z i = a i G (i) a i E i a i G (i) a i Here Z i and Z j are not independent, but almost. Lemma E N i Z i p p Cp E [ Λ 2p o + N p] or, after plugging in the size of Λ o (Nη) /2 E N i Z i p p Cp (Nη) p = Z (Nη) i 6

62 Sketch of the sketch of the proof: p = 2 E N i Z i 2 = N 2 i j EZ i Z j + N 2 i E Z i 2 We look only at the first term, this is EZ Z 2 (by symmetry) ] EZ Z 2 = E [a G () a E a G () a ][a 2 G (2) a 2 E a 2 G (2) a 2 Make the main terms independent by G () kl = G (2) kl + G() k2 G() 2l G () 22 =: P (2) kl + P () kl (superscripts indicate independence of the givens rows/columns) EZ Z 2 = E[ E a P (2) a + where ][ E a P () a E2 a 2 P (2) a 2 + E i := I E i, i.e. acting on a r.v. X, we have ] E a 2 P (2) a 2 2 E i X = X E i X. 62

63 EZ Z 2 = E[ E a P (2) a + with E i := I E i. ][ E a P () a E2 a 2 P (2) a 2 + ] E a 2 P (2) a 2 2 Note that if A (i) is indep of i and Y is arbitrary, then in particular, E i A (i) = 0, and E i [ Y A (i) ] = A (i) Ei Y E ( E X (2))( E2 X ()) = E E [X (2) ] E2 X () = 0 (with Y := X (2), A () := E X () ), since E Ei = 0. Thus 2 EZ Z 2 = E [ E a P () a ][ E2 a 2 P (2) a 2] and E[a P () a ] 2 kl = P () kl 2 σ 2 k σ2 l Λ4 o. 63

64 Some proofs for Step 2: Bakry-Emery argument Simple calculation shows t S(f t ) = using Lf t dµ = 0. (Lf t )log f t dµ+ f t Lf t f t dµ = 2 ( ft ) 2 f t dµ = 4D( f t ), () Now we compute t D( f t ). Let h := f for simplicity, then t h t = 2h t h 2 t = 2h t Lh 2 t = Lh t + 2h t ( h t ) 2. In the last step we used that Lh 2 = ( h) 2 + 2hLh that can be seen directly from L = 2 2 ( H). 64

65 We compute (dropping the t subscript for brevity and = dµ) t D( f t ) = 2 t = = = 2 ( h)( Lh) + 2 ( h)[, L]h = 2 ( h) 2 dµ h = ( h)( 2 H) h 2 ij [ 2( j h)( i h) ij h h ( h)( 2 H) h 2 ( h) ( h)2 h ( h)l( h) + 2 ij ( i j h) 2 ij ( jh) 2 ( i h) 2 ( where we used the commutator ij h 2 f [ 2( j h) i j h i h h ] ij h ( ih)( j h) h [, L] = 2 ( 2 H). ( jh) 2 ] i h h 2 ) 2, (2) 65

66 Local relaxation flow We modify the dynamics by adding a fictitious term to H: H = H + W, W = j (x j γ j ) 2 where R and γ j are the classical location of ev s, defined by γj sc(x)dx = j N New equilibrium measure: ω e N H. R 2 We then have v, 2 H(x)v R 2 v 2 + N i<j (v i v j ) 2 (x i x j ) 2, v RN. i.e. the relaxation time is t R 2 (Bakry-Emery). We also keep the second term from log x i x j. 66

67 Theorem. [Relaxation of the modified dynamics] Suppose v, 2 H(x)v R 2 v 2 + N Consider the forward equation i<j (v i v j ) 2 t q t = Lq t, t 0, (x i x j ) 2, v RN. with initial condition q 0 = q and with reversible measure ω. Then t D ω ( q t ) R 2D ω( q t ) 2N 2 2N 2 0 ds N i,j= N i,j= ( i qs j qs ) 2 and the logarithmic Sobolev inequality holds ( i qt j qt ) 2 (x i x j ) 2 dω, (x i x j ) 2 dω D ω ( q) S ω (q) CR 2 D ω ( q) Thus the time to equilibrium is of order R 2 : S ω (q t ) e Ct/R2 S ω (q 0 ). 67

68 Theorem [Reformulation] Q := sup t t 0 j (x j γ j ) 2 f t dµ, t 0 = N 2ε and assume that Q CN m with some exponent m. Fix n and an array of positive integers, m = (m, m 2,..., m n ) N n +. Let G : R n R be a nice fn and define G i,m (x) := G ( N(x i x i+m ), N(x i+m x i+m2 ),..., N(x i+mn x i+mn ) Then for any sufficiently small ε > 0, and for τ 3t 0 we have G i,m (x)f τ dµ G i,m (x)dµ N N CNε Qτ + Ce cnε. i i ). Since Q N 2ε in applications, the rhs is small if τ N 2ε. From this theorem and the upper bound one can easily get the limit of correlation functions. Note that the observable G depends on differences. 68

69 Theorem 2. [Gap distribution estimated by Dir. form] Let q L be a density, qdω =. Then for any τ > 0 we have G(N(x i x i+ ))qdω G(N(x i x i+ ))dω N N i ( Dω ( ) q)τ /2 C + C S ω (q)e cτ/r2. N Similar result holds for G i,m (x). As a comparison we show what entropy control alone can reach Theorem: We have, for any τ > 0, and E < 2 N i G ( N(x i E) )[ q ] dω C S ω (q) [τ + e cτ/r2] i = CR S ω (q) after optimizing for τ in the last step. N problem! /2 69

70 Proof. Run the L flow with q as initial condition. Split G(N(x i x i+ )) [ q ] dω = G(...) [ (q q τ )+(q τ ) ] dω N N i For the second term use entropy, G(...) q τ dω C S ω (q τ ) C S ω (q)e cτ/r2 N i For the first term, integrate its derivative τ G(...)(q q τ )dω = N ds 0 N = i τ 0 ds N j ( j q s ) j [ N i i i G(...) s q s dω G(N(x i x i+ )) ] = τ 0 ds N i ( i q s i+ q s ) G (N(x i x i+ )) = τ 0 ds N i ( i qs i+ qs ) q s G (N(x i x i+ )) 70

71 After a Schwarz G(N(x i x i+ )) [ ] q q τ dω N i = τ 0 ds N i τ ds 0 i ( i qs i+ qs ) q s G (N(x i x i+ )) [ i qs i+ qs ] 2 N 2 (x i x i+ ) 2 dω /2 τ ds 0 i G (N(x i x i+ )) 2 (x i x i+ ) 2 q s dω /2 τ ds 0 i,j [ i qs j qs ] 2 N 2 (x i x j ) 2 dω /2 τ N The first term is estimated by D ω ( q) from the additional term in Bakry-Emery. 7

72 We will use this for q = g τ, where g t dω = f t dµ, or, g t = f t /ψ. Comparison of dynamics: t f t = Lf t f t µ µ slow [true DBM] t q t = Lq t q t ω ω fast [approx] Compare f t µ = g t ω with ω for t 3t 0. Theorem 3: [Comparison of dynamics] Set τ = R 2 N ε 3t 0. Let g t = f t /ψ. Assume that S µ (f t0 ) CN m with some fixed m. Then the entropy and the Dirichlet form satisfy the estimates: S ω (g τ/2 ) CNR 2 Q, D ω ( g τ ) CNR 4 Q. Recall: Q := sup t t 0 j (x j γ j ) 2 f t dµ. Note: g t does NOT evolve according to the L dynamics (then the estimate would be exp. small). 72

73 Still, differentiating entropy (similar to Boltzmann s t S = D) t S ω (g t ) = 2 N j ( j gt ) 2 dω+ j From the Schwarz inequality t S ω (g t ) D ω ( g t )+CN j Together with LSI, we have b j j g t dω, b j = W j = x j γ j R 2 b 2 j g tdω D ω ( g t )+CNQR 4 t S ω (g t ) CR 2 S ω (g t ) + CNQR 4 which, after integrating from t = t 0 to τ/2, proves the bound on S ω (g τ ), using that τ = R 2 N ε and S ω (g 0 ) = S µ (f t0 ψ) = S µ (f t0 ) log(z/ Z) + N In fact, we have S ω (g s ) CR 2 NQR 4 = CNQR 2 Wf t0 dµ CN m. for any s τ/2. The estimate for D is integration of (*) from τ/2 to τ. 73 ( )

74 Theorem [Recall] Completion of the proof of Step 2. Q := sup t t 0 j (x j γ j ) 2 f t dµ, t 0 = N 2ε ( ) G i,m (x) := G N(x i x i+m ), N(x i+m x i+m2 ),..., N(x i+mn x i+mn ). Then for any sufficiently small ε > 0, and for τ 3t 0 we have G i,m (x)f τ dµ G i,m (x)dµ N N CNε Qτ + Ce cnε. i i Proof: Recall that t 0 = N 2ε, τ = R 2 N ε. Choose q = g τ = f τ /ψ in Theorem 2. Then Thm 2 and 3 give: G i,m (x)f τ dµ G i,m (x)dω N N CNε Qτ + Ce cnε. i i Use the same for f 0 = (so f t = ) to compare f τ µ with µ. 74

75 Step 3: Green function comparison theorem Theorem Let H (v) and H (w) be two generalized N N Wigner matrices, with matrix elements h ij given by the random variables N /2 v ij and N /2 w ij, respectively, with v ij and w ij satisfying the uniform subexponential decay condition. Define interpolating matrices, H 0 = H (v), H,... H N 2 = H (w), i.e. order the elements in some way and let H γ such that its matrix elements h ij follow the v-distribution up to γ and then they follow the w-distribution; Assume that for any y N +ε and E < 2, ( Assume moment matching H γ E iy ) kk Evij k Ev = Ewk ij, k =,2,3, 4 ij Ewij 4 N δ 75

76 Let F be a nice function and G (v) (z) = (H (v) z) be the resolvent. Then EF k N Tr G (v) (zj ),..., k n N Tr G (v) (zj n ) EF ( G (v) G (w)) j= j= for any collection of spectral parameters, zj m N ε, Ej m 2 2κ. = E m j ± iη, with η Similar statement holds for matrix elements. Moral of the statement: if four moments (almost) match, local statistics are the same. 76

77 Sketch of the proof of the comparison theorem E F ( ) N Tr H (v) E F z = γ(n) γ= ( ) N Tr H (w) z ( E F N Tr H γ z [ ) E F ( )] N Tr. H γ z H γ with H γ differ only in the (i, j) and (j, i) matrix elements: H γ = Q + N V, V := v ij e i e j + v ji e j e i H γ = Q + N W, W := w ij e i e j + w ji e j e i, Q ij = Q ji = 0 Define the Green functions R = Q z, S = H γ z. 77

78 Step : We have S kk = ( H γ z ) kk by resolvent perturbation: = R kk = ( Q z ) kk R = S + N /2SV S N 9/5(SV )9 S + N 5(SV )0 R. Note that offdiagonal elements can be estimated by diagonal, morally G kl max G kk V has only at most two nonzero element, so (SV S) kl = S ki v ij S jl + S kj v ji S il. In general, (SV SV...V S) kk is a finite sum of products of S kl and v ij. For the last term, use the trivial bound R ij η. 78

79 Step 2. Compare the resolvents of H γ and H γ with the resolvent R = (Q z) : S = R N RV R+ N (RV )2 R N 3/2(RV )3 R+ N 2(RV )4 R N 5/2(RV )5 S, so we can write with N TrS = R + ξ, ξ = 4 m= N m/2 R (m) + N 5/2Ω R = N TrR, R (m) = ( ) m N Tr(RV )m R, Ω = N Tr(RV )5 S. Each of them is a sum of a few terms, e.g. R (2) = [ R ki v ij R jj v ji R ik +R ki v ij R ji v ij R ik +R kj v ji R ii v ij R jk +R kj v ji R ij v ji R ik N k ((i, j) fixed!) and similar formulas hold for the other terms. 79

80 Taylor expand: ( EF N Tr H γ z ) =EF ( R + ξ ) =E [ F( R) + F ( R)ξ + F ( R)ξ F (5) ( R + ξ )ξ 5] = 5 m=0 N m/2ea(m), where ξ [0, ξ]; the A (m) s are defined as A (0) = F( R), A () = F ( R) R (), A (2) = F ( R)( R () ) 2 +F ( R) R (2), etc. The expectation values of the terms A (m), m 4, with respect to v ij are determined by the first four moments of v ij, for example E A (2) = F ( R) [ N k R ki R jj R ik +... ] E v ij 2 +F ( R) [ ] N 2 R ki R jk R lj R il +... E v ij 2. k,l 80

81 The same expansion holds for the resolvent of H (γ ) but w ij replaces v ij. If the four moments match, then the first four (fully expanded) terms in ( ) EF N Tr 5 = H γ z m=0 N m/2ea(m) coincide. The difference is given by the fifth (error) term, which is of order N 5/2 : A (5) = F ( R)Ω + F (5) ( R + ξ )( R () ) , Ω = N Tr(RV )5 S. It involves S, that is correlated with V, but Schwarz it out. If the fourth moments match only up to N δ, then ( ) ( ) E F N Tr E F H γ z N Tr CN 5/2 + CN 2 δ. H γ z Sum up the telescopic sum with N 2 terms, we get ( ) ( ) E F N Tr H (v) E F z N Tr H (w) CN /2 + CN δ. z 8

82 SUMMARY All results for general Wigner matrices, no explicit formula used Local semicircle law for the DOS on the optimal scale /N Eigenvectors are fully delocalized away from the spectral edges. Level repulsion for k eigenvalues with optimal exponent. Universality of local eigenvalue statistics for general Wigner matrices with subexponentially decaying matrix elements. OPEN: Universality for band matrices and random Schrödinger. 82

Universality of local spectral statistics of random matrices

Universality of local spectral statistics of random matrices Universality of local spectral statistics of random matrices László Erdős Ludwig-Maximilians-Universität, Munich, Germany CRM, Montreal, Mar 19, 2012 Joint with P. Bourgade, B. Schlein, H.T. Yau, and J.

More information

Universality for random matrices and log-gases

Universality for random matrices and log-gases Universality for random matrices and log-gases László Erdős IST, Austria Ludwig-Maximilians-Universität, Munich, Germany Encounters Between Discrete and Continuous Mathematics Eötvös Loránd University,

More information

Universality of Random Matrices and Dyson Brownian Motion

Universality of Random Matrices and Dyson Brownian Motion Universality of Random Matrices and Dyson Brownian Motion Horng-Tzer Yau Harvard University Joint work with L. Erdős, and B. Schlein, J. Yin 1 Lecture 1: History of Universality and Main Results Basic

More information

Random Matrix: From Wigner to Quantum Chaos

Random Matrix: From Wigner to Quantum Chaos Random Matrix: From Wigner to Quantum Chaos Horng-Tzer Yau Harvard University Joint work with P. Bourgade, L. Erdős, B. Schlein and J. Yin 1 Perhaps I am now too courageous when I try to guess the distribution

More information

Local semicircle law, Wegner estimate and level repulsion for Wigner random matrices

Local semicircle law, Wegner estimate and level repulsion for Wigner random matrices Local semicircle law, Wegner estimate and level repulsion for Wigner random matrices László Erdős University of Munich Oberwolfach, 2008 Dec Joint work with H.T. Yau (Harvard), B. Schlein (Cambrigde) Goal:

More information

Semicircle law on short scales and delocalization for Wigner random matrices

Semicircle law on short scales and delocalization for Wigner random matrices Semicircle law on short scales and delocalization for Wigner random matrices László Erdős University of Munich Weizmann Institute, December 2007 Joint work with H.T. Yau (Harvard), B. Schlein (Munich)

More information

The Matrix Dyson Equation in random matrix theory

The Matrix Dyson Equation in random matrix theory The Matrix Dyson Equation in random matrix theory László Erdős IST, Austria Mathematical Physics seminar University of Bristol, Feb 3, 207 Joint work with O. Ajanki, T. Krüger Partially supported by ERC

More information

From the mesoscopic to microscopic scale in random matrix theory

From the mesoscopic to microscopic scale in random matrix theory From the mesoscopic to microscopic scale in random matrix theory (fixed energy universality for random spectra) With L. Erdős, H.-T. Yau, J. Yin Introduction A spacially confined quantum mechanical system

More information

Spectral Statistics of Erdős-Rényi Graphs II: Eigenvalue Spacing and the Extreme Eigenvalues

Spectral Statistics of Erdős-Rényi Graphs II: Eigenvalue Spacing and the Extreme Eigenvalues Spectral Statistics of Erdős-Rényi Graphs II: Eigenvalue Spacing and the Extreme Eigenvalues László Erdős Antti Knowles 2 Horng-Tzer Yau 2 Jun Yin 2 Institute of Mathematics, University of Munich, Theresienstrasse

More information

Homogenization of the Dyson Brownian Motion

Homogenization of the Dyson Brownian Motion Homogenization of the Dyson Brownian Motion P. Bourgade, joint work with L. Erdős, J. Yin, H.-T. Yau Cincinnati symposium on probability theory and applications, September 2014 Introduction...........

More information

Isotropic local laws for random matrices

Isotropic local laws for random matrices Isotropic local laws for random matrices Antti Knowles University of Geneva With Y. He and R. Rosenthal Random matrices Let H C N N be a large Hermitian random matrix, normalized so that H. Some motivations:

More information

arxiv: v3 [math-ph] 21 Jun 2012

arxiv: v3 [math-ph] 21 Jun 2012 LOCAL MARCHKO-PASTUR LAW AT TH HARD DG OF SAMPL COVARIAC MATRICS CLAUDIO CACCIAPUOTI, AA MALTSV, AD BJAMI SCHLI arxiv:206.730v3 [math-ph] 2 Jun 202 Abstract. Let X be a matrix whose entries are i.i.d.

More information

Universality of sine-kernel for Wigner matrices with a small Gaussian perturbation

Universality of sine-kernel for Wigner matrices with a small Gaussian perturbation E l e c t r o n i c J o u r n a l o f P r o b a b i l i t y Vol. 5 (00), Paper no. 8, pages 56 603. Journal URL http://www.math.washington.edu/~ejpecp/ Universality of sine-kernel for Wigner matrices with

More information

Spectral Universality of Random Matrices

Spectral Universality of Random Matrices Spectral Universality of Random Matrices László Erdős IST Austria (supported by ERC Advanced Grant) Szilárd Leó Colloquium Technical University of Budapest February 27, 2018 László Erdős Spectral Universality

More information

Random matrices: Distribution of the least singular value (via Property Testing)

Random matrices: Distribution of the least singular value (via Property Testing) Random matrices: Distribution of the least singular value (via Property Testing) Van H. Vu Department of Mathematics Rutgers vanvu@math.rutgers.edu (joint work with T. Tao, UCLA) 1 Let ξ be a real or complex-valued

More information

Dynamical approach to random matrix theory

Dynamical approach to random matrix theory Dynamical approach to random matrix theory László Erdős, Horng-Tzer Yau May 9, 207 Partially supported by ERC Advanced Grant, RAMAT 338804 Partially supported by the SF grant DMS-307444 and a Simons Investigator

More information

Quantum Diffusion and Delocalization for Random Band Matrices

Quantum Diffusion and Delocalization for Random Band Matrices Quantum Diffusion and Delocalization for Random Band Matrices László Erdős Ludwig-Maximilians-Universität, Munich, Germany Montreal, Mar 22, 2012 Joint with Antti Knowles (Harvard University) 1 INTRODUCTION

More information

Exponential tail inequalities for eigenvalues of random matrices

Exponential tail inequalities for eigenvalues of random matrices Exponential tail inequalities for eigenvalues of random matrices M. Ledoux Institut de Mathématiques de Toulouse, France exponential tail inequalities classical theme in probability and statistics quantify

More information

Comparison Method in Random Matrix Theory

Comparison Method in Random Matrix Theory Comparison Method in Random Matrix Theory Jun Yin UW-Madison Valparaíso, Chile, July - 2015 Joint work with A. Knowles. 1 Some random matrices Wigner Matrix: H is N N square matrix, H : H ij = H ji, EH

More information

Lecture Notes on the Matrix Dyson Equation and its Applications for Random Matrices

Lecture Notes on the Matrix Dyson Equation and its Applications for Random Matrices Lecture otes on the Matrix Dyson Equation and its Applications for Random Matrices László Erdős Institute of Science and Technology, Austria June 9, 207 Abstract These lecture notes are a concise introduction

More information

Universality of distribution functions in random matrix theory Arno Kuijlaars Katholieke Universiteit Leuven, Belgium

Universality of distribution functions in random matrix theory Arno Kuijlaars Katholieke Universiteit Leuven, Belgium Universality of distribution functions in random matrix theory Arno Kuijlaars Katholieke Universiteit Leuven, Belgium SEA 06@MIT, Workshop on Stochastic Eigen-Analysis and its Applications, MIT, Cambridge,

More information

Eigenvalue variance bounds for Wigner and covariance random matrices

Eigenvalue variance bounds for Wigner and covariance random matrices Eigenvalue variance bounds for Wigner and covariance random matrices S. Dallaporta University of Toulouse, France Abstract. This work is concerned with finite range bounds on the variance of individual

More information

Bulk scaling limits, open questions

Bulk scaling limits, open questions Bulk scaling limits, open questions Based on: Continuum limits of random matrices and the Brownian carousel B. Valkó, B. Virág. Inventiones (2009). Eigenvalue statistics for CMV matrices: from Poisson

More information

Markov operators, classical orthogonal polynomial ensembles, and random matrices

Markov operators, classical orthogonal polynomial ensembles, and random matrices Markov operators, classical orthogonal polynomial ensembles, and random matrices M. Ledoux, Institut de Mathématiques de Toulouse, France 5ecm Amsterdam, July 2008 recent study of random matrix and random

More information

Universality for random matrices and log-gases Lecture Notes for Current Developments in Mathematics, 2012

Universality for random matrices and log-gases Lecture Notes for Current Developments in Mathematics, 2012 Universality for random matrices and log-gases Lecture Notes for Current Developments in Mathematics, 202 László Erdős Institute of Mathematics, University of Munich, Theresienstr. 39, D-80333 Munich,

More information

arxiv: v4 [math.pr] 10 Sep 2018

arxiv: v4 [math.pr] 10 Sep 2018 Lectures on the local semicircle law for Wigner matrices arxiv:60.04055v4 [math.pr] 0 Sep 208 Florent Benaych-Georges Antti Knowles September, 208 These notes provide an introduction to the local semicircle

More information

A Note on the Central Limit Theorem for the Eigenvalue Counting Function of Wigner and Covariance Matrices

A Note on the Central Limit Theorem for the Eigenvalue Counting Function of Wigner and Covariance Matrices A Note on the Central Limit Theorem for the Eigenvalue Counting Function of Wigner and Covariance Matrices S. Dallaporta University of Toulouse, France Abstract. This note presents some central limit theorems

More information

arxiv: v2 [math.pr] 16 Aug 2014

arxiv: v2 [math.pr] 16 Aug 2014 RANDOM WEIGHTED PROJECTIONS, RANDOM QUADRATIC FORMS AND RANDOM EIGENVECTORS VAN VU DEPARTMENT OF MATHEMATICS, YALE UNIVERSITY arxiv:306.3099v2 [math.pr] 6 Aug 204 KE WANG INSTITUTE FOR MATHEMATICS AND

More information

Local Semicircle Law and Complete Delocalization for Wigner Random Matrices

Local Semicircle Law and Complete Delocalization for Wigner Random Matrices Local Semicircle Law and Complete Delocalization for Wigner Random Matrices The Harvard community has made this article openly available. Please share how this access benefits you. Your story matters.

More information

Invertibility of random matrices

Invertibility of random matrices University of Michigan February 2011, Princeton University Origins of Random Matrix Theory Statistics (Wishart matrices) PCA of a multivariate Gaussian distribution. [Gaël Varoquaux s blog gael-varoquaux.info]

More information

Determinantal point processes and random matrix theory in a nutshell

Determinantal point processes and random matrix theory in a nutshell Determinantal point processes and random matrix theory in a nutshell part II Manuela Girotti based on M. Girotti s PhD thesis, A. Kuijlaars notes from Les Houches Winter School 202 and B. Eynard s notes

More information

Chapter 29. Quantum Chaos

Chapter 29. Quantum Chaos Chapter 29 Quantum Chaos What happens to a Hamiltonian system that for classical mechanics is chaotic when we include a nonzero h? There is no problem in principle to answering this question: given a classical

More information

Local law of addition of random matrices

Local law of addition of random matrices Local law of addition of random matrices Kevin Schnelli IST Austria Joint work with Zhigang Bao and László Erdős Supported by ERC Advanced Grant RANMAT No. 338804 Spectrum of sum of random matrices Question:

More information

1 Intro to RMT (Gene)

1 Intro to RMT (Gene) M705 Spring 2013 Summary for Week 2 1 Intro to RMT (Gene) (Also see the Anderson - Guionnet - Zeitouni book, pp.6-11(?) ) We start with two independent families of R.V.s, {Z i,j } 1 i

More information

Random Matrices: Invertibility, Structure, and Applications

Random Matrices: Invertibility, Structure, and Applications Random Matrices: Invertibility, Structure, and Applications Roman Vershynin University of Michigan Colloquium, October 11, 2011 Roman Vershynin (University of Michigan) Random Matrices Colloquium 1 / 37

More information

Mesoscopic eigenvalue statistics of Wigner matrices

Mesoscopic eigenvalue statistics of Wigner matrices Mesoscopic eigenvalue statistics of Wigner matrices Yukun He Antti Knowles May 26, 207 We prove that the linear statistics of the eigenvalues of a Wigner matrix converge to a universal Gaussian process

More information

Fluctuations from the Semicircle Law Lecture 4

Fluctuations from the Semicircle Law Lecture 4 Fluctuations from the Semicircle Law Lecture 4 Ioana Dumitriu University of Washington Women and Math, IAS 2014 May 23, 2014 Ioana Dumitriu (UW) Fluctuations from the Semicircle Law Lecture 4 May 23, 2014

More information

Random Toeplitz Matrices

Random Toeplitz Matrices Arnab Sen University of Minnesota Conference on Limits Theorems in Probability, IISc January 11, 2013 Joint work with Bálint Virág What are Toeplitz matrices? a0 a 1 a 2... a1 a0 a 1... a2 a1 a0... a (n

More information

Concentration Inequalities for Random Matrices

Concentration Inequalities for Random Matrices Concentration Inequalities for Random Matrices M. Ledoux Institut de Mathématiques de Toulouse, France exponential tail inequalities classical theme in probability and statistics quantify the asymptotic

More information

Update on the beta ensembles

Update on the beta ensembles Update on the beta ensembles Brian Rider Temple University with M. Krishnapur IISC, J. Ramírez Universidad Costa Rica, B. Virág University of Toronto The Tracy-Widom laws Consider a random Hermitian n

More information

Free Probability and Random Matrices: from isomorphisms to universality

Free Probability and Random Matrices: from isomorphisms to universality Free Probability and Random Matrices: from isomorphisms to universality Alice Guionnet MIT TexAMP, November 21, 2014 Joint works with F. Bekerman, Y. Dabrowski, A.Figalli, E. Maurel-Segala, J. Novak, D.

More information

Lectures 6 7 : Marchenko-Pastur Law

Lectures 6 7 : Marchenko-Pastur Law Fall 2009 MATH 833 Random Matrices B. Valkó Lectures 6 7 : Marchenko-Pastur Law Notes prepared by: A. Ganguly We will now turn our attention to rectangular matrices. Let X = (X 1, X 2,..., X n ) R p n

More information

Fluctuations of random tilings and discrete Beta-ensembles

Fluctuations of random tilings and discrete Beta-ensembles Fluctuations of random tilings and discrete Beta-ensembles Alice Guionnet CRS (E S Lyon) Workshop in geometric functional analysis, MSRI, nov. 13 2017 Joint work with A. Borodin, G. Borot, V. Gorin, J.Huang

More information

The circular law. Lewis Memorial Lecture / DIMACS minicourse March 19, Terence Tao (UCLA)

The circular law. Lewis Memorial Lecture / DIMACS minicourse March 19, Terence Tao (UCLA) The circular law Lewis Memorial Lecture / DIMACS minicourse March 19, 2008 Terence Tao (UCLA) 1 Eigenvalue distributions Let M = (a ij ) 1 i n;1 j n be a square matrix. Then one has n (generalised) eigenvalues

More information

NUMERICAL CALCULATION OF RANDOM MATRIX DISTRIBUTIONS AND ORTHOGONAL POLYNOMIALS. Sheehan Olver NA Group, Oxford

NUMERICAL CALCULATION OF RANDOM MATRIX DISTRIBUTIONS AND ORTHOGONAL POLYNOMIALS. Sheehan Olver NA Group, Oxford NUMERICAL CALCULATION OF RANDOM MATRIX DISTRIBUTIONS AND ORTHOGONAL POLYNOMIALS Sheehan Olver NA Group, Oxford We are interested in numerically computing eigenvalue statistics of the GUE ensembles, i.e.,

More information

Duality, Statistical Mechanics and Random Matrices. Bielefeld Lectures

Duality, Statistical Mechanics and Random Matrices. Bielefeld Lectures Duality, Statistical Mechanics and Random Matrices Bielefeld Lectures Tom Spencer Institute for Advanced Study Princeton, NJ August 16, 2016 Overview Statistical mechanics motivated by Random Matrix theory

More information

Random Fermionic Systems

Random Fermionic Systems Random Fermionic Systems Fabio Cunden Anna Maltsev Francesco Mezzadri University of Bristol December 9, 2016 Maltsev (University of Bristol) Random Fermionic Systems December 9, 2016 1 / 27 Background

More information

Fluctuations of random tilings and discrete Beta-ensembles

Fluctuations of random tilings and discrete Beta-ensembles Fluctuations of random tilings and discrete Beta-ensembles Alice Guionnet CRS (E S Lyon) Advances in Mathematics and Theoretical Physics, Roma, 2017 Joint work with A. Borodin, G. Borot, V. Gorin, J.Huang

More information

Eigenvalues and Singular Values of Random Matrices: A Tutorial Introduction

Eigenvalues and Singular Values of Random Matrices: A Tutorial Introduction Random Matrix Theory and its applications to Statistics and Wireless Communications Eigenvalues and Singular Values of Random Matrices: A Tutorial Introduction Sergio Verdú Princeton University National

More information

Inhomogeneous circular laws for random matrices with non-identically distributed entries

Inhomogeneous circular laws for random matrices with non-identically distributed entries Inhomogeneous circular laws for random matrices with non-identically distributed entries Nick Cook with Walid Hachem (Telecom ParisTech), Jamal Najim (Paris-Est) and David Renfrew (SUNY Binghamton/IST

More information

Applications of random matrix theory to principal component analysis(pca)

Applications of random matrix theory to principal component analysis(pca) Applications of random matrix theory to principal component analysis(pca) Jun Yin IAS, UW-Madison IAS, April-2014 Joint work with A. Knowles and H. T Yau. 1 Basic picture: Let H be a Wigner (symmetric)

More information

Lecture I: Asymptotics for large GUE random matrices

Lecture I: Asymptotics for large GUE random matrices Lecture I: Asymptotics for large GUE random matrices Steen Thorbjørnsen, University of Aarhus andom Matrices Definition. Let (Ω, F, P) be a probability space and let n be a positive integer. Then a random

More information

Local Kesten McKay law for random regular graphs

Local Kesten McKay law for random regular graphs Local Kesten McKay law for random regular graphs Roland Bauerschmidt (with Jiaoyang Huang and Horng-Tzer Yau) University of Cambridge Weizmann Institute, January 2017 Random regular graph G N,d is the

More information

Central Limit Theorems for linear statistics for Biorthogonal Ensembles

Central Limit Theorems for linear statistics for Biorthogonal Ensembles Central Limit Theorems for linear statistics for Biorthogonal Ensembles Maurice Duits, Stockholm University Based on joint work with Jonathan Breuer (HUJI) Princeton, April 2, 2014 M. Duits (SU) CLT s

More information

Painlevé Representations for Distribution Functions for Next-Largest, Next-Next-Largest, etc., Eigenvalues of GOE, GUE and GSE

Painlevé Representations for Distribution Functions for Next-Largest, Next-Next-Largest, etc., Eigenvalues of GOE, GUE and GSE Painlevé Representations for Distribution Functions for Next-Largest, Next-Next-Largest, etc., Eigenvalues of GOE, GUE and GSE Craig A. Tracy UC Davis RHPIA 2005 SISSA, Trieste 1 Figure 1: Paul Painlevé,

More information

The largest eigenvalues of the sample covariance matrix. in the heavy-tail case

The largest eigenvalues of the sample covariance matrix. in the heavy-tail case The largest eigenvalues of the sample covariance matrix 1 in the heavy-tail case Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia NY), Johannes Heiny (Aarhus University)

More information

Extreme eigenvalue fluctutations for GUE

Extreme eigenvalue fluctutations for GUE Extreme eigenvalue fluctutations for GUE C. Donati-Martin 204 Program Women and Mathematics, IAS Introduction andom matrices were introduced in multivariate statistics, in the thirties by Wishart [Wis]

More information

Triangular matrices and biorthogonal ensembles

Triangular matrices and biorthogonal ensembles /26 Triangular matrices and biorthogonal ensembles Dimitris Cheliotis Department of Mathematics University of Athens UK Easter Probability Meeting April 8, 206 2/26 Special densities on R n Example. n

More information

Random Correlation Matrices, Top Eigenvalue with Heavy Tails and Financial Applications

Random Correlation Matrices, Top Eigenvalue with Heavy Tails and Financial Applications Random Correlation Matrices, Top Eigenvalue with Heavy Tails and Financial Applications J.P Bouchaud with: M. Potters, G. Biroli, L. Laloux, M. A. Miceli http://www.cfm.fr Portfolio theory: Basics Portfolio

More information

RANDOM MATRIX THEORY AND TOEPLITZ DETERMINANTS

RANDOM MATRIX THEORY AND TOEPLITZ DETERMINANTS RANDOM MATRIX THEORY AND TOEPLITZ DETERMINANTS David García-García May 13, 2016 Faculdade de Ciências da Universidade de Lisboa OVERVIEW Random Matrix Theory Introduction Matrix ensembles A sample computation:

More information

1 Math 241A-B Homework Problem List for F2015 and W2016

1 Math 241A-B Homework Problem List for F2015 and W2016 1 Math 241A-B Homework Problem List for F2015 W2016 1.1 Homework 1. Due Wednesday, October 7, 2015 Notation 1.1 Let U be any set, g be a positive function on U, Y be a normed space. For any f : U Y let

More information

MATH 205C: STATIONARY PHASE LEMMA

MATH 205C: STATIONARY PHASE LEMMA MATH 205C: STATIONARY PHASE LEMMA For ω, consider an integral of the form I(ω) = e iωf(x) u(x) dx, where u Cc (R n ) complex valued, with support in a compact set K, and f C (R n ) real valued. Thus, I(ω)

More information

Non white sample covariance matrices.

Non white sample covariance matrices. Non white sample covariance matrices. S. Péché, Université Grenoble 1, joint work with O. Ledoit, Uni. Zurich 17-21/05/2010, Université Marne la Vallée Workshop Probability and Geometry in High Dimensions

More information

From random matrices to free groups, through non-crossing partitions. Michael Anshelevich

From random matrices to free groups, through non-crossing partitions. Michael Anshelevich From random matrices to free groups, through non-crossing partitions Michael Anshelevich March 4, 22 RANDOM MATRICES For each N, A (N), B (N) = independent N N symmetric Gaussian random matrices, i.e.

More information

A Remark on Hypercontractivity and Tail Inequalities for the Largest Eigenvalues of Random Matrices

A Remark on Hypercontractivity and Tail Inequalities for the Largest Eigenvalues of Random Matrices A Remark on Hypercontractivity and Tail Inequalities for the Largest Eigenvalues of Random Matrices Michel Ledoux Institut de Mathématiques, Université Paul Sabatier, 31062 Toulouse, France E-mail: ledoux@math.ups-tlse.fr

More information

Quantum Chaos and Nonunitary Dynamics

Quantum Chaos and Nonunitary Dynamics Quantum Chaos and Nonunitary Dynamics Karol Życzkowski in collaboration with W. Bruzda, V. Cappellini, H.-J. Sommers, M. Smaczyński Phys. Lett. A 373, 320 (2009) Institute of Physics, Jagiellonian University,

More information

4 / Gaussian Ensembles. Level Density

4 / Gaussian Ensembles. Level Density 4 / Gaussian Ensembles. Level Density In this short chapter we reproduce a statistical mechanical argument of Wigner to "derive" the level density for the Gaussian ensembles. The joint probability density

More information

A Generalization of Wigner s Law

A Generalization of Wigner s Law A Generalization of Wigner s Law Inna Zakharevich June 2, 2005 Abstract We present a generalization of Wigner s semicircle law: we consider a sequence of probability distributions (p, p 2,... ), with mean

More information

Determinantal point processes and random matrix theory in a nutshell

Determinantal point processes and random matrix theory in a nutshell Determinantal point processes and random matrix theory in a nutshell part I Manuela Girotti based on M. Girotti s PhD thesis and A. Kuijlaars notes from Les Houches Winter School 202 Contents Point Processes

More information

Modification of the Porter-Thomas distribution by rank-one interaction. Eugene Bogomolny

Modification of the Porter-Thomas distribution by rank-one interaction. Eugene Bogomolny Modification of the Porter-Thomas distribution by rank-one interaction Eugene Bogomolny University Paris-Sud, CNRS Laboratoire de Physique Théorique et Modèles Statistiques, Orsay France XII Brunel-Bielefeld

More information

Double contour integral formulas for the sum of GUE and one matrix model

Double contour integral formulas for the sum of GUE and one matrix model Double contour integral formulas for the sum of GUE and one matrix model Based on arxiv:1608.05870 with Tom Claeys, Arno Kuijlaars, and Karl Liechty Dong Wang National University of Singapore Workshop

More information

On the principal components of sample covariance matrices

On the principal components of sample covariance matrices On the principal components of sample covariance matrices Alex Bloemendal Antti Knowles Horng-Tzer Yau Jun Yin February 4, 205 We introduce a class of M M sample covariance matrices Q which subsumes and

More information

Random matrix pencils and level crossings

Random matrix pencils and level crossings Albeverio Fest October 1, 2018 Topics to discuss Basic level crossing problem 1 Basic level crossing problem 2 3 Main references Basic level crossing problem (i) B. Shapiro, M. Tater, On spectral asymptotics

More information

III. Quantum ergodicity on graphs, perspectives

III. Quantum ergodicity on graphs, perspectives III. Quantum ergodicity on graphs, perspectives Nalini Anantharaman Université de Strasbourg 24 août 2016 Yesterday we focussed on the case of large regular (discrete) graphs. Let G = (V, E) be a (q +

More information

Rectangular Young tableaux and the Jacobi ensemble

Rectangular Young tableaux and the Jacobi ensemble Rectangular Young tableaux and the Jacobi ensemble Philippe Marchal October 20, 2015 Abstract It has been shown by Pittel and Romik that the random surface associated with a large rectangular Young tableau

More information

On the concentration of eigenvalues of random symmetric matrices

On the concentration of eigenvalues of random symmetric matrices On the concentration of eigenvalues of random symmetric matrices Noga Alon Michael Krivelevich Van H. Vu April 23, 2012 Abstract It is shown that for every 1 s n, the probability that the s-th largest

More information

Burgers equation in the complex plane. Govind Menon Division of Applied Mathematics Brown University

Burgers equation in the complex plane. Govind Menon Division of Applied Mathematics Brown University Burgers equation in the complex plane Govind Menon Division of Applied Mathematics Brown University What this talk contains Interesting instances of the appearance of Burgers equation in the complex plane

More information

Quantum chaos in composite systems

Quantum chaos in composite systems Quantum chaos in composite systems Karol Życzkowski in collaboration with Lukasz Pawela and Zbigniew Pucha la (Gliwice) Smoluchowski Institute of Physics, Jagiellonian University, Cracow and Center for

More information

1 Tridiagonal matrices

1 Tridiagonal matrices Lecture Notes: β-ensembles Bálint Virág Notes with Diane Holcomb 1 Tridiagonal matrices Definition 1. Suppose you have a symmetric matrix A, we can define its spectral measure (at the first coordinate

More information

On rational approximation of algebraic functions. Julius Borcea. Rikard Bøgvad & Boris Shapiro

On rational approximation of algebraic functions. Julius Borcea. Rikard Bøgvad & Boris Shapiro On rational approximation of algebraic functions http://arxiv.org/abs/math.ca/0409353 Julius Borcea joint work with Rikard Bøgvad & Boris Shapiro 1. Padé approximation: short overview 2. A scheme of rational

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

The Transition to Chaos

The Transition to Chaos Linda E. Reichl The Transition to Chaos Conservative Classical Systems and Quantum Manifestations Second Edition With 180 Illustrations v I.,,-,,t,...,* ', Springer Dedication Acknowledgements v vii 1

More information

ON THE SECOND-ORDER CORRELATION FUNCTION OF THE CHARACTERISTIC POLYNOMIAL OF A HERMITIAN WIGNER MATRIX

ON THE SECOND-ORDER CORRELATION FUNCTION OF THE CHARACTERISTIC POLYNOMIAL OF A HERMITIAN WIGNER MATRIX ON THE SECOND-ORDER CORRELATION FUNCTION OF THE CHARACTERISTIC POLYNOMIAL OF A HERMITIAN WIGNER MATRIX F. GÖTZE AND H. KÖSTERS Abstract. We consider the asymptotics of the second-order correlation function

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Beyond the Gaussian universality class

Beyond the Gaussian universality class Beyond the Gaussian universality class MSRI/Evans Talk Ivan Corwin (Courant Institute, NYU) September 13, 2010 Outline Part 1: Random growth models Random deposition, ballistic deposition, corner growth

More information

The Eigenvector Moment Flow and local Quantum Unique Ergodicity

The Eigenvector Moment Flow and local Quantum Unique Ergodicity The Eigenvector Moment Flow and local Quantum Unique Ergodicity P. Bourgade Cambridge University and Institute for Advanced Study E-mail: bourgade@math.ias.edu H.-T. Yau Harvard University and Institute

More information

APPENDIX A. Background Mathematics. A.1 Linear Algebra. Vector algebra. Let x denote the n-dimensional column vector with components x 1 x 2.

APPENDIX A. Background Mathematics. A.1 Linear Algebra. Vector algebra. Let x denote the n-dimensional column vector with components x 1 x 2. APPENDIX A Background Mathematics A. Linear Algebra A.. Vector algebra Let x denote the n-dimensional column vector with components 0 x x 2 B C @. A x n Definition 6 (scalar product). The scalar product

More information

Wigner s semicircle law

Wigner s semicircle law CHAPTER 2 Wigner s semicircle law 1. Wigner matrices Definition 12. A Wigner matrix is a random matrix X =(X i, j ) i, j n where (1) X i, j, i < j are i.i.d (real or complex valued). (2) X i,i, i n are

More information

arxiv: v2 [math.pr] 13 Jul 2018

arxiv: v2 [math.pr] 13 Jul 2018 Eigenvectors distribution and quantum unique ergodicity for deformed Wigner matrices L. Benigni LPSM, Université Paris Diderot lbenigni@lpsm.paris arxiv:7.0703v2 [math.pr] 3 Jul 208 Abstract We analyze

More information

Density of States for Random Band Matrices in d = 2

Density of States for Random Band Matrices in d = 2 Density of States for Random Band Matrices in d = 2 via the supersymmetric approach Mareike Lager Institute for applied mathematics University of Bonn Joint work with Margherita Disertori ZiF Summer School

More information

Stein s Method and Characteristic Functions

Stein s Method and Characteristic Functions Stein s Method and Characteristic Functions Alexander Tikhomirov Komi Science Center of Ural Division of RAS, Syktyvkar, Russia; Singapore, NUS, 18-29 May 2015 Workshop New Directions in Stein s method

More information

COMPLEX HERMITE POLYNOMIALS: FROM THE SEMI-CIRCULAR LAW TO THE CIRCULAR LAW

COMPLEX HERMITE POLYNOMIALS: FROM THE SEMI-CIRCULAR LAW TO THE CIRCULAR LAW Serials Publications www.serialspublications.com OMPLEX HERMITE POLYOMIALS: FROM THE SEMI-IRULAR LAW TO THE IRULAR LAW MIHEL LEDOUX Abstract. We study asymptotics of orthogonal polynomial measures of the

More information

Universality for a class of random band matrices. 1 Introduction

Universality for a class of random band matrices. 1 Introduction Universality for a class of random band matrices P. Bourgade New York University, Courant Institute bourgade@cims.nyu.edu H.-T. Yau Harvard University htyau@math.harvard.edu L. Erdős Institute of Science

More information

Stochastic Differential Equations Related to Soft-Edge Scaling Limit

Stochastic Differential Equations Related to Soft-Edge Scaling Limit Stochastic Differential Equations Related to Soft-Edge Scaling Limit Hideki Tanemura Chiba univ. (Japan) joint work with Hirofumi Osada (Kyushu Unv.) 2012 March 29 Hideki Tanemura (Chiba univ.) () SDEs

More information

Scattering for the NLS equation

Scattering for the NLS equation Scattering for the NLS equation joint work with Thierry Cazenave (UPMC) Ivan Naumkin Université Nice Sophia Antipolis February 2, 2017 Introduction. Consider the nonlinear Schrödinger equation with the

More information

Applications and fundamental results on random Vandermon

Applications and fundamental results on random Vandermon Applications and fundamental results on random Vandermonde matrices May 2008 Some important concepts from classical probability Random variables are functions (i.e. they commute w.r.t. multiplication)

More information

Microscopic behavior for β-ensembles: an energy approach

Microscopic behavior for β-ensembles: an energy approach Microscopic behavior for β-ensembles: an energy approach Thomas Leblé (joint with/under the supervision of) Sylvia Serfaty Université Paris 6 BIRS workshop, 14 April 2016 Thomas Leblé (Université Paris

More information

Free probabilities and the large N limit, IV. Loop equations and all-order asymptotic expansions... Gaëtan Borot

Free probabilities and the large N limit, IV. Loop equations and all-order asymptotic expansions... Gaëtan Borot Free probabilities and the large N limit, IV March 27th 2014 Loop equations and all-order asymptotic expansions... Gaëtan Borot MPIM Bonn & MIT based on joint works with Alice Guionnet, MIT Karol Kozlowski,

More information

arxiv: v7 [math.pr] 28 Apr 2014

arxiv: v7 [math.pr] 28 Apr 2014 The Annals of Applied Probability 204, Vol. 24, No. 3, 935 00 DOI: 0.24/3-AAP939 c Institute of Mathematical Statistics, 204 UNIVERSALITY OF COVARIANCE MATRICES arxiv:0.250v7 [math.pr] 28 Apr 204 By Natesh

More information

Assessing the dependence of high-dimensional time series via sample autocovariances and correlations

Assessing the dependence of high-dimensional time series via sample autocovariances and correlations Assessing the dependence of high-dimensional time series via sample autocovariances and correlations Johannes Heiny University of Aarhus Joint work with Thomas Mikosch (Copenhagen), Richard Davis (Columbia),

More information