Random Matrices with Blocks of Intermediate Scale Strongly Correlated Band Matrices

Similar documents
Linear regression. Daniel Hsu (COMS 4771) (y i x T i β)2 2πσ. 2 2σ 2. 1 n. (x T i β y i ) 2. 1 ˆβ arg min. β R n d

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS

7.1 Convergence of sequences of random variables

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

CHAPTER 10 INFINITE SEQUENCES AND SERIES

Lecture 2: Monte Carlo Simulation

Math 61CM - Solutions to homework 3

n outcome is (+1,+1, 1,..., 1). Let the r.v. X denote our position (relative to our starting point 0) after n moves. Thus X = X 1 + X 2 + +X n,

January 25, 2017 INTRODUCTION TO MATHEMATICAL STATISTICS

Lecture 7: Properties of Random Samples

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

1 Generating functions for balls in boxes

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

Infinite Sequences and Series

Eigenvalues and Eigenvectors

Problem Set 2 Solutions

Lecture 19: Convergence

32 estimating the cumulative distribution function

Zeros of Polynomials

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

Distribution of Random Samples & Limit theorems

6.3 Testing Series With Positive Terms

Table 12.1: Contingency table. Feature b. 1 N 11 N 12 N 1b 2 N 21 N 22 N 2b. ... a N a1 N a2 N ab

Math 2784 (or 2794W) University of Connecticut

Notes for Lecture 11

THE ASYMPTOTIC COMPLEXITY OF MATRIX REDUCTION OVER FINITE FIELDS

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 3 9/11/2013. Large deviations Theory. Cramér s Theorem

Convergence of random variables. (telegram style notes) P.J.C. Spreij

Goodness-of-Fit Tests and Categorical Data Analysis (Devore Chapter Fourteen)

, then cv V. Differential Equations Elements of Lineaer Algebra Name: Consider the differential equation. and y2 cos( kx)

1 Last time: similar and diagonalizable matrices

Advanced Stochastic Processes.

Estimation for Complete Data

17. Joint distributions of extreme order statistics Lehmann 5.1; Ferguson 15

Ma 530 Introduction to Power Series

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

LECTURE 8: ASYMPTOTICS I

Sequences and Series of Functions

4. Partial Sums and the Central Limit Theorem

18.S096: Homework Problem Set 1 (revised)

Polynomial Functions and Their Graphs

Economics 241B Relation to Method of Moments and Maximum Likelihood OLSE as a Maximum Likelihood Estimator

7.1 Convergence of sequences of random variables

Inverse Matrix. A meaning that matrix B is an inverse of matrix A.

A sequence of numbers is a function whose domain is the positive integers. We can see that the sequence

The picture in figure 1.1 helps us to see that the area represents the distance traveled. Figure 1: Area represents distance travelled

Lecture 4. We also define the set of possible values for the random walk as the set of all x R d such that P(S n = x) > 0 for some n.

Mathematics 170B Selected HW Solutions.

5.1 Review of Singular Value Decomposition (SVD)

Topic 9: Sampling Distributions of Estimators

Math 525: Lecture 5. January 18, 2018

Sequences A sequence of numbers is a function whose domain is the positive integers. We can see that the sequence

Probability, Expectation Value and Uncertainty

CALCULATION OF FIBONACCI VECTORS

Law of the sum of Bernoulli random variables

Efficient GMM LECTURE 12 GMM II

Lesson 10: Limits and Continuity

Optimally Sparse SVMs

Chapter 6 Principles of Data Reduction

Random Variables, Sampling and Estimation

The standard deviation of the mean

CMSE 820: Math. Foundations of Data Sci.

MAT1026 Calculus II Basic Convergence Tests for Series

For a 3 3 diagonal matrix we find. Thus e 1 is a eigenvector corresponding to eigenvalue λ = a 11. Thus matrix A has eigenvalues 2 and 3.

1 Introduction to reducing variance in Monte Carlo simulations

Randomized Algorithms I, Spring 2018, Department of Computer Science, University of Helsinki Homework 1: Solutions (Discussed January 25, 2018)

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

1 Approximating Integrals using Taylor Polynomials

CS / MCS 401 Homework 3 grader solutions

Statistical and Mathematical Methods DS-GA 1002 December 8, Sample Final Problems Solutions

Random assignment with integer costs

Exponential Families and Bayesian Inference

( 1) n (4x + 1) n. n=0

Session 5. (1) Principal component analysis and Karhunen-Loève transformation

Seunghee Ye Ma 8: Week 5 Oct 28

Statistical Inference Based on Extremum Estimators

Random Models. Tusheng Zhang. February 14, 2013

x a x a Lecture 2 Series (See Chapter 1 in Boas)

Lecture 2 Clustering Part II

Algorithms for Clustering

MIDTERM 3 CALCULUS 2. Monday, December 3, :15 PM to 6:45 PM. Name PRACTICE EXAM SOLUTIONS

Solution to Chapter 2 Analytical Exercises

Stochastic Simulation

a for a 1 1 matrix. a b a b 2 2 matrix: We define det ad bc 3 3 matrix: We define a a a a a a a a a a a a a a a a a a

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 22

CHAPTER I: Vector Spaces

Section 11.8: Power Series

PROBLEM SET 5 SOLUTIONS 126 = , 37 = , 15 = , 7 = 7 1.

HOMEWORK #10 SOLUTIONS

Lecture 8: October 20, Applications of SVD: least squares approximation

4. Hypothesis testing (Hotelling s T 2 -statistic)

Stochastic Matrices in a Finite Field

62. Power series Definition 16. (Power series) Given a sequence {c n }, the series. c n x n = c 0 + c 1 x + c 2 x 2 + c 3 x 3 +

Math 10A final exam, December 16, 2016

A Hadamard-type lower bound for symmetric diagonally dominant positive matrices

An Introduction to Randomized Algorithms

ki, X(n) lj (n) = (ϱ (n) ij ) 1 i,j d.


First, note that the LS residuals are orthogonal to the regressors. X Xb X y = 0 ( normal equations ; (k 1) ) So,

We are mainly going to be concerned with power series in x, such as. (x)} converges - that is, lims N n

Transcription:

Radom Matrices with Blocks of Itermediate Scale Strogly Correlated Bad Matrices Jiayi Tog Advisor: Dr. Todd Kemp May 30, 07 Departmet of Mathematics Uiversity of Califoria, Sa Diego

Cotets Itroductio Notatio ad Backgroud 6. Bad Matrix............................ 6. Wick s Theorem.......................... 6 3 Models 3. Model............................... 3. Proof................................ 7

Itroductio Aradommatrixissimplyamatrix,allofwhoseetriesareradom. This meas that X is a matrix with etries X ij ( apple i, j apple ) that are radom variables. The eigevalues of symmetric matrices with radom idepedet etries are well-uderstood. I the 950s, Wiger s semicircle law was first observed by Eugee Wiger [4. Defiitio. Let {Y i,j } appleiapplej ad we assume {Y i,j } appleiapplej are idepedet The diagoal etries {Y i, } applei are idetically-distributed, ad the o diagoal etries {Y i,j } <i<j are idetically-distributed. E[Yij < for all i, j Let < Y ij,iapple j [Y ij = : Y ji,i>j With E[Y > 0, the the matrices X = / Y are Wiger matrices. No matter what distributio the etries of Wiger matrix have, the desity of eigevalues coverges to a uiversal distributio, kow as Wiger s semicircle law: t(dx) = p (4t t x ) + dx Figure : The desity of eigevalues of a istace of X 4000,aGaussia Wiger matrix.

However, whe the etries of the radom matrix are correlated, few thigs are well-uderstood by ow. Therefore, i this paper, our aim is to study the empirical eigevalue distributio of square radom models with some structured correlatios. To be more specific, we study the eigevalue distributio of bad matrices whose bads o diagoals are idepedet of each other, but the idetically distributed etries alog each bad are correlated. The asymptotic empirical eigevalue distributio ca be uderstood by cosiderig the limits of the matrix momets i expectatio: lim! ETr[Xk where k N. Why do these matrix momets relate to the eigevalue distributio? First, we eed to uderstad what a histogram is. There are some data poits,,...,. These data poits eed to be put ito some bis or itervals, ad the umber of the poits i each bi should be couted so that we could kow the heights of the bis. For example, i the iterval [a, b, we eed to cout # {j {,...,} : j [a, b} We ca use the idicator fuctio [a,b.itisdefiedby <, if a apple x apple b [a,b (x) = : 0, otherwise To get the desired cout, we eed to add all idicator fuctios up as follows: # {j {,...,} : j [a, b} = [a,b ( )+ [a,b ( )+ + [a,b ( ) Now, we have the height of iterval [a, b. To get the whole histogram, we should compute the radom variables: C [a,b := [a,b ( j ) j= for all a, b. Sice the j are eigevalues of a radom matrix, it is very hard to compute them exactly. Therefore, we approximate the fuctio C [a,b by a sequece of polyomials P m. As the degree of polyomials becomes higher ad higher, we ca fid a sequece of polyomials with P m (x)! [a,b (x) for every x. 3

The, we have C [a,b := j= [a,b ( j )= lim m! P m ( j ) j= To get C [a,b,weeedtocompte P ( j ) j= for every polyomial P (x) =a 0 +a x+a x + +a k x k for some real umbers a 0,a,...,a k.the P ( j )= j= (a 0 + a j + a j + + a k k j ) j= = a 0 + a j= j + a j= j + + a k j= k j So we eed to kow j= for each momet k. Now, we have X, a symmetric radom matrix. It is orthogoally diagoalizable: X = QDQ where Q is the eigevector matrix ad D is diagoal with the eigevalues o the diagoal: 3 0... 0 0... 0 D = 6 7 4 0 0... 0 5 k j The, Ad thus, 0 0... 3 k 0... 0 X k = QD k Q k 0 = Q... 0 6 7 4 0 0... 0 5 Q k 0 0... Tr(X k )=Tr(QD k Q )=Tr(Q QD k )=Tr(D k ) 4

sice Tr(AB) = Tr(BA) i geeral. Hece, 3 k 0... 0 Tr(X k k 0 )=Tr... 0 6 7 4 0 0... 0 5 = j= k 0 0... k j Therefore, if we kow the matrix momets: Tr(X k ) for each iteger k, the we kow the quatities P j= k j,sowekowallquatities P j= P ( j)forpolyomialsp. The we are able to approximate the height of each bi or iterval [a,b as closely as we eed. Sice X is a radom matrix with radom variables as etries i our model, we caot compute Tr(X k ) exactly. However, ETr[X k ca be computed exactly i our model. The we ca prove that Var(Tr(X k )) is small as matrix size!.thevariacecalculatioisdoeithepaper [. So we really do kow the limit shape of the histogram by computig after ormalizatio. We kow that Tr(X k )= ETr[Xk i,i,i 3,...,i k = take expectatio of both sides of equatio (), the ETr(X k )= i,i,i 3,...,i k = X i i...x ik i () E[X i i...x ik i () I this paper, we will first defie the maximally correlated bad radom matrix which we work o, ad the give the proof that the limits of this matrix momets i expectatio are the momets of Gaussia. 5

Notatio ad Backgroud. Bad Matrix I order to guaratee that the matrix has real eigevalues, we will make the assumptio that X is a symmetric matrix, i.e. X ij = X ji,where apple i, j apple. Let [Y m beasequeceofradomvariables,where0apple m apple. Defiitio. Defie a matrix with etries X i,i+m = X i,i m = p Y m as a radom bad matrix, whichhavethefollowigform: 6 4 X, X,......... X, X, X,......... X,............ X i,i+m............ X i,i+m......... X, X,......... X,... 3 7 5 is equivalet to p 6 4 Y 0 Y......... Y Y Y 0...... Y m Y........................ Y m............ Y Y......... Y 0.. 3 7 5 We call this matrix: maximally correlated bad matrix. The maximally correlated bad matrix is give the followig assumptios: Etries Y m are idepedet of each other. Etries Y m o the diagoals are idetically-distributed stadard ormal radom variables.. Wick s Theorem To calculate the limitig matrix momets i expectatio, we apply Wick s Theorem. 6

Wick s theorem is a method of reducig high-order momets to combiatioal expressios ivolvig oly covariaces of Gaussia radom variables. It is used for the formula expressig the higher momets of a Gaussia distributio i terms of the secod momets. Defiitio 3. Defie P (k) asallpairigsof{,,...,k}. Let be oe of the k itegers possible pairigs i P, Note that = {{, }, {, },...,{ k, k}} # {P (k)} =(k )!! = (k )(k 3)(k 5)...(5)(3)() Theorem. Wick s Theorem Let X,X,...,X k be joitly idepedet ormal radom variables. The E[X X...X k = X Y E[X X P (k) {, } Example.. whe k = 4, i,i,i 3,i 4 {,} ad X ij are the etries of a matrix, the accordig to equatio (): ETr(X 4 )= where by Wick s theorem E[X i i i,i,i 3,i 4 E[X i i = E[X i i E[ +E[X i i +E[X i i (3) Three pairigs of four variables ca be expressed graphically. X i i X i i X i i 7

I coclusiio, by Wick s Theorem, we ca express the covergece of matrix momets i expectatio as follows: 3 Models ETr[Xk = = i,...,i k = E[X i,i X i,i 3...X ik,i X Y i,...,i k = P (k) {, } E[X i,i + X i,i + (4) I this sectio, we will study the empirical eigevalue distributio by applyig specific radom variables ito the bad matrices. 3. Model Here is a histogram of the eigevalues of a 4000 4000 matrix sampled from the maximally correlated bad model (Defiitio ) with Gaussia etries: Figure : The desity of eigevalues of a istace of X 4000,aGaussiabad matrix. Theorem (Kemp, Tog, 07). Let X be a maximally correlated bad matrix accordig to Defiitio. The limit matrix momets are the momets

of a Gaussia distributio: where k =,, 3,... lim! ETr[Xk = < 0, k odd :(k )!!, k eve The full proof of this theorem will be preseted i Sectio 3.. Before the proof, some propositios are eeded. Propositio. Let X be a maximally correlated bad matrix, where X i,i+m = X i,i m = p Y m. Let {, } be a block i pairig = {{, }, {, },...,{ k, k}} Because of the idepedece of etries, term E[X i,i + X i,i + i equatio (3) does ot equal to zero if ad oly if i i + = i i + Proof. If i i + 6= i i + the E[X i,i + X i,i + =E[X i,i + E[X i,i + = p p E[Y i i + E[Y i i + = 0 0 =0 Therefore, to get a o-zero E[X i,i + X i,i +, there are two cases of the idices: Positive case: i i + = i i + Negative case: i i + = i + i Propositio. Oly the pairigs of the idices which are uder the egative case: i i + = i + i 9

cotribute to leadig order of the results i Theorem 3. The proof of this propositio will be preseted i Sectio 3.. Before provig the geeral k-th momet, a specific example with k = 4willbe explaied first. Example 3.. Whe k = 4, there are four idices: i,i,i 3,i 4 {,}. The matrix momet s expectatio ca be expressed as: ETr[X4 (5) = E[X i i (6) = i,i,i 3,i 4 = i,...,i k = (E[X i i E[ +E[X i i + E[X i i ) (7) The, from the propositio, we see that there are four cases of these four idices: Case I: If i i = i i 3 = i 3 i 4 = i 4 i, the, E[X i i E[ 6= 0,E[X i i 6= 0, E[X i i 6= 0 Case II: If i i = i i 3 6= i 3 i 4 = i 4 i, the, E[X i i E[ 6= 0,E[X i i =0, E[X i i =0 Case III: If i i = i 3 i 4 6= i i 3 = i 4 i, the, E[X i i E[ =0,E[X i i 6= 0, E[X i i =0 Case IV: If i i = i 4 i 6= i i 3 = i 3 i 4, the, E[X i i E[ 6= 0,E[X i i =0, E[X i i 6= 0 To calculate the large- limit of equatio (7), we discuss each case separately ad the sum them up to get the leadig term of the limit. Case I: i i = i i 3 = i 3 i 4 = i 4 i = m 0

where m =0,,,..., Uder this case, equatio (7) ca be expressed as: i,...,i 4 = E[X i i E[ +E[X i i + E[X i i = X ( E[Y m + E[Y m + E[Y m ) m=0 =3 X m=0 =3 O( ) = O E[Y m () As!, O! 0. Therefore, this case does ot cotirbute to the leadig term of the covergece. Case II: i i = i i 3 6= i 3 i 4 = i 4 i We chage the absolute value ito positive ad egative cases: i i = ±(i i 3 ) 6= i 3 i 4 = ±(i 4 i ) The reaso why we do this chage is that we wat to covert the relatio betwee the idices ito a system of equatios. The we ow reduce the matrices which cosist of the coe ciets of the system to get the rak or ullity of the matrices. This value will give us the leadig order uder each case, so that we ca tell the idices uder which case will cotribute to the leadig term i the limit. There are three possible subcases: II.: (both equatios are with positive sig) < i i =+(i i 3 ) : i 3 i 4 =+(i 4 i ) II.: (oe of equatios is with positive sig) < i i =+(i i 3 ) : i 3 i 4 = (i 4 i )

or < i i = (i i 3 ) : i 3 i 4 =+(i 4 i ) The reaso these two cases are put together is that there are oly two rows i the coe ciets matrices, where oe matrix ca be acquired by chagig the order of aother matrix s row. II.3: (both equatios are with egative sig) < i i = (i i 3 ) : i 3 i 4 = (i 4 i ) The, we covert the system of equatios ito coe ciet matrices ad solve the rak or ullity of the matrices. II.: : Movig all the terms to the left side of the equatio, we have the followig matrix equatio: 3 " # i " # 0 i 6 7 0 4 5 = 0 0 i 3 i 4 The, we have the coe ciet matrix: " # 0 0 By reductio ad calculatio, the rak is ad ullity is 4 =. The ullity of the matrix gives us the dimesio of the ull space, the we have the umber of solutios. Whe we have the order of umber of the solutios, we ca justify whether the case cotribute to the limit s leadig term to prove propositio. II.: : Similarly, we have " # 0 0 6 0 4 i i i 3 3 " 7 5 = 0 0 # i 4

The coe ciet matrix is: " # 0 0 0 whose rak is ad ullity is 4 =. II.3: : I this case, i ad i 4 are caceled i the equatio, it gives i = i 3. " # 0 0 6 0 0 4 Therefore, we have three free parameters: i i i 3 i 4 3 " 7 5 = 0 0 # i,i,i 4 The ullity of the coe ciet matrix is 3. Uder this case, because of the property of i.i.d radom variables, E[X i i =E[X i i E[ E[ =0 ad E[X i i =E[X i i E[ E[ =0 The, equatio (7) ca be expressed as: i,...,i 4 = E[X i i E[ +E[X i i + E[X i i ) = X ( E[Y m +0+0) m=0 = X m=0 E[Y m = (O(3 )+O( )) =+O (9) Oly i II.3, the ullity of the coe ciet matrix is 3. This case gives O( 3 ) i the calculatio above. As!, O! 0. The leadig term is cotributed by the II.3, uder which, both of the coe ciet equatios are 3

usig the egative sig. Case III: i i = i 3 i 4 6= i i 3 = i 4 i We chage the absolute value ito positive ad egative cases: i i = ±(i 3 i 4 ) 6= i i 3 = ±(i 4 i ) There are three possible subcases as well: III.: (both equatios are with positive sig) < i i =+(i 3 i 4 ) : i i 3 =+(i 4 i ) III.: (oe of equatios is with positive sig) < i i =+(i 3 i 4 ) : i i 3 = (i 4 i ) or < i i = (i 3 i 4 ) : i i 3 =+(i 4 i ) III.3: (both equatios are with egative sig) < i i = (i 3 i 4 ) : i i 3 = (i 4 i ) Similarly, i III. ad III. the rak of the matrix is ad ullity is, but uder III.3 gives that the rak of the matrix is ad ullity is 3. Therefore, uder Case III, because of the property of i.i.d radom variables, ad E[X i i E[ =E[X i i E[ E[ =0 E[X i i =E[X i i E[ E[ =0 4

The, equatio (7) ca be expressed as: i,...,i 4 = E[X i i E[ +E[X i i + E[X i i = X (0 + E[Y m ) m=0 = X m=0 E[Y m = (O(3 )+O( )) =+O (0) Same as Case II, oly uder III.3, the ullity of the coe ciet matrix is 3, which gives O( 3 ). As!, O! 0. The leadig term is cotributed by III.3, uder which, both of the coe ciet equatios are usig the egative sig. Case IV: i i = i 4 i 6= i i 3 = i 3 i 4 We chage the absolute value ito positive ad egative cases: i i = ±(i 4 i ) 6= i i 3 = ±(i 3 i 4 ) There are three possible subcases as well: IV.: (both equatios are with positive sig) < i i =+(i 4 i ) : i i 3 =+(i 3 i 4 ) IV.: (oe of equatios is with positive sig) < i i =+(i 4 i ) : i i 3 = (i 3 i 4 ) or < i i = (i 4 i ) : i i 3 =+(i 3 i 4 ) 5

IV.3: (both equatios are with egative sig) < i i = (i 4 i ) : i i 3 = (i 3 i 4 ) Similarly, i IV. ad IV. the rak of the matrix is ad ullity is, but uder IV.3 gives that the rak of the matrix is ad ullity is 3. Therefore, uder Case IV, because of the property of i.i.d radom variables, ad E[X i i E[ =E[X i i E[ E[ =0 E[X i i =E[X i i E[ E[ =0 The, equatio (7) ca be expressed as: i,...,i 4 = E[X i i E[ +E[X i i + E[X i i = X (0 + E[Y m ) m=0 = X m=0 E[Y m = (O(3 )+O( )) =+O () Same as Case II, oly uder IV.3, the ullity of the coe ciet matrix is 3, which gives O( 3 ). As!, O! 0. The leadig term is cotributed by IV.3, uder which, both of the coe ciet equatios are usig the egative sig. Now, we fiish discussig all the cases ad sum equatio (), (9), (0), () up, the 6

ETr[X4 = i,...,i 4 = E[X i i E[ +E[X i i + E[X i i = O + +O =3+O + +O + +O As!, O! 0. The leadig term 3 is cotributed by II.3, III.3 ad IV.3, uder which, both of the coe ciet equatios are usig the egative sig. 3. Proof Proof. Recall the equatio: ETr[Xk = = = i,...,i k = E[X i,i X i,i 3...X ik,i () X Y i,...,i k = P (k) {, } E[X i,i + X i,i + (3) P () (4) k = P () (5) k+ where P () isapolyomialfuctioofad i i + = i i + We eed to fid what relatio betwee the idices will give O( k+ )which ca be caceled with ad cotribute to the leadig term of the covergece. k+ Based o the idea i Example 3., we separate the relatio betwee idices ito several cases. All {, } i satisfy i + = i Part of the {, } i satisfy i + = i 7

All {, } i satisfy i + 6= i The reaso why cases deped o whether i + = i is that the cacellatio of the terms i the system of equatios might a ect the rak of correspodig coe ciet matrices. Case I: All {, } i satisfy i + = i. We chage the absolute value ito positive ad egative cases as we did i Example 3.: i i + = ±(i i + ) I.: : All of {, } take the positive sig, i.e. i i + =+(i i + ) Whe all the idex equatios use positive sig, we have the followig matrix equatio: 3 0 0... 0 0 0... 0 6 7 4..................... 5 0 0 0 0 6 4 Now, we have a k k matrix of coe ciets: i i i 3. i k i k i k 3 0 0... 0 0 0... 0 6 7 4..................... 5 0 0 0 0 3 3 0 = 6 7 4 0. 5 7 0 5 By reducig the matrix, the rak is k, the the ullity is k k = k. The equatio () ca be expressed as: ETr[Xk = = i,...,i k = E[X i,i X i,i 3...X ik,i X Y i,...,i k = P (k) {, } = k O(k ) = O E[X i,i + X i,i + (6)

As!,theresultdoesotcotributetotheleadigterm. I.: : Some equatios use positive sig ad others use egative sig, i.e i i + =+(i + i ), i i + = (i + i ) Assume the umber of egative idex equatios is j, which meas j of k idex equatios have the form i i + =+(i + i )wherei + = i.the coe ciets of idices with this form ca be writte as a (k j) k matrix: 6 4 0 0... 0 0 0... 0..................... 0 0 0 0 By reducig this matrix, the dimesio of ull space of this matrix is k j. The, addig o j free parameters which are caceled by satisfyig i + = i, the ullity equals to k. 3 7 5 Therefore, we have ETr[Xk =O (7) which does ot cotribute to the leadig term, either. I.3: All the sigs are egative i.e. i i + = (i i + ) The umber of pairs (, )isk,thereforetherearekequatiosofidices. Sice i + = i,thei = i +. There are k free parameters because of i + = i. All the i left are all equal to each other, which ca be see as oe sigle free parameter. Therefore, the total umber of free parameters is k +,thatistheullityisk +. The, equatio () ca be re-writte as: ETr[Xk = = i,...,i k = E[X i,i X i,i 3...X ik,i X Y i,...,i k = P (k) {, } E[X i,i + X i,i + = k O(k+ ) = O() () 9

Case II: Part of the {, } i satisfy i + = i Uder this case, we will discuss three same subcases i Case I. II.: : All the sigs are positive i.e. i i + =+(i i + ) Sice there is o egative case, o idex ca be caceled from the equatio. Therefore, a k k coe ciet matrix ca be derived. After reducig the coe ciets matrix, the rak is still k. We will have ETr[Xk =O which does ot cotribute to the leadig term of the covergece. (9) II.: : Some equatios use positive sig ad others use egative sig, i.e i i + =+(i i + ), i i + = (i i +) Assume there are j pairs of {, possibilities: } i satisfy i + = i. Three are three All of these j pairs satisfy i i + = (i i +) Some of these j pairs satisfy i i + = (i i +) Noe of these j pairs satisfy i i + = (i i +) Uder the first possibility, j pairs of terms ca be regarded as j free parameters ad be caceled out from the equatios. The left idices coe ciets costruct a (k j) k matrix which has (k j +) dimetio of ull space. Thus, there are k+ free parameters. The, we have ETr[Xk =O() (0) Uder the left two possibilities, there are less tha j free parameters which ca be caceled out from the equatios, so the ullity of the matrices is less tha k+. I other words, we will get ETr[Xk =O which does ot cotribute to the leadig term uder these two possibilities. () II.3: All the sigs are egative i.e. i i + = (i i + ) 0

Two situatios will happe uder this case. The first oe is although there exist egative equatio, there is o cacellatio of the idices. This situatio is the same as II.(All the sigs are positive i.e. i i + =+(i i + )). Therefore, we have ETr[Xk =O. The secod oe is that there exists cacellatio of the idices. The idices which are caceled ca be regarded as free parameters. There is o patter of coe ciets matrix, so we check the sigs of the idices. Di eret from II., the sigs of the same idices i di eret equatios are ot the same. Besides the cacellatio caused by the egative situatio, o more terms will be caceled. Therefore the rak of the matrix will larger tha k k =k, which meas there are less tha k +free parameters, so the result of ETr[Xk caotcotributetotheleadigterm. Case III: All (, )i satisfy i + 6= i. III.: : All the sigs are positive i.e. i i + =+(i i + ) III.: : Some equatios use positive sig ad others use egative sig, i.e i i + =+(i + i ), i i + = (i + i ) Uder both of the III. ad III., the sigs before same two idices are ot always di eret. It meas that the rak will be bigger tha k, hece, the ullity is less tha k k = k ad we caot get equatio (9). III.3: All the sigs are egative i.e. i i + = (i i + ) After movig all the idex i the equatios to the left had sides, the sigs before same two idices are di eret, or we ca say each i i + ad i + i equals to each other. It meas oe row of the k k coe ciets matrix will have all 0, the the rak of the matrix is k. Thus, the umber of the free parameters is k+.the, we have ETr[Xk =O() () Accordig to the cases discussed above, we sum equatio (6), (7), (), (9), (0), (), () uder each case together. The,

lim! ETr[Xk = lim! = lim! = lim! i,...,i k = E[X i,i X i,i 3...X ik,i X Y i,...,i k = P (k) (, ) k (k )!!(k+ + O( k ) =(k )!! lim ( + O(! )) =(k )!! E[X i,i + X i,i + The reaso why the aswer to this equatio is (k )!! is due to Defiitio 3 the umber of all pairigs of {,,...,k} itegers, i.e., #P (k) = (k )!!. We ca see that equatio (), (0), () uder the egative cases have O( k+ ), which fially caceled out with the k+, cotributig to the leadig term of the results.

Refereces [ Kemp, Todd., Math 47A Lecture Notes: Itroductio to Radom Matrix Theory Uiversity of Califoria, Sa Diego. < http://www.math.ucsd.edu/ tkemp/47a.notes.pdf > [ Kemp, Todd., Zimmerma, David., Radom Matrices with Log-Rage Correlatios, ad Log-Sobolev Iequalities. Preprits [3 Migo, James A., Speicher, Rolad. Free Probability Ad Radom Matrices. S.l.: Spriger-Verlag New York, 07. Prit [4 Wiger, E., Characteristic Vectors of Bordered Matrices with Iite Dimesios. A. of Math. 6, 54564 (955) 3