Fluctuations of Random Matrices and Second Order Freeness

Similar documents
Free Probability Theory and Random Matrices. Roland Speicher Queen s University Kingston, Canada

Second Order Freeness and Random Orthogonal Matrices

Free Probability Theory and Non-crossing Partitions. Roland Speicher Queen s University Kingston, Canada

arxiv:math/ v1 [math.oa] 18 Jun 2006

FREE PROBABILITY THEORY AND RANDOM MATRICES

Free Probability Theory and Random Matrices

Freeness and the Transpose

FREE PROBABILITY THEORY

FREE PROBABILITY THEORY

Free Probability Theory and Random Matrices. Roland Speicher Queen s University Kingston, Canada

Selfadjoint Polynomials in Independent Random Matrices. Roland Speicher Universität des Saarlandes Saarbrücken

Operator-Valued Free Probability Theory and Block Random Matrices. Roland Speicher Queen s University Kingston

From random matrices to free groups, through non-crossing partitions. Michael Anshelevich

Truncations of Haar distributed matrices and bivariate Brownian bridge

Free Probability Theory and Random Matrices

arxiv:math/ v3 [math.oa] 16 Oct 2005

The Free Central Limit Theorem: A Combinatorial Approach

Applications and fundamental results on random Vandermon

Random Matrix Theory Lecture 3 Free Probability Theory. Symeon Chatzinotas March 4, 2013 Luxembourg

The norm of polynomials in large random matrices

Polynomials in Free Variables

The Hadamard product and the free convolutions

Eigenvalues and Singular Values of Random Matrices: A Tutorial Introduction

STAT C206A / MATH C223A : Stein s method and applications 1. Lecture 31

MIMO Capacities : Eigenvalue Computation through Representation Theory

Exponential tail inequalities for eigenvalues of random matrices

Lecture I: Asymptotics for large GUE random matrices

Operator norm convergence for sequence of matrices and application to QIT

RECTANGULAR RANDOM MATRICES, ENTROPY, AND FISHER S INFORMATION

Wigner s semicircle law

Lectures 6 7 : Marchenko-Pastur Law

ILWOO CHO. In this paper, we will reformulate the moments and cumulants of the so-called generating operator T 0 = N

Free Probability, Sample Covariance Matrices and Stochastic Eigen-Inference

Combinatorial Aspects of Free Probability and Free Stochastic Calculus

Free Probability Theory

SEA s workshop- MIT - July 10-14

Notes on Random Vectors and Multivariate Normal

c 2005 Society for Industrial and Applied Mathematics

Universality of distribution functions in random matrix theory Arno Kuijlaars Katholieke Universiteit Leuven, Belgium

1 Presessional Probability

Quantum Symmetries in Free Probability Theory. Roland Speicher Queen s University Kingston, Canada

Angular matrix integrals

1 Tridiagonal matrices

CLASSIFICATION OF COMPLETELY POSITIVE MAPS 1. INTRODUCTION

Random Matrices: Beyond Wigner and Marchenko-Pastur Laws

Free deconvolution for signal processing applications

Jeffrey H. Schenker and Hermann Schulz-Baldes

Fluctuations from the Semicircle Law Lecture 4

Supelec Randomness in Wireless Networks: how to deal with it?

Analytic versus Combinatorial in Free Probability

On asymptotics of large Haar distributed unitary matrices 1

On the Fluctuation of Mutual Information in Double Scattering MIMO Channels

Beyond the Gaussian universality class

Large sample covariance matrices and the T 2 statistic

STATISTICAL EIGEN-INFERENCE FROM LARGE WISHART MATRICES

Random Matrix Theory Lecture 1 Introduction, Ensembles and Basic Laws. Symeon Chatzinotas February 11, 2013 Luxembourg

Outage Capacity of Rayleigh Product Channels: a Free Probability Approach

arxiv: v1 [math-ph] 19 Oct 2018

A Note on the Central Limit Theorem for the Eigenvalue Counting Function of Wigner and Covariance Matrices

Limit Laws for Random Matrices from Traffic Probability

Distribution of Eigenvalues of Weighted, Structured Matrix Ensembles

Lectures 2 3 : Wigner s semicircle law

Lectures 2 3 : Wigner s semicircle law

Mod-φ convergence I: examples and probabilistic estimates

Free Probability and Random Matrices

Lectures on the Combinatorics of Free Probability Theory. Alexandru Nica Roland Speicher

Near extreme eigenvalues and the first gap of Hermitian random matrices

18.440: Lecture 28 Lectures Review

Local law of addition of random matrices

free pros quantum groups QIT de Finetti -

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

Quantum central limit theorems

Quantum symmetries in free probability. Stephen Robert Curran

Random Matrix: From Wigner to Quantum Chaos

ASYMPTOTIC EIGENVALUE DISTRIBUTIONS OF BLOCK-TRANSPOSED WISHART MATRICES

Random matrices: Distribution of the least singular value (via Property Testing)

Matrix-valued stochastic processes

On the concentration of eigenvalues of random symmetric matrices

On corrections of classical multivariate tests for high-dimensional data. Jian-feng. Yao Université de Rennes 1, IRMAR

Test of independence for high-dimensional random vectors based on freeness in block correlation matrices

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Math 115 Spring 11 Written Homework 10 Solutions

RANDOM MATRIX THEORY AND TOEPLITZ DETERMINANTS

Ordinary Differential Equations II

BULK SPECTRUM: WIGNER AND MARCHENKO-PASTUR THEOREMS

Free probability and quantum information

Markov operators, classical orthogonal polynomial ensembles, and random matrices

WIGNER THEOREMS FOR RANDOM MATRICES WITH DEPENDENT ENTRIES: ENSEMBLES ASSO- CIATED TO SYMMETRIC SPACES AND SAMPLE CO- VARIANCE MATRICES

Triangular matrices and biorthogonal ensembles

Linear and Multilinear Algebra. Linear maps preserving rank of tensor products of matrices

Convolutions and fluctuations: free, finite, quantized.

Non white sample covariance matrices.

THE INVERSE FUNCTION THEOREM

A Proof of Aldous Spectral Gap Conjecture. Thomas M. Liggett, UCLA. Stirring Processes

04. Random Variables: Concepts

arxiv: v1 [math-ph] 1 Nov 2016

Algebra homework 6 Homomorphisms, isomorphisms

Statistical signal processing

Comparison Method in Random Matrix Theory

Unitary t-designs. Artem Kaznatcheev. February 13, McGill University

Transcription:

Fluctuations of Random Matrices and Second Order Freeness james mingo with b. collins p. śniady r. speicher SEA 06 Workshop Massachusetts Institute of Technology July 9-14, 2006 1

0.4 0.2 0-2 -1 0 1 2-2 -1 0 1 2 0.1 0.08 0.06 0.04-1 0 1-1 0 1 random matrix free probability 2

references J. Mingo and A. Nica: Annular noncrossing permutations and partitions, and second-order asymptotics for random matrices, Int. Math. Res. Not., (28):1413 1460, 2004 J. Mingo and R. Speicher: Second Order Freeness and Fluctuations of Random Matrices: I. Gaussian and Wishart matrices and Cyclic Fock spaces, J. Funct. Anal., 235, 2006, pp. 226-270. J. Mingo, P. Śniady, and R. Speicher. Second order freeness and fluctuations of random matrices: II. 3

Unitary random matrices. Adv. in Math. to appear, ariv:math.oa/0405258 B. Collins, J. Mingo, P. Śniady, and R. Speicher. Second order freeness and fluctuations of random matrices: III. Higher order freeness and free cumulants, ariv:math.oa/0606431 4

second order freeness & fluctuations Voiculescu s freeness property gives a universal rule for calculating mixed moments of a and b if a and b are free and moments of a and b are known; second order freeness will do for fluctuations what first order freeness (i.e. Voiculescu s freeness) did for moments; n a N N random matrix 1 α k = lim N N E(Tr( k N )) = moments of limit distribution in many examples we also have a second order limit 5

distribution: Y N,k = Tr( k N α k I N ) converges in distribution: α k,l = lim N cov(y N,k, Y N,l ) k r (Tr(Y n 1 N,i i ),..., Tr(Y n r N,i r )) 0 for r > 2 6

examples GUE: N = ( f i, j ) an N N random matrix, f i, j a complex Gaussian r.v.; E( f i, j ) = 0, E( f i, j 2 ) = 1/N 1 lim N N E(Tr( k N )) = 2 4 t t k 2 dt 2 2π (Wigner s semi-circle law) Y N,k = Tr( k N α k I N ) is asymptotically Gaussian α m,n = lim N cov(y N,m, Y N,n ) ρ(x, y) = = x m y m x n y n ρ(x, y) dx dy [ 2,2] 2 x y x y 4 xy 4π 2 4 x 2 4 y 2 7

combinatorial interpretation Wigner s semi-circle law 4 t 2 dt has moments 2π 2 given by n o non-crossing pairings: covariance given by fluctuation moments α m,n = lim cov(y N,m, Y N,n ) N = x m y m x n y n ρ(x, y) dx dy [ 2,2] 2 x y x y 4 xy ρ(x, y) = 4π 2 4 x 2 4 y 2 8

fluctuation moments are given by n o non-crossing annular pairings 9

second order freeness 1 N E(Tr( k N )) = α k + O(N 2 ) E(Tr( m N α m I N )Tr( n N α n I N )) = α m,n + O(N 2 ) general situation: given {A N } N with A N a i.e. lim N 1 N E(Tr(Ak N )) = ϕ(ak ) lim n E(Tr(A m N ϕ(am )I N ) Tr(A n N ϕ(an )I N )) = ϕ 2 (a m, a n ) and {B N } N with B N b i.e. lim N 1 N E(Tr(Bk N )) = ϕ(bk ) lim n E(Tr(B m N ϕ(bm )I N ) Tr(B n N ϕ(bn )I N )) = ϕ 2 (b m, b n ) 10

first order freeness tells us how to calculate ϕ(a r 1 b r2 a r k 1b r k) from {ϕ(a k )} k and {ϕ(b k )} k second order freeness tells us how to calculate ϕ 2 (a r 1 b r2 a r k 1 b r k, a s 1 b s2 a s l 1 b s l ) from the fluctuation moments of a and b suppose we want to calculate ϕ 2 (a r 1 b r2 a r k 1 b r k, a s 1 b s2 a s l 1 b s l ) then it suffices to calculate ϕ 2 ((a r 1 α r1 ) (b r 2 β r2 ) (a r k 1 α rk 1 ) (b r k β rk )), (a s 1 α s1 ) (b s 2 β s2 ) (a s l 1 α sl 1 ) (b s l β sl )) and apply induction (α k = ϕ(a k ), β k = ϕ(b k )) 11

the definition so let p 1, p 2,... p k and q 1, q 2,..., q l be centred polynomials alternating from a to b, i.e. ϕ(p i ) = ϕ(q j ) = 0 for all i and j and if p i is a polynomial in a then p i+1 (setting p k+1 = p 1 ) is a polynomial in b, likewise for q j if a and b are free of second order then ϕ 2 (p 1 p 2 p k, q l q l 1 q 1 ) = 0 if k l ϕ 2 (p 1 p 2 p k, q k q k 1 q 1 ) = ϕ(p 1 q 1 ) ϕ(p k q k ) + ϕ(p 1 q 2 ) ϕ(p k q 1 ) + ϕ(p 1 q 3 ) ϕ(p k q 2 ) + + ϕ(p 1 q k ) ϕ(p k q k 1 ) 12

example ϕ 2 (p 1 p 2 p 3, q 3 q 2 q 1 ) = ϕ(p 1 q 1 )ϕ(p 2 q 2 )ϕ(p 3 q 3 ) + ϕ(p 1 q 2 )ϕ(p 2 q 3 )ϕ(p 3 q 1 ) + ϕ(p 1 q 2 )ϕ(p 2 q 3 )ϕ(p 3 q 1 ) p 1 p 1 q 3 q 1 q 2 q 3 q 1 q 2 p 3 p 2 p 3 p 2 ϕ(p 1 q 1 )ϕ(p 2 q 2 )ϕ(p 3 q 3 ) ϕ(p 1 q 2 )ϕ(p 2 q 3 )ϕ(p 3 q 1 ) 13

p 1 q 1 ϕ(p 1 q 3 )ϕ(p 2 q 1 )ϕ(p 3 q 2 ) q 3 q 2 p 3 p 2 ϕ 2 (a r, b s ) = 0 ϕ 2 (a r 1 b r 2, a s ) = ϕ 2 (a r 1, a s )ϕ(b r 2 ) ϕ 2 (a r 1 b r 2, a s 1 b s 2 ) = (ϕ(a r 1+s 1 ) ϕ(a r 1 )ϕ(a s 1 ))(ϕ(b r 2+s 2 ) ϕ(b r 2 )ϕ(b s 2 )) + ϕ 2 (a r 1, a r 2 )ϕ(b r 2 )ϕ(b s 2 ) + ϕ(a r 1 )ϕ(a s 1 )ϕ 2 (b s 1, b s 2 ) 14

second order limit distributions N = GUE; lim N 1 N E(Tr( k N )) = α k = n o non-crossing pairings of [k] lim N E(Tr( m N α m )Tr( n N α n)) = α m,n = n o non-crossing (m, n)-annular pairings 15

complex Wishart G = G M,N = M N cx. i.i.d. Gaussian, N = G M,N G M,N, cx. Wishart with = I M, M/N c 1 lim M,N N E(Tr( k N )) = α k = c #(π) π NC(k) lim E(Tr( m M,N N α m )Tr( n N α n)) = α m,n = c #(π) π S NC (m,n) 16

more Wishart matrices suppose {D M,1,..., D M,s } M are independent from G M,N and have a second order limit distribution, i.e. lim M 1 M E(Tr(D i 1 D ik ) = ϕ(d i1 d ik ) lim M k 2 (Tr(D i1 D im ), Tr(D im+1 D im+n )) = ϕ 2 (d i1 d im, d im+1 d im+n ) (and a vanishing condition on higher cumulants) 1 lim M,N N E(Tr(G D i 1 G G D i2 G G D ik G ) = c #(π) ϕ π 1 γ (d i1,..., d ik ) π NC(k) if π = ( j 1,..., j l ) is a cycle, ϕ π (d 1,..., d n ) := ϕ(d j1 d j2 d jl ); γ = (1, 2, 3,..., k) 17

second order limit distributions lim k 2 (Tr(G D i1 G G D i2 G G D im G ), M,N Tr(G D im+1 G G D im+2 G G D im+n G )) = π S NC (m,n) + c #(π) ϕ π 1 γ m,n (d i1,..., d im+n ) π 1 π 2 NC(m) NC(n) π 1 =s 1 s p,π 2 =t 1 t q c #(π1)+#(π2) i, j ϕ si,t j ϕ s1 ϕ si ϕ sp ϕ t1 ϕ t j ϕ tq the argument of each ϕ, (d i1,..., d im+n ) has been suppressed; γ m,n = (1, 2, 3,..., m)(m + 1,..., m + n) 18

asymptotic second order freeness Suppose {A N } N and {B N } N are two families of random matrices such that: each of {A N } N and {B N } N have second order limit distributions A N and B N are independent the joint distribution of the entries of B N is invariant under conjugation by a unitary then {A N } N and {B N } N are asymptotically free of second order 19

epilogue P.S. freeness of third and higher order 20

Fluctuations of Random Matrices and Second Order Freeness Part 2: R-transform formulas Roland Speicher Queen s University Kingston, Canada

Second order freeness and fluctuations of random matrices: Mingo + Speicher: I. Gaussian and Wishart matrices and cyclic Fock spaces JFA 235 (2006), 226-270 Mingo + Sniady + Speicher: II. Unitary random matrices math 0405258 (to appear in Adv. Math.) Collins + Mingo + Sniady + Speicher: III. Higher order freeness and free cumulants math 0606431 1

Warning In the following all our random matrices are complex. On the level of eigenvalue distributions the theory is robust against changing complex to real or quaternionic. For fluctuations, however, the phenomena are more sensitive and there are typically extra factors for the real case (still to worked out rigorously!). 2

Definition [Mingo, Speicher]: Consider N N random matrices (A N ) N N. They have a second order limit distribution if for all m, n 1 the limits and α n := lim N E[tr(An N )] α m,n := lim N cov( Tr(A m N ),Tr(An N )) exist and if all higher classical cumulants of Tr(A m N ) go to zero. This means that the family ( Tr(A m N ) E[Tr(A m N )]) m N converges to a Gaussian family. In many cases (Gaussian, Wishart, Haar unitary random matrices) such second order limit distribution exists. 3

Example: Gaussian random matrix A (N = 40, trials=50.000) 0.4 0.07 0.35 0.25 0.06 0.3 0.25 0.2 0.15 0.1 0.2 0.15 0.1 0.05 0.04 0.03 0.02 0.05 0.05 0.01 0 4 3 2 1 0 1 2 3 4 0 14 16 18 20 22 24 26 0 50 60 70 80 90 100 110. Var(Tr(A)) = 1 Var(Tr(A 2 )) = 2 Var(Tr(A 4 )) = 36 Normal Probability Plot Normal Probability Plot Normal Probability Plot 0.999 0.997 0.99 0.98 0.95 0.90 0.999 0.997 0.99 0.98 0.95 0.90 0.999 0.997 0.99 0.98 0.95 0.90 Probability 0.75 0.50 0.25 Probability 0.75 0.50 0.25 Probability 0.75 0.50 0.25 0.10 0.05 0.02 0.01 0.003 0.001 cov=0.99 0.10 0.05 0.02 0.01 0.003 0.001 cov=1.99 0.10 0.05 0.02 0.01 0.003 0.001 cov=35.85 4 3 2 1 0 1 2 3 4 Data 16 18 20 22 24 26 Data 60 65 70 75 80 85 90 95 100 105 110 Data 4

Example: Wishart matrix B (N = M = 40, trials=50.000) 0.4 0.1 0.025 0.35 0.09 0.3 0.25 0.08 0.07 0.06 0.02 0.015 0.2 0.05 0.15 0.04 0.01 0.1 0.05 0.03 0.02 0.01 0.005 0 36 37 38 39 40 41 42 43 44 0 60 65 70 75 80 85 90 95 100 0 120 140 160 180 200 220 240 260 280. Var(Tr(B)) = 1 Var(Tr(B 2 )) = 18 Var(Tr(B 3 )) = 300 Normal Probability Plot Normal Probability Plot Normal Probability Plot 0.999 0.997 0.99 0.98 0.95 0.90 0.999 0.997 0.99 0.98 0.95 0.90 0.999 0.997 0.99 0.98 0.95 0.90 Probability 0.75 0.50 0.25 Probability 0.75 0.50 0.25 Probability 0.75 0.50 0.25 0.10 0.05 0.02 0.01 0.003 0.001 cov=0.99 0.10 0.05 0.02 0.01 0.003 0.001 cov=18.14 0.10 0.05 0.02 0.01 0.003 0.001 cov=299.6 36 37 38 39 40 41 42 43 44 Data 65 70 75 80 85 90 95 100 Data 140 160 180 200 220 240 260 280 Data 5

Problem: What can we say about the fluctuations of C := A + B in a generic situation. What we need to understand are the mixed fluctuations in A and B, i.e. limits of cov ( Tr(A m 1B n 1A m 2 ),Tr(A m 1Bñ1A m 2 ) ). 6

Theorem [Mingo, Sniady, Speicher]: Consider random matrices A N and B N such that A N has second order limit distribution. B N has second order limit distribution. A N and B N are independent B N is invariant under unitary conjugation Then A N and B N are asymptotically free of second order. 7

Definition [Mingo, Speicher]: A and B are free of second order if special mixed covariances cov ( Tr(cyclically alternating centered), Tr(cyclically alternating centered) calculate in a specific way (by summing over spoke diagrams in corresponding annular representation of the two traces) from expectations of A and of B. ) All mixed covariances can be reduced to above situation, thus second order freeness is a rule for calculating mixed correlations in A and B from the expectations and covariances of A and of B. 8

In particular: If A and B are free, then the second order distribution (covariances) of A + B depends only on the expectations and covariances of A and of B. Example: We have α A+B 1,2 = α A 1,2 + αb 1,2 + 2αA 1 αb 1,1 + 2αB 1 αa 1,1, i.e., cov (Tr(A + B),Tr ( (A + B) 2)) = cov ( Tr(A),Tr(A 2 ) ) + cov ( Tr(B),Tr(B 2 ) ) + 2E[tr(A)] cov ( Tr(B),Tr(B) ) + 2E[tr(B)] cov ( Tr(A),Tr(A) ) 9

To treat these formulas in general, linearize the problem by going over to free cumulants κ or R-transforms R. Recall first order case: Put and define by the relation G(x) = 1 x + R(x) = n=1 n=1 1 α n x n+1 κ n x n 1 Cauchy transform R-transform G(x) + R(G(x)) = x. 10

There is a combinatorial structure behind this, the relation between the α s and the κ s is given by summing over non-crossing partitions: α 1 = α 2 = + = κ 1 = κ 2 + κ 1 κ 1 α 3 = + + + + = κ 3 + κ 1 κ 2 + κ 2 κ 1 + κ 2 κ 1 + κ 1 κ 1 κ 1 α 4 = + + + + + + + + + + + + + = κ 4 + 4κ 1 κ 3 + 2κ 2 2 + 6κ2 1 κ 2 + κ 4 1 11

Theorem [Voiculescu 1986, Speicher 1994]: Let A and B be free. Then one has R A+B (z) = R A (z) + R B (z), or equivalently κ A+B m = κ A m + κ B m m. 12

Second order R-transform formula From the expectations α m and the covariances α m,n we define second order free cumulants κ m,n as follows. Put and define by the equation G(x, y) := m,n 1 R(x, y) = m,n 1 α m,n 1 x m+1 1 y n+1 κ m,n x m 1 y n 1 13

G(x, y) = G (x) G (y) R ( G(x), G(y) ) + G (x)g (y) 1 ( ) 2 G(x) G(y) (x y) 2. or G(x, y) = G (x) G (y) R ( G(x), G(y) ) + 2 x y [ log ( G(x) G(y) x y )]

These equations encode relations between the α m,n and the κ m,n (containing also the κ m ), e.g.: α 1,1 = κ 1,1 + κ 2 α 1,2 = κ 1,2 + 2κ 1 κ 1 + 2κ 3 + 2κ 1 κ 2 α 2,2 = κ 2,2 + 4κ 1 κ 1,2 + 4κ 2 1 κ 1,1 + 4κ 4. + 8κ 1 κ 3 + 2κ 2 2 + 4κ2 1 κ 2 There is again a rich combinatorial theory behind this, the above sums are running over annular non-crossing permutations. 14

Note also the following special case: If the second order free cumulants are zero (as it happens for Gaussian and Wishart random matrices), the above formula reduces to G(x, y) = G (x)g (y) ( G(x) G(y) ) 2 1 (x y) 2 or G(x, y) = 2 x y [ log ( G(x) G(y) x y )], which says that the fluctuations in such a case are determined by the eigenvalue distribution. This is the formula of [Bai and Silverstein, 2004] for the fluctuations of general Wishart matrices. 15

Theorem [Collins,Mingo,Sniady,Speicher]: Let A and B be free of second order. Then one has κ A+B m,n = κ A m,n + κ B m,n m, n or equivalently R A+B (x, y) = R A (x, y) + R B (x, y). This allows to calculate the fluctuations of A + B from the moments and fluctuations of A and the moments and fluctuations of B if A and B are free of second order (for example, for independent random matrices, where one of them is unitarily invariant). 16

Examples: Take independent Gaussian random matrix A and Wishart random matrix B (N = M) and consider C = A + B and D = A 2 + B Note: C has vanishing second order free cumulants, thus the Bai-Silverstein formula can be applied to get its fluctuations; for D this is not the case! 17

Example: C = A + B (N = 40, trials=50.000) 0.35 0.08 0.018 0.3 0.07 0.016 0.25 0.2 0.15 0.1 0.05 0.06 0.05 0.04 0.03 0.02 0.01 0.014 0.012 0.01 0.008 0.006 0.004 0.002 0 34 36 38 40 42 44 46 0 95 100 105 110 115 120 125 130 135 140 145 0 200 250 300 350 400 450. Var(Tr(C)) = 2 Var(Tr(C 2 )) = 28 Var(Tr(C 3 )) = 600 Normal Probability Plot Normal Probability Plot Normal Probability Plot 0.999 0.997 0.99 0.98 0.95 0.90 0.999 0.997 0.99 0.98 0.95 0.90 0.999 0.997 0.99 0.98 0.95 0.90 Probability 0.75 0.50 0.25 Probability 0.75 0.50 0.25 Probability 0.75 0.50 0.25 0.10 0.05 0.02 0.01 0.003 0.001 cov=2.01 0.10 0.05 0.02 0.01 0.003 0.001 cov=28.3 0.10 0.05 0.02 0.01 0.003 0.001 cov 598.8 36 38 40 42 44 46 Data 100 105 110 115 120 125 130 135 140 145 Data 240 260 280 300 320 340 360 380 400 420 440 Data 18

Example: D = A 2 + B (N = 40, trials=50.000) 0.25 0.04 7 x 10 3 0.2 0.035 0.03 6 5 0.15 0.025 4 0.1 0.05 0.02 0.015 0.01 0.005 3 2 1 0 72 74 76 78 80 82 84 86 88 0 200 210 220 230 240 250 260 270 280 0 600 700 800 900 1000 1100 1200. Var(Tr(D)) = 3 Var(Tr(D 2 )) = 112 Var(Tr(D 3 )) = 4090 Normal Probability Plot Normal Probability Plot Normal Probability Plot 0.999 0.997 0.99 0.98 0.95 0.90 0.999 0.997 0.99 0.98 0.95 0.90 0.999 0.997 0.99 0.98 0.95 0.90 Probability 0.75 0.50 0.25 Probability 0.75 0.50 0.25 Probability 0.75 0.50 0.25 0.10 0.05 0.02 0.01 0.003 0.001 cov=3.01 0.10 0.05 0.02 0.01 0.003 0.001 cov=117.5 0.10 0.05 0.02 0.01 0.003 0.001 cov=4105.8 74 76 78 80 82 84 86 Data 200 210 220 230 240 250 260 270 280 Data 700 800 900 1000 1100 1200 Data 19