A note on a Marčenko-Pastur type theorem for time series. Jianfeng. Yao
|
|
- Warren Morris
- 5 years ago
- Views:
Transcription
1 A note on a Marčenko-Pastur type theorem for time series Jianfeng Yao Workshop on High-dimensional statistics The University of Hong Kong, October 2011
2 Overview 1 High-dimensional data and the sample covariance matrix 2 Variations on the Marčenko-Pastur theorem 3 Seek for a similar theorem for time series Model and Main result Main idea of the proof Application to a ARMA process
3 1 High-dimensional data and the sample covariance matrix 2 Variations on the Marčenko-Pastur theorem 3 Seek for a similar theorem for time series Model and Main result Main idea of the proof Application to a ARMA process
4 The sample covariance matrix S n Consider a sequence of p-dimensional random vectors, x 1,..., x n with population covariance matrix Σ = Cov(x 1); The sample covariance matrix is S n = 1 n n x j x T j, or S n = 1 n j=1 n (x j x)(x j x) T. j=1 Classical theory: p is fixed and n, S n Σ, almost surely. In particular, the p (random) eigenvalues of S n converge to the eigenvalues of Σ. λ Sn 1 λ Sn p,
5 High-dimensional data Some typical data dimensions : data dimension p sample size n c = p/n portfolio climate survey speech analysis a 10 2 b ORL face data base micro-arrays Important: the dimension-to-sample ratio c = p/n is not always close to 0 ; frequently larger than 1
6 An effect of high-dimensions: S n Σ (x i ) N p(0, I p) i.i.d.; Σ = I p; p = 40, n = 160, c = p/n = Warning:!! λ Sn j 1!! Blue curve: MP law of index Répartition des v.p., p=40, n= Histogram of eigenvalues {λ Sn j } of S n
7 Répartition des v.p., p=40, n=320 p = 40, n = 320, c = p/n = 0.125, Blue curve: MP law of index Histogram of eigenvalues of S n
8 The Marčenko-Pastur distributions Theorem. (a simplified version) Assume : Marčenko & Pastur, 1967 X = (X ij ) = (x 1,..., x n) be a p n array of i.i.d. variables (0, 1), (so we have Σ = Cov(x 1) = I p); not necessarily Gaussian, but with finite 4-th moment; p, n, p/n c (0, 1]; Then, the empirical distribution of the eigenvalues of S n = 1 n XXT, p F n = 1 p j=1 δ λ Sn j converges (in probability) to the distribution with density function where f (x) = 1 (x a)(b x), a x b, 2πcx a = (1 c) 2, b = (1 + c) 2.
9 The Marčenko-Pastur distribution f (x) = 1 (x a)(b x), (1 c) 2 = a x b = (1 + c) 2. 2πcx Densités de la loi de Marcenko Pastur 1.0 c p/n a b 1/ / / c=1/4 c=1/8 c=1/2
10 1 High-dimensional data and the sample covariance matrix 2 Variations on the Marčenko-Pastur theorem 3 Seek for a similar theorem for time series Model and Main result Main idea of the proof Application to a ARMA process
11 Marčenko-Pastur s original version; Silverstein 1995 s version; Bai and Zhou 2008 s version.
12 Marčenko-Pastur 1967
13 8VERWPEXIHERHTYFPMWLIHF]%17
14 Marčenko-Pastur Cont d Consider where (τ i ) are i.i.d. real ; n B p = A p + τ i x i x i, i=1 A p is non random and Hermitian; (x i ) is a sequence of independent complex-valued vectors; from (τ i ); independent
15 Marčenko-Pastur Cont d Assume I. Concentration : n/p c > 0; II. The ESD of A p tends to a (possibly defective) nonrandom measure ν 0; III. The coordinates (q 1,..., q p) of each (column) vector x i satisfy 1 E[q k ] = 0, E q k 2 = 1, E q k 4 < ; 2 E[q i q j ] = p 1 δ ij + a ij (p), E[q i q j q l q m] = p 2 [δ ij δ lm + δ im δ jl ] + ϕ il (p)ϕ jm (p) + b ijlm (p), with p ij 1/2 a ij (p) 2 0, ϕ ij (p) 2 0, p 1/2 b ijlm (p) 2 0. ij ijlm IV. (τ i ) are i.i.d. with distribution function H(x) Note. Independent columns with nearly uncorrelated components.
16 Marčenko-Pastur Cont d Then, the ESD function of B p converges (in probability) to a non decreasing function ν(λ) at all points of continuity; its Stieldjes transform s(z) is the unique solution in the region {I(z) > 0} to the equation ) τ s(z) = s 0 (z c 1 + τs(z) dh(τ), where s 0 is the ST of ν 0 (LSD of (A p)). Special cases: 1 Assume A p 0, then s 0(z) = 1/z, z = 1 s(z) + c τ 1 + τs(z) dh(τ). 2 If moreover, τ i 1, then H = δ 1, z = 1 s(z) s(z), which can be solved and leads to the MP distribution with index c.
17 The paper in MathScinet: Silverstein 1995
18 The paper: Silverstein Cont d
19 And the theorem: Silverstein 1995 s theorem
20 Silverstein versus Marčenko-Pastur Silverstein: Write X n = (x 1,..., x N ) and set y j = Tn 1/2 x j, then Silverstein s matrix B n is B n = 1 N T n 1/2 X nxn Tn 1/2 = 1 N N y j yj, i.e. the SCM of i.i.d. random vectors (y j ) with a special structure; = : more general correlations T n within the coordinates The convergence of the ESD is stronger (almost sure); Matrix entries are square-integrable only. j=1 Marčenko-Pastur: The final equations appeared already in Marčenko-Pastur; also those matrices have an extra additional term A n. (Note however there is a similar paper by Silverstein and Bai where this additional matrix A n is present).
21 Yet another MP type theorem The problem with Silverstein s theorem is that various within-components correlations could not be always put into the form where x j has i.i.d. coordinates. y j = T 1/2 n x j, Unfortunately, random vectors are not all Gaussian...
22 Bai and Zhou 2008 theorem
23
24 Note. Independent columns; and 1). specifies a very general correlation pattern within the components.
25 1 High-dimensional data and the sample covariance matrix 2 Variations on the Marčenko-Pastur theorem 3 Seek for a similar theorem for time series Model and Main result Main idea of the proof Application to a ARMA process
26 What s next? What about dependent columns, for example a p-dimensional time series, x 1,..., x n, with stationary population covariance matrix Σ = Cov(x t)? The problem is open for linear time series with general Σ. A solution below for linear time series with Σ = I p.
27 SP 500 daily stock prices ; p = 488 stocks; An example n = 1000 daily returns r t(i) = log p t(i)/p t 1(i) from to ;
28 The sample correlation matrix Let the SCM ( ) S n = 1 n n (r t r)(r t r) T. t=1 We consider the sample correlation matrix R n with R n(i, j) = S n(i, j) [S n(i, i)s n(j, j)] 1/2. The 10 largest and 10 smallest eigenvalues of R n are:
29 Sample eigenvalues of stock returns The important questions here are: Give an explanation for the largest eigenvalues (using theory of spikes, factor models) ; Find the (population) correlation structure between the 488 returns from the bulk eigenvalues.
30 A Marčenko-Pastur type theorem for time series A very special case of time series; The ST of the LSD characterised by an equation depending on the spectral density of the time series.
31 The model consider an univariate real-valued linear process z t = φ k ε t k, t Z, (1) k=0 where (ε k ) is a real-valued i. i. d. noise with mean zero and variance 1. The p-dimensional process (X t) considered in this paper will be made by p independent copies of the linear process (z t), i.e. for X t = (X 1t,..., X pt), X it = φ k ε i,t k, t Z, k=0 where the p coordinate processes {(ε 1,t,..., ε p,t)} are independent copies of the univariate error process {ε t} in (1).
32 The SCM S n Let X 1,..., X n be the observations of the time series; we consider the ESD of the SCM S n = 1 n n X j X j. (2) j=1 Note. A previous paper by Jing at al. (2009) considers a similar question, but our result here is much more general (although not enough!).
33 Theorem Assume that the following conditions hold: 1 The dimensions p, n and p/n c (0, ); 2 The error process has a fourth moment: Eε 4 t < ; 3 The linear filter (φ k ) is absolutely summable, i.e. φ k <. k=0 Y. (2011) Then almost surely the ESD of S n tends to a non-random probability distribution F. Moreover, the Stieltjes transform s = s(z) of F (as a mapping from C + into C + ) satisfies the equation z = 1 s + 1 2π 1 dλ, (3) 2π 0 cs + {2πf (λ)} 1 where f (λ) is the spectral density of the linear process (z t): f (λ) = 1 2 φ k e ikλ, λ [0, 2π). (4) 2π k=0
34 Main idea of the proof The data matrix (X 1,..., X n) have p i.i.d. rows; the correlations in a row are the autocovariances of the base linear process (z t): for all 1 i p, Cov(X is, X it ) = Cov(z s, z t) = γ t s, 1 s, t n. We can then adapt Bai and Zhou s theorem to this case with an evaluation of these autocovariances and exchanging the roles of columns/rows.
35 Application of Bai and Zhou s theorem It remains to evaluate the LSD, say H, of the (deterministic) covariance matrix T n of each coordinate process (X i1,..., X in ); This equals to the n-th order Toeplitz matrix associated to f = 2πf : T n(s, t) = γ t s, 1 s, t n, and f (λ) = k= γ k e ikλ, λ [0, 2π).
36 Cont d Assume that we can apply Bai and Zhou s theorem: first we identify the ESD H of the Toeplitz matrices T n; second, the ESD of n Sn converges a.s nonrandom probability distribution p whose Stieltjes transform m solves the equation z = 1 m + 1 c x 1 + mx dh(x). Need a theorem from Gabor Szegö to compute the last integral.
37 A theorem of Szegö The Fourier coefficients (γ k ) of the function f are absolutely summable; f is smooth with defined minimum a and maximum b on [0, 2π]; By the fundamental eigenvalue distribution theorem of Szegö for Toeplitz matrices: for any function ϕ continuous on [a, b] and denoting the eigenvalues of T n by σ (n) 1,..., σ(n) n, it holds that 1 lim n n n k=1 ϕ(σ (n) k ) = 1 2π 2π 0 ϕ( f (λ))dλ.
38 Cont d Consequently, the ESD of T n weakly converges to a nonrandom distribution H with support [a, b] and we have 0 ϕ(x)dh(x) = 1 2π 2π 0 ϕ( f (λ))dλ. (5) Hence we get z = 1 m + 1 c = 1 m + 1 2πc x 1 + mx dh(x) 2π 0 1 m + 1/ f dλ. The final equation is obtained by observing the relation s(z) = 1 c m(z/c).
39 Application to p-dimensional ARMA(1,1) series The base process is z t = φz t 1 + ε t + θε t 1, t Z, where φ < 1 and θ is real. The general equation (3) for ST reduces to with z = 1 s + θ csθ φ (φ + θ)(1 + φθ) (csθ φ) 2 ɛ(α) α2 4. (6) α = cs(1 + θ2 ) φ 2, ɛ(α) = sgn(iα). (7) csθ φ
40 Density plots of the LSD phi=0.4 theta=0 c=0.2 phi=0.4 theta=0.2 c= phi=0.4 theta=0.6 c= phi=0.8 theta=0.2 c= Densities of the LSD from ARMA(1,1) model. Left to right and top to bottom: (φ, θ, c) = (0.4, 0, 0.2) (0.4, 0.2, 0.2) (0.4, 0.6, 0.2) (0.8, 0.2, 0.2).
41 Conclusions Marčenko-Pastur theorems are a fundamental tool for the understanding of the spectral distributions of large sample covariance matrices; For dependent observations (time series for example), much work need to be done; existing results are partial answers only.
On corrections of classical multivariate tests for high-dimensional data. Jian-feng. Yao Université de Rennes 1, IRMAR
Introduction a two sample problem Marčenko-Pastur distributions and one-sample problems Random Fisher matrices and two-sample problems Testing cova On corrections of classical multivariate tests for high-dimensional
More informationOn corrections of classical multivariate tests for high-dimensional data
On corrections of classical multivariate tests for high-dimensional data Jian-feng Yao with Zhidong Bai, Dandan Jiang, Shurong Zheng Overview Introduction High-dimensional data and new challenge in statistics
More informationAssessing the dependence of high-dimensional time series via sample autocovariances and correlations
Assessing the dependence of high-dimensional time series via sample autocovariances and correlations Johannes Heiny University of Aarhus Joint work with Thomas Mikosch (Copenhagen), Richard Davis (Columbia),
More informationNon white sample covariance matrices.
Non white sample covariance matrices. S. Péché, Université Grenoble 1, joint work with O. Ledoit, Uni. Zurich 17-21/05/2010, Université Marne la Vallée Workshop Probability and Geometry in High Dimensions
More informationOn singular values distribution of a matrix large auto-covariance in the ultra-dimensional regime. Title
itle On singular values distribution of a matrix large auto-covariance in the ultra-dimensional regime Authors Wang, Q; Yao, JJ Citation Random Matrices: heory and Applications, 205, v. 4, p. article no.
More informationNumerical Solutions to the General Marcenko Pastur Equation
Numerical Solutions to the General Marcenko Pastur Equation Miriam Huntley May 2, 23 Motivation: Data Analysis A frequent goal in real world data analysis is estimation of the covariance matrix of some
More informationStrong Convergence of the Empirical Distribution of Eigenvalues of Large Dimensional Random Matrices
Strong Convergence of the Empirical Distribution of Eigenvalues of Large Dimensional Random Matrices by Jack W. Silverstein* Department of Mathematics Box 8205 North Carolina State University Raleigh,
More informationWeiming Li and Jianfeng Yao
LOCAL MOMENT ESTIMATION OF POPULATION SPECTRUM 1 A LOCAL MOMENT ESTIMATOR OF THE SPECTRUM OF A LARGE DIMENSIONAL COVARIANCE MATRIX Weiming Li and Jianfeng Yao arxiv:1302.0356v1 [stat.me] 2 Feb 2013 Abstract:
More informationAn Introduction to Large Sample covariance matrices and Their Applications
An Introduction to Large Sample covariance matrices and Their Applications (June 208) Jianfeng Yao Department of Statistics and Actuarial Science The University of Hong Kong. These notes are designed for
More informationEstimation of the Global Minimum Variance Portfolio in High Dimensions
Estimation of the Global Minimum Variance Portfolio in High Dimensions Taras Bodnar, Nestor Parolya and Wolfgang Schmid 07.FEBRUARY 2014 1 / 25 Outline Introduction Random Matrix Theory: Preliminary Results
More informationApplications of random matrix theory to principal component analysis(pca)
Applications of random matrix theory to principal component analysis(pca) Jun Yin IAS, UW-Madison IAS, April-2014 Joint work with A. Knowles and H. T Yau. 1 Basic picture: Let H be a Wigner (symmetric)
More informationLarge sample covariance matrices and the T 2 statistic
Large sample covariance matrices and the T 2 statistic EURANDOM, the Netherlands Joint work with W. Zhou Outline 1 2 Basic setting Let {X ij }, i, j =, be i.i.d. r.v. Write n s j = (X 1j,, X pj ) T and
More informationLARGE DIMENSIONAL RANDOM MATRIX THEORY FOR SIGNAL DETECTION AND ESTIMATION IN ARRAY PROCESSING
LARGE DIMENSIONAL RANDOM MATRIX THEORY FOR SIGNAL DETECTION AND ESTIMATION IN ARRAY PROCESSING J. W. Silverstein and P. L. Combettes Department of Mathematics, North Carolina State University, Raleigh,
More informationA new method to bound rate of convergence
A new method to bound rate of convergence Arup Bose Indian Statistical Institute, Kolkata, abose@isical.ac.in Sourav Chatterjee University of California, Berkeley Empirical Spectral Distribution Distribution
More informationLong-range dependence
Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series
More informationRandom Matrices for Big Data Signal Processing and Machine Learning
Random Matrices for Big Data Signal Processing and Machine Learning (ICASSP 2017, New Orleans) Romain COUILLET and Hafiz TIOMOKO ALI CentraleSupélec, France March, 2017 1 / 153 Outline Basics of Random
More informationECO 513 Fall 2009 C. Sims CONDITIONAL EXPECTATION; STOCHASTIC PROCESSES
ECO 513 Fall 2009 C. Sims CONDIIONAL EXPECAION; SOCHASIC PROCESSES 1. HREE EXAMPLES OF SOCHASIC PROCESSES (I) X t has three possible time paths. With probability.5 X t t, with probability.25 X t t, and
More informationRandom Matrix Theory Lecture 1 Introduction, Ensembles and Basic Laws. Symeon Chatzinotas February 11, 2013 Luxembourg
Random Matrix Theory Lecture 1 Introduction, Ensembles and Basic Laws Symeon Chatzinotas February 11, 2013 Luxembourg Outline 1. Random Matrix Theory 1. Definition 2. Applications 3. Asymptotics 2. Ensembles
More informationRandom regular digraphs: singularity and spectrum
Random regular digraphs: singularity and spectrum Nick Cook, UCLA Probability Seminar, Stanford University November 2, 2015 Universality Circular law Singularity probability Talk outline 1 Universality
More informationModeling and testing long memory in random fields
Modeling and testing long memory in random fields Frédéric Lavancier lavancier@math.univ-lille1.fr Université Lille 1 LS-CREST Paris 24 janvier 6 1 Introduction Long memory random fields Motivations Previous
More informationThe circular law. Lewis Memorial Lecture / DIMACS minicourse March 19, Terence Tao (UCLA)
The circular law Lewis Memorial Lecture / DIMACS minicourse March 19, 2008 Terence Tao (UCLA) 1 Eigenvalue distributions Let M = (a ij ) 1 i n;1 j n be a square matrix. Then one has n (generalised) eigenvalues
More informationEstimation of large dimensional sparse covariance matrices
Estimation of large dimensional sparse covariance matrices Department of Statistics UC, Berkeley May 5, 2009 Sample covariance matrix and its eigenvalues Data: n p matrix X n (independent identically distributed)
More informationPreface to the Second Edition...vii Preface to the First Edition... ix
Contents Preface to the Second Edition...vii Preface to the First Edition........... ix 1 Introduction.............................................. 1 1.1 Large Dimensional Data Analysis.........................
More informationINVERSE EIGENVALUE STATISTICS FOR RAYLEIGH AND RICIAN MIMO CHANNELS
INVERSE EIGENVALUE STATISTICS FOR RAYLEIGH AND RICIAN MIMO CHANNELS E. Jorswieck, G. Wunder, V. Jungnickel, T. Haustein Abstract Recently reclaimed importance of the empirical distribution function of
More informationApplications of Random Matrix Theory to Economics, Finance and Political Science
Outline Applications of Random Matrix Theory to Economics, Finance and Political Science Matthew C. 1 1 Department of Economics, MIT Institute for Quantitative Social Science, Harvard University SEA 06
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationEigenvalues and Singular Values of Random Matrices: A Tutorial Introduction
Random Matrix Theory and its applications to Statistics and Wireless Communications Eigenvalues and Singular Values of Random Matrices: A Tutorial Introduction Sergio Verdú Princeton University National
More informationComparison Method in Random Matrix Theory
Comparison Method in Random Matrix Theory Jun Yin UW-Madison Valparaíso, Chile, July - 2015 Joint work with A. Knowles. 1 Some random matrices Wigner Matrix: H is N N square matrix, H : H ij = H ji, EH
More informationSpectral analysis of the Moore-Penrose inverse of a large dimensional sample covariance matrix
Spectral analysis of the Moore-Penrose inverse of a large dimensional sample covariance matrix Taras Bodnar a, Holger Dette b and Nestor Parolya c a Department of Mathematics, Stockholm University, SE-069
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationRandom Matrix Theory and its Applications to Econometrics
Random Matrix Theory and its Applications to Econometrics Hyungsik Roger Moon University of Southern California Conference to Celebrate Peter Phillips 40 Years at Yale, October 2018 Spectral Analysis of
More informationRandom Toeplitz Matrices
Arnab Sen University of Minnesota Conference on Limits Theorems in Probability, IISc January 11, 2013 Joint work with Bálint Virág What are Toeplitz matrices? a0 a 1 a 2... a1 a0 a 1... a2 a1 a0... a (n
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationSome functional (Hölderian) limit theorems and their applications (II)
Some functional (Hölderian) limit theorems and their applications (II) Alfredas Račkauskas Vilnius University Outils Statistiques et Probabilistes pour la Finance Université de Rouen June 1 5, Rouen (Rouen
More informationRandom Matrices: Invertibility, Structure, and Applications
Random Matrices: Invertibility, Structure, and Applications Roman Vershynin University of Michigan Colloquium, October 11, 2011 Roman Vershynin (University of Michigan) Random Matrices Colloquium 1 / 37
More informationThe largest eigenvalues of the sample covariance matrix. in the heavy-tail case
The largest eigenvalues of the sample covariance matrix 1 in the heavy-tail case Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia NY), Johannes Heiny (Aarhus University)
More informationRandom Matrices: Beyond Wigner and Marchenko-Pastur Laws
Random Matrices: Beyond Wigner and Marchenko-Pastur Laws Nathan Noiry Modal X, Université Paris Nanterre May 3, 2018 Wigner s Matrices Wishart s Matrices Generalizations Wigner s Matrices ij, (n, i, j
More informationWigner s semicircle law
CHAPTER 2 Wigner s semicircle law 1. Wigner matrices Definition 12. A Wigner matrix is a random matrix X =(X i, j ) i, j n where (1) X i, j, i < j are i.i.d (real or complex valued). (2) X i,i, i n are
More informationLong Memory through Marginalization
Long Memory through Marginalization Hidden & Ignored Cross-Section Dependence Guillaume Chevillon ESSEC Business School (CREAR) and CREST joint with Alain Hecq and Sébastien Laurent Maastricht University
More informationRandom Matrices in Machine Learning
Random Matrices in Machine Learning Romain COUILLET CentraleSupélec, University of ParisSaclay, France GSTATS IDEX DataScience Chair, GIPSA-lab, University Grenoble Alpes, France. June 21, 2018 1 / 113
More informationParametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes
Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More information5: MULTIVARATE STATIONARY PROCESSES
5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability
More informationAsymptotic Statistics-III. Changliang Zou
Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (
More informationStatistica Sinica Preprint No: SS R2
Statistica Sinica Preprint No: SS-12-130R2 Title A local moment estimator of the spectrum of a large dimensional covariance matrix Manuscript ID SS-12-130R2 URL http://www.stat.sinica.edu.tw/statistica/
More informationBootstrapping high dimensional vector: interplay between dependence and dimensionality
Bootstrapping high dimensional vector: interplay between dependence and dimensionality Xianyang Zhang Joint work with Guang Cheng University of Missouri-Columbia LDHD: Transition Workshop, 2014 Xianyang
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationThe norm of polynomials in large random matrices
The norm of polynomials in large random matrices Camille Mâle, École Normale Supérieure de Lyon, Ph.D. Student under the direction of Alice Guionnet. with a significant contribution by Dimitri Shlyakhtenko.
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More informationLECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.
MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;
More informationLesson 4: Stationary stochastic processes
Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means
More informationELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals
ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals Jingxian Wu Department of Electrical Engineering University of Arkansas Outline Matched Filter Generalized Matched Filter Signal
More information7. MULTIVARATE STATIONARY PROCESSES
7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability
More information14 - Gaussian Stochastic Processes
14-1 Gaussian Stochastic Processes S. Lall, Stanford 211.2.24.1 14 - Gaussian Stochastic Processes Linear systems driven by IID noise Evolution of mean and covariance Example: mass-spring system Steady-state
More informationInhomogeneous circular laws for random matrices with non-identically distributed entries
Inhomogeneous circular laws for random matrices with non-identically distributed entries Nick Cook with Walid Hachem (Telecom ParisTech), Jamal Najim (Paris-Est) and David Renfrew (SUNY Binghamton/IST
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationNonlinear time series
Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More information1 Intro to RMT (Gene)
M705 Spring 2013 Summary for Week 2 1 Intro to RMT (Gene) (Also see the Anderson - Guionnet - Zeitouni book, pp.6-11(?) ) We start with two independent families of R.V.s, {Z i,j } 1 i
More informationApplications and fundamental results on random Vandermon
Applications and fundamental results on random Vandermonde matrices May 2008 Some important concepts from classical probability Random variables are functions (i.e. they commute w.r.t. multiplication)
More informationMatrices: 2.1 Operations with Matrices
Goals In this chapter and section we study matrix operations: Define matrix addition Define multiplication of matrix by a scalar, to be called scalar multiplication. Define multiplication of two matrices,
More informationEXTENDED GLRT DETECTORS OF CORRELATION AND SPHERICITY: THE UNDERSAMPLED REGIME. Xavier Mestre 1, Pascal Vallet 2
EXTENDED GLRT DETECTORS OF CORRELATION AND SPHERICITY: THE UNDERSAMPLED REGIME Xavier Mestre, Pascal Vallet 2 Centre Tecnològic de Telecomunicacions de Catalunya, Castelldefels, Barcelona (Spain) 2 Institut
More informationSF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES
SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through
More informationNotes on Random Vectors and Multivariate Normal
MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution
More information9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006.
9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. References for this Lecture: Introduction to Time Series and Forecasting. P.J. Brockwell and R. A. Davis, Springer Texts
More informationIndependence Test For High Dimensional Random Vectors
Independence Test For High Dimensional Random Vectors Guangming Pan, Jiti Gao, Yanrong Yang, Meihui Guo June 6, 2 Abstract This paper proposes a new test statistic for the hypothesis testing of cross sectional
More informationModule 9: Stationary Processes
Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.
More informationTIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets
TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:
More informationHomework 1. Yuan Yao. September 18, 2011
Homework 1 Yuan Yao September 18, 2011 1. Singular Value Decomposition: The goal of this exercise is to refresh your memory about the singular value decomposition and matrix norms. A good reference to
More informationOn 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).
On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). (sin(x)) 2 + (cos(x)) 2 = 1. 28 1 Characteristics of Time
More informationHypothesis testing in multilevel models with block circular covariance structures
1/ 25 Hypothesis testing in multilevel models with block circular covariance structures Yuli Liang 1, Dietrich von Rosen 2,3 and Tatjana von Rosen 1 1 Department of Statistics, Stockholm University 2 Department
More informationTime series analysis MAP565
Time series analysis MAP565 François Roueff January 29, 2016 ii Contents 1 Random processes 1 1.1 Introduction..................................... 2 1.2 Random processes.................................
More informationEigenvalue Statistics for Toeplitz and Circulant Ensembles
Eigenvalue Statistics for Toeplitz and Circulant Ensembles Murat Koloğlu 1, Gene Kopp 2, Steven J. Miller 1, and Karen Shen 3 1 Williams College 2 University of Michigan 3 Stanford University http://www.williams.edu/mathematics/sjmiller/
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More informationKernel-based portmanteau diagnostic test for ARMA time series models
STATISTICS RESEARCH ARTICLE Kernel-based portmanteau diagnostic test for ARMA time series models Esam Mahdi 1 * Received: 01 October 2016 Accepted: 07 February 2017 First Published: 21 February 2017 *Corresponding
More informationSingle Equation Linear GMM with Serially Correlated Moment Conditions
Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )
More informationENGR352 Problem Set 02
engr352/engr352p02 September 13, 2018) ENGR352 Problem Set 02 Transfer function of an estimator 1. Using Eq. (1.1.4-27) from the text, find the correct value of r ss (the result given in the text is incorrect).
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More informationA Random Matrix Framework for BigData Machine Learning and Applications to Wireless Communications
A Random Matrix Framework for BigData Machine Learning and Applications to Wireless Communications (EURECOM) Romain COUILLET CentraleSupélec, France June, 2017 1 / 80 Outline Basics of Random Matrix Theory
More informationBooth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay. Midterm
Booth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay Midterm Chicago Booth Honor Code: I pledge my honor that I have not violated the Honor Code during
More informationRobust Testing and Variable Selection for High-Dimensional Time Series
Robust Testing and Variable Selection for High-Dimensional Time Series Ruey S. Tsay Booth School of Business, University of Chicago May, 2017 Ruey S. Tsay HTS 1 / 36 Outline 1 Focus on high-dimensional
More informationFluctuations from the Semicircle Law Lecture 4
Fluctuations from the Semicircle Law Lecture 4 Ioana Dumitriu University of Washington Women and Math, IAS 2014 May 23, 2014 Ioana Dumitriu (UW) Fluctuations from the Semicircle Law Lecture 4 May 23, 2014
More informationAn introduction to G-estimation with sample covariance matrices
An introduction to G-estimation with sample covariance matrices Xavier estre xavier.mestre@cttc.cat Centre Tecnològic de Telecomunicacions de Catalunya (CTTC) "atrices aleatoires: applications aux communications
More informationFor a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,
CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance
More informationEigenvalue spectra of time-lagged covariance matrices: Possibilities for arbitrage?
Eigenvalue spectra of time-lagged covariance matrices: Possibilities for arbitrage? Stefan Thurner www.complex-systems.meduniwien.ac.at www.santafe.edu London July 28 Foundations of theory of financial
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationRandom matrix pencils and level crossings
Albeverio Fest October 1, 2018 Topics to discuss Basic level crossing problem 1 Basic level crossing problem 2 3 Main references Basic level crossing problem (i) B. Shapiro, M. Tater, On spectral asymptotics
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationSTAT 200C: High-dimensional Statistics
STAT 200C: High-dimensional Statistics Arash A. Amini May 30, 2018 1 / 57 Table of Contents 1 Sparse linear models Basis Pursuit and restricted null space property Sufficient conditions for RNS 2 / 57
More informationRegularized Estimation of High Dimensional Covariance Matrices. Peter Bickel. January, 2008
Regularized Estimation of High Dimensional Covariance Matrices Peter Bickel Cambridge January, 2008 With Thanks to E. Levina (Joint collaboration, slides) I. M. Johnstone (Slides) Choongsoon Bae (Slides)
More informationHYPOTHESIS TESTING ON LINEAR STRUCTURES OF HIGH DIMENSIONAL COVARIANCE MATRIX
Submitted to the Annals of Statistics HYPOTHESIS TESTING ON LINEAR STRUCTURES OF HIGH DIMENSIONAL COVARIANCE MATRIX By Shurong Zheng, Zhao Chen, Hengjian Cui and Runze Li Northeast Normal University, Fudan
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationTime series models in the Frequency domain. The power spectrum, Spectral analysis
ime series models in the Frequency domain he power spectrum, Spectral analysis Relationship between the periodogram and the autocorrelations = + = ( ) ( ˆ α ˆ ) β I Yt cos t + Yt sin t t= t= ( ( ) ) cosλ
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationConvergence of empirical spectral distributions of large dimensional quaternion sample covariance matrices
Ann Inst Stat Math 206 68:765 785 DOI 0007/s0463-05-054-0 Convergence of empirical spectral distributions of large dimensional quaternion sample covariance matrices Huiqin Li Zhi Dong Bai Jiang Hu Received:
More informationNonparametric Eigenvalue-Regularized Precision or Covariance Matrix Estimator
Nonparametric Eigenvalue-Regularized Precision or Covariance Matrix Estimator Clifford Lam Department of Statistics, London School of Economics and Political Science Abstract We introduce nonparametric
More informationSignals and Spectra - Review
Signals and Spectra - Review SIGNALS DETERMINISTIC No uncertainty w.r.t. the value of a signal at any time Modeled by mathematical epressions RANDOM some degree of uncertainty before the signal occurs
More informationRobust covariance matrices estimation and applications in signal processing
Robust covariance matrices estimation and applications in signal processing F. Pascal SONDRA/Supelec GDR ISIS Journée Estimation et traitement statistique en grande dimension May 16 th, 2013 FP (SONDRA/Supelec)
More informationAdvanced Econometrics
Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco
More informationLarge System Analysis of Projection Based Algorithms for the MIMO Broadcast Channel
Large System Analysis of Projection Based Algorithms for the MIMO Broadcast Channel Christian Guthy and Wolfgang Utschick Associate Institute for Signal Processing, Technische Universität München Joint
More information