SF2943: A FEW COMMENTS ON SPECTRAL DENSITIES
|
|
- Julius Holland
- 5 years ago
- Views:
Transcription
1 SF2943: A FEW COMMENTS ON SPECTRAL DENSITIES FILIP LINDSKOG The aim of this brief note is to get some basic understanding of spectral densities by computing, plotting and interpreting the spectral densities of the stationary time series (1) (2) (3) (4) X t = 0.8X t 1 + Z t + 0.9Z t 1, X t = 0.8X t 1 0.9X t 2 + Z t, X t = 0.8X t 1 + Z t, X t = 0.8X t 1 + Z t, where {Z t is an iid sequence of standard normal random variables. The processes are examples of causal linear time series, the roots of the autoregressive polynomials are all located outside the unit circle in the complex plane. In order to get some very preliminary feeling for the four systems above, we study how the linear systems transforms the input (z 1, z 2, z 3,... ) = (1, 0, 0,... ) into output values (x 1, x 2, x 3,... ). The result (with linear interpolation) is shown in Figure 1. The AR(1) process (3) shows a periodic behavior, although a rather trivial one (plus, minus, plus, minus, etc.). The AR(2) process (2) shows a more interesting periodic behavior, with cycle lengths of approximately 3 time steps. For a zero-mean stationary time series with ACVF γ satisfying spectral density is given by (5) f(λ) = 1 e ihλ γ(h). γ(h) <, the The assumption γ(h) < ensures that the sum in (5) is absolute convergent (makes sense). Since e ihλ = cos( hλ) + i sin( hλ) and since cos and sin both have period, it is sufficient to consider only λ in (, π]. Since sin is an odd function, sin( x) = sin(x), and the sum in (5) is computed over a set of h-values that is symmetric about 0, f(λ) = 1 e ihλ γ(h) = 1 cos( hλ)γ(h). In particular, f is a real-valued even function, f(λ) = f( λ). Therefore, it is sufficient to consider only λ in [0, π]. Moreover, it can be shown that f(λ) 0. 1
2 2 FILIP LINDSKOG Let us compute the spectral density of a causal AR(1) process with γ(h) = σ 2 φ ( h )/(1 φ 2 ): σ 2 ( ) f(λ) = (1 φ φ h (e ihλ + e ihλ ) ) h=0 h=1 { = z h = 1 1 z if z < 1 = σ 2 (1 (1 φ 2 + φe iλ ) φeiλ + ) 1 φe iλ 1 φe iλ σ 2 1 φ 2 e iλ e iλ = (1 φ 2 ) 1 φ(e iλ + e iλ ) + φ 2 e iλ e iλ = σ2 (1 2φ cos λ + φ2 ) 1. For a causal ARMA(p, q) process it can be shown that f(λ) = σ2 θ(e iλ ) 2 φ(e iλ ) 2, λ [, π]. In particular, for a causal AR(2) process f(λ) is maximized for λ = φ 1 (φ 2 1)/(4φ 2 ). For the AR(2) process in (2) we get the λ arccos( 0.42) 2.0 maximising the spectral density. This corresponds to the cycle length /λ 3. The spectral densities of (1) to (4), for λ [0, π], are shown in Figure 2. The AR(1) process (3) has a spectral density with a maximum at λ = π. This is consistent with the cycle length of /π = 2 shown in Figure 1. The AR(2) process (2) has a spectral density with a maximum at λ 2. This is consistent with the cycle length of /2 = π 3 shown in Figure 1. The processes (1) and (4) have spectral densities with maxima at λ = 0. They show no periodic behavior (cycles of infinite length). Notice that π e ikλ f(λ)dλ = π = 1 = γ(k) 1 γ(h) e i(k h)λ γ(h)dλ π e i(k h)λ dλ since the integral vanishes for h k. We notice that γ(0) = π f(λ)dλ. In particular, the comparison of the spectral densities might be visualized more clearly by dividing the spectral densities by the stationary variance γ(0) of the respective processes. Finally, we want to see if the sample analogues of the spectral densities, based on simulated samples {x 1,..., x n, are sufficiently accurate to draw the same conclusions as if we knew the spectral densities. The result is shown in Figure 4. The sample analogue of
3 SF2943: A FEW COMMENTS ON SPECTRAL DENSITIES 3 (5) is obtained by replacing the autocovariance function γ by the sample autocovariance function γ in (5): f(λ) = 1 n 1 h= n+1 e ihλ γ(h). The sample analogue of the spectral density corresponds to what is called the periodogram of {x 1,..., x n : I n (λ) = 1 n x t e itλ 2. n It can be shown that I n (ω k ) = f(ω k ), t=1 ω k = k n (, π].
4 4 FILIP LINDSKOG R-code ar2specdens<-function(l,s,phi1,phi2) { s^2/(2*pi*(1+phi1^2+phi2^2+2*phi2+2*(phi1*phi2-phi1)*cos(l)-4*phi2*cos(l)^2)) arma11specdens<-function(l,s,phi,theta) { s^2*(1+theta^2+2*theta*cos(l))/(2*pi*(1+phi^2-2*phi*cos(l))) xvals<-(0:100)*pi/100 plot(xvals,arma11specdens(xvals,1,0.8,0.9),type="l",xlab="",ylab="", ylim=c(0,max(ar2specdens(xvals,1,-0.8,-0.9)))) lines(xvals,ar2specdens(xvals,1,-0.8,-0.9)) lines(xvals,ar2specdens(xvals,1,-0.8,0)) lines(xvals,ar2specdens(xvals,1,0.8,0)) ar2roots<-polyroot(c(1,0.8,0.9)) (ar2roots[1]*ar2roots[2])^2/((ar2roots[1]*ar2roots[2]-1)*(ar2roots[2]-ar2roots[1]))* (ar2roots[1]/(ar2roots[1]^2-1)-ar2roots[2]/(ar2roots[2]^2-1)) g01<-(1+1.7^2/(1-0.8^2)) g02<-(ar2roots[1]*ar2roots[2])^2/((ar2roots[1]*ar2roots[2]-1)*(ar2roots[2]-ar2roots[1])) (ar2roots[1]/(ar2roots[1]^2-1)-ar2roots[2]/(ar2roots[2]^2-1)) g03<-1/(1-0.8^2) g04<-1/(1-0.8^2) plot(xvals,arma11specdens(xvals,1,0.8,0.9)/g01,type="l",xlab="",ylab="", ylim=c(0,max(ar2specdens(xvals,1,-0.8,-0.9)/re(g02)))) lines(xvals,ar2specdens(xvals,1,-0.8,-0.9)/re(g02)) lines(xvals,ar2specdens(xvals,1,-0.8,0)/g03) lines(xvals,ar2specdens(xvals,1,0.8,0)/g04) z<-1 tmpmat<-matrix(0,4,20) tmpmat[1,1]<-z tmpmat[2,1]<-z tmpmat[3,1]<-z tmpmat[4,1]<-z
5 SF2943: A FEW COMMENTS ON SPECTRAL DENSITIES 5 tmpmat[1,2]<-0.8*tmpmat[1,1]+0.9*z tmpmat[2,2]<--0.8*tmpmat[2,1] tmpmat[3,2]<--0.8*tmpmat[3,1] tmpmat[4,2]<-0.8*tmpmat[4,1] for(i in (3:20)) { tmpmat[1,i]<-0.8*tmpmat[1,i-1] tmpmat[2,i]<--0.8*tmpmat[2,i-1]-0.9*tmpmat[2,i-2] tmpmat[3,i]<--0.8*tmpmat[3,i-1] tmpmat[4,i]<-0.8*tmpmat[4,i-1] plot(tmpmat[1,],type="l",ylim=c(-1,1.8),xlab="",ylab="",col="red",xaxt="n") axis(1,(0:20)) lines(tmpmat[2,],col="blue") lines(tmpmat[3,],col="green") lines(tmpmat[4,],col="black") arma1.sim<-arima.sim(model=list(ar=c(-0.8,-0.9)),n=200) arma2.sim<-arima.sim(model=list(ar=c(-0.8)),n=200) arma3.sim<-arima.sim(model=list(ar=c(0.8)),n=200) arma4.sim<-arima.sim(model=list(ar=c(0.8),ma=c(0.9)),n=200) (arma1.sim) lines(xvals/(2*pi),ar2specdens(xvals,1,-0.8,-0.9)) (arma2.sim) lines(xvals/(2*pi),ar2specdens(xvals,1,-0.8,0)) (arma3.sim) lines(xvals/(2*pi),ar2specdens(xvals,1,0.8,0)) (arma4.sim) lines(xvals/(2*pi),arma11specdens(xvals,1,0.8,0.9))
6 6 FILIP LINDSKOG Figures Figure 1. Illustration of how the systems transform input to output Figure 2. Spectral densities
7 SF2943: A FEW COMMENTS ON SPECTRAL DENSITIES Figure 3. Spectral densities normalized by dividing by the stationary variance of the time series 1e 02 1e+01 5e 03 5e 01 1e 03 1e+00 1e 03 1e+01 Figure 4. Periodograms based on samples of size 200 and the corresponding spectral densities
SF2943 TIME SERIES ANALYSIS: A FEW COMMENTS ON SPECTRAL DENSITIES
SF2943 TIME SERIES ANALYSIS: A FEW COMMENTS ON SPECTRAL DENSITIES FILIP LINDSKOG The aim of this brief note is to get some basic understanding of spectral densities by computing, plotting and interpreting
More informationSF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES
SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through
More informationLong-range dependence
Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series
More informationContents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8
A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5
More informationLECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.
MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationModeling and testing long memory in random fields
Modeling and testing long memory in random fields Frédéric Lavancier lavancier@math.univ-lille1.fr Université Lille 1 LS-CREST Paris 24 janvier 6 1 Introduction Long memory random fields Motivations Previous
More informationPart III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to
TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let
More informationFirst-order random coefficient autoregressive (RCA(1)) model: Joint Whittle estimation and information
ACTA ET COMMENTATIONES UNIVERSITATIS TARTUENSIS DE MATHEMATICA Volume 9, Number, June 05 Available online at http://acutm.math.ut.ee First-order random coefficient autoregressive RCA model: Joint Whittle
More informationStatistics of stochastic processes
Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationA Diagnostic for Seasonality Based Upon Autoregressive Roots
A Diagnostic for Seasonality Based Upon Autoregressive Roots Tucker McElroy (U.S. Census Bureau) 2018 Seasonal Adjustment Practitioners Workshop April 26, 2018 1 / 33 Disclaimer This presentation is released
More informationTime Series: Theory and Methods
Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary
More informationγ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1
4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationLesson 9: Autoregressive-Moving Average (ARMA) models
Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationFrequency estimation by DFT interpolation: A comparison of methods
Frequency estimation by DFT interpolation: A comparison of methods Bernd Bischl, Uwe Ligges, Claus Weihs March 5, 009 Abstract This article comments on a frequency estimator which was proposed by [6] and
More informationIdentifiability, Invertibility
Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:
More informationChapter 9: Forecasting
Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the
More informationCh 4. Models For Stationary Time Series. Time Series Analysis
This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e
More informationSTAD57 Time Series Analysis. Lecture 23
STAD57 Time Series Analysis Lecture 23 1 Spectral Representation Spectral representation of stationary {X t } is: 12 i2t Xt e du 12 1/2 1/2 for U( ) a stochastic process with independent increments du(ω)=
More informationARMA Models: I VIII 1
ARMA Models: I autoregressive moving-average (ARMA) processes play a key role in time series analysis for any positive integer p & any purely nondeterministic process {X t } with ACVF { X (h)}, there is
More informationCross Spectra, Linear Filters, and Parametric Spectral Estimation
Cross Spectra,, and Parametric Spectral Estimation Outline 1 Multiple Series and Cross-Spectra 2 Arthur Berg Cross Spectra,, and Parametric Spectral Estimation 2/ 20 Outline 1 Multiple Series and Cross-Spectra
More information3 Theory of stationary random processes
3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation
More informationON THE RANGE OF VALIDITY OF THE AUTOREGRESSIVE SIEVE BOOTSTRAP
ON THE RANGE OF VALIDITY OF THE AUTOREGRESSIVE SIEVE BOOTSTRAP JENS-PETER KREISS, EFSTATHIOS PAPARODITIS, AND DIMITRIS N. POLITIS Abstract. We explore the limits of the autoregressive (AR) sieve bootstrap,
More informationCalculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by
Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary
More informationStatistics 349(02) Review Questions
Statistics 349(0) Review Questions I. Suppose that for N = 80 observations on the time series { : t T} the following statistics were calculated: _ x = 10.54 C(0) = 4.99 In addition the sample autocorrelation
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationResearch Article Optimal Portfolio Estimation for Dependent Financial Returns with Generalized Empirical Likelihood
Advances in Decision Sciences Volume 2012, Article ID 973173, 8 pages doi:10.1155/2012/973173 Research Article Optimal Portfolio Estimation for Dependent Financial Returns with Generalized Empirical Likelihood
More informationAutoregressive Models Fourier Analysis Wavelets
Autoregressive Models Fourier Analysis Wavelets BFR Flood w/10yr smooth Spectrum Annual Max: Day of Water year Flood magnitude vs. timing Jain & Lall, 2000 Blacksmith Fork, Hyrum, UT Analyses of Flood
More informationMAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)
MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.
More informationSpectral Analysis of Random Processes
Spectral Analysis of Random Processes Spectral Analysis of Random Processes Generally, all properties of a random process should be defined by averaging over the ensemble of realizations. Generally, all
More informationChapter 5: Models for Nonstationary Time Series
Chapter 5: Models for Nonstationary Time Series Recall that any time series that is a stationary process has a constant mean function. So a process that has a mean function that varies over time must be
More informationOutline. STA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8) Summary Statistics of Midterm Scores
STA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8) Outline 1 Midterm Results 2 Multiple Series and Cross-Spectra 3 Linear Filters Arthur Berg STA 6857 Cross Spectra,
More informationCovariances of ARMA Processes
Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation
More informationSTA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8)
STA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8) Outline 1 Midterm Results 2 Multiple Series and Cross-Spectra 3 Linear Filters Arthur Berg STA 6857 Cross Spectra,
More informationAkaike criterion: Kullback-Leibler discrepancy
Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ
More informationStrictly Stationary Solutions of Autoregressive Moving Average Equations
Strictly Stationary Solutions of Autoregressive Moving Average Equations Peter J. Brockwell Alexander Lindner Abstract Necessary and sufficient conditions for the existence of a strictly stationary solution
More informationSpectral Analysis (Theory)
Spectral Analysis (Theory) Al Nosedal University of Toronto Winter 2016 Al Nosedal University of Toronto Spectral Analysis (Theory) Winter 2016 1 / 28 Idea: Decompose a stationary time series {X t } into
More informationNonlinear time series
Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of
More information7. MULTIVARATE STATIONARY PROCESSES
7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability
More informationGeneralised AR and MA Models and Applications
Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and
More informationStatistics of Stochastic Processes
Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationTesting Restrictions and Comparing Models
Econ. 513, Time Series Econometrics Fall 00 Chris Sims Testing Restrictions and Comparing Models 1. THE PROBLEM We consider here the problem of comparing two parametric models for the data X, defined by
More informationEEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:
EEM 409 Random Signals Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Consider a random process of the form = + Problem 2: X(t) = b cos(2π t + ), where b is a constant,
More informationIntroduction to Time Series Analysis. Lecture 18.
Introduction to Time Series Analysis. Lecture 18. 1. Review: Spectral density, rational spectra, linear filters. 2. Frequency response of linear filters. 3. Spectral estimation 4. Sample autocovariance
More informationTime Series Analysis. Solutions to problems in Chapter 5 IMM
Time Series Analysis Solutions to problems in Chapter 5 IMM Solution 5.1 Question 1. [ ] V [X t ] = V [ǫ t + c(ǫ t 1 + ǫ t + )] = 1 + c 1 σǫ = The variance of {X t } is not limited and therefore {X t }
More informationAsymptotically Efficient Nonparametric Estimation of Nonlinear Spectral Functionals
Acta Applicandae Mathematicae 78: 145 154, 2003. 2003 Kluwer Academic Publishers. Printed in the Netherlands. 145 Asymptotically Efficient Nonparametric Estimation of Nonlinear Spectral Functionals M.
More information1. Fundamental concepts
. Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing
More informationDifference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.
Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order
More information18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013
18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models
ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN
More informationExtremogram and Ex-Periodogram for heavy-tailed time series
Extremogram and Ex-Periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Jussieu, April 9, 2014 1 2 Extremal
More informationSTA 6857 Estimation ( 3.6)
STA 6857 Estimation ( 3.6) Outline 1 Yule-Walker 2 Least Squares 3 Maximum Likelihood Arthur Berg STA 6857 Estimation ( 3.6) 2/ 19 Outline 1 Yule-Walker 2 Least Squares 3 Maximum Likelihood Arthur Berg
More informationIDENTIFICATION OF ARMA MODELS
IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by
More informationSpectral Analysis for Intrinsic Time Processes
Spectral Analysis for Intrinsic Time Processes TAKAHIDE ISHIOKA, SHUNSUKE KAWAMURA, TOMOYUKI AMANO AND MASANOBU TANIGUCHI Department of Pure and Applied Mathematics, Graduate School of Fundamental Science
More informationIntroduction to Signal Processing
to Signal Processing Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Intelligent Systems for Pattern Recognition Signals = Time series Definitions Motivations A sequence
More informationExtremogram and ex-periodogram for heavy-tailed time series
Extremogram and ex-periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Zagreb, June 6, 2014 1 2 Extremal
More informationTrig Identities. or (x + y)2 = x2 + 2xy + y 2. Dr. Ken W. Smith Other examples of identities are: (x + 3)2 = x2 + 6x + 9 and
Trig Identities An identity is an equation that is true for all values of the variables. Examples of identities might be obvious results like Part 4, Trigonometry Lecture 4.8a, Trig Identities and Equations
More informationJournal of Statistical Research 2007, Vol. 41, No. 1, pp Bangladesh
Journal of Statistical Research 007, Vol. 4, No., pp. 5 Bangladesh ISSN 056-4 X ESTIMATION OF AUTOREGRESSIVE COEFFICIENT IN AN ARMA(, ) MODEL WITH VAGUE INFORMATION ON THE MA COMPONENT M. Ould Haye School
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationRational Trigonometry. Rational Trigonometry
There are an infinite number of points on the unit circle whose coordinates are each rational numbers. Indeed every Pythagorean triple gives one! 3 65 64 65 7 5 4 5 03 445 396 445 3 4 5 5 75 373 5 373
More informationTime Series Analysis Fall 2008
MIT OpenCourseWare http://ocw.mit.edu 14.384 Time Series Analysis Fall 008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. Introduction 1 14.384 Time
More informationdistributed approximately according to white noise. Likewise, for general ARMA(p,q), the residuals can be expressed as
library(forecast) log_ap
More informationDispersion for the Schrodinger Equation on discrete trees
Dispersion for the Schrodinger Equation on discrete trees April 11 2012 Preliminaries Let Γ = (V, E) be a graph, denoting V the set of vertices and E the set of edges, assumed to be segments which are
More informationNumerical Methods and Computation Prof. S.R.K. Iyengar Department of Mathematics Indian Institute of Technology Delhi
Numerical Methods and Computation Prof. S.R.K. Iyengar Department of Mathematics Indian Institute of Technology Delhi Lecture No - 27 Interpolation and Approximation (Continued.) In our last lecture we
More informationTime series analysis MAP565
Time series analysis MAP565 François Roueff January 29, 2016 ii Contents 1 Random processes 1 1.1 Introduction..................................... 2 1.2 Random processes.................................
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationMATH 5075: Time Series Analysis
NAME: MATH 5075: Time Series Analysis Final For the entire test {Z t } WN(0, 1)!!! 1 1) Let {Y t, t Z} be a stationary time series with EY t = 0 and autocovariance function γ Y (h). Assume that a) Show
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationStochastic processes: basic notions
Stochastic processes: basic notions Jean-Marie Dufour McGill University First version: March 2002 Revised: September 2002, April 2004, September 2004, January 2005, July 2011, May 2016, July 2016 This
More informationSTA 294: Stochastic Processes & Bayesian Nonparametrics
MARKOV CHAINS AND CONVERGENCE CONCEPTS Markov chains are among the simplest stochastic processes, just one step beyond iid sequences of random variables. Traditionally they ve been used in modelling a
More informationLevinson Durbin Recursions: I
Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions
More informationLevinson Durbin Recursions: I
Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions
More informationIntroduction to Spectral and Time-Spectral Analysis with some Applications
Introduction to Spectral and Time-Spectral Analysis with some Applications A.INTRODUCTION Consider the following process X t = U cos[(=3)t] + V sin[(=3)t] Where *E[U] = E[V ] = 0 *E[UV ] = 0 *V ar(u) =
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationClass: Trend-Cycle Decomposition
Class: Trend-Cycle Decomposition Macroeconometrics - Spring 2011 Jacek Suda, BdF and PSE June 1, 2011 Outline Outline: 1 Unobserved Component Approach 2 Beveridge-Nelson Decomposition 3 Spectral Analysis
More informationStochastic Processes: I. consider bowl of worms model for oscilloscope experiment:
Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationA first order divided difference
A first order divided difference For a given function f (x) and two distinct points x 0 and x 1, define f [x 0, x 1 ] = f (x 1) f (x 0 ) x 1 x 0 This is called the first order divided difference of f (x).
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 2 - Probability Models Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 34 Agenda 1 Introduction 2 Stochastic Process Definition 1 Stochastic Definition
More informationEC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015
EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.
More information5: MULTIVARATE STATIONARY PROCESSES
5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability
More informationPure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0
MODULE 9: STATIONARY PROCESSES 7 Lecture 2 Autoregressive Processes 1 Moving Average Process Pure Random process Pure Random Process or White Noise Process: is a random process X t, t 0} which has: E[X
More informationDistribution of Eigenvalues of Weighted, Structured Matrix Ensembles
Distribution of Eigenvalues of Weighted, Structured Matrix Ensembles Olivia Beckwith 1, Steven J. Miller 2, and Karen Shen 3 1 Harvey Mudd College 2 Williams College 3 Stanford University Joint Meetings
More informationFourier Analysis of Stationary and Non-Stationary Time Series
Fourier Analysis of Stationary and Non-Stationary Time Series September 6, 2012 A time series is a stochastic process indexed at discrete points in time i.e X t for t = 0, 1, 2, 3,... The mean is defined
More informationMoving Average (MA) representations
Moving Average (MA) representations The moving average representation of order M has the following form v[k] = MX c n e[k n]+e[k] (16) n=1 whose transfer function operator form is MX v[k] =H(q 1 )e[k],
More informationMinimax Interpolation Problem for Random Processes with Stationary Increments
STATISTICS, OPTIMIZATION AND INFORMATION COMPUTING Stat., Optim. Inf. Comput., Vol. 3, March 215, pp 3 41. Published online in International Academic Press (www.iapress.org Minimax Interpolation Problem
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationWavelet domain test for long range dependence in the presence of a trend
Wavelet domain test for long range dependence in the presence of a trend Agnieszka Jach and Piotr Kokoszka Utah State University July 24, 27 Abstract We propose a test to distinguish a weakly dependent
More informationWeek 5 Quantitative Analysis of Financial Markets Characterizing Cycles
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More informationLINEAR STOCHASTIC MODELS
LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process
More informationNonlinear Time Series
Nonlinear Time Series Recall that a linear time series {X t } is one that follows the relation, X t = µ + i=0 ψ i A t i, where {A t } is iid with mean 0 and finite variance. A linear time series is stationary
More informationCOMPUTER ALGEBRA DERIVATION OF THE BIAS OF LINEAR ESTIMATORS OF AUTOREGRESSIVE MODELS
COMPUTER ALGEBRA DERIVATION OF THE BIAS OF LINEAR ESTIMATORS OF AUTOREGRESSIVE MODELS Y. ZHANG and A.I. MCLEOD Acadia University and The University of Western Ontario May 26, 2005 1 Abstract. A symbolic
More informationReview: Power series define functions. Functions define power series. Taylor series of a function. Taylor polynomials of a function.
Taylor Series (Sect. 10.8) Review: Power series define functions. Functions define power series. Taylor series of a function. Taylor polynomials of a function. Review: Power series define functions Remarks:
More information