STAT 248: EDA & Stationarity Handout 3
|
|
- Clinton Shelton
- 5 years ago
- Views:
Transcription
1 STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance function, stationarity, the sample acf and the correlogram. Useful readings are chapters 1 and 2 from Shumway and Stoffer s book, and chapters 2 and 3 from Cryer and Chan s book. Note that both books are freely online available through the university library-system. 2 Stationarity Sometimes we may wish to specify the collection of joint distributions to all finite dimensional vectors (X t1, X t2,.., X tn ), t = (t 1,..., t n ) T n, n {1, 2,..., n}. In such a case we need to be sure that a stochastic process with the specified distributions really does exist. Kolmogorov s theorem guarantees that this is true under minimal conditions on the specified distribution functions. We are going to look for describing a time series by its first and second moments, the mean function and the covariance function respectively. You can think of the covariance function as the average cross-product relative to the joint distribution f(x r, X s ). Definition: [mean function] The mean function, provided it exists, is defined as: µ Xt = E(X t ) = where F t (x) = P (X t x) and f t (x) = Ft(x) x. xf t (x)dx (1) Definition: [autocovariance function (acvf)] If {X t, t T } is a process s.t. V ar(x t ) < for each t T, then its autocovariance function γ X (.,.) is defined as: γ X (r, s) = Cov(X r, X s ) = E[(X r E(X r ))(X s E(X s ))] r, s T (2) The autocovariance function measures the linear dependence between two points on the same series observed at different times. If X s and X r are independent then γ X (r, s) = 0. However, if γ X (r, s) = 0 then X s and X r are NOT necessarily independent. Remark: What happens when s=r? 1
2 Definition: [(weak) stationarity]: The time series {X t, t Z} with index set Z = {0, ±1, ±2,...} is said to be (weakly) stationary if: 1. E X t 2 < t Z 2. E(X t ) = µ t Z 3. γ X (r, s) = γ X (r + t, s + t) r, s, t Z Intuitively a time series is stationary if it has a finite variance such that the mean value is constant and doesn t depend on time. The covariance function depends on s and t only through the difference s t. Stationary processes play a crucial role in the analysis of time series. Of course many observed time series are decidedly non-stationary in appearance. Frequently such data sets can be transformed into series which can reasonably be modelled as realizations of some stationary process. The theory of stationary processes is then used for the analysis, fitting and prediction of the resulting series. In all of this the autocovariance function is a primary tool. Remark 1: Stationarity as defined is frequently referred to in the literature as weak stationarity, covariance stationarity, stationarity in the wide sense or second-order stationarity. For us when we refer to stationarity we will think about the three properties mentioned in the definition above. Remark 2: If {X t, t Z} is stationary then γ X (r, s) = γ X (r s, 0) for all r, s Z. It is therefore convenient to redefine the autocovariance function of a stationary process as the function of just one variable: γ X (h) = γ x (h, 0) = Cov(X t+h, X t ) t, h Z (3) The function γ X (.) will be referred as the autocovariance function of {X t } and γ X (h) as its value at lag h. Note that γ X (s, t) = γ X (t, s) for all points s and t. Elementary properties covariance function. If γ(.) is the autocovariance function of a stationary process {X t, t Z}, then: γ(0) 0 γ(h) γ(0) γ(h) = γ( h) h Z h Z Definition: [strict stationarity] The time series {X t, t Z} is said to be strictly stationary if the joint distributions of (X t1,..., X tk ) and (X t1 +h,..., X tk +h) are the same for all positive integers k and for all t 1,..., t k, h Z. Strict stationarity means intuitively that the graphs over two equal-length time intervals of a realization of the time series should exhibit similar statistical characteristics. For example, the proportion of ordinates not exceeding a given level x should be roughly the same for both intervals. Remark 1 The previous definition is equivalent to the statement that (X 1,..., X k ) and (X 1+h,..., X k+h ) have the same joint distribution for all positive integers k and integers h. Remark 2 A strictly stationary process with finite second moments is stationary. The inverse is not necessarily true. 2
3 Remark 3 If {X t, t Z} is stationary Gaussian process then {X t } is strictly stationary, since for all {1, 2,...} and for all h, t 1, t 2,..., Z the random vectors (X t1,..., X tn ) and (X t1 +h,..., X tn+h) have the same mean and covariance matrix and hence the same distribution. Definition: [autocorrelation function (acf)] The autocorrelation function of a stationary time series is the function whose value at lag h is: ρ X (h) = γ x(h) γ x (0) = Corr(X t+h, X t ) t, h Z The Cauchy-Schwarz inequality shows that 1 ρ(h) 1 for all h. Further, ρ X (h) = 0 if X t and X t+h are not correlated, and ρ X (h) = ±1 if X t+h = α 0 + α 1 X t. The value ρ X (h) is a rough measure of the ability to forecast the series at time t + h from the value at time t. Definition: [cross-covariance function (ccvf)] If {X t, t T } and {Y t, t T } are processes s.t. V ar(x t ) < & V ar(y t ) < for each t T, then the cross-covariance function γ XY (.,.) is defined as: γ XY (r, s) = Cov(X r, Y s ) = E[(X r E(X r ))(Y s E(Y s ))] r, s T (4) Of course there is also a scaled version of the cross-covariance function: Definition: [cross-correlation function (ccf)] The cross-correlation function of {X t, t T } and {Y t, t T } is defined as: ρ XY (s, t) = γ XY (s, t) γx (s, s)γ Y (t, t) s, t Z 3 Estimation of Mean and Correlation Function As we just have one realization of our time series, the assumption of stationarity becomes critical. Somehow, we must use averages over this single realization to estimate the population means and covariance functions. If a time series is stationary, the mean function is constant µ t = µ. In that case we can estimate it by the sample mean: ˆµ = x t = 1 n Assuming stationarity, the autocovariance and autocorrelation function can be estimated using: n t=1 x t 3
4 Definition: The sample autocovariance function is defined as: n h ˆγ(h) = n 1 (x t+h x)(x t x)) t=1 with ˆγ( h) = ˆγ(h) for h = 0, 1,..., n 1. Dividing by n h ensures that the function is nonnegative definite although it is not an unbiased estimate of γ(h). Definition: The sample autocorrelation function is defined, analogously, as: ˆρ(h) = ˆγ(h) ˆγ(0) Correlogram of 'white noise' Correlogram of 'white noise with trend' ACF ACF Lag Lag Figure 1: Left: Sample acf of Gaussian white noise. Right: Sample acf of the series generated by X t = t + Z t, where Z t is Gaussian white noise (i.e. X t is white noise with a deterministic trend t). > white.noise = as.ts(rnorm(100)) > acf(white.noise, main = "Correlogram of white noise ") > t = 1:100 > Xt = t + white.noise > acf(xt, main = "Correlogram of white noise with trend ") The sample autocorrelation function has a sampling distribution that allows us to assess whether the data comes from a completely random or white series, or whether correlations are statistically significant at some lags. Large sample distribution of the acf Under general conditions, if x t is white noise, then for large n, the sample acf, ˆρ X (h) for h = 1, 2,..., H, where H is fixed but arbitrary, is approximately normally distributed with zero mean and standard deviation given by: σˆρx(h) = 1 n Remark 1 Based on this result, we obtain a rough method of assessing whether peaks in ˆρ(h) are significant by determining whether the observed peak is outside the interval ±2/ n; for a white noise 4
5 sequence, approximately 95% of the sample acf s should be within these limits. After trying to reduce a time series to a white noise series the acf s of the residuals should then lie roughly within the limits given above. Remark 2 The sample autocovariance and autocorrelation functions can be computed for nonstationary process. For data containing a trend, ˆρ(h) will exhibit slow decay as h increases, and for data with a substantial deterministic periodic, ˆρ(h) will exhibit similar behaviour with the same periodicity. Remark 3 The sample cross-covariance and sample cross-correlation function are defined analogously to the sample autocovariance and sample autocorrelation function. 3.1 R functions mean(), acf(), acf(type = "covariance"), ccf(), ccf(type = "covariance"). 4 Bibliography This handout is based on handouts prepared by Irma Hernandez-Magallanes, previous GSI for this course. Additional sources that are used, and that could be useful for you: Time Series: Data Analysis and Theory by David R. Brillinger Time Series: Theory and Methods by Peter Brockwell & Richard Davis Time Series Analysis and Its Applications: With R Examples by Robert Schumway & David Stoffer 5
Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models
Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More informationSTA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)
STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each
More informationNonlinear time series
Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of
More informationOn 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).
On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). (sin(x)) 2 + (cos(x)) 2 = 1. 28 1 Characteristics of Time
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 2 - Probability Models Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 34 Agenda 1 Introduction 2 Stochastic Process Definition 1 Stochastic Definition
More informationTIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets
TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:
More informationPart II. Time Series
Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationStochastic Processes. A stochastic process is a function of two variables:
Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationTime Series I Time Domain Methods
Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT
More informationModule 9: Stationary Processes
Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.
More informationMultivariate Time Series
Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form
More informationTime Series Analysis -- An Introduction -- AMS 586
Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationReliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends
Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function
More informationStochastic Processes: I. consider bowl of worms model for oscilloscope experiment:
Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing
More informationFor a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,
CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance
More informationStatistics of stochastic processes
Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationLesson 4: Stationary stochastic processes
Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means
More informationLesson 7: Estimation of Autocorrelation and Partial Autocorrela
Lesson 7: Estimation of Autocorrelation and Partial Autocorrelation Function Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Estimation
More information1. Stochastic Processes and Stationarity
Massachusetts Institute of Technology Department of Economics Time Series 14.384 Guido Kuersteiner Lecture Note 1 - Introduction This course provides the basic tools needed to analyze data that is observed
More informationwhite noise Time moving average
1.3 Time Series Statistical Models 13 white noise w 3 1 0 1 0 100 00 300 400 500 Time moving average v 1.5 0.5 0.5 1.5 0 100 00 300 400 500 Fig. 1.8. Gaussian white noise series (top) and three-point moving
More informationENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University
ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More informationIV. Covariance Analysis
IV. Covariance Analysis Autocovariance Remember that when a stochastic process has time values that are interdependent, then we can characterize that interdependency by computing the autocovariance function.
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationAR, MA and ARMA models
AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More informationClass 1: Stationary Time Series Analysis
Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models
More informationSpectral representations and ergodic theorems for stationary stochastic processes
AMS 263 Stochastic Processes (Fall 2005) Instructor: Athanasios Kottas Spectral representations and ergodic theorems for stationary stochastic processes Stationary stochastic processes Theory and methods
More informationSTOR 356: Summary Course Notes
STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction
More informationGaussian processes. Basic Properties VAG002-
Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity
More informationSTAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics
David B. University of South Carolina Department of Statistics What are Time Series Data? Time series data are collected sequentially over time. Some common examples include: 1. Meteorological data (temperatures,
More informationA time series is a set of observations made sequentially in time.
Time series and spectral analysis Peter F. Craigmile Analyzing time series A time series is a set of observations made sequentially in time. R. A. Fisher: One damned thing after another. Time series analysis
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationLecture 1: Brief Review on Stochastic Processes
Lecture 1: Brief Review on Stochastic Processes A stochastic process is a collection of random variables {X t (s) : t T, s S}, where T is some index set and S is the common sample space of the random variables.
More informationTime series and spectral analysis. Peter F. Craigmile
Time series and spectral analysis Peter F. Craigmile http://www.stat.osu.edu/~pfc/ Summer School on Extreme Value Modeling and Water Resources Universite Lyon 1, France. 13-24 Jun 2016 Thank you to The
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationUniversity of Regina. Lecture Notes. Michael Kozdron
University of Regina Statistics 252 Mathematical Statistics Lecture Notes Winter 2005 Michael Kozdron kozdron@math.uregina.ca www.math.uregina.ca/ kozdron Contents 1 The Basic Idea of Statistics: Estimating
More informationNotes on Random Processes
otes on Random Processes Brian Borchers and Rick Aster October 27, 2008 A Brief Review of Probability In this section of the course, we will work with random variables which are denoted by capital letters,
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationCharacteristics of Time Series
Characteristics of Time Series Al Nosedal University of Toronto January 12, 2016 Al Nosedal University of Toronto Characteristics of Time Series January 12, 2016 1 / 37 Signal and Noise In general, most
More informationAkaike criterion: Kullback-Leibler discrepancy
Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ
More informationStochastic process for macro
Stochastic process for macro Tianxiao Zheng SAIF 1. Stochastic process The state of a system {X t } evolves probabilistically in time. The joint probability distribution is given by Pr(X t1, t 1 ; X t2,
More informationCovariances of ARMA Processes
Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation
More informationCh3. TRENDS. Time Series Analysis
3.1 Deterministic Versus Stochastic Trends The simulated random walk in Exhibit 2.1 shows a upward trend. However, it is caused by a strong correlation between the series at nearby time points. The true
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More informationFinancial Time Series Analysis Week 5
Financial Time Series Analysis Week 5 25 Estimation in AR moels Central Limit Theorem for µ in AR() Moel Recall : If X N(µ, σ 2 ), normal istribute ranom variable with mean µ an variance σ 2, then X µ
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More informationStochastic Processes
Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.
More informationLecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)
Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Our immediate goal is to formulate an LLN and a CLT which can be applied to establish sufficient conditions for the consistency
More information1. Fundamental concepts
. Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationIntroduction to Economic Time Series
Econometrics II Introduction to Economic Time Series Morten Nyboe Tabor Learning Goals 1 Give an account for the important differences between (independent) cross-sectional data and time series data. 2
More information16.584: Random (Stochastic) Processes
1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable
More information1 Introduction to Generalized Least Squares
ECONOMICS 7344, Spring 2017 Bent E. Sørensen April 12, 2017 1 Introduction to Generalized Least Squares Consider the model Y = Xβ + ɛ, where the N K matrix of regressors X is fixed, independent of the
More informationLecture 22: Variance and Covariance
EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce
More informationModule 3. Descriptive Time Series Statistics and Introduction to Time Series Models
Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015
More informationX random; interested in impact of X on Y. Time series analogue of regression.
Multiple time series Given: two series Y and X. Relationship between series? Possible approaches: X deterministic: regress Y on X via generalized least squares: arima.mle in SPlus or arima in R. We have
More information9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006.
9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. References for this Lecture: Introduction to Time Series and Forecasting. P.J. Brockwell and R. A. Davis, Springer Texts
More informationTime Series Analysis - Part 1
Time Series Analysis - Part 1 Dr. Esam Mahdi Islamic University of Gaza - Department of Mathematics April 19, 2017 1 of 189 What is a Time Series? Fundamental concepts Time Series Decomposition Estimating
More informationMinitab Project Report Assignment 3
3.1.1 Simulation of Gaussian White Noise Minitab Project Report Assignment 3 Time Series Plot of zt Function zt 1 0. 0. zt 0-1 0. 0. -0. -0. - -3 1 0 30 0 50 Index 0 70 0 90 0 1 1 1 1 0 marks The series
More informationAdvanced Econometrics
Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco
More informationTime Series: Theory and Methods
Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary
More informationDeterministic. Deterministic data are those can be described by an explicit mathematical relationship
Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact
More informationEAS 305 Random Processes Viewgraph 1 of 10. Random Processes
EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome
More informationChapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes
Chapter 2 Some basic tools 2.1 Time series: Theory 2.1.1 Stochastic processes A stochastic process is a sequence of random variables..., x 0, x 1, x 2,.... In this class, the subscript always means time.
More informationProbability and Statistics for Final Year Engineering Students
Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms
More informationMaster Thesis. Change Detection in Telecommunication Data using Time Series Analysis and Statistical Hypothesis Testing.
Master Thesis Change Detection in Telecommunication Data using Time Series Analysis and Statistical Hypothesis Testing Tilda Eriksson LiTH-MAT-EX 2013/04 SE Change Detection in Telecommunication Data
More informationSTAT STOCHASTIC PROCESSES. Contents
STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5
More informationStochastic Processes. Monday, November 14, 11
Stochastic Processes 1 Definition and Classification X(, t): stochastic process: X : T! R (, t) X(, t) where is a sample space and T is time. {X(, t) is a family of r.v. defined on {, A, P and indexed
More informationContents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8
A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5
More information8.2 Harmonic Regression and the Periodogram
Chapter 8 Spectral Methods 8.1 Introduction Spectral methods are based on thining of a time series as a superposition of sinusoidal fluctuations of various frequencies the analogue for a random process
More informationThe distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)
Stochastic Processes - MM3 - Solutions MM3 - Review Exercise Let X N (0, ), i.e. X is a standard Gaussian/normal random variable, and denote by f X the pdf of X. Consider also a continuous random variable
More informationSTAT 153, FALL 2015 HOMEWORK 1 SOLUTIONS
STAT 153, FALL 015 HOMEWORK 1 SOLUTIONS Problem 1 (a) A solution in R is provided in Figure 1. We conclude that if δ = 0 the process is a random walk without drift. When δ 0 there is a linear drift with
More informationUniversity of Oxford. Statistical Methods Autocorrelation. Identification and Estimation
University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model
More informationLong-range dependence
Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series
More informationWhat s for today. Random Fields Autocovariance Stationarity, Isotropy. c Mikyoung Jun (Texas A&M) stat647 Lecture 2 August 30, / 13
What s for today Random Fields Autocovariance Stationarity, Isotropy c Mikyoung Jun (Texas A&M) stat647 Lecture 2 August 30, 2012 1 / 13 Stochastic Process and Random Fields A stochastic process is a family
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationLecture 4a: ARMA Model
Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model
More informationStationary Stochastic Time Series Models
Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic
More informationCommunication Theory II
Communication Theory II Lecture 8: Stochastic Processes Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 5 th, 2015 1 o Stochastic processes What is a stochastic process? Types:
More informationECON3327: Financial Econometrics, Spring 2016
ECON3327: Financial Econometrics, Spring 2016 Wooldridge, Introductory Econometrics (5th ed, 2012) Chapter 11: OLS with time series data Stationary and weakly dependent time series The notion of a stationary
More informationChapter 3: Regression Methods for Trends
Chapter 3: Regression Methods for Trends Time series exhibiting trends over time have a mean function that is some simple function (not necessarily constant) of time. The example random walk graph from
More informationClassic Time Series Analysis
Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t
More informationNonlinear Time Series Modeling
Nonlinear Time Series Modeling Part II: Time Series Models in Finance Richard A. Davis Colorado State University (http://www.stat.colostate.edu/~rdavis/lectures) MaPhySto Workshop Copenhagen September
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More information