5 Transfer function modelling
|
|
- Helena Lyons
- 6 years ago
- Views:
Transcription
1 MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series (X t ). Thus the process (Y t ) is dynamically related to the process (X t ). We may think of (X t ) as the input to a system and of (Y t ) as the output, or of (X t ) as a process of explanatory variables and of (Y t ) as a process of dependent variables. In general, we shall model (Y t ) as a linearly filtered version of (X t ), where the filter used is one-sided, i.e., causal, and also includes a constant term µ. Thus, at the heart of the model, we have a relationship of the form Y t = µ + v j X t j, which represents the systematic dynamics of the model. The effect of the input process takes time to work through to the output process. Furthermore, there may be a timedelay before the input starts to influence the output, i.e., a positive integer k such that v j = 0 for 0 j k 1. (In comparison with Section 3 on linear filters, note the reversal of the roles of the processes (X t ) and (Y t ) as input and output.) In practice, this model needs to be developed further, because the relationship between (X t ) and (Y t ) will not be exact but will be subject to disturbance. We introduce a disturbance or noise term U t to arrive at the following equation for what is known as the transfer function model or a distributed lag model. Y t = µ + v j X t j + U t ( < t < ). (1) The disturbance process (U t ) in Equation (1) is unobservable and is not necessarily a white noise process. We make the following two assumptions: 1. (U t ) is a zero-mean stationary process. 2. (U t ) is uncorrelated with the input process (X t ). If we further assume that (X t ) is a stationary process then it follows that the output process (Y t ) is also a stationary process. If the data being modelled is non-stationary then it may be differenced to reduce it to stationarity before fitting the model of Equation (1). In such cases it may well be appropriate, as is the case for univariate models, to assume that the differenced data, both input and output, have zero mean, which implies that the constant µ in the model equation (1) is taken to be zero. 1
2 Example Consider a manufacturer who decides on the amount of a certain product that he will start to produce at time t, basing his decision upon the predicted selling price. Suppose that it takes k time periods from start to completion of the product. Let Y t denote the quantity of the product ready for supply at time t and X t the market price at time t. If the manufacturer uses simple exponential smoothing then the predicted price ˆx t any number of steps ahead at time t may be written in the form ˆx t = (1 α) α j X t j. Assuming a simple linear relationship between planned production and predicted price, together with the added disturbance term, we may write Equivalently, writing δ(1 α) = β, Y t+k = µ + δˆx t + U t+k = µ + δ(1 α) α j X t j + U t+k. Y t = µ + β α j X t k j + U t = µ + β α j k X t j + U t. (2) j=k Note the presence of a time-delay k. Alternatively, using the lag operator, we may write Y t = µ + βl k (αl) j X t + U t = µ + βlk 1 αl X t + U t. (3) The processes (X t ) and (Y t ) as described will not be stationary in general, but they may be differenced to transform them to stationarity. The transformed series will still satisfy essentially the same Equations (2) and (3) with µ = 0. Returning to the general case, we assume that the disturbance process (U t ) is an ARMA process with infinite moving average representation U t = ψ(l)ɛ t, where (ɛ t ) is a white noise process with variance σ 2 process (X t ). Equation (1) may then be written as and uncorrelated with the input Y t = µ + v(l)x t + ψ(l)ɛ t, (4) where v(z) is the generating function of the coefficients of the filter. In the present setting v(z) is also referred to as the transfer function of the filter. The first two terms on the 2
3 right hand side of Equation (4) represent the systematic dynamics of the model and the third term the disturbance dynamics. We may write the ARMA model for (U t ) more explicitly in the form φ(l)u t = θ(l)ɛ t. Thus φ(z) is the autoregressive characteristic polynomial and θ(z) is the moving average characteristic polynomial for the disturbance process. We assume that the transfer function v(z) may be expressed as a rational function, a ratio of polynomials, v(z) = ω(z)zk δ(z), (5) where k is the time-delay, the denominator (autoregressive) polynomial δ is given by δ(z) = 1 δ 1 z δ 2 z 2... δ p z p, for some p, and the numerator (moving average) polynomial ω by ω(z) = ω 0 ω 1 z ω 2 z 2... ω q z q, for some q. The corresponding recursive filter is δ(l)y t = ω(l)l k X t. Equation (4) becomes Y t = µ + ω(l)lk δ(l) X t + θ(l) φ(l) ɛ t. (6) The polynomial ω(z) has ω 0 1 in general, because a multiplicative constant has been absorbed into it. The minus sign in front of the subsequent ω i reflects the SAS usage. To have a well-defined model, we assume that all the roots of the characteristic equations δ(z) = 0 and φ(z) = 0 lie outside the unit circle in the complex plane. The model of Equation (1)/(4)/(6) may be rewritten as δ(l)y t = µ + ω(l)l k X t + U t, where µ = δ(1)µ and U t = δ(l)u t, to exhibit explicitly a recursive, autoregressive aspect of the model for (Y t ). Assuming that the processes (X t ) and (Y t ) are stationary, let µ X and µ Y denote their respective means. Taking expectations in Equation (1), µ Y = µ + v j µ X. (7) The quantity v j is sometimes referred to as the total multiplier the change in µ Y per unit change in µ X, i.e., the long-term effect on (Y t ) of a unit change in µ X. Note that the total multiplier may also be written as v(1) ω(1)/δ(1). 3
4 5.2 The cross-correlation function and model identification Taking (X t ) to be stationary, we assume temporarily that (X t ) has zero mean, which does not alter the second-order moments of the model but simplifies the notation in deriving the results of this sub-section. Recall that in Equation (1) (U t ) is also zero-mean stationary and that (X t ) and (U t ) are uncorrelated with each other. For any τ, multiplying through by X t τ and taking expectations, we obtain γ 21 τ = v j γ 11 τ j, (8) where γτ 21 is the cross-covariance, γτ 21 = E(Y t X t τ ), and γτ 11 is the autocovariance, γτ 11 = E(X t X t τ ). Equation (8) simplifies in the special case when (X t ) is a white noise process, in which case γτ 21 = v τ γ0 11, which we may rewrite as v τ = ρ 21 τ γ22 0 γ0 11 ρ 21 τ. (9) The result of Equation (9) shows that, if the input process is white noise, we have a straightforward method of estimating the transfer function from the sample crosscorrelation function and the sample variances of the input and output processes: ˆv τ = r 21 τ c22 0 c 11 0 r 21 τ. (10) Now, in general, (X t ) is not a white noise process but is some other ARMA process. Hence there exists some filter with generating function w(z) such that w(l)x t = η t, where (η t ) is a white noise process, uncorrelated with (U t ). Applying this filter to Equation (1) we obtain Yt = µ + v j η t j + Ut, (11) where Yt = w(l)y t, µ = w(1)µ and Ut = w(l)u t. Equation (11) is similar in form to the model Equation (1), with the same filter (v j ), but with the input process white noise. Hence the results of Equations (9) and (10) may be applied to the filtered processes (Yt ) and (η t ) to estimate (v j ). In attempting to find an appropriate transfer function model, a standard approach involves first filtering the input and output processes, using the same filter for both, so as to convert the input process to white noise. Such a procedure is commonly known as prewhitening. It presupposes that the processes (X t ) and (Y t ) have, if necessary, already been transformed to stationarity by differencing. By examining the cross-correlation function of the pre-whitened process data, we may be able to identify a suitable transfer function model. 4
5 5.3 Example: sales Using the example introduced in Section 4.2, recall that it is the first differences of the series ind and sales that appear to be stationary. The main reason for investigating the series would appear to be to predict sales from the leading indicator. Hence we shall attempt to construct a transfer function model with first differences of the sales as the output (Y t ) and the first differences of the leading indicator as the input (X t ). The following SAS program fits an appropriate model, chosen from among models of the general form of Equation (6). proc arima data=indsales; identify var=ind(1) nlag=20; estimate q=1 noint; identify var=sales(1) crosscor=ind(1) nlag=20; estimate q=1 input=(3 $ / (1) ind) noint; forecast lead=5 out=results; run; In fact, this program would have to be developed iteratively, step by step. The first identify statement specifies what is going to be the input variable, here the first differences of the variable ind, and produces the autocorrelation and other functions. The autocorrelation function output on page 7 suggests that we should fit an MA(1) model to the differences, i.e., an ARIMA(0,1,1) model to ind. The first estimate statement fits this model, and the noint option indicates that a zero mean is being assumed. The output on page 8 exhibits the fitted model, and the p-values of the portmanteau statistics show that the model fits well. The fitted model for the input process is used 1. for pre-whitening the input and output variables before calculation of their crosscorrelation function and 2. for calculating forecast values of the input variable which are in turn used in the calculation of forecast values of the output variable. After the input process has been modelled, the second identify statement, with the crosscor option, produces (i) the autocorrelation and other functions for what is going to be the output variable, the first differences of sales, and (ii) the cross-correlation function of the first differences of sales and ind, automatically pre-whitened using the model fitted to ind by the previous estimate statement. Examination of the crosscorrelation function on page 10 indicates that there is a time-delay of 3 units. Thereafter, the ccf appears to die away geometrically. This suggests a transfer function of the form v(z) = ω 0z 3 1 δ 1 z. We are fortunate in having a clear-cut structure to the cross-correlation function here! Note that this cross-correlation function differs from the one in Section 4.2, where the variables had not been pre-whitened. 5
6 As part of the process of model identification, we also have to specify a model for the disturbance process (U t ). We might just try fitting the transfer function model, assuming a few simple models for the disturbance process, to find the simplest one that works. Another, more systematic approach involves first finding a rough estimate ˆv of the transfer function, using Equation (10). An estimated disturbance process is then given by û t = y t ˆv(L)x t. At least in simple cases, we may readily calculate the values of this estimated disturbance process and fit an ARMA model to them. In the present case, using Equation (10), first estimates of the parameters of our proposed form of transfer function are given by ˆω 0 = ˆv 3 = = 4.71 and ˆδ 1 = ˆv 4 = r21 4 = ˆv 3 r = From analysis of the estimated disturbance process or by some trial and error, it turns out that an MA(1) model for (U t ) is appropriate here. The second estimate statement in the SAS program fits our chosen transfer function model to the data with the first difference of sales as the output variable, as specified in the previous identify statement. The q=1 option specifies that the disturbance process is to be modelled as an MA(1) process. The input option is used to specify the input variable and the form of the rational transfer function of Equation (5). In general, the input option takes the form input = ( k $ ( numerator lags ) / ( denominator lags ) x ) where k is the time-delay (Shift in the terminology of the SAS output), numerator lags specifies the numerator polynomial, denominator lags specifies the denominator polynomial, and x represents the input variable. In the present case, the time-delay is 3. The term / (1) after the dollar sign specifies that the numerator polynomial is a constant and that the denominator polynomial is of order 1, i.e., 1 δ 1 z. The input variable will be the first difference of ind. The noint option specifies that there is to be no constant term in the transfer function model. The SAS output on pages 11 and 12 gives the fitted model as Y t = L X t 3 + ɛ t ɛ t 1. Apart from one value that is significant at the 5% level, the p-values of the diagnostic statistics on page 11 indicate that the fitted model is satisfactory in that 1. the autocorrelations of the residuals are consistent with being from a white noise process and 2. the cross-correlations are consistent with the residuals, and hence the disturbance process, being uncorrelated with the input process. Finally, the forecast statement in the SAS program produces forecasts of sales for the next five time-points. We might note that the forecast values are fairly similar to the ones obtained at the end of Section 4 using the VAR(5) model, and would be even more so if constant terms had not been included in the VAR(5) model. 6
7 The ARIMA Procedure Name of Variable = ind Period(s) of Differencing 1 Mean of Working Series Standard Deviation Number of Observations 149 Observation(s) eliminated by differencing 1... Autocorrelations Lag Covariance Correlation Std Error ******************** ********* ** * *** ** ** ** ** * ** **** ** * * * ** ** "." marks two standard errors 7
8 Conditional Least Squares Estimation Standard Approx Parameter Estimate Error t Value Pr > t Lag MA1, < Variance Estimate Std Error Estimate AIC SBC Number of Residuals 149 * AIC and SBC do not include log determinant. Autocorrelation Check of Residuals To Chi- Pr > Lag Square DF ChiSq Autocorrelations Model for variable ind Period(s) of Differencing 1 No mean term in this model. Moving Average Factors Factor 1: B**(1) 8
9 Name of Variable = sales Period(s) of Differencing 1 Mean of Working Series Standard Deviation Number of Observations 149 Observation(s) eliminated by differencing 1... Autocorrelations Lag Covariance Correlation Std Error ******************** ****** ****** ***** ***** *** *** * *** ** * * * * * "." marks two standard errors Variable ind has been differenced. Correlation of sales and ind Period(s) of Differencing 1 Number of Observations 149 Observation(s) eliminated by differencing 1 Variance of transformed series sales Variance of transformed series ind Both series have been prewhitened. 9
10 Crosscorrelations Lag Covariance Correlation * * * * * * * * * * * * ** * ** ************** ********* ******* ***** ***** **** *** ** *** * ** * * * "." marks two standard errors... Both variables have been prewhitened by the following filter: Prewhitening Filter Moving Average Factors Factor 1: B**(1) 10
11 Conditional Least Squares Estimation Standard Approx Parameter Estimate Error t Value Pr > t Lag Variable Shift MA1, sales 0 NUM < ind 3 DEN1, < ind 3 Variance Estimate Std Error Estimate AIC SBC Number of Residuals 145 * AIC and SBC do not include log determinant. Correlations of Parameter Estimates Variable sales ind ind Parameter MA1,1 NUM1 DEN1,1 sales MA1, ind NUM ind DEN1, Autocorrelation Check of Residuals To Chi- Pr > Lag Square DF ChiSq Autocorrelations Crosscorrelation Check of Residuals with Input ind To Chi- Pr > Lag Square DF ChiSq Crosscorrelations
12 Model for variable sales Period(s) of Differencing 1 No mean term in this model. Moving Average Factors Factor 1: B**(1) Input Number 1 Input Variable ind Shift 3 Period(s) of Differencing 1 Overall Regression Factor Denominator Factors Factor 1: B**(1) Forecasts for variable sales Obs Forecast Std Error 95% Confidence Limits
The ARIMA Procedure: The ARIMA Procedure
Page 1 of 120 Overview: ARIMA Procedure Getting Started: ARIMA Procedure The Three Stages of ARIMA Modeling Identification Stage Estimation and Diagnostic Checking Stage Forecasting Stage Using ARIMA Procedure
More informationSAS/ETS 14.1 User s Guide. The ARIMA Procedure
SAS/ETS 14.1 User s Guide The ARIMA Procedure This document is an individual chapter from SAS/ETS 14.1 User s Guide. The correct bibliographic citation for this manual is as follows: SAS Institute Inc.
More informationMultivariate ARMA Processes
LECTURE 8 Multivariate ARMA Processes A vector y(t) of n elements is said to follow an n-variate ARMA process of orders p and q if it satisfies the equation (1) A 0 y(t) + A 1 y(t 1) + + A p y(t p) = M
More informationNANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS
NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)
More informationForecasting with ARMA
Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables
More informationat least 50 and preferably 100 observations should be available to build a proper model
III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or
More informationRoss Bettinger, Analytical Consultant, Seattle, WA
ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression
More informationIntroduction to ARMA and GARCH processes
Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationWeek 5 Quantitative Analysis of Financial Markets Characterizing Cycles
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036
More informationDynamic Time Series Regression: A Panacea for Spurious Correlations
International Journal of Scientific and Research Publications, Volume 6, Issue 10, October 2016 337 Dynamic Time Series Regression: A Panacea for Spurious Correlations Emmanuel Alphonsus Akpan *, Imoh
More information4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore,
61 4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2 Mean: y t = µ + θ(l)ɛ t, where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore, E(y t ) = µ + θ(l)e(ɛ t ) = µ 62 Example: MA(q) Model: y t = ɛ t + θ 1 ɛ
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationUnivariate ARIMA Models
Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests
ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Unit Root Tests ECON/FIN
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationProblem Set 1 Solution Sketches Time Series Analysis Spring 2010
Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationCh. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations
Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed
More informationAutoregressive and Moving-Average Models
Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More informationClassic Time Series Analysis
Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t
More informationVector autoregressions, VAR
1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,
More informationEASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationStat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)
Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationTIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA
CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models
ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN
More informationForecasting. Simon Shaw 2005/06 Semester II
Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More informationMultivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]
1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationMidterm Suggested Solutions
CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)
More informationSeasonal Models and Seasonal Adjustment
LECTURE 10 Seasonal Models and Seasonal Adjustment So far, we have relied upon the method of trigonometrical regression for building models which can be used for forecasting seasonal economic time series.
More informationBasics: Definitions and Notation. Stationarity. A More Formal Definition
Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that
More informationUniversità di Pavia. Forecasting. Eduardo Rossi
Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More information5 Autoregressive-Moving-Average Modeling
5 Autoregressive-Moving-Average Modeling 5. Purpose. Autoregressive-moving-average (ARMA models are mathematical models of the persistence, or autocorrelation, in a time series. ARMA models are widely
More informationLesson 2: Analysis of time series
Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems
More informationTIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.
TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION
More informationEconometrics II Heij et al. Chapter 7.1
Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy
More informationLecture note 2 considered the statistical analysis of regression models for time
DYNAMIC MODELS FOR STATIONARY TIME SERIES Econometrics 2 LectureNote4 Heino Bohn Nielsen March 2, 2007 Lecture note 2 considered the statistical analysis of regression models for time series data, and
More informationCh. 19 Models of Nonstationary Time Series
Ch. 19 Models of Nonstationary Time Series In time series analysis we do not confine ourselves to the analysis of stationary time series. In fact, most of the time series we encounter are non stationary.
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More informationScenario 5: Internet Usage Solution. θ j
Scenario : Internet Usage Solution Some more information would be interesting about the study in order to know if we can generalize possible findings. For example: Does each data point consist of the total
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More informationBJEST. Function: Usage:
BJEST BJEST (CONSTANT, CUMPLOT, EXACTML, NAR=number of AR parameters, NBACK=number of back-forecasted residuals,ndiff=degree of differencing, NLAG=number of autocorrelations,nma=number of MA parameters,
More informationChapter 5: Models for Nonstationary Time Series
Chapter 5: Models for Nonstationary Time Series Recall that any time series that is a stationary process has a constant mean function. So a process that has a mean function that varies over time must be
More informationLecture 2: ARMA(p,q) models (part 2)
Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.
More informationMinitab Project Report - Assignment 6
.. Sunspot data Minitab Project Report - Assignment Time Series Plot of y Time Series Plot of X y X 7 9 7 9 The data have a wavy pattern. However, they do not show any seasonality. There seem to be an
More informationRoss Bettinger, Analytical Consultant, Seattle, WA
ABSTRACT USING PROC ARIMA TO MODEL TRENDS IN US HOME PRICES Ross Bettinger, Analytical Consultant, Seattle, WA We demonstrate the use of the Box-Jenkins time series modeling methodology to analyze US home
More informationTrend-Cycle Decompositions
Trend-Cycle Decompositions Eric Zivot April 22, 2005 1 Introduction A convenient way of representing an economic time series y t is through the so-called trend-cycle decomposition y t = TD t + Z t (1)
More informationCircle a single answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationTIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets
TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:
More information1 Introduction to Generalized Least Squares
ECONOMICS 7344, Spring 2017 Bent E. Sørensen April 12, 2017 1 Introduction to Generalized Least Squares Consider the model Y = Xβ + ɛ, where the N K matrix of regressors X is fixed, independent of the
More information7 Introduction to Time Series
Econ 495 - Econometric Review 1 7 Introduction to Time Series 7.1 Time Series vs. Cross-Sectional Data Time series data has a temporal ordering, unlike cross-section data, we will need to changes some
More informationAdvanced Econometrics
Based on the textbook by Verbeek: A Guide to Modern Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna May 2, 2013 Outline Univariate
More informationLecture 4a: ARMA Model
Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model
More informationFORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL
FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation
More informationCovariances of ARMA Processes
Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation
More informationFor a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,
CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance
More informationWhite Noise Processes (Section 6.2)
White Noise Processes (Section 6.) Recall that covariance stationary processes are time series, y t, such. E(y t ) = µ for all t. Var(y t ) = σ for all t, σ < 3. Cov(y t,y t-τ ) = γ(τ) for all t and τ
More informationProblem Set 2 Solution Sketches Time Series Analysis Spring 2010
Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )
More information6 NONSEASONAL BOX-JENKINS MODELS
6 NONSEASONAL BOX-JENKINS MODELS In this section, we will discuss a class of models for describing time series commonly referred to as Box-Jenkins models. There are two types of Box-Jenkins models, seasonal
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More information10) Time series econometrics
30C00200 Econometrics 10) Time series econometrics Timo Kuosmanen Professor, Ph.D. 1 Topics today Static vs. dynamic time series model Suprious regression Stationary and nonstationary time series Unit
More informationStatistics of stochastic processes
Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014
More informationarxiv: v1 [stat.me] 5 Nov 2008
arxiv:0811.0659v1 [stat.me] 5 Nov 2008 Estimation of missing data by using the filtering process in a time series modeling Ahmad Mahir R. and Al-khazaleh A. M. H. School of Mathematical Sciences Faculty
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationLINEAR STOCHASTIC MODELS
LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process
More information1. Fundamental concepts
. Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing
More informationIDENTIFICATION OF ARMA MODELS
IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationCHAPTER 8 MODEL DIAGNOSTICS. 8.1 Residual Analysis
CHAPTER 8 MODEL DIAGNOSTICS We have now discussed methods for specifying models and for efficiently estimating the parameters in those models. Model diagnostics, or model criticism, is concerned with testing
More informationMODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo
Vol.4, No.2, pp.2-27, April 216 MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo ABSTRACT: This study
More information3 Theory of stationary random processes
3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation
More informationTHE ROYAL STATISTICAL SOCIETY 2009 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES
THE ROYAL STATISTICAL SOCIETY 9 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES The Society provides these solutions to assist candidates preparing
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationAPPLIED ECONOMETRIC TIME SERIES 4TH EDITION
APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationTime Series: Theory and Methods
Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary
More informationForecasting with ARMA Models
LECTURE 4 Forecasting with ARMA Models Minumum Mean-Square Error Prediction Imagine that y(t) is a stationary stochastic process with E{y(t)} = 0. We may be interested in predicting values of this process
More informationStatistical Methods for Forecasting
Statistical Methods for Forecasting BOVAS ABRAHAM University of Waterloo JOHANNES LEDOLTER University of Iowa John Wiley & Sons New York Chichester Brisbane Toronto Singapore Contents 1 INTRODUCTION AND
More information11. Further Issues in Using OLS with TS Data
11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,
More informationExercises - Time series analysis
Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare
More information7. Forecasting with ARIMA models
7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector
More informationARIMA Modelling and Forecasting
ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationCircle the single best answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice
More information388 Index Differencing test ,232 Distributed lags , 147 arithmetic lag.
INDEX Aggregation... 104 Almon lag... 135-140,149 AR(1) process... 114-130,240,246,324-325,366,370,374 ARCH... 376-379 ARlMA... 365 Asymptotically unbiased... 13,50 Autocorrelation... 113-130, 142-150,324-325,365-369
More informationTMA4285 December 2015 Time series models, solution.
Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that
More information