Problem Set 2 Solution Sketches Time Series Analysis Spring 2010
|
|
- Catherine Ross
- 5 years ago
- Views:
Transcription
1 Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 ) <. a) Compute Ê(Y X) (the linear projection of Y on X and a constant). Ê(Y X) = α 0 + α 1 X, where α 1 = cov(x, Y )/var(x) and α 0 = E(Y ) α 1 E(X). This is the population equivalent of a simple linear regression. b) Give a concrete example where E(Y X) Ê(Y X). There was an example in the last homework: If X N(0, 1) and Y = X 2 then E(Y X) = X 2 but Ê(Y X) = 2. (The difference between the conditional mean and linear projection is the reason we distinguish between martingale difference and white noise sequences.) 2. Let Y t be an MA( ) process: Y t = µ + ɛ t + ψ 1 ɛ t 1 + ψ 2 ɛ t , where ɛ t w.n. (0, σ 2 ) and i ψ2 i := S <. a) Let information at time t be given by I t = {ɛ t, ɛ t 1, ɛ t 2...}. Derive the optimal linear forecast of Y t+h, h > 0 based on I t under mean square loss. (Hint: follow the naive procedure and prove that it gives you the linear projection of Y t+h on I t.) Y t+h = µ + ɛ t+h + ψ 1 ɛ t+h 1 + ψ 2 ɛ t+h ψ h ɛ t + ψ h+1 ɛ t Claim: Ŷ t+h t = Ê(Y t+h I t ) = µ + ψ h ɛ t + ψ h+1 ɛ t To show that Ŷt+h t is indeed the linear projection of Y t+h on I t we need to verify E [(ɛ t+h + ψ 1 ɛ t+h ψ h 1 ɛ t+1 ) ɛ j ] = 0 for j t But this is immediate, since ɛ t is white noise. 1
2 b) Show that the forecast error associated with the optimal linear forecast of Y t+h follows an MA(h 1) process. e t,h = ɛ t+h + ψ 1 ɛ t+h ψ h 1 ɛ t+1 For h fixed this is precisely the definition of an MA(h 1) process. c) Derive the variance of the forecast error and show that it approaches the unconditional variance of Y t as h. V (Y t ) = V (ɛ t + ψ 1 ɛ t ) = σ 2 ( 1 + ψ1 2 + ψ ) V (e t,h ) = σ 2 ( 1 + ψ1 2 + ψ ψh 1 2 ) Clearly V (e t,h ) converges to V (Y t ) as h d) Assuming, in addition, that the marginal distribution of ɛ t is normal, construct a 95% interval forecast (i.e. a confidence interval for the value of Y t+h ). Using the normality assumption, the point forecast ± 2 standard errors is an approximate 95% CI. 3. (Forecasting under alternative loss functions.) One way to solve the general minimization problem min f E[L(Y t+h, f(x t ))] (1) is to construct a solution pointwise for each possible value of X t. Thus, to obtain the optimal predictor f : support(x t ) R, suppose that the observed value of X t is x t and find a number ŷ that solves the conditional problem min ŷ E[L(Y t+h, ŷ) X t = x t ] = min ŷ L(y, ŷ)df (y x t ), where F (y x t ) denotes the conditional distribution of Y t+h given X t = x t. Then one can set f (x t ) = ŷ. By the law of iterated expectations, and the monotonicity of the expectations operator, f will then solve (1). Suppose Y t follows an AR(1) process Y t = φy t 1 + ɛ t, 2
3 where φ < 1 and ɛ t i.i.d. unif[ 1, 1]. Furthermore, let L(y, ŷ) = a(y ŷ) if y > ŷ and L(y, ŷ) = b if y ŷ, where a and b are positive constants. a) Let X t = (Y t, Y t 1,...). Show rigorously that E(Y t+1 X t ) = φy t. Then derive the conditional distribution of Y t+1 given X t? One needs to show that E(ɛ t+1 Y t, Y t 1,...) = 0. Note that Y t j = s=0 φs ɛ t j s, j = 0, 1, 2,... In other words, Y t j depends only on shocks dated t j and earlier. Because of the iid assumption, ɛ t+1 is independent of (ɛ t, ɛ t 1,...), and so ɛ t+1 is independent of any function of (ɛ t, ɛ t 1,...), even vector-valued. It follows that ɛ t+1 is independent of (Y t, Y t 1,...). Therefore, E(ɛ t+1 Y t, Y t 1,...) = E(ɛ t+1 ) = 0. Clearly, the conditional distribution of Y t+1 given Y t is uniform[ 1 + φy t, 1 + φy t ]. b) Write down E[L(Y t+1, ŷ) X t ] as an integral. Split the integral into two parts depending on whether Y t+1 ŷ is positive or negative. Do the integration. For ŷ [ 1 + φy t, 1 + φy t ] we can write E[L(Y t+1, ŷ) X t ] = + 1+φYt ŷ ŷ a(y ŷ) [ 1+φY t,1+φy t](y)dy 1+φY t b [ 1+φY t,1+φy t](y)dy c) The expression you got in part b) should be a function of ŷ and X t. Find the minimum w.r.t. ŷ. This is the optimal predictor of Y t+1 under L given X t. d) Compare this predictor with E(Y t+1 X t ) = φy t, the optimal predictor under square loss. In particular, is there bias relative to E(Y t+1 X t )? Spectral analysis 4. Suppose you observe a time series at regular daily intervals (e.g. at noon every day). It should be intuitively clear that there is nothing you can learn about cycles with periods less than 1 day, i.e. frequencies greater than 2π. It is less clear, but also true, that you can t learn about cycles with periods shorter than 2 days, i.e. frequencies greater than π. (This is why the spectral density is defined for frequencies less than 3
4 π only.) To see this formally, let Y t = cos(ωt), ω > π, t Z. Show that there exists ω [0, π] such that Y t = cos(ω t) for all t. There exists an integer k > 0 and ω [ π, π] such that ω = ω + k2π. Then Y t = cos(ωt) = cos(k2πt + ωt) = cos( ωt), where the last equality follows because kt is an integer and the periodicity of the cosine function is 2π. Finally, note that cosine is an even function so that Y t = cos( ω t), where ω [0, π]. Set ω := ω. 5. Let Y t MA(1); in particular, let Y t = θɛ t 1 + ɛ t, where θ > 1 and ɛ t is white noise with variance σ 2. Show that the process has an invertible representation, i.e. there exists a white noise process η t such that Y t = 1 θ η t 1 + η t. (Hints: Given the form of the proposed representation, use the filtering theorem to derive the spectral density of η t. Show that it s constant, and argue that the fact that it s constant implies that the process is white noise.) Define η t := (1 + 1 θ L) 1 Y t. We need to show that η t is white noise. Now, Y t = (1 + θl)ɛ t so that η t = (1 + 1 θ L) 1 (1 + θl)ɛ t = h(l)ɛ t, where h(l) = (1 + 1 θ L) 1 (1 + θl). The spectral density of η t is then given by s η (ω) = h(e iω )h(e iω ) σ2 2π = θ2 σ 2 2π. Hence, η t has a constant spectral density. We know that white noise sequences have constant spectral densities, but is the reverse implication true? The answer is yes, because there is a one-to-one relationship between the autocorrelation sequence of a covariance stationary time series and its spectral density. This is the uniqueness part in the Herglotz theorem. 6.) Volatile time series are often smoothed out by some sort of averaging. In particular, let X t be a covariance stationary time series and let X t denote its smoothed version defined by an m-period centered moving average X t = (X t m X t 1 + X t + X t X t+m )/(2m + 1). a) Let m = 1. Using lag operator notation write down the linear filter that transforms X t into X t. X t = X t 1 + X t + X t+1 3 = ( 1 3 L ) 3 L 1 X t ; therefore, the linear filter that transforms X t into X t is h(l) = 1 3 L L 1. 4
5 b) Find the transfer function of the filter (i.e. the function by which the spectral density of X t has to be multiplied to obtain the spectral density of Xt ). ( 1 h(e iω )h(e iω ) = 3 eiω ) ( 1 3 e iω 3 e iω ) 3 eiω ( 1 = 3 eiω ) 2 3 e iω = 1 (1 + 2cos(ω))2 9 c) Compare the spectral density of X t with the spectral density of Xt. Calculate the zeros of the transfer function analytically. Then use Matlab or any other program to graph the rest of the filter function. Which frequencies are missing from the spectrum of Xt? Which frequencies are dampened? Which frequencies are amplified? Assuming you have monthly observations calculate the period of the cycles corresponding to these frequencies. (1 + 2cos(ω)) = 0 cos(ω) = 1 2 ω = 2π 3 (It s enough to find the root that lies in [0, π].) This means that cycles corresponding to the frequency ω = 2π 3 will be missing from the spectral density of Xt. For, say, monthly data, this frequency corresponds to a period of 2π ω = 3 months m= Frequency d) Explain why it is appropriate to describe the effect of averaging as smoothing. As seen in the figure, the transfer function is less than or equal to 1 for all frequencies in [0, π]. This means that the contribution of all cycles to the variance is reduced (V ( X t ) V (X t )). This is especially true for high frequency cycles. The variance contribution of cycles with frequencies greater than or equal to π 2 (periods less than or equal to 4 months) gets multiplied by a factor of 0.1 or less. Therefore, the time series plot of Xt should look smoother than X t. 7.) This is a real data exercise. The ascii text file clothing.txt contains 105 observations on the monthly inflation rate of the price index of clothing articles in Hungary from 1995:04 to 2003:12. 5
6 a) Write a code that calculates the Bartlett-estimator of the spectral density evaluated at the frequencies j2π/t, j = 1,..., (T 1)/2 for T odd. (See equations [6.3.14] and [6.3.15] in Hamilton.) b) Plot the estimated spectral density using (i) q = 20 and (ii) q = 40 autocorrelations. Describe briefly the effect of this choice on the estimator. Identify the cycles that seem to contribute the most to the variance of inflation. You should find a very pronounced peak at about frequency 1 to 1.1, which corresponds to a 6 month cycle. It s pretty clear that this cycle should be related to the change of seasons (summer vs. winter). There is a smaller but still obvious peak at frequency approx. 2.1, which gives a period of 3 months. This gives a shorter seasonal cycle. (Hungary has four pretty distinct seasons: winter, spring, summer and fall.) On q = 20 vs. q = 40. I thought this choice would make more of a difference, but the two graphs appear quite similar visually. The choice of q in the estimator governs the biasvariance tradeoff present in any nonparametric estimation method. Large q means using more estimated autocorrelations in estimating the spectrum this will give less bias but more variance (the estimated spectrum will appear very spikey or noisy ). A small q will give a smoother picture (less variance but more bias). No free lunch: oversmoothing might obscure details, while undersmoothing might introduce spurious details. Try q = 5 vs. q = 50 to really see what I am talking about. c) Apply the seasonal differencing filter (1 L 12 ) to the inflation rate. Estimate and plot the spectral density of the transformed sequence for a single choice of q. Interpret the result. The seasonal differencing filter will eliminate the 6-month and 3-month cycles (see the calculations on p of Hamilton). ARMA modeling 8.) This is a real data exercise. The ascii text file arma.txt contains 500 observations from an ARMA(p, q) process ordered as a column vector (the first entry is the earliest observation). Your job is to identify the underlying model, i.e. find p 0 and q 0 that adequately describes the data. You can work with any software package. 6
7 a) Estimate and graph the autocorrelation function along with the approximate 95% confidence bands. Estimate and graph the partial autocorrelation function along with the approximate 95% confidence bands. Make an educated guess about the values of p and q based on these graphs. Since both graphs show exponential-type decay, the process is likely to be mixed (i.e. ARMA as opposed to pure AR or MA). Though it s hard to be specific about the orders, note that the ACF shows a pretty big drop after the first lag and then the decay is much more gradual afterwards. To me this suggests just one MA term. b) Implement the Hannan-Rissanen procedure for various (p, q) combinations to see if your initial guess holds up. (Hint: You can use, say, K = 30 in the initial autoregression. I am also willing to let you know that p and q are both less than or equal to 6.) You ll continue to work with the model you identified here on the next problem set. The true process was ARMA(3,1). If your code is correct, you will indeed find that the minimum BIC occurs at p = 3 and q = 1. Notes on the ARMA identification problem Matlab notes If you are working with Matlab, here are a few hints. An easy way to get the data into Matlab is to put the text file in the Matlab work directory and use the command y=load( arma.txt ). This creates the vector y consisting of the sample observations. Also, there is an autocorr and parcorr function in Matlab, although I don t think it s part of the basic package. (It s pretty easy to write a routine that does the job anyway.) On the Hannan-Rissanen procedure Looking at my notes, I noticed that I gave you the form of the Bayesian Information Criterion (BIC) incorrectly (embarrassing once again). By way of compensation, I am giving you step-by-step instructions on the Hannan-Rissanen procedure. It has the correct formula for BIC. 7
8 Suppose Y t is to be modeled as an ARMA(p, q) process, with 0 p p and 0 q q. In this case one can choose p and q according to the Hannan-Rissanen procedure (Granger and Newbold 1986, p ). Let K > p be an integer and suppose there are q + K + T observations available on Y t, indexed as t = q K + 1,..., 0, 1,..., T. For a large K use OLS to estimate the autoregression Y t = a 0 + a 1 Y t a K Y t K + ɛ t, t = q + 1,..., T. Record the residuals ˆɛ t = Y t â 0 â 1 Y t 1... â K Y t K, t = q + 1,..., T. Use the residual sequence ˆɛ t to estimate (by OLS) an ARMA(p, q) model: Y t = c + φ 1 Y t φ p Y t p + θ 1ˆɛ t θ qˆɛ t q + e t for t = 1,..., T. Calculate the BIC log(ˆσ 2 p,q) + (p + q + 1) log T T, where ˆσ 2 p,q is the estimated error variance from the ARMA model above. Repeat the previous step for different values of (p, q) with 0 p p and 0 q q until you find the combination that minimizes this criterion. 8
Problem Set 1 Solution Sketches Time Series Analysis Spring 2010
Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models
ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN
More informationTime Series Analysis -- An Introduction -- AMS 586
Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests
ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Unit Root Tests ECON/FIN
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1
ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1 Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Forecast Examples: Part 1 ECON/FIN
More information4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore,
61 4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2 Mean: y t = µ + θ(l)ɛ t, where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore, E(y t ) = µ + θ(l)e(ɛ t ) = µ 62 Example: MA(q) Model: y t = ɛ t + θ 1 ɛ
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationUniversity of Oxford. Statistical Methods Autocorrelation. Identification and Estimation
University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model
More informationForecasting with ARMA
Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables
More informationEconometric Forecasting
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationEconomics Department LSE. Econometrics: Timeseries EXERCISE 1: SERIAL CORRELATION (ANALYTICAL)
Economics Department LSE EC402 Lent 2015 Danny Quah TW1.10.01A x7535 : Timeseries EXERCISE 1: SERIAL CORRELATION (ANALYTICAL) 1. Suppose ɛ is w.n. (0, σ 2 ), ρ < 1, and W t = ρw t 1 + ɛ t, for t = 1, 2,....
More informationRead Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them.
TS Module 1 Time series overview (The attached PDF file has better formatting.)! Model building! Time series plots Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book;
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More informationClass: Trend-Cycle Decomposition
Class: Trend-Cycle Decomposition Macroeconometrics - Spring 2011 Jacek Suda, BdF and PSE June 1, 2011 Outline Outline: 1 Unobserved Component Approach 2 Beveridge-Nelson Decomposition 3 Spectral Analysis
More informationLesson 13: Box-Jenkins Modeling Strategy for building ARMA models
Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Facoltà di Economia Università dell Aquila umberto.triacca@gmail.com Introduction In this lesson we present a method to construct an ARMA(p,
More informationBasics: Definitions and Notation. Stationarity. A More Formal Definition
Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationTAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω
ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.
More informationat least 50 and preferably 100 observations should be available to build a proper model
III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or
More informationIntroduction to Signal Processing
to Signal Processing Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Intelligent Systems for Pattern Recognition Signals = Time series Definitions Motivations A sequence
More informationMAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)
MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes. Only
More informationChapter 9: Forecasting
Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationMAT3379 (Winter 2016)
MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation
More informationForecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1
Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation
More informationUnivariate ARIMA Models
Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationIntroduction to Simple Linear Regression
Introduction to Simple Linear Regression Yang Feng http://www.stat.columbia.edu/~yangfeng Yang Feng (Columbia University) Introduction to Simple Linear Regression 1 / 68 About me Faculty in the Department
More informationChapter 8: Model Diagnostics
Chapter 8: Model Diagnostics Model diagnostics involve checking how well the model fits. If the model fits poorly, we consider changing the specification of the model. A major tool of model diagnostics
More informationPart III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to
TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let
More informationARIMA Modelling and Forecasting
ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first
More informationStatistics Homework #4
Statistics 910 1 Homework #4 Chapter 6, Shumway and Stoffer These are outlines of the solutions. If you would like to fill in other details, please come see me during office hours. 6.1 State-space representation
More information8.2 Harmonic Regression and the Periodogram
Chapter 8 Spectral Methods 8.1 Introduction Spectral methods are based on thining of a time series as a superposition of sinusoidal fluctuations of various frequencies the analogue for a random process
More informationTime Series I Time Domain Methods
Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT
More informationHomework 2. For the homework, be sure to give full explanations where required and to turn in any relevant plots.
Homework 2 1 Data analysis problems For the homework, be sure to give full explanations where required and to turn in any relevant plots. 1. The file berkeley.dat contains average yearly temperatures for
More informationTIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA
CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis
More informationARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27
ARIMA Models Jamie Monogan University of Georgia January 16, 2018 Jamie Monogan (UGA) ARIMA Models January 16, 2018 1 / 27 Objectives By the end of this meeting, participants should be able to: Argue why
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationMCMC analysis of classical time series algorithms.
MCMC analysis of classical time series algorithms. mbalawata@yahoo.com Lappeenranta University of Technology Lappeenranta, 19.03.2009 Outline Introduction 1 Introduction 2 3 Series generation Box-Jenkins
More informationCh 5. Models for Nonstationary Time Series. Time Series Analysis
We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the
More informationA SARIMAX coupled modelling applied to individual load curves intraday forecasting
A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationForecasting. Simon Shaw 2005/06 Semester II
Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future
More informationAnalysis. Components of a Time Series
Module 8: Time Series Analysis 8.2 Components of a Time Series, Detection of Change Points and Trends, Time Series Models Components of a Time Series There can be several things happening simultaneously
More informationStat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)
Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary
More informationStatistics of stochastic processes
Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014
More informationSOME BASICS OF TIME-SERIES ANALYSIS
SOME BASICS OF TIME-SERIES ANALYSIS John E. Floyd University of Toronto December 8, 26 An excellent place to learn about time series analysis is from Walter Enders textbook. For a basic understanding of
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More informationExercises - Time series analysis
Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare
More information5 Transfer function modelling
MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationMAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)
MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.
More informationGeneralised AR and MA Models and Applications
Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and
More informationModel selection using penalty function criteria
Model selection using penalty function criteria Laimonis Kavalieris University of Otago Dunedin, New Zealand Econometrics, Time Series Analysis, and Systems Theory Wien, June 18 20 Outline Classes of models.
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More informationAustrian Inflation Rate
Austrian Inflation Rate Course of Econometric Forecasting Nadir Shahzad Virkun Tomas Sedliacik Goal and Data Selection Our goal is to find a relatively accurate procedure in order to forecast the Austrian
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationCircle a single answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle
More informationIntroduction to ARMA and GARCH processes
Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,
More informationChapter 3: Regression Methods for Trends
Chapter 3: Regression Methods for Trends Time series exhibiting trends over time have a mean function that is some simple function (not necessarily constant) of time. The example random walk graph from
More informationWeek 5 Quantitative Analysis of Financial Markets Characterizing Cycles
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More informationEASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More informationChapter 6: Model Specification for Time Series
Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing
More informationEcon 423 Lecture Notes: Additional Topics in Time Series 1
Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationE 4101/5101 Lecture 6: Spectral analysis
E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More informationRegression of Time Series
Mahlerʼs Guide to Regression of Time Series CAS Exam S prepared by Howard C. Mahler, FCAS Copyright 2016 by Howard C. Mahler. Study Aid 2016F-S-9Supplement Howard Mahler hmahler@mac.com www.howardmahler.com/teaching
More informationTime Series. Anthony Davison. c
Series Anthony Davison c 2008 http://stat.epfl.ch Periodogram 76 Motivation............................................................ 77 Lutenizing hormone data..................................................
More informationLecture 4a: ARMA Model
Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model
More informationMODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo
Vol.4, No.2, pp.2-27, April 216 MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo ABSTRACT: This study
More informationTheoretical and Simulation-guided Exploration of the AR(1) Model
Theoretical and Simulation-guided Exploration of the AR() Model Overview: Section : Motivation Section : Expectation A: Theory B: Simulation Section : Variance A: Theory B: Simulation Section : ACF A:
More informationINTRODUCTION TO TIME SERIES ANALYSIS. The Simple Moving Average Model
INTRODUCTION TO TIME SERIES ANALYSIS The Simple Moving Average Model The Simple Moving Average Model The simple moving average (MA) model: More formally: where t is mean zero white noise (WN). Three parameters:
More informationMultivariate Time Series: VAR(p) Processes and Models
Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with
More informationRegression with correlation for the Sales Data
Regression with correlation for the Sales Data Scatter with Loess Curve Time Series Plot Sales 30 35 40 45 Sales 30 35 40 45 0 10 20 30 40 50 Week 0 10 20 30 40 50 Week Sales Data What is our goal with
More informationLecture 6a: Unit Root and ARIMA Models
Lecture 6a: Unit Root and ARIMA Models 1 2 Big Picture A time series is non-stationary if it contains a unit root unit root nonstationary The reverse is not true. For example, y t = cos(t) + u t has no
More informationEstimating AR/MA models
September 17, 2009 Goals The likelihood estimation of AR/MA models AR(1) MA(1) Inference Model specification for a given dataset Why MLE? Traditional linear statistics is one methodology of estimating
More informationNANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS
NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationLecture 2: ARMA(p,q) models (part 2)
Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationTime Series: Theory and Methods
Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary
More informationChapter 2: simple regression model
Chapter 2: simple regression model Goal: understand how to estimate and more importantly interpret the simple regression Reading: chapter 2 of the textbook Advice: this chapter is foundation of econometrics.
More informationAdvanced Machine Learning Practical 4b Solution: Regression (BLR, GPR & Gradient Boosting)
Advanced Machine Learning Practical 4b Solution: Regression (BLR, GPR & Gradient Boosting) Professor: Aude Billard Assistants: Nadia Figueroa, Ilaria Lauzana and Brice Platerrier E-mails: aude.billard@epfl.ch,
More informationStatistics 910, #5 1. Regression Methods
Statistics 910, #5 1 Overview Regression Methods 1. Idea: effects of dependence 2. Examples of estimation (in R) 3. Review of regression 4. Comparisons and relative efficiencies Idea Decomposition Well-known
More informationWeek 9: An Introduction to Time Series
BUS41100 Applied Regression Analysis Week 9: An Introduction to Time Series Dependent data, autocorrelation, AR and periodic regression models Max H. Farrell The University of Chicago Booth School of Business
More informationVolatility. Gerald P. Dwyer. February Clemson University
Volatility Gerald P. Dwyer Clemson University February 2016 Outline 1 Volatility Characteristics of Time Series Heteroskedasticity Simpler Estimation Strategies Exponentially Weighted Moving Average Use
More informationDynamic Time Series Regression: A Panacea for Spurious Correlations
International Journal of Scientific and Research Publications, Volume 6, Issue 10, October 2016 337 Dynamic Time Series Regression: A Panacea for Spurious Correlations Emmanuel Alphonsus Akpan *, Imoh
More informationCh. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations
Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More information1 Class Organization. 2 Introduction
Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More information2. An Introduction to Moving Average Models and ARMA Models
. An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models
More informationApplied Forecasting (LECTURENOTES) Prof. Rozenn Dahyot
Applied Forecasting (LECTURENOTES) Prof. Rozenn Dahyot SCHOOL OF COMPUTER SCIENCE AND STATISTICS TRINITY COLLEGE DUBLIN IRELAND https://www.scss.tcd.ie/rozenn.dahyot Michaelmas Term 2017 Contents 1 Introduction
More informationIDENTIFICATION OF ARMA MODELS
IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by
More information