Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Similar documents
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Some Time-Series Models

Minitab Project Report Assignment 3

Lesson 2: Analysis of time series

Econometría 2: Análisis de series de Tiempo

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

Statistics of stochastic processes

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Class 1: Stationary Time Series Analysis

A time series is called strictly stationary if the joint distribution of every collection (Y t

Econ 424 Time Series Concepts

Time Series Solutions HT 2009

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Introduction to Stochastic processes

Ch3. TRENDS. Time Series Analysis

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

Minitab Project Report - Assignment 6

Empirical Market Microstructure Analysis (EMMA)

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

3 Time Series Regression

Time Series Analysis -- An Introduction -- AMS 586

This note introduces some key concepts in time series econometrics. First, we

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Discrete time processes

Chapter 6: Model Specification for Time Series

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Exercises - Time series analysis

3. ARMA Modeling. Now: Important class of stationary processes

Advanced Econometrics

Univariate Time Series Analysis; ARIMA Models

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Chapter 3: Regression Methods for Trends

STAT 248: EDA & Stationarity Handout 3

Time Series 2. Robert Almgren. Sept. 21, 2009

Part II. Time Series

Classic Time Series Analysis

Characteristics of Time Series

Econometría 2: Análisis de series de Tiempo

Time Series Analysis

Non-Stationary Time Series and Unit Root Testing

Ch 6. Model Specification. Time Series Analysis

7. MULTIVARATE STATIONARY PROCESSES

Time Series Solutions HT Let fx t g be the ARMA(1, 1) process, where jffij < 1 and j j < 1. Show that the autocorrelation function of

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

1. Fundamental concepts

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

7 Introduction to Time Series

E 4101/5101 Lecture 6: Spectral analysis

Univariate ARIMA Models

Lecture 2: Univariate Time Series

1 Linear Difference Equations

Simple Descriptive Techniques

AR, MA and ARMA models

6 NONSEASONAL BOX-JENKINS MODELS

Multivariate Time Series

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

Introduction to ARMA and GARCH processes

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Econometrics II Heij et al. Chapter 7.1

STAT Financial Time Series

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Modeling and forecasting global mean temperature time series

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

Computational Data Analysis!

Chapter 4: Models for Stationary Time Series

at least 50 and preferably 100 observations should be available to build a proper model

IDENTIFICATION OF ARMA MODELS

Firstly, the dataset is cleaned and the years and months are separated to provide better distinction (sample below).

7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15

5: MULTIVARATE STATIONARY PROCESSES

X t = a t + r t, (7.1)

STOCHASTIC MODELING OF MONTHLY RAINFALL AT KOTA REGION

Non-Stationary Time Series and Unit Root Testing

Econometrics I: Univariate Time Series Econometrics (1)

Introduction to Economic Time Series

Lesson 4: Stationary stochastic processes

STAT Financial Time Series

Forecasting. Simon Shaw 2005/06 Semester II

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Chapter 3 - Temporal processes

ECON 616: Lecture 1: Time Series Basics

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Midterm Suggested Solutions

Applied time-series analysis

Class: Trend-Cycle Decomposition

Empirical Macroeconomics

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

1. Stochastic Processes and Stationarity

Forecasting using R. Rob J Hyndman. 3.2 Dynamic regression. Forecasting using R 1

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

TMA4285 December 2015 Time series models, solution.

Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each)

Ross Bettinger, Analytical Consultant, Seattle, WA

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).

Transcription:

Reliability and Risk Analysis

Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function µ t defined by The autocovariance function is defined as µ t = E(Y t), t = 0, ±1, ±2 γ t,s = C(Y t, Y s), t, s = 0, ±1, ±2, where C(Y t, Y s) = E[(Y t µ t)(y s µ s)] = E[Y t, Y s] µ tµ s The autocorrelation function is given by C(Yt, Ys) ρ t,s = = γt,s D(Yt)D(Y s) γt,tγ s,s

Stacionarity A process {Y t} is said to be strictly stationary if the joint distribution Y t1, Y t2,, Y tn is the same as the joint distribution of Y t1 k, Y t2 k,, Y tn k for all choices of time lag k If a function γ s,t depends on its arguments only through their differences k = s t, we can introduce notation γ k = γ s t = γ s,t Additionally, if the mean function µ t of the process is constant for all t (µ t = µ), the process {Y t} is said to be weakly stationary

Stacionarity autocovariance and autocorrelation function The autocovariace function γ k of the stationary stochastic process is γ k = C(Y t, Y t k ) = E[(Y t µ)(y t k µ)], and the autocorrelation function (ACF) ρ k is given by ρ k = C(Yt, Y t k) D(Yt)D(Y t k ) = γ k γ 0

Partial autocorrelation function The correlation between two random variables is often caused by the correlation with another variable The partial autocorrelation provide information about the correlation values Y t a Y t k removing the effect of variables Y t 1, Y t k+1 The partial autocorrelation with a lag k is expressed by the regression coefficient φ kk in auto-regression Y t = φ k1 Y t 1 + φ k2 Y t 2 + + φ kk Y t k + e t, where e t is variable uncorrelated with Y t j, j 1 φ kk is a function of the lag k, we call it the partial autocorrelation function (PACF) and denote it ρ kk

Partial autocorrelation function After multiplying both sides of the above equation by the variable Y t 1 and taking expectation of the equation we get γ j = φ k1 γ j 1 + φ k2 γ j 2 + + φ kk γ j k, so For j = 1, 2,, k is ρ j = φ k1 ρ j 1 + φ k2 ρ j 2 + + φ kk ρ j k ρ 1 = φ k1 ρ 0 + φ k2 ρ 1 + + φ kk ρ k 1 ρ 2 = φ k1 ρ 1 + φ k2 ρ 0 + + φ kk ρ k 2 ρ k = φ k1 ρ k 1 + φ k2 ρ k 2 + + φ kk ρ 0 These equations are called Yule-Walker equations

Partial autocorrelation function Using Cramer s rule for k = 1, 2, we sequentially obtain ρ 11 = φ 11 = ρ 1, 1 ρ1 ρ 1 ρ 2 ρ 22 = φ 22 = 1 = ρ2 ρ2 1, ρ1 1 ρ 2 1 ρ 1 1 1 ρ 1 ρ 2 ρ k 2 ρ 1 ρ 1 1 ρ 1 ρ k 3 ρ 2 ρ k 1 ρ k 2 ρ k 3 ρ 1 ρ k ρ kk = φ kk = 1 ρ 1 ρ 2 ρ k 2 ρ k 1 ρ 1 1 ρ 1 ρ k 3 ρ k 2 ρ k 1 ρ k 2 ρ k 3 ρ 1 1

Estimates Parameters µ, γ 0 and ρ k are unknown in general We use estimates µ = Y = 1 n n Y t, γ 0 = 1 n t=1 n (Y t Y ) 2 where n the number of measurements length of time series n t=k+1 ρ k = (Yt Yt)(Y t k Y t) n t=1 (Yt Y, k = 1, 2,, n 1 )2 t=1 (in R acf)

Estimates For the sample partial autocorrelation function we can use the recursive formula (in R pacf) ρ 11 = ρ 1 ρ kk = ˆρ k k 1 j=1 ρ k 1,j ρ k j 1 k 1 j=1 ρ, k 1,j ρ j ρ kj = ρ k 1,j ρ kk ρ k 1,k j, j = 1, 2,, k 1

White noise process The white noise process {ɛ t} is an important stationary stochastic process It is a sequence of independent random variables with the same distribution with zero mean and constant variance It fulfills { 1 k = 0 ρ k = 0 k 0 { 1 k = 0 ρ kk = 0 k 0 Gaussian white noise a sequence of independent random variables with the distribution N(0, σ 2 ɛ t )

Deterministic trend Example: The process Y t = Y 0 + at, t = 1, n contains deterministic linear trend Y 0 denotes an initial value The graph shows given process for n = 100, Y 0 = 0, a = 1

Stochastic trend Example: the random walk Y t = Y t 1 + ɛ t, t = 1, n, where ɛ t WN(0, σ 2 ) Y t = Y t 1 + ɛ t = (Y t 2 + ɛ t 1) + ɛ t = = (Y t 3 + ɛ t 2) + ɛ t 1 + ɛ t = = t = Y 0 + ɛ 1 + + ɛ t = Y 0 + i=1 Y 0 denotes an initial value Two possible realizations (simulation) of this process (n = 100, Y 0 = 0, ɛ t WN(0, 1)) are shown in the graphs ɛ i

Stochastick trend Example the random walk with the drift Y t = Y t 1 + a + ɛ t, t = 1, n, where ɛ t WN(0, σ 2 ) Y t = Y t 1 + a + ɛ t = (Y t 2 + a + ɛ t 1) + a + ɛ t = (Y t 3 + a + ɛ t 2) + 2a + ɛ t 1 + ɛ t = = t = Y 0 + at + i=1 ɛ i Y 0 denotes an initial value One possible realizations (simulation) of this process (n = 100, Y 0 = 0, ɛ t WN(0, 1)) is shown in the graph

Regression The basis of the classical time series analysis is its decomposition into trend T t, seasonal ingredients S t and residual component e t in the additive model in the multiplicative model then Y t = T t + S t + e t, Y t = T t S t e t Linear filters can be used to estimate the trend T t = i= λ i Y t+i

Regression A simple example of linear filters are moving averages with constant weights T t = 1 2a + 1 a Y t+i i= a Smoothed value of time series in time τ is obtained as the average of {y τ a,, y τ,, y τ+a} For example, for a = 2, 12 and 40 we have a = 2, λ i = { 1 5, 1 5, 1 5, 1 5, 1 5 } a = 12,λ i = { 1 25,, 1 25 } }{{} 25 krát a = 40,λ i = { 1 81,, 1 81 } }{{} 81 krát

Regression The graph shows the monthly production of beer in Australia from January 1956 to August 1995

Regression The graphs show moving averages of the length 5 (a = 2), 25 (a = 12), 81 (a = 20)

Regression Decoposition (in R can be computed using the function filter) are the basis of classical decomposition, in which the R performs the function decompose The function stl offers a somewhat more sophisticated method of decomposition

Regression Linear regression The figure shows the evolution of gross monthly wage in the Czech Republic (2000 2012, quarterly data)

Regression Linear regression We estimate the trend using the regression line for the time variable t = year 1999, t = 1, 13 Estimate St error t-test p-value intercept 118759388 2865841 4144 00000 t 10511374 346343 3035 00000

Regression Linear regression We include dummy variables q 1, q 2, q 3, q 4 into the model to describe seasonality q 1 = (1, 0, 0, 0, 1, 0, 0, 0,, 1, 0, 0, 0) q 1 = (0, 1, 0, 0, 0, 1, 0, 0,, 0, 1, 0, 0) q 1 = (0, 0, 1, 0, 0, 0, 1, 0,, 0, 0, 1, 0) q 1 = (0, 0, 0, 1, 0, 0, 0, 1,, 0, 0, 0, 1) Estimate St error t-test p-value t 484,4349 152,8450 3,17 0,0027 t 2 111,4610 23,3280 4,78 0,0000 t 3-5,7907 1,0430-5,55 0,0000 q 1 11684,2530 282,8990 41,30 0,0000 q 2 12457,0559 287,3388 43,35 0,0000 q 3 12138,0542 291,5674 41,63 0,0000 q 4 13921,6368 295,6681 47,09 0,0000

Regression Linear regression

Regression Linear regression The predictions for 2013 (and 95% confidence intervals) are summarized in the table prediction lower upper 2013, 1 quarter 2442314 2400805 2483823 2013 2 quarter 2523773 2477701 2569844 2013 3 quarter 2494350 2443121 2545579 2013 4 quarter 2673430 2616451 2730409