Statistics of stochastic processes
|
|
- Francine Beatrix Murphy
- 5 years ago
- Views:
Transcription
1 Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre / 31
2 Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. In statistics of stochastic processes (= time series analysis) we will assume y 1,..., y n to be realizations of a stochastic process... Y 1,..., Y n,... with some rules for dependence. 14 settembre / 31
3 Introduction Aims of time series analysis Drawing inference from available data, but first we need to find an appropriate model. Once a model has been selected: provide a compact and correct description of data (trend, seasonal and random terms) Remarks 14 settembre / 31
4 Introduction Aims of time series analysis Drawing inference from available data, but first we need to find an appropriate model. Once a model has been selected: provide a compact and correct description of data (trend, seasonal and random terms) adjust data (filtering, missing values) [separating noise from signal] Remarks 14 settembre / 31
5 Introduction Aims of time series analysis Drawing inference from available data, but first we need to find an appropriate model. Once a model has been selected: provide a compact and correct description of data (trend, seasonal and random terms) adjust data (filtering, missing values) [separating noise from signal] test hypotheses (increasing trend? influence of factors) understand causes Remarks 14 settembre / 31
6 Introduction Aims of time series analysis Drawing inference from available data, but first we need to find an appropriate model. Once a model has been selected: provide a compact and correct description of data (trend, seasonal and random terms) adjust data (filtering, missing values) [separating noise from signal] test hypotheses (increasing trend? influence of factors) understand causes Remarks predict future values 14 settembre / 31
7 Introduction Aims of time series analysis Drawing inference from available data, but first we need to find an appropriate model. Once a model has been selected: provide a compact and correct description of data (trend, seasonal and random terms) adjust data (filtering, missing values) [separating noise from signal] test hypotheses (increasing trend? influence of factors) understand causes Remarks predict future values - Random terms generally described as a stationary process. 14 settembre / 31
8 Introduction Aims of time series analysis Drawing inference from available data, but first we need to find an appropriate model. Once a model has been selected: provide a compact and correct description of data (trend, seasonal and random terms) adjust data (filtering, missing values) [separating noise from signal] test hypotheses (increasing trend? influence of factors) understand causes Remarks predict future values - Random terms generally described as a stationary process. - Linear analysis (additive decomposition of trend, seasonal and stationary process) 14 settembre / 31
9 Introduction Stationary process Definition A stochastic process {X t } t Z is (strictly) stationary if the joint distribution of (X t1, X t2,..., X tk ) is equal to the distribution of (X t1 +h, X t2 +h,..., X tk +h) k N, h Z, t 1, t 2,..., t k Z. In particular, if a stationary stochastic process has finite second moment, then E(X t ) and Cov(X t, X t+h ) do not depend on t. 14 settembre / 31
10 Introduction Stationary process. 2 Linear time series analysis looks only at second-order properties. Then Definition A stochastic process {X t } t Z is stationary if it is in L 2 and E(X t ) = µ Cov(X t, X t+h ) = γ(h). 14 settembre / 31
11 Introduction Stationary process. 2 Linear time series analysis looks only at second-order properties. Then Definition A stochastic process {X t } t Z is stationary if it is in L 2 and E(X t ) = µ Cov(X t, X t+h ) = γ(h). If a Gaussian process is stationary, then it is strictly stationary. 14 settembre / 31
12 Introduction Stationary process. 2 Linear time series analysis looks only at second-order properties. Then Definition A stochastic process {X t } t Z is stationary if it is in L 2 and E(X t ) = µ Cov(X t, X t+h ) = γ(h). If a Gaussian process is stationary, then it is strictly stationary. A Gaussian process is such that all finite-dimensional distributions are multivariate normal. 14 settembre / 31
13 Introduction Reminders on multivariate normal Definition Y = (Y 1,..., Y n ) is multivariate normal if, a R n, a t Y is a univariate normal. 14 settembre / 31
14 Introduction Reminders on multivariate normal Definition Y = (Y 1,..., Y n ) is multivariate normal if, a R n, a t Y is a univariate normal. Equivalently, Y is multivariate normal there exists b R n A (n m) matrix, X = (X 1,..., X m ) independent standard normal r.v. such that Y = AX + b. 14 settembre / 31
15 Introduction Reminders on multivariate normal Definition Y = (Y 1,..., Y n ) is multivariate normal if, a R n, a t Y is a univariate normal. Equivalently, Y is multivariate normal there exists b R n A (n m) matrix, X = (X 1,..., X m ) independent standard normal r.v. such that Y = AX + b. = E(Y ) = b, Cov(Y ) = AA t, i.e. Y N(b, AA t ). 14 settembre / 31
16 Introduction Reminders on multivariate normal Definition Y = (Y 1,..., Y n ) is multivariate normal if, a R n, a t Y is a univariate normal. Equivalently, Y is multivariate normal there exists b R n A (n m) matrix, X = (X 1,..., X m ) independent standard normal r.v. such that Y = AX + b. = E(Y ) = b, Cov(Y ) = AA t, i.e. Y N(b, AA t ). Alternative characterization via characteristic function. 14 settembre / 31
17 Introduction Reminders on multivariate normal Definition Y = (Y 1,..., Y n ) is multivariate normal if, a R n, a t Y is a univariate normal. Equivalently, Y is multivariate normal there exists b R n A (n m) matrix, X = (X 1,..., X m ) independent standard normal r.v. such that Y = AX + b. = E(Y ) = b, Cov(Y ) = AA t, i.e. Y N(b, AA t ). Alternative characterization via characteristic function. If Cov(Y ) = S positive definite (i.e. invertible), Y N(µ, S) has density f Y (y) = (2π) n/2 S 1/2 exp{ (y µ) t S 1 (y µ)/2}. (non-singular distribution) 14 settembre / 31
18 Introduction Gaussian processes Definition A process {X t } is Gaussian, if for any n > 0 and any (t 1,..., t n ) the vector X = (X t1,..., X tn ) has a non-singular multivariate normal distribution. 14 settembre / 31
19 Introduction Gaussian processes Definition A process {X t } is Gaussian, if for any n > 0 and any (t 1,..., t n ) the vector X = (X t1,..., X tn ) has a non-singular multivariate normal distribution. Then let µ = (µ t1,..., µ tn ) = E(X) and Cov(X) = Γ = {γ(t i, t j ), i, j = 1... n}. X has density function { g(x, µ, Γ) = (2π) n/2 Γ 1/2 exp 1 } 2 Γ 1 (x µ), x µ. 14 settembre / 31
20 Introduction Gaussian processes Definition A process {X t } is Gaussian, if for any n > 0 and any (t 1,..., t n ) the vector X = (X t1,..., X tn ) has a non-singular multivariate normal distribution. Then let µ = (µ t1,..., µ tn ) = E(X) and Cov(X) = Γ = {γ(t i, t j ), i, j = 1... n}. X has density function { g(x, µ, Γ) = (2π) n/2 Γ 1/2 exp 1 } 2 Γ 1 (x µ), x µ. {X t } is (weakly) stationary if µ t µ and γ(t i, t j ) = γ( t i t j ); then is also strictly stationary, as the distribution depends only on µ and Γ. 14 settembre / 31
21 Introduction Gaussian processes Definition A process {X t } is Gaussian, if for any n > 0 and any (t 1,..., t n ) the vector X = (X t1,..., X tn ) has a non-singular multivariate normal distribution. Then let µ = (µ t1,..., µ tn ) = E(X) and Cov(X) = Γ = {γ(t i, t j ), i, j = 1... n}. X has density function { g(x, µ, Γ) = (2π) n/2 Γ 1/2 exp 1 } 2 Γ 1 (x µ), x µ. {X t } is (weakly) stationary if µ t µ and γ(t i, t j ) = γ( t i t j ); then is also strictly stationary, as the distribution depends only on µ and Γ. Linear time series analysis is very well suited for Gaussian processes; less so for non-gaussian ones. 14 settembre / 31
22 Introduction Hilbert spaces Many time series problems can be solved using Hilbert space theory. Indeed space L 2 (Ω) is a Hilbert space with X, Y = E(XY ), X Y 2 = E( X Y 2 ). Restricting to the 0-mean subspace X, Y = Cov(X, Y ). 14 settembre / 31
23 Introduction Detrending data Often data do not appeat as arising from stationary processes. Estimating trend, and then study residuals (differences from trend) smoothing polynomial (esp. line) fitting Study differenced series In all cases, trasformations may be useful More systematic model fitting in the future. 14 settembre / 31
24 Johnson & Johnson quarterly earnings J & J Earnings per Share data 3 points smoothing 5 points smoothing Time 14 settembre / 31
25 Johnson & Johnson data: deviations from trend Deviations from moving average Time 14 settembre / 31
26 Johnson & Johnson data: deviations in log-scale Deviations (in log scale) from moving average Time 14 settembre / 31
27 Sunspots data sunspots year 14 settembre / 31
28 Sunspots data : square-root transformation sunspots year 14 settembre / 31
29 PanAm international air passengers Passengers (1000's) Time 14 settembre / 31
30 PanAm yearly data Annual air passengers aggregate(ap) Time 14 settembre / 31
31 PanAm monthly variation Seasonal component in air passengers Month 14 settembre / 31
32 Level of Lake Huron Level of lake Huron ft Time 14 settembre / 31
33 Lake Huron level: deviations from trend Deviations from trend in level of lake Huron ft Time 14 settembre / 31
34 sales of red wine in Australia Red wine sales in Australia kilolitres Time 14 settembre / 31
35 Deviation from trend in wine sales Deviations from trend in sales of red wine kilolitres Time 14 settembre / 31
36 PanAm monthly variation Seasonal variation in wine sales (AUS) settembre / 31
37 Global temperature data Global temperature data Anomalies from mean Monthly averages Yearly averages Time 14 settembre / 31
38 Global temperature: recent years and trend Global temperatures (regression line in blue) Anomalies Time 14 settembre / 31
39 Measles data in England Measles in England cases per biweek year 14 settembre / 31
40 EEG data from a subject with epilepsy EEG time (arbitrary unit) 14 settembre / 31
41 De-trend and de-seasonalize (period T = 2q) yearly average: m t = 1 T ( 1 2 x t q + q 1 j= (q 1) x t+j x t+q ). 14 settembre / 31
42 De-trend and de-seasonalize (period T = 2q) yearly average: m t = 1 T ( seasonal deviation: w k = 1 n 1 2 x t q + n 1 j=0 q 1 j= (q 1) x t+j x t+q ) (x jt +k m jt +k ), k = 1... T.. 14 settembre / 31
43 De-trend and de-seasonalize (period T = 2q) yearly average: m t = 1 T ( seasonal deviation: w k = 1 n 1 2 x t q + n 1 j=0 seasonal component: ŝ k = w k 1 T ŝ t = ŝ t [ t 1 T ]T, t > T. q 1 j= (q 1) x t+j x t+q ) (x jt +k m jt +k ), k = 1... T. T w i, k = 1... T. i=1. 14 settembre / 31
44 De-trend and de-seasonalize (period T = 2q) yearly average: m t = 1 T ( seasonal deviation: w k = 1 n 1 2 x t q + n 1 j=0 seasonal component: ŝ k = w k 1 T ŝ t = ŝ t [ t 1 T ]T, t > T. deseasonalized data d t = x t ŝ t. q 1 j= (q 1) x t+j x t+q ) (x jt +k m jt +k ), k = 1... T. T w i, k = 1... T. i=1. 14 settembre / 31
45 De-trend and de-seasonalize (period T = 2q) yearly average: m t = 1 T ( seasonal deviation: w k = 1 n 1 2 x t q + n 1 j=0 seasonal component: ŝ k = w k 1 T ŝ t = ŝ t [ t 1 T ]T, t > T. deseasonalized data d t = x t ŝ t. q 1 j= (q 1) x t+j x t+q ) (x jt +k m jt +k ), k = 1... T. T w i, k = 1... T. i=1 ˆm t trend component on deseasonalized data.. 14 settembre / 31
46 De-trend and de-seasonalize (period T = 2q) yearly average: m t = 1 T ( seasonal deviation: w k = 1 n 1 2 x t q + n 1 j=0 seasonal component: ŝ k = w k 1 T ŝ t = ŝ t [ t 1 T ]T, t > T. deseasonalized data d t = x t ŝ t. q 1 j= (q 1) x t+j x t+q ) (x jt +k m jt +k ), k = 1... T. T w i, k = 1... T. i=1 ˆm t trend component on deseasonalized data. Ŷ t = x t ˆm t ŝ t random component.. 14 settembre / 31
47 De-trend and de-seasonalize (period T = 2q) yearly average: m t = 1 T ( seasonal deviation: w k = 1 n 1 2 x t q + n 1 j=0 seasonal component: ŝ k = w k 1 T ŝ t = ŝ t [ t 1 T ]T, t > T. deseasonalized data d t = x t ŝ t. q 1 j= (q 1) x t+j x t+q ) (x jt +k m jt +k ), k = 1... T. T w i, k = 1... T. i=1 ˆm t trend component on deseasonalized data. Ŷ t = x t ˆm t ŝ t random component. Otherwise, difference data: T X t := X t X t T. T X t are de-seasonalized; then a trend can be eliminated from these.. 14 settembre / 31
48 Autocovariance and autocorrelation functions If a process {X t } is stationary, γ(h) := Cov(X t, X t+h ) is the Autocovariance function (ACVF). 14 settembre / 31
49 Autocovariance and autocorrelation functions If a process {X t } is stationary, γ(h) := Cov(X t, X t+h ) is the Autocovariance function (ACVF). Recall the correlation ρ(x, Y ) = Cov(X, Y ) V (X )V (Y ). For a stationary process V (X t ) = V (X t+h ) = γ(0). Hence ρ(h) = ρ(x t, X t+h ) = γ(h) γ(0) is the Autocorrelation function (ACF). 14 settembre / 31
50 Autocovariance and autocorrelation functions If a process {X t } is stationary, γ(h) := Cov(X t, X t+h ) is the Autocovariance function (ACVF). Recall the correlation ρ(x, Y ) = Cov(X, Y ) V (X )V (Y ). For a stationary process V (X t ) = V (X t+h ) = γ(0). Hence ρ(h) = ρ(x t, X t+h ) = γ(h) γ(0) First properties of ACVF: is the Autocorrelation function (ACF). 14 settembre / 31
51 Autocovariance and autocorrelation functions If a process {X t } is stationary, γ(h) := Cov(X t, X t+h ) is the Autocovariance function (ACVF). Recall the correlation ρ(x, Y ) = Cov(X, Y ) V (X )V (Y ). For a stationary process V (X t ) = V (X t+h ) = γ(0). Hence ρ(h) = ρ(x t, X t+h ) = γ(h) γ(0) First properties of ACVF: is the Autocorrelation function (ACF). γ(h) = γ( h) [stationarity = Cov(X t, X t+h ) = Cov(X t h, X t )] 14 settembre / 31
52 Autocovariance and autocorrelation functions If a process {X t } is stationary, γ(h) := Cov(X t, X t+h ) is the Autocovariance function (ACVF). Recall the correlation ρ(x, Y ) = Cov(X, Y ) V (X )V (Y ). For a stationary process V (X t ) = V (X t+h ) = γ(0). Hence ρ(h) = ρ(x t, X t+h ) = γ(h) γ(0) First properties of ACVF: is the Autocorrelation function (ACF). γ(h) = γ( h) [stationarity = Cov(X t, X t+h ) = Cov(X t h, X t )] γ(h) γ(0) [as ρ(x, Y ) 1] 14 settembre / 31
53 Simple stationary processes and their ACVF IID(0, σ 2 ): {X t } t Z independent and identically distributed r. v. with E(X t ) = 0, V(X t ) = σ 2 : γ(0) = σ 2, γ(h) = 0 for h > settembre / 31
54 Simple stationary processes and their ACVF IID(0, σ 2 ): {X t } t Z independent and identically distributed r. v. with E(X t ) = 0, V(X t ) = σ 2 : γ(0) = σ 2, γ(h) = 0 for h > 0. WN(0, σ 2 ) [white noise] {X t } t Z uncorrelated random variables with mean 0 and variance σ 2 : γ(0) = σ 2, γ(h) = 0 for h > settembre / 31
55 Simple stationary processes and their ACVF IID(0, σ 2 ): {X t } t Z independent and identically distributed r. v. with E(X t ) = 0, V(X t ) = σ 2 : γ(0) = σ 2, γ(h) = 0 for h > 0. WN(0, σ 2 ) [white noise] {X t } t Z uncorrelated random variables with mean 0 and variance σ 2 : γ(0) = σ 2, γ(h) = 0 for h > 0. WN(0, σ 2 ) need not be independent. For instance if {Z t } t Z are IID and N(0,1) [normal r.v.], then { Z t t odd X t = (Zt 1 2 1)/ is WN(0, 1) but not IID(0, 1). 2 t even It is not IID, since (e.g.) X 1 and X 2 are obviously not independent. Left for exercise that X t is WN. 14 settembre / 31
56 Simple stationary processes and their ACVF IID(0, σ 2 ): {X t } t Z independent and identically distributed r. v. with E(X t ) = 0, V(X t ) = σ 2 : γ(0) = σ 2, γ(h) = 0 for h > 0. WN(0, σ 2 ) [white noise] {X t } t Z uncorrelated random variables with mean 0 and variance σ 2 : γ(0) = σ 2, γ(h) = 0 for h > 0. WN(0, σ 2 ) need not be independent. For instance if {Z t } t Z are IID and N(0,1) [normal r.v.], then { Z t t odd X t = (Zt 1 2 1)/ is WN(0, 1) but not IID(0, 1). 2 t even It is not IID, since (e.g.) X 1 and X 2 are obviously not independent. Left for exercise that X t is WN. Less contrived examples of {X t } t Z WN but not IID will be seen later in the course. 14 settembre / 31
57 Moving average processes and their ACVF. 2 MA(1): moving average {X t } t Z is MA(1) if X t = Z t + ϑz t 1, t Z where ϑ R, {Z t } WN(0, σ 2 ). A simple computation: γ(0) = σ 2 (1 + ϑ 2 ), γ(1) = ϑσ 2, γ(h) = 0 for h > settembre / 31
58 Moving average processes and their ACVF. 2 MA(1): moving average {X t } t Z is MA(1) if X t = Z t + ϑz t 1, t Z where ϑ R, {Z t } WN(0, σ 2 ). A simple computation: γ(0) = σ 2 (1 + ϑ 2 ), γ(1) = ϑσ 2, γ(h) = 0 for h > 1. Similarly {X t } t Z MA(q) if X t = Z t + ϑ 1 Z t 1 + ϑ q Z t q, t Z, with ϑ 1,..., ϑ q R, {Z t } WN(0, σ 2 ). Another simple computation leads to γ(h) = 0 for h > q. 14 settembre / 31
59 AutoRegressive processes AR(1) [AutoRegressive] {X t } t Z is AR(1) if is stationary and X t = φx t 1 + Z t, t Z where φ R, {Z t } WN(0, σ 2 ). (1) 14 settembre / 31
60 AutoRegressive processes AR(1) [AutoRegressive] {X t } t Z is AR(1) if is stationary and X t = φx t 1 + Z t, t Z where φ R, {Z t } WN(0, σ 2 ). (1) (1) is an (infinite set of) equation. It is not obvious that a stationary process exists satisfying them (this will be discussed later). 14 settembre / 31
61 AutoRegressive processes AR(1) [AutoRegressive] {X t } t Z is AR(1) if is stationary and X t = φx t 1 + Z t, t Z where φ R, {Z t } WN(0, σ 2 ). (1) (1) is an (infinite set of) equation. It is not obvious that a stationary process exists satisfying them (this will be discussed later). We are not saying {X t } t N is the Markov chain defined through X t = φx t 1 + Z t, t > 0 with X 0 some prescribed r.v. 14 settembre / 31
62 AutoRegressive processes AR(1) [AutoRegressive] {X t } t Z is AR(1) if is stationary and X t = φx t 1 + Z t, t Z where φ R, {Z t } WN(0, σ 2 ). (1) (1) is an (infinite set of) equation. It is not obvious that a stationary process exists satisfying them (this will be discussed later). We are not saying {X t } t N is the Markov chain defined through X t = φx t 1 + Z t, t > 0 with X 0 some prescribed r.v. Now, assume a stationary process {X t } t Z exists satisfying (1) and E(X t Z s ) = 0 for t < s (this latter property seems natural as X t should be defined in terms of Z t and the previous ones). 14 settembre / 31
63 AutoRegressive processes AR(1) [AutoRegressive] {X t } t Z is AR(1) if is stationary and X t = φx t 1 + Z t, t Z where φ R, {Z t } WN(0, σ 2 ). (1) (1) is an (infinite set of) equation. It is not obvious that a stationary process exists satisfying them (this will be discussed later). We are not saying {X t } t N is the Markov chain defined through X t = φx t 1 + Z t, t > 0 with X 0 some prescribed r.v. Now, assume a stationary process {X t } t Z exists satisfying (1) and E(X t Z s ) = 0 for t < s (this latter property seems natural as X t should be defined in terms of Z t and the previous ones). Then γ(0) = V(X t ) = E((φX t 1 + Z t ) 2 ) = φ 2 V(X t 1 ) + σ 2 + 2φE(X t 1 Z t ) = φ 2 γ(0) + σ 2. Hence γ(0) = σ2 1 φ 2 (makes sense only if φ 2 < 1 ). 14 settembre / 31
64 AutoRegressive processes. 2 Remarks: we have found φ 2 < 1 φ < 1 as a necessary condition for an AR(1) satisfying E(X t Z s ) = 0 for t < s. It will also be sufficient. Implicit assumption in the computations: E(X t ) = 0 (this can be proved analogously). More simply, one can then compute γ(h) for h > 0 (left for exercise). 14 settembre / 31
Time Series Analysis -- An Introduction -- AMS 586
Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data
More informationPart II. Time Series
Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationSTOR 356: Summary Course Notes
STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationStochastic Processes: I. consider bowl of worms model for oscilloscope experiment:
Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationSTAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics
David B. University of South Carolina Department of Statistics What are Time Series Data? Time series data are collected sequentially over time. Some common examples include: 1. Meteorological data (temperatures,
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationNANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS
NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More informationReliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends
Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function
More informationOn 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).
On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). (sin(x)) 2 + (cos(x)) 2 = 1. 28 1 Characteristics of Time
More informationSTA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)
STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationTime Series Solutions HT 2009
Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by
More informationStat 248 Lab 2: Stationarity, More EDA, Basic TS Models
Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance
More informationTIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets
TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationSTAT 443 (Winter ) Forecasting
Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationPart III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to
TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationModule 3. Descriptive Time Series Statistics and Introduction to Time Series Models
Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015
More informationMinitab Project Report - Assignment 6
.. Sunspot data Minitab Project Report - Assignment Time Series Plot of y Time Series Plot of X y X 7 9 7 9 The data have a wavy pattern. However, they do not show any seasonality. There seem to be an
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationClass 1: Stationary Time Series Analysis
Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models
More informationLong-range dependence
Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series
More informationLesson 4: Stationary stochastic processes
Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means
More information1. Fundamental concepts
. Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing
More informationBasics: Definitions and Notation. Stationarity. A More Formal Definition
Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More information5: MULTIVARATE STATIONARY PROCESSES
5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability
More informationLesson 9: Autoregressive-Moving Average (ARMA) models
Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen
More informationCharacteristics of Time Series
Characteristics of Time Series Al Nosedal University of Toronto January 12, 2016 Al Nosedal University of Toronto Characteristics of Time Series January 12, 2016 1 / 37 Signal and Noise In general, most
More informationModule 9: Stationary Processes
Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.
More informationAkaike criterion: Kullback-Leibler discrepancy
Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ
More information1 Class Organization. 2 Introduction
Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat
More informationTime Series Analysis - Part 1
Time Series Analysis - Part 1 Dr. Esam Mahdi Islamic University of Gaza - Department of Mathematics April 19, 2017 1 of 189 What is a Time Series? Fundamental concepts Time Series Decomposition Estimating
More information3 Time Series Regression
3 Time Series Regression 3.1 Modelling Trend Using Regression Random Walk 2 0 2 4 6 8 Random Walk 0 2 4 6 8 0 10 20 30 40 50 60 (a) Time 0 10 20 30 40 50 60 (b) Time Random Walk 8 6 4 2 0 Random Walk 0
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 2 - Probability Models Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 34 Agenda 1 Introduction 2 Stochastic Process Definition 1 Stochastic Definition
More informationExercises - Time series analysis
Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare
More informationMultivariate Time Series
Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form
More informationESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45
ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions
More informationProblem Set 1 Solution Sketches Time Series Analysis Spring 2010
Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically
More informationDifference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.
Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order
More informationFor a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,
CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More informationcovariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of
Index* The Statistical Analysis of Time Series by T. W. Anderson Copyright 1971 John Wiley & Sons, Inc. Aliasing, 387-388 Autoregressive {continued) Amplitude, 4, 94 case of first-order, 174 Associated
More informationSTAT 248: EDA & Stationarity Handout 3
STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance
More information1. Stochastic Processes and Stationarity
Massachusetts Institute of Technology Department of Economics Time Series 14.384 Guido Kuersteiner Lecture Note 1 - Introduction This course provides the basic tools needed to analyze data that is observed
More informationStatistics of Stochastic Processes
Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More informationBasic concepts and terminology: AR, MA and ARMA processes
ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture
More informationTime Series Outlier Detection
Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection
More informationChapter 3 - Temporal processes
STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect
More informationWeek 5 Quantitative Analysis of Financial Markets Characterizing Cycles
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036
More informationA nonparametric test for seasonal unit roots
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna To be presented in Innsbruck November 7, 2007 Abstract We consider a nonparametric test for the
More informationTime Series I Time Domain Methods
Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT
More informationClassic Time Series Analysis
Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t
More informationBrooklyn College, CUNY. Lecture Notes. Christian Beneš
Brooklyn College, CUNY Math 4506 Time Series Lecture Notes Spring 2015 Christian Beneš cbenes@brooklyn.cuny.edu http://userhome.brooklyn.cuny.edu/cbenes/timeseries.html Math 4506 (Spring 2015) January
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More informationLesson 2: Analysis of time series
Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems
More informationECO 513 Fall 2009 C. Sims CONDITIONAL EXPECTATION; STOCHASTIC PROCESSES
ECO 513 Fall 2009 C. Sims CONDIIONAL EXPECAION; SOCHASIC PROCESSES 1. HREE EXAMPLES OF SOCHASIC PROCESSES (I) X t has three possible time paths. With probability.5 X t t, with probability.25 X t t, and
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More informationIntroduction to Time Series Analysis. Lecture 7.
Last lecture: Introduction to Time Series Analysis. Lecture 7. Peter Bartlett 1. ARMA(p,q) models: stationarity, causality, invertibility 2. The linear process representation of ARMA processes: ψ. 3. Autocovariance
More informationTime series and spectral analysis. Peter F. Craigmile
Time series and spectral analysis Peter F. Craigmile http://www.stat.osu.edu/~pfc/ Summer School on Extreme Value Modeling and Water Resources Universite Lyon 1, France. 13-24 Jun 2016 Thank you to The
More informationSimple Descriptive Techniques
Simple Descriptive Techniques Outline 1 Types of variation 2 Stationary Time Series 3 The Time Plot 4 Transformations 5 Analysing Series that Contain a Trend 6 Analysing Series that Contain Seasonal Variation
More informationX t = a t + r t, (7.1)
Chapter 7 State Space Models 71 Introduction State Space models, developed over the past 10 20 years, are alternative models for time series They include both the ARIMA models of Chapters 3 6 and the Classical
More informationCh 5. Models for Nonstationary Time Series. Time Series Analysis
We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the
More informationCh. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations
Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed
More information7 Introduction to Time Series
Econ 495 - Econometric Review 1 7 Introduction to Time Series 7.1 Time Series vs. Cross-Sectional Data Time series data has a temporal ordering, unlike cross-section data, we will need to changes some
More informationLecture 2: ARMA(p,q) models (part 2)
Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More information3 Theory of stationary random processes
3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation
More informationStochastic process for macro
Stochastic process for macro Tianxiao Zheng SAIF 1. Stochastic process The state of a system {X t } evolves probabilistically in time. The joint probability distribution is given by Pr(X t1, t 1 ; X t2,
More informationLecture 19 Box-Jenkins Seasonal Models
Lecture 19 Box-Jenkins Seasonal Models If the time series is nonstationary with respect to its variance, then we can stabilize the variance of the time series by using a pre-differencing transformation.
More informationEconometric Forecasting
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend
More informationTime Series: Theory and Methods
Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary
More informationA time series is a set of observations made sequentially in time.
Time series and spectral analysis Peter F. Craigmile Analyzing time series A time series is a set of observations made sequentially in time. R. A. Fisher: One damned thing after another. Time series analysis
More information7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15
Econ 495 - Econometric Review 1 Contents 7 Introduction to Time Series 3 7.1 Time Series vs. Cross-Sectional Data............ 3 7.2 Detrending Time Series................... 15 7.3 Types of Stochastic
More informationGaussian processes. Basic Properties VAG002-
Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity
More informationX random; interested in impact of X on Y. Time series analogue of regression.
Multiple time series Given: two series Y and X. Relationship between series? Possible approaches: X deterministic: regress Y on X via generalized least squares: arima.mle in SPlus or arima in R. We have
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationCh 4. Models For Stationary Time Series. Time Series Analysis
This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e
More informationLevinson Durbin Recursions: I
Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More informationStochastic Processes
Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.
More informationLevinson Durbin Recursions: I
Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More information