Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:
|
|
- Prudence Long
- 5 years ago
- Views:
Transcription
1 Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1
2 stochastic process is: Stochastic Processes: II informally: bowl + drawing mechanism formally: family of random variables (RVs) indexed by t t called time for convenience real-valued RV: mapping from sample space to real line denote individual real-valued RVs by X t or X(t) notation: discrete parameter stochastic process: {X t } = {X t : t 2 Z}, where Z = {..., 2, 1, 0, 1, 2,...} continuous parameter stochastic process: {X(t)} = {X(t) : t 2 R}, where R = {t : 1 < t < 1} SAPA2e 22, 23, 24 II 2
3 Basic Theory for Stochastic Processes: I cumulative probability distribution function (CPDF): F t (a) P[X t apple a] can define bivariate CPDF (and multivariate CPDF): note dependence on t F t1,t 2 (a 1, a 2 ) P[X t1 apple a 1, X t2 apple a 2 ] too complicated to use as models for time series summarize using moments (assuming they exist) SAPA2e 24, 25 II 3
4 Basic Theory for Stochastic Processes: II first moment: µ t E{X t } = = = X i Z 1 Z x df t (x) (see pages 26 7) xf t (x) dx x i p(x i ) second-order central moments: if f t (x) = df t(x) dx exists if F t ( ) is a step function t 0,t 1 cov{x t0, X t1 } = E{(X t0 µ t0 )(X t1 µ t1 )} 2 t var{x t} = still too complicated! Z 1 1 (x µ t ) 2 df t (x) SAPA2e 25, 26 II 4
5 Simplification: Stationary Processes {X t } is by definition stationary if: 1. E{X t } = µ, a finite constant independent of t 2. cov{x t+, X t } = s, a finite constant that can depend on (known as the lag), but not t autocovariance sequence (ACVS) for {X t }: {s } = {s : 2 Z} autocovariance function (ACVF) for {X(t)}: s( ) = function of defined by s( ) for 2 R several terms for this type of stationarity: second-order, covariance, wide sense, weak other type of stationarity: complete (strong, strict) stationarity SAPA2e 28, 29 II 5
6 Basic Properties: I var{x t } = cov{x t, X t } = s 0 (i.e., variance constant over t) s = s (i.e., symmetric about = 0) correlation between X t and X t+ : cov{x t+, X t } corr{x t+, X t } p var{xt } var{x t+ } = cov{x t+, X t } = s var{x t } s 0 autocorrelation sequence (ACS) for {X t }: { } = { : 2 Z} autocorrelation function (ACF) for {X(t)}: ( ) = function defined by ( ) for 2 R apple 1 implies s apple s 0 SAPA2e 29 II 6
7 Basic Properties: II {s } is positive semidefinite, i.e., n 1 X j=0 n 1 X k=0 s tj t k a j a k 0 (1) holds for any positive integer n, any set of n real numbers a 0,..., a n 1 and any set of n integers t 0,..., t n 1 proof: for W P n 1 j=0 a jx tj = a T V, have 0 apple var{w } = var{a T V} [2.2] = a T a is var/cov matrix for V, i.e., its (j, k)th element is cov{x tj, X tk } = s tj t k a T a = (1) imposes severe limitation (Makhoul, 1990) SAPA2e 29, 30 II 7
8 Basic Properties: III var/cov matrix for [X 0,..., X N 1 ] T is Toeplitz; when N = 6, have 2 3 s 0 s 1 s 2 s 3 s 4 s 5 s 1 s 0 s 1 s 2 s 3 s 4 s 2 s 1 s 0 s 1 s 2 s 3 6 s 3 s 2 s 1 s 0 s 1 s s 4 s 3 s 2 s 1 s 0 s 1 5 s 5 s 4 s 3 s 2 s 1 s 0 if {X t } Gaussian, completely characterized by µ and {s } extension: complex-valued stationary process {Z t } cov{z t+, Z t } E{(Z t+ µ)(z t µ) } = s (asterisk indicates complex conjugate) now have s = s (rather than s = s ) SAPA2e 30, 31 II 8
9 Example White Noise Process assume E{X t } = µ and var{x t } = 2 (< 1) {X t } called white noise if, when 6= 0, X t+ and X t are uncorrelated; i.e., cov{x t+, X t } = 0 white noise is stationary process with ACVS: ( 2, = 0; s = 0, otherwise six realizations of white noise, all with µ = 0 and 2 = 1 SAPA2e 32, 33 II 9
10 White Noise with Gaussian Distribution t SAPA2e 33 II 10
11 White Noise with Uniform Distribution t SAPA2e 33 II 11
12 White Noise with Double Exponential Distribution t SAPA2e 33 II 12
13 White Noise with Discrete Distribution t SAPA2e 33 II 13
14 White Noise with Random Distributions t SAPA2e 33 II 14
15 White Noise with Blocky Distribution t SAPA2e 33 II 15
16 White Noise and IID Noise all six examples have exactly the same second-order properties! as fifth and sixth examples illustrate, no need for distributions of X t and X t+1 to be the same for second-order stationarity to hold related concept: IID noise {X t : t 2 Z} are independent and identically distributed all IID noise with finite variance is white noise, but converse is false SAPA2e 33 II 16
17 Example Moving Average Process: I let { t } be white noise, mean 0, variance construct MA(q) process: qx X t = µ q,j t j = µ j=0 2 1X j= 1 q,j t j, where q,0 6= 0, q,q 6= 0 and q,j = 0 for j < 0 or j > q claim: {X t } is stationary, which follows if we can show that E{X t } does not depend on t cov{x t+, X t } does not depend on t (but can depend on ) now E{X t } = µ (why is this true?) SAPA2e 34 II 17
18 Example Moving Average Process: II since X t µ = P j q,j t j, have < cov{x t+, X t } = X : q,k t+ k k = X X q,k q,j E{ t+ k t j } k j = X q,j+ 2 q,j (why?) j s note: no restrictions on q,j s X j q,j t j 19 = A ; SAPA2e 34 II 18
19 Example Moving Average Process: III in terms of just q,0,..., q,q, can reexpress ACVS as ( 2 P q s = j=0 q,j+ q,j, apple q; 0, otherwise note: ACVS for MA(q) process is identically zero when > q note: variance of MA(q) process is qx s 0 = 2 j=0 2 q,j two realizations of MA(1) process X t = t 1,1 t 1 constructed from Gaussian white noise { t }, one with 1,1 = +1, and the other, 1,1 = 1 SAPA2e 34 II 19
20 MA(1) Process with 1,1 = +1; 1 = s 1 /s 0 = 1/ t SAPA2e 35 II 20
21 MA(1) Process with 1,1 = 1; 1 = 1/ t SAPA2e 35 II 21
22 Example Autoregressive Process let { t } be white noise, mean 0, variance construct AR(p) process px X t = µ + j=1 p,j(x t j µ) + t with p,p 6= 0 claim: {X t } is stationary under restrictions on p,j s proof: Priestley (1981) rich class of processes (Chapter 8) extension: ARMA(p,q) process (Box & Jenkins, 1970) 2 p SAPA2e 34, 35, 37 II 22
23 Second Order Autoregressive Process 1st example of realization of AR(2) process X t = 0.75X t 1 0.5X t 2 + t with 2 2 = t SAPA2e 36 II 23
24 Second Order Autoregressive Process 2nd example of realization of AR(2) process X t = 0.75X t 1 0.5X t 2 + t with 2 2 = t SAPA2e 36 II 23
25 Second Order Autoregressive Process 3rd example of realization of AR(2) process X t = 0.75X t 1 0.5X t 2 + t with 2 2 = t SAPA2e 36 II 23
26 Second Order Autoregressive Process 4th example of realization of AR(2) process X t = 0.75X t 1 0.5X t 2 + t with 2 2 = t SAPA2e 36 II 23
27 Fourth Order Autoregressive Process 1st example of realization of AR(4) process (here 2 4 = 0.002) X t = X t X t X t X t 4 + t t SAPA2e 36 II 24
28 Fourth Order Autoregressive Process 2nd example of realization of AR(4) process (here 2 4 = 0.002) X t = X t X t X t X t 4 + t t SAPA2e 36 II 24
29 Fourth Order Autoregressive Process 3rd example of realization of AR(4) process (here 2 4 = 0.002) X t = X t X t X t X t 4 + t t SAPA2e 36 II 24
30 Fourth Order Autoregressive Process 4th example of realization of AR(4) process (here 2 4 = 0.002) X t = X t X t X t X t 4 + t t SAPA2e 36 II 24
31 Example Harmonic Process: I for fixed L > 0, let A 1,..., A L, B 1,..., B L be set of uncorrelated zero mean RVs such that var {A l } = var {B l } l 2 construct harmonic process: LX X t = µ + A l cos (2 f l t) + B l sin (2 f l t), l=1 where µ and f l s are real-valued constants can reexpress harmonic process as LX X t = µ + D l cos (2 f l t + l ), l=1 where Dl 2 = A2 l + B2 l and tan ( l ) = B l /A l note: also have A l = D l cos ( l ) and B l = D l sin ( l ) SAPA2e 37 II 25
32 First Order Gaussian Harmonic Process 1st example of realization of harmonic process X t = A cos ( t) + B sin ( t), where A & B are independent Gaussian RVs with unit variance t SAPA2e 38 II 26
33 First Order Gaussian Harmonic Process 2nd example of realization of harmonic process X t = A cos ( t) + B sin ( t), where A & B are independent Gaussian RVs with unit variance t SAPA2e 38 II 26
34 First Order Gaussian Harmonic Process 3rd example of realization of harmonic process X t = A cos ( t) + B sin ( t), where A & B are independent Gaussian RVs with unit variance t SAPA2e 38 II 26
35 Example Harmonic Process: II claim: harmonic process LX X t = µ + A l cos (2 f l t) + B l sin (2 f l t) is stationary (?!) l=1 proof is subject of Exercise [38], which shows ACVS to be s = LX l=1 2 l cos (2 f l ) (same f l s as making up X t, but cosine terms are all in phase ) SAPA2e 37, 38 II 27
36 Example Harmonic Process: III specialize to case where RVs A l and B l are Gaussian recall, if Y 0, Y 1,..., Y 1 are IID zero mean Gaussian RVs with unit variance, 2 Y0 2 + Y Y 2 1 said to have chi-square distribution with degrees of freedom since A l / l and B l / l are independent zero mean, unit variance Gaussian RVs, Dl 2/ l 2 = (A 2 l + B2 l )/ l 2 has chi-square distribution with 2 degrees of freedom probability density function (PDF) for RV given by ( e u/2 /2, u 0; f 22 (u) 0, u < SAPA2e 38, 39 II 28
37 Example Harmonic Process: IV can deduce PDF for Dl 2 from fact that PDF for D2 l / l 2 is 2 2 : ( e u/(2 l 2) /(2 f D 2(u) l 2 ), u 0; l 0, u < 0 above special case of exponential PDF f(u) = exp( u/ )/ with mean value = 2 2 l random amplitude D l is thus the square root of an exponential RV and is said to obey a Rayleigh distribution symmetry of bivariate distribution for A l & B l dictates to be uniformily distributed over (, ] and independent of D l thus Gaussian harmonic processes involve Rayleigh distributed D l s and uniformily distributed l s, with all 2L RVs being independent of one another SAPA2e 39 II 29
38 Example Harmonic Process: V now consider X t = µ + LX D l cos (2 f l t + l ), l=1 l s are IID RVs uni- where D l s are positive constants, while formily distributed over (, ] X t is necessarily bounded and hence cannot have a Gaussian distribution SAPA2e 39 II 30
39 First-Order Non-Gaussian Process X t 1st example of realization of process X t = p 2 cos ( t + ), where is uniformily distributed over interval (, ] t SAPA2e 38 II 31
40 First-Order Non-Gaussian Process X t 2nd example of realization of process X t = p 2 cos ( t + ), where is uniformily distributed over interval (, ] t SAPA2e 38 II 31
41 First-Order Non-Gaussian Process X t 3rd example of realization of process X t = p 2 cos ( t + ), where is uniformily distributed over interval (, ] t SAPA2e 38 II 31
42 Example Harmonic Process: VI claim: X t = µ + is a stationary process LX D l cos (2 f l t + l ), l=1 to prove when L = 1, i.e., X t = µ + D cos (2 ft + show E{X t } & cov{x t+, X t } do not depend on t now E{X t } = µ + D E{cos (2 ft + )} = µ + D Z ), need to cos (2 ft + ) 1 2 d = µ (why?) SAPA2e 39 II 32
43 Example Harmonic Process: VII since X t µ = D cos (2 ft + ), have cov{x t+, X t } = E {(D Z cos (2 f[t + ] + )) (D cos (2 ft + ))} = D 2 cos (2 f[t + ] + ) cos (2 ft + ) 1 2 d Z trig = D2 cos (2 f[2t + ] + 2 ) + cos (2 f ) d 4 = D 2 cos (2 f )/2 s thus can dispense with random amplitudes! basis for spectral representation theorem (Chapter 4) Exercises [2.20] & [2.21]: non-uniformily distributed l doesn t work, and neither do uncorrelated uniformily distributed l s SAPA2e 39, 40, 49, 50 II 33
44 Stationary Processes as Models: I stationarity property of models, not data spectral analysis assumes stationarity need to examine assumption for each time series example: spinning rotor series (Figure 41) bin number SAPA2e 41, 42 II 34
45 Stationary Processes as Models: II can detrend using linear regression model: X t = + t + Y t letting ˆ and ˆ denote least squares estimates, detrended series gotten from Ŷ t = X t ˆ + ˆt bin number SAPA2e 41, 42 II 35
46 Stationary Processes as Models: III can also get detrended series by first di erencing (filtering): X (1) t X t X t 1 = + Y (1) t, where Y (1) t Y t Y t bin number SAPA2e 41, 42 II 36
47 Stationary Processes as Models: IV second example: standard resistor series (Figure 43) days (from 1/1/80) SAPA2e 43 II 37
Statistics of stochastic processes
Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationTime Series Analysis -- An Introduction -- AMS 586
Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data
More informationFor a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,
CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance
More informationARMA Models: I VIII 1
ARMA Models: I autoregressive moving-average (ARMA) processes play a key role in time series analysis for any positive integer p & any purely nondeterministic process {X t } with ACVF { X (h)}, there is
More informationModule 9: Stationary Processes
Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.
More informationStat 248 Lab 2: Stationarity, More EDA, Basic TS Models
Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More information1 Class Organization. 2 Introduction
Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 2 - Probability Models Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 34 Agenda 1 Introduction 2 Stochastic Process Definition 1 Stochastic Definition
More informationLesson 4: Stationary stochastic processes
Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means
More informationwhite noise Time moving average
1.3 Time Series Statistical Models 13 white noise w 3 1 0 1 0 100 00 300 400 500 Time moving average v 1.5 0.5 0.5 1.5 0 100 00 300 400 500 Fig. 1.8. Gaussian white noise series (top) and three-point moving
More informationTime Series 3. Robert Almgren. Sept. 28, 2009
Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E
More informationStochastic Processes. A stochastic process is a function of two variables:
Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:
More informationModule 3. Descriptive Time Series Statistics and Introduction to Time Series Models
Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015
More informationClassic Time Series Analysis
Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t
More informationNonlinear time series
Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationSTA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)
STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA
More informationProblem Set 1 Solution Sketches Time Series Analysis Spring 2010
Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically
More informationOn 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).
On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). (sin(x)) 2 + (cos(x)) 2 = 1. 28 1 Characteristics of Time
More informationNANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS
NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationSTAT 248: EDA & Stationarity Handout 3
STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance
More informationChapter 3 - Temporal processes
STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect
More informationGaussian processes. Basic Properties VAG002-
Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity
More informationLecture 4 - Spectral Estimation
Lecture 4 - Spectral Estimation The Discrete Fourier Transform The Discrete Fourier Transform (DFT) is the equivalent of the continuous Fourier Transform for signals known only at N instants separated
More informationDifference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.
Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order
More informationChapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models
Chapter 3 ARIMA Models Classical regression is often insu cient for explaining all of the interesting dynamics of a time series. For example, the ACF of the residuals of the simple linear regression fit
More informationSpectral representations and ergodic theorems for stationary stochastic processes
AMS 263 Stochastic Processes (Fall 2005) Instructor: Athanasios Kottas Spectral representations and ergodic theorems for stationary stochastic processes Stationary stochastic processes Theory and methods
More informationStochastic Processes. Monday, November 14, 11
Stochastic Processes 1 Definition and Classification X(, t): stochastic process: X : T! R (, t) X(, t) where is a sample space and T is time. {X(, t) is a family of r.v. defined on {, A, P and indexed
More informationAkaike criterion: Kullback-Leibler discrepancy
Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ), 2 }, Kullback-Leibler s index of f ( ; ) relativetof ( ; ) is Z ( ) =E
More information2. An Introduction to Moving Average Models and ARMA Models
. An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationRandom Processes Why we Care
Random Processes Why we Care I Random processes describe signals that change randomly over time. I Compare: deterministic signals can be described by a mathematical expression that describes the signal
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationTIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets
TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:
More information3 Theory of stationary random processes
3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation
More informationESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45
ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions
More informationChapter 6. Random Processes
Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process
More informationDeterministic. Deterministic data are those can be described by an explicit mathematical relationship
Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationNonlinear Time Series Modeling
Nonlinear Time Series Modeling Part II: Time Series Models in Finance Richard A. Davis Colorado State University (http://www.stat.colostate.edu/~rdavis/lectures) MaPhySto Workshop Copenhagen September
More informationClassical Decomposition Model Revisited: I
Classical Decomposition Model Revisited: I recall classical decomposition model for time series Y t, namely, Y t = m t + s t + W t, where m t is trend; s t is periodic with known period s (i.e., s t s
More informationat least 50 and preferably 100 observations should be available to build a proper model
III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or
More informationIntroduction to ARMA and GARCH processes
Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationThe autocorrelation and autocovariance functions - helpful tools in the modelling problem
The autocorrelation and autocovariance functions - helpful tools in the modelling problem J. Nowicka-Zagrajek A. Wy lomańska Institute of Mathematics and Computer Science Wroc law University of Technology,
More informationProperties of the Autocorrelation Function
Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More information1. Stochastic Processes and Stationarity
Massachusetts Institute of Technology Department of Economics Time Series 14.384 Guido Kuersteiner Lecture Note 1 - Introduction This course provides the basic tools needed to analyze data that is observed
More informationStatistics 349(02) Review Questions
Statistics 349(0) Review Questions I. Suppose that for N = 80 observations on the time series { : t T} the following statistics were calculated: _ x = 10.54 C(0) = 4.99 In addition the sample autocorrelation
More informationMultivariate Time Series
Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form
More informationNext tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2
Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution Defn: Z R 1 N(0,1) iff f Z (z) = 1 2π e z2 /2 Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) (a column
More informationCharacteristics of Time Series
Characteristics of Time Series Al Nosedal University of Toronto January 12, 2016 Al Nosedal University of Toronto Characteristics of Time Series January 12, 2016 1 / 37 Signal and Noise In general, most
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationFinal Exam. Economics 835: Econometrics. Fall 2010
Final Exam Economics 835: Econometrics Fall 2010 Please answer the question I ask - no more and no less - and remember that the correct answer is often short and simple. 1 Some short questions a) For each
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationEASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More informationPart III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to
TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let
More informationAPPLIED ECONOMETRIC TIME SERIES 4TH EDITION
APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION
More informationThe distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)
Stochastic Processes - MM3 - Solutions MM3 - Review Exercise Let X N (0, ), i.e. X is a standard Gaussian/normal random variable, and denote by f X the pdf of X. Consider also a continuous random variable
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More information13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process. Strict Exogeneity
Outline: Further Issues in Using OLS with Time Series Data 13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process I. Stationary and Weakly Dependent Time Series III. Highly Persistent
More information8.2 Harmonic Regression and the Periodogram
Chapter 8 Spectral Methods 8.1 Introduction Spectral methods are based on thining of a time series as a superposition of sinusoidal fluctuations of various frequencies the analogue for a random process
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More information3 ARIMA Models. 3.1 Introduction
3 ARIMA Models 3. Introduction In Chapters and, we introduced autocorrelation and cross-correlation functions (ACFs and CCFs) as tools for clarifying relations that may occur within and between time series
More informationTime series and spectral analysis. Peter F. Craigmile
Time series and spectral analysis Peter F. Craigmile http://www.stat.osu.edu/~pfc/ Summer School on Extreme Value Modeling and Water Resources Universite Lyon 1, France. 13-24 Jun 2016 Thank you to The
More informationfor valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I
Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:
More informationGaussian Random Variables Why we Care
Gaussian Random Variables Why we Care I Gaussian random variables play a critical role in modeling many random phenomena. I By central limit theorem, Gaussian random variables arise from the superposition
More informationECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes
ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV
More informationA time series is a set of observations made sequentially in time.
Time series and spectral analysis Peter F. Craigmile Analyzing time series A time series is a set of observations made sequentially in time. R. A. Fisher: One damned thing after another. Time series analysis
More informationLong-range dependence
Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series
More information3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES
3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.1 Introduction In this chapter we will review the concepts of probabilit, rom variables rom processes. We begin b reviewing some of the definitions
More informationRoss Bettinger, Analytical Consultant, Seattle, WA
ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression
More informationLecture 2: ARMA(p,q) models (part 2)
Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More information5: MULTIVARATE STATIONARY PROCESSES
5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability
More informationTesting for IID Noise/White Noise: I
Testing for IID Noise/White Noise: I want to be able to test null hypothesis time series {x t } or set of residuals {r t } is IID(0, 2 ) or WN(0, 2 ) there are many such tests, including informal test
More information16.584: Random (Stochastic) Processes
1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable
More informationARIMA Modelling and Forecasting
ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first
More informationStatistics of Stochastic Processes
Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationClass 1: Stationary Time Series Analysis
Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationRegression with correlation for the Sales Data
Regression with correlation for the Sales Data Scatter with Loess Curve Time Series Plot Sales 30 35 40 45 Sales 30 35 40 45 0 10 20 30 40 50 Week 0 10 20 30 40 50 Week Sales Data What is our goal with
More information