Stochastic Processes

Similar documents
Time Series Analysis -- An Introduction -- AMS 586

Econ 424 Time Series Concepts

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Some Time-Series Models

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

AR, MA and ARMA models

Econometría 2: Análisis de series de Tiempo

FE570 Financial Markets and Trading. Stevens Institute of Technology

Time Series 2. Robert Almgren. Sept. 21, 2009

11. Further Issues in Using OLS with TS Data

1 Linear Difference Equations

Applied time-series analysis

Volatility. Gerald P. Dwyer. February Clemson University

ECON3327: Financial Econometrics, Spring 2016

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series. MA6622, Ernesto Mordecki, CityU, HK, 2006.

Lecture 2: Univariate Time Series

Class 1: Stationary Time Series Analysis

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Covers Chapter 10-12, some of 16, some of 18 in Wooldridge. Regression Analysis with Time Series Data

7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

7 Introduction to Time Series

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

1. Stochastic Processes and Stationarity

Empirical Market Microstructure Analysis (EMMA)

at least 50 and preferably 100 observations should be available to build a proper model

Module 9: Stationary Processes

Lesson 4: Stationary stochastic processes

2. An Introduction to Moving Average Models and ARMA Models

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

1. Fundamental concepts

White Noise Processes (Section 6.2)

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

Time Series Analysis

ECONOMICS 7200 MODERN TIME SERIES ANALYSIS Econometric Theory and Applications

Part II. Time Series

Multivariate GARCH models.

Financial Econometrics

Lecture 6: Univariate Volatility Modelling: ARCH and GARCH Models

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them.

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

Heteroskedasticity in Time Series

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

Time Series 4. Robert Almgren. Oct. 5, 2009

Multivariate Time Series: VAR(p) Processes and Models

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)

Multivariate Time Series

Time Series: Theory and Methods

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 4a: ARMA Model

Univariate, Nonstationary Processes

Time Series Methods. Sanjaya Desilva

FinQuiz Notes

Introduction to Stochastic processes

Gaussian processes. Basic Properties VAG002-

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

Introduction to Regression Analysis. Dr. Devlina Chatterjee 11 th August, 2017

Review Session: Econometrics - CLEFIN (20192)

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

This note introduces some key concepts in time series econometrics. First, we

9) Time series econometrics

Ross Bettinger, Analytical Consultant, Seattle, WA

Basic concepts and terminology: AR, MA and ARMA processes

1 Class Organization. 2 Introduction

Non-Stationary Time Series and Unit Root Testing

Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem

Probabilities & Statistics Revision

Statistics of stochastic processes

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Discrete time processes

Chapter 2: Unit Roots

A time series is called strictly stationary if the joint distribution of every collection (Y t

Simulation. Where real stuff starts

Stochastic Processes. A stochastic process is a function of two variables:

LECTURE 11. Introduction to Econometrics. Autocorrelation

Econometría 2: Análisis de series de Tiempo

1 Teaching notes on structural VARs.

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

Stochastic process for macro

3. ARMA Modeling. Now: Important class of stationary processes

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

FINANCIAL ECONOMETRICS AND EMPIRICAL FINANCE -MODULE2 Midterm Exam Solutions - March 2015

CHAPTER 21: TIME SERIES ECONOMETRICS: SOME BASIC CONCEPTS

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Nonlinear time series

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Empirical properties of large covariance matrices in finance

Non-Stationary Time Series and Unit Root Testing

Chapter 4: Models for Stationary Time Series

Problem set 1 - Solutions

Nonstationary Time Series:

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Lecture 6a: Unit Root and ARIMA Models

Transcription:

Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation. A stochastic process deals with more than one possible reality of how a process might evolve. This means that even if the starting point (initial condition) is known there are many paths that a process may follow some are more probable than others. For processes in time, a stochastic process is simply a process that develops in time according to probabilistic rules. Stochastic Process Formal Definition Stochastic Process (X) is a family of random variables, dependent upon a parameter which usually denotes time (T) and defined on some sample space (Ω). Mathematically, {X t, t є T} = { X t (ω), t є T, ω є Ω} Of course the parameter does not have to always denote time. It could be a vector representing location in space. In such a case the process will represent a random variable that varies across two dimensional space. In our case we will not delve into this level of details. Stochastic Processes Edited: February 2011 Page 1

Stochastic Process Discrete Time When T is a set of integers, representing specific time points we have a stochastic process in discrete time. In this case we generally define the random variable as X n. The random variable X n will depend on earlier values of the process, That is: X n 1, X n 2, Stochastic Processes Edited: February 2011 Page 2

Stochastic Process Continuous Time When T is the real line (or some interval of the real line) we have a stochastic process in continuous time. We will focus on this definition for our study. The random variable X(t) will depend on values of X(u) for u < t. Stochastic Processes Edited: February 2011 Page 3

Examples of Stochastic Processes Random Walk Models such as exchange rate data. In a random walk model changes in the rate are independent normal random variables with zero mean and standard deviation of the actual data. In essence, the upward and downward movement in the rate is equally likely and there is no scope for profiteering by speculation except by luck. Aside: J.P. Morgan s famous stock market prediction was that, Prices will fluctuate. Bachelier s Theory of Speculation in 1900 postulated that prices fluctuate randomly. These models make sense in a world where: 1. Most price changes result from temporary imbalances between buyers and sellers. 2. Stronger price shocks are unpredictable. 3. Under efficient capital market hypothesis the current price of a stock reflects all information about it. Stochastic Processes Edited: February 2011 Page 4

These models do not make sense if: 1. One believes in technical analysis 2. Random walk models assume that returns are normally or log normally distributed. Thus, frequency of extreme events is underestimated. NOTE: These models will be the focus of our study. Poisson Processes such as photon emissions. A photon is a minute particle of light measured by a special machine. The problem the machine has a dead time period. That is, after it counts the photon it has to recharge before it can count the next photon. Therefore, the counts are underestimated. A stochastic model would need to be formulated that not only deals with the emissions but also the dead time based on some probability of occurrence. A homogenous poisson process satisfies: o Starts at zero o It is stationary, and has independent increments o For every t > 0, X(t) has a poisson distribution NOTE: Because poisson processes are counting processes they inherently have a jump. Epidemics such as SARS. The disease is detected, spread, and eventually controlled to eradicate it. The question arises, how soon will it spread. How widespread will it be. How many should you vaccinate because some will die anyway so the ultimate goal of eradication is reached. The graphs generally have a peak when the epidemic is at its peak. Stochastic Processes Edited: February 2011 Page 5

Point Processes such as Earthquakes. Empirically we say that on average there is one earthquake per year. However, in actuality there are years when there have been no recorded earthquakes. Thus, there is a clustering effect in the processes. The question would be, is it related to magnitude? is it strictly by chance?, etc. Reproduction Processes Yeast cells. Mother yeast cells produces daughter yeast cells after a random time. Each daughter cell has to evolve to a mother stage before it can reproduce. But that evolution also takes a random time. Each cell can only reproduce a fixed number of times before it dies. Question, how many cells will there be give some fixed time. Evolutionary Processes Interest to evolutionary biologists. When they study gene behavior of plants. These processes all exhibit the Markov property. Stochastic Processes Edited: February 2011 Page 6

Definition of Terms Used in Stochastic Processes Recall: Financial Time Series for this course is a collection of financial measurements over time. Example the log returns over time can be stated as: { r t } = {r 1, r 2,, r T } for T observations Time Invariant A system in which all quantities that make up the systems behavior remain constant with time. That is, the system s response to a given input does not depend on the time it is applied. For example: System A: not time in variant System B: 10 time invariant White Noise White Noise is a simple type of stochastic process (random signal) whose terms are iid with zero mean. However, the iid requirement is too restrictive in practice. A Gaussian white noise is a stochastic process having the following characteristics: Mean = zero. Variance is finite. Zero autocorrelations. Finite Dimensional Disturbances (fidis) A fidis of the stochastic process X are the distributions of the finite dimensional vectors,,, t 1,, t n є T for all possible choices of times t i and every n 1. A collection of fidis is the distribution of the stochastic process. Stochastic Processes Edited: February 2011 Page 7

Stationarity A strict stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space. Thus, Stationarity explores the timeinvariant behavior of a time series. Determining the stationarity condition of the time series allows for proper identification and development of forecasting models. Two types of Stationarities: Strict Stationarity distributions are time invariant Weak Stationarity only the first 2 moments are time invariant. That is, the data values fluctuate with constant variation around a constant level. How used in Time Series Analysis Raw data are often transformed to remove the trend effect (de trended). If the time plot of { r t } varies around a fixed level with a finite range the series is said to be weakly stationary. The first 2 moments of future r t are the same as those of the data we infer the series is weakly stationary. Most financial time series exhibit weak form stationarity. Markov Property Given the present (X k 1 ) the future (X k ) is independent of the past (X k 1, X k 3,, X 1 ). In other words, the lack of memory property of a time series. Counting Process It is a process X(t) in discrete or continuous time for which the possible values of X(t) are the natural numbers (0, 1, 2, ) with the property that X(t) is a non decreasing function of t. Stochastic Processes Edited: February 2011 Page 8

A Sample Path A sample path of a stochastic process is a particular realization of the process. That is, a particular set of values X(t) for all t, which is generated according to the (stochastic) rules of the process. Increments An increments of a stochastic process are the changes X(t) X(s) between time points s and t (s < t). Processes in which the increments for non overlapping time intervals are independent and stationary are of important. Examples: Random Walks. Levy processes such as Poisson processes; and Brownian Motion. Stochastic Processes Edited: February 2011 Page 9

Statistical Measures for Linear Time Series Gaussian Processes A stochastic process is called Gaussian if all its fidis are multivariate Gaussian. The distribution of a Gaussian stochastic process is determined only by the collection of the expectations and covariance matrices of the fidis. Mean or Expectation Variance Mean and Variance of Returns 1 1 1 To Test H 0 : µ = 0 v/s H 0 : µ 0 compute, / ~ 0,1 Reject H 0 of zero mean if t > Z α/2 Covariance and Correlation The covariance of random variables X and Y is defined: Cov(X,Y) = E[(X µ x )(Y µ y )]. If X and Y are in sync the covariances will be high; If they are independent, the positive and negative terms should cancel out to give a score around zero. Stochastic Processes Edited: February 2011 Page 10

The lag l auto covariance Cov(r t, r t 1 ) = γ l has two interesting properties: γ 0 = Var(r t ); γ l = γ l 1 The correlation coefficient of random variables X and Y, is defined:,,, It measures the strength of linear dependence between X and Y, and lies between ± 1. Correlation and Causation NOTE: Correlation does not imply causation. The meaningfulness of the correlation can be evaluated by considering: the number of pairs tested the number of points in each time series the sniff test statistical tests Lag k auto covariance, Serial (or auto ) correlations, Existence of serial correlation implies that the return is predictable indicating market inefficiency. Aside: Market inefficiency is a condition that occurs when the current prices don t reflect the available information regarding securities. Stochastic Processes Edited: February 2011 Page 11

Sample Autocorrelation Function (ACF) Where: is the sample mean and T is the sample size Test zero serial correlations (market efficiency) Single Test : 0 : 0 1 ~ 0,1 Reject H 0 if t > Z α/2 Joint Test Ljung Box statistics : 0 : 0 2 ~ Reject H 0 if Q(m) > Stochastic Processes Edited: February 2011 Page 12

What are we looking for when studying autocorrelations? If stock returns are truly random, we expect all lags to show a correlation of around zero. Today s volatility is a good predictor for tomorrow, so we expect high autocorrelations for short lags. Macro variables such as today s sales are a good predictor for yesterday s sales, so we expect high autocorrelations for short lags. If the analysis period is changed you may be able to study day of week effects, or day of year effects. These will show up as lags of 7, and 365 respectively. Stochastic Processes Edited: February 2011 Page 13

Univariate Time Series Purpose 1. A model for 2. Understanding models for : properties, forecasting, etc. Linear Time Series is linear if the predictable part is a linear function of F t 1, and are independent and have the same distribution (iid). That is, Where: : is a constant 1 associated with impulse responses is an iid sequence with mean zero and a well defined distribution. Generally these are shock (or innovations) Univariate Linear Time Series Models 1. Autoregressive (AR) Models 2. Moving Average (MA) Models 3. Mixed ARMA Models 4. Seasonal Models 5. Regression Models with Time Series Errors 6. Fractionally differenced Models (long memory models) Stochastic Processes Edited: February 2011 Page 14

Important Properties of a Model 1. Stationarity Conditions 2. Basic properties: mean, variance, and serial dependence 3. Empirical model building: specification, estimation, and checking 4. Forecasting Consideration for Empirical Model Building 1. Ethical and Financial considerations 2. Art as much as a Science 3. How much detail over / under specification 4. NOTE: All models are wrong that is they only approximate reality they are not the reality 5. Constant tweaking is essential before one that stands the test of time is found Stochastic Processes Edited: February 2011 Page 15

AR Models Simple AR Models: Similar to a Simple Linear Regression model with lagged variables. AR(1) Model An AR(1) model can be stated as follows: Where:, : are real numbers, which are referred to as parameters. : is assumed to be a white noise series with mean = 0 and a finite variance. AR(p) Model An AR(p) model (similar to a multiple regression model with lagged variables) can be stated as follows: Both models suggest that the past period values jointly determine the conditional expectation of today s value. Stochastic Processes Edited: February 2011 Page 16

Properties of AR(1) Model Stationarity Assume: weak stationarity a necessary and sufficient condition is 1 < 1 Recall: Under stationarity E(r t ) = E(r t 1 ) = µ. Therefore, taking the expectation of We have, And the Mean = μ Recall: Under stationarity Var(r t ) = Var(r t 1 ). Therefore, taking the square and expectation of: We have, μ μ Or, the Variance = Stochastic Processes Edited: February 2011 Page 17

Autocorrelations: ρ0 1, ρ1 1, ρ2, etc. In general, ρk, and ACF ρk decays exponentially as k increases. Note: if 1 0, the decay rate 1. If if 1 0 the decay rate Stochastic Processes Edited: February 2011 Page 18

AR(2) Model An AR(2) model can be stated as follows: Where:,, : are real numbers, which are referred to as parameters. : is assumed to be a white noise series with mean = 0 and a finite variance. Stationarity of AR(2) Stochastic Processes Edited: February 2011 Page 19