For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Similar documents
STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

Some Time-Series Models

Introduction to Stochastic processes

Chapter 4: Models for Stationary Time Series

Lesson 4: Stationary stochastic processes

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Chapter 9: Forecasting

white noise Time moving average

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Econometría 2: Análisis de series de Tiempo

Econ 424 Time Series Concepts

Bivariate distributions

Lecture 4a: ARMA Model

Spectral representations and ergodic theorems for stationary stochastic processes

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).

1 Linear Difference Equations

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Lecture 2: Review of Probability

1. Stochastic Processes and Stationarity

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process. Strict Exogeneity

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

7 Introduction to Time Series

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Statistical signal processing

Stochastic Processes. A stochastic process is a function of two variables:

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

Chapter 3 - Temporal processes

Module 9: Stationary Processes

Covariance and Correlation

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Statistics of stochastic processes

STAT 248: EDA & Stationarity Handout 3

7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15

Stochastic process for macro

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

Joint Probability Distributions and Random Samples (Devore Chapter Five)

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

Ch 9. FORECASTING. Time Series Analysis

ECON3327: Financial Econometrics, Spring 2016

Discrete time processes

Joint Distribution of Two or More Random Variables

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

conditional cdf, conditional pdf, total probability theorem?

Ch3. TRENDS. Time Series Analysis

Empirical Market Microstructure Analysis (EMMA)

STAT 153, FALL 2015 HOMEWORK 1 SOLUTIONS

Lecture 2: Univariate Time Series

Part II. Time Series

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Gaussian processes. Basic Properties VAG002-

Stat 5101 Notes: Algorithms (thru 2nd midterm)

11. Further Issues in Using OLS with TS Data

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)

Introduction. Spatial Processes & Spatial Patterns

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

Jointly Distributed Random Variables

Ch 4. Models For Stationary Time Series. Time Series Analysis

ENGG2430A-Homework 2

Ch 6. Model Specification. Time Series Analysis

6.041/6.431 Fall 2010 Quiz 2 Solutions

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

The Delta Method and Applications

1 Presessional Probability

Math 180B, Winter Notes on covariance and the bivariate normal distribution

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2

ECE 673-Random signal analysis I Final

1. Fundamental concepts

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Class 1: Stationary Time Series Analysis

Chapter 4. Chapter 4 sections

X t = a t + r t, (7.1)

Multiple Random Variables

This note introduces some key concepts in time series econometrics. First, we

Stat 5101 Notes: Algorithms

MAS223 Statistical Inference and Modelling Exercises

Bivariate Paired Numerical Data

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

Random Processes Why we Care

Time Series Solutions HT 2009

Chapter 5: Models for Nonstationary Time Series

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

A time series is called strictly stationary if the joint distribution of every collection (Y t

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

4. Distributions of Functions of Random Variables

Notes on Random Processes

3-1. all x all y. [Figure 3.1]

FE570 Financial Markets and Trading. Stevens Institute of Technology

Transcription:

CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance functions, stationary processes, and autocorrelation functions. 2. Time Series and Stochastic Processes The sequence of random variables {Y t : t 0, ±, ±2, ±3, } is called a stochastic process and serves as a model for an observed time series. It is known that the complete probabilistic structure of such a process is determined by the set of distributions of all finite collections of the Y s. Fortunately, we will not have to deal explicitly with these multivariate distributions. Much of the information in these joint distributions can be described in terms of means, variances, and covariances. Consequently, we concentrate our efforts on these first and second moments. (If the joint distributions of the Y s are multivariate normal distributions, then the first and second moments completely determine all the joint distributions.) 2.2 Means, Variances, and Covariances For a stochastic process {Y t : t 0, ±, ±2, ±3, }, the mean function is defined by (2.2.) That is, μ t is just the expected value of the process at time t. In general, μ t can be different at each time point t. The autocovariance function, γ t,s, is defined as where Cov(Y t, Y s ) E[(Y t μ t )(Y s μ s )] E(Y t Y s ) μ t μ s. The autocorrelation function, ρ t,s, is given by where μ t EY ( t ) for t 0, ±, ± 2..., γ t, s Cov( Y t, Y s ) for ts, 0, ±, ± 2..., ρ t, s Corr( Y t, Y s ) for t, s 0, ±, ± 2,... Corr( Y t, Y s ) Cov( Y t, Y s ) ------------------------------------------- γ -------------------- t, s Var( Y t )Var( Y s ) γ t, t γ s, s (2.2.2) (2.2.3) (2.2.4)

2 Fundamental Concepts We review the basic properties of expectation, variance, covariance, and correlation in Appendix A on page 24. Recall that both covariance and correlation are measures of the (linear) dependence between random variables but that the unitless correlation is somewhat easier to interpret. The following important properties follow from known results and our definitions: γ t, t Var( Y t ) γ t, s γ s, t γ t, s γ tt, γ ss, ρ t, t ρ t, s ρ s, t ρ t, s (2.2.5) Values of ρ t,s near ± indicate strong (linear) dependence, whereas values near zero indicate weak (linear) dependence. If ρ t,s 0, we say that Y t and Y s are uncorrelated. To investigate the covariance properties of various time series models, the following result will be used repeatedly: If c, c 2,, c m and d, d 2,, d n are constants and t, t 2,, t m and s, s 2,, s n are time points, then Cov m n m n c i Y ti, d j Y sj c i d j Cov( Y ti, Y sj ) i j i j (2.2.6) The proof of Equation (2.2.6), though tedious, is a straightforward application of the linear properties of expectation. As a special case, we obtain the well-known result Var n c i Y ti i n c 2 i Var ( Y ) n i ti + 2 c i c j Cov( Y ti, Y tj ) i i 2 j (2.2.7) The Random Walk Let e, e 2, be a sequence of independent, identically distributed random variables each with zero mean and variance σ2 e. The observed time series, {Y t : t, 2, }, is constructed as follows: Y e Y 2. e + e 2 Y t e + e 2 + + et (2.2.8) Alternatively, we can write Y t Y t + e t (2.2.9) with initial condition Y e. If the e s are interpreted as the sizes of the steps taken (forward or backward) along a number line, then Y t is the position of the random walker at time t. From Equation (2.2.8), we obtain the mean function

2.2 Means, Variances, and Covariances 3 so that μ t We also have EY ( t ) Ee ( + e 2 + + e t ) Ee ( ) + Ee ( 2 ) + + Ee ( t ) 0 + 0 + + 0 μ t 0 for all t (2.2.0) Var( Y t ) Var( e + e 2 + + e t ) Var( e ) + Var( e 2 ) + + Var ( et ) σ2 e + σ2 e + + σe 2 so that Var( Y t ) tσ2 e (2.2.) Notice that the process variance increases linearly with time. To investigate the covariance function, suppose that t s. Then we have γ ts, Cov( Y t, Y s ) Cov( e + e 2 + + e t, e + e 2 + + e t + e t + + + e s ) From Equation (2.2.6), we have s t γ ts, Cov( e i, e j ) i j However, these covariances are zero unless i j, in which case they equal Var(e i ) σ2 e. There are exactly t of these so that γ t,s t σ2 e. Since γ t,s γ s,t, this specifies the autocovariance function for all time points t and s and we can write γ t, s tσ2 e for t s The autocorrelation function for the random walk is now easily obtained as γ ρ t, s t t, s -------------------- - for t s γ t, t γ s ss, (2.2.2) (2.2.3) The following numerical values help us understand the behavior of the random walk. ρ, 2 -- 0.707 2 24 ρ 24, 25 ----- 0.980 25 8 ρ 89, -- 0.943 9 ρ, 25 ----- 0.200 25 The values of Y at neighboring time points are more and more strongly and positively correlated as time goes by. On the other hand, the values of Y at distant time points are less and less correlated. A simulated random walk is shown in Exhibit 2. where the e s were selected from a standard normal distribution. Note that even though the theoretical mean function is

4 Fundamental Concepts zero for all time points, the fact that the variance increases over time and that the correlation between process values nearby in time is nearly indicate that we should expect long excursions of the process away from the mean level of zero. The simple random walk process provides a good model (at least to a first approximation) for phenomena as diverse as the movement of common stock price, and the position of small particles suspended in a fluid so-called Brownian motion. Exhibit 2. Time Series Plot of a Random Walk Random Walk 2 0 2 4 6 8 0 0 20 30 40 50 60 Time > win.graph(width4.875, height2.5,pointsize8) > data(rwalk) # rwalk contains a simulated random walk > plot(rwalk,type'o',ylab'random Walk') A Moving Average As a second example, suppose that {Y t } is constructed as Y t e t + e --------------------- t 2 (2.2.4) where (as always throughout this book) the e s are assumed to be independent and identically distributed with zero mean and variance σ2 e. Here μ t EY ( t ) E e t + e --------------------- t Ee ( t ) + Ee ( t ) 2 --------------------------------------- 2 0 and

2.2 Means, Variances, and Covariances 5 Var( Y t ) Var e t + e --------------------- t Var( e t ) + Var( e t ) 2 --------------------------------------------------- 4 0.5σ2 e Also Cov( Y t, Y t ) Cov e t + e --------------------- e t t + e ---------------------------- t 2, 2 2 Cov e t, e t + (, t t + ( 2, e t ) ------------------------------------------------------------------------------------------------------------------------- 4 Cov( e t, e t 2 ) + ---------------------------------------- 4 Cov( e t, e t ) ---------------------------------------- 4 (as all the other covariances are zero) 0.25σ2 e or Furthermore, γ t, t 0.25σ2 e for all t Cov( Y t, Y t 2 ) Cov e t + e --------------------- e t t 2 + e t 3, ---------------------------- 2 2 Similarly, Cov(Y t, Y t k ) 0 for k >, so we may write For the autocorrelation function, we have 0 since the e s are independent. 0.5σ2 e for t s 0 γ t, s 0.25σ2 e for t s 0 for t s > (2.2.5) for t s 0 ρ t, s 0.5 for t s 0 for t s > (2.2.6) since 0.25 σ2 e /0.5 σ2 e 0.5. Notice that ρ 2, ρ 3,2 ρ 4,3 ρ 9,8 0.5. Values of Y precisely one time unit apart have exactly the same correlation no matter where they occur in time. Furthermore, ρ 3, ρ 4,2 ρ t,t 2 and, more generally, ρ t,t k is the same for all values of t. This leads us to the important concept of stationarity.

6 Fundamental Concepts 2.3 Stationarity To make statistical inferences about the structure of a stochastic process on the basis of an observed record of that process, we must usually make some simplifying (and presumably reasonable) assumptions about that structure. The most important such assumption is that of stationarity. The basic idea of stationarity is that the probability laws that govern the behavior of the process do not change over time. In a sense, the process is in statistical equilibrium. Specifically, a process {Y t } is said to be strictly stationary if the joint distribution of Y t, Y t2,, Y tn is the same as the joint distribution of Y t k Y t2 k,, Y tn k for all choices of time points t, t 2,, t n and all choices of time lag k. Thus, when n the (univariate) distribution of Y t is the same as that of Y t k for all t and k; in other words, the Y s are (marginally) identically distributed. It then follows that E(Y t ) E(Y t k ) for all t and k so that the mean function is constant for all time. Additionally, Var(Y t ) Var(Y t k ) for all t and k so that the variance is also constant over time. Setting n 2 in the stationarity definition we see that the bivariate distribution of Y t and Y s must be the same as that of Y t k and Y s k from which it follows that Cov(Y t, Y s ) Cov(Y t k, Y s k ) for all t, s, and k. Putting k s and then k t, we obtain That is, the covariance between Y t and Y s depends on time only through the time difference t s and not otherwise on the actual times t and s. Thus, for a stationary process, we can simplify our notation and write Note also that γ t, s Cov( Y t s, Y 0 ) Cov( Y 0, Y s t ) Cov( Y 0, Y t s ) γ 0, t s γ k Cov( Y t, Y t k ) and ρ k Corr( Y t, Y t k ) The general properties given in Equation (2.2.5) now become γ 0 γ k Var( Y t ) γ k γ k γ 0 ρ k γ ---- k γ 0 ρ 0 (2.3.) (2.3.2) If a process is strictly stationary and has finite variance, then the covariance function must depend only on the time lag. A definition that is similar to that of strict stationarity but is mathematically weaker ρ k ρ k ρ k

2.3 Stationarity 7 is the following: A stochastic process {Y t } is said to be weakly (or second-order) stationary if. The mean function is constant over time, and 2. γ t, t k γ 0, k for all time t and lag k In this book the term stationary when used alone will always refer to this weaker form of stationarity. However, if the joint distributions for the process are all multivariate normal distributions, it can be shown that the two definitions coincide. For stationary processes, we usually only consider k 0. White Noise A very important example of a stationary process is the so-called white noise process, which is defined as a sequence of independent, identically distributed random variables {e t }. Its importance stems not from the fact that it is an interesting model itself but from the fact that many useful processes can be constructed from white noise. The fact that {e t } is strictly stationary is easy to see since Pr( e t x, e t2 x 2,, e tn x n ) Pr( e t x )Pr( e t2 x 2 ) Pr( e tn x n ) (by independence) Pr( e t k x )Pr e t2 k x 2 ( tn k x n ) (identical distributions) Pr( e t k x, e t2 k x 2,, e tn k x n ) (by independence) as required. Also, μ t E(e t ) is constant and Var( e γ t ) k 0 for k 0 for k 0 Alternatively, we can write ρ k 0 for k 0 for k 0 (2.3.3) The term white noise arises from the fact that a frequency analysis of the model shows that, in analogy with white light, all frequencies enter equally. We usually assume that the white noise process has mean zero and denote Var(e t ) by σ2 e. The moving average example, on page 4, where Y t (e t + e t )/2, is another example of a stationary process constructed from white noise. In our new notation, we have for the moving average process that ρ k 0.5 0 for k 0 for k for k 2

8 Fundamental Concepts Random Cosine Wave As a somewhat different example, consider the process defined as follows: Y t cos 2π ----- t + Φ for t 0, ±, ± 2, 2 where Φ is selected (once) from a uniform distribution on the interval from 0 to. A sample from such a process will appear highly deterministic since Y t will repeat itself identically every 2 time units and look like a perfect (discrete time) cosine curve. However, its maximum will not occur at t 0 but will be determined by the random phase Φ. The phase Φ can be interpreted as the fraction of a complete cycle completed by time t 0. Still, the statistical properties of this process can be computed as follows: EY ( t ) E 2π ----- t + Φ cos 2 2π ----- t + φ cos 2 0 ----- sin 2π ----- t + φ 2π 2 dφ ----- sin 2π----- t + 2π 2π 2 But this is zero since the sines must agree. So μ t 0 for all t. Also φ 0 sin 2π----- t 2 γ t, s E 2π ----- t + Φ 2π ----- s + Φ cos cos 2 2 cos 2π ----- t + φ cos 2π ----- s + φ 0 2 2 dφ -- cos 2π t --------- s 2π t --------- + s + 2φ 2 + cos 0 2 2 -- 2π t --------- s cos ----- 2π t --------- + s + 2φ 2 + sin 2 4π 2 -- cos 2π ------------ t s 2 2 dφ φ 0 This example contains optional material that is not needed in order to understand most of the remainder of this book. It will be used in Chapter 3, Introduction to Spectral Analysis.

2.4 Summary 9 So the process is stationary with autocorrelation function ρ k (2.3.4) This example suggests that it will be difficult to assess whether or not stationarity is a reasonable assumption for a given time series on the basis of the time sequence plot of the observed data. The random walk of page 2, where Y t e + e 2 + + et, is also constructed from white noise but is not stationary. For example, the variance function, Var(Y t ) t σ2 e, is not constant; furthermore, the covariance function γ t, s tσ2 e for 0 t s does not depend only on time lag. However, suppose that instead of analyzing {Y t } directly, we consider the differences of successive Y-values, denoted Y t. Then Y t Y t Y t e t, so the differenced series, { Y t }, is stationary. This represents a simple example of a technique found to be extremely useful in many applications. Clearly, many real time series cannot be reasonably modeled by stationary processes since they are not in statistical equilibrium but are evolving over time. However, we can frequently transform nonstationary series into stationary series by simple techniques such as differencing. Such techniques will be vigorously pursued in the remaining chapters. 2.4 Summary cos 2π----- k 2 for k 0, ±, ± 2, In this chapter we have introduced the basic concepts of stochastic processes that serve as models for time series. In particular, you should now be familiar with the important concepts of mean functions, autocovariance functions, and autocorrelation functions. We illustrated these concepts with the basic processes: the random walk, white noise, a simple moving average, and a random cosine wave. Finally, the fundamental concept of stationarity introduced here will be used throughout the book. EXERCISES 2. Suppose E(X) 2, Var(X) 9, E(Y) 0, Var(Y) 4, and Corr(X,Y) 0.25. Find: (a) Var(X + Y). (b) Cov(X, X + Y). (c) Corr(X + Y, X Y). 2.2 If X and Y are dependent but Var(X) Var(Y), find Cov(X + Y, X Y). 2.3 Let X have a distribution with mean μ and variance σ 2, and let Y t X for all t. (a) Show that {Y t } is strictly and weakly stationary. (b) Find the autocovariance function for {Y t }. (c) Sketch a typical time plot of Y t.

20 Fundamental Concepts 2.4 Let {e t } be a zero mean white noise process. Suppose that the observed process is Y t e t + θe t, where θ is either 3 or /3. (a) Find the autocorrelation function for {Y t } both when θ 3 and when θ /3. (b) You should have discovered that the time series is stationary regardless of the value of θ and that the autocorrelation functions are the same for θ 3 and θ /3. For simplicity, suppose that the process mean is known to be zero and the variance of Y t is known to be. You observe the series {Y t } for t, 2,..., n and suppose that you can produce good estimates of the autocorrelations ρ k. Do you think that you could determine which value of θ is correct (3 or /3) based on the estimate of ρ k? Why or why not? 2.5 Suppose Y t 5 + 2t + X t, where {X t } is a zero-mean stationary series with autocovariance function γ k. (a) Find the mean function for {Y t }. (b) Find the autocovariance function for {Y t }. (c) Is {Y t } stationary? Why or why not? X 2.6 Let {X t } be a stationary time series, and define Y t for t odd t X t + 3 for t even. (a) Show that Cov( Y t, Y t k ) is free of t for all lags k. (b) Is {Y t } stationary? 2.7 Suppose that {Y t } is stationary with autocovariance function γ k. (a) Show that W t Y t Y t Y t is stationary by finding the mean and autocovariance function for {W t }. (b) Show that U t 2 Y t [Y t Y t ] Y t 2Y t + Y t 2 is stationary. (You need not find the mean and autocovariance function for {U t }.) 2.8 Suppose that {Y t } is stationary with autocovariance function γ k. Show that for any fixed positive integer n and any constants c, c 2,..., c n, the process {W t } defined by W t c Y t + c 2 Y t + + cn Y t n + is stationary. (Note that Exercise 2.7 is a special case of this result.) 2.9 Suppose Y t β 0 + β t + X t, where {X t } is a zero-mean stationary series with autocovariance function γ k and β 0 and β are constants. (a) Show that {Y t } is not stationary but that W t Y t Y t Y t is stationary. (b) In general, show that if Y t μ t + X t, where {X t } is a zero-mean stationary series and μ t is a polynomial in t of degree d, then m Y t ( m Y t ) is stationary for m d and nonstationary for 0 m < d. 2.0 Let {X t } be a zero-mean, unit-variance stationary process with autocorrelation function ρ k. Suppose that μ t is a nonconstant function and that σ t is a positive-valued nonconstant function. The observed series is formed as Y t μ t + σ t X t. (a) Find the mean and covariance function for the {Y t } process. (b) Show that the autocorrelation function for the {Y t } process depends only on the time lag. Is the {Y t } process stationary? (c) Is it possible to have a time series with a constant mean and with Corr(Y t,y t k ) free of t but with {Y t } not stationary?

Exercises 2 2. Suppose Cov(X t,x t k ) γ k is free of t but that E(X t ) 3t. (a) Is {X t } stationary? (b) Let Y t 7 3t + X t. Is {Y t } stationary? 2.2 Suppose that Y t e t e t 2. Show that {Y t } is stationary and that, for k > 0, its autocorrelation function is nonzero only for lag k 2. 2.3 Let Y t e t θ(e t ) 2. For this exercise, assume that the white noise series is normally distributed. (a) Find the autocorrelation function for {Y t }. (b) Is {Y t } stationary? 2.4 Evaluate the mean and covariance function for each of the following processes. In each case, determine whether or not the process is stationary. (a) Y t θ 0 + te t. (b) W t Y t, where Y t is as given in part (a). (c) Y t e t e t. (You may assume that {e t } is normal white noise.) 2.5 Suppose that X is a random variable with zero mean. Define a time series by Y t ( ) t X. (a) Find the mean function for {Y t }. (b) Find the covariance function for {Y t }. (c) Is {Y t } stationary? 2.6 Suppose Y t A + X t, where {X t } is stationary and A is random but independent of {X t }. Find the mean and covariance function for {Y t } in terms of the mean and autocovariance function for {X t } and the mean and variance of A. 2.7 Let {Y t } be stationary with autocovariance function γ k. Let Y _. n -- n t Y t Show that Var Y _ ( ) γ ---- 0 n 2 n -- n k + -- γk n k n -- n k n + ---- k γk n 2.8 Let {Y t } be stationary with autocovariance function γ k. Define the sample variance as S 2 n -----------. n ( Y t Y _ ) 2 t n (a) First show that ( Y t μ ) 2 n ( Y t Y _ ) 2 + ny ( _ μ) 2. t t (b) Use part (a) to show that (c) ES 2 n n ( ) -----------γ. n 0 -----------Var Y _ 2 ( ) γ n 0 n ----------- n k -- γk n k (Use the results of Exercise 2.7 for the last expression.) (d) If {Y t } is a white noise process with variance γ 0, show that E(S 2 ) γ 0.

22 Fundamental Concepts 2.9 Let Y θ 0 + e, and then for t > define Y t recursively by Y t θ 0 + Y t + e t. Here θ 0 is a constant. The process {Y t } is called a random walk with drift. (a) Show that Y t may be rewritten as Y t tθ 0 + e t + e t + + e. (b) Find the mean function for Y t. (c) Find the autocovariance function for Y t. 2.20 Consider the standard random walk model where Y t Y t + e t with Y e. (a) Use the representation of Y t above to show that μ t μ t for t > with initial condition μ E(e ) 0. Hence show that μ t 0 for all t. (b) Similarly, show that Var(Y t ) Var(Y t ) + σ2 e for t > with Var(Y ) σ2 e and hence Var(Y t ) t σ2 e. (c) For 0 t s, use Y s Y t + e t + + e t + 2 + + e s to show that Cov(Y t, Y s ) Var(Y t ) and, hence, that Cov(Y t, Y s ) min(t, s) σ2 e. 2.2 For a random walk with random starting value, let Y t Y 0 + e t + e t + + e for t > 0, where Y 0 has a distribution with mean μ 0 and variance σ2 0. Suppose further that Y 0, e,..., e t are independent. (a) Show that E(Y t ) μ 0 for all t. (b) Show that Var(Y t ) t σ2 e + σ2 0. (c) Show that Cov(Y t, Y s ) min(t, s) σ2 e + σ2 0. tσ2 (d) Show that Corr( Y t, Y s ) a + σ2 --------------------- 0 for 0 t s. sσ2 a + σ2 0 2.22 Let {e t } be a zero-mean white noise process, and let c be a constant with c <. Define Y t recursively by Y t cy t + e t with Y e. (a) Show that E(Y t ) 0. (b) Show that Var(Y t ) σ2 ( + c 2 +c 4 + + c 2t 2 e ). Is {Y t } stationary? (c) Show that Corr( Y t, Y t ) c Var ( Y t ) -------------------------- Var( Y t ) and, in general, Corr( Y t, Y t k ) c k Var( Y t k ) ------------------------- for k > 0 Var( Y t ) Hint: Argue that Y t is independent of e t. Then use Cov(Y t, Y t ) Cov(cY t + e t, Y t ) (d) For large t, argue that σ2 e Var( Y t ) ------------- c 2 and Corr( Y t, Y t k ) c k for k > 0 so that {Y t } could be called asymptotically stationary. e (e) Suppose now that we alter the initial condition and put Y -----------------. Show that now {Y t } is stationary. c 2

Exercises 23 2.23 Two processes {Z t } and {Y t } are said to be independent if for any time points t, t 2,..., t m and s, s 2,..., s n the random variables { Z t, Z t2,, Z tm } are independent of the random variables { Y s, Y s2,, Y sn }. Show that if {Z t } and {Y t } are independent stationary processes, then W t Z t + Y t is stationary. 2.24 Let {X t } be a time series in which we are interested. However, because the measurement process itself is not perfect, we actually observe Y t X t + e t. We assume that {X t } and {e t } are independent processes. We call X t the signal and e t the measurement noise or error process. If {X t } is stationary with autocorrelation function ρ k, show that {Y t } is also stationary with ρ Corr( Y t, Y t k ) -------------------------- k σ 2 for k + e σ2 X We call σ 2 X σ2 e the signal-to-noise ratio, or SNR. Note that the larger the SNR, the closer the autocorrelation function of the observed process {Y t } is to the autocorrelation function of the desired signal {X t }. k 2.25 Suppose Y t β 0 + [ A i cos( 2πf i t) + B i sin( 2πf i t) ], where β 0, f, f 2,..., f k are i constants and A, A 2,..., A k, B, B 2,..., B k are independent random variables with zero means and variances Var(A i ) Var(B i ) σ2 i. Show that {Y t } is stationary and find its covariance function. 2.26 Define the function Γ t, s --E[( Y. In geostatistics, Γ t,s is called the 2 t Y s ) 2 ] semivariogram. (a) Show that for a stationary process Γ t, s γ 0 γ t s. (b) A process is said to be intrinsically stationary if Γ t,s depends only on the time difference t s. Show that the random walk process is intrinsically stationary. 2.27 For a fixed, positive integer r and constant φ, consider the time series defined by Y t e t + φe t + φ 2 e t 2 + + φ r e t r. (a) Show that this process is stationary for any value of φ. (b) Find the autocorrelation function. 2.28 (Random cosine wave extended) Suppose that Y t Rcos( 2π( ft+ Φ) ) for t 0, ±, ± 2, where 0 < f < ½ is a fixed frequency and R and Φ are uncorrelated random variables and with Φ uniformly distributed on the interval (0,). (a) Show that E(Y t ) 0 for all t. (b) Show that the process is stationary with γ k --E( R. 2 2 ) cos( 2πfk) Hint: Use the calculations leading up to Equation (2.3.4), on page 9.

24 Fundamental Concepts 2.29 (Random cosine wave extended further) Suppose that Y t cos[ + )] m R j 2π( f j t Φ j for t 0, ±, ± 2, j where 0 < f < f 2 < < f m < ½ are m fixed frequencies, and R, Φ, R 2, Φ 2,, R m, Φ m are uncorrelated random variables with each Φ j uniformly distributed on the interval (0,). (a) Show that E(Y t ) 0 for all t. m (b) Show that the process is stationary with γ k -- ER2. 2 ( j ) cos( 2πf j k) Hint: Do Exercise 2.28 first. j 2.30 (Mathematical statistics required) Suppose that Y t Rcos[ 2π( ft + Φ) ] for t 0, ±, ± 2, where R and Φ are independent random variables and f is a fixed frequency. The phase Φ is assumed to be uniformly distributed on (0,), and the amplitude R has a Rayleigh distribution with pdf fr ( ) re r2 2 for r > 0. Show that for each time point t, Y t has a normal distribution. (Hint: Let Y Rcos[ 2π( ft + Φ) ] and X Rsin[ 2π( ft + Φ) ]. Now find the joint distribution of X and Y. It can also be shown that all of the finite dimensional distributions are multivariate normal and hence the process is strictly stationary.) Appendix A: Expectation, Variance, Covariance, and Correlation In this appendix, we define expectation for continuous random variables. However, all of the properties described hold for all types of random variables, discrete, continuous, or otherwise. Let X have probability density function f(x) and let the pair (X,Y) have joint probability density function f(x,y). The expected value of X is defined as EX ( ) xf( x) dx. (If xf( x) dx < ; otherwise E(X) is undefined.) E(X) is also called the expectation of X or the mean of X and is often denoted μ or μ X. Properties of Expectation If h(x) is a function such that hx ( ) fx ( ) dx <, it may be shown that Similarly, if EhX [ ( )] hx ( )fx ( ) dx hxy (,)fxy (, ) dxdy <, it may be shown that

Appendix A: Expectation, Variance, Covariance and Correlation 25 As a corollary to Equation (2.A.), we easily obtain the important result EaX ( + by+ c) ae( X) + be( Y) + c We also have EhXY [ (, )] hxy (, )fxy (, ) dxdy EXY ( ) xyf( x, y) dxdy The variance of a random variable X is defined as Var( X) E{ [ X E( X) ] 2 } (provided E(X 2 ) exists). The variance of X is often denoted by σ 2 or σ2 X. (2.A.) (2.A.2) (2.A.3) (2.A.4) Properties of Variance Var( X) 0 (2.A.5) Var( a + bx) b 2 Var( X) If X and Y are independent, then Var( X + Y) Var( X) + Var( Y) In general, it may be shown that (2.A.6) (2.A.7) Var( X) EX ( 2 ) [ EX ( )] 2 (2.A.8) The positive square root of the variance of X is called the standard deviation of X and is often denoted by σ or σ X. The random variable (X μ X )/σ X is called the standardized version of X. The mean and standard deviation of a standardized variable are always zero and one, respectively. The covariance of X and Y is defined as Cov( X, Y) E[ ( X μ X )( Y μ Y )]. Properties of Covariance Cov( a + bx, c + dy) bdcov( X, Y) Var( X + Y) Var( X) + Var( Y) + 2Cov( X, Y) Cov( X + Y, Z) Cov( X, Z) + Cov( Y, Z) Cov( X, X) Var( X) Cov( X, Y) Cov( Y, X) If X and Y are independent, Cov( X, Y) 0 (2.A.9) (2.A.0) (2.A.) (2.A.2) (2.A.3) (2.A.4)

26 Fundamental Concepts The correlation coefficient of X and Y, denoted by Corr(X, Y) or ρ, is defined as Cov( X, Y) ρ Corr( X, Y) ---------------------------------------- Var( X)Var( Y) Alternatively, if X* is a standardized X and Y* is a standardized Y, then ρ E(X*Y*). Properties of Correlation Corr( X, Y) Corr( a + bx, c + dy) sign( bd)corr( X, Y) where sign( bd) if bd > 0 0 if bd 0 if bd < 0 (2.A.5) (2.A.6) Corr(X, Y) ± if and only if there are constants a and b such that Pr(Y a + bx).

http://www.springer.com/978-0-387-75958-6