σ 2 ) Consider a discrete-time random walk x t =u 1 + +u t with i.i.d. N(0,σ 2 ) increments u t. Exercise: Show that UL N(0, 3

Similar documents
N <- <- # <- # <- # 1 <- # 6 <- ; <- N-1

4. Partial Sums and the Central Limit Theorem

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n.

Properties and Hypothesis Testing

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 21 11/27/2013

Lecture 19: Convergence

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 6 9/23/2013. Brownian motion. Introduction

Output Analysis and Run-Length Control

Lecture 33: Bootstrap

Kolmogorov-Smirnov type Tests for Local Gaussianity in High-Frequency Data

Lecture 22: Review for Exam 2. 1 Basic Model Assumptions (without Gaussian Noise)

STAT331. Example of Martingale CLT with Cox s Model

( ) = p and P( i = b) = q.

Notes 27 : Brownian motion: path properties

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

[ ] ( ) ( ) [ ] ( ) 1 [ ] [ ] Sums of Random Variables Y = a 1 X 1 + a 2 X 2 + +a n X n The expected value of Y is:

Economics 241B Relation to Method of Moments and Maximum Likelihood OLSE as a Maximum Likelihood Estimator

Stat 421-SP2012 Interval Estimation Section

Large Sample Theory. Convergence. Central Limit Theorems Asymptotic Distribution Delta Method. Convergence in Probability Convergence in Distribution

Power and Type II Error

HOMEWORK I: PREREQUISITES FROM MATH 727

ECO 312 Fall 2013 Chris Sims LIKELIHOOD, POSTERIORS, DIAGNOSING NON-NORMALITY

of the matrix is =-85, so it is not positive definite. Thus, the first

Parameter, Statistic and Random Samples

x iu i E(x u) 0. In order to obtain a consistent estimator of β, we find the instrumental variable z which satisfies E(z u) = 0. z iu i E(z u) = 0.

32 estimating the cumulative distribution function

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors

6 Sample Size Calculations

NANYANG TECHNOLOGICAL UNIVERSITY SYLLABUS FOR ENTRANCE EXAMINATION FOR INTERNATIONAL STUDENTS AO-LEVEL MATHEMATICS

Because it tests for differences between multiple pairs of means in one test, it is called an omnibus test.

Study the bias (due to the nite dimensional approximation) and variance of the estimators

Efficient GMM LECTURE 12 GMM II

t distribution [34] : used to test a mean against an hypothesized value (H 0 : µ = µ 0 ) or the difference

Mathematical Statistics - MS

Solution to Chapter 2 Analytical Exercises

Statistical Analysis on Uncertainty for Autocorrelated Measurements and its Applications to Key Comparisons

ENGI 4421 Probability and Statistics Faculty of Engineering and Applied Science Problem Set 1 Solutions Descriptive Statistics. None at all!

Econ 325 Notes on Point Estimator and Confidence Interval 1 By Hiro Kasahara

Estimation for Complete Data

Homework 3 Solutions

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

1 Introduction to reducing variance in Monte Carlo simulations

An Introduction to Asymptotic Theory

multiplies all measures of center and the standard deviation and range by k, while the variance is multiplied by k 2.


Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

Hypothesis Testing. Evaluation of Performance of Learned h. Issues. Trade-off Between Bias and Variance

Binomial Distribution

Statistical Inference Based on Extremum Estimators

A note on self-normalized Dickey-Fuller test for unit root in autoregressive time series with GARCH errors

Probability 2 - Notes 10. Lemma. If X is a random variable and g(x) 0 for all x in the support of f X, then P(g(X) 1) E[g(X)].

MA Advanced Econometrics: Properties of Least Squares Estimators

Distribution of Random Samples & Limit theorems

11 Correlation and Regression

Parameter, Statistic and Random Samples

7.1 Convergence of sequences of random variables

Limit Theorems. Convergence in Probability. Let X be the number of heads observed in n tosses. Then, E[X] = np and Var[X] = np(1-p).

The Central Limit Theorem

Random Variables, Sampling and Estimation

Stat 200 -Testing Summary Page 1

Outline Continuous-time Markov Process Poisson Process Thinning Conditioning on the Number of Events Generalizations

STAT Homework 1 - Solutions

0.1 Stationarity, Ergodicity, Spectral measure and spectral density

Introductory statistics

A Note on Box-Cox Quantile Regression Estimation of the Parameters of the Generalized Pareto Distribution

Lecture 12: November 13, 2018

Lecture 8: Convergence of transformations and law of large numbers

Statisticians use the word population to refer the total number of (potential) observations under consideration

STA Object Data Analysis - A List of Projects. January 18, 2018

ENGI Series Page 6-01

Convergence of random variables. (telegram style notes) P.J.C. Spreij

Sample Size Estimation in the Proportional Hazards Model for K-sample or Regression Settings Scott S. Emerson, M.D., Ph.D.

Topic 9: Sampling Distributions of Estimators

Module 1 Fundamentals in statistics

MATHEMATICAL SCIENCES PAPER-II

Asymptotic Results for the Linear Regression Model

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

1 The Haar functions and the Brownian motion

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

Singular Continuous Measures by Michael Pejic 5/14/10

We are mainly going to be concerned with power series in x, such as. (x)} converges - that is, lims N n

Lecture 4. We also define the set of possible values for the random walk as the set of all x R d such that P(S n = x) > 0 for some n.

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

TAMS24: Notations and Formulas

2.2. Central limit theorem.

Fall 2013 MTH431/531 Real analysis Section Notes

Simulation. Two Rule For Inverting A Distribution Function

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

EE / EEE SAMPLE STUDY MATERIAL. GATE, IES & PSUs Signal System. Electrical Engineering. Postal Correspondence Course

Lecture 15: Density estimation

LECTURE 8: ASYMPTOTICS I

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

1 General linear Model Continued..

GG313 GEOLOGICAL DATA ANALYSIS

Exercise 4.3 Use the Continuity Theorem to prove the Cramér-Wold Theorem, Theorem. (1) φ a X(1).

Statistical Theory; Why is the Gaussian Distribution so popular?

STA6938-Logistic Regression Model

ECE 330:541, Stochastic Signals and Systems Lecture Notes on Limit Theorems from Probability Fall 2002

FACULTY OF MATHEMATICAL STUDIES MATHEMATICS FOR PART I ENGINEERING. Lectures

17. Joint distributions of extreme order statistics Lehmann 5.1; Ferguson 15

Transcription:

Cosider a discrete-time radom walk x t =u + +u t with i.i.d. N(,σ ) icremets u t. Exercise: Show that UL ( x) = 3 x t L σ N(, 3 ). Exercise: Show that U (i) x t ~N(,tσ ), (ii) s<t x t x s ~N(,(t s)σ ), (iii) q<r s<t x r x q ad x t x s are idepedet. Exercise: Show that the discrete-time radom walk satisfies the differece equatio x t =φ x t- +u t with x = ad φ=. U x t Solutio: 3 x t = u +(u +u )+ +(u + +u ) = u +( ) u + +u ~ N(,( +( ) + + )σ ) ( + )(+ ) = N(, 6 ( + )(+ ) σ ) ~ N(, σ ). 3 44 6443 σ 3

Exercise: Compare the daily quotes of the S&P 5 idex (symbol: ^GSPC) with realizatios of a radom walk. Dowload the historical prices from Yahoo!Fiace as a csv file ^GSPC.csv ito the workig directory C:\SP5. Import the data ito R ad plot the log closig prices ad 3 realizatios of a radom walk (with drift) with matchig parameters (startig value, mea ad variace of log returs). setwd("c:/sp5") # R uses / as path separator Y <- read.csv("^gspc.csv",header=t,a.strigs="ull") # i the dowloaded file, missig values are represeted # by the strig "ull" rather tha by the symbol NA Y <- a.omit(y) # rows with missig values are omitted N <- row(y); D <- as.date(y[,]) # dates i colum cl <- log(y[,6]) # adjusted close prices i colum 6 r <- cl[:n]-cl[:(n-)]; <- N- # (log) returs my <- mea(r); sigma <- sd(r) # sample momets par(mar=c(,,,)); COL <- c("red","gree","blue") plot(d,cl,type="l",ylim=rage(cl)+c(-,)) for (j i :3) { u <- rorm(,m=my,sd=sigma) x <- cumsum(c(cl[],u)); lies(d,x,col=col[j]) }

Exercise: Compare the (log) returs with matchig Gaussia icremets ad resampled returs. par(mar=c(.,,.,.)); YL <- c(-.9,.9) plot(r,type="l",ylim=yl) # returs Resamplig blocks of returs rather tha idividual returs produces clusters of differet volatility. # k=/b ooverlappig blocks of legth b: b <- 5; k <- truc(/b); K <- sample(:k,k,replace=t) u <- NULL; for (j i K) u <- c(u,r[(j-)b+:b]) u <- rorm(,m=my,sd=sigma); plot(u,"l",ylim=yl) # k=/b overlappig blocks of legth b: b <- 5; k <- truc(/b); K <- sample(:(-b),k,t) u <- NULL; for (j i K) u <- c(u,r[j+:b]) Obviously, the desity of the returs has more probability mass ear the ceter as well as i the tails tha a ormal desity. A more realistic sample of sythetic returs ca be obtaied by resamplig the give returs. u <- sample(r,size=,replace=t); plot(u,"l",ylim=yl) # statioary bootstrap: blocks of radom legth q <-.996; K <- sample(:,,t) # mea legth=/(-q) for (i i :) if (ruif()<q) # icrease block with prob. q K[i] <- ifelse(k[i-]!=,k[i-]+,); u <- r[k] 3

Give observatios x,,x from a discrete-time radom walk, a simple cotiuous-time process ca be defied by x (t)=x [t], t [,], where x = ad [t] is the greatest iteger less tha or equal to t. Exercise: Show that as UC (i) Var(x ()) ad (ii) < α Var(x ())< α=. Exercise: Show that the cotiuous-time process UN satisfies (i) (ii) x (t)= x (t)= x [t], t [,], x ()=, x (t) L N(,tσ ), (iii) s<t x (t) x (s) L N(,(t s) σ ), (iv) q<r s<t x (r) x (q) ad x (t) x (s) are idepedet if is sufficietly large. Exercise: Plot realizatios of the discrete-time process x t ad the cotiuous-time process x (t) for σ = ad =. par(mar=c(,,.,.),pch=9); <- t <- c(:); x <- cumsum(c(,rorm())); plot(t,x) t <- t/; x <- x/^.5; plot(t,x,type="s") # "s": stair steps (move first horizotal, the vertical) 4

Exercise: Plot realizatios of the process ad =,. x (t) for σ = As icreases the height of the jumps i the graph of x (t) decreases. We ca therefore expect cotiuity i the limit. Of course, this does ot imply smoothess. A cotiuous-time process B(t), t [,], is called Browia motio with variace σ if (i) B()=, (ii) B(t)~N(,tσ ), (iii) s<t B(t) B(s)~N(,(t s) σ ), (iv) q<r s<t B(r) B(q) ad B(t) B(s) are idepedet. Browia motio with variace is called stadard Browia motio or Wieer process. 5

Exercise: Show that UB s<t Cov(B(s),B(t))=sσ. Solutio: Cov(B(s),B(t)) = Cov(B(s),(B(t) B(s))+B(s)) = Cov(B(s),B(t) B(s))+Cov(B(s),B(s)) = Cov(B(s) B(),B(t) B(s))+Var(B(s)) = +sσ It ca be show that ay realizatio of Browia motio is everywhere cotiuous ad owhere differetiable with probability. Ideed, for <h we have ad E(B(t+h) B(t)) = Var(B(t+h) B(t)) = ((t+h) t)σ = hσ E B( t+ h) B( t) h = h hσ = h σ. UD 6

For ay fixed <τ, the cetral limit theorem applied to the mea [ τ] [ τ] u t of the fractio u,,u [τ] of the whole sample u,,u gives ad x (τ)= [ τ] τ [ τ] u t [ ] [ τ] u t = [τ] [ τ] [ τ] L N(,σ ) [ τ] u t L τn(, σ ). 443 = N(, τσ ) I cotrast, the fuctioal cetral limit theorem cocers the asymptotic behavior of x regarded as a stochastic fuctio of τ, i.e., x L B, where B is Browia motio with variace σ. For the extesio of covergece i law to radom fuctios, it is required, amog other coditios, that ( x (τ ),, x (τ k )) T L (B(τ ),,B(τ k )) T for ay τ < < τ k. The cotiuous mappig theorem (CMT): For a sequece of radom variables x t ad a cotiuous fuctio g, we have x L x g(x ) L g(x). Aalogously, we have for a sequece of stochastic fuctios f ad a cotiuous fuctioal g, f L f g(f ) L g(f). Fuctioals map a fuctio ito a real umber ad a stochastic fuctio ito a radom variable, respectively. Examples: (i) g ( f) = f(), (ii) g ( f) = f( τ) 7

Example: g (f) = f ( τ) g ( x ) = = = x ( τ) = x[ τ ] ( x[ τ ] + + x[ τ ] ) ( x + + x ) = x L B g ( x )= 3 x t x 3 x t ( τ) L g (B) = B ( τ) L N(, σ ) 3 B ( τ) ~ N(, 3 σ ) Example: g (f) = ( f ( τ)) g ( x ) = x ( ( τ)) = = ( x [ τ + + ] x τ ] [ x [ τ ] ) = ( x + + x )= x L B g ( x ) = x t. ( x ( τ)) L g (B)= ( B ( τ)) R RM 8

Uder the uit root hypothesis RU H : φ=, the expected value of the deomiator of the statistic is give by φˆ φ =φˆ = x t u t / x t E( u = σ +... + u t ) ( t ) = σ, which implies that we eed to multiply φˆ by i order to obtai a odegeerate asymptotic distributio. The estimator φˆ is called a supercosistet estimator, because it coverges to φ= at a faster rate tha usual. We have (φˆ ) = x t u t / x t = (u u +(u +u )u 3 + +(u + +u - )u )/ g ( x ) u t = ( ( u t ) )/ x (( x ()) σ )/ ( x ( τ)) = g 3 ( x ), L g 3 (B) = ((B()) σ )/ ( ( τ)) ( Bτ ( )) = ((W()) ) / ( W ( τ)). = The radom variable i the umerator is.5 times a demeaed χ ()-variable ad is therefore skewed to the right. 9

We caot use the statistics ( )(φˆ ) or (φˆ ) to test the uit root hypothesis H : φ= agaist the alterative hypothesis H A : φ< uless we have critical values. For the calculatio of critical values, we do ot eed to use the asymptotic distributio of the respective test statistic. Istead, we ca use Mote Carlo techiques. First, we ca geerate m pseudo-radom samples u (j),,u (j), j=,,m, of N(,) variates ad the compute φˆ for each sample. Fially, order statistics are used to estimate the quatiles of iterest. Exercise: Fid critical values for the test statistic (φˆ ). Use =5, 5,,, m=,, ad α=.5. m <- ; <- 5; <- -; phi <- rep(,m) for (i i :m) { u <- rorm(); x <- cumsum(u)[:] phi[i] <- sum(xu[:])/sum(xx) } # phi=phi- q <- quatile(phi,probs=.5); cr.val <- q; cr.val -7.654794 Aalogously, we obtai the remaiig values (α=.5): for m= for m= 5-7.7-7.3 5-7.7-7.8-8. -7.8-8.5-8. Clearly, the values obtaied with m= are more reliable tha those obtaied with m=. Furthermore, we ca use the critical values obtaied for large values of, e.g., =, as estimates of the critical values of the asymptotic distributio.

Exercise: Test H : φ= for a sythetic AR() series. x <- arima.sim(list(order=c(,,),ar=.7),=5) # φ=.7 <- 5; x <- x[:(-)] phi <- sum(xx[:])/sum(xx) (phi-) -8.74 The uit root hypothesis H ca be rejected, because the value of the test statistic is less tha the critical value for a sample size of 5, i.e., -8.74 < -7.3. Exercise: Suppose that x,,x are o-stochastic ad u,,u are ucorrelated with commo mea ad variace σ. Show that i the liear regressio model y =β x + u the variace of the OLS estimator βˆ is give by RV t t var( ˆ) β = σ /. t x t Aother way of testig the uit root hypothesis is to write the model x t =φ x t- +u t as x t =x t x t- =φ x t- +u t x t- =φ x t- +u t, where φ =φ, ad reject H if the value of the OLS estimator ˆ φ = x t x t / x t or, alteratively, the covetioal OLS t-ratio where ˆ var( φ ) = t = ˆ φ / ˆ var( φ ) x ˆ t xt ) ( φ /, x t, is much smaller tha. The test based o t is called Dickey-Fuller test. Clearly, t has either a t-distributio or a limitig ormal distributio if φ=.

A more realistic model is obtaied by itroducig additioal lags i order to allow for serial correlatio: Writig the model as x t =φ x t- + +φ p x t-p +u t x t =(φ ) x t- + +φ p x t-p +u t =[(φ + +φ p ) (φ + +φ p )] x t- +[(φ + +φ p ) (φ 3 + +φ p )] x t- M +[φ p- +φ p ) φ p ] x t-(p-) +φ p x t-p +u t =(φ + +φ p ) x t- (φ + +φ p ) x t- φ p x t-(p-) +u t =φ x t- +δ x t- + +δ p x t-p- +u t we see that the uit root hypothesis Φ ()= φ φ p = is equivalet to the hypothesis H : φ =. RL Icludig also a costat term ad a liear time tred, we obtai a eve more geeral model: x t =α+βt+φ x t- +δ x t- + +δ p x t-p +u t The test of the hypothesis H : φ =, which is based o the covetioal OLS t-ratio for φ, is called augmeted Dickey-Fuller (ADF) test. I practice, it is extremely hard to decide whether a costat term ad a time tred should be icluded ad how may lags should be icluded. Ufortuately, differet model specificatios typically produce differet test results. Exercise: Apply a augmeted Dickey-Fuller test to the log S&P5 series created above. Hit: library(tseries) # the package tseries is loaded help(adf.test)