Bootstrapping Cointegrating Regressions 1

Size: px
Start display at page:

Download "Bootstrapping Cointegrating Regressions 1"

Transcription

1 Bootstrappig Coitegratig Regressios Yoosoo Chag Departmet of Ecoomics Rice Uiversity Joo Y. Park School of Ecoomics Seoul Natioal Uiversity Kevi Sog Departmet of Ecoomics Yale Uiversity Abstract I this paper, we cosider bootstrappig coitegratig regressios. It is show that the method of bootstrap, if properly implemeted, geerally yields cosistet estimators ad test statistics for coitegratig regressios. We do ot assume ay speci c data geeratig process, ad employ the sieve bootstrap based o the approximated ite-order vector autoregressios for the regressio errors ad the rst di ereces of the regressors. I particular, we establish the bootstrap cosistecy for OLS method. The bootstrap method ca thus beused to correct for the ite sample bias of the OLS estimator ad to approximate the asymptotic critical values of the OLS-based test statistics i geeral coitegratig regressios. The bootstrap OLS procedure, however, is ot e±ciet. For the e±ciet estimatio ad hypothesis testig, we cosider the procedure proposed by Saikkoe (99) ad Stock ad Watso (993) relyig o the regressio augmeted with the leads ad lags of di ereced regressors. The bootstrap versios of their procedures are show to be cosistet, ad ca be used to do ifereces that are asymptotically valid. A Mote Carlo study is coducted to ivestigate the ite sample performaces of the proposed bootstrap methods. This versio: February 8, 22 Key words ad phrases: coitegratig regressio, sieve bootstrap, e±ciet estimatio ad hypothesis testig, AR approximatio. Park thaks the Departmet of Ecoomics at Rice Uiversity, where he is a Adjuct Professor, for its cotiuig hospitality ad secretarial support. Correspodece address to: YoosooChag, Departmet of Ecoomics, Rice Uiversity, 6 MaiStreet, Housto, TX , Tel: , Fax: , yoosoo@rice.edu.

2 . Itroductio The bootstrap has become a stadard tool for the ecoometric aalysis. Roughly, the purpose of usig the bootstrap methodology is twofold: To d the distributios of statistics whose asymptotic distributios are ukow or depedet upo uisace parameters, ad to obtai re emets of the asymptotic distributios that are closer to the ite sample distributios of the statistics. It is well kow that the bootstrap statistics have the same asymptotic distributios as the correspodig sample statistics for a very wide, if ot all, class of models, ad therefore, the ukow or uisace parameter depedet limit distributios ca be approximated by the bootstrap simulatios. Moreover, if properly implemeted to pivotal statistics, the bootstrap simulatios ideed provide better approximatios to the ite sample distributios of the statistics tha their asymptotics. See Horowitz (22) for a excellet otechical survey o the subject. The purpose of this paper is to develop the bootstrap theory for coitegratig regressios. The bootstrap ca be potetially more useful for models with ostatioary time series tha for the stadard models with statioary time series, sice the statistical theories for the former are geerally ostadard ad deped, ofte i a very complicated maer, upo various uisace parameters. Nevertheless, the bootstrap theories for the former are much less developed compared to those for the latter. Virtually all of the published works o the theoretical aspects of the ostatioary bootstrap cosider simple uit root models. See, e.g., Basawa et al. (99), Chag (2), Chag ad Park (22b) ad Park (2, 22). The bootstrap coitegratig regressio has bee studied oly by simulatios as i Li ad Maddala (997). The bootstrap method, however, is used quite frequetly ad extesively by empirical researchers to approximate the distributios of the statistics i more geeral models with ostatioary time series. We cosider the sieve bootstrap to resample from the coitegratig regressios. The method does ot assume ay speci c data geeratig processes, ad the data are simply tted by a VAR of order icreasig with the sample size. The bootstrap samples are the costructed usig the tted VAR from the resampled iovatios. We show i the paper that uder such a scheme the bootstrap becomes cosistet for both the usual OLS ad the e±ciet OLS by Saikkoe (99) ad Stock ad Watso (993). The sieve bootstrap ca therefore be used to reduce the ite sample bias of the OLS estimator, ad also to d the asymptotic critical values of the tests based o the OLS estimator. The bootstrapped OLS estimator, however, is ie±ciet just as is the sample OLS estimator. The bootstrap does ot improve the e±ciecy. To attai e±ciecy, we eed to bootstrap the e±ciet OLS estimator. The sieve bootstrap ca be aturally implemeted to do resamplig for the e±ciet estimator, which itself relies o the idea of sieve estimatio of the coitegratig regressio. We show i the paper that the sieve bootstrap is geerally cosistet for the e±ciet OLS method. Though we focus o the prototype multivariate coitegratio model i the paper for cocreteess, the theory we derive here ca be used to aalyze more geeral coitegrated models. The immediate extesios are the coitegratig regressios with more exible determiistic treds icludig those allowig for structural breaks. Coitegratig regressios with shifts i the coe±ciets ca also be aalyzed usig the methods developed i the paper.

3 2 Moreover, our theory exteds further to the coitegratig regressios represeted as error correctio models, seemigly urelated coitegrated models ad paels with coitegratio i idividual uits. The sieve bootstrap proposed here ca be applied to all such models oly with some obvious modi catios. The estimatio of ad testig for the coitegratio parameters ca therefore be performed or re ed by the sieve bootstrap. It is well expected that the sieve bootstrap is cosistetfor the aforemetioed models if implemeted properly as suggested i the paper, ad it ca be proved rigorously, if ecessary, usig the theory established i the paper. The rest of the paper is orgaized as follows. Sectio 2 itroduces the model, assumptios ad some prelimiary results. The multivariatecoitegratig regressio with a detailed speci catio for data geeratig process is give, ad a strog ivariace priciple, which ca be used for both the sample ad bootstrap asymptotics, is itroduced ad discussed. The stadard asymptotic results for the coitegratig regressios are the derived. I Sectio 3, the sieve bootstrap procedure is preseted, ad the relevat bootstrap asymptotics are developed. The bootstrap cosistecy is established there. Sectio 4 summarizes our simulatio study, ad Sectio 5 cocludes the paper. All the mathematical proofs are give i Sectio 6. A word o otatio. Followig the stadard covetio, we use the superscript \ " to sigify whatever is related to the bootstrap samples ad depedet upo the realizatio of the samples. The usual otatios for the modes of covergece such as! a:s:,! p ad! d are used without additioal refereces, ad the otatio = d deotes the equality i distributio. The stochastic order symbols o p ad O p are also used frequetly. Moreover, we ofte use such stadard otatios with the superscript \ " to imply their bootstrap couterparts. We use the otatio j j to deote the Euclidea orm for vectors ad matrices, i.e., jxj 2 = x x ad jaj 2 = tra A for a vector x ad a matrix A. For matrices, we also use the operator orm k k, i.e., kak = max x jaxj=jxj for a vector x ad a matrix A which are of coformable dimesios. For a matrix A, vec(a) deotes a colum vector which stacks row vectors of A. 2. The Model, Assumptios ad Prelimiary Results 2. The Model ad Assumptios We cosider the regressio model give by y t = x t + u t () x t = x t + v t for t = ; 2;:::, where w t = (u t;v t) is a (` +m)-dimesioal statioary vector process. Uder this speci catio, the model itroduced i () becomes a multivariate coitegratig regressio. For the subsequet developmet of our theory, we let x be ay radom variable which is stochastically bouded,

4 3 ad let (w t ) be a liear process give by where We make the followig assumptios. Assumptio 2. We assume w t = ª(L)" t (2) ª(z) = X ª k z k k= (a) (" t ) are iid radom variables such that E" t =, E" t " t = > ad Ej" t j a < for some a 4. (b) det ª(z) 6= for all jzj, ad P k= jkj b jª k j < for some b. The coditios i Assumptio 2. are ot ecessary for our results i this sectio. I particular, the iid assumptio o the iovatio (" t ) is ot required. It is itroduced here just to make our forthcomig bootstrap procedure more meaigful. All the subsequet theoretical results may be obtaied uder weaker martigale di erece assumptios o (" t ). Also, -summability of (ª k ) is assumed to simplify the proofs, ad ca be weakeed to =2- summability. Yet, a wide class of coitegrated models, icludig Gaussia error correctio models cosidered i Johase (988, 99), ca be represeted as the model speci ed i () ad (2) with (" t ) ad (ª k ) satisfyig the coditios i Assumptio Ivariace Priciples For the iid radom vectors (" t ), we de e [r] W (r) = d =2 X " t t= The the ivariace priciple for the iid radom vectors (" t ) holds, i.e., W! d W (3) where W is a vector Browia motio with variace. I particular, we have Lemma 2.2 Let Ej" t j a < for some a > 2. The we may de e o a commo probability space W ad W such that ( ) P sup jw (r) W(r)j > =2 c Kc a Ej" t j a r for ay umerical sequece (c ), c = =a+± for some ± >, where K is a absolute costat depedig oly o a ad `+m.

5 Lemma 2.2 follows immediately from the strog approximatio result established i Eimahl (987). For ay give a > 2, we may choose ± such that < ± < =2 =a to show sup jw (r) W(r)j = o p () r ad therefore, the ivariace priciple (3) follows directly from Lemma 2.2. The use of strog approximatio for the proof of the ivariace priciple (3) is very useful i our cotext, siceitca also be directlyapplied to derive thecorrespodig ivariacepriciple for the bootstrap samples. This will be show i the ext sectio. The ivariace priciple for (" t ) directly carries over to the oe for (w t ). We have from the Beveridge-Nelso decompostio that w t = ª()" t + ( ¹w t ¹w t ) with ¹w t = P k= ¹ª k " t k ad ¹ª k = P i=k+ ª i, ad therefore, where [r] B (r) = =2 X w t t= [r] = =2 X ª() " t + =2 ( ¹w ¹w [r] ) t= = ª()W (r) + R (r) (4) sup jr (r)j = o p () r sice ( ¹w ) is well-de ed ad statioary uder our -summability coditio o (ª k ). The reader is referred to Phillips ad Solo (992) for more details. It ow follows immediately from (4) that B! d B (5) where B is a (`+m)-dimesioal vector Browia motio with variace = ª() ª(). We call the logru variace of (w t ). If we let (k) = Ew t wt+k be the autocovariace fuctio of (w t ), the we may de e the logru variace of (w t ) as = P k= (k). Correspodigly, we deote by the oe-way logru variace of w t, i.e., = P k= (k). We let B = (B;B 2) ad partitio = ( ij ) ad = ( ij ) ito cell matrices for i;j = ; 2, coformably with w t = (u t;v t ). 2.3 Iferece o Parameters As is well kow, the parameter i the multivariate coitegratig regressio () ca be cosistetly estimated by the OLS estimator ^, whose limitig distributio is give by µz µz (^ )! d B 2 B2 B 2 db + 2 4

6 as!. Though the OLS estimator ^ is super-cosistet, it is asymptotically biased ad ie±ciet whe (u t ) ad (v t ) are correlated. Moreover, the tests based o the OLS estimator ^ are geerally ivalid. For the test of the hypothesis H : ( ) = (6) where : R`m! R is cotiuously di eretiable with rst-order derivative =@¼ ;¼ = vec, we may cosider the Wald-type statistic ^T = ^ ³^ (M )^ ^ (7) where ^ = (^ ), ^ = (^ ) ad M = P t= x t x t. For the practical implemetatio, of course, the logru variace of (u t ) must be estimated. The statistic has the limitig distributio ^T! d Q (8) where = (M I`)( R B 2 db + ± 2 ) ad Q = (M ) usig the otatios = ( ), ± 2 = vec 2 ad M = R B 2B2. The asymptotic distributio of T is thus ostadard ad depedet upo various uisace parameters. See, e.g., Park ad Phillips (988) for the distributio theories for ^ ad ^T. For the e±ciet estimatio of, we cosider the procedure suggested by Saikkoe (99) ad Stock ad Watso (993), which is based o the regressios augmeted with the leads ad lags of the rst-di ereced regressors 4x t. Uder Assumptio 2., we may write u t = X k= kv t k + t (9) where ( t) is ucorrelated with (v t ) = (4x t ) at all leads ad lags with variace 2 = , ad P k= j k j <. The reader is referred to Saikkoe (99) ad the refereces cited therei for the represetatio i (9). We are therefore led to cosider the regressio y t = x t + X k4x t k + pt () jkj p where pt = t + P jkj>p k 4x t k. Sice ( k ) are absolutely summable, we may well expect that the error ( pt ) will become close to ( t) if we let the umber p of icluded leads ad lags of the di ereced regressors icrease appropriately as the sample size grows. Ideed, Saikkoe (99) shows that if we let p! such that p 3 =! ad =2 P jkj>p j k j!, the the regressio () is asymptotically equivalet to y t = x t + t () I particular, the OLS estimator ~ of i regressio (), which we call the e±ciet OLS estimator, has the asymptotics give by µz ( ~ Z )! d B 2 B2 B 2 d(b 2 22 B 2) (2) 5

7 which is precisely the same as the asymptotics of the OLS estimator of from regressio (). Note that B 2 22 B 2 has reduced variace 2 = relative to the variace of B. The asymptotic variace of ~ is therefore smaller tha that of ^. Moreover, B 2 22 B 2 is idepedet of B 2, ad cosequetly the limitig distributio of ~ is mixed ormal, which is quite a cotrast to the ostadard limit theory of ^. As a result, the usual chi-square test is valid if we use a test statistic based o ~. Ideed, for testig the hypothesis (6), we may use ~T = ~ ³ ~ (M 2 )~ ~ (3) where ~ = (~ ) ad ~ = (~ ) are de ed similarly as ^ ad ^ itroduced i (7), ad M is the matrix de ig the sample covariace of ~, which correspods to M for ^ give also i (7). The it follows that ~T! d  2 (4) As before, 2 should be cosistetly estimated for practical applicatios. I what follows, we reestablish the asymptotics for the augmeted regressio () usig a weaker coditio o the expasio rate for p, relative to the oe required by Saikkoe (99). We assume Assumptio 2.3 Let p! ad p = o( =2 ) as!. Our coditio o p is substatially weaker tha the oe used i Saikkoe (99). Our upper boud for p is exteded to o( =2 ) compared to his o( =3 ). Moreover, we do ot impose ay restrictio o the miimum rate at which p must grow. Thus we may allow for the logarithmic rate that is usually imposed for the commo order selectio rules such as AIC ad BIC. The logarithmic rate, however, is ot allowed i Saikkoe (99) uless ( k ) decreases at a geometric rate. The rate for p here is therefore valid for more geeral statioary process (w t ), as ca be see clearly i our proof for the followig lemma. Lemma 2.4 Uder Assumptios 2. ad 2.3, we have (2) ad (4) as!. The asymptotics established i the above lemma will be used as a basis for the subsequet developmet of our bootstrap asymptotics. 3. Bootstrap Procedures ad Their Asymptotics 3. Bootstrap Procedures I this sectio, we itroduce the bootstrap procedures for our coitegratig regressio model () ad develop their asymptotics. For our bootstrap procedures itroduced below, we may 6

8 use various cosistet estimates for. Therefore, we use the geeric otatio to deote ay estimate of that is -cosistet. More explicitly, we let = ^ ; ~ ; where ^ ad ~ are the OLS estimators of i regressios () ad (), respectively, ad deotes the hypothesized or estimated value of uder the restrictio give by the hypothesis (6). Other estimates of, which are asymptotically equivalet to ay of these estimators, ca also be used. The coitegratig regressio model () with the speci catio (2) of its statioary compoet as a liear process ca be bootstrapped usig the stadard sieve method. The method of sieve bootstrap requires to t the liear process (w t ) to a ite order VAR with the order icreasig as the sample size grows. We may rewrite (w t ) as a VAR (L)w t = " t with (z) = I P k= k z k, sice ª(L) i (2) is ivertible, ad therefore it is reasoable to approximate (w t ) as a ite order VAR w t = w t + + q w t q + " qt (5) with " qt = " t + P jkj>q k w t k. The order q of the approximated VAR is set to icrease at a cotrolled rate of, as we will specify below. I practice, it ca be chose by oe of the commoly used order selectio rules such as AIC ad BIC. Assumptio 3. Let q! ad q = o( =2 ) as!. The required coditio o the expasio rate for q i Assumptio 3. is idetical to that for p give i Assumptio 2.3. Outlied below are the ecessary steps for obtaiig the bootstrap samples (x t ) ad (y t ) for (x t ) ad (y t ), respectively. Step : Fit regressio () to obtai (^u t ), i.e., y t = x t + ^u t ad de e ^w t = (^u t;v t), where v t = 4x t. As metioed above, we may use various estimates of here. Step 2: Apply the sieve estimatio method to the VAR (5) of ( ^w t ) to get the tted values (^" qt ) of (" qt ), i.e., ^w t = ^ ^w t + + ^ q ^w t q + ^" qt (6) Obtai (" t ) by resamplig the cetered tted residuals à ^" qt X t= ^" qt! t= 7

9 8 ad costruct the bootstrap samples (w t ) recursively usig w t = ^ w t + + ^ q w t q + " t give the iitial values w t = w t for t = ;:::; q. This step amouts to the usual sieve bootstrap for ( ^w t ). Step 3: De e w t = (u t ;v t ) aalogously as w t = (u t;v t). Obtai the bootstrap samples (x t ) by itegratig (v t ), i.e., x t = x + P t k= v t, with x = x ad geerate the bootstrap samples (y t ) from y t = x t + u t The estimate of here eed ot be the same as the oe used i Step. The bootstrap samples (x t ) ad (y t ) ca be used to simulate the distributios of various statistics as explaied below. Discussios o some practical issues arisig from the implemetatio of our bootstrap method are i order. First, the choice of a estimator for should be made i Steps ad 3. The choice i Step 3 is ot importat, as log as we regard the chose estimate as the true parameter for the bootstrap samples. The choice of for we use i Step, however, is very importat, as it would a ect the subsequet bootstrap procedure i Steps 2 ad 3. I particular, the order selectio rule applied to determie the model i Step 2 ca be very sesitive to the choice of made i Step. Naturally, we recommed to use the best possible estimate, i.e., the most e±ciet estimate available icorporatig all restrictios for the hypothesis testig. This is what is observed by Chag ad Park (22b) for the bootstrap uit root tests. Li ad Maddala (997) also d that usig the true value i Step yields the best result for the bootstrap hypotheses tests. Secod, the iitializatios for the geeratios of (w t ) ad (x t ) are ecessary respectively i Steps 2 ad 3. The choices w t = w t for t = ;:::; q ad x = x make the results coditioal o these iitial values of thesamples. To reduce or elimiate such depedecies, wemay geerate su±cietly large umber of (w t ) ad retai oly last drawigs, or iclude a costat so that the estimate of is idepedet of the iitial value of (x t ). We cosider the bootstrap versio of regressio () give as which yields the bootstrap estimator for the usual OLS, viz., y t = x t + u t (7) Ã! ^ X Ã! = x tx X t x ty t t= t= We also look at the bootstrap versio of regressio (), which we write as y t = x t + X k 4x t k + pt (8) jkj p

10 where pt = X jkj>p k v t k + t The e±ciet bootstrap OLS estimator, i.e., the OLS estimator from regressio (8), is give by where X ~ x tx t t= t= x ty t à X x tv pt t= à X x tv pt t=! à X v ptv pt t=! à X v ptv pt t= v pt = (4x t+p;:::;4x t p)! à X! à X v ptx t t= v pty t t=! A I both regressios (7) ad (8), we deote by the estimate of used to geerate the bootstrap samples (y t ) from (x t ) ad (u t ). Note that i regressio (8) the coe±ciets of the leads ad lags of the di ereced regressors deped o the realized samples, which we sigify by attachig the superscript \ " to the coe±ciets k 's. The test statistics ^T = ~T = ~ ^ ³^ (M ^ )^ ³ ~ (M ^ 2 )~ ^ ~ which are costructed from the bootstrap OLS ad e±ciet estimators, ^ ad ~, aalogously as as their sample couterparts ^T ad ~T de ed i (7) ad (3). For the de itios of the bootstrap statistics ^T ad ~T, we may use the logru variace estimate obtaied from each bootstrap sample, say ad 2, i the places of the sample estimates ^ ad ^ 2. This would ot a ect ay of our subsequet results, sice we oly cosider the rst order asymptotics. 3.2 Bootstrap Asymptotics The asymptotic theories of the estimators ^ ad ~ ca be developed similarly as those for ^ ad ~. To develop their asymptotics, we rst establish the bootstrap ivariace priciple for (" t ). We have Lemma 3.2 Uder Assumptios 2. ad 3., as!. E j" tj a = O p () Roughly, Lemma 3.2 allows us to regard the bootstrap samples (" t ) as iid radom variables with ite a-th momet, give a sample realizatio. Followig the usual covetio, we! A 9

11 use the otatio P to deote the bootstrap probability coditioal o the samples. The otatio E sigi es the expectatio take with respect to P, for each realizatio of the samples. To develop the bootstrap asymptotics, it is coveiet to itroduce the bootstrap stochastic order symbols o p ad O p, which correspod respectively to o p ad O p. Let ² > be give arbirarily, ad M > be chose accordigly. For a bootstrap statistic T, we de e T = o p() if ad oly if PfP fjs j < ²g > ²g < ² for all large, ad T = O p() if ad oly if PfP fjs j > Mg > ²g < ² for all large. They all satisfy the usual rules applicable for o p ad O p. See Chag ad Park (22b) for more details. I particular, wheever T! d T i P, we have T = O p() ad T +o p()! d T i P. Note also that E jt j = o p () ad O p () imply T = o p() ad O p(), respectively. De e W (r) = d =2 X " t t= aalogously as W itroduced earlier i Sectio 2, ad let W be a vector Browia motio with variace. It follows from the strog approximatio result i Lemma 2.2 applied to the bootstrap samples (" t ) that we may choose W satisfyig P ( sup r jw (r) W(r)j > =2 c ) Kc a E j" tj a for (c ) ad K give exactly as i Lemma 2.2. Note i particular that K does ot deped o the realizatio of the samples. However, due to the result i Lemma 3.2, we have as log as a > 2, ad cosequetly sup jw Wj = o p() r W! d W i P A `i probability' versio of the bootstrap ivariace priciple is therefore established for the bootstrap samples (" t ). The correspodig ivariace priciple for (w t ) ca also be derived easily. We let ^ () = qx ^ k k= ad ^ª() = ^ ()

12 The we may deduce after some algebra that qx qx w t = ^ª()" t + i= j=i = ^ª()" t + ( ¹w t ¹w t ) where ¹w t = ^ª() P q i= (P q ^ j=i j )w t i+. We therefore have [r] B (r) = =2 X w t t= t= ^ j A (w t i w t i+) [r] = =2 X ^ª() " t + =2 ( ¹w ¹w [r] ) = ^ª()W (r) + R (r) It is thus well expected that the ivariace priciple holds for (w t ) if ^ª()! p ª() ad sup r jr (r)j = o p(). Let B be the vector Browia motio itroduced i (5). The we may ideed show that Theorem 3.3 Uder Assumptios 2. ad 3., we have B! d B i P as!. Let The we have Lemma 3.4 z t = z + tx w i i= Uder Assumptios 2. ad 3., we have as!. 2 X t= X t= z t z t! d z t w t! d Z Z BB i P BdB + i P The limit distributios of the bootstrap OLS estimator ad the associated statistic ca ow be obtaied easily from Lemma 3.4, ad are give i the followig theorem. Let ad Q be de ed as i (8).

13 2 Theorem 3.5 ad as!. Uder Assumptios 2. ad 3., we have µz µz (^ )! d B 2 B2 B 2 db + 2 ^T! d Q i P i P The bootstrap is therefore cosistet for the OLS regressio. The bootstrap estimator ^ has the same limitig distributio as ^. Therefore, i particular, the bootstrap ca be used to remove the asymptotic bias i ^, sice the bootstrap distributio has the same asymptotic bias as the sample distributio. The bias corrected OLS estimator de ed by ^ c = ^ E (^ ) does ot have asymptotic bias, ad is thus expected to have smaller bias i ite samples as well. Hypothesis testig ca also be doe usig the OLS estimator, if the critical values are obtaied from the bootstrap distributio. For example, cosider the test of the hypothesis (6). If we let c ( ) be the bootstrap critical value give by o P ^T ^c ( ) = for a prescribed size, the we have o P ^T ^c ( )! as!. Therefore, the bootstrap test has the asymptotically correct size. We ow cosider the asymptotics for the bootstrap e±ciet estimator ~ obtaied from (8). Deote by f ( ) ad (k), respectively, the spectral desity ad autocovariace fuctio of (w t ). For all bootstrap samples yieldig spectral desity bouded away from zero ad absolutely summable autocovariace fuctio, we would have the represetatio u t = X k= k v t k + t (9) where P k= j k j < ad ( t ) are ucorrelated with (v t ) at all leads ad lags. As show i Lemma A3 i Appedix, we have ad It therefore follows that sup kf ( ) f( )k = o p() X k= (k) = X k= (k) + o p() P ff ( ) > ²g! p

14 for some ² > ad P 8 < : X k= 9 = j (k)j < ;! p sice f( ) is bouded away from zero ad (k) is absolutely summable. We may therefore deduce that the probability of realizatio of samples which allow for the represetatio (9) approaches to oe as the sample size icreases. The bootstrapped augmeted regressio (8) is therefore well justi ed. Similarly as for the asymptotic equivalece betwee the sample regressios () ad (), we have the bootstrap asymptotic equivalece betwee the regressio (8) ad for the iferece o. We de e 3 y t = x t + t (2) N = X t= x t t ; M = 2 X t= x tx t ad N = M = X t= = N + P 2 X t= = M + Q à x t pt x tx t X t= Ã! à x tv pt X t= X t=! à x tv pt! à v ptv pt X t= X t=! à v ptv pt v pt pt X t=! v ptx t! The we have Lemma 3.6 Uder Assumptios 2., 2.3 ad 3., we have P ; Q = o p() as!. Therefore, the bootstrap asymptotics for the regressios (8) ad (2) are the same. Cosequetly it follows that Theorem 3.7 Uder Assumptios 2., 2.3 ad 3., we have ad µz Z (~ )! d B 2 B2 ~T! d  2 i P B 2 d(b 2 22 B 2) i P

15 4 as!. The limit distributio of the bootstrap e±ciet estimator ~ is equivalet to that of its sample couterpart ~, just as i the case for the OLS estimator. The associated statistics ~T ad ~T also have the idetical asymptotic distributio. The bootstrap cosistecy of the test statistic for coitegratig regressio () therefore exteds to the dyamic coitegratig regressio () augmeted with the leads ad lags of the di ereced regressors. We may also cosider the bias corrected estimator ~ c = ~ E (~ ) for the e±ciet estimator ~. Though it has o asymptotic bias, the correctio may still reduce the bias i ite samples. Moreover, the bootstrap critical value for the test ~T ca be foud o P ~T ~c ( ) = for the size test. The bootstrap test based o b ( ) would the have the correct asymptotic size, sice o P T ~ ~c ( )! as!. Note that ~T is asymptotically pivotal, ulike ^T whose limitig distributio depeds o various uisace parameters. Therefore, the bootstrap test based o ~T may provide a asymptotic re emet. I this case, the bootstrap test utilizig the bootstrap critical values ~c ( ) would yield rejectio rates closer to the omial values, compared to the test relyig o the asymptotic chi-square critical values. 4. Simulatios I this sectio, we coduct a set of simulatios to ivestigate ite sample performaces of the bootstrap procedures i the cotext of coitegratig regressios. Our simulatios are based o the simple bivariate coitegratig regressio model y t = x t = ¼x t + u t x t + v t where the regressio error (u t ) ad the iovatio (v t ) for the regressor (x t ) are geerated as statioary AR() processes give by u t = v t = ' v t + " t ' 2 v t + " 2t The iovatios (" t ), " t = (" t ;" 2t ), are draw from bivariate ormal distributio with covariace matrix µ ½ = ½

16 5 Our simulatio setup is etirely aalogous to that of Stock ad Watso (993). I our simulatios, we set ' = :6, ' 2 = :3, ½ = :5, ad ¼ = ; :. We cosider two least squares regressios y t = ^¼ x t + ^u t (2) y t = ~¼ x t + ~¼ 4x t + ~¼ 2 4x t + ~ t (22) ad compare the ite sample performaces of the estimators ^¼ ad ~¼ ad the test statistics ^T ad ~T based respectively o ^¼ ad ~¼. Note that our simulatio setup itroduces ozero asymptotic correlatio betwee the regressor ad the regressio error i regressio (2), causig the asymptotic bias of ^¼ ad the ivalidity of ^T. The problems are however expected to vaish, at least asymptotically, i regressio (22). As explaied earlier, ~¼ has o asymptotic bias, ad the test usig ~T is asymptotically valid. For our simulatio setup, we have i regressio (22) ad ¼ = ½; ¼ 2 = ' ½' 2 t = " t ½" 2t Notice that ( t) de ed as above are ucorrelated with (" 2t ) at all leads ad lags, ad therefore are orthogoal also to (v t ) at all leads ad lags, which is required for the validity of the model (22) as a e±ciet coitegratig regressio itroduced i (). To examie the e ectiveess of the bootstrap bias correctio, the ite sample performaces of ^¼ ad ~¼ are compared to those of the bias corrected estimators ^¼ c ad ~¼ c, i terms of bias, variace ad mea square error (MSE). For the sake of comparisos, the scaled biases, variaces ad MSE's for the estimators ^¼, ~¼, ^¼ c ad ~¼ c are computed. For the test statistics, we compute the rejectio probabilities uder both the ull ad the alterative hypotheses, for which we set the value of ¼ respectively at ad :. Rejectio probabilities are obtaied usig the critical values from  2 distributio for the sample test statistics, ^T ad ~T, ad from the bootstrap distributio for the bootstrap tests, ^T ad ~T. The simulatio results are preseted i Tables, 2 ad 3, respectively for the models with o determiistic compoets, with drift ad with a liear tred. We use the superscripts \¹" ad \ " to sigify that the associated estimators/tests are computed from the models with drift ad with a liear tred. Samples of sizes = 25; 5; ad 2 are cosidered. For each case, 5 samples are simulated, ad for each of the simulated samples, bootstrap repetitios are carried out to compute bootstrap estimators ad test statistics. The simulatio results are largely cosistet with the theory derived i the paper. The bias of ^¼ is quite oticeable ad does ot vaish as the sample size icreases. It turs out, however, that the bootstrap is quite e ective i reducig the bias of ^¼. The bootstrap reduces the bias of ^¼ substatially i all cases that we cosider i our simulatios, ad the bias reductio becomes more e ective as the sample size icreases. I some of the reported cases, the bias corrected estimate has bias as small as approximately 2% of that of ^¼. O the other had, the bootstrap bias correctio has little impact o the samplig variatio.

17 The sample variaces of ^¼ c are roughly comparable to, or eve slightly larger tha, those of ^¼. The bootstrap also reduces the bias i ~¼, albeit the magitudes of improvemets are ot comparable to those for ^¼. Clearly, ~¼ is asymptotically ubiased, ad thus there is ot as much room for improvemet i ~¼ as i ^¼. The bootstrap does ot reduce the samplig variatios for ~¼, just as for ^¼, ad ~¼ c has o smaller sample variaces. For both estimators, the bootstrap bias correctios becomerelativelymore importatfor models with a mea or a liear tred. It appears that the bootstrap geerates quite precise critical values for both tests ^T ad ~T, eve whe the sample sizes are small. I particular, the performace of ^T is quite satisfactory. The test ^T is ivalid, ad aturally, yields the rejectio probabilities uder the ull that are very distict from its omial test sizes. However, its bootstrap versio ^T has quite accurate ull rejectio probabilities eve for small samples. As expected, the test ~T based o the e±ciet estimator ~¼ performs much better tha its OLS couterpart ^T i terms of the rejectio probabilities uder the ull. Its bootstrap versio ~T also improves the ite sample performaces of ~T. While the ull rejectio probabilities of both bootstrap tests ^T ad ~T approach to the omial values as the size of samples icrease, the bootstrap e±ciet test ~T does better i small samples tha its OLS couterpart ^T. This is more so for models with a mea or a liear tred. Due to the presece of o-uiform size distortios, the ite sample power comparisos betwee ^T ad ~T ad their bootstrap couterparts ^T ad ~T are ot clear. It just seems that they are all roughly comparable. 5. Coclusio I this paper we cosider the bootstrap for coitegratig regressios. We itroduce the sieve bootstrap based o a VAR of order icreasig with the sample size, ad establish its cosistecy ad asymptotic validity for two procedures: the usual OLS ad the e±ciet OLS relyig o the regressios augmeted with the leads ad lags of the di ereced regressors. For the usual OLS, the bootstrap ca thus be employed to correct for biases i the estimated parameters, ad to compute the critical values of the tests. With the bootstrap bias correctio, the OLS estimator becomes asymptotically ubiased. Moreover, the OLS-based tests become asymptotically valid, if the bootstrap critical values are used. The bootstrap OLS method, however, is ot e±ciet. For the e±ciet iferece, we should base our bootstrap procedure o the e±ciet OLS method. The sieve bootstrap proposed i the paper appears to improve upo the e±ciet OLS method i ite samples. It geerally reduces the ite sample biases for the estimators ad yields sizes that are closer to the omial sizes of the tests. The theory ad method developed i the paper ca be used to aalyze more geeral coitegrated models. The models with determiistic treds ad/or structural breaks ca be aalyzed similarly. The error correctio models, seemigly urelated ad pael coitegratio models are other examples, to which our theory ad method are readily applicable. Ideed, the required modi catios of the theory ad method for such extesios are miimal, ad ca be applied oly with some obvious adjustmets. The theory discussed i the paper is cocered oly with the cosistecy of the bootstrap. For the pivotal statistics, 6

18 however, it might well be the case that the bootstrap provides the asymptotic re emets as well. Our simulatio study i the paper is i fact somewhat idicative of this possibility. Moreover, it appears that the theoretical illustratio for the asymptotic re emet is possible usig the techiques developed i Park (2) to establish the bootstrap re emet for the uit root tests. 6. Mathematical Proofs 6. Useful Lemmas ad Their Proofs We cosider the regressio w t = ~ w t + + ~ q w t q + ~" qt (23) which is the tted versio of regressio (5). Sice is cosistet, it is well expected that the tted regressio (23) with (w t ) should be asymptotically equivalet to the tted regressio (6) with ( ^w t ) itroduced i Step of our bootstrap procedure. Lemma A Uder Assumptios 2. ad 3., we have as! uiformly i k q. Moreover, ^ k = ~ k + O p ( =2 ) 7 as!. max j^" qt ~" qt j = O p ( =2 ) t Proof of Lemma A It follows immediately from the de itio of ( ^w t ) that X max ^w i;j q t i ^w t j X µ à w t i wt j max j ^w X t w t j j ^w t j +! X jw t j t t= However, we have because t= max j ^w t w t j = O p ( =2 ) t à yt! à j ^w t w t j = x t ( ) x t w =! t j jjx t j v t t= t= ad Sice j j = O p ( ); max t jx tj = O p ( =2 ) X jw t j; t= X j ^w t j = O p () t=

19 8 we have max i;j q X ^w t i ^w t j t= X w t i wt j = O p( =2 ) The rest of the proof is rather straightforward, ad we omit the details. Let ad de e t= w t = (w t ;:::;w t ) M = Ew t w t Moreover, deote by f the spectral desity of (w t ). The we have Lemma A2 for all. Uder Assumptio 2., we have M µ if 2¼ kf( )k Proof of Lemma A2 Let c 2 R be a eigevector associated with the smallest eigevalue mi of M. De e ' ( ) = (e i ;:::;e i ) ad use \¹" to deote its cojugate. It follows that mi = = = = 2¼ c M c Z ¼ c ' ( )¹' ( ) f( ) c d ¼ Z ¼ µ if ' ( )¹' ( ) f( ) d ¼ µ Z ¼ if kf( )k k' ( )¹' ( ) kd ¼ µ if kf( )k However, we have M = mi from which the stated result ca be deduced immediately. Lemma A3 ad as!. Uder Assumptios 2. ad 3., we have X k= sup jf ( ) f( )j = o p() (k) = X k= (k) + o p ()

20 Proof of Lemma A3 Give Lemma A, the stated results are just straightforward extesios of Lemma A2 i Chag ad Park (22b). Here we oly obtai `i probability' versios, istead of `almost sure' versios, sice the results i Lemma A hold oly i probability. 9 Lemmma A4 Uder Assumptios 2. ad 3., we have E X (w t iw t= t j (i j)) 2 = O p () uiformly i i ad j, as!. Proof of Lemma A4 Oce agai, the stated result follows exactly as i Lemma A4 i Chag ad Park (22b), due to Lemma A, uder some obvious modi catios to deal with multiple time series. Lemmma A5 as!. Uder Assumptios 2., 2.3 ad 3., we have Ã! E X v ptv pt = O p() t= X E x tv pt = O p(p =2 ) t= Proof of Lemma A5 ad Park (22b). The proof is essetially idetical to that of Lemma 3.3 i Chag Lemma A6 ad as!. Uder Assumptios 2., 2.3 ad 3., we have X v pt pt = O p( =2 p =2 ) t= X x t ( t pt) = o p() t= Proof of Lemma A6 De e (^ª pk ) such that pt t = X ^ k v t k = X ^ª pk " t k jkj>p jkj>p

21 2 Note that à X! X j^ª pk j j^ª k X j^ k ja jkj>p k= as oe may easily deduce. To show the rst part, we write for i p It is easy to see X t= v t i pt = E X t= X v t i t + 2 v t i t t= jkj>p X v t i( pt t ) t= = O p () uiformly i i p. Therefore, it su±ces to show that X v t i( pt t ) = o p( =2 ) (24) t= uiformly i i p. However, we have as i the proof of Lemma 3. i Chag ad Park (22a) X w t i X jjj>p ^ª pj " A t j = = X X X jkj>p X j^ª k jj^ª pk jao p() X j^ª i jj^ª pj jao p( =2 ) j^ª pk jao p( =2 ) j^ k jao p( =2 ) uiformly i i p, ad (24) follows immediately. To prove the secod part, we de e so that It follows that t= xi t = i=jjj>p tx " i (25) i= z t = ^ª()» t + ( ¹w ¹w t ) X X z t ( pt t ) = ^ª()» t ( pt t ) + ¹w t= X ( pt t ) t= X ¹w t ( pt t ) (26) t=

22 We have X ( pt t ) = X X ^ª pk t= = = X X jkj>p " t k t= j^ª pk jao p( =2 ) 2 j^ k jao p( =2 ) (27) We also have similarly as i the proof of the rst part X ¹w t ( pt t ) X j^ k jao p() (28) t= jkj>p Moreover, we have as i the proof of Lemma 3. i Chag ad Park (22a) X» t ( pt t ) X j^ª pk jao p() X j^ª pk jao p( =2 ) t= = X jkj>p jkj>p j^ k jao p() (29) The secod part ca ow be easilydeduced from (26) ad (27) - (29). The proof is therefore complete. 6.2 Proofs of Lemmas ad Theorems Proof of Lemma 2.2 The stated result follows from Eimahl (987). I particular, he shows that his Equatio.3 holds for all ± whe 2 < s < 4, ad for ± K s with < =(2s 4) whe s 4, i his otatio. I either case, his ± is greater tha our =a+± with ay ± > as log as is su±cietly large. His result is therefore applicable as we formulate here. Proof of Lemma 2.4 We write Ã! X Ã (~ ) = 2 x t x t Q where P = Q = X x t ( t pt ) + t= Ã! Ã X x t vpt t= t= Ã X t=! Ã X x t vpt t=! Ã v pt vpt! X x t t P t=! X Ã v pt vpt t=! X v pt x t t= X v pt pt t=!

23 22 where i tur v pt = (v t+p;:::;v t p) To get the stated result, it su±ces to show that P ; Q = o p () uder Assumptios 2. ad 2.3. I the subsequet proof, we use X x t vpt = O p(p =2 ) (3) t= Ã X v pt v pt! = O p() (3) t= which are the multivariate extesios of the results established i Chag ad Park (22a). The required extesios are straightforward ad the details are omitted. It follows immediately from (3) ad (3) that sice kq k as oe may easily see. We ow show that We rst write kp k Q = O p (p =2 )O p ()O p (p =2 ) = O p ( p) X Ã x t vpt t= jkj>p X v pt vpt! t= P X j k jao p () + O p ( =2 p) X x t ( pt t) + t= = A + B X Ã x t vpt t= X v pt x t t= X v pt vpt! We may show as i the proof of Lemma 3. of Chag ad Park (22a) that A X j k jao p () jkj>p t= X v pt pt t=

24 23 Moreover, we have X v t i pt t= X v t t X kv t j A t= jjj p X v t i u X t + X j j j v t i vt j t= = O p ( =2 ) j= t= uiformly i i for jij p. It therefore follows that B = O p (p =2 )O p ()O p ( =2 p =2 ) = O p ( =2 p) as was to be show. Proof of Lemma 3.2 Give the result i Lemma A, the proof is the trivial extesio of the proof of Lemma 3.2 i Park (22). The details are, therefore, omitted. Proof of Theorem 3.3 of (" t ), we eed to show To derive the bootstrap ivariace priciple for (w t ) from that ^ ()! p () (32) ad ½ ¾ P max =2 ¹w t > ² = o p () (33) t for ay ² >. Let ~ () be de ed exactly as ^ () usig the tted coe±ciets (~ k ) i regressio (23). It follows immediately from Lemma A that ^ () = ~ () + O p ( =2 q) Moreover, we may deduce as i the proof of Lemma 3.5 i Chag ad Park (22a) that ~ () = () + O p ( =2 q) + o(q b ) usig the result i Shibata (98). We therefore have ^ () = () + o p () ad obtai (32). The proof of (33) is essetially idetical to the proof of Theorem 3.3 i Park (22).

25 Proof of Lemma 3.4 Set z = for simplicity. The required modi catio to allow for ozero z is trivial. The rst part follows immediately, sice ad 2 X t= z t z t = d Z =2 z = O p() B B + 2z z for large. To prove the secod part, we let (» t ) be de ed as i (25) so that we have 24 X t= z t w t = ^ª() X t= z ¹w + ¹w» t " t ^ª() + X t= X t= " t ^ª() w t ¹w t X t= ¹w t " t ^ª() It is straightforward to show ad we therefore have for large. It follows that z ¹w ; ¹w X z t w t= X t= t = ^ª() ^ª() " t ^ª() ; X t= X t= X t= ¹w t " t ^ª() = O p( =2 )» t " t ^ª() +» t " t ^ª()! d X t= Z w t ¹w t + o p() (34) BdB by the bootstrap ivariace priciple ad Kurtz ad Protter (99). Moreover, it ca be deduced aalogously as i Lemma A4 that ad we have However, X E (w t ¹w t E w t ¹w t= X t= E w t ¹w t = 2 t ) = O p ( 2 ) w t ¹w t = E w t ¹w t + O p( =2 ) X X (k) = (k) + o p() from which, together with (34), the stated result follows immediately. k= k=

26 Proof of Theorem 3.5 The results ca easily be derived from Lemma 3.4 usig the bootstrap ivariace priciple ad cotiuous mappig theorem. Proof of Lemma 3.6 Notice that jq j X Ã x tv pt It therefore follows that t= X t= v ptv pt! X v pt x t t= Q = O p(p =2 )O p()o p(p =2 ) = O p( p) from Lemma A5. This shows that Q = o p(). Moreover, we have jp j X x t ( pt t ) + t= X Ã x tv pt t= ad it follows from Lemmas A5 ad A6 that as required to be show. X t= v ptv pt! P = o p() + O p(p =2 )O p()o p( =2 p =2 ) = o p() X v pt pt t= 25 Proof of Theorem 3.7 Due to Lemma 3.6, we have (~ ) = Ã 2 X t= x tx t! X t= x t t + o p() The bootstrap asymptotic distributio of ~ ca ow be easily deduced from the bootstrap ivariace priciple ad Kurtz ad Protter (99). The bootstrap asymptotic distributio of ~T may similarly be obtaied.

27 26 Refereces Basawa, I.V., A.K. Mallik, W.P. McCormick, J.H. Reeves ad R.L. Taylor (99a). \Bootstrappig ustable rst-order autoregressive processes," Aalsof Statistics 9: 98-. Chag, Y. ad J.Y. Park (22a). \O the asymptotics of ADF tests for uit roots," forthcomig i Ecoometric Reviews. Chag, Y. ad J.Y. Park (22b). \A sieve bootstrap for the test of a uit root," forthcomig i Joural of Time Series Aalysis. Eimahl, U. (987). \A useful estimate i the multidimesioal ivariace priciple," Probability Theory ad Related Fields 76: 8-. Horowitz, J. (22). \The bootstrap," forthcomig i Hadbook of Ecoometrics Vol. 5, Elsevier, Amsterdam. Johase, S. (988). \Statistical aalysis of coitegratio vectors," Joural of Ecoomic Dyamics ad Cotrol 2: Johase, S. (99). \Estimatio ad hypothesis testig of coitegratio vectors i Gaussia vector autoregressive models," Ecoometrica 59: Kurtz, T.G. ad P. Protter (99). \Weak limit theorems for stochastic itegrals ad stochastic di eretial equatios," Aals of Probability 9: Li, H. ad J.S. Maddala (997). \Bootstrappig coitegratig regressios," Joural of Ecoometrics 8: Park, J.Y. (2). \Bootstrap uit root tests," Mimeographed,School of Ecoomics, Seoul Natioal Uiversity. Park, J.Y. (22). \A ivariace priciple for sieve bootstrap i time series," forthcomig i Ecoometric Theory. Park, J.Y. ad P.C.B. Phillips (988). \Statistical iferece i regressios with itegrated processes: Part," Ecoometric Theory 4: Phillips, P.C.B.ad V.Solo (992). \Asymptotics for liear processes," Aalsof Statistics 2: 97-. Saikkoe, P. (99). \Asymptotically e±ciet estimatio of coitegratio regressios," Ecoometric Theory 7: -2. Shibata, R. (98). \Asymptotically e±ciet selectio of the order of the model for estimatig parameters of a liear process," Aals of Statistics 8: Stock, J.H. ad M.W. Watso (993). \A simple estimator of coitegratig vectors i higher order itegrated systems," Ecoometrica 6:

28 27 Table.: Fiite Sample Performaces of the Estimators estimator bias 2 var 2 MSE 25 ^¼ ^¼ c ~¼ ~¼ c ^¼ ^¼ c ~¼ ~¼ c ^¼ ^¼ c ~¼ ~¼ c ^¼ ^¼ c ~¼ ~¼ c Table.2: Fiite Sample Performaces of the Test Statistics sizes powers test % test 5% test % test % test 5% test % test 25 ^T ^T ~T ~T ^T ^T ~T ~T ^T ^T ~T ~T ^T ^T ~T ~T

29 28 Table 2.: Fiite Sample Performaces of the Estimators estimator bias 2 var 2 MSE 25 ^¼ ¹ ^¼ ¹c ~¼ ¹ ~¼ ¹c ^¼ ¹ ^¼ ¹c ~¼ ¹ ~¼ ¹c ^¼ ¹ ^¼ ¹c ~¼ ¹ ~¼ ¹c ^¼ ¹ ^¼ ¹c ~¼ ¹ ~¼ ¹c Table 2.2: Fiite Sample Performaces of the Test Statistics sizes powers test % test 5% test % test % test 5% test % test 25 ^T ¹ ^T ¹ ~T ¹ ~T ¹ ^T ¹ ^T ¹ ~T ¹ ~T ¹ ¹ ^T ^T ¹ ~T ¹ ~T ¹ ¹ 2 ^T ^T ¹ ~T ¹ ~T ¹

30 29 Table 3.: Fiite Sample Performaces of the Estimators estimator bias 2 var 2 MSE 25 ^¼ ^¼ c ~¼ ~¼ c ^¼ ^¼ c ~¼ ~¼ c ^¼ ^¼ c ~¼ ~¼ c ^¼ ^¼ c ~¼ ~¼ c Table 3.2: Fiite Sample Performaces of the Test Statistics sizes powers test % test 5% test % test % test 5% test % test 25 ^T ^T ~T ~T ^T ^T ~T ~T ^T ^T ~T ~T ^T ^T ~T ~T

On the Asymptotics of ADF Tests for Unit Roots 1

On the Asymptotics of ADF Tests for Unit Roots 1 O the Asymptotics of ADF Tests for Uit Roots Yoosoo Chag Departmet of Ecoomics Rice Uiversity ad Joo Y. Park School of Ecoomics Seoul Natioal Uiversity Abstract I this paper, we derive the asymptotic distributios

More information

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n.

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n. Jauary 1, 2019 Resamplig Methods Motivatio We have so may estimators with the property θ θ d N 0, σ 2 We ca also write θ a N θ, σ 2 /, where a meas approximately distributed as Oce we have a cosistet estimator

More information

Statistical Inference Based on Extremum Estimators

Statistical Inference Based on Extremum Estimators T. Rotheberg Fall, 2007 Statistical Iferece Based o Extremum Estimators Itroductio Suppose 0, the true value of a p-dimesioal parameter, is kow to lie i some subset S R p : Ofte we choose to estimate 0

More information

Study the bias (due to the nite dimensional approximation) and variance of the estimators

Study the bias (due to the nite dimensional approximation) and variance of the estimators 2 Series Methods 2. Geeral Approach A model has parameters (; ) where is ite-dimesioal ad is oparametric. (Sometimes, there is o :) We will focus o regressio. The fuctio is approximated by a series a ite

More information

Economics 241B Relation to Method of Moments and Maximum Likelihood OLSE as a Maximum Likelihood Estimator

Economics 241B Relation to Method of Moments and Maximum Likelihood OLSE as a Maximum Likelihood Estimator Ecoomics 24B Relatio to Method of Momets ad Maximum Likelihood OLSE as a Maximum Likelihood Estimator Uder Assumptio 5 we have speci ed the distributio of the error, so we ca estimate the model parameters

More information

Regression with an Evaporating Logarithmic Trend

Regression with an Evaporating Logarithmic Trend Regressio with a Evaporatig Logarithmic Tred Peter C. B. Phillips Cowles Foudatio, Yale Uiversity, Uiversity of Aucklad & Uiversity of York ad Yixiao Su Departmet of Ecoomics Yale Uiversity October 5,

More information

Spurious Fixed E ects Regression

Spurious Fixed E ects Regression Spurious Fixed E ects Regressio I Choi First Draft: April, 00; This versio: Jue, 0 Abstract This paper shows that spurious regressio results ca occur for a xed e ects model with weak time series variatio

More information

Properties and Hypothesis Testing

Properties and Hypothesis Testing Chapter 3 Properties ad Hypothesis Testig 3.1 Types of data The regressio techiques developed i previous chapters ca be applied to three differet kids of data. 1. Cross-sectioal data. 2. Time series data.

More information

UNIT ROOT MODEL SELECTION PETER C. B. PHILLIPS COWLES FOUNDATION PAPER NO. 1231

UNIT ROOT MODEL SELECTION PETER C. B. PHILLIPS COWLES FOUNDATION PAPER NO. 1231 UNIT ROOT MODEL SELECTION BY PETER C. B. PHILLIPS COWLES FOUNDATION PAPER NO. 1231 COWLES FOUNDATION FOR RESEARCH IN ECONOMICS YALE UNIVERSITY Box 28281 New Have, Coecticut 652-8281 28 http://cowles.eco.yale.edu/

More information

Slide Set 13 Linear Model with Endogenous Regressors and the GMM estimator

Slide Set 13 Linear Model with Endogenous Regressors and the GMM estimator Slide Set 13 Liear Model with Edogeous Regressors ad the GMM estimator Pietro Coretto pcoretto@uisa.it Ecoometrics Master i Ecoomics ad Fiace (MEF) Uiversità degli Studi di Napoli Federico II Versio: Friday

More information

Bootstrap Unit Root Tests

Bootstrap Unit Root Tests Bootstrap Uit Root Tests Joo Y. Park Departmet of Ecoomics Rice Uiversity ad School of Ecoomics Seoul Natioal Uiversity Abstract We cosider the bootstrap uit root tests based o fiite order autoregressive

More information

Stochastic Simulation

Stochastic Simulation Stochastic Simulatio 1 Itroductio Readig Assigmet: Read Chapter 1 of text. We shall itroduce may of the key issues to be discussed i this course via a couple of model problems. Model Problem 1 (Jackso

More information

Asymptotic Results for the Linear Regression Model

Asymptotic Results for the Linear Regression Model Asymptotic Results for the Liear Regressio Model C. Fli November 29, 2000 1. Asymptotic Results uder Classical Assumptios The followig results apply to the liear regressio model y = Xβ + ε, where X is

More information

A note on self-normalized Dickey-Fuller test for unit root in autoregressive time series with GARCH errors

A note on self-normalized Dickey-Fuller test for unit root in autoregressive time series with GARCH errors Appl. Math. J. Chiese Uiv. 008, 3(): 97-0 A ote o self-ormalized Dickey-Fuller test for uit root i autoregressive time series with GARCH errors YANG Xiao-rog ZHANG Li-xi Abstract. I this article, the uit

More information

Efficient GMM LECTURE 12 GMM II

Efficient GMM LECTURE 12 GMM II DECEMBER 1 010 LECTURE 1 II Efficiet The estimator depeds o the choice of the weight matrix A. The efficiet estimator is the oe that has the smallest asymptotic variace amog all estimators defied by differet

More information

Lecture 19: Convergence

Lecture 19: Convergence Lecture 19: Covergece Asymptotic approach I statistical aalysis or iferece, a key to the success of fidig a good procedure is beig able to fid some momets ad/or distributios of various statistics. I may

More information

LECTURE 11 LINEAR PROCESSES III: ASYMPTOTIC RESULTS

LECTURE 11 LINEAR PROCESSES III: ASYMPTOTIC RESULTS PRIL 7, 9 where LECTURE LINER PROCESSES III: SYMPTOTIC RESULTS (Phillips ad Solo (99) ad Phillips Lecture Notes o Statioary ad Nostatioary Time Series) I this lecture, we discuss the LLN ad CLT for a liear

More information

First Year Quantitative Comp Exam Spring, Part I - 203A. f X (x) = 0 otherwise

First Year Quantitative Comp Exam Spring, Part I - 203A. f X (x) = 0 otherwise First Year Quatitative Comp Exam Sprig, 2012 Istructio: There are three parts. Aswer every questio i every part. Questio I-1 Part I - 203A A radom variable X is distributed with the margial desity: >

More information

MA Advanced Econometrics: Properties of Least Squares Estimators

MA Advanced Econometrics: Properties of Least Squares Estimators MA Advaced Ecoometrics: Properties of Least Squares Estimators Karl Whela School of Ecoomics, UCD February 5, 20 Karl Whela UCD Least Squares Estimators February 5, 20 / 5 Part I Least Squares: Some Fiite-Sample

More information

Journal of Multivariate Analysis. Superefficient estimation of the marginals by exploiting knowledge on the copula

Journal of Multivariate Analysis. Superefficient estimation of the marginals by exploiting knowledge on the copula Joural of Multivariate Aalysis 102 (2011) 1315 1319 Cotets lists available at ScieceDirect Joural of Multivariate Aalysis joural homepage: www.elsevier.com/locate/jmva Superefficiet estimatio of the margials

More information

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence Chapter 3 Strog covergece As poited out i the Chapter 2, there are multiple ways to defie the otio of covergece of a sequece of radom variables. That chapter defied covergece i probability, covergece i

More information

Since X n /n P p, we know that X n (n. Xn (n X n ) Using the asymptotic result above to obtain an approximation for fixed n, we obtain

Since X n /n P p, we know that X n (n. Xn (n X n ) Using the asymptotic result above to obtain an approximation for fixed n, we obtain Assigmet 9 Exercise 5.5 Let X biomial, p, where p 0, 1 is ukow. Obtai cofidece itervals for p i two differet ways: a Sice X / p d N0, p1 p], the variace of the limitig distributio depeds oly o p. Use the

More information

1 Covariance Estimation

1 Covariance Estimation Eco 75 Lecture 5 Covariace Estimatio ad Optimal Weightig Matrices I this lecture, we cosider estimatio of the asymptotic covariace matrix B B of the extremum estimator b : Covariace Estimatio Lemma 4.

More information

Convergence of random variables. (telegram style notes) P.J.C. Spreij

Convergence of random variables. (telegram style notes) P.J.C. Spreij Covergece of radom variables (telegram style otes).j.c. Spreij this versio: September 6, 2005 Itroductio As we kow, radom variables are by defiitio measurable fuctios o some uderlyig measurable space

More information

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1 EECS564 Estimatio, Filterig, ad Detectio Hwk 2 Sols. Witer 25 4. Let Z be a sigle observatio havig desity fuctio where. p (z) = (2z + ), z (a) Assumig that is a oradom parameter, fid ad plot the maximum

More information

Lecture 33: Bootstrap

Lecture 33: Bootstrap Lecture 33: ootstrap Motivatio To evaluate ad compare differet estimators, we eed cosistet estimators of variaces or asymptotic variaces of estimators. This is also importat for hypothesis testig ad cofidece

More information

Bull. Korean Math. Soc. 36 (1999), No. 3, pp. 451{457 THE STRONG CONSISTENCY OF NONLINEAR REGRESSION QUANTILES ESTIMATORS Seung Hoe Choi and Hae Kyung

Bull. Korean Math. Soc. 36 (1999), No. 3, pp. 451{457 THE STRONG CONSISTENCY OF NONLINEAR REGRESSION QUANTILES ESTIMATORS Seung Hoe Choi and Hae Kyung Bull. Korea Math. Soc. 36 (999), No. 3, pp. 45{457 THE STRONG CONSISTENCY OF NONLINEAR REGRESSION QUANTILES ESTIMATORS Abstract. This paper provides suciet coditios which esure the strog cosistecy of regressio

More information

7.1 Convergence of sequences of random variables

7.1 Convergence of sequences of random variables Chapter 7 Limit Theorems Throughout this sectio we will assume a probability space (, F, P), i which is defied a ifiite sequece of radom variables (X ) ad a radom variable X. The fact that for every ifiite

More information

1 Inferential Methods for Correlation and Regression Analysis

1 Inferential Methods for Correlation and Regression Analysis 1 Iferetial Methods for Correlatio ad Regressio Aalysis I the chapter o Correlatio ad Regressio Aalysis tools for describig bivariate cotiuous data were itroduced. The sample Pearso Correlatio Coefficiet

More information

Rank tests and regression rank scores tests in measurement error models

Rank tests and regression rank scores tests in measurement error models Rak tests ad regressio rak scores tests i measuremet error models J. Jurečková ad A.K.Md.E. Saleh Charles Uiversity i Prague ad Carleto Uiversity i Ottawa Abstract The rak ad regressio rak score tests

More information

1 General linear Model Continued..

1 General linear Model Continued.. Geeral liear Model Cotiued.. We have We kow y = X + u X o radom u v N(0; I ) b = (X 0 X) X 0 y E( b ) = V ar( b ) = (X 0 X) We saw that b = (X 0 X) X 0 u so b is a liear fuctio of a ormally distributed

More information

Conditional-Sum-of-Squares Estimation of Models for Stationary Time Series with Long Memory

Conditional-Sum-of-Squares Estimation of Models for Stationary Time Series with Long Memory Coditioal-Sum-of-Squares Estimatio of Models for Statioary Time Series with Log Memory.M. Robiso Lodo School of Ecoomics The Sutory Cetre Sutory ad Toyota Iteratioal Cetres for Ecoomics ad Related Disciplies

More information

POWER COMPARISON OF EMPIRICAL LIKELIHOOD RATIO TESTS: SMALL SAMPLE PROPERTIES THROUGH MONTE CARLO STUDIES*

POWER COMPARISON OF EMPIRICAL LIKELIHOOD RATIO TESTS: SMALL SAMPLE PROPERTIES THROUGH MONTE CARLO STUDIES* Kobe Uiversity Ecoomic Review 50(2004) 3 POWER COMPARISON OF EMPIRICAL LIKELIHOOD RATIO TESTS: SMALL SAMPLE PROPERTIES THROUGH MONTE CARLO STUDIES* By HISASHI TANIZAKI There are various kids of oparametric

More information

Kolmogorov-Smirnov type Tests for Local Gaussianity in High-Frequency Data

Kolmogorov-Smirnov type Tests for Local Gaussianity in High-Frequency Data Proceedigs 59th ISI World Statistics Cogress, 5-30 August 013, Hog Kog (Sessio STS046) p.09 Kolmogorov-Smirov type Tests for Local Gaussiaity i High-Frequecy Data George Tauche, Duke Uiversity Viktor Todorov,

More information

6.3 Testing Series With Positive Terms

6.3 Testing Series With Positive Terms 6.3. TESTING SERIES WITH POSITIVE TERMS 307 6.3 Testig Series With Positive Terms 6.3. Review of what is kow up to ow I theory, testig a series a i for covergece amouts to fidig the i= sequece of partial

More information

Cointegration versus Spurious Regression and Heterogeneity in Large Panels

Cointegration versus Spurious Regression and Heterogeneity in Large Panels Coitegratio versus Spurious Regressio ad Heterogeeity i Large Paels Lorezo rapai Cass Busiess School Jauary 8, 009 Abstract his paper provides a estimatio ad testig framework to idetify the source(s) of

More information

A Uni ed Estimation Approach for Spatial Dynamic Panel Data Models: Stability, Spatial Cointegration and Explosive Roots

A Uni ed Estimation Approach for Spatial Dynamic Panel Data Models: Stability, Spatial Cointegration and Explosive Roots A Ui ed Estimatio Approach for Spatial Dyamic Pael Data Models: Stability, Spatial Coitegratio ad Explosive Roots Lug-fei Lee Departmet of Ecoomics Ohio State Uiversity lee.777osu.edu Jihai Yu Departmet

More information

Bootstrapping Unit Root Tests with Covariates 1

Bootstrapping Unit Root Tests with Covariates 1 Bootstrappig Uit Root Tests with Covariates 1 Yoosoo Chag Departmet of Ecoomics Rice Uiversity Robi C. Sickles Departmet of Ecoomics Rice Uiversity Woho Sog Departmet of Ecoomics Rice Uiversity Abstract

More information

4. Partial Sums and the Central Limit Theorem

4. Partial Sums and the Central Limit Theorem 1 of 10 7/16/2009 6:05 AM Virtual Laboratories > 6. Radom Samples > 1 2 3 4 5 6 7 4. Partial Sums ad the Cetral Limit Theorem The cetral limit theorem ad the law of large umbers are the two fudametal theorems

More information

A RANK STATISTIC FOR NON-PARAMETRIC K-SAMPLE AND CHANGE POINT PROBLEMS

A RANK STATISTIC FOR NON-PARAMETRIC K-SAMPLE AND CHANGE POINT PROBLEMS J. Japa Statist. Soc. Vol. 41 No. 1 2011 67 73 A RANK STATISTIC FOR NON-PARAMETRIC K-SAMPLE AND CHANGE POINT PROBLEMS Yoichi Nishiyama* We cosider k-sample ad chage poit problems for idepedet data i a

More information

Kernel density estimator

Kernel density estimator Jauary, 07 NONPARAMETRIC ERNEL DENSITY ESTIMATION I this lecture, we discuss kerel estimatio of probability desity fuctios PDF Noparametric desity estimatio is oe of the cetral problems i statistics I

More information

6 Sample Size Calculations

6 Sample Size Calculations 6 Sample Size Calculatios Oe of the major resposibilities of a cliical trial statisticia is to aid the ivestigators i determiig the sample size required to coduct a study The most commo procedure for determiig

More information

Random Variables, Sampling and Estimation

Random Variables, Sampling and Estimation Chapter 1 Radom Variables, Samplig ad Estimatio 1.1 Itroductio This chapter will cover the most importat basic statistical theory you eed i order to uderstad the ecoometric material that will be comig

More information

1 Introduction to reducing variance in Monte Carlo simulations

1 Introduction to reducing variance in Monte Carlo simulations Copyright c 010 by Karl Sigma 1 Itroductio to reducig variace i Mote Carlo simulatios 11 Review of cofidece itervals for estimatig a mea I statistics, we estimate a ukow mea µ = E(X) of a distributio by

More information

11 THE GMM ESTIMATION

11 THE GMM ESTIMATION Cotets THE GMM ESTIMATION 2. Cosistecy ad Asymptotic Normality..................... 3.2 Regularity Coditios ad Idetificatio..................... 4.3 The GMM Iterpretatio of the OLS Estimatio.................

More information

THE SPECTRAL RADII AND NORMS OF LARGE DIMENSIONAL NON-CENTRAL RANDOM MATRICES

THE SPECTRAL RADII AND NORMS OF LARGE DIMENSIONAL NON-CENTRAL RANDOM MATRICES COMMUN. STATIST.-STOCHASTIC MODELS, 0(3), 525-532 (994) THE SPECTRAL RADII AND NORMS OF LARGE DIMENSIONAL NON-CENTRAL RANDOM MATRICES Jack W. Silverstei Departmet of Mathematics, Box 8205 North Carolia

More information

Statistical Analysis on Uncertainty for Autocorrelated Measurements and its Applications to Key Comparisons

Statistical Analysis on Uncertainty for Autocorrelated Measurements and its Applications to Key Comparisons Statistical Aalysis o Ucertaity for Autocorrelated Measuremets ad its Applicatios to Key Comparisos Nie Fa Zhag Natioal Istitute of Stadards ad Techology Gaithersburg, MD 0899, USA Outlies. Itroductio.

More information

Investigating the Significance of a Correlation Coefficient using Jackknife Estimates

Investigating the Significance of a Correlation Coefficient using Jackknife Estimates Iteratioal Joural of Scieces: Basic ad Applied Research (IJSBAR) ISSN 2307-4531 (Prit & Olie) http://gssrr.org/idex.php?joural=jouralofbasicadapplied ---------------------------------------------------------------------------------------------------------------------------

More information

Output Analysis and Run-Length Control

Output Analysis and Run-Length Control IEOR E4703: Mote Carlo Simulatio Columbia Uiversity c 2017 by Marti Haugh Output Aalysis ad Ru-Legth Cotrol I these otes we describe how the Cetral Limit Theorem ca be used to costruct approximate (1 α%

More information

Precise Rates in Complete Moment Convergence for Negatively Associated Sequences

Precise Rates in Complete Moment Convergence for Negatively Associated Sequences Commuicatios of the Korea Statistical Society 29, Vol. 16, No. 5, 841 849 Precise Rates i Complete Momet Covergece for Negatively Associated Sequeces Dae-Hee Ryu 1,a a Departmet of Computer Sciece, ChugWoo

More information

A statistical method to determine sample size to estimate characteristic value of soil parameters

A statistical method to determine sample size to estimate characteristic value of soil parameters A statistical method to determie sample size to estimate characteristic value of soil parameters Y. Hojo, B. Setiawa 2 ad M. Suzuki 3 Abstract Sample size is a importat factor to be cosidered i determiig

More information

It should be unbiased, or approximately unbiased. Variance of the variance estimator should be small. That is, the variance estimator is stable.

It should be unbiased, or approximately unbiased. Variance of the variance estimator should be small. That is, the variance estimator is stable. Chapter 10 Variace Estimatio 10.1 Itroductio Variace estimatio is a importat practical problem i survey samplig. Variace estimates are used i two purposes. Oe is the aalytic purpose such as costructig

More information

GUIDELINES ON REPRESENTATIVE SAMPLING

GUIDELINES ON REPRESENTATIVE SAMPLING DRUGS WORKING GROUP VALIDATION OF THE GUIDELINES ON REPRESENTATIVE SAMPLING DOCUMENT TYPE : REF. CODE: ISSUE NO: ISSUE DATE: VALIDATION REPORT DWG-SGL-001 002 08 DECEMBER 2012 Ref code: DWG-SGL-001 Issue

More information

Solution to Chapter 2 Analytical Exercises

Solution to Chapter 2 Analytical Exercises Nov. 25, 23, Revised Dec. 27, 23 Hayashi Ecoometrics Solutio to Chapter 2 Aalytical Exercises. For ay ε >, So, plim z =. O the other had, which meas that lim E(z =. 2. As show i the hit, Prob( z > ε =

More information

Goodness-Of-Fit For The Generalized Exponential Distribution. Abstract

Goodness-Of-Fit For The Generalized Exponential Distribution. Abstract Goodess-Of-Fit For The Geeralized Expoetial Distributio By Amal S. Hassa stitute of Statistical Studies & Research Cairo Uiversity Abstract Recetly a ew distributio called geeralized expoetial or expoetiated

More information

Mi-Hwa Ko and Tae-Sung Kim

Mi-Hwa Ko and Tae-Sung Kim J. Korea Math. Soc. 42 2005), No. 5, pp. 949 957 ALMOST SURE CONVERGENCE FOR WEIGHTED SUMS OF NEGATIVELY ORTHANT DEPENDENT RANDOM VARIABLES Mi-Hwa Ko ad Tae-Sug Kim Abstract. For weighted sum of a sequece

More information

Lecture 3. Properties of Summary Statistics: Sampling Distribution

Lecture 3. Properties of Summary Statistics: Sampling Distribution Lecture 3 Properties of Summary Statistics: Samplig Distributio Mai Theme How ca we use math to justify that our umerical summaries from the sample are good summaries of the populatio? Lecture Summary

More information

10. Comparative Tests among Spatial Regression Models. Here we revisit the example in Section 8.1 of estimating the mean of a normal random

10. Comparative Tests among Spatial Regression Models. Here we revisit the example in Section 8.1 of estimating the mean of a normal random Part III. Areal Data Aalysis 0. Comparative Tests amog Spatial Regressio Models While the otio of relative likelihood values for differet models is somewhat difficult to iterpret directly (as metioed above),

More information

Summary and Discussion on Simultaneous Analysis of Lasso and Dantzig Selector

Summary and Discussion on Simultaneous Analysis of Lasso and Dantzig Selector Summary ad Discussio o Simultaeous Aalysis of Lasso ad Datzig Selector STAT732, Sprig 28 Duzhe Wag May 4, 28 Abstract This is a discussio o the work i Bickel, Ritov ad Tsybakov (29). We begi with a short

More information

Lecture 22: Review for Exam 2. 1 Basic Model Assumptions (without Gaussian Noise)

Lecture 22: Review for Exam 2. 1 Basic Model Assumptions (without Gaussian Noise) Lecture 22: Review for Exam 2 Basic Model Assumptios (without Gaussia Noise) We model oe cotiuous respose variable Y, as a liear fuctio of p umerical predictors, plus oise: Y = β 0 + β X +... β p X p +

More information

Geometry of LS. LECTURE 3 GEOMETRY OF LS, PROPERTIES OF σ 2, PARTITIONED REGRESSION, GOODNESS OF FIT

Geometry of LS. LECTURE 3 GEOMETRY OF LS, PROPERTIES OF σ 2, PARTITIONED REGRESSION, GOODNESS OF FIT OCTOBER 7, 2016 LECTURE 3 GEOMETRY OF LS, PROPERTIES OF σ 2, PARTITIONED REGRESSION, GOODNESS OF FIT Geometry of LS We ca thik of y ad the colums of X as members of the -dimesioal Euclidea space R Oe ca

More information

Algebra of Least Squares

Algebra of Least Squares October 19, 2018 Algebra of Least Squares Geometry of Least Squares Recall that out data is like a table [Y X] where Y collects observatios o the depedet variable Y ad X collects observatios o the k-dimesioal

More information

Optimally Sparse SVMs

Optimally Sparse SVMs A. Proof of Lemma 3. We here prove a lower boud o the umber of support vectors to achieve geeralizatio bouds of the form which we cosider. Importatly, this result holds ot oly for liear classifiers, but

More information

Spatial Nonstationarity and Spurious Regression: The Case with Row-Normalized Spatial Weights Matrix

Spatial Nonstationarity and Spurious Regression: The Case with Row-Normalized Spatial Weights Matrix Spatial Nostatioarity ad Spurious Regressio: The Case with Row-Normalized Spatial Weights Matrix March 26, 29 Abstract This paper ivestigates the spurious regressio i the spatial settig where the regressat

More information

Estimation for Complete Data

Estimation for Complete Data Estimatio for Complete Data complete data: there is o loss of iformatio durig study. complete idividual complete data= grouped data A complete idividual data is the oe i which the complete iformatio of

More information

An Introduction to Randomized Algorithms

An Introduction to Randomized Algorithms A Itroductio to Radomized Algorithms The focus of this lecture is to study a radomized algorithm for quick sort, aalyze it usig probabilistic recurrece relatios, ad also provide more geeral tools for aalysis

More information

Lecture 24: Variable selection in linear models

Lecture 24: Variable selection in linear models Lecture 24: Variable selectio i liear models Cosider liear model X = Z β + ε, β R p ad Varε = σ 2 I. Like the LSE, the ridge regressio estimator does ot give 0 estimate to a compoet of β eve if that compoet

More information

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors ECONOMETRIC THEORY MODULE XIII Lecture - 34 Asymptotic Theory ad Stochastic Regressors Dr. Shalabh Departmet of Mathematics ad Statistics Idia Istitute of Techology Kapur Asymptotic theory The asymptotic

More information

Entropy Rates and Asymptotic Equipartition

Entropy Rates and Asymptotic Equipartition Chapter 29 Etropy Rates ad Asymptotic Equipartitio Sectio 29. itroduces the etropy rate the asymptotic etropy per time-step of a stochastic process ad shows that it is well-defied; ad similarly for iformatio,

More information

σ 2 ) Consider a discrete-time random walk x t =u 1 + +u t with i.i.d. N(0,σ 2 ) increments u t. Exercise: Show that UL N(0, 3

σ 2 ) Consider a discrete-time random walk x t =u 1 + +u t with i.i.d. N(0,σ 2 ) increments u t. Exercise: Show that UL N(0, 3 Cosider a discrete-time radom walk x t =u + +u t with i.i.d. N(,σ ) icremets u t. Exercise: Show that UL ( x) = 3 x t L σ N(, 3 ). Exercise: Show that U (i) x t ~N(,tσ ), (ii) s

More information

Estimation of the Mean and the ACVF

Estimation of the Mean and the ACVF Chapter 5 Estimatio of the Mea ad the ACVF A statioary process {X t } is characterized by its mea ad its autocovariace fuctio γ ), ad so by the autocorrelatio fuctio ρ ) I this chapter we preset the estimators

More information

SOME THEORY AND PRACTICE OF STATISTICS by Howard G. Tucker

SOME THEORY AND PRACTICE OF STATISTICS by Howard G. Tucker SOME THEORY AND PRACTICE OF STATISTICS by Howard G. Tucker CHAPTER 9. POINT ESTIMATION 9. Covergece i Probability. The bases of poit estimatio have already bee laid out i previous chapters. I chapter 5

More information

Double Stage Shrinkage Estimator of Two Parameters. Generalized Exponential Distribution

Double Stage Shrinkage Estimator of Two Parameters. Generalized Exponential Distribution Iteratioal Mathematical Forum, Vol., 3, o. 3, 3-53 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/.9/imf.3.335 Double Stage Shrikage Estimator of Two Parameters Geeralized Expoetial Distributio Alaa M.

More information

If, for instance, we were required to test whether the population mean μ could be equal to a certain value μ

If, for instance, we were required to test whether the population mean μ could be equal to a certain value μ STATISTICAL INFERENCE INTRODUCTION Statistical iferece is that brach of Statistics i which oe typically makes a statemet about a populatio based upo the results of a sample. I oesample testig, we essetially

More information

ECE 901 Lecture 12: Complexity Regularization and the Squared Loss

ECE 901 Lecture 12: Complexity Regularization and the Squared Loss ECE 90 Lecture : Complexity Regularizatio ad the Squared Loss R. Nowak 5/7/009 I the previous lectures we made use of the Cheroff/Hoeffdig bouds for our aalysis of classifier errors. Hoeffdig s iequality

More information

Lecture Stat Maximum Likelihood Estimation

Lecture Stat Maximum Likelihood Estimation Lecture Stat 461-561 Maximum Likelihood Estimatio A.D. Jauary 2008 A.D. () Jauary 2008 1 / 63 Maximum Likelihood Estimatio Ivariace Cosistecy E ciecy Nuisace Parameters A.D. () Jauary 2008 2 / 63 Parametric

More information

Preponderantly increasing/decreasing data in regression analysis

Preponderantly increasing/decreasing data in regression analysis Croatia Operatioal Research Review 269 CRORR 7(2016), 269 276 Prepoderatly icreasig/decreasig data i regressio aalysis Darija Marković 1, 1 Departmet of Mathematics, J. J. Strossmayer Uiversity of Osijek,

More information

An almost sure invariance principle for trimmed sums of random vectors

An almost sure invariance principle for trimmed sums of random vectors Proc. Idia Acad. Sci. Math. Sci. Vol. 20, No. 5, November 200, pp. 6 68. Idia Academy of Scieces A almost sure ivariace priciple for trimmed sums of radom vectors KE-ANG FU School of Statistics ad Mathematics,

More information

Marcinkiwiecz-Zygmund Type Inequalities for all Arcs of the Circle

Marcinkiwiecz-Zygmund Type Inequalities for all Arcs of the Circle Marcikiwiecz-ygmud Type Iequalities for all Arcs of the Circle C.K. Kobidarajah ad D. S. Lubisky Mathematics Departmet, Easter Uiversity, Chekalady, Sri Laka; Mathematics Departmet, Georgia Istitute of

More information

Lecture 2: Monte Carlo Simulation

Lecture 2: Monte Carlo Simulation STAT/Q SCI 43: Itroductio to Resamplig ethods Sprig 27 Istructor: Ye-Chi Che Lecture 2: ote Carlo Simulatio 2 ote Carlo Itegratio Assume we wat to evaluate the followig itegratio: e x3 dx What ca we do?

More information

6. Kalman filter implementation for linear algebraic equations. Karhunen-Loeve decomposition

6. Kalman filter implementation for linear algebraic equations. Karhunen-Loeve decomposition 6. Kalma filter implemetatio for liear algebraic equatios. Karhue-Loeve decompositio 6.1. Solvable liear algebraic systems. Probabilistic iterpretatio. Let A be a quadratic matrix (ot obligatory osigular.

More information

[412] A TEST FOR HOMOGENEITY OF THE MARGINAL DISTRIBUTIONS IN A TWO-WAY CLASSIFICATION

[412] A TEST FOR HOMOGENEITY OF THE MARGINAL DISTRIBUTIONS IN A TWO-WAY CLASSIFICATION [412] A TEST FOR HOMOGENEITY OF THE MARGINAL DISTRIBUTIONS IN A TWO-WAY CLASSIFICATION BY ALAN STUART Divisio of Research Techiques, Lodo School of Ecoomics 1. INTRODUCTION There are several circumstaces

More information

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering CEE 5 Autum 005 Ucertaity Cocepts for Geotechical Egieerig Basic Termiology Set A set is a collectio of (mutually exclusive) objects or evets. The sample space is the (collectively exhaustive) collectio

More information

Advanced Stochastic Processes.

Advanced Stochastic Processes. Advaced Stochastic Processes. David Gamarik LECTURE 2 Radom variables ad measurable fuctios. Strog Law of Large Numbers (SLLN). Scary stuff cotiued... Outlie of Lecture Radom variables ad measurable fuctios.

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 21 11/27/2013

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 21 11/27/2013 MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 21 11/27/2013 Fuctioal Law of Large Numbers. Costructio of the Wieer Measure Cotet. 1. Additioal techical results o weak covergece

More information

G. R. Pasha Department of Statistics Bahauddin Zakariya University Multan, Pakistan

G. R. Pasha Department of Statistics Bahauddin Zakariya University Multan, Pakistan Deviatio of the Variaces of Classical Estimators ad Negative Iteger Momet Estimator from Miimum Variace Boud with Referece to Maxwell Distributio G. R. Pasha Departmet of Statistics Bahauddi Zakariya Uiversity

More information

6a Time change b Quadratic variation c Planar Brownian motion d Conformal local martingales e Hints to exercises...

6a Time change b Quadratic variation c Planar Brownian motion d Conformal local martingales e Hints to exercises... Tel Aviv Uiversity, 28 Browia motio 59 6 Time chage 6a Time chage..................... 59 6b Quadratic variatio................. 61 6c Plaar Browia motio.............. 64 6d Coformal local martigales............

More information

17. Joint distributions of extreme order statistics Lehmann 5.1; Ferguson 15

17. Joint distributions of extreme order statistics Lehmann 5.1; Ferguson 15 17. Joit distributios of extreme order statistics Lehma 5.1; Ferguso 15 I Example 10., we derived the asymptotic distributio of the maximum from a radom sample from a uiform distributio. We did this usig

More information

PSYCHOLOGICAL RESEARCH (PYC 304-C) Lecture 9

PSYCHOLOGICAL RESEARCH (PYC 304-C) Lecture 9 Hypothesis testig PSYCHOLOGICAL RESEARCH (PYC 34-C Lecture 9 Statistical iferece is that brach of Statistics i which oe typically makes a statemet about a populatio based upo the results of a sample. I

More information

The standard deviation of the mean

The standard deviation of the mean Physics 6C Fall 20 The stadard deviatio of the mea These otes provide some clarificatio o the distictio betwee the stadard deviatio ad the stadard deviatio of the mea.. The sample mea ad variace Cosider

More information

. Prelimiaries ad otatios Let f C q [ 1 1] be give, where q 0. The for a xed r such that q < r q + 1 we dee a polyomial H r f of degree at most + r 1

. Prelimiaries ad otatios Let f C q [ 1 1] be give, where q 0. The for a xed r such that q < r q + 1 we dee a polyomial H r f of degree at most + r 1 Poitwise Gopegauz Estimates for Iterpolatio T. Kilgore ad J. Presti Abstract We derive some ew poitwise estimates for the error i simultaeous approximatio of a fuctio f C q [ 1 1] ad its derivatives by

More information

Week 10. f2 j=2 2 j k ; j; k 2 Zg is an orthonormal basis for L 2 (R). This function is called mother wavelet, which can be often constructed

Week 10. f2 j=2 2 j k ; j; k 2 Zg is an orthonormal basis for L 2 (R). This function is called mother wavelet, which can be often constructed Wee 0 A Itroductio to Wavelet regressio. De itio: Wavelet is a fuctio such that f j= j ; j; Zg is a orthoormal basis for L (R). This fuctio is called mother wavelet, which ca be ofte costructed from father

More information

1.010 Uncertainty in Engineering Fall 2008

1.010 Uncertainty in Engineering Fall 2008 MIT OpeCourseWare http://ocw.mit.edu.00 Ucertaity i Egieerig Fall 2008 For iformatio about citig these materials or our Terms of Use, visit: http://ocw.mit.edu.terms. .00 - Brief Notes # 9 Poit ad Iterval

More information

Supplementary Material to A General Method for Third-Order Bias and Variance Corrections on a Nonlinear Estimator

Supplementary Material to A General Method for Third-Order Bias and Variance Corrections on a Nonlinear Estimator Supplemetary Material to A Geeral Method for Third-Order Bias ad Variace Correctios o a Noliear Estimator Zheli Yag School of Ecoomics, Sigapore Maagemet Uiversity, 90 Stamford Road, Sigapore 178903 emails:

More information

CONDITIONAL PROBABILITY INTEGRAL TRANSFORMATIONS FOR MULTIVARIATE NORMAL DISTRIBUTIONS

CONDITIONAL PROBABILITY INTEGRAL TRANSFORMATIONS FOR MULTIVARIATE NORMAL DISTRIBUTIONS CONDITIONAL PROBABILITY INTEGRAL TRANSFORMATIONS FOR MULTIVARIATE NORMAL DISTRIBUTIONS Satiago Rico Gallardo, C. P. Queseberry, F. J. O'Reilly Istitute of Statistics Mimeograph Series No. 1148 Raleigh,

More information

Self-normalized deviation inequalities with application to t-statistic

Self-normalized deviation inequalities with application to t-statistic Self-ormalized deviatio iequalities with applicatio to t-statistic Xiequa Fa Ceter for Applied Mathematics, Tiaji Uiversity, 30007 Tiaji, Chia Abstract Let ξ i i 1 be a sequece of idepedet ad symmetric

More information

Chapter 6 Sampling Distributions

Chapter 6 Sampling Distributions Chapter 6 Samplig Distributios 1 I most experimets, we have more tha oe measuremet for ay give variable, each measuremet beig associated with oe radomly selected a member of a populatio. Hece we eed to

More information

Quantile regression with multilayer perceptrons.

Quantile regression with multilayer perceptrons. Quatile regressio with multilayer perceptros. S.-F. Dimby ad J. Rykiewicz Uiversite Paris 1 - SAMM 90 Rue de Tolbiac, 75013 Paris - Frace Abstract. We cosider oliear quatile regressio ivolvig multilayer

More information

Estimation of Gumbel Parameters under Ranked Set Sampling

Estimation of Gumbel Parameters under Ranked Set Sampling Joural of Moder Applied Statistical Methods Volume 13 Issue 2 Article 11-2014 Estimatio of Gumbel Parameters uder Raked Set Samplig Omar M. Yousef Al Balqa' Applied Uiversity, Zarqa, Jorda, abuyaza_o@yahoo.com

More information

Sieve Estimators: Consistency and Rates of Convergence

Sieve Estimators: Consistency and Rates of Convergence EECS 598: Statistical Learig Theory, Witer 2014 Topic 6 Sieve Estimators: Cosistecy ad Rates of Covergece Lecturer: Clayto Scott Scribe: Julia Katz-Samuels, Brado Oselio, Pi-Yu Che Disclaimer: These otes

More information