Wavelet Methods for Time Series Analysis. Part IX: Wavelet-Based Bootstrapping

Similar documents
Wavelet Methods for Time Series Analysis. Motivating Question

Wavelet-Based Bootstrapping for Non-Gaussian Time Series. Don Percival

Wavelet Methods for Time Series Analysis. Part IV: Wavelet-Based Decorrelation of Time Series

Wavestrapping Time Series: Adaptive Wavelet-Based Bootstrapping

An Introduction to Wavelets with Applications in Environmental Science

Inference. Undercoverage of Wavelet-Based Resampling Confidence Intervals

FGN 1.0 PPL 1.0 FD

Wavelet Methods for Time Series Analysis

A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements

A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements

Resampling hierarchical processes in the wavelet domain: A case study using atmospheric turbulence

Wavelet Methods for Time Series Analysis. Part II: Wavelet-Based Statistical Analysis of Time Series

Using wavelet tools to estimate and assess trends in atmospheric data

Wavelet Analysis of Discrete Time Series

Testing for Homogeneity of Variance in Time Series: Long Memory, Wavelets and the Nile River

Resampling hierarchical processes in the wavelet domain: A case study using atmospheric turbulence

Asymptotic Decorrelation of Between-Scale Wavelet Coefficients

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

If we want to analyze experimental or simulated data we might encounter the following tasks:

Minitab Project Report Assignment 3

Lesson 8: Testing for IID Hypothesis with the correlogram

Comments on New Approaches in Period Analysis of Astronomical Time Series by Pavlos Protopapas (Or: A Pavlosian Response ) Don Percival

Review Session: Econometrics - CLEFIN (20192)

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Nontechnical introduction to wavelets Continuous wavelet transforms Fourier versus wavelets - examples

Introduction to Signal Processing

Statistics of Stochastic Processes

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

Wavelet domain test for long range dependence in the presence of a trend

Part II. Time Series

Lecture 2: ARMA(p,q) models (part 2)

An Introduction to Nonstationary Time Series Analysis

Testing for homogeneity of variance in time series: Long memory, wavelets, and the Nile River

Econometría 2: Análisis de series de Tiempo

Testing for IID Noise/White Noise: I

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Vector autoregressions, VAR

Testing for IID Noise/White Noise: I

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Computational Data Analysis!

Decline of Arctic Sea-Ice Thickness as Evidenced by Submarine Measurements. Don Percival

Quantitative Finance II Lecture 10

Lecture 3: Statistical sampling uncertainty

Wavelet-Based Test for Time Series Non-Stationarity 1

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

Akaike criterion: Kullback-Leibler discrepancy

Chapter 12 - Lecture 2 Inferences about regression coefficient

Gaussian, Markov and stationary processes

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Multivariate Time Series

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

distributed approximately according to white noise. Likewise, for general ARMA(p,q), the residuals can be expressed as

Statistical signal processing

Ch 6. Model Specification. Time Series Analysis

Regime shifts and red noise in the North Pacific

Dynamic Time Series Regression: A Panacea for Spurious Correlations

ISSN Article. Simulation Study of Direct Causality Measures in Multivariate Time Series

Notes on Random Processes

ECE 636: Systems identification

Statistics of stochastic processes

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Probability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Analysis. Components of a Time Series

Time series and spectral analysis. Peter F. Craigmile

7. Integrated Processes

Robust Backtesting Tests for Value-at-Risk Models

A time series is a set of observations made sequentially in time.

Wavelet-based confidence intervals for the self-similarity parameter

8.2 Harmonic Regression and the Periodogram

Econ 423 Lecture Notes: Additional Topics in Time Series 1

Time Series: Theory and Methods

Serial Correlation Tests with Wavelets

Testing for Regime Switching in Singaporean Business Cycles

IS THE NORTH ATLANTIC OSCILLATION A RANDOM WALK? A COMMENT WITH FURTHER RESULTS

Multiscale Detection and Location of Multiple Variance Changes in the Presence of Long Memory Brandon Whitcher EURANDOM Peter

Statistical concepts in climate research

arxiv: v1 [stat.me] 5 Nov 2008

Akaike criterion: Kullback-Leibler discrepancy

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

7. Integrated Processes

LIST OF PUBLICATIONS. 1. J.-P. Kreiss and E. Paparoditis, Bootstrap for Time Series: Theory and Applications, Springer-Verlag, New York, To appear.

F9 F10: Autocorrelation

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series. Anthony Davison. c

Modeling and testing long memory in random fields

Multiresolution Models of Time Series

Defect Detection using Nonparametric Regression

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II

1 Random walks and data

Use of the Autocorrelation Function for Frequency Stability Analysis

A Diagnostic for Seasonality Based Upon Autoregressive Roots

ECONOMICS 7200 MODERN TIME SERIES ANALYSIS Econometric Theory and Applications

2008 Winton. Statistical Testing of RNGs

AR, MA and ARMA models

Nonlinear Time Series

Minitab Project Report - Assignment 6

Transcription:

Wavelet Methods for Time Series Analysis Part IX: Wavelet-Based Bootstrapping start with some background on bootstrapping and its rationale describe adjustments to the bootstrap that allow it to work with correlated time series describe how the decorrelating property of the DWT can be used to develop a wavelet-based bootstrap for certain time series describe wavestrapping, an adaptive procedure based upon finding a decorrelating transform from a wavelet packet table IX

Motivating Question let X =[X 0,..., X N ] T be a finite portion of a stationary process with autocovariance sequence (ACVS) {s τ } let {ρ τ } be the corresponding autocorrelation sequence (ACS): ρ τ = s τ, where s τ =cov{x t,x t+τ } and s s 0 =var{x t } 0 given a time series, we can estimate its ACS at τ = using N 2 ˆρ t=0 X tx t+ N t=0 X2 t under the assumption that E{X t } =0 Q: given the amount N of data we have, how close can we expect ˆρ to be to the true unknown ρ? i.e., how can we assess the sampling variability in ˆρ? IX 2

Classic Approach Large Sample Theory: I in what follows, let N (µ, σ 2 ) denote a Gaussian (normal) random variable (RV) with mean µ and variance σ 2 if X t s were independent and identically distributed (IID) so that ρ = 0, the distribution of ˆρ becomes arbitrarily close to that of an N (0, N )RVasN (requires suitable conditions) IX 3

Classic Approach Large Sample Theory: II more generally, ˆρ is close to the distribution of an N (ρ,σn 2 ) RV as N, where σn 2 } {ρ 2 N τ(+2ρ 2 )+ρ τ+ρ τ 4ρ ρ τ ρ τ τ= in practice, the above result is unappealing because it requires knowledge of the theoretical ACS the ACS to damp down sufficiently fast, which would rule out long memory processes (LMPs) while large sample theory has been worked out for ˆρ under certain conditions, similar theory for other statistics can be hard to come by IX 4

Alternative Approach Bootstrapping: I if X t s were IID, we could apply bootstrapping to assess the variability in ˆρ, as follows suppose we have the following time series of length N =8, which is a realization of a Gaussian white noise process: x =[.9,. 2.2, 0.,.0, 0.6, 0.5,.3, 0.3] T,. for which ˆρ =0.23 (for white noise, the true value of ρ is 0) generate a new time series x () by randomly sampling from x: x (). =[2.2, 0., 0.,.0,.9,.9, 0.6, 0.] T, for which ˆρ (). = 0.3 (note: sampling is done with replacement) do again to get x (2) =[ 0.3, 0.5,.9, 0.6, 0.3, 0.5, 2.2, 2.2] T, for which ˆρ (2). =0.39 IX 5

Alternative Approach Bootstrapping: II repeat a large number of times to get ˆρ (), ˆρ(2),...,ˆρ(M) plots shows histogram for {ˆρ (m) : m =,...,0, 000}, along with probability density function (PDF) for N (0, 8 ) (left-hand plot) and an approximation to the true PDF for ˆρ (right).3 0.0 0 0 ˆρ (m) ˆρ (m) vertical lines indicates ˆρ can regard sample distribution of {ˆρ (m) } as an approximation to the unknown distribution of ˆρ IX 6

Alternative Approach Bootstrapping: III bootstrap approximation to distribution of ˆρ gets better as N increases consider sample of Gaussian white noise of length N = 28, for which ˆρ. = 0.02 x t 4 0 4 5 0 0 28 0 t ˆρ (m) vertical line indicates ˆρ sample distribution of {ˆρ (m) } agrees quite well with the approximate true PDF IX 7

Bootstrapping Correlated Time Series: I key assumption: x was a realization of IIDRVs if not true (usually the case with time series!), sample distribution of {ˆρ (m) } can be badly misleading as an approximation to unknown distribution of ˆρ as an example, consider a realization of a fractionally differenced (FD) process with parameter δ = 4, for which ˆρ. =0.23 (for an FD( 4 ) process, ρ = 3 ) x t 4 0 4 0 28 t IX 8

Bootstrapping Correlated Time Series: II use the same procedure as before to get ˆρ (), ˆρ(2),...,ˆρ(M) plot shows histogram for {ˆρ (m) : m =,...,0, 000}, along with an approximation to the true PDF for ˆρ 5 0 0 ˆρ (m) vertical lines indicate ˆρ and ρ bootstrap approximation gets even worse as N increases to correct the problem caused by correlation in time series, can use specialized time or frequency domain bootstrapping if ACS damp downs sufficiently fast IX 9

Parametric Bootstrapping: I one well-known time domain bootstrapping scheme is the parametric (or residual) bootstrap suppose we can assume that our time series is a realization of a portion X 0,..., X N of a first order autoregressive (AR) process: X t = φ X t + ɛ t, where φ < and {ɛ t } is white noise with zero mean and variance σɛ 2 (this model is widely used in geophysics) have var {X t } = σɛ/( 2 φ 2 ) and ρ τ = φ τ for AR() process in particular, ρ = φ, so can estimate φ using ˆφ ˆρ IX 0

Parametric Bootstrapping: II since ɛ t = X t φ X t, can form residuals r t = X t ˆφ X t, t =,...,N, with the idea that r t will be a good approximation to ɛ t (note: there are N residuals rather than N) let r () 0,r(),...,r() N be a random sample from r,r 2,...,r N (as before, sampling is done with replacement) let X () 0 = r () 0 /( ˆφ 2 )/2 ( stationary initial condition ) form X () t = ˆφ X () t + r() t, t =,...,N, yielding the bootstrapped time series X () 0,X(),...,X() N IX

Parametric Bootstrapping: III use X () 0,X(),...,X() N let r (2) 0,r(2),...,r(2)..., r N to compute ˆρ() N be a second random sample from r, r 2, use these to form a second bootstrapped series X (2) 0, X(2),..., X (2) N, from which we form ˆρ(2) repeat this procedure M times to get ˆρ (), ˆρ(2),...,ˆρ(M) IX 2

Parametric Bootstrapping: IV as an example, consider a realization of an AR() process with φ = ρ = 3, for which ˆρ. =0.38 x t 4 0 4 0 28 t plot shows histogram for {ˆρ (m) : m =,...,0, 000}, along with an approximation to the true PDF for ˆρ 6 0 0 ˆρ (m) vertical lines indicate ˆρ and ρ IX 3

Parametric Bootstrapping: V important assumption here is that time series is well modeled by AR() process to see what happens if this assumption fails, reconsider FD( 4 ) realization and treat it as if it were an AR() realization since ˆρ. =0.23, we would set ˆφ. =0.23 plot shows histogram for {ˆρ (m) : m =,...,0, 000}, along with an approximation to the true PDF for ˆρ 5 0-0 ˆρ (m) vertical lines indicate ˆρ and ρ IX 4

Parametric Bootstrapping: VI more generally, can fit pth order process p X t = φ u X t u + ɛ t and use r t = X t u= p u= ˆφ u X t u to form new series and then ˆρ (m) note that the number of residuals is N p, so best to stick with small values of p several variations on the basic scheme, one of which is to use r t = r t r rather than r t, where r is the sample mean of the residuals (usually close to zero, but sometimes not) IX 5

Block Bootstrapping another time domain approach is block bootstrapping, which is nonparametric and has some nice theoretical properties, but a bit trickier to describe and implement IX 6

Frequency Domain Bootstrapping phase scramble discrete Fourier transform (DFT) {X k } of data {X t } and apply inverse DFT to create new series: X k = N t=0 X t e i2πkt/n = A k e iθ k periodogram-based bootstrapping: in addition to phase scrambling, evoke large sample result that A k s are approximately uncorrelated with distribution related to a chi-square RV with 2 degrees of freedom circulant embedding bootstrapping: form nonparametric estimate of spectral density function and generate realizations using circulant embedding IX 7

Rationale for Wavelet Domain Bootstrapping time and frequency domain approaches are both problematic for long memory processes DWT decorrelates certain time series X, including long memory processes (these are ruled out by time and frequency domain bootstrapping because ACS damps down slowly) level J 0 partial DWT maps X to W, W 2,...,W J0 and V J0, with the RVs in the W j s being approximately uncorrelated (note: scaling coefficients V J0 are still highly correlated) IX 8

DWT of a Long Memory Process: I 5 X 0 0 ˆρ X,τ 5 0 256 52 768 024 0 32 t τ (lag) realization of an FD(0.4) time series X along with its sample autocorrelation sequence (ACS): for τ 0, N τ ˆρ X,τ t=0 X t X t+τ N t=0 X2 t note that ACS dies down slowly IX 9

DWT of a Long Memory Process: II V 7 W 7 W 6 W 5 W 4 W 3 W 2 W 4 20 5 5 5 5 5 5 5 5 5 5 5 5 5 5 0 32 τ (lag) LA(8) DWT of FD(0.4) series and sample ACSs for each W j & V 7, along with 95% confidence intervals for white noise IX 20

DWT of a Long Memory Process: III second example: ACS for FD(0.45) 0 0 6 32 48 64 τ unit lag autocorrelations for W j using the Haar, D(4) and LA(8) wavelet filters (other autocorrelations are very small) j Haar D(4) LA(8) 0.0626 0.0797 0.0767 2 0.0947 0.320 0.356 3 0.33 0.5 0.50 4 0.2 0.559 0.535 IX 2

DWT of a Long Memory Process: IV spectral density functions (SDFs) for X and W j SDF (in decibels) -0 0 0 20 0.0 0.2 0.4 frequency 0.0 0.2 0.4 frequency relatively flat (white noise if perfectly flat), but remaining variation well approximated by SDF for AR(2) process height increases as j increases (variance of W j sets height) IX 22

DWT of a Long Memory Process: V maximum absolute cross-correlations for wavelet coefficients in W j and W j for j<j 4 Haar D(4) LA(8) j\j 2 3 4 2 3 4 2 3 4 0.3 0.7 0.4 0.09 0.09 0.04 0.06 0.03 0.00 2 0.7 0.2 0.2 0. 0.08 0.03 3 0.8 0.3 0.08 IX 23

Recipe for Wavelet Domain Bootstrapping: I. given X of length N =2 J, compute level J 0 = J 2 partial DWT W,..., W J0 and V J0 (4 coefficients in W J0 and V J0 ) 2. randomly sample with replacement N/2 j times from W j to create bootstrapped vector W (b) j, j =,...,J 0 3. do the same for V J0 to create V (b) J (theory lacking here, but 0 better in computer experiments than using just V J0 ) 4. apply inverse transform to W (b),..., W(b) J and V (b) 0 J to obtain 0 bootstrapped time series X (b) 5. compute unit lag sample autocorrelation ˆρ (b) repeat above many times to build up sample distribution of bootstrapped autocorrelations IX 24

Recipe for Wavelet Domain Bootstrapping: II computer experiments indicate improvement over block bootstrap for FDprocesses variation: replace X by series of length 2N given by X (c) [X 0,X,...,X N 2,X N,X N,X N 2,...,X,X 0 ] T ; i.e., use reflection rather than circular boundary conditions IX 25

Motivation for Wavestrapping : I DWT does not adequately decorrelate all time series consider first order moving average process (MA()): X t = ɛ t +0.99ɛ t IX 26

Motivation for Wavestrapping : II SDFs for MA() process and associated W j SDF (in decibels) -20-0 0 0 0.0 0.2 0.4 frequency 0.0 0.2 0.4 frequency note that SDF of W is not approximately flat idea: use transform selected from wavelet packet table IX 27

Motivation for Wavestrapping : III consider following level J 0 = 4 wavelet packet table (WPT): W 0,0 X W,0 W, W 2,0 W 2, W 2,2 W 2,3 W 3,0 W 3, W 3,2 W 3,3 W 3,4 W 3,5 W 3,6 W 3,7 W 4,0 W 4, W 4,2 W 4,3 W 4,4 W 4,5 W 4,6 W 4,7 W 4,8 W 4,9 W 4,0 W 4, W 4,2 W 4,3 W 4,4 W 4,5 0 /6 /8 3/6 /4 5/6 3/8 7/6 /2 f shaded boxes identify an orthonormal transform that is a better decorrelator of the MA() process than the DWT IX 28

Motivation for Wavestrapping : IV SDFs for MA() process and associated W j,n SDF (in decibels) -20-0 0 0 0.0 0.2 0.4 frequency 0.0 0.2 0.4 frequency IX 29

Motivation for Wavestrapping : V first 5 of W j,n SDFs have variations less than 3 db, but those for W 4,3, W 4,4 and W 4,5 vary by 3.9, 5.3 and 7.2 db increasing depth of WPT to J 0 = 6 allows us to replace these by three j = 5 level subvectors W 5,26, W 5,27, W 5,28 and six j = 6 level subvectors W 6,58,...,W 6,63 resulting WPT has SDFs that all vary by less than 3 db idea: adaptively select transform by using white noise tests IX 30

Recipe for Wavestrapping: I. given X of length 2 J, compute level J 0 = J 2 WPT (enter step 2 with starting values j = n = 0 and W 0,0 X) 2. if j = J 0, retain W j,n ;ifj<j 0, do white noise test on W j,n portmanteau test on autocorrelation estimates for W j,n cumulative periodogram test if fail to reject the null hypothesis, retain W j,n ; if reject, discard W j,n (after transforming it into W j+,2n and W j+,2n+ ), and repeat this step twice again (both on W j+,2n and W j+,2n+ ) 3. desired adaptively chosen transform consists of all subvectors retained after step 2 applied as many times as needed; randomly sample (with replacement) from each subvector in the transform to create the similarly dimensioned wavestrapped subvectors IX 3

Recipe for Wavestrapping: II 4. apply inverse transform to obtain bootstrapped time series X (b) 5. compute unit lag sample autocorrelation ˆρ (b) repeat above many times to build up sample distribution of bootstrapped autocorrelations IX 32

Summary of Computer Experiments - I Wavestrap Process Boundary DWT Port Pgrm Block True WN N = 28 periodic 8.2 8.7 8.8 8. 8.7 reflection 8.3 8.6 8.7 N = 024 periodic 3. 3. 3. 3.0 3. reflection 3.2 3.2 3. AR() N = 28 periodic 5.7 5.2 5. 5.4 4.8 reflection 5.5 5. 5.4 N = 024 periodic.6.5.5.5.4 reflection.6.5.5 MA() N = 28 periodic 7. 6.8 6.8 6.5 6.3 reflection 7.0 6.8 6.6 N = 024 periodic 2.6 2.4 2.3 2.2 2.2 reflection 2.6 2.4 2.4 FD N = 28 periodic 9.4 8.3 8.5 7.7 0.7 reflection 9.9 8.8 9.6 N = 024 periodic 4.4 4.2 4.2 3.4 5.3 reflection 4.7 4.5 4.7 IX 33

Summary of Computer Experiments - II standard deviations ( 00) of unit lag sample autocorrelations given by DWT-based bootstrapping, two forms of wavestrapping and block bootstrap, along with true standard deviations four models considered are white noise (WN); AR() process X t =0.9X t + ɛ t ; MA() process X t = ɛ t +0.99ɛ t ; and fractionally differenced (FD) process with δ = 0.45 wavestrapping with portmanteau test and reflection boundary conditions does better than or is comparable to block bootstrap (current state of the art) except for the MA() process, for which the block bootstrap is ideally suited IX 34

Application to BMW Stock Prices - I log(returns) -0.5-0.05 0.05 0 000 2000 3000 4000 5000 6000 days plot shows log of daily returns on BMW share prices has small unit lag sample autocorrelation: ˆρ. =0.08. large sample theory appropriate for Gaussian white noise gives standard error of / N. =0.03 IX 35

Application to BMW Stock Prices - II Gaussianity is suspect: data better modeled by t distribution with 3.9 degrees of freedom block bootstrap with block sizes 30, 50, 00, 200 and 500 gives standard errors are 0.02, 0.02, 0.04, 0.06 and 0.05 DWT-based bootstrap and wavestrap give 0.023 & 0.020 confirms presence of autocorrelation (small, but presumably exploitable by traders) IX 36

Applications to Bivariate Climate Time Series - I 400 5 snowpack (inches) 200 0 0-5 900 920 940 960 980 2000 year PDO index plot shows Pacific decadal oscillation (PDO) index (thick curve) and March 5th snow depth on Mt. Rainier (thin curve) sample cross-correlation is N ˆρ XY t=0 (X t X)(Y t Y ) [ N t=0 (X t X) 2 N t=0 (Y t Y ) 2] /2. = 0.27 Q: given such a short series, is this significantly different from zero? IX 37

Applications to Bivariate Climate Time Series - II histogram of wavestrapped cross-correlations says yes 4 normalized counts 3 2 0-0.4-0.2 0.0 0.2 0.4 cross correlation IX 38

Comment on Other Approaches stick with DWT, but use J 0 = J 3orJ 0 = J 4 partial DWT so that there are 8 or 6 coefficients in both W J0 and V J0, then use parametric bootstrap on V J0 in addition to using parametric bootstrap on V J0, use parametric or block bootstrap separately on each subvector W j for FDprocesses, although W j is close to white noise, its variation from white noise is captured to a very good approximation by an AR() or AR(2) process IX 39

References: I C. Angelini, D. Cava, G. Katul and B. Vidakovic (2005), Resampling Hierarchical Processes in the Wavelet Domain: A Case Study Using Atmospheric Turbulence, Physica D, 207, pp. 24-40 M. Breakspear, M. J. Brammer, E. T. Bullmore, P. Das and L. M. Williams (2004), Spatiotemporal Wavelet Resampling for Functional Neuroimaging Data, Human Brain Mapping, 23, pp. 25 M. Breakspear, M. Brammer and P. A. Robinson (2003), Construction of Multivariate Surrogate Sets from Nonlinear Data Using the Wavelet Transform, Physica D, 82, pp. 22 E. Bullmore, J. Fadili, V. Maxim, L. Şendur, B. Whitcher, J. Suckling, M. Brammer and M. Breakspear (2004), Wavelets and Functional Magnetic Resonance Imaging of the Human Brain, NeuroImage, 23, pp. S234-S249 E. Bullmore, C. Long, J. Suckling, J. Fadili, G. Calvert, F. Zelaya, T. A. Carpenter and M. Brammer (200), Colored Noise and Computational Inference in Neurophysiological (fmri) Time Series Analysis: Resampling Methods in Time and Wavelet Domains, Human Brain Mapping, 2, pp. 6 78 IX 40

References: II A. C. Davison and D. V. Hinkley (997), Bootstrap Methods and their Applications, Cambridge University Press H. Feng, T. R. Willemain and N. Shang (2005), Wavelet-Based Bootstrap for Time Series Analysis, Communications in Statistics: Simulation and Computation 34(2), pp. 393 43 S. Golia (2002), Evaluating the GPH Estimator via Bootstrap Technique, in Proceedings in Computational Statistics COMPSTAT2002, edited by W. Härdle and B. Ronz. Heidelberg: Physica Verlag, pp. 343 8 D. B. Percival and W. L. B. Constantine (2006), Exact Simulation of Gaussian Time Series from Nonparametric Spectral Estimates with Application to Bootstrapping, Statistics and Computing, 6(), pp. 25 35. D. B. Percival, S. Sardy and A. C. Davison (200), Wavestrapping Time Series: Adaptive Wavelet-Based Bootstrapping, in Nonlinear and Nonstationary Signal Processing, edited by W. J. Fitzgerald, R. L. Smith, A. T. Walden and P. C. Young. Cambridge, England: Cambridge University Press, pp. 442 70 IX 4

References: III A. M. Sabatini (999), Wavelet-Based Estimation of /f-type Signal Parameters: Confidence Intervals Using the Bootstrap, IEEE Transactions on Signal Processing, 47(2), pp. 3406 9 B. J Whitcher (2006), Wavelet-Based Bootstrapping of Spatial Patterns on a Finite Lattice, Computational Statistics & Data Analysis 50(9), pp. 2399 42 IX 42