NONLINEAR TIME SERIES ANALYSIS, WITH APPLICATIONS TO MEDICINE

Similar documents
Complex system approach to geospace and climate studies. Tatjana Živković

Phase-Space Reconstruction. Gerrit Ansmann

Detecting chaos in pseudoperiodic time series without embedding

Application of Physics Model in prediction of the Hellas Euro election results

The Behaviour of a Mobile Robot Is Chaotic

Testing for Chaos in Type-I ELM Dynamics on JET with the ILW. Fabio Pisano

Detection of Nonlinearity and Stochastic Nature in Time Series by Delay Vector Variance Method

Modeling and Predicting Chaotic Time Series

Complex Dynamics of Microprocessor Performances During Program Execution

1 Random walks and data

Phase Space Reconstruction from Economic Time Series Data: Improving Models of Complex Real-World Dynamic Systems

INTRODUCTION TO CHAOS THEORY T.R.RAMAMOHAN C-MMACS BANGALORE

Effects of data windows on the methods of surrogate data

Delay Coordinate Embedding

Directionality of coupling from bivariate time series: How to avoid false causalities and missed connections

Characterizing chaotic time series

Algorithms for generating surrogate data for sparsely quantized time series

Information Mining for Friction Torque of Rolling Bearing for Space Applications Using Chaotic Theory

Estimating Lyapunov Exponents from Time Series. Gerrit Ansmann

Information Dynamics Foundations and Applications

What is Chaos? Implications of Chaos 4/12/2010

Investigating volcanoes behaviour. With novel statistical approaches. UCIM-UNAM - CONACYT b. UCIM-UNAM. c. Colegio de Morelos

Attractor of a Shallow Water Equations Model

Dynamics of partial discharges involved in electrical tree growth in insulation and its relation with the fractal dimension

Understanding the dynamics of snowpack in Washington State - part II: complexity of

Chaos, Complexity, and Inference (36-462)

PHONEME CLASSIFICATION OVER THE RECONSTRUCTED PHASE SPACE USING PRINCIPAL COMPONENT ANALYSIS

Stochastic Processes. A stochastic process is a function of two variables:

SIMULATED CHAOS IN BULLWHIP EFFECT

Detection of Nonlinearity and Chaos in Time. Series. Milan Palus. Santa Fe Institute Hyde Park Road. Santa Fe, NM 87501, USA

Nonlinear Characterization of Activity Dynamics in Online Collaboration Websites

Determination of fractal dimensions of solar radio bursts

Correlator I. Basics. Chapter Introduction. 8.2 Digitization Sampling. D. Anish Roshi

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES

Statistics of Stochastic Processes

Permutation, Linear Combination and Complexity of Short Time Series

SEISMIC WAVE PROPAGATION. Lecture 2: Fourier Analysis

Chapter 2 Analysis of Solar Radiation Time Series

SURROGATE DATA PATHOLOGIES AND THE FALSE-POSITIVE REJECTION OF THE NULL HYPOTHESIS

Revista Economica 65:6 (2013)

Signal interactions Cross correlation, cross spectral coupling and significance testing Centre for Doctoral Training in Healthcare Innovation

Biomedical Signal Processing and Signal Modeling

How much information is contained in a recurrence plot?

Interactive comment on Characterizing ecosystem-atmosphere interactions from short to interannual time scales by M. D. Mahecha et al.

A benchmark test A benchmark test of accuracy and precision in estimating dynamical systems characteristics from a time series

Addendum: Lyapunov Exponent Calculation

Pattern Formation and Chaos

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Old and New Methods in the Age of Big Data

Vulnerability of economic systems

Introduction to Biomedical Engineering

Use of the Autocorrelation Function for Frequency Stability Analysis

Discrete Lyapunov Exponent and Resistance to Differential Cryptanalysis José María Amigó, Ljupco Kocarev, and Janusz Szczepanski

WHEELSET BEARING VIBRATION ANALYSIS BASED ON NONLINEAR DYNAMICAL METHOD

CHAPTER IV CHARACTERIZATION OF WEAK AND STRONG CHAOS

Dynamical systems, information and time series

The Effects of Dynamical Noises on the Identification of Chaotic Systems: with Application to Streamflow Processes

Reconstruction Deconstruction:

Experiments with a Hybrid-Complex Neural Networks for Long Term Prediction of Electrocardiograms

Abstract. Introduction. B. Sivakumar Department of Land, Air & Water Resources, University of California, Davis, CA 95616, USA

ESTIMATING THE ATTRACTOR DIMENSION OF THE EQUATORIAL WEATHER SYSTEM M. Leok B.T.

What is Nonlinear Dynamics? HRV 2006: Techniques, Applications, and New Directions. Daniel Kaplan Macalester College Saint Paul, Minnesota

Intermittent onset of turbulence and control of extreme events

If we want to analyze experimental or simulated data we might encounter the following tasks:

Two Decades of Search for Chaos in Brain.

Phase Desynchronization as a Mechanism for Transitions to High-Dimensional Chaos

Stabilization of Hyperbolic Chaos by the Pyragas Method

EE/CpE 345. Modeling and Simulation. Fall Class 9

550 XU Hai-Bo, WANG Guang-Rui, and CHEN Shi-Gang Vol. 37 the denition of the domain. The map is a generalization of the standard map for which (J) = J

Discrimination power of measures for nonlinearity in a time series

Removing the Noise from Chaos Plus Noise

LECTURE 18. Lecture outline Gaussian channels: parallel colored noise inter-symbol interference general case: multiple inputs and outputs

A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University. False Positives in Fourier Spectra. For N = DFT length: Lecture 5 Reading

ANTICORRELATIONS AND SUBDIFFUSION IN FINANCIAL SYSTEMS. K.Staliunas Abstract

Exchange rates expectations and chaotic dynamics: a replication study

Measuring time irreversibility using symbolization

One dimensional Maps

EXPERIMENTAL OBSERVATION OF A CHAOS-TO-CHAOS TRANSITION IN LASER DROPLET GENERATION

Collective Effects. Equilibrium and Nonequilibrium Physics

Distinguishing chaos from noise

Predictability and Chaotic Nature of Daily Streamflow

physics/ v2 21 Jun 1999

The dynamics of human gait

arxiv: v1 [nlin.cd] 3 Aug 2010

A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University

Nonlinear Biomedical Physics

Separation of a Signal of Interest from a Seasonal Effect in Geophysical Data: I. El Niño/La Niña Phenomenon

NON-LINEAR DYNAMICS TOOLS FOR THE MOTION ANALYSIS AND CONDITION MONITORING OF ROBOT JOINTS

Collective Effects. Equilibrium and Nonequilibrium Physics

Investigation of Chaotic Nature of Sunspot Data by Nonlinear Analysis Techniques

Long-Term Prediction, Chaos and Artificial Neural Networks. Where is the Meeting Point?

Automating biomedical time-series analysis using massive feature extraction

Detecting Macroeconomic Chaos Juan D. Montoro & Jose V. Paz Department of Applied Economics, Umversidad de Valencia,

ORIGINS. E.P. Wigner, Conference on Neutron Physics by Time of Flight, November 1956

Introduction to Signal Processing

However, in actual topology a distance function is not used to define open sets.

where r n = dn+1 x(t)

Quantitative Description of Robot-Environment Interaction Using Chaos Theory 1

EEG- Signal Processing

Evaluating nonlinearity and validity of nonlinear modeling for complex time series

Transcription:

NONLINEAR TIME SERIES ANALYSIS, WITH APPLICATIONS TO MEDICINE José María Amigó Centro de Investigación Operativa, Universidad Miguel Hernández, Elche (Spain) J.M. Amigó (CIO) Nonlinear time series analysis 1 / 41

LECTURE 4 NONLINEAR METHODS IN MEDICINE I: GENERAL TECHNIQUES J.M. Amigó (CIO) Nonlinear time series analysis 2 / 41

OUTLINE 1 What is nonlinear time series analysis 2 Practical issues 3 Conventional linear methods 4 State-space reconstruction 5 Noise reduction 6 Testing stationarity & determinism 7 Surrogate data testing 8 References J.M. Amigó (CIO) Nonlinear time series analysis 3 / 41

1. What is nonlinear TSA? Chapter 2: From rst principles to phenomenology (bottom up approach) This chapter: From phenomenology to the system (top down approach) J.M. Amigó (CIO) Nonlinear time series analysis 4 / 41

1. What is nonlinear TSA? Working hypothesis: Nature is deterministic, nonlinear, and dissipative. Picture: Data are measurements at successive times of a system evolving usually in continuous time, t 7! x(t) 2 Ω ===> s n = s(x(t n )) where s : Ω! R is called the measurement function and t n the measurement times. Scope: Extract information from the data But there are quite a few practical issues. J.M. Amigó (CIO) Nonlinear time series analysis 5 / 41

1. What is nonlinear TSA? Examples of time series. Meteorological observations (temperature, pressure, humidity,...) Financial data (trading price, market averages, exchange rates,...) Biomedical records (EEG, ECG, epidemiological data,...) J.M. Amigó (CIO) Nonlinear time series analysis 6 / 41

1. What is nonlinear TSA? Procedure. 1 Reconstruction of the state space 2 Noise reduction 3 Test stationarity and determinism 4 Characterize the reconstructed attractor 5 Check the validity of the conclusions (surrogate data) Remark. The exact order depends on whether the methods need a reconstructed state space. J.M. Amigó (CIO) Nonlinear time series analysis 7 / 41

2. Practical issues Time series analysis involves a number of practical issues. 1) Sampling and delay time. In practice, data are sampled at a constant sampling frequency 1/ t, i.e., s 0, s 1,..., s n,... = s(x(0)), s(x( t)),..., s(x(n t)),... where t is the sampling time, and n t t n are the measurement times. t should be much smaller than the variation scale of x(t). A better option: s 0, s τ,..., s nτ,... = s(x(0)), s(x(τ t)),..., s nτ = s(x(nτ t)),... The parameter τ 1 is called the lag or delay time. J.M. Amigó (CIO) Nonlinear time series analysis 8 / 41

2. Practical issues 2) Finiteness. Real-world time series are nite but most quantities (entropies, λ, dimensions, etc.) involve in nite limits! Remedies: Produce as many data as possible, use alternative estimators, use nite-size correction terms,... General rule: It su ces to obtain the scaling behavior. J.M. Amigó (CIO) Nonlinear time series analysis 9 / 41

2. Practical issues 3) Data contamination Any error in the determination of the states. If the error is propagated by the dynamic: dynamical or multiplicative noise If the error is not propagated by the dynamic: observational or additive noise Modelling additive noise. s n = s(x(n t)) + w n where (w n ) n0 is white noise (i.e., an i.i.d. random process). One can use noise-reduction techniques. J.M. Amigó (CIO) Nonlinear time series analysis 10 / 41

2. Practical issues 4) Stationarity. Stationarity means hat the statistical properties (averages, deviations, etc.) do not depend on which part of the time series I am considering. Causes of nonstationarity. If the system is random: the process is nonstationary. If the system is deterministic: transients, change of parameters. Time series too short to capture the longest characteristic time scale. J.M. Amigó (CIO) Nonlinear time series analysis 11 / 41

3. Conventional linear methods Conventional linear methods are useful for some basic purposes, like checking stationarity or discriminating randomness from chaoticity. 1) Stationarity with 1st and 2nd order statistics If x = x 1,..., x N is a nonstationary time series, then its mean and/or the standard deviation v u σ = change with N. hxi = 1 N t 1 N 1 N x n, n=1 N (x n hxi) 2 n=1 J.M. Amigó (CIO) Nonlinear time series analysis 12 / 41

3. Conventional linear methods Test. 1 Calculate hxi and σ for the 1st half (0 n bn/2c) 2 Calculate hxi and σ for the 2nd half (bn/2c + 1 n N) 3 If the means of the two halves di er by more than a few standard errors for each half (σ/ p N/2), then stationarity is a problem. Other possibility: Plot a histogram of the probability distribution p(x) (number of bins p N). p(x) should remain approximately constant for di erent x s if the data source is stationary. J.M. Amigó (CIO) Nonlinear time series analysis 13 / 41

3. Conventional linear methods 2) Determinism with Fourier analysis Assumption: x = x 1,..., x N can be represented by a superposition of sines and cosines of various amplitudes and frequencies. Fundamental frequency: f 0 = 1/N Highest frequency: f c = 1/2 (Nyquist frequency) Frequencies of the harmonics: f ν = ν/n = νf 0 (1 ν N/2) Amplitudes: x n ' a N/2 0 2 + a ν = 2 N N n=1 ν=1 a ν cos 2πνn N x n cos 2πνn N, b ν = 2 N + b ν sin 2πνn N N x n sin 2πνn N. n=1 J.M. Amigó (CIO) Nonlinear time series analysis 14 / 41

3. Conventional linear methods De nition (Power spectral density). The power S(f ) at frequency νf 0 is S ν = a 2 ν + b 2 ν. Quasiperiodic signals have a nite number of sharp spectral peaks. Chaotic signals have a continuous (or broadband ) spectrum, perhaps with some embedded peaks. But random noise has also a continuous spectrum. If the power spectrum diverges as f considered nonstationary. α, the time series must be J.M. Amigó (CIO) Nonlinear time series analysis 15 / 41

P(f) 3. Conventional linear methods Power spectrum of a chaotic signal 10 2 10 0 10 2 10 4 10 6 10 8 x component 10 10 10 12 1 10 100100010000 f Figure. Power spectrum of a time series generated with the Rössler oscillator. J.M. Amigó (CIO) Nonlinear time series analysis 16 / 41

3. Conventional linear methods De nition. The autocorrelation function of a stationary time series x = x 1,..., x N, k G(k) ' N n=1 (x n hxi)(x n+k hxi) N n=1 k (x, n hxi) 2 measures how strongly on average each data point is correlated with one k time steps away (0 k N 1). G(0) = 1 G(k) = 1 for perfect correlation G(k) = 1 for perfect anticorrelation G(k) = 0 for uncorrelated data k is called the lag. J.M. Amigó (CIO) Nonlinear time series analysis 17 / 41

3. Conventional linear methods Facts. Random processes have decaying autocorrelations but the decay rate depends on the properties of the process. Autocorrelations of chaotic signals decay exponentially with increasing lag. J.M. Amigó (CIO) Nonlinear time series analysis 18 / 41

3. Conventional linear methods Theorem (Wiener-Khinchin). The Fourier transform of G(k) is the power spectrum. S ν = G(0) + G(K) cos 2πνK K 1 N + 2 G(k) cos 2πνk N, where K is the maximum k, which should be taken about N/4. k=1 J.M. Amigó (CIO) Nonlinear time series analysis 19 / 41

4. State-space reconstruction Assumption. s 0, s 1,..., s n,... has been obtained from a deterministic system evolving on an attractor: where s n = s(x(n 0 + nτ) t), s : Ω! R is the measurement function, x(t) is the time evolution of the system t is the sampling time τ is the delay time n 0 allows for removing transients J.M. Amigó (CIO) Nonlinear time series analysis 20 / 41

4. State-space reconstruction Comments. If the system is deterministic, s n+1 = ϕ(s n, s n 1,...), but noise can blur this relation. The plots s n+1 vs s n, s n 1,... are called a return maps. The underlying dynamical system (Ω, f ) is not known. If f is dissipative, dimension of the attractor < dimension of Ω. J.M. Amigó (CIO) Nonlinear time series analysis 21 / 41

4. State-space reconstruction The state space reconstruction allows to construct a new space Ω new with an equivalent attractor, i.e., 1 every point in Ω new maps to a unique point by the dynamic, 2 the corresponding attractors in Ω and Ω new have the same λ and dimensions. The change of coordinates between Ω and Ω new is called an embedding. J.M. Amigó (CIO) Nonlinear time series analysis 22 / 41

4. State-space reconstruction Theorem (Takens). Given a time series s n0, s n0 +τ, s n0 +2τ,..., s n0 +kτ,..., the time-delay vectors, s(n) = (s n (m 1)τ,..., s n τ, s n ), constitute an adequate embedding provided m b2d 0 c + 1. where D 0 is the box-counting dimension of the attractor. The parameter m is called the embedding dimension m. J.M. Amigó (CIO) Nonlinear time series analysis 23 / 41

4. State-space reconstruction Illustration 1 1 C.J. Stam, Clin. Neurophys. 116 (2005) 226 J.M. Amigó (CIO) Nonlinear time series analysis 24 / 41

4. State-space reconstruction The state-space reconstruction has two parameters: 1 The embedding dimension m (all we know is m b2d 0 c + 1) 2 The delay time τ How to choose them? J.M. Amigó (CIO) Nonlinear time series analysis 25 / 41

4. State-space reconstruction Choosing m: False nearest neighbors. 1 For each s n nd an s l such that their distance in the reconstructed state space, q R n (m) = (s l s n ) 2 + (s l τ s n τ ) 2 +... + (s l (m 1)τ s n (m 1)τ ) 2 is minimum. s l is call the nearest neighbor of s n. 2 Calculate R n (m + 1). If R n (m + 1) R n (m), then x n and x l are false neighbors. Criterion for falseness: js l mτ s n mτ j R n (m) & 15 3 Sample the time series and plot the fraction of false nearest neighbors. J.M. Amigó (CIO) Nonlinear time series analysis 26 / 41

4. State-space reconstruction Illustration 2. 2 M. Perc, Eur. J. Phys. 26 (2005) 757 J.M. Amigó (CIO) Nonlinear time series analysis 27 / 41

4. State-space reconstruction Choosing τ: Minimum mutual information. If τ is too small, the reconstructed attractor is stretched out along the diagonal of the embedding space. If τ is too large, the reconstructed attractor is excessively folded. J.M. Amigó (CIO) Nonlinear time series analysis 28 / 41

4. State-space reconstruction Recommended method: Use the rst minimum of the mutual information, I(τ) = N N i=1 j=1 p ij (τ) log p ij (τ) 2 N is the number of bins in [s min, s max ] p i is the probability that x n is in bin i N i=1 p i ln p i, p ij (τ) is the probability that s n is in bin i and s n+τ is in bin j The size of the bins is not critical as long as they are su ciently small. J.M. Amigó (CIO) Nonlinear time series analysis 29 / 41

5. Noise reduction There are noise reduction algorithms based on time-delay embedding. Noisy measurements: s n = x n + ζ n where ζ n is supposed to be a rv with fast decaying autocorrelation and no correlation with x n. σ = r D ζ 2E is called the noise amplitude or the noise level. σ can be estimated from a plot of the data or a correlation sum. J.M. Amigó (CIO) Nonlinear time series analysis 30 / 41

5. Noise reduction Method (nonlinear ltering). Let s(n) = (s n (m 1)τ,..., s n τ, s n ) be the embedding vectors.then replace the middle coordinate s n by the average middle coordinate of neighboring points: bm/2cτ ŝ n+bm/2cτ = 1 jb ε (s(n))j s n bm/2c, s(k)2b ε (s(n)) where B ε (s(n)) is the m-dimensional ball of radius ε centered at s(n), and jb ε (s(n))j is the number of points in it. Use ε = 2σ or ε = 3σ. J.M. Amigó (CIO) Nonlinear time series analysis 31 / 41

6. Testing stationarity & determinism Possible methods to test stationarity: 1 Linear methods. 2 Nonlinear methods: recurrence plots, cross prediction error statistic,... Cross prediction error statistic will be explained below (as a test for determinsim) J.M. Amigó (CIO) Nonlinear time series analysis 32 / 41

6. Testing stationarity & determinism Possible methods to test determinism: 1 Visual methods: Return maps, recurrence plots,... 2 Fingerprints: Power spectrum, Lyapunov exponent,... 3 Forbidden ordinal patterns 4 Kaplan-Glass test 5 Cross prediction error statistic J.M. Amigó (CIO) Nonlinear time series analysis 33 / 41

6. Testing stationarity & determinism 6.1. Kaplan-Glass test. Idea: Neighboring trajectories should point in about the same direction if the system is deterministic. 1 Quantize the reconstructed state space with a box partition. 2 Each pass of the trajectory generates a unit vector e p determined by the entry and exit points in/from the box k. 3 The average directional vector of box k is V k = 1 n n e p=1 where n is the number of total passes though the box k. 4 Let κ be the average length of all V k / kv k k: (i) κ 0 random data; (ii) κ 1 deterministic data. J.M. Amigó (CIO) Nonlinear time series analysis 34 / 41

6. Testing stationarity & determinism 6.2. Cross prediction error statistic. Stationary deterministic systems are predictable, at least in the short term. Let s 0, s 1,..., s N be a test data set in a longer series. Question: s N+1 =? Idea: Suppose s n = s(x n ), where x 0, x 1,..., x N 2 R d is deterministic x n+1 = f (x n ), ) x x k ' x N+1 = f (x N ) ' f (x k ) = x k+1 N provided f is continuous. Answer: s N+1 ' s k+1 J.M. Amigó (CIO) Nonlinear time series analysis 35 / 41

6.Testing stationarity & determinism In practice though x 0, x 1,..., x N are unknown. Use embedding vectors s(n) = (s n (m 1)τ,..., s n τ, s n ) to obtain states equivalent to the original ones. Thus: s(n + 1) ' s(k + 1) Even better: s(n + 1) ' 1 jb ε (s(n))j s(k + 1) s(k)2b ε (s(n)) Prediction: The last component of s(n + 1). J.M. Amigó (CIO) Nonlinear time series analysis 36 / 41

6. Testing stationarity & determinism Determinism test: 1 Split the data (s n ) N n=1 into I short segments S i (N/I 500) 2 For all pair S i, S j make predictions in S j using data of S i. 3 Compute the rms of the predictions errors δ ij 4 If δ ij average, then S i is a bad model for S j 5 Compare (δ ij ) min, (δ ij ) max with the average The matrix δ ij is usually color-coded and plotted. J.M. Amigó (CIO) Nonlinear time series analysis 37 / 41

6. Testing stationarity & determinism Illustration 3. 3 M. Perc, Eur. J. Phys. 26 (2005) 757 J.M. Amigó (CIO) Nonlinear time series analysis 38 / 41

7. Surrogate data testing Chaotic systems mimic (white and colored) noise. Random systems with nonuniform power spectrum can mimic chaos. Recommendation. Test conclusions about whether a time series is chaotic with surrogate data 4. Surrogate data are designed to mimic the statistical properties of chaotic data but with determinism removed. 4 T. Schreiber and A. Schmitz, Physica A (2000) 346. J.M. Amigó (CIO) Nonlinear time series analysis 39 / 41

7. Surrogate data testing 1 Surrogate data with the same probability distribution: shu e the data. 2 Surrogate data with the same power spectrum: use a surrogate series Y = (y n ) N n=1 with the same Fourier amplitudes but with random phases, y n = a N/2 0 2 + p νn Sν sin 2π ν N + r, ν=1 where r ν are N/2 uniform random numbers chosen from 0 r ν < 1. Note. p(y) tends to be nearly Gaussian. J.M. Amigó (CIO) Nonlinear time series analysis 40 / 41

References 1 H. Kantz, and T. Schreiber, Nonlinear time series analysis. Cambridge University Press, 2004. 2 W.H. Press et al., Numerical Recipes. Cambridge University Press, 2007. 3 M. Small, Applied Nonlinear Time Series Analysis, World Scienti c 2005. 4 J.C. Sprott, Chaos and time series analysis. Oxford University Press, 2003. J.M. Amigó (CIO) Nonlinear time series analysis 41 / 41