On multiscale entropy analysis for physiological data

Similar documents
Equilibrium Time, Permutation, Multiscale and Modified Multiscale Entropies for Low-High Infection Level Intracellular Viral Reaction Kinetics

Information-Based Similarity Index

CHAPTER 10 PERMUTATION ENTROPY, MULTISCALE PERMUTATION ENTROPY AND WAVELET TRANSFORM ANALYSIS OF HEART RATE VARIABILITY

Statistical physics approach to categorize biologic signals: From heart rate dynamics to DNA sequences

Sample Entropy based HRV: Effect of ECG Sampling Frequency

CHAPTER 8 COMPRESSION ENTROPY ESTIMATION OF HEART RATE VARIABILITY AND COMPUTATION OF ITS RENORMALIZED ENTROPY

Discrimination power of measures for nonlinearity in a time series

Interbeat interval. Beat number. Interbeat interval. Beat number. h max

arxiv:physics/ v1 [physics.med-ph] 8 Jan 2003

arxiv:physics/ v2 [physics.med-ph] 21 May 2002

A New Approach to the Study of Heartbeat Dynamics Based on Mathematical Model

Deviations from uniform power law scaling in nonstationary time series

Interbeat RR Interval Time Series Analyzed with the Sample Entropy Methodology

Chapter 3. Data Analysis

WORKING ON THE NOLTISALIS DATABASE: MEASUREMENT OF NONLINEAR PROPERTIES IN HEART RATE VARIABILITY SIGNALS

Multiscale Analysis of Heart Rate Variability: A Comparison of Different Complexity Measures

Two Decades of Search for Chaos in Brain.

Multiscale Entropy Analysis: A New Method to Detect Determinism in a Time. Series. A. Sarkar and P. Barat. Variable Energy Cyclotron Centre

arxiv:physics/ v2 [physics.med-ph] 8 Jan 2003

Algorithms for generating surrogate data for sparsely quantized time series

Chapter 2 Chaos theory and its relationship to complexity

If we want to analyze experimental or simulated data we might encounter the following tasks:

Dynamical systems, information and time series

Magnitude and sign scaling in power-law correlated time series

Change ΔS of the entropy in natural time under time reversal: Complexity measures upon change of scale

International Journal of Advanced Research in Computer Science and Software Engineering

Detecting chaos in pseudoperiodic time series without embedding

Independence and symbolic independence of nonstationary heartbeat series during atrial fibrillation

Universal structures of normal and pathological heart rate variability (Supplemental Information)

EEG- Signal Processing

arxiv:physics/ v2 [physics.med-ph] 7 Feb 2003

Investigations of Cardiac Rhythm Fluctuation Using the DFA Method

Overture: Why is Physiologic Variability Important?

Multiscale entropy analysis of human gait dynamics

Simple approach to the creation of a strange nonchaotic attractor in any chaotic system

DYNAMICAL ANALYSIS OF HEART BEAT FROM THE VIEWPOINT OF CHAOS THEORY

Effects of data windows on the methods of surrogate data

Coupling Analysis of ECG and EMG based on Multiscale symbolic transfer entropy

Chapter 3. Gumowski-Mira Map. 3.1 Introduction

The Behaviour of a Mobile Robot Is Chaotic

NONLINEAR TIME SERIES ANALYSIS, WITH APPLICATIONS TO MEDICINE

Intrinsic mode entropy based on multivariate empirical mode decomposition and its application to neural data analysis

Abstract: Complex responses observed in an experimental, nonlinear, moored structural

Modelling Heart Rate Variability

Entropy-Based Algorithm to Detect Life Threatening Cardiac Arrhythmias Using Raw Electrocardiogram Signals

Malvin Carl Teich. Boston University and Columbia University Workshop on New Themes & Techniques in Complex Systems 2005

What is Chaos? Implications of Chaos 4/12/2010

What is Nonlinear Dynamics? HRV 2006: Techniques, Applications, and New Directions. Daniel Kaplan Macalester College Saint Paul, Minnesota


Coarse-Graining Approaches in Univariate Multiscale Sample and Dispersion Entropy

Permutation entropy a natural complexity measure for time series

Controlling chaos in random Boolean networks

Complex Dynamics of Microprocessor Performances During Program Execution

Sjoerd Verduyn Lunel (Utrecht University)

Cardiac Control Entropy: Changes in complexity of HR signal can be transient or sustained

From time series to superstatistics

SUPPLEMENTARY INFORMATION

DETC EXPERIMENT OF OIL-FILM WHIRL IN ROTOR SYSTEM AND WAVELET FRACTAL ANALYSES

arxiv: v1 [nlin.ps] 9 May 2015

Dynamical Systems and Chaos Part I: Theoretical Techniques. Lecture 4: Discrete systems + Chaos. Ilya Potapov Mathematics Department, TUT Room TD325

Classification of periodic, chaotic and random sequences using approximate entropy and Lempel Ziv complexity measures

Application to Experiments

THREE DIMENSIONAL SYSTEMS. Lecture 6: The Lorenz Equations

JC: You want to see self-organization at work? Test each part of the circuit one by one. You ll see that the whole is more than the sum of its parts

Sensitivity of an Ocean Model to Details of Stochastic Forcing

Deborah Lacitignola Department of Health and Motory Sciences University of Cassino

arxiv:cond-mat/ v1 [cond-mat.dis-nn] 3 Apr 2002

How Random is a Coin Toss? Bayesian Inference and the Symbolic Dynamics of Deterministic Chaos

Quantitative Description of Robot-Environment Interaction Using Chaos Theory 1

Research Article Hidden Periodicity and Chaos in the Sequence of Prime Numbers

The difference-sign runs length distribution in testing for serial independence

Detection of Nonlinearity and Stochastic Nature in Time Series by Delay Vector Variance Method

Ahsan Habib Khandoker Chandan Karmakar Michael Brennan Andreas Voss Marimuthu Palaniswami. Poincaré Plot Methods for Heart Rate Variability Analysis

Detection of Nonlinearity and Chaos in Time. Series. Milan Palus. Santa Fe Institute Hyde Park Road. Santa Fe, NM 87501, USA

No. 6 Determining the input dimension of a To model a nonlinear time series with the widely used feed-forward neural network means to fit the a

DYNAMIC ANALYSIS OF THE CORRELATION INTEGRAL OF HEART RATE VARIABILITY IN HYPERTROPHIC CARDIOMYOPATHY PATIENTS

Ordering periodic spatial structures by non-equilibrium uctuations

congestive heart failure Hh,02.50.Ey, y,05.40.Fb

Complex system approach to geospace and climate studies. Tatjana Živković

6.2 Brief review of fundamental concepts about chaotic systems

A New Science : Chaos

The influence of noise on two- and three-frequency quasi-periodicity in a simple model system

ESTIMATING THE ATTRACTOR DIMENSION OF THE EQUATORIAL WEATHER SYSTEM M. Leok B.T.

Measuring time irreversibility using symbolization

SPATIOTEMPORAL CHAOS IN COUPLED MAP LATTICE. Itishree Priyadarshini. Prof. Biplab Ganguli

Detecting abnormality in heart dynamics from multifractal analysis of ECG signals

Suppression of Spiral Waves and Spatiotemporal Chaos Under Local Self-adaptive Coupling Interactions

n m I I 1 4 1, K T (1)

Recurrence Analysis of Converging and Diverging Trajectories in the Mandelbrot Set

SURROGATE DATA PATHOLOGIES AND THE FALSE-POSITIVE REJECTION OF THE NULL HYPOTHESIS

arxiv:physics/ v1 [physics.bio-ph] 5 Oct 2004

Wavelet based sample entropy analysis: A new method to test weak form market efficiency

Chaos and Liapunov exponents

CONTROLLING CHAOS. Sudeshna Sinha. The Institute of Mathematical Sciences Chennai

Igor A. Khovanov,Vadim S. Anishchenko, Astrakhanskaya str. 83, Saratov, Russia

1 Random walks and data

Effect of various periodic forces on Duffing oscillator

THIELE CENTRE. Disordered chaotic strings. Mirko Schäfer and Martin Greiner. for applied mathematics in natural science

Wavelet entropy as a measure of solar cycle complexity

Improved Detection of Nonlinearity in Nonstationary Signals by Piecewise Stationary Segmentation

Transcription:

Physica A ] (]]]]) ]]] ]]] www.elsevier.com/locate/physa On multiscale entropy analysis for physiological data Ranjit A. Thuraisingham, Georg A. Gottwald School of Mathematics and Statistics, University of Sydney, NSW 2006, Australia Received 26 October 2004; received in revised form 1 October 2005 Abstract We perform an analysis of cardiac data using multiscale entropy as proposed in Costa et al. [Multiscale entropy analysis of complex physiological time series, Phys. Rev. Lett. 89 (2002) 068102]. We reproduce the signatures of the multiscale entropy for the three cases of young healthy hearts, atrial fibrillation and congestive heart failure. We show that one has to be cautious how to interpret these signatures in terms of the underlying dynamics. In particular, we show that different dynamical systems can exhibit the same signatures depending on the sampling time, and that similar systems may have different signatures depending on the time scales involved. Besides the total amount of data we identify the sampling time, the correlation time and the period of possible nonlinear oscillations as important time scales which have to be involved in a detailed analysis of the signatures of multiscale entropies. We illustrate our ideas with the Lorenz equation as a simple deterministic chaotic system. r 2005 Elsevier B.V. All rights reserved. Keywords: Multiscale entropy; Cardiac arrhythmias 1. Introduction Over the past years, interest has risen in applying methods and concepts from nonlinear dynamics to problems in physiology. This is evidenced by several focus issues on cardiology and nonlinear dynamics [1 5]. It has been proposed that the normal heartbeat is associated with complex nonlinear dynamics and chaos [6 8]. This has opened new avenues to use methods from nonlinear dynamics as diagnostic tools for the analysis of physiological data for certain heart trouble [9 11]. Moreover, physicists and mathematicians have been inspired by the wealth of interesting physiological problems to develop new tools and methods. One such problem in cardiology is to extract information from a given set of heart rate data about the health status of a given patient. This is obviously of great clinical importance for diagnostics. One aim is to find early signs of cardiac arrhythmias. These are often precursors for fatal malfunctions such as ventricular fibrillation. Cardiovascular disease is the main cause of death in the industrialized world, and ventricular fibrillation is the most common route to death for patients with cardiac conditions. Early detection of these possibly fatal heart conditions allows for preventive drug management, application of radio-frequency ablation, or the use of an implantable defibrillators. Corresponding author. E-mail address: gottwald@maths.usyd.edu.au (G.A. Gottwald). 0378-4371/$ - see front matter r 2005 Elsevier B.V. All rights reserved. doi:10.1016/j.physa.2005.10.008

2 R.A. Thuraisingham, G.A. Gottwald / Physica A ] (]]]]) ]]] ]]] It is widely believed that a healthy heart is characterized by chaotic heart rate time series whereas the intermittent appearance of periodicities in heart data is associated with the onset of congestive heart failure [12]. The naive picture behind this is that a chaotic heart which generically involves a multitude of spatial and temporal scales is more adaptive and can react better to external changes of its input. The occurrence of regularity in heart data as observed with certain heart trouble is often associated with the emergence of re-entrant spiral waves whose rotation frequency dominates the inter-beat intervals. This is a consequence of the heart being an excitable medium (see Ref. [13] for an overview on cardiac arrhythmias). There have been several approaches coming from nonlinear dynamics to identify signatures in heart data (see for example, Ref. [14] for an overview). The inevitable presence of noise forbids to use traditional measures of chaoticity such as Lyapunov exponents (see Ref. [15] for an overview on nonlinear time series methods). Amongst the ones used to measure complexity in physiological data are entropy methods which are mostly based on symbolic dynamics [16 20], methods based on recurrence plots [21] and methods analyzing the multiscale powerlaws of the length distribution of low-variability periods in heartbeat dynamics [22]. In a recent paper [1], a new entropy based measure of complexity, the multiple scale entropy (MSE), was introduced. The authors applied their new complexity measure to distinguish between young healthy hearts and congestive heart failure. Moreover, they were able to distinguish atrial fibrillation from healthy hearts. The former is associated with erratic fluctuations similar to uncorrelated noise [23,24]; the latter with fluctuations stemming from an underlying chaotic deterministic dynamics. They key to their method lies in a multiscale approach. We explain their method in the next section and reproduce their results for real clinical heart data in Section 3. However, we will show in Section 4, that care has to be taken when interpreting the signatures of their complexity measure and drawing conclusions about possible deterministic dynamics. We show that the signatures depend on the ratio of the sample time t s of the time series, the correlation time t c and the period of possible nonlinear oscillations t p. 2. Multiscale entropy The concept of entropy has been widely used to quantify complexity. Traditional entropy definitions such as the Shannon-entropy or the Kolmogorov Sinai entropy characterize the gain of information and measure disorder and uncertainty. The Kolmogorov Sinai entropy h KS measures the exponential rate at which information is obtained. In principle, Pesin s identity [25] which states that the sum of all positive Lyapunov exponents forms an upper bound for h KS makes this entropy attractive to characterize complexity. However, its definition requires an infinite data series with infinitely accurate precision and resolution [26]. This is never the case in experimental data. Entropy-based complexity measure designed to deal with short and noisy time series were recently introduced. Based on the so-called approximate entropy introduced in Ref. [27], and its modification, the sample entropy [28], a multiscale entropy has been introduced and successfully applied to physiological data [1]. We note that the term multiscale entropy is also used in a different context in image processing [29]. The multiscale entropy is based on the application of approximate entropy [27] or sample entropy [28]. We will therefore briefly review both of these entropies. For a good review on the connection between these measures of entropy and their historic connection see Ref. [30]. Given a time series fx i g¼fx 1 ; x 2 ;...; x N g of length N, one can define m-dimensional sequence vectors y ðmþ ðiþ ¼fx i ; x iþ1 ;...; x iþm 1 g. Two vectors y ðmþ ðiþ and y ðmþ ðjþ are called similar if their distance dði; jþ ¼maxfjxði þ kþ xðj þ kþj : 0pkpm 1g is smaller than a specified tolerance level d. For each of the N m þ 1 vectors y ðmþ ðiþ the number of similar vectors y ðmþ ðjþ is determined by measuring their respective distances. Let n ðmþ i be the number of vectors similar to y ðmþ ðiþ. The relative frequency to find a vector y ðmþ ðjþ which is similar to y ðmþ ðiþ within a tolerance level d is given by C ðmþ n ðmþ i i ðdþ ¼ N m þ 1. (2.1)

To define an entropy-like quantity we look at the relative frequency of the logarithm of C ðmþ i ðdþ. Under the hypothesis of stationarity we obtain H ðmþ N ðdþ ¼ 1 N m þ 1 N mþ1 X i¼1 The mean rate of creation of information can now be defined as log C ðmþ i ðdþ. (2.2) h approx ðd; mþ ¼ lim N!1 ½HðmÞ N ðdþ Hðmþ1Þ N ðdþš, (2.3) or equivalently using ergodicity h approx ðd; mþ ¼ lim 1 N!1 m HðmÞ N ðdþ. (2.4) In the evaluation of h approx ðd; mþ the vectors y ðmþ ðiþ are allowed to selfmatch. This results in a biased statistics. Therefore, the sample entropy h sample ðd; mþ was introduced [28] which avoids self matching in the above described procedure. As in the Kolmogorov Sinai entropy, both sample and approximate entropy, provide a measure for the information increase over one step from m! m þ 1. To be able to resolve complexity on scales larger than this smallest scale, multiscale entropy was introduced [1]. Here the entropy is calculated not directly by comparing y ðmþ ðiþ, but instead new coarse grained vectors n ðmþ ði; tþ which are sequence vectors constructed from a coarse grained averaged time series fn j ðtþg n j ðtþ ¼ 1 t X jt i¼ðj 1Þtþ1 x i ; 1pjpN=t, R.A. Thuraisingham, G.A. Gottwald / Physica A ] (]]]]) ]]] ]]] 3 where t ¼ 1; 2; 3;... Note that n j ð1þ ¼x j. For nonzero t the original time series fx i g is segmented in N=t coarse grained sequences with each segment being of length t. For each segment the mean value n j ðtþ is calculated which now constitute the coarse grained time series fn j ðtþg. From this time series now the m- dimensional sequence vectors y ðmþ ði; tþ are built as before. The coarse graining introduces smoothing and decorrelation of the vectors. Now, the multiscale entropy can be defined as the sample (or approximate) entropy for the new vectors defined by using n j ðtþ instead of x i. Both, sample entropy and approximate entropy involve two free parameters, the sequence length m and the tolerance level d. Their output measures the likelihood that two sequences of length m stay close to each other at the next step within a tolerance level d. If the output is zero, then consecutive sequences are identical. Obviously, as d decreases the two entropies increase as it will be harder to find consecutive sequences which are close within the prescribed d-tolerance level. The proposed measures are therefore no absolute measures. In our analysis we used m ¼ 2 and d ¼ 0:15s where s is the standard deviation of the original time series. For a discussion on optimal choices for m and d, see Ref. [33]. In Ref. [31], it was noted that by introducing the scale parameter t and the associated process of averaging, the variation of the data decreases with increasing t. However, the tolerance level d is assumed to be fixed for all scales t and is not adjusted to the standard deviation of the coarse-grained data at each scale, which implies that the standard deviation of a given data set decreases with increasing scale t. When d is fixed for all scales, one is also measuring the standard deviation and not just entropy. However, as pointed out in Ref. [32] the t-independent tolerance level d which is determined as a percentage of the standard deviation of the original time series, exactly avoids that the differences in entropy are caused by differences in standard deviations. Moreover, there does not exist a universal relationship between entropy and standard deviation. The relationship between the two depends on the correlation properties. For example, differences in standard deviation cannot be used to clearly distinguish between young healthy and patients with congestive heart failure [32]. In Ref. [32] a simple example is given whereby a periodic signal with variance s p and a random signal with variance s r are considered. The entropy of the periodic signal is always smaller than that of the random signal even if s p bs r. In Section 4, we show that this statement depends on the relationship between the correlation time of the random signal, the period of the periodic signal and the sample time.

4 R.A. Thuraisingham, G.A. Gottwald / Physica A ] (]]]]) ]]] ]]] 3. Analysis of cardiac data using multiscale entropy In this section, we reproduce the analysis of Ref. [1]. We use data provided in Ref. [34] which consist of consecutive heartbeat intervals (R R intervals) derived from young healthy patients, patients with congestive heart failure and patients with atrial fibrillation. The results are depicted in Fig. 1 and show distinctive signatures for all three different cases. The signatures were obtained by using MSE and subsequent averaging over all patients. For the averaging of the healthy heart signals and the congestive heart failure data we used 15 data sets each. For the atrial fibrillation we had only one case. The time series were filtered to remove spurious outliers. The differences in the MSE signature are striking. However, we note that we also found cases of congestive heart failure with a signature of a young healthy heart. Moreover, although the differences between the three groups is striking for their averages, we found that MSE is not a reliable diagnostic tool for individual patients. The MSE-signatures of individual patients are very hard to associate with the signatures of their averages. However, we note that the authors of Ref. [1] stress in Ref. [36] that MSE is rather a detector of complexity than a diagnostic tool. The question of whether MSE can be used as a diagnostic tool to discriminate between healthy and pathological patients had been discussed in Refs. [35,36]. In Ref. [35], it is argued that the (averaged) signature of multiscale entropy for elderly healthy patients is similar to the (averaged) signature of patients with congestive heart failure, and hence multiscale entropy is not a good diagnostic tool for cardiac conditions. In a rebuttal [36], the authors argue that the purpose of multiscale entropy is rather to quantify complexity than to provide a diagnostic tool. Fluctuations of aging and of pathological systems show a lesser degree of complexity when compared with healthy systems. The underlying hypothesis is as mentioned in the introduction that a healthy physiological system needs to exhibit processes which run on several different time scales to be able to adapt to an ever changing environment. It is in this sense that complexity is a sign of health. Multiscale analysis is able to quantify the degree of complexity as argued in Ref. [36]; in Ref. [30] multiscale analysis is applied to binary DNA sequences and synthetic computer codes to quantify complexity. In Ref. [36], it is stressed that complexity rather than irregularity is investigated using multiscale entropy. This allows to discriminate between atrial fibrillation and congestive heart failure. In Ref. [36] it is argued that increased irregularity is not necessarily associated with increased complexity. In the next section, we extend this point and show that one has to be careful with drawing any conclusions from multiscale entropy on any dynamical properties such as irregularity or regularity, and certain a priori knowledge on the dynamical time-scales involved is needed to draw conclusions. In the spirit of Ref. [36], we see multiscale analysis as a general tool to study complexity. The next 2.6 2.4 2.2 2 AF sample entropy 1.8 1.6 1.4 1.2 healthy 1 0.8 CHF 0.6 0 2 4 6 8 10 12 14 16 18 20 scale factor Fig. 1. Plot of the multiscale signature for young healthy hearts (stars), atrial fibrillation (circles) and congestive heart failure (crosses).

R.A. Thuraisingham, G.A. Gottwald / Physica A ] (]]]]) ]]] ]]] 5 section looks at general aspects of multiscale entropy analysis and pitfalls in the interpretation of signatures and conclusions about complexity. 4. Validity of the multiscale entropy method for the interpretation of physiological data In this section, we investigate the interpretation of the signatures of the multiscale entropy of heart data given in Ref. [1]. It was suggested that the signature for atrial fibrillation is associated with white noise data and an underlying stochastic dynamics. In contrast, the signature of young healthy hearts was associated with underlying chaotic dynamics. The signature of congestive heart failure was believed to originate from the occurrence of regularity in the dynamics. We show that such one-to-one correspondence between signatures and the underlying dynamics producing such signatures cannot be made, and care has to be taken when interpreting signatures of multiscale entropy. To interpret signatures, it is necessary to have a clear idea about the following three different time scales involved: the sample time t s of the time series, the correlation time t c of the data, and the period of possible nonlinear oscillations t p. To illustrate our argument we use data from numerical simulations of the Lorenz system _x ¼ sðy xþ, _y ¼ rx y xz, _z ¼ xy bz. (4.1) To assure that the dynamics has settled down to their respective regular or chaotic attractors, we always allow for a transient before sampling a data set. We can measure the correlation length t c by computing the autocorrelation function of a data set, and the period t p by detecting peaks in the Fourier spectrum of a time series. Note that t p ¼1if no such peaks are present. We first concentrate on systems without any regularity. By that we mean either purely random data or chaotic dynamics without any prevalent nonlinear frequencies. In Fig. 2a, we show the result for a purely random data set. The time series consists of white noise data. The resemblance with the signature for atrial fibrillation is evident (compare with Fig. 1). The monotonic decrease of entropy as a function of the scale factor t is due to the averaging involved in the calculation of the multiscale entropy. The larger the t, the more consecutive data points are averaged. As the average is constant (zero in our case) the entropy will decrease due to the law of large numbers. However, a similar picture can be obtained by analyzing a data set which originates from the Lorenz system with parameters resulting in chaotic dynamics. A finite correlation time t c is typical for such chaotic dynamics. If the sampling time t s is chosen such that t s bt c, measurements or samplings cannot resolve the deterministic chaoticity exhibiting a finite correlation length. Instead the resulting time series appears to be uncorrelated and yields a signature similar to the d-correlated white-noise data as can be seen in Fig. 2b. If, however, the sampling is done at a rate with t s 5t c, the time series can resolve the finite correlation and we see indeed the signature of a young healthy heart as in Fig. 1. InFig. 3, we show the result for the same parameters as in Fig. 2b but now for a smaller sampling rate. The initial increase in entropy with increasing scale factor t can be explained by noting that increasing the scale factor, i.e., averaging consecutive t data points, is an effective decorrelation of a data set with a finite correlation time t c. This results in an increase of disorder and entropy. For larger scale factors the entropy has to decrease inevitably as discussed above for the purely random case. We now look at chaotic dynamics which involve prevalent nonlinear frequencies with finite periods t p. These frequencies can be detected in a Fourier spectrum. We show examples of such Fourier spectra in Fig. 4. The local signature of the multiscale entropy for a time series of a chaotic system with prevalent frequencies depends crucially on the ratio of the scale associated with the regular periodic part, t p, and the time scale associated with the chaotic dynamics, t c.infig. 5, we show a sketch of the two extreme cases, t p bt c and t p 5t c. Note that the autocorrelation time t c of a signal depends on its sample time t s.

6 R.A. Thuraisingham, G.A. Gottwald / Physica A ] (]]]]) ]]] ]]] 2.6 2.4 2.2 2 sample entropy 1.8 1.6 1.4 1.2 1 0.8 0 2 4 6 8 10 12 14 16 18 20 (a) scale factor 2.2 2 1.8 sample entropy 1.6 1.4 1.2 1 0.8 0.6 0 5 10 15 20 25 30 35 40 45 (b) scale factor Fig. 2. (a) Plot of the multiscale entropy for a purely random data set. The time series consists of data drawn from a white-noise distribution with a variance of 1, (b) plot of the multiscale entropy for the Lorenz system with r ¼ 28, s ¼ 10 and b ¼ 8 3. A time series of 4:5 10 3 data points has been used with a total time length of 4:5t:u. The time scales are t c ¼ 0:3t:u: and t s ¼ 3:0t:u. The signature of regular dynamics is a dip of the multiscale entropy for certain values of the scale factor t, whereas the local signature of chaotic dynamic is an increase of the multiscale entropy. The difference of the two cases depicted in Fig. 5 in MSE becomes clear if we recollect that the coarse graining factor t smoothes the signal by averaging. In the case t p bt c this averaging simply results in smoothing over the fast finite correlation part of the dynamics resulting in a smoother more regular signal. In this case, we expect a dip in MSE with initial increasing t. However, in the case t p 5t c, the only effect of increasing t is to decorrelate the finite t c signal, and no initial dip can be observed. The occurrence or nonoccurrence of the initial dip in MSE is linked to the two competing effects of increasing the coarse graining factor t; namely the effect of decorrelation, leading to an increase of MSE, and the effect of smoothing leading to a decrease in MSE if t c is small enough. In the case t p bt c we obtain indeed a signature resembling the signature of congestive heart failure (compare with Fig. 1). In Fig. 6, we show results for the Lorenz system with r ¼ 213, s ¼ 10 and b ¼ 8 3 which yield the Fourier spectrum in Fig. 4. For these parameters we have t p ¼ 0:5t:u: and t c ¼ 0:08 t:u: The influence of the sample time t s on the signature of the multiscale entropy can be understood by looking at the sketch in Fig. 5a. Increasing t s will decorrelate the chaotic component which is correlated on the fast

R.A. Thuraisingham, G.A. Gottwald / Physica A ] (]]]]) ]]] ]]] 7 0.7 0.6 0.5 sample entropy 0.4 0.3 0.2 0.1 0 0 5 10 15 20 25 30 35 40 45 scale factor Fig. 3. Plot of the multiscale entropy for the Lorenz system with r ¼ 28, s ¼ 10 and b ¼ 8 3. A time series of 4:5 103 data points has been used with a total time length of 4:5t:u: and t s ¼ 0:001 t:u. x 10-3 9 8 7 Power Spectral Density 6 5 4 3 2 1 0 1 2 3 4 5 6 7 8 9 10 Frequency (Hz) Fig. 4. Fourier spectrum of a time series of the Lorenz system with r ¼ 213, s ¼ 10 and b ¼ 8 3. The maximal peak corresponds to t p ¼ 0:5t:u. time scale t c. However, increasing t s will not affect the regular slow time scale part of the regular component (if t s is moderate). We note that the characteristic decrease of the multiscale entropy for small t is strongly dependent on the sampling time t s for noisefree data. The inclusion of d-correlated noise yields a more pronounced dip (see Fig. 6). 5. Summary The paper is an attempt to model and understand the MSE of cardiac data. This is carried out using electro cardiogram data and the Lorenz model. We have revisited the analysis of physiological cardiac data by means of multiscale entropy proposed in Ref. [1]. Signatures of the multiscale entropy for the three cases of young

8 R.A. Thuraisingham, G.A. Gottwald / Physica A ] (]]]]) ]]] ]]] (a) (b) Fig. 5. Sketch of a time series which involves a finite frequency and a finite correlation length. The regular part is depicted in dashed lines, the chaotic part in solid lines. The boxes are of length t and indicate the segmentation of the signal by smoothing over t: (a) the case where t p bt c, (b) the case where t p 5t c. 0.6 0.55 0.5 0.45 with added noise sample entropy 0.4 0.35 0.3 0.25 0.2 0.15 no noise 0.1 0 5 10 15 20 25 30 35 40 45 scale factor Fig. 6. Plot of the multiscale entropy for the Lorenz system with r ¼ 213, s ¼ 10 and b ¼ 8 3. Lower curve: noise free data. t p ¼ 0:5t:u:, t c ¼ 0:08 t:u: and t s ¼ 0:00402. Upper curve: Gaussian noise with variance 2:89 has been added as measurement noise to the data.

healthy hearts, atrial fibrillation and congestive heart failure were identified. We discussed the interpretation of these signatures. By studying the Lorenz equation as a simple deterministic chaotic system, we showed that for a dynamical system the MSE signature one obtains from a time series depends on the sampling time interval, the correlation time and the period of oscillation, if any frequencies are present. For example, sampling rate can cause decorrelation, suppress periodicities, thus effects the observed MSE signature. This sounds a bell of caution to the reader to be wary of drawing definite conclusions about the nature of the underlying dynamical system from the MSE signature of a sampled time series without detailed knowledge of the different time scales involved. The degree of complexity depends crucially on the time scales under consideration. Chaotic time series with and without nonlinear frequencies yield the same MSE signature as a purely white noise signal if the sampling time is greater than the correlation time and the period of possible nonlinear frequencies. In the case where the correlation time in the sampled data is much greater than the sampling time (and periods of possible nonlinear frequencies are at least smaller than the correlation time) coarse graining will cause the sample entropy to increase initially due to decorrelation before it begins to decrease according to the law of averages. The MSE signature of the young healthy heart shows such a behaviour, suggesting the presence of complex long time correlations. However, we note that in the case of cardiac data which is a time series of the interbeat intervals, the sampling time interval obviously cannot be varied. Whether periodicities in a chaotic signal result in the MSE signature of congestive heart failure depends crucially on the relation between the correlation time t c and the period t p. Only if t c 5t p we can find the dip in multiscale entropy which was found to be an indication for loss of complexity and the appearance of regularity in congestive heart failure [1]. However, the finite correlation does not necessarily have to come from a deterministic chaotic system but could as well be some noise with finite correlation time. In conclusion, similar to the fact that when a time series is constructed on the succession of the same Mahler symphony, the resulting time series may look complex on short intervals whereas the actual signal is periodic [37]. In the same way, we have to be careful when we draw conclusions from time series analysis about the underlying dynamics. In a recent paper [30], which had been published after our submission, the authors address similar issues to ours. In Ref. [30] the influence of uncorrelated noise and sample frequency on multiscale entropy is investigated. In addition to our work in Ref. [30] also the influence of outliers within a given data set on multiscale entropy is investigated. However, we note that the definition of sample time in their work and in our work is different. Whereas sample time in Ref. [30] is used to define the accuracy by which Holter monitor data are collected to produce R R interval data series, we use sample time to denote the sampling interval of the data set which is being directly analyzed (note that there is no freedom in choosing different sampling times according to our definition for R R interval data). Whereas changes in the sampling time of the data series to be analyzed and its ratio to the correlation time and possible present regular frequencies were found here to be of great importance, changes in sample frequency as defined in Ref. [30] were not important. The study on the influence of noise in Ref. [30] is limited to uncorrelated noise and does not address the connection between the correlation time of the noise and the correlation time of the deterministic system or the period of the embedded regular dynamics. Acknowledgements R.A. Thuraisingham, G.A. Gottwald / Physica A ] (]]]]) ]]] ]]] 9 We are very grateful to Madalena Costa for her help with handling the cardiac data and to Douglas Lake for providing the latest sample entropy program. We also thank Niels Wessel for valuable discussions. G.A.G gratefully acknowledges support by the Australian Research Council, DP0452147. We thank the anonymous referee for pointing out to us the Refs. [30,31,35]. References [1] M. Costa, A.L. Goldberger, C.-K. Peng, Multiscale entropy analysis of complex physiological time series, Phys. Rev. Lett. 89 (2002) 068102. [2] Spotlight issue on chaos in the cardiovascular system, Cardiovasc. Res. 31 (1996).

10 R.A. Thuraisingham, G.A. Gottwald / Physica A ] (]]]]) ]]] ]]] [3] Focus issue: fibrillation in normal ventricular myocardium, Chaos 8 (1998). [4] Focus issue: mapping and control of complex cardiac arrhythmias, Chaos 12 (2002). [5] Special issue on the heart, Chaos, Solitons Fractals 13 (2002) 1579 1762. [6] A.L. Goldberger, Is the normal heartbeat chaotic or homeostatic?, News Physiol. Sci. 6 (1991) 87 91. [7] F.X. Witkowski, K.M. Kavanagh, P.A. Penkoske, R. Plonsey, M.L. Spano, W.L. Ditto, D.T. Kaplan, Evidence for determinism in ventricular fibrillation, Phys. Rev. Lett. 75 (1995) 1230 1233. [8] G. Sugihara, W. Allan, D. Sobel, K.D. Allan, Nonlinear control of heart rate variability in human infants, Proc. Natl. Acad. Sci. USA 93 (1996) 2608 2613. [9] T.A. Denton, G.A. Diamond, R.H. Helfant, S. Khan, H. Karagueuzian, Fascinating rhythm: a primer on chaos theory and its application to cardiology, Am. Heart. J. 120 (1990) 1419 1440. [10] J.E. Skinner, A.L. Goldberger, G. Mayer-Kress, R.E. Ideker, Chaos in the heart: implications for clinical cardiology, Biotechnol. 8 (1990) 1018 1024. [11] A.L. Goldberger, Nonlinear dynamics for clinicians: chaos theory, fractals and complexity at the bedside, Lancet 347 (1996) 1312 1314. [12] C.-S. Poon, C.K. Merrill, Decrease of cardiac chaos in congestive heart failure, Nature 389 (1997) 492 495. [13] D.J. Cristini, L. Glass, Introduction: mapping and control of complex cardiac arrhythmias, Chaos 12 (2002) 732 739. [14] C.D. Wagner, P.B. Persson, Chaos in the cardiovascular system: an update, Cardiovasc. Res. 40 (1998) 257 264. [15] H. Kantz, T. Schreiber, Nonlinear Time Series Analysis, second ed., Cambridge University Press, Cambridge, 2004. [16] J. Kurths, A. Voss, A. Witt, P. Saparin, H.J. Kleiner, N. Wessel, Quantitative analysis of heart rate variability, Chaos 5 (1995) 8 94. [17] A. Voss, J. Kurths, H.J. Kleiner, A. Witt, N. Wessel, P. Saparin, K.J. Osterziel, R. Schurath, R. Dietz, The application of methods of non-linear dynamics for the improved and predictive recognition of patients threatened by sudden cardiac death, Cardiovasc. Res. 31 (1996) 419 433. [18] N. Wessel, A. Voss, J. Kurths, A. Schirdewan, K. Hnatkova, M. Malik, Evaluation of renormalised entropy for risk stratification using heart rate variability data, Med. Biol. Eng. Comput. 38 (2000) 680 685. [19] N. Wessel, C. Ziehmann, J. Kurths, U. Meyerfeldt, A. Schirdewan, A. Voss, Short-term forecasting of life-threatening cardiac arrhythmias based on symbolic dynamics and finite-time growth rates, Phys. Rev. E 61 (2000) 733 739. [20] N. Wessel, H. Malberg, U. Meyerfeldt, A. Schirdewan, J. Kurths, Classifying simulated and physiological heart rate variability signals, Comput. Cardiol. 29 (2002) 133 135. [21] N. Marwan, N. Wessel, U. Meyerfeldt, A. Schirdewan, J. Kurths, Recurrence-plot-based measures of complexity and their application to heart-rate-variability data, Phys. Rev. E 66 (2002) 026702. [22] M. Sa kki, J. Kalda, M. Vainu, M. Laan, The distribution of low-variability periods in human heartbeat dynamics, Physica A 338 (2004) 255 260. [23] J. Hayano, F. Yamasaki, S. Sakata, A. Okada, S. Mukai, T. Fujinami, Spectral characteristics of ventricular response to atrial fibrillation, Am. J. Physiol. 273 (1997) H2811 H2816. [24] W. Zeng, L. Glass, Statistical properties of heartbeat intervals during atrial fibrillation, Phys. Rev. E 54 (1996) 1779 1784. [25] Ya.B. Pesin, Characteristic Lyapunov exponents and smooth ergodic theory, Russ. Math. Surveys 32 (1977) 55. [26] J.R. Dorfman, An Introduction to Chaos in Nonequilibrium Statistical Mechanics, Cambridge University Press, Cambridge, 1999. [27] S.M. Pincus, Approximate entropy as a measure of system complexity, Proc. Natl. Acad. Sci. USA 88 (1991) 2291 2301. [28] J.S. Richmann, J.R. Moorman, Physiological time series analysis using approximate entropy and sample entropy, Am. J. Physiol. Heart Circ. Physiol. 278 (2000) H2039. [29] J.L. Starck, F. Murthag, A. Bijaoui, Image Processing and Data Analysis, Cambridge University Press, Cambridge, 1998. [30] M. Costa, A.L. Goldberger, C.-K. Peng, Multiscale entropy analysis of biological signals, Phys. Rev. E 71 (2005) 021906. [31] V.V. Nikulin, T. Brismar, Comment on multiscale entropy analysis of complex physiological time series, Phys. Rev. Lett. 92 (2004) 089803. [32] M. Costa, A.L. Goldberger, C.-K. Peng, Reply to [31], Phys. Rev. Lett. 92 (2004) 089804. [33] D.E. Lake, J.S. Richmann, M.P. Griffin, J.R. Moorman, Sample entropy analysis of neonatal heart rate variability, Am. J. Physiol. Heart Circ. Physiol. 283 (2002) R789 R797. [34] MIT-BIH Normal Sinus Rhythm Database, BIDMC Congestive Heart failure database and MIT-BIH Atrial Fibrillation Database available at http://www.physionet.org/physiobank/database/#ecg. [35] N. Wessel, A. Schirdewan, J. Kurths, Comment: intermittently decreased beat-to-beat variability in congestive heart failure, Phys. Rev. Lett. 91 (2003) 119801. [36] M. Costa, A.L. Goldberger, C.-K. Peng, Reply to [35], Phys. Rev. Lett. 91 (2003) 119802. [37] P.E. Rapp, T. Schmah, Complexity measures in molecular psychiatry, Mol. Psychiatry 1 (1996) 408 416.