Use of the Autocorrelation Function for Frequency Stability Analysis

Similar documents
Computational Data Analysis!

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Minitab Project Report - Assignment 6

Minitab Project Report Assignment 3

Wavelet Methods for Time Series Analysis. Part IV: Wavelet-Based Decorrelation of Time Series

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

2. An Introduction to Moving Average Models and ARMA Models

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Time series models in the Frequency domain. The power spectrum, Spectral analysis

UNCERTAINTY OF STABILITY VARIANCES BASED ON FINITE DIFFERENCES

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

GEORGIA INSTITUTE OF TECHNOLOGY SCHOOL OF ELECTRICAL AND COMPUTER ENGINEERING Final Examination - Fall 2015 EE 4601: Communication Systems

Stochastic Processes. A stochastic process is a function of two variables:

IDENTIFICATION OF ARMA MODELS

Modern Navigation. Thomas Herring

Probability and Statistics for Final Year Engineering Students

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

Advanced Econometrics

ESTIMATING THE ATTRACTOR DIMENSION OF THE EQUATORIAL WEATHER SYSTEM M. Leok B.T.

Econometrics I. Professor William Greene Stern School of Business Department of Economics 25-1/25. Part 25: Time Series

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

Data Sampling Techniques for Fourier Analysis

1. How can you tell if there is serial correlation? 2. AR to model serial correlation. 3. Ignoring serial correlation. 4. GLS. 5. Projects.

Correlation, discrete Fourier transforms and the power spectral density

Design of Time Series Model for Road Accident Fatal Death in Tamilnadu

Nonlinear Time Series Modeling

FGN 1.0 PPL 1.0 FD

NONLINEAR TIME SERIES ANALYSIS, WITH APPLICATIONS TO MEDICINE

Econometría 2: Análisis de series de Tiempo

Time Series Analysis

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

EE538 Final Exam Fall :20 pm -5:20 pm PHYS 223 Dec. 17, Cover Sheet

The Identification of ARIMA Models

STAT 153: Introduction to Time Series

Minitab Project Report - Practical 2. Time Series Plot of Electricity. Index

A KALMAN FILTER CLOCK ALGORITHM FOR USE IN THE PRESENCE OF FLICKER FREQUENCY MODULATION NOISE

Wavelet Methods for Time Series Analysis. Part IX: Wavelet-Based Bootstrapping

Wavelet Methods for Time Series Analysis. Motivating Question

Generalised AR and MA Models and Applications

EFFICIENT 3D POISSON SOLVERS FOR SPACE-CHARGE SIMULATION

Estimation and Hypothesis Testing in LAV Regression with Autocorrelated Errors: Is Correction for Autocorrelation Helpful?

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

TMA4285 December 2015 Time series models, solution.

6.435, System Identification

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

Supplemental Material for KERNEL-BASED INFERENCE IN TIME-VARYING COEFFICIENT COINTEGRATING REGRESSION. September 2017

Econ 424 Time Series Concepts

16.584: Random (Stochastic) Processes

Circle a single answer for each multiple choice question. Your choice should be made clearly.

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

ECE302 Spring 2006 Practice Final Exam Solution May 4, Name: Score: /100

FE570 Financial Markets and Trading. Stevens Institute of Technology

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

V(t) = Total Power = Calculating the Power Spectral Density (PSD) in IDL. Thomas Ferree, Ph.D. August 23, 1999

Fractal Analysis of Intraflow Unidirectional Delay over W-LAN and W-WAN WAN Environments

Figure 1.1.1: Magnitude ratio as a function of time using normalized temperature data

A Data-Driven Model for Software Reliability Prediction

1 Introduction. 2 Binary shift map

White Noise Processes (Section 6.2)

Large sample distribution for fully functional periodicity tests

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

Recipes for the Linear Analysis of EEG and applications

Statistics 349(02) Review Questions

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Degrees-of-freedom estimation in the Student-t noise model.

7 Mathematical Methods 7.6 Insulation (10 units)

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

GMM, HAC estimators, & Standard Errors for Business Cycle Statistics

EE/CpE 345. Modeling and Simulation. Fall Class 9

Thesis Proposal The Short Time Fourier Transform and Local Signals

Time Series Analysis -- An Introduction -- AMS 586

E 4101/5101 Lecture 6: Spectral analysis

A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA

Therefore the new Fourier coefficients are. Module 2 : Signals in Frequency Domain Problem Set 2. Problem 1

Financial Econometrics

Econometría 2: Análisis de series de Tiempo

Chaos, Complexity, and Inference (36-462)

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -18 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.

Cast of Characters. Some Symbols, Functions, and Variables Used in the Book

Autoregressive Models Fourier Analysis Wavelets

5 Transfer function modelling

Final Examination 7/6/2011

Problem Set 2: Box-Jenkins methodology

Reading Assignment. Serial Correlation and Heteroskedasticity. Chapters 12 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Correlator I. Basics. Chapter Introduction. 8.2 Digitization Sampling. D. Anish Roshi

TIME SERIES ANALYSIS

STAT 436 / Lecture 16: Key

ALLAN VARIANCE ESTIMATED BY PHASE NOISE MEASUREMENTS

Technical note on seasonal adjustment for Capital goods imports

Department of Economics, UCSD UC San Diego

Econometrics I: Univariate Time Series Econometrics (1)

Classic Time Series Analysis

Stability Variances: A filter Approach.

TMA4285 Time Series Models Exam December

Some basic properties of cross-correlation functions of n-dimensional vector time series

Wavelet-Based Bootstrapping for Non-Gaussian Time Series. Don Percival

Transcription:

Use of the Autocorrelation Function for Frequency Stability Analysis W.J. Riley, Hamilton Technical Services Introduction This paper describes the use of the autocorrelation function (ACF) as a complement to other statistical and spectral methods for frequency stability analysis. The Autocorrelation Function The autocorrelation function of a time series z for lag is defined as: 2 ρ = Ε z µ z µ / σ t t+ where: Ε{} is the expectation operator, z t is value of the time series at time t, µ is its mean, and σ 2 is its variance. A common estimate of the autocorrelation function is: r = z z z z t t+ z t z 2 where z = z t, and where the lags are =0,, K, and K is -. An equivalent (and faster) estimate can be made for the summation in the numerator as the product of the Fourier transforms of the two terms, based on the fact that convolution in the time domain is equivalent to multiplication in the frequency domain. The autocorrelation sequence calculated using the Fast Fourier Transform (FFT) produces autocorrelation points at lags up to one-half of the data length. An autocorrelation plot is often restricted to fewer points to better show values at smaller lags. Uses of the Autocorrelation Function Because the ACF and the power spectrum are related by the Fourier transform, they are mathematically equivalent, and contain the same information. However, the power spectrum is more familiar and its interpretation is generally easier. The autocorrelation sequence is most useful for theoretical wor, for determining the non-whiteness of data or residuals, for detecting periodic components in data, and for identifying the dominant power law noise type. The latter technique is the main subject of this paper. Examples of Autocorrelation Plots for Power Law oise Figure shows autocorrelation plots, along with the corresponding data, for white, flicer and random wal noise. It is clear that there is a large difference in the degree of correlation between these data as indicated by the shape of their autocorrelations.

Figure a. White FM oise Autocorrelation Plot Figure b. Flicer FM oise Autocorrelation Plot Figure c. Random Wal FM oise Autocorrelation Plot

Lag Scatter Plots A Lag scatter plot is a plot of the phase or frequency data plotted against itself with a lag of. The data at time t+ is plotted on the y-axis versus the value at time t on the x-axis. This plot is another way of showing the degree of correlation in the data, and the slope of a linear fit to these points is closely related to the lag autocorrelation. Examples of Lag scatter plots for frequency data having the five most common power law noise types are shown in Figure 2 below. White PM oise Flicer PM oise White FM oise Flicer FM oise Random Wal FM oise Power Law oise Identification Figure 2. Lag Scatter Plots for Frequency Data It is possible to identify the dominant power law noise process in a phase or frequency data record by examination of its autocorrelation function. In particular, the lag autocorrelation can be used for that purpose, as shown in the following plot. The Stable32 Autocorrelation function provides an estimate of the power law noise type (white, flicer, random wal, flicer wal and random run) for the particular data type (phase or frequency. This estimate is based on the lag autocorrelation value, and is effective for data sets of 30 points and larger. White (uncorrelated) noise has a nominal lag autocorrelation of zero, while flicer and white noise have nominal lag autocorrelations of /3 and /2 respectively. The more divergent noises have lag autocorrelation values that are positive, dependent on the number of samples, and tend to be large (approaching ). For frequency data, the threshold values that separate the white, flicer and random wal FM power law noise types can be determined by empirical fits of the form a + b ln(), where is the # of analysis points, to the lag values for equal amounts of the two adjacent noise types. Those lines have the same log slope separated by 0.5, with the flicer - random wal noise boundary is limited to a maximum value of 0.998. The flicer and white PM noises have boundary values of 0.7 and 0.42 that are independent of. These noise regions are shown in Figure 3 below. It is difficult, however, to distinguish the more divergent noise processes by this method (e.g. flicer wal FM and random run FM for frequency data, and flicer FM and above (lower α) for phase data.

Power Law oise Identification Using Lag Autocorrelation Lag Autocorrelation.0 0.9 Random Wal FM 0.998 Max 0.8 0.6+0.05 ln 0.7 0.6 Flicer FM 0.5 0.4 0.3 0.+0.05 ln 0.2 0. White FM 0-0. -0.7-0.2-0.3 Flicer PM -0.4-0.42-0.5 White PM -0.6-0.7 0 30 Min 0 2 0 3 0 4 0 5 # Analysis Points Figure 3. Lag Autocorrelation Power Law oise Boundaries for Frequency Data A more refined method for identifying power law noises using the lag autocorrelation has been suggested by C. Greenhall [4] based on the properties of fractionally integrated noises having spectral densities of the form f 2δ. For δ < ½, the process is stationary and has a lag autocorrelation that is δ independent of equal to ρ = δ, and the noise type can therefore be estimated from δ = r. + r For frequency data, white PM noise has ρ = / 2, flicer PM noise has ρ = / 3, and white FM noise has ρ = 0. For the more divergent noises, first differences of the data are taen until a stationary process is obtained (δ < 0. 25 ). The noise identification process therefore uses p = round( 2δ ) 2d, where round( 2δ ) is 2δ rounded to the nearest integer and d is the number of times that the data is differenced to bring δ down to < 0.25. If z is a τ-average of frequency data y(t), then α = p; if z is a τ-average of phase data x(t), then α = p + 2, where α is the usual power law exponent f α, thereby determining the noise type at that averaging time. This is the method used in Stable32 for estimating the noise type from the lag autocorrelation. It has excellent discrimination for all common power law noises for both phase and frequency data, including difficult cases with mixed noises. We encourage you to experiment with it for your applications.

Stable32 Autocorrelation Function The Stable32 Autocorrelation function has provisions for plotting the ACF for a selectable number of lags after applying a chosen averaging factor to the phase or frequency data. It provides an estimate of the power law noise type (white, flicer, random wal, flicer wal, or random run) for the particular data type (phase or frequency) that is displayed as a message on the plot. This estimate includes both the name of the closest power law noise type and the estimated alpha value, based on the lag autocorrelation value, and is available for data sets of 30 and larger. The estimated alpha value is particularly helpful for mixed noises because it indicates the approximate proportions of the two dominant noise types (e.g. alpha=.50 indicates an equal mixture of white and flicer PM noise). Stable32 also has provisions for plotting a lag scatter plot for a selected lag number, both as an insert on the autocorrelation plot and as a separate plot that includes, 2 and 3 sigma error boxes. These lag scatter plots include least square linear fits to the slope of the plotted points. Typical Stable32 ACF and lag scatter plots are shown in Figure 4 below. Figure 4a. Autocorrelation Plot Figure 4b. Lag Scatter Plot References The following references apply to the use of the autocorrelation function in the Stable32 program for frequency stability analysis.. W.H. Press, B.P. Flannery, S.A. Teuolsy and W.T. Vetterling, umercial Recipes in C, Cambridge University Press, 988, ISB 0-52-35465-X, Section 3.2. 2. G. Box and G. Jenins, Time Series Analysis, Holden-Day, Inc., 976, ISB 0-862- 04-3. 3. D.B. Percival and A.T. Walden, Spectral Analysis for Physical Applications, Cambridge University Press, 993, ISB 0-52-4354-2. 4. C. Greenhall, Another Power-Law Identifier That Uses Lag- Autocorrelation, private communication, August 28, 2003. W. Riley 8/3/03