Sea Surface. Bottom OBS
|
|
- Miles Vernon Davidson
- 6 years ago
- Views:
Transcription
1 ANALYSIS OF HIGH DIMENSIONAL TIME SERIES: OCEAN BOTTOM SEISMOGRAPH DATA Genshiro Kitagawa () and Tetsuo Takanami (2) () The Institute of Statistical Mathematics, Minami-Azabu, Minato-ku, Tokyo Japan, (2) Hokkaido University, Research Center for Seismology and Volcanology Kita-ku, Sapporo , Japan, Abstract. To explor underground velocity structure based on OBS (Ocean Bottom Seismograph), it is necessary to extract reection or refractions waves from the data contaminated with relatively large direct waves. In this paper, we consider the time series decomposition, spacial decomposition and time-space decomposition of the data. In spatial decompsition and time-space decomposition, the dierence of the travel time (delay) corresponding to underground layer structure is considered. Key words and phrases: Explosion seismology, underground structure, Bayesian modeling, general state space model, Monte Carlo lter, self-organization. Introduction In a cooperative research project with University of Bergen, Research Center for Seismology and Volcanology, Hokkaido University, performed a series of experiments to observed articial seismic signals by ocean bottom seismogram (OBS) near Norway (Berg et al. (200)). The objective was to explore the underground velocity structure. In one experiment, for example, 39 OBSs were set on the bottom of the sea (500{ 2000 m depth) with distances 0{30 km and observed the signals generated by air-gun equipped on board. The ship moved with a constant speed and generated signal 982 times at each 200m (70 seconds). At each OBS, four-channel time series (2 horizontal and 2 vertical (high-gain and low-gain) components were observed with sampling interval of /256 second. As a result, channel time series with 5366 observations were obtained at each OBS. In this article, we consider methods of extracting information about underground velocity structure from multi-channel time series obtained from array of OBS. In particular an important problem is the estimation of the reection waves from the observed seismographs. For that purpose, we rst applied a time series decomposition model. We then develop a spatial model that takes into account of delay of the propagation of the direct and reection waves. We then extend the model to a space-time model to take into account of both the time series structure and spatial delay structure.
2 Sea Surface Bottom OBS Fig.. Exploring underground structure by OBS data 2. State Space Decomposition of Time Series 2. Separation of Direct Waves and Reection/Refraction Waves The problem of extracting reection waves and refraction waves from relatively large direct waves by the state space modeling is considered here. Note that in this section, we consider the modeling and smoothing of single channel and thus the subscript identifying the channel is omitted. For the extraction of the reection (or refraction) waves from direct waves, we consider the model (2.) y n = r n + s n + " n where r n, s n and " n represent the direct wave, the reection wave and the observation noise, respectively. To separate these three components and to extract information about the underground structure from complicated observed time series, it is assumed that both r n and s n are expressed by the autoregressive models (2.2) r n = mx i= a i r n;i + u n s n = `X i= b i s n;i + v n where the AR orders m and ` and the AR coecients a i and b i are unknown and u n, v n and " n are white noise sequences with u n N(0 2 ), v n N(0 2 2 ) and " n N(0 2 ), respectively (Kitagawa and Takanami (985)). The models in (2.) and (2.2) can be combined in the state space model form (2.3) x n = Fx n; + Gw n y n = Hx n + " n where x n is (m+`)-dimensional state vector dened by x n =(r n ::: r n;m+ s n ::: s n;`+ ) T, and w n = (u n v n ) T is a two dimensional system noise. F, G and H are respectively 2
3 (m + `) (m + `), (m + `) 2and (m + `) matrices dened by (2.4) F = 2 64 a a 2 ::: a m b b 2 ::: b`... H = [ 0 0 j 0 0 ] 3 75 G = and the variance of w n is given by (2.5) Q n = " # n 2 : 2.2 Separation of the Time Series and Estimation of the Parameters If all of the parameters m, ` and = (a j b i ) T are given, the state vector x n can be estimated by the Kalman lter and the xed interval smoother (Anderson and Moore 979). In actual estimation, however, the parameters of the model is unknown. autoregressive coecients of the model, a i and b i, were estimated by tting the AR model with observation noise (2.6) y n = r n + " n r n = mx i= a i r n;i + u n to the data where apparently only the direct wave or reection wave exists. The state space representation for this model can be obtained by considering the special case when ` = 0 or m = 0 in (2.2). The log-likelihood of this AR(m) (or AR(`)) plus noise model is obtained by (2.7) `( m )=; N 2 log 2 ; 2 NX n= log r n ; 2 where " n = y n ; Hx njn; and r n = HV njn; H T + 2 with x njn; and V njn; being the mean and the variance covariance matrix of the one-step-ahead predictor of the state obtained by the Kalman lter (Jones (980)). 2.3 Estimation of the Time Varying Variance The variances of the autoregressive model for the direct and reection waves, 2 and 2 2, are related to the amplitude of the waves and are actually time varying. Namely, the variance is almost zero before the direct or reection wave arrives, becomes large NX n= " 2 n r n The 3
4 depending on the amplitude of the wave and then goes back to zero as the signal dies out. These variance parameters play the role of signal to noise ratios, and the estimation of this parameters is the key problem for the extraction of the reection waves. A self-organizing state space model was successfully applied for the estimation of the time-varying variance (Kitagawa 998). In this method, the original state vector x n is augmented with the time-varying parameter n as (2.8) z n =[x n n ] T where the parameter n, for the present problem, is dened by (2.9) n = [log 0 2 n log 0 2 2n] T : The logarithm of the variance is used to assure the positivity of 2 n and 2 2n. We further assume that this parameter n changes according to the random walk model (2.0) log 0 2 j n = log 0 2 j n; + j n j = 2 where j n is the Gaussian white noise with j n N(0 2 j ). The state space model for this augmented state is easily obtained from the original state space model for x n and (2.0). It can be expressed in nonlinear state space model form: (2.) z n = F n (z n; v n ) y n = H n (z n w n ): Then by applying the Monte Carlo lter/smoother (Gordon et al. 993, Kitagawa 996), we can estimate the state z n from the observations. Since the augmented state z n contains x n and n, this means that the marginal posterior distributins of x n and n can be obtained simultaneously and that the variances of the direct and reection waves are obtained automatically. 3. Statial Filtering/Smoothing 3. Time-lag Structure of the Data and Spatial Smoothing The actual time series observed at OBS contains signals of direct waves, reection waves, refracton waves and observation noise. Just beneath the air-gun, the direct wave (compression wave with velocity about.48km/sec.) that travels through the water arrives rst and dominates in the time series. However, since the velocity of the waves in the ground are faster than that in the water (2-8 km/sec.), a refection (or refraction) wave comes rst for the epicentral distance larger than approximately 5 km. As an example, assume the following three-parallel-lay structure: the depth and the velocity of the water layer: h 0 km and v 0 km/sec., the width and the velocities of three layers: h, h 2 h 3 km and v, v 2 v 3 km/sec., respectively. 4
5 Table. Wave types and arrival times Wave type Arrival time Wave(0 2k; ) v 0 ; q (2k ; ) 2 h D2 Wave(0 2k; ) (2k ; )v 0 ; q h d2 0 + v; (D ; (2k ; )d 0 Wave(0 2k; 2) (2k ; )v 0 ; q q h d v; h 2 + d2 2 + v; 2 d 2 Wave(0232) v 0 ; q q q h d v; h 2 + d2 3 +2v; 2 h 2 + d v; 3 d 3 where d ij = v i h i = q v 2 j ; v 2 i, d 2 = D ; (2k ; )d02 ; 2d2, d3 = D ; d03 ; 2d3 ; 2d23. Wave 0 Wave 000 Wave 02 Fig. 2. Left: Examples of wave types: Wave(0), Wave(000) and Wave(02) Right: Arrival times of various waves. Horizontal axis: epicentral distance D (km), vertical axis: arrival time t ; D=6(sec.). From bottom up in vertical axis, Wave (0), (0), (02), (000), (000), (0232), (0002), (00000), (000232). The wave path is identied by the notation Wave(i i k ), (i j = 0 2 3), where Wave(0) denote the direct wave (compression wave) that travels directly from the air-gun to the OBS, Wave(0) denotes the wave that travels on the surface of the bottom of the sea, Wave(0002) denotes the wave that reected at the bottom and the surface of the sea and travels through the surface between the rst and the second layers (Figure 2). Table shows the travel times of various waves. At each OBS these waves arrive succesively (Telford et al. (990)). Figure 2 shows the plot of the arrival times, t, versus the epicentral distances, D, for various wave path. The parameters of the 3-layer structure are assumed to be h0 = 2:km, h = 2km, h2 = 3km, h3 = 5km, v0 = :5km/sec, v = 2:5km/sec, v2 = 3:5km/sec, v3 = 7:0km/sec. It can be seen that the order of the 5
6 Table 2. Wave types and delay of arrival times for various epicentral distance Wave type Epicentral Distance (km) Wave(0) Wave(0 3 ) Wave(0 5 ) Wave(0) Wave(02) Wave(0232) arrival times changes in a complex way with the horizontal distance D, even for such simplest parallel ray structure. 3.2 Dierence of the Arrival Time and Spatial Smoothing At each OBS, 982 time series were observed, with the location of the explosion shifted by 200 m. Therefore the consecutive two series are cross-correlated and by using this it is expected that we can detect the information that was dicult to obtain from a single time series. Table 2 shows the dierence of the arrival times between two consecutive time series, computed for each wave type and for some epicentral distance, D. The dierences of the waves that travel on the surface between two layers, such aswave(0) (02), (0232), are the constants independent on the epicentral distance D. The delay time becomes small for deeper layer or faster wave. On the other hand, for the direct waves that path through water, such as Wave(0), (000), (00000), gradually increases with the increase of the epicentric distance D, and converges to approximately 34 for distance D > 0km. This indicates that for D>0, the arrival time is approximately a linear function of the distance D. Taking into account of this fact, we consider the following model. In this modeling, time series structure is ignored and we use (3.) s n j =2s n;k j; ; s n;2k j;2 + v n j y n j = s n j + w n j where k is the dierence of the arrival times between channels j ; and j. By dening the state vector by x n j =[s n j s n;k j;] T we obtain the state space representation, x n j = Fx n;k j; + Gv n j y n j = Hx n j + w n j : Therefore, if the delay time k is given, we can easily obtain estiamtes of the \signal" by the Kalman lter and the smoother. If one value of k dominates in one region, we can estimate it by the maximum likelihood method. However, in actual data, several dierent waves may appear in the same time and the same channel. 6
7 To cope with this situation, we consider a mixture-lag model dened by (3.2) KX s n j = n j k^s n j k k= y n j = s n j + w n j where ^s n j k is the one step ahead predictor of the s n j dened by ^s n j k = 2s n;k j; ; s n;2k j;2 and n j k is the mixture weight at time n and channel j. In the recursive ltering, this mixture weight can be updated by (3.3) n j k / n;k j; k exp n ;(y n j ; ^s n j k ) 2 =2r njn; o 4. Space-Time Filtering/Smoothing In general, the sequential computational method for ltering and smoothing cannot be extended to space-time ltering/smoothing problems. However, for our special situation where a signal propagates in one direction, a reasonable approximate algorithm can be developed. We consider a multi-variate version of the decomosition model in (2.) (4.) y n j = r n j + s n j + " n j where r n j, s n j and " n j denote the direct wave, reection wave and the observation noise component of channel j. The direct wave and the reection wave components are assumed to follow AR models (4.2) mx r n j = i= `X a i j r n;i j + v r n j s n j = i= b i j s n;i j + v s n j : Considering the delay structure, we also use the model (4.3) r n j = r n;kr j; + u r n j s n j = s n;ks j; + u s n j : An approximate estimation algorithm can be developed by combining the ltering and smoothing algorithm in time and in space (channel). However, as mentioned in section 3, we need to consider mixture of verious wave types (or various time lag components). 5. Summary Several methods for the analysis of OBS data for exploring underground structure are shown. In time sereis decomposition, a smoothness prior approach is taken. In spatial and spatial-temporal ltering/smoothing the delay structure of various types of waves are considered. In the presentation, we shall show the numerical results of the actual OBS data. 7
8 References Akaike, H., and Kitagawa, G.: The Practice of Time Series Analysis, Springer-Verlag New York (999) Anderson, B.D.O. and Moore, J.B.: Optimal Filtering, New Jersey, Prentice-Hall (979). Berg, E., Amundsen, L., Morton, A., Mjelde, R., Shimamura, H., Shiobara, H., Kanazawa, T., Kodaira, S. and Fjekkanger, J.P., Three dimensional OBS-data prcessing for lithology and uid prediction in the mid-norway margin, NE Atlantic, Earth, Planet and Space, Vol. 53, No. 2, 75{90 (200). Harrison, P.J., and Stevens, C.F.: Bayesian forecasting (with discussion), J. R. Statist. Soc., B 38, 205{247 (976) Higuchi, T.: A method to separate the spin synchronized signals using a Bayesian approach (in Japanese with English Abstract), Proceedings of the Institute of Statistical Mathematics, 4, 5{30 (993) Kalman, R.E.: A new approach to linear ltering and prediction problems, Trans. Amer. Soc. Mech. Eng., J. Basic Engineering, 82, 35{45 (960) Kashiwagi, N.: On the use of the Kalman lter for spatial smoothing, Annals of the Institute of Statistical Mathematics, 45, 2{34 (993) Kitagawa. G.: Monte Carlo lter and smoother for non-gaussian nonlinear state space model, Journal of Computational and Graphical Statistics, 5, {25 (996). Kitagawa, G.: Self-organizing State Space Model, Journal of the American Statistical Association, 93, 203{25 (998). Kitagawa, G. and Gersch, W.: Smoothness Priors Analysis of Time Series, Lecture Notes in Statistics, No. 6, Springer-Verlag, New York (996) Shimamura, H.: OBS technical description, Cruise Report, Inst. of Solid Earth Physics Report, Univ. of Bergen, eds. Sellevoll, M.A., 72 (988). Telford, W.M., Geldart, L.P., and Sheri, R.E., Applied Geophysics, Second dition, Cambridge University Press, Cambridge (990). Whittaker, E.T: On a new method of graduation, Proc. Edinborough Math. Assoc., 78, 8{89 (923). 8
Title. Author(s) TAKANAMI, Tetsuo; KITAGAWA, Genshiro. Issue Date Doc URLhttp://hdl.handle.net/2115/8858. Type.
Title Modern Signal Extraction Methods in Computatio Author(s) TAKANAMI, Tetsuo; KITAGAWA, Genshiro Citation Journal of the Faculty of Science, Hokkaido Un Issue Date 2000-03-24 Doc URLhttp://hdl.handle.net/2115/8858
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation
More informationoutput dimension input dimension Gaussian evidence Gaussian Gaussian evidence evidence from t +1 inputs and outputs at time t x t+2 x t-1 x t+1
To appear in M. S. Kearns, S. A. Solla, D. A. Cohn, (eds.) Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, 999. Learning Nonlinear Dynamical Systems using an EM Algorithm Zoubin
More informationGaussian Processes for Sequential Prediction
Gaussian Processes for Sequential Prediction Michael A. Osborne Machine Learning Research Group Department of Engineering Science University of Oxford Gaussian processes are useful for sequential data,
More informationThe Kalman Filter ImPr Talk
The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman
More informationdata lam=36.9 lam=6.69 lam=4.18 lam=2.92 lam=2.21 time max wavelength modulus of max wavelength cycle
AUTOREGRESSIVE LINEAR MODELS AR(1) MODELS The zero-mean AR(1) model x t = x t,1 + t is a linear regression of the current value of the time series on the previous value. For > 0 it generates positively
More informationFigure : Learning the dynamics of juggling. Three motion classes, emerging from dynamical learning, turn out to correspond accurately to ballistic mot
Learning multi-class dynamics A. Blake, B. North and M. Isard Department of Engineering Science, University of Oxford, Oxford OX 3PJ, UK. Web: http://www.robots.ox.ac.uk/vdg/ Abstract Standard techniques
More informationA STATE-SPACE APPROACH TO POLYGONAL LINE REGRESSION
Ann. Inst. Statist. Math. Vol. 48, No. 2, 215 228 (1996) A STATE-SPACE APPROACH TO POLYGONAL LINE REGRESSION NOBUHISA KASHIWAGI The Institute of Statistical Mathematics, 4-6-7 Minami-Azabu, Minato-ku,
More informationEnsemble Kalman Filter
Ensemble Kalman Filter Geir Evensen and Laurent Bertino Hydro Research Centre, Bergen, Norway, Nansen Environmental and Remote Sensing Center, Bergen, Norway The Ensemble Kalman Filter (EnKF) Represents
More informationREGLERTEKNIK AUTOMATIC CONTROL LINKÖPING
Expectation Maximization Segmentation Niclas Bergman Department of Electrical Engineering Linkoping University, S-581 83 Linkoping, Sweden WWW: http://www.control.isy.liu.se Email: niclas@isy.liu.se October
More informationLecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay
Lecture 6: State Space Model and Kalman Filter Bus 490, Time Series Analysis, Mr R Tsay A state space model consists of two equations: S t+ F S t + Ge t+, () Z t HS t + ɛ t (2) where S t is a state vector
More informationBlind Equalization via Particle Filtering
Blind Equalization via Particle Filtering Yuki Yoshida, Kazunori Hayashi, Hideaki Sakai Department of System Science, Graduate School of Informatics, Kyoto University Historical Remarks A sequential Monte
More informationThe Ensemble Kalman Filter:
p.1 The Ensemble Kalman Filter: Theoretical formulation and practical implementation Geir Evensen Norsk Hydro Research Centre, Bergen, Norway Based on Evensen, Ocean Dynamics, Vol 5, No p. The Ensemble
More informationTime Series Prediction by Kalman Smoother with Cross-Validated Noise Density
Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density Simo Särkkä E-mail: simo.sarkka@hut.fi Aki Vehtari E-mail: aki.vehtari@hut.fi Jouko Lampinen E-mail: jouko.lampinen@hut.fi Abstract
More informationLecture 6: Bayesian Inference in SDE Models
Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs
More informationManaging Uncertainty
Managing Uncertainty Bayesian Linear Regression and Kalman Filter December 4, 2017 Objectives The goal of this lab is multiple: 1. First it is a reminder of some central elementary notions of Bayesian
More informationThe Ensemble Kalman Filter:
p.1 The Ensemble Kalman Filter: Theoretical formulation and practical implementation Geir Evensen Norsk Hydro Research Centre, Bergen, Norway Based on Evensen 23, Ocean Dynamics, Vol 53, No 4 p.2 The Ensemble
More informationSTRUCTURAL TIME-SERIES MODELLING
1: Structural Time-Series Modelling STRUCTURAL TIME-SERIES MODELLING Prajneshu Indian Agricultural Statistics Research Institute, New Delhi-11001 1. Introduction. ARIMA time-series methodology is widely
More informationA new unscented Kalman filter with higher order moment-matching
A new unscented Kalman filter with higher order moment-matching KSENIA PONOMAREVA, PARESH DATE AND ZIDONG WANG Department of Mathematical Sciences, Brunel University, Uxbridge, UB8 3PH, UK. Abstract This
More informationLecture 7: Optimal Smoothing
Department of Biomedical Engineering and Computational Science Aalto University March 17, 2011 Contents 1 What is Optimal Smoothing? 2 Bayesian Optimal Smoothing Equations 3 Rauch-Tung-Striebel Smoother
More informationThree-component OBS-data processing for lithology and fluid prediction in the mid-norway margin, NE Atlantic
Earth Planets Space, 53, 75 89, 2001 Three-component OBS-data processing for lithology and fluid prediction in the mid-norway margin, NE Atlantic Eivind Berg 1, Lasse Amundsen 1, Andrew Morton 1, Rolf
More informationNonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania
Nonlinear and/or Non-normal Filtering Jesús Fernández-Villaverde University of Pennsylvania 1 Motivation Nonlinear and/or non-gaussian filtering, smoothing, and forecasting (NLGF) problems are pervasive
More informationSequential Bayesian Updating
BS2 Statistical Inference, Lectures 14 and 15, Hilary Term 2009 May 28, 2009 We consider data arriving sequentially X 1,..., X n,... and wish to update inference on an unknown parameter θ online. In a
More informationOn Moving Average Parameter Estimation
On Moving Average Parameter Estimation Niclas Sandgren and Petre Stoica Contact information: niclas.sandgren@it.uu.se, tel: +46 8 473392 Abstract Estimation of the autoregressive moving average (ARMA)
More informationTSRT14: Sensor Fusion Lecture 8
TSRT14: Sensor Fusion Lecture 8 Particle filter theory Marginalized particle filter Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 8 Gustaf Hendeby Spring 2018 1 / 25 Le 8: particle filter theory,
More informationBayesian Analysis of Vector ARMA Models using Gibbs Sampling. Department of Mathematics and. June 12, 1996
Bayesian Analysis of Vector ARMA Models using Gibbs Sampling Nalini Ravishanker Department of Statistics University of Connecticut Storrs, CT 06269 ravishan@uconnvm.uconn.edu Bonnie K. Ray Department of
More informationLecture 8: Bayesian Estimation of Parameters in State Space Models
in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space
More informationSoftware for non-stationary time series analysis and decompositions via TVAR models. Raquel Prado July 26, Overview This software implements ti
Software for non-stationary time series analysis and decompositions via TVAR models. Raquel Prado July 26, 2000 1 Overview This software implements time-varying autoregressions or TVAR models and allows
More informationG. Larry Bretthorst. Washington University, Department of Chemistry. and. C. Ray Smith
in Infrared Systems and Components III, pp 93.104, Robert L. Caswell ed., SPIE Vol. 1050, 1989 Bayesian Analysis of Signals from Closely-Spaced Objects G. Larry Bretthorst Washington University, Department
More informationLagrangian Data Assimilation and Its Application to Geophysical Fluid Flows
Lagrangian Data Assimilation and Its Application to Geophysical Fluid Flows Laura Slivinski June, 3 Laura Slivinski (Brown University) Lagrangian Data Assimilation June, 3 / 3 Data Assimilation Setup:
More informationCost Analysis of Square Root Information Filtering and Smoothing with a Mixed Real-Integer State
Cost Analysis of Square Root Information Filtering and Smoothing with a Mixed Real-Integer State December 8, 013 The following analysis provides an analysis of the transformation undergone by the maximum
More informationFundamentals of Statistical Signal Processing Volume II Detection Theory
Fundamentals of Statistical Signal Processing Volume II Detection Theory Steven M. Kay University of Rhode Island PH PTR Prentice Hall PTR Upper Saddle River, New Jersey 07458 http://www.phptr.com Contents
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 13: Learning in Gaussian Graphical Models, Non-Gaussian Inference, Monte Carlo Methods Some figures
More informationA Note on the Particle Filter with Posterior Gaussian Resampling
Tellus (6), 8A, 46 46 Copyright C Blackwell Munksgaard, 6 Printed in Singapore. All rights reserved TELLUS A Note on the Particle Filter with Posterior Gaussian Resampling By X. XIONG 1,I.M.NAVON 1,2 and
More informationEstadística Oficial. Transfer Function Model Identication
Boletín de Estadística e Investigación Operativa Vol 25, No 2, Junio 2009, pp 109-115 Estadística Oficial Transfer Function Model Identication Víctor Gómez Ministerio de Economía y Hacienda B vgomez@sgpgmehes
More information1 Introduction The Separation of Independent Sources (SIS) assumes that some unknown but independent temporal signals propagate through a mixing and/o
Appeared in IEEE Trans. on Circuits and Systems, vol. 42, no. 11, pp. 748-751, November 95 c Implementation and Test Results of a Chip for The Separation of Mixed Signals Ammar B. A. Gharbi and Fathi M.
More information2 T. Schneider and A. Neumaier m-dimensional random vectors with zero mean and covariance matrix C. The m- dimensional parameter vector w of intercept
Algorithm: ARfit A Matlab Package for Estimation and Spectral Decomposition of Multivariate Autoregressive Models Tapio Schneider Princeton University and Arnold Neumaier Universitat Wien ARfit is a collection
More informationLessons in Estimation Theory for Signal Processing, Communications, and Control
Lessons in Estimation Theory for Signal Processing, Communications, and Control Jerry M. Mendel Department of Electrical Engineering University of Southern California Los Angeles, California PRENTICE HALL
More informationMethods of Data Assimilation and Comparisons for Lagrangian Data
Methods of Data Assimilation and Comparisons for Lagrangian Data Chris Jones, Warwick and UNC-CH Kayo Ide, UCLA Andrew Stuart, Jochen Voss, Warwick Guillaume Vernieres, UNC-CH Amarjit Budiraja, UNC-CH
More informationAdaptive MV ARMA identification under the presence of noise
Chapter 5 Adaptive MV ARMA identification under the presence of noise Stylianos Sp. Pappas, Vassilios C. Moussas, Sokratis K. Katsikas 1 1 2 3 University of the Aegean, Department of Information and Communication
More informationPATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter
More informationAkinori Sekiguchi and Yoshihiko Nakamura. Dept. of Mechano-Informatics, University of Tokyo Hongo, Bunkyo-Ku, Tokyo , Japan
The Chaotic Mobile Robot Akinori Sekiguchi and Yoshihiko Nakamura Dept. of Mechano-Informatics, University of Tokyo 7-- Hongo, Bunkyo-Ku, Tokyo -866, Japan ABSTRACT In this paper, we develop a method to
More informationThe likelihood for a state space model
Biometrika (1988), 75, 1, pp. 165-9 Printed in Great Britain The likelihood for a state space model BY PIET DE JONG Faculty of Commerce and Business Administration, University of British Columbia, Vancouver,
More informationParticle Filters. Outline
Particle Filters M. Sami Fadali Professor of EE University of Nevada Outline Monte Carlo integration. Particle filter. Importance sampling. Degeneracy Resampling Example. 1 2 Monte Carlo Integration Numerical
More informationFiltering and Likelihood Inference
Filtering and Likelihood Inference Jesús Fernández-Villaverde University of Pennsylvania July 10, 2011 Jesús Fernández-Villaverde (PENN) Filtering and Likelihood July 10, 2011 1 / 79 Motivation Introduction
More informationDynamic System Identification using HDMR-Bayesian Technique
Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in
More informationParticle filters, the optimal proposal and high-dimensional systems
Particle filters, the optimal proposal and high-dimensional systems Chris Snyder National Center for Atmospheric Research Boulder, Colorado 837, United States chriss@ucar.edu 1 Introduction Particle filters
More informationIntroduction to Particle Filters for Data Assimilation
Introduction to Particle Filters for Data Assimilation Mike Dowd Dept of Mathematics & Statistics (and Dept of Oceanography Dalhousie University, Halifax, Canada STATMOS Summer School in Data Assimila5on,
More informationarxiv: v1 [physics.ao-ph] 23 Jan 2009
A Brief Tutorial on the Ensemble Kalman Filter Jan Mandel arxiv:0901.3725v1 [physics.ao-ph] 23 Jan 2009 February 2007, updated January 2009 Abstract The ensemble Kalman filter EnKF) is a recursive filter
More informationState Estimation using Moving Horizon Estimation and Particle Filtering
State Estimation using Moving Horizon Estimation and Particle Filtering James B. Rawlings Department of Chemical and Biological Engineering UW Math Probability Seminar Spring 2009 Rawlings MHE & PF 1 /
More informationRAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS
RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS Frédéric Mustière e-mail: mustiere@site.uottawa.ca Miodrag Bolić e-mail: mbolic@site.uottawa.ca Martin Bouchard e-mail: bouchard@site.uottawa.ca
More informationSOURCE PROCESS OF THE 2003 PUERTO PLATA EARTHQUAKE USING TELESEISMIC DATA AND STRONG GROUND MOTION SIMULATION
Synopses of Master Papers Bulletin of IISEE, 47, 19-24, 2013 SOURCE PROCESS OF THE 2003 PUERTO PLATA EARTHQUAKE USING TELESEISMIC DATA AND STRONG GROUND MOTION SIMULATION Fabricio Moquete Everth* Supervisor:
More informationDual Estimation and the Unscented Transformation
Dual Estimation and the Unscented Transformation Eric A. Wan ericwan@ece.ogi.edu Rudolph van der Merwe rudmerwe@ece.ogi.edu Alex T. Nelson atnelson@ece.ogi.edu Oregon Graduate Institute of Science & Technology
More informationFahrmeir: Recent Advances in Semiparametric Bayesian Function Estimation
Fahrmeir: Recent Advances in Semiparametric Bayesian Function Estimation Sonderforschungsbereich 386, Paper 137 (1998) Online unter: http://epub.ub.uni-muenchen.de/ Projektpartner Recent Advances in Semiparametric
More information1 Introduction Independent component analysis (ICA) [10] is a statistical technique whose main applications are blind source separation, blind deconvo
The Fixed-Point Algorithm and Maximum Likelihood Estimation for Independent Component Analysis Aapo Hyvarinen Helsinki University of Technology Laboratory of Computer and Information Science P.O.Box 5400,
More informationGaussian Process Approximations of Stochastic Differential Equations
Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run
More informationNORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET. Directional Metropolis Hastings updates for posteriors with nonlinear likelihoods
NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET Directional Metropolis Hastings updates for posteriors with nonlinear likelihoods by Håkon Tjelmeland and Jo Eidsvik PREPRINT STATISTICS NO. 5/2004 NORWEGIAN
More information1. Introduction Over the last three decades a number of model selection criteria have been proposed, including AIC (Akaike, 1973), AICC (Hurvich & Tsa
On the Use of Marginal Likelihood in Model Selection Peide Shi Department of Probability and Statistics Peking University, Beijing 100871 P. R. China Chih-Ling Tsai Graduate School of Management University
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationLinear Regression and Its Applications
Linear Regression and Its Applications Predrag Radivojac October 13, 2014 Given a data set D = {(x i, y i )} n the objective is to learn the relationship between features and the target. We usually start
More informationStatistics 910, #15 1. Kalman Filter
Statistics 910, #15 1 Overview 1. Summary of Kalman filter 2. Derivations 3. ARMA likelihoods 4. Recursions for the variance Kalman Filter Summary of Kalman filter Simplifications To make the derivations
More informationEM-algorithm for Training of State-space Models with Application to Time Series Prediction
EM-algorithm for Training of State-space Models with Application to Time Series Prediction Elia Liitiäinen, Nima Reyhani and Amaury Lendasse Helsinki University of Technology - Neural Networks Research
More informationDensity Propagation for Continuous Temporal Chains Generative and Discriminative Models
$ Technical Report, University of Toronto, CSRG-501, October 2004 Density Propagation for Continuous Temporal Chains Generative and Discriminative Models Cristian Sminchisescu and Allan Jepson Department
More informationWidths. Center Fluctuations. Centers. Centers. Widths
Radial Basis Functions: a Bayesian treatment David Barber Bernhard Schottky Neural Computing Research Group Department of Applied Mathematics and Computer Science Aston University, Birmingham B4 7ET, U.K.
More informationAnalysis of the 29th May 2008 Ölfus earthquake and aftershock sequence using three-component t processing on ICEARRAY
Analysis of the 29th May 2008 Ölfus earthquake and aftershock sequence using three-component t processing on ICEARRAY Benedikt Halldórsson Steven J. Gibbons International Symposium on Strong-motion Earthquake
More informationVolume 30, Issue 3. A note on Kalman filter approach to solution of rational expectations models
Volume 30, Issue 3 A note on Kalman filter approach to solution of rational expectations models Marco Maria Sorge BGSE, University of Bonn Abstract In this note, a class of nonlinear dynamic models under
More informationCrib Sheet : Linear Kalman Smoothing
Crib Sheet : Linear Kalman Smoothing Gabriel A. Terejanu Department of Computer Science and Engineering University at Buffalo, Buffalo, NY 14260 terejanu@buffalo.edu 1 Introduction Smoothing can be separated
More informationComparison of DDE and ETDGE for. Time-Varying Delay Estimation. H. C. So. Department of Electronic Engineering, City University of Hong Kong
Comparison of DDE and ETDGE for Time-Varying Delay Estimation H. C. So Department of Electronic Engineering, City University of Hong Kong Tat Chee Avenue, Kowloon, Hong Kong Email : hcso@ee.cityu.edu.hk
More informationTREND ESTIMATION AND THE HODRICK-PRESCOTT FILTER
J. Japan Statist. Soc. Vol. 38 No. 1 2008 41 49 TREND ESTIMATION AND THE HODRICK-PRESCOTT FILTER Andrew Harvey* and Thomas Trimbur** The article analyses the relationship between unobserved component trend-cycle
More informationVelocity-Interface Structure of the Southwestern Ryukyu Subduction Zone from EW OBS/MCS Data
Marine Geophysical Researches 22: 265-287, 2001. 2002 Kluwer Academic Publishers. Printed in the Netherlands. Velocity-Interface Structure of the Southwestern Ryukyu Subduction Zone from EW9509-1 OBS/MCS
More informationThe Practice of Time Series Analysis
Hirotugu Akaike Genshiro Kitagawa Editors The Practice of Time Series Analysis Springer Contents Preface to the English Version Preface The Structure of This Book and General References v vii ix 1 Control
More informationL09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms
L09. PARTICLE FILTERING NA568 Mobile Robotics: Methods & Algorithms Particle Filters Different approach to state estimation Instead of parametric description of state (and uncertainty), use a set of state
More informationUltra sonic? Heat and Temp.? Force and Disp.? Where is the best location? Void
Inverse Problems in Engineering: Theory and Practice 3rd Int. Conference on Inverse Problems in Engineering June 3-8, 999, Port Ludlow, WA, USA ME0 OPTIMIZATION OF MEASUREMENTS FOR INVERSE PROBLEM Kenji
More informationX t = a t + r t, (7.1)
Chapter 7 State Space Models 71 Introduction State Space models, developed over the past 10 20 years, are alternative models for time series They include both the ARIMA models of Chapters 3 6 and the Classical
More informationBayesian Dynamic Linear Modelling for. Complex Computer Models
Bayesian Dynamic Linear Modelling for Complex Computer Models Fei Liu, Liang Zhang, Mike West Abstract Computer models may have functional outputs. With no loss of generality, we assume that a single computer
More informationComments on \Wavelets in Statistics: A Review" by. A. Antoniadis. Jianqing Fan. University of North Carolina, Chapel Hill
Comments on \Wavelets in Statistics: A Review" by A. Antoniadis Jianqing Fan University of North Carolina, Chapel Hill and University of California, Los Angeles I would like to congratulate Professor Antoniadis
More informationSequential Monte Carlo Methods for Bayesian Computation
Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter
More informationInferring biological dynamics Iterated filtering (IF)
Inferring biological dynamics 101 3. Iterated filtering (IF) IF originated in 2006 [6]. For plug-and-play likelihood-based inference on POMP models, there are not many alternatives. Directly estimating
More informationJ. Cwik and J. Koronacki. Institute of Computer Science, Polish Academy of Sciences. to appear in. Computational Statistics and Data Analysis
A Combined Adaptive-Mixtures/Plug-In Estimator of Multivariate Probability Densities 1 J. Cwik and J. Koronacki Institute of Computer Science, Polish Academy of Sciences Ordona 21, 01-237 Warsaw, Poland
More informationFundamentals of Data Assimilation
National Center for Atmospheric Research, Boulder, CO USA GSI Data Assimilation Tutorial - June 28-30, 2010 Acknowledgments and References WRFDA Overview (WRF Tutorial Lectures, H. Huang and D. Barker)
More informationis used in the empirical trials and then discusses the results. In the nal section we draw together the main conclusions of this study and suggest fut
Estimating Conditional Volatility with Neural Networks Ian T Nabney H W Cheng y 1 Introduction It is well known that one of the obstacles to eective forecasting of exchange rates is heteroscedasticity
More informationContinuous-time Gaussian Autoregression Peter Brockwell, Richard Davis and Yu Yang, Statistics Department, Colorado State University, Fort Collins, CO
Continuous-time Gaussian Autoregression Peter Brockwell, Richard Davis and Yu Yang, Statistics Department, Colorado State University, Fort Collins, CO 853-877 Abstract The problem of tting continuous-time
More informationA Subspace Approach to Estimation of. Measurements 1. Carlos E. Davila. Electrical Engineering Department, Southern Methodist University
EDICS category SP 1 A Subspace Approach to Estimation of Autoregressive Parameters From Noisy Measurements 1 Carlos E Davila Electrical Engineering Department, Southern Methodist University Dallas, Texas
More informationExpectation propagation for signal detection in flat-fading channels
Expectation propagation for signal detection in flat-fading channels Yuan Qi MIT Media Lab Cambridge, MA, 02139 USA yuanqi@media.mit.edu Thomas Minka CMU Statistics Department Pittsburgh, PA 15213 USA
More informationLecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis
Lecture 3 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,
More informationMonte Carlo Methods for Statistical Inference: Variance Reduction Techniques
Monte Carlo Methods for Statistical Inference: Variance Reduction Techniques Hung Chen hchen@math.ntu.edu.tw Department of Mathematics National Taiwan University 3rd March 2004 Meet at NS 104 On Wednesday
More informationLinear Dynamical Systems
Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations
More informationDesign of FIR Smoother Using Covariance Information for Estimating Signal at Start Time in Linear Continuous Systems
Systems Science and Applied Mathematics Vol. 1 No. 3 2016 pp. 29-37 http://www.aiscience.org/journal/ssam Design of FIR Smoother Using Covariance Information for Estimating Signal at Start Time in Linear
More informationKalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q
Kalman Filter Kalman Filter Predict: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q Update: K = P k k 1 Hk T (H k P k k 1 Hk T + R) 1 x k k = x k k 1 + K(z k H k x k k 1 ) P k k =(I
More information1. INTRODUCTION State space models may be formulated in avariety of ways. In this paper we consider rst the linear Gaussian form y t = Z t t + " t " t
A simple and ecient simulation smoother for state space time series analysis BY J. DURBIN Department of Statistics, London School of Economics and Political Science, London WCA AE, UK. durbinja@aol.com
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of
More informationA Hybrid Method of Forecasting in the Case of the Average Daily Number of Patients
Journal of Computations & Modelling, vol.4, no.3, 04, 43-64 ISSN: 79-765 (print), 79-8850 (online) Scienpress Ltd, 04 A Hybrid Method of Forecasting in the Case of the Average Daily Number of Patients
More informationGaussian process for nonstationary time series prediction
Computational Statistics & Data Analysis 47 (2004) 705 712 www.elsevier.com/locate/csda Gaussian process for nonstationary time series prediction Soane Brahim-Belhouari, Amine Bermak EEE Department, Hong
More informationExpectation Propagation in Dynamical Systems
Expectation Propagation in Dynamical Systems Marc Peter Deisenroth Joint Work with Shakir Mohamed (UBC) August 10, 2012 Marc Deisenroth (TU Darmstadt) EP in Dynamical Systems 1 Motivation Figure : Complex
More informationPERIODIC KALMAN FILTER: STEADY STATE FROM THE BEGINNING
Journal of Mathematical Sciences: Advances and Applications Volume 1, Number 3, 2008, Pages 505-520 PERIODIC KALMAN FILER: SEADY SAE FROM HE BEGINNING MARIA ADAM 1 and NICHOLAS ASSIMAKIS 2 1 Department
More informationSquare-Root Algorithms of Recursive Least-Squares Wiener Estimators in Linear Discrete-Time Stochastic Systems
Proceedings of the 17th World Congress The International Federation of Automatic Control Square-Root Algorithms of Recursive Least-Squares Wiener Estimators in Linear Discrete-Time Stochastic Systems Seiichi
More informationLagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC
Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC Background Data Assimilation Iterative process Forecast Analysis Background
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project
More information