Posterior Covariance vs. Analysis Error Covariance in Data Assimilation
|
|
- Adelia Park
- 5 years ago
- Views:
Transcription
1 Posterior Covariance vs. Analysis Error Covariance in Data Assimilation F.-X. Le Dimet(1), I. Gejadze(2), V. Shutyaev(3) (1) Université de Grenole (2)University of Strathclyde, Glasgow, UK (3) Institute of Numerical Mathematics, RAS, Moscow, Russia Septemer 27, 2013 F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
2 Overview Introduction Analysis Error Covariance via Hessian Posterior Covariance : A Bayesian approach Effective Covariance Estimates Implementation : some remarks Asymptotic Properties Numerical Example Conclusion F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
3 Introduction 1 There are two asic approaches for Data Assimilation: Variational Methods Kalman Filter Both lead to minimize a cost function : J(u) = (V (u u ), u u ) X (Vo (Cϕ y), Cϕ y) Yo, (1) 2 where u X is a prior initial-value function (ackground state), y Y o is a prescried function (oservational data), Y o is an oservation space, C : Y Y o is a linear ounded operator. We get the same optimal solution ū F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
4 Introduction 2 ū is the solution of the Optimality System : { ϕ t { ϕ = F (ϕ) + f, t (0, T ) t ϕ t=0 = u, + (F (ϕ)) ϕ = C V 1 o (Cϕ y), t (0, T ) ϕ t=t = 0, (2) (3) V 1 (u u ) ϕ t=0 = 0 (4) F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
5 Introduction 3 In an analysis there are two inputs: The ackground u the oservation y Both have error and the question is what is the impact of these errors on the analysis? The ackground can e considered from two different viewpoints: Variational viewpoint: the ackground is a regularization term in the Tykhonov s sense to make the prolem well posed Bayesian view point: the ackground is an a priori information on the analysis F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
6 Introduction 4 In the linear case we get the same covariance error for the analysis : the inverse of the Hessian of the cost function. In the non linear case we get two different items: Variational approach : Analysis Error Covariance Bayesian Approach : Posterior Covariance Questions: How to compute, approximate, these elements? What are the differences? F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
7 Analysis Error Covariance 1: True solution and Errors We assume the existence of a true solution u t and an associated true state ϕ t verifying: ϕ t t = F (ϕ t ) + f, t (0, T ) ϕ t t=0 = u t. (5) Then the errors are defined y : u = u t + ξ, y = Cϕ t + ξ o with covariances V andv o F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
8 Analysis Error Covariance 2: Discrepancy Evolution Let δϕ = ϕ ϕ t, δu = u u t. Then for regular F there exists ϕ = ϕ t + τ(ϕ ϕ t ), τ [0, 1], such that { δϕ t F ( ϕ)δϕ = 0, t (0, T ), δϕ t=0 = δu, (6) { ϕ t + (F (ϕ)) ϕ = C Vo 1 (Cδϕ ξ o ), ϕ t=t = 0, (7) V 1 (δu ξ ) ϕ t=0 = 0. (8) F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
9 Analysis Error Covariance 3: Exact Equation for Analysis Error Let us introduce the operator R(ϕ) : X Y as follows: R(ϕ)v = ψ, v X, (9) where ψ is the solution of the tangent linear prolem ψ t F (ϕ)ψ = 0, ψ t=0 = v. (10) Then, the system for errors can e represented as a single operator equation for δu: H(ϕ, ϕ)δu = V 1 ξ + R (ϕ)c Vo 1 ξ o, (11) where H(ϕ, ϕ) = V 1 + R (ϕ)c V 1 o CR( ϕ). (12) F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
10 Analysis Error Covariance 4 : H Operator The operator H(ϕ, ϕ) : X X can e defined y ψ t ψ t F ( ϕ)ψ = 0, ψ t=0 = v, (13) (F (ϕ)) ψ = C V 1 o Cψ, ψ t=t = 0, (14) H(ϕ, ϕ)v = V 1 v ψ t=0. (15) The operator H(ϕ, ϕ) is neither symmetric, nor positive definite. ϕ = ϕ = θ, it ecomes the Hessian H(θ) of the cost function J 1 in the following auxiliary DA prolem: find δu and δϕ such that J 1 (δu) = inf v J 1(v), where J 1 (δu) = 1 1 (V 2 (δu ξ ), δu ξ ) X (Vo (Cδϕ ξ o ), Cδϕ ξ o ) Yo, 2 (16) and δϕ satisfies the prolem δϕ F (θ)δϕ = 0, δϕ t t=0 = δu. (17) F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
11 Analysis Error Covariance 5 : Analysis Error Covariance via Hessian The optimal solution (analysis) error δu is assumed to e uniased, i.e. E[δu] = 0, and V δu = E[(, δu) X δu] = E[(, u u t ) X (u u t )]. (18) The est value of ϕ and ϕ independent of ξ o, ξ is apparently ϕ t and using the error equation reduces to H(ϕ t )δu = V 1 R( ϕ) R(ϕ t ), R (ϕ) R (ϕ t ), (19) ξ + R (ϕ t )C V 1 ξ o, H( ) = V 1 We express δu from equation (??) δu = H 1 (ϕ t )(V 1 ξ + R (ϕ t )C Vo 1 ξ o ) and otain for the analysis error covariance V δu = H 1 (ϕ t )(V 1 o + R ( )C Vo 1 CR( ). (20) + R (ϕ t )C V 1 o CR(ϕ t ))H 1 (ϕ t ) = H 1 (ϕ t ). (21) F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
12 Analysis Error Covariance 6 : Approximations In practice the true field ϕ t is not known, thus we have to use an approximation ϕ associated to a certain optimal solution ū defined y the real data (ū, ȳ), i.e. we use V δu = H 1 ( ϕ). (22) In (Raier and Courtier,1992), the error equation is derived in the form (V 1 + R (ϕ)c V 1 o CR(ϕ))δu = V 1 ξ + R (ϕ)c V 1 o ξ o. (23) The error due to transitions R( ϕ) R(ϕ t ) and R (ϕ) R (ϕ t ); we call it the linearization error. The use of ϕ instead of ϕ t in the Hessian computations leads to another error, which shall e called the origin error. F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
13 Posterior Covariance : Bayesian Approach 1 Given u N (ū, V ), y N (ȳ, V o ), the following expression for the posterior distriution of u is derived from the Bayes theorem: p(u ȳ) = C exp ( 1 1 (V 2 (u ū ), u ū ) X ) exp( 1 1 (Vo (Cϕ ȳ), Cϕ ȳ) Yo ). 2 (24) The solution to the variational DA prolem with the data y = ȳ and u = ū is equal to the mode of p(u, ȳ) (see e.g. Lorenc, 1986; Tarantola, 1987). Accordingly, the Bayesian posterior covariance is defined y : with u p(u ȳ). V δu = E[(, u E[u]) X (u E[u]) (25) F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
14 Posterior Covariance : Bayesian Approach 2 In order to compute V δu y the Monte Carlo method, one must generate a sample of pseudo-random realizations u i from p(u ȳ). We will consider u i to e the solutions to the DA prolem with the pertured data u = ū + ξ, and y = ȳ + ξ o, where ξ N (0, V ), ξ o N (0, V o ). Further we assume that E[u] = ū, where ū is the solution to the unpertured prolem in which case V δu can e approximated as follows V δu = E[(, u ū) X (u ū)] = E[(, δu) X δu]. (26) F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
15 Posterior Covariance : O.S. for Errors Unpertured O.S. with u = ū, y = ȳ: ϕ t ϕ t = F ( ϕ) + f, ϕ t=0 = ū, (27) + (F ( ϕ)) ϕ = C V 1 o (C ϕ ȳ), ϕ t=t = 0, (28) V 1 (ū ū ) ϕ t=0 = 0 (29) With perturations : u = ū + ξ, y = ȳ + ξ o, where ξ X, ξ o Y o. δu = u ū, δϕ = ϕ ϕ, δϕ = ϕ ϕ. δϕ t δϕ t = F (ϕ) F ( ϕ), δϕ t=0 = δu, (30) +(F (ϕ)) δϕ = [((F ( ϕ)) F (ϕ)) ] ϕ +C V 1 o (Cδϕ ξ o ), (31) V 1 (δu ξ ) δϕ t=0 = 0. (32) F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
16 Posterior Covariance : Exact Errors Equations Introducing ϕ 1 = ϕ + τ 1 δϕ, ϕ 2 = ϕ + τ 2 δϕ, τ 1, τ 2 [0, 1], we derive the exact system for errors: δϕ t δϕ t = F ( ϕ 1 )δϕ, δϕ t=0 = δu, (33) + (F (ϕ)) δϕ = [(F ( ϕ 2 )) ϕ ] δϕ + C V 1 o (Cδϕ ξ o ), (34) V 1 (δu ξ ) δϕ t=0 = 0, (35) equivalent to a single operator equation for δu: where H(ϕ, ϕ 1, ϕ 2 ) = V 1 H(ϕ, ϕ 1, ϕ 2 )δu = V 1 ξ + R (ϕ)c Vo 1 ξ o, (36) + R (ϕ)(c V 1 o C [(F ( ϕ 2 )) ϕ ] )R( ϕ 1 ). (37) F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
17 Posterior Covariance : Operator H H(ϕ, ϕ 1, ϕ 2 ) : X X is defined y solving: ψ t ψ t = F ( ϕ 1 )ψ, ψ t=0 = v, (38) (F (ϕ)) ψ = [(F ( ϕ 2 )) ϕ ] ψ C V 1 o Cψ, (39) H(ϕ, ϕ 1, ϕ 2 )v = V 1 v ψ t=0. (40) If we had ϕ = ϕ 1 = ϕ 2, H(ϕ) ecomes the Hessian of the cost function in the original DA prolem, it is symmetric and positive definite if u is a minimum of J(u). The equation is often referred as the second order adjoint (Le Dimet et al., 2002). As aove, we assume that E(δu) 0, and we consider the following approximations R( ϕ 1 ) R( ϕ), R (ϕ) R ( ϕ), [(F ( ϕ 2 )) ϕ ] [(F ( ϕ)) ϕ ]. (41) F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
18 Posterior Covariance via Hessian The exact error equation (??) is approximated as follows where H( ) = V 1 Now, we express δu : H( ϕ)δu = V 1 ξ + R( ϕ) C Vo 1 ξ o, (42) + R ( )(C V 1 o C [(F ( )) ϕ ] )R( ). (43) δu = H 1 ( ϕ)(v 1 ξ + R( ϕ) C Vo 1 ξ o ), and otain an approximate expression for the posterior error covariance V 1 = H 1 ( ϕ)(v 1 + R ( ϕ)v o 1 R( ϕ))h 1 ( ϕ) = H 1 ( ϕ)h( ϕ)h 1 ( ϕ), (44) where H( ϕ) is the Hessian of the cost function J 1 computed at θ = ϕ. Other approximations of the posterior covariance: V 2 = H 1 ( ϕ), V 3 = H 1 ( ϕ). (45) F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
19 Posterior Covariance : Effective Estimates To supress the linearization errors, the effective inverse Hessian may e used for estimating the analysis error covariance (see Gejadze et al., 2011): V δu = E [ H 1 (ϕ) ]. (46) The same is true for the posterior error covariance V δu : V δu = E [ H 1 (ϕ, ϕ 1, ϕ 2 ) H(ϕ) H 1 (ϕ, ϕ 1, ϕ 2 ) ]. First, we sustitute a possily asymmetric and indefinite operator H(ϕ, ϕ 1, ϕ 2 ) y H(ϕ): V δu V e 1 = E [ H 1 (ϕ) H(ϕ) H 1 (ϕ) ]. (47) Next, y assuming H(ϕ)H 1 (ϕ) I we get V δu V e 2 = E [ H 1 (ϕ) ]. (48) Finally, y assuming H 1 (ϕ) H 1 (ϕ) we otain yet another approximation V δu V e 3 = E [ H 1 (ϕ) ]. (49) F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
20 Numerical Example ϕ(x, t) is governed y the 1D Burgers equation with a nonlinear viscous term : ϕ t + 1 (ϕ 2 ) = ( ν(ϕ) ϕ ), (50) 2 x x x ϕ = ϕ(x, t), t (0, T ), x (0, 1), with the Neumann oundary conditions ϕ = ϕ = 0 (51) x x x=0 x=1 and the viscosity coefficient ( ) ϕ 2 ν(ϕ) = ν 0 + ν 1, ν 0, ν 1 = const > 0. (52) x Two initial conditions u t = ϕ t (x, 0) are considered (case A and case B), F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
21 Numerical Example : Cas A ϕ F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
22 Numerical Example : Cas B ϕ F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
23 Numerical Example : Squared Riemann distance ( M ) 1/2. µ(a, B) = i=1 ln2 γ i Case µ 2 (V 3, ˆV) µ 2 (V 2, ˆV) µ 2 (V 1, ˆV) µ 2 (V3 e, ˆV) µ 2 (V2 e, ˆV) µ 2 (V1 e, ˆV) A A A A A A A B B B B B B F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
24 The reference mean deviation ˆσ(x) (related to ˆV). Cases A2, A case A2 case A8 σ x The mean deviation vector σ is defined as follows: σ(i) = V 1/2 (i, i). F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
25 The reference mean deviation ˆσ(x) (related to ˆV). Cases B6, B case B6 case B9 σ x F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
26 Asolute errors in the correlation matrix: 3, e3, e1 (case A2) a) ) c) case A2 F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
27 Asolute errors in the correlation matrix: 3, e3, e1 (case A8) a) ) c) F.-X. Le Dimet (INRIA) case A8 Posterior covariance Septemer 27, / 1
28 Asolute errors in the correlation matrix: 3, e3, e1 (case B6) a) ) c) F.-X. Le Dimet (INRIA) case B6 Posterior covariance Septemer 27, / 1
29 Conclusion Variational and Bayesian approaches of DA lead to the minimization of the same function For Error Estimation we otain two different concepts Analysis Error Covariance for the Variational Approach Posterior Covariance for the Bayesian Approach In the linear case the covariances coincide In the nonlinear case strong discrepancies can occur. Algorithms for the estimation and the approximation of these covariances are proposed. Gejadze, I.,Shutyaev, V., Le Dimet, F.-X. Analysis error covariance versus posterior covariance in variational data assimilation prolems. Q.J.R.Meteolol. Soc. (2013). F.-X. Le Dimet (INRIA) Posterior covariance Septemer 27, / 1
Posterior Covariance vs. Analysis Error Covariance in Data Assimilation
Posterior Covariance vs. Analysis Error Covariance in Data Assimilation François-Xavier Le Dimet, Victor Shutyaev, Igor Gejadze To cite this version: François-Xavier Le Dimet, Victor Shutyaev, Igor Gejadze.
More informationAnalysis error covariance versus posterior covariance in variational data assimilation
Analysis error covariance versus posterior covariance in variational data assimilation Igor Gejadze, Victor Shutyaev, François-Xavier Le Dimet To cite this version: Igor Gejadze, Victor Shutyaev, François-Xavier
More informationOptimal solution error covariance in highly nonlinear problems of variational data assimilation
Nonlin. Processes Geophys., 9, 77 84, 22 www.nonlin-processes-geophys.net/9/77/22/ doi:.594/npg-9-77-22 Authors) 22. CC Attriution 3. License. Nonlinear Processes in Geophysics Optimal solution error covariance
More information(Extended) Kalman Filter
(Extended) Kalman Filter Brian Hunt 7 June 2013 Goals of Data Assimilation (DA) Estimate the state of a system based on both current and all past observations of the system, using a model for the system
More informationMODELLING OF ACOUSTIC POWER RADIATION FROM MOBILE SCREW COMPRESSORS
11 th International Conference on Viration Prolems Z. Dimitrovová et.al. (eds.) Lison, Portugal, 9 12 Septemer 213 MODELLING OF ACOUSTIC POWER RADIATION FROM MOBILE SCREW COMPRESSORS Jan Dupal* 1, Jan
More informationCONTINUOUS DEPENDENCE ESTIMATES FOR VISCOSITY SOLUTIONS OF FULLY NONLINEAR DEGENERATE ELLIPTIC EQUATIONS
Electronic Journal of Differential Equations, Vol. 20022002), No. 39, pp. 1 10. ISSN: 1072-6691. URL: http://ejde.math.swt.edu or http://ejde.math.unt.edu ftp ejde.math.swt.edu login: ftp) CONTINUOUS DEPENDENCE
More informationDetecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf
Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf Reading: Ch. 5 in Kay-II. (Part of) Ch. III.B in Poor. EE 527, Detection and Estimation Theory, # 5c Detecting Parametric Signals in Noise
More informationOn a Data Assimilation Method coupling Kalman Filtering, MCRE Concept and PGD Model Reduction for Real-Time Updating of Structural Mechanics Model
On a Data Assimilation Method coupling, MCRE Concept and PGD Model Reduction for Real-Time Updating of Structural Mechanics Model 2016 SIAM Conference on Uncertainty Quantification Basile Marchand 1, Ludovic
More informationSharp estimates of bounded solutions to some semilinear second order dissipative equations
Sharp estimates of ounded solutions to some semilinear second order dissipative equations Cyrine Fitouri & Alain Haraux Astract. Let H, V e two real Hilert spaces such that V H with continuous and dense
More informationEnsemble of Data Assimilations and uncertainty estimation
Ensemle of Data Assimilations and uncertainty estimation Massimo Bonavita ECMWF Acnowledgments: Lars Isasen, Mats Hamrud, Elias Holm, Slide 1 Mie Fisher, Laure Raynaud, Loi Berre, A. Clayton Outline Why
More informationA Reduced Rank Kalman Filter Ross Bannister, October/November 2009
, Octoer/Novemer 2009 hese notes give the detailed workings of Fisher's reduced rank Kalman filter (RRKF), which was developed for use in a variational environment (Fisher, 1998). hese notes are my interpretation
More informationσ(a) = a N (x; 0, 1 2 ) dx. σ(a) = Φ(a) =
Until now we have always worked with likelihoods and prior distributions that were conjugate to each other, allowing the computation of the posterior distribution to be done in closed form. Unfortunately,
More informationSpectrum Opportunity Detection with Weak and Correlated Signals
Specum Opportunity Detection with Weak and Correlated Signals Yao Xie Department of Elecical and Computer Engineering Duke University orth Carolina 775 Email: yaoxie@dukeedu David Siegmund Department of
More informationMonte Carlo in Bayesian Statistics
Monte Carlo in Bayesian Statistics Matthew Thomas SAMBa - University of Bath m.l.thomas@bath.ac.uk December 4, 2014 Matthew Thomas (SAMBa) Monte Carlo in Bayesian Statistics December 4, 2014 1 / 16 Overview
More informationBayesian Model Comparison:
Bayesian Model Comparison: Modeling Petrobrás log-returns Hedibert Freitas Lopes February 2014 Log price: y t = log p t Time span: 12/29/2000-12/31/2013 (n = 3268 days) LOG PRICE 1 2 3 4 0 500 1000 1500
More informationFundamentals of Data Assimilation
National Center for Atmospheric Research, Boulder, CO USA GSI Data Assimilation Tutorial - June 28-30, 2010 Acknowledgments and References WRFDA Overview (WRF Tutorial Lectures, H. Huang and D. Barker)
More informationSequential Monte Carlo Methods (for DSGE Models)
Sequential Monte Carlo Methods (for DSGE Models) Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 Some References These lectures use material from our joint work: Tempered
More informationTechnical Appendix: Imperfect information and the business cycle
Technical Appendix: Imperfect information and the usiness cycle Farice Collard Harris Dellas Frank Smets March 29 We would like to thank Marty Eichenaum, Jesper Lindé, Thomas Luik, Frank Schorfheide and
More informationData assimilation in high dimensions
Data assimilation in high dimensions David Kelly Courant Institute New York University New York NY www.dtbkelly.com February 12, 2015 Graduate seminar, CIMS David Kelly (CIMS) Data assimilation February
More informationBayesian Estimation of DSGE Models
Bayesian Estimation of DSGE Models Stéphane Adjemian Université du Maine, GAINS & CEPREMAP stephane.adjemian@univ-lemans.fr http://www.dynare.org/stepan June 28, 2011 June 28, 2011 Université du Maine,
More informationStochastic Collocation Methods for Polynomial Chaos: Analysis and Applications
Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications Dongbin Xiu Department of Mathematics, Purdue University Support: AFOSR FA955-8-1-353 (Computational Math) SF CAREER DMS-64535
More informationStrathprints Institutional Repository
Strathprnts Insttutonal Repostory Gejadze, I. Yu. and Shutyaev, V. and Le Dmet, F.X. (2013) Analyss error covarance versus posteror covarance n varatonal data assmlaton. Quarterly Journal of the Royal
More informationFinQuiz Notes
Reading 9 A time series is any series of data that varies over time e.g. the quarterly sales for a company during the past five years or daily returns of a security. When assumptions of the regression
More informationAn introduction to data assimilation. Eric Blayo University of Grenoble and INRIA
An introduction to data assimilation Eric Blayo University of Grenoble and INRIA Data assimilation, the science of compromises Context characterizing a (complex) system and/or forecasting its evolution,
More informationA globally convergent numerical method and adaptivity for an inverse problem via Carleman estimates
A globally convergent numerical method and adaptivity for an inverse problem via Carleman estimates Larisa Beilina, Michael V. Klibanov Chalmers University of Technology and Gothenburg University, Gothenburg,
More informationBayesian Inference for DSGE Models. Lawrence J. Christiano
Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.
More informationSolutions to Exam 2, Math 10560
Solutions to Exam, Math 6. Which of the following expressions gives the partial fraction decomposition of the function x + x + f(x = (x (x (x +? Solution: Notice that (x is not an irreducile factor. If
More informationDSGE Methods. Estimation of DSGE models: Maximum Likelihood & Bayesian. Willi Mutschler, M.Sc.
DSGE Methods Estimation of DSGE models: Maximum Likelihood & Bayesian Willi Mutschler, M.Sc. Institute of Econometrics and Economic Statistics University of Münster willi.mutschler@uni-muenster.de Summer
More informationUncertainty quantification for inverse problems with a weak wave-equation constraint
Uncertainty quantification for inverse problems with a weak wave-equation constraint Zhilong Fang*, Curt Da Silva*, Rachel Kuske** and Felix J. Herrmann* *Seismic Laboratory for Imaging and Modeling (SLIM),
More informationBayesian Inference for DSGE Models. Lawrence J. Christiano
Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Preliminaries. Probabilities. Maximum Likelihood. Bayesian
More informationAn efficient sampling method for stochastic inverse problems
Comput Optim Appl (2007) 37: 121 138 DOI 10.1007/s10589-007-9021-4 An efficient sampling method for stochastic inverse problems Pierre Ngnepieba M.Y. Hussaini Received: 23 June 2005 / Revised: 15 November
More informationIteration and SUT-based Variational Filter
Iteration and SUT-ased Variational Filter Ming Lei 1, Zhongliang Jing 1, and Christophe Baehr 2 1. School of Aeronautics and Astronautics, Shanghai Jiaotong University, Shanghai 2. Météo-France/CNRS, Toulouse
More informationBayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference
1 The views expressed in this paper are those of the authors and do not necessarily reflect the views of the Federal Reserve Board of Governors or the Federal Reserve System. Bayesian Estimation of DSGE
More informationEnsemble Kalman Filter
Ensemble Kalman Filter Geir Evensen and Laurent Bertino Hydro Research Centre, Bergen, Norway, Nansen Environmental and Remote Sensing Center, Bergen, Norway The Ensemble Kalman Filter (EnKF) Represents
More informationIN-PLANE VIBRATIONS OF PINNED-FIXED HETEROGENEOUS CURVED BEAMS UNDER A CONCENTRATED FORCE
5th International Conference on "Advanced Composite Materials Engineering" COMAT 4 6-7 OCTOBER 4, Braşov, Romania IN-PLANE VIBRATIONS OF PINNED-FIXED HETEROGENEOUS CURVED BEAMS UNDER A CONCENTRATED FORCE
More informationPoint, Interval, and Density Forecast Evaluation of Linear versus Nonlinear DSGE Models
Point, Interval, and Density Forecast Evaluation of Linear versus Nonlinear DSGE Models Francis X. Diebold Frank Schorfheide Minchul Shin University of Pennsylvania May 4, 2014 1 / 33 Motivation The use
More informationBasic concepts in estimation
Basic concepts in estimation Random and nonrandom parameters Definitions of estimates ML Maimum Lielihood MAP Maimum A Posteriori LS Least Squares MMS Minimum Mean square rror Measures of quality of estimates
More informationApplication of Ensemble Kalman Filter in numerical models. UTM, CSIC, Barcelona, Spain 2. SMOS BEC, Barcelona, Spain 3. NURC, La Spezia, Italy 4
UM Application of Ensemle Kalman Filter in numerical models Joaquim Ballarera,2, Baptiste Mourre 3, Sofia Kalaroni 2,4, Nina Hoareau 2,4, Marta Umert 4 UM, CSIC, Barcelona, Spain 2 SMOS BEC, Barcelona,
More informationBayesian parameter estimation in predictive engineering
Bayesian parameter estimation in predictive engineering Damon McDougall Institute for Computational Engineering and Sciences, UT Austin 14th August 2014 1/27 Motivation Understand physical phenomena Observations
More informationConvergence of the Ensemble Kalman Filter in Hilbert Space
Convergence of the Ensemble Kalman Filter in Hilbert Space Jan Mandel Center for Computational Mathematics Department of Mathematical and Statistical Sciences University of Colorado Denver Parts based
More informationEffect of Uniform Horizontal Magnetic Field on Thermal Instability in A Rotating Micropolar Fluid Saturating A Porous Medium
IOSR Journal of Mathematics (IOSR-JM) e-issn: 78-578, p-issn: 39-765X. Volume, Issue Ver. III (Jan. - Fe. 06), 5-65 www.iosrjournals.org Effect of Uniform Horizontal Magnetic Field on Thermal Instaility
More informationSequential Bayesian Updating
BS2 Statistical Inference, Lectures 14 and 15, Hilary Term 2009 May 28, 2009 We consider data arriving sequentially X 1,..., X n,... and wish to update inference on an unknown parameter θ online. In a
More informationQUALITY CONTROL OF WINDS FROM METEOSAT 8 AT METEO FRANCE : SOME RESULTS
QUALITY CONTROL OF WINDS FROM METEOSAT 8 AT METEO FRANCE : SOME RESULTS Christophe Payan Météo France, Centre National de Recherches Météorologiques, Toulouse, France Astract The quality of a 30-days sample
More informationPart III. A Decision-Theoretic Approach and Bayesian testing
Part III A Decision-Theoretic Approach and Bayesian testing 1 Chapter 10 Bayesian Inference as a Decision Problem The decision-theoretic framework starts with the following situation. We would like to
More informationEconometrics I, Estimation
Econometrics I, Estimation Department of Economics Stanford University September, 2008 Part I Parameter, Estimator, Estimate A parametric is a feature of the population. An estimator is a function of the
More informationIN this paper, we consider the estimation of the frequency
Iterative Frequency Estimation y Interpolation on Fourier Coefficients Elias Aoutanios, MIEEE, Bernard Mulgrew, MIEEE Astract The estimation of the frequency of a complex exponential is a prolem that is
More informationLoad Balancing Congestion Games and their Asymptotic Behavior
Load Balancing Congestion Games and their Asymptotic Behavior Eitan Altman, Corinne Touati Inria CRS, LIG, Univ. Grenole Alpes, LIG, F-38000 Grenole, France Email: {eitan.altman, corinne.touati}@inria.fr
More informationHierarchical Modeling for Univariate Spatial Data
Hierarchical Modeling for Univariate Spatial Data Geography 890, Hierarchical Bayesian Models for Environmental Spatial Data Analysis February 15, 2011 1 Spatial Domain 2 Geography 890 Spatial Domain This
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation
More informationSloshing problem in a half-plane covered by a dock with two equal gaps
Sloshing prolem in a half-plane covered y a dock with two equal gaps O. V. Motygin N. G. Kuznetsov Institute of Prolems in Mech Engineering Russian Academy of Sciences St.Petersurg, Russia STATEMENT OF
More informationUNDERSTANDING DATA ASSIMILATION APPLICATIONS TO HIGH-LATITUDE IONOSPHERIC ELECTRODYNAMICS
Monday, June 27, 2011 CEDAR-GEM joint workshop: DA tutorial 1 UNDERSTANDING DATA ASSIMILATION APPLICATIONS TO HIGH-LATITUDE IONOSPHERIC ELECTRODYNAMICS Tomoko Matsuo University of Colorado, Boulder Space
More informationBayesian inverse problems with Laplacian noise
Bayesian inverse problems with Laplacian noise Remo Kretschmann Faculty of Mathematics, University of Duisburg-Essen Applied Inverse Problems 2017, M27 Hangzhou, 1 June 2017 1 / 33 Outline 1 Inverse heat
More informationERASMUS UNIVERSITY ROTTERDAM Information concerning the Entrance examination Mathematics level 2 for International Business Administration (IBA)
ERASMUS UNIVERSITY ROTTERDAM Information concerning the Entrance examination Mathematics level 2 for International Business Administration (IBA) General information Availale time: 2.5 hours (150 minutes).
More informationUncertainty quantification for Wavefield Reconstruction Inversion
Uncertainty quantification for Wavefield Reconstruction Inversion Zhilong Fang *, Chia Ying Lee, Curt Da Silva *, Felix J. Herrmann *, and Rachel Kuske * Seismic Laboratory for Imaging and Modeling (SLIM),
More informationEstimation Theory. as Θ = (Θ 1,Θ 2,...,Θ m ) T. An estimator
Estimation Theory Estimation theory deals with finding numerical values of interesting parameters from given set of data. We start with formulating a family of models that could describe how the data were
More informationComparisons between 4DEnVar and 4DVar on the Met Office global model
Comparisons between 4DEnVar and 4DVar on the Met Office global model David Fairbairn University of Surrey/Met Office 26 th June 2013 Joint project by David Fairbairn, Stephen Pring, Andrew Lorenc, Neill
More informationMODELING AND SIMULATION OF FREE FLUID OVER A POROUS MEDIUM BY MESHLESS METHOD
6th uropean Conference on Computational echanics (CC 6) 7th uropean Conference on Computational Fluid Dynamics (CFD 7) 5 June 8, Glasgow, UK ODLING AND SIULATION OF FR FLUID OVR A POROUS DIU BY SLSS TOD
More informationarxiv:nlin/ v1 [nlin.si] 29 Jan 2007
Exact shock solution of a coupled system of delay differential equations: a car-following model Y Tutiya and M Kanai arxiv:nlin/0701055v1 [nlin.si] 29 Jan 2007 Graduate School of Mathematical Sciences
More informationRiemann Manifold Methods in Bayesian Statistics
Ricardo Ehlers ehlers@icmc.usp.br Applied Maths and Stats University of São Paulo, Brazil Working Group in Statistical Learning University College Dublin September 2015 Bayesian inference is based on Bayes
More informationON ESTIMATION OF TEMPERATURE UNCERTAINTY USING THE SECOND ORDER ADJOINT PROBLEM
ON ESTIMATION OF TEMPERATURE UNCERTAINTY USING THE SECOND ORDER ADJOINT PROBLEM Aleksey K. Alekseev a, Michael I. Navon b,* a Department of Aerodynamics and Heat Transfer, RSC, ENERGIA, Korolev (Kaliningrad),
More informationStatistical Estimation of the Parameters of a PDE
PIMS-MITACS 2001 1 Statistical Estimation of the Parameters of a PDE Colin Fox, Geoff Nicholls (University of Auckland) Nomenclature for image recovery Statistical model for inverse problems Traditional
More informationErgodicity in data assimilation methods
Ergodicity in data assimilation methods David Kelly Andy Majda Xin Tong Courant Institute New York University New York NY www.dtbkelly.com April 15, 2016 ETH Zurich David Kelly (CIMS) Data assimilation
More informationDynamic Factor Models and Factor Augmented Vector Autoregressions. Lawrence J. Christiano
Dynamic Factor Models and Factor Augmented Vector Autoregressions Lawrence J Christiano Dynamic Factor Models and Factor Augmented Vector Autoregressions Problem: the time series dimension of data is relatively
More informationPhysics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester
Physics 403 Parameter Estimation, Correlations, and Error Bars Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Best Estimates and Reliability
More informationAn Introduction to Bayesian Linear Regression
An Introduction to Bayesian Linear Regression APPM 5720: Bayesian Computation Fall 2018 A SIMPLE LINEAR MODEL Suppose that we observe explanatory variables x 1, x 2,..., x n and dependent variables y 1,
More information1.1 a.) Suppose we have a conductor and place some charge density within it. For a surface S inside the conductor enclose the charge density!
1.1 a. uppose we have a conductor and place some charge density within it. # Q = d 3 x x V ( For a surface inside the conductor enclose the charge density E d a = 1 d 3 x $ %( x$ # V This will cause an
More informationTOPOGRAPHY INFLUENCE ON THE LAKE EQUATIONS IN BOUNDED DOMAINS
TOPOGRAPHY INFLUENCE ON THE LAKE EQUATIONS IN BOUNDED DOMAINS CHRISTOPHE LACAVE, TOAN T. NGUYEN, AND BENOIT PAUSADER Astract. We investigate the influence of the topography on the lake equations which
More informationLARGE-SCALE KALMAN FILTERING USING THE LIMITED MEMORY BFGS METHOD
LARGE-SCALE KALMAN FILTERING USING THE LIMITED MEMORY BFGS METHOD H. AUVINEN, J. M. BARDSLEY, H. HAARIO, AND T. KAURANNE. Abstract. The standard formulations of the Kalman filter (KF) and extended Kalman
More informationSequential Monte Carlo Samplers for Applications in High Dimensions
Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex
More informationAn Efficient Estimation Method for Longitudinal Surveys with Monotone Missing Data
An Efficient Estimation Method for Longitudinal Surveys with Monotone Missing Data Jae-Kwang Kim 1 Iowa State University June 28, 2012 1 Joint work with Dr. Ming Zhou (when he was a PhD student at ISU)
More informationFast Numerical Methods for Stochastic Computations
Fast AreviewbyDongbinXiu May 16 th,2013 Outline Motivation 1 Motivation 2 3 4 5 Example: Burgers Equation Let us consider the Burger s equation: u t + uu x = νu xx, x [ 1, 1] u( 1) =1 u(1) = 1 Example:
More informationBayesian Inference. Chapter 9. Linear models and regression
Bayesian Inference Chapter 9. Linear models and regression M. Concepcion Ausin Universidad Carlos III de Madrid Master in Business Administration and Quantitative Methods Master in Mathematical Engineering
More informationCorrelation integral likelihood for stochastic differential equations
Correlation integral likelihood for stochastic differential equations Heikki Haario, Janne Hakkarainen, Ramona Maraia, Sebastian Springer Abstract A new approach was recently introduced for the task of
More informationASSESSING A VECTOR PARAMETER
SUMMARY ASSESSING A VECTOR PARAMETER By D.A.S. Fraser and N. Reid Department of Statistics, University of Toronto St. George Street, Toronto, Canada M5S 3G3 dfraser@utstat.toronto.edu Some key words. Ancillary;
More informationABHELSINKI UNIVERSITY OF TECHNOLOGY
Cross-Validation, Information Criteria, Expected Utilities and the Effective Number of Parameters Aki Vehtari and Jouko Lampinen Laboratory of Computational Engineering Introduction Expected utility -
More informationShort Questions (Do two out of three) 15 points each
Econometrics Short Questions Do two out of three) 5 points each ) Let y = Xβ + u and Z be a set of instruments for X When we estimate β with OLS we project y onto the space spanned by X along a path orthogonal
More informationSciences, Oak Ridge National Laboratory, Oak Ridge, TN Sciences of Ukraine, Kyiv, Ukraine Ukraine 88000
Supplementary Material: CuInP S 6 - Room emperature Layered Ferroelectric A. Belianinov, 1 Q. He, 1 A. Dziaugys, P. Maksymovych, 1 E. Eliseev, a A. Borisevich, 1 A. Morozovska, J. Banys, Y. Vysochanskii,
More informationIntroduction to Bayesian Inference
University of Pennsylvania EABCN Training School May 10, 2016 Bayesian Inference Ingredients of Bayesian Analysis: Likelihood function p(y φ) Prior density p(φ) Marginal data density p(y ) = p(y φ)p(φ)dφ
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing
More informationNonlinear variational inverse problems
6 Nonlinear variational inverse problems This chapter considers highly nonlinear variational inverse problems and their properties. More general inverse formulations for nonlinear dynamical models will
More informationarxiv: v1 [stat.ml] 10 Apr 2017
Integral Transforms from Finite Data: An Application of Gaussian Process Regression to Fourier Analysis arxiv:1704.02828v1 [stat.ml] 10 Apr 2017 Luca Amrogioni * and Eric Maris Radoud University, Donders
More informationEstimating functional uncertainty using polynomial chaos and adjoint equations
0. Estimating functional uncertainty using polynomial chaos and adjoint equations February 24, 2011 1 Florida State University, Tallahassee, Florida, Usa 2 Moscow Institute of Physics and Technology, Moscow,
More informationON THE COMPARISON OF BOUNDARY AND INTERIOR SUPPORT POINTS OF A RESPONSE SURFACE UNDER OPTIMALITY CRITERIA. Cross River State, Nigeria
ON THE COMPARISON OF BOUNDARY AND INTERIOR SUPPORT POINTS OF A RESPONSE SURFACE UNDER OPTIMALITY CRITERIA Thomas Adidaume Uge and Stephen Seastian Akpan, Department Of Mathematics/Statistics And Computer
More informationROUNDOFF ERRORS; BACKWARD STABILITY
SECTION.5 ROUNDOFF ERRORS; BACKWARD STABILITY ROUNDOFF ERROR -- error due to the finite representation (usually in floatingpoint form) of real (and complex) numers in digital computers. FLOATING-POINT
More informationBayesian methods in the search for gravitational waves
Bayesian methods in the search for gravitational waves Reinhard Prix Albert-Einstein-Institut Hannover Bayes forum Garching, Oct 7 2016 Statistics as applied Probability Theory Probability Theory: extends
More informationExperimental 2D-Var assimilation of ARM cloud and precipitation observations
456 Experimental 2D-Var assimilation of ARM cloud and precipitation oservations Philippe Lopez, Angela Benedetti, Peter Bauer, Marta Janisková and Martin Köhler Research Department Sumitted to Q. J. Roy.
More informationIn the derivation of Optimal Interpolation, we found the optimal weight matrix W that minimizes the total analysis error variance.
hree-dimensional variational assimilation (3D-Var) In the derivation of Optimal Interpolation, we found the optimal weight matrix W that minimizes the total analysis error variance. Lorenc (1986) showed
More informationBayesian Logistic Regression
Bayesian Logistic Regression Sargur N. University at Buffalo, State University of New York USA Topics in Linear Models for Classification Overview 1. Discriminant Functions 2. Probabilistic Generative
More informationLearning the hyper-parameters. Luca Martino
Learning the hyper-parameters Luca Martino 2017 2017 1 / 28 Parameters and hyper-parameters 1. All the described methods depend on some choice of hyper-parameters... 2. For instance, do you recall λ (bandwidth
More informationOnline appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US
Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US Gerdie Everaert 1, Lorenzo Pozzi 2, and Ruben Schoonackers 3 1 Ghent University & SHERPPA 2 Erasmus
More information4. DATA ASSIMILATION FUNDAMENTALS
4. DATA ASSIMILATION FUNDAMENTALS... [the atmosphere] "is a chaotic system in which errors introduced into the system can grow with time... As a consequence, data assimilation is a struggle between chaotic
More informationERASMUS UNIVERSITY ROTTERDAM
Information concerning Colloquium doctum Mathematics level 2 for International Business Administration (IBA) and International Bachelor Economics & Business Economics (IBEB) General information ERASMUS
More informationApproximate Bayesian Computation
Approximate Bayesian Computation Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki and Aalto University 1st December 2015 Content Two parts: 1. The basics of approximate
More informationThe Kalman Filter ImPr Talk
The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman
More informationBayesian Inverse problem, Data assimilation and Localization
Bayesian Inverse problem, Data assimilation and Localization Xin T Tong National University of Singapore ICIP, Singapore 2018 X.Tong Localization 1 / 37 Content What is Bayesian inverse problem? What is
More informationSimple Examples. Let s look at a few simple examples of OI analysis.
Simple Examples Let s look at a few simple examples of OI analysis. Example 1: Consider a scalar prolem. We have one oservation y which is located at the analysis point. We also have a ackground estimate
More informationarxiv:quant-ph/ v1 4 Nov 2005
arxiv:quant-ph/05043v 4 Nov 2005 The Sufficient Optimality Condition for Quantum Information Processing V P Belavkin School of Mathematical Sciences, University of Nottingham, Nottingham NG7 2RD E-mail:
More informationThe Bayesian approach to inverse problems
The Bayesian approach to inverse problems Youssef Marzouk Department of Aeronautics and Astronautics Center for Computational Engineering Massachusetts Institute of Technology ymarz@mit.edu, http://uqgroup.mit.edu
More informationOn variational data assimilation for 1D and 2D fluvial hydraulics
On variational data assimilation for D and D fluvial hydraulics I. Gejadze, M. Honnorat, FX Le Dimet, and J. Monnier LJK - MOISE project-team, Grenoble, France. Contact: Jerome.Monnier@imag.fr Civil Engineering
More informationA Hybrid Approach to Estimating Error Covariances in Variational Data Assimilation
A Hybrid Approach to Estimating Error Covariances in Variational Data Assimilation Haiyan Cheng, Mohamed Jardak, Mihai Alexe, Adrian Sandu,1 Computational Science Laboratory Department of Computer Science
More information