DART_LAB Tutorial Section 5: Adaptive Inflation

Similar documents
Adaptive Inflation for Ensemble Assimilation

DART_LAB Tutorial Section 2: How should observations impact an unobserved state variable? Multivariate assimilation.

DART Tutorial Sec'on 9: More on Dealing with Error: Infla'on

Data Assimilation Research Testbed Tutorial

Ensemble Data Assimilation and Uncertainty Quantification

DART Tutorial Part II: How should observa'ons impact an unobserved state variable? Mul'variate assimila'on.

Data Assimilation Research Testbed Tutorial

DART Tutorial Section 18: Lost in Phase Space: The Challenge of Not Knowing the Truth.

Introduction to Ensemble Kalman Filters and the Data Assimilation Research Testbed

Localization and Correlation in Ensemble Kalman Filters

DART Tutorial Sec'on 5: Comprehensive Filtering Theory: Non-Iden'ty Observa'ons and the Joint Phase Space

DART Tutorial Part IV: Other Updates for an Observed Variable

DART Tutorial Sec'on 1: Filtering For a One Variable System

Bayesian Regression Linear and Logistic Regression

Machine Learning 4771

MS&E 226: Small Data. Lecture 11: Maximum likelihood (v2) Ramesh Johari

Statistics 203: Introduction to Regression and Analysis of Variance Penalized models

Reducing the Impact of Sampling Errors in Ensemble Filters

Overview. Probabilistic Interpretation of Linear Regression Maximum Likelihood Estimation Bayesian Estimation MAP Estimation

Fundamentals of Data Assimila1on

Ensemble Data Assimila.on for Climate System Component Models

Ensemble Data Assimila.on and Uncertainty Quan.fica.on

Application of the Ensemble Kalman Filter to History Matching

Machine Learning. Gaussian Mixture Models. Zhiyao Duan & Bryan Pardo, Machine Learning: EECS 349 Fall

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012

Statistical Methods in Particle Physics

Reminders. Thought questions should be submitted on eclass. Please list the section related to the thought question

Lecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis

Relative Merits of 4D-Var and Ensemble Kalman Filter

Bayesian Methods in Positioning Applications

EM Algorithm II. September 11, 2018

Sequential Bayesian Updating

Relevance Vector Machines

Bayesian Statistics and Data Assimilation. Jonathan Stroud. Department of Statistics The George Washington University

Dynamic System Identification using HDMR-Bayesian Technique

CS Homework 3. October 15, 2009

Particle Filters. Outline

Generalized Linear Models for Non-Normal Data

Week 3: Linear Regression

Financial Econometrics

Eco517 Fall 2004 C. Sims MIDTERM EXAM

Gaussian Filtering Strategies for Nonlinear Systems

CSC 2541: Bayesian Methods for Machine Learning

Ji-Sun Kang and Eugenia Kalnay

Adaptive Data Assimilation and Multi-Model Fusion

CLASS NOTES Models, Algorithms and Data: Introduction to computing 2018

Accepted in Tellus A 2 October, *Correspondence

On Markov chain Monte Carlo methods for tall data

Observability, a Problem in Data Assimilation

(Extended) Kalman Filter

Measures of Fit from AR(p)

CHAPTER 6: SPECIFICATION VARIABLES

Variations. ECE 6540, Lecture 10 Maximum Likelihood Estimation

CIS 390 Fall 2016 Robotics: Planning and Perception Final Review Questions

Fundamentals of Data Assimila1on

4. DATA ASSIMILATION FUNDAMENTALS

Statistical Models with Uncertain Error Parameters (G. Cowan, arxiv: )

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

An Efficient Ensemble Data Assimilation Approach To Deal With Range Limited Observation

STAT 730 Chapter 4: Estimation

Implicit sampling for particle filters. Alexandre Chorin, Mathias Morzfeld, Xuemin Tu, Ethan Atkins

Introduction to machine learning and pattern recognition Lecture 2 Coryn Bailer-Jones

Motivation Scale Mixutres of Normals Finite Gaussian Mixtures Skew-Normal Models. Mixture Models. Econ 690. Purdue University

ECE531 Lecture 10b: Maximum Likelihood Estimation

Confidence Intervals 1

Bayesian Linear Models

E190Q Lecture 10 Autonomous Robot Navigation

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet.

Asymptotic Multivariate Kriging Using Estimated Parameters with Bayesian Prediction Methods for Non-linear Predictands

Approximate Bayesian Computation: a simulation based approach to inference

Classical and Bayesian inference

A new Hierarchical Bayes approach to ensemble-variational data assimilation

STA 414/2104, Spring 2014, Practice Problem Set #1

ML estimation: Random-intercepts logistic model. and z

Multimodel Ensemble forecasts

Stat 5100 Handout #26: Variations on OLS Linear Regression (Ch. 11, 13)

ECE531 Lecture 8: Non-Random Parameter Estimation

Bayesian Linear Models

Robust Ensemble Filtering With Improved Storm Surge Forecasting

Density Estimation: ML, MAP, Bayesian estimation

Introduction to Probabilistic Machine Learning

Bayesian Dynamic Linear Modelling for. Complex Computer Models

Lecture 4: Types of errors. Bayesian regression models. Logistic regression

Stat260: Bayesian Modeling and Inference Lecture Date: February 10th, Jeffreys priors. exp 1 ) p 2

Choosing priors Class 15, Jeremy Orloff and Jonathan Bloom

First Year Examination Department of Statistics, University of Florida

Sequential Monte Carlo Methods (for DSGE Models)

Introduction to Machine Learning Midterm Exam

Machine Learning - MT & 5. Basis Expansion, Regularization, Validation

The joint posterior distribution of the unknown parameters and hidden variables, given the

What is a meta-analysis? How is a meta-analysis conducted? Model Selection Approaches to Inference. Meta-analysis. Combining Data

Convergence of Square Root Ensemble Kalman Filters in the Large Ensemble Limit

CSC411 Fall 2018 Homework 5

Hypothesis Testing. Econ 690. Purdue University. Justin L. Tobias (Purdue) Testing 1 / 33

Linear Models for Regression

A strategy to optimize the use of retrievals in data assimilation

Lecture 13: Data Modelling and Distributions. Intelligent Data Analysis and Probabilistic Inference Lecture 13 Slide No 1

A data-driven method for improving the correlation estimation in serial ensemble Kalman filter

Group Sequential Designs: Theory, Computation and Optimisation

1 Bayesian Linear Regression (BLR)

Transcription:

DART_LAB Tutorial Section 5: Adaptive Inflation UCAR 14 The National Center for Atmospheric Research is sponsored by the National Science Foundation. Any opinions, findings and conclusions or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Variance inflation for observations: An adaptive error tolerant filter Probability.8.6.4. Prior PDF S.D. Obs. Likelihood Actual 4.714 SDs Expected Separation S.D. 4 4 1. For observed variable, have estimate of prior-observed inconsistency.. Expected (prior_mean observation) = σ prior + σ obs Assumes that prior and observation are supposed to be unbiased. Is it model error or random chance? DART_LAB Section 5: of 33

Variance inflation for observations: An adaptive error tolerant filter Probability.8.6.4. Prior PDF Inflated S.D. Obs. Likelihood Actual 3.698 SDs Expected Separation S.D. 4 4 1. For observed variable, have estimate of prior-observed inconsistency.. Expected (prior_mean observation) = σ prior + σ obs 3. Inflating increases expected separation Increases apparent consistency between prior and observation. DART_LAB Section 5: 3 of 33

Variance inflation for observations: An adaptive error tolerant filter Probability.8.6.4. Prior PDF Inflated S.D. Obs. Likelihood Actual 3.698 SDs Expected Separation S.D. 4 4 Distance D from prior mean y to obs is Prob y is observed given l: N(, λσ prior + σ obs )= N,θ p( y o λ)= ( πθ ) 1 exp( D θ ) DART_LAB Section 5: 4 of 33

Variance inflation for observations: An adaptive error tolerant filter Use Bayesian statistics to get estimate of inflation factor l..6.4. Prior PDF Obs. Likelihood 1 1 3 4 Observation: y Prior λ PDF 1 1 3 4 5 6 Obs. Space Inflation Factor: λ Assume prior is gaussian: p( λ,t k Y tk 1 )= N λ p,σ λ,p DART_LAB Section 5: 5 of 33

Variance inflation for observations: An adaptive error tolerant filter Use Bayesian statistics to get estimate of inflation factor l..6.4. 1 1 3 4 Observation: y Prior λ PDF 1 Prior PDF Obs. Likelihood We ve assumed a gaussian for prior. p λ,t k Y tk 1 Recall that p y k λ can be evaluated From normal PDF. 1 3 4 5 6 Obs. Space Inflation Factor: λ p( λ,t k Y tk )= p( y k λ) p( λ,t k Y tk 1 )/ normalization DART_LAB Section 5: 6 of 33

Variance inflation for observations: An adaptive error tolerant filter Use Bayesian statistics to get estimate of inflation factor l..6.4 Inflated Prior λ =.75 Obs. Likelihood Get p y k λ =.75 from normal PDF.. 1 1 3 4 Observation: y Prior λ PDF 1 1 3 4 5 6 Obs. Space Inflation Factor: λ Multiply by p λ =.75,t k Y tk 1 to get p λ =.75,t k Y tk p( λ,t k Y tk )= p( y k λ) p( λ,t k Y tk 1 )/ normalization DART_LAB Section 5: 7 of 33

Variance inflation for observations: An adaptive error tolerant filter Use Bayesian statistics to get estimate of inflation factor l..6.4 Inflated Prior λ = 1.5 Obs. Likelihood Get p y k λ =1.5 from normal PDF.. 1 1 3 4 Observation: y Prior λ PDF 1 1 3 4 5 6 Obs. Space Inflation Factor: λ Multiply by p λ =1.5,t k Y tk 1 to get p λ =1.5,t k Y tk p( λ,t k Y tk )= p( y k λ) p( λ,t k Y tk 1 )/ normalization DART_LAB Section 5: 8 of 33

Variance inflation for observations: An adaptive error tolerant filter Use Bayesian statistics to get estimate of inflation factor l..6.4 Inflated Prior λ =.5 Obs. Likelihood Get p y k λ =.5 from normal PDF.. 1 1 3 4 Observation: y Prior λ PDF 1 1 3 4 5 6 Obs. Space Inflation Factor: λ Multiply by p λ =.5,t k Y tk 1 to get p λ =.5,t k Y tk p( λ,t k Y tk )= p( y k λ) p( λ,t k Y tk 1 )/ normalization DART_LAB Section 5: 9 of 33

Variance inflation for observations: An adaptive error tolerant filter Use Bayesian statistics to get estimate of inflation factor l..6.4 Obs. Likelihood Repeat for a range of values of l.. 1 1 3 4 Observation: y Prior λ PDF Now must get posterior in same form as prior (gaussian). 1 Posterior Likelihood y observed given λ 1 3 4 5 6 Obs. Space Inflation Factor: λ p( λ,t k Y tk )= p( y k λ) p( λ,t k Y tk 1 )/ normalization DART_LAB Section 5: 1 of 33

Variance inflation for observations: An adaptive error tolerant filter Use Bayesian statistics to get estimate of inflation factor l..6.4. Obs. Likelihood Very little information about l in a single observation. 1 1 3 4 Observation: y Prior λ PDF 1 Posterior 1 Obs. Space Inflation Factor: λ Posterior and prior are very similar. Normalized posterior indistinguishable from prior. p( λ,t k Y tk )= p( y k λ) p( λ,t k Y tk 1 )/ normalization DART_LAB Section 5: 11 of 33

Variance inflation for observations: An adaptive error tolerant filter Use Bayesian statistics to get estimate of inflation factor l..6.4..1.1 1 1 3 4 Observation: y λ: Posterior Prior Obs. Likelihood Max density shifted to right 1 Obs. Space Inflation Factor: λ Very little information about l in a single observation. Posterior and prior are very similar. Difference shows slight shift to larger values of l. p( λ,t k Y tk )= p( y k λ) p( λ,t k Y tk 1 )/ normalization DART_LAB Section 5: 1 of 33

Variance inflation for observations: An adaptive error tolerant filter Use Bayesian statistics to get estimate of inflation factor l..6.4 Obs. Likelihood One option is to use Gaussian prior for l.. 1 1 3 4 Observation: y Find Max by search Prior λ PDF 1 Max is new λ mean Select max (mode) of posterior as mean of updated Gaussian. Do a fit for updated standard deviation. 1 Obs. Space Inflation Factor: λ p( λ,t k Y tk )= p( y k λ) p( λ,t k Y tk 1 )/ normalization DART_LAB Section 5: 13 of 33

Variance inflation for observations: An adaptive error tolerant filter A. Computing updated inflation mean,. Mode of p( y k λ) p( λ,t k Y tk 1 ) can be found analytically! Solving p( y leads to 6 th k λ) p λ,t k Y tk 1 / y = order poly in q. This can be reduced to a cubic equation and solved to give mode. λ u New λ u is set to the mode. This is relatively cheap compared to computing regressions. DART_LAB Section 5: 14 of 33

Variance inflation for observations: An adaptive error tolerant filter σ λ,u B. Computing updated inflation variance,. 1. Evaluate numerator at mean λ u and second point, e.g. λ + σ u λ,p σ λ,u p λ u. Find so N λ u,σ λ,u goes through and p ( λ + σ u λ,p) 3. Compute as σ λ,u = σ λ,p / lnr where r = p( λ u + σ λ,p )/ p λ u DART_LAB Section 5: 15 of 33

Single Variable Computations with Adaptive Error Correction.6 Probability.4. Prior PDF Obs. Likelihood 4 4 1. Compute updated inflation distribution, p( λ,t k Y tk ). DART_LAB Section 5: 16 of 33

Single Variable Computations with Adaptive Error Correction.6 Probability.4. Obs. Likelihood 4 4 1. Compute updated inflation distribution, p λ,t k Y tk.. Inflate ensemble using mean of updated l distribution. DART_LAB Section 5: 17 of 33

Single Variable Computations with Adaptive Error Correction.6 Probability.4. Obs. Likelihood 4 4 1. Compute updated inflation distribution, p λ,t k Y tk.. Inflate ensemble using mean of updated l distribution. 3. Compute posterior for y using inflated prior. DART_LAB Section 5: 18 of 33

Single Variable Computations with Adaptive Error Correction.6 Probability.4. Obs. Likelihood 4 4 1. Compute updated inflation distribution, p λ,t k Y tk.. Inflate ensemble using mean of updated l distribution. 3. Compute posterior for y using inflated prior. 4. Compute increments from ORIGINAL prior ensemble. DART_LAB Section 5: 19 of 33

Single Variable Computations with Adaptive Error Correction Adaptive inflation can be tested with matlab script oned_model_inf.m Can explore 5 different values that control adaptive inflation: Minimum value of inflation, often set to 1 (no deflation). Inflation damping, more on this later. Value of 1. turns it off. Maximum value of inflation. The initial value of the inflation standard deviation. A lower bound on inflation standard deviation (it will asymptote to zero if allowed). DART_LAB Section 5: of 33

Single Variable Computations with Adaptive Error Correction Turn on adaptive inflation. Inflation settings. Inflation value and its standard deviation. DART_LAB Section 5: 1 of 33

Single Variable Computations with Adaptive Error Correction Try adaptive inflation. Pick a lower value for standard deviation. Initial lower bound on inflation of 1., upper bound large (1). Try introducing a model bias. What happens if lower bound is less than 1? DART_LAB Section 5: of 33

Inflation Damping Inflation mean damped towards 1 every assimilation time. inf_damping.9: 9% of the inflation difference from 1. is retained. Can be useful in models with heterogeneous observations in time. For instance, a well-observed hurricane crosses a model domain. Adaptive inflation increases along hurricane trace. After hurricane, fewer observations, no longer need so much inflation. For large earth system models, following values may work: inf_sd_initial =.6, inf_damping =.9, inf_sd_lower_bound =.6. DART_LAB Section 5: 3 of 33

Adaptive Multivariate Inflation Algorithm Suppose we want a global multivariate inflation, l s, instead. Make same least squares assumption that is used in ensemble filter. Inflation of l s for state variables inflates obs. priors by same amount. Get same likelihood as before: p( y o λ)= ( πθ ) 1 exp( D θ ) θ = λ s σ prior + σ obs Compute updated distribution for l s exactly as for single observed variable. DART_LAB Section 5: 4 of 33

Implementation of Adaptive Multivariate Inflation Algorithm 1. Apply inflation to state variables with mean of l s distribution.. Do following for observations at given time sequentially: a. Compute forward operator to get prior ensemble. b. Compute updated estimate for l s mean and variance. c. Compute increments for prior ensemble. d. Regress increments onto state variables. DART_LAB Section 5: 5 of 33

Spatially varying adaptive inflation algorithm Have a distribution for l for each state variable, l s,i. Use prior correlation from ensemble to determine impact of l s,i on prior variance for given observation. If g is correlation between state variable i and observation then θ = 1+ γ λ s,i 1 σprior + σ obs Equation for finding mode of posterior is now full 1th order: Analytic solution appears unlikely. Can do Taylor expansion of q around l s,i. Retaining linear term is normally quite accurate. There is an analytic solution to find mode of product in this case! DART_LAB Section 5: 6 of 33

Spatially Varying Adaptive Inflation Spatially Varying Adaptive inflation can be tested with matlab script run_lorenz96_inf.m Can explore 3 different values that control adaptive inflation: Minimum value of inflation, often set to 1 (no deflation). Inflation damping. Value of 1. turns it off. The value of the inflation standard deviation. Lower bound on standard deviation is set to same value. In this case, standard deviation just stays fixed at selected value. DART_LAB Section 5: 7 of 33

Spatially Varying Adaptive Inflation Adaptive inflation controls. Adaptive inflation spatial plot. DART_LAB Section 5: 8 of 33

Spatially Varying Adaptive Inflation Explore some of the following: How does adaptive inflation change as localization is changed? How does adaptive inflation change for different values of inflation standard deviation? If the lower bound is smaller than 1, does deflation (inflation < 1) happen? DART_LAB Section 5: 9 of 33

Simulating Model Error in 4-Variable Lorenz-96 Model Inflation can deal with all sorts of errors, including model error. Can simulate model error in Lorenz96 by changing forcing. Synthetic observations are from model with forcing = 8.. Both run_lorenz_96 and run_lorenz_96_inf allow model error. DART_LAB Section 5: 3 of 33

Spatially Varying Adaptive Inflation Model error: Change forcing for assimilating model here. DART_LAB Section 5: 31 of 33

Spatially Varying Adaptive Inflation with Model Error Explore some of the following: Change the model forcing to a larger or smaller value (say 6 or 1). How does adaptive inflation respond to model error? Do good values of localization change as model error increases? DART_LAB Section 5: 3 of 33

DART_LAB Section 5: 33 of 33