Professor David B. Stephenson

Similar documents
Structural Uncertainty in Health Economic Decision Models

Parameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn

Basic Verification Concepts

Probabilities for climate projections

The treatment of uncertainty in climate analysis and prediction Myles Allen Department of Physics, University of Oxford

Basic Verification Concepts

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01

Bayesian analysis in nuclear physics

Ensemble Verification Metrics

Understanding Uncertainty in Climate Model Components Robin Tokmakian Naval Postgraduate School

Uncertainty in energy system models

Linking climate change modelling to impacts studies: recent advances in downscaling techniques for hydrological extremes

Gaussian Processes for Computer Experiments

Statistical modelling for energy system planning

USING METEOROLOGICAL ENSEMBLES FOR ATMOSPHERIC DISPERSION MODELLING OF THE FUKUSHIMA NUCLEAR ACCIDENT

Estimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio

Probabilistic assessment of regional climate change: a Bayesian approach to combining. predictions from multi-model ensembles

Risk in Climate Models Dr. Dave Stainforth. Risk Management and Climate Change The Law Society 14 th January 2014

Predicting Climate Change

A comparison of ensemble post-processing methods for extreme events

Strong Lens Modeling (II): Statistical Methods

Statistical Modelling for Energy Systems Integration

Verification of Weather Warnings

STA 4273H: Statistical Machine Learning

The ENSEMBLES Project

Multimodel Ensemble forecasts

Bayesian Inference: What, and Why?

A comparison of polynomial chaos and Gaussian process emulation for uncertainty quantification in computer experiments

A Note on Bayesian Inference After Multiple Imputation

Probabilities, Uncertainties & Units Used to Quantify Climate Change

ECE521 week 3: 23/26 January 2017

Markov Chains and MCMC

Probabilistic Weather Forecasting and the EPS at ECMWF

Developing Operational MME Forecasts for Subseasonal Timescales

Bayesian Methods for Machine Learning

Probabilities & Statistics Revision

Computational Cognitive Science

REQUIREMENTS FOR WEATHER RADAR DATA. Review of the current and likely future hydrological requirements for Weather Radar data

Statistical Practice

Predicting climate extreme events in a user-driven context

COMP 551 Applied Machine Learning Lecture 19: Bayesian Inference

Confidence Distribution

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Hypothesis testing. Chapter Formulating a hypothesis. 7.2 Testing if the hypothesis agrees with data

Suggestions for Making Useful the Uncertainty Quantification Results from CFD Applications

Standardized Verification System for Long-Range Forecasts. Simon Mason

Bayesian Inference for the Multivariate Normal

Progress in RT1: Development of Ensemble Prediction System

Threshold estimation in marginal modelling of spatially-dependent non-stationary extremes

Monte Carlo Techniques for Regressing Random Variables. Dan Pemstein. Using V-Dem the Right Way

statistical methods for tailoring seasonal climate forecasts Andrew W. Robertson, IRI

Uncertainty quantification for spatial field data using expensive computer models: refocussed Bayesian calibration with optimal projection

LAM EPS and TIGGE LAM. Tiziana Paccagnella ARPA-SIMC

Why Try Bayesian Methods? (Lecture 5)

At A Glance. UQ16 Mobile App.

Modelling Under Risk and Uncertainty

Bayesian hierarchical modelling for data assimilation of past observations and numerical model forecasts

Uncertainty Propagation

Baseline Climatology. Dave Parker ADD PRESENTATION TITLE HERE (GO TO: VIEW / MASTER / SLIDE MASTER TO AMEND) ADD PRESENTER S NAME HERE / ADD DATE HERE

Emulator-Based Simulator Calibration for High-Dimensional Data

16 : Markov Chain Monte Carlo (MCMC)

Using Data Assimilation to Explore Precipitation - Cloud System - Environment Interactions

Robust Ensemble Filtering With Improved Storm Surge Forecasting

An introduction to Bayesian statistics and model calibration and a host of related topics

Introduction to Bayesian Learning

What Measures Can Be Taken To Improve The Understanding Of Observed Changes?

FORECASTING STANDARDS CHECKLIST

Bios 6649: Clinical Trials - Statistical Design and Monitoring

Reminder of some Markov Chain properties:

BetaZi SCIENCE AN OVERVIEW OF PHYSIO-STATISTICS FOR PRODUCTION FORECASTING. EVOLVE, ALWAYS That s geologic. betazi.com. geologic.

Data Analysis and Uncertainty Part 2: Estimation

Enhancing Weather Information with Probability Forecasts. An Information Statement of the American Meteorological Society

Lecture : Probabilistic Machine Learning

Introduction to Artificial Intelligence. Prof. Inkyu Moon Dept. of Robotics Engineering, DGIST

1. INTRODUCTION 2. QPF

Introduction to Bayesian Statistics and Markov Chain Monte Carlo Estimation. EPSY 905: Multivariate Analysis Spring 2016 Lecture #10: April 6, 2016

Inverse problems and uncertainty quantification in remote sensing

Recent advances in Tropical Cyclone prediction using ensembles

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling

Strategy for Using CPC Precipitation and Temperature Forecasts to Create Ensemble Forcing for NWS Ensemble Streamflow Prediction (ESP)

STATS 200: Introduction to Statistical Inference. Lecture 29: Course review

ECE295, Data Assimila0on and Inverse Problems, Spring 2015

A Scientific Challenge for Copernicus Climate Change Services: EUCPXX. Tim Palmer Oxford

Verification of Probability Forecasts

Bayesian Inference. Introduction

Learning Bayesian network : Given structure and completely observed data

New Insights into History Matching via Sequential Monte Carlo

Evaluating Forecast Quality

Understanding climate predictability and uncertainty (seasonal to decadal)

Introduction to Bayesian Inference

(Directions for Excel Mac: 2011) Most of the global average warming over the past 50 years is very likely due to anthropogenic GHG increases

Sensor Tasking and Control

Introduction: MLE, MAP, Bayesian reasoning (28/8/13)

Answers and expectations

INFLUENCE OF CLIMATE CHANGE ON EXTREME WEATHER EVENTS

Seamless Prediction. Hannah Christensen & Judith Berner. Climate and Global Dynamics Division National Center for Atmospheric Research, Boulder, CO

Uncertainty in future climate: What you need to know about what we don t know

Lecture 1: Bayesian Framework Basics

The importance of sampling multidecadal variability when assessing impacts of extreme precipitation

Fundamentals of Data Assimila1on

Transcription:

rand Challenges in Probabilistic Climate Prediction Professor David B. Stephenson Isaac Newton Workshop on Probabilistic Climate Prediction D. Stephenson, M. Collins, J. Rougier, and R. Chandler University of Exeter 20-23 Sep 2010 http://www.newton.ac.uk/programmes/clp/clpw02.html 1

Aims of the workshop to debate the strengths and weaknesses of existing frameworks for inferring and evaluating predictions of real-world climate to identify and formulate pressing and potentially solvable problems in the mathematics of probabilistic climate prediction. 2

Breakout themes Theme A: Statistical Frameworks for prediction Chair: Jonty Rougier) What statistical frameworks are available for quantifying uncertainty in climate predictions and what are their respective strengths and limitations? Theme B: Calibration of climate predictions Chair: David Stephenson) What grounds do we have for believing that our predictions are well-calibrated and how should we go about calibrating outputs from climate models)? Theme C: Evaluation of climate predictions Chair: Chris Ferro) How should we go about evaluating probabilistic climate predictions? How can we best determine the skill and reliability of probabilistic climate predictions at various space and time scales? Theme D: Model processes and inadequacies Chair: Mat Collins) How should we represent and quantify known limitations in physical processes simulated by climate models? i.e. the ability to simulate long-lasting blocking events, the correct intensity of extreme storm events, etc. Theme E: Problem area to be decided Chair: Richard Chandler) What is climate? How best to design Multi-Model Ensembles? How best to communicate uncertainty 3

What is climate? We cannot observe the climate; we merely observe the states of the various components of the climate system at various instants in time It can be helpful to study imperfectly observed toy models such as the logistic map, x t+1 =1-ax t2 although the climate is much more complicated Climate is: the invariant measure in a hypothetical thought experiment in which the forcing is held constant indefinitely what is obtained by taking the joint PDF for the state and the parameters, and then integrating it over the parameters the PDF of the current system state given all the historical observations Current CMs are not configured to produce PDFs, and hence additional effort and assumptions e.g. ergodicity) are required to compute climate as defined above 4

What is climate? We considered what is meant by: The climate of Exeter on 1 Jan 2070 Probability does not exist Bruno de Finetti The probability distribution of weather in Exeter on 1 Jan 2070 can be estimated using: Ensemble runs from different initial conditions Neighbouring time and/or space values under homogeneity and trend assumptions) NTE: BTH APPRACHES INVLVE PRBABILITY MDEL ASSUMPTINS Hence Climate does not exist DBS Probabilistic climate prediction = using non-existent p to forecast non-existent c! 5

Theme A: Statistical frameworks We need more than weights we need a credible statistical model for the distribution of real climate Y given information from a multi-model ensemble m X ' w i X i1 Incorrect to interpret weights w as probabilities for each model Need to know dependencies in X_i to find variance of X X does not equal real climate Y require a statistical model for the distribution Y X i Y = real climate X_i = climate model run Z = observation of climate 6

How should we use multi-model simulations... 7

... to make inference about future observables? scenarios Uncertainty around projections Projections bservations 8

Forecast Assimilation the dual of data assimilation Data Assimilation px i y ) i pyi xi)pxi) py ) i py Forecast Assimilation x ) px yf )py px ) Stephenson, D.B., Coelho, C.A.S., Balmaseda, M. and Doblas-Reyes, F.J. 2005): Forecast Assimilation: A unified framework for the combination of multi-model weather and climate predictions, Tellus A, 57 3), pp 253-264 9 f f f f f )

The Multi-Model Ensemble A fruit bowl of opportunity {X 1,X 2,...,X m } Note: Not a random sample from one homogeneous population and it does not include all possible fruit!) 10

What does reality look like? An inconvenient truth actual true climate Y inferred from observations Z It could not have been drawn out of my fruit bowl! How can we infer properties of this from the fruit in the fruitbowl? 11

Smoothies multi-model means) A smoothie is a weighted average of fruits. It is not an item of real fruit! important information has been lost by averaging) Non-unique choice of weights for making smoothies. m X ' w X Y i1 i i We require modelling frameworks for obtaining samples of real fruit from the posterior distribution py X) not smoothies EX) or EX Y)). 12

Should we use everything in the fruit bowl? Should we select subsets? All fruit are wrong, but some are tasty - ranny Smith 13

Homogeous sub-samples How to relate Y to X? Are the {X i } independent draws from a distribution centred on Y? Are the {X i } second-order exchangeable with each other and Y? How best to model model discrepancy Y-X i? All fruit are equal, but some are are more equal than others ranny rwell 14

Statistical frameworks All such frameworks are simplifying assumptions about the complex MME and its relation to reality Models can be based on Bayesian judgements about the MME or frequentist sampling ideas For the frequentist interpretation one should define the PPULATIN F ALL PSSIBLE MDEL RUNS and how one selects a sample from it: Random combination of model components Expert judgement on model components Evolution of models viable species) The models and their assumptions should be clearly stated and rigorously assessed 15

Theme B and D: Model biases Y bf ) d 1. Can we express current bias correction procedures which currently give different answers) using a common statistical framework in a way that will help us learn about our model biases? 2. What can we assume about d today and d in the future? How do we deal with non-linear changes in d? 3. How can model processes be directly represented in the statistical framework, rather than frameworks using model parameters? 4. How can we test if a statistical framework is well formulated? Which assumptions can be tested? 5. How can we start to work with societally relevant variables, eg. extremes rather than 30-year means? Modelling centres have very few b-and-d'ers and too many f'ers. 16

Calibration uncertainty John Ho, Mat Collins, Simon Brown, Chris Ferro How to infer distribution of Y from distributions of Y, X and X? 1. No calibration Assume Y and X have identical distributions i.e. no model biases!) i.e. F Y = F X 2. Bias correction X X Assume Y =BX ) where B.)=F Y -1 F X.)) Y =BX ) 3. Change factor Assume Y =CY) where C.)=F X -1 F X.)) Y Y =CY) Y =? 4. ther e.g. Adjust parameters in parametric fits e.g. Y ' Y X ' X 17

Example: daily summer temperatures in London Daily mean air temperatures Y=bservations 1970-1999 from E-BS gridded dataset Haylock et al., 2008) X=HadRM3 standard run 25 km resolution) forced by HadCM3; SRES A1B scenario. n=30*120=3600 days Black line = sample mean Red line = 99th percentile 18

Probability density functions Black line = pdf of obs data 1970-1999 Blue line = pdf of climate data 1970-1999 Red line = pdf of climate data 2070-2099 19

20 Linear calibration constant shape) mean warming 3.6 i.e. 19.6 15.61) 19.65 3.35 3.01 15.95 : Which for our data gives ) ') ) ' ')) ') ' ) ) Correction Bias ' 1 C C E F F B x F x F x F x F mean warming 41 i.e. 20.1 15.61) 15.95 3.35 4.03 19.65 : Which for our data gives ) ') ) )) ) ' ) ) factor Change ' ' ' ' 1 ' ' ' ' C. C E F F C x F x F x F x F Two approaches give different future mean temperatures!

Change in 10-summer level 2040-69 from 1970-99 No calibration Bias correction Change factor T g - T o Substantial differences between different estimates! 21

Theme C: Evaluation of predictions What is a good climate prediction and how can we measure goodness? How can our judgment about the quality of climate predictions be informed by... climate model validation? short lead-time predictions? perfect model experiments? hindcasts? expert judgments about the ability of climate models to simulate relevant physical processes? How can we avoid or account for any doublecounting? 22

Design of Experiments Perturb parameters in one model to see resulting differences as a measure for differences between models; for sensitivities to parameter values and parametric uncertainty; to assess the relative importance of different parameters the potentially very large number of combinations that could arise in exploring sensitivities might mean use of an emulator). Multi-model ensembles to estimate model discrepancy; to increase the range/variability spanned by the ensemble greater in practice than by perturbing parameters in most individual models) and this variability is wanted although the reason for wanting this additional variability could be articulated better); for the range to include the mouldy fruit and be broad enough to have some that can be rejected as evidence that the range is fully explored. How to trade off between resolution and ensemble size? Experiments likely to be multi-purpose because of the effort / cost involved. Purposes may be ascertaining for example) model sensitivities or distributions of aspects of climate. So models need to be robust because of the many questions likely to be addressed. The purpose influences the choice of parameter values. A remaining question is how the choice of parameter values is made. Sequential Design was agreed, but raises issues to explore. The whole sequence may not be set a priori. Emulators need testing. They can be used in successive reduction of parameter space covered according to what is learned space shown to be irrelevant) and to guide the few runs of a simulator to achieve better accuracy. Dimension reduction gives benefit in reducing effort. A question of what factors are relevant to be included or used as criteria in experiment design transformed to the issue of using previous existing) information to inform how?) next steps what are dominant sources of uncertainty? This includes the use of simpler models in a hierarchy of models to inform subsequent or more complex models along/up a chain of development. An identified problem to pursue: how to design multi-model experiments to be efficient in reaching the desired objective especially optimal estimates of distributions of aspects of climate) with the given resources a need to optimise a utility function, but how to set weights thereof this is an evolution of the question of how to weight members of a multi-model ensemble)? Tails of distributions are of interest for low-probability but high intolerable?) impacts. Usually Monte Carlo would be efficient for many parameters and exploring tails. There is a need to test any emulator and simulator!) for extremes. For such interests we may want to initially fill parameter space Latin hypercube, Sobol Sequence) an initial emulator use was suggested in plenary. The shape of parameter space may be poorly defined and parameters may not all be independent. 23

Some grand challenges 1. Reconcile operational and conceptual probabilistic definitions of climate and climate trend. 2. Develop statistical frameworks for MMEs that can be used to make reliable predictions of societally-relevant variables. Ideally the frameworks will incorporate process information from models physical metrics) be able to cope with complex biases non-linear, time-dependent etc) inform us about Design of Experiments for future MMEs that include climate models with different complexity and resolution 3. Develop a rigorous evaluation procedure for assessing and comparing the frameworks in order to build confidence in this important component of a probabilistic climate prediction system. 4. Find better ways of communicating the various sources of uncertainty in probabilistic climate predictions e.g. dynamic graphics, best judgements ) 24