Overview of Achievements October 2001 October 2003 Adrian Raftery, P.I. MURI Overview Presentation, 17 October 2003 c 2003 Adrian E.

Similar documents
Using Bayesian Model Averaging to Calibrate Forecast Ensembles

Ensemble Copula Coupling: Towards Physically Consistent, Calibrated Probabilistic Forecasts of Spatio-Temporal Weather Trajectories

132 ensemblebma: AN R PACKAGE FOR PROBABILISTIC WEATHER FORECASTING

Standardized Anomaly Model Output Statistics Over Complex Terrain.

Probabilistic Weather Forecasting in R

Improvement of Mesoscale Numerical Weather Prediction For Coastal Regions of Complex Terrain FY2003

Application and verification of ECMWF products in Austria

Motivation & Goal. We investigate a way to generate PDFs from a single deterministic run

Application and verification of ECMWF products in Austria

Calibration of ECMWF forecasts

The document was not produced by the CAISO and therefore does not necessarily reflect its views or opinion.

Seasonal Climate Watch September 2018 to January 2019

FORECASTING: A REVIEW OF STATUS AND CHALLENGES. Eric Grimit and Kristin Larson 3TIER, Inc. Pacific Northwest Weather Workshop March 5-6, 2010

Calibration of extreme temperature forecasts of MOS_EPS model over Romania with the Bayesian Model Averaging

Ensemble forecasting: Error bars and beyond. Jim Hansen, NRL Walter Sessions, NRL Jeff Reid,NRL May, 2011

Application and verification of ECMWF products 2017

Products of the JMA Ensemble Prediction System for One-month Forecast

Exploring ensemble forecast calibration issues using reforecast data sets

AMPS Update June 2016

RTP SHIP Inclusion of Environmental Uncertainty for Automated Ship-Routing Guidance

NOTES AND CORRESPONDENCE. Improving Week-2 Forecasts with Multimodel Reforecast Ensembles

Final Report. COMET Partner's Project. University of Texas at San Antonio

Medium-range Ensemble Forecasts at the Met Office

LAM EPS and TIGGE LAM. Tiziana Paccagnella ARPA-SIMC

Quantifying Uncertainty through Global and Mesoscale Ensembles

Package ProbForecastGOP

Atmospheric circulation analysis for seasonal forecasting

Developing Operational MME Forecasts for Subseasonal Timescales

What is one-month forecast guidance?

Probabilistic Quantitative Precipitation Forecasting Using Bayesian Model Averaging

ensemblebma: An R Package for Probabilistic Forecasting using Ensembles and Bayesian Model Averaging

Inflow Forecasting for Hydropower Operations: Bayesian Model Averaging for Postprocessing Hydrological Ensembles

Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation

ensemblebma: An R Package for Probabilistic Forecasting using Ensembles and Bayesian Model Averaging

Weather Forecasting: Lecture 2

ASSESMENT OF THE SEVERE WEATHER ENVIROMENT IN NORTH AMERICA SIMULATED BY A GLOBAL CLIMATE MODEL

1.2 DEVELOPMENT OF THE NWS PROBABILISTIC EXTRA-TROPICAL STORM SURGE MODEL AND POST PROCESSING METHODOLOGY

Ensemble Copula Coupling (ECC)

NWS/AFWA/Navy Office: JAN NWS (primary) and other NWS (see report) Name of NWS/AFWA/Navy Researcher Preparing Report: Jeff Craven (Alan Gerard)

Fleet Numerical Meteorology and Oceanography Center. Current Sub-seasonal to Seasonal Capabilities

11A.3 THE REGIME DEPENDENCE OF OPTIMALLY WEIGHTED ENSEMBLE MODEL CONSENSUS FORECASTS

Adaptation for global application of calibration and downscaling methods of medium range ensemble weather forecasts

Probabilistic wind speed forecasting in Hungary

Enhancing Weather Information with Probability Forecasts. An Information Statement of the American Meteorological Society

Peter P. Neilley. And. Kurt A. Hanson. Weather Services International, Inc. 400 Minuteman Road Andover, MA 01810

BAYESIAN PROCESSOR OF OUTPUT: A NEW TECHNIQUE FOR PROBABILISTIC WEATHER FORECASTING. Roman Krzysztofowicz

Seasonal forecasts presented by:

Wind Forecasting using HARMONIE with Bayes Model Averaging for Fine-Tuning

Application and verification of ECMWF products in Austria

Introduction to TIGGE and GIFS. Richard Swinbank, with thanks to members of GIFS-TIGGE WG & THORPEX IPO

ensemblebma: An R Package for Probabilistic Forecasting using Ensembles and Bayesian Model Averaging

Activities of NOAA s NWS Climate Prediction Center (CPC)

FPAW October Pat Murphy & David Bright NWS Aviation Weather Center

David G. DeWitt Director, Climate Prediction Center (CPC) NOAA/NWS

Application and verification of ECMWF products 2012

Package ensemblebma. R topics documented: January 18, Version Date

J11.5 HYDROLOGIC APPLICATIONS OF SHORT AND MEDIUM RANGE ENSEMBLE FORECASTS IN THE NWS ADVANCED HYDROLOGIC PREDICTION SERVICES (AHPS)

Verification of ensemble and probability forecasts

Supplementary appendix

Global climate predictions: forecast drift and bias adjustment issues

Uncertainty in Operational Atmospheric Analyses. Rolf Langland Naval Research Laboratory Monterey, CA

4.3. David E. Rudack*, Meteorological Development Laboratory Office of Science and Technology National Weather Service, NOAA 1.

Hydrologic Ensemble Prediction: Challenges and Opportunities

Multimodel Ensemble forecasts

Impact of METOP ASCAT Ocean Surface Winds in the NCEP GDAS/GFS and NRL NAVDAS

Seasonal forecasts presented by:

BAYESIAN PROCESSOR OF OUTPUT FOR PROBABILISTIC FORECASTING OF PRECIPITATION OCCURRENCE. Coire J. Maranzano. Department of Systems Engineering

Akira Ito & Staffs of seasonal forecast sector

The Impact of Horizontal Resolution and Ensemble Size on Probabilistic Forecasts of Precipitation by the ECMWF EPS

Implementation and Evaluation of a Mesoscale Short-Range Ensemble Forecasting System Over the Pacific Northwest

Allison Monarski, University of Maryland Masters Scholarly Paper, December 6, Department of Atmospheric and Oceanic Science

Making a Climograph: GLOBE Data Explorations

NAVGEM Platform Support

Package ensemblebma. July 3, 2010

Ashraf S. Zakey The Egyptian Meteorological Autority

Locally Calibrated Probabilistic Temperature Forecasting Using Geostatistical Model Averaging and Local Bayesian Model Averaging

Combining Deterministic and Probabilistic Methods to Produce Gridded Climatologies

ESPC Analysis of Interagency Extended-Range Ensemble Prediction

Application and verification of ECMWF products 2018

Locally Calibrated Probabilistic Temperature Forecasting Using Geostatistical Model Averaging and Local Bayesian Model Averaging

Performance of the ocean wave ensemble forecast system at NCEP 1

Model Output Statistics (MOS)

Toward improved initial conditions for NCAR s real-time convection-allowing ensemble. Ryan Sobash, Glen Romine, Craig Schwartz, and Kate Fossell

BUREAU OF METEOROLOGY

Application and verification of ECMWF products in Croatia - July 2007

Development of Operational Air Quality MM5-MOS Equations for the San Joaquin Valley Air Pollution Control District

ALASKA REGION CLIMATE OUTLOOK BRIEFING. December 22, 2017 Rick Thoman National Weather Service Alaska Region

The ECMWF Extended range forecasts

Jackson County 2013 Weather Data

Application and verification of ECMWF products 2012

The ProbForecastGOP Package

Standardized Verification System for Long-Range Forecasts. Simon Mason

1. INTRODUCTION 2. QPF

Basic Verification Concepts

Der SPP 1167-PQP und die stochastische Wettervorhersage

Quantifying Uncertainty through Global and Mesoscale Ensembles

Fig. F-1-1. Data finder on Gfdnavi. Left panel shows data tree. Right panel shows items in the selected folder. After Otsuka and Yoden (2010).

Experimental MOS Precipitation Type Guidance from the ECMWF Model

Possible Applications of Deep Neural Networks in Climate and Weather. David M. Hall Assistant Research Professor Dept. Computer Science, CU Boulder

Third Grade Math and Science DBQ Weather and Climate/Representing and Interpreting Charts and Data - Teacher s Guide

Transcription:

MURI Project: Integration and Visualization of Multisource Information for Mesoscale Meteorology: Statistical and Cognitive Approaches to Visualizing Uncertainty, 2001 2006 Overview of Achievements October 2001 October 2003 Adrian Raftery, P.I. MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 1

Project Goals Develop statistical methods for assessing uncertainty in numerical weather predictions Develop ways of visualizing and communicating this uncertainty to users Develop tools for users to implement the methods Project Team: Statisticians, atmospheric scientists, psychologists, and visualization specialists from the University of Washington MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 2

Summary of Achievements Expanded UW mesoscale ensemble forecast system and results Calibrated probabilistic forecasting methods developed New verification concepts and tools developed Observation and evaluation of how naval forecasters use model information completed New Web tools developed: Expanded UW Ensemble website MUM (MURI Uncertainty Monitor) Shared data archive data extraction tools Intense between-group interaction and interdisciplinary learning Many students, papers and presentations The springboard for a highly productive next 3 years MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 3

UW Mesoscale Ensemble System Development of an operational mesoscale ensemble system Representation of initial condition uncertainty with 8 analyses from different major operational forecast centers Representation of physics uncertainty through perturbations to the mesoscale model New ensemble members through mirroring, a new idea due to Tony Eckel Demonstrated a robust skill-spread relationship MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 4

Grid-Based Bias Correction Much literature on bias correction at observation sites, but little on grid-based bias correction. Grid-based bias correction cannot use observations at the point at which the correction is being made. Three different approaches developed by MURI researchers Sliding time window at each grid point, with bias estimated from comparing forecast with verifying analysis (not observations): 14 day window found to perform well. Substantial reduction in RMSE (up to 30%) Global method using observations, and modern nonlinear regression with ACE, CART and Bayesian model selection: Independent variables: Functions of latitude, longitude, elevation, distance from ocean, land use. Fitted using 2001 temperature data for Pacific Northwest. Evaluated using 2002 data. 12% reduction in RMSE. MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 5

Local method using recent neighboring observations, defined as close to the grid point spatially and in terms of land use and elevation: Good performance with a 20-day sliding window, and windows of 1 degree latitude and longitude, 500m elevation, and broad land use categories 25% reduction in RMSE. Combination of global and local methods using regression: 26% reduction in RMSE. Small improvement over local method, but global component allows good estimates when there are few observational assets Open question: Comparison and combination of analysis-based and observation-based approaches MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 6

Statistical Assessment of Uncertainty Development of three calibrated probabilistic forecasting methods, possibly for the first time BMA (Bayesian Model Averaging): Optimally combining an ensemble of deterministic forecasts from different MM5 initializations with probabilistic error distributions to produce a calibrated and sharp predictive probability distribution of a future meteorological quantity CHMOS (Conditionally Heteroscedastic Model Output Statistics): Extends standard MOS by using the predictive distribution from regression and allowing for nonconstant variance to yield a calibrated probability distribution of a future meteorological quantity GOP (Geostatistical Output Perturbation Method): Probabilistic forecasting of entire fields: Combining a deterministic forecast with a spatial statistical model of a random field to produce a calibrated probability distribution of an entire field at a future time point MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 7

Bayesian Model Averaging A principled statistical method for combining probabilistic forecasts from different models and analyses using weights that reflect their relative skills, yielding both a probabilistic forecast (PDF of future weather quantities and events), and a deterministic forecast, and honoring the spread-skill relationship Provides a theoretical explanation of a paradox: Why do ensembles show a spread-skill relationship, thus providing useful predictions of forecast skill, and yet are uncalibrated? Can be represented as an analytic function, or as an equally weighted ensemble of any desired size The weights provide a basis for selecting ensemble members We established good training period lengths: 20 days for temperature; 40 days for sea-level pressure MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 8

BMA Results The BMA predictive distribution is much better calibrated than the raw ensemble It is also sharp, with prediction intervals that were 62% shorter than those produced by climatology in an out-of-sample evaluation The BMA deterministic forecast had an RMSE that was: 55% better than climatology 11% better than the best single forecast 6% better than the ensemble mean MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 9

CHMOS: Conditionally Heteroscedastic Model Output Statistics Easy-to-implement linear postprocessing techniques for mesoscale ensembles: EMOS: ensemble model output statistics CHMOS: conditionally heteroscedastic model output statistics MEMOS: mean-based EMOS [uses only ensemble mean as predictor] MCHMOS: mean-based CHMOS [uses only ensemble mean as predictor] Good properties: calibrated sharp CHMOS and MCHMOS honor the spread-error relationship quick and dirty approximation to BMA MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 10

CHMOS Results In a cross-validation study for SLP in the Puget Lowland (January-March 2000), and for the test set, CHMOS shows 12% reduction of MAE 24% reduction of log/ignorance score over raw (smoothed) ensemble output in true forecast mode for 2-m temperature in the Puget Lowland (Spring 2000), MEMOS shows up to 80% reduction of log/ignorance score over raw (smoothed) ensemble output in true forecast mode for SLP in the Puget Lowland (Spring 2000), CHMOS shows up to 1% reduction of MAE 35% reduction of ignorance over raw (smoothed) ensemble output MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 11

Forecasting Fields: The Geostatistical Output Perturbation (GOP) Method Calibrated sharp probabilistic forecasting method Honors the spatial characteristics of the error field Simulates an ensemble of forecast fields using geostatistical error models and fast random field simulation methods Contours look rougher than meteorologists are used to MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 12

Example of GOP Simulations 01/12/2002 01/12/2002 Latitude 42 44 46 48 50 255 260 265 270 275 280 285 290 Latitude 42 44 46 48 50 255 260 265 270 275 280 285 290 124 122 120 118 Longitude 124 122 120 118 Longitude 01/12/2002 01/12/2002 Latitude 42 44 46 48 50 255 260 265 270 275 280 285 290 Latitude 42 44 46 48 50 255 260 265 270 275 280 285 290 124 122 120 118 Longitude 124 122 120 118 Longitude MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 13

Verification Established principle of maximizing sharpness subject to calibration Developed a theoretical framework for verifying probabilistic forecasts Developed concepts of probabilistic calibration, exceedance calibration, and marginal calibration The concept of marginal calibration was adopted at 2002 NCAR verification workshop Developed verification tools such as the marginal calibration plot Established results about scoring systems (e.g., showed that the probability score is improper) MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 14

Rapid Mesoscale Forecast Verification (MVT) Developed an automated user interface tool for rapid mesoscale forecast verification (MVT) allowing rapid and accurate selection of fields for verification. MVT allows selection, data access, and setup of a verification case in minutes, as compared to hours to days Investigation of rapid image processing algorithms revealed a combination of algorithms that resulted in a 30-fold increase in computational verification speed with essentially no degradation (< 1% in accuracy). MVT technology allows consideration of phase and timing errors in automated verification of grid forecasts, an attribute not routinely considered in automated verification schemes. MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 15

Human Factors How are naval forecasters using model information now? Completed work qualitative observations statistical evaluations Hypothetical uses of probabilistic information: Work in progress: Experiments using meteorology students Evaluation of alternative techniques for display: Experiments planned. MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 16

Human Factors: Experiments Carried Out Conducted forecaster observations at NAMOC, Norfolk, VA, NAS Whidbey, and NOAA Developed verbal protocol coding scheme appropriate for weather forecasting Conducted verbal protocol study on 6 forecasters at NAS Whidbey, and at NOAA Conducted uncertainty questionnaire study at NAS Whidbey (joint APL/Psych) Conducted uncertainty products experiment at UW (joint APL/Psych/Atmos) Conducted lens model analysis of NAS Whidbey wind forecasts (190 TAFs) Analyzed comparative accuracy of forecasts by MM5 alone and by forecasters using MM5 MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 17

Visualization and Data Sharing Expanded operational UW Ensemble website Developed an on-line prototype interface for visualizing uncertainty: the MURI Uncertainty Monitor (MUM): will facilitate more rapid assessment of model uncertainty produces new types of model performance and forecast uncertainty visualizations Developed a web-based data extraction tool for research and operational post-processing Developed a Java Web Start tool for mesoscale verification analyses Jeff Baars, jointly appointed between the Atmos and Stat groups, works on data sharing and verification MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 18

Transitions and Leveraging of MURI Research ONR Code 322 funded research on Mesoscale Verification, Jan03-Dec06 WA Sea Grant funded research for system to deliver environmental information to Puget Sound boaters, Feb04-Jan07 Army SBIR announcement on Portraying Uncertainty in Battlefield Weather Situation Awareness Presentation to Commander, Naval METOC Command, March 03 Briefing by Doug Brown (Army) to NATO working group Battlearea Meteorological Systems and Support, Nov 02. Great interest reported. Considerable contact with Whidbey Island Naval Air Station Forecasters Invited talk by P.I. to Army Applied Statistics Conference, Oct 03 MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 19

Major Student Involvement 5 undergraduate students supported (2 in Stat, 3 at APL) 9 graduate students supported (2 in Atmos, 5 in Stat, 2 in Psych) MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 20

Intense Between-Group Collaboration 6 one-day all-muri meetings: August 2001, January 2002, May 2002, October 2002, April 2003, October 2003 Active MURI email list: muri@stat.washington.edu MURI web site: www.stat.washington.edu/muri Attendance by all groups at Mass s weather forecasting classes Attendance by all groups at Gneiting and Raftery s course on Inference for Deterministic Simulation Models MURI visualization working group MURI verification working group MURI Data committee involving members from all groups MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 21

Joint attendance at conferences, e.g.: Verification Workshop at NCAR, July 2002 Pacific Northwest Weather Workshop, March 2003 Mesoscale Ensemble Workshop, Monterey, July 2003 UW Mesoscale Verification Workshop, organized by Cliff Mass, July 2003 Ensemble Forecasting Workshop, Québec, Sept 2003 American Meteorological Society Annual Meeting, Jan 2004 (including 3 talks by statisticians) Joint visits to Whidbey NAS (esp. Psych APL) Regular APL-Psych meetings (every two weeks) Many other ad hoc inter-group meetings MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 22

Quantitative Summary: FY 2002 Refereed publications 15 Invited presentations 7 Contributed presentations 12 Honors 4 (FY 2003 report not yet available) MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 23

Summary of Achievements New ensemble forecast system and results Calibrated probabilistic forecasting methods developed New verification concepts and tools developed Observation and cognitive task analysis of how naval forecasters use models completed Web-based Uncertainty visualization, data extractor and data-sharing tools developed Major student involvement Already, a range of potential transitions Intense between-group interaction and learning MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 24

Next Steps: Prototypes and Going Operational Continue, complete and publish current ensemble, probabilistic forecasting, bias correction, verification, human factors and visualization work Practical implementation of probabilistic forecasting products Experimentation in a quasi-operational setting on the Web (UW Ensemble Web site and APL s MUM) Going operational! MURI Overview Presentation, 17 October 2003 c 2003 Adrian E. Raftery 25