ESPC Analysis of Interagency Extended-Range Ensemble Prediction

Similar documents
Impact of Resolution on Extended-Range Multi-Scale Simulations

Developing Operational MME Forecasts for Subseasonal Timescales

Assessing Land Surface Albedo Bias in Models of Tropical Climate

Activities of NOAA s NWS Climate Prediction Center (CPC)

ESPC-2 Operational Implementation and Validation of the Coupled System

Leveraging ISI Multi-Model Prediction for Navy Operations: Proposal to the Office of Naval Research

Sub-seasonal predictions at ECMWF and links with international programmes

WWOSC Montreal August 18, 2014

A Multi-Model Ensemble for Western North Pacific Tropical Cyclone Intensity Prediction

Climate Prediction Center Research Interests/Needs

Sub-seasonal predictions at ECMWF and links with international programmes

The pilot real-time sub-seasonal MME prediction in WMO LC-LRFMME

ESPC-2 Operational Implementation and Validation of the Coupled System

MJO prediction Intercomparison using the S2S Database Frédéric Vitart (ECMWF)

NOAA MAPP Program S2S Activities Focus on select products/capabilities/metrics

RTP SHIP Inclusion of Environmental Uncertainty for Automated Ship-Routing Guidance

California DWR and WSWC 2018 Seasonal to Sub- Seasonal Workshop

Quantifying Uncertainty through Global and Mesoscale Ensembles

Tropical drivers of the Antarctic atmosphere

Behind the Climate Prediction Center s Extended and Long Range Outlooks Mike Halpert, Deputy Director Climate Prediction Center / NCEP

Coupled Global-Regional Data Assimilation Using Joint States

Leveraging the MJO for Predicting Envelopes of Tropical Wave and Synoptic Activity at Multi-Week Lead Times

Enhanced Predictability During Extreme Winter Flow Regimes

ECMWF: Weather and Climate Dynamical Forecasts

NMME Progress and Plans

ALASKA REGION CLIMATE OUTLOOK BRIEFING. December 22, 2017 Rick Thoman National Weather Service Alaska Region

Challenges to Improving the Skill of Weekly to Seasonal Climate Predictions. David DeWitt with contributions from CPC staff

The ECMWF Extended range forecasts

ALASKA REGION CLIMATE OUTLOOK BRIEFING. November 17, 2017 Rick Thoman National Weather Service Alaska Region

Challenges in forecasting the MJO

Unified Cloud and Mixing Parameterizations of the Marine Boundary Layer: EDMF and PDF-based cloud approaches

Uncertainty in Operational Atmospheric Analyses. Rolf Langland Naval Research Laboratory Monterey, CA

Fleet Numerical Meteorology and Oceanography Center. Current Sub-seasonal to Seasonal Capabilities

Evaluating the Discrete Element Method as a Tool for Predicting the Seasonal Evolution of the MIZ

COLORADO STATE UNIVERSITY FORECAST OF ATLANTIC HURRICANE ACTIVITY FROM AUGUST 31 SEPTEMBER 13, 2012

SPECIAL PROJECT PROGRESS REPORT

COLORADO STATE UNIVERSITY FORECAST OF ATLANTIC HURRICANE ACTIVITY FROM AUGUST 2 AUGUST 15, 2013

Toward Seamless Weather-Climate Prediction with a Global Cloud Resolving Model

EMC Probabilistic Forecast Verification for Sub-season Scales

Interagency Earth System Predic;on Capability (ESPC) Navy ESPC PM: D. Eleuterio (ONR)

COLORADO STATE UNIVERSITY FORECAST OF ATLANTIC HURRICANE ACTIVITY FROM AUGUST 30 SEPTEMBER 12, 2013

NAVGEM Platform Support

DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited.

The Properties of Convective Clouds Over the Western Pacific and Their Relationship to the Environment of Tropical Cyclones

DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited.

National Earth System Prediction Capability (National ESPC)

(Towards) using subseasonal-to-seasonal (S2S) extreme rainfall forecasts for extendedrange flood prediction in Australia

Predictability of Sudden Stratospheric Warmings in sub-seasonal forecast models

TerraMaris: Proposed UK contribution to YMC

NAVGEM Platform Support

How far in advance can we forecast cold/heat spells?

Wassila Mamadou Thiaw Climate Prediction Center

Improved EM Tactical Applications through UAS-Enhanced High-Resolution Mesoscale Data Assimilation and Modeling

A review on recent progresses of THORPEX activities in JMA

S e a s o n a l F o r e c a s t i n g f o r t h e E u r o p e a n e n e r g y s e c t o r

The benefits and developments in ensemble wind forecasting

DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited.

Special blog on winter 2016/2017 retrospective can be found here -

Quantifying Uncertainty through Global and Mesoscale Ensembles

Forecasting the MJO with the CFS: Factors affecting forecast skill of the MJO. Jon Gottschalck. Augustin Vintzileos

Leveraging the MJO for Predicting Envelopes of Tropical Wave and Synoptic Activity at Multi-Week Lead Times

A Framework for Assessing Operational Model MJO Forecasts

Development and Implementation of Universal Cloud/Radiation Parameterizations in Navy Operational Forecast Models

Current Issues and Challenges in Ensemble Forecasting

OAR/Climate Program Office

Prognostic aerosols in the Ensemble Prediction System and impacts at the monthly/sub-seasonal scales

Forecasting at the interface between weather and climate: beyond the RMM-index

WMO Lead Centre activities for global sub-seasonal MME prediction

Monthly forecasting system

Predicting Tropical Cyclone Genesis

Time Space Distribution of Long-Range Atmospheric Predictability

Prospects for subseasonal forecast of Tropical Cyclone statistics with the CFS

International Desks: African Training Desk and Projects

A Pilot Program to Examine High-Resolution Modeling of Coastal Land-Air-Sea Interaction in the Monterey Bay

Hypothesis for ESPC Demo #1: Flow-followingfinite-volume. Icosahedral

Products of the JMA Ensemble Prediction System for One-month Forecast

TC/PR/RB Lecture 3 - Simulation of Random Model Errors

GPC Exeter forecast for winter Crown copyright Met Office

The Properties of Convective Clouds over the Western Pacific and Their Relationship to the Environment of Tropical Cyclones

Seasonal Climate Watch September 2018 to January 2019

CPC. Dan Collins and Emily Becker* NOAA Climate Predic9on Center with slides from Jon Go?schalck and Dave DeWi?

The Sub-seasonal to Seasonal (S2S) Prediction Project

COLORADO STATE UNIVERSITY FORECAST OF ATLANTIC HURRICANE ACTIVITY FROM SEPTEMBER 28 OCTOBER 11, 2011

ESPC Coupled Global Prediction System

Toward Seamless Weather-Climate Prediction with a Global Cloud Resolving Model

ALASKA REGION CLIMATE OUTLOOK BRIEFING. November 16, 2018 Rick Thoman Alaska Center for Climate Assessment and Policy

SOUTHEAST ASIAN SUBSEASONAL TO SEASONAL (SEA S2S) PROJECT

Global climate predictions: forecast drift and bias adjustment issues

64. PREDICTIONS OF SEVERE WEATHER ENVIRONMENTS BY THE CLIMATE FORECAST

Advanced Numerical Methods for NWP Models

Atmospheric circulation analysis for seasonal forecasting

COLORADO STATE UNIVERSITY FORECAST OF ATLANTIC HURRICANE ACTIVITY FROM AUGUST 17 AUGUST 30, 2012

The U. S. Winter Outlook

ALASKA REGION CLIMATE FORECAST BRIEFING. December 15, 2014 Rick Thoman NWS Alaska Region ESSD Climate Services

Probabilistic predictions of monsoon rainfall with the ECMWF Monthly and Seasonal Forecast Systems

Understanding Predictability and Model Errors Through Light, Portable Pseudo-Assimilation and Experimental Prediction Techniques

COLORADO STATE UNIVERSITY FORECAST OF ATLANTIC HURRICANE ACTIVITY FROM AUGUST 4-17, 2015

DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited.

HYCOM and Navy ESPC Future High Performance Computing Needs. Alan J. Wallcraft. COAPS Short Seminar November 6, 2017

NOAA Climate Test Bed Jin Huang CTB Director

Transcription:

DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. ESPC Analysis of Interagency Extended-Range Ensemble Prediction Carolyn Reynolds Naval Research Laboratory Monterey, CA 93943-5502 phone: (831) 656-4728 fax: (831) 656-4769 e-mail: carolyn.reynolds@nrlmry.navy.mil Award Number: N0001415WX00968 LONG-TERM GOALS The long-term goal of the project is the development of a coupled atmosphere-ocean ensemble prediction system that can provide probabilistic environmental information to meet Navy and DoD operations and planning needs throughout the globe from undersea to upper atmosphere and from the tropics to the poles on the weekly to monthly timescale. OBJECTIVES Evaluation and validation of multi-model extended-range forecasts produced from projects such as the High-Impact Weather Prediction Project (HIWPP) and North American Multi-model Ensemble (NMME). This will allow for determination of when and why multi-model ensembles are superior to single-model ensembles for predicting phenomena, such as the Madden Julian Oscillation (MJO), that are central to extended-range forecast skill. APPROACH The focus of this component of ESPC is on examining the utility of multi-model ensembles for extended-range prediction. Ensemble-based probabilistic approaches become necessary as integration periods are increased. The specific approach used here will be to first collect retrospective and quasireal-time ensembles produced under multi-agency efforts, such as the High-Impact Weather Prediction Project (HIWPP) and North American Multi-model Ensemble (NMME) and then apply diagnostic tools to evaluate and validate these multi-agency multi-model ensemble forecasts. The focus will be on phenomena tied to extended-range predictability, such as the Madden Julian Oscillation (MJO). Specific diagnostics to be applied include diagnostic tools developed to evaluate the prediction skill of the MJO and associated moist process and teleconnections, and diagnostics to evaluate the Hadley circulation in deterministic forecasts. In this project, these diagnostics will be applied to the individual ensemble members to understand the variability between ensemble members produced from the same forecast model, and the differences between ensembles from different forecast models. Understanding these different types of variability will aid in understand when and why multi-model ensemble forecasts are superior to single-model ensembles. Key personnel: Song Yang and Carolyn Reynolds (NRL MRY). 1

WORK COMPLETED FOR FY15 We have researched different candidate data sets to determine their suitability for multi-agency intercomparison diagnostics. Each data set has pros and cons for our evaluation purposes: 1) NOAA High-Impact Weather Prediction Project (HIWPP) data set became available in early 2015. On the positive side, it is multi model (two NOAA models and NAVGEM) and with standardized output. It will allow for intercomparison between high-resolution deterministic and lower-resolution ensembles. On the negative side, the models do not have interactive ice or ocean components, and the forecast integration times are limited (out to 15 days for ensembles). We have not pursued HIWPP evaluation at this point. 2) North American Multi-model Ensemble (NMME): NMME-1 is not useful given that the archived fields are monthly averaged, but NMME-2 has a potentially extensive suite of daily output from coupled models. The down side of NMME-2 appears to be that center participation is much more limited than NMME-1, data (fields and dates) available varies by center and model. We have opened NMME accounts and preliminary data has been downloaded. 3) WMO Year of Tropical Convection/Madden Julian Oscillation (YOTC/MJO) 20-day forecasts for selected MJO periods of interest are promising. On the positive side, it includes both coupled and uncoupled forecasts for periods when the MJO is active. On the negative side, it is only for certain periods of interest during 2009 and 2010, and is only for deterministic forecasts. We identified several problems with the online archive. First, the 20-day forecast data and 2-day forecast data had been switched (one would download the 2-day data when requesting the 20-day data). The other was that the 20-day forecast data for NAVGEM was not available on the server. We alerted the archivists to these problems and they have since been corrected. We have downloaded this multi-model data set and evaluated and compared these forecasts for teleconnection predictive skill. 4) WMO/WWRP/THORPEX-WCRP Subseasonal to Seasonal Project (S2S). This system archive should be ideal for our purposes, containing many coupled ensemble systems. The system opened in spring 2015 and is currently being populated at different rates for different systems. We have downloaded data for the ECMWF system, learned how to read the data, and have begun our evaluation. The ECMWF S2S data with 32 days forecasts (extended to 46 days forecasts after May 12, 2015) has outputs twice per week (Monday and Thursday). The control run and ensemble runs are beneficial to determine whether the ensemble forecast has advantages in improving the extended forecasts. Preliminary analysis with datasets in Jan-Apr 2015 shown in the results section indicates that the S2S dataset is useful for this project because of its availability of long time periods. Other model outputs will be downloaded for similar analyses soon. Because of its suitability and availability, our analysis of the archived data has begun with the YOTC/MJO 20-day data, applying diagnostics to compare NAVGEM forecasts of the Arctic Oscillation and Antarctic oscillation to that of other systems, as described in the results section. RESULTS We have downloaded all of the YOTC/MJO 20-day forecasts and calculated both the Arctic Oscillation index (AO) and Antarctic oscillation index (AAO) for each of the forecasts. The AO and 2

AAO are useful for characterizing large-scale flow in the high-latitudes. Large-scale teleconnection patterns such as the AO and AAO are hypothesized to be a source of extended range predictability and have also been related to the MJO. We leverage development of the AO and AAO diagnostics from ESPC WU-11. Figure 1: The AO index from the NCEP analysis (black), ECMWF 20-day forecasts (blue) and NAVGEM 20-day forecasts (red) starting from 20 OCT 2009 (top panel), 25 OCT 2009 (middle panel) and 29 OCT 2009 (bottom panel). Forecast and analyses taken from the YOTC/MJO 20-day forecast archive. Figure 1 shows a time series of 20-day forecasts from ECMWF (blue) and NAVGEM (red) of the AO index, compared with the AO index as calculated using the NCEP analysis (black). Top, middle and bottom panels are for forecasts started from 20 OCT, 25 OCT and 29 OCT 2009. In early November, there was a rapid increase in the AO from values of -2 to values greater than +2 over the course of 4 days. Forecasts started on 20 OCT, 10-days prior to the rapid AO increase, miss the event. Forecasts started on 25 October, 5 days prior to the event, capture the rapid increase, but miss the subsequent decreases. Forecasts started on 29 October, just prior to the event, capture both the rapid increase and subsequent decrease (though the decrease is too weak in the NAVGEM forecasts). 3

Figure 2: The AO index as computed using the NCEP analysis (solid black curve), ECMWF reanalysis (dashed black curve), and the YOTC/MJO 20-day forecasts (thin colored curves) starting from 30 DEC 2009. The thick red curve shows the ensemble mean AO. The top left panel includes forecasts from all 13 forecast systems. The top right panel includes only the four forecast systems that produced the most-skillful MJO forecasts in the inter-comparison. The bottom left panel includes only the four forecasts systems that produced the least-skillful MJO forecasts. Figure 2 shows a time-series of 20-day forecasts of the AO index from the different systems participating in the YOTC/MJO inter-comparison (thin colored curve). Also included is the AO index using the NCEP analysis (sold black curve) and ECMWF analysis (dashed black curve). The ensemble mean AO is indicated by the thick red curve. The panel in the upper left shows includes results from all 13 forecast systems. The upper right panel only includes the four systems with the most skillful MJO forecasts, and the lower left panel only includes the four systems with the least skillful MJO forecasts. There are several points to notice in Fig. 2. The first point is that the AO produced from the ECMWF and NCEP analysis (black solid and dashed curves) are different. Small differences are not 4

unexpected, as the AO is calculated using 1000-hPa geopotential height, and analyses from different systems will not be identical, but we do point out that the magnitude of these differences can be greater than 0.5 on the AO index scale. These differences are not small when compared to typical AO index forecast errors for early lead times, and therefore care must be taken when using a particular analysis as verification. The initial spread of the forecast system AO index values is another indication of uncertainty in that field, and illustrates system dependence in a derived field such as geopotential height, despite the fact that all forecasts were starting from a prescribed analysis. The second point is that the ensemble mean (thick red curve) tends to outperform any individual ensemble member. This is not an unexpected result and is consistent with findings of the skill of multi-model ensemble means for many different applications. The third point is that there is not a clear relationship between skillful MJO forecasts and skillful AO forecasts for this particular case. The individual ensemble forecasts and ensemble mean for the least-skillful MJO forecast systems and most-skillful MJO forecast systems are similar. Qualitative evaluation of the other 20-day forecasts does not indicate a clear relationship between MJO skill and AO skill, but we will evaluate these potential relationships in a quantitative manner in future work. Figure 3: Example of the ECMWF AO and AAO index 32-day ensemble forecast from Jan 1, 2015 calculated using fields from the S2S database. The control run and first 10 members of the ensemble run are presented. AO_ob and AAO_ob (thick black curves) are from the NCEP analysis. AO_mean and AAO_mean (thick red curves) are for the ensemble mean. The heavy magenta dots (AO_ec and AAO_ec) are AO and AAO at ECMWF analysis times. Figure 3 is an example of the ECMWF AO and AAO index 32-day ensemble forecast from Jan 1, 2015 calculated using fields downloaded from the S2S database. The NCEP analysis AO index and AAO index (thick black curves) are also shown. The ECMWF analysis AO and AAO (available twice a week, shown as magenta dots) are clearly different from the NCEP analysis values, especially for the AAO, again illustrating sensitivity of the verification to the particular analysis chosen. The ensemble mean forecasts (thick red curves) are superior to the control forecasts (thin solid blue curves) at longer lead times, especially for the AAO. These results are consistent with the expected superior 5

performance of the ensemble mean over individual forecasts, though verification for many cases, planned for next year, is needed to quantify this improvement. IMPACT/APPLICATIONS These experiments allow for an assessment of potential extended range multi-model ensemble forecast utility and will allow for assessment of the Navy ESPC coupled system as compared to forecasts from other systems. Development of a skillful coupled ensemble forecast system will allow for DoD decision support on monthly time scales as well as improve capabilities reliant upon extended-range forecasts such as trans-ocean ship routing. TRANSITIONS The potential for extended-range forecasting demonstrated in this program will be transitioned to operations through existing and future 6.4 programs. RELATED PROJECTS This project leverage the diagnostics tools developed under ESPC WU-11 Integrated Skill Diagnostics. This project analyzes inter-agency multi-model ensembles produced and/or archived under other projects including the High-impact Weather Prediction Project (HIWPP), the North American Multi-model Ensemble (NMME), the WMO Year of Tropical Convection/Madden Julian Oscillation (YOTC/MJO) forecast archive, and the WMO/WWRP/THORPEX-WCRP Subseasonal to Seasonal Project (S2S). 6