Performance of Multi-Model Ensemble(MME) Seasonal Prediction at APEC Climate Center (APCC) During 2008

Similar documents
Climate Outlook for March August 2018

Climate Outlook for December 2015 May 2016

Standardized Verification System for Long-Range Forecasts. Simon Mason

Verification at JMA on Ensemble Prediction

Development of Multi-model Ensemble technique and its application Daisuke Nohara

Winter Steve Todd Meteorologist In Charge National Weather Service Portland, OR

Climate Outlook for March August 2017

Multi-Model Ensembles in NWS Climate Prediction Center Subseasonal to Seasonal Forecasts: Metrics for Impact Events

ENSO Cycle: Recent Evolution, Current Status and Predictions. Update prepared by Climate Prediction Center / NCEP 23 April 2012

Climate Outlook for October 2017 March 2018

Verification of the Seasonal Forecast for the 2005/06 Winter

Methods of forecast verification

ENSO Cycle: Recent Evolution, Current Status and Predictions. Update prepared by Climate Prediction Center / NCEP 5 August 2013

ENSO Cycle: Recent Evolution, Current Status and Predictions. Update prepared by Climate Prediction Center / NCEP 11 November 2013

Climate Outlook for Pacific Islands for May - October 2015

Bugs in JRA-55 snow depth analysis

Climate Outlook for Pacific Islands for December 2017 May 2018

ENSO Cycle: Recent Evolution, Current Status and Predictions. Update prepared by Climate Prediction Center / NCEP 15 July 2013

4.3.2 Configuration. 4.3 Ensemble Prediction System Introduction

ENSO Cycle: Recent Evolution, Current Status and Predictions. Update prepared by Climate Prediction Center / NCEP 25 February 2013

ENSO Cycle: Recent Evolution, Current Status and Predictions. Update prepared by Climate Prediction Center / NCEP 24 September 2012

ENSO Cycle: Recent Evolution, Current Status and Predictions. Update prepared by Climate Prediction Center / NCEP July 26, 2004

Chapter 1 Climate in 2016

Seasonal Climate Watch July to November 2018

Seasonal Climate Watch April to August 2018

ENSO: Recent Evolution, Current Status and Predictions. Update prepared by: Climate Prediction Center / NCEP 9 November 2015

Climate Outlook for Pacific Islands for July December 2017

An evaluation of the skill of ENSO forecasts during

Shuhei Maeda Climate Prediction Division Global Environment and Marine Department Japan Meteorological Agency

JMA s Seasonal Prediction of South Asian Climate for Summer 2018

The Australian Summer Monsoon

Seasonal Climate Watch June to October 2018


South Asian Climate Outlook Forum (SASCOF-8)

Seasonal Predictions for South Caucasus and Armenia

Introduction of climate monitoring and analysis products for one-month forecast

Interpretation of Outputs from Numerical Prediction System

Climate Outlook for Pacific Islands for August 2015 January 2016

South Asian Climate Outlook Forum (SASCOF-12)

A study of the impacts of late spring Tibetan Plateau snow cover on Chinese early autumn precipitation

Forecasting. Theory Types Examples

Wassila Mamadou Thiaw Climate Prediction Center

1.4 USEFULNESS OF RECENT NOAA/CPC SEASONAL TEMPERATURE FORECASTS

Climate change and variability -

South Asian Climate Outlook Forum (SASCOF-6)

Can seasonal forecasting improve grower profitability? Meredith Guthrie, DPIRD and Fiona Evans, Murdoch University

Decadal predictions for Europe: Regional downscaling of the MiKlip decadal experiments

Tropical Moist Rainforest

Multi Model Ensemble seasonal prediction of APEC Climate Center

I C P A C IGAD Climate Prediction & Applications centre

No. 20 Spring El Niño Outlook (April October 2010) 1. Contents. (a) (a) (b) (b) Tokyo Climate Center 1 No. 20 Spring 2010

Work on on Seasonal Forecasting at at INM. Dynamical Downscaling of of System 3 And of of ENSEMBLE Global Integrations.

KUALA LUMPUR MONSOON ACTIVITY CENT

Work on seasonal forecasting at INM: Dynamical downscaling of the ECMWF System 3 and of the global integrations of the EU ensembles project

Statistical and dynamical downscaling of precipitation over Spain from DEMETER seasonal forecasts

Characteristics of Global Precipitable Water Revealed by COSMIC Measurements

Ashraf S. Zakey The Egyptian Meteorological Autority

2.6 Operational Climate Prediction in RCC Pune: Good Practices on Downscaling Global Products. D. S. Pai Head, Climate Prediction Group

ENSO: Recent Evolution, Current Status and Predictions. Update prepared by: Climate Prediction Center / NCEP 30 October 2017

Tokyo Climate Center s activities as RCC Tokyo

A simple method for seamless verification applied to precipitation hindcasts from two global models

Fig P3. *1mm/day = 31mm accumulation in May = 92mm accumulation in May Jul

WMO LC-LRFMME Website User Manual

Zambia. General Climate. Recent Climate Trends. UNDP Climate Change Country Profiles. Temperature. C. McSweeney 1, M. New 1,2 and G.

Suriname. General Climate. Recent Climate Trends. UNDP Climate Change Country Profiles. Temperature. C. McSweeney 1, M. New 1,2 and G.

Seasonal Outlook for Summer Season (12/05/ MJJ)

El Niño Seasonal Weather Impacts from the OLR Event Perspective

Developing Operational MME Forecasts for Subseasonal Timescales

WATER VAPOR FLUXES OVER EQUATORIAL CENTRAL AFRICA

Figures and tables Table 1 Annual CO emissions 1 : a priori and a posteriori estimates for selected regions, Tg a -1. Figure 1. Seasonal a priori CO

Seasonal Weather Forecast Talk Show on Capricorn FM and North West FM

Environment and Climate Change Canada / GPC Montreal

Operational Practices in South African Weather Service (SAWS)

Summary. peninsula. likely over. parts of. Asia has. have now. season. There is. season, s that the. declining. El Niño. affect the. monsoon.

Predictability and prediction of the North Atlantic Oscillation

The pilot real-time sub-seasonal MME prediction in WMO LC-LRFMME

The Coupled Model Predictability of the Western North Pacific Summer Monsoon with Different Leading Times

Seasonal Climate Watch September 2018 to January 2019

ECMWF Medium-Range Forecast Graphical Products

Seasonal forecasting with the NMME with a focus on Africa

Assessment of ERA-20C reanalysis monthly precipitation totals on the basis of GPCC in-situ measurements

Climate change and variability -

Sixth Session of the ASEAN Climate Outlook Forum (ASEANCOF-6)

Supplemental Materials

THEME: Seasonal forecast: Climate Service for better management of risks and opportunities

North Pacific Climate Overview N. Bond (UW/JISAO), J. Overland (NOAA/PMEL) Contact: Last updated: August 2009

Introduction of products for Climate System Monitoring

SPATIAL AND TEMPORAL DISTRIBUTION OF AIR TEMPERATURE IN ΤΗΕ NORTHERN HEMISPHERE

CPTEC and NCEP Model Forecast Drift and South America during the Southern Hemisphere Summer

March Regional Climate Modeling in Seasonal Climate Prediction: Advances and Future Directions

ICPAC. IGAD Climate Prediction and Applications Centre Monthly Bulletin, May 2017

Impact of Eurasian spring snow decrement on East Asian summer precipitation

El Niño 2015/2016 Impact Analysis Monthly Outlook February 2016

I C P A C. IGAD Climate Prediction and Applications Centre Monthly Climate Bulletin, Climate Review for September 2017

Forecasting activities on Intraseasonal Variability at APEC Climate Center

How Patterns Far Away Can Influence Our Weather. Mark Shafer University of Oklahoma Norman, OK

I C P A C. IGAD Climate Prediction and Applications Centre Monthly Climate Bulletin, Climate Review for April 2018

Comparison of the NCEP and DTC Verification Software Packages

Chapter 1. Global agroclimatic patterns

UPDATE OF REGIONAL WEATHER AND SMOKE HAZE (September 2017)

Transcription:

2008/SOM3/ISTWG/SYM/034 Agenda Item: 4 Performance of Multi-Model Ensemble(MME) Seasonal Prediction at APEC Climate Center (APCC) During 2008 Submitted by: APEC Climate Center (APCC) APEC Climate Symposium Lima, Peru 19-21 August 2008

APEC CLIMATE CENTER 4 th Working Group Meeting/ Science Advisory Committee Meeting CLIMATE PREDICTION AND APPLICATION TACKLING REGIONAL NEEDS APCC/WG/Doc. 3.3(1) (22.VII.2008) Item: 3.3 ENGLISH ONLY LIMA Republic of Peru 19-21 AUGUST 2008 PERFORMANCE OF MME SEASONAL PREDICTION AT APCC DURING 2008 (Submitted by APEC Climate Center) Summary and Purpose of Document This document discusses some outputs from the APCC s MME forecasts since 2007 SON. ACTION PROPOSED The WG participants are invited to peruse the document and give their feedback, along with suggestions if any, on the performance of the MME system since 2007. General comments from the other WG members are also welcome. - References: APCC forecasts and other products (see http://www.apcc21.net) WG 3.3(1) 1/30

3.3 PERFORMANCE OF MME SEASONAL PREDICTION AT APCC DURING 2008 3.3.1 Forecast Verification The results of forecast verification against observation are shown here from 2007SON to 2008MAM, which are the seasons after last APCC Symposium. Pattern anomaly correlation coefficients (ACC) are used here to evaluate MME prediction performance in global, East Asia and Australia, as samples of the performance (performance for the other regions can be assessed from our webpage). The forecasts from four deterministic MME Schemes are evaluated. In 2007SON, MME forecasts for both precipitation and T850 generally show stable forecast skills except MRG scheme in East Asia. The forecast skill of SSE scheme is superior to that of other MME schemes for precipitation, while SCM scheme is better than others for temperature. In 2007DJF, MME forecasts for precipitation demonstrate quite good prediction skills over global or in Australia. It is interesting that the forecast skills for precipitation by these MME schemes are superior to that for temperature. In 2008MAM, MME schemes show very good forecast skills for both precipitation and temperature over global and in East Asia. In Australia, the MME schemes still show high forecast skills for precipitation, however, they show lower forecast skill for temperature. In general, MME schemes have generated skilful forecasts for precipitation from 2007DJF to 2008MAM, and skilful forecasts for temperature from 2008JFM to 2008MAM. WG 3.3(1) 2/30

a. 2007 SON WG 3.3(1) 3/30

b. 2007 DJF WG 3.3(1) 4/30

c. 2008 JFM WG 3.3(1) 5/30

d. 2008 FMA WG 3.3(1) 6/30

e. 2008 MAM WG 3.3(1) 7/30

3.3.2 Hindcast Verification In order to evaluate deterministic MME hindcast, three skill scores are used in this section. A detailed description of mean squared skill score (MSSS) is provided by WMO (2002), so only a brief description is presented here. Let x ij and f ij (i=1,,n) denote time series of observations and continuous deterministic forecasts respectively for a grid point or station j over the period of verification (POV). Then, their averages for the POV, x j and f j and their sample variances s 2 2 xj and s fj are given by: x s j 2 xj 1 = = n x, f 1 = ij n j i= 1 n i= 1 1 n n f ij ( ) ( ) 2 n 2 1, = f f x ij x j fj n 1i= 1 n 1i= 1 The mean squared error of the forecasts is: MSE j 1 = n n i= 1 ( f x ) ij ij 2 s For the case of cross-validated (see section 3.4) POV climatology forecasts where forecast/observation pairs are reasonably temporally independent of each other (so that only one year at a time is withheld), the mean squared error of climatology forecasts (Murphy, 1988) is: ij j 2 MSE cj = 1 n n 2 xj s The Mean Squared Skill Score (MSSS) for j is defined as one minus the ratio of the squared error of the forecasts to the squared error for forecasts of climatology : MSSS j =1 MSE MSE j cj For the three domains described in Sec. 3.1.1 it is recommended that an overall MSSS be provided. This is computed as: MSSS = 1 j j w w j j MSE MSE j cj WG 3.3(1) 8/30

where w j is unity for verifications at stations and is equal to cos(θ j ), where θ j is the latitude at grid point j on latitude-longitude grids. For either MSSS j or MSSS a corresponding Root Mean Squared Skill Score (RMSSS) can be obtained easily from 1 ( ) RMSSS = 1 1 MSSS 2 MSSSj for forecasts fully cross-validated (with one year at a time withheld) can be expanded (Murphy, 1988) as MSSS j = 2 s s fj xj r fxj s s fj xj 2 [ f x j] j s xj 2 + 2n 1 2n 1 1 + 2 2 ( n 1) ( n 1) where r fxj is the product moment correlation of the forecasts and observations at point or station j. 1 ( f f )( ) n ij xij x j n j i=1 r fxj = s fj s xj The first three terms of the decomposition of MSSS j are related to phase errors (through the correlation), amplitude errors (through the ratio of the forecast to observed variances) and overall bias error, respectively, of the forecasts. These terms provide the opportunity for those wishing to use the forecasts for input into regional and local forecasts to adjust or weight the forecasts as they deem appropriate. The last term takes into account the fact that the climatology forecasts are cross-validated as well. A recommended skill score for categorical deterministic forecasts is the Gerrity Skill Score, GSS. If x i and f i now denote an observation and corresponding forecast of category i (i = 1,,3), let n ij be the count of those instances with forecast category i and observed category j. The full contingency table is defined as the nine n ij. Graphically the nine cell counts are usually arranged with the forecasts defining the table rows and the observations the table columns: WG 3.3(1) 9/30

Table 1: General three by three contingency table. Observation s Below Normal Near Normal Above Normal Below Normal n 11 n 12 n 13 n 1 Forecasts Near Normal n 21 n 22 n 23 n 2 Above n 31 n 32 n 33 n 3 Normal n 1 n 2 n 3 T In Table 1, n i and n i represents the sum of the rows and columns respectively; T is the total number of cases. Generally about at least 90 forecast/observation pairs are required to properly estimate a three by three contingency table. Thus it is recommended that the provided tables be aggregated by users over windows of target periods, like several adjacent months or overlapping three-month periods, or over verification points. In the case of the latter the weights W i should be used in summing n ij over different points i (see discussion on Table 4). W i is defined as: W = 1 when verification is done at stations or at single grid points within a 10 degree i box. W i ( ) = cos θ i at grid point i, when verification is done on a grid. θ = the latitude at grid point i. i On a 2.5 degree latitude-longitude grid the minimally acceptable sample is easily attained even with a record as short as n = 10 by aggregating over all grid points with a 10 degree box. Or alternatively in this case, an adequate sample can be achieved by aggregation over three adjacent months or overlapping three-month periods and within a 5 degree box. Regardless, scores derived from any contingency table should be accompanied by error bars, confidence intervals or level of significance. Contingency tables such as the one in Table 3 are mandatory for level 3 verification in the core SVS. The relative sample frequencies p ij are defined as the ratios of the cell counts to the total number of forecast/observation pairs N (n is reserved to denote the length of the POV): p ij = n ij N WG 3.3(1) 10/30

The sample probability distributions of forecasts and observations respectively then become p p 3 ( ) = ˆ ; i = 1,...,3 f i = j= 1 3 p ij p ( ) = ; i = 1,...,3 x i = j= 1 p ji p i i A recommended skill score for the three by three table which has many desirable properties and is easy to compute is the Gerrity Skill Score, GSS. The definition of the score uses a scoring matrix s ij (i = 1,,3), which is a tabulation of the reward or penalty every forecast/observation outcome represented by the contingency table will be accorded: GSS = 3 3 pijs ij i= 1 j= 1 The scoring matrix is given by s s ii ij where a 1 i 1 2 1 = ar + 2 r= 1 r = i a r 1 i 1 1 = 2 ar 2 r = 1 r = j 1 p r = 1 i = i i p r = 1 r r ( j 1) + ar ;1 i < 3, i < j 3 Note that GSS is computed using the sample probabilities, not those on which the original categorizations were based (i.e. 0.33, 0.33, 0.33). The spatial distribution of scores of MSSS, correlation and GSS for both precipitation and 850 hpa temperature (T850) are shown in the below, which includes the seasons of FMA, MAM, AMJ, MJJ, JJA and JAS. WG 3.3(1) 11/30

In boreal spring, it is found that MME prediction for T850 show pretty well performance in the tropical regions. It also has quite good skills in most of the northern America continent, Siberia region, the southern Africa continent, western Europe and middle east regions. However, MME prediction for precipitation have only marginally skills in some of the equatorial area, such as Indochina peninsula, northern Borneo Island, the Philippines, northwestern part of Australia, northern Brazil, Mexico and western coast of the US, western equatorial Africa, and part of the middle east area. In boreal summer, MME predict T850 well in Mongolia, northeastern China, total India peninsula, Indochina peninsula, eastern part of Maritime Continent, western part of Australia, equatorial America and northern part of Southern America, western and eastern regions of Canada, eastern tropical Africa and northwestern Africa. But MME predictions for precipitation have only marginal skills in Maritime Continent and northeastern Brazil. It is noticed that the skill scores of temperature prediction are, in general, better than that of precipitation in both boreal spring and summer season. It is also found that the prediction skill for temperature is better in boreal summer than that in boreal spring; and the prediction skill for precipitation is better in boreal spring than in boreal summer. WG 3.3(1) 12/30

a. FMA WG 3.3(1) 13/30

WG 3.3(1) 14/30

Gerrity Skill Score WG 3.3(1) 15/30

b. MAM WG 3.3(1) 16/30

WG 3.3(1) 17/30

Gerrity Skill Score WG 3.3(1) 18/30

c.amj WG 3.3(1) 19/30

WG 3.3(1) 20/30

Gerrity Skill Score WG 3.3(1) 21/30

d. MJJ WG 3.3(1) 22/30

WG 3.3(1) 23/30

Gerrity Skill Score WG 3.3(1) 24/30

e. JJA WG 3.3(1) 25/30

WG 3.3(1) 26/30

Gerrity Skill Score WG 3.3(1) 27/30

f. JAS WG 3.3(1) 28/30

WG 3.3(1) 29/30

Gerrity Skill Score WG 3.3(1) 30/30