Evaluating weather and climate forecasts
|
|
- Hannah Wilkins
- 5 years ago
- Views:
Transcription
1 Evaluating weather and climate forecasts Chris Ferro Department of Mathematics University of Exeter, UK RSS Highlands Local Group and University of St Andrews (St Andrews, 1 September 2016)
2 Monitoring forecast performance (8 August 2016)
3 Informing forecast development (8 August 2016)
4 Guiding responses to forecasts Figure 9.7 Relative error measures of CMIP5 model performance, based on the global seasonal-cycle climatology ( ) computed from the historical experiments. Rows and columns represent individual variables and models, respectively. The error measure is a space time root-mean-square error (RMSE), which, treating each variable separately, is portrayed as a relative error by normalizing the result by the median error of all model results (Gleckler et al., 2008). For example, a value of 0.20 indicates that a model s RMSE is 20% larger than the median CMIP5 error for that variable, whereas a value of 0.20 means the error is 20% smaller than the median error. No colour (white) indicates that model results are currently unavailable. A diagonal split of a grid square shows the relative error with respect to both the default reference data set (upper left triangle) and the alternate (lower right triangle). The relative errors are calculated independently for the default and alternate data sets. All reference data used in the diagram are summarized in Table 9.3. IPCC AR5 Chapter 9
5 Introduction: forecasts and their evaluation Probability forecasts: proper scoring rules Evaluation when truth is uncertain: unbiased scoring rules Ensemble forecasts: fair scoring rules Conclusion
6 Types of forecast (8 August 2016)
7 Probabilities and ensembles Definition: A probability forecast is a probability distribution for a predictand. Definition: An ensemble forecast is a set of ( equally likely ) possible values for a predictand. Weather/climate models produce ensembles, not probabilities. Ensembles are produced by running a model several times with perturbed initial conditions and/or perturbed model parameters. Post-processing turns ensembles into probability forecasts.
8 An ensemble forecast (11 August 2016)
9 Use scoring rules to measure forecast performance Definition: A scoring rule, s, is a real-valued function of a forecast, f, and an outcome, x. Calculate the score, s(f, x), for each forecast and measure performance by the mean score, e.g. mean squared error. We use negatively-oriented scoring rules: lower scores indicate better forecasts.
10 Use scoring rules to measure forecast performance Definition: A scoring rule, s, is a real-valued function of a forecast, f, and an outcome, x. Calculate the score, s(f, x), for each forecast and measure performance by the mean score, e.g. mean squared error. We use negatively-oriented scoring rules: lower scores indicate better forecasts. Other measures are prone to inflation due to trends. Example: correlation =.87
11 Use scoring rules to measure forecast performance Definition: A scoring rule, s, is a real-valued function of a forecast, f, and an outcome, x. Calculate the score, s(f, x), for each forecast and measure performance by the mean score, e.g. mean squared error. We use negatively-oriented scoring rules: lower scores indicate better forecasts. Post-processing models, x w f (x; w, θ) for ensemble w and parameter θ, can be fitted by minimum score estimation: ˆθ := argmin θ n s(f j, x j ) j=1 where f j (x) := f (x; w j, θ).
12 Introduction: forecasts and their evaluation Probability forecasts: proper scoring rules Evaluation when truth is uncertain: unbiased scoring rules Ensemble forecasts: fair scoring rules Conclusion
13 Use proper scoring rules for probability forecasts Definition: A scoring rule is proper if the expected score, E x {s(f, x)}, over any distribution, p, for x, is optimized at f = p.
14 Use proper scoring rules for probability forecasts Definition: A scoring rule is proper if the expected score, E x {s(f, x)}, over any distribution, p, for x, is optimized at f = p. Example: Let x {0, 1}, p := Pr(x = 1) and f [0, 1]. s 1 (f, x) := (f x) 2 is proper since is minimized at f = p. E x {s 1 (f, x)} = (f p) 2 + p(1 p)
15 Use proper scoring rules for probability forecasts Definition: A scoring rule is proper if the expected score, E x {s(f, x)}, over any distribution, p, for x, is optimized at f = p. Example: Let x {0, 1}, p := Pr(x = 1) and f [0, 1]. s 1 (f, x) := (f x) 2 is proper since is minimized at f = p. E x {s 1 (f, x)} = (f p) 2 + p(1 p) s 2 (f, x) := f x is improper since E x {s 2 (f, x)} = p + (1 2p)f is minimized at f = 0 if p < 1/2 and at f = 1 if p > 1/2.
16 Good forecasts are calibrated and sharp p f := conditional distribution of outcomes given forecast f Definition: Forecasts are calibrated if p f = f for all f.
17 Good forecasts are calibrated and sharp p f := conditional distribution of outcomes given forecast f Definition: Forecasts are calibrated if p f = f for all f. Definition: Forecasts are sharp if p f is concentrated for all f.
18 Good forecasts are calibrated and sharp p f := conditional distribution of outcomes given forecast f Definition: Forecasts are calibrated if p f = f for all f. Definition: Forecasts are sharp if p f is concentrated for all f. Let x {0, 1}, f [0, 1] and p f := Pr(x = 1 f ). Plot the graph of p f and the distribution of f.
19 Proper scoring rules favour calibrated, sharp forecasts Let S(f, p) := E x {s(f, x)} when x p so that E f,x {s(f, x)} = E f [E x f {s(f, x) f }] = E f {S(f, p f )}, where p f denotes the conditional distribution of x given f, and S(f, p f ) = S(f, p f ) S(p f, p f ) + S(p f, p f ) =: C(f, p f ) + R(p f ).
20 Proper scoring rules favour calibrated, sharp forecasts Let S(f, p) := E x {s(f, x)} when x p so that E f,x {s(f, x)} = E f [E x f {s(f, x) f }] = E f {S(f, p f )}, where p f denotes the conditional distribution of x given f, and S(f, p f ) = S(f, p f ) S(p f, p f ) + S(p f, p f ) =: C(f, p f ) + R(p f ). Let s be proper so that S(f, p) S(p, p) for all f, p. C measures calibration: C(f, p f ) 0 with equality when p f = f. R measures sharpness: R is concave. Example: Let x {0, 1}, f [0, 1] and s(f, x) := (f x) 2. Then C(f, p f ) = (f p f ) 2 and R(p f ) = p f (1 p f ).
21 Windspeed forecasts 36h-ahead forecasts of 80m windspeed at Lerwick for n = 580 days from Let x = 1 if windspeed exceeds 15 ms 1. Score: n 1 n j=1 (f j x j ) 2 = 0.14 (0.01)
22 Windspeed forecasts 36h-ahead forecasts of 80m windspeed at Lerwick for n = 580 days from Let x = 1 if windspeed exceeds 15 ms 1. Score: n 1 n j=1 (f j x j ) 2 = 0.14 (0.01) Forecasts take values p k = k/12 for k = 0, 1,..., 12. Let n k be the number of times the forecast is p k and let x k be the mean of the corresponding outcomes. Calibration: n 1 k n k(p k x k ) 2 = 0.02 Sharpness: n 1 k n k x k(1 x k ) = 0.12
23 Windspeed forecasts: calibration diagram
24 Introduction: forecasts and their evaluation Probability forecasts: proper scoring rules Evaluation when truth is uncertain: unbiased scoring rules Ensemble forecasts: fair scoring rules Conclusion
25 Observation error x = true value of predicted quantity y = observed value of predicted quantity f = forecast probability distribution for x s = scoring rule: assigns score s(f, y) to f
26 Observation error x = true value of predicted quantity y = observed value of predicted quantity f = forecast probability distribution for x s = scoring rule: assigns score s(f, y) to f Observation error causes two problems. 1. Proper scoring rules favour good forecasts of y, not of x. Proper scores can favour forecasters who issue worse forecasts of the truth. 2. Scores s(f, y) are biased estimates of s(f, x). Changes in proper scores over time can be due to changes in observation quality.
27 Observation error Example: Let x, y {0, 1}, f [0, 1], s(f, y) = (f y) 2. Denote the observation error probabilities by r x := Pr(y x x). Then E y {s(f, y)} = (f q) 2 + q(1 q) where q = (1 r 1 )p + r 0 (1 p) when Pr(x = 1) = p. Graphs of E y {s(f, y)} against f when p = 0.1 and the observation error probabilities are r 0 = r 1 = 0 (solid line) and r 0 = r 1 = 0.1,..., 0.5 (dashed). Dots ( ) mark the minima.
28 Proper, unbiased scoring rules x = true value of predicted quantity y = observed value of predicted quantity r = conditional distribution of y given x s = scoring rule: assigns score s(f, y) to f
29 Proper, unbiased scoring rules x = true value of predicted quantity y = observed value of predicted quantity r = conditional distribution of y given x s = scoring rule: assigns score s(f, y) to f Definition: The scoring rule is proper under r if E y {s(f, y)}, calculated for any distribution, p, for x, is optimized at f = p.
30 Proper, unbiased scoring rules x = true value of predicted quantity y = observed value of predicted quantity r = conditional distribution of y given x s = scoring rule: assigns score s(f, y) to f Definition: The scoring rule is proper under r if E y {s(f, y)}, calculated for any distribution, p, for x, is optimized at f = p. Definition: The scoring rule is unbiased for s 0 under r if E y x {s(f, y) x} = s 0 (f, x) for all f and x.
31 Proper, unbiased scoring rules x = true value of predicted quantity y = observed value of predicted quantity r = conditional distribution of y given x s = scoring rule: assigns score s(f, y) to f Definition: The scoring rule is proper under r if E y {s(f, y)}, calculated for any distribution, p, for x, is optimized at f = p. Definition: The scoring rule is unbiased for s 0 under r if E y x {s(f, y) x} = s 0 (f, x) for all f and x. A scoring rule, s, has both properties if it is unbiased under r for a proper scoring rule, s 0.
32 Binary predictands Let x, y {0, 1}, f [0, 1], r x := Pr(y x x). Proposition: Let s 0 be a (non-trivial, proper) scoring rule. There exists a scoring rule, s, that is unbiased for s 0 under r if and only if r 0 + r 1 1, in which case s(f, y) = s 0 (f, y) + r y{s 0 (f, y) s 0 (f, 1 y)} 1 r 0 r 1.
33 Binary predictands Let x, y {0, 1}, f [0, 1], r x := Pr(y x x). Proposition: Let s 0 be a (non-trivial, proper) scoring rule. There exists a scoring rule, s, that is unbiased for s 0 under r if and only if r 0 + r 1 1, in which case s(f, y) = s 0 (f, y) + r y{s 0 (f, y) s 0 (f, 1 y)} 1 r 0 r 1. Example: Artificial aircraft icing data (after Briggs et al. 2005, Monthly Weather Review): 1000 forecasts f = Pr(icing), error probabilities r 0 = r 1 = 0.2, scoring rule s 0 (f, x) = (f x) 2. s 0 (f, x) = (0.006) s 0 (f, y) = (0.009) s(f, y) = (0.014)
34 Continuous predictands Given proper s 0, we need s to satisfy the integral equation s(f, y)r(y x)dy = s 0 (f, x) for all f and x.
35 Continuous predictands Given proper s 0, we need s to satisfy the integral equation s(f, y)r(y x)dy = s 0 (f, x) for all f and x. Example: The (proper) Dawid-Sebastiani scoring rule is s 0 (f, x) = log σ + (x µ)2 2σ 2 where µ and σ are the mean and standard deviation of f. Let E(y x) = x and var(y x) = c 2. Then s(f, y) = log σ + (y µ)2 c 2 2σ 2.
36 Introduction: forecasts and their evaluation Probability forecasts: proper scoring rules Evaluation when truth is uncertain: unbiased scoring rules Ensemble forecasts: fair scoring rules Conclusion
37 Evaluating ensemble forecasts Weather/climate models produce ensembles, not probabilities. Performance depends on the meaning of the ensemble empirical distribution is a probability forecast? 2. set of functionals (e.g. quantiles) of a probability forecast? 3. simple random sample from a probability forecast?
38 Evaluating ensemble forecasts Weather/climate models produce ensembles, not probabilities. Performance depends on the meaning of the ensemble empirical distribution is a probability forecast? 2. set of functionals (e.g. quantiles) of a probability forecast? 3. simple random sample from a probability forecast? Let members w 1,..., w m be i.i.d. with unknown distribution f. Try evaluating a proper scoring rule, s(f m, x), for the e.d.f., f m. When x p, E w,x {s(f m, x)} typically is not optimized at f = p. Example: Let x Ber(p), w i Ber(f ) and s(f m, x) = ( w x) 2. If m = 10 then E w,x {s(f m, x)} is minimized at f = 0 for p 5%.
39 Use fair scoring rules for ensemble forecasts Definition: A scoring rule is fair if the expected score, E w,x {s(w, x)}, over i.i.d. ensemble members sampled from a distribution f and any distribution, p, for x, is optimized at f = p. Fair scoring rules effectively evaluate f given only a sample from f, e.g. E w f,x {s(w, x) f, x} is a proper scoring rule for f.
40 Use fair scoring rules for ensemble forecasts Definition: A scoring rule is fair if the expected score, E w,x {s(w, x)}, over i.i.d. ensemble members sampled from a distribution f and any distribution, p, for x, is optimized at f = p. Fair scoring rules effectively evaluate f given only a sample from f, e.g. E w f,x {s(w, x) f, x} is a proper scoring rule for f. Theorem: Let x, w 1,..., w m {0, 1} and let s(w, x) = s m w,x. The scoring rule is fair if (m k)(s k+1,0 s k,0 ) = k(s k 1,1 s k,1 ) for k = 0, 1,..., m and s k+1,0 s k,0 for k = 0, 1,..., m 1. Example: s(w, x) = ( w x) 2 w(1 w)/(m 1) is fair.
41 Introduction: forecasts and their evaluation Probability forecasts: proper scoring rules Evaluation when truth is uncertain: unbiased scoring rules Ensemble forecasts: fair scoring rules Conclusion
42 Summary Evaluation makes forecasts more useful. Performance depends on the meaning of the forecast. Rank probability forecasts using proper scoring rules. Rank i.i.d. ensemble forecasts using fair scoring rules. Account for observation error using unbiased scoring rules. Other methods may be needed to understand performance.
43 Some other research topics Evaluating forecasts of high-dimensional space-time fields Evaluating forecasts of phenomena such as hurricanes Evaluating forecasts of rare, extreme events Predicting the performance of climate forecasts
44 References Ferro CAT (2016) Proper scoring rules in the presence of observation error. In preparation. Ferro CAT (2014) Fair scores for ensemble forecasts. Quarterly Journal of the Royal Meteorological Society, 140, Fricker TE, Ferro CAT, Stephenson DB (2013) Three recommendations for evaluating climate predictions. Meteorological Applications, 20, Mitchell K, Ferro CAT (2016) Proper scoring rules for interval probabilistic forecasts. Submitted. Otto FEL, Ferro CAT, Fricker TE, Suckling EB (2015) On judging the credibility of climate predictions. Climatic Change, 132,
45
46 Concavity of R(p) := S(p, p) for binary outcomes For x {0, 1}, S(f, p) := E x {s(f, x)} = ps(f, 1) + (1 p)s(f, 0). Fix f [0, 1]. The graph of S(f, p) against p is a straight line intersecting the graph of R(p) at p = f. If s is proper then S(f, p) R(p) for all p. This can happen only if the straight line is tangent to R(p) at f and if R(p) is concave.
47 Decision problems define proper scoring rules Let L(ˆx, x) := loss incurred by ˆx when outcome is x. Let ˆx f := argmin E x {L(ˆx, x)} when x f be the Bayes action. x The incurred loss, s(f, x) := L(ˆx f, x), is a proper scoring rule.
48 Evaluating climate forecasts 1. Long lead times and unknowable greenhouse gas concentrations mean few or no observations are available. Evaluate hindcasts instead of forecasts. Hindcast scores inform belief about forecast performance. Train to become good judges of relative performance. 2. Good models or good forecasts? Initial-condition ensembles are not designed to be well calibrated, so proper scoring rules may be inappropriate.
49 Proper scoring rules for interval probability forecasts Forecaster must issue one of several fixed probability intervals, e.g. [0,.05], (.05,.15],..., (.95, 1]. Scoring rules are functions of an interval, I, and an outcome, x. Definition: A scoring rule is interval-proper if its expectation, E x {s(i, x)}, over any distribution, p, for x, is optimized at I p. Example: The scoring rule s(i, x) := (m x) 2, where m is the midpoint of I, is not interval-proper. For example, E x {s(i, x)} is optimized by I = [0,.05] when p =.06. Let I k := [a k 1, a k ). For any increasing sequence g j the scoring rule s(i k, x) := k 1 j=1 (g j g j 1 )(a j x) is interval-proper. Example: s(i k, x) := (x a k 1 )(x a k ) is interval-proper.
50 Windspeed forecasts Weibull forecast distributions with density f and cdf F. Two proper scoring rules are the logarithmic score log f (x) and the continuous ranked probability score 0 {F(t) H(t x)} 2 dt.
51 Windspeed forecasts: PIT histogram Probability Integral Transform: if x F then F (x) Uni(0, 1). Plot the distribution of PIT values, u j := F j (x j ).
Ensemble Verification Metrics
Ensemble Verification Metrics Debbie Hudson (Bureau of Meteorology, Australia) ECMWF Annual Seminar 207 Acknowledgements: Beth Ebert Overview. Introduction 2. Attributes of forecast quality 3. Metrics:
More informationEvaluating density forecasts: forecast combinations, model mixtures, calibration and sharpness
Second International Conference in Memory of Carlo Giannini Evaluating density forecasts: forecast combinations, model mixtures, calibration and sharpness Kenneth F. Wallis Emeritus Professor of Econometrics,
More informationHOW TO INTERPRET PROBABILISTIC FORECASTS (IN PARTICULAR FOR WEATHER AND CLIMATE)
HOW TO INTERPRET PROBABILISTIC FORECASTS (IN PARTICULAR FOR WEATHER AND CLIMATE) Jochen Bröcker Ackn: S. Siegert, H. Kantz MPI für Physik komplexer Systeme, Dresden, Germany (on leave) CATS Outline 1 The
More informationstatistical methods for tailoring seasonal climate forecasts Andrew W. Robertson, IRI
statistical methods for tailoring seasonal climate forecasts Andrew W. Robertson, IRI tailored seasonal forecasts why do we make probabilistic forecasts? to reduce our uncertainty about the (unknown) future
More informationS e a s o n a l F o r e c a s t i n g f o r t h e E u r o p e a n e n e r g y s e c t o r
S e a s o n a l F o r e c a s t i n g f o r t h e E u r o p e a n e n e r g y s e c t o r C3S European Climatic Energy Mixes (ECEM) Webinar 18 th Oct 2017 Philip Bett, Met Office Hadley Centre S e a s
More informationPredicting climate extreme events in a user-driven context
www.bsc.es Oslo, 6 October 2015 Predicting climate extreme events in a user-driven context Francisco J. Doblas-Reyes BSC Earth Sciences Department BSC Earth Sciences Department What Environmental forecasting
More informationPreliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com
1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix
More informationStandardized Anomaly Model Output Statistics Over Complex Terrain.
Standardized Anomaly Model Output Statistics Over Complex Terrain Reto.Stauffer@uibk.ac.at Outline statistical ensemble postprocessing introduction to SAMOS new snow amount forecasts in Tyrol sub-seasonal
More informationModel verification / validation A distributions-oriented approach
Model verification / validation A distributions-oriented approach Dr. Christian Ohlwein Hans-Ertel-Centre for Weather Research Meteorological Institute, University of Bonn, Germany Ringvorlesung: Quantitative
More informationWhat is one-month forecast guidance?
What is one-month forecast guidance? Kohshiro DEHARA (dehara@met.kishou.go.jp) Forecast Unit Climate Prediction Division Japan Meteorological Agency Outline 1. Introduction 2. Purposes of using guidance
More informationEnhancing Weather Information with Probability Forecasts. An Information Statement of the American Meteorological Society
Enhancing Weather Information with Probability Forecasts An Information Statement of the American Meteorological Society (Adopted by AMS Council on 12 May 2008) Bull. Amer. Meteor. Soc., 89 Summary This
More informationUpscaled and fuzzy probabilistic forecasts: verification results
4 Predictability and Ensemble Methods 124 Upscaled and fuzzy probabilistic forecasts: verification results Zied Ben Bouallègue Deutscher Wetterdienst (DWD), Frankfurter Str. 135, 63067 Offenbach, Germany
More informationComparing Probabilistic Forecasting Systems. with the Brier Score
Comparing Probabilistic Forecasting Systems with the Brier Score Christopher A. T. Ferro Walker Institute Department of Meteorology University of Reading, UK January 16, 2007 Corresponding author address:
More informationRisk in Climate Models Dr. Dave Stainforth. Risk Management and Climate Change The Law Society 14 th January 2014
Risk in Climate Models Dr. Dave Stainforth Grantham Research Institute on Climate Change and the Environment, and Centre for the Analysis of Timeseries, London School of Economics. Risk Management and
More informationCalibration of short-range global radiation ensemble forecasts
Calibration of short-range global radiation ensemble forecasts Zied Ben Bouallègue, Tobias Heppelmann 3rd International Conference Energy & Meteorology Boulder, Colorado USA June 2015 Outline EWeLiNE:
More informationDrought forecasting methods Blaz Kurnik DESERT Action JRC
Ljubljana on 24 September 2009 1 st DMCSEE JRC Workshop on Drought Monitoring 1 Drought forecasting methods Blaz Kurnik DESERT Action JRC Motivations for drought forecasting Ljubljana on 24 September 2009
More informationGL Garrad Hassan Short term power forecasts for large offshore wind turbine arrays
GL Garrad Hassan Short term power forecasts for large offshore wind turbine arrays Require accurate wind (and hence power) forecasts for 4, 24 and 48 hours in the future for trading purposes. Receive 4
More informationValidation of Forecasts (Forecast Verification) Overview. Ian Jolliffe
Validation of Forecasts (Forecast Verification) Overview Ian Jolliffe 1 Outline 1. Introduction and history (4) 2. Types of forecast (2) 3. Properties of forecasts (3) verification measures (2) 4. Terminology
More informationSeasonal Predictions for South Caucasus and Armenia
Seasonal Predictions for South Caucasus and Armenia Anahit Hovsepyan Zagreb, 11-12 June 2008 SEASONAL PREDICTIONS for the South Caucasus There is a notable increase of interest of population and governing
More informationPrequential Analysis
Prequential Analysis Philip Dawid University of Cambridge NIPS 2008 Tutorial Forecasting 2 Context and purpose...................................................... 3 One-step Forecasts.......................................................
More informationFORECASTING poor air quality events associated with
A Comparison of Bayesian and Conditional Density Models in Probabilistic Ozone Forecasting Song Cai, William W. Hsieh, and Alex J. Cannon Member, INNS Abstract Probabilistic models were developed to provide
More informationINFLUENCE OF CLIMATE CHANGE ON EXTREME WEATHER EVENTS
INFLUENCE OF CLIMATE CHANGE ON EXTREME WEATHER EVENTS Richard L Smith University of North Carolina and SAMSI (Joint with Michael Wehner, Lawrence Berkeley Lab) VI-MSS Workshop on Environmental Statistics
More informationEMC Probabilistic Forecast Verification for Sub-season Scales
EMC Probabilistic Forecast Verification for Sub-season Scales Yuejian Zhu Environmental Modeling Center NCEP/NWS/NOAA Acknowledgement: Wei Li, Hong Guan and Eric Sinsky Present for the DTC Test Plan and
More informationClimate Projections and Energy Security
NOAA Research Earth System Research Laboratory Physical Sciences Division Climate Projections and Energy Security Andy Hoell and Jim Wilczak Research Meteorologists, Physical Sciences Division 7 June 2016
More informationHierarchical modelling of performance indicators, with application to MRSA & teenage conception rates
Hierarchical modelling of performance indicators, with application to MRSA & teenage conception rates Hayley E Jones School of Social and Community Medicine, University of Bristol, UK Thanks to David Spiegelhalter,
More informationVerification of extremes using proper scoring rules and extreme value theory
Verification of extremes using proper scoring rules and extreme value theory Maxime Taillardat 1,2,3 A-L. Fougères 3, P. Naveau 2 and O. Mestre 1 1 CNRM/Météo-France 2 LSCE 3 ICJ May 8, 2017 Plan 1 Extremes
More informationStrategy for Using CPC Precipitation and Temperature Forecasts to Create Ensemble Forcing for NWS Ensemble Streamflow Prediction (ESP)
Strategy for Using CPC Precipitation and Temperature Forecasts to Create Ensemble Forcing for NWS Ensemble Streamflow Prediction (ESP) John Schaake (Acknowlements: D.J. Seo, Limin Wu, Julie Demargne, Rob
More informationPredictability from a Forecast Provider s Perspective
Predictability from a Forecast Provider s Perspective Ken Mylne Met Office, Bracknell RG12 2SZ, UK. email: ken.mylne@metoffice.com 1. Introduction Predictability is not a new issue for forecasters or forecast
More informationWhich Climate Model is Best?
Which Climate Model is Best? Ben Santer Program for Climate Model Diagnosis and Intercomparison Lawrence Livermore National Laboratory, Livermore, CA 94550 Adapting for an Uncertain Climate: Preparing
More informationProbabilistic verification
Probabilistic verification Chiara Marsigli with the help of the WG and Laurie Wilson in particular Goals of this session Increase understanding of scores used for probability forecast verification Characteristics,
More informationVerification of Probability Forecasts
Verification of Probability Forecasts Beth Ebert Bureau of Meteorology Research Centre (BMRC) Melbourne, Australia 3rd International Verification Methods Workshop, 29 January 2 February 27 Topics Verification
More informationOperational event attribution
Operational event attribution Peter Stott, NCAR, 26 January, 2009 August 2003 Events July 2007 January 2009 January 2009 Is global warming slowing down? Arctic Sea Ice Climatesafety.org climatesafety.org
More informationVerification of ECMWF products at the Finnish Meteorological Institute
Verification of ECMWF products at the Finnish Meteorological Institute by Juha Kilpinen, Pertti Nurmi, Petra Roiha and Martti Heikinheimo 1. Summary of major highlights A new verification system became
More informationWarwick Business School Forecasting System. Summary. Ana Galvao, Anthony Garratt and James Mitchell November, 2014
Warwick Business School Forecasting System Summary Ana Galvao, Anthony Garratt and James Mitchell November, 21 The main objective of the Warwick Business School Forecasting System is to provide competitive
More informationApplication and verification of ECMWF products 2015
Application and verification of ECMWF products 2015 Hungarian Meteorological Service 1. Summary of major highlights The objective verification of ECMWF forecasts have been continued on all the time ranges
More informationProbabilistic Weather Forecasting and the EPS at ECMWF
Probabilistic Weather Forecasting and the EPS at ECMWF Renate Hagedorn European Centre for Medium-Range Weather Forecasts 30 January 2009: Ensemble Prediction at ECMWF 1/ 30 Questions What is an Ensemble
More informationMonthly probabilistic drought forecasting using the ECMWF Ensemble system
Monthly probabilistic drought forecasting using the ECMWF Ensemble system Christophe Lavaysse(1) J. Vogt(1), F. Pappenberger(2) and P. Barbosa(1) (1) European Commission (JRC-IES), Ispra Italy (2) ECMWF,
More informationScoring rules can provide incentives for truthful reporting of probabilities and evaluation measures for the
MANAGEMENT SCIENCE Vol. 55, No. 4, April 2009, pp. 582 590 issn 0025-1909 eissn 1526-5501 09 5504 0582 informs doi 10.1287/mnsc.1080.0955 2009 INFORMS Sensitivity to Distance and Baseline Distributions
More informationData assimilation in high dimensions
Data assimilation in high dimensions David Kelly Kody Law Andy Majda Andrew Stuart Xin Tong Courant Institute New York University New York NY www.dtbkelly.com February 3, 2016 DPMMS, University of Cambridge
More informationCGE TRAINING MATERIALS ON VULNERABILITY AND ADAPTATION ASSESSMENT. Climate change scenarios
CGE TRAINING MATERIALS ON VULNERABILITY AND ADAPTATION ASSESSMENT Climate change scenarios Outline Climate change overview Observed climate data Why we use scenarios? Approach to scenario development Climate
More informationApplication and verification of ECMWF products 2008
Application and verification of ECMWF products 2008 RHMS of Serbia 1. Summary of major highlights ECMWF products are operationally used in Hydrometeorological Service of Serbia from the beginning of 2003.
More informationVerification of Weather Warnings
Verification of Weather Warnings Did the boy cry wolf or was it just a sheep? David B. Stephenson Exeter Climate Systems olliffe, Clive Wilson, Michael Sharpe, Hewson, and Marion Mittermaier ks also to
More informationExploring the Use of Dynamical Weather and Climate Models for Risk Assessment
Exploring the Use of Dynamical Weather and Climate Models for Risk Assessment James Done Willis Research Network Fellow National Center for Atmospheric Research Boulder CO, US Leverages resources in the
More informationHenrik Aalborg Nielsen 1, Henrik Madsen 1, Torben Skov Nielsen 1, Jake Badger 2, Gregor Giebel 2, Lars Landberg 2 Kai Sattler 3, Henrik Feddersen 3
PSO (FU 2101) Ensemble-forecasts for wind power Comparison of ensemble forecasts with the measurements from the meteorological mast at Risø National Laboratory Henrik Aalborg Nielsen 1, Henrik Madsen 1,
More informationDeterministic and Probabilistic prediction approaches in Seasonal to Inter-annual climate forecasting
RA 1 EXPERT MEETING ON THE APPLICATION OF CLIMATE FORECASTS FOR AGRICULTURE Banjul, Gambia, 9-13 December 2002 Deterministic and Probabilistic prediction approaches in Seasonal to Inter-annual climate
More informationStatistics: Learning models from data
DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial
More informationEvaluating Forecast Quality
Evaluating Forecast Quality Simon J. Mason International Research Institute for Climate Prediction Questions How do we decide whether a forecast was correct? How do we decide whether a set of forecasts
More informationECMWF 10 th workshop on Meteorological Operational Systems
ECMWF 10 th workshop on Meteorological Operational Systems 18th November 2005 Crown copyright 2004 Page 1 Monthly range prediction products: Post-processing methods and verification Bernd Becker, Richard
More informationWalter C. Kolczynski, Jr.* David R. Stauffer Sue Ellen Haupt Aijun Deng Pennsylvania State University, University Park, PA
7.3B INVESTIGATION OF THE LINEAR VARIANCE CALIBRATION USING AN IDEALIZED STOCHASTIC ENSEMBLE Walter C. Kolczynski, Jr.* David R. Stauffer Sue Ellen Haupt Aijun Deng Pennsylvania State University, University
More informationBayesian merging of multiple climate model forecasts for seasonal hydrological predictions
Click Here for Full Article JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 112,, doi:10.1029/2006jd007655, 2007 Bayesian merging of multiple climate model forecasts for seasonal hydrological predictions Lifeng
More informationMATH Solutions to Probability Exercises
MATH 5 9 MATH 5 9 Problem. Suppose we flip a fair coin once and observe either T for tails or H for heads. Let X denote the random variable that equals when we observe tails and equals when we observe
More informationVERFICATION OF OCEAN WAVE ENSEMBLE FORECAST AT NCEP 1. Degui Cao, H.S. Chen and Hendrik Tolman
VERFICATION OF OCEAN WAVE ENSEMBLE FORECAST AT NCEP Degui Cao, H.S. Chen and Hendrik Tolman NOAA /National Centers for Environmental Prediction Environmental Modeling Center Marine Modeling and Analysis
More informationComparison between ensembles generated with atmospheric and with oceanic perturbations, using the MPI-ESM model
Comparison between ensembles generated with atmospheric and with oceanic perturbations, using the MPI-ESM model C. Marini 1. A. Köhl 1. D. Stammer 1 Vanya Romanova 2. Jürgen Kröger 3 1 Institut für Meereskunde
More informationAccounting for the effect of observation errors on verification of MOGREPS
METEOROLOGICAL APPLICATIONS Meteorol. Appl. 15: 199 205 (2008) Published online in Wiley InterScience (www.interscience.wiley.com).64 Accounting for the effect of observation errors on verification of
More informationSpatial Verification. for Ensemble. at DWD
Spatial Verification for Ensemble at DWD Susanne Theis Deutscher Wetterdienst (DWD) Introduction Spatial Verification of Single Simulations Box Statistics (e.g. averaging) Fraction Skill Score Wavelets
More informationProfessor David B. Stephenson
rand Challenges in Probabilistic Climate Prediction Professor David B. Stephenson Isaac Newton Workshop on Probabilistic Climate Prediction D. Stephenson, M. Collins, J. Rougier, and R. Chandler University
More informationCOS513: FOUNDATIONS OF PROBABILISTIC MODELS LECTURE 9: LINEAR REGRESSION
COS513: FOUNDATIONS OF PROBABILISTIC MODELS LECTURE 9: LINEAR REGRESSION SEAN GERRISH AND CHONG WANG 1. WAYS OF ORGANIZING MODELS In probabilistic modeling, there are several ways of organizing models:
More informationSummary of Seasonal Normal Review Investigations. DESC 31 st March 2009
Summary of Seasonal Normal Review Investigations DESC 31 st March 9 1 Introduction to the Seasonal Normal Review The relationship between weather and NDM demand is key to a number of critical processes
More informationENSEMBLE FLOOD INUNDATION FORECASTING: A CASE STUDY IN THE TIDAL DELAWARE RIVER
ENSEMBLE FLOOD INUNDATION FORECASTING: A CASE STUDY IN THE TIDAL DELAWARE RIVER Michael Gomez & Alfonso Mejia Civil and Environmental Engineering Pennsylvania State University 10/12/2017 Mid-Atlantic Water
More informationIntroduction to Bayesian Inference
Introduction to Bayesian Inference p. 1/2 Introduction to Bayesian Inference September 15th, 2010 Reading: Hoff Chapter 1-2 Introduction to Bayesian Inference p. 2/2 Probability: Measurement of Uncertainty
More informationVectors, dot product, and cross product
MTH 201 Multivariable calculus and differential equations Practice problems Vectors, dot product, and cross product 1. Find the component form and length of vector P Q with the following initial point
More informationProper Scores for Probability Forecasts Can Never Be Equitable
APRIL 2008 J O L LIFFE AND STEPHENSON 1505 Proper Scores for Probability Forecasts Can Never Be Equitable IAN T. JOLLIFFE AND DAVID B. STEPHENSON School of Engineering, Computing, and Mathematics, University
More informationMulti-Model Calibrated Probabilistic Seasonal Forecasts of Regional Arctic Sea Ice Coverage
Multi-Model Calibrated Probabilistic Seasonal Forecasts of Regional Arctic Sea Ice Coverage 2018 Polar Prediction Workshop Arlan Dirkson, William Merryfield, Bertrand Denis Thanks to Woosung Lee and Adam
More informationThe benefits and developments in ensemble wind forecasting
The benefits and developments in ensemble wind forecasting Erik Andersson Slide 1 ECMWF European Centre for Medium-Range Weather Forecasts Slide 1 ECMWF s global forecasting system High resolution forecast
More informationStandardized Verification System for Long-Range Forecasts. Simon Mason
Standardized Verification System for Long-Range Forecasts Simon Mason simon@iri.columbia.edu MedCOF 2015 Training Workshop Madrid, Spain, 26 30 October 2015 SVS for LRF: Goal Provide verification information
More informationApplied Time Series Topics
Applied Time Series Topics Ivan Medovikov Brock University April 16, 2013 Ivan Medovikov, Brock University Applied Time Series Topics 1/34 Overview 1. Non-stationary data and consequences 2. Trends and
More informationThe Measurement and Characteristics of Professional Forecasters Uncertainty. Kenneth F. Wallis. Joint work with Gianna Boero and Jeremy Smith
Seventh ECB Workshop on Forecasting Techniques The Measurement and Characteristics of Professional Forecasters Uncertainty Kenneth F. Wallis Emeritus Professor of Econometrics, University of Warwick http://go.warwick.ac.uk/kfwallis
More informationInflow Forecasting for Hydropower Operations: Bayesian Model Averaging for Postprocessing Hydrological Ensembles
Inflow Forecasting for Hydropower Operations: Bayesian Model Averaging for Postprocessing Hydrological Ensembles Andreas Kleiven, Ingelin Steinsland Norwegian University of Science & Technology Dept. of
More informationthe amount of the data corresponding to the subinterval the width of the subinterval e x2 to the left by 5 units results in another PDF g(x) = 1 π
Math 10A with Professor Stankova Worksheet, Discussion #42; Friday, 12/8/2017 GSI name: Roy Zhao Problems 1. For each of the following distributions, derive/find all of the following: PMF/PDF, CDF, median,
More informationHigh School Modeling Standards
High School Modeling Standards Number and Quantity N-Q.1 - Use units as a way to understand problems and to guide the solution of multi-step problems; choose and interpret units consistently in formulas;
More informationCalculus I Homework: Linear Approximation and Differentials Page 1
Calculus I Homework: Linear Approximation and Differentials Page Example (3..8) Find the linearization L(x) of the function f(x) = (x) /3 at a = 8. The linearization is given by which approximates the
More informationClimate Models and Snow: Projections and Predictions, Decades to Days
Climate Models and Snow: Projections and Predictions, Decades to Days Outline Three Snow Lectures: 1. Why you should care about snow 2. How we measure snow 3. Snow and climate modeling The observational
More informationDownscaling and Probability
Downscaling and Probability Applications in Climate Decision Aids May 11, 2011 Glenn Higgins Manager, Environmental Sciences and Engineering Department Downscaling and Probability in Climate Modeling The
More informationMultivariate verification: Motivation, Complexity, Examples
Multivariate verification: Motivation, Complexity, Examples A.Hense, A. Röpnack, J. Keune, R. Glowienka-Hense, S. Stolzenberger, H. Weinert Berlin, May, 5th 2017 Motivations for MV verification Data assimilation
More informationVerification challenges in developing TC response procedures based on ensemble guidance
Verification challenges in developing TC response procedures based on ensemble guidance Grant Elliott Senior Metocean Meteorologist ECMWF UEF 2016 workshop Disclaimer and important notice This presentation
More informationProducts of the JMA Ensemble Prediction System for One-month Forecast
Products of the JMA Ensemble Prediction System for One-month Forecast Shuhei MAEDA, Akira ITO, and Hitoshi SATO Climate Prediction Division Japan Meteorological Agency smaeda@met.kishou.go.jp Contents
More informationFig P3. *1mm/day = 31mm accumulation in May = 92mm accumulation in May Jul
Met Office 3 month Outlook Period: May July 2014 Issue date: 24.04.14 Fig P1 3 month UK outlook for precipitation in the context of the observed annual cycle The forecast presented here is for May and
More informationSeasonal Climate Watch June to October 2018
Seasonal Climate Watch June to October 2018 Date issued: May 28, 2018 1. Overview The El Niño-Southern Oscillation (ENSO) has now moved into the neutral phase and is expected to rise towards an El Niño
More informationSUPPLEMENTARY INFORMATION
doi:10.1038/nature12310 We present here two additional Tables (Table SI-1, 2) and eight further Figures (Figures SI-1 to SI-8) to provide extra background information to the main figures of the paper.
More informationCalculus I Homework: Linear Approximation and Differentials Page 1
Calculus I Homework: Linear Approximation and Differentials Page Questions Example Find the linearization L(x) of the function f(x) = (x) /3 at a = 8. Example Find the linear approximation of the function
More informationProblem 1 (20) Log-normal. f(x) Cauchy
ORF 245. Rigollet Date: 11/21/2008 Problem 1 (20) f(x) f(x) 0.0 0.1 0.2 0.3 0.4 0.0 0.2 0.4 0.6 0.8 4 2 0 2 4 Normal (with mean -1) 4 2 0 2 4 Negative-exponential x x f(x) f(x) 0.0 0.1 0.2 0.3 0.4 0.5
More informationMixture EMOS model for calibrating ensemble forecasts of wind speed
Mixture EMOS model for calibrating ensemble forecasts of wind speed Sándor Baran a and Sebastian Lerch b,c a Faculty of Informatics, University of Debrecen, Hungary arxiv:1507.06517v3 [stat.ap] 11 Dec
More informationTowards Operational Probabilistic Precipitation Forecast
5 Working Group on Verification and Case Studies 56 Towards Operational Probabilistic Precipitation Forecast Marco Turco, Massimo Milelli ARPA Piemonte, Via Pio VII 9, I-10135 Torino, Italy 1 Aim of the
More informationProbabilistic Weather Forecasting in R
CONTRIBUTED ARTICLE 1 Probabilistic Weather Forecasting in R by Chris Fraley, Adrian Raftery, Tilmann Gneiting, McLean Sloughter and Veronica Berrocal Abstract This article describes two R packages for
More informationForecasting AOSC 200 Tim Canty. Class Web Site: Lecture 26 Nov 29, Weather Forecasting
Forecasting AOSC 200 Tim Canty Class Web Site: http://www.atmos.umd.edu/~tcanty/aosc200 Topics for today: Forecasting Lecture 26 Nov 29, 2018 1 Weather Forecasting People have been trying to predict the
More informationCHAPTER 4: DATASETS AND CRITERIA FOR ALGORITHM EVALUATION
CHAPTER 4: DATASETS AND CRITERIA FOR ALGORITHM EVALUATION 4.1 Overview This chapter contains the description about the data that is used in this research. In this research time series data is used. A time
More informationClimate Change Adaptation for ports and navigation infrastructure
Climate Change Adaptation for ports and navigation infrastructure The application of climate projections and observations to address climate risks in ports Iñigo Losada Research Director IHCantabria Universidad
More informationUsing Multivariate Adaptive Constructed Analogs (MACA) data product for climate projections
Using Multivariate Adaptive Constructed Analogs (MACA) data product for climate projections Maria Herrmann and Ray Najjar Chesapeake Hypoxia Analysis and Modeling Program (CHAMP) Conference Call 2017-04-21
More informationRegional forecast quality of CMIP5 multimodel decadal climate predictions
Regional forecast quality of CMIP5 multimodel decadal climate predictions F. J. Doblas-Reyes ICREA & IC3, Barcelona, Spain V. Guemas (IC3, Météo-France), J. García-Serrano (IPSL), L.R.L. Rodrigues, M.
More informationUsing Innovative Displays of Hydrologic Ensemble Traces
Upper Colorado River Basin Water Forum November 7, 2018 Communicating Uncertainty and Risk in Water Resources: Using Innovative Displays of Hydrologic Ensemble Traces Richard Koehler, Ph.D. NOAA/NWS, Boulder,
More informationCalibrated Uncertainty in Deep Learning
Calibrated Uncertainty in Deep Learning U NCERTAINTY IN DEEP LEARNING W ORKSHOP @ UAI18 Volodymyr Kuleshov August 10, 2018 Estimating Uncertainty is Crucial in Many Applications Assessing uncertainty can
More informationDeclarative Statistics
Declarative Statistics Roberto Rossi, 1 Özgür Akgün, 2 Steven D. Prestwich, 3 S. Armagan Tarim 3 1 The University of Edinburgh Business School, The University of Edinburgh, UK 2 Department of Computer
More informationpresented by: Latest update: 11 January 2018
Seasonal forecasts presented by: Latest update: 11 January 2018 The seasonal forecasts presented here by Seasonal Forecast Worx are based on forecast output of the coupled ocean-atmosphere models administered
More informationMachine Learning. Lecture 9: Learning Theory. Feng Li.
Machine Learning Lecture 9: Learning Theory Feng Li fli@sdu.edu.cn https://funglee.github.io School of Computer Science and Technology Shandong University Fall 2018 Why Learning Theory How can we tell
More informationTraining: Climate Change Scenarios for PEI. Training Session April Neil Comer Research Climatologist
Training: Climate Change Scenarios for PEI Training Session April 16 2012 Neil Comer Research Climatologist Considerations: Which Models? Which Scenarios?? How do I get information for my location? Uncertainty
More informationECMWF products to represent, quantify and communicate forecast uncertainty
ECMWF products to represent, quantify and communicate forecast uncertainty Using ECMWF s Forecasts, 2015 David Richardson Head of Evaluation, Forecast Department David.Richardson@ecmwf.int ECMWF June 12,
More informationForecasting precipitation for hydroelectric power management: how to exploit GCM s seasonal ensemble forecasts
INTERNATIONAL JOURNAL OF CLIMATOLOGY Int. J. Climatol. 27: 1691 1705 (2007) Published online in Wiley InterScience (www.interscience.wiley.com).1608 Forecasting precipitation for hydroelectric power management:
More informationProbability and climate change UK/Russia Climate Workshop, October, 2003
Probability and climate change UK/Russia Climate Workshop, October, 2003 Myles Allen Department of Physics University of Oxford myles.allen@physics.ox.ac.uk South Oxford on January 5 th, 2003 Photo courtesy
More informationNATCOR. Forecast Evaluation. Forecasting with ARIMA models. Nikolaos Kourentzes
NATCOR Forecast Evaluation Forecasting with ARIMA models Nikolaos Kourentzes n.kourentzes@lancaster.ac.uk O u t l i n e 1. Bias measures 2. Accuracy measures 3. Evaluation schemes 4. Prediction intervals
More informationCalibration of ECMWF forecasts
from Newsletter Number 142 Winter 214/15 METEOROLOGY Calibration of ECMWF forecasts Based on an image from mrgao/istock/thinkstock doi:1.21957/45t3o8fj This article appeared in the Meteorology section
More information