How to make projections of daily-scale temperature variability in the future: a cross-validation study using the ENSEMBLES regional climate models Jouni Räisänen Department of Physics, University of Helsinki 8 March 2012
Daily mean temperatures in Helsinki in January Mean = -4.2, StDev = 6.3 Skewness = -1.3 Mean = -5.5, StDev = 5.2 Skewness = -0.7 What about Observations 2069-2098? Mean = -2.5, StDev = 4.2 Skewness = -0.4
Recent climate (observations) Two classes of MOS-approaches for projecting future climate Recent climate (model) Future climate (model) Change Climate projection Delta Change Recent climate (observations) Recent climate (model) Future climate (model) Bias correction Climate projection Bias correction
Questions How to best combine biased model results with observations? How large is the uncertainty associated with the choice of bias correction / delta change methods, in comparison with other uncertainties (in particular, the variation of climate change between different models)? Here these issues are explored for local frequency distributions of daily mean temperature No attention to temporal or spatial autocorrelation structure -------------------- inter-variable correlations
Model data A subset of 6 ENSEMBLES RCM simulations 6 different driving GCMs, 6 different RCMs CNRM-RM5.1 / ARPEGE ETHZ-CLM / HadCM3Q0 HadRM3Q3 / HadCM3Q3 HadRM3Q16 / HadCM3Q16 MPI-M-REMO / ECHAM5 SMHIRCA / BCM 25 km resolution A1B scenario Control period 1971-2000, scenario 2069-2098 / 2001-2030
Projections for daily mean temperatures in future climates: 10 methods changed / corrected Delta change Bias correction Mean 1 6 Mean + StDev 2 7 Mean + StDev + Skewness 3 8 Quantile mapping (non-parametric) Quantile mapping (linear fit) 4 9 5 10 In quantile mapping methods (4-5, 9-10), the projected change in the time mean temperature may differ from that simulated by the model
2069-2098 Quantile mapping (delta change mode) -13 C In this particular case, it would be inferred that a daily mean T of -13 C or lower would occur in the future (here: 2069-98) climate as frequently as -20 C or lower in the observed climate (in 1971-2000) -20 C 1971-2000
Distribution of daily mean temperatures: Helsinki, January, 1971-2000
Distribution of daily mean temperatures: Helsinki, January, 1971-2000 and 2069-2098 Based on ensemble mean of 6 RCM simulations
Distribution of daily mean temperatures: Helsinki, January, 1971-2000 and 2069-2098 Based on ensemble mean of 6 RCM simulations
Distribution of daily mean temperatures: Helsinki, January, 1971-2000 and 2069-2098 Based on ensemble mean of 6 RCM simulations
Distribution of daily mean temperatures: Helsinki, January, 1971-2000 and 2069-2098 Based on ensemble mean of 6 RCM simulations
Distribution of daily mean temperatures: Helsinki, January, 1971-2000 and 2069-2098 Based on ensemble mean of 6 RCM simulations
Distribution of daily mean temperatures: Helsinki, January, 1971-2000 and 2069-2098 Based on ensemble mean of 6 RCM simulations
Distribution of daily mean temperatures: Helsinki, January, 1971-2000 and 2069-2098 Based on ensemble mean of 6 RCM simulations
Distribution of daily mean temperatures: Helsinki, January, 1971-2000 and 2069-2098 Results for the 6 RCM simulations shown separately
Conclusions The uncertainty in bias correction / delta change methods is not negligible particularly not in the extreme tails of the distribution Still, the uncertainty in climate change itself (i.e., different changes in different models) is generally larger This example was just for a single location and one of the 12 months. However, the same conclusions qualitatively hold for most other cases.
Which projection method is best? Principle of Cross validation Recent climate in model N Recent climate in N-1 models Actual future climate in model N Projection for future climate in model N Verification statistics Future climate in N-1 models Repeat for all choices of Model N, using different (delta change and bias correction) methods for projection
Cross validation statistics: 2069-2098 Delta change Bias correction 1 6 Mean square error (MSE) of temperature distribution, averaged over 0-100%, the 6 verifying models, 12 months and 386 grid boxes m m 6 A 1 1 da 12 1 A 1 dt da 100% 0 12 1 ( T dt Pred 100% 0 dq T Ver ) 2 dq
Cross validation statistics: 2069-2098 Delta change Bias correction 1 6 Mean square error (MSE) of temperature distribution, averaged over 0-100%, the 6 verifying models, 12 months and 386 grid boxes m m 6 A 1 1 da 12 1 A 1 dt da 100% 0 12 1 ( T dt Pred 100% 0 dq T Ver ) 2 dq Red = MSE contribution from error in time mean temperature Grey = MSE contribution from error in the distribution of temperature around the mean value
Cross validation statistics: 2001-2030 Delta change Bias correction 1 6 Mean square error (MSE) of temperature distribution, averaged over 0-100%, the 6 verifying models, 12 months and 386 grid boxes m m 6 A 1 1 da 12 1 A 1 dt da 100% 0 12 1 ( T dt Pred 100% 0 dq T Ver ) 2 dq Red = MSE contribution from error in time mean temperature Grey = MSE contribution from error in the distribution of temperature around the mean value
Key points from the preceding slides Most of the cross-validation error (for the whole T distribution) is associated with the error in the time mean temperature Delta change and bias correction methods show comparable performance Methods 1 and 6 (same delta change or same bias correction over the whole distribution) worse than the other methods, which treat different parts of the distributions differently. Apparent superiority of quantile mapping methods (4-5, 9-10) in 2069-2098 is mainly due to smaller errors in the projected time mean temperature!
Important caveat For some applications, extremes of the temperature distribution might be unproportionally important Bulk measures (like MSE) not particularly sensitive to errors in extremes!
Ranking of the 10 methods for different parts of the distribution (2069-2098) Quantile (%)
Quantile (%) Ranking of the 10 methods for different parts of the distribution (2069-2098) There is no single method that would be best in all parts of the distribution
Quantile (%) Ranking of the 10 methods for different parts of the distribution (2069-2098) There is no single method that would be best in all parts of the distribution not to speak about different months and different areas
which suggests that The uncertainty associated with the choice of delta change / bias correction methods is (to some extent) unavoidable Thus (at least in theory) different methods should be used in parallel when making projections (in addition to using different climate models, etc.)
which suggests that The uncertainty associated with the choice of delta change / bias correction methods is (to some extent) unavoidable Thus (at least in theory) different methods should be used in parallel when making projections (in addition to using different climate models, etc.) Still, it seems wise to avoid the worst-performing methods ( = those that apply the same change / bias correction over the whole distribution?)
Example: 1st percentile of daily mean T in January, Helsinki, 2069-2098 Black = ensemble mean from 6 RCM simulations Other colors: 6 RCM simulations separately 1971-2000
Example: 1st percentile of daily mean T in January, Helsinki, 2069-2098 Black = ensemble mean from 6 RCM simulations Other colors: 6 RCM simulations separately 1971-2000 Methods 1 and 6 excluded
Example: 1st percentile of daily mean T in January, Helsinki, 2069-2098 Black = ensemble mean from 6 RCM simulations Other colors: 6 RCM simulations separately 1971-2000 nominal 5-95% range (which most likely underestimates the real uncertainty) Methods 1 and 6 excluded
Questions? Other comments?