Which Climate Model is Best?

Similar documents
How Do We Know That Human Activities Have Influenced Global Climate?

Doing science with multi-model ensembles

Northern New England Climate: Past, Present, and Future. Basic Concepts

An Introduction to Coupled Models of the Atmosphere Ocean System

Behind the Climate Prediction Center s Extended and Long Range Outlooks Mike Halpert, Deputy Director Climate Prediction Center / NCEP

Atmospheric Processes

Incorporating Climate Change Information Decisions, merits and limitations

Chapter outline. Reference 12/13/2016

Climate Modeling Dr. Jehangir Ashraf Awan Pakistan Meteorological Department

Climate Dynamics (PCC 587): Hydrologic Cycle and Global Warming

5. General Circulation Models

Guided Notes: Atmosphere Layers of the Atmosphere

Arctic Climate Change. Glen Lesins Department of Physics and Atmospheric Science Dalhousie University Create Summer School, Alliston, July 2013

Climate Feedbacks from ERBE Data

ATM S 111, Global Warming Climate Models

Presentation Overview. Southwestern Climate: Past, present and future. Global Energy Balance. What is climate?

Assessment of ESA GlobAlbedo for climate model applications - examples using MPI-ESM

TROPICAL-EXTRATROPICAL INTERACTIONS

3. Climate Change. 3.1 Observations 3.2 Theory of Climate Change 3.3 Climate Change Prediction 3.4 The IPCC Process

Introduction to Climate ~ Part I ~

Patterns and impacts of ocean warming and heat uptake

Energy Systems, Structures and Processes Essential Standard: Analyze patterns of global climate change over time Learning Objective: Differentiate

Torben Königk Rossby Centre/ SMHI

Climate Change 2007: The Physical Science Basis

Chapter 6: Modeling the Atmosphere-Ocean System

Possible Applications of Deep Neural Networks in Climate and Weather. David M. Hall Assistant Research Professor Dept. Computer Science, CU Boulder

Climate.tgt, Version: 1 1

Climate Change: Global Warming Claims

Climate and the Atmosphere

Global warming and Extremes of Weather. Prof. Richard Allan, Department of Meteorology University of Reading

Interannual variability of top-ofatmosphere. CERES instruments

Introduction to Climatology. GEOG/ENST 2331: Lecture 1

The Climate System and Climate Models. Gerald A. Meehl National Center for Atmospheric Research Boulder, Colorado

Outline: 1) Extremes were triggered by anomalous synoptic patterns 2) Cloud-Radiation-PWV positive feedback on 2007 low SIE

Lecture 1. Amplitude of the seasonal cycle in temperature

Global climate predictions: forecast drift and bias adjustment issues

the 2 past three decades

2. Fargo, North Dakota receives more snow than Charleston, South Carolina.

Climate modeling: 1) Why? 2) How? 3) What?

The ozone hole indirect effect: Cloud-radiative anomalies accompanying the poleward shift of the eddy-driven jet in the Southern Hemisphere

Climate models. René D. Garreaud. Departement of Geophysics Universidad de Chile

ALASKA REGION CLIMATE OUTLOOK BRIEFING. June 22, 2018 Rick Thoman National Weather Service Alaska Region

Global Warming: The known, the unknown, and the unknowable

Correspondence between short and long timescale systematic errors in CAM4/CAM5 explored by YOTC data

Extremes of Weather and the Latest Climate Change Science. Prof. Richard Allan, Department of Meteorology University of Reading

NSF Expeditions in Computing. Understanding Climate Change: A Data Driven Approach. Vipin Kumar University of Minnesota

Seasonal Climate Watch June to October 2018

ENSO and April SAT in MSA. This link is critical for our regression analysis where ENSO and

A perturbed physics ensemble climate modeling. requirements of energy and water cycle. Yong Hu and Bruce Wielicki

Introduction to Climatology. GEOG/ENST 2331: Lecture 1

Climate changes in Finland, but how? Jouni Räisänen Department of Physics, University of Helsinki

Temperature extremes in the United States: Quantifying the response to aerosols and greenhouse gases with implications for the warming hole

Transformational Climate Science. The future of climate change research following the IPCC Fifth Assessment Report

ALASKA REGION CLIMATE OUTLOOK BRIEFING. December 22, 2017 Rick Thoman National Weather Service Alaska Region

Weather Forecasts and Climate AOSC 200 Tim Canty. Class Web Site: Lecture 27 Dec

ALASKA REGION CLIMATE OUTLOOK BRIEFING. November 16, 2018 Rick Thoman Alaska Center for Climate Assessment and Policy

Observed Climate Variability and Change: Evidence and Issues Related to Uncertainty

Weather and Climate Prediction ATM S 380

Forecasts for the Future: Understanding Climate Models and Climate Projections for the 21 st Century

Impact of Eurasian spring snow decrement on East Asian summer precipitation

(1) Arctic Sea Ice Predictability,

ALASKA REGION CLIMATE OUTLOOK BRIEFING. November 17, 2017 Rick Thoman National Weather Service Alaska Region

Global Ocean Monitoring: A Synthesis of Atmospheric and Oceanic Analysis

FIRST DRAFT. Science underpinning the prediction and attribution of extreme events

WCRP/CLIVAR efforts to understand El Niño in a changing climate

The scientific basis for climate change projections: History, Status, Unsolved problems

Seasonal to decadal climate prediction: filling the gap between weather forecasts and climate projections

ALASKA REGION CLIMATE OUTLOOK BRIEFING. July 20, 2018 Rick Thoman National Weather Service Alaska Region

General Circulation. Nili Harnik DEES, Lamont-Doherty Earth Observatory

Climate 1: The Climate System

Radiative Climatology of the North Slope of Alaska and the Adjacent Arctic Ocean

WCRP Grand Challenge Workshop: Clouds, Circulation and Climate Sensitivity

Earth s Heat Budget. What causes the seasons? Seasons

PRMS WHITE PAPER 2014 NORTH ATLANTIC HURRICANE SEASON OUTLOOK. June RMS Event Response

ATM S 111: Global Warming Climate Feedbacks. Jennifer Fletcher Day 7: June

NATS 101 Section 13: Lecture 25. Weather Forecasting Part II

Lecture 7: The Monash Simple Climate

Detection and Attribution of Climate Change

Hurricane Risk: Importance of Climate Time Scale and Uncertainty

Prentice Hall EARTH SCIENCE

Consequences for Climate Feedback Interpretations

SUPPLEMENTARY INFORMATION

Workshop Summary. The 3 rd WGNE Workshop on Systematic Errors in Climate and NWP Models San Francisco, February 12-16, 2007

Fidelity and Predictability of Models for Weather and Climate Prediction

ECMWF global reanalyses: Resources for the wind energy community

East Penn School District Curriculum and Instruction

Preferred spatio-temporal patterns as non-equilibrium currents

Investigate the influence of the Amazon rainfall on westerly wind anomalies and the 2002 Atlantic Nino using QuikScat, Altimeter and TRMM data

Will a warmer world change Queensland s rainfall?

Saharan Dust Induced Radiation-Cloud-Precipitation-Dynamics Interactions

Moving from Global to Regional Projections of Climate Change

Shepard Glacier-2005 Glacier National Park, Montana

Prentice Hall EARTH SCIENCE

ALASKA REGION CLIMATE FORECAST BRIEFING. December 15, 2014 Rick Thoman NWS Alaska Region ESSD Climate Services

CLIMATE CHANGE AND REGIONAL HYDROLOGY ACROSS THE NORTHEAST US: Evidence of Changes, Model Projections, and Remote Sensing Approaches

The Search for a Human Fingerprint in the Changing Thermal Structure of the Atmosphere

Lecture 9: Climate Sensitivity and Feedback Mechanisms

Interhemispheric climate connections: What can the atmosphere do?

Seasonal Climate Watch September 2018 to January 2019

Transcription:

Which Climate Model is Best? Ben Santer Program for Climate Model Diagnosis and Intercomparison Lawrence Livermore National Laboratory, Livermore, CA 94550 Adapting for an Uncertain Climate: Preparing for the Next 100 Years University of Minnesota, Chanhassen, Minnesota September 17 th, 2010 1

Introducing PCMDI PCMDI: Program for Climate Model Diagnosis and Intercomparison Service: Coordinate international climate modeling simulations (standard benchmark experiments) Enable broader science community to analyze and evaluate models Goal: Quantify how well models simulate present-day climate and evaluate uncertainty in projections of future climate change PCMDI was established in 1989 Has been at Lawrence Livermore National Lab since then 2

Structure Introduction: Why we need climate models Evaluating climate models Problems in identifying the best and worst models Can we reduce uncertainties in model projections of future climate change? Conclusions 3

Average surface temperature change ( C) Computer models can perform the control experiment that we can t do in the real world Meehl et al., Journal of Climate (2004) 4

Models are the only credible tools we have for projecting the climatic shape of things to come A2 A1B B1 Constant composition 20 th century Source: IPCC Fourth Assessment Report (2007) 5

Structure Introduction: Why we need climate models Evaluating climate models Problems in identifying the best and worst models Can we reduce uncertainties in model projections of future climate change? Conclusions 6

We routinely test how well current climate models simulate: Today s annual average climate The daily cycle of temperature, clouds, and rainfall Changes over the seasonal cycle The response to massive volcanic eruptions Ocean uptake of products of atmospheric tests of nuclear weapons The climate changes of the past 30 to 150 years Climates of the deep past (like the last Ice Age) Weather Modes of natural climate variability (like El Niño) 7

We routinely test how well current climate models simulate: Today s annual average climate The daily cycle of temperature, clouds, and rainfall Changes over the seasonal cycle The response to massive volcanic eruptions Ocean uptake of products of atmospheric tests of nuclear weapons The climate changes of the past 30 to 150 years Climates of the deep past (like the last Ice Age) Weather Modes of natural climate variability (like El Niño) 8

Mean Median A B C D E F G H I J K L M N O P Q R S T U V Climate variable Evaluating how well computer models simulate seasonal changes in climate Latent heat flux at surface Sensible heat flux at surface Surface temperature Reflected SW radiation (clear sky) Reflected SW radiation Outgoing LW radiation (clear sky) Outgoing LW radiation Total cloud cover Precipitation Total column water vapor Sea-level pressure Meridional wind stress Zonal wind stress Meridional wind at surface Zonal wind at surface Specific humidity at 400 mb Specific humidity at 850 mb Meridional wind at 200 mb Zonal wind at 200 mb Temperature at 200 mb Geopotential height at 500 mb Meridional wind at 850 mb Zonal wind at 850 mb Temperature at 850 mb Worst Best Model used in IPCC Fourth Assessment Gleckler, Taylor, and Doutriaux, Journal of Geophysical Research (2008) 9

Mean Median A B C D E F G H I J K L M N O P Q R S T U V Climate variable Evaluating how well computer models simulate seasonal changes in climate Latent heat flux at surface Sensible heat flux at surface Surface temperature Reflected SW radiation (clear sky) Reflected SW radiation Outgoing LW radiation (clear sky) Outgoing LW radiation Total cloud cover Precipitation Total column water vapor Sea-level pressure Meridional wind stress Zonal wind stress Meridional wind at surface Zonal wind at surface Specific humidity at 400 mb Specific humidity at 850 mb Meridional wind at 200 mb Zonal wind at 200 mb Temperature at 200 mb Geopotential height at 500 mb Meridional wind at 850 mb Zonal wind at 850 mb Temperature at 850 mb Worst Best Model used in IPCC Fourth Assessment 10

Mean Median A B C D E F G H I J K L M N O P Q R S T U V Climate variable Evaluating how well computer models simulate seasonal changes in climate Latent heat flux at surface Sensible heat flux at surface Surface temperature Reflected SW radiation (clear sky) Reflected SW radiation Outgoing LW radiation (clear sky) Outgoing LW radiation Total cloud cover Precipitation Total column water vapor Sea-level pressure Meridional wind stress Zonal wind stress Meridional wind at surface Zonal wind at surface Specific humidity at 400 mb Specific humidity at 850 mb Meridional wind at 200 mb Zonal wind at 200 mb Temperature at 200 mb Geopotential height at 500 mb Meridional wind at 850 mb Zonal wind at 850 mb Temperature at 850 mb Worst Best Model used in IPCC Fourth Assessment 11

We routinely test how well current climate models simulate: Today s annual average climate The daily cycle of temperature, clouds, and rainfall Changes over the seasonal cycle The response to massive volcanic eruptions Ocean uptake of products of atmospheric tests of nuclear weapons The climate changes of the past 30 to 150 years Climates of the deep past (like the last Ice Age) Weather Modes of natural climate variability (like El Niño) 12

Can we improve the representation of clouds and rainfall in climate models? Cloud fraction at Point Barrow ARM site Xie et al., Journal of Geophysical Research (2008)

Structure Introduction: Why we need climate models Evaluating climate models Problems in identifying the best and worst models Can we reduce uncertainties in model projections of future climate change? Conclusions 14

How do we make the best use of scientific information from a large archive of climate model output? Not all computer models are of equal quality Because of differences in model quality, there is increasing interest in weighting computer projections of future climate change Most studies have weighted individual models by simple measures of their performance in simulating today s average climate Is it a model democracy ( One model, one vote? ) Or should we pay more attention to better models? Can we define a Letterman-like list of top 10 climate models?

An example of the difficulties involved in identifying the best models: The case of water vapor We found that: There is an emerging human-caused signal in the increasing moisture content of Earth s atmosphere This signal is primarily due to human-caused increases in well-mixed greenhouse gases 16

Although the models showed important differences in their performance, they had equal weight in the fingerprint study Santer et al., Proceedings of U.S. National Academy of Sciences (2007) 17

Does model quality matter in climate fingerprinting? 18

If we use only the top ten models, can we still identify a human fingerprint in observed water vapor changes? We identified the top ten models (out of 22 in the CMIP-3 archive) in three different ways, using measures of model performance in simulating: The climatological mean state and seasonal cycle pattern ( M+SC ) The amplitude and pattern of variability on different timescales (monthly, 2- year, 10-year; VA+VP ) Mean state, seasonal cycle, and variability ( ALL ) This was done for: Two different variables: Water vapor and sea-surface temperature (SST) Five different geographical regions: AMO, PDO, Niño 3.4, tropical oceans (30 N-30 S), and near-global oceans (50 N-50 S) 19

If we use only the top ten models, can we still identify a human fingerprint in observed water vapor changes? We identified the top ten models (out of 22 in the CMIP-3 archive) in three different ways, using measures of model performance in simulating: The climatological mean state and seasonal cycle pattern ( M+SC ) The amplitude and pattern of variability on different timescales (monthly, 2- year, 10-year; VA+VP ) Mean state, seasonal cycle, and variability ( ALL ) This was done for: Two different variables: Water vapor and sea-surface temperature (SST) Five different geographical regions: AMO, PDO, Niño 3.4, tropical oceans (30 N-30 S), and near-global oceans (50 N-50 S) 20

How did we do the model ranking? M+SC: 20 model performance metrics VA+VP: 50 model performance metrics ALL: 70 model performance metrics For each set of metrics, model ranking was done in two different ways: Parametrically: Rank is average of normalized values of individual metrics ( P ) Non-parametrically: Average of the ranks for each individual metric ( NP ) In each case, identified top ten and bottom ten models 12 cases: 3 groups of metrics (M+SC, VA+VP, ALL) NP) 2 groups of models (Top ten, Bottom ten) 2 ranking schemes (P, 21

How did we do the model ranking? M+SC: 20 model performance metrics VA+VP: 50 model performance metrics ALL: 70 model performance metrics For each set of metrics, model ranking was done in two different ways: Parametrically: Rank is average of normalized values of individual metrics ( P ) Non-parametrically: Average of the ranks for each individual metric ( NP ) In each case, identified top ten and bottom ten models 12 cases: 3 groups of metrics (M+SC, VA+VP, ALL) NP) 2 groups of models (Top ten, Bottom ten) 2 ranking schemes (P, 22

How did we do the model ranking? M+SC: 20 model performance metrics VA+VP: 50 model performance metrics ALL: 70 model performance metrics For each set of metrics, model ranking was done in two different ways: Parametrically: Rank is average of normalized values of individual metrics ( P ) Non-parametrically: Average of the ranks for each individual metric ( NP ) In each case, identified top ten and bottom ten models 12 cases: 3 groups of metrics (M+SC, VA+VP, ALL) NP) 2 groups of models (Top ten, Bottom ten) 2 ranking schemes (P, 23

Relationship between different measures of model skill 24

How do we make the best use of scientific information from a large archive of climate model output? Santer et al., Proceedings of the U.S. National Academy of Sciences (2009) Rankings based on 20 different performance metrics Rankings based on 50 different performance metrics 25

How do we make the best use of scientific information from a large archive of climate model output? Rankings based on 20 different performance metrics Rankings based on 50 different performance metrics 26

Is the identification of a human fingerprint in water vapor changes sensitive to model quality information? 27

Structure Introduction: Why we need climate models Evaluating climate models Problems in identifying the best and worst models Can we reduce uncertainties in model projections of future climate change? Conclusions 28

Degrees C Can we reduce uncertainties in regional-scale projections of future climate change? Slides courtesy of Dr. Mike Wehner, Lawrence Berkeley National Laboratory

Why is it difficult to assess the reliability of climate model projections? In weather forecasting, we make predictions days, weeks, or months in advance We have the data necessary to test the skill of the forecast In climate change experiments, we make projections years, decades, or centuries into the future Limited data available for assessing the skill of the projections The relationship between a climate model s skill in simulating present or past conditions and its predictive skill is largely unknown

So how can we assess the reliability of climate model projections? We can evaluate model physics We can build confidence by testing a model s ability to simulate a wide variety of phenomena We can study individual feedback processes that determine how sensitive the climate system is to CO 2 increases We can see how well we did with projections made 5 15 years ago We can make true decadal predictions of future climate We can start ( initialize ) a model with present-day (or past) ocean and land surface conditions, and then make a true forecast of the climate of the next decade

The quest for the Holy Grail: Uncovering relationships between present-day observables and future changes Response of snow cover to global warming in models is related to their snow response to spring warming (Hall and Xu, Geophysical Research Letters, 2006)

Conclusions: Model evaluation and weighting projections We routinely evaluate climate model performance in a variety of different ways Model errors are complex in space and in time Even for a relatively straightforward application (identifying a human fingerprint in observed water vapor changes), it is not easy to make an unambiguous identification of the top ten models In the water vapor case, there is not a clear relationship between model errors in simulating the mean state and the temporal variability It may be difficult to come up with objective, scientifically-defensible schemes for weighting projections of future climate change Best practice: Do model weighting in many different ways, and examine sensitivity of things you care about (like fingerprint detectability) to different weighting options 33

Conclusions: Reducing projection uncertainties Projection uncertainties are largely due to uncertainties in key feedback mechanisms (clouds, snow, ice, water vapor) Observations may help to constrain uncertainties in model estimates of feedback processes 34