The Observational Climate Record

Similar documents
Your Galactic Address

Global Warming is Unequivocal: The Evidence from NOAA

NOAA s Temperature Records: A Foundation for Understanding Global Warming

Lecture 26 Section 8.4. Mon, Oct 13, 2008

Coverage bias in the HadCRUT4 temperature series and its impact on recent temperature trends. UPDATE COBE-SST2 based land-ocean dataset

Nursing Facilities' Life Safety Standard Survey Results Quarterly Reference Tables

Use your text to define the following term. Use the terms to label the figure below. Define the following term.

Comparison of Global Mean Temperature Series

Forecasting the 2012 Presidential Election from History and the Polls

Sample Statistics 5021 First Midterm Examination with solutions

Analyzing Severe Weather Data

Drought Monitoring Capability of the Oklahoma Mesonet. Gary McManus Oklahoma Climatological Survey Oklahoma Mesonet

Global Temperature. James Hansen, Makiko Sato, Reto Ruedy, Ken Lo

Swine Enteric Coronavirus Disease (SECD) Situation Report June 30, 2016

Deriving Historical Temperature and Precipitation Time Series For Alaska Climate Divisions Via Climatologically Aided Interpolation

Measuring Global Temperatures: Satellites or Thermometers?

Jay Lawrimore NOAA National Climatic Data Center 9 October 2013

What Measures Can Be Taken To Improve The Understanding Of Observed Changes?

Sierra Weather and Climate Update

Champaign-Urbana 2001 Annual Weather Summary

ESTIMATING TEMPERATURE NORMALS FOR USCRN STATIONS

Accepted for publication in Journal of Geophysical Research - Atmospheres. Published 2010 American Geophysical Union. Further reproduction or

Parametric Test. Multiple Linear Regression Spatial Application I: State Homicide Rates Equations taken from Zar, 1984.

Towards a Bankable Solar Resource

Swine Enteric Coronavirus Disease (SECD) Situation Report Sept 17, 2015

The ISTI: Land surface air temperature datasets for the 21st Century

Inter-comparison of Historical Sea Surface Temperature Datasets

Accepted for publication in Journal of Geophysical Research - Atmospheres. Published 2010 American Geophysical Union. Further reproduction or

Annual Performance Report: State Assessment Data

Description of the Temperature Observation and Averaging Methods Used at the Blue Hill Meteorological Observatory

Observed Climate Variability and Change: Evidence and Issues Related to Uncertainty

Detecting Climate Change through Means and Extremes

Evolution Strategies for Optimizing Rectangular Cartograms

Funding provided by NOAA Sectoral Applications Research Project CLIMATE. Basic Climatology Colorado Climate Center

Summary of Natural Hazard Statistics for 2008 in the United States

Global Average Surface Temperature Measurement Uncertainties

History of Global Temperature Estimation Köppen to Satellites. Phil Jones CRU, ENV, UEA Norwich

C Further Concepts in Statistics

An Overview of Atmospheric Analyses and Reanalyses for Climate

Maximum and minimum temperature trends for the globe: An update through 2004

March was 3rd warmest month in satellite record

Champaign-Urbana 1999 Annual Weather Summary

Appendix 5 Summary of State Trademark Registration Provisions (as of July 2016)

ANALYSIS OF CLIMATIC CHANGES IN THE SAN JUAN MOUNTAIN (SJM) REGION DURING THE 20 TH CENTURY. Imtiaz Rangwala

Uncertainty in annual rankings from NOAA s global temperature time series

Extreme Weather and Climate Change: the big picture Alan K. Betts Atmospheric Research Pittsford, VT NESC, Saratoga, NY

TEMPERATURE TREND BIASES IN GRIDDED CLIMATE PRODUCTS AND THE SNOTEL NETWORK ABSTRACT

Monthly Long Range Weather Commentary Issued: July 18, 2014 Steven A. Root, CCM, President/CEO

Seasonal Climate Watch June to October 2018

Climatography of the United States No

Multiway Analysis of Bridge Structural Types in the National Bridge Inventory (NBI) A Tensor Decomposition Approach

NOTES AND CORRESPONDENCE. Evaluation of Temperature Differences for Paired Stations of the U.S. Climate Reference Network

Monthly Long Range Weather Commentary Issued: May 15, 2014 Steven A. Root, CCM, President/CEO

Presentation Overview. Southwestern Climate: Past, present and future. Global Energy Balance. What is climate?

Swine Enteric Coronavirus Disease (SECD) Situation Report Mar 5, 2015

Early Period Reanalysis of Ocean Winds and Waves

AIR FORCE RESCUE COORDINATION CENTER

Draft Report. Prepared for: Regional Air Quality Council 1445 Market Street, Suite 260 Denver, Colorado Prepared by:

ZUMWALT WEATHER AND CLIMATE ANNUAL REPORT ( )

Weather Trends & Climate Normals

CLIMATE OF THE ZUMWALT PRAIRIE OF NORTHEASTERN OREGON FROM 1930 TO PRESENT

Uncertainty in Ranking the Hottest Years of U.S. Surface Temperatures

SEPTEMBER 2013 REVIEW

What Lies Beneath: A Sub- National Look at Okun s Law for the United States.

STATE OF THE CLIMATE IN 2017

Weather History on the Bishop Paiute Reservation

EXST 7015 Fall 2014 Lab 08: Polynomial Regression

Global Climates. Name Date

SAMPLE AUDIT FORMAT. Pre Audit Notification Letter Draft. Dear Registrant:

State of the Climate: Recent Developments

Richard W. Reynolds * NOAA National Climatic Data Center, Asheville, North Carolina

Climatography of the United States No

Climatography of the United States No

Daily Operations Briefing Tuesday, May 14, 2013 As of 8:30 a.m. EDT

HAIDA GWAII CLIMATE ASSESSMENT 2010 Special Report for MIEDS Franc Pridoehl

Monthly Long Range Weather Commentary Issued: February 15, 2015 Steven A. Root, CCM, President/CEO

Daily Operations Briefing December 25, 2012 As of 6:30 a.m. EST

Midwest and Great Plains Climate and Drought Update

Climatography of the United States No

Climatography of the United States No

SEASONAL AND ANNUAL TRENDS OF AUSTRALIAN MINIMUM/MAXIMUM DAILY TEMPERATURES DURING

8.1 CHANGES IN CHARACTERISTICS OF UNITED STATES SNOWFALL OVER THE LAST HALF OF THE TWENTIETH CENTURY

Changing Hydrology under a Changing Climate for a Coastal Plain Watershed

What is the difference between Weather and Climate?

Regional Precipitation and ET Patterns: Impacts on Agricultural Water Management

Lesson Adaptation Activity: Analyzing and Interpreting Data

Atmospheric circulation analysis for seasonal forecasting

Class business PS is due Wed. Lecture 20 (QPM 2016) Multivariate Regression November 14, / 44

Climatography of the United States No

Global temperature record reaches one-third century

Smart Magnets for Smart Product Design: Advanced Topics

Climatography of the United States No

Seasonal Climate Watch April to August 2018

Mingyue Chen 1)* Pingping Xie 2) John E. Janowiak 2) Vernon E. Kousky 2) 1) RS Information Systems, INC. 2) Climate Prediction Center/NCEP/NOAA

Weather and Climate of the Rogue Valley By Gregory V. Jones, Ph.D., Southern Oregon University

Climate Variability. Eric Salathé. Climate Impacts Group & Department of Atmospheric Sciences University of Washington. Thanks to Nathan Mantua

Third Grade Math and Science DBQ Weather and Climate/Representing and Interpreting Charts and Data

Drought and Climate Extremes Indices for the North American Drought Monitor and North America Climate Extremes Monitoring System. Richard R. Heim Jr.

On the reliability of the U.S. surface temperature record

Monthly Long Range Weather Commentary Issued: APRIL 18, 2017 Steven A. Root, CCM, Chief Analytics Officer, Sr. VP,

Transcription:

The Observational Climate Record Deke Arndt NOAA s Asheville, NC, USA February 2018 National Oceanic and Atmospheric Administration NOAA Satellite and Information Service

National Centers for Environmental Information Asheville, NC 2

About Me I am a meteorologist by training, but I ve been in drought or climate since the late 1990s. I m not the lead expert in any of the following material. I use it operationally. 3

Objectives Long-term Goal (this and future engagements): - Inform your role as station scientist who gets many climate questions from many directions Today s Objectives: - Identify and address some common misconceptions or points of confusion about the observational record. - Point to peer-reviewed literature that more fully describes approaches and conclusion. I want you to be confident using this information to get your job(s) done. 4

Global Surface Temperature Difference From 20th Century Average, in F 2017: +0.84 C / +1.51 F above 1901-2000 average; 3 rd warmest year of record Year 2016 57% 2015 32% 2017 10% All others 1% Probability of Warmest Rank 5

U.S. Temperature: 2017 CONUS: 54.6 F; 2.6 F above 20 th century avg; 3 rd warmest year 2012 (55.3 F) and 2016 (54.9 F) were warmer 21 st consecutive year warmer than 20 th century average Every CONUS state and AK had an above average (top one-third of history) or warmer year, for 3 rd consecutive year Record warm states: AZ, GA, NM, NC, SC - Arizona temp of 63.1ºF surpassed the previous record set in 2014 by 0.8ºF. T max and T min ranked as the 5 th and 4 th highest temps on record, respectively. http://www.ncdc.noaa.gov/sotc/national 6

T min vs. T max relative to [their own] history http://www.ncdc.noaa.gov/sotc/national 7

Temperature Anomalies by County https://www.ncdc.noaa.gov/sotc/national/2017/13/supplemental/page-3/ 8

Individual Months: Mar 2010: NJ RI Apr 2010: CT IL MA ME MI NH NJ* RI May 2010: FL AK Jun 2010: DE LA* NC NJ IA MI Jul 2010: CT* DE* MA* RI* Aug 2010: LA* Sep 2010: MN Oct 2010: FL Dec 2010: FL UT Mar 2011: TX Apr 2011: IL IN KY MI* NY OH PA TN WV Jun 2011: LA TX NM Jul 2011: DE* MD* OK TX Aug 2011: AZ CO FL LA NM OK TX NH NJ NY VT Sep 2011: OR PA Nov 2011: RI Jan 2012: AK Mar 2012: CONUS AL AR CT GA IA IL IN KS KY MA MI MN MO MS NC ND NE NH NJ NY OH OK PA RI SC SD TN VA VT WI WV Jun 2012: CO FL UT* WY Jul 2012: DE MD VA Aug 2012: NV MS Sep 2012: CA MN MT ND SD Oct 2012: DE Nov 2012: NH WV Feb 2013: GA Apr 2013: ND IA MI May 2013: IA Jun 2013: NJ UT Jul 2013: CT MA RI FL Sep 2013: CO OR WA Oct 2013: AK Jan 2014: CA Jul 2014: AR Aug 2014: MT Feb 2015: AZ CA OR WA Apr 2015: FL May 2015: AK CT MA NH CONUS CO OK TX UT Jun 2015: CA ID OR UT* WA IL IN OH Jul 2015: KY Sep 2015: CO CT MA ME MI MN RI UT WI Oct 2015: WA SC Nov 2015: NJ AR MO Dec 2015: CONUS AL AR CT DE FL GA IA IL IN KS KY MA MD ME MI MN MO MS NH NJ NY NC OU OK PA RI SC TN VA VT WI WV CONUS IA NC OK WI Feb 2016: AK Mar 2016: LA Apr 2016: AK Jun 2016: AZ UT Jul 2016: FL NM Aug 2016: CT DE MA MD NJ NY PA RI LA Oct 2016: NM ID MT WA AK Nov 2016: WA FL Feb 2017: AR DE IL IN KY LA MD MO NC NJ NY OH PA TX VA WV Mar 2017: CO NM Apr 2017: DE MD NC NJ OH PA WV VA NC Aug 2017: CA OR WA TX Sep 2017: LA Oct 2017: CT MA ME NH RI VT MI Nov 2017: AZ CO NM UT Jun 2014: National AK MN Centers for Environmental Information Seasons: 1Q 2010: ME VT* MA RI Spr 2010: CT* DE* MA* ME MI* NH NJ* NY* RI VT* 2Q 2010: CT DE LA MA MD NC NH NJ RI VA VT Sum 2010: CT* DE GA MD MS NC NJ RI* SC TN VA WI 3Q 2010: CT* DE* FL* MA* 4Q 2010: NV UT FL CY 2010: ME NH* ND Spr 2011: ID IN KY MI MT NY OH OR PA VT WA WV WY AK TX 2Q 2011: WA TX IL IN KY MI MT NY OH NM Sum 2011: LA NM OK TX NJ TX 3Q 2011: NM TX MD NJ NY Aut 2011: CT* MA* ME* NH* RI VT OH PA 4Q 2011: CT* MA* NJ* RI* CY 2011: CT IN KY MA NJ NY OH PA VT 1Q 2012: CONUS AL AR CT DE GA IA IL IN KS KY MA MD MI MN MO ND NE NH NJ NY OH OK* PA RI SD TN VA VT WI WV CT RI 1Q 2014: AZ* CA* Win 2013-14: CA* 2Q 2014: MN Sum 2014: AK Aut 2014: CA CY 2014: AK* AZ* CA NV Win 2014-15: AZ CA NV UT 1Q 2015: NY AZ CA NV OR UT WA WY SD Spr 2015: FL TX 2Q 2015: FL WA TX Sum 2015: OR WA 3Q 2015: CT* ME NH RI * Aut 2015: CONUS* FL SC 4Q 2015: CONUS CT DE FL GA IN KS MA MD ME MI MN MS NH NJ NY NC OH PA RI SC TX* VA VR WI WV MO NC OK SC TX CY 2015: FL MT OR WA OK TX Win 2015-16: CONUS CT MA ME NH RI VT IA 1Q 2016: AK Spr 2016: AK 2Q 2016: AK Sum 2016: CA* CT RI Spr 2012: CONUS AL AR CO CT DE FL* GA IA IL IN KS KY LA MA MD MI MN MO MS NC 3Q 2016: CT DE FL MD MA MI NC NJ NY OH NE NJ NY OH OK PA SC SD TN TX VA VT WI PA RI SC WV Aut 2016: CONUS CO IA KS MI MN NM TX 2Q 2012: CO KS AR UT WA Sum 2012: CO FL NE 4Q 2016: NM* TX 3Q 2012: AK MS MT NE SD CY 2016: AK GA* Aut 2012: NV Win 2016-17: LA TX NV WY CY 2012: AR AZ* CO CT DE IA IL IN KS MA 1Q 2017: LA MS NM OK SC TX MD MI MO NE NH NJ NM* NY OK PA RI SD Sum 2017: CA NV MS TX VA VT WI WY NE WY Aut 2017: AZ CT MA ME NH NM AR 1Q 2013: CA 4Q 2017: AZ CO NM Spr 2013: IA CY 2017: AZ NM GA NC SC MI Sum 2013: DE GA SC 3Q 2013: CO OR IA Aut 2013: WY 4Q 2013: CA CY 2013: CA State records since 2010 COOLEST WARMEST WETTEST DRIEST Key to Seasons: CY: Calendar Yr (Jan-Dec) 1Q: 1 st qtr (Jan-Feb-Mar) Spr: Spring (Mar-Apr-May) * - record since broken Climate at a Glance http://www.ncdc.noaa.gov/cag 9

Contents How do we compute US and global temperatures? Station Siting Corrections In situ vs. Satellite-derived temperatures Precision Dealing with dropouts Where do I find this stuff? 10

First, good news The surface temperature record has been extensively scrutinized this past decade This led to improvements in network and station metadata, algorithms which detect and correct disruptions in time-series, and ultimately, a more robust and understood US and Global temperature time series. 11

Multiple indicators We could talk about: arctic amplification glaciers permafrost sea level rise rise in lake temps ocean heat content ocean acidification 12

What are temperature anomalies? For studying climate change, temperature departures, or anomalies, are often more important than absolute temperature. These are differences from an average, or baseline, temperature, often the normal. - Positive anomaly: observed temperature warmer than the the baseline - Negative anomaly: observed temperature cooler than the baseline. 13

Why use anomalies? Anomalies take advantage of the meteorological property of scale. Temperature patterns vs. normal are large and fairly consistent, even if the absolute temperatures within vary due to terrain, proximity to bodies of water, elevation, or other effects This is especially true on multiday time scales Using anomalies helps minimize problems when stations are added, removed, or missing from the monitoring network. 14

Globe derived from Grid boxes 15

Grid box derived from stations, buoys, ships X X Normal monthly average temperature at station X Observed monthly average temperature at station X Anomaly (departure from normal) Step 1: Compute Station anomalies ß Cooler Warmer à G F E G A B F D C H E B H A C D Step 2: Keep anomalies; scrap obs (work in anomaly space ) Gee, H thanks, NOAA! A B C D E F G Step 3: Compute avg anomaly for grid box Step 4: Average grid boxes. Weight by latitude 16

Ocean data shared through a similar international system 17

Global + US + OK Temperature Anomalies since 1900 18

Dealing with threats to data validity 19

Random error vs. bias Random, garden variety bad data, precision, etc. generally doesn t keep us up at night. Problems tend to bounce around the average. Over increasingly large time and space scales, these balance out. What keeps us up at night: detecting and correcting systematic biases. These show up as shifts in a long-term record that can be subtle or dramatic 20

Random error vs. bias We often get asked: - How can you tell the temperature of the planet down to 0.01C when the instruments only measure to 1.0C? - Great question, and straightforward answer: the power of large numbers, in particular the large number of obs. Follow up question: - But what about the instrument only being accurate within +/- 2.0C? - Great question, and straightforward answer: the power of large numbers, in particular the large number of obs. Think of it this way: - How can you tell LeBron James is a 33.2% three-point shooter, when we only measure whole makes or misses? Wow thanks Deke. Great point. 21

Random error vs. bias Random error forgives itself when the number of observations is sufficiently large (and the scale of analysis is sufficiently large to match) Random Error: - Ref said his toe was on the line! - He did/didn t get fouled! However: - What if they moved the three-point line during his career? - What about the effects of aging? Changing teams? - Other rule changes? Points to make here: 1. It s the bias, not the random error, that threatens largescale analysis based on many data points. 2. Bias keeps climate data people up at night. 3. The causes of bias are often changes in practice or instrumentation. 4. The pointers to bias lie in the metadata. 22

Major Known Causes of Station Shifts Time of observation ( TOB ) changes Changes in instrumentation Changes in station environment All of these can be documented or undocumented 23

Observation of Time of Observation, over Time We have become a nation of morning observers. This introduced a cool bias over time. TOB is a statistical correction for this effect. From Vose et al., 2003 Literature: Vose, R. S., C. N. Williams Jr., T. C. Peterson, T. R. Karl, and D. R. Easterling (2003), An evaluation of the time of observation bias adjustment in the U.S. Historical Climatology Network, Geophys. Res. Lett., 30, 2046, doi:10.1029/2003gl018111, 20. 24

Changes in instrumentation and siting 1980s: Much of the network shifted from traditional shelters and liquid in glass to MMTS package Competing Effects: Brought stations closer to structures, but much cooler instrument package NOAA Photo Library Literature: Menne, M.J., C.N. Williams Jr., and R.S. Vose, 2009: The United States Historical Climatology Network monthly temperature data Version 2. Bulletin of the American Meteorological Society, 90, 993-1007. National Weather Service 25

Urbanization and Ruralization Reno, Nevada T min : comparison with neighbors Urban signal associated with 14%-21% of the rise in unadjusted T min since 1895 and 6%-9% since 1960. Homogenization effectively removes this urban signal from individual and aggregate station records such that it becomes insignificant during the last 50 80 years. Literature: Z. Hausfather, M.J. Menne, C.N. Williams, T. Masters, R. Broberg, and D. Jones, "Quantifying the effect of urbanization on U.S. Historical Climatology Network temperature records", J. Geophys. Res. Atmos., vol. 118, pp. 481-494, 2013. http://dx.doi.org/10.1029/2012jd018509 26

Effects of these issues on CONUS Changes in observation practice had different effect on T max vs T min trends Before any sort of homogenizaton: T max widespread shifts artificially cooled the true rate of change - Artificial cooling since 1950: changing time of observation - Artificial cooling (primarily mid-1980s): liquid-in-glass thermometers à MMTS electronic resistance thermistors T min these shifts work in opposition to each other. - Artificial cooling since 1950: changing time of observation - Some artificial cooling from 1930-50: station moves to somewhat cooler microclimates (ruralization) - Artificial warming since the mid-1980s: associated with installation of MMTS. - Conclusion: raw T min data likely underestimate overall trend since 1950 (when time of obs shifts dominate) and overestimate overall trend since 1979 (when shifts associated with MMTS installation dominate). Literature: Menne, M. J., C. N. Williams, Jr., and M. A. Palecki, 2010: On the reliability of the U.S. surface temperature record. Journal of Geophysical Research, 115, D11108, doi:10.1029/2009jd013094. 27

Checking Our Work How do we know we re getting it right? 28

How do we know corrections work? NCDC uses homogenization algorithm designed to account for shifts and reduce the error in trend calculations Benchmarking experiments broadly affirmed the approach Comparison with hourly reanalyses also indicate corrections are in correct direction Vose, R.S., S. Applequist, M.J. Menne, C.N. Williams Jr., and P. Thorne (2012), An intercomparison of temperature trends in the U.S. Historical Climatology Network and recent atmospheric reanalyses, Geophys. Res. Lett., 39, L10703, doi: 10.1029/2012GL051387 Williams, C.N., M.J. Menne, and P.W. Thorne, 2012: Benchmarking the performance of pairwise homogenization of surface temperatures in the United States. Journal of Geophysical Research- Atmospheres, 117, D5, doi:10.1029/2011jd016761. Zhang, J., W. Zheng, and M.J. Menne, 2012: A Bayes factor model for detecting artificial discontinuities via pairwise comparisons. Journal of Climate, 25, 8462-8474, doi: 10.1175/JCLI-D-12-00052.1. 29

Compared to Climate Reference Network USHCN: A network of ~1,200 stations used to calculate CONUS temperature until 2014. Now defunct ClimDiv: Uses about five times as many stations, gridded method to calculate CONUS (and now Alaska) temperature and precipitation. Became the operational set of choice in 2014. 30

Compared to Climate Reference Network 31

Why adjust the past? Rationale: adjusted datasets should reflect what the data would show with today s instrumentation and practices. Unadjusted data are perpetually publicly available. Unadjusted data are primary input for each monthly analysis (we don t just assume the corrections from last month s run) Dear Colleague: if you take anything away from today s presentation, please note that our corrections actually reduce the global temperature trend. Karl, T. R., and Coauthors, 2015: Possible artifacts of data biases in the recent global surface warming hiatus. Science, 348, 1469 1472, doi:10.1126/science.aaa5632. Vose, R.S., D.S. Arndt, V.F. Banzon, D.R. Easterling, B. Gleason, B. Huang, E. Kearns, J.H. Lawrimore, M.J. Menne, T.C. Peterson, R.W. Reynolds, T.M. Smith, C.N. Williams, Jr., and D.L. Wuertz, 2012: NOAA's merged land-ocean surface temperature analysis. Bull. Amer. Met. Soc., 93, 1677 1685, doi:10.1175/ BAMS-D-11-00241.1 32

SST corrections dominate the global signal None of the boundaries between these are sharp (with the exception of the WW2 years themselves) Matthews & Matthews (2013): Comparing historical and modern methods of sea surface temperature measurement. Ocean Sci., 9, 695-711 Bucket Method: dominates pre-ww2 Kennedy, J.J., N.A. Rayner, R.O. Smith, D.E. Parker, and M. Saunby (2011), Reassessing biases and other uncertainties in sea surface temperature observations measured in situ since 1850: 2. Biases and homogenization, J. Geophys. Res., 116, D14104, doi:10.1029/2010jd015220. Engine Room Intake: dominates last half of 20 th century Buoys: emergent late 20 th century; becoming dominant 33

SST observations, historically Big drop-off in bucket use during WW2 Rapid rebound in bucket use WW2, then replaced by cheaper, safer ERI. Results in warm bias generally over time. Journal of Geophysical Research: Atmospheres Volume 116, Issue D14, D14104, 22 JUL 2011 DOI: 10.1029/2010JD015220 34

Other random stuff 35

T sfc vs. TLT: Related but not Equated Surface Temperature Represents: meteorological surface temperature (approx. 1.5m AGL) Measured by: thousands of insitu stations Familiar datasets: GHCN, nclimdiv, CRUTemp Challenges: environment drift, changing instruments Temp of Lower Trop ( satellite ) Represents: bulk temperature from sfc to about 8,000m (26,000 ft) Measured by: indirect; derived from radiances in microwave frequencies Familiar datasets: RSS, UAH Challenges: orbital drift, changing instruments, 36

Precision How do we come up with 0.1 or 0.01 values when individual stations measure in whole degrees? More samples = more statistical power NBA.com 37

Specific Questions 38

Some Specific Questions I'm most curious about the span of the record as we didn't have stations on many continents for the early years of the record, how were those years reconstructed? - Global-scale analysis still possible: the oceans, and Africa, were better sampled than you might think. - Some orgs analyze global temps back to mid 1800s; we start our official record at 1880 - With that said, uncertainties are higher during the late 19 th century partly because of sparser coverage 39

Some Specific Questions How are future analyses looking? Leaning more on surface stations or satellite obs? - Sats will become more important over time as the period of record lengthens. This is especially true for variables that aren t in the in situ wheelhouse (i.e., those variables other than sfc temperature and somewhat precip) 40

Some Specific Questions Without getting in the weeds, what are key differences between the NOAA monthly analyses and those of other labs (NASA, Hadley, JMA)? - Short version: Main difference lately is how aggressive or conservative we are in data sparse areas (Antarctic, Africa, Arctic). - Esp. the Arctic, which is warming at a substantially different rate than the rest of the planet. - Also, minor diffs in station composition, quality assurance, and how missing data are handled. - We also present vs. different base periods, which can bother people, although no effect on trend 41

Some Specific Questions Is there a way to know how much data (by percentage) is discarded or deemed inferior for the analyses? - Two ways to look at this: rejecting gardenvariety bad data, and identifying station discontinuities. - The flag rate for individual observations is about a quarter of a percent - The vast majority of stations have at least one detected break in their record 42

Resources 43

NCEI FAQ and Similar Resources About global temperature uncertainty: https://www.ncdc.noaa.gov/monitoring-references/ faq/global-precision.php About anomalies vs. absolute temperatures: https://www.ncdc.noaa.gov/monitoring-references/ faq/anomalies.php Shorter version of same: https://www.ncdc.noaa.gov/monitoring-references/ dyk/anomalies-vs-temperature 44

Data Specific to this Presentation Global, US temperature time series: - Climate at a Glance: http://www.ncdc.noaa.gov/cag - Also for states, climate divisions within states Raw and Adjusted HCN data: - http://www1.ncdc.noaa.gov/pub/data/ghcn/v3/ - qcu files: unadjusted (raw) - qca files: adjusted Comparison CONUS temperature methods to CRN: - http://www.ncdc.noaa.gov/temp-and-precip/national-temperatureindex/ 45