MeteoSwiss acceptance procedure for automatic weather stations

Similar documents
METEO-Cert: Acceptance Procedure for Automatic Weather Stations

Integrating third party data from partner networks: Quality assessment using MeteoSwiss Meteorological Certification procedure

Quality Assessment using METEO-Cert

QualiMET 2.0. The new Quality Control System of Deutscher Wetterdienst

SwissMetNet: The New Automatic Meteorological Network of Switzerland

Quality assurance for sensors at the Deutscher Wetterdienst (DWD)

Quality Assurance and Quality Control

1 Introduction. Station Type No. Synoptic/GTS 17 Principal 172 Ordinary 546 Precipitation

Quality assurance for sensors at the Deutscher Wetterdienst (DWD)

Guidelines on Quality Control Procedures for Data from Automatic Weather Stations

Country Report for Japan (Submitted by Kenji Akaeda, Japan Meteorological Agency)

Questionnaire on Quality Management for Surface Meteorological Observations in Regional Association II (Asia)

STATUS OF THE WIGOS DEMONSTRATION PROJECTS

Applications. Remote Weather Station with Telephone Communications. Tripod Tower Weather Station with 4-20 ma Outputs

World Meteorological Organization

Operational Applications of Awos Network in Turkey

Vaisala AviMet Automated Weather Observing System

Components of uncertainty in measurements by means of AWS

MetConsole AWOS. (Automated Weather Observation System) Make the most of your energy SM

A TEST OF THE PRECIPITATION AMOUNT AND INTENSITY MEASUREMENTS WITH THE OTT PLUVIO

Gridding of precipitation and air temperature observations in Belgium. Michel Journée Royal Meteorological Institute of Belgium (RMI)

WMO SPICE. World Meteorological Organization. Solid Precipitation Intercomparison Experiment - Overall results and recommendations

The Vaisala AUTOSONDE AS41 OPERATIONAL EFFICIENCY AND RELIABILITY TO A TOTALLY NEW LEVEL.

CLIMATE CHANGE ADAPTATION BY MEANS OF PUBLIC PRIVATE PARTNERSHIP TO ESTABLISH EARLY WARNING SYSTEM

Representivity of wind measurements for design wind speed estimations

WeatherHawk Weather Station Protocol

MANAGEMENT OF THE NIEMEN RIVER BASIN WITH ACCOUNT OF ADAPTATION TO CLIMATE CHANGE. Hydro-meteorological Monitoring System

* * * Table (1) Table (2)

AVALANCHE FORECASTER EXCHANGE PROGRAM U.S. SWITZERLAND

Data Quality Assurance System. Hong Kong, China

Optimizing the energy consumption and operational availability of point heaters using weather forecasts

Precipitation type from the Thies disdrometer

Complete Weather Intelligence for Public Safety from DTN

Wind Resource Assessment Practical Guidance for Developing A Successful Wind Project

THE CLIMATE INFORMATION MODULE

Meteorological Service

The WMO Global Basic Observing Network (GBON)

NEW CGMS BASELINE FOR THE SPACE-BASED GOS. (Submitted by the WMO Secretariat) Summary and Purpose of Document

CONTENT 2. ORGANIZATION 3. SERVICES. instruments. (3)Activities of RIC Tsukuba

LP PYRA Installation and Mounting of the Pyranometer for the Measurement of Global Radiation:

Country Report. India Meteorolog. gical ldepartment

Wali Ullah Khan Pakistan Meteorological Department

REGIONAL INSTRUMENT CENTER (RIC) MANILA Philippines (Regional Association V)

10 Success Stories for the 10th Anniversary. WS Family

Site assessment for a type certification icing class Fraunhofer IWES Kassel, Germany

INSTRUMENT TEST REPORT 700

Use of lightning data to improve observations for aeronautical activities

Road weather forecasts and MDSS in Slovakia

INFLUENCE OF THE AVERAGING PERIOD IN AIR TEMPERATURE MEASUREMENT

Peterborough Distribution Inc Ashburnham Drive, PO Box 4125, Station Main Peterborough ON K9J 6Z5

Agrometeorological services in Greece

White Paper. Perform Conductivity Measurements In Compliance with USP <645>

Hourly Precipitation Estimation through Rain-Gauge and Radar: CombiPrecip

Use of Ultrasonic Wind sensors in Norway

Regional Climate Centre Network (RCC) in RA VI as a GFCS contribution

AN OVERVIEW OF THE TVA METEOROLOGICAL DATA MANAGEMENT PROGRAM

The WMO Integrated Global Observing System (WIGOS), current status and planned regional activities

Introductions to RIC-Beijing. NAN Xuejing, CUI Xiai Meteorological Observation Center China Meteorological Administration March,2018

NOAA Surface Weather Program

PRELIMINARY WIND RESOURCE ASSESSMENT OF WINDYGATES FIRST NATIONS PROJECT. Department of Innovation, Energy and Mines Contact. Rick Halas Document No

AWOS. Automated Weather Observing Systems COASTAL

Road weather forecast for different customer systems in Switzerland

Mitigation planning in Epirus- The case of frost and snowfalls. Thematic seminar Epirus September 2011

REPORTS ON THE PROGRESS IN ADDRESSING THE WORK PLAN OF THE EXPERT TEAM. Standardization in instrumentation and observations

Regional Consultation on Climate Services at the National Level for South East Europe Antalya, Turkey November

Monitoring Extreme Weather Events. February 8, 2010

STRUCTURAL ENGINEERS ASSOCIATION OF OREGON

Model 3024 Albedometer. User s Manual 1165 NATIONAL DRIVE SACRAMENTO, CALIFORNIA WWW. ALLWEATHERINC. COM

MeteoSwiss Spatial Climate Analyses: Documentation of Datasets for Users

Field Experiment on the Effects of a Nearby Asphalt Road on Temperature Measurement

WeatherWatcher ACP. Astronomers Control Panel (V4 or >) Ambient Virtual Weather Station (Pro or Internet editions) ASCOM platform v4.

C1: From Weather to Climate Looking at Air Temperature Data

Advisory Circular. U.S. Department of Transportation Federal Aviation Administration

The WMO Global Basic Observing Network (GBON)

Development of procedures for calibration of meteorological sensors. Case study: calibration of a tipping-bucket rain gauge and data-logger set

The AWS based operational urban network in Milano: achievements and open questions.

INCA-CE achievements and status

Model Output Statistics (MOS)

personal weather station

Instruments and Methods of Observation Programme

The Montague Doppler Radar, An Overview

Road Surface Condition Analysis from Web Camera Images and Weather data. Torgeir Vaa (SVV), Terje Moen (SINTEF), Junyong You (CMR), Jeremy Cook (CMR)

Comparison of Vaisala Radiosondes RS41 and RS92 WHITE PAPER

Innovative Sustainable Technology

Withdrawn at December 31, Guideline DKD-R 5-7. Calibration of Climatic Chambers DEUTSCHER KALIBRIERDIENST

Proper Data Management Responsibilities to Meet the Global Ocean Observing System (GOOS) Requirements

Progress on GCOS-China CMA IOS Development Plan ( ) PEI, Chong Department of Integrated Observation of CMA 09/25/2017 Hangzhou, China

PUBLIC SAFETY POWER SHUTOFF POLICIES AND PROCEDURES

Enhancing Weather Information with Probability Forecasts. An Information Statement of the American Meteorological Society

Deutscher Wetterdienst

Austrian Wind Atlas and Wind Potential Analysis

Application and verification of ECMWF products 2017

Proceedings, International Snow Science Workshop, Breckenridge, Colorado, 2016

Winter Ready DC District of Columbia Public Service Commission

REQUIREMENTS FOR WEATHER RADAR DATA. Review of the current and likely future hydrological requirements for Weather Radar data

Winter Maintenance on Ontario s Highways

An Online Platform for Sustainable Water Management for Ontario Sod Producers

Annex I to Resolution 6.2/2 (Cg-XVI) Approved Text to replace Chapter B.4 of WMO Technical Regulations (WMO-No. 49), Vol. I

Centralized Forecasting Registration and Communication Requirements for Distribution Connected Variable Generators. IESO Training

Montréal, 7 to 18 July 2014

Transcription:

MeteoSwiss acceptance procedure for automatic weather stations J. Fisler, M. Kube, E. Grueter and B. Calpini MeteoSwiss, Krähbühlstrasse 58, 8044 Zurich, Switzerland Phone:+41 44 256 9433, Email: joel.fisler@meteoswiss.ch ABSTRACT MeteoSwiss is the national weather service of Switzerland and the national meteorological authority in Switzerland. Over the last years MeteoSwiss has started to integrate into its Data Warehouse (central data platform) surface observations from other partners and private networks as well. This integration of 3 rd party data sources raises the question of the quality associated with the data at disposal. MeteoSwiss has decided to develop accordingly a new certification procedure, or acceptance procedure, that will allow to qualify each data source and deliver to our data users information with known quality flags. MeteoSwiss is currently implementing this procedure. This paper describes the principles and difficulties associated to this process, and shows the interest of having an acceptance procedure that in turn is used both for MeteoSwiss own surface network as well as for AWS sites from partner networks. This application may provide sound background information and tool for decision makers in other National Meterological Services. 1 Introduction In 2010 the national weather service MeteoSwiss decided to develop a quality assurance tool to assess automatic weather stations (codename AWSCheck ). The reason was that MeteoSwiss started to use meteorological data from both public and private partners and wanted to be able to check the quality of meteorological data. Thus the goal of AWSCheck is to make a statement about the quality of data provided by an AWS and to be able to label it according to its field of application. AWSCheck uses standards defined in meteorology and climatology to assess and rank an instrument as fully compliant (green), compliant (yellow) or not compliant (red). The following paper describes AWSCheck in detail and publishes first results obtained by applying AWSCheck on MeteoSwiss own SwissMetNet. These findings were used to optimize AWSCheck. Assessment of partner stations are planned for 2013 and will be conducted by a neutral third-party organization not related to MeteoSwiss. In this paper the organization conducting the assessment is called auditor whereas the owner of the AWS is called operator (e.g. MeteoSwiss or a private company with their own weather stations). 2 Obtaining good measurements According to WMO-CIMO (2010a) accurate surface observations measurements are obtained by: 1. Selecting the proper sites 2. Designing the proper frame for holding the Automatic Weather Stations (AWS) 3. Using high quality instruments 4. Ensuring proper maintenance and troubleshooting procedures 5. Working with a well trained staff AWSCheck includes these different criteria for each instrument on an AWS. In AWSCheck each instrument is assessed separately: a given site can be well suited for one parameter (e.g. temperature) but not appropriate for another one (e.g. precipitation or wind). - 1 -

3 Quality assurance process using AWSCheck The assessment process for an automatic weather station consists of three steps: 1. Pre-analysis: Does the AWS meet basic requirements to be certified? What instruments are installed? Can the operator provide AWS maintenance and history documentation? 2. Audit on site: Auditor and operator fill out the provided forms at the location of the AWS. As a result, AWSCheck gives a 1 st level immediate assessment for each instrument using the categories green, yellow and red (see below). 3. Final acceptance procedure: Based on at least one year of data certain operating figures and quality checks are calculated: What is the data availability of the AWS? Maximum downtime of an instrument? Does the AWS provide data fast enough? In a first step the auditor contacts the operator of an AWS and asks for documents containing: - The exact location, warden contact information and other metadata about the AWS - Lists of installed instruments, plans, station history, etc. - Maintenance: Duties and responsibilities of warden and operator, maintenance frequency - Calibration documentation for each instrument - If possible a set of at least one year of meteorological data Using this information the auditor is able to fill out parts of the AWSCheck assessment report. The next step consists of visiting the AWS together with the operator and/or warden. Filling out the report should be a mutual process. Reaching a consensus on parameters can be challenging (e.g. evaluation of site classification in mountainous region). In cases where no agreement is possible, the auditor is responsible for taking a decision but adds the operators objections into his final report. This step can include an agreement on certain improvements done by the operator (like cutting trees, adjusting measurement heights, modernizing instruments, etc.) which lead to a higher classification of the AWS. The agreement should also include how these measures are checked by the auditor after they have been completed (sending pictures, signed agreement or revisiting of AWS). Based on the completed report AWSCheck provides an assessment of each instrument installed on an AWS. The instruments are labelled using the following three categories: fully compliant green label All WMO/CIMO requirements fulfilled, useable for climatology, meteorology and warning purposes compliant yellow label Most requirements fulfilled, useable for meteorology and warnings. not compliant red label Some important WMO requirements not fulfilled! Be aware of the limitations and use data with caution e.g. for weather warnings. Table 1: The three quality categories that AWSCheck uses to label instruments on an AWS After at least one year of collected meteorological data, the auditor can complete the last step of the assessment. An automatic data analysis tool implemented by MeteoSwiss calculates following parameters and checks if the defined threshold has been reached. These include: - Measurement and transmission interval: do they comply with defined requirements? - Data delivery time: Is the data delivered to the MeteoSwiss Data Warehouse within one sampling step after the integration time of the previous sample (typ. in less than 10 minutes after a measurement)? - Data availability: how many data were delivered in total over a one year period of time (typ. threshold of better than 96% versus the maximum data achievable in 1 year)? - Maximum downtime of an instrument: Is it below 3 days (green) or 5 days (yellow) in 80% of the cases? - Data quality and plausibility checks based on calculations using parallel measurements, neighbouring AWS or interpolations - Other quality parameters that can be calculated and checked automatically The data analysis step is planned for 2014 and will be implemented directly into the Data Warehouse of MeteoSwiss delivering on a regular basis reports for each AWS. - 2 -

4 Assessment of the automatic weather station and instruments The assessment of an AWS includes so far the basic following meteorological variables: temperature, humidity, pressure, precipitation, wind, radiation and sunshine duration. An extension of AWSCheck including more variables will be considered in the future. For each variable AWSCheck contains parameters which can roughly be grouped into three categories: Part I: Instruments: Precision, heating, ventilation, protection against direct sunlight etc. Part II: Measurement site classifications and mounting arrangement of instruments Part III: Maintenance and monitoring Part I: AWSCheck already includes lists with the most commonly used instruments and their manufacturer s specifications (data sheet) including sensitivity and accuracy, but also technical information such as ventilation, heating, etc. These values are based on what manufacturers have published for their products and are checked against the values published in the CIMO guide (WMO, 2010a) and rated accordingly. The auditor is also able to enter manual values if a certain instrument is not yet included in the list. Part II includes a site classification according to Leroy (WMO, 2010b; p. 48), examination of correct mounting arrangements, mounting heights and emplacement. This part is likely to produce some lively discussions between the auditor and the operator. Nevertheless it must be stated that this part also contains a lot of potential for improvement of the measuring quality (e.g. by cutting trees that influence the measurement, choosing a better site, adjusting measurement height etc.). Part III looks at how often the instruments are calibrated, controlled and maintained and also if there is an automatic monitoring system or a dual measurement to detect faulty instruments. 5 Adaptations for Switzerland The CIMO guide gives guidelines with some degree of adaptation on certain parameters. MeteoSwiss as the Swiss national weather service uses these guidelines and also considers the specific mountainous orography and harsh environment in our country. Measurement height: CIMO guide indicates a measurement height for precipitation and temperature between 1.2m 2m. MeteoSwiss fixed the measurement height for precipitation at 1.5m and for temperature/humidity at 2m. Calibration: CIMO guide recommends calibration of pressure sensors on a yearly basis. MeteoSwiss experience shows that certain sensors require new calbration every 5 to 8 years, depending on the model. These adaptive criteria are also considered within AWSCheck. Maintenance: The CIMO guide recommends daily cleaning of pyranometer. MeteoSwiss does rely on a weekly cleaning by the warden. Sunshine duration: CIMO defines threshold for sunshine with 120 W/m 2. MeteoSwiss uses for historic reasons a threshold of 200 W/m 2 Precipitation: WMO allows rain gauges with surface of 200 to 500 mm 2, SMN only tolerates a surface of 200 mm 2 Special considerations are taken into account when it comes to siting of instruments in Switzerland. In certain regions of Switzerland like mountainous terrain it is not possible to strictly follow the CIMO recommendations. Measurement height: In mountainous regions above 1000m the snow level can reach 1.5m and more. Therefore sensors are sometimes installed higher than 2m. To respect this AWSCheck does allow a measurement height of up to 3m for mountain stations. - 3 -

Topography: The Leroy site classification defines a flat terrain without any obstacles. In mountain regions or in steep valleys some of the Leroy criteria cannot be met. Therefore AWSCheck allows higher Leroy classes at sites considered as special topography. All these specific adaptations for Switzerland are duly documented and referred in the AWSCheck documentation: the goal for AWSCheck is to become an objective tool for this certification process. Its final rating (fully compliant-compliant-not compliant) must be traceable for the user (the auditor) and as well for the AWS sites operator. This opens the opportunity to understand why a site has not reached a given degree of certification and possibly to upgrade the AWS site and improve accordingly the rating of this certification. 6 Possible implementation of AWSCheck Currently AWSCheck has been realized using Excel. This approach has certain limitations and therefore the aim is to create a mobile application usable on a tablet or mobile device in the field. Figure 1 shows a possible implementation of the AWSCheck app. Figure 1: Mock-up of a possible AWSCheck-App for mobile devices and tablets - 4 -

7 First results AWSCheck has been applied as a test on five SwissMetNet stations. The findings from these tests were used to improve the assessment procedure. Most important findings: All sensors used on SwissMetNet stations were fully compliant to WMO standards (high precision, correct ventilation and shielding etc.) Site classification according to Leroy is the most challenging parameter and requires an experienced auditor. If an instrument failed according to AWSCheck then the Leroy site class was in most cases responsible for the label red of a SMN station. There is a gap between the defined maintenance interval for certain tasks and how in reality the warden fulfils the tasks. In some cases the grass was not cut to an adequate height or the instruments were not clean. Since the audits were snapshots done without notification of the warden, it is not clear if the lack of maintenance was an exception or the normal case. But AWSCheck only assesses the defined maintenance intervals; checking if those commitments are met by the warden is the operators job. Wind height is measured in most cases at 10m, even if the terrain roughness would require the sensor to be installed higher. See Figure 5 for more information. Some examples of fully compliant, compliant and not compliant stations: Figure 2: A fully compliant low-land (left) and mountain (right) AWS using high quality instruments, very good measurement site according to Leroy and very well maintained (not visible in picture). Figure 3: Two AWS with high quality instruments but due to obstacles and lower Leroy classification most instruments are ranked as compliant. Some instruments on the left picture could reach fully compliant level if trees are cut! - 5 -

Figure 4: Measuring precipitation on a roof (left), temperature on a tower (middle) or next to a road (right) is not compliant according to climatological and meteorological standards. Furthermore the measurement heights of 160 MeteoSwiss stations was compared to the AWSCheck guidelines. The results show that for temperature, precipitation and wind in about 90% of the cases the WMO requirements are fully met. In the other cases either the AWS is at a location with lots of snow in winter and thus parameters are measured above expected maximum snow level or the sensors are installed on a tower (eg. radio/it communication tower, see Figure 4) on purpose (to measure upper air conditions). Temperature Precipitation Wind 3% 3% 6% 2% 9% 3% 94 % 92 % 88 % Figure 5: Measurement height for 160 MeteoSwiss AWS compared with AWSCheck guidelines. Green is fully compliant to WMO guidelines, yellow is partly compliant and red is not compliant. - 6 -

8 Conclusion With AWSCheck it is possible to flag data based on an assessment of the automatic weather station (AWS) where the data is collected. But this quality assessment should be treated with caution because the label green does not in every case stand for best data and vice versa (a station flagged as red can deliver perfect data but collected e.g. at 5m instead of 2m, see below). MeteoSwiss therefore considers this tool not as a bulletproof certification procedure but as a set of guidelines data providers have to fulfil (comparable e.g. to the guidelines defined by the German weather service DWD, 2001). Furthermore AWSCheck can be used to improve an AWS by giving specific suggestions on what to change (e.g. measurement height, replace instrument, better siting, etc.) in order to achieve a higher data quality. The aim of AWSCheck is at first to provide metadata about how the surface observations are achieved. The user can decide if this information is useable for his work even if AWSCheck classifies an AWS site as not being suitable for the specific field of application. An example could be a climate researcher including temperature data collected at 5m above ground from alpine weather stations. Due to high snow levels the WMO recommendation of 2m above ground cannot be fulfilled, thus AWSCheck flags the data as being usable for warning only ( red ). It s the first goal of AWSCheck, to create awareness on how meteorological data is collected and what its possible field of application is. But allowing a National Met service to use surface observation from 3 rd party networks and to deliver this information to other data users having full confidence about the quality of the information provided is a second and as important goal for this process. 9 References DWD (2001). Richtlinien für automatische Klimastationen. Offenbach, Deutscher Wetterdienst. Kube, Marlen, Stocker, Catherine (2012). Qualitätssicherung von Messstationen. Fachkonzept SMN3, Version 0.6. WMO (2010a). WMO Guide to Meteorological Instruments and Methods of Observation. Geneva, World Meteorological Organization. WMO-No. 8 (2008 edition, Updated in 2010). WMO (2010b). Commission for Instruments and Methods of Observation Report of the Fifteenth session. Helsinki 2-8 September 2010. WMO-No. 1064. - 7 -