Air Quality Screening Modeling

Similar documents
Meteorological Modeling using Penn State/NCAR 5 th Generation Mesoscale Model (MM5)

Ozone Transport Commission. Modeling Committee. Editor: Debra Baker MDE. Contributors. Debra Baker MDE. Tom Downs ME DEP.

ANNUAL WRF SIMULATIONS FOR THE UTAH BUREAU OF LAND MANAGEMENT S AIR RESOURCE MANAGEMENT STRATEGY (ARMS) AIR QUALITY MODELING

Overview of Maryland s Upper Air Profilers

Di Wu, Xiquan Dong, Baike Xi, Zhe Feng, Aaron Kennedy, and Gretchen Mullendore. University of North Dakota

Simulation studies for Lake Taihu effect on local meteorological environment. Ren Xia

Assessment of the Noah LSM with Multi-parameterization Options (Noah-MP) within WRF

Water Balance in the Murray-Darling Basin and the recent drought as modelled with WRF

Comprehensive Analysis of Annual 2005/2008 Simulation of WRF/CMAQ over Southeast of England

Wind conditions based on coupling between a mesoscale and microscale model

Uncertainties in planetary boundary layer schemes and current status of urban boundary layer simulations at OU

Creating Meteorology for CMAQ

IMPROVING CLOUD PREDICTION IN WRF THROUGH THE USE OF GOES SATELLITE ASSIMILATION

Draft Report. Prepared for: Regional Air Quality Council 1445 Market Street, Suite 260 Denver, Colorado Prepared by:

Development and Validation of Polar WRF

Jordan G. Powers Kevin W. Manning. Mesoscale and Microscale Meteorology Laboratory National Center for Atmospheric Research Boulder, Colorado, USA

The Fifth-Generation NCAR / Penn State Mesoscale Model (MM5) Mark Decker Feiqin Xie ATMO 595E November 23, 2004 Department of Atmospheric Science

Development and Testing of Polar WRF *

PRELIMINARY EXPERIENCES WITH THE MULTI-MODEL AIR QUALITY FORECASTING SYSTEM FOR NEW YORK STATE

Improved rainfall and cloud-radiation interaction with Betts-Miller-Janjic cumulus scheme in the tropics

COSMIC GPS Radio Occultation and

WRF Model Simulated Proxy Datasets Used for GOES-R Research Activities

EVALUATION OF PEANUT DISEASE DEVELOPMENT FORECASTING. North Carolina State University, Raleigh, North Carolina

WRF-RTFDDA Optimization and Wind Farm Data Assimilation

The Climate of Grady County

Jackson County 2013 Weather Data

Downscaling and Probability

Application and Evaluation of the Global Weather Research and Forecasting (GWRF) Model

Weather Research and Forecasting Model. Melissa Goering Glen Sampson ATMO 595E November 18, 2004

Sensitivity of precipitation forecasts to cumulus parameterizations in Catalonia (NE Spain)

2012 AHW Stream 1.5 Retrospective Results

EVALUATION OF THE WRF METEOROLOGICAL MODEL RESULTS FOR HIGH OZONE EPISODE IN SW POLAND THE ROLE OF MODEL INITIAL CONDITIONS Wrocław, Poland

Jordan G. Powers Kevin W. Manning. Mesoscale and Microscale Meteorology Laboratory National Center for Atmospheric Research Boulder, CO

Western- State Air Quality Modeling Study (WSAQS) Weather Research Forecast 2014 Meteorological Model Application/Evaluation

MM5 v3.6.1 and WRF v3.5.1 model comparison of standard and surface energy variables in the development of the planetary boundary layer

1.2 Annual Meteorological Modeling in Support of Visibility Improvement in the Southeast US

A COMPARISON OF VERY SHORT-TERM QPF S FOR SUMMER CONVECTION OVER COMPLEX TERRAIN AREAS, WITH THE NCAR/ATEC WRF AND MM5-BASED RTFDDA SYSTEMS

Atmospheric processes leading to extreme flood events

Source apportionment of fine particulate matter over the Eastern U.S. Part I. Source sensitivity simulations using CMAQ with the Brute Force method

A WRF-based rapid updating cycling forecast system of. BMB and its performance during the summer and Olympic. Games 2008

Application of the Weather Research and Forecasting Model for Air Quality Modeling in the San Francisco Bay Area

The Climate of Payne County

2012 and changes to the Rapid Refresh and HRRR weather forecast models

The Climate of Kiowa County

Wen Xu* Alberta Environment and Sustainable Resource Development, Edmonton, Alberta, Canada

Preliminary Experiences with the Multi Model Air Quality Forecasting System for New York State

Land Data Assimilation at NCEP NLDAS Project Overview, ECMWF HEPEX 2004

The Climate of Bryan County

The Climate of Marshall County

The Global Weather Research and Forecasting (GWRF) Model: Model Evaluation, Sensitivity Study, and Future Year Simulation

Evaluating winds and vertical wind shear from Weather Research and Forecasting model forecasts using seven planetary boundary layer schemes

The Climate of Pontotoc County

Maximization of Historical Severe Precipitation Events over American, Yuba and Feather River Basins

Interannual variation of MODIS NDVI in Lake Taihu and its relation to climate in submerged macrophyte region

Regents Earth Science Unit 7: Water Cycle and Climate

WRAP 2002 Visibility Modeling: Emission, Meteorology Inputs and CMAQ Performance Evaluation

The model simulation of the architectural micro-physical outdoors environment

D.Carvalho, A, Rocha, at 2014 Applied Energy. Report person:zhang Jiarong. Date:

ABSTRACT 3 RADIAL VELOCITY ASSIMILATION IN BJRUC 3.1 ASSIMILATION STRATEGY OF RADIAL

Arctic System Reanalysis Provides Highresolution Accuracy for Arctic Studies

Evaluation of nonlocal and local planetary boundary layer schemes in the WRF model

The Climate of Seminole County

The Climate of Murray County

The Model Simulation of the Architectural Micro-Physical Outdoors Environment

The Climate of Texas County

P1.34 MULTISEASONALVALIDATION OF GOES-BASED INSOLATION ESTIMATES. Jason A. Otkin*, Martha C. Anderson*, and John R. Mecikalski #

Evaluation of nonlocal and local planetary boundary layer schemes in the WRF model

Variational data assimilation of lightning with WRFDA system using nonlinear observation operators

P1.2 SENSITIVITY OF WRF MODEL FORECASTS TO DIFFERENT PHYSICAL PARAMETERIZATIONS IN THE BEAUFORT SEA REGION

Simulating orographic precipitation: Sensitivity to physics parameterizations and model numerics

Modeling Study of Atmospheric Boundary Layer Characteristics in Industrial City by the Example of Chelyabinsk

A Study of Convective Initiation Failure on 22 Oct 2004

SIMULATION OF ATMOSPHERIC STATES FOR THE CASE OF YEONG-GWANG STORM SURGE ON 31 MARCH 2007 : MODEL COMPARISON BETWEEN MM5, WRF AND COAMPS

Numerical Prediction of Tropical Cyclones (PAGASA Experience)

Influences of PBL Parameterizations on Warm-Season Convection-Permitting Regional Climate Simulations

An analysis of Wintertime Cold-Air Pool in Armenia Using Climatological Observations and WRF Model Data

THE IMPACT OF GROUND-BASED GPS SLANT-PATH WET DELAY MEASUREMENTS ON SHORT-RANGE PREDICTION OF A PREFRONTAL SQUALL LINE

The Climate of Haskell County

APPENDIX G-7 METEROLOGICAL DATA

The impact of rain on ocean wave evolution and its feedback to the atmosphere

Near-surface weather prediction and surface data assimilation: challenges, development, and potential data needs

5.0 WHAT IS THE FUTURE ( ) WEATHER EXPECTED TO BE?

Current and Future Impacts of Wildfires on PM 2.5, Health, and Policy in the Rocky Mountains

2003 Water Year Wrap-Up and Look Ahead

Colorado s 2003 Moisture Outlook

Memo. I. Executive Summary. II. ALERT Data Source. III. General System-Wide Reporting Summary. Date: January 26, 2009 To: From: Subject:

Variability of Reference Evapotranspiration Across Nebraska

Projected change in the East Asian summer monsoon from dynamical downscaling

Meteorological Data recorded at Armagh Observatory from 1795 to 2001: Volume I - Daily, Monthly and Annual Rainfall

Climatology of Surface Wind Speeds Using a Regional Climate Model

Jackson County 2014 Weather Data

Jackson County 2018 Weather Data 67 Years of Weather Data Recorded at the UF/IFAS Marianna North Florida Research and Education Center

P Hurricane Danielle Tropical Cyclogenesis Forecasting Study Using the NCAR Advanced Research WRF Model

VERIFICATION OF HIGH RESOLUTION WRF-RTFDDA SURFACE FORECASTS OVER MOUNTAINS AND PLAINS

Changing Hydrology under a Changing Climate for a Coastal Plain Watershed

An Integrated Approach to the Prediction of Weather, Renewable Energy Generation and Energy Demand in Vermont

The Impacts of GPSRO Data Assimilation and Four Ices Microphysics Scheme on Simulation of heavy rainfall Events over Taiwan during June 2012

Operational quantitative precipitation estimation using radar, gauge g and

A Snow-Ratio Equation and Its Application to Numerical Snowfall Prediction

608 SENSITIVITY OF TYPHOON PARMA TO VARIOUS WRF MODEL CONFIGURATIONS

Transcription:

Air Quality Screening Modeling 2007 Meteorology Simulation with WRF OTC Modeling Committee Meeting September 16, 2010 Baltimore, MD

Presentation is based upon the following technical reports available from the OTC Sensitivity testing of WRF Physics parameterizations and protocol for meteorological modeling in support of Regional SIP air quality modeling in the Ozone Transport Region (OTR) Assessment of 2007 WRF meteorological modeling in support of regional air quality modeling in the Ozone Transport Region (OTR)

Simulated WRF meteorological fields were found to be acceptable, based on performance metrics for surface level temperature, wind speed and direction, water vapor, precipitation, wind profilers, cloud cover, and planetary boundary layer height (PBL) Design Approach and Conclusions Collaborative process with other RPOs, with assistance from University of Maryland Participant States were GA, IA, MD, NC, NY, and VA Setup benchmark and sensitivity simulations to develop a common configuration

Initial Work Goal was to ensure that WRF simulation performed by each State modeling center would yield no differences in the simulated meteorological fields by the use of a common set of input data prepared by UMD for a 20 day period in summer and winter of 2007 with an agreed upon vertical and horizontal framework for WRF simulation This benchmark effort however showed significant differences between the state simulations, suggesting the need for a common configuration of hardware/software system Using a static compiled executable WRF code with 8 CPUs and a common operating system (OS), the numerical differences between the state modeling centers were found to be minimal

WRF Sensitivity Comparison Goal: To determine a suitable configuration for annual simulation Selection: Planetary Boundary Layer (PBL) Schemes: ACM2, MYJ, YSU, and modified Blackadar Microphysics: WSM5 and WSM6 Land Surface: NOAH and PX Time Periods: Summer and Winter Periods Comparison Analysis: Techniques Development Laboratory (TDL) data wind speed (with/without calms), temperature and humidity from National Weather Service (NWS) Clean Air Status and Trends Network (CASTNet) data wind speed and temperature Wind profiler data from NWS Satellite cloud cover data from UMD Precipitation data from NWS

Selected Configuration for 2007 OTC WRF WRF version 3.1 North American Mesoscale model (NAM) analyses fields as input to 36 km, and 12 km domains in a 2 way nested mode OBSGRID for enhance nudging with observed data Model runs were at five and half days block with 12 hours overlap Annual Modeling Model Options WSM6 microphysics RRTM long and shortwave radiation Pleim Xiu land surface and surface layer models Modified Blackadar planetary boundary layer (PBL) scheme Kain Fritsch cumulus parameterization Applied analysis nudging for winds above and below PBL Applied analysis nudging for temperature and moisture above PBL only

WRF Modeling domains at 36 and 12 km, and the 12 km Air Quality Modeling domain shown shaded, and the MANE-VU domain shown in blue

WRF simulation Assessment TDL Data Monthly Statistics Wind Speed: WRF over predicted (positive bias) in winter months, and negative bias in summer months. The normalized bias is higher in summer due to observed low wind speeds Temperature: WRF predictions exhibit positive bias in winter and negative bias in summer Humidity: WRF overestimated observed humidity Overall: WRF Performance is in line with acceptable range of bias reported for meteorological modeling systems

An Example Over the 12 km domain wind speed monthly statistics comparing simulation against TDL data Wind Speed Jan Feb March April May June July Aug Sept Oct Nov Dec Sample_Size 600076 541340 601626 582365 597352 574771 592688 590041 576115 599801 582704 606478 Mean_Obs 3.61 3.84 3.73 3.82 3.04 2.62 2.34 2.26 2.53 2.90 3.23 3.12 Mean_Model 3.71 3.97 3.67 3.73 2.89 2.45 2.18 2.10 2.55 2.98 3.37 3.21 STD_Obs 2.57 2.74 2.72 2.78 2.38 2.21 1.97 1.97 2.21 2.43 2.58 2.64 STD_Model 2.11 2.38 2.26 2.42 1.92 1.79 1.55 1.52 1.84 2.04 2.20 2.30 Mean_Bias 0.11 0.13 0.06 0.09 0.15 0.18 0.16 0.16 0.03 0.07 0.14 0.09 STD_bias 1.77 1.84 1.82 1.77 1.67 1.62 1.53 1.57 1.59 1.64 1.69 1.73 RMSE 1.78 1.84 1.82 1.78 1.68 1.63 1.54 1.57 1.59 1.64 1.70 1.74 Normalized_Mean_Bias (%) 3.03 3.27 1.52 2.27 4.95 6.73 6.69 7.20 1.04 2.54 4.22 2.81 Mean_Fractional_Bias (%) 24.57 22.68 20.08 17.98 22.31 25.04 28.19 30.11 35.60 33.06 31.91 30.95 Mean_Error 1.29 1.34 1.34 1.34 1.27 1.22 1.17 1.18 1.19 1.21 1.24 1.25 Normalized_Mean_Error (%) 35.72 35.01 35.84 35.02 41.61 46.34 50.05 52.29 46.95 41.87 38.52 40.01 Mean Fractional_Error (%) 58.24 57.29 60.84 61.61 74.43 83.54 88.52 92.07 84.48 75.07 67.76 70.97 Correlation_Coefficient 0.73 0.75 0.75 0.78 0.72 0.69 0.65 0.63 0.71 0.74 0.76 0.76

Similarly for temperature comparison monthly statistics over 12 km domain Temperature Jan Feb March April May June July Aug Sept Oct Nov Dec Sample_Size 615669 555571 618981 598522 618278 596032 613847 612687 595867 615711 595098 617887 Mean_Obs 272.98 268.91 279.09 282.37 290.24 294.48 295.42 296.54 292.54 288.08 278.59 273.56 Mean_Model 273.50 269.17 278.59 281.98 289.48 293.64 294.82 296.20 292.58 288.38 279.22 273.93 STD_Obs 9.03 9.38 10.00 8.48 6.91 5.89 5.29 6.14 6.60 7.06 7.04 8.32 STD_Model 8.02 8.63 9.23 7.82 6.44 5.68 5.10 5.66 5.75 6.21 6.31 7.67 Mean_Bias 0.52 0.25 0.50 0.39 0.77 0.84 0.61 0.34 0.04 0.30 0.63 0.37 STD_bias 2.22 2.22 2.43 2.32 2.39 2.19 2.05 2.08 2.11 2.16 2.10 2.24 RMSE 2.27 2.24 2.48 2.35 2.52 2.35 2.13 2.10 2.11 2.18 2.20 2.27 Normalized_Mean_Bias (%) 0.19 0.09 0.18 0.14 0.26 0.29 0.20 0.11 0.01 0.10 0.23 0.14 Mean_Fractional_Bias (%) 0.20 0.10 0.17 0.13 0.26 0.28 0.20 0.11 0.02 0.11 0.23 0.14 Mean_Error 1.64 1.64 1.84 1.75 1.89 1.78 1.61 1.60 1.61 1.63 1.62 1.64 Normalized_Mean_Error (%) 0.60 0.61 0.66 0.62 0.65 0.61 0.55 0.54 0.55 0.57 0.58 0.60 Mean Fractional_Error (%) 0.61 0.61 0.66 0.62 0.65 0.61 0.55 0.54 0.55 0.57 0.58 0.60 Correlation_Coefficient 0.97 0.97 0.97 0.96 0.94 0.93 0.92 0.94 0.95 0.96 0.96 0.96

An Example Over the MANEVU region at 12 km grid spacing Wind Speed monthly statistics comparing WRF simulation against TDL data Wind Speed Jan Feb March April May June July Aug Sept Oct Nov Dec Sample_Size 124364 112077 124880 121021 123017 117069 121552 120333 118168 124638 121222 126654 Mean_Obs 3.71 4.21 4.03 3.92 3.02 2.82 2.51 2.36 2.47 2.73 3.21 3.31 Mean_Model 3.64 4.11 3.81 3.67 2.72 2.54 2.26 2.09 2.33 2.60 3.16 3.24 STD_Obs 3.05 3.19 3.11 2.91 2.47 2.36 2.15 2.16 2.35 2.55 2.80 3.03 STD_Model 2.22 2.56 2.32 2.39 1.73 1.67 1.56 1.49 1.60 1.81 2.08 2.39 Mean_Bias 0.08 0.10 0.23 0.25 0.30 0.27 0.25 0.28 0.14 0.13 0.05 0.07 STD_bias 2.30 2.31 2.29 2.03 1.90 1.85 1.69 1.75 1.82 1.95 2.05 2.16 RMSE 2.30 2.31 2.30 2.05 1.92 1.87 1.71 1.77 1.83 1.95 2.05 2.16 Normalized_Mean_Bias (%) 2.05 2.37 5.59 6.40 10.01 9.69 9.89 11.67 5.61 4.74 1.61 2.22 Mean_Fractional_Bias (%) 26.53 21.28 18.74 16.86 22.27 24.22 29.25 30.88 37.92 35.05 32.46 31.34 Mean_Error 1.48 1.54 1.55 1.46 1.38 1.32 1.26 1.26 1.26 1.32 1.38 1.42 Normalized_Mean_Error (%) 39.99 36.66 38.44 37.30 45.52 46.96 50.10 53.44 51.20 48.18 42.83 42.70 Mean Fractional_Error (%) 66.45 61.60 63.16 64.41 79.43 82.50 90.62 94.92 91.57 86.37 74.66 75.68 Correlation_Coefficient 0.66 0.70 0.68 0.72 0.64 0.62 0.63 0.60 0.63 0.65 0.68 0.71

An Example Over the MANEVU region at 12 km grid spacing Temperature monthly statistics comparing WRF simulation against TDL data Temperature Jan Feb March April May June July Aug Sept Oct Nov Dec Sample_Size 130525 117480 131319 127015 131983 126779 131809 130928 127676 132503 127340 131827 Mean_Obs 272.73 267.94 275.76 280.42 288.67 293.00 294.67 294.79 291.53 287.60 277.86 272.59 Mean_Model 273.12 268.08 275.15 279.83 287.54 291.95 293.75 294.26 291.32 287.66 278.31 272.86 STD_Obs 8.16 6.12 8.26 6.89 6.71 5.54 4.99 5.48 6.03 6.40 5.50 5.55 STD_Model 7.06 5.42 7.16 6.02 6.06 5.16 4.71 4.85 4.89 5.25 4.74 4.93 Mean_Bias 0.39 0.14 0.61 0.59 1.13 1.05 0.92 0.53 0.20 0.06 0.45 0.27 STD_bias 2.14 2.05 2.43 2.26 2.39 2.15 2.05 2.11 2.20 2.26 2.01 2.09 RMSE 2.18 2.05 2.50 2.33 2.64 2.40 2.24 2.18 2.21 2.26 2.06 2.11 Normalized_Mean_Bias (%) 0.14 0.05 0.22 0.21 0.39 0.36 0.31 0.18 0.07 0.02 0.16 0.10 Mean_Fractional_Bias (%) 0.15 0.06 0.21 0.20 0.39 0.36 0.31 0.18 0.06 0.03 0.17 0.10 Mean_Error 1.60 1.46 1.86 1.69 2.00 1.85 1.73 1.68 1.69 1.71 1.52 1.55 Normalized_Mean_Error (%) 0.59 0.54 0.67 0.60 0.69 0.63 0.59 0.57 0.58 0.59 0.55 0.57 Mean Fractional_Error (%) 0.59 0.55 0.67 0.60 0.69 0.63 0.59 0.57 0.58 0.60 0.55 0.57 Correlation_Coefficient 0.97 0.94 0.96 0.95 0.93 0.92 0.91 0.92 0.94 0.94 0.93 0.93

Comparison of WRF and TDL daily Wind Speed and bias for August 2007 over 12 km entire domain and MANEVU Region Wind Speed (m/sec) 5 4 3 2 1 0-1 -2 TDL Comparison - August, 2007 5 10 15 20 25 30 Observed Modeled 6 5 4 3 2 1 0 Bias Wind Speed (m/sec) 7 6 5 4 3 2 1 0-1 -2 TDL Comparison (MANEVU areas) - August, 2007 5 10 15 20 25 30 Observed Modeled 6 5 4 3 2 1 0-1 Bias -3 5 10 15 20 25 30 Date -1-3 5 10 15 20 25 30 Date -2

Comparison of WRF and TDL daily Temperature and bias for August 2007 over 12 km entire domain and MANEVU Region 305 TDL Comparison - August, 2007 5 10 15 20 25 30 Observed Modeled 12 10 310 305 TDL Comparison (MANEVU areas) - August, 2007 5 10 15 20 25 30 Observed Modeled 12 10 Temperature (K) 300 295 290 8 6 4 2 Bias Temperature (K) 300 295 290 8 6 4 2 Bias 285 280 5 10 15 20 25 30 Date 0-2 -4 285 280 275 5 10 15 20 25 30 Date 0-2 -4

Comparison of WRF and TDL daily humidity and bias for August 2007 over 12 km entire domain and MANEVU Region Humidity (g/kg) 18 16 14 12 10 8 TDL Comparison - August, 2007 5 10 15 20 25 30 Observed Modeled 6 5 4 3 2 1 0 Bias Humidity (g/kg) 20 18 16 14 12 10 8 6 TDL Comparison (MANEVU areas) - August, 2007 5 10 15 20 25 30 Observed Modeled 7 6 5 4 3 2 1 0 Bias 6-1 4-1 4 5 10 15 20 25 30 Date -2-3 2 0 5 10 15 20 25 30 Date -2-3 WRF predicts the observed double peak However, it overpredicts the early morning peak and underpredicts the day time minimum

CASTNet Data Wind Speed: WRF over predicted wind speed compared to measured by CASTNet Temperature: WRF predicted higher temperatures than those measured by CASTNet Note: Unlike NWS TDL, CASTNet data were not used in the FDDA (four dimensional data assimilation) or nudging process Also, CASTNet measurements are at rural stations

Over the 12 km domain wind speed monthly statistics comparing WRF simulation against CASTNet data Wind Speed Jan Feb March April May June July Aug Sept Oct Nov Dec Sample_Size 36234 31973 35355 32630 34636 33309 34065 33890 33507 35871 35561 37073 Mean_Obs 2.86 2.98 2.87 3.02 2.22 1.93 1.77 1.73 1.85 2.26 2.52 2.58 Mean_Model 3.71 3.87 3.59 3.67 2.54 2.20 2.06 1.89 2.11 2.74 3.18 3.20 STD_Obs 1.91 2.03 1.98 2.14 1.61 1.42 1.23 1.18 1.33 1.68 1.82 1.96 STD_Model 2.03 2.31 2.17 2.29 1.72 1.59 1.41 1.35 1.54 1.85 2.01 2.25 Mean_Bias 0.85 0.88 0.71 0.65 0.32 0.27 0.30 0.17 0.26 0.48 0.66 0.62 STD_bias 1.66 1.86 1.77 1.76 1.46 1.37 1.26 1.24 1.32 1.46 1.55 1.68 RMSE 1.86 2.06 1.90 1.87 1.50 1.40 1.30 1.25 1.34 1.54 1.68 1.79 Normalized_Mean_Bias (%) 29.67 29.69 24.78 21.39 14.40 14.00 16.84 9.65 14.17 21.35 26.29 24.03 Mean_Fractional_Bias (%) 25.67 21.68 18.72 15.33 4.53 1.01 2.99 4.76 0.50 10.18 18.51 14.37 Mean_Error 1.42 1.53 1.43 1.44 1.19 1.09 1.03 0.98 1.06 1.20 1.29 1.31 Normalized_Mean_Error (%) 49.60 51.31 49.87 47.82 53.47 56.67 58.26 56.90 57.15 53.09 51.35 50.64 Mean Fractional_Error (%) 52.64 54.33 54.86 56.13 65.91 68.73 69.50 70.64 70.13 63.52 58.09 56.50 Correlation_Coefficient 0.65 0.64 0.64 0.69 0.61 0.59 0.55 0.53 0.59 0.66 0.68 0.69

Wind Profiler Data Five sites: Baltimore, MD (BLTMD), Rutgers, NJ (RUTNJ), Stowe, MA (STWMA), Charlotte, NC (CHANC), and Raleigh, NC (RALNC) Qualitative comparison of the measured vertical profile of wind speed with WRF predictions

BLTMD BLUE: Wind Profiler RED: WRF

RUTNJ BLUE: Wind Profiler RED: WRF

STWMA BLUE: Wind Profiler RED: WRF

Comparison of Observed and WRF Cloud Fraction Using GOES 12 satellite observations derived cloud fraction from UMD s Surface Radiation Budget (SRB) Cloud fraction estimates are made on an hourly basis at 0.5 resolution for an area bounded by 70 125 W longitude and 25 50 N latitude MCIP processed cloud fraction using WRF output

Comparison of Cloud fraction based on observed (SRB) and WRF predicted (MCIP) for August 2, 2007 at 1700 General agreement in pattern over the domain

Comparison of Cloud fraction based on observed (SRB) and WRF predicted (MCIP) for August 3, 2007 at 1700 General agreement in pattern over the domain

Planetary Boundary Layer (PBL) Height Comparison Lidar based observed PBL height at CCNY WRF based PBL height at CCNY grid and maximum value of 9 by 9 grid surrounding CCNY grid The maximum of the 9 by 9 grid is found to be closer to Lidar based observed PBL height

Comparison of PBL heights from Lidar and WRF at the CCNY grid during 2007 Data are from 1000 to 1300 hrs (EST) with 1 to 1 line and best fit

Precipitation Obtained NCEP gridded stage 4 precipitation data from measurements Compared against WRF estimate on a monthly accumulation basis Overall WRF performed fairly well over non summer months, in terms of amount and pattern Over the months of April through September, WRF overpredicted precipitation amounts, except for August for which the model is found to under predict the precipitation because of passage of Hurricane Dean in late August

Comparison of precipitation based on measured (Stage 4) and predicted (WRF) for the month of January 2007

Comparison of precipitation based on measured (Stage 4) and predicted (WRF) for the month of August 2007

Summary and Conclusions WRF simulations shows good comparison with NWS TDL surface based measurements Similarly good comparison was noted with CASTNet data WRF captured the cloud coverage pattern when compared with satellitebased data WRF simulated Low level jets (LLJ) and compares well with wind profiler data WRF based estimates of Planetary Boundary Layer (PBL) heights compared well with Lidar based PBL heights at New York City location WRF simulated precipitation patterns are found to match well with measurements