Comparison of NYHOPS hydrodynamic model SST predictions with satellite observations in the Hudson River tidal, estuarine, and coastal plume region

Similar documents
The Stevens Integrated Maritime Surveillance and Forecast System: Expansion and Enhancement

The Stevens Integrated Maritime Surveillance and Forecast System: Expansion and Enhancement

Final Report. The New York Bight Shelf Harbor Dynamic Study: Ocean Forecast Sensitivity to Forecasts of Atmospheric Forcing

PROJECT DESCRIPTION. 1. Introduction Problem Statement

Water Level Prediction with Artificial Neural Network Models

ESTIMATING CORRELATIONS FROM A COASTAL OCEAN MODEL FOR LOCALIZING AN ENSEMBLE TRANSFORM KALMAN FILTER

Operational Estuarine & Coastal Forecast Systems in NOAA s. National Ocean Service

4.6 Establishing a Community-Based Extratropical Storm Surge and Tide Model for NOAA s Operational Forecasts for the Atlantic and Gulf Coasts

Forecast of Nearshore Wave Parameters Using MIKE-21 Spectral Wave Model

MACOORA Theme/MARCOOS Capabilities Product Matrix

Detailed modeling of recent severe storm tides in estuaries of the New York City region

HFR Surface Currents Observing System in Lower Chesapeake Bay and Virginia Coast

The Delaware Environmental Monitoring & Analysis Center

Coastal ocean wind fields gauged against the performance of an ocean circulation model

SAWS: Met-Ocean Data & Infrastructure in Support of Industry, Research & Public Good. South Africa-Norway Science Week, 2016

P2.9 Use of the NOAA ARL HYSPLIT Trajectory Model For the Short Range Prediction of Stratus and Fog

E. P. Berek. Metocean, Coastal, and Offshore Technologies, LLC

Critical Issues in Assessment of Offshore Wind Farm Development on Dispersion and Settling of Scallop Larvae in the Northeast U.S.

The Properties of Convective Clouds Over the Western Pacific and Their Relationship to the Environment of Tropical Cyclones

Applications of an ensemble Kalman Filter to regional ocean modeling associated with the western boundary currents variations

Estimation of Wave Heights during Extreme Events in Lake St. Clair

DOWNSCALING THE OCEAN CIRCULATION ON WESTERN SOUTH ATLANTIC: HINDCASTING, MONITORING AND FORECASTING PURPOSES

Assimilation of HF radar-derived surface currents on tidal-timescales

Conrad Blucher Institute for Surveying and Science

Development of Ocean and Coastal Prediction Systems

Weather Forecasting. March 26, 2009

Implementation of an Ocean Acoustic Laboratory at PMRF

Modeling New York in D-Flow FM

CHAPTER 13 WEATHER ANALYSIS AND FORECASTING MULTIPLE CHOICE QUESTIONS

Observing System Requirements for the Harmful Algal Bloom Forecast System in the Gulf of Mexico

Access to IOOS Data Relevant to OOI. Kathleen Bailey NOAA/NOS/IOOS January 6, 2016

Serving Marine-Related Users in the Mid-Atlantic through Ocean Observing and Forecasting

DEVELOPMENT OF A FORECAST EARLY WARNING SYSTEM ethekwini Municipality, Durban, RSA. Clint Chrystal, Natasha Ramdass, Mlondi Hlongwae

Darren Wright Maritime Services Program Manager Center for Operational Oceanographic Products and Services (CO-OPS)

Developing Coastal Ocean Forecasting Systems and Their Applications

THREE-DIMENSIONAL HYDRODYNAMIC MODEL OF NEW YORK HARBOR REGION

P3.1 Development of MOS Thunderstorm and Severe Thunderstorm Forecast Equations with Multiple Data Sources

Impact of Sea Level Rise on Future Storm-induced Coastal Inundation

O.M Smedstad 1, E.J. Metzger 2, R.A. Allard 2, R. Broome 1, D.S. Franklin 1 and A.J. Wallcraft 2. QinetiQ North America 2. Naval Research Laboratory

CHAPTER 27 AN EVALUATION OF TWO WAVE FORECAST MODELS FOR THE SOUTH AFRICAN REGION. by M. Rossouw 1, D. Phelp 1

WIND EFFECTS ON CHEMICAL SPILL IN ST ANDREW BAY SYSTEM

Large-Eddy Simulations of Tropical Convective Systems, the Boundary Layer, and Upper Ocean Coupling

Cold air outbreak over the Kuroshio Extension Region

Topics 1. IOOS on the US East Coast. 2. Regional Physical & Ecosystem Modeling Efforts

AN INTERNATIONAL SOLAR IRRADIANCE DATA INGEST SYSTEM FOR FORECASTING SOLAR POWER AND AGRICULTURAL CROP YIELDS

Coupled Ocean Circulation and Wind-Wave Models With Data Assimilation Using Altimeter Statistics

How typical are current conditions?

Improvement of Mesoscale Numerical Weather Prediction For Coastal Regions of Complex Terrain FY2003

RSMC WASHINGTON USER'S INTERPRETATION GUIDELINES ATMOSPHERIC TRANSPORT MODEL OUTPUTS

Good morning. My name is Nickitas Georgas. I am an oceanographer working at Stevens Institute of Technology in Hoboken. I am here to talk to you

The Northeast Coastal Ocean Forecast System (NECOFS) and Storm Surge and Inundation Prediction. Status and Initial Ideas

Case study analysis of the Real-Time Mesoscale Analysis (RTMA) in the northern Gulf of Mexico

THE IMPACT OF SATELLITE-DERIVED WINDS ON GFDL HURRICANE MODEL FORECASTS

Earth Observation in coastal zone MetOcean design criteria

High Resolution Surface Characterization from Marine Radar Measurements

Importance of Numerical Weather Prediction in Variable Renewable Energy Forecast

Add NOAA nowcoast Layers to Maps

NORTHEAST COASTAL OCEAN FORECAST SYSTEM (NECOFS)

SPATIAL CHARACTERISTICS OF THE SURFACE CIRCULATION AND WAVE CLIMATE USING HIGH-FREQUENCY RADAR

Evaluation of regional numerical weather prediction model surface fields over the Middle Atlantic Bight

South Bay Coastal Ocean Observing System California Clean Beaches Initiative

Module 11: Meteorology Topic 5 Content: Weather Maps Notes

Movements of striped bass in response to extreme weather events

Performance of the Nortek Aquadopp Z-Cell Profiler on a NOAA Surface Buoy

The Coastal Field Data Collection Program (CFDC) Waves & Coastal Observations for the Corps and the Nation

6.1 CASE STUDY: SLOSH USING GRIDDED WIND FIELDS FOR HURRICANES IRENE-2011 AND SANDY-2012

IMPLEMENTATION OF MODEL SKILL ASSESSMENT SOFTWARE FOR WATER LEVEL AND CURRENT IN TIDAL REGIONS

Data Short description Parameters to be used for analysis SYNOP. Surface observations by ships, oil rigs and moored buoys

Xiaodong Hong 1*, James Doyle 1, Richard M. Hodur 1, and Paul J. Martin 2

Robert Weaver, Donald Slinn 1

A Community Gridded Atmospheric Forecast System for Calibrated Solar Irradiance

DEVELOPMENT OF AN INUNDATION FORECAST SYSTEM FOR MASSACHUSETTS COASTAL WATERS

Forecasting of Optical Turbulence in Support of Realtime Optical Imaging and Communication Systems

Numerical Experiment on the Fortnight Variation of the Residual Current in the Ariake Sea

Assimilation Impact of Physical Data on the California Coastal Ocean Circulation and Biogeochemistry

Nerushev A.F., Barkhatov A.E. Research and Production Association "Typhoon" 4 Pobedy Street, , Obninsk, Kaluga Region, Russia.

Overview of NOS Coastal Ocean Operational Forecast Systems

The North Carolina State University coastal and estuary storm surge and flood prediction system

John Callahan (Delaware Geological Survey) Kevin Brinson, Daniel Leathers, Linden Wolf (Delaware Environmental Observing System)

DETERMINING USEFUL FORECASTING PARAMETERS FOR LAKE-EFFECT SNOW EVENTS ON THE WEST SIDE OF LAKE MICHIGAN

The document was not produced by the CAISO and therefore does not necessarily reflect its views or opinion.

Convective downbursts are known to produce potentially hazardous weather

Thunderstorm Downburst Prediction: An Integrated Remote Sensing Approach. Ken Pryor Center for Satellite Applications and Research (NOAA/NESDIS)

SAMPLE. SITE SPECIFIC WEATHER ANALYSIS Wind Report. Robinson, Smith & Walsh. John Smith. July 1, 2017 REFERENCE: 1 Maple Street, Houston, TX 77034

Session 4 Presentation: Wave-Current-Surge Information System (WavCIS)

Daily OI SST Trip Report Richard W. Reynolds National Climatic Data Center (NCDC) Asheville, NC July 29, 2005

A Community Terrain-Following Ocean Modeling System (ROMS)

USE OF SURFACE MESONET DATA IN THE NCEP REGIONAL GSI SYSTEM

Assimilation of Doppler radar observations for high-resolution numerical weather prediction

MxVision WeatherSentry Web Services Content Guide

Applying Basin-Scale HyCOM Hindcasts in Providing Open Boundary Conditions for Nested High-Resolution Coastal Circulation Modeling

Coupled Ocean-Atmosphere Modeling of the Coastal Zone

CIMIS. California Irrigation Management Information System

Authors of abstract. Pat Fitzpatrick Jessie Kastler Frank Hernandez Carla Culpepper Candace Bright. But whole CONCORDE team contributed to results

NOAA Nautical Charts and Coastal and Marine Spatial Planning. Meredith Westington Chief Geographer NOAA/NOS/Office of Coast Survey

A diurnally corrected highresolution

Littoral Air-Sea Processes DRI Daniel Eleuterio, 322MM Scott Harper, 322PO

AnuMS 2018 Atlantic Hurricane Season Forecast

Toward Accurate Coastal Ocean Modeling

HY-2A Satellite User s Guide

Transcription:

Comparison of NYHOPS hydrodynamic model SST predictions with satellite observations in the Hudson River tidal, estuarine, and coastal plume region Abstract Shashi Bhushan 1, Alan F. Blumberg 2, Nickitas Georgas 3 The New York Harbor Observation and Prediction System, now in its 3 rd generation (NYHOPS v3), combines a network of real time sensors and a hydrodynamic forecasting computer model to assess prevailing ocean, environmental and meteorological conditions and to provide long and short term forecasts of the mentioned conditions. The older NYHOPS v2 model used spatially uniform surface heat flux forcing. Barometric pressure gradient forcing has also been neglected. The scope of this work was to assess sensitivity of the NYHOPS Sea Surface Temperature (SST) predictions to the spatial variability of the surface boundary condition. We compared two runs using different meteorological forcing: 1) Spatially varying wind stress and air pressure forcing, but spatially uniform heat flux forcing for the entire NYHOPS region (NYHOPS v2 surface boundary conditions with air pressure). 2) Spatially varying wind stress, air pressure, and heat flux forcing (NYHOPS v3 surface boundary conditions with air pressure). The SST modeled with NYHOPS was then compared against the validated GOES 12 satellite SST mapped on the NYHOPS grid nodes. A remarkable improvement (error reduction to the extent of 60%) in the prediction of SST by NYHOPS v3 was observed compared to NYHOPS v2. Similar error ranges were observed on comparison of NYHOPS v3 modeled SST and the satellite SST against in-situ observations, indicating that NYHOPS provides an effective SST prediction tool. Introduction The New York Harbor Observation and Prediction System (NYHOPS) can be considered to consist of a network of sensors and an estuarine and coastal ocean 1 Graduate Student, Center for Maritime Systems, Stevens Institute of Technology, Castle Point on Hudson, Hoboken, NJ 07030, Ph: 281.673.2927, Email: sbhushan@absconsulting.com 2 George Meade Bond Professor and Director, Center for Maritime Systems, Stevens Institute of Technology, Castle Point on Hudson, Hoboken, NJ 07030, Phone: 201.216.5289, Email: Alan.Blumberg@stevens.edu 3 Nickitas Georgas, Senior Research Engineer, Center for Maritime Systems, Stevens Institute of Technology, Castle Point on Hudson, Hoboken, NJ 07030, Phone:201.216.8218, Email: Nickitas.Georgas@stevens.edu 1

forecasting hydrodynamic computer model. NYHOPS performs a real time assessment of ocean, weather and environmental conditions in the New York Harbor region and forecasts the conditions in the short and long term. The sensor network comprises of surface and bottom sensors, data buoys and Acoustic Doppler Current Profilers (ADCP) for measurement of pressure, currents, salinity and water temperature. It also includes a CODAR (high frequency radar) system for the measurement of currents and waves on a broad scale. In addition to the above equipments, it also utilizes the facilities of commuter ferries with onboard sensors to measure salinity and temperature. Continuous observations of meteorological conditions are obtained from weather stations. The modeling system is based on the Estuarine Coastal and Ocean Model (ECOMSED) which has been derived from the Princeton Ocean Model (Blumberg and Mellor 1987). The NY/NJ Harbor, the Hudson River Estuary up to the Troy Dam, the Long Island Sound and the NY Bight up to the continental shelf fall within its spatial coverage. Forcing in the NYHOPS system includes 93 river systems, 241 fresh water discharges, and 39 power plants (Georgas and Blumberg (2010, this issue)). Spatially varying time sequential wind stresses are provided from the data extracted from the output of North American Mesoscale (NAM) meteorological model. Surface heat fluxes are calculated based on QUAL-2E-type formulations (Georgas and Blumberg 2008) from meteorological parameters provided by NAM and oceanographic parameters calculated by NYHOPS: wind speed, air temperature, relative humidity, cloud cover and parameterized short wave solar radiation are used. The NYHOPS v2 model used spatially uniform heat flux: one dimensional, time sequential heat flux input near the JFK airport was extracted from NAM, JFK being centrally located in the NYHOPS domain. NYHOPS v2 also neglected direct atmospheric pressure load. The objectives of the work presented herein were the following: 1) To improve the NYHOPS sea surface temperature (SST) predictions by studying the sensitivity of NYHOPS to the following time sequential NAM-based meteorological forcings: a) Spatially varying wind stress and air pressure, but spatially uniform heat flux in the entire NYHOPS region. (NYHOPS v2 surface boundary condition with air pressure) b) Spatially varying wind stress, air pressure and heat flux (air temperature, relative humidity, cloud cover and the parameterized short wave solar radiation) (NYHOPS v3 surface boundary condition with air pressure). 2) To assess the SST obtained from the geostationary satellite GOES 12 against insitu observations. 3) To compare the performance of the NYHOPS modeled SST with above meteorological forcings against SST obtained from GOES12 satellite, thereby creating a procedure for evaluation of NYHOPS predicted SST against satellite SST. 2

The period chosen for this study was from March 1, 2007 to June 17, 2007. Data Preparation Data Preparation for model forcing The meteorological data were taken from the outputs of the NAM model. The Weather Research and Forecasting (WRF) -based NAM currently runs with 12 km horizontal resolution and three hours temporal resolution. These data were in standard world meteorological organization Gridded Binary (GRIB) format. WGRIB, an operational National Centers for Environment Prediction (NCEP) program was used to decode the GRIB files and the same were put in the ASCII and binary formats as required by the NYHOPS model input file formats. The NAM data had u and v components of the wind velocity, barometric pressure, air temperature and relative humidity data at its grid nodes. Since these grids were different from the NYHOPS grid, the data were interpolated to the NYHOPS grid (Figure 1). The individual day data files were concatenated with increasing time step in the evaluation period. Figure 1. NAM and NYHOPS model grids 3

Preparation of Satellite SST data The SST data of GOES 12 satellite, which covers the eastern region of the United States, were used for assessment of the performance of Satellite SST against in-situ SST taken from National Data Buoy Center (NDBC) stations. The validated GOES SST was then used for assessing the performance of NYHOPS modeled SST in the entire NYHOPS domain. The source of the data was the Global Ocean Data Assimilation Experiment data sponsored by the Office of Naval Research. The time frame of acquired data was March 1, 2007 to June 17, 2007. The National Oceanographic and Atmospheric Administration s (NOAA) office of Satellite Data Processing and Distribution generates SST from the images retrieved from the GOES satellite. A new cloud masking methodology based on a probabilistic (Bayesian) approach is used for improved retrieval accuracy. This algorithm provides SST with a probability of cloud contamination (Merchant et al 2004). The satellite data were mapped to the NYHOPS grid nodes using the nearest neighbor algorithm. Data were not available for the Hudson River estuary grid nodes. After mapping the available data to the NYHOPS grid nodes, they were matched within ±30 minutes to the NYHOPS modeled SST hourly output time step and thereafter used to create an hourly persistent time series (hourly gaps filled with the preceding value between two available time steps). NOAA s National Ocean Service s (NOS) skill assessment software was thereon used to assess the satellite SST against in situ observations. Methodology for Model Performance Assessment Skill Assessment Model skill assessment (Hess et al 2003), as the name suggests, is the measurement of performance of the model validated against observations. The National Ocean Service (NOS) has developed a software package to check the performance of oceanographic models. The observed data from relevant stations were retrieved for analysis. The parameters sea surface elevation, water temperature, salinity, and currents can be handled by this software. Tidal prediction, harmonic analysis, gap filling for missing data and filtering routines are included. Three methods for gap filling have been provided in the package namely linear interpolation, cubic spline interpolation, and singular value decomposition (SVD). In the present analysis SVD has been used as for larger gaps it gives better results. Fourier filtering is used to remove short period variations and noise extrema values. Table 1 provides definitions of the statistical variables (Table 1) computed in the skill assessment. A target frequency of occurrence is used as an acceptance criterion. There are also statistics which are expressed as duration of errors, e.g. S(X) L, where S denotes statistic, X is the defined magnitude of error, and L is the target frequency. 4

Variable Criterion Series Mean, SM None Root Mean Square Error, RMSE None Standard Deviation, SD None Negative Outlier Frequency, NOF(2X) 1% Central Frequency, CF(X) 90% Positive Outlier Frequency, POF(2X) 1% Maximum Duration of Positive Outliers, MDPO(2X) L Maximum Duration of Negative Outliers, MDNO(2X) L Worst Case Outlier Frequency, WOF(2X) 0.5% Table1. Standard criteria of acceptance Stations for analysis of NYHOPS modules SST Performance Figure 2 shows the stations chosen for NYHOPS SST performance comparison among the three NYHOPS runs, i.e., the NYHOPS v2, NYHOPS v2 with spatially varying time sequential air pressure, and NYHOPS v3 surface boundary conditions with spatially varying time sequential air pressure. Station names are listed in Table 2. 1) Hudson River at Albany, NY 2) Hudson River at Poughkeepsie, NY 3) Hudson River at South Dock at West Point, NY 4) Hudson River south of Hastings on Hudson, NY 5) George Washington Bridge, NJ 6) Hudson River at 79 th Street, NY 7) Pier 40, NY 8) The Battery, NY 9) Sandy Hook, NJ 10) Atlantic City, NJ 11) NDBC Station 44009 Delaware Bay 26 NM Southeast of Cape May, NJ 12) NDBC Station 44025 Long Island 33 NM South of Islip, NY 13) NDBC Station 44017 23 Nautical Miles Southwest of Montauk Point, NY Table2. Stations used in NYHOPS surface boundary condition sensitivity runs Stations Considered for Satellite SST Validation For the purpose of assessing Satellite SST, the three NDBC stations of Table 2 (stations 11-13) were chosen. 5

Figure 2. Observation stations used in this study Performance Assessment of NYHOPS modeled SST against Satellite SST Assessment of the performance of NYHOPS modeled SST against Satellite SST was carried out in two steps: 1) Skill assessments of persistent Satellite SST data and NYHOPS modeled SST were carried out against observed SST separately at the above mentioned three NDBC stations. 2) The available Satellite data were matched within ±30 minutes timeframe of the NYHOPS SST (creating non persistent series). A spatial comparison over the NYHOPS region was performed between the Satellite SST available in NYHOPS domain and the NYHOPS SST at the corresponding grid nodes. A correlation between Satellite SST and NYHOPS SST was calculated at all the NYHOPS grid nodes where satellite data were available. Some stations were eliminated from the analysis based on a hypothesis test. Only those stations were included that had enough satellite data to test whether significant correlation existed between NYHOPS SST and the time matched Satellite SST. The claim that was tested was "There is significant linear correlation" 6

The parameter used for linear correlation is r (Bobko 2001): H 0 : r = 0 H 1 : r 0 r has a t distribution with n-2 degrees of freedom, and the test statistic is given by: with n-2 degrees of freedom.) Root Mean Square (RMS) differences ( errors ) between NYHOPS SST and Satellite SST were also calculated in the NYHOPS domain to observe the spatial variability. Results Performance Assessment of NYHOPS modeled SST against in-situ observations The previously mentioned NOS skill assessment software was used to obtain the RMS errors between the modeled SST against the observed SST for the following three model cases with different forcings. 1) NYHOPS v2 run 2) NYHOPS v2 with spatially varying time sequential air pressure run 3) NYHOPS v3 with spatially varying time sequential air pressure run The modeled hindcast temperatures met the NOS standards for all the model runs as all the statistical variables were within the accepted limits as defined in the NOS acceptance criteria. The skill ranged from 0.96 to 1 for almost all the stations and for all the models. The SST performance of the model improved with inclusion of the spatially varying heat flux in NYHOPS v3. The central frequencies in all cases were 100% with the accepted magnitude of the error being 3ºC (Patchen 2007). As the central frequency and skill values for different models were similar at different stations, RMS error values were used to compare the performance of the NYHOPS runs. As can be seen from Figures 3 and 4, when the model is run with spatially varying time sequential heat flux forcing, it performs better (approximately 60% reduction in SST RMS errors) in comparison to a run without the spatially varying heat flux forcing. As a result, it can be said that NYHOPS v3 run performs better than NYHOPS v2 as well as NYHOPS v2 with spatially varying air pressure runs in predicting sea surface temperature. 7

Figure 3. Improvement in NYHOPS SST prediction due to spatially varying heat flux inclusion Figure 4. Mean Absolute Error in SST predictions at different stations Performance Assessment of Satellite SST and comparison with NYHOPS modeled SST The skill values for persistent Satellite SST data compared to observations at the three NOAA NDBC stations ranged from 0.97 to 0.99. All other statistical variables were within NOS standards as defined in the Skill Assessment section. In addition, Figure 11 represents the average of the RMS error values provided with the satellite data (Satellite/Observation) (Merchant et al 2004). It can be observed that the RMS error values are close to 1 deg. C in the entire domain. The RMS errors are 8

relatively greater in the Hudson River plume region. The RMS errors that were obtained from the comparison of Satellite SST against in situ observations for the three NOAA NDBC stations were also close to 1 deg. C (Figure 5). Similar RMS errors were found by Maturi et al 2008. Thus it can be said that satellite persistent SST series qualifies by NOS model performance standards as well as with reference to the work by Maturi et al 2008 as a SST model that provides relevant SST in the NYHOPS region. The NYHOPS v3 simulated SSTs at the three open water NOAA stations were analyzed against the corresponding observations using the NOS software. The NYHOPS v3 model skill ranged from 0.98 to 0.99. Figure 5 represents the comparison of the RMS errors obtained from the skill tables, for the following two cases, after running the NOS software: a) Satellite persistent series SST against observed SST b) NYHOPS v3 SST against observed SST Figure 6 represents the absolute mean SST errors at the three stations. It can be seen that NYHOPS v3 is better than a Satellite SST persistence time series model for all three observation stations. The two technologies, satellite and NYHOPS, are comparable in results. Figure 5. RMSE of Satellite SST and NYHOPS v3 SST 9

Figure 6. Mean Absolute Error of Satellite SST and NYHOPS v3 SST Figure 7 shows the coverage of the satellite during the concerned period. The proportion of available data with respect to the maximum frequency obtained for a grid cell has been plotted in this figure. The variable resolution of the NYHOPS grid contributes to the observed pattern (more observations fall within the bounds of larger cells). Figure 7. Satellite coverage of GOES 12 in the NYHOPS region 10

Figure 8 shows the variability of the satellite SST in the study period. The gaps in the following figures are more than those in Figure 7. These are the grid cells where the numbers of satellite SST data collected within the evaluation period were less than 10. The number 10 was chosen on the basis of the hypothesis test explained earlier. Figure 8. Standard Deviation of the Satellite SST data Spatial Performance Assessment of NYHOPS models Another objective was to compare the performance of NYHOPS forced with spatially uniform (NYHOPS v2) against spatially variable (NYHOPS v3) heat flux forcing in the full NYHOPS domain. For this purpose, RMS errors between the respective modeled SSTs against Satellite SSTs were calculated in the entire NYHOPS domain after matching the available satellite SST within ±30 minutes of 11

the NYHOPS hourly output time step. It can be observed by comparing Figures 9 and 10 that in the entire NYHOPS domain the modeled SST performance improves in the NYHOPS v3 run. This is reflected by reduction in the compared SST RMS errors in the NYHOPS v3 case. To further assess the above results, the strength of linear correlation between modeled SST and Satellite SST in the entire NYHOPS domain with available satellite SST data was tested. Correlation coefficient spatial plots were made for the grids for which it could be said with 95% confidence that a significant correlation existed between the matched NYHOPS v2 and NYHOPS v3 modeled SST data respectively with the Satellite SST (Figures 12 and 13). The correlation coefficients for NYHOPS v3 modeled SST against satellite SST were higher than the correlation coefficients for the NYHOPS v2 modeled SST. This complemented the RMS error plots. Figure 9 represents the RMS errors obtained on comparison of satellite SST against NYHOPS v2. Figure 10 represents the RMS errors obtained on comparison of satellite SST against NYHOPS v3 modeled SST. Figure 11 represents the average of the SST RMS error values given alongside the satellite data (Satellite/Observation) (Merchant et al 2004). This represents the inherent RMS error induced in the satellite data during extraction of the data from satellite images. Figure 12 represents the strength of linear correlation between the satellite SST and NYHOPS v2 modeled SST. Figure 13 represents the strength of linear correlation between the satellite SST and NYHOPS v3 modeled SST. Conclusions The study shows that Sea Surface Temperature prediction improved immensely (~ 60% error reduction) on inclusion of the spatially varying time sequential heat flux into NYHOPS (NYHOPS v3). The geostationary satellite GOES 12 provides relevant SST data in the NYHOPS offshore region. The NYHOPS sea surface temperature (SST) was also compared with GOES12 satellite SST. The NYHOPS-modeled and the satellite-inferred SST were tested for skill against in situ observation SST. The error ranges were found to be similar, indicating that NYHOPS provides an effective SST prediction tool. 12

Figure 9. RMS difference ( Error, ºC) between NYHOPS v2 SST predictions and Satellite-inferred SST. Figure 10. RMS difference ( Error, ºC ) between NYHOPS v3 SST predictions and Satellite-inferred SST.

. Figure 11. RMSE in the satellite-inferred SST data (internal quality control). 14

Figure 12. Correlation Coefficient between Satellite SST and NYHOPS v2 predictions. Figure 13. Correlation Coefficient between Satellite SST and NYHOPS v3 predictions.

Acknowledgements Dr. Wei Li, Research Associate Professor (Department of Physics and Engineering Physics, Stevens Institute of Technology), and Ms. Jee Ko, Research Engineer, Center for Maritime Systems, Stevens Institute of Technology) assisted greatly with this work. References Blumberg, A., Mellor, G. L. (1987). A description of a three-dimensional coastal ocean circulation model. Three-Dimensional Coastal Ocean Models, N. S. Heaps (Ed.), 1-16. Bobko, P. (2001). Correlation and Regression. Applications for Industrial Organizational Psychology and Management, Sage Publications, California, 58 Georgas, N., Blumberg, A.F. (2008). The New York Bight Shelf Harbor Dynamic Study: Ocean Forecast Sensitivity to Forecasts of Atmospheric Forcing. Final Report, Stevens Institute of Technology, New Jersey, ONR grant N00014-06-1-1027 Georgas, N., Blumberg A.F. (2010). Establishing Confidence in Marine Forecast Systems: The design and skill assessment of the New York Harbor Observation and Prediction System, version 3 (NYHOPS v3), Est. and Coast. Mod., Amer. Soc. Civ. Eng., Proc. 11th Intern. Conf., this issue, 2010 Hess, K. W., Gross, T. F., Schmalz, R. A., Kelley, J. G. W., Aikman III, F., Wei, E., Vincent, M. S. (2003). NOS standards for evaluating operational nowcast and forecast hydrodynamic model systems, NOAA Technical Report NOS CS 17, 47. Maturi, E., Harris, A., Merchant, C., Mittaz, J., Potash, B., Meng, W., Sapper J. (2008). NOAA s Sea Surface Temperature products from operational geostationary satellites. 1886 Merchant, C J., Old, C P., MacCullam, S. (2004). Bayesian cloud screening for GOES-12 Algorithm theoretical basis., 13 Patchen, R., (2007). Establishment of a Delaware Bay model evaluation environment., Est. and Coast. Mod., Amer. Soc. Civ. Eng., Proc. 10th Intern. Conf., 783-818, 2008 16