Daily OI SST Trip Report Richard W. Reynolds National Climatic Data Center (NCDC) Asheville, NC July 29, 2005

Similar documents
Richard W. Reynolds * NOAA National Climatic Data Center, Asheville, North Carolina

Objective Determination of Feature Resolution in an SST Analysis. Richard W. Reynolds (NOAA, CICS) Dudley B. Chelton (Oregon State University)

Impact of TRMM SSTs on a Climate-Scale SST Analysis

Objective Determination of Feature Resolution in Two Sea Surface Temperature Analyses

SST in Climate Research

An Improved In Situ and Satellite SST Analysis for Climate

Evaluation of AATSR and TMI Satellite SST Data

HOMOGENEOUS VALIDATION SCHEME OF THE OSI SAF SEA SURFACE TEMPERATURE PRODUCTS

Observations and Modeling of SST Influence on Surface Winds

NOAA In Situ Satellite Blended Analysis of Surface Salinity: Preliminary Results for

A Comparison of Satellite and In Situ Based Sea Surface Temperature Climatologies

Comparison between the Pathfinder Versions 5.0 and 4.1 Sea Surface Temperature Datasets: A Case Study for High Resolution

THREE-WAY ERROR ANALYSIS BETWEEN AATSR, AMSR-E AND IN SITU SEA SURFACE TEMPERATURE OBSERVATIONS.

Time Scales for SST over the Global Ocean

CHAPTER 2 DATA AND METHODS. Errors using inadequate data are much less than those using no data at all. Charles Babbage, circa 1850

Inter-comparison of Historical Sea Surface Temperature Datasets

Multi-Sensor Satellite Retrievals of Sea Surface Temperature

4.5 Comparison of weather data from the Remote Automated Weather Station network and the North American Regional Reanalysis

Generating a Climate data record for SST from Passive Microwave observations

NEW OSI SAF SST GEOSTATIONARY CHAIN VALIDATION RESULTS

EVALUATION OF THE GLOBAL OCEAN DATA ASSIMILATION SYSTEM AT NCEP: THE PACIFIC OCEAN

What Measures Can Be Taken To Improve The Understanding Of Observed Changes?

Monitoring Sea Surface temperature change at the Caribbean Sea, using AVHRR images. Y. Santiago Pérez, and R. Mendez Yulfo

Characteristics of the Satellite-Derived Sea Surface Temperature in the Oceans around Japan

Assimilation of Satellite Sea-surface Salinity Fields: Validating Ocean Analyses and Identifying Errors in Surface Buoyancy Fluxes

Comparison of Global Mean Temperature Series

Improved Historical Reconstructions of SST and Marine Precipitation Variations

E-AIMS. SST: synthesis of past use and design activities and plans for E-AIMS D4.432

Applications of an ensemble Kalman Filter to regional ocean modeling associated with the western boundary currents variations

Trends in Global Cloud Cover in Two Decades of HIRS Observations

Global space-time statistics of sea surface temperature estimated from AMSR-E data

Version 7 SST from AMSR-E & WindSAT

Analysis of Pathfinder SST algorithm for global and regional conditions

Oceanic Eddies in the VOCALS Region of the Southeast Pacific Ocean

IASI L2Pcore sea surface temperature. By Anne O Carroll, Thomas August, Pierre Le Borgne and Anne Marsouin

The SeaFlux Turbulent Flux Dataset Version 1.0 Documentation

P2.7 CHARACTERIZATION OF AIRS TEMPERATURE AND WATER VAPOR MEASUREMENT CAPABILITY USING CORRELATIVE OBSERVATIONS

Use of Satellite Observations to Measure Air-Sea Coupling and to Validate Its Estimates from Numerical Atmospheric Models

The measurement of climate change using data from the Advanced Very High Resolution and Along Track Scanning Radiometers

AIRS and IASI Precipitable Water Vapor (PWV) Absolute Accuracy at Tropical, Mid-Latitude, and Arctic Ground-Truth Sites

SUPPLEMENTARY INFORMATION

Annual Validation Report: Rain

Quality control methods for KOOS operational sea surface temperature products

Long-term global time series of MODIS and VIIRS SSTs

Ocean data assimilation for reanalysis

Improving predictions of hurricane intensity: new high-resolution sea surface temperatures from NASA s Aqua satellite. C. L.

Blended Sea Surface Winds Product

Fine-resolution satellite-based daily sea surface temperatures over the global ocean

How DBCP Data Contributes to Ocean Forecasting at the UK Met Office

Overview of data assimilation in oceanography or how best to initialize the ocean?

Characteristics of Global Precipitable Water Revealed by COSMIC Measurements

The Climatology of Clouds using surface observations. S.G. Warren and C.J. Hahn Encyclopedia of Atmospheric Sciences.

Coupling between Sea Surface Temperature and Low-Level Winds in Mesoscale Numerical Models

P2.57 PRECIPITATION STRUCTURE IN MIDLATITUDE CYCLONES

Early May Cut-off low and Mid-Atlantic rains

From El Nino to Atlantic Nino: pathways as seen in the QuikScat winds

OCEAN & SEA ICE SAF CDOP2. OSI-SAF Metop-A IASI Sea Surface Temperature L2P (OSI-208) Validation report. Version 1.4 April 2015

Tropical Cyclone Hyperactivity in the Eastern and Central Caribbean Sea During the 2005 Atlantic Hurricane Season

NEAR REAL TIME GLOBAL OPTIMUM INTERPOLATED MICROWAVE SSTS: APPLICATIONS TO HURRICANE INTENSITY FORECASTING

MARCH 2010 VOLUME 48 NUMBER 3 IGRSD2 (ISSN )

Using 22 years of HIRS Observations to Infer Global Cloud Trends

SST Trend Based on High-resolution Datasets and Numerical Simulations

Sea Surface Temperatures from the GOES-8 Geostationary Satellite

Observations of seasurface. in situ: evolution, uncertainties and considerations on their use

The aerosol- and water vapor-related variability of precipitation in the West Africa Monsoon

Global Ocean Monitoring: A Synthesis of Atmospheric and Oceanic Analysis

A diurnally corrected highresolution

4B.4 HURRICANE SATELLITE (HURSAT) DATA SETS: LOW EARTH ORBIT INFRARED AND MICROWAVE DATA

Climate reanalysis and reforecast needs: An Ocean Perspective

Lindzen et al. (2001, hereafter LCH) present

Validation of Microwave Sea Surface Temperature Measurements for Climate Purposes

Preparation and dissemination of the averaged maps and fields of selected satellite parameters for the Black Sea within the SeaDataNet project

Measuring Global Temperatures: Satellites or Thermometers?

SSS retrieval from space Comparison study using Aquarius and SMOS data

Evaluations of SST climatologies in the tropical Pacific Ocean

The Formation of Precipitation Anomaly Patterns during the Developing and Decaying Phases of ENSO

P2.9 Use of the NOAA ARL HYSPLIT Trajectory Model For the Short Range Prediction of Stratus and Fog

Remote Sensing of Precipitation

Related Improvements. A DFS Application. Mark A. Bourassa

Southern Florida to Cape Hatteras Spring Season Preview 2018 UPDATE ON U.S. EAST COAST GULF STREAM CONDITIONS

A two-season impact study of the Navy s WindSat surface wind retrievals in the NCEP global data assimilation system

A cloud-free, satellite-derived, sea surface temperature analysis for the West Florida Shelf

NOTES AND CORRESPONDENCE. Wind Stress Drag Coefficient over the Global Ocean*

John Steffen and Mark A. Bourassa

North Pacific Climate Overview N. Bond (UW/JISAO), J. Overland (NOAA/PMEL) Contact: Last updated: September 2008

Cold air outbreak over the Kuroshio Extension Region

A HIGH RESOLUTION EUROPEAN CLOUD CLIMATOLOGY FROM 15 YEARS OF NOAA/AVHRR DATA

Satellite Sea Surface Temperature Observations of Ocean Processes. Dr. Craig Donlon.

Air sea satellite flux datasets and what they do (and don't) tell us about the air sea interface in the Southern Ocean

138 ANALYSIS OF FREEZING RAIN PATTERNS IN THE SOUTH CENTRAL UNITED STATES: Jessica Blunden* STG, Inc., Asheville, North Carolina

An Annual Cycle of Arctic Cloud Microphysics

A New Sea Surface Temperature and Sea Ice Boundary Dataset for the Community Atmosphere Model

Comparison of Freezing-Level Altitudes from the NCEP Reanalysis with TRMM Precipitation Radar Brightband Data

291. IMPACT OF AIRS THERMODYNAMIC PROFILES ON PRECIPITATION FORECASTS FOR ATMOSPHERIC RIVER CASES AFFECTING THE WESTERN UNITED STATES

Comparison results: time series Margherita Grossi

A Time Series of Photo-synthetically Available Radiation at the Ocean Surface from SeaWiFS and MODIS Data

Earth & Space Research, Seattle WA, USA

Enhancing predictability of the Loop Current variability using Gulf of Mexico Hycom

C

Inter-comparison of Historical Sea Surface Temperature Datasets. Yasunaka, Sayaka (CCSR, Univ. of Tokyo, Japan) Kimio Hanawa (Tohoku Univ.

Transcription:

Daily OI SST Trip Report Richard W. Reynolds National Climatic Data Center (NCDC) Asheville, NC July 29, 2005 I spent the month of July 2003 working with Professor Dudley Chelton at the College of Oceanic and Atmospheric Sciences, Oregon State University, in Corvallis. The purpose of my visit was to develop a higher resolution sea surface temperature (SST) analysis. The facilities at Oregon Sate were quite good. I was given a private office with a door. This provided a quiet place to work which increased my work efficiency. I was also provided faculty card so I could use the university's recreation center and travel free on the Corvallis bus system. In addition, I had a 2 bedroom duplex at no cost to NCDC. The NCDC link to my Office PC (via VPN) was also very good. With this I had complete access to my PC and all the other computer services as though I were at my desk in the Asheville Federal Building. The only problem I had was a loss of service for two days following a computer maintenance procedure at NCDC. 1. Background The original SST analysis (Reynolds and Smith, 1994, and Reynolds, et al., 2002) (henceforth OI.v2) uses optimum interpolation (OI) and is analyzed weekly on a 1 o spatial grid. This analysis uses in situ measurements from ships and buoys as well as infrared satellite retrievals from the Advanced Very High Resolution Radiometer (AVHRR). Chelton and Wentz (2005) compared the OI.v2 with a daily analysis described by Thiébaux, et al. (2003) and satellite data from the passive microwave Advanced Microwave Scanning Radiometer - EOS (AMSR-E). (Here EOS is the Earth Observing System.) The analysis described by Thiébaux, et al. is called the Real Time Global SST analysis (RTG_SST) and uses the same data used in the OI.v2. However, the RTG_SST has been run daily since 30 January 2001on a 1/2 o grid instead of weekly on a 1 o grid and uses smaller spatial error correlation scales than those used in the OI.v2. Chelton and Wentz (2005) showed that the RTG_SST analysis agreed better with AMSR- E than the OI.v2 even though, as will be discussed below, the AVHRR data is often sparse because of cloud cover. Thus, it was proposed to design a new daily OI analysis on a 1/4 o grid with smaller spatial error covariance scales. The analysis would be tested using AVHRR data and in situ data. However, it would be designed to include other satellite data sources including AMSR. 2. Preliminary Work The weekly OI analysis method used by Reynolds et al. (2002) includes a preliminary correction of the AVHRR satellite data before it is used in the OI. This is necessary because the OI method assumes that the data do not contain long-term biases. The bias correction uses a Poisson technique to remove satellite biases relative to in situ data 1

before the OI analysis is begun. This method has been used successfully. However, the major problem with this method is that each correction is performed independently. Thus, there is no time continuity of the correction. In most cases, the cause of the bias, for example the presence of volcanic aerosols, does persist in time. A new method was developed using an OI bias correction. This was done as an analysis of the difference between in situ data and each type of satellite data. To provide continuity in time, the OI bias analysis uses the preceding OI bias analysis as a first guess. This bias correction was initially computed weekly and compared with the Poisson method. Tom Smith (NCDC) computed spatial error correlations and signal to noise ratios. Because the in situ data were noisy, the scales were large and had to be assumed isotropic and homogenous. The scales that worked best were a bias noise to signal ratio (standard deviation) of 4 and spatial error correlation scale of 1500 km. Examination of the results showed that differences between the Poisson bias correction and the OI bias correction were modest. However, Tom Smith and I felt that the OI bias correction was superior because of the time continuity. Thus, the OI bias method will be used in the daily OI. 3. The Daily Analysis To begin the daily analysis, it was first necessary to reacquire the data, since the original data had been averaged into weekly values on a 1 o grid. The operational data that had been used was the operational version produced by the US Navy from AVHRR data (May et al., 1998). However, there was also a Pathfinder AVHRR reanalysis project (Kilpatrick, et al., 2001). Pathfinder has the potential of being better than the operational set, because a reanalysis would allow time to develop corrections to the AVHRR dataset when errors occur. It was decided to use both types of AVHRR data. In addition, AMSR data were obtained from Remote Sensing Systems (RSS) (http://www.ssmi.com) as gridded data for ascending and descending passes on a daily 1/4 o grid. The in situ data were obtained from the International Comprehensive Ocean-Atmosphere Data Set (ICOADS). The 1 o OI code was modified to run daily on a 1/4 o grid and to allow multiple data sets instead of the original set. However, for this purpose the OI noise to signal and correlation scales had to be modified. These scales had been computed by Tom Smith for weekly data. Using methods he derived, the spatial scales were scaled by a factor of 0.25 from the weekly scales and the noise to signal ratios were scaled by a factor of 1.3. The spatial scales are shown in Figure 1 and range from values below 100 km in regions of western boundary currents in values above 300 km in regions with small SST variance. The values are similar to the 100-400 km used in the RTG_SST. The weekly OI bias correction required little change. It was just modified to use the daily SST data files that were used by the OI and computed daily using the most recent 7 days of data. Before I left North Carolina, the daily OI codes were run for both Pathfinder and operational NAVY AVHRR data for 2003. However, there was only time to look in detail at January 2003. Once I arrived in Corvallis, the AVHRR and in situ data were extended back to 2002. In addition, the AMSR data were obtained from RSS and the daily OI was run from June 2002 (the start of AMSR) through the end of 2003. Thus, the 2

following OI daily versions have been completed: AVHRR pathfinder (January 2002 - December 2003), AVHRR operational Navy (January 2002 - December 2003) and AMSR (June 2002 - December 2003). All analyses include the same in situ data and were run with and without satellite bias correction. 4. The Results As can be expected (e.g., Chelton and Wentz, 2005) the coverage of AMSR data is dramatically improved over AVHRR. To demonstrate this using the framework of the daily OI, the number of days of daytime data and nighttime data are shown for the month of January 2003 for both Pathfinder and AMSR data in Figures 2 and 3. Here the same color scales are used for both figures. The results show the impact of clouds on AVHRR where most regions north of 40 o N and south of 40 o S tend to only have 5 days of data during the entire month. The numbers increase in the tropics to roughly 15 days in cloud free regions. The impact of clouds in tropical regions, such as the Intertropical Convergence Zone (ITCZ) is evident by the reduction in the number of days of data. However, the number of AMSR days with data is much better especially north of 40 o N and south of 40 o S. There are drop offs along the ITCZ and in the tropics because AMSR cannot retrieve SSTs in regions with precipitating water. Figure 3 also shows a drop off in daytime observations near 50 o S due to sun glint. In addition there is an as yet unexplained drop off in the AMSR observations that occurs during the day along roughly 160 o W and at night along roughly 20 o E. Careful intercomparisons of the different OI versions showed large fluctuations in the bias correction. These differences were evaluated and a problem was found in the OI bias program. The correction had been scaled to use 7 days of observations once a week. However, for the daily version, a running mean set of 7 days of data were used every day. Thus, the observations were usually repeated in the OI bias correction 7 times. This was important because the previous analysis was the first guess. To correct this, the noise to signal ratio was reduced by the square of the number of days repeated, 7. Once this was done, the OI bias correction became more stable and corrections were similar to the weekly corrections. Unfortunately there was only enough time to identify the problem and not enough time to reprocess the data. Thus, the results which follow will only show the daily OI without the bias correction. Figures 4-6 summarize the overall response of the daily OI and show interesting differences among the satellite systems. Figure 4 shows the 18 month average anomalies from the Pathfinder and operational Navy AVHRR and the difference. The overall anomaly patterns are similar. However, the difference plot shows that Pathfinder is cooler in the tropics than the operational Navy product. These differences lie along regions of persistent cloudiness such as the ITCZ and the South Pacific Convergence Zone. Comparisons of the number of observations, not shown here, indicate an increase in the number of Pathfinder observations compared to the Navy product. This suggests clouds may contaminate some of the Pathfinder data. The differences along 60 o S are due to a problem with the Navy operational product. When the Navy moved from NOAA-16 to 3

NOAA-17 in the spring of 2003, a low stratus cloud test, which worked well for NOAA- 15 and NOAA-16, was continued for NOAA-17. However, this test actually limited the coverage and was eliminating too many good SSTs. The test was corrected 24 August 2004 (Dan Olszewski, personal communication, 2005). Figure 5 shows a similar comparison of the long term mean anomaly for Pathfinder and AMSR. The tropical differences again suggest cloud contamination of the Pathfinder product. However, since the differences are stronger than those shown in Figure 4 and are spatially similar, this suggests that clouds may contaminate both AVHRR products. In the tropical North Atlantic, dust from the Sahara Desert is blown toward the west and frequently causes negative AVHRR biases particularly in the summer. This appears to be an explanation of some of the differences shown in the Atlantic between the equator and 20 o N. Figure 6 shows the standard deviation of SST anomalies for Pathfinder and AMSR for the same period. The patterns are similar. However, the difference shows that the variability for Pathfinder AVHRR is greater along about 10 o N. Since this occurs along a region which may have AVHRR cloud contamination, as suggested by Figures 4 and 5, this suggests that cloud contamination may be a source of the artificial variability in AVHRR. The difference map also shows that AMSR has greater variability than Pathfinder outside the tropics. This may be due to more AMSR sampling since, as shown in Figures 2 and 3, the figures show that AMSR has more days with data south of 40 o S and north of 40 o N. Examples of daily SST fields and their gradients are shown in Figures 7 and 8. From top to bottom, the SST fields are: the Pathfinder AVHRR data, the OI.v2 analysis, the RTG_SST analysis, the AMSR data, the daily OI using Pathfinder and the daily OI using AMSR. These figures are similar to those in Chelton and Wentz (2005) with the exception that the daily OI is not included there. Figure 7 shows these fields and their gradients for the eastern tropical Pacific for 28 May 2003. In this region, as reported by Chelton and Wentz (2005), tropical instability waves are frequently observed just north of the equator. Chelton and Wentz reported that the gradients in AVHRR and AMSR were reduced in the RTG_SST analysis and severely reduced in the OI.v2. These results are clearly shown in Figure 6. However, the figure also shows that these gradients have been preserved in the daily OI. Furthermore, the gradients in both daily OI versions are similar and in both cases sharper than the RTG_SST. The meridional and zonal length scales in the daily OI (see Figure 1) were both less than 100 km. In the RTG_SST analysis, the length scales ranged from 100 to 400 km based on climatological SST gradients. Since the gradient in the eastern tropical Pacific are smaller than in the western boundary regions, the RTG_SST scales are most likely larger than those used in the daily OI. Figure 8 shows similar results in a region including the Gulf Stream. Once again the gradients evident in AMSR are reduced in the RTG_SST and greatly reduced in the OI.v2. However, there is a very striking difference compared to Figure 7. There are almost no AVHRR Pathfinder data shown in the top panel of Figure 8. Furthermore, as 4

shown in Figure 2, Pathfinder data were available for roughly only 5 days during January 2003 with no observations along some of the regions with the largest Gulf Stream gradients. Thus, SST gradient features must be relatively stationary with time. Otherwise the RTG_SST gradients would not appear similar to the AMSR gradients because of different temporal sampling. The figure also shows that the gradients in the daily OI using Pathfinder are not as sharp as the RTG_SST gradients. Here, the meridional OI spatial scales are larger than 100 km in part of this region (figure 1) while the RTG_SST gradients must be at their minimum value of 100 km. However, the daily OI gradients sharpen dramatically when AMSR data are used instead of Pathfinder data. 5. Plans for Further Work The daily SST analysis shows progress and it is planned to have an operational version using AVHRR data available back to 1985 which is the present start of the Pathfinder dataset. In addition there will be another version using microwave and other satellite data for more recent period which would certainly include the full AMSR period which begins in June 2002. Before this is done there are a number of tuning and other adjustments, which need to be made: 1. The present analysis is on a 1/4 o grid. The microwave SST data available from RSS are also on a 1/4 o grid. However, the OI grid needs to be shifted 1/8 o meridionally to match the RSS grid. This change is needed because the gradients become reduced when interpolating between different grids. This will require recomputation of all data products and the land sea mask as well as modifying the OI code itself. Once this is done, the OI versions will be recomputed for the 2002-03 period. This will include the correction to the noise to signal ratio used in the OI bias. In addition, daily SSTs generated from seaice coverage will also be added. 2. The spatial scales and signal to noise ratios for the different data sets will also be reexamined. Tom Smith has computed a preliminary version using the daily OI. This version is similar to the results shown in Figure 1 but does not include a reduction in scales in the region of the eastern tropical Pacific. In this case the OI resolution will be reduced as suggested by a comparison of the gradients in Figures 7 and 8. A further test run will be carried out using a constant 100 km spatial scale for comparison. 3. The constant spatial and signal to noise ratios for the OI bias correction will be retested using a simulation. This will be done using the bias differences between Pathfinder and the operational Navy AVHRR products as shown in Figure 4. In this simulation it will be assumed that one of the two products is 'truth' and the other product is 'biased'. The true product will be subsampled at the scales of the actual in situ data with representative random in situ noise. The biased satellite product will be corrected by the OI bias procedure. The root mean square (RMS) difference between the bias corrected 5

product and the true product will be computed using different scales. The scales with the lowest RMS difference will be selected. 4. The RTG_SST sometimes has a day-to-day noise which may be a function of bad data. It is very likely that this will also occur in the daily OI. Statistics of the day- to-day differences will be computed to help define typical differences. Then cases with large differences will be examined to best determine how to reduce this noise. References Chelton, D. B., and F. J. Wentz, 2005: Global High-Resolution Satellite Observations of Sea-Surface Temperature for Numerical Weather Prediction and Climate Research. Bull. Amer. Meteor. Soc., 86, in press. Kilpatrick, K. A., G. P. Podesta, and R. Evans, 2001: Overview of the NOAA/NASA advanced very high resolution radiometer Pathfinder algorithm for sea surface temperature and associated matchup database. J. Geophys. Res., 106, 9179 9198. May, D. A., M. M. Parmeter, D. S. Olszewski, and B. D. McKenzie, 1998: Operational processing of satellite sea surface temperature retrievals at the Naval Oceanographic Office. Bull. Amer. Meteor. Soc., 79, 397 407. Reynolds, R. W., N. A. Rayner, T. M. Smith, D. C. Stokes and W. Wang, 2002: An improved in situ and satellite SST analysis for climate. J. Climate, 15, 1609-1625. Reynolds, R. W., and T. M. Smith, 1994: Improved global sea surface temperature analyses using optimum interpolation. J. Climate, 7, 929 948. Thiébaux, J., E. Rogers, W. Wang and B. Katz, 2003: A new high-resolution blended real-time global sea surface temperature analysis. Bull. Amer. Meteor. Soc., 84, 645-656. 6

Figure 1. Spatial error covariance scales used in the daily 1/4 o OI. The zonal scales are shown in the top panel; the meridional in the bottom panel. 7

Figure 2. Number of days with Pathfinder AVHRR data for January 2003 on a 1/4 o grid. The daytime number is shown in the top panel; the nighttime number is shown in the bottom panel. 8

Figure 3. Number of days with AMSR-E data for January 2003 on a 1/4 o grid. The daytime number is shown in the top panel; the nighttime number is shown in the bottom panel. 9

Figure 4. The average daily 1/4 o OI anomaly for the 18 month period: July 2002 - December 2003. The OI using Pathfinder AVHRR data is shown in the top panel; the OI using operational Navy AVHRR data is shown in the middle panel; the difference (top minus middle) is shown in lower panel. 10

Figure 5. The average daily 1/4 o OI anomaly for the 18 month period: July 2002 - December 2003. The OI using Pathfinder AVHRR data is shown in the top panel; the OI using AMSR-E data is shown in the middle panel; the difference (top minus middle) is shown in lower panel. 11

Figure 6. The standard deviation of the daily 1/4 o OI anomaly for the 18 month period: July 2002 - December 2003. The OI using Pathfinder AVHRR data is shown in the top panel; the OI using AMSR-E data is shown in the middle panel; the difference (top minus middle) is shown in lower panel 12

Figure 7. Four SST analyses and two satellite data averages for the eastern tropical Pacific for 28 May 2003. The left panels show the SST; the right panels show the corresponding magnitudes of the SST gradients. From top to bottom the panels are: Pathfinder AVHRR data, Reynolds OI.v2 weekly analysis, NCEP RTG daily analysis, AMSR-E data, Reynolds Pathfinder daily analysis, Reynolds AMSR-E daily analysis. 13

Figure 8. Four SST analyses and two satellite data averages for the Gulf Stream 29 January 2003. The left panels show the SST; the right panels show the corresponding magnitudes of the SST gradients. From top to bottom the panels are: Pathfinder AVHRR data, Reynolds OI.v2 weekly analysis, NCEP RTG daily analysis, AMSR-E data, Reynolds Pathfinder daily analysis, Reynolds AMSR-E daily analysis. 14