OSCAR - OCean Surface Current Analyses - RealtimeNOAA
Home Project Overview Data Display & Download General Interest

 

Methodology

Altimeter data processing | Sea level gridding | Vector data processing | Gridded surface currents Data quality | Applications | Climate prediction | Products with GOES SST | Model comparisons
References


Data flow and processing diagram

Satellite Data Acquisition and Processing Functions. Operational surface currents will result from the flow of data from satellite to end product through the key processing steps described above.

Altimeter sea level data access and error analysis in near real time (Cheney): Altimeter Geophysical Data Records (GDRs) typically require several weeks or more for processing all the geophysical, sensor and orbit ephemeris corrections, and these are designed for research use where the highest accuracy is required. The NOAA/NESDIS Laboratory for Satellite Altimetry and other groups have developed interim GDRs (IGDRs) to meet operational needs with somewhat degraded accuracy. Cheney et al (1997) show that these differ by less that 5 cm rms for Topex/Poseidon. During the project, it is likely that altimeter data will be available from three satellites series: Topex/Poseidon/ Jason-1, Geosat Follow-On (GFO), and ERS/Envisat. Recently, Jason-1 and Envisat were launched to replace their predecessors Topex/Poseidon and ERS respectively. Each of these new satellites and GFO will have near-real time capability such that processed sea surface heights should be available with a delay of less than 2 days. The result will be an operational flow of global altimeter data that will be sufficient for constructing maps of sea surface topography, averaged over 7-10 day intervals, with at least 1x1 degree resolution with an accuracy of 5 cm or better. Additional details for each mission are given below:

  1. Topex/Poseidon/Jason-1. The highly successful Topex/Poseidon mission is the primary data source for the monthly maps we now produce for our ongoing research and for the Climate Diagnostics Bulletin. It has been operational since 1992, will be continued with the 2001 launch of Jason-1. Both missions are sponsored by NASA and CNES. Jason-1 will provide identical coverage to 66 degrees latitude with a 10-day ground track repeat. Its high altitude and state-of-the-art instrument systems should yield the most accurate measurements yet for sea surface topography: approximately 4 cm for 1-sec (6 km) averages along the ground track, with a goal of reducing this figure to 2 cm. Data to be provided by NASA in near-real time will be slightly less accurate, approximately 5 cm, but adjustments can be performed to minimize the additional uncertainty which arises primarily from the orbit determination.
  2. Geosat Follow-On (GFO). GFO is a U.S. Navy mission launched in 1998 which builds on the 4-year sea surface height record of the Geosat altimeter (1985-89). Like its predecessor, GFO provides coverage to 72 degrees latitude with a 17-day ground track repeat. Serious hardware and software problems during the first 2 years of the mission were overcome, and the satellite was declared fully operational in 2000. The Navy Doppler orbits have errors of several meters, so laser tracking through a commercial contract (Raytheon) is required. This is currently providing near-real time (3-4 day delay) IGDR data with an absolute accuracy of 10-15 cm, but because this error is due mostly to the satellite orbit determination, a level of 5 cm can still be achieved with routine ground processing (for mesoscale applications, the GFO profiles can be adjusted with respect to a mean surface; for climate-scale sea level signals, the GFO profiles can be adjusted relative to Jason-1). NOAA and Navy have worked closely together to establish a ground system, including orbit determination from laser tracking, that provides GFO altimeter data within 12 hours of observation.
  3. ERS/Envisat. ESA launched Envisat in 2001 to continue the decade-long record of sea height compiled by the ERS-1 and ERS-2 altimeters. Envisat will follow the ERS 35-day repeat orbit providing coverage to 82 degrees latitude. Near-real time sea surface height measurements are expected to fall between Jason-1 and GFO in terms of accuracy, but the same adjustments used for GFO will also enable Envisat to achieve the 5-cm level. Altimeter data from Envisat will be provided by ESA and CNES with a delay of 1-2 days.

Sea level objective analysis and latitude/longitude gridding (Mitchum): The available altimetric sea surface height (SSH) datasets will be analyzed routinely to produce gridded SSH fields for the estimation of the geostrophic surface current. The quality of the velocity calculations significantly depends on this analysis step and it must be done with great care. The work proposed begins with the gridded analysis using Topex/Poseidon (T/P) data that has already been developed as part of the work described by Lagerloef et al. (1999). The current method actually first creates a 1/4o x 1/4o product and then spatially smooths those fields to obtain the final 1o x 1o product. In principle, therefore, higher resolution products are possible, but these have not yet been carefully evaluated, as is the case with the 1o x 1o product. We are now making the transition between Topex/Poseidon and Jason-1 altimeters as the primary data resource. We also plan to put the major effort into both improving the spatial resolution beyond the present 1o x 1o product, and including data from ERS/Envisat and GFO. As noted above, the ERS time series will continue with Envisat, as the T/P series will continue with Jason-1, and thus the methods will apply for the foreseeable future. As part of the investigation into the best way to include the ERS/Envisat and GFO data, we will evaluate the tradeoffs inherent in using various multiple altimeter combinations. All of the above work is planned for the first two years of the project, with the third year of effort devoted to the transfer of the appropriate software systems to NESDIS and the testing of the system there.

Sea Surface Height Gridding Technique: The gridding method begins by temporally interpolating the altimeter data to the times desired in the final output. These times are still spaced at approximately 10 days, so no interpolate is more than 5 days from the time of observation. At this stage of analysis, we also apply corrections for tide model errors by notching out the alias periods and apply a correction for the low frequency T/P drift error using the estimate provided by Mitchum (2000). Finally, a correction for large-scale, but high frequency, alongtrack error is made that is similar to that used described by Le Traon et al. (1998). This leaves an alongtrack set of SSH heights that are at common time intervals, and assumed to be the most reliable estimates of the true SSH field.

The spatial gridding at each time point works by considering the SSH field to consist of a propagating signal plus a stationary signal. An example of the latter would be the mean seasonal cycle at mid-latitudes due to seasonal heating, which has large zonal scales but no temporal lags between different longitudes. In advance of the gridding calculations we create estimates of the propagation speed and the variance ratio of the propagating signals to the stationary signals on a global 1/4o x 1/4o grid using calculations on the entire T/P dataset. The production of the gridded set then proceeds by making an estimate of the SSH on the same 1/4o x 1/4o grid in a very simple fashion. Specifically, only the nearest T/P data point is used. The propagating portion of the signal is estimated by taking the data from that nearest point, but from an earlier or later time, depending on the value of the propagation speed estimate at that point. The stationary part of the signal is simply the nearest T/P data point. These two signal estimates are then averaged with weighting proportional to the variance ratio estimate that was made in advance. The result at this stage of the analysis is thus a 1/4o x 1/4o gridded field. This field is then smoothed with a 2-dimensional Gaussian filter to the desired final grid spacing. The skill at properly interpolating propagating ocean features is demonstrated with the superposition of the derived velocity fields and a GOES SST image of tropical instability waves on the equator (Fig. 2). These waves propagate westward with 10-20 day periods and alias the sampling during the 10-day T/P repeat cycle. A simple static 10-day SSH optimal interpolation would not properly account for this propagation.

The analysis method has another significant advantage over some more complicated methods. If more of the T/P track data is averaged to make the initial 1/4o x 1/4o estimates, then it is inevitable that the grid estimates that lie between tracks will have smaller values than occur on the tracks, since nearly all interpolates are bounded by the endpoint values used to compute the interpolation. The net result is a "trackiness" in the variance of the gridded SSH field. Our method avoids this problem, producing variance fields that are smoothly varying in space. It was, in fact, this feature that motivated the development of this method in the first place. Not all interpolation methods necessarily suffer from this problem, and many researchers use optimal interpolation (OI), which is based on estimates of the signal and noise covariance functions, to do the gridding problem. As pointed out by Chelton and Schlax (1991), however, these estimates should really be referred to a "suboptimal" interpolations because the required information about the signal and error covariance characteristics is rarely if ever known adequately. These authors argue that OI might be better viewed as a filtering technique, with the user-defined covariance functions defining the space and time scales of interest. It can be difficult, however, to determine how a specific set of covariance function estimates translates to a frequency/wavenumber response function. In our method, though, we have the advantage that the response function can be controlled exactly, albeit at the cost of the uncertainty in the 1/4o x 1/4o grid estimates.

Spatial Resolution: To date, our method has been primarily validated after smoothing to a 1o x 1o grid (Fig. 2, for example). There have evidence that we can also do well with higher resolution fields. First, we have successfully tracked mesoscale eddies from the Big Island of Hawaii for thousands of kilometers and for more than a year in time (Holland and Mitchum, 2001) using the 1o x 1o product, indicating that mesoscale variability is clearly retained in the higher resolution 1/4o x 1/4o grids that were used to generate the 1o x 1o grids. Second, we have compared our results to a very complex OI calculation described recently by Kuragano and Kamachi (2000). These authors defined covariance functions that were allowed to modulation in space and time, and which also included features necessary to describe propagating features. This analysis is highly complex and very computationally intensive, but compared well to the SSH data obtained at tide gauge stations in the Pacific. Our analysis, however, did just as well or even better using the same tide gauge comparison. It has the benefit of being simple enough to allow rapid production of the products, which is an advantage for the operational product that is the desired endpoint of the proposed work.

Altimeter Gridding Tasks: The work that we propose falls into two sections. In the first year of the project we will begin producing routine gridded fields with existing algorithms continue work to improve the resolution of the present gridded products with the emphasis on obtaining reliable estimates of the ocean mesoscale variability. In parallel we will evaluate the improvements obtained by including data from the ERS and GFO missions. By including these data we hope to improve our higher resolution grids, but even if this is not possible, we expect that we will still be able to improve the present 1o x 1o product. As a first step, the present gridding method will be applied to the ERS and GFO data, with appropriate changes to accommodate the different space/time sampling of that altimetric system. Since the ERS and GFO repeat periods are 35 and 17 days, the initial temporal processing described above for the T/P data will have to be modified, and determining the optimal ways to do this are a task for the first year of the project.

During the second year of the project we will focus on evaluating the success of these improvements to the existing product, and will then iterate our algorithm development depending on the results of the evaluations. We will first repeat some of the evaluations that we have done for the 1o x 1o product. We have compared those gridded data to tide gauge data, and have also compared the gridded products to data along T/P tracks when the data from those particular tracks have been withheld from the analysis. In addition, Mitchum will collaborate with the lead PI (Lagerloef) on evaluations of the final surface velocity product. Since computation of the velocity field is very demanding of the SSH grid because of the need to have accurate gradients of the field, these evaluations have in our experience been very useful in identifying strengths and weaknesses of the SSH gridded product. Finally, the additional of the NRL group to our team allows another powerful method of evaluation. This group is presently producing 1/4o x 1/4o numerical model simulations of the tropical Pacific that Mitchum has already been examining in a different context. Given the SSH simulations from this model we can subsample the model output in a fashion identical to that of the altimeters, and then apply our gridding calculations to those "data". We can then compare the resulting 1/4o x 1/4o SSH gridded product to the original model output. Obviously this approach can be iterated to make further improvements in our gridding method, while still keeping our final product a purely data-based one. Mitchum is already looking at the NRL model output for another project, so many of the tools that we need to carry out this part of the proposed work are already in place.

During the final year of the project we will continue to evaluate and improve the SSH gridded products, but a major task will be to produce well-documented code and procedures for making the gridded SSH fields and port these to Cheney's group at NESDIS. We also anticipate that we will go to NESDIS in order to work with personnel there to solve any problems that arise with creating the operational product there.

Vector wind data acquisition in near real time and gridded daily maps (Bourassa): Vector winds processing will be provided as a shared contribution from Florida State University (FSU) as part of previously funded processing and analysis projects. The fields are based on an objective approach for gridding scatterometer winds developed by Bourassa and presently used for our monthly Climate Diagnostics Bulletin velocity maps. Several improvements to the process used for gridding NSCAT winds (Pegion et al. 2000) result in a more timely product and fewer rain-related problems. Gridded scatterometer winds are currently based on the research quality level 2 wind products. The system will be modified to ingest the near real time winds obtained from NESDIS. The research quality fields have been enhanced by utilizing data before and after the analysis time. A similar procedure will be applied to the near real-time product, resulting in a total delay in delivery by approximately 24 hours. We will create daily-average 1x1E fields for the study region. Higher resolutions up to 1/40 will be examined experimentally in conjunction with the higher resolution SSH data described above. The production of the wind fields will be automated, and they will be made available through the internet. We have secured a real-time feed of QuikSCAT data (as part of several other projects) through NOAA/NESDIS. This data stream is expected to be available through the lifetime of this project.

Vector Wind Gridding Technique: The functional requirements of our system include near-real time capability; objectively determined tuning parameters (i.e., weights); and treatment for inhomogeneous sampling of in-situ surface observations. Similar research quality fields are validated through ocean model comparisons and independent checks to surface observations. Weights are applied to each constraint in the cost function. There are three constraints: a misfit to observations, a smoothing term, and a misfit to curl. The second and third terms are relative to a background field. Currently, this background field is based solely on scatterometer observations. The creation of this background field is responsible for most of the delay in creating the objective wind field. Changes to this field can be examined with the goal of reducing the delay in production. The influence of the background field, relative to the observations, is controlled by the ratio of 'the weight for misfit to observations' to the weights for the other constraints. For research quality scatterometer data, the influence of the background field is small compared to the influence of the observations. The weights required for the variation method will be optimized through cross validation (Pegion et al. 2000). Cross validation is an objective method for determining fitting values. Through a separate NOPP proposal (O'Brien, FSU), Bourassa and others plan to improve the quality of the near-real time wind product. These changes should lead to improvements in gridded fields for areas of light winds and frequent rain, such as a portion of the region proposed in this pilot study.

Processing vector wind and sea level data into gridded surface current maps (Lagerloef): Background and methods: The surface currents are computed directly from the gridded surface topography and surface wind analyses described above. This employs a straightforward linear combination of geostrophic and wind-driven (Ekman) motion(Lagerloef et al, 1999). The technique is tuned to best represent the ageostrophic motion of the WOCE/TOGA 15 m drogue drifters relative to the surface wind stress. Geostrophic velocities are computed with sea level gradients derived from satellite sea surface height analyses. The mean altimeter surface height field is also subtracted and replaced by the mean annual 0-1000 dbar dynamic height derived from the NOAA/NODC atlas (Levitus et al., 1994a,b) to preclude the influence of marine geoid errors on thealtimeterr data. We applied the variational analysis of Special Sensor Microwave Imager (SSMI) winds (Atlas et al, 1996) as a proxy for satellite scatterometer vector winds for the initial development and testing of our approachbecausee they provided a continuity with the Topex/Poseidon data set between 1992-present. We have been incorporating NASA QuikSCAT vector winds since that satellite became operational in 1999.

The difficulty of computing near equatorial geostrophic currents was treated by devising a weighted blend of the equatorial beta-plane and conventional f-plane geostrophic equations (also described in Lagerloef et al., 1999). Ekman currents were derived from a two- parameter linear model fitted to drifter data. This provided a formulation that allowed smooth transitions across the equator without the problem of the equatorial singularity where the Coriolis acceleration crosses zero. We found through subsequent analyses that the method was seriously deficient in the eastern Pacific in the vicinity of the equatorial cold tongue. Generally, the currents we derived were biased toward westward flow by as much as 0.1 to 0.4 m/s on the equator, and failed to reproduce the seasonal reversal in the boreal spring. The bias was from two sources: (1) the geostrophic method was overly smoothed in the meridional direction by the polynomial fit we used on the sea level gradients, and thus did not resolve a geostrophic minimum at the equator, and (2) the Ekman parameterization was not physically realistic enough to represent the strong vertical shear presented by the Equatorial Undercurrent (EUC) in the region. In the interim, F. Bonjean, working with Lagerloef, developed a more physically realistic model for deriving the near equatorial currents from gridded topography and winds (Bonjean and Lagerloef, 2002). The first step was to include a realistic shear model, based on Stommel (1960), for the Ekman component. This produces much more accurate results on the equator and becomes nearly identical to the Lagerloef et al 1999 formulation poleward of about 5 degrees latitude. The second innovation was to apply a unique set of orthogonal polynomial basis functions, symmetric on the equator, to solve the geostrophic and Ekman terms across the equatorial singularity. With these improvements, the disparity on the equator between the mean satellite-derived currents and drifter climatologies is practically eliminated (Fig. 5). All velocity maps will be obtained from the new model.

Data quality and error analysis using drifter, ship and moored observations (Lagerloef): It is often claimed that no data should be published without a good estimate of its measurement errors. The scientific and practical utility of the satellite surface currents depends upon its demonstrated reliability for particular applications. We will perform a thorough investigation that will quantify the satellite surface current's errors (its measured variance that is unrelated to the ocean variability of interest) and its response function (the extent to which it does measure the ocean's true variability). E. Johnson of ESR will focus on this effort, which will entail the comparison of the satellite currents with in situ data from the global drifter network, moored current meters in the TAO array, and ship ADCP sections from the semi-annual NOAA TAO mooring maintenance cruises, and comparison with NRL operational ocean general circulation models (OGCMs). Three issues will be considered:

  1. The spatial and temporal error characteristics and covariances
  2. The relative errors from using altimeter operational IGDR versus more accurate GDR data
  3. Error reduction obtained with applying 2 or more altimeterr data sets

Since satellite and in situ systems to not measure exactly the same quantities, all will produce errors, both geophysical and instrumental, with respect to the "true" ocean variability (generally taken as that variability measured in common by both instruments). The data-based error analysis will use the strengths of each individual data set to test different aspects of the satellite measurement. Current meter moorings maintained by the TOGA/TAO and related programs provide data with full time resolution but extremely limited spatial coverage; these can be used to calculate the satellite measurement's errors as a function of frequency at the few locations sampled. Frequency information is also available over the much larger spatial domain of the accumulated drifter data set. Since a drifter moves only with local currents it samples predominantly in time and for our present purposes can be treated as a time series. These can be compared to pseudo-Lagrangian velocities extracted from the satellite currents along drifter tracks. We will analyze the frequency characteristics of the difference between the two data sets. This will provide vastly increased statistical confidence at time scales shorter than one year. A comparable analysis over spatial scale is possible using shipboard ADCP data (e.g., Johnson and Plimpton, 1999, Johnson and Luther, 1994). A moving ship samples predominantly in space, and the resulting sections can be treated as a snapshot of ocean variability. As before we will extract matching satellite velocities and analyze the differences between the paired sections for their spatial scales. Both the drifter/satellite and the ADCP/satellite comparisons will be further sub-sampled as functions of latitude or longitude to give some indication of the spatial inhomogeneity of the above statistics. The objective is to provide our best rigorous assessment of the space/time error covariance matrix for this region of the ocean.

It remains a challenge to characterize such errors with the sparse and inhomogeneous in situ data available. We will therefore turn to an analysis of operational OCGM output from NRL, collaborating with G. Jacobs, in which the variability can be calculated exactly. Specifically we will sub-sample the OCGM SLH fields as would the satellite and produce the equivalent surface current estimates for comparison to OCGM velocities, and then reproduce the above error analyses over frequency and spatial scales. We can quantify the response of the satellite-based measurement to modeled ocean phenomena, also as a function of frequency and spatial scale. We will be able to derive two ultimately useful products. First, we can assign definite error characteristics to the satellite-based measurement as a function of frequency and spatial scale, rather than simply identifying a difference variance vis-a-vis a particular in situ data set having its own error characteristics. Second, we will be able to combine knowledge of the satellite current's sampling error and response function to evaluate its ability to measure various ocean phenomena.

NOAA CoastWatch applications, public accessibility and communication (Polovina): The Honolulu Laboratory, SWFSC, NMFS, NOAA conducts a wide range of research on fisheries and protected species in the central and western Pacific. In recent years physical and biological data obtained from satellite remote sensing have been essential to many aspects of this work. For example geostrophic currents estimated from Topex/Poseidon were used to simulate transport of lobster larvae around the Hawaiian Islands (Polovina et al. 1999), sea surface temperature (SST) measured from the GOES was used to identify, track, and direct research sampling of cyclonic eddies in the Hawaiian Islands (Seki et al. 2001), and data from Topex/Poseidon, SeaWiFS, and satellite-based SST were used to identify migration habitat of loggerhead turtles in the central North Pacific (Polovina et al. 2000). The Hawaii Regional CoastWatch Node, funded by NOAA/NESDIS is located at the Honolulu Laboratory. It serves as a regional focal point for the acquisition and processing of remotely sensed environmental data and the timely distribution of derived products to marine researchers, resource managers, and the general public. It currently has 330 registered users and many others who access products distributed on the web page (http://coastwatch.nmfs.hawaii.edu/).

One product resulting from a collaboration between the Honolulu Laboratory and Hawaii CoastWatch are regional maps of Topex/Poseidon estimated sea surface height and geostrophic currents. This product represents a reasonable first step in utilizing satellite altimetry data, yet it suffers from several shortcomings. First it uses altimetry data aggregated by 1-degree of latitude over ten-day periods excluding propagation as described above. Secondly the geostrophic equations fail close to the equator. Finally, it addresses only geostrophic currents and not Ekman transport. The estimated current vectors resulting from this proposal represent important improvements over our existing approach since they will offer higher spatial and temporal resolution, include the equatorial region, and include both geostrophic and Ekman currents.

This ocean current product produced from this NOPP proposal will be used for various research and management investigations including: i) to identify mesoscale ocean features, particularly fronts and eddies, in near real time which can be sampled with research cruises to improve our understanding of physical-biological links at these features, ii) to monitor the spatial and temporal dynamics of specific fronts and eddies which serve as habitat for pelagic fishes, and protected species, iii) to use the ocean current vectors to drive a simulation model to model the transport of various particles in the ocean including fish larvae, marine debris, and oil spills. The operational data will be accessible to the extensive CoastWatch user community via link or mirror site we will include at the Hawaii CoastWatch web page (http://coastwatch.nmfs.hawaii.edu/). We will continue to communicate with users and promote the data use and dissemination, and evaluate the user's responses and special needs.

Climate Prediction studies (Kousky): The Climate Prediction Center (CPC), NOAA, will evaluate the impact the of the operation surface velocity data for El Nino-South Oscillation (ENSO) monitoring and prediction. A sophisticated ocean data assimilation system (ODAS) constructed by the Coupled Model Branch of the Environmental Modeling Center of NCEP is being routinely used by CPC in monitoring the subsurface temperature changes associated with ENSO. The same system is also used in the initialization of the coupled model for the prediction of the tropical Pacific SST at NCEP. The ODAS only assimilates the surface and subsurface temperature data and does not assimilate any ocean current data, and we find the quality of the ocean model surface currents is not satisfactory (Acreo-Schertzer et al. 1997). A comparison of drifting buoy and ocean model surface currents suggests that a large part of the discrepancies in the comparison are associated with gaps in drifting buoys space-time sampling. When the model is sampled only at the buoy locations, the differences are reduced from 30 cm/s to 5 cm/s in the equatorial belt (Reynolds 1997). The satellite-derived ocean surface currents have a much better spatial and temporal coverage than drift buoys and current moorings together. These fields will be very valuable for evaluating the ocean model surface currents. Furthermore, the satellite derived ocean current data can be assimilated into the ODAS, which will likely lead to improvements in ENSO forecast skill. The operational satellite-derived ocean surface current data set will also be used in monitoring the oceanic climate changes in real time, which complements other climate monitoring variables in the Climate Diagnostics Bulletin. Lastly, this ocean surface current data will be used in diagnostic studies of ENSO variability (c.f. Fig. 3).

Merged products with GOES SST (Legeckis): During this study, the East and West geostationary satellites will continue to provide sea surface temperatures (SST) of the central and eastern equatorial Pacific at 30 minute intervals. Time lapse investigations of SST images have demonstrated the ability to detect motions of low frequency phenomena in the Pacific such as Tropical Instability Waves (TIW) waves (Fig. 2), wind driven coastal upwelling, and the formation and propagation of eddies off Hawaii and Central America. The unique feature of the time lapse approach is the ability to separate high frequency cloud motions from low frequency surface ocean events. This is accomplished by forming daily composites of GOESS SST to eliminate some of the clouds and then animating the composites at the rate of at least 5 days per second. The result is an ocean in motion.

Two GOES data sets will be used in evaluation with velocity vectors. The first is the NESDIS operational, cloud cleared, hourly SST product developed by Paul Menzel and Fred Wu at NESDIS/NOAA. This has a resolution of about 6 km, covers the region between 180w - 30W and 60N - 45S and SST range between -3.15 and 35.1C at intervals of 0.15C. The second data set is a hourly selectable window from a GOES satellite at a resolution of about 4 km which is used to view individual GOES infrared channels and to test SST algorithms. This data set can be optimized to detect ocean fronts and it is not cloud cleared so the interaction of clouds and SST fronts is more apparent. The NOAA CoastWatch sites are now using and distributing these GOES SST products. They have proven useful in detecting ocean eddies in the lee of Hawaii, the monitoring of TIW events associated with La Nina, and coastal upwelling off the western coast of North and Central America.

The additional value in the interpretation of GOES observations gained by combining the operational satellite-derived surface velocity fields will be studied in detail. The GOES provides an excellent view of wave propagation and this observation could be combined with independent measurements of currents. It will be particularly useful to examine how well the TIW phase propagation is reproduced in the currents in comparison to the SST. Recent studies show evidence in QuikSCAT data of wind stress modulation coupled with SST fronts in the TIWs (D. Chelton, personal communication). There are also transient diurnal heating events evident in the GOES SST animations which would be useful for air-sea interaction studies. The combination of the velocity and SST data therefore offers insights for several interesting equatorial dynamic problems.

Comparisons with NRL model assimilation fields (Jacobs): Two ocean models running at Naval Research Labs (NRL) in Mississippi are used operationally by the Navy. The models, known by their abbreviations NLOM and NCOM, each assimilate altimeter sea level data with different nudging techniques. NLOM also has a bulk mixed layer formulation, while NCOM employs a full Mellor-Yamada turbulence closure scheme, and the models would accordingly be expected to have different wind response to the surface current. Our partnership is interested in evaluating the differences between these ocean model fields and the operational surface currents we propose here. The result will be a better understanding the quality of both products. G. Jacobs will aid in accessing the data, and collaborate in the interpretation of comparisons. The model fields will also be applied to the error analysis described above.

References

Home | Project Overview | Data Display & Download | General Interest
NOAA

OSCAR Project Office
Earth and Space Research

1910 Fairview Ave E, Suite 210
Seattle WA 98102-3620

ESR webmast.oscar@noaa.gov
Credits | Disclaimer | Privacy Policy