Northeast Fisheries Science Center
Oceanography Branch

NEFSC Scientific Computer System (SCS)
Alongtrack Data Processing
ctd image

Contents

History
Purpose of this web site
How to access the data
Table of Cruise Notes w/links to figures & logs
Tables of variables recorded
Notes on recording and archiving data
Notes on processing

History

During the GLOBEC years (1992-1999), the NMFS shipboard data was rarely processed.  The raw data was stored on disk and then, in most cases, on 8mm tape  and then to CDROM. Most of the GLOBEC cruises were processed with a MATLAB routine called "procescs.m" and the processed data  for those cruises was posted on  both the GLOBEC homepage.

In January 2000, with GLOBEC field work complete, we started developing a method to routinely load the NEFSC SCS shipboard data into the ORACLE database.  This is a joint effort between the centers Data Management Services and the Oceanography Branch. The conventions developed to archive data in a standardized format are documented elsewhere.

Beginning in early 2001, we occassionally got salt sample calibration files from Joyce Dennecour (Narr Lab) and compare the Niskin samples to the flow through values. See link in "notes on processing" below.

When the Henry Bigelow came on line in 2007, we started sending alontrack  data to the Shipboard Automated Met Ocean System (SAMOS) based at Florida State. In 2008, the ALBATROSS and DELAWARE started sending data as well. A routine was devised to download the netCDF data posted by the SAMOS system at Florida Stae and convert it to the ORACLE tables at our lab. The advantage of this system is that the data was already merged and cleaned to some extent. The disadvantage at first was that not ALL the variables were being sent. Salinity, for example, was not one of the parameters originally processed through this system. We are working out the bugs of this new system in 2008 and hope to have this the new standard routine.

Some of ADCP processing has occasionally been done. Ideally, in the future, we would like to integrate the velocity data into the same database.

Purpose of this web site

This site simply provides a listing of  processed cruises (see Table 1 below) with links to figures that were generated during the processing. Figures include basic time series and ship track plots in order to allow the user to browse any particular cruise and determine what data is available. A log of notes associated with processing each cruise  is included as well.  The range of values specified as "acceptable" in the despiking operation is documented  in the log file. Both the raw and  "processed" data associated with each cruise is available on request.   Note that the "processed" data is a single ascii file with one minute samples of several variables (time, lat,lon, airt, temp, salt, fluorescence, u-wind, v-wind, barop, and humidity) that has also been loaded into ORACLE.  At the time of this writing, it excludes other data like bottom depth, ship speed, etc.  An estimate of those variables can be made given the position.

How to access the data

One may get at the data in one of two ways:

  1. find the processed ascii file in /jgofdata/ship/<cruise>/ship.dat (or ask me to post it on ftp if you do not have access to that disk)
  2. visit the "Alongtrack" section of the NEFSC Oceanography "data" page. This allows users to extract the processed data needed by time and or geographic box from the ORACLE database
Table 1. List of Cruises and links to figures and logs (separate document)



Table 2.  Raw Data Model
Variable Datatype Units Standard  ALBATROSS Filenames  Standard  DELAWARE Filenames
ship VARCHAR2
cruise_code CHAR
year NUMBER(4)
yrday1_gmt NUMBER(8,4) fractional days all files
latitude NUMBER(7,4) degrees north gps2trimble-gga.aco dgps1-gga.aco
longitude NUMBER(7,4) degrees west gps2trimble-gga.aco dgps1-gga.aco
bottom_depth NUMBER(6,1) meters see notes below 
airtemp NUMBER(4,1) deg C youngmet-xdr.aco young-met-parent.aco
ss5temp NUMBER(4,1) deg C furunoseatemp-mtw.aco furuno-seatemp1.aco
salinity NUMBER(6,3) practical salinity units tsgtemperature.aco tsg-unit-temp.aco
fluorescence NUMBER part per million fluro.aco raw-fluro.aco
wind_speed NUMBER(6,2) knots fyoungavgtruewinddir.aco true-wind-speed-p.aco
wind_direction NUMBER(6,2) degrees true (from) fyoungavgtruewinddir.aco true-wind-speed-p.aco
humidity NUMBER(5,1) % youngmet-xdr.aco young-met-parent.aco
baro_pressure NUMBER(6,1) bars youngmet-xdr.aco young-met-parent.aco

*note: wind speed and direction are converted to eastward and northward wind  (meters/second) in the processed data file
 

Notes on recording and archiving data
 

  • most files listed in Table 2 above are currently stored as raw data  on the NEFSC internal website which is not accessible to the public
    • where xxx is "del" or  "alb"
    • where YY is the two-digit year code
    • where ZZ is the cruise code and some cruises have multiple legs labeled  l1, l2, etc.
  • other directories in the same level as compress are raw,eventdata, adcp, and misc
    • where the "eventldata" directory contains
      • sensorhistory files
      • nodc files
    • raw files are basically the same as the aco files except  they have:
      • slightly higher rate of sampling
      • alot of text characters in each record
    • adcp directory has:
      • four types of RDI files  .cfg, XYr.ZZZ, XYp.ZZZ, and XYn.ZZZ associated with config, raw ping data, processed data, and navigation data
      • data should appear here whenever the chief scientist request it
    • misc directory usually has pictures (*.jpg)
  • aco files are the "ascii compressed" version of appended .raw files and the ".tpl" files have a few lines explaining what is contained in the aco file.
  • "yrday1_gmt" happens to be the convention now used by the SCS people for yearday. Other forms are:
    • yrday0_local  where noontime Jan 1st would be 0.5
    • yrday1_local   where noontime Jan 1st would be 1.5
    • yrday0_utc    where "utc" stands for "universal time constant" (i think) but is the same as "gmt"
    • yrday1_utc
  • We have asked the SCS et's to conform to these file structure standards as much as posible. A consistent protocol will make it much easier for us.
  • Bottom_depth is the most difficult variable to process since the record is full of spikes and very little has been done as yet about this problem. Since bottom depth is fairly well documented for the NE shelf, I do not attempt at improving that data.

  •  
Notes on processing data
  • Documentation of the step-by-step perl processing procedure  ("PROCESCS.PLX") is documented here
  • In cases where the normal "compressed" data was not archived an alternative method of processing was developed in Oct 2000 using the PERL routine "procelg.plx" which operates on the eventlog files.
  • In cases where salinity samples were taken (ie Niskin bottles), a calibration check is conducted as described here. 

  •  




    For further information contact: James.Manning@noaa.gov

    Return to main Oceanography Page


    www.nefsc.noaa.gov
    Search
    Link Disclaimer
    webMASTER
    Privacy Policy
    (Modified Apr. 14 2008)