NOAA Geospatial |
Messages
First NOAA Geospatial Data Workshop
|
The First NOAA GeoSpatial Data Workshop was held in Silver Spring between Nov. 28 - Dec. 1, 2000. The workshop included sections on metadata, data access, modern applications within NOAA, NOAA's spatial data infrastructure, and many other topics. |
Searchable Workshop Agenda
|
Ted Habermann from NOAA's National Geophysical Data Center spoke about using the NNDC Interface Database for accessing distributed databases. He applied that technique to a database with the agenda for this meeting. It can be used to search the agenda. |
GeoSpatial Tools
|
|
ESRI Spatial Database Engine
|
The ESRI Spatial Database Engine is a tool used with relational database systems in order to make the data in those databases available to ESRI GIS clients (to spatially enable those databases. SDE FAQ are available at ESRI. This forum includes information on SDE from NOAA users. |
Loading Data Into SDE
|
Presently we have three ways of loading data into SDE: 1) from shapefiles, 2) by spatially enabling DBMS tables, and 3) from FreeForm-described data files. |
Automatically Loading And Optimizing Shapefiles In SDE
| ||||||||||||
IntroductionProbably everyone who has worked with SDE and shapefiles is familiar with the shp2sde utility. Depending on the size and number of shapefiles this may be enough. However, for large shapefiles and large quantities, I have a Perl script which will load, set optimum layer grid sizes, set optimum Oracle table extents, calculate the layer envelope, and grant select access to other users on any number of shapefiles.Ill cover each of these steps next, and finish by presenting the Perl script. (Thanks to Tim Clark of ESRI for providing the optimizing and tuning information which is the foundation of this article.) Loading and Optimizing ShapefilesOptimizing layer grid sizesThe -g option (layer grid sizes) is required when running shp2sde, so what value do you use? I usually have no idea, so I pick a reasonable number for starters, for example:
Then I run sdelayer -o si_stats once the SDE layer has been created to see how good my guess was (with the parameter were trying to change in bold, brown highlights):
(I dont know why Level 2 and 3 are being ignored.) Well, Ive been told its good to have the average features per grid be around 75 to 150, and since Im taking a mechanical approach to optimizing the grid size, Ill consider the job done when Level 1 has an average of 112.5 +/- 37.5 features per grid. Tom Careys fast algorithm for approaching this range is to estimate a new grid size by multiplying the previous grid size by the square root of the ratio of the target and actual grid sizes, which in this case gives me 5 * sqrt(112.5 / 428.66) = 2.56147302. This formula reflects the geometric argument that if features are evenly distributed, thenin this case5 * 5 is proportional to 428.66 and so X * X should be proportional to 112.5 where 5 is the current grid size with a 428.66 average and X is the grid size that should give a 112.5 average. So dividing these two proportionalities and solving for X yields the general formula X = G * sqrt(T / M) where X is the new grid size, G is the current grid size, T is the target average features per grid, and M is the current average features per grid. So here goes my second try (and since SDE is ignoring Levels 2 and 3, so will I):
If the points were really evenly distributed we would have hit the target average (112.5) on the first try. However, applying the same formula on the latest approximation should yield an even better approximation, so well continue until were in the ballpark:
Were in the 75 150 range, so well stop optimizing the grid layer sizes now. Avoid the DEFAULTS configuration sectionHeres what the DEFAULTS configuration section looks like in one of our dbtune.sde files (with the interesting parts in bold, blue highlights):
This means that when you dont specify a configuration keyword when running shp2sde (using the -k option), the attribute, feature, and spatial index tables and attribute and 2nd spatial indexes can be no larger than 2,027,520 bytes (40,960 bytes times 55 extents times 90%), and the feature and 1st spatial indexes can be no larger than 506,880 bytes (10,240 bytes times 55 extents times 90%). So for loading large shapefiles I have created the IMPORT configuration section (unlike DEFAULTS, IMPORT has no special significance to its name):
So when I run shp2sde with the -k IMPORT option, Im allowing for nearly up to a 1000 megabytes for all the tables and indexes. The downside to this is that any table or index which occupies a fraction of a megabyte has that entire extent allocated for its use, which can result in some slack space. To avoid wasted space, I create a custom configuration section for each shapefile I load. Here is an example:
I determined the table and index sizes for a shapefile after I had loaded it into SDE with the IMPORT configuration section, and then I deleted the layer and loaded the shapefile again using a configuration section that was sized just for that shapefile. Note that Im still using 1 megabyte extent sizes. Originally I specified only single extents that were the size of the tables or indexes (e.g., A_INIT about 138 M, A_NEXT 0, and A_MAXX 1) but I found that because of tablespace fragmentation I couldnt always request a 138 Mb extent. Feel free to tune extent sizes versus number of extents as you see fit. Here are the Oracle queries I ran to determine the table and index sizes:
You can ask your DataBase Administrator what the block size is for your database. Here is one way to determine this (this may require DBA privileges):
MiscellaneousSome SDE applications may require the layer envelope to be calculated which is done with the following command:
And letting others see your new SDE layer can be helpful:
The Perl ScriptDoing this by hand is how I and most people start doing tasks, but when I was presented with a multiple CD set of lots of shapefiles, it was time to automate. The following Perl script is somewhat customized for the ESRI Data & Maps CD sets (e.g., the Perl script prefixes shapefile names with directory codes) but much of the Perl script is generally applicable to any collection of shapefiles.
|
error in command
|
I'm trying to use your perl script but i get this command shp2sde -o create -l polbndl,gid -f /usr/users/sde/sic_umts/MiguelHeras/Australia/polbndl.shp -g 5,1 5,45 -x -200,-110,1000000 -c 100 -a all -k IMPORT -e + -G 4326 -u sde -p sdesic4 -s luca >> log.txt Layer polbndl,gid is incompatible with the input shape file type. ESRI SDE Shape to Layer Loading Utility Mon Feb 25 12:09:01 2002 ------------------------------------------------------------------------ Could you help me? |
Spatially Enabling DBMS Tables
| |||||||
IntroductionIn Automatically Loading And Optimizing Shapefiles In SDE we load shapefiles into SDE, and in my next article we load raw binary or ASCII data into SDE. In this article we look at spatially enabling data that are already in a DBMS. dbms2sdeI have patterned the command-line usage of dbms2sde after the style of the ESRI administrative commands (options and variations specific to dbms2sde are shown in bold, brown highlights):
Presently dbms2sde only spatially enables point data, in which each row of a table corresponds to a single point, and all information about that point is contained within that row. It can create either a point shape or a rectangle shape for each table row (I could easily add an ellipse optioncontact me if this interests you). I could extend dbms2sde to spatially enable tables with line or area shapes whose vertices are derived from more than a single point (unlike the current -o rect option)contact me if this interests you. However, there a variety of schemes in which the vertices might need to be accessed, and I will probably support only one. (But which one? For example, the vertices of a line or area could be stored in a single table row, or stored in a second table related to the business table, or stored in an external file.) Spatially enabling a table seems to take much longer than loading a shapefile. Probably the reason for this is that doubly-nested queries are used when spatially enabling a table. This roughly means that the time to spatially enable a table is proportional to the square of the number of rows. What dbms2sde does is apply the query specified by the -w option to the business table (if not specified then the query is equivalent to "select all rows"or "select all rows where 1 = 1"). For each row from the business table matching this query, shape information and a new where clause based on that row are formed, and then all tables in the business table are updated with the shape information that matches this new where clause. This latter (or inner) query is why specifying the -key option can be extremely important. If the -key option is omitted, then dbm2sde forms a where clause based on the x and y coordinates (and z and measure values as well, if theyre specified). So, for example, if a particular row has a latitude equal to 40 and a longitude equal to -107, then all rows with that latitude and longitude will be updated with the shape information based on those coordinates. If there are five such rows, then those rows will each be updated five times. This still yields an SDE layer with the correct shape information, but it can be very slow. The -key option overrides this by using the specified column(s) for the latter (or inner) query. So, for example, if dbms2sde is run with "-key ID", and a particular row has an ID value of 106, then all rows where ID equals 106 will be updated with the shape information derived from that row. At least with Oracle, it is a tremendous boon to run dbms2sde with the -key option specifying the tables primary key. In Oracle the primary key is guaranteed to be unique and it is indexed as wellso it is safe and faster. (Unlike omitting the -key option, whichalthough it is inefficientrepeatedly updates rows that have the same coordinates values, specifying the -key option with column(s) having duplicate values can attach incorrect shape information to those rows which share the same key column(s) values. Unless, of course, those rows also happen to have the same coordinate values. I have added a check in dbms2sde which warns when duplicate values in key column(s) are found.) So in a nutshell, ensure that your tables have a primary key (or the equivalent, and are indexed) and run dbms2sde specifying this column or columns with the -key option. Here are some examples of running dbms2sde without the -key option, with the -key option but on a column which is neither the primary key nor indexed, and finally with the -key option on a column which is the primary key.
Here we received an Oracle error complaining about our rollback segments being too small. Specifying a key column under certain conditions seems to bypass this error condition, but under other conditions involving a key column we will see this error again.
In this case with the -key option (but run on a column that does have unique values, but is unindexed) dbms2sde does run to completion in just under three hours.
In this last case dbms2sde runs to completion in under half an-hour. This is because the ID column is a primary key and is automatically indexed in Oracle. Incidentally, heres what happens when you run dbms2sde specifying a columnhaving duplicate valueswith the -key option. To do this I dropped the primary key and then set the ID column of one row to have the same value as another row. I then indexed the ID column (not strictly necessary, but Id rather wait a half-hour than three hours).
So here the error presents itself again, although we processed nearly all rows. At this point I increased the size of the rollback segments, and was able to bypass the error without changing how I was running dbms2sde.
OptimizingMuch of the discussion in my previous article Automatically Loading And Optimizing Shapefiles In SDE applies to dbms2sde. You can take a self-guided demo of a web enterprise application of dbms2sde which allows you to select tables, spatially enable them, optimize their grid sizes, determine optimum configuration sections, and so on. Click here to take the tour. (Sorry, but this doesnt fully work unless youre coming from a computer in the noaa.gov domain. Im working on that...)
However, there is one difference regarding optimizing grid sizes and table extents. It seems that the SDE libraries (which dbms2sde builds on) dont use the specified grid sizes until the table is fully enabledIve observed SDE making the spatial layer with a larger spatial index which then shrinks in size once the grid sizes are applied. This means that I cant calculate the table extents after Ive optimized the grid sizes (if I do, dbms2sde reports an error in which the maximum number of extents have been exceeded). So I have to calculate the table extents before I optimize the grid sizes, and then use what seems like a needlessly larger configuration section. Oh well, disk space is cheap, right? Where do I get dbms2sde?Click here to go to the ESRI ArcScripts page showing SDE utilities. There you can download a zipped tar file containing source code and Windows NT and Sun Solaris binaries. |
Blue Angel Metadata Management System
|
We are exploring the Blue Angel Enterprise MetaStar system for managing FGDC compliant metadata. |
ESRI ArcIMS (Internet Map Server
|
The ESRI ArcIMS is a tool used to make GIS data available to web browsers, the Geography Network, the free GIS Browser ArcExplorer, and to ESRI Clients. This forum includes information on ArcIMS from NOAA users. |
ArcIMS technical discussion
|
On a more pragmatic level, I was left with the impression that a number of us are, or will be, creating ArcIMS installations. This can be a frustrating process due to the lack of documentation and the numerous "interesting" features that are in ArcIMS. Are there other folks who would be willing to share their questions, suggestions, successes and frustrations? I'll throw out a first question - following up on Daniel Martin's suggestion, I've finally gotten an extract server working. Does anyone know how to limit which layers are included in the extracted files? - Tiffany |
Must be in the AXL
|
This is an interesting problem that must be faced by all data servers. In the Interface Database, this information is provided by the display_fields. In DODS, it comes from the DAS and the DDS. In the live access server it comes from the configuration files (I think). My understanding of ArcIMS suggests that all of the information about what gets served is in the AXL file that describes the service. Maybe you could post that AXL file and we could have a look?
|
Pieces to the ArcIMS puzzle
|
I work with Ted Habermann at NGDC and I have been searching on the web for ArcIMS sites and would like to share some of the ideas and concepts I found. The first is a url to a company that has an overview of some of the pieces to ArcIMS. It's more of a general concept. The url is: http://www.edgetech-us.com/Prd/Softw/ArcIMS.htm |
ArcIMS Toolbar help file
|
An example of a help file for the toolbar in ArcIMSThe USGS has a page that descirbes how to use the tools in the toolbar for ArcIMS. The file contains the standard tools for ArcIMS. It was done nicely and the url is:http://idaho.usgs.gov/projects/sr3/help/index.html I use this help file as a link below the toolbar in the ArcIMS viewer I've created. |
Trying to give some answers to common question about ArcIMS
|
Questions and answers about ArcIMSAnother site tried to answer some questions about ArcIMS. Some examples questions are:
http://www.co.cabarrus.nc.us/Pages/Gis/applications.htm
|
Documentation, tutorials, and tips and tricks for using ArcIMS
|
ArcIMSDocumentation, tutorials, and tips and tricks for using ArcIMS, the latest Internet Map Server from ESRI. It has such things as ArcOnline, ArcIMS 3.0 - An Application Developer's Perspective, Geography Network, ArcIMS Tutorials, ArcXML and ArcIMS Scripts. The url is:http://gis.about.com/cs/arcims/?once=true&
|
Relationship between layers in the mapservice and in the HTML viewer
|
The HTML viewer numbers layers in the reverse order that they are listed in the MapService (defined in the AXL file). This reference remains constant even if the layer is not currently displayed in the viewer (due to scale constraints). For example: if the cities layer is defined as the next to layer in the AXL file, the viewer will always addess it as layer 1. (count begins at 0).
|
General Instruction page on how to use the Layers area in ArcIMS
|
Instructions on how to turn on layers and other things in ArcIMSThis general instructions is specific to this site, but it gets the idea across on ways to write up on How to use the layer area in ArcIMS.The url is: http://gisweb.sgrdc.com/Website/Lowndes2/instructions.html The main page for this site is: http://gisweb.sgrdc.com/Website/Lowndes2/viewer.htm?Title=VALOR%20GIS
|
Another tool bar help file ArcIMS
|
Another Toolbar help file ArcIMSThis site gives another way to do the toolbar help file.The url is: http://205.169.141.11/imd/gis/arcims_help.htm
|
ESRI GIS main Software Page
|
ESRI main GIS software pageThis page from ESRI lists all the software that they have and the url is:http://www.esri.com/software/index.html For a direct link to ArcIMS at ESRI go to the following url: http://www.esri.com/software/arcims/index.html |
ArcIMS tutorial
|
The USGS ArcIMS tutorialThe USGS has a ArcIMS tutorial page that includes:
The url is: http://webgis.wr.usgs.gov/arcims.htm
|
ArcOnline by ESRI
|
ESRI ArcOnline is the complete technical resource center for ArcGIS software.You can access System Requirements, Downloads, Technical Papers, Developer help pages, and Data Models for ArcGIS Desktop (ArcView 8.1, ArcEditor, ArcInfo), and ArcSDE as well as ArcIMS, and ArcPad. You can also browse for answers to your technical questions in the Knowledge Base and participate in user-to-user Discussion Forums.The url is: http://arconline.esri.com/arconline/index.cfm?pid=6
|
ArcIMS examples
|
MetroGIS DataFinderThis ArcIMS site has different functionallity than what comes with the out of box for the ArcIMS html viewer. Examples are:
http://gis.metc.state.mn.us/website/DF_GeneralMap/viewer.htm?Title=MetroGIS |
ArcIMS examples
|
USGS Snake River Corridor ProjectThis ArcIMS site has different functionallity than what comes with the out of box for the ArcIMS html viewer. Examples are:
http://idims.wr.usgs.gov/website/sr3beta/viewer.htm?Title=Snake%20River%20Corridor%20Project |
Generic out of the box ArcIMS sites
|
Out of the box ArcIMS SitesHere is some examples of what comes straight out of the box of ArcIMS for the html viewer. These sites basically took the out of the box approach without hardly any modifications for the html viewer in ArcIMS.The url is: http://205.169.141.11/website/basins/viewer.htm?Title=Mesa%20County%20Drainage%20Basins Another one is: http://map2.ngdc.noaa.gov/website/pkd/viewer.htm |
More ArcIMS examples
|
GIS at Mesa County, ColoradoFor the ArcIMS sites locates at Mesa County, click on any of the interactive maps section. They have over 15 different examples of ArcIMS sites they include:
There seems to be 2 main approches to view there data. One is use a select by some citeria such as: parcel, address or township and then bring up an ArcIMS site after the selection. The second approach is a direct link to the ArcIMS site with no select citeria done first. I found that these are not always available and sometimes I couldn't connect to the database for that site and I'm not able to view the ArcIMS site.
The url is:
|
More ArcIMS examples
|
National Marine FisheriesThis site has numerous examples of ArcIMS sites, most look like there the "out of the box" using the Html viewer for ArcIMS.The url is: http://www.fakr.noaa.gov/arcims/ |
Evaluation of Two Internet GIS Sites Using ArcIMS
|
Evaluation of Two Internet GIS Sites Using ArcIMSThis site talks about the different approaches of these two sites using ArcIMS and ideas about what could make them better and what the author liked and disliked about each site.The url is: http://www.geog.uno.edu/~dwelch/arcims1.htm |
ArcIMS – The Solution to Streamlining Data Collection, Modeling,
|
ArcIMS – The Solution to Streamlining Data Collection, Modeling, Planning and Design.This site talks about to use ArcIMS as a solution to your data needs.The url is: http://www.esri.com/library/userconf/proc00/professional/papers/PAP591/p591.htm |
SpatialDirect™ plug-in adds data extraction and delivery to ArcIMS
|
SpatialDirect™ plug-in adds data extraction and delivery to ArcIMSAllows data to be retrieved to the desktop in user specified data format and projection.The url is: http://www.safe.com/press/arcims_june00.htm |
FAQ about GIS and ArcIMS from ESRI
|
Answers to question about GIS and ArcIMs from ESRIA random collection of questions about GIS and ArcIMS that ESRI put together to answer some questions.Some example questions are:
|
ArcIMS Demos from ESRI
|
ArcIMS DemosEsri has a demo of ArcIMS sites.The url is: http://eslims.esri.com/arcimsdemos.htm |
ArcIMS/ArcView IMS Server
|
ESS ArcIMS/ArcView IMS ServerThis site has helpful ideas about ArcView and ArcIMSThe url is: http://wwwims1.gsc.nrcan.gc.ca/ |
Prototype (or operational) ArcIMS Sites in NOAA
|
A number of groups in NOAA are experimenting with serving geospatial data on the web using ESRI's ArcIMS. This area has links to those sites. Questions (and answers) about the sites can posted to the threads as well!
|
ArcIMS Prototypes at NGDC
|
These map servers are running on NT and accessing data primarily in various SDE installations (Informix, DB2, and Oracle) at NGDC. John Cartwright is the technical wizard for these sites!
|
How to do specific things in ArcIMS
|
This area is for people who have figured out how to do things in ArcIMS.
|
Changing the title of your ArcIMS viewer
|
Changing the titleThe title of the HTML Viewer can be set when creating the web site with ArcIMS Designer. The default title is ArcIMS 3.0 Viewer. You can change the title after the HTML Viewer is created by editing the default.htm file. Change the following line to include your own title text string: var theTitle = My Very Own Viewer; For additional information, see the ESRI documentation about Customizing ArcIMS |
More than one way to do specific tasks in ArcIMS
|
Different paths to achive similiar results in modifing ArcIMSI didn't find the variable title in the file default.html, but I found it in the viewer.html file. All the .html files are located on the machine where you created your web site and the general stucture is /Base directory/website/name of ArcIMS site/ the html files are in the name of the ArcIMS site above. Or you could just search for the file. These are ways I found to get things done in ArcIMS. There are more than one way to do things in ArcIMS and I'm showing the ones I found to work. If others find better ways to do these things please post them here in Hypernews. I am learning as are other folks and if we get a good list of how to do things in ArcIMS it will be alot easier for datamanagers to do things relatively easy. The general way to do things in ArcIMS is use Author make your changes, save the file, go into ArcIMS Administrator click the refresh button and review the results of the change in your browser. |
Changing the name of the layer in your ArcIMS viewer
|
Changing the name of the layerIn ArcIMS Author:
For additional information, see the ESRI documentation about Customizing ArcIMS |
Deciding what fields get displayed and changing field names for a table to alias field names
|
Deciding what fields get displayed and changing field names to alias field namesUse an editor and open the file ArcIMSParam.js You can search for the file or it is generally in the following directory stucture: All the .html and the ArcIMSParam.js files are located on the machine where you created your web site and the general stucture is /Base directory/website/name of ArcIMS site/ the html files and ArcIMSParam.js are in the name of the ArcIMS site above. Or you could just search for the file. Use an editor and open the file ArcIMSParam.js:
What fields get displayed
Putting in alias field names
The following is from the Esri documentation called Customizing ArcIMS Limiting the fields displayed To limit the fields returned in a selection, query, or identify, change the value of the variable selectFields to set the fields you want displayed. The default value is #ALL#, which indicates all fields are displayed. Field names must be in upper case to match what the ArcIMS Spatial Server returns. Since query operations are typically done on the active layer, you probably want the field display to change when the active layer changes. To make this happen, set swapSelectFields to true. If swapSelectFields is true, then a list of field names must be created for each layer. To create the list of fields for a layer, set the array variable selFieldList. Assign an element for each layer in this array, with the topmost layer set at index 0. Each line of the array is assigned like this: selFieldList[2]=NAME #ID# #SHAPE# POP; An element is required for each layer. The ID and Shape fields must be included in the list and must be surrounded by #s. This notation indicates that these fields are not in the database but instead are generated by the server. Image layers are assigned #ALL# since they have no attributes. An example of the assignment of these three variables is shown in Chapter 3, ëThe HTML Viewer JavaScript Library, ArcIMSParams.js, Identify/Select/Query/Buffer parameters. |
ESRI write up on how to Using aliases for the field names
|
Using aliases for the field namesTo display an alias field name instead of its original name, set useFieldAlias to true. When useFieldAlias is true, a list of field names and their aliases must be created for each layer.To create the list of field names and aliases for a layer, set the array variable fieldAliasList. Assign an element for each layer in this array, with the topmost layer set at index 0. The list is a string containing pairs of field names and their aliases, separated by a colon. Each pair is separated by a bar (|). Each element of the array is assigned like this:
fieldAliasList [0]=NAME:City Name|POP:Population; Because an element is required for each layer, if you donít want to assign aliases for a layer, set the element list to an empty string (î) as shown for element [1]. The viewer checks for an alias to use and only swaps the field name if it finds a name/alias pair for that layerís field in the list. An example of this assignment is shown in the description for the useFieldAlias array in Chapter 3, The HTML Viewer JavaScript Library, ArcIMSParams.js, Identify/Select/Query/Buffer parameters. |
Full table names not required
|
The quotes around true are REQUIRED and the field names in this section do not need to be fully resolved: // fields to be returned in identify/selection/query request. . . #ALL#=all fields var selectFields = "true"; // swap out the list of returned fields? //If true, a list must be defined in selFieldList[n] for each layer to update selectFields var swapSelectFields="true"; // array for each layer's returned fields if swapSelectFields=true var selFieldList = new Array(); // sample set for world - if not #ALL#, id and shape fields required. Separate with a space selFieldList[0]="FGDC_ID TITLE EMAIL #ID# #SHAPE#"; selFieldList[1]="NAME #ID# #SHAPE#"; |
Changing the backgound image in ArcIMS
|
Changing the backgound image in ArcIMSTask is to use my own background images in all the frames in ArcIMS. The frames are controlled by a series of html files.See the ESRI documentation on: HTML Viewer frame layout available at:http://arconline.esri.com/arconline/documentation/ims_/HTMLViewer1.pdf?PID=6 All the .html files are located on the machine where you created your web site and the general stucture is /Base directory/website/name of ArcIMS site/ the html files are in the name of the ArcIMS site above. Or you could just search for the all the html files. I changed the following line to point at my background image (Noaashield.gif)not the background image provided by ESRI: ...body... Background="/Images/Noaashield.gif"> I modified all the following files to get a consistent look in all the frames that comprize the ArcIMS site:
|
Using your own footer in ArcIMS
|
Using your own footer in ArcIMSThe way I found to get this to work is edit the file bottom.html.
A HREF="http://dudley.ngdc.noaa.gov/website/hazards/viewer.htm" target="_blank">Global Natural Hazards Viewer Home | A HREF="http://nndc.noaa.gov/?home.shtml" target="_blank" ALT="NNDC Home">NNDC Home | A HREF="http://www.ngdc.noaa.gov/seg/hazard/hazards.shtml" target="_blank">Natural Hazard Home Save the file, go into ArcIMS Administrator click the refresh button and review the results of the change in your browser. Note: Each frame is a predetermined size. If I put in a 2 line footer, the second line gets cut off. There is only room for a one line footer unless you change the size of the frame.
ESRI has a write up on headers and footers in the Customizing ArcIMS see pages 24-26 for information on this subject. |
Putting in the words "Current Tool" for the tool that is currently enabled
|
Putting in the words Current Tool: in the lower left hand corner of the ArcIMS viewerI saw this capability on another ArcIMS site and I thought it was a nice feature. In the default html viewer for ArcIMS the tool name is displayed in the lower left corner. Some end users might not know what these words mean, so I put the words Current Tool: right before the name of the tool. I did this by modifing the Modeframe.html file. After the Body BGCOLOR line I put the line Current Tool: in the Modeframe.html file to accomplish this task.
BODY BGCOLOR="Silver" LEFTMARGIN=0 TOPMARGIN=0 BACKGROUND="images/noaashield.gif" |
Tool bar help file
|
Putting in the link for the Tool bar help fileThe USGS has a help page that descirbes how to use the tools in the toolbar for the ArcIMS viewer. The file contains the standard tools for standard ArcIMS html viewer. It was done nicely and the url is: http://idaho.usgs.gov/projects/sr3/help/index.html I took the contents of that file and saved it locally to my PC. If you use other non-standard tools then you could just add them to this file or if you use less tools you could just take out the tools that are not available with your ArcIMS viewer. I put a link directly below the tool bar called Tool Bar Help in my ArcIMS viewer. The way I achived this is to put the help file that I saved on my local pc in the ArcIMS /images directory and all the icons for the help file are already in that directory, so I didn't have to bring them all over to the /images directory.
Then I went and modified the Toolbar.html file.
I put the following line right above the /body> and /html> tags:
To have a seperate window appear for the help file you must
use the following line in your a href tag:
target="_blank"
|
Layer's Area Help file
|
Putting in the link for the Layers Area Help fileI put a link directly below the Layers Area called Layers Area Help in my ArcIMS viewer. I created an Html page with a screen shot of my ArcIMS viewer. I saved the screen shot as a .gif file and put that file in the /images directory. Then in the html file I made a image link to the screen shot and then put text that I found on the web below the screen shot.
The contents of the Layers Area Help file can be viewed at:
Then I went and modified the Toc.html file. To have a seperate window appear for the help file you must use the following line in your a href tag: target="_blank"
|
Setting the scale factor (when to turn on a layer at a certain Zoom level)
|
Setting the scale factorHere's an example of when you might want to turn a layer on when you zoom to a certain level. I have a site that I call Hazards and I have 3 science layers and a country outline layer. I turn the country layer on when they bring up ArcIMS, but people might be intereted in hazards and there location to major roads. If I turned on major roads at the world level it wouldn't really help the end user and this layer could clutter up the Global view. The end user is looking at a world map and then zooms into the USA and then zooms to a state. At this level, I could turn on the Major roads layer automatically by setting the scale factor for the layer Major Roads. This is done by doing the following: Set scale factors
Remove scale factors
Also, the ESRI documentation called Using ArcIMS talks about this subject. I couldn't find a direct link to the documentation but I believe this book comes with ArcIMS. |
How to indicate to the end user that additional layers are available as you zoom in
|
How to indicate that additional layers are available as you zoom inI saw a site that tried to tell the end user that additional layers are available as you zoom in. I don't think there an easy way to do this and I couldn't come up with a better suggestion than theres. The site put a text statement that says: Note:If you zoom in more layers could be available. This text was put under the layers area as a statement with no links. The way to do this is to add the following to the file toc.html above the end body and end html tag:
Note:If you zoom in more layers could be available. |
One of the best resources is the ESRI Book - Cutomizing ArcIMS 3.0 - html viewer
|
ESRI Book - Cutomizing ArcIMS 3.0 - html viewerThis book is the best one I've seen on how to do specific tasks in ArcIMS. It is not on the ESRI web site and to my knowledge it is only available when you buy the ArcIMS product. I will use there table of contents to show what topics they cover. The first 2 chapters are only 30 pages long but they are extremely helpful. I can't reproduce these page because they are copyrighted.
Chapter 1 Introducing the HTML Viewer 5
Some highlights in the book are:
Chapter 1
These page really help the developer with the file organization and where to put files for ArcIMS.
The HTML Viewer frames - Page 8
Chapter 2 |
Changing the background color in the main map area
|
Changing the background color in the main map areaI wanted to make the background for the Oceans Blue of the main map area, then I wanted the Country boundary layer on top of the Blue background. This is done by a 2 step process:
|
Changing the ESRI Copyright text on the main map area
|
Changing the ESRI Copyright text message in the main map windowESRI has a statement at the bottom of the main map area window that says Map created with ArcIMS - Copyright 1992 - 2001 ESRI Inc.
The above statement can be changed to any other statement. I decided to take out the ESRI copyright statement and put in
my own statement: How do I achieve this? Edit the file ArcIMSparam.js
Refresh the map service in ArcIMS Administrator (last button on the tool bar) Reload your browser and see if changes are present. |
How to not display the Overview Map
|
How to get rid of the overview map.This is a 2 step process to achieve this:
|
Georeferenced tif
|
How to get a Georeferenced tif into ArcIMSEtopo.tif is the file I want to Georeferenced.You need to create an additional file for the tif image to be able to to georegister it. The file needs to be what ESRI call a World File with a .trw file extension. The format for the ESRI World file is:
Example:
Here's the World file created for the Global Bathemetry and topography tif file.
After creating the .trw world file for the etopo.tif file, put these 2 files in the image directory for your ArcIMS site. Start up Author and the new layer (Etopo.tif) should be there. Save in ArcIMS Author. Refresh the Map service in ArcIMS Administrator. Refresh your browser and check and make sure changes are there. Overlay vector files over image file to make sure registration is fine. |
Modifing the size and location of the image in the main map area
|
Modifing the size and location of the image in the main map area
Problem:
Solution:
To change the Size of the imageIn Mapframe.html:
To change the location of where the image is displayed in the Map AreaIn Mapframe.html:
|
How to Set The Map's Initial Zoom Area
| ||
How to Set The Map's Initial Zoom Area
|
Adding metadata links for the TOC fields
|
Adding metadata links for the TOC fieldsProblem:
Solution:
document.writeln('<td><font face="Arial" size="-1">' + t.LayerName[i] + '</font></td>'); with
if (t.hrefLayer[i]) {
In ArcIMSParam.js:
// href info for layers list in TOC
Example: hrefLayer[5] = '"http://map1.ngdc.noaa.gov/index.html"'; NOTE: A common error is the single and double quotes are improperly placed. The format for the hreflayer line is single quote, double quote, url, double quote and single quote.
If you have six layers in the TOC area:
The second layer corresponds to hrefLayer[1] The third layer corresponds to hrefLayer[2] The fourth layer corresponds to hrefLayer[3] The fifth layer corresponds to hrefLayer[4] The sixth layer corresponds to hrefLayer[5] When the end user clicks on the field name a separate window will appear with the associated metadata for that layer.
|
Taking out the tools: Overview map and legend layer toggle button
| ||
Deleting out the tools: Legend and Overview Map from the ToolbarProblem:
Solution:
In toolbar.htm: document.write(' // I commented out the following lines to get rid of the legend and Overmap // if ((parent.MapFrame.hasTOC) && (parent.MapFrame.aimsLegendPresent)) { // // Legend toggle. . . requires aimsLegend.js // document.write(' '); | ');// document.write(''); // isSecond = !isSecond; // document.writeln(' // } // if (parent.MapFrame.hasOVMap) { // // Overview Map toggle . . . requires overview map // document.write(' '); | ');// document.write(''); // isSecond = !isSecond; // document.writeln(' // if (isSecond) document.write(' // } Then save the file toolbar.html and run your ArcIMS site and the tools should be taken out of the toolbar. |
Adding Stripes to The Layers List
| ||
Adding Stripes to The Layers List
|
Adding The Legend/Layer Toggle Button
| ||||||||
Adding The Legend/Layer Toggle Button
|
Making the Continents Layer Unactivable
| ||
Making the Continents Layer Unactivable
|
Adding The Overview Frame
| ||||||||||
Adding The Overview Frame
|
Adding The Overview Frame
|
Hi, I was able to add a new frame for Overview map. But there is a extent box for overview map present in Mapframe. Can you help me to get rid of it? Thanks, Kavi
|
Relocating the Refresh Map Button
| ||||||||||
Relocating The Refresh Map Button
|
Adding the Shapefile Extractor Capability
| ||||||||
Adding the Shapefile Extractor Capability
|
Adding The External Viewer
| ||||||||||||||
Adding The External Viewer
|
Legend Truncated
|
ArcIMS 4.0 truncates the lower part of the legend. How do I prevent it from doing so?
|
I don't want to switch between Legend and Layer
|
Hello, I want to know if it's be possible to display the legend of each layer and the layer menu in a same HTML frame (the legend at the left of each layer and the active and visible button at the right of the name of the layer) Thanks a lot for help me.
|
Zoom limit in arcIMS
|
I'm trying to constrain zooming so that you can't get any closer then 1" = 500' since thats approximately the zoom depth that the hillshade and DOQQ graphics that I'm using become unviewable due to pixelation. Anyway, I've been trying to find a way to have a lower limit set on zooming without having to re-write large portions of aimsClick.js, aimsCommon.js and/or aimsNavigation.js. Any clues as to how to go about this would be fabulous. Cheers -Colin |
Setting variables for hyper links in ArcIMSparam.js files
|
I have sections of the code for the ArcIMSparam.js files and the AimsIdentify.js files. This code details the modifications that were successful in activating hyper links to fields when an Identify command has been operated. This is the sections in the ArcIMSparam.js file that directs which fields are selected when an Identify operation has been run as well as how to turn those fields into hyperlinks: // fields to be returned in identify/selection/query request. . . #ALL#=all fields var selectFields= "true"; //var selectFields= "#ID# #SHAPE#"; // swap out the list of returned fields? //If true, a list must be defined in selFieldList[n] for each layer to update selectFields var swapSelectFields=true; // array for each layer's returned fields if swapSelectFields=true var selFieldList = new Array(); // sample set for world - if not #ALL#, id and shape fields required. Separate with a space selFieldList[0]="SDE_USER.MAR_SEDS.MGGID #ID# #SHAPE#"; selFieldList[1]="SDE_USER.MAR_SEDS.MGGID #ID# #SHAPE#"; selFieldList[2]="#ALL#"; selFieldList[3]="#ALL#"; selFieldList[4]="#ALL#"; selFieldList[5]="NAME CONTINENT #ID# #SHAPE#"; selFieldList[6]="#ALL#"; // use the field alias in the data display? //If true, a list must be defined in fieldAliasList[n] for each layer defining aliases for those fields needing them var useFieldAlias=false; // array for aliases for each layer's returned fields if useFieldAlias=true var fieldAliasList = new Array(); // sample set for world - fieldname:alias pairs separated by a bar (|)... if no aliases, use empty string ("") fieldAliasList[0]="SDE_USER.MAR_SEDS.URL|URL"; fieldAliasList[1]="SDE_USER.MAR_SEDS.URL|URL"; fieldAliasList[2]=""; fieldAliasList[3]=""; fieldAliasList[4]=""; fieldAliasList[5]="NAME:CountryName"; fieldAliasList[6]=""; // Hide the ID field display? The ID Field must be included in field list, but we don't have to show it. var hideIDFieldData = true; // Hide the shape field display? The Shape Field must be included in field list, but we don't have to show it. var hideShapeFieldData = true; // parameters for setting up hyperlinks in data display var hyperLinkLayers = new Array(); // layers to have hyperlink var hyperLinkFields = new Array(); // field in those layers to be used for hyperlink var hyperLinkPrefix = new Array(); // prefix (if any) to place before field value to make hyperlink url var hyperLinkSuffix = new Array(); // suffix (if any) to place after field value to make hyperlink url hyperLinkLayers[0] = "Marine Geology Station Locations"; hyperLinkFields[0] = "SDE_USER.MAR_SEDS.MGGID"; hyperLinkPrefix[0] = "http://oas.ngdc.noaa.gov/mgg/plsql/geolin.set_expand?v_mggid="; hyperLinkSuffix[0] = ""; |
modification to aimsIdentify.js file to correct hyperlinkprefix and hyperlinksuffix
|
There seemed to be an error in the default code in the aimsIdentify.js file. We made a correction to one line of code at line 256. Here is the corrected code: if (showHyper) { for (var s1=0;s1<hyperLinkFields.length;s1++) { if (hyperLinkFields[s1]==fName1[f]) { var theLinkURL = hyperLinkPrefix[s1] + fValue1[f] + hyperLinkSuffix[s1]; Win1.document.write('<a href="' + theLinkURL + '" target="_blank">'); isHyper=true; |
Reasoning for ArcIMS defaulting to JPEG instead of GIF files
|
I found this on the ArcIMS e-mail discussion forum site. It explains why ESRI cannot allow the generation of the GIF format, but does for JPEG files. The GIF Format: GIF was developed to provide a highly compressed raster format for interchanging images across slow speed telephone lines and is thus excellent for use on the Internet. The LZW compression scheme relies on the fact that many computer generated images contain large areas of identical colors. These repeating sequences of colors can be represented by much smaller code sequences. For this reason, it is best to use GIF for simple to moderately complex images typical of maps and diagrams. LZW is a "lossless" type of compression so the resultant images never suffer from image artifacts or quality degradation associated with "lossy" techniques like JPEG. LZW is founded on the use of Palettes and is limited to 256 simultaneous colors. Since the GIF format utilizes the LZW compression technology patented by Unisys Corporation, you must establish a licensing agreement with Unisys before you can obtain GIF support from ESRI. Solid colors in GIF images sometimes appear dithered in some web browsers, but this dithering can be removed by using the 'safety palette', a 216 color palette used for web raster graphics that ensures that they don't appear dithered when displayed on a wide range of browser types and platforms. The JFIF (JPEG) Format: It is commonly believed that JPEG is an image format, however JPEG is actually a compression format and not a file format. The JPEG File Interchange Format (JFIF) is a standard file format, which uses JPEG compression technology, a 24-bit compression technique for storing full color and gray scale images. JPEG was created out of the need to shrink the storage requirements of photo realistic images and therefore is optimized to work with natural scenes and portraits, which are dominated by slow shifts in color frequency and relatively low contrast. This is exactly where GIF performs poorly since there are few repeating sequences to encode. JPEG compression produces files four times larger on average than LZW for diagram-like images but can produce significantly smaller images than LZW when used for complex images like satellite photos. Unlike GIF/LZW, JPEG is royalty free and does not require a special licensing agreement for its use. JPEG can encode images with up to 16.7 million colors. You should avoid framing your images, using dense grids, or any graphical techniques which introduce abrupt changes in contrast for optimum image quality. Compression "artifacts," small distortions in colors which may appear as blotchy regions are common in diagram type images when using JPEG. The PNG Format: The PNG graphics format was defined during 1995-96 to overcome the problems with the copyright issues of GIF/LZW. The compression of PNG (where all the copyright issues started with) is based on the zlib algorithms from Mark Adler e.o., which is free software donated to the public domain. PNG's compression is fully lossless--and it can support up to 48-bit truecolor or 16-bit grayscale. For more information: http://www.freesoftware.com/pub/png Two file size comparisons: Image1 JPEG - 94K GIF - 20K PNG - 14K (8Bit) PNG - 17K (24BIT) Image2 JPEG - 109K GIF - 197K PNG - 170K (8Bit) PNG - 433K(24BIT) (http://forums.esri.com/forums/Index.cfm?CFApp=64&Message_ID=64617) |
Hyperlink problems
|
Dear All I have set up hyperlinks from some layers within ArcIMS 3.1 with both the Prefix and Suffix functions. Now those layers with hyperlinks cannot be identified with the identify button. Any ideas would be greatly appreciated Best Wishes Ian May |
Hyperlink issues
|
Ian: I am not sure about the specifics of your problem. Would you repost your problem with the actual code from your ArcIMSparam.js and aimsidentify.js files. All I need to look at are lines 250-260 in aimsidentify.js file and lines 285-335 in the ArcIMSparam.js file. Feel free to e-mail me directly at josh.klaus@noaa.gov. Josh |
Hyperlink to new window problem
|
Hi! Recently, I have a problem with my browser. When i click on a hyperlink that links me to a new window, a new window appears and gets stuck there... I could only see the title bar and menu bar of the new window only. I have tried reinstalling and repair Internet Explorer6 SP1, but was no use... Please tell me how to troubleshoot this problem. Your help is greatly appreciated. George
|
Same problem
|
I am having exactly the same problem. hyperlinks that should open new windows in explorer or email cycle and fail.
|
ArcIMS@NOAA: The Video Conference
|
The second NOAA-wide ArcIMS Video Conference will be held between 12:00 noon and 2 PM (EST) on December 7, 2002. The conference will provide a forum for learning about applications of ArcIMS within NOAA. The conference will center around informal presentations by ArcIMS users from a number of groups in NOAA. Information about the conference will be posted inthis discussion thread. |
Presentations
|
Presentations that will be given during the conference:
|
Using ArcIMS to Improve Metadta Quality
|
Ted Habermann, Josh Klaus, and Todd Pearce (NESDIS)We are exploring ArcIMS as a tool for examining and improving spatial metadata for NOAA datasets. Our initial prototype is designed as a tool for metadata managers. We will discuss several potentially interesting steps in our process:
|
NOAA Data Shoreline Explorer
|
NOAA Data Shoreline Explorer The National Geodetic Survey ArcIMS application for extracting and displaying U.S Coastal Shoreline shapefiles. |
NgsMap2
|
NgsMap2 The National Geodetic Survey's ArcIMS application for realtime display of survey control points in the U.S. |
CSC ArcIMS ActiveX Template Code Base
|
Although a number of organizations use ArcIMS sites to serve data, many only employ ArcIMS's “out of the box” functionality. A template for ArcIMS applications developed by the National Oceanic and Atmospheric Administration Coastal Services Center provides a standardized, fully featured code base that ArcIMS ActiveX Connector programmers can use to quickly create and customize new ArcIMS sites. The template provides a uniform look and feel for Internet mapping services and allows complex customizations that include integration with ArcObjects and ArcSDE. This presentation highlights the features particular to the Center’s template and methods for easy customizations by other ArcIMS programmers. Examples of how the template has been used in Center products, including a raster and vector data distribution system, as well as an interactive tutorial for teaching the process of identifying and visualizing coastal hazards and their potential risks, will also be quickly demonstrated. Everyone is encouraged to download a copy of the associated Powerpoint presentation, found at the following URL: http://astrolabe.csc.noaa.gov/Presentation/template.html
|
How put the mouse coordinates into a TXT file
|
Hello,
I need to get the X and Y coordinates of a mouse click into the map, as put them to a TXt file. Anyone can help me?
Thanks Luis Ladeira
|
ArcIMS Administration
|
Here's a place to discuss ArcIMS administration issues. One good discussion forum is at ESRI: http://support.esri.com/ choose the ArcIMS link on the left navbar, and then the "Discussion Forum" link on the main page that follows. There is a new page that covers details of NGDC's administration approach at http://hypernews.ngdc.noaa.gov/HyperNews/get/doc-arcims-admin.html
|
Free 3D Viewer - Geospatial Explorer
|
Regarding your post on tools, www.cyze.com/download has a free 3D Viewer. It was based on their initial implementation used by the US EPA FIELDS team. Geospatial Explorer (Win32 Application) HTH TS
|
ArcIMS technical discussion
|
GeoSpatial Data in the NOAA Line-Offices
|
This section of the forum provides information about GeoSpatial activities in NOAA's Line Offices.
|
NESDIS
|
This section of the forum provides information about GeoSpatial activities in NESDIS.
|
National Ocean Service
|
This section of the forum provides information about GeoSpatial activities in the National Ocean Service.
|
National Weather Service
|
This section of the forum provides information about GeoSpatial activities in the National Weather Service.
|
Office of Atmospheric Research
|
This section of the forum provides information about GeoSpatial activities in the Office of Atmospheric Research.
|
National Marine Fisheries Service
|
This section of the forum provides information about GeoSpatial activities in the National Marine Fisheries Service.
|
Science means compare and contrast
|
Comparing and contrasting pictures is an important tool in the quest for scientific understanding. Many of the systems we looked at lead to a single picture. The Web Image Spreadsheet Tool allows users to interactively create tables of images.
|
VENTS Seismicity Maps
|
The VENTS project at PMEL brings together a wide variety of geospatial data relevant to hydrothermal vents on the seafloor. Their site includes a page with a set of thumbnails that can be clicked to see maps of the seismicity of the NE Pacific each quarter. The same gifs can be displayed in a WIST that provides a very flexible user interface that allows comparison of maps for different time periods. Check "Control Panel" and change the display to fit your needs! |
Metadata
|
High quality Metadata is an important part of an data system. Use this section of the forum for discussing Metadata issues.
|
NGDC Metadata Workshop Feb. 7-9, 2001
|
At the GeoSpatial / Metadata workshop in Silver Spring during December representatives of all NOAA Line Offices agreed that existing NOAA metadata can be improved in many ways. It was also agreed that increased communication within the NOAA metadata community would be an important step in the process of improving metadata management and quality. NGDC will host a metadata workshop February 7-9, 2001 as a step toward increasing that communication. The workshop is designed for people who are actively working with NOAA metadata and metadata management tools. The first day of the workshop will feature detailed discussion of NOAA metadata quality and management. This discussion might include topics like: 1. Application of the FGDC Metadata Content Standard to NOAA data, i.e. what information goes in which sections of the FGCD, NOAA Supplemental Fields. 2. The process of updating NOAA metadata in NOAAServer: existing process and potential for automation. 3. The NOAAServer interface, simple vs. advanced searches, hierarchical theme keywords. 4. Other topics that might emerge from this group prior to the workshop. Please use the web-based discussion group at http://hypernews.ngdc.noaa.gov/HyperNews/get/geospatial/6/1.html to contribute topics or comments on existing topics. The second day of the workshop will begin with a focus on metadata management tools. The Data Centers are using COTS management tools developed by Blue Angel Technologies (http://www.blueangeltech.com). These tools are broad enough to address metadata management problems throughout NOAA and flexible enough to apply to NOAA data management needs in many other areas. Technical representatives from Blue Angel will introduce these tools on the morning of the second day and describe user case-studies that demonstrate several different applications of the tools. Specific questions that you might like Blue Angel to address can be posted at http://hypernews.ngdc.noaa.gov/HyperNews/get/geospatial/6/1.html. The afternoon of the second day offers several possibilities: more general discussion of specific NOAA Metadata problems, or more technical discussion of the Blue Angel tools. Either of these topics could be addressed in a plenary session or breakouts could support both topics. The third day is optional and will focus on specific technical aspects of using the Blue Angel tools. This day is designed as an advanced technical support day for NGDC staff that are working with these tools. We will cover installation and configuration of the tools, advanced aspects of data entry page design, display of metadata using XSL stylesheets and applications of the Blue Angel Software Development Kit (SDK). This SDK will support populating metadatabases directly from data processing systems. This message is being sent to potential metadata contacts throughout NOAA as a way to identify people that might be interested in attending the workshop. Please share this announcement with others that might be interested in attending and let me know if you plan to attend so that I can plan accordingly! -- Ted Habermann Information Architect NOAA, National Geophysical Data Center V: 303.497.6472 F: 303.497.6513 Ted.Habermann@noaa.gov
|
Discussion Topics: NOAA Metadata
|
Use this section of the forum to contribute discussion topics or to comment on existing topics.
|
Ideas for integrating metadata and data access
|
At the GeoSpatial/Metadata Workshop I presented several ideas for integrating metadata and data access systems in order to improve the effectiveness of NOAA's metadata. The ideas were Spatial Surrogates, Hierarchical Keywords, Spatially Enable Metadata, Lineage Tracking, Attribute Ranges / Inventory. Since the workshop I have added Keywords to <meta> tags to the list. Full resolution gifs from my talk are available at http://www.ngdc.noaa.gov/seg/talks/GSDWorkshop/habermann-2. Smaller versions are reproduced in this section of the forum. Feedback on these ideas would be great and identification of partners in implementing pilot projects would be incredible.
|
Spatial Surrogates
|
The Idea:The Implementation: |
Hierarchical Keywords
|
The Idea:The Implementation: |
Keywords to <meta> tags
|
The Idea:The Implementation: |
Spatially Enable Metadata
|
The Idea:The Implementation: |
Lineage Tracking
|
The Idea:The Implementation: |
Attribute Ranges / Inventory
|
The Idea:The Implementation: |
Metadata Management Tools - Web Interfaces
|
There are many metadata management tools available from many sources that might be helpful for NOAA Metadata Managers. This section of the forum is the place to share information about those tools.
|
FGDC Metadata Tools Page
|
The FGDC Metadata Tools page is at http://www.fgdc.gov/metadata/toollist/metatool.html
|
Web-based metadata entry tool from CCRS
|
The Canadian Center for Remote Sensing (CCRS) web-based metadata entry tool.
|
Review by Andrew Rushin (NOS Metadata Specialist)
|
Andrew's review is available at http://meridian.ngdc.noaa.gov/intranet/Metadata/webBasedToolReview.htm
|
We need versitile metadata management foundations
|
Andrew makes some great points that. I particularly support his idea that a single web-based tool is not going to satisfy all of our needs. More details are available at: http://meridian.ngdc.noaa.gov/intranet/Metadata/accesspaths.html
|
webtool discussion
|
Dear Andrew, The issues you raise are quite familiar to me. I developed a metadata standard before FGDC for an integrated global database (http://www.ngdc.noaa.gov/seg/fliers/se-2006.shtml). If you look at that, you will see that my metadata concept fits your second purpose - to thoroughly document the dataset. Also, because of the importance of metadata for (a) operational software, such as GIS; and (b) scientific documentation, I incorporated the metadata into the product in a fully integrated way. In fact, the on-line interface (follow the links) requires the user to go THROUGH the metadata to access files for download, and some of the metadata actually resides in operational GIS header files to fulfill an operability requirement (which provides quality assurance). Hence, users obtain data from within the frame of reference (context) of the documentation and integration concept, which is important (or critical) to understanding its use. The datasets as a whole are thus packaged with the metadata record for the dataset, including all lower-level documentation in appropriate system files. The idea is to ensure that they are never separate. Enter FGDC. We now have the concept of SEPARATING data from metadata for search purposes; a bad idea from the very beginning. I got a $40k grant from FGDC to explore the problem of optimal storage taking into account built-in hierarchies (integrated data products), such that one can PRODUCE (not store) FGDC records from a more optimal documentation structure. The problem to be solved, that FGDC did not even consider, is that metadata for related products have tremendous overlap and are inseparable from data for maintenance and archive purposes. The approach I explored would resolve those overlaps for a given integrated product and dynamically produce FGDC records on demand (at the level of interest). This approach is needed in complex databases because there literally could be thousands of FGDC records depending on the level at which the report is generated; however it cannot be left to an external system to accomplish because it needs to be part of the archive data product. The simplest solution is to generate the thousands of flat FGDC files and store them with the product (disk space is cheap and they're not really that big). But still, one needs a system to generate them in the first place, not through hand entry but capturing already coded information in system files. That strongly reinforces the issues you mentioned, that search and documentation are different things. Also, search does not require a full record (except in someone's Orwellian nightmere). That being the case, and given archival requirements for data and metadata (together) it only makes sense that the product metadata, maintained by the scientist or data specialist, should be the primary authority. That means that updates and corrections must propagate from the product to the search system, not the other way around. Now, the Blue Angel approach is still organized along the concept of individual FGDC flat records for an "FGDC data thing" (FDT). There are no relationships encoded between FDT's, i.e., no way to capture the hierarchies I mentioned above. The metadata are not stored as flat files, but in a different structure that allows for repeat information and lookup fields. This is designed to ease construction of a metadata record de novo. Ted is working on a capability to read in a flat file so it doesn't all have to be hand keyed. But there is no way the Blue Angel system can deal with the fundamental problem of hierarchical relationships in related datasets (or within datasets with multiple layers). The entity-attribute repeat entries were an attempt to capture the detailed sub-hierarchies, but there are too many built-in assumptions about how these are related to the overall record for it to work generally. My conclusion is that there must be two systems. There is the system the data producer/manager uses to develop data products and accompanying metadata (automated or not), and there is a separate clearinghouse search system. The first must be the primary authority for information and it must feed reports to the clearinghouse by whatever means works. Naturally, an automated connection is desired. Hence I am interested in building stand-alone data products that can produce FGDC records on demand (or have them prepared in advance by whatever system is used to create and edit them, and stored on the archive product). In that scenario, not all the FGDC information need be accessed for search purposes, or it could be delivered at different levels of detail as needed. In my more cynical moments, I suggest that all we really need at the national search level is a MARC record format, produced from archive data/metadata products. In other words, the FGDC approach has seriously confused these two purposes and produced a standard that is good for documentation but not for distributed search, while building a distributed clearinghouse system that really demands a simpler search record. All the best. John Kineman Andrew Rushin wrote: > > Attached is a review of the web-based metadata entry tool from CCRS. I found it to be a very nifty idea that if > properly executed, could solve some problems. However, the FGDC standard is complicated; and I have not yet seen a > web-based tool that adequately handles that complexity. I have not been able to access Blue Angel's metadata entry > tool. (Ted Habermann is working on a firewall issue.) I am interested to see how they have decided to approach it. > If you have any comments, please send them my way! >
|
Managing Metadata in HTML
|
HTML and associated tools like javascript, cascading style shttes, and extensible stylesheet language bring us unprecedented capabilities to display metadata in informative and accessible ways. These capabilities also come with some new challenges that are specifically related to limitations of these approaches.
|
Newlines
|
Newline characters do not exist in HTML text. They are represented by <br> tags or associated with other tags (H1, H2,..., P, LI). Blocks of text that include newline characters will, therefore, have the words at the end of the lines crunched together with those at the beginning of the next line. The last line will look like "of thenext line". This also effects things like addresses in metadata. They must be comma separated rather than appearing on multiple lines. Of course, newlines included in preformatted sections (<pre>....</pre>) will be displayed, but this eliminates the capability to take advantage of HTML formatting within those sections. Usually this is not the desired look.
|
Managing Metadata in Relational Databases
|
Relational database systems are powerful tools for managing data of many kinds. The client-server architecture supported by those systems is particularly useful for groups managing data collections using web interfaces. Aspects of metadata management in RDBMS can be discussed here.
|
Metadatabase Design
| |||||||||||||||||||||||||||||
The design of the database for holding metadata is very different from the design that you might imagine or that you might create with only a small amount of database experience. It took me quite awhile to figure it out, but, once you do, it is incredibly simple and powerful. It is what I call the Attribute / Value design. One table holds definitions of all of the attributes that can exist. In the FGDC case these definitions include whether or not the attribute can be repeated and some other stuff. Another table holds the values of those attributes. A simple example: Attribute Table:
Value Table:
In this case we are using an attribute value approach for a book database. It contains the titles, authors, and publication dates for the two classic works on metadata. The titles are identified in the value table as rows that have Attribute ID = 1. The authors have Attribute ID = 2. If we want all of the fields for your book, we select rows from the value table where Record ID is 1. If a book has two authors, you just add another row to the value table for that record. What is confusing about this is that each "record" is actually spread out over many records in the value table, one for each attribute. What is cool about it is that I can add any number of attributes without changing the database design and there are no empty fields for attributes that some records do not have. Actually there are a bunch of cool things in addition to this. One of them is that Blue Angel can use this simple design for almost any sort of data that you can imagine in addition to metadata. This is what makes Blue Angel such a powerful tool for us to get to know. I can explain this more if you are interested. An example of a simple implementation for software documentation is available. |
Discussion Topics: Questions for Blue Angel
|
Use this section to post questions about metadata management tools to be addressed by technical representatives from Blue Angel Technologies (or others).
|
BAT support for the Biological Data Profile
|
Does the BAT database (is it still called 'Repository'?) support the FGDC Biological Data Profile? The BDP is an FGDC-approved profile that includes several elements specifically related to biological data that have a geospatial component. More information about the BDP is available at http://www.nbii.gov/datainfo/metadata/standards/index.html NODC and other NOAA components have a growing number of biological data sets being described using the National Biological Information Infrastructure [NBII] metadata creation tool 'metamaker', which supports a version of the BDP. We want to include all of the BDP extensions in one of our record sets, preferably without going through the hassle of redefining the template to accommodate this standard profile. Is this profile already created in the database?
|
BAT support for multiple theme (and other) keyword thesauri
|
In the MS Access version of Repository, I can identify multiple keyword thesauri, but it is not clear how to make a link to each thesaurus as a picklist of choices. For example, I made an xml file with 'nodc sea area names' to use as place keywords. I also want to use the somewhat different GCMD Place keyword list as a second choice of thesaurus for keywords. I can manually identify and type in each thesaurus, but I don't know how to make this a repeatable 'two-tiered' selection: first tier: choose a thesaurus from one or more thesauri; second tier: choose one or more terms from the selected thesaurus; repeat as needed. A more realistic example of this requirement: NODC is developing a keyword dictionary from which a thesaurus will be derived. We want one 'theme keyword thesaurus' to be the NODC list. The NOAA Central Library uses the Library of Congress Subject Headings as their authoritative 'theme keyword thesaurus' so a second choice is this list. NGDC has advocated use of the GCMD Keyword lists, which represent a third keyword thesaurus. Many individual data collection projects have their own keyword lists. To be most useful to different customer groups, the data describer may decide to use more than one of these lists. Does the oracle version of 'Repository' provide this capability?
|
BAT support for multiple record sets for a single data center in Oracle version
|
In the MS Access version of Repository, I have multiple record sets (standard products, non-standard products, nndcserver products, etc.). Will NODC and the other data centers be able to maintain multiple record sets of our own data in the Oracle version of Repository?
|
Participants
|
The present list of participants for the meeting is available at http://lithophyte.ngdc.noaa.gov/~haber/Participants.htm
|
Agenda
|
No hidden agenda's here. This meeting has two sections and potentially two tracks. Wednesday is a day for NOAA Metadata Specialists to discuss the present NOAA Metadata Management System and the future of that system. Topics for discussion can be entered into the Discussion Topics section of this forum. This discussion can continue on Thursday afternoon if need be. Our colleagues from Blue Angel have put together a plan for the technical track of the meeting that starts on Thursday morning. This plan is designed for people interested in the more technical aspects of the Blue Angel Tools. It includes an overview and introductory segments that will allow those people to come up to speed quickly. That plan is presented in this section. Questions for Blue Angel can be posted above. |
Thursday Morning
|
|
Thursday Afternoon
|
|
Friday
|
|
Proposed Agenda for Wednesday
|
Possible time slots for discussion
|
8:30 - 8:45 Welcome / Logistics
|
We will discuss logistics and welcome participants to the workshop!
|
8:45 - 10:00 FGDC Metadata Content Standard -What is it and how do/can we use it?
|
The FGDC provides a broad content standard for metadata that cannot be applied to specific types of data without some interpretation. Different groups in NOAA have interpreted some sections in slightly different ways. These differences, while minor, cause difficulties in managing and accessing the metadata. If we can agree on those interpretations it will simplify the metadata management task and enable us to take advantage of system developments across NOAA. The FGDC includes some sections that may support powerful capabilities that we are not presently taking advantage of. These include spatial keywords, lineage tracking, attribute definitions and ranges, and others.
|
10:15 - 11:00 FGDC Metadata Content Standard -Why we can't use it.
|
One of the most common conclusions reached about the FGDC content standard is that it doesn't fit my data (despite the fact that it is so broad). This conclusion is the jumping off point from which the development of stovepipe metadata systems is justified. It is important that we understand aspects of NOAA data that are not addressed by FGDC. Are these common across many groups in NOAA?
|
11:00 - 12:00 Metadata System Requirements - What do we need this system to do?
|
If we lived in a perfect world what would the NOAA metadata management infrastructure look like and do? Do we need such an infrastructure? Would it consist of a myriad of specialized systems? Would it be centralized? Would it be made up of nodes? How would the system be searched? Would it connect to the NSDI? Would it be GeoSpatial? Who would operate the system? Who would manage the metadata content? What would Napster look like if it were a metadata system? Who pays for the system?
|
1:00 - 2:00 NOAAServer - Under the hood
|
NOAAServer provides a metadata search mechanism that is presently used by a number of NOAA groups. How does the present system work at the metadata production nodes and at the hub? How is the content transferred to NOAAServer? What does it look like when it is transferred? What happens when it gets there? How is it indexed? How is the index searched?
|
2:00 - 3:00 NOAAServer - Interfaces
|
The present interface to NOAAServer was initially developed several years ago. It provides keyword, spatial extent, and temporal extent search interfaces. Other interfaces might be useful and might extend the metadata search capability in new ways. Options might include direct URL interfaces, keyword hierarchies, spatial hierarchies, ….
|
3:15 - 4:00 Metadata Management / Discovery Alternatives
|
There are many alternative approaches to managing metadata and to using metadata in the data discovery process. As the amount of metadata increases our metadata management / discovery systems may need to evolve. What are present systems and alternative systems? What are the advantages and disadvantages of these alternatives? Do we need NOAAServer? GCMD Portals? FGDC Clearinghouse Nodes? How do metadata and HTML searches compare?
|
4:00 - 5:00 Conclusions / Recommendations
|
What have we learned today? Are there topics that we will continue to discuss over the next two days? Are there conclusions or recommendations that we can make? Who do we make them to?
|
Meeting Report
|
The Metadata Workshop in Boulder brought together a diverse collection of people from throughout NOAA with very different metadata experience and needs. This section summarizes some of the interchanges.
|
What we like and dislike about metadata
|
We started the meeting with all participants describing what they liked and disliked about metadata. A few of the thoughts: What I like about Metadata
|
What is Metadata
|
Anne Ball suggested that we agree on what metadata is before we discuss it further. Some suggestions:
|
|
|
Shape File of Power Plants?
|
I'm looking for a source for US power plant locations in shape file format. Would like to use it to filter hot spots detected on satellite data for our wildfire program. Anyone know of a source? Thanks in advance!
|
NGDC Geospatial Data Services Group Minutes
|
The NGDC Geospatial Data Services Group meets occasionally to discuss their activities. This is where minutes of those meetings are kept.
|
April 3, 2002
|
Topics discussed during the April 3, 2002 meeting go here.
|
Plans / Templates
|
I have been trying to improve my understnding and implementation of project management as part of my work on CLASS. Chas Schirmer (who works with Eric) has made a rather strong recommendation for the ideas presented by Steve McConnell's book "Software Project Survival Guide". We have a copy of this book that is available for the group. The website for this book is at http://www.construx.com (click on the survival guide website link). That site includes templates for a number of plans. I am experimenting those templates to see if they might help this group understand what we are doing. Examples of a project plan and an architecture plan are in the DDSS directory on our intranet (http://meridian.ngdc.noaa.gov/intranet/Geospatial/DDSS/). What do we need to add/subtract from these templates to make them useful?
|
CVS Client Overview
|
Setup: To use CVS, you need an account on cave.ngdc.noaa.gov (contact linux-admin@rt.ngdc.noaa.gov) and you need to belong to the group nndccvs. You need software to run CVS. The command line cvs tool is part of the linux distribution, however there are more convenient guid clients. For linux there is cervisia located at http://cervisia.sourceforge.net For windows and mac you can use something from the cvs gui project: http://cvsgui.sourceforge.net/ tortois also looks interesting: http://www.tortoisecvs.org/ once you have the above set up you will need no know the cvsroot. In our case the cvsroot is ":pserver:cave-username@cave.ngdc.noaa.gov:/cvs/projects/nndc" This means we are using cvs's password server, as the user the machine cave.ngdc.noaa.gov and the project home directory is /cvs/project/nndc here are some links for tutorials and such: Miros page I think you should read the basic concepts page. a nice tutorial from me....(using command line cvs) first one must login if your cvshome is set, all should be fine here. >cvs login now we will create a place on the local machine to put cvs files >mkdir CVS >chdir CVS now to check something out >cvs checkout tutorial/files notice how this creates the entire directroy structure. now change to the directroy tutorial/files and edit the users.txt file to include your name and save the file now check it back in >cvs commit now the file on cave is up to date... if you are done and do not need the files anymore then i recommend releasing the files change back to the CVS directory >cvs release -d tutorial it says you have 0 files changed (which is good because your file was updated). that is a sample session... Now I will tell you how to import your files on cvs. first use the guidelines page to determine the proper directory structure on cvs. This is important because it is a pain to change once created. Then change to the directory that you have files and/or directories in you want to import. Now type the command >cvs import -m "some cvs log message" module/directory ngdc start your files are now in cvs. For a better tutorial and documentation go to cvs home documentation. dont forget to read the cvs guidelines page, this is not a final document or anything, it needs comments and questions and such guidelines doc |
meeting notes 4-3-2002 on Gravity
|
Gravity project progress |
Spatial Snapshots
|
Goal: To create snapshots of GLOBE (School) data as once done by Ken Tanaka and spatially-enable them using Mark's dbms->sde program. Highlights: Tom Gaines and I will work with Ken Tanaka's old sde_school.sql procedures to do: 1)Create a database link from nndc to fsl's gdbd production machine; 2)Create a materialized view (snapshot) of GLOBE observation data--with special interest to grab "firstReport, lastReport" data inorder to create the timeline histograms for fyp. 3)Have this view refresh once/week. 4)Develop a trigger and/or Oracle Job that will take the snapshot and spatially enable it using Mark's dbms->sde. The aim is to execute this process in Oracle without having to use a cronjob to run a shell script.
|
April 9, 2002
|
The regular time for these meetings has been moved to 9:00 AM on Tuesdays. At this meeting Ted discussed application of the plans that he has been working on for CLASS to projects in this group. He showed examples of a project plan (http://meridian.ngdc.noaa.gov/intranet/Geospatial/DDSS/Data%20Discovery%20Support%20System%20Project%20Plan.htm) and an architecture plan (http://meridian.ngdc.noaa.gov/intranet/Geospatial/DDSS/Data%20Discovery%20Support%20System%20Architecture.htm) Ted also discussed a propose task list for NGDC in the CLASS project.
|
June 11, 2002
|
Meeting MinutesGeospatial Data Group weekly meeting on June 11, 2002Attendee List: John Cartwright, Mark Ohrenschall, Dan Kowal, Thomas Gaines, Kris Nuttycombe, Josh Klaus, Travis Stevens, Ted Habermann, James Barkley, Karen Horan Meeting Location: WADI room (1B307) Meeting time: 3:00 - 3:40 Minutes recorded by: James Barkley |
minute items
|
1. Karen talked about Metadata standards in the gridded data sets she was working on loading. This opened up some discussion about how to make people care about metadata. 2. Jimmy talked about the WIST tool which is currently being revamped by Herbert and Jimmy. Their goal is to have the WIST configuration files in an XML structure and have the processing in java. 3. Seven copies of "The Elements of Java Style" have been purchased for the group in order to increase coding standards in accordance with the FSA results. 4. There was some talk about eliminating the test layer of the IDB. 5. Josh has put a hazards viewer with shape files on NVDS.
|
June 18, 2002
|
Meeting minutes for June 11, 2002 Geospatial Data Group weekly meeting Attendee list: John Cartwright, Mark Ohrenschall, Dan Kowal, Thomas Gaines, Kris Nuttycombe, Josh Klaus, Travis Stevens, Ted Habermann, James Barkley, Karen Horan. Meeting location: WADI room (1B307) Meeting time: 9:05 - 10:15 Minutes record by: James Barkley There was some discussion about improving the OSEI fire hazard layer. It was suggested that they be changed to circles but it was decided that this would confuse the users since hurricane events are also circles. It was decided that they will stay as rectangles but will increase in size. We received a copy of NNVL's MySQL database, from which Ted extracted 1112 lat/lon items. There was some discussion about having interns get images from the NNVL database. There was a lot of discussion on the Satellite Metadata Characterization Project. Ted gave a top-down explanation of the project which prompted a number of important questions. Jimmy has crated an XML characterization of the level 1b file headers and is working on getting it into Blue Angel. Kris has created a schema as a Blue Angel Template for describing our comprehensive metadata model. There are potential issues with the namespace element. Tom has been working on characterizing the metadata from the DMSP database. Kris has reviewed FGDC's remote sensing extensions in order to see how well our metadata model fits with FGDC's. Minutes from last meeting were approved. Tom is working to automate the process of spatially enabling layers; if he is successful this will yield a large windfall. Progress is being made with the hurricane tracking as far as getting geotiffs and putting them into map services. There was some discussion about group communication. Ted lended some instruction on presenting and explaining to others.
|
June 25, 2002
|
Meeting minutes for June 25, 2002 Geospatial Data Group weekly meeting -------------------------------------------------------------- Attendee list: John Cartwright, Mark Ohrenschall, Dan Kowal, Thomas Gaines, Kris Nuttycombe, Josh Klaus, Travis Stevens, James Barkley Absentee list: Ted Habermann, Karen Horan. Meeting location: WADI room (1B307) Meeting time: 11:00 - 11:50 Minutes record by: James Barkley -------------------------------------------------------------- FYP --- FYP was discussed. Kris has rewritten hierarchy server code in JSP. This has improved code readability and also yielded a significant efficiency gain. The oracle statistics are setup for FYP. However, there are data integrety issues with FYP as well as interest to get more up to date data. There has been some criticism from Paleo about the reliability of the data. AUTO-SPATIAL ENABLE (GLOBE, NNVL, OSEI) ----------------------------------- One of Marc's programs is the best candidate for using to auto-spatially enable data. Tom and Marc are going to design a model for the spatially enabling system. Tom will modify some daemons that will fit into this model. NNVL -> FGDC ----------- Josh took 600 of the 1000 or so images from NNVL and produced them as a layer on the map server. They can be seen displayed at http://map.ngdc.noaa.gov/website/stp/hurricanes/ (they are the yellow dots). COLLECTION MAP SERVICES ---------------------- This would be a capability foundation for a wide variety (all?) map services. The biggest problem is pulling spatial data from Blue Angel. SATELLITE METADATA CHARACTERIZATION PROJECT ------------------------------------------ Ted, Jimmy, and Kris decided to use a relational database to implement the metadata crosswalk model. This will increase search capabilities. An Oracle database has been created on cheetah for this project, and all of the SAA information has been loaded into it. The 1b file header information should be in the database soon, too. Jimmy and Kris are collaborating to develop a web interface for the database. GRIDDED METADATA MODEL ----------------------- Marc has been working on gravity grids the ecosystems in LAS/DODS. There are a few small problems, such as displaying large sets in ferret. HAZARDS YEARBOOK ---------------- Dan has put the tsunamis,tsunami runups, and earthquakes in the hazards yearbook, but there are still volcanoes and hot springs. Tom has been cleaning up the hazards data. MINUTES APPROVAL ---------------- Last meetings' minutes were approved by the group. CVS --- There was some discussion about CVS, and some debate about a collective model versus a project-based model. No consensus was reached.
|
Meeting minutes for July 02, 2002
|
Meeting minutes for July 02, 2002 Geospatial Data Group weekly meeting -------------------------------------------------------------- Attendee list: John Cartwright, Mark Ohrenschall, Dan Kowal, Thomas Gaines, Kris Nuttycombe, Josh Klaus, Travis Stevens, James Barkley Absentee list: Karen Horan. Meeting location: WADI room (1B307) Meeting time: 9:00 - 10:15 Minutes record by: James Barkley -------------------------------------------------------------- Sharon Meesnick from NCDDC visited. WCS is an emergeing specification for delivering data that could possibly be used on top of DODS. Resource Description Framework (RDF) is another interesting technology that is emerging but there is no clear place for it in our projects. John Muller is retiring. The satellite characterization metadata project is coming along - most of the datasets are in the database and the interface for doing crosswalks is in place. the SRID in FYP is not getting written to the metadata tables. We have started thinking about the user interface for FYP. Dan is in the process of making sure all the FYP search looks and display looks are working. Tom has deployed the WAR file on nndc2. the "Meta-tool" is on metadata1 instead of redbull. The IDB Metadata Connection has been run on some gravity sets. We should think about what new machines are coming in and what needs to be on them.
|
July 9, 2002
|
Meeting minutes for July 09, 2002 Geospatial Data Group weekly meeting -------------------------------------------------------------- Attendee list: Karen Horan, Travis Stevens, Josh Klaus, Thomas Gaines, Kris Nuttycombe Ted Habermann, Marc Ohrenschall, James Barkley Absentee List: John Cartwright, Dan Kowal Meeting location: WADI room (1B307) Meeting time: 9:15 - 10:30 Minutes record by: James Barkley -------------------------------------------------------------- The Satellite Metadata Characterization project is progressing rapidly. The interface for doing the crosswalks has been completed and is being used to create crosswalks. It still may need some fine-tuning. All of the data has been streamlined and put in the database. Tom suggested adding a "note" option to the crosswalk tool in order for one to attach a note to a crosswalk. Kris, Tom, Travis, and Jimmy should be thinking about general documentation for this project, as well as a homepage for the project. The datasets still missing are the CDML satellite statistics and the DMSP. People have been using the metadata comparison tool at http://metadata.ngdc.noaa.gov/metadata/login.jsp. Ted suggested we investigate the possiblity of a geodedic datablade which works in polar coordinates - not grids. One of our values is to fill up metadata field as much as possible. Ted will email the CLASS vision statement to the group. Everyone was given a list of "ilities" to keep in mind when considering project development. There was a lengthy discussion on some of them. The items that were heavily focused on included useability, manageability, testability, and security. Ted has asked us to rank these ourselves giving each item either a "high", "low", or "medium" ranking. Travis, Jimmy, Kris, and Ted will be on travel in Washington D.C. from the 21st through the 26th. Tom, Marc, and John will be on travel in Washington D.C. the 23rd through the 26th. Thanks for the bagels Karen!
|
Minutes 7/30
|
Meeting minutes for July 30, 2002 Geospatial Data Group weekly meeting -------------------------------------------------------------- Attendee list: Karen Horan, James Barkley, Mark Ohrenschall, John Cartwright, Dan Kowal, Travis Stevens, Kris Nuttycombe, Ted Habermann, Josh Klaush, Tom Gaines *Joe Zajic arrived at 9:30 Meeting location: WADI room (1B307) Meeting time: 9:10 - 10:10 Minutes record by: James Barkley -------------------------------------------------------------- The CLASS talk from the senate hearing on NOAA's satellite systems said that NOAA does not have their act together in their future planning. The group had a successful trip to D.C. NNVL The lat/lon was added to the database. This changes the transfer process. Ran perl scripts to populate the lat/lon columns and modified interface for adding the images to include lat/lon. This means that all future images should have lat/lon in them. OSEI Ted argued the architecture of their metadata system. Ted's argument was that it is important to get the data into the database as soon as possible and then quality checking becomes an ongoing process. In a plan to implement an area sort Ted and others will attempt to come up with some suggested keywords for Theme keywords (science words) and place keywords (GIS). We should be able to determine place keywords in an automated way. Aurora has already completed 700 images. OSEI says they have 10,000 images, but Ted thinks this is a "brutal over-estimate." DBMS2SDE A where clause needs to be added in DBMS2SDE otherwise an estimated 10% downtime occurs from dropping tables and re-creating them. COASTWATCH Ted's coastwatch talk was wall received. The coastwatch images are getting extensive metadata generated for each of their geotiffs. Unfortunately the Satellite Active Archive (SAA) does not keep this metadata when the images are transferred to them. SOURCEFORCE Joe Zajic visited us. He has lots of project management experience in software development and is excited to set up a source forge for lots of different groups including ours, IP. CLASS, CoRIS, and others. This "NOAAForge" has many advantages including better user-development interaction, communication among collaborating groups, and project management. METADATA At OSEI we talked about the metadata tools and how they might be refined. Grouping and "batch" operations on groups was discussed along with many other suggestions for an improved interface. Having the OSEI website on top of the IDB or BAT would be an interesting programming task. SMC MARC was added and Kris will add the hierarchical info while Karen populates it. Someone needs to send an email to Tom Passin to dump the database and interface on CD and send it to us. ARCIMS NMFS and the Climate References Network want JSP and ARCIMS development from us. CLASS A white paper on accessing gridded data to be able to deal with custom raster data is now available. This could be an interesting aspect of the netCDF study. One of the largest potential problems with was CLASS that it had only dataset dependant features. Eric Kihn wants to get an AIX box from IBM and set up an informix gridded datablade for the CLASS project.
|
ArcIMS Training / Workshop at NGDC October 2002
|
Information presented at the meeting and questions / answers go here.
|
Presentations
|
|
The Big Geospatial Picture
|
This is Ted Habermann's Presentation: http://www.ngdc.noaa.gov/seg/talks/TheBigGeospatialPicture_files/frame.htm
|
Using this forum | For information on this forum contact: Ted Habermann |