Greg,
At NCSU we are working with FGDC metadata in our NDIIPP project
focusing on preservation of state and local government digital
geospatial data. Much of the state data and some of the local data
arrives with FGDC metadata. In our ingest workflow we are validating,
normalizing, synchronizing and remediating existing FGDC metadata, and
batch generating FGDC metadata where none exists. In the latter case it
is a question of focusing on the subset of software-generated technical
metadata elements while also adding additional information from agency
templates and selectively adding information from data inventory
details. We are normalizing to the ESRI Profile of FGDC in order to
take advantage of additional technical metadata elements that are
available. Not all of this has been fully automated yet as we are
working out the kinks and refining the work flow.
We've participated in some FGDC-ISO crosswalk tests in order to prepare
for the upcoming shift from FGDC ver. 2 to the North American profile of
the ISO standard. In exchange cases such as this we are stripping the
ESRI Profile elements and synchronization tags for standards
compliance. Tools we are using include ArcCatalog, the NPS Metadata
Toolkit, mp, and cns. In the past we also used an in-house modified
version of the NOAA Metadata Collector extension for batch processes
(rewriting the dialog-driven tool as a non-dialog batch processor for
extraction of technical elements)--the NPS Toolkit has superceded that
for batch processes but there are questions about ongoing support of the
NPS software. ArcObjects-based processing may be a next direction for
some of the things that we'd like to automate further--e.g., better
control of synchronization.
Our partner on the NDIIPP project is the state GIS agency, which
conducts metadata outreach and training for local agencies. As part of
the project we are compiling metadata quality information to be passed
back to the metadata outreach effort to inform the training process and
hopefully improve the quality of metadata seen in the longer term.
Quality issues in this case being things like a change in datum or
change in format not being recorded, and other innaccuracies. Lack of
concurrency between the data and metadata is a common problem.
We've also mapped FGDC elements, along with other technical and
administrative metadata cultivated in the ingest preparation process, to
DSpace QDC for repository ingest. The plan is to include the FGDC
record itself as well as a METS wrapper as bitstreams in the Dspace
item--we're still fleshing this part out. The Dspace QDC mapping is
seen as a single spoke in the repository preparation process, as we'd
also like to investigate mapping to FEDORA SIP's, etc.
In earlier projects going back to the mid-late 90's we've been involved
in a variety of other projects with FGDC metadata components: serving
as the NPS metasearch source (using Isite to index and serve SGML
records via Z39.50 GEO profile); cross-walking FGDC to MARC as a
beta-tester of a USGS tool, to incorporate geospatial data into the
library catalog; and developing a public access discovery system that
involves disambiguating key access elements from FGDC metadata
(including key access elements or facets that are not discretely defined
in FGDC) to inform discovery and selection. Also, over the years we've
provided metadata authoring training and support to grad students and
research staff working on various campus projects.
Best regards,
Steve Morris
gmarch wrote:
>Hello,
>
>I am curious to know if any DLF partners are actively involved with FGDC
>metadata related projects?
>
>Thank you,
>
>Greg
>
>Gregory March
>Graduate Research Assistant
>Map Library
>15 Hoskins Library
>University of Tennessee
>Knoxville, TN 37996
>(865)974-4315
>[log in to unmask]
>
>
--
Steve Morris
Head of Digital Library Initiatives
North Carolina State University Libraries
Phone: (919) 515-1361 Fax: (919) 515-3031
[log in to unmask]
|