We use LinkScan to check the 856 fields for those monograph and serials
records that have them. Every week, the catalog dumps an export of record
numbers, titles, and 856 fields into a big file for each category of
record. LinkScan runs through them, reporting on broken links for
monographs and serials. The group in our Technical Services department
responsible for the data uses the report to clean things up.
Our catalog has about 8 million records, but obviously far fewer serials
and monographs with 856 fields.
The review is manual and sporadic -- the reports get looked at regularly,
but probably not as often as they are run.
Standard
--
Ken Varnum
Web Systems Manager E: [log in to unmask]
University of Michigan Library T: 734-615-3287
300C Hatcher Graduate Library F: 734-647-6897
Ann Arbor, MI 48109-1190 http://www.lib.umich.edu/users/varnum
On 2/23/12 12:02 PM, "Tod Olson" <[log in to unmask]> wrote:
>There's been some recent discussion at our site about revi(s|v)ing URL
>checking in our catalog, and I was wondering if other sites have any
>strategies that they have found to be effective.
>
>We used to run some home-grown link checking software. It fit nicely into
>a shell pipeline, so it was easy to filter out sites that didn't want to
>be link checked. But still the reports had too many spurious errors. And
>with over a million links in the catalog, there are some issues of scale,
>both for checking the links and consuming any report.
>
>Anyhow, if you have some system you use as part of catalog link
>maintenance, or if there's some link checking software that you've had
>good experiences with, or if there's some related experience you'd like
>to share, I'd like to hear about it.
>
>Thanks,
>
>-Tod
>
>
>Tod Olson <[log in to unmask]>
>Systems Librarian
>University of Chicago Library
|