There is another MARC handling library, written in Java, called marc4j ( https://github.com/marc4j/marc4j ) it is used heavily by the Open Source project named SolrMarc, which extracts information from MARC records and builds Solr records. One of the plug-ins for SolrMarc that our University's Library uses gathers Summary Holdings information including translating the 853 and 863 fields to text. It even makes an attempt to collapse a series of consecutive holdings into a range for example: 2015, no.1- 2017, no.46 (2015:Jan 5-2017:Nov 13) Although I just found that in that case it missed a few gaps in the middle of the range. I can make the necessary code available is anyone is interested. -Bob Haschart University of Virginia Library On 1/19/2018 7:32 PM, Julie Cole wrote: > It will take me awhile to delve into this all and understand it so I can determine that the code IS indeed helpful. But one thing I know is that as people you are all very helpful. Thanks so much for sharing. I love this community. > Julie. > > -----Original Message----- > From: Code for Libraries [mailto:[log in to unmask]] On Behalf Of Spurgin, Kristina M. > Sent: Friday, January 19, 2018 8:13 AM > To: [log in to unmask] > Subject: Re: [CODE4LIB] MARC Holdings > > That MFHD.pm code is so helpful! Thanks for sharing. > > -Kristina > >> -----Original Message----- >> From: Code for Libraries [mailto:[log in to unmask]] On Behalf Of >> Mike Rylander >> Sent: Thursday, January 18, 2018 5:49 PM >> To: [log in to unmask] >> Subject: Re: [CODE4LIB] MARC Holdings >> >> Evergreen has Perl code for generating holdings statements from MFHD >> that we use in the serials module. The core MFHD module could >> certainly be used directly (with, perhaps, a namespace change), and >> the serials module code could serve as a guide for how to make use of >> the MFHD module. The latter is, of course, tied to Evergreen's data >> structures and general architecture, but the specific calls made by >> the serials code to use the MFHD module should be relatively >> transparent to a Perl developer and you can ignore all the Evergreen >> bits. We're dealing with MARC data (MARC::Record objects) at the >> point that we're calling the MFHD module, so that should be translatable to one's own code. >> >> See MFHD.pm and the contents of the MFHD directory here: >> http://git.evergreen-ils.org/?p=Evergreen.git;a=tree;f=Open- >> ILS/src/perlmods/lib/OpenILS/Utils >> >> The serials module is implemented by the code here: >> http://git.evergreen-ils.org/?p=Evergreen.git;a=blob;f=Open- >> ILS/src/perlmods/lib/OpenILS/Application/Serial.pm >> >> HTH, >> >> -- >> Mike Rylander >> | President >> | Equinox Open Library Initiative >> | phone: 1-877-OPEN-ILS (673-6457) >> | email: [log in to unmask] | web: >> http://equinoxinitiative.org >> >> >> On Thu, Jan 18, 2018 at 4:25 PM, Spurgin, Kristina M. >> <[log in to unmask]> wrote: >>> The MARC libraries cited make it easy to work with MARC in general, >>> but >> unfortunately the Perl and Ruby versions don't come with any help for >> the "interesting" problem of transforming Holdings 853s and 863s into >> human- readablesummary holdings statements (like you'd record in the >> 866 or 867). (I haven't worked with PyMARC). >>> We had the same sort of need you describe years ago and came up with >> some Perl code that clunkily (and in some cases not quite 100% >> accurately) does this. It's not publicly available to point to, but I >> could send the relevant part of that code if you are interested. >>> There has been mild grumbling (from those who pay attention to our >>> serials >> display) about the not-great way this works, and we are working on a >> new discovery interface, so it's on my list to improve the summary >> holdings generation from 853s/863s. >>> I did some searching for code to do this, but didn't find anything >>> in my first >> attempt. If you find something useful that someone else has for this, >> please do share! >>> I've been thinking through a good approach, but don't have anything >> implemented yet. >>> best, >>> Kristina >>> >>> -=- >>> Kristina M. Spurgin -- Library Data Strategist >>> E-Resources & Serials Management, Davis Library >>> University of North Carolina at Chapel Hill >>> CB#3938, Davis Library -- Chapel Hill, NC 27514-8890 >>> 919-962-3825 -- [log in to unmask] >>> >>>> -----Original Message----- >>>> From: Code for Libraries [mailto:[log in to unmask]] On Behalf >>>> Of Andromeda Yelton >>>> Sent: Thursday, January 18, 2018 4:01 PM >>>> To: [log in to unmask] >>>> Subject: Re: [CODE4LIB] MARC Holdings >>>> >>>> Note that if perl isn't your thing there are MARC libraries in >>>> several languages >>>> - python and ruby at least, probably others I don't remember off >>>> the top of my head (since I work in python and ruby, no shade to >>>> other people's languages :). https://github.com/edsu/pymarc , >>>> https://github.com/ruby-marc/ruby- >>>> marc . >>>> >>>> On Thu, Jan 18, 2018 at 12:50 PM, Julie Cole <[log in to unmask]> wrote: >>>> >>>>> Hello all, >>>>> I'm pretty new to the world of library systems and this is my first post. >>>>> >>>>> Anyone have any experience parsing MARC Holding records (853 and >>>>> 863) into a more readable 866 or 867 format? >>>>> We are wanting to export our holdings from our ILS into our >>>>> Discovery Layer and trying to save some of the money that the ILS >>>>> vendor would charge us to create the records. >>>>> >>>>> The parsing doesn't look fun, so I was hoping someone has some >>>>> code to use as a starting point. >>>>> Also, I'm not sure how clean our data in 853 and 863 is so any >>>>> scripts or advice on gotchas when cleaning that up would be >> appreciated. >>>>> We have about 60,000 holding records. >>>>> >>>>> Thanks, >>>>> Julie. >>>>> >>>>> >>>>> Julie Cole >>>>> Library Systems Administrator >>>>> Langara College Library >>>>> Vancouver, BC >>>>> >>>> >>>> >>>> -- >>>> Andromeda Yelton >>>> Senior Software Engineer, MIT Libraries: https://libraries.mit.edu/ >>>> President, Library & Information Technology Association: >>>> http://www.lita.org http://andromedayelton.com @ThatAndromeda >>>> <http://twitter.com/ThatAndromeda>