Hi Bill,
I have been doing some work with Symphony logs using Elasticsearch. It is simple to install and use, though I recommend Elasticsearch: The Definitive Guide (http://shop.oreilly.com/product/0636920028505.do). The main problem is the size of the history logs, ours being on the order of 5,000,000 lines per month.
Originally I used a simple python script to load each record. The script broke down each line into the command code, then all the data codes, then loaded them using curl. This failed initially because Symphony writes extended characters to title fields. I then ported the script to python 3.3 which was not difficult, and everything loaded fine -- but took more than a to finish a month's worth of data. I am now experimenting with Bulk (http://www.elastic.co/guide/en/elasticsearch/reference/current/docs-bulk.html) to improve performance.
I would certainly be willing to share what I have written if you would like. The code is too experimental to post to Github however.
Edmonton Public Library
Andrew Nisbet
ILS Administrator
T: 780.496.4058 F: 780.496.8317
-----Original Message-----
From: Code for Libraries [mailto:[log in to unmask]] On Behalf Of William Denton
Sent: March-18-15 3:55 PM
To: [log in to unmask]
Subject: [CODE4LIB] Anyone analyzed SirsiDynix Symphony transaction logs?
I'm going to analyze a whack of transaction logs from our Symphony ILS so that we can dig into collection usage. Any of you out there done this? Because the system is so closed and proprietary I understand it's not easy (perhaps
impossible?) to share code (publicly?), but if you've dug into it I'd be curious to know, not just about how you parsed the logs but then what you did with it, whether you loaded bits of data into a database, etc.
Looking around, I see a few examples of people using the system's API, but that's it.
Bill
--
William Denton ↔ Toronto, Canada ↔ https://www.miskatonic.org/
|