Print

Print


Francis,

I was able to use Logstash's existing patterns for what I needed.

Depending on how you configure the logging, the format can be identical to Apache's.

I may have some custom expressions for query params, but you can also do a lot with ES' dynamic fields, which will keep the index smaller.

I have the template on Github, but I'm not sure it's the latest. I'll check and post the link.



Jason

------ Original message ------
From: Francis Kayiwa
Date: 03/21/2015 8:53 AM
To: [log in to unmask];
Subject:Re: [CODE4LIB] Anyone analyzed SirsiDynix Symphony transaction logs?

On 3/19/15 3:53 PM, Jason Stirnaman wrote:
> I've been using the ELK (elastic + logstash(1) + kibana)(2) stack for EZProxy log analysis.
> Yes, the index can grow really fast with log data, so I have to be selective about what I store. I'm not familiar with the Symphony log format, but Logstash has filters to handle just about any data that you want to parse, including multiline. Maybe for some log entries, you don't need to store the full entry at all but only a few bits or a single tag?
>
> And because it's Ruby underneath, you can filter using custom Ruby. I use that to do LDAP lookups on user names so we can get department and user-type stats.

Hey Jason,

Did you have to create customized grok filters for EZProxy logs format?
It has been something on my mind and if you've done the work... ;-)

Cheers,

./fxk

--
Your analyst has you mixed up with another patient.  Don't believe a
thing he tells you.