Well, A log file is theoretically structured. Every log record is a - very - flat set of fields. So, every log file line would be a Lucene document. Then, one could use Solr to search, filter and facet records.
Of course, this requires parsing log file back into record components. Most log files were created for output, not for re-input. But if you can parse it back, you might be able to do custom data import. Or, if you can intercept log file before it hits serialization, you might be able to index the fields directly. Or you could just buy Splunk ( http://www.splunk.com/ ) and be done with it. Parsing and visualizing log files is exactly what they set out to deal with. No (great) open source solution yet. Regards, Alex. Personal blog: http://blog.outerthoughts.com/ Research group: http://www.clt.mq.edu.au/Research/ - I think age is a very high price to pay for maturity (Tom Stoppard) On Tue, Mar 24, 2009 at 2:40 PM, Matthew Runo <mr...@zappos.com> wrote: > Well, I think you'll have the same problem. Lucene, and Solr (since it's > built on Lucene) are both going to expect a structured document as input. > Once you send in a bunch of documents, you can then query them for whatever > you want to find.