On Dec 21, 2007, at 8:16 AM, Erik Hatcher wrote:
On Dec 21, 2007, at 1:33 AM, alexander lind wrote:
I have a pretty big app in the works, and in short it will need to
index a lot of items, with with some core attributes, and hundreds
of optional attributes for each item.
The app then needs to be able to make queries like
'find all items with attributes attribute_1=yes, attribute_5=10,
attribute_8 > 50, attribute_12 != green'
etc.
I need exact counts, no approximations.
I could do this using a regular database like mysql, but I know it
will become rather slow at about 4-500k items with a 100 or so
attributes each.
My question is, would Solr be able to handle this better? I was
thinking perhaps I could use the facetted searches for this?
I think Solr would handle this better personally, and you'd get full-
text search as an added bonus! :) But, of course, it is advisable
to give it a try and see. Solr is quite easy to get rolling with,
so it'd be well worth a try.
The one part I am most worried about is that solr would start doing
approximations when the amount of data grows big, like in the 500k
range, or even once I reach the 1M range.
Do you know if there is a way to make sure that all facet counts are
exact when dealing with this many records, and is it feasible to do
this, ie not making it too slow by forcing the exact counts?
Thanks
Alec