We're using Solr as the backbone for our shiny new helpdesk application, and
by and large it's been a big win... especially in terms of search
performance.  But before I pat myself on the back because the Solr devs have
done a great job, I had a question regarding commit frequency.

While our app doesn't need truly realtime search, documents get updated and
replaced somewhat frequently, and those changes need to be visible in the
index within 500ms.  At the moment, I'm using autocommit to satisfy this,
but I've run across a few threads mentioning that frequent commits may cause
some serious performance issues.

Our average document size is quite small (less than 10k), and I'm expecting
that we're going to have a maximum of around 100k documents per day on any
given index; most of these will be replacing existing documents.

So, rather than getting bitten by this down the road, I figure I may as well
(a) ask if anybody else here is running a similar setup or has any input,
and then (b) do some heavy load testing via a fake data generator.

Thanks-in-advance!

Reply via email to