Hi Solr Gurus

We are thinking about optimizing our production master slave solr setup,
just wanted to poll the group on following questions:

1. Currently we are using autocommit feature with setting of 50 docs and 5
mins. Now the requirement is to reduce this time. So we are analyzing the
situation where we will use the time based feature of autocommit. The time
to autocommit will be *1 min*.
Can anyone think of any disadvantages this change can have on index? Is it
possible that autocommit process itself takes more that 1 min?

2. We want to trace the average time it takes to perform commit operation.
Right now on production we have Lucid Solr 1.4 on master/slaves but we are
still using old script based replication method. But we will be moving to
new JAVA based replication soon, hence want to focus more on autocommit and
the time it takes to commit the data. So, how to trace back the logs of
autocommit? Does autocommit executes commit script present under bin folder?

3. What should be the optimum time for optimizing the data? After going
through some posts like -
http://www.mail-archive.com/solr-user@lucene.apache.org/msg10920.html. it
makes sense to optimize the data infrequently.
How to configure this in 1.4? Currently we optimize using optimize script
twice a day. Also, can there be a situation where the optimize can conflict
with commit operation? If yes, then how to avoid such kind of situation.

Many Thanks & Regards
Dipti Khullar

Reply via email to