Hi, You probably want to use CommitWithin: http://wiki.apache.org/solr/CommitWithin to limit the number of commits to a minimum.
Some other questions: * Are you using spellcheck with builtOnCommit? That totally kills commit performance... * What's your index size, total RAM and allocated RAM to JVM? -- Jan Høydahl, search solution architect Cominvent AS - www.cominvent.com Solr Training - www.solrtraining.com On 2. nov. 2011, at 03:58, vijay.sampath wrote: > Hi All, > > I recently started working on SOLR 3.3 and would need your expertise to > provide a solution. I'm working on a POC, in which I've imported 3.5 million > document records using DIH. We have a source system which publishes change > data capture in a XML format. The requirement is to integrate SOLR with the > real time CDC updates. I've written an utility program which receives the > XML message, transform and update SOLR using SOLRJ. The source system > publishes atleast 3-4 messages per second, and the requirement is to have > the changes reflected within 1-2 seconds. Right now it takes almost 15-25 > seconds to get the changes committed in SOLR. I know, commit at every record > or every second would hamper the search and indexing. > > I thought of having a Master for writes and a Slave for reads, but again not > sure how fast the replication would be? Since the requirement is to have the > change data capture in 1-2 seconds. > > Any thoughts or suggesstions are appreciated. Thanks again. > > > > -- > View this message in context: > http://lucene.472066.n3.nabble.com/Solr-real-time-update-taking-time-tp3472709p3472709.html > Sent from the Solr - User mailing list archive at Nabble.com.