Hello,

There is no definitive rule for this, it depends on your situation such as size 
of documents, resource constraints and possible heavy analysis chain. And in 
case of (re)indexing a large amount, your autocommit time/limit is probably 
more important.

In our case, some collections are fine with 5000+ batch sizes, but others are 
happy with just a hundred. One has small documents and no text analysis, the 
other quite the opposite.

Finding a sweet spot is trial and error.

Cheers,
Markus

 
 
-----Original message-----
> From:Lucky Sharma <goku0...@gmail.com>
> Sent: Thursday 25th April 2019 21:48
> To: solr-user@lucene.apache.org
> Subject: Solr-Batch Update
> 
> Hi all,
> While creating an update request to solr, Its recommended creating
> batch request instead of small updates. What is the optimum batch
> size? Is there any number or any computation which can help us to
> assist on the same.
> 
> 
> -- 
> Warm Regards,
> 
> Lucky Sharma
> Contact No :+91 9821559918
> 

Reply via email to