Well, we are doing same thing(in a way). we have to do frequent deletions in 
mass, at a time we are deleting around 20M+ documents.All i am doing is after 
deletion i am firing the below command on each of our solr node and keep some 
patience as it take way much time.

curl -vvv 
"http://node1.solr.xxxxx.com/collection1/update?optimize=true&distrib=false"; >> 
/tmp/__solr_clener_log

After finishing optimisation curl returns below xml :









<?xml version="1.0" encoding="UTF-8"?>
<response>
<lst name="responseHeader"><int name="status">0</int><int 
name="QTime">10268995</int></lst>
</response>

Regards,Amey

> Date: Wed, 31 Dec 2014 02:32:37 -0700
> From: inna.gel...@elbitsystems.com
> To: solr-user@lucene.apache.org
> Subject: Frequent deletions
> 
> Hello,
> We perform frequent deletions from our index, which greatly increases the
> index size.
> How can we perform an optimization in order to reduce the size.
> Please advise,
> Thanks.
> 
> 
> 
> 
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Frequent-deletions-tp4176689.html
> Sent from the Solr - User mailing list archive at Nabble.com.
                                          

Reply via email to