Oh, yes on windows, using java 1.6 and Solr 1.4.1.

Ok let me try that one...

Thank you so much.

Regards,
Rajani



2011/12/16 Tomás Fernández Löbbe <tomasflo...@gmail.com>

> Are you on Windows? There is a JVM bug that makes Solr keep the old files,
> even if they are not used anymore. The files are going to be eventually
> removed, but if you want them out of there immediately try optimizing
> twice, the second optimize doesn't do much but it will remove the old
> files.
>
> On Fri, Dec 16, 2011 at 9:10 AM, Juan Pablo Mora <jua...@informa.es>
> wrote:
>
> > Maybe you are generating a snapshot of your index attached to the
> optimize
> > ???
> > Look for post-commit or post-optimize events in your solr-config.xml
> >
> > ________________________________________
> > De: Rajani Maski [rajinima...@gmail.com]
> > Enviado el: viernes, 16 de diciembre de 2011 11:11
> > Para: solr-user@lucene.apache.org
> > Asunto: Solr Optimization Fail
> >
> > Hi,
> >
> >  When we do optimize, it actually reduces the data size right?
> >
> > I have index of size 6gb(5 million documents). Index is already created
> > with commits for every 10000 documents.
> >
> > Now I was trying to do optimization with  http optimize command.   When i
> > did that,  data size became - 12gb.  Why this might have happened?
> >
> > And can anyone please suggest me fix for it?
> >
> > Thanks
> > Rajani
> >
>

Reply via email to