optimize is taking too much time

2010-02-12 Thread mklprasad

hi 
in my solr u have 1,42,45,223 records having some 50GB .
Now when iam loading a new record and when its trying optimize the docs its
taking 2 much memory and time 


can any body please tell do we have any property in solr to get rid of this.

Thanks in advance

-- 
View this message in context: 
http://old.nabble.com/optimize-is-taking-too-much-time-tp27561570p27561570.html
Sent from the Solr - User mailing list archive at Nabble.com.



Re: optimize is taking too much time

2010-02-17 Thread mklprasad



hossman wrote:
> 
> 
> : in my solr u have 1,42,45,223 records having some 50GB .
> : Now when iam loading a new record and when its trying optimize the docs
> its
> : taking 2 much memory and time 
> 
> : can any body please tell do we have any property in solr to get rid of
> this.
> 
> Solr isn't going to optimize the index unless you tell it to -- how are 
> you indexing your docs? are you sure you don't have something programmed 
> to send an optimize command?
> 
> 
> -Hoss
> 
>  yes ,
> From My Code 
> For Every Load iam calling the server.optimize() method
> ( Now iam planning to remove this from the code)
> in the config level i have 'mergerFactor=10'
> i have a doubt like will the mergerFactor will only do a merge  or will it
> also performs the optimization 
> if not do i need to call  in that case for my 50Gb will it takes less time .
> 
> 
> Please clearify me
> Thanks in advance
> 
> 
> 
> 

-- 
View this message in context: 
http://old.nabble.com/optimize-is-taking-too-much-time-tp27561570p27634994.html
Sent from the Solr - User mailing list archive at Nabble.com.



Re: optimize is taking too much time

2010-02-18 Thread mklprasad



Jagdish Vasani-2 wrote:
> 
> Hi,
> 
> you should not optimize index after each insert of document.insted you
> should optimize it after inserting some good no of documents.
> because in optimize it will merge  all segments to one according to
> setting
> of lucene index.
> 
> thanks,
> Jagdish
> On Fri, Feb 12, 2010 at 4:01 PM, mklprasad  wrote:
> 
>>
>> hi
>> in my solr u have 1,42,45,223 records having some 50GB .
>> Now when iam loading a new record and when its trying optimize the docs
>> its
>> taking 2 much memory and time
>>
>>
>> can any body please tell do we have any property in solr to get rid of
>> this.
>>
>> Thanks in advance
>>
>> --
>> View this message in context:
>> http://old.nabble.com/optimize-is-taking-too-much-time-tp27561570p27561570.html
>> Sent from the Solr - User mailing list archive at Nabble.com.
>>
>>
> 
> 

Yes,
Thanks for reply 
i have removed the optmize() from  code. but i have a doubt ..
1.Will  mergefactor internally do any optmization (or) we have to specify

2. Even if solr initaiates optmize if i have a large data like 52GB will
that takes huge time?

Thanks,
Prasad



-- 
View this message in context: 
http://old.nabble.com/optimize-is-taking-too-much-time-tp27561570p27650028.html
Sent from the Solr - User mailing list archive at Nabble.com.



Too many .cfs files

2010-03-03 Thread mklprasad

HI All,
I set up my 'mergerfactor ' as 10.
i have loaded a 1million docs in to solr,after that iam able to see 14 .cfs
files in my data/index folder.
mergeFactor will not merge after the 11th record comes?

Plese clearify?

Thanks,
Prasad

-- 
View this message in context: 
http://old.nabble.com/Too-many-.cfs-files-tp2508p2508.html
Sent from the Solr - User mailing list archive at Nabble.com.