Shawn,

Increasing the fetch size and increasing my heap based on that did the
trick.. Thanksss a lot for your help.. your suggestions helped me a lot..

Hope these suggestions will be helpful to others too who are facing similar
kind of issue.

Thanks,
Barani

Shawn Heisey-4 wrote:
> 
> Do keep looking into the batchSize, but I think I might have found the 
> issue.  If I understand things correctly, you will need to add 
> processor="CachedSqlEntityProcessor" to your first entity.  It's only 
> specified on the other two.  Assuming you have enough RAM and heap space 
> available in your JVM to load the results of all three queries, that 
> ought to make it work very quickly.
> 
> If I'm right, basically what it's doing is issuing a real SQL query 
> against your first table for every entry it has read for the other two 
> tables.
> 
> Shawn
> 
> On 3/6/2010 11:58 AM, JavaGuy84 wrote:
>> Shawn,
>>
>> Thanks a lot for your response,
>>
>> Yes, still the DB connection is active.. It is still fetching the data
>> from
>> the DB.
>>
>> I am using Redhat MetaMatrix DB as backend and I am trying to find out
>> the
>> parameter for setting the JDBC fetch size..
>>
>> Do you think that this problem will be mostly due to fetch size?
>>
>> Thanks,
>> Barani
>>
> 
> 
> 

-- 
View this message in context: 
http://old.nabble.com/SOLR-takes-more-than-9-hours-to-index-300000-rows-tp27805403p27825172.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to