Good afternoon,

It may be  a problem in your app

If your crawler is a java app, try to limit the amount of memory it uses,
ex:

java -jar my-app-with-dependencies.jar   -Xms64m -Xmx128m -XX:NewSize=64m
-XX:MaxNewSize=64m -XX:PermSize=128m -XX:MaxPermSize=128m ;

Look for this parameters in the script that start solr too.


On Fri, May 11, 2012 at 4:09 PM, Thiago <thiagosousasilve...@gmail.com>wrote:

> I'm having problems with memory when I'm using Solr. I have an application
> that crawl the web for some documents. It does a lot of consecutively
> indexing. But after some days of crawling, I'm having problems with memory.
> My Java process is consuming a lot of memory and it doesn't seems OK. My
> computer is starting swap and my crawler is running  very slow. My
> professor
> told me that it is using the cache. What can I do? Is there any option that
> I should choose to solve this problem?
>
> Thanks in advance
>
> Thiago
>
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/Problems-with-Memory-tp3980765.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>



-- 
Carlos Alberto Schneider
Informant - (47) 38010919 - 9904-5517

Reply via email to