As a general guide: use the following process.

1. Set your JVM heap to a fairly large size.
2. Load Solr.
3. Do a bunch of common queries that cover the range of what production will see. Be sure to use the most expensive operations you expect, such as facets and filters, and all of the fields that might be referenced. The idea is to force Lucene and Solr to load various caches.
3. Check the available JVM heap memory.
4. Reset for JVM heap limit to that number plus a reasonable margin, such as at least 250M. The goal is to have enough for Solr to work, but not so much that tons of Java garbage can accumulate which will eventually cause very slow garbage collections.
5. Restart Solr.
6. Re-execute the test queries.
7. Verify that the available JVM heap is still reasonable, like 250M.
8. Maybe sure that available OS system memory outside of the JVM is at least half of your index, at a minimum. Caching the full index in OS system memory is preferable.

In short, it sounds like your system is woefully underconfigured for your index. If it happens to run some of the time, consider yourself very lucky. If it doesn't run reasonably well, which seems to be the case, start by properly configuring it with sufficient OS system memory and enough but not too much Java JVM heap memory.

-- Jack Krupansky

-----Original Message----- From: Manivannan Selvadurai
Sent: Thursday, February 28, 2013 10:58 AM
To: solr-user
Subject: Re: Solr 3.6 - Out Of Memory Exception

hi,

Thanks for the quick reply,

Total memory in the server is around 7.5 G, Even though there are around
61075834 docs the index size is around
44G. I tried changing the directoryFactory to MMapDirectory, it didnt help.
Previously we used Lucene to query for term vectors using
TermVectorMapper, We didnt face any issues other than large response
time(40 sec).

As our index grew we decided to move to Solr, but now we are facing these
issues, OOM, large QTime etc.
Is this normal for our index or are we missing some thing?

With Thanks
Manivannan.



On Thu, Feb 28, 2013 at 9:01 PM, Jan Høydahl <jan....@cominvent.com> wrote:

How much memory on the server in total? For such a large index you should
leave PLENTY of free memory for the OS to cache your index efficiently.
A quick thing to try is to upgrade to Solr4.1, as the index size itself
will shrink dramatically and you will get better utilization of whatever
memory you have. Also, you should read this blog to try optimize your HW
resources
http://blog.thetaphi.de/2012/07/use-lucenes-mmapdirectory-on-64bit.html

My gut feel is that you still need to allocate more than 4G for Sold,
until you get rid of all OOMs.

--
Jan Høydahl, search solution architect
Cominvent AS - www.cominvent.com
Solr Training - www.solrtraining.com

28. feb. 2013 kl. 16:08 skrev Manivannan Selvadurai <
manivan...@unmetric.com>:

> Hi,
>
>      Im using Solr 3.6 on Tomcat 6, Xmx is set to 4096m.
>
> I have indexed about 61075834 documents using shingle filter with > max
> shingle size 3. Basically i have a lot of terms. Whenever i request 3-4
> queries at a time to to get the termvector component, I get the > following
> exception.
>
> SEVERE: java.lang.OutOfMemoryError: Java heap space
>        at
> org.apache.lucene.search.HitQueue.getSentinelObject(HitQueue.java:76)
>        at
> org.apache.lucene.search.HitQueue.getSentinelObject(HitQueue.java:22)
>        at
> org.apache.lucene.util.PriorityQueue.initialize(PriorityQueue.java:116)
>        at org.apache.lucene.search.HitQueue.<init>(HitQueue.java:67)
>        at
>
org.apache.lucene.search.TopScoreDocCollector.<init>(TopScoreDocCollector.java:275)
>        at
>
org.apache.lucene.search.TopScoreDocCollector.<init>(TopScoreDocCollector.java:37)
>        at
>
org.apache.lucene.search.TopScoreDocCollector$InOrderTopScoreDocCollector.<init>(TopScoreDocCollector.java:42)
>        at
>
org.apache.lucene.search.TopScoreDocCollector$InOrderTopScoreDocCollector.<init>(TopScoreDocCollector.java:40)
>        at
>
org.apache.lucene.search.TopScoreDocCollector.create(TopScoreDocCollector.java:258)
>        at
>
org.apache.lucene.search.TopScoreDocCollector.create(TopScoreDocCollector.java:238)
>        at
>
org.apache.solr.search.SolrIndexSearcher.getDocListNC(SolrIndexSearcher.java:1285)
>        at
>
org.apache.solr.search.SolrIndexSearcher.getDocListC(SolrIndexSearcher.java:1178)
>        at
>
org.apache.solr.search.SolrIndexSearcher.search(SolrIndexSearcher.java:377)
>        at
>
org.apache.solr.handler.component.QueryComponent.process(QueryComponent.java:394)
>        at
>
org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:186)
>        at
>
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129)
>        at org.apache.solr.core.SolrCore.execute(SolrCore.java:1376)
>        at
>
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:365)
>        at
>
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:260)
>        at
>
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
>        at
>
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
>        at
>
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
>        at
>
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
>        at
>
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
>        at
>
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
>        at
>
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
>        at
>
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
>        at
>
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:859)
>        at
>
org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588)
>        at
> org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
>        at java.lang.Thread.run(Thread.java:662)
>
>
>           Even if the query returns data it takes a lot of time around
230
> sec (Qtime= 230000). Is there any way to optimize my index.
>
>
>
>
> --
> *With Thanks,*
> *Manivannan *




--
*With Best Regards,*
*Manivannan *

Reply via email to