How much memory on the server in total? For such a large index you should leave PLENTY of free memory for the OS to cache your index efficiently. A quick thing to try is to upgrade to Solr4.1, as the index size itself will shrink dramatically and you will get better utilization of whatever memory you have. Also, you should read this blog to try optimize your HW resources http://blog.thetaphi.de/2012/07/use-lucenes-mmapdirectory-on-64bit.html
My gut feel is that you still need to allocate more than 4G for Sold, until you get rid of all OOMs. -- Jan Høydahl, search solution architect Cominvent AS - www.cominvent.com Solr Training - www.solrtraining.com 28. feb. 2013 kl. 16:08 skrev Manivannan Selvadurai <manivan...@unmetric.com>: > Hi, > > Im using Solr 3.6 on Tomcat 6, Xmx is set to 4096m. > > I have indexed about 61075834 documents using shingle filter with max > shingle size 3. Basically i have a lot of terms. Whenever i request 3-4 > queries at a time to to get the termvector component, I get the following > exception. > > SEVERE: java.lang.OutOfMemoryError: Java heap space > at > org.apache.lucene.search.HitQueue.getSentinelObject(HitQueue.java:76) > at > org.apache.lucene.search.HitQueue.getSentinelObject(HitQueue.java:22) > at > org.apache.lucene.util.PriorityQueue.initialize(PriorityQueue.java:116) > at org.apache.lucene.search.HitQueue.<init>(HitQueue.java:67) > at > org.apache.lucene.search.TopScoreDocCollector.<init>(TopScoreDocCollector.java:275) > at > org.apache.lucene.search.TopScoreDocCollector.<init>(TopScoreDocCollector.java:37) > at > org.apache.lucene.search.TopScoreDocCollector$InOrderTopScoreDocCollector.<init>(TopScoreDocCollector.java:42) > at > org.apache.lucene.search.TopScoreDocCollector$InOrderTopScoreDocCollector.<init>(TopScoreDocCollector.java:40) > at > org.apache.lucene.search.TopScoreDocCollector.create(TopScoreDocCollector.java:258) > at > org.apache.lucene.search.TopScoreDocCollector.create(TopScoreDocCollector.java:238) > at > org.apache.solr.search.SolrIndexSearcher.getDocListNC(SolrIndexSearcher.java:1285) > at > org.apache.solr.search.SolrIndexSearcher.getDocListC(SolrIndexSearcher.java:1178) > at > org.apache.solr.search.SolrIndexSearcher.search(SolrIndexSearcher.java:377) > at > org.apache.solr.handler.component.QueryComponent.process(QueryComponent.java:394) > at > org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:186) > at > org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129) > at org.apache.solr.core.SolrCore.execute(SolrCore.java:1376) > at > org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:365) > at > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:260) > at > org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) > at > org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) > at > org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233) > at > org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191) > at > org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) > at > org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) > at > org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109) > at > org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298) > at > org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:859) > at > org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588) > at > org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489) > at java.lang.Thread.run(Thread.java:662) > > > Even if the query returns data it takes a lot of time around 230 > sec (Qtime= 230000). Is there any way to optimize my index. > > > > > -- > *With Thanks,* > *Manivannan *