Hi,

I am getting the following error during the indexing.I am trying to index 14
million records but the document size is very minimal.

*Error:*
2011-11-08 14:53:24,634 ERROR [STDERR] (Thread-12)
java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:54:07,910 ERROR [org.apache.coyote.http11.Http11Protocol]
(http-10.32.7.136-8180-2) Error reading request, ignored

java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:53:54,961 ERROR [STDERR]
(DefaultQuartzScheduler_QuartzSchedulerThread) Exception in thread
"DefaultQuartzScheduler_QuartzSchedulerThread"

2011-11-08 14:54:21,780 ERROR
[org.apache.catalina.core.ContainerBase.[jboss.web].[localhost].[/solr].[jsp]]
(http-10.32.7.136-8180-9) Servlet.service() for servlet jsp threw exception

java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:54:18,417 ERROR [org.apache.catalina.connector.CoyoteAdapter]
(http-10.32.7.136-8180-7) An exception or error occurred in the container
during the request processing

java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:54:18,417 ERROR [org.apache.catalina.connector.CoyoteAdapter]
(http-10.32.7.136-8180-6) An exception or error occurred in the container
during the request processing

java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:54:36,237 SEVERE
[org.apache.solr.handler.dataimport.SolrWriter] (Thread-19) Exception while
solr commit.

java.lang.RuntimeException: java.lang.OutOfMemoryError: GC overhead limit
exceeded

at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1099)

at
org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:425)

at
org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.java:85)

at org.apache.solr.handler.dataimport.SolrWriter.commit(SolrWriter.java:179)

at org.apache.solr.handler.dataimport.DocBuilder.finish(DocBuilder.java:236)

at
org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:208)

at
org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:359)

at
org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:427)

at
org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:408)

Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded

at java.util.Arrays.copyOfRange(Arrays.java:3209)

at java.lang.String.<init>(String.java:215)

at org.apache.lucene.index.TermBuffer.toTerm(TermBuffer.java:122)

at org.apache.lucene.index.SegmentTermEnum.term(SegmentTermEnum.java:176)

at org.apache.lucene.index.TermInfosReader.<init>(TermInfosReader.java:122)

at
org.apache.lucene.index.SegmentCoreReaders.<init>(SegmentCoreReaders.java:75)

at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:114)

at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:92)

at org.apache.lucene.index.DirectoryReader.<init>(DirectoryReader.java:235)

at
org.apache.lucene.index.ReadOnlyDirectoryReader.<init>(ReadOnlyDirectoryReader.java:34)

at
org.apache.lucene.index.DirectoryReader.doReopen(DirectoryReader.java:484)

at
org.apache.lucene.index.DirectoryReader.access$000(DirectoryReader.java:45)

at
org.apache.lucene.index.DirectoryReader$2.doBody(DirectoryReader.java:476)

at
org.apache.lucene.index.SegmentInfos$FindSegmentsFile.run(SegmentInfos.java:750)

at
org.apache.lucene.index.DirectoryReader.doReopenNoWriter(DirectoryReader.java:471)

at
org.apache.lucene.index.DirectoryReader.doReopen(DirectoryReader.java:429)

at org.apache.lucene.index.DirectoryReader.reopen(DirectoryReader.java:392)

at org.apache.solr.search.SolrIndexReader.reopen(SolrIndexReader.java:414)

at org.apache.solr.search.SolrIndexReader.reopen(SolrIndexReader.java:425)

at org.apache.solr.search.SolrIndexReader.reopen(SolrIndexReader.java:35)

at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1080)

... 8 more

2011-11-08 14:54:34,905 WARN 
[org.jboss.system.server.profileservice.hotdeploy.HDScanner] (HDScanner)
Scan failed

java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:54:25,132 ERROR [STDERR]
(DefaultQuartzScheduler_QuartzSchedulerThread) java.lang.OutOfMemoryError:
GC overhead limit exceeded

2011-11-08 14:54:36,238 ERROR [STDERR]
(DefaultQuartzScheduler_QuartzSchedulerThread)   at
java.util.TreeMap.key(TreeMap.java:1206)

2011-11-08 14:54:36,238 ERROR [STDERR]
(DefaultQuartzScheduler_QuartzSchedulerThread)   at
java.util.TreeMap.firstKey(TreeMap.java:267)

2011-11-08 14:54:36,238 ERROR [STDERR]
(DefaultQuartzScheduler_QuartzSchedulerThread)   at
java.util.TreeSet.first(TreeSet.java:377)

2011-11-08 14:54:36,238 ERROR [STDERR]
(DefaultQuartzScheduler_QuartzSchedulerThread)   at
org.quartz.simpl.RAMJobStore.acquireNextTrigger(RAMJobStore.java:1131)

2011-11-08 14:54:36,238 ERROR [STDERR]
(DefaultQuartzScheduler_QuartzSchedulerThread)   at
org.quartz.core.QuartzSchedulerThread.run(QuartzSchedulerThread.java:233)

2011-11-08 14:54:36,238 ERROR [STDERR]
(ContainerBackgroundProcessor[StandardEngine[jboss.web]]) Exception in
thread "ContainerBackgroundProcessor[StandardEngine[jboss.web]]"

2011-11-08 14:54:36,239 ERROR [STDERR]
(ContainerBackgroundProcessor[StandardEngine[jboss.web]])
java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:54:36,240 ERROR [STDERR] (Timer-Log4jService) Exception in
thread "Timer-Log4jService"

2011-11-08 14:54:36,240 ERROR [STDERR] (Timer-Log4jService)
java.lang.OutOfMemoryError: GC overhead limit exceeded

2011-11-08 14:54:36,247 INFO 
[org.apache.solr.handler.dataimport.SolrWriter] (Thread-19) Read
dataimport.properties

2011-11-08 14:54:36,256 INFO 
[org.apache.solr.handler.dataimport.SolrWriter] (Thread-19) Wrote last
indexed time to
/data/solr/Proliphiq/ProliphiqSearch/ProliphiqSolr_Master/profileAutoSuggest/conf/dataimport.properties

2011-11-08 14:54:36,256 INFO 
[org.apache.solr.update.processor.UpdateRequestProcessor] (Thread-19)
{deleteByQuery=*:*,add=[17505883, 17505887, 17505891, 17505895, 17505899,
17505903, 17505907, 17505911, ... (14673008 adds)]} 0 11

2011-11-08 14:54:36,256 INFO 
[org.apache.solr.handler.dataimport.DocBuilder] (Thread-19) Time taken =
6:40:39.955

Do i need to increase the heap size for JVM?

My solrconfig settings are given below.

<indexDefaults>
  
    <useCompoundFile>false</useCompoundFile>

    <mergeFactor>25</mergeFactor>
    
    <maxBufferedDocs>2</maxBufferedDocs>
   
    <ramBufferSizeMB>1024</ramBufferSizeMB>
    <maxMergeDocs>2147483647</maxMergeDocs>
    <maxFieldLength>10000</maxFieldLength>
    <writeLockTimeout>1000</writeLockTimeout>
    <commitLockTimeout>10000</commitLockTimeout>

and the main index values are 

<useCompoundFile>false</useCompoundFile>
    <ramBufferSizeMB>512</ramBufferSizeMB>
    <mergeFactor>10</mergeFactor>
    <maxMergeDocs>2147483647</maxMergeDocs>
    <maxFieldLength>10000</maxFieldLength>

Do i need to increase the ramBufferSizeMB to a little higher?

Please provide your inputs.

Regards,
Siva
 

--
View this message in context: 
http://lucene.472066.n3.nabble.com/Out-of-memory-during-the-indexing-tp3492701p3492701.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to