Doing this you will send the dump where you want:
-XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/path/to/the/dump

Then you can open the dump with jhat:
jhat /path/to/the/dump/your_stack.bin

It provably will give you a OutOfMemortException due to teh large size ofthe
dump. In case you can give you more momory to your JVM do:

jhat -J-mx2000m my_stack.bin

Then you can analyze the heap at the OutOfMemoryMoment:

http://localhost:7000

Let me know if you find something please. I experienced the same a few ago
and could't fix the problem


Jeff Newburn wrote:
> 
> Added the parameter and it didn't seem to dump when it hit the gc limit
> error.  Any other thoughts?
> 
> -- 
> Jeff Newburn
> Software Engineer, Zappos.com
> jnewb...@zappos.com - 702-943-7562
> 
> 
>> From: Bill Au <bill.w...@gmail.com>
>> Reply-To: <solr-user@lucene.apache.org>
>> Date: Thu, 1 Oct 2009 12:16:53 -0400
>> To: <solr-user@lucene.apache.org>
>> Subject: Re: Solr Trunk Heap Space Issues
>> 
>> You probably want to add the following command line option to java to
>> produce a heap dump:
>> 
>> -XX:+HeapDumpOnOutOfMemoryError
>> 
>> Then you can use jhat to see what's taking up all the space in the heap.
>> 
>> Bill
>> 
>> On Thu, Oct 1, 2009 at 11:47 AM, Mark Miller <markrmil...@gmail.com>
>> wrote:
>> 
>>> Jeff Newburn wrote:
>>>> I am trying to update to the newest version of solr from trunk as of
>>>> May
>>>> 5th.  I updated and compiled from trunk as of yesterday (09/30/2009).
>>>  When
>>>> I try to do a full import I am receiving a GC heap error after changing
>>>> nothing in the configuration files.  Why would this happen in the most
>>>> recent versions but not in the version from a few months ago.
>>> Good question. The error means its spending too much time trying to
>>> garbage collect without making much progress.
>>> Why so much more garbage to collect just by updating? Not sure...
>>> 
>>>> The stack
>>>> trace is below.
>>>> 
>>>> Oct 1, 2009 8:34:32 AM
>>> org.apache.solr.update.processor.LogUpdateProcessor
>>>> finish
>>>> INFO: {add=[166400, 166608, 166698, 166800, 166811, 167097, 167316,
>>> 167353,
>>>> ...(83 more)]} 0 35991
>>>> Oct 1, 2009 8:34:32 AM org.apache.solr.common.SolrException log
>>>> SEVERE: java.lang.OutOfMemoryError: GC overhead limit exceeded
>>>>     at java.util.Arrays.copyOfRange(Arrays.java:3209)
>>>>     at java.lang.String.<init>(String.java:215)
>>>>     at
>>>> com.ctc.wstx.util.TextBuffer.contentsAsString(TextBuffer.java:384)
>>>>     at
>>> com.ctc.wstx.sr.BasicStreamReader.getText(BasicStreamReader.java:821)
>>>>     at org.apache.solr.handler.XMLLoader.readDoc(XMLLoader.java:280)
>>>>     at
>>> org.apache.solr.handler.XMLLoader.processUpdate(XMLLoader.java:139)
>>>>     at org.apache.solr.handler.XMLLoader.load(XMLLoader.java:69)
>>>>     at
>>>> 
>>> org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentSt
>>>> reamHandlerBase.java:54)
>>>>     at
>>>> 
>>> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.
>>>> java:131)
>>>>     at org.apache.solr.core.SolrCore.execute(SolrCore.java:1316)
>>>>     at
>>>> 
>>> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:3
>>>> 38)
>>>>     at
>>>> 
>>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:
>>>> 241)
>>>>     at
>>>> 
>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(Application
>>>> FilterChain.java:235)
>>>>     at
>>>> 
>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterCh
>>>> ain.java:206)
>>>>     at
>>>> 
>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.ja
>>>> va:233)
>>>>     at
>>>> 
>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.ja
>>>> va:175)
>>>>     at
>>>> 
>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128
>>>> )
>>>>     at
>>>> 
>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102
>>>> )
>>>>     at
>>>> 
>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java
>>>> :109)
>>>>     at
>>>> 
>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:286)
>>>>     at
>>>> 
>>> org.apache.coyote.http11.Http11NioProcessor.process(Http11NioProcessor.java:
>>>> 879)
>>>>     at
>>>> 
>>> org.apache.coyote.http11.Http11NioProtocol$Http11ConnectionHandler.process(H
>>>> ttp11NioProtocol.java:719)
>>>>     at
>>>> 
>>> org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:
>>>> 2080)
>>>>     at
>>>> 
>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.ja
>>>> va:886)
>>>>     at
>>>> 
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:9
>>>> 08)
>>>>     at java.lang.Thread.run(Thread.java:619)
>>>> 
>>>> Oct 1, 2009 8:40:06 AM org.apache.solr.core.SolrCore execute
>>>> INFO: [zeta-main] webapp=/solr path=/update params={} status=500
>>> QTime=5265
>>>> Oct 1, 2009 8:40:12 AM org.apache.solr.common.SolrException log
>>>> SEVERE: java.lang.OutOfMemoryError: GC overhead limit exceeded
>>>> 
>>>> 
>>> 
>>> 
>>> --
>>> - Mark
>>> 
>>> http://www.lucidimagination.com
>>> 
>>> 
>>> 
>>> 
> 
> 
> 

-- 
View this message in context: 
http://www.nabble.com/Solr-Trunk-Heap-Space-Issues-tp25701422p25704560.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to