A little before and after. The before is around may 5th'is - the after
is trunk.

http://myhardshadow.com/memanalysis/before.png
http://myhardshadow.com/memanalysis/after.png

Mark Miller wrote:
> Took a peak at the checkout around the time he says he's using.
>
> CharTokenizer appears to be holding onto much large char[] arrays now
> than before. Same with snowball.Among - used to be almost nothing, now
> its largio.
>
> The new TokenStream stuff appears to be clinging. Needs to find some
> inner peace.
>
> Yonik Seeley wrote:
>   
>> On Mon, Oct 5, 2009 at 4:54 PM, Jeff Newburn <jnewb...@zappos.com> wrote:
>>   
>>     
>>> Ok we have done some more testing on this issue.  When I only have the 1
>>> core the reindex completes fine.  However, when I added a second core with
>>> no documents it runs out of heap again.  This time the heap was 322Mb of
>>> LRUCache.  The 1 query that warms returns exactly 2 documents so I have no
>>> idea where the LRUCache is getting its information or what is even in there.
>>>     
>>>       
>> I guess the obvious thing to check would be the custom search component.
>> Does it access documents?  I don't see how else the document cache
>> could self populate with so many entries (assuming it is the document
>> cache again).
>>
>> -Yonik
>> http://www.lucidimagination.com
>>
>>
>>
>>
>>   
>>     
>>> --
>>> Jeff Newburn
>>> Software Engineer, Zappos.com
>>> jnewb...@zappos.com - 702-943-7562
>>>
>>>
>>>     
>>>       
>>>> From: Yonik Seeley <yo...@lucidimagination.com>
>>>> Reply-To: <solr-user@lucene.apache.org>
>>>> Date: Mon, 5 Oct 2009 13:32:32 -0400
>>>> To: <solr-user@lucene.apache.org>
>>>> Subject: Re: Solr Trunk Heap Space Issues
>>>>
>>>> On Mon, Oct 5, 2009 at 1:00 PM, Jeff Newburn <jnewb...@zappos.com> wrote:
>>>>       
>>>>         
>>>>> Ok I have eliminated all queries for warming and am still getting the heap
>>>>> space dump.  Any ideas at this point what could be wrong?  This seems 
>>>>> like a
>>>>> huge increase in memory to go from indexing without issues to not being 
>>>>> able
>>>>> to even with warming off.
>>>>>         
>>>>>           
>>>> Do you have any custom Analyzers, Tokenizers, TokenFilters?
>>>> Another change is that token streams are reused by caching in a
>>>> thread-local, so every thread in your server could potentially have a
>>>> copy of an analysis chain (token stream) per field that you have used.
>>>>  This normally shouldn't be an issue since these will be small.  Also,
>>>> how many unique fields do you have?
>>>>
>>>> -Yonik
>>>> http://www.lucidimagination.com
>>>>
>>>>
>>>>
>>>>       
>>>>         
>>>>> Jeff Newburn
>>>>> Software Engineer, Zappos.com
>>>>> jnewb...@zappos.com - 702-943-7562
>>>>>
>>>>>
>>>>>         
>>>>>           
>>>>>> From: Jeff Newburn <jnewb...@zappos.com>
>>>>>> Reply-To: <solr-user@lucene.apache.org>
>>>>>> Date: Thu, 01 Oct 2009 08:41:18 -0700
>>>>>> To: "solr-user@lucene.apache.org" <solr-user@lucene.apache.org>
>>>>>> Subject: Solr Trunk Heap Space Issues
>>>>>>
>>>>>> I am trying to update to the newest version of solr from trunk as of May
>>>>>> 5th.  I updated and compiled from trunk as of yesterday (09/30/2009).  
>>>>>> When
>>>>>> I try to do a full import I am receiving a GC heap error after changing
>>>>>> nothing in the configuration files.  Why would this happen in the most
>>>>>> recent versions but not in the version from a few months ago.  The stack
>>>>>> trace is below.
>>>>>>
>>>>>> Oct 1, 2009 8:34:32 AM 
>>>>>> org.apache.solr.update.processor.LogUpdateProcessor
>>>>>> finish
>>>>>> INFO: {add=[166400, 166608, 166698, 166800, 166811, 167097, 167316, 
>>>>>> 167353,
>>>>>> ...(83 more)]} 0 35991
>>>>>> Oct 1, 2009 8:34:32 AM org.apache.solr.common.SolrException log
>>>>>> SEVERE: java.lang.OutOfMemoryError: GC overhead limit exceeded
>>>>>>     at java.util.Arrays.copyOfRange(Arrays.java:3209)
>>>>>>     at java.lang.String.<init>(String.java:215)
>>>>>>     at com.ctc.wstx.util.TextBuffer.contentsAsString(TextBuffer.java:384)
>>>>>>     at 
>>>>>> com.ctc.wstx.sr.BasicStreamReader.getText(BasicStreamReader.java:821)
>>>>>>     at org.apache.solr.handler.XMLLoader.readDoc(XMLLoader.java:280)
>>>>>>     at 
>>>>>> org.apache.solr.handler.XMLLoader.processUpdate(XMLLoader.java:139)
>>>>>>     at org.apache.solr.handler.XMLLoader.load(XMLLoader.java:69)
>>>>>>     at
>>>>>> org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentSt
>>>>>> reamHandlerBase.java:54)
>>>>>>     at
>>>>>> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.
>>>>>> java:131)
>>>>>>     at org.apache.solr.core.SolrCore.execute(SolrCore.java:1316)
>>>>>>     at
>>>>>> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:3
>>>>>> 38)
>>>>>>     at
>>>>>> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:
>>>>>> 241)
>>>>>>     at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(Application
>>>>>> FilterChain.java:235)
>>>>>>     at
>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterCh
>>>>>> ain.java:206)
>>>>>>     at
>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.ja
>>>>>> va:233)
>>>>>>     at
>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.ja
>>>>>> va:175)
>>>>>>     at
>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128
>>>>>> )
>>>>>>     at
>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102
>>>>>> )
>>>>>>     at
>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java
>>>>>> :109)
>>>>>>     at
>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:286)
>>>>>>     at
>>>>>> org.apache.coyote.http11.Http11NioProcessor.process(Http11NioProcessor.java:
>>>>>> 879)
>>>>>>     at
>>>>>> org.apache.coyote.http11.Http11NioProtocol$Http11ConnectionHandler.process(H
>>>>>> ttp11NioProtocol.java:719)
>>>>>>     at
>>>>>> org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:
>>>>>> 2080)
>>>>>>     at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.ja
>>>>>> va:886)
>>>>>>     at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:9
>>>>>> 08)
>>>>>>     at java.lang.Thread.run(Thread.java:619)
>>>>>>
>>>>>> Oct 1, 2009 8:40:06 AM org.apache.solr.core.SolrCore execute
>>>>>> INFO: [zeta-main] webapp=/solr path=/update params={} status=500 
>>>>>> QTime=5265
>>>>>> Oct 1, 2009 8:40:12 AM org.apache.solr.common.SolrException log
>>>>>> SEVERE: java.lang.OutOfMemoryError: GC overhead limit exceeded
>>>>>>
>>>>>> --
>>>>>> Jeff Newburn
>>>>>> Software Engineer, Zappos.com
>>>>>> jnewb...@zappos.com - 702-943-7562
>>>>>>
>>>>>>           
>>>>>>             
>>>>>         
>>>>>           
>>>     
>>>       
>
>
>   


-- 
- Mark

http://www.lucidimagination.com



Reply via email to