pty
core allows the system to run out of heap.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
> From: Mark Miller
> Reply-To:
> Date: Tue, 06 Oct 2009 17:21:47 -0400
> To:
> Subject: Re: Solr Trunk Heap Space Issues
>
> Mark
Mark Miller wrote:
> Jeff Newburn wrote:
>
>> So could that potentially explain our use of more ram on indexing? Or is
>> this a rare edge case.
>>
>>
> I think it could explain the JVM using more RAM while indexing - but it
> should be fairly easily recoverable from what I can tell - so
Jeff Newburn wrote:
> So could that potentially explain our use of more ram on indexing? Or is
> this a rare edge case.
>
I think it could explain the JVM using more RAM while indexing - but it
should be fairly easily recoverable from what I can tell - so no
explanation on the OOM yet. Still loo
So could that potentially explain our use of more ram on indexing? Or is
this a rare edge case.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
> From: Mark Miller
> Reply-To:
> Date: Tue, 06 Oct 2009 15:30:50 -0400
> To:
> Subject: Re: Solr T
This is looking like its just a Lucene oddity you get when adding a
single doc due to some changes with the NRT stuff.
Mark Miller wrote:
> Okay - I'm sorry - serves me right for working sick.
>
> Now that I have put on my glasses and correctly tagged my two eclipse tests:
>
> It still appears tha
Okay - I'm sorry - serves me right for working sick.
Now that I have put on my glasses and correctly tagged my two eclipse tests:
It still appears that trunk likes to use more RAM.
I switched both tests to one million iterations and watched the heap.
The test from the build around may 5th (I pr
Okay, I juggled the tests in eclipse and flipped the results. So they
make sense.
Sorry - goose chase on this one.
Yonik Seeley wrote:
> I don't see this with trunk... I just tried TestIndexingPerformance
> with 1M docs, and it seemed to work fine.
> Memory use stabilized at 40MB.
> Most memory u
I don't see this with trunk... I just tried TestIndexingPerformance
with 1M docs, and it seemed to work fine.
Memory use stabilized at 40MB.
Most memory use was for indexing (not analysis).
char[] topped out at 4.5MB
-Yonik
http://www.lucidimagination.com
On Tue, Oct 6, 2009 at 12:31 PM, Mark Mi
the obvious thing to check would be the custom search component.
>>>>> Does it access documents? I don't see how else the document cache
>>>>> could self populate with so many entries (assuming it is the document
>>>>> cache again).
>>>>
is seems
>>>>>>> like a
>>>>>>> huge increase in memory to go from indexing without issues to not being
>>>>>>> able
>>>>>>> to even with warming off.
>>>>>>>
>>>>>>>
>
;>> could self populate with so many entries (assuming it is the document
>>> cache again).
>>>
>>> -Yonik
>>> http://www.lucidimagination.com
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>> --
>>
>>
>>
>>
>>
>>> --
>>> Jeff Newburn
>>> Software Engineer, Zappos.com
>>> jnewb...@zappos.com - 702-943-7562
>>>
>>>
>>>
>>>
>>>> From: Yonik Seeley
>>>> Reply-To
rn
>> Software Engineer, Zappos.com
>> jnewb...@zappos.com - 702-943-7562
>>
>>
>>
>>> From: Yonik Seeley
>>> Reply-To:
>>> Date: Mon, 5 Oct 2009 13:32:32 -0400
>>> To:
>>> Subject: Re: Solr Trunk Heap Space Issues
>
gt; Date: Mon, 5 Oct 2009 13:32:32 -0400
>> To:
>> Subject: Re: Solr Trunk Heap Space Issues
>>
>> On Mon, Oct 5, 2009 at 1:00 PM, Jeff Newburn wrote:
>>> Ok I have eliminated all queries for warming and am still getting the heap
>>> space dump. Any ideas
idea where the LRUCache is getting its information or what is even in there.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
> From: Yonik Seeley
> Reply-To:
> Date: Mon, 5 Oct 2009 13:32:32 -0400
> To:
> Subject: Re: Solr Trunk Heap Space Iss
reindex.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
> From: Yonik Seeley
> Reply-To:
> Date: Mon, 5 Oct 2009 13:32:32 -0400
> To:
> Subject: Re: Solr Trunk Heap Space Issues
>
> On Mon, Oct 5, 2009 at 1:00 PM, Jeff Newburn wrote:
&g
o:
>> Date: Thu, 01 Oct 2009 08:41:18 -0700
>> To: "solr-user@lucene.apache.org"
>> Subject: Solr Trunk Heap Space Issues
>>
>> I am trying to update to the newest version of solr from trunk as of May
>> 5th. I updated and compiled from trunk as of
Jeff Newburn wrote:
> Ok I have eliminated all queries for warming and am still getting the heap
> space dump. Any ideas at this point what could be wrong? This seems like a
> huge increase in memory to go from indexing without issues to not being able
> to even with warming off.
>
How about a
, Zappos.com
jnewb...@zappos.com - 702-943-7562
> From: Jeff Newburn
> Reply-To:
> Date: Thu, 01 Oct 2009 08:41:18 -0700
> To: "solr-user@lucene.apache.org"
> Subject: Solr Trunk Heap Space Issues
>
> I am trying to update to the newest version of solr from trunk
t
> like this? Also, would FastLRU cache make a difference?
> --
> Jeff Newburn
> Software Engineer, Zappos.com
> jnewb...@zappos.com - 702-943-7562
>
>
>> From: Yonik Seeley
>> Reply-To:
>> Date: Fri, 2 Oct 2009 00:53:46 -0400
>> To:
>> Subje
s : 0
> hitratio : 0.00
> inserts : 0
> evictions : 0
> size : 0
> warmupTime : 0
> cumulative_lookups : 0
> cumulative_hits : 0
> cumulative_hitratio : 0.00
> cumulative_inserts : 0
> cumulative_evictions : 0
>
> --
> Jeff Newburn
> Software Engineer, Zap
om: Jeff Newburn
> Reply-To:
> Date: Fri, 02 Oct 2009 08:28:44 -0700
> To:
> Subject: Re: Solr Trunk Heap Space Issues
>
> The warmers return 11 fields:
> 3 Strings
> 2 booleans
> 2 doubles
> 2 longs
> 1 sint (solr.SortableIntField)
>
> Let me know if you nee
: 0.00
cumulative_inserts : 0
cumulative_evictions : 0
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
> From: Yonik Seeley
> Reply-To:
> Date: Fri, 2 Oct 2009 10:04:27 -0400
> To:
> Subject: Re: Solr Trunk Heap Space Issues
>
> On Fri, Oc
On Fri, Oct 2, 2009 at 10:02 AM, Mark Miller wrote:
> Jeff Newburn wrote:
>> that side change enough to push up the memory limits where we would run out
>> like this?
>>
> Yes - now give us the FieldCache section from the stats section please :)
And the fieldValueCache section too (used for multi
Jeff Newburn wrote:
> that side change enough to push up the memory limits where we would run out
> like this?
>
Yes - now give us the FieldCache section from the stats section please :)
Its not likely gonna do you any good, but it could be good information
for us.
--
- Mark
http://www.luci
To:
>> Date: Fri, 2 Oct 2009 00:53:46 -0400
>> To:
>> Subject: Re: Solr Trunk Heap Space Issues
>>
>> On Thu, Oct 1, 2009 at 8:45 PM, Jeffery Newburn wrote:
>>> I loaded the jvm and started indexing. It is a test server so unless some
>>> errant qu
> From: Yonik Seeley
> Reply-To:
> Date: Fri, 2 Oct 2009 00:53:46 -0400
> To:
> Subject: Re: Solr Trunk Heap Space Issues
>
> On Thu, Oct 1, 2009 at 8:45 PM, Jeffery Newburn wrote:
>> I loaded the jvm and started indexing. It is a test server so unless some
>> erran
On Thu, Oct 1, 2009 at 8:45 PM, Jeffery Newburn wrote:
> I loaded the jvm and started indexing. It is a test server so unless some
> errant query came in then no searching. Our instance has only 512mb but my
> concern is the obvious memory requirement leap since it worked before. What
> other data
I loaded the jvm and started indexing. It is a test server so unless
some errant query came in then no searching. Our instance has only
512mb but my concern is the obvious memory requirement leap since it
worked before. What other data would be helpful with this?
On Oct 1, 2009, at 5:14 P
Jeff Newburn wrote:
> Ok I was able to get a heap dump from the GC Limit error.
>
> 1 instance of LRUCache is taking 170mb
> 1 instance of SchemaIndex is taking 56Mb
> 4 instances of SynonymMap is taking 112mb
>
> There is no searching going on during this index update process.
>
> Any ideas what o
my May version did this
without any problems whatsoever.
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
> From: Mark Miller
> Reply-To:
> Date: Thu, 01 Oct 2009 17:57:28 -0400
> To:
> Subject: Re: Solr Trunk Heap Space Issues
>
> Yoni
Yonik Seeley wrote:
> On Thu, Oct 1, 2009 at 4:35 PM, Yonik Seeley
> wrote:
>
>> Since isTokenized() more reflects if something is tokenized at the
>> Lucene level, perhaps we need something that specifies if there is
>> more than one logical value per field value? I'm drawing a blank on a
>>
On Thu, Oct 1, 2009 at 4:35 PM, Yonik Seeley wrote:
> Since isTokenized() more reflects if something is tokenized at the
> Lucene level, perhaps we need something that specifies if there is
> more than one logical value per field value? I'm drawing a blank on a
> good name for such a method thoug
On Thu, Oct 1, 2009 at 4:05 PM, Yonik Seeley wrote:
> On Thu, Oct 1, 2009 at 3:37 PM, Mark Miller wrote:
>> Still interested in seeing his field sanity output to see whats possibly
>> being doubled.
>
> Strangely enough, I'm having a hard time seeing caching at the different
> levels.
> I mad a
On Thu, Oct 1, 2009 at 3:37 PM, Mark Miller wrote:
> Still interested in seeing his field sanity output to see whats possibly
> being doubled.
Strangely enough, I'm having a hard time seeing caching at the different levels.
I mad a multi-segment index (2 segments), and then did a sort and facet:
Whoops. There is my lazy brain for you - march, may, august - all the
same ;)
Okay - forgot Solr went straight down and used FieldSortedHitQueue.
So it all still makes sense ;)
Still interested in seeing his field sanity output to see whats possibly
being doubled.
Yonik Seeley wrote:
> On Thu,
On Thu, Oct 1, 2009 at 3:14 PM, Mark Miller wrote:
> bq. Tons of changes since... including the per-segment
> searching/sorting/function queries (I think).
>
> Yup. I actually didn't think so, because that was committed to Lucene in
> Feburary - but it didn't come into Solr till March 10th. March
bq. Tons of changes since... including the per-segment
searching/sorting/function queries (I think).
Yup. I actually didn't think so, because that was committed to Lucene in
Feburary - but it didn't come into Solr till March 10th. March 5th just
ducked it.
Yonik Seeley wrote:
> On Thu, Oct 1, 200
On Thu, Oct 1, 2009 at 11:41 AM, Jeff Newburn wrote:
> I am trying to update to the newest version of solr from trunk as of May
> 5th.
Tons of changes since... including the per-segment
searching/sorting/function queries (I think).
Do you sort on any single valued fields that you also facet on?
rameter and it didn't seem to dump when it hit the gc limit
> error. Any other thoughts?
>
> --
> Jeff Newburn
> Software Engineer, Zappos.com
> jnewb...@zappos.com - 702-943-7562
>
>
>> From: Bill Au
>> Reply-To:
>> Date: Thu, 1 Oct 2009 12:16:53
Mark Miller wrote:
>
> You might use jmap to take a look at the heap (you can do it well its
> live with Java6)
Errr - just so I don't screw anyone in a production environment - it
will freeze your app while its getting the info.
--
- Mark
http://www.lucidimagination.com
Jeff Newburn wrote:
> Added the parameter and it didn't seem to dump when it hit the gc limit
> error. Any other thoughts?
>
>
You might use jmap to take a look at the heap (you can do it well its
live with Java6) or to force a heap dump when you specify.
Since its spending 98% of the time in
Added the parameter and it didn't seem to dump when it hit the gc limit
error. Any other thoughts?
--
Jeff Newburn
Software Engineer, Zappos.com
jnewb...@zappos.com - 702-943-7562
> From: Bill Au
> Reply-To:
> Date: Thu, 1 Oct 2009 12:16:53 -0400
> To:
> Subject: Re: S
You probably want to add the following command line option to java to
produce a heap dump:
-XX:+HeapDumpOnOutOfMemoryError
Then you can use jhat to see what's taking up all the space in the heap.
Bill
On Thu, Oct 1, 2009 at 11:47 AM, Mark Miller wrote:
> Jeff Newburn wrote:
> > I am trying to
Jeff Newburn wrote:
> I am trying to update to the newest version of solr from trunk as of May
> 5th. I updated and compiled from trunk as of yesterday (09/30/2009). When
> I try to do a full import I am receiving a GC heap error after changing
> nothing in the configuration files. Why would thi
I am trying to update to the newest version of solr from trunk as of May
5th. I updated and compiled from trunk as of yesterday (09/30/2009). When
I try to do a full import I am receiving a GC heap error after changing
nothing in the configuration files. Why would this happen in the most
recent
46 matches
Mail list logo