Thanks!

I've seen a few formulae like this go by over the months. Can someone
please make a wiki page for memory and processing estimation with
locality properties?  Or is there a Lucene page we can use?

Lance 

-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Yonik
Seeley
Sent: Tuesday, December 04, 2007 8:06 AM
To: solr-user@lucene.apache.org
Subject: Re: out of heap space, every day

On Dec 4, 2007 10:59 AM, Brian Whitman <[EMAIL PROTECTED]> wrote:
> >
> > For faceting and sorting, yes.  For normal search, no.
> >
>
> Interesting you mention that, because one of the other changes since 
> last week besides the index growing is that we added a sort to an sint

> field on the queries.
>
> Is it reasonable that a sint sort would require over 2.5GB of heap on 
> a 8M index? Is there any empirical data on how much RAM that will
need?

int[maxDoc()] + String[nTerms()] + size_of_all_unique_terms.
Then double that to allow for a warming searcher.

One can decrease this memory usage by using an "integer" instead of an
"sint" field if you don't need range queries.  The memory usage would
then drop to a straight int[maxDoc()] (4 bytes per document).

-Yonik

Reply via email to