Hi Nicolas,

I was doing something similar to your solution to have better searching
times.
I download you patch but I have a problem in one class. I'm not sure if I'm
doing something wrong but if I what to compile the proyect I must change in
IndexSchema:

        //private Similarity similarity;

        AND PUT:

        private SimilarityFactory similarityFactory;

I'm doing something incorrectly or is a little bug?

Thanks,

Rober.

-----Mensaje original-----
De: Nicolas DESSAIGNE [mailto:[EMAIL PROTECTED] 
Enviado el: viernes, 20 de junio de 2008 12:01
Para: solr-user@lucene.apache.org
Asunto: RE: never desallocate RAM...during search

Hi Robert,

We had actually a similar problem to your own (slow highlighting of big
documents). We corrected it by extending copyField functionality :
https://issues.apache.org/jira/browse/SOLR-538

We just updated the patch. It should work perfectly on trunk.

Please tell us if it answers your problem.

Nicolas

-----Message d'origine-----
De : [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
Envoyé : mercredi 18 juin 2008 15:49
À : solr-user@lucene.apache.org
Objet : RE: never desallocate RAM...during search

Hi Otis,

Thank you for your attention.

I've read for days the mail list of lucene and solr and no-one have problems
with anything similar that's why it's seem a bit strange for me this
behaviour.

I can try what you comment about the "gc", but what I'm telling is a normal
behaviour? I must configure my JVM with gc especial parameters for solr?

Thanks a lot. I hope I can arrive to one solution with your help.

Rober.
-----Mensaje original-----
De: Otis Gospodnetic [mailto:[EMAIL PROTECTED]
Enviado el: miércoles, 18 de junio de 2008 14:55
Para: solr-user@lucene.apache.org
Asunto: Re: never desallocate RAM...during search

Hi,
I don't have the answer about why cache still shows "true", but as far as
memory usage goes, based on your description I'd guess the memory is
allocated and used by the JVM which typically  tries not to run GC unless it
needs to.  So if you want to get rid of that used memory, you need to talk
to the JVM and persuade it to run GC.  I don't think there is a way to
manage memory usage directly.  There is System.gc() that you can call, but
that's only a "suggestion" for the JVM to run GC.


Otis --
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch


----- Original Message ----
> From: Roberto Nieto <[EMAIL PROTECTED]>
> To: solr-user <solr-user@lucene.apache.org>
> Sent: Wednesday, June 18, 2008 7:43:12 AM
> Subject: never desallocate RAM...during search
>
> Hi users,
>
> Somedays ago I made a question about RAM use during searchs but I didn't
> solve my problem with the ideas that some expert users told me. After
making
> somes test I can make a more specific question hoping someone can help me.
>
> My problem is that i need highlighting and i have quite big docs (txt of
> 40MB). The conclusion of my tests is that if I set "rows" to 10, the
content
> of the first 10 results are cached. This if something normal because its
> probable needed for the highlighting, but this memory is never desallocate
> although I set solr's caches to 0. With this, the memory grows up until is
> close to the heap, then the gc start to desallocate memory..but at that
> point the searches are quite slow. Is this a normal behavior? Can I
> configure some solr parameter to force the desallocation of results after
> each search? [I´m using solr 1.2]
>
> Another thing that I found is that although I comment (in solrconfig) all
> this options:
> ----> filterCache, queryResultCache, documentCache,
enableLazyFieldLoading,
> useFilterForSortedQuery, boolTofilterOptimizer
> In the stats always appear "caching:true".
>
> I'm probably leaving some stupid thing but I can't find it.
>
> If anyone can help me..i'm quite desperate.
>
>
> Rober.

Reply via email to