How many documents does each search find? What does this mean: "number
of index hits: 7.2GB."

Above a threshold, the more memory you give Java, the more time it
spends collecting. You want to start with very little memory and
gradually increase memory size until the program stops using it all,
and then add maybe 10%. The operating system is better at managing
memory than Java, and it is faster to leave the full index data in the
OS disk buffers. It is counterintuitive, but is true.

Another problem you will find is 'Large Pages'. This is an OS tuning
parameter, not a Java or Solr tuning. You did not say which OS you
use, but here is an explanation for Linux:
http://lwn.net/Articles/423584/

On Mon, Aug 13, 2012 at 6:16 PM, feroz_kh <feroz...@yahoo.com> wrote:
> 1. So we have 24.5GB assigned to jvm which is half of the total memory, which
> is 48GB RAM.(If that's what you meant, and if i am getting that right ?)
> 2. Size of *.fdt and *fdx is around 300m and 50m respectively.So that's
> definitely less that 5%.
> Do you see a problem there ?
>
> Is there a way that we can force or tune in such a way that the response
> time remains constant or doesn't degrade a lot(i.e. almost doubling) when
> the index size is doubled ?
> Or we cannot do anything about it ?
>
>
>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Solr-Index-linear-growth-Performance-degradation-tp4000934p4001034.html
> Sent from the Solr - User mailing list archive at Nabble.com.



-- 
Lance Norskog
goks...@gmail.com

Reply via email to