Shawn, 

By using jps command double check the parms used to start solr, i found that 
the max  heap size already set to 10G. So I made a big mistake yesterday.

But by using solr admin UI, I select the collection with performance problem, 
in the overview page I find that the heap memory is about 8M. What is wrong.

Every time I search difference characters, QTime from response header always 
greater than 300ms. If I search again, cause i can hit cache, the response time 
could become to about 30ms.






--
发自我的网易邮箱手机智能版


在 2016-05-10 11:35:27,"Shawn Heisey" <apa...@elyograg.org> 写道:
>On 5/9/2016 9:11 PM, lltvw wrote:
>> You are right, the max heap is 512MB, thanks.
>
>90 million documents split into 12 shards means 7.5 million documents
>per shard.
>
>With that many documents and a 512MB heap, you're VERY lucky if Solr
>doesn't experience OutOfMemoryError problems -- which will make Solr's
>behavior very unpredictable.
>
>Your server has plenty of memory.  Because of the very small max heap,
>it is probably spending a lot of time doing garbage collection.  You'll
>actually need to *increase* your heap size.  I would recommend starting
>with 4GB.
>
>Exactly how to do this will depend on how you're starting it.  If you
>are starting it with "bin/solr" then you can add a -m 4g option to that
>commandline.
>
>Thanks,
>Shawn
>

Reply via email to