In a word, "stress test". Here's the blog I wrote on topic outlining
why it's hard to give a more helpful answer....

https://lucidworks.com/2012/07/23/sizing-hardware-in-the-abstract-why-we-dont-have-a-definitive-answer/

You might want to explore the hyper-log-log approach which provides
pretty good estimates without so many resources.

Best,
Erick

On Tue, Jun 20, 2017 at 11:36 AM, Lewin Joy (TMS) <lewin....@toyota.com> wrote:
> ** PROTECTED 関係者外秘
> Hi,
>
> Is there anyway to estimate the CPU needed to setup solr environment?
> We use pivot facets extensively. We use it in json facet api and also native 
> queries.
>
> For our 150 million record collection, we are seeing high CPU usage of 100% 
> with small loads.
> If we have to increase our configuration, is there somehow we can estimate 
> the CPU usage?
>
> We have five VMs with 8 CPU each and 32gb RAM, for which solr uses 24gb heap.
>
> Thanks,
> Lewin

Reply via email to