Great thanks Shawn... As you said - **For 204GB of data per server, I recommend at least 128GB of total RAM, preferably 256GB**. Therefore, if I have 204GB of data on single server/shard then I prefer is 256GB by which searching will be fast and never slow down. Is it?
On Wed, Mar 25, 2015 at 9:50 PM, Shawn Heisey <apa...@elyograg.org> wrote: > On 3/25/2015 8:42 AM, Nitin Solanki wrote: > > Server configuration: > > 8 CPUs. > > 32 GB RAM > > O.S. - Linux > > <snip> > > > are running. Java heap set to 4096 MB in Solr. While indexing, > > <snip> > > > *Currently*, I have 1 shard with 2 replicas using SOLR CLOUD. > > Data Size: > > 102G solr/node1/solr/wikingram_shard1_replica2 > > 102G solr/node2/solr/wikingram_shard1_replica1 > > If both of those are on the same machine, I'm guessing that you're > running two Solr instances on that machine, so there's 8GB of RAM used > for Java. That means you have about 24 GB of RAM left for caching ... > and 200GB of index data to cache. > > 24GB is not enough to cache 200GB of index. If there is only one Solr > instance (leaving 28GB for caching) with 102GB of data on the machine, > it still might not be enough. See that "SolrPerformanceProblems" wiki > page I linked in my earlier email. > > For 102GB of data per server, I recommend at least 64GB of total RAM, > preferably 128GB. > > For 204GB of data per server, I recommend at least 128GB of total RAM, > preferably 256GB. > > Thanks, > Shawn > >