On 10/23/2018 7:15 AM, Daniel Carrasco wrote:
Hello,
Thanks for your response.
We've already thought about that and doubled the instances. Just now for
every Solr instance we've 60GB of RAM (40GB configured on Solr), and a 16
Cores CPU. The entire Data can be stored on RAM and will not fill the RAM
(of course talking about raw data, not procesed data).
Why are you making the heap so large? I've set up servers that can
handle hundreds of millions of Solr documents in a much smaller heap. A
40GB heap would be something you might do if you're handling billions of
documents on one server.
When you say the entire data can be stored in RAM ... are you counting
that 40GB you gave to Solr? Because you can't count that -- that's for
Solr, NOT the index data.
The heap size should never be dictated by the amount of memory in the
server. It should be made as large as it needs to be for the job, and
no larger.
https://wiki.apache.org/solr/SolrPerformanceProblems#RAM
About the usage, I've checked the RAM and CPU usage and are not fully used.
What exactly are you looking at? I've had people swear that they can't
see a problem with their systems when Solr is REALLY struggling to keep
up with what it has been asked to do.
Further down on the page I linked above is a section about asking for
help. If you can provide the screenshot it mentions there, that would
be helpful. Here's a direct link to that section:
https://wiki.apache.org/solr/SolrPerformanceProblems#Asking_for_help_on_a_memory.2Fperformance_issue
Thanks,
Shawn