On 8/10/2015 7:07 PM, rohit wrote:
> Thanks Shawn. I was looking at SOLR Admin UI and also using top command on
> server. 

The amount of free memory shown by tools like that is not a very good
way to determine what's happening with your memory.  As I said before,
it's completely normal for the OS to utilize almost all of your physical
memory, even if your programs only require a fraction of it.  It is not
a meaningful metric for success.

> Im running a endurance for 4 hr with 50/sec TPS and i see the physical
> memory keeps on increasing during the time and if we have schedule delta
> import during that time frame which can import upto 4 million docs. After
> the import I see again the memory increases and their comes a point when
> their is no more memory left which in turn leads to OOM. 

If you're hitting OOM, then you need to increase the heap size.  Solr is
requiring more memory than you have assigned to the heap.  This will
reduce the amount of memory available for the OS disk cache, which may
reduce performance.

There may be ways you can reduce heap usage by adjusting your
configuration or the way you use Solr.

http://wiki.apache.org/solr/SolrPerformanceProblems#Java_Heap

There is other good information on that page.  I would encourage you to
go to the top of the page and read all of it.

50 queries per second is a lot of load for a single server.  I would
only expect to see success with that many queries per second if the
index fits entirely into the OS disk cache.  It may also be necessary
for the index to be relatively small.

> I have seen one more thing if their is no activity on server no import , no
> search going.  I have not seen the memory coming down from the state which
> was created after test. 

In general, once Java grabs memory, it doesn't let it go.  As I
previously mentioned, it cannot grab more than you ask, plus some
overhead.  The overhead may be a few hundred mb, which is not very much
when you're talking about multiple gigabytes.

> Couple of things to notice: 
> 
> 1. We are storing data and indexing also. (not sure if that is causing
> problem). 
> 2.  Is 8 GB enuf for 10 million or more data to index.
> 3.  We have custom handler which extend solr handlers to return data when
> client calls solr handler. 

I couldn't tell you whether 8GB is enough for 10 million documents.
That depends on what's in those documents, what your schema.xml says,
how you query Solr, and a few other factors.  Even if you tell me the
answers to these questions, I *still* may not be able to say whether
it's enough.  I *might* be able to tell you that it's NOT enough,
though.  The only way to be absolutely sure is to prototype -- actually
try it out.

https://lucidworks.com/blog/sizing-hardware-in-the-abstract-why-we-dont-have-a-definitive-answer/

8GB of RAM is a *very* small system in the world of Solr.  My systems
have 64GB of RAM, and I frequently wish that was 256GB.  My indexes are
somewhat larger than yours, though.

Thanks,
Shawn

Reply via email to