On Fri, 2011-09-09 at 18:48 +0200, Mike Austin wrote:
> Our index is very small with 100k documents and a light load at the moment.
> If I wanted to use the smallest possible RAM on the server, how would I do
> this and what are the issues?

The index size depends just as much on the size of the documents as the
number, but assuming that your documents are relatively small, I don't
see any issues. 100K is such a small amount that you will get fair OS
caching even on a very low memory server.

Plain searches works well with low memory, but faceting might be tricky
as it requires a temporary memory overhead for each concurrent search.
Limiting the number of concurrent searches to 1 or 2 might be a good
idea.

During tests of hierarchical faceting with Solr trunk, I tried running
with 32MB for a very simple 1 million document index and it worked
surprisingly well (better with 48MB though). For stable Solr I would
expect the memory requirement to be somewhat higher.


Could you tell us what you're aiming at? What is low memory for you, how
large is your index in bytes, what response times do you hope for and
what is the expected query rate?

Reply via email to