Chris Hostetter wrote:
How big is your physical index directory on disk?
It's about 2.9G now.
Is there a direct connection between size of index and usage of ram?

Your best bet is to allocate as much ram to the server as you can.
Depending on how full your caches are, and what hitratios you are getting
(the "STATISTICS" link from the Admin screen will tell you) you might want
to make some of them smaller to reduce the amount of RAM Solr uses for
them.
Hm, after disabling all caches I still get OutOfMemoryErrors.
All I do currently while testing is to delete documents. No searching or inserting. Typically after deleting about 20,000 documents the server throws the first error message.

From an acctual index standpoint, if you don't care about doc/field boosts
of lengthNorms, then the omitNorm="true" option on your fields (or
fieldtypes) will help save one byte per document per field you use it on.
That is something I could test, though I think this won't significantly change the size of the index.

One thing that appears suspicious to me is that everything went fine as long as the number of documents was below 10 million. Problems started when this limit was exceeded. But maybe this is just a coincidence.

Marcus

Reply via email to