This maybe more of a general java q than a solr one, but I'm a bit
confused.
We have a largish solr index, about 8M documents, the data dir is
about 70G. We're getting about 500K new docs a week, as well as about
1 query/second.
Recently (when we crossed about the 6M threshold) resin has been
stopping with the following:
/usr/local/resin/log/stdout.log:[12:08:21.749] [28304] HTTP/1.1 500
Java heap space
/usr/local/resin/log/stdout.log:[12:08:21.749]
java.lang.OutOfMemoryError: Java heap space
Only a restart of resin will get it going again, and then it'll crash
again within 24 hours.
It's a 4GB machine and we run it with args="-J-mx2500m -J-ms2000m" We
can't really raise this any higher on the machine.
Are there 'native' memory requirements for solr as a function of
index size? Does a 70GB index require some minimum amount of wired
RAM? Or is there some mis-configuration w/ resin or solr or my
system? I don't really know Java well but it seems strange that the
VM can't page RAM out to disk or really do something else beside
stopping the server.
- out of heap space, every day Brian Whitman
-