You might want to look at turning down or eliminating your caches if
you're running out of RAM. Possibly some of them have a low hit rate,
which you can see on the Stats page. Caches with a low hit rate are
only consuming RAM and CPU cycles.

Also, using this JVM arg might reduce the memory footprint:
-XX:+UseCompressedOops

In the end though, the surefire solution would be to go to an instance
type with more RAM: http://www.ec2instances.info/

Michael Della Bitta

------------------------------------------------
Appinions | 18 East 41st St., Suite 1806 | New York, NY 10017
www.appinions.com
Where Influence Isn’t a Game


On Mon, Aug 6, 2012 at 1:48 PM, Jon Drukman <jdruk...@gmail.com> wrote:
> Hi there.  I am running Solr 1.4.1 on an Amazon EC2 box with 7.5GB of RAM.
>  It was set up about 18 months ago and has been largely trouble-free.
>  Unfortunately, lately it has started to run out of memory pretty much
> every day.  We are seeing
>
> SEVERE: java.lang.OutOfMemoryError: Java heap space
>
> When that happens, a simple query like
> "http://localhost:8983/solr/select?q=*:*'"
> returns nothing.
>
> I am starting Solr with the following:
>
> /usr/lib/jvm/jre/bin/java -XX:+UseConcMarkSweepGC -Xms1G -Xmx5G -jar
> start.jar
>
> It would be vastly preferable if Solr could just exit when it gets a memory
> error, because we have it running under daemontools, and that would cause
> an automatic restart.  After restarting, Solr works fine for another 12-18
> hours.  Not ideal but at least it wouldn't require human intervention to
> get it going again.
>
> What can I do to reduce the memory pressure?  Does Solr require the entire
> index to fit in memory at all times?  The on disk size is 15GB.  There are
> 27.5 million documents, but they are all tiny (mostly one line forum
> comments like "this game is awesome").
>
> We're using Sun openJava SDK 1.6 if that matters.
>
> -jsd-

Reply via email to