Hello, Thanks for your answers and suggestions. I can get a heap dump also with the jmap command. The resulting file is so big that jhat gets out of memory errors itself when reading the dump.
I traced back my problem (using the heap dump analyser from yourkit.com) to the FieldCache. In fact it had nothing to do with the index optimization but with some queries running in background from cron doing faceted search. I will come back with another email about that. But nevertheless, wouldn't be a good idea to print the stack trace when SEVERE errors are encountered? nicolae On Mon, Aug 3, 2009 at 3:20 AM, Bill Au <bill.w...@gmail.com> wrote: > Your heap may be just too small or you may have a memory leak. A stack > trace may not help you since the thread encountered the OutOfMemoryError > may > not be where the memory leak is. A heap dump will tell you what's using up > all the memory in your heap. > Bill > > On Thu, Jul 30, 2009 at 3:54 PM, Nicolae Mihalache <xproma...@yahoo.com > >wrote: > > > Hello, > > > > I'm a new user of solr but I have worked a bit with Lucene before. I get > > some out of memory exception when optimizing the index through Solr and I > > would like to find out why. > > However, the only message I get on standard output is: > > Jul 30, 2009 9:20:22 PM org.apache.solr.common.SolrException log > > SEVERE: java.lang.OutOfMemoryError: Java heap space > > > > Is there a way to get a stack trace for this exception? I had a look into > > the java.util.logging options and didn't find anything. > > > > My solr runs in some standard configuration inside jetty. > > Any suggestion would be appreciated. > > > > Thanks, > > nicolae > > > > > > > > >