A quick guess:
If you are using tomcat for example, be sure to grand unlimited virtual memory to that process, e.g. putting
"ulimit -v unlimited"
in your tomcat-init script (if you're using Linux).

Am 09.01.2012 06:29, schrieb James:
I am build the solr index on the hadoop, and at reduce step I run the task that 
merge the indexes, each part of index is about 1G, I have 10 indexes to merge 
them together, I always get the java heap memory exhausted, the heap size is 
about 2G  also. I wonder which part use these so many memory. And how to avoid 
the OOM during the merge process.



--
Ralf Matulat
Deutscher Bundestag
Platz der Republik 1
11011 Berlin
Referat IT 1 - Anwendungsadministration
ralf.matu...@bundestag.de
Tel.: 030 - 227 34260

Reply via email to