Hello again,

I run Solr on Tomcat under windows and use the tomcat monitor to start the service. I have set the minimum heap size to be 512MB and then maximum to be 1024mb. The system has 2 Gigs of ram. The error that I get after sending
approximately 300 MB is:

java.lang.OutOfMemoryError: Java heap space
   at org.xmlpull.mxp1.MXParser.fillBuf(MXParser.java:2947)
   at org.xmlpull.mxp1.MXParser.more(MXParser.java:3026)
   at org.xmlpull.mxp1.MXParser.nextImpl(MXParser.java:1384)
   at org.xmlpull.mxp1.MXParser.next(MXParser.java:1093)
   at org.xmlpull.mxp1.MXParser.nextText(MXParser.java:1058)
at org.apache.solr.handler.XmlUpdateRequestHandler.readDoc(XmlUpdateRequestHandler.java:332) at org.apache.solr.handler.XmlUpdateRequestHandler.update(XmlUpdateRequestHandler.java:162) at org.apache.solr.handler.XmlUpdateRequestHandler.handleRequestBody(XmlUpdateRequestHandler.java:84) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:77)
   at org.apache.solr.core.SolrCore.execute(SolrCore.java:658)
at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:191) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:159) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:230) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:175) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:104) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:261) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:844) at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:581) at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:447)
   at java.lang.Thread.run(Thread.java:619)

After sleeping on the problem I see that it does not directly stem from Solr, but from the
module  org.xmlpull.mxp1.MXParser. Hmmm. I'm open to sugestions and ideas.

First is this doable?
If yes, will I have to modify the code to save the file to disk and then read it back
in order to index it in chunks.
Or can I get it it working on a stock Solr install.

Thanks,

Brian

Norberto Meijome schrieb:
On Wed, 05 Sep 2007 17:18:09 +0200
Brian Carmalt <[EMAIL PROTECTED]> wrote:

I've bin trying to index a 300MB file to solr 1.2. I keep getting out of memory heap errors.
Even on an empty index with one Gig of vm memory it sill won't work.

Hi Brian,

VM != heap memory.

VM = OS memory
heap memory = memory made available by the JavaVM to the Java process. Heap 
memory errors are hardly ever an issue of the app itself (other , of course, 
with bad programming... but it doesnt seem to be issue here so far)


[EMAIL PROTECTED] [Thu Sep  6 14:59:21 2007]
/usr/home/betom
$ java -X
[...]
    -Xms<size>        set initial Java heap size
    -Xmx<size>        set maximum Java heap size
    -Xss<size>        set java thread stack size
[...]

For example, start solr as :
java  -Xms64m -Xmx512m   -jar start.jar

YMMV with respect to the actual values you use.

Good luck,
B
_________________________
{Beto|Norberto|Numard} Meijome

Windows caters to everyone as though they are idiots. UNIX makes no such assumption. It assumes you know what you are doing, and presents the challenge of figuring it out for yourself if you don't.

I speak for myself, not my employer. Contents may be hot. Slippery when wet. 
Reading disclaimers makes you go blind. Writing them is worse. You have been 
Warned.


Reply via email to