You're saying that the file you're indexing is 500M? Pretty big...

First, I'd ask if you really want to index it as a single file or whether
you can break it up into sub-files. It depends upon what it is I guess.

Second, you can certainly index something this big, you just need
enough memory. Sounds like a 64-bit machine is in order.

Third, make sure you're committing before you try to index this
thing, so you're sure you have as many resources available
as possible.

Fourth, where is your error coming from? The SolrJ program which
I presume is running on your local machine or the Solr server
which is running where?

Best
Erick

On Fri, Sep 30, 2011 at 7:55 AM, hadi <md.anb...@gmail.com> wrote:
> I write a simple program with solrj that index files but after a minute
> passed it crashed and the
> *java.lang.OutOfmemoryError : java heap space* appear
>
> I used Eclipse and my memory storage is abou 2GB and i set the
> -Xms1024M-Xmx2048M for both my VM arg of tomcat and my application in Debug
> Configuration and uncomment the maxBufferedDocs in solrconfig and set it to
> 100 then run again my application but it crashed soon when it reach the
> files greater than 500MB
>
> is there any config to index large files with solrj?
> the detail my solrj is as below:
>
> String urlString = "http://localhost:8983/solr/file";;
> CommonsHttpSolrServer solr = new CommonsHttpSolrServer(urlString);
>
> ContentStreamUpdateRequest req = new
> ContentStreamUpdateRequest("/update/extract");
>
> eq.addFile(file);
> req.setParam("literal.id", file.getAbsolutePath());
> req.setParam("literal.name", file.getName());
> req.setAction(ACTION.COMMIT, true, true);
>
> solr.request(req);
>
>
>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/heap-size-problem-when-indexinf-files-with-solrj-tp3382115p3382115.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>

Reply via email to