On Tue, Feb 17, 2009 at 1:10 PM, Otis Gospodnetic <
otis_gospodne...@yahoo.com> wrote:
> Right. But I was trying to point out that a single 150MB Document is not
> in fact what the o.p. wants to do. For example, if your 150MB represents,
> say, a whole book, should that really be a single docume
-- http://sematext.com/ -- Lucene - Solr - Nutch
From: Shalin Shekhar Mangar
To: solr-user@lucene.apache.org
Sent: Tuesday, February 17, 2009 2:48:08 PM
Subject: Re: Outofmemory error for large files
On Tue, Feb 17, 2009 at 10:26 AM, Otis Gospodnetic
On Tue, Feb 17, 2009 at 10:26 AM, Otis Gospodnetic <
otis_gospodne...@yahoo.com> wrote:
> Siddharth,
>
> But does your 150MB file represent a single Document? That doesn't sound
> right.
>
Otis, Solrj writes the whole XML in memory before writing it to server. That
may be one reason behind Sidhh
, 2009 12:39:53 PM
Subject: RE: Outofmemory error for large files
Otis,
I haven't tried it yet but what I meant is :
If we divide the content in multiple parts, then words will be splitted in two
different SOLR documents. If the main document contains 'Hello World' then
these
From: "Gargate, Siddharth"
To: solr-user@lucene.apache.org
Sent: Monday, February 16, 2009 2:00:58 PM
Subject: Outofmemory error for large files
I am trying to index around 150 MB text file with 1024 MB max heap. But I get
Outofmemory error in the SolrJ code.
"Gargate, Siddharth"
To: solr-user@lucene.apache.org
Sent: Monday, February 16, 2009 2:00:58 PM
Subject: Outofmemory error for large files
I am trying to index around 150 MB text file with 1024 MB max heap. But
I get Outofmemory error in the SolrJ code.
Exception in thread "mai
I am trying to index around 150 MB text file with 1024 MB max heap. But
I get Outofmemory error in the SolrJ code.
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2882)
at
java.lang.AbstractStringBuilder.expandCapacity