On Tue, Feb 17, 2009 at 1:10 PM, Otis Gospodnetic <
otis_gospodne...@yahoo.com> wrote:
> Right. But I was trying to point out that a single 150MB Document is not
> in fact what the o.p. wants to do. For example, if your 150MB represents,
> say, a whole book, should that really be a single docume
-- http://sematext.com/ -- Lucene - Solr - Nutch
From: Shalin Shekhar Mangar
To: solr-user@lucene.apache.org
Sent: Tuesday, February 17, 2009 2:48:08 PM
Subject: Re: Outofmemory error for large files
On Tue, Feb 17, 2009 at 10:26 AM, Otis Gospodnetic
On Tue, Feb 17, 2009 at 10:26 AM, Otis Gospodnetic <
otis_gospodne...@yahoo.com> wrote:
> Siddharth,
>
> But does your 150MB file represent a single Document? That doesn't sound
> right.
>
Otis, Solrj writes the whole XML in memory before writing it to server. That
may be one reason behind Sidhh
, 2009 12:39:53 PM
Subject: RE: Outofmemory error for large files
Otis,
I haven't tried it yet but what I meant is :
If we divide the content in multiple parts, then words will be splitted in two
different SOLR documents. If the main document contains 'Hello World' then
these
ing for
'Hello world' won't give me the required search result unless I use OR in the
query.
Thanks,
Siddharth
-Original Message-
From: Otis Gospodnetic [mailto:otis_gospodne...@yahoo.com]
Sent: Tuesday, February 17, 2009 9:58 AM
To: solr-user@lucene.apache.org
Subject: R
Siddharth,
At the end of your email you said:
"One option I see is to break the file in chunks, but with this, I won't be
able to search with multiple words if they are distributed in different
documents."
Unless I'm missing something unusual about your application, I don't think the
above is