ter.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
> From: Justin <[EMAIL PROTECTED]>
> To: solr-user@lucene.apache.org
> Sent: Monday, March 3, 2008 2:43:03 PM
> Subject: out of memory every time
>
> I'm indexing a large nu
ser@lucene.apache.org
Subject: Re: out of memory every time
Just guessing, but I'd say it has something to do with the dynamic fields...
I ran a similar operation (docs ranged from 1K to 2MB). For the
initial indexing, I wrote a job to submit about 100,000 documents to
solr, committing after every 10
Just guessing, but I'd say it has something to do with the dynamic fields...
I ran a similar operation (docs ranged from 1K to 2MB). For the
initial indexing, I wrote a job to submit about 100,000 documents to
solr, committing after every 10 docs. I never sent any optimize
commands. I also used
On Mon, 2008-03-03 at 21:43 +0200, Justin wrote:
> I'm indexing a large number of documents.
>
> As a server I'm using the /solr/example/start.jar
>
> No matter how much memory I allocate it fails around 7200 documents.
How do you allocate the memory?
Something like:
java -Xms512M -Xmx1500M -ja
I'm indexing a large number of documents.
As a server I'm using the /solr/example/start.jar
No matter how much memory I allocate it fails around 7200 documents.
I am committing every 100 docs, and optimizing every 300.
all of my xml's contain on doc, and can range in size from 2k to 700k.
when