could be to index a summary and point to the large doc
in the file system or database.
Cheers,
Jon
-Original Message-
From: David Thibault [mailto:[EMAIL PROTECTED]
Sent: Saturday, February 23, 2008 9:50 PM
To: solr-user@lucene.apache.org
Subject: Re: Indexing very large files.
Thanks. I'
hing strings
>
>
> Jon
>
>
> -Original Message-
> From: David Thibault [mailto:[EMAIL PROTECTED]
> Sent: Thursday, February 21, 2008 7:58 PM
> To: solr-user@lucene.apache.org
> Subject: Re: Indexing very large files.
>
> All,
> A while back I was running int
Thibault [mailto:[EMAIL PROTECTED]
Sent: Thursday, February 21, 2008 7:58 PM
To: solr-user@lucene.apache.org
Subject: Re: Indexing very large files.
All,
A while back I was running into an issue with a Java heap out of memory
error while indexing large files. I figured out that was my own error due
All,
A while back I was running into an issue with a Java heap out of memory
error while indexing large files. I figured out that was my own error due
to a misconfiguration of my Netbeans memory settings.
However, now that is fixed and I have stumbled upon a new error. When
trying to upload file
irection.
>
> Otis
> --
> Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
>
> - Original Message
> From: David Thibault <[EMAIL PROTECTED]>
> To: solr-user@lucene.apache.org
> Sent: Wednesday, January 16, 2008 1:31:23 PM
> Subject: Re: Index
Yonik,
I pulled SimplePostTool apart, pulled out the main() and the postFiles() and
just use it directly in Java via postFile() -> postData(). It seems to work
OK. Maybe I should upgrade to v1.3 and try doing things directly through
Solrj. Is 1.3 stable yet? Might that be a better plan altogethe
: solr-user@lucene.apache.org
Sent: Wednesday, January 16, 2008 1:31:23 PM
Subject: Re: Indexing very large files.
Walter and all,
I had been bumping up the heap for my Java app (running outside of
Tomcat)
but I hadn't yet tried bumping up my Tomcat heap. That seems to have
helped
me upl
>From your stack trace, it looks like it's your client running out of
memory, right?
SimplePostTool was meant as a command-line replacement to curl to
remove that dependency, not as a recommended way to talk to Solr.
-Yonik
On Jan 16, 2008 4:29 PM, David Thibault <[EMAIL PROTECTED]> wrote:
> OK,
le.com/reader/shared/16849249410805339619
-Original Message-
From: David Thibault [mailto:[EMAIL PROTECTED]
Sent: Thursday, 17 January 2008 8:30 AM
To: solr-user@lucene.apache.org
Subject: Re: Indexing very large files.
OK, I have now bumped my tomcat JVM up to 1024MB min and 1500MB max. For
OK, I have now bumped my tomcat JVM up to 1024MB min and 1500MB max. For
some reason Walter's suggestion helped me get past the 8MB file upload to
Solr but it's still choking on a 32MB file. Is there a way to set
per-webapp JVM settings in tomcat, or is the overall tomcat JVM sufficient
to set?
Walter and all,
I had been bumping up the heap for my Java app (running outside of Tomcat)
but I hadn't yet tried bumping up my Tomcat heap. That seems to have helped
me upload the 8MB file, but it's crashing while uploading a 32MB file now. I
Just bumped tomcat to 1024MB of heap, so I'm not sure
Nice signature...=)
On 1/16/08, Erick Erickson <[EMAIL PROTECTED]> wrote:
>
> The PS really wasn't related to your OOM, and raising that shouldn't
> have changed the behavior. All that happens if you go beyond 10,000
> tokens is that the rest gets thrown away.
>
> But we're beyond my real knowledg
This error means that the JVM has run out of heap space. Increase the
heap space. That is an option on the "java" command. I set my heap to
200 Meg and do it this way with Tomcat 6:
JAVA_OPTS="-Xmx600M" tomcat/bin/startup.sh
wunder
On 1/16/08 8:33 AM, "David Thibault" <[EMAIL PROTECTED]> wrote:
The PS really wasn't related to your OOM, and raising that shouldn't
have changed the behavior. All that happens if you go beyond 10,000
tokens is that the rest gets thrown away.
But we're beyond my real knowledge level about SOLR, so I'll defer
to others. A very quick-n-dirty test as to whether y
I tried raising the 1 under
as well as and still no luck. I'm trying to
upload a text file that is about 8 MB in size. I think the following stack
trace still points to some sort of overflowed String issue. Thoughts?
Solr returned an error: Java heap space java.lang.OutOfMemoryError: J
I think your PS might do the trick. My JVM doesn't seem to be the issue,
because I've set it to -Xmx512m -Xms256m. I will track down the solr config
parameter you mentioned and try that. Thanks for the quick response!
Dave
On 1/16/08, Erick Erickson <[EMAIL PROTECTED]> wrote:
>
> P.S. Lucene by
P.S. Lucene by default limits the maximum field length
to 10K tokens, so you have to bump that for large files.
Erick
On Jan 16, 2008 11:04 AM, Erick Erickson <[EMAIL PROTECTED]> wrote:
> I don't think this is a StringBuilder limitation, but rather your Java
> JVM doesn't start with enough memor
I don't think this is a StringBuilder limitation, but rather your Java
JVM doesn't start with enough memory. i.e. -Xmx.
In raw Lucene, I've indexed 240M files
Best
Erick
On Jan 16, 2008 10:12 AM, David Thibault <[EMAIL PROTECTED]>
wrote:
> All,
> I just found a thread about this on the
All,
I just found a thread about this on the mailing list archives because I'm
troubleshooting the same problem. The kicker is that it doesn't take such
large files to kill the StringBuilder. I have discovered the following:
By using a text file made up of 3,443,464 bytes or less, I get no erro
On 7-Sep-07, at 4:47 AM, Brian Carmalt wrote:
Lance Norskog schrieb:
Now I'm curious: what is the use case for documents this large?
It is a rand use case, but could become relevant for us. I was told
to explore the possibilities, and that's what I'm doing. :)
Since I haven't heard any
Legal discovery can have requirements like this. --wunder
On 9/7/07 4:47 AM, "Brian Carmalt" <[EMAIL PROTECTED]> wrote:
> Lance Norskog schrieb:
>> Now I'm curious: what is the use case for documents this large?
>>
>> Thanks,
>>
>> Lance Norskog
>>
>>
>>
> It is a rand use case, but could
Lance Norskog schrieb:
Now I'm curious: what is the use case for documents this large?
Thanks,
Lance Norskog
It is a rand use case, but could become relevant for us. I was told to
explore the possibilities, and that's what I'm doing. :)
Since I haven't heard any suggestions as to how to
On 6-Sep-07, at 2:26 AM, Brian Carmalt wrote:
Hallo again,
I checked out the solr source and built the 1.3-dev version and
then I tried to index the same file to the new server.
I do get a different exception trace, but the result is the same.
Note that StringBuilder expands capacity by al
Now I'm curious: what is the use case for documents this large?
Thanks,
Lance Norskog
On Thu, 2007-09-06 at 11:26 +0200, Brian Carmalt wrote:
> Hallo again,
>
> I checked out the solr source and built the 1.3-dev version and then I
> tried to index the same file to the new server.
> I do get a different exception trace, but the result is the same.
>
> java.lang.OutOfMemoryError:
Hallo again,
I checked out the solr source and built the 1.3-dev version and then I
tried to index the same file to the new server.
I do get a different exception trace, but the result is the same.
java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2882)
a
Moin Thorsten,
I am using Solr 1.2.0. I'll try the svn version out and see of that helps.
Thanks,
Brian
Which version do you use of solr?
http://svn.apache.org/viewvc/lucene/solr/trunk/src/java/org/apache/solr/handler/XmlUpdateRequestHandler.java?view=markup
The trunk version of the XmlUpdate
On Thu, 2007-09-06 at 08:55 +0200, Brian Carmalt wrote:
> Hello again,
>
> I run Solr on Tomcat under windows and use the tomcat monitor to start
> the service. I have set the minimum heap
> size to be 512MB and then maximum to be 1024mb. The system has 2 Gigs of
> ram. The error that I get afte
Hello again,
I run Solr on Tomcat under windows and use the tomcat monitor to start
the service. I have set the minimum heap
size to be 512MB and then maximum to be 1024mb. The system has 2 Gigs of
ram. The error that I get after sending
approximately 300 MB is:
java.lang.OutOfMemoryError: Ja
Yonik Seeley schrieb:
On 9/5/07, Brian Carmalt <[EMAIL PROTECTED]> wrote:
I've bin trying to index a 300MB file to solr 1.2. I keep getting out of
memory heap errors.
300MB of what... a single 300MB document? Or is that file represent
multiple documents in XML or CSV format?
-Yonik
On Wed, 05 Sep 2007 17:18:09 +0200
Brian Carmalt <[EMAIL PROTECTED]> wrote:
> I've bin trying to index a 300MB file to solr 1.2. I keep getting out of
> memory heap errors.
> Even on an empty index with one Gig of vm memory it sill won't work.
Hi Brian,
VM != heap memory.
VM = OS memory
heap m
On 9/5/07, Brian Carmalt <[EMAIL PROTECTED]> wrote:
> I've bin trying to index a 300MB file to solr 1.2. I keep getting out of
> memory heap errors.
300MB of what... a single 300MB document? Or is that file represent
multiple documents in XML or CSV format?
-Yonik
Hello all,
I will apologize up front if this is comes twice.
I've bin trying to index a 300MB file to solr 1.2. I keep getting out of
memory heap errors.
Even on an empty index with one Gig of vm memory it sill won't work.
Is it even possible to get Solr to index such large files?
Do I need to
33 matches
Mail list logo