rom: Erick Erickson
Sent: 09 March 2020 21:13
To: solr-user@lucene.apache.org
Subject: Re: OutOfMemory error solr 8.4.1
I’m 99% certain that something in your custom jar is the culprit, otherwise
we’d have seen a _lot_ of these. TIMED_WAITING is usually just a listener
thread, but they shouldn’t b
p(Native Method)
> org.apache.http.impl.client.IdleConnectionEvictor$1.run(IdleConnectionEvictor.java:66)
> java.lang.Thread.run(Thread.java:748)
>
> Thanks and Regards,
> Srinivas Kashyap
>
> -Original Message-
> From: Erick Erickson
> Sent: 06 March 2020
4
To: solr-user@lucene.apache.org
Subject: Re: OutOfMemory error solr 8.4.1
I assume you recompiled the jar file? re-using the same one compiled against 5x
is unsupported, nobody will be able to help until you recompile.
Once you’ve done that, if you still have the problem you need to take a thre
I assume you recompiled the jar file? re-using the same one compiled against 5x
is unsupported, nobody will be able to help until you recompile.
Once you’ve done that, if you still have the problem you need to take a thread
dump to see if your custom code is leaking threads, that’s my number one
Hi Erick,
We have custom code which are schedulers to run delta imports on our cores and
I have added that custom code as a jar and I have placed it on
server/solr-webapp/WEB-INF/lib. Basically we are fetching the JNDI datasource
configured in the jetty.xml(Oracle) and creating connection objec
This one can be a bit tricky. You’re not running out of overall memory, but you
are running out of memory to allocate stacks. Which implies that, for some
reason, you are creating a zillion threads. Do you have any custom code?
You can take a thread dump and see what your threads are doing, and
solr-1.3.0/example/solr
>
> Francis
>
> -Original Message-
> From: Constantijn Visinescu [mailto:baeli...@gmail.com]
> Sent: Wednesday, September 09, 2009 11:35 PM
> To: solr-user@lucene.apache.org
> Subject: Re: OutOfMemory error on solr 1.3
>
> Just wondering, how much
Subject: Re: OutOfMemory error on solr 1.3
Just wondering, how much memory are you giving your JVM ?
On Thu, Sep 10, 2009 at 7:46 AM, Francis Yakin wrote:
>
> I am having OutOfMemory error on our slaves server, I would like to know if
> someone has the same issue and have the solution
--Original Message-
From: Constantijn Visinescu [mailto:baeli...@gmail.com]
Sent: Wednesday, September 09, 2009 11:35 PM
To: solr-user@lucene.apache.org
Subject: Re: OutOfMemory error on solr 1.3
Just wondering, how much memory are you giving your JVM ?
On Thu, Sep 10, 2009 at 7:46 AM, Fra
Just wondering, how much memory are you giving your JVM ?
On Thu, Sep 10, 2009 at 7:46 AM, Francis Yakin wrote:
>
> I am having OutOfMemory error on our slaves server, I would like to know if
> someone has the same issue and have the solution for this.
>
> SEVERE: Error during auto-warming of
>
ira/browse/SOLR-291
Thank you,
Koji
Regards,
Francis
-Original Message-
From: Koji Sekiguchi [mailto:k...@r.email.ne.jp]
Sent: Wednesday, June 17, 2009 8:28 PM
To: solr-user@lucene.apache.org
Subject: Re: OutOfMemory error on solrslaves
Francis Yakin wrote:
We are experiencing &q
17, 2009 8:28 PM
To: solr-user@lucene.apache.org
Subject: Re: OutOfMemory error on solrslaves
Francis Yakin wrote:
> We are experiencing "OutOfMemory" error frequently on our slaves, this is the
> error:
>
> SEVERE: Error during auto-warming of
> key:org.apache.
Francis Yakin wrote:
We are experiencing "OutOfMemory" error frequently on our slaves, this is the
error:
SEVERE: Error during auto-warming of
key:org.apache.solr.search.queryresult...@a8c6f867:java.lang.OutOfMemoryError:
allocLargeObjectOrArray - Object size: 5120080, Num elements: 1280015
j
Nutch
- Original Message
> From: Francis Yakin
> To: "solr-user@lucene.apache.org"
> Sent: Tuesday, May 5, 2009 1:50:07 PM
> Subject: RE: OutOfMemory error
>
>
> Here is cache in solrconfig.xml
>
>
>
>
> class="so
he.org
Subject: Re: OutOfMemory error
Hi Francis,
How big are your caches? Please paste the relevant part of the config.
Which of your fields do you sort by? Paste definitions of those fields from
schema.xml, too.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Or
Hi Francis,
How big are your caches? Please paste the relevant part of the config.
Which of your fields do you sort by? Paste definitions of those fields from
schema.xml, too.
Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
- Original Message
> From: Francis Yak
I'm guessing (and it's only a guess) that you have some field
that's a datestamp and that you're sorting on it in your warmup
queries??? If so, there are possibilities.
It would help a lot if you'd tell us more about the structure of
your index and what your autowarm queries look like, otherwise
t
On Tue, Feb 17, 2009 at 1:10 PM, Otis Gospodnetic <
otis_gospodne...@yahoo.com> wrote:
> Right. But I was trying to point out that a single 150MB Document is not
> in fact what the o.p. wants to do. For example, if your 150MB represents,
> say, a whole book, should that really be a single docume
-- http://sematext.com/ -- Lucene - Solr - Nutch
From: Shalin Shekhar Mangar
To: solr-user@lucene.apache.org
Sent: Tuesday, February 17, 2009 2:48:08 PM
Subject: Re: Outofmemory error for large files
On Tue, Feb 17, 2009 at 10:26 AM, Otis Gospodnetic
On Tue, Feb 17, 2009 at 10:26 AM, Otis Gospodnetic <
otis_gospodne...@yahoo.com> wrote:
> Siddharth,
>
> But does your 150MB file represent a single Document? That doesn't sound
> right.
>
Otis, Solrj writes the whole XML in memory before writing it to server. That
may be one reason behind Sidhh
, 2009 12:39:53 PM
Subject: RE: Outofmemory error for large files
Otis,
I haven't tried it yet but what I meant is :
If we divide the content in multiple parts, then words will be splitted in two
different SOLR documents. If the main document contains 'Hello World' then
these
ing for
'Hello world' won't give me the required search result unless I use OR in the
query.
Thanks,
Siddharth
-Original Message-
From: Otis Gospodnetic [mailto:otis_gospodne...@yahoo.com]
Sent: Tuesday, February 17, 2009 9:58 AM
To: solr-user@lucene.apache.org
Subject: R
Siddharth,
At the end of your email you said:
"One option I see is to break the file in chunks, but with this, I won't be
able to search with multiple words if they are distributed in different
documents."
Unless I'm missing something unusual about your application, I don't think the
above is
: nearly 100 percent and no queries were answered. I found out that
: "warming" the server with serial queries, not parallel ones, bypassed
: this problem (not to be confused with warming the caches!). So after a
Note that you can have Solr do this automatically for you in both
firstSearcher and
Hi,
Chris Hostetter wrote:
This is a fairly typical Lucene issue (ie: not specific to Solr)...
Ah, I see. I should really put more attention on Lucene. But when
working with Solr I sometimes forget about the underlying technology.
Sorting on a field requires building a FieldCache for every d
On 6/14/06, Chris Hostetter <[EMAIL PROTECTED]> wrote:
Off the top of my head, i don't remember if omiting norms for fields
reduces the amount of resident memory needed by the index
It does indeed. 1 byte per document for the indexed field.
-Yonik
This is a fairly typical Lucene issue (ie: not specific to Solr)...
Sorting on a field requires building a FieldCache for every document --
regardless of how many documents match your query. This cache is reused
for all searches thta sort on that field.
For things like Integers and Floats, the
27 matches
Mail list logo