gt;
>> On Mar 15, 2012, at 6:39 AM, Husain, Yavar wrote:
>>
>>> Thanks a ton.
>>> ________________
>>> From: Li Li [fancye...@gmail.com]
>>> Sent: Thursday, March 15, 2012 12:11 PM
>>> To: Husain, Yavar
>>>
ton.
>>
>> From: Li Li [fancye...@gmail.com]
>> Sent: Thursday, March 15, 2012 12:11 PM
>> To: Husain, Yavar
>> Cc: solr-user@lucene.apache.org
>> Subject: Re: Solr out of memory exception
>>
>> it seems you ar
> From: Li Li [fancye...@gmail.com]
> Sent: Thursday, March 15, 2012 12:11 PM
> To: Husain, Yavar
> Cc: solr-user@lucene.apache.org
> Subject: Re: Solr out of memory exception
>
> it seems you are using 64bit jvm(32bit jvm can only allocate about 1.5GB).
> you should ena
Thanks a ton.
From: Li Li [fancye...@gmail.com]
Sent: Thursday, March 15, 2012 12:11 PM
To: Husain, Yavar
Cc: solr-user@lucene.apache.org
Subject: Re: Solr out of memory exception
it seems you are using 64bit jvm(32bit jvm can only allocate about 1.5GB
-
> From: "Li Li";
> Date: Thu, Mar 15, 2012 02:41 PM
> To: "Husain, Yavar";
> Cc: "solr-user@lucene.apache.org";
> Subject: Re: Solr out of memory exception
>
>
> it seems you are using 64bit jvm(32bit jvm can only allocate about
why should enable pointer compression?
-- Original --
From: "Li Li";
Date: Thu, Mar 15, 2012 02:41 PM
To: "Husain, Yavar";
Cc: "solr-user@lucene.apache.org";
Subject: Re: Solr out of memory exception
it seems you are
ver with exactly same system and solr configuration &
> memory it is working fine?
>
>
> -Original Message-
> From: Li Li [mailto:fancye...@gmail.com]
> Sent: Thursday, March 15, 2012 11:11 AM
> To: solr-user@lucene.apache.org
> Subject: Re: Solr out of memory excep
pace allocated but then how come on a
different server with exactly same system and solr configuration & memory it is
working fine?
-Original Message-
From: Li Li [mailto:fancye...@gmail.com]
Sent: Thursday, March 15, 2012 11:11 AM
To: solr-user@lucene.apache.org
Subject: Re: Sol
how many memory are allocated to JVM?
On Thu, Mar 15, 2012 at 1:27 PM, Husain, Yavar wrote:
> Solr is giving out of memory exception. Full Indexing was completed fine.
> Later while searching maybe when it tries to load the results in memory it
> starts giving this exception. Though with the sam
quot; caused "SEVERE:
> java.lang.OutOfMemoryError: Java heap space" , the solr still works ,and i
> can continue searching other words.
>
>
>
> - 原始邮件 ----------
> 发件人: "Daniel Brügge";
> 发送时间: 2012年3月6日(星期二) 晚上6:35
> 收件人: &
Maybe the index is to big and you need to add more memory to the JVM via
the -Xmx parameter. See also
http://wiki.apache.org/solr/SolrPerformanceFactors#OutOfMemoryErrors
Daniel
On Tue, Mar 6, 2012 at 10:01 AM, C.Yunqin <345804...@qq.com> wrote:
> sometimes when i search a simple word ,like "i
I commented the autocommit option and tried uploading the file (a smaller
file now 5 million records) and I hit an oom again:
Jun 17, 2011 2:32:59 PM org.apache.solr.common.SolrException log
SEVERE: java.lang.OutOfMemoryError: Java heap space
--
View this message in context:
http://lucene.472066
On Fri, Jun 17, 2011 at 1:30 AM, pravesh wrote:
> If you are sending whole CSV in a single HTTP request using curl, why not
> consider sending it in smaller chunks?
Smaller chunks should not matter - Solr streams from the input (i.e.
the whole thing is not buffered in memory).
It could be relate
I did that , but when I split them into 5 mill records, the first file went
through fine, when I started processing the second file SOLR hit an OOM
again:
org.apache.solr.common.SolrException log
SEVERE: java.lang.OutOfMemoryError: Java heap space
at
org.apache.lucene.index.FreqProxTermsWri
If you are sending whole CSV in a single HTTP request using curl, why not
consider sending it in smaller chunks?
--
View this message in context:
http://lucene.472066.n3.nabble.com/SOlR-Out-of-Memory-exception-tp3074636p3075091.html
Sent from the Solr - User mailing list archive at Nabble.com.
Yes Eric, after changing the lock type to Single, I got an OOM after loading
5.5 million records. I am using the curl command to upload the csv.
--
View this message in context:
http://lucene.472066.n3.nabble.com/SOlR-Out-of-Memory-exception-tp3074636p3074765.html
Sent from the Solr - User mailin
H, are you still getting your OOM after 7M records? Or some larger
number? And how are you using the CSV uploader?
Best
Erick
On Thu, Jun 16, 2011 at 9:14 PM, jyn7 wrote:
> We just started using SOLR. I am trying to load a single file with 20 million
> records into SOLR using the CSV uploade
I should also add that reducing the caches and autowarm sizes (or not using
them at all) drastically reduces memory consumption when a new searcher is
being prepares after a commit. The memory usage will spike at these events.
Again, use a monitoring tool to get more information on your specific
Bing Li,
One should be conservative when setting Xmx. Also, just setting Xmx might not
do the trick at all because the garbage collector might also be the issue
here. Configure the JVM to output debug logs of the garbage collector and
monitor the heap usage (especially the tenured generation) w
Dear Adam,
I also got the OutOfMemory exception. I changed the JAVA_OPTS in catalina.sh
as follows.
...
if [ -z "$LOGGING_MANAGER" ]; then
JAVA_OPTS="$JAVA_OPTS
-Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager"
else
JAVA_OPTS="$JAVA_OPTS -server -Xms8096m -Xmx80
By adding more server means add more searchers (slaves) on Load balancer not
talking about sharding.
Sharding is required when your index size will increase the size of about
50GB.
-
Thanx:
Grijesh
--
View this message in context:
http://lucene.472066.n3.nabble.com/Solr-Out-of-Memory-Error
Is anyone familiar with the environment variable, JAVA_OPTS? I set
mine to a much larger heap size and never had any of these issues
again.
JAVA_OPTS = -server -Xms4048m -Xmx4048m
Adam
On Wed, Jan 19, 2011 at 3:29 AM, Isan Fulia wrote:
> Hi all,
> By adding more servers do u mean sharding of in
Hi all,
By adding more servers do u mean sharding of index.And after sharding , how
my query performance will be affected .
Will the query execution time increase.
Thanks,
Isan Fulia.
On 19 January 2011 12:52, Grijesh wrote:
>
> Hi Isan,
>
> It seems your index size 25GB si much more compared t
Hi Isan,
It seems your index size 25GB si much more compared to you have total Ram
size is 4GB.
You have to do 2 things to avoid Out Of Memory Problem.
1-Buy more Ram ,add at least 12 GB of more ram.
2-Increase the Memory allocated to solr by setting XMX values.at least 12 GB
allocate to solr.
B
Hi Grijesh,all,
We are having only single master and are using multicore environment with
size of various indexes as 675MB ,516 MB , 3GB , 25GB.
Number of documents with 3GB index are roughly around 14 lakhs
and with 25 GB are roughly around 7 lakh
Queries are fired very frequently.
ramBufferSize
On which server [master/slave] Out of Memory ocuur
What is your index in size[GB]?
How many documents you have?
What is query per second?
How you are indexing?
What is you ramBufferSize?
-
Thanx:
Grijesh
--
View this message in context:
http://lucene.472066.n3.nabble.com/Solr-Out-of-Memory-
e free memory on the host, try using the -Xmx command line
parameter to raise the max amount of memory solr can use. (-Xmx2g for example)
- Original Message
From: Isan Fulia
To: markus.jel...@openindex.io
Cc: solr-user@lucene.apache.org
Sent: Tue, January 18, 2011 9:04:31 PM
Subject: Re
Hi markus,
We dont have any XMX memory settings as such .Our java version is 1.6.0_19
and solr version is 1.4 developer version. Can u plz help us out.
Thanks,
Isan.
On 18 January 2011 19:54, Markus Jelsma wrote:
> Hi
>
> I haven't seen one like this before. Please provide JVM settings and Solr
Hi
I haven't seen one like this before. Please provide JVM settings and Solr
version.
Cheers
On Tuesday 18 January 2011 15:08:35 Isan Fulia wrote:
> Hi all,
> I got the following error on solr with m/c configuration 4GB RAM and
> Intel Dual Core Processor.Can you please help me out.
>
> jav
29 matches
Mail list logo