Hi,
I'm running solr in tomcat. I am trying to upgrade to solr 4.4 but I
can't get it to work. If someone can point me at what I'm doing wrong.
tomcat context:
crossContext="true">
value="/opt/solr4.4/solr_address" override="true" />
core.properties:
name=address
collection=address
coreNod
Hi,
I have setup SolrCloud with tomcat. I use solr 4.1.
I have zookeeper running on 192.168.1.10.
A tomcat running solr_myidx on 192.168.1.10 on port 8080.
A tomcat running solr_myidx on 192.168.1.11 on port 8080.
My solr.xml is like this:
hostPort="8080" hostContext="solr_myidx" zkClientTi
Hi,
In the solr admin web interface, when looking at the statistics of a
collection (this page: http://{ip}:8080/{index}/#/collection1), there is
"Current" under Optimized.
What does it mean?
Thanks.
This transmission is strictly confidential, possibly legally privileged, and
intended solely
e method to POST solved the issue for me (maybe I was hitting a
> GET limit somewhere?).
>
> -Luis
>
>
> On Tue, Apr 16, 2013 at 7:38 AM, Marc des Garets wrote:
>
>> Did you find anything? I have the same problem but it's on update requests
>> only.
>>
>>
Did you find anything? I have the same problem but it's on update
requests only.
The error comes from the solrj client indeed. It is solrj logging this
error. There is nothing in solr itself and it does the update correctly.
It's fairly small simple documents being updated.
On 04/15/2013 07:
taken or merged the new config, then I would suggest making
> sure that the update log is not enabled (or make sure you do hard commits
> relatively frequently rather than only soft commits.)
>
> -- Jack Krupansky
>
> -Original Message-
> From: Marc De
earn your index size and what is your performance measure as query
> per second?
>
> 2013/4/11 Marc Des Garets
>
>> Big heap because very large number of requests with more than 60 indexes
>> and hundreds of million of documents (all indexes together). My problem
>> is
same config.
On 04/10/2013 07:38 PM, Shawn Heisey wrote:
> On 4/10/2013 9:48 AM, Marc Des Garets wrote:
>> The JVM behavior is now radically different and doesn't seem to make
>> sense. I was using ConcMarkSweepGC. I am now trying the G1 collector.
>>
>> The perm gen wen
Hi,
I run multiple solr indexes in 1 single tomcat (1 webapp per index). All
the indexes are solr 3.5 and I have upgraded few of them to solr 4.1
(about half of them).
The JVM behavior is now radically different and doesn't seem to make
sense. I was using ConcMarkSweepGC. I am now trying the G1 c
specify the individual
components directly, e.g. to get the equivalent of StandardAnalyzer, but
without the StopFilter:
Steve
> -Original Message-
> From: Marc Des Garets [mailto:marc.desgar...@192.com]
> Sent: Friday, September 09, 2011 6:21 AM
> To: sol
Hi,
I have a simple field defined like this:
Which I use here:
In solr 1.4, I could do:
?q=(middlename:a*)
And I was getting all documents where middlename = A or where middlename starts
by the letter A.
In solr 3.3, I get only results where middlename starts by the lette
Hi,
I am doing a really simple query on my index (it's running in tomcat):
http://host:8080/solr_er_07_09/select/?q=hash_id:123456
I am getting the following exception:
HTTP Status 500 - null java.lang.IllegalArgumentException at
java.nio.Buffer.limit(Buffer.java:249) at
org.apache.lucene
Perfect. Thank you for your help.
-Original Message-
From: Shalin Shekhar Mangar [mailto:shalinman...@gmail.com]
Sent: 08 March 2010 12:57
To: solr-user@lucene.apache.org
Subject: Re: question about mergeFactor
On Mon, Mar 8, 2010 at 5:31 PM, Marc Des Garets
wrote:
>
> If I
Hello,
On the solr wiki, here:
http://wiki.apache.org/solr/SolrPerformanceFactors
It is written:
mergeFactor Tradeoffs
High value merge factor (e.g., 25):
Pro: Generally improves indexing speed
Con: Less frequent merges, resulting in a collection with more index
files which may slow
Just curious, have you checked if the hanging you are experiencing is not
garbage collection related?
-Original Message-
From: Erick Erickson [mailto:erickerick...@gmail.com]
Sent: 13 January 2010 13:33
To: solr-user@lucene.apache.org
Subject: Re: Problem comitting on 40GB index
That's
ekhar Mangar [mailto:shalinman...@gmail.com]
Sent: 12 January 2010 07:49
To: solr-user@lucene.apache.org
Subject: Re: update solr index
On Mon, Jan 11, 2010 at 7:42 PM, Marc Des Garets
wrote:
>
> I am running solr in tomcat and I have about 35 indexes (between 2 and
> 80 millions documents
Hi,
I am running solr in tomcat and I have about 35 indexes (between 2 and
80 millions documents each). Currently if I try to update few documents
from an index (let's say the one which contains 80 millions documents)
while tomcat is running and therefore receiving requests, I am getting
few very
Subject: Re: very slow add/commit time
How many MB have you set of cache on your solrconfig.xml?
On Tue, Nov 3, 2009 at 12:24 PM, Marc Des Garets
wrote:
> Hi,
>
>
>
> I am experiencing a problem with an index of about 80 millions
documents
> (41Gb). I am trying to update docum
Hi,
I am experiencing a problem with an index of about 80 millions documents
(41Gb). I am trying to update documents in this index using Solrj.
When I do:
solrServer.add(docs); //docs is a List that contains
1000 SolrInputDocument (takes 36sec)
solrServer.commit(false,false); //either ne
19 matches
Mail list logo