Hello again,
I finally managed to add/update solr single core by using Perl CPAN Solr by
Timothy Garafola. But I am unable to actually update or add anything to a
multicore environment !
I was wondering if I am doing something incorrectly or if there is an issue
at this point? Should I be editing
I have a 8 Gigs or RAM 4 CPU's 2.4 GHz eachIs this the information
you were looking for?
Otis Gospodnetic wrote:
>
> You'll need to provide more information about your environment and index
> if you want guesstimates.
>
> Otis
> --
> Sematext -- http://sematext.com/ -- Lucene - Solr -
Hi. We are experimenting with installing Tomcat 5.5 from Red Hat
Repositories.
Tomcat 5.5, Java 1.5, Solr 1.2, and REHL 4
When I try to access solr, the following error occurs:
SEVERE: Exception starting filter SolrRequestFilter
java.lang.UnsupportedClassVersionError: unsupported classversio
You're running an older JVM than what was used to compile the code.
-Yonik
On Tue, Apr 8, 2008 at 1:00 PM, Richard Lichlyter-Klein
<[EMAIL PROTECTED]> wrote:
>
> Hi. We are experimenting with installing Tomcat 5.5 from Red Hat
> Repositories.
>
> Tomcat 5.5, Java 1.5, Solr 1.2, and REHL 4
>
>
Ok. Just to give some feedback.
I reindexed with less precision as you told me and it's working really fast.
Thanks for your help!
Jonathan
On Fri, Apr 4, 2008 at 6:02 PM, Chris Hostetter <[EMAIL PROTECTED]>
wrote:
>
> : Looking into the code it seems like a Lucene problem, more than Solr. It
>
Thanks.
Yonik Seeley wrote:
>
> You're running an older JVM than what was used to compile the code.
>
> -Yonik
>
> On Tue, Apr 8, 2008 at 1:00 PM, Richard Lichlyter-Klein
> <[EMAIL PROTECTED]> wrote:
>>
>> Hi. We are experimenting with installing Tomcat 5.5 from Red Hat
>> Repositories.
>>
from the client side, multicore should behave exactly the same as
multi single core servers running next to each other.
I'm not familiar with the perl client, but it will need to be
configured for each core -- rather then one client that talks to
multiple cores.
while you install solr at:
I just was wondering, has anybody dealt with trying to "translate" the data
from a big, legacy DB schema to a Solr installation? What I mean is, our
company has (drawn from a big data warehouse) a series of 6 tables A, B, C, D,
E, and F of product information that we've currently been making se
Hi,
I am trying to search through a distributed index and when I enter this
link:
http://wil1devsch1.cs.tmcs:8983/select?shards=wil1devsch1.cs.tmcs:8983,wil1devsch1.cs.tmcs:8080&q=pizza
But it always gives me results from the index stored on 8983 and not on
8080.
Is there anything wrong in what
Hello everyone. I downloaded the latest nightly build from
http://people.apache.org/builds/lucene/solr/nightly/. When I tried to
compile it, I got the following errors:
[javac] Compiling 189 source files to
/home/csweb/apache-solr-nightly/build/core
[javac]
/home/csweb/apache-solr-nightly/src
I'm testing solr search performance using LoadRunner
the index contains 5M+ docs , about 10.7GB large.
CPU:3.2GHz*2 RAM16GB
The result si dispirited
max:19s min 1.5s avg 11.7s
But the QTime is around 1s
(simple query without facet or mlt,just fetch the top 50 IDs)
So it seems that XMLWriter is the
On 08/04/2008, at 23:13, 李银松 wrote:
I'm testing solr search performance using LoadRunner
the index contains 5M+ docs , about 10.7GB large.
CPU:3.2GHz*2 RAM16GB
The result si dispirited
max:19s min 1.5s avg 11.7s
But the QTime is around 1s
(simple query without facet or mlt,just fetch the top 50
limiting the results? How?
I had set rows=50 fl=ID
is it what u said for limiting the results?
When I switch to writing json data, the result is better
max:14.8s min:0.1savg:8.7
Error:QTime is around 1-4s ,not 1s
2008/4/9, Leonardo Santagada <[EMAIL PROTECTED]>:
>
>
> On 08/04/2008, at
most of time seems to be used for the writer getting and writing the docs
can those docs prefetched?
2008/4/9, Leonardo Santagada <[EMAIL PROTECTED]>:
>
>
> On 08/04/2008, at 23:13, 李银松 wrote:
>
> > I'm testing solr search performance using LoadRunner
> > the index contains 5M+ docs , about 10.7G
800MB does not seem that big. Since all of your 6 tables have product
information it should not be very difficult to join them together and import
them into one Solr index. Again, all of this depends on what you're
searching on and what you want to display as results.
Have you taken a look at http
On 09/04/2008, at 00:24, 李银松 wrote:
most of time seems to be used for the writer getting and writing the
docs
can those docs prefetched?
There is a cache on solr... if you really want it you could make the
cache and the jvm as big as your memory it should probably fit most of
the 10gb i
I will try what u suggested !
Thanks a lot~
在08-4-9,Leonardo Santagada <[EMAIL PROTECTED]> 写道:
>
>
> On 09/04/2008, at 00:24, 李银松 wrote:
>
> > most of time seems to be used for the writer getting and writing the
> > docs
> > can those docs prefetched?
> >
>
>
> There is a cache on solr... if you
On Tue, Apr 8, 2008 at 8:56 PM, swarag <[EMAIL PROTECTED]> wrote:
> I am trying to search through a distributed index and when I enter this
> link:
>http://wil1devsch1.cs.tmcs:8983/select?shards=wil1devsch1.cs.tmcs:8983,wil1devsch1.cs.tmcs:8080&q=pizza
> But it always gives me results from the i
: most of time seems to be used for the writer getting and writing the docs
: can those docs prefetched?
as mentiond, the documentCache can help you out in the common case, but
1-4 seconds for just the XMLWriting seems pretty high ...
1) how are you timing this (ie: what exactly are you measur
: I just was wondering, has anybody dealt with trying to "translate" the
: data from a big, legacy DB schema to a Solr installation? What I mean
there's really no general answer to that question -- it all comes down to
what you want to query on, and what kinds of results you want to get
out.
We are using the Chain Collapse patch as well. Will that not work over a
distributed index?
swarag wrote:
>
> Hi,
> I am trying to search through a distributed index and when I enter this
> link:
>
> http://wil1devsch1.cs.tmcs:8983/select?shards=wil1devsch1.cs.tmcs:8983,wil1devsch1.cs.tmcs:8080
Hi all,
I want to limit the search result by 2 numerical field, A and B, where Solr
return the result only value in field A or B is non-zero. Does it possible
or I need to change the document and schema? or I need to change the schema
as well as the query?
Thank you,
Vinci
--
View this message
22 matches
Mail list logo