Hello everyone,
I have implemented facet search with search in all fields, its working fine
with the following url.
http://localhost:8983/solr/select?q=cd&facet=true&facet.field=media&facet.limit=-1&start=0&rows=15&facet.mincount=1&qt=LM
Query : cd
But when I tried with query which have AND, its
Is there a way to use RAMDirectory with SOLR?If you can point me to
documentation that would be great.
Thanks,
S
The XppUpdateRequestHandler was removed this afternoon... make sure your
sorlconfig.xml does not include:
class="solr.XppUpdateRequestHandler" />
holler if you have problems!
ryan
Matthew Runo wrote:
Hello!
I'm having a horrible time getting the current svn head build of Solr to
run.
Hello!
I'm having a horrible time getting the current svn head build of Solr
to run. I even remembered to do an 'ant clean' this time.. but no luck.
I have set up solr_home via a JAVA_OPTS flag, and am using Tomcat 6...
[EMAIL PROTECTED]:/opt/tomcat]$ echo $JAVA_OPTS
-Dsolr.solr.home=/opt/so
As a reference:
I have several million records where are about 20 fields. One of them is
100-1k bytes, and the rest are 20-50 bytes. There is a reliable 5%
performance difference between pulling just the unique key field and
pulling all of the fields.
-Original Message-
From: Geert-Jan B
Setting maxBufferedDocs to something smaller (say, 300), might be a
better way of limiting your memory usage. I have difficulties with
the odd huge document when using the default maxBufferedDocs=1000 (in
the next Solr version, there should be an option to limit indexing
based on memory us
yeah, that makes sense.
so, in in all, could scanning all the fields and loading the 10 fields add
up to cost about the same or even more as performing the intial query? (Just
making sure)
I am wondering if the following change to the schema would help in this
case:
current setup:
It's possible t
On Dec 27, 2007 11:01 AM, Britske <[EMAIL PROTECTED]> wrote:
> after inspecting solrconfig.xml I see that I already have enabled lazy field
> loading by:
> true (I guess it was
> enabled by default)
>
> Since any query returns about 10 fields (which differ from query to query) ,
> would this mean t
after inspecting solrconfig.xml I see that I already have enabled lazy field
loading by:
true (I guess it was
enabled by default)
Since any query returns about 10 fields (which differ from query to query) ,
would this mean that only these 10 of about 2000-4000 fields are retrieved /
loaded?
T
>From a Lucene perspective, it's certainly possible to do lazy field
loading. That is, when loading a document you can determine at
run time what fields to load, even on a per-document basis. I'm
not entirely sure how to accomplish this in Solr, but I'd give
long odds that there's a way.
I did
Yonik Seeley wrote:
>
> On Dec 27, 2007 9:45 AM, Britske <[EMAIL PROTECTED]> wrote:
>> I am using SolrJ to communicate with SOLR. My Solr-queries perform within
>> range (between 50 ms and 300 ms) by looking at the solr log as ouputted
>> on
>> my (windows) commandline.
>>
>> However I discover
On Dec 27, 2007 9:45 AM, Britske <[EMAIL PROTECTED]> wrote:
> I am using SolrJ to communicate with SOLR. My Solr-queries perform within
> range (between 50 ms and 300 ms) by looking at the solr log as ouputted on
> my (windows) commandline.
>
> However I discovered that the following command at all
Hi,
I am using SolrJ to communicate with SOLR. My Solr-queries perform within
range (between 50 ms and 300 ms) by looking at the solr log as ouputted on
my (windows) commandline.
However I discovered that the following command at all times takes
significantly longer than the number outputted i
13 matches
Mail list logo