Hello,I'm not sure if it's the smartest solution, but if your request go
throught a programming layer, you could rewrite it using regular expression :
query="apple" is rewrited in query="title:apple AND text:apple", for example.
Don't know if it's clever performancewise, but it works fine, altho
On 30-Jul-07, at 3:34 PM, Daniel Naber wrote:
Hi,
I want to search multiple fields by default (which is no supported by
StandardRequestHandler), but I also want to be able to use Lucene's
boolean syntax (AND/OR/NOT). This doesn't seem to be supported by
DisMaxRequestHandler. I will need to cop
On 30-Jul-07, at 11:35 AM, David Whalen wrote:
Hi Yonik!
I'm glad to finally get to talk to you. We're all very impressed
with solr and when it's running it's really great.
We increased the heap size to 1500M and that didn't seem to help.
In fact, the crashes seem to occur more now than ever.
Hi,
I want to search multiple fields by default (which is no supported by
StandardRequestHandler), but I also want to be able to use Lucene's
boolean syntax (AND/OR/NOT). This doesn't seem to be supported by
DisMaxRequestHandler. I will need to copy or extend StandardRequestHandler
and modify
On 7/30/07, David Whalen <[EMAIL PROTECTED]> wrote:
> Hi All.
>
> I am using facets to help me build an ajax-driven tree for
> search results. When the search is first run, all I need to
> do is show the counts per facet, for example
>
> search results for "fred"
> +--A (102)
> +--B (234)
> +--C (
Hi All.
I am using facets to help me build an ajax-driven tree for
search results. When the search is first run, all I need to
do is show the counts per facet, for example
search results for "fred"
+--A (102)
+--B (234)
+--C (721)
+--D (512)
sounds simple, but I also need to break-down the resu
On 7/30/07, David Whalen <[EMAIL PROTECTED]> wrote:
> Yonik:
>
> > If that's not the problem, you could decrease memory usage
> > due to faceting by upgrading to Solr 1.2 and using
> > facet.enum.cache.minDf
>
> Is it hard to upgrade from 1.1 to 1.2? We were considering
> making that change if it
Yonik:
> If that's not the problem, you could decrease memory usage
> due to faceting by upgrading to Solr 1.2 and using
> facet.enum.cache.minDf
Is it hard to upgrade from 1.1 to 1.2? We were considering
making that change if it wouldn't cost us a lot of downtime.
can you help me understand
On 7/30/07, Kevin Holmes <[EMAIL PROTECTED]> wrote:
> Jul 30, 2007 3:05:03 PM org.apache.solr.core.SolrException log
> SEVERE: java.io.IOException: Lock obtain timed out:
> SimpleFSLock@/tmp/lucene-f4cca35f5bee7bbcd8238c7ef8697193-write.lock
> at org.apache.lucene.store.Lock.obtain(Lock.jav
This might be relevant too?
Jul 30, 2007 3:05:22 PM org.apache.solr.core.SolrException log
SEVERE: Error during auto-warming of
key:[EMAIL PROTECTED]:java.lang.OutOfMemory
Error: Java heap space
Jul 30, 2007 3:05:25 PM org.apache.solr.core.SolrException log
SEVERE: Error during auto-warming of
k
Grep for PERFORMANCE in the logs to make sure that you aren't running
into a scenario where more than one searcher is warming in the
background.
If that's not the problem, you could decrease memory usage due to
faceting by upgrading to Solr 1.2 and using facet.enum.cache.minDf
-Yonik
On 7/30/07,
debiandos:~# curl -i
http://localhost:8983/solr/select/?q=superduperobscuretestingstring
HTTP/1.1 200 OK
Date: Mon, 30 Jul 2007 19:20:40 GMT
Server: Jetty/5.1.11RC0 (Linux/2.6.18-4-686 i386 java/1.5.0_11
Content-Type: text/xml; charset=UTF-8
Content-Length: 272
0121superduperobscuretestingstring
David,
If "nothing on port 8983 responds", your servlet container is
certainly the first thing that should be checked, because that is
what's listening on port 8983.
First, let's need to figure out what version of Jetty you're using
and how it is started -- which will lead you to the log
These might be relevant too:
Jul 30, 2007 3:05:03 PM org.apache.solr.core.SolrException log
SEVERE: java.io.IOException: Lock obtain timed out:
SimpleFSLock@/tmp/lucene-f4cca35f5bee7bbcd8238c7ef8697193-write.lock
at org.apache.lucene.store.Lock.obtain(Lock.java:69)
at
org.apache.
Just got this:
Jul 30, 2007 3:02:14 PM org.apache.solr.core.SolrException log
SEVERE: java.lang.OutOfMemoryError: Java heap space
Jul 30, 2007 3:02:30 PM org.apache.solr.core.SolrException log
SEVERE: java.lang.OutOfMemoryError: Java heap space
Kevin Holmes
eNR Services, Inc.
20 Glover Av
On 7/30/07, David Whalen <[EMAIL PROTECTED]> wrote:
> We increased the heap size to 1500M and that didn't seem to help.
> In fact, the crashes seem to occur more now than ever. We're
> constantly restarting solr just to get a response.
>
> I don't know enough to know where the log files are to ans
Hi Yonik!
I'm glad to finally get to talk to you. We're all very impressed
with solr and when it's running it's really great.
We increased the heap size to 1500M and that didn't seem to help.
In fact, the crashes seem to occur more now than ever. We're
constantly restarting solr just to get a r
It may be related to the out-of-memory errors you were seeing.
severe errors like that should never be ignored.
Do you see any other warning or severe errors in your logs?
-Yonik
On 7/30/07, David Whalen <[EMAIL PROTECTED]> wrote:
> Guys:
>
> Can anyone help me? Things are getting serious at my
Guys:
Can anyone help me? Things are getting serious at my
company and heads are going to roll.
I need to figure out why solr just suddenly stops responding
without any warning.
DW
> -Original Message-
> From: David Whalen [mailto:[EMAIL PROTECTED]
> Sent: Friday, July 27, 2007 1
many third party parsers (like xerces I think) handle this automatically but in
general they should be removed.
rev 561050
ryan
I'm pretty sure that's a BOM: http://en.wikipedia.org/wiki/Byte_Order_Mark
The web.xml in src/webapp also seems to have it there so I would assume that
any/all builds from the solr kit would have the same problem no matter where
you downloaded it from.
It looks like the BOM got checked in here
It was one I downloaded.
However, a quick inspection of the source file indicates the same flaw in the
source web.xml:
>From
>http://svn.apache.org/repos/asf/lucene/solr/trunk/src/webapp/WEB-INF/web.xml:

^^^ this is what prevents this from running unmodified on OAS. I suspect
other a
Jason P. Weiss wrote:
I had some trouble getting the current production build (1.2.0) working
on 10gR3 (10.1.3.0.0).
I had to remove 3 bad characters off of the front of the web.xml file
and re-jar the WAR file. It worked perfectly after that minor
modification.
Was this a .war you downloade
I had some trouble getting the current production build (1.2.0) working
on 10gR3 (10.1.3.0.0).
I had to remove 3 bad characters off of the front of the web.xml file
and re-jar the WAR file. It worked perfectly after that minor
modification.
Jason
-Original Message-
From: Chris Hostetter
I will take a stab at patching the MoreLikeThis handler - but given
that I have never touched a single line of Solr code this might fail
miserably :)
Maybe there is a kind soul which could provide a new patch for
SOLR-236 which includes field collapse with MLT ?
On 30/07/07, Ryan McKinley <[EMAIL
Hi David,
We're running Solr 1.1 and we're seeing intermittent cases where
Solr stops responding to HTTP requests. It seems like the listener
on port 8983 just doesn't respond.
When we started using solr we encountered the same problem. We are
currently running solr 1.0 (!) with tomcat 5.5 o
Nuno Leitao wrote:
Hi,
I have a 1.3 Solr with the field collapsing patch (SOLR-236 -
http://issues.apache.org/jira/browse/SOLR-236).
Collapsing works great, but only using the dismax and standard query
handler - I haven't managed to get it to work using the MoreLikeThis
handler though - I a
Hi,
I have a 1.3 Solr with the field collapsing patch (SOLR-236 - http://
issues.apache.org/jira/browse/SOLR-236).
Collapsing works great, but only using the dismax and standard query
handler - I haven't managed to get it to work using the MoreLikeThis
handler though - I am going for a sim
I think I have the same question as Arnaud. For example, my dismax query has
qf=title^5 description^2. Now if I search for "Java developer", I want to
make sure that the results have at least "java" or "developer" in the title.
Is this possible with dismax query?
On 7/30/07, Chris Hostetter <[EMAI
Hi All.
I'm still hoping to get some insight into how I can solve this
issue. If Jetty is the problem I'll happily get rid of it, but
I'd feel better if I could do some tests first to be sure I'm
solving the problem.
Has anyone else had this problem in the past?
Thanks,
DW
> -Original M
: I am new in Solr and try to use Jitty and example with 13 million records.
: During running it, I have the error -
: java.lang.OutOfMemoryError: Java heap space
: Any recommendation? We have a million transactions, so would it be better to
: use Tomcat?
millions of records takes up memory. wh
: Is it possible to get the values from the ValueSource (or from
: getFieldCacheCounts) sorted by its natural order (from lowest to
: highest values)?
well, an inverted term index is already a data structure listing terms
from lowest to highest and the associated documents -- so if you want to
it
: Is it possible to specify precisely one or more mandatory fields in a
: DismaxRequestHandler?
what would the semantics making a field mandatory mean? considering your
specific example...
:
: text^0.5 features^1.0 name^1.2 sku^1.5 id^10.0 manu^1.1 cat^1.4
:
:
i noticed this message while catching up on some mail backlog ... i don't
know anything baout Oracle's app server, but some creative googling for
the error message you cited suggests to me that "Failed in uploading
archive. Invalid archive file:" is a common error message preamble anytime
Oracle's
Hello,
You need to start your jetty or tomcat server with higher vm memory
settings.
On this page you can find some explanation
http://www.caucho.com/resin-3.0/performance/jvm-tuning.xtp
The 2 parameters that are important are -Xms and -Xmx.
So instead of starting jetty with
java -jar start.j
35 matches
Mail list logo