what is the purpose of multiple solr weapps

2008-09-02 Thread sanraj25

Hi

i want to know  what is the purpose of multiple solr weapps? If we
create like that,can we load large
amount of data? please help me .In what situation we should go for multiple
solr webapps
Thanks in advance

-Santhanaraj R
-- 
View this message in context: 
http://www.nabble.com/what-is-the-purpose-of-multiple-solr-weapps-tp19265499p19265499.html
Sent from the Solr - User mailing list archive at Nabble.com.



Re: what is the purpose of multiple solr weapps

2008-09-02 Thread Neeti Raj
Hi Santhanaraj

I have used multiple solr webapps with Tomcat container for my projects.
My idea was to have a single Solr server catering to multiple projects with
completely different databases/indexes.
So as far as Solr goes, only the Solr directory (containing bin, conf and
data) needs to be separated out for each project

   - Single point of maintenance
   - But that would also mean server start/stop will impact all the
   affiliated projects.

Read more about this on -
http://wiki.apache.org/solr/MultipleIndexes

- Neeti
On Tue, Sep 2, 2008 at 1:06 PM, sanraj25 <[EMAIL PROTECTED]> wrote:

>
> Hi
>
>i want to know  what is the purpose of multiple solr weapps? If we
> create like that,can we load large
> amount of data? please help me .In what situation we should go for multiple
> solr webapps
> Thanks in advance
>
> -Santhanaraj R
> --
> View this message in context:
> http://www.nabble.com/what-is-the-purpose-of-multiple-solr-weapps-tp19265499p19265499.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>


Searching with Wildcards

2008-09-02 Thread Brian Carmalt
Hello all, 

I need to get wildcard searches with highlighting up and running. I'd
like to get it to work with a DismaxHandler, but I'll settle with
starting with the StandardRequestHandler. I've been reading the some of
the past mails on wildcard searches and Solr-195. It seems I need to
change the default behavior for wildcards from PrefixFilter to a
PrefixQuery. 
I know that I will have to deal with TooManyClauses Exceptions, but I
want to paly around with it.

I have read that this can only be done by modifying the code, but I
cann't seem to find the correct section. Can someone point me in the
right direction? Thanks. 

- Brian 



Re: Searching with Wildcards

2008-09-02 Thread Erik Hatcher
Probably your best bet is to create a new QParser(Plugin) that uses  
Lucene's QueryParser directly.  We probably should have that available  
anyway in the core, just so folks coming from Lucene Java have the  
same QueryParser.


Erik

On Sep 2, 2008, at 7:11 AM, Brian Carmalt wrote:


Hello all,

I need to get wildcard searches with highlighting up and running. I'd
like to get it to work with a DismaxHandler, but I'll settle with
starting with the StandardRequestHandler. I've been reading the some  
of

the past mails on wildcard searches and Solr-195. It seems I need to
change the default behavior for wildcards from PrefixFilter to a
PrefixQuery.
I know that I will have to deal with TooManyClauses Exceptions, but I
want to paly around with it.

I have read that this can only be done by modifying the code, but I
cann't seem to find the correct section. Can someone point me in the
right direction? Thanks.

- Brian




Check on Solr 1.3?

2008-09-02 Thread Jon Baer

Hi,

Was wondering if there was an update on a push for a final 1.3?   
Wanted to build a final .war but wondering status and if I should hold  
off ... everything in trunk seems promising any major issues?


Thanks.

- Jon


RE: Performance of putting the solr data in SAN.

2008-09-02 Thread Yongjun Rong
 Hi,
  I did not get any response from this maillist about this quesiton.
Does that mean no one in this mail list used Solr with SAN? Please reply
to me if you use solr with SAN.
  Thank you very much.
  Yongjun Rong

-Original Message-
From: Yongjun Rong [mailto:[EMAIL PROTECTED] 
Sent: Friday, August 29, 2008 1:18 PM
To: solr-user@lucene.apache.org
Cc: Mitch Stewart
Subject: Performance of putting the solr data in SAN.

Hi,
  I'm jus wondering if anybody has experinces about putting the solr
data in SAN instead of local disk. Is there a big performance penalty?
Please share with me your experiences.
  Thank you very much.
  Yongjun Rong


Re: Check on Solr 1.3?

2008-09-02 Thread Grant Ingersoll
Barring anything major, it should be out this week or early next.   
We've published one Release Candidate so far, and another one will be  
available today.  I have been announcing this on solr-dev, but have  
held back on making a broad announcement just yet.


-Grant

On Sep 2, 2008, at 10:39 AM, Jon Baer wrote:


Hi,

Was wondering if there was an update on a push for a final 1.3?   
Wanted to build a final .war but wondering status and if I should  
hold off ... everything in trunk seems promising any major issues?


Thanks.

- Jon





Re: Performance of putting the solr data in SAN.

2008-09-02 Thread Nuno Leitao
Hi.

It all depends on what SAN you will be using, and also the size of you
index when compared to the amount of RAM on your server(s).

For example, if your index is only a couple of GB, and your server has
sufficient RAM, performance between a SAN and DAD (Directly Attached
Disks) will likely not make a difference.

If however your index is large (as in, many multiples of the total RAM
on your server(s)) and the QPS (queries per second) or document update
rate is large, then you will generally see a difference.

When it comes to SAN versus DAD - there are so many variables that it
is difficult to start explaining the pros and cons. Generally
speaking, in most cases if you have a good RAID level (say, 10) and
lots of directly attached disks (5+) then DAD performance can easily
reach or beat the performance of an average SAN.

However, a good FC SAN with lots of spindles and lots of cache will
most likely outperform an average DAD setup.

It's up to you to talk to your hardware people and decide - there's no
"best practice" for this as such.

--Nuno

2008/9/2 Yongjun Rong <[EMAIL PROTECTED]>:
>  Hi,
>  I did not get any response from this maillist about this quesiton.
> Does that mean no one in this mail list used Solr with SAN? Please reply
> to me if you use solr with SAN.
>  Thank you very much.
>  Yongjun Rong
>
> -Original Message-
> From: Yongjun Rong [mailto:[EMAIL PROTECTED]
> Sent: Friday, August 29, 2008 1:18 PM
> To: solr-user@lucene.apache.org
> Cc: Mitch Stewart
> Subject: Performance of putting the solr data in SAN.
>
> Hi,
>  I'm jus wondering if anybody has experinces about putting the solr
> data in SAN instead of local disk. Is there a big performance penalty?
> Please share with me your experiences.
>  Thank you very much.
>  Yongjun Rong
>


Re: Using score in FunctionQuery

2008-09-02 Thread Chris Hostetter

: Is it possible to use Lucene similarity score in a FunctionQuery (in dismax
: boost function, for example) as part of a function?

I don't think so ... not directly anyway.

FunctionQueries just contribute to the final Lucene score as an additional 
Query clause.  

(but there might be a ValueSource i'm not immediately remembering that you 
could use to do something like in your own custom handler or component)


-Hoss



Re: Question about autocomplete feature

2008-09-02 Thread Chris Hostetter
: First I decided to make it working for solr example. So I pasted the
: snippet to schema.xml. Then I edited exampledocs/hd.xml, I added the
: "ac" field to each doc. Value of "ac" field is a copy of name filed:

you didn't need to do that, you could have just used a copyField to make 
"ac" get a copy of "name" for all docs.

: Then I clean solr index, posted hd.xml and restarted solr server. But
: when I'm trying to search for "samsu" (the part of word "samsung") I
: still get no result. Seems like solr treats "ac" field like the
: regular field. 

what exactly did you query for?  if your URL looked like this...

   http://localhost:8983/solr/select?q=samsu

...then you are just searching the default field (and it doesn't sound 
like you changed that to "ac") ... you either need to change the defualt 
field in your schema, or be explicit in the URL...

   http://localhost:8983/solr/select?q=ac:samsu

...in general adding the debugQuery=true option to your URLs can help you 
diagnose problems like this.  Among other things it will show you a string 
representing what query Solr is building after it parses your input (and 
would most like have said "text:samsu")


-Hoss



Adding multiple documents to the server at once.

2008-09-02 Thread Erik Holstad
Hi!
I'm trying to add multiple documents to the server at the same time,
and it works fine when I know how many I want to add, cause than I
can create that man SolrInputDocuments to store them in.

But if I don't know the total number in advance, is there a good way
of doing this, since reusing one document only gives me the last insert.


Regards Erik


"background merge hit exception"

2008-09-02 Thread Chris Harris
I've made some changes to my Solr setup, and now I'm getting the
"background merge hit exception" pasted at the end of this message.
The most notable changes I've made are:

Update to r690989 (Lucene r688745)

Change a few things in my schema. In particular, I was previously
storing my main document text and the metadata fields in a single
"body" field, like this:

*

   
   

   
   

*

whereas I'm now using "body" as a sort of alias that just gets
redirected to other fields, like this:

*

   
   

   
   

   
   


   

   
   
   

*

When I was indexing with this new setup, things were initially fine,
and segments seemed to be merging fine. I ran into trouble when I sent
an , though. I think in an earlier run I also got a very
similar exception just from document adds, without an explicit
.

I'm also running with a shingle-related patch
(https://issues.apache.org/jira/browse/LUCENE-1370 /
https://issues.apache.org/jira/browse/SOLR-744) and the rich document
handler patch, though I've used these before without trouble.

Is it possible that my schema change is illegitimate? Am I not allowed
to have non-indexed, non-stored fields, for example?

Anyway, here is my stack trace:

*

background merge hit exception: _1h:C2552 _1i:C210->_1i _1j:C266->_1i
_1k:C214->_1i _1l:C329->_1i _1m:C231->_1i _1n:C379->_1i _1o:C447
_1p:C453->_1p _1q:C485->_1p into _1r [optimize]

java.io.IOException: background merge hit exception: _1h:C2552
_1i:C210->_1i _1j:C266->_1i _1k:C214->_1i _1l:C329->_1i _1m:C231->_1i
_1n:C379->_1i _1o:C447 _1p:C453->_1p _1q:C485->_1p into _1r [optimize]
at org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java:2303)
at org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java:2233)
at 
org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:355)
at 
org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.java:77)
at 
org.apache.solr.handler.XmlUpdateRequestHandler.processUpdate(XmlUpdateRequestHandler.java:228)
at 
org.apache.solr.handler.XmlUpdateRequestHandler.handleRequestBody(XmlUpdateRequestHandler.java:125)
at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1156)
at 
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:341)
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:272)
at 
org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1089)
at 
org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:365)
at 
org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
at 
org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:181)
at 
org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:712)
at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:405)
at 
org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:211)
at 
org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
at 
org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:139)
at org.mortbay.jetty.Server.handle(Server.java:285)
at 
org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:502)
at 
org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:835)
at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:641)
at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:208)
at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:378)
at 
org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:226)
at 
org.mortbay.thread.BoundedThreadPool$PoolThread.run(BoundedThreadPool.java:442)
Caused by: java.lang.NullPointerException
at java.lang.System.arraycopy(Native Method)
at 
org.apache.lucene.store.BufferedIndexOutput.writeBytes(BufferedIndexOutput.java:49)
at 
org.apache.lucene.index.FieldsWriter.writeField(FieldsWriter.java:215)
at 
org.apache.lucene.index.FieldsWriter.addDocument(FieldsWriter.java:268)
at 
org.apache.lucene.index.SegmentMerger.mergeFields(SegmentMerger.java:359)
at org.apache.lucene.index.SegmentMerger.merge(SegmentMerger.java:138)
at 
org.apache.lucene.index.IndexWriter.mergeMiddle(IndexWriter.java:3988)
at org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:3637)
at 
org.apache.lucene.index.ConcurrentMergeScheduler.doMerge(ConcurrentMergeScheduler.java:214)
at 
org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:269)


Re: Adding multiple documents to the server at once.

2008-09-02 Thread Erik Holstad
Never mind :)

On Tue, Sep 2, 2008 at 11:23 AM, Erik Holstad <[EMAIL PROTECTED]> wrote:

> Hi!
> I'm trying to add multiple documents to the server at the same time,
> and it works fine when I know how many I want to add, cause than I
> can create that man SolrInputDocuments to store them in.
>
> But if I don't know the total number in advance, is there a good way
> of doing this, since reusing one document only gives me the last insert.
>
>
> Regards Erik
>


Re: Performance of putting the solr data in SAN.

2008-09-02 Thread Jeryl Cook
using a SAN to persistent your data doesn't effect performance.

On 9/2/08, Yongjun Rong <[EMAIL PROTECTED]> wrote:
>  Hi,
>   I did not get any response from this maillist about this quesiton.
> Does that mean no one in this mail list used Solr with SAN? Please reply
> to me if you use solr with SAN.
>   Thank you very much.
>   Yongjun Rong
>
> -Original Message-
> From: Yongjun Rong [mailto:[EMAIL PROTECTED]
> Sent: Friday, August 29, 2008 1:18 PM
> To: solr-user@lucene.apache.org
> Cc: Mitch Stewart
> Subject: Performance of putting the solr data in SAN.
>
> Hi,
>   I'm jus wondering if anybody has experinces about putting the solr
> data in SAN instead of local disk. Is there a big performance penalty?
> Please share with me your experiences.
>   Thank you very much.
>   Yongjun Rong
>


-- 
Jeryl Cook
/^\ Pharaoh /^\
http://pharaohofkush.blogspot.com/
"Whether we bring our enemies to justice, or bring justice to our
enemies, justice will be done."
--George W. Bush, Address to a Joint Session of Congress and the
American People, September 20, 2001


Building a multilevel query.

2008-09-02 Thread Erik Holstad
Hi!
I want to do a query that first queries on one specific field and
for all those that match the input do a second query.

For example if we have a "type" field where one of the options
is "user" and a "title" fields includes the names of the users.

So I want to find all data with "type" field = user where the name
Erik is in the title field.

Is this possible? Have been playing with faceting, but can't get
the facet.query to work, and otherwise I just get all the results.

Regards Erik