Deduplication patch not working in nightly build

2009-01-05 Thread Marc Sturlese

Hey there,
I was using the Deduplication patch with Solr 1.3 release and everything was
working perfectly. Now I upgraded to a nigthly build (20th december) to be
able to use new facet algorithm and other stuff and DeDuplication is not
working any more. I have followed exactly the same steps to apply the patch
to the source code. I am geting this error:

WARNING: Error reading data 
com.mysql.jdbc.CommunicationsException: Communications link failure due to
underlying exception: 

** BEGIN NESTED EXCEPTION ** 

java.io.EOFException

STACKTRACE:

java.io.EOFException
at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:1905)
at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2404)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
at com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
at
org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
at
org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
at
org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
at
org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
at
org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
at
org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
at
org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
at
org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
at
org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
at
org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
at
org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:388)


** END NESTED EXCEPTION **
Last packet sent to the server was 202481 ms ago.
at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2563)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
at com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
at
org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
at
org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
at
org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
at
org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
at
org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
at
org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
at
org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
at
org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
at
org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
at
org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
at
org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:388)
Jan 5, 2009 10:06:16 AM org.apache.solr.handler.dataimport.JdbcDataSource
logError
WARNING: Exception while closing result set
com.mysql.jdbc.CommunicationsException: Communications link failure due to
underlying exception: 

** BEGIN NESTED EXCEPTION ** 

java.io.EOFException

STACKTRACE:

java.io.EOFException
at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:1905)
at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2351)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
at com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
at com.mysql.jdbc.RowDataDynamic.close(RowDataDynamic.java:150)
at com.mysql.jdbc.ResultSet.realClose(ResultSet.java:6488)
at com.mysql.jdbc.ResultSet.close(ResultSet.java:736)
at
org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.close(JdbcDataSource.java:312)
at
org.apache.solr.handler.dataimpo

Re: Solr 1.3.0 with Jetty 6.1.14

2009-01-05 Thread Jacob Singh
Hi,

I did this.  The only option I've found is to use Matt's attached solution.

I suggest just using MultiCore/CoreAdmin though.

Best,
Jacob

On Mon, Jan 5, 2009 at 8:47 AM, gwk  wrote:
> Hello,
>
>
> I'm trying to get multiple instances of Solr running with Jetty as per
> the instructions on http://wiki.apache.org/solr/SolrJetty, however I've
> run into a snag. According to the page you set the solr/home parameter
> as follows:
>
> 
>   solr/home
>   *My Solr Home Dir*
> 
>
> However, as MattKangas mentions on the wiki, using this method to set
> the JNDI parameter makes it global to the jvm which is bad for running
> multiple instances but reading the 6.1.14 documentation for the EnvEntry
> class constructors shows that with this version of jetty you can supply
> a scope, I've tried this with the following configuration:
>
> 
>   
>   
>   
>   /solr/home
>default="." />/my/solr/home/dir
>   true
>   
> 
>
> But unfortunately this doesn't seem to work, if I set the first argument
> to NULL (), it works for one instance (as it's in jvm scope) but
> when I set it to the WebAppContext-scope, solr logs:
>
> org.apache.solr.core.SolrResourceLoader locateInstanceDir
> INFO: No /solr/home in JNDI
> org.apache.solr.core.SolrResourceLoader locateInstanceDir
> INFO: solr home defaulted to 'solr/' (could not find system property or
> JNDI)
>
> Am I doing something wrong here? Any help will be appreciated.
>
> Regards,
>
> gwk
>
>
>



-- 

+1 510 277-0891 (o)
+91  33 7458 (m)

web: http://pajamadesign.com

Skype: pajamadesign
Yahoo: jacobsingh
AIM: jacobsingh
gTalk: jacobsi...@gmail.com


Re: cannot allocate memory for snapshooter

2009-01-05 Thread Brian Whitman
On Sun, Jan 4, 2009 at 9:47 PM, Mark Miller  wrote:

> Hey Brian, I didn't catch what OS you are using on EC2 by the way. I
> thought most UNIX OS's were using memory overcommit - A quick search brings
> up Linux, AIX, and HP-UX, and maybe even OSX?
>
> What are you running over there? EC2, so Linux I assume?
>

This is on debian, a 2.6.21 x86_64 kernel.


Re: Deduplication patch not working in nightly build

2009-01-05 Thread Marc Sturlese

Thanks I will have a look to my JdbcDataSource. Anyway it's weird because
using the 1.3 release I don't have that problem...

Shalin Shekhar Mangar wrote:
> 
> Yes, initially I figured that we are accidentally re-using a closed data
> source. But Noble has pinned it right. I guess you can try looking into
> your
> JDBC driver's documentation for a setting which increases the connection
> alive-ness.
> 
> On Mon, Jan 5, 2009 at 5:29 PM, Noble Paul നോബിള്‍ नोब्ळ् <
> noble.p...@gmail.com> wrote:
> 
>> I guess the indexing of a doc is taking too long (may be because of
>> the de-dup patch) and the resultset gets closed automaticallly (timed
>> out)
>> --Noble
>>
>> On Mon, Jan 5, 2009 at 5:14 PM, Marc Sturlese 
>> wrote:
>> >
>> > Donig this fix I get the same error :(
>> >
>> > I am going to try to set up the last nigthly build... let's see if I
>> have
>> > better luck.
>> >
>> > The thing is it stop indexing at the doc num 150.000 aprox... and give
>> me
>> > that mysql exception error... Without DeDuplication patch I can index 2
>> > milion docs without problems...
>> >
>> > I am pretty lost with this... :(
>> >
>> >
>> > Shalin Shekhar Mangar wrote:
>> >>
>> >> Yes I meant the 05/01/2008 build. The fix is a one line change
>> >>
>> >> Add the following as the last line of DataConfig.Entity.clearCache()
>> >> dataSrc = null;
>> >>
>> >>
>> >>
>> >> On Mon, Jan 5, 2009 at 4:22 PM, Marc Sturlese
>> >> wrote:
>> >>
>> >>>
>> >>> Shalin you mean I should test the 05/01/2008 nighlty? maybe with this
>> one
>> >>> works? If the fix you did is not really big can u tell me where in
>> the
>> >>> source is and what is it for? (I have been debuging and tracing a lot
>> the
>> >>> dataimporthandler source and I I would like to know what the
>> imporovement
>> >>> is
>> >>> about if it is not a problem...)
>> >>>
>> >>> Thanks!
>> >>>
>> >>>
>> >>> Shalin Shekhar Mangar wrote:
>> >>> >
>> >>> > Marc, I've just committed a fix which may have caused the bug. Can
>> you
>> >>> use
>> >>> > svn trunk (or the next nightly build) and confirm?
>> >>> >
>> >>> > On Mon, Jan 5, 2009 at 3:10 PM, Noble Paul നോബിള്‍ नोब्ळ् <
>> >>> > noble.p...@gmail.com> wrote:
>> >>> >
>> >>> >> looks like a bug w/ DIH with the recent fixes.
>> >>> >> --Noble
>> >>> >>
>> >>> >> On Mon, Jan 5, 2009 at 2:36 PM, Marc Sturlese
>> >>> 
>> >>> >> wrote:
>> >>> >> >
>> >>> >> > Hey there,
>> >>> >> > I was using the Deduplication patch with Solr 1.3 release and
>> >>> >> everything
>> >>> >> was
>> >>> >> > working perfectly. Now I upgraded to a nigthly build (20th
>> december)
>> >>> to
>> >>> >> be
>> >>> >> > able to use new facet algorithm and other stuff and
>> DeDuplication
>> is
>> >>> >> not
>> >>> >> > working any more. I have followed exactly the same steps to
>> apply
>> >>> the
>> >>> >> patch
>> >>> >> > to the source code. I am geting this error:
>> >>> >> >
>> >>> >> > WARNING: Error reading data
>> >>> >> > com.mysql.jdbc.CommunicationsException: Communications link
>> failure
>> >>> due
>> >>> >> to
>> >>> >> > underlying exception:
>> >>> >> >
>> >>> >> > ** BEGIN NESTED EXCEPTION **
>> >>> >> >
>> >>> >> > java.io.EOFException
>> >>> >> >
>> >>> >> > STACKTRACE:
>> >>> >> >
>> >>> >> > java.io.EOFException
>> >>> >> >at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:1905)
>> >>> >> >at
>> >>> com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2404)
>> >>> >> >at
>> com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
>> >>> >> >at
>> com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
>> >>> >> >at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
>> >>> >> >at
>> >>> >> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
>> >>> >> >at
>> >>> com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
>> >>> >> >at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
>> >>> >> >at
>> >>> >> >
>> >>> >>
>> >>>
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
>> >>> >> >at
>> >>> >> >
>> >>> >>
>> >>>
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
>> >>> >> >at
>> >>> >> >
>> >>> >>
>> >>>
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
>> >>> >> >at
>> >>> >> >
>> >>> >>
>> >>>
>> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
>> >>> >> >at
>> >>> >> >
>> >>> >>
>> >>>
>> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
>> >>> >> >at
>> >>> >> >
>> >>> >>
>> >>>
>> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
>> >>> >> >at
>> >>> >> >
>> >>> >>
>> >>>
>> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
>> >>> >> >at
>> >>> >> >
>> >>> >>
>> >>>
>> org.apache.s

Re: Deduplication patch not working in nightly build

2009-01-05 Thread Marc Sturlese


Yeah looks like but... if I don't use the DeDuplication patch everything
works perfect.  I can create my indexed using full import and delta import
without problems. The JdbcDataSource of the nightly is pretty similar to the
1.3 release...
The DeDuplication patch doesn't touch the dataimporthandler classes... it's
coz I thought the problem was not there (but can't say it for sure...)

I was thinking that the problem has something to do with the
UpdateRequestProcessorChain but don't know how this part of the source
works...

I am really interested in updating to the nightly build as I think new facet
algorithm and  SolrDeletionPolicy are really great stuff!

>>Marc, I've just committed a fix which may have caused the bug. Can you use
>>svn trunk (or the next nightly build) and confirm? 
You meann the last nightly build?

Thanks


Noble Paul നോബിള്‍ नोब्ळ् wrote:
> 
> looks like a bug w/ DIH with the recent fixes.
> --Noble
> 
> On Mon, Jan 5, 2009 at 2:36 PM, Marc Sturlese 
> wrote:
>>
>> Hey there,
>> I was using the Deduplication patch with Solr 1.3 release and everything
>> was
>> working perfectly. Now I upgraded to a nigthly build (20th december) to
>> be
>> able to use new facet algorithm and other stuff and DeDuplication is not
>> working any more. I have followed exactly the same steps to apply the
>> patch
>> to the source code. I am geting this error:
>>
>> WARNING: Error reading data
>> com.mysql.jdbc.CommunicationsException: Communications link failure due
>> to
>> underlying exception:
>>
>> ** BEGIN NESTED EXCEPTION **
>>
>> java.io.EOFException
>>
>> STACKTRACE:
>>
>> java.io.EOFException
>>at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:1905)
>>at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2404)
>>at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
>>at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
>>at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
>>at
>> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
>>at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
>>at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
>>at
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
>>at
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
>>at
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
>>at
>> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
>>at
>> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
>>at
>> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
>>at
>> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
>>at
>> org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
>>at
>> org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
>>at
>> org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
>>at
>> org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:388)
>>
>>
>> ** END NESTED EXCEPTION **
>> Last packet sent to the server was 202481 ms ago.
>>at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2563)
>>at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
>>at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
>>at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
>>at
>> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
>>at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
>>at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
>>at
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
>>at
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
>>at
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
>>at
>> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
>>at
>> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
>>at
>> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
>>at
>> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
>>at
>> org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
>>at
>> org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
>>at
>> org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
>>at
>> org.apache.s

Re: synonyms.txt file updated frequently

2009-01-05 Thread Alexander Ramos Jardim
2009/1/3 Grant Ingersoll 

>
> On Jan 2, 2009, at 10:25 AM, Alexander Ramos Jardim wrote:
>
>  Grant,
>>
>
>
>> 2. SynonymTokenFilterFactory does the "synonyms.txt" parse and creates the
>> SynonymTokenFilter instance. If I want the SynonymTokenFilter to reload
>> synonyms.txt file from time to time, I will need to put the file load and
>> parsing strategies inside my new TokenFilter, right?
>>
>
> I think it can go in the SynonymFilterFactory.
>
>
How will I make SynonymFilterFactory aware of the file changes, and how will
it destroy every filter that is using that file and recreate it?
As long as I can understand, the SynonymFilterFactory does not know the
SynonymFilter instances it creates, as it isn't called all the time for
doing something, so I wouldn't have an eficient way to apply my "timer".
Just let me know if I am misunderstanding something.


>
>
>
>>
>> 2008/12/30 Grant Ingersoll 
>>
>>  I'd probably write a new TokenFilter that was aware of the reload policy
>>> (in a generic way) such that I didn't have to go through a whole core
>>> reload
>>> every time.  Are you just using them during query time or also during
>>> indexing?
>>>
>>> -Grant
>>>
>>>
>>> On Dec 30, 2008, at 8:12 AM, Alexander Ramos Jardim wrote:
>>>
>>> Hello guys,
>>>

 As the title suggests I must update my synonyms.txt file frequently.
 What
 is
 the best approach? Should I send a commit after the file is updated?
 Does
 Solr need to be restarted after the file changes?

 --
 Alexander Ramos Jardim


>>> --
>>> Grant Ingersoll
>>>
>>> Lucene Helpful Hints:
>>> http://wiki.apache.org/lucene-java/BasicsOfPerformance
>>> http://wiki.apache.org/lucene-java/LuceneFAQ
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>> --
>> Alexander Ramos Jardim
>>
>
> --
> Grant Ingersoll
>
> Lucene Helpful Hints:
> http://wiki.apache.org/lucene-java/BasicsOfPerformance
> http://wiki.apache.org/lucene-java/LuceneFAQ
>
>
>
>
>
>
>
>
>
>
>


-- 
Alexander Ramos Jardim


Re: Spell Checker Reloading Issue

2009-01-05 Thread Navdeep

Hi 

Thanks for your response. 
Please find the attached. 
1) schema.xml and solrconfig.xml

In solrconfig.xml file, we are changing the below parts ...

PART 1:

 
 
  
  false
  
  false
  
  10
  
 explicit 
  
 0.01 
 statusName_product_s^1.0 productId_product_s^1.0
iSBN10_product_s^1.0 iSBN13_product_s^1.0 prdMainTitle_product_s^1.0
prdKeywords_product_s^1.0 productDescription_product_s^1.0
prdMainSubTitle_product_s^1.0 contentTypeId_product_s^1.0
editionTypeId_product_s^1.0 statusId_product_s^1.0 formatId_product_s^1.0
audienceId_product_s^1.0 eraId_product_s^1.0 extentTypeId_product_s^1.0
divisionId_product_s^1.0 productPrice_product_s^1.0 basePrice_product_s^1.0
catalogPrice_product_s^1.0 editionName_product_s^1.0
productSource_product_s^1.0 ageRange_product_s^1.0
prdPublishingDate_product_s^1.0 productCopyright_product_s^1.0
productExtentName_product_s^1.0 parentTaxonomy_product_s^1.0
parentGroup_product_s^1.0 IndexId_s^1.0 productURL_product_s^1.0
websiteURL_product_s^1.0 productContributors_product_s^1.0
relatedFiles_product_s^1.0 relatedLinks_product_s^1.0 awards_product_s^1.0
imprints_product_s^1.0 product_product_s^1.0 documents_product_s^1.0
taxonomyPathElement_product_s^1.0  

  
 
  
 english^90 hindi^123 Glorious^2000 highlighting^1000
maths^100 ab^12 erer^4545
  
*,score 
  
 
 
  spellcheck

  

PART 2:



textSpell

   
  default
  solr.spelling.FileBasedSpellChecker
  ./spellings.txt
  UTF-8
  ./spellcheckerFile
  0.7


  

Thanks 
Navdeep




Grant Ingersoll-6 wrote:
> 
> Can you share your configuration, or at least the relevant pieces?
> 
> -Grant
> On Jan 5, 2009, at 9:24 AM, Navdeep wrote:
> 
>>
>> Hi all
>>
>> we are facing an issue in spell checker with solr server. We are  
>> changing
>> the below given attributes of SolrConfig.xml file
>>
>> 1) Accuracy
>> 2) Number of Suggestions
>>
>> we are rebuilding solr indexes using "spellcheck.build=true" :
>> URL used for POST_SOLR_URL=
>> "select? 
>> q 
>> = 
>> *:*&spellcheck 
>> .q=flavro&spellcheck=true&spellcheck.build=true&qt=dismaxrequest"
>>
>> After performing the above steps, when we are trying to perform the  
>> final
>> search for keyword, it is not working.
>> Please share your thoughts on this issue.
>>
>> -- 
>> View this message in context:
>> http://www.nabble.com/Spell-Checker-Reloading-Issue-tp21291873p21291873.html
>> Sent from the Solr - User mailing list archive at Nabble.com.
>>
> 
> --
> Grant Ingersoll
> 
> Lucene Helpful Hints:
> http://wiki.apache.org/lucene-java/BasicsOfPerformance
> http://wiki.apache.org/lucene-java/LuceneFAQ
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
http://www.nabble.com/file/p21292901/schema.xml schema.xml 
http://www.nabble.com/file/p21292901/solrconfig.xml solrconfig.xml 
-- 
View this message in context: 
http://www.nabble.com/Spell-Checker-Reloading-Issue-tp21291873p21292901.html
Sent from the Solr - User mailing list archive at Nabble.com.



Re: Issue with Java Client code

2009-01-05 Thread Erik Hatcher

On Jan 5, 2009, at 8:25 AM, Kalidoss MM wrote:

Is it possible to issue the commit to the Solr Server from that java  
code it

self??


Of course... using CommonsHttpSolrServer#commit

Erik





I have tried the same by issueing the command in terminal
(/solr/bin/./commit) and it worked..

Please let me know. is it possible to do the same in java code it  
self?


kalidoss.m,

On Mon, Jan 5, 2009 at 6:18 PM, Erik Hatcher >wrote:




On Jan 5, 2009, at 7:33 AM, Kalidoss MM wrote:

 We have created a Java EmbeddedSolrServer Client Code, I can able  
to
add, delete, update the Solr content - At the same time i cant  
able to

search the updated conente from the Running Solr client(jetty) web
interface.

 My requirement is, All search need to happen from/by running web
Solr(jetty, 8983) and all write should happened from Java client  
code.


 Both(jeety and javaclient) are using 'Core0' as core name, and  
both data
directory, schema, solrconfig are same. - is there any fix  
available??


 Case1:
 1) solr started in 8983 port as Core0,
 2) Running a java client(Core0) to add one record say  
"hitest", with

commit
 3) when i search for hitest, am not getting any result,
 4) after i restart the solr(8983) and search for 'hitest' am  
getting

the result.

-
   - both Solr, java client is using the same dada directory,  
schema.xml,

  solrconfig.xml
  fyi: even this java client is working when solr is not started


 is it possible to handle the read/search by webinterface, and all  
write

from java-client(with out http) 



You'll need to issue the  to the Solr server (not the  
embedded
one) for it to take place there.  A commit to EmbeddedSolrServer  
will make
newly added documents visible through _that_ SolrServer, but not to  
anyone

other process (such as Solr via jetty) pointing at the Lucene index.

  Erik






Subscribe Me

2009-01-05 Thread kalidoss


Thanks,
kalidoss.m,


** DISCLAIMER **
Information contained and transmitted by this E-MAIL is proprietary to 
Sify Limited and is intended for use only by the individual or entity to 
which it is addressed, and may contain information that is privileged, 
confidential or exempt from disclosure under applicable law. If this is a 
forwarded message, the content of this E-MAIL may not have been sent with 
the authority of the Company. If you are not the intended recipient, an 
agent of the intended recipient or a  person responsible for delivering the 
information to the named recipient,  you are notified that any use, 
distribution, transmission, printing, copying or dissemination of this 
information in any way or in any manner is strictly prohibited. If you have 
received this communication in error, please delete this mail & notify us 
immediately at ad...@sifycorp.com


Re: collectionDistribution vs SolrReplication

2009-01-05 Thread Noble Paul നോബിള്‍ नोब्ळ्
The default IndexDeletionPolicy just keeps the last commit only
(KeepOnlyLastCommitDeletionPolicy) .Files belonging to older commits
are removed. If the files are needed longer for replication, they are
leased . The lease is extended 10 secs at a time. Once all the slaves
have copied the lease is never extended and the files will be purged.

In the snapshot based system , unless the snapshots are deleted from
the file system the old files will continue to live on the disk
--Noble

On Mon, Jan 5, 2009 at 6:59 PM, Mark Miller  wrote:
> Noble Paul ??? ?? wrote:
>>
>> * SolrReplication does not create snapshots . So you have less cleanup
>> to do. The script based replication results is more disk space
>> consumption (especially if you do frequent commits)
>>
>
> Doesn't SolrReplication effectively take a snapshot by using a custom
> IndexDeletionPolicy to keep the right index files around? Isn't that
> maintaining a snapshot?
>
> Could you elaborate on the difference Noble?
>
> - Mark
>



-- 
--Noble Paul


Re: Deduplication patch not working in nightly build

2009-01-05 Thread Noble Paul നോബിള്‍ नोब्ळ्
looks like a bug w/ DIH with the recent fixes.
--Noble

On Mon, Jan 5, 2009 at 2:36 PM, Marc Sturlese  wrote:
>
> Hey there,
> I was using the Deduplication patch with Solr 1.3 release and everything was
> working perfectly. Now I upgraded to a nigthly build (20th december) to be
> able to use new facet algorithm and other stuff and DeDuplication is not
> working any more. I have followed exactly the same steps to apply the patch
> to the source code. I am geting this error:
>
> WARNING: Error reading data
> com.mysql.jdbc.CommunicationsException: Communications link failure due to
> underlying exception:
>
> ** BEGIN NESTED EXCEPTION **
>
> java.io.EOFException
>
> STACKTRACE:
>
> java.io.EOFException
>at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:1905)
>at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2404)
>at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
>at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
>at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
>at com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
>at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
>at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
>at
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
>at
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
>at
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
>at
> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
>at
> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
>at
> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
>at
> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
>at
> org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
>at
> org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
>at
> org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
>at
> org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:388)
>
>
> ** END NESTED EXCEPTION **
> Last packet sent to the server was 202481 ms ago.
>at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2563)
>at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
>at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
>at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
>at com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
>at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
>at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
>at
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
>at
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
>at
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
>at
> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
>at
> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
>at
> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
>at
> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
>at
> org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
>at
> org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
>at
> org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
>at
> org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:388)
> Jan 5, 2009 10:06:16 AM org.apache.solr.handler.dataimport.JdbcDataSource
> logError
> WARNING: Exception while closing result set
> com.mysql.jdbc.CommunicationsException: Communications link failure due to
> underlying exception:
>
> ** BEGIN NESTED EXCEPTION **
>
> java.io.EOFException
>
> STACKTRACE:
>
> java.io.EOFException
>at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:1905)
>at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2351)
>at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
>at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
>at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
>at com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
>at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
>at com.mysql.jdbc.RowDataDynamic.close(RowDataDynamic.java:150)
>at com.mysql.jdbc.R

Re: collectionDistribution vs SolrReplication

2009-01-05 Thread Jacob Singh
Has there been a discussion anywhere about a "binary log" style
replications scheme (ala mysql?)  Wherein, every write request goes to
the master, and the the slaves read in a queue of the requests and
update themselves one record at a time instead of wholesale?  Or is
this just not worth the development time?

Best,
Jacob

On Mon, Jan 5, 2009 at 10:26 AM, Noble Paul നോബിള്‍ नोब्ळ्
 wrote:
> The default IndexDeletionPolicy just keeps the last commit only
> (KeepOnlyLastCommitDeletionPolicy) .Files belonging to older commits
> are removed. If the files are needed longer for replication, they are
> leased . The lease is extended 10 secs at a time. Once all the slaves
> have copied the lease is never extended and the files will be purged.
>
> In the snapshot based system , unless the snapshots are deleted from
> the file system the old files will continue to live on the disk
> --Noble
>
> On Mon, Jan 5, 2009 at 6:59 PM, Mark Miller  wrote:
>> Noble Paul ??? ?? wrote:
>>>
>>> * SolrReplication does not create snapshots . So you have less cleanup
>>> to do. The script based replication results is more disk space
>>> consumption (especially if you do frequent commits)
>>>
>>
>> Doesn't SolrReplication effectively take a snapshot by using a custom
>> IndexDeletionPolicy to keep the right index files around? Isn't that
>> maintaining a snapshot?
>>
>> Could you elaborate on the difference Noble?
>>
>> - Mark
>>
>
>
>
> --
> --Noble Paul
>



-- 

+1 510 277-0891 (o)
+91  33 7458 (m)

web: http://pajamadesign.com

Skype: pajamadesign
Yahoo: jacobsingh
AIM: jacobsingh
gTalk: jacobsi...@gmail.com


Re: Deduplication patch not working in nightly build

2009-01-05 Thread Noble Paul നോബിള്‍ नोब्ळ्
I guess the indexing of a doc is taking too long (may be because of
the de-dup patch) and the resultset gets closed automaticallly (timed
out)
--Noble

On Mon, Jan 5, 2009 at 5:14 PM, Marc Sturlese  wrote:
>
> Donig this fix I get the same error :(
>
> I am going to try to set up the last nigthly build... let's see if I have
> better luck.
>
> The thing is it stop indexing at the doc num 150.000 aprox... and give me
> that mysql exception error... Without DeDuplication patch I can index 2
> milion docs without problems...
>
> I am pretty lost with this... :(
>
>
> Shalin Shekhar Mangar wrote:
>>
>> Yes I meant the 05/01/2008 build. The fix is a one line change
>>
>> Add the following as the last line of DataConfig.Entity.clearCache()
>> dataSrc = null;
>>
>>
>>
>> On Mon, Jan 5, 2009 at 4:22 PM, Marc Sturlese
>> wrote:
>>
>>>
>>> Shalin you mean I should test the 05/01/2008 nighlty? maybe with this one
>>> works? If the fix you did is not really big can u tell me where in the
>>> source is and what is it for? (I have been debuging and tracing a lot the
>>> dataimporthandler source and I I would like to know what the imporovement
>>> is
>>> about if it is not a problem...)
>>>
>>> Thanks!
>>>
>>>
>>> Shalin Shekhar Mangar wrote:
>>> >
>>> > Marc, I've just committed a fix which may have caused the bug. Can you
>>> use
>>> > svn trunk (or the next nightly build) and confirm?
>>> >
>>> > On Mon, Jan 5, 2009 at 3:10 PM, Noble Paul നോബിള്‍ नोब्ळ् <
>>> > noble.p...@gmail.com> wrote:
>>> >
>>> >> looks like a bug w/ DIH with the recent fixes.
>>> >> --Noble
>>> >>
>>> >> On Mon, Jan 5, 2009 at 2:36 PM, Marc Sturlese
>>> 
>>> >> wrote:
>>> >> >
>>> >> > Hey there,
>>> >> > I was using the Deduplication patch with Solr 1.3 release and
>>> >> everything
>>> >> was
>>> >> > working perfectly. Now I upgraded to a nigthly build (20th december)
>>> to
>>> >> be
>>> >> > able to use new facet algorithm and other stuff and DeDuplication is
>>> >> not
>>> >> > working any more. I have followed exactly the same steps to apply
>>> the
>>> >> patch
>>> >> > to the source code. I am geting this error:
>>> >> >
>>> >> > WARNING: Error reading data
>>> >> > com.mysql.jdbc.CommunicationsException: Communications link failure
>>> due
>>> >> to
>>> >> > underlying exception:
>>> >> >
>>> >> > ** BEGIN NESTED EXCEPTION **
>>> >> >
>>> >> > java.io.EOFException
>>> >> >
>>> >> > STACKTRACE:
>>> >> >
>>> >> > java.io.EOFException
>>> >> >at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:1905)
>>> >> >at
>>> com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2404)
>>> >> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
>>> >> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
>>> >> >at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
>>> >> >at
>>> >> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
>>> >> >at
>>> com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
>>> >> >at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
>>> >> >at
>>> >> >
>>> >>
>>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
>>> >> >at
>>> >> >
>>> >>
>>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
>>> >> >at
>>> >> >
>>> >>
>>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
>>> >> >at
>>> >> >
>>> >>
>>> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
>>> >> >at
>>> >> >
>>> >>
>>> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
>>> >> >at
>>> >> >
>>> >>
>>> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
>>> >> >at
>>> >> >
>>> >>
>>> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
>>> >> >at
>>> >> >
>>> >>
>>> org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
>>> >> >at
>>> >> >
>>> >>
>>> org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
>>> >> >at
>>> >> >
>>> >>
>>> org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
>>> >> >at
>>> >> >
>>> >>
>>> org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:388)
>>> >> >
>>> >> >
>>> >> > ** END NESTED EXCEPTION **
>>> >> > Last packet sent to the server was 202481 ms ago.
>>> >> >at
>>> com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2563)
>>> >> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
>>> >> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
>>> >> >at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
>>> >> >at
>>> >> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
>>> >> >at
>>> 

Re: Deduplication patch not working in nightly build

2009-01-05 Thread Marc Sturlese

Shalin you mean I should test the 05/01/2008 nighlty? maybe with this one
works? If the fix you did is not really big can u tell me where in the
source is and what is it for? (I have been debuging and tracing a lot the
dataimporthandler source and I I would like to know what the imporovement is
about if it is not a problem...)

Thanks!


Shalin Shekhar Mangar wrote:
> 
> Marc, I've just committed a fix which may have caused the bug. Can you use
> svn trunk (or the next nightly build) and confirm?
> 
> On Mon, Jan 5, 2009 at 3:10 PM, Noble Paul നോബിള്‍ नोब्ळ् <
> noble.p...@gmail.com> wrote:
> 
>> looks like a bug w/ DIH with the recent fixes.
>> --Noble
>>
>> On Mon, Jan 5, 2009 at 2:36 PM, Marc Sturlese 
>> wrote:
>> >
>> > Hey there,
>> > I was using the Deduplication patch with Solr 1.3 release and
>> everything
>> was
>> > working perfectly. Now I upgraded to a nigthly build (20th december) to
>> be
>> > able to use new facet algorithm and other stuff and DeDuplication is
>> not
>> > working any more. I have followed exactly the same steps to apply the
>> patch
>> > to the source code. I am geting this error:
>> >
>> > WARNING: Error reading data
>> > com.mysql.jdbc.CommunicationsException: Communications link failure due
>> to
>> > underlying exception:
>> >
>> > ** BEGIN NESTED EXCEPTION **
>> >
>> > java.io.EOFException
>> >
>> > STACKTRACE:
>> >
>> > java.io.EOFException
>> >at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:1905)
>> >at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2404)
>> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
>> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
>> >at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
>> >at
>> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
>> >at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
>> >at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
>> >at
>> >
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
>> >at
>> >
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
>> >at
>> >
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
>> >at
>> >
>> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
>> >at
>> >
>> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
>> >at
>> >
>> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
>> >at
>> >
>> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
>> >at
>> >
>> org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
>> >at
>> >
>> org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
>> >at
>> >
>> org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
>> >at
>> >
>> org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:388)
>> >
>> >
>> > ** END NESTED EXCEPTION **
>> > Last packet sent to the server was 202481 ms ago.
>> >at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2563)
>> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
>> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
>> >at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
>> >at
>> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
>> >at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
>> >at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
>> >at
>> >
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
>> >at
>> >
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
>> >at
>> >
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
>> >at
>> >
>> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
>> >at
>> >
>> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
>> >at
>> >
>> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
>> >at
>> >
>> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
>> >at
>> >
>> org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
>> >at
>> >
>> org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
>> >at
>> >
>> org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
>> >at
>> >
>> org.apache.solr.handler.dataimport.DataImporter$1

Questions about UUID type

2009-01-05 Thread Dingding Ye
Hi.

I'm confused by the UUID type comment. It's said that

 /**
   * Generates a UUID if val is either null, empty or "NEW".
   *
   * Otherwise it behaves much like a StrField but checks that the value
given
   * is indeed a valid UUID.
   *
   * @param val The value of the field
   * @see org.apache.solr.schema.FieldType#toInternal(java.lang.String)
   */

However, i found that if i don't specify the field and it will report the
exception

org.apache.solr.common.SolrException: Document [test] missing required
field: id

Surely when i set it to "NEW", it will work as expected.

I want to know that is this a expected way or should be possible a bug?

The related codes should be:

   for (SchemaField field : schema.getRequiredFields()) {
  if (out.getField(field.getName() ) == null) {
*if (field.getDefaultValue() != null) {//  Here because
the default value is null.  I think we should judge whether it's a UUID
field.
  out.add( field.createField( field.getDefaultValue(), 1.0f ) );
} *
else {
  String id = schema.printableUniqueKey( out );
  String msg = "Document ["+id+"] missing required field: " +
field.getName();
  throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, msg
);
}
  }
}

Thanks for help.

Best regards.

sishen


Re: Spell Checker Reloading Issue

2009-01-05 Thread Grant Ingersoll

Can you share your configuration, or at least the relevant pieces?

-Grant
On Jan 5, 2009, at 9:24 AM, Navdeep wrote:



Hi all

we are facing an issue in spell checker with solr server. We are  
changing

the below given attributes of SolrConfig.xml file

1) Accuracy
2) Number of Suggestions

we are rebuilding solr indexes using "spellcheck.build=true" :
URL used for POST_SOLR_URL=
"select? 
q 
= 
*:*&spellcheck 
.q=flavro&spellcheck=true&spellcheck.build=true&qt=dismaxrequest"


After performing the above steps, when we are trying to perform the  
final

search for keyword, it is not working.
Please share your thoughts on this issue.

--
View this message in context: 
http://www.nabble.com/Spell-Checker-Reloading-Issue-tp21291873p21291873.html
Sent from the Solr - User mailing list archive at Nabble.com.



--
Grant Ingersoll

Lucene Helpful Hints:
http://wiki.apache.org/lucene-java/BasicsOfPerformance
http://wiki.apache.org/lucene-java/LuceneFAQ












Issue with Java Client code

2009-01-05 Thread Kalidoss MM
 Hi,

We have created a Java EmbeddedSolrServer Client Code, I can able to
add, delete, update the Solr content - At the same time i cant able to
search the updated conente from the Running Solr client(jetty) web
interface.

My requirement is, All search need to happen from/by running web
Solr(jetty, 8983) and all write should happened from Java client code.

Both(jeety and javaclient) are using 'Core0' as core name, and both data
directory, schema, solrconfig are same. - is there any fix available??

Case1:
1) solr started in 8983 port as Core0,
2) Running a java client(Core0) to add one record say "hitest", with
commit
3) when i search for hitest, am not getting any result,
4) after i restart the solr(8983) and search for 'hitest' am getting
the result.

   -
  - both Solr, java client is using the same dada directory, schema.xml,
 solrconfig.xml
 fyi: even this java client is working when solr is not started


is it possible to handle the read/search by webinterface, and all write
from java-client(with out http) 

Thanks in advance,
Kalidoss.m,


Re: collectionDistribution vs SolrReplication

2009-01-05 Thread Mark Miller

Noble Paul ??? ?? wrote:

* SolrReplication does not create snapshots . So you have less cleanup
to do. The script based replication results is more disk space
consumption (especially if you do frequent commits)
  
Doesn't SolrReplication effectively take a snapshot by using a custom 
IndexDeletionPolicy to keep the right index files around? Isn't that 
maintaining a snapshot?


Could you elaborate on the difference Noble?

- Mark


Re: Deduplication patch not working in nightly build

2009-01-05 Thread Shalin Shekhar Mangar
Marc, I've just committed a fix which may have caused the bug. Can you use
svn trunk (or the next nightly build) and confirm?

On Mon, Jan 5, 2009 at 3:10 PM, Noble Paul നോബിള്‍ नोब्ळ् <
noble.p...@gmail.com> wrote:

> looks like a bug w/ DIH with the recent fixes.
> --Noble
>
> On Mon, Jan 5, 2009 at 2:36 PM, Marc Sturlese 
> wrote:
> >
> > Hey there,
> > I was using the Deduplication patch with Solr 1.3 release and everything
> was
> > working perfectly. Now I upgraded to a nigthly build (20th december) to
> be
> > able to use new facet algorithm and other stuff and DeDuplication is not
> > working any more. I have followed exactly the same steps to apply the
> patch
> > to the source code. I am geting this error:
> >
> > WARNING: Error reading data
> > com.mysql.jdbc.CommunicationsException: Communications link failure due
> to
> > underlying exception:
> >
> > ** BEGIN NESTED EXCEPTION **
> >
> > java.io.EOFException
> >
> > STACKTRACE:
> >
> > java.io.EOFException
> >at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:1905)
> >at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2404)
> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
> >at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
> >at
> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
> >at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
> >at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
> >at
> >
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
> >at
> >
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
> >at
> >
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
> >at
> >
> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
> >at
> >
> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
> >at
> >
> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
> >at
> >
> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
> >at
> >
> org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
> >at
> >
> org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
> >at
> >
> org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
> >at
> >
> org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:388)
> >
> >
> > ** END NESTED EXCEPTION **
> > Last packet sent to the server was 202481 ms ago.
> >at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2563)
> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
> >at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
> >at
> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
> >at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
> >at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
> >at
> >
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
> >at
> >
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
> >at
> >
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
> >at
> >
> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
> >at
> >
> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
> >at
> >
> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
> >at
> >
> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
> >at
> >
> org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
> >at
> >
> org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
> >at
> >
> org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
> >at
> >
> org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:388)
> > Jan 5, 2009 10:06:16 AM org.apache.solr.handler.dataimport.JdbcDataSource
> > logError
> > WARNING: Exception while closing result set
> > com.mysql.jdbc.CommunicationsException: Communications link failure due
> to
> > underlying exception:
> >
> > ** BEGIN NESTED EXCEPTION **
> >
> > java.io.EOFException
> >
> > STACKTRACE:
> >
> > java.io.EOFException
> >at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:1905)
> >at com.mysql.jdbc.MysqlIO.reuseAndReadPa

Spell Checker Reloading Issue

2009-01-05 Thread Navdeep

Hi all 

we are facing an issue in spell checker with solr server. We are changing
the below given attributes of SolrConfig.xml file 

1) Accuracy 
2) Number of Suggestions

we are rebuilding solr indexes using "spellcheck.build=true" :
URL used for POST_SOLR_URL=
"select?q=*:*&spellcheck.q=flavro&spellcheck=true&spellcheck.build=true&qt=dismaxrequest"

After performing the above steps, when we are trying to perform the final
search for keyword, it is not working.
Please share your thoughts on this issue.

-- 
View this message in context: 
http://www.nabble.com/Spell-Checker-Reloading-Issue-tp21291873p21291873.html
Sent from the Solr - User mailing list archive at Nabble.com.



Re: synonyms.txt file updated frequently

2009-01-05 Thread Grant Ingersoll
I haven't fully thought it through, but I was thinking that, in the  
create code in the Factory (where it returns then new TokenFilter),  
you would simply check to see if the file is new, and if it is, reload  
it and recreate the SynonymMap, accounting for threading issues, of  
course, and possibly a timing mechanism such that you aren't  
constantly reloading.


The old filters will just go away when they are done, just as they  
always do.




On Jan 5, 2009, at 6:10 AM, Alexander Ramos Jardim wrote:


2009/1/3 Grant Ingersoll 



On Jan 2, 2009, at 10:25 AM, Alexander Ramos Jardim wrote:

Grant,





2. SynonymTokenFilterFactory does the "synonyms.txt" parse and  
creates the
SynonymTokenFilter instance. If I want the SynonymTokenFilter to  
reload
synonyms.txt file from time to time, I will need to put the file  
load and

parsing strategies inside my new TokenFilter, right?



I think it can go in the SynonymFilterFactory.


How will I make SynonymFilterFactory aware of the file changes, and  
how will

it destroy every filter that is using that file and recreate it?
As long as I can understand, the SynonymFilterFactory does not know  
the
SynonymFilter instances it creates, as it isn't called all the time  
for
doing something, so I wouldn't have an eficient way to apply my  
"timer".

Just let me know if I am misunderstanding something.








2008/12/30 Grant Ingersoll 

I'd probably write a new TokenFilter that was aware of the reload  
policy
(in a generic way) such that I didn't have to go through a whole  
core

reload
every time.  Are you just using them during query time or also  
during

indexing?

-Grant


On Dec 30, 2008, at 8:12 AM, Alexander Ramos Jardim wrote:

Hello guys,



As the title suggests I must update my synonyms.txt file  
frequently.

What
is
the best approach? Should I send a commit after the file is  
updated?

Does
Solr need to be restarted after the file changes?

--
Alexander Ramos Jardim



--
Grant Ingersoll

Lucene Helpful Hints:
http://wiki.apache.org/lucene-java/BasicsOfPerformance
http://wiki.apache.org/lucene-java/LuceneFAQ














--
Alexander Ramos Jardim



--
Grant Ingersoll

Lucene Helpful Hints:
http://wiki.apache.org/lucene-java/BasicsOfPerformance
http://wiki.apache.org/lucene-java/LuceneFAQ














--
Alexander Ramos Jardim


--
Grant Ingersoll

Lucene Helpful Hints:
http://wiki.apache.org/lucene-java/BasicsOfPerformance
http://wiki.apache.org/lucene-java/LuceneFAQ












Re: Deduplication patch not working in nightly build

2009-01-05 Thread Shalin Shekhar Mangar
Yes, initially I figured that we are accidentally re-using a closed data
source. But Noble has pinned it right. I guess you can try looking into your
JDBC driver's documentation for a setting which increases the connection
alive-ness.

On Mon, Jan 5, 2009 at 5:29 PM, Noble Paul നോബിള്‍ नोब्ळ् <
noble.p...@gmail.com> wrote:

> I guess the indexing of a doc is taking too long (may be because of
> the de-dup patch) and the resultset gets closed automaticallly (timed
> out)
> --Noble
>
> On Mon, Jan 5, 2009 at 5:14 PM, Marc Sturlese 
> wrote:
> >
> > Donig this fix I get the same error :(
> >
> > I am going to try to set up the last nigthly build... let's see if I have
> > better luck.
> >
> > The thing is it stop indexing at the doc num 150.000 aprox... and give me
> > that mysql exception error... Without DeDuplication patch I can index 2
> > milion docs without problems...
> >
> > I am pretty lost with this... :(
> >
> >
> > Shalin Shekhar Mangar wrote:
> >>
> >> Yes I meant the 05/01/2008 build. The fix is a one line change
> >>
> >> Add the following as the last line of DataConfig.Entity.clearCache()
> >> dataSrc = null;
> >>
> >>
> >>
> >> On Mon, Jan 5, 2009 at 4:22 PM, Marc Sturlese
> >> wrote:
> >>
> >>>
> >>> Shalin you mean I should test the 05/01/2008 nighlty? maybe with this
> one
> >>> works? If the fix you did is not really big can u tell me where in the
> >>> source is and what is it for? (I have been debuging and tracing a lot
> the
> >>> dataimporthandler source and I I would like to know what the
> imporovement
> >>> is
> >>> about if it is not a problem...)
> >>>
> >>> Thanks!
> >>>
> >>>
> >>> Shalin Shekhar Mangar wrote:
> >>> >
> >>> > Marc, I've just committed a fix which may have caused the bug. Can
> you
> >>> use
> >>> > svn trunk (or the next nightly build) and confirm?
> >>> >
> >>> > On Mon, Jan 5, 2009 at 3:10 PM, Noble Paul നോബിള്‍ नोब्ळ् <
> >>> > noble.p...@gmail.com> wrote:
> >>> >
> >>> >> looks like a bug w/ DIH with the recent fixes.
> >>> >> --Noble
> >>> >>
> >>> >> On Mon, Jan 5, 2009 at 2:36 PM, Marc Sturlese
> >>> 
> >>> >> wrote:
> >>> >> >
> >>> >> > Hey there,
> >>> >> > I was using the Deduplication patch with Solr 1.3 release and
> >>> >> everything
> >>> >> was
> >>> >> > working perfectly. Now I upgraded to a nigthly build (20th
> december)
> >>> to
> >>> >> be
> >>> >> > able to use new facet algorithm and other stuff and DeDuplication
> is
> >>> >> not
> >>> >> > working any more. I have followed exactly the same steps to apply
> >>> the
> >>> >> patch
> >>> >> > to the source code. I am geting this error:
> >>> >> >
> >>> >> > WARNING: Error reading data
> >>> >> > com.mysql.jdbc.CommunicationsException: Communications link
> failure
> >>> due
> >>> >> to
> >>> >> > underlying exception:
> >>> >> >
> >>> >> > ** BEGIN NESTED EXCEPTION **
> >>> >> >
> >>> >> > java.io.EOFException
> >>> >> >
> >>> >> > STACKTRACE:
> >>> >> >
> >>> >> > java.io.EOFException
> >>> >> >at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:1905)
> >>> >> >at
> >>> com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2404)
> >>> >> >at
> com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
> >>> >> >at
> com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
> >>> >> >at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
> >>> >> >at
> >>> >> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
> >>> >> >at
> >>> com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
> >>> >> >at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
> >>> >> >at
> >>> >> >
> >>> >>
> >>>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
> >>> >> >at
> >>> >> >
> >>> >>
> >>>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
> >>> >> >at
> >>> >> >
> >>> >>
> >>>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
> >>> >> >at
> >>> >> >
> >>> >>
> >>>
> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
> >>> >> >at
> >>> >> >
> >>> >>
> >>>
> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
> >>> >> >at
> >>> >> >
> >>> >>
> >>>
> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
> >>> >> >at
> >>> >> >
> >>> >>
> >>>
> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
> >>> >> >at
> >>> >> >
> >>> >>
> >>>
> org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
> >>> >> >at
> >>> >> >
> >>> >>
> >>>
> org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
> >>> >> >at
> >>> >> >
> >>> >>
> >>>
> org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
> >>> >> > 

Re: Issue with Java Client code

2009-01-05 Thread Kalidoss MM
Is it possible to issue the commit to the Solr Server from that java code it
self??

I have tried the same by issueing the command in terminal
(/solr/bin/./commit) and it worked..

Please let me know. is it possible to do the same in java code it self?

kalidoss.m,

On Mon, Jan 5, 2009 at 6:18 PM, Erik Hatcher wrote:

>
> On Jan 5, 2009, at 7:33 AM, Kalidoss MM wrote:
>
>>   We have created a Java EmbeddedSolrServer Client Code, I can able to
>> add, delete, update the Solr content - At the same time i cant able to
>> search the updated conente from the Running Solr client(jetty) web
>> interface.
>>
>>   My requirement is, All search need to happen from/by running web
>> Solr(jetty, 8983) and all write should happened from Java client code.
>>
>>   Both(jeety and javaclient) are using 'Core0' as core name, and both data
>> directory, schema, solrconfig are same. - is there any fix available??
>>
>>   Case1:
>>   1) solr started in 8983 port as Core0,
>>   2) Running a java client(Core0) to add one record say "hitest", with
>> commit
>>   3) when i search for hitest, am not getting any result,
>>   4) after i restart the solr(8983) and search for 'hitest' am getting
>> the result.
>>
>>  -
>> - both Solr, java client is using the same dada directory, schema.xml,
>>solrconfig.xml
>>fyi: even this java client is working when solr is not started
>>
>>
>>   is it possible to handle the read/search by webinterface, and all write
>> from java-client(with out http) 
>>
>
> You'll need to issue the  to the Solr server (not the embedded
> one) for it to take place there.  A commit to EmbeddedSolrServer will make
> newly added documents visible through _that_ SolrServer, but not to anyone
> other process (such as Solr via jetty) pointing at the Lucene index.
>
>Erik
>
>


Re: Issue with Java Client code

2009-01-05 Thread Erik Hatcher


On Jan 5, 2009, at 7:33 AM, Kalidoss MM wrote:
   We have created a Java EmbeddedSolrServer Client Code, I can able  
to

add, delete, update the Solr content - At the same time i cant able to
search the updated conente from the Running Solr client(jetty) web
interface.

   My requirement is, All search need to happen from/by running web
Solr(jetty, 8983) and all write should happened from Java client code.

   Both(jeety and javaclient) are using 'Core0' as core name, and  
both data

directory, schema, solrconfig are same. - is there any fix available??

   Case1:
   1) solr started in 8983 port as Core0,
   2) Running a java client(Core0) to add one record say  
"hitest", with

commit
   3) when i search for hitest, am not getting any result,
   4) after i restart the solr(8983) and search for 'hitest' am  
getting

the result.

  -
 - both Solr, java client is using the same dada directory,  
schema.xml,

solrconfig.xml
fyi: even this java client is working when solr is not started


   is it possible to handle the read/search by webinterface, and all  
write

from java-client(with out http) 


You'll need to issue the  to the Solr server (not the  
embedded one) for it to take place there.  A commit to  
EmbeddedSolrServer will make newly added documents visible through  
_that_ SolrServer, but not to anyone other process (such as Solr via  
jetty) pointing at the Lucene index.


Erik



Re: Deduplication patch not working in nightly build

2009-01-05 Thread Shalin Shekhar Mangar
Yes I meant the 05/01/2008 build. The fix is a one line change

Add the following as the last line of DataConfig.Entity.clearCache()
dataSrc = null;



On Mon, Jan 5, 2009 at 4:22 PM, Marc Sturlese wrote:

>
> Shalin you mean I should test the 05/01/2008 nighlty? maybe with this one
> works? If the fix you did is not really big can u tell me where in the
> source is and what is it for? (I have been debuging and tracing a lot the
> dataimporthandler source and I I would like to know what the imporovement
> is
> about if it is not a problem...)
>
> Thanks!
>
>
> Shalin Shekhar Mangar wrote:
> >
> > Marc, I've just committed a fix which may have caused the bug. Can you
> use
> > svn trunk (or the next nightly build) and confirm?
> >
> > On Mon, Jan 5, 2009 at 3:10 PM, Noble Paul നോബിള്‍ नोब्ळ् <
> > noble.p...@gmail.com> wrote:
> >
> >> looks like a bug w/ DIH with the recent fixes.
> >> --Noble
> >>
> >> On Mon, Jan 5, 2009 at 2:36 PM, Marc Sturlese 
> >> wrote:
> >> >
> >> > Hey there,
> >> > I was using the Deduplication patch with Solr 1.3 release and
> >> everything
> >> was
> >> > working perfectly. Now I upgraded to a nigthly build (20th december)
> to
> >> be
> >> > able to use new facet algorithm and other stuff and DeDuplication is
> >> not
> >> > working any more. I have followed exactly the same steps to apply the
> >> patch
> >> > to the source code. I am geting this error:
> >> >
> >> > WARNING: Error reading data
> >> > com.mysql.jdbc.CommunicationsException: Communications link failure
> due
> >> to
> >> > underlying exception:
> >> >
> >> > ** BEGIN NESTED EXCEPTION **
> >> >
> >> > java.io.EOFException
> >> >
> >> > STACKTRACE:
> >> >
> >> > java.io.EOFException
> >> >at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:1905)
> >> >at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2404)
> >> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
> >> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
> >> >at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
> >> >at
> >> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
> >> >at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
> >> >at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:388)
> >> >
> >> >
> >> > ** END NESTED EXCEPTION **
> >> > Last packet sent to the server was 202481 ms ago.
> >> >at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2563)
> >> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
> >> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
> >> >at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
> >> >at
> >> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
> >> >at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
> >> >at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
> >> >at
> >> >
> >>
> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351

Re: synonyms.txt file updated frequently

2009-01-05 Thread Alexander Ramos Jardim
2009/1/5 Grant Ingersoll 

> I haven't fully thought it through, but I was thinking that, in the create
> code in the Factory (where it returns then new TokenFilter), you would
> simply check to see if the file is new, and if it is, reload it and recreate
> the SynonymMap, accounting for threading issues, of course, and possibly a
> timing mechanism such that you aren't constantly reloading.
>
> The old filters will just go away when they are done, just as they always
> do.
>
>
So, everytime something will be queried or indexed the filters are rebuilt?
Is that right?


>
>
>
> On Jan 5, 2009, at 6:10 AM, Alexander Ramos Jardim wrote:
>
>  2009/1/3 Grant Ingersoll 
>>
>>
>>> On Jan 2, 2009, at 10:25 AM, Alexander Ramos Jardim wrote:
>>>
>>> Grant,
>>>


>>>
>>>  2. SynonymTokenFilterFactory does the "synonyms.txt" parse and creates
 the
 SynonymTokenFilter instance. If I want the SynonymTokenFilter to reload
 synonyms.txt file from time to time, I will need to put the file load
 and
 parsing strategies inside my new TokenFilter, right?


>>> I think it can go in the SynonymFilterFactory.
>>>
>>>
>>>  How will I make SynonymFilterFactory aware of the file changes, and how
>> will
>> it destroy every filter that is using that file and recreate it?
>> As long as I can understand, the SynonymFilterFactory does not know the
>> SynonymFilter instances it creates, as it isn't called all the time for
>> doing something, so I wouldn't have an eficient way to apply my "timer".
>> Just let me know if I am misunderstanding something.
>>
>>
>>
>>>
>>>
>>>
 2008/12/30 Grant Ingersoll 

 I'd probably write a new TokenFilter that was aware of the reload policy

> (in a generic way) such that I didn't have to go through a whole core
> reload
> every time.  Are you just using them during query time or also during
> indexing?
>
> -Grant
>
>
> On Dec 30, 2008, at 8:12 AM, Alexander Ramos Jardim wrote:
>
> Hello guys,
>
>
>> As the title suggests I must update my synonyms.txt file frequently.
>> What
>> is
>> the best approach? Should I send a commit after the file is updated?
>> Does
>> Solr need to be restarted after the file changes?
>>
>> --
>> Alexander Ramos Jardim
>>
>>
>>  --
> Grant Ingersoll
>
> Lucene Helpful Hints:
> http://wiki.apache.org/lucene-java/BasicsOfPerformance
> http://wiki.apache.org/lucene-java/LuceneFAQ
>
>
>
>
>
>
>
>
>
>
>
>
>
 --
 Alexander Ramos Jardim


>>> --
>>> Grant Ingersoll
>>>
>>> Lucene Helpful Hints:
>>> http://wiki.apache.org/lucene-java/BasicsOfPerformance
>>> http://wiki.apache.org/lucene-java/LuceneFAQ
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>> --
>> Alexander Ramos Jardim
>>
>
> --
> Grant Ingersoll
>
> Lucene Helpful Hints:
> http://wiki.apache.org/lucene-java/BasicsOfPerformance
> http://wiki.apache.org/lucene-java/LuceneFAQ
>
>
>
>
>
>
>
>
>
>
>


-- 
Alexander Ramos Jardim


Re: Deduplication patch not working in nightly build

2009-01-05 Thread Marc Sturlese

Donig this fix I get the same error :(

I am going to try to set up the last nigthly build... let's see if I have
better luck.

The thing is it stop indexing at the doc num 150.000 aprox... and give me
that mysql exception error... Without DeDuplication patch I can index 2
milion docs without problems...

I am pretty lost with this... :(


Shalin Shekhar Mangar wrote:
> 
> Yes I meant the 05/01/2008 build. The fix is a one line change
> 
> Add the following as the last line of DataConfig.Entity.clearCache()
> dataSrc = null;
> 
> 
> 
> On Mon, Jan 5, 2009 at 4:22 PM, Marc Sturlese
> wrote:
> 
>>
>> Shalin you mean I should test the 05/01/2008 nighlty? maybe with this one
>> works? If the fix you did is not really big can u tell me where in the
>> source is and what is it for? (I have been debuging and tracing a lot the
>> dataimporthandler source and I I would like to know what the imporovement
>> is
>> about if it is not a problem...)
>>
>> Thanks!
>>
>>
>> Shalin Shekhar Mangar wrote:
>> >
>> > Marc, I've just committed a fix which may have caused the bug. Can you
>> use
>> > svn trunk (or the next nightly build) and confirm?
>> >
>> > On Mon, Jan 5, 2009 at 3:10 PM, Noble Paul നോബിള്‍ नोब्ळ् <
>> > noble.p...@gmail.com> wrote:
>> >
>> >> looks like a bug w/ DIH with the recent fixes.
>> >> --Noble
>> >>
>> >> On Mon, Jan 5, 2009 at 2:36 PM, Marc Sturlese
>> 
>> >> wrote:
>> >> >
>> >> > Hey there,
>> >> > I was using the Deduplication patch with Solr 1.3 release and
>> >> everything
>> >> was
>> >> > working perfectly. Now I upgraded to a nigthly build (20th december)
>> to
>> >> be
>> >> > able to use new facet algorithm and other stuff and DeDuplication is
>> >> not
>> >> > working any more. I have followed exactly the same steps to apply
>> the
>> >> patch
>> >> > to the source code. I am geting this error:
>> >> >
>> >> > WARNING: Error reading data
>> >> > com.mysql.jdbc.CommunicationsException: Communications link failure
>> due
>> >> to
>> >> > underlying exception:
>> >> >
>> >> > ** BEGIN NESTED EXCEPTION **
>> >> >
>> >> > java.io.EOFException
>> >> >
>> >> > STACKTRACE:
>> >> >
>> >> > java.io.EOFException
>> >> >at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:1905)
>> >> >at
>> com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2404)
>> >> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
>> >> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
>> >> >at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
>> >> >at
>> >> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
>> >> >at
>> com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
>> >> >at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
>> >> >at
>> >> >
>> >>
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
>> >> >at
>> >> >
>> >>
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
>> >> >at
>> >> >
>> >>
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
>> >> >at
>> >> >
>> >>
>> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
>> >> >at
>> >> >
>> >>
>> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
>> >> >at
>> >> >
>> >>
>> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
>> >> >at
>> >> >
>> >>
>> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
>> >> >at
>> >> >
>> >>
>> org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
>> >> >at
>> >> >
>> >>
>> org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
>> >> >at
>> >> >
>> >>
>> org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
>> >> >at
>> >> >
>> >>
>> org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:388)
>> >> >
>> >> >
>> >> > ** END NESTED EXCEPTION **
>> >> > Last packet sent to the server was 202481 ms ago.
>> >> >at
>> com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2563)
>> >> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
>> >> >at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
>> >> >at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
>> >> >at
>> >> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
>> >> >at
>> com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
>> >> >at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
>> >> >at
>> >> >
>> >>
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
>> >> >at
>> >> >
>> >>
>> org.apache.solr.handler.dataimport.JdbcDataSource$Resul

RE: Ngram Repeats

2009-01-05 Thread Feak, Todd
To get the unique brand names, you are wandering in to the Facet query 
territory that I mentioned.

You could consider a separate index, and that will probably provide the best 
performance. Especially if you are hitting it on a per-keystroke basis to 
update that auto-complete box. Creating a separate index also allows you to 
scale this section of your search infrastructure separately, if necessary.

You *can* put the separate index within the same Tomcat instance if you need 
to. The context snippets in Tomcat can be used to provide a different URL for 
those queries.

-Todd Feak

-Original Message-
From: Jeff Newburn [mailto:jnewb...@zappos.com] 
Sent: Wednesday, December 24, 2008 2:30 PM
To: solr-user@lucene.apache.org
Subject: Re: Ngram Repeats

You are correct on the layout.  The reason we are trying to do the ngrams is
we want to do a drop down box for autocomplete.  The ngrams are extremely
fast and the recommended way to do this according to the user group.  They
work wonderfully except this one issue.  So do we basically have to do a
separate index for this or is there a dedup setting to only return unique
brand names.


On 12/24/08 7:51 AM, "Feak, Todd"  wrote:

> It sounds like you want to get a list of "brands" that start with a particular
> string, out of your index. But your index is based on products, not brands. Is
> that correct?
> 
> If so, that has nothing to do with NGrams (or even tokenizing for that matter)
> I think you should be doing a Facet query instead of a standard query. Take a
> look at Facets on the Solr Wiki.
> 
> http://wiki.apache.org/solr/SolrFacetingOverview
> 
> -Todd Feak
> -Original Message-
> From: Jeff Newburn [mailto:jnewb...@zappos.com]
> Sent: Wednesday, December 24, 2008 7:39 AM
> To: solr-user@lucene.apache.org
> Subject: Ngram Repeats
> 
> I have set up an ngram filter and have run into a problem.  Our index is
> basically composed of products as the unique id.  Each product also has a
> brand name assigned to it.  There are much fewer unique brand names than
> products in the index.  I tried to set up an ngram based on the brand name
> but it is returning the same brand name over and over for each product.
> Essentially if you try for the brand name starting with ³as² you will get
> the brand ³asus² 15 times.  Is there a way to make the ngram only return
> unique brand name?  I have attached the configuration below.
> 
>  positionIncrementGap="1">
> 
> 
> 
>  minGramSize="1" maxGramSize="20"/>
> 
> 
> 
> 
> 
> 
> -Jeff




Re: Deduplication patch not working in nightly build

2009-01-05 Thread Marc Sturlese

Yeah looks like but... if I don't use the DeDuplication patch everything
works perfect.  I can create my indexed using full import and delta import
without problems. The JdbcDataSource of the nightly is pretty similar to the
1.3 release...
The DeDuplication patch doesn't touch the dataimporthandler classes... it's
coz I thought the problem was not there (but can't say it for sure...)

I was thinking that the problem has something to do with the
UpdateRequestProcessorChain but don't know how this part of the source
works...

Any advice how could I sort it? I am really interested in updating to the
nightly build as I think new facet algorithm and  SolrDeletionPolicy are
really great stuff!

Thanks


Noble Paul നോബിള്‍ नोब्ळ् wrote:
> 
> looks like a bug w/ DIH with the recent fixes.
> --Noble
> 
> On Mon, Jan 5, 2009 at 2:36 PM, Marc Sturlese 
> wrote:
>>
>> Hey there,
>> I was using the Deduplication patch with Solr 1.3 release and everything
>> was
>> working perfectly. Now I upgraded to a nigthly build (20th december) to
>> be
>> able to use new facet algorithm and other stuff and DeDuplication is not
>> working any more. I have followed exactly the same steps to apply the
>> patch
>> to the source code. I am geting this error:
>>
>> WARNING: Error reading data
>> com.mysql.jdbc.CommunicationsException: Communications link failure due
>> to
>> underlying exception:
>>
>> ** BEGIN NESTED EXCEPTION **
>>
>> java.io.EOFException
>>
>> STACKTRACE:
>>
>> java.io.EOFException
>>at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:1905)
>>at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2404)
>>at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
>>at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
>>at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
>>at
>> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
>>at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
>>at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
>>at
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
>>at
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
>>at
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
>>at
>> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
>>at
>> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
>>at
>> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
>>at
>> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
>>at
>> org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
>>at
>> org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
>>at
>> org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
>>at
>> org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:388)
>>
>>
>> ** END NESTED EXCEPTION **
>> Last packet sent to the server was 202481 ms ago.
>>at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:2563)
>>at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2862)
>>at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:771)
>>at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1289)
>>at
>> com.mysql.jdbc.RowDataDynamic.nextRecord(RowDataDynamic.java:362)
>>at com.mysql.jdbc.RowDataDynamic.next(RowDataDynamic.java:352)
>>at com.mysql.jdbc.ResultSet.next(ResultSet.java:6144)
>>at
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.hasnext(JdbcDataSource.java:294)
>>at
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.access$400(JdbcDataSource.java:189)
>>at
>> org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator$1.hasNext(JdbcDataSource.java:225)
>>at
>> org.apache.solr.handler.dataimport.EntityProcessorBase.getNext(EntityProcessorBase.java:229)
>>at
>> org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:76)
>>at
>> org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:351)
>>at
>> org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:193)
>>at
>> org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:144)
>>at
>> org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:334)
>>at
>> org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:407)
>>at
>> org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:388)
>> Jan 5, 2009 10:06:16 AM org.apache.solr.handler.dataimport.JdbcData

Re: collectionDistribution vs SolrReplication

2009-01-05 Thread Shalin Shekhar Mangar
The problem with that approach is that unlike databases, a commit is an
expensive operation in Lucene right now. It is not very practical to commit
per document, therefore log replication offers very little.

On Tue, Jan 6, 2009 at 12:07 AM, Jacob Singh  wrote:

> Has there been a discussion anywhere about a "binary log" style
> replications scheme (ala mysql?)  Wherein, every write request goes to
> the master, and the the slaves read in a queue of the requests and
> update themselves one record at a time instead of wholesale?  Or is
> this just not worth the development time?
>
> Best,
> Jacob
>
> On Mon, Jan 5, 2009 at 10:26 AM, Noble Paul നോബിള്‍ नोब्ळ्
>  wrote:
> > The default IndexDeletionPolicy just keeps the last commit only
> > (KeepOnlyLastCommitDeletionPolicy) .Files belonging to older commits
> > are removed. If the files are needed longer for replication, they are
> > leased . The lease is extended 10 secs at a time. Once all the slaves
> > have copied the lease is never extended and the files will be purged.
> >
> > In the snapshot based system , unless the snapshots are deleted from
> > the file system the old files will continue to live on the disk
> > --Noble
> >
> > On Mon, Jan 5, 2009 at 6:59 PM, Mark Miller 
> wrote:
> >> Noble Paul ??? ?? wrote:
> >>>
> >>> * SolrReplication does not create snapshots . So you have less cleanup
> >>> to do. The script based replication results is more disk space
> >>> consumption (especially if you do frequent commits)
> >>>
> >>
> >> Doesn't SolrReplication effectively take a snapshot by using a custom
> >> IndexDeletionPolicy to keep the right index files around? Isn't that
> >> maintaining a snapshot?
> >>
> >> Could you elaborate on the difference Noble?
> >>
> >> - Mark
> >>
> >
> >
> >
> > --
> > --Noble Paul
> >
>
>
>
> --
>
> +1 510 277-0891 (o)
> +91  33 7458 (m)
>
> web: http://pajamadesign.com
>
> Skype: pajamadesign
> Yahoo: jacobsingh
> AIM: jacobsingh
> gTalk: jacobsi...@gmail.com
>



-- 
Regards,
Shalin Shekhar Mangar.


Solr 1.3.0 with Jetty 6.1.14

2009-01-05 Thread gwk

Hello,


I'm trying to get multiple instances of Solr running with Jetty as per
the instructions on http://wiki.apache.org/solr/SolrJetty, however I've
run into a snag. According to the page you set the solr/home parameter
as follows:


   solr/home
   *My Solr Home Dir*


However, as MattKangas mentions on the wiki, using this method to set
the JNDI parameter makes it global to the jvm which is bad for running
multiple instances but reading the 6.1.14 documentation for the EnvEntry
class constructors shows that with this version of jetty you can supply
a scope, I've tried this with the following configuration:


   
   
   
   /solr/home
   /my/solr/home/dir
   true
   


But unfortunately this doesn't seem to work, if I set the first argument
to NULL (), it works for one instance (as it's in jvm scope) but
when I set it to the WebAppContext-scope, solr logs:

org.apache.solr.core.SolrResourceLoader locateInstanceDir
INFO: No /solr/home in JNDI
org.apache.solr.core.SolrResourceLoader locateInstanceDir
INFO: solr home defaulted to 'solr/' (could not find system property or
JNDI)

Am I doing something wrong here? Any help will be appreciated.

Regards,

gwk




Delete / filter / hide query results

2009-01-05 Thread DODMax

Hello,

I'm trying to resolve a problem for which I've seen several posts but I have
not found any suitable answer.

I need to filter my results according to complex rights managment such as it
can't be part of a field or something like this. So let's say that the only
way to know if a user has access rights is by calling something like
accessRights(sessionID, docID) where docID is stored in a field.

My first idea was to simply delete results from rsp.getResults() after the
search process but it breaks faceting (expecially facet count). So since I
don't see any way to delete results from faceting it's not a good answer
(anyway it's ugly to build facets to delete them after).

I then decided to use a custom SearchComponent called right after index
querying (before faceting is done) but for what I have read it's not a good
idea to delete results because they are stored in more than one place and it
could break the caching system (I suppose that if I delete results for user
A they will be deleted for user B too if he makes the same query although he
does have access rights). Anyway I don't really understand where results are
stored in ResponseBuilder; DocSet / DocList are pretty obscur.

So here I am, I need a way to filter results from cache for each query
basing on an external function call but I don't see how without having to
write my own QueryComponent).


Thanks for your help.
-- 
View this message in context: 
http://www.nabble.com/Delete---filter---hide-query-results-tp21287332p21287332.html
Sent from the Solr - User mailing list archive at Nabble.com.



Solrj, drill down, facets.

2009-01-05 Thread Yevgeniy Belman
I am playing with solrj trying to run through a few scenarios one of which
is to "drill down" into a facet. Here, my facet is "manu". I want to narrow
the search by requesting anything that matches "ipod", and falls into an
"apple" manufacturer facet. I am new to Solr/Lucene, my appologies for basic
question.

Thru trial and error i got this line to narrow the facet:
query.add("fq", "manu:apple");

but what does this line do? I didn't notice any result difference.
query.addFacetQuery("manu:apple");


Re: Solrj, drill down, facets.

2009-01-05 Thread Erik Hatcher


On Jan 5, 2009, at 3:59 PM, Yevgeniy Belman wrote:

Thru trial and error i got this line to narrow the facet:
query.add("fq", "manu:apple");

but what does this line do? I didn't notice any result difference.
query.addFacetQuery("manu:apple");


addFacetQuery is equivalent to using &facet.query, which is not the  
same as fq (filter query). It is a bit confusing, no question.


Erik



Re: Spell Checker Reloading Issue

2009-01-05 Thread Grant Ingersoll
Can you describe what "not working" means?  You're not getting  
suggestions or your getting exceptions?  Is there any error in your log?


If you add  &debugQuery=true to your query, does it show that the  
Spell component was run?  (I think it should)


Do your logs show the Spell Checker being initialized or anything else?

On Jan 5, 2009, at 10:21 AM, Navdeep wrote:



Hi

Thanks for your response.
Please find the attached.
1) schema.xml and solrconfig.xml

In solrconfig.xml file, we are changing the below parts ...

PART 1:

class="solr.DisMaxRequestHandler" >

   
  
 false
 
 false
 
 10

explicit

0.01
statusName_product_s^1.0 productId_product_s^1.0
iSBN10_product_s^1.0 iSBN13_product_s^1.0 prdMainTitle_product_s^1.0
prdKeywords_product_s^1.0 productDescription_product_s^1.0
prdMainSubTitle_product_s^1.0 contentTypeId_product_s^1.0
editionTypeId_product_s^1.0 statusId_product_s^1.0  
formatId_product_s^1.0
audienceId_product_s^1.0 eraId_product_s^1.0  
extentTypeId_product_s^1.0
divisionId_product_s^1.0 productPrice_product_s^1.0  
basePrice_product_s^1.0

catalogPrice_product_s^1.0 editionName_product_s^1.0
productSource_product_s^1.0 ageRange_product_s^1.0
prdPublishingDate_product_s^1.0 productCopyright_product_s^1.0
productExtentName_product_s^1.0 parentTaxonomy_product_s^1.0
parentGroup_product_s^1.0 IndexId_s^1.0 productURL_product_s^1.0
websiteURL_product_s^1.0 productContributors_product_s^1.0
relatedFiles_product_s^1.0 relatedLinks_product_s^1.0  
awards_product_s^1.0

imprints_product_s^1.0 product_product_s^1.0 documents_product_s^1.0
taxonomyPathElement_product_s^1.0 




	 english^90 hindi^123 Glorious^2000  
highlighting^1000

maths^100 ab^12 erer^4545

   *,score

   

 spellcheck
   
 

PART 2:



   textSpell

   
 default
 solr.spelling.FileBasedSpellChecker
 ./spellings.txt
 UTF-8
 ./spellcheckerFile
  0.7
   

 

Thanks
Navdeep




Grant Ingersoll-6 wrote:


Can you share your configuration, or at least the relevant pieces?

-Grant
On Jan 5, 2009, at 9:24 AM, Navdeep wrote:



Hi all

we are facing an issue in spell checker with solr server. We are
changing
the below given attributes of SolrConfig.xml file

1) Accuracy
2) Number of Suggestions

we are rebuilding solr indexes using "spellcheck.build=true" :
URL used for POST_SOLR_URL=
"select?
q
=
*:*&spellcheck
.q=flavro&spellcheck=true&spellcheck.build=true&qt=dismaxrequest"

After performing the above steps, when we are trying to perform the
final
search for keyword, it is not working.
Please share your thoughts on this issue.

--
View this message in context:
http://www.nabble.com/Spell-Checker-Reloading-Issue-tp21291873p21291873.html
Sent from the Solr - User mailing list archive at Nabble.com.



--
Grant Ingersoll

Lucene Helpful Hints:
http://wiki.apache.org/lucene-java/BasicsOfPerformance
http://wiki.apache.org/lucene-java/LuceneFAQ













http://www.nabble.com/file/p21292901/schema.xml schema.xml
http://www.nabble.com/file/p21292901/solrconfig.xml solrconfig.xml
--
View this message in context: 
http://www.nabble.com/Spell-Checker-Reloading-Issue-tp21291873p21292901.html
Sent from the Solr - User mailing list archive at Nabble.com.



--
Grant Ingersoll

Lucene Helpful Hints:
http://wiki.apache.org/lucene-java/BasicsOfPerformance
http://wiki.apache.org/lucene-java/LuceneFAQ












Re: DataImport

2009-01-05 Thread Performance

I have been following this tutorial but I can't seem to get past an error
related to not being able to load the DB2 Driver.  The user has all the
right config to load the JDBC driver and Squirrel works fine.  Do I need to
update and path within Solr?



muxa wrote:
> 
> Looked through the tutorial on data import, section "Full Import
> Example".
> 1) Where is this dataimport.jar? There is no such file in the
> extracted example-solr-home.jar.
> 2) "Use the solr folder inside example-data-config folder as your
> solr home." What does this mean? Anyway, there is no folder
> example-data-config.
>  Ar cieņu, Mihails
> 

-- 
View this message in context: 
http://www.nabble.com/DataImport-tp17730791p21301571.html
Sent from the Solr - User mailing list archive at Nabble.com.



Re: Spell Checker Reloading Issue

2009-01-05 Thread Navdeep

Hi 

Thanks for the concern, Grant
My Issue is resolved. 

Problem was spell checker was not working after changing the accuracy or
number of suggestions in solrconfig.xml file.

Solution is:- 
We have to add "&build=true" in the command so that it should generate the
indexes everytime we change the parameters in solrconfig.xml file.

Now my spell checker is working fine.

Thanks for your response


Grant Ingersoll-6 wrote:
> 
> Can you describe what "not working" means?  You're not getting  
> suggestions or your getting exceptions?  Is there any error in your log?
> 
> If you add  &debugQuery=true to your query, does it show that the  
> Spell component was run?  (I think it should)
> 
> Do your logs show the Spell Checker being initialized or anything else?
> 
> On Jan 5, 2009, at 10:21 AM, Navdeep wrote:
> 
>>
>> Hi
>>
>> Thanks for your response.
>> Please find the attached.
>> 1) schema.xml and solrconfig.xml
>>
>> In solrconfig.xml file, we are changing the below parts ...
>>
>> PART 1:
>>
>> > class="solr.DisMaxRequestHandler" >
>>
>>
>>  false
>>  
>>  false
>>  
>>  10
>>
>> explicit
>>
>> 0.01
>> statusName_product_s^1.0 productId_product_s^1.0
>> iSBN10_product_s^1.0 iSBN13_product_s^1.0 prdMainTitle_product_s^1.0
>> prdKeywords_product_s^1.0 productDescription_product_s^1.0
>> prdMainSubTitle_product_s^1.0 contentTypeId_product_s^1.0
>> editionTypeId_product_s^1.0 statusId_product_s^1.0  
>> formatId_product_s^1.0
>> audienceId_product_s^1.0 eraId_product_s^1.0  
>> extentTypeId_product_s^1.0
>> divisionId_product_s^1.0 productPrice_product_s^1.0  
>> basePrice_product_s^1.0
>> catalogPrice_product_s^1.0 editionName_product_s^1.0
>> productSource_product_s^1.0 ageRange_product_s^1.0
>> prdPublishingDate_product_s^1.0 productCopyright_product_s^1.0
>> productExtentName_product_s^1.0 parentTaxonomy_product_s^1.0
>> parentGroup_product_s^1.0 IndexId_s^1.0 productURL_product_s^1.0
>> websiteURL_product_s^1.0 productContributors_product_s^1.0
>> relatedFiles_product_s^1.0 relatedLinks_product_s^1.0  
>> awards_product_s^1.0
>> imprints_product_s^1.0 product_product_s^1.0 documents_product_s^1.0
>> taxonomyPathElement_product_s^1.0 
>>
>> 
>>
>> 
>>   english^90 hindi^123 Glorious^2000  
>> highlighting^1000
>> maths^100 ab^12 erer^4545
>> 
>>*,score
>> 
>>
>> 
>>  spellcheck
>>
>>  
>>
>> PART 2:
>>
>> 
>>
>>textSpell
>>
>> 
>>  default
>>  solr.spelling.FileBasedSpellChecker
>>  ./spellings.txt
>>  UTF-8
>>  ./spellcheckerFile
>>0.7
>>
>>
>>  
>>
>> Thanks
>> Navdeep
>>
>>
>>
>>
>> Grant Ingersoll-6 wrote:
>>>
>>> Can you share your configuration, or at least the relevant pieces?
>>>
>>> -Grant
>>> On Jan 5, 2009, at 9:24 AM, Navdeep wrote:
>>>

 Hi all

 we are facing an issue in spell checker with solr server. We are
 changing
 the below given attributes of SolrConfig.xml file

 1) Accuracy
 2) Number of Suggestions

 we are rebuilding solr indexes using "spellcheck.build=true" :
 URL used for POST_SOLR_URL=
 "select?
 q
 =
 *:*&spellcheck
 .q=flavro&spellcheck=true&spellcheck.build=true&qt=dismaxrequest"

 After performing the above steps, when we are trying to perform the
 final
 search for keyword, it is not working.
 Please share your thoughts on this issue.

 -- 
 View this message in context:
 http://www.nabble.com/Spell-Checker-Reloading-Issue-tp21291873p21291873.html
 Sent from the Solr - User mailing list archive at Nabble.com.

>>>
>>> --
>>> Grant Ingersoll
>>>
>>> Lucene Helpful Hints:
>>> http://wiki.apache.org/lucene-java/BasicsOfPerformance
>>> http://wiki.apache.org/lucene-java/LuceneFAQ
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>> http://www.nabble.com/file/p21292901/schema.xml schema.xml
>> http://www.nabble.com/file/p21292901/solrconfig.xml solrconfig.xml
>> -- 
>> View this message in context:
>> http://www.nabble.com/Spell-Checker-Reloading-Issue-tp21291873p21292901.html
>> Sent from the Solr - User mailing list archive at Nabble.com.
>>
> 
> --
> Grant Ingersoll
> 
> Lucene Helpful Hints:
> http://wiki.apache.org/lucene-java/BasicsOfPerformance
> http://wiki.apache.org/lucene-java/LuceneFAQ
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 

-- 
View this message in context: 
http://www.nabble.com/Spell-Checker-Reloading-Issue-tp21291873p21303979.html
Sent from the Solr - User mailing list archive at Nabble.com.



Re: DataImport

2009-01-05 Thread Noble Paul നോബിള്‍ नोब्ळ्
The driver can be put directly into the WEB-INF/lib of the solr web
app or it can be put into ${solr.home}/lib dir.

or if something is really screwed up you can try the old fashioned way
of putting your driver jar into JAVA_HOME/lib/ext

--Noble


On Tue, Jan 6, 2009 at 7:05 AM, Performance  wrote:
>
> I have been following this tutorial but I can't seem to get past an error
> related to not being able to load the DB2 Driver.  The user has all the
> right config to load the JDBC driver and Squirrel works fine.  Do I need to
> update and path within Solr?
>
>
>
> muxa wrote:
>>
>> Looked through the tutorial on data import, section "Full Import
>> Example".
>> 1) Where is this dataimport.jar? There is no such file in the
>> extracted example-solr-home.jar.
>> 2) "Use the solr folder inside example-data-config folder as your
>> solr home." What does this mean? Anyway, there is no folder
>> example-data-config.
>>  Ar cieņu, Mihails
>>
>
> --
> View this message in context: 
> http://www.nabble.com/DataImport-tp17730791p21301571.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>



-- 
--Noble Paul


Setting up DataImportHandler for Oracle datasource on JBoss

2009-01-05 Thread The Flight Captain

I am having trouble setting up an Oracle datasource. Can anyone help me
connect to the datasource?

My solrconfig.xml:

...
  
  
data-config.xml
  

...

My data-config.xml

  
  


I have placed the oracle driver on the classpath of JBoss.

I am getting the following errors in the server.log on startup:

2009-01-06 17:03:12,756 ERROR [STDERR] 6/01/2009 17:03:12
org.apache.solr.handler.dataimport.DataImportHandler inform
SEVERE: Exception while loading DataImporter
org.apache.solr.handler.dataimport.DataImportHandlerException: Exception
occurred while initializing context Processing Document # 
at
org.apache.solr.handler.dataimport.DataImporter.loadDataConfig(DataImporter.java:176)
at
org.apache.solr.handler.dataimport.DataImporter.(DataImporter.java:93)
at
org.apache.solr.handler.dataimport.DataImportHandler.inform(DataImportHandler.java:106)
at
org.apache.solr.core.SolrResourceLoader.inform(SolrResourceLoader.java:311)
at org.apache.solr.core.SolrCore.(SolrCore.java:480)
at
org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:119)
at
org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:69)
at
org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:275)
at
org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:397)
at
org.apache.catalina.core.ApplicationFilterConfig.(ApplicationFilterConfig.java:108)
at
org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:3720)
at
org.apache.catalina.core.StandardContext.start(StandardContext.java:4358)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:752)
at 
org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:732)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:553)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:297)
at 
org.jboss.mx.server.RawDynamicInvoker.invoke(RawDynamicInvoker.java:164)
at org.jboss.mx.server.MBeanServerImpl.invoke(MBeanServerImpl.java:659)
at 
org.apache.catalina.core.StandardContext.init(StandardContext.java:5300)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:297)
at 
org.jboss.mx.server.RawDynamicInvoker.invoke(RawDynamicInvoker.java:164)
at org.jboss.mx.server.MBeanServerImpl.invoke(MBeanServerImpl.java:659)
at
org.jboss.web.tomcat.service.TomcatDeployer.performDeployInternal(TomcatDeployer.java:301)
at
org.jboss.web.tomcat.service.TomcatDeployer.performDeploy(TomcatDeployer.java:104)
at org.jboss.web.AbstractWebDeployer.start(AbstractWebDeployer.java:375)
at org.jboss.web.WebModule.startModule(WebModule.java:83)
at org.jboss.web.WebModule.startService(WebModule.java:61)
at
org.jboss.system.ServiceMBeanSupport.jbossInternalStart(ServiceMBeanSupport.java:289)
at
org.jboss.system.ServiceMBeanSupport.jbossInternalLifecycle(ServiceMBeanSupport.java:245)
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.jboss.mx.interceptor.ReflectedDispatcher.invoke(ReflectedDispatcher.java:155)
at org.jboss.mx.server.Invocation.dispatch(Invocation.java:94)
at org.jboss.mx.server.Invocation.invoke(Invocation.java:86)
at
org.jboss.mx.server.AbstractMBeanInvoker.invoke(AbstractMBeanInvoker.java:264)
at org.jboss.mx.server.MBeanServerImpl.invoke(MBeanServerImpl.java:659)
at
org.jboss.system.ServiceController$ServiceProxy.invoke(ServiceController.java:978)
at $Proxy0.start(Unknown Source)
at org.jboss.system.ServiceController.start(ServiceController.java:417)
at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.jboss.mx.interceptor.ReflectedDispatcher.invoke(ReflectedDispatcher.java:155)
at org.jboss.mx.serve

Re: Setting up DataImportHandler for Oracle datasource on JBoss

2009-01-05 Thread Noble Paul നോബിള്‍ नोब्ळ्
the  tag and the rest of the stuff is missing in your data-config file

On Tue, Jan 6, 2009 at 12:50 PM, The Flight Captain
 wrote:
>
> I am having trouble setting up an Oracle datasource. Can anyone help me
> connect to the datasource?
>
> My solrconfig.xml:
>
> ...
>   class="org.apache.solr.handler.dataimport.DataImportHandler">
>  
>data-config.xml
>  
>
> ...
>
> My data-config.xml
> 
>driver="oracle.jdbc.OracleDriver"
>  url="jdbc:oracle:thin:@hostname:port:service"
>  user="username"
>  password="password"/>
>  
> 
>
> I have placed the oracle driver on the classpath of JBoss.
>
> I am getting the following errors in the server.log on startup:
>
> 2009-01-06 17:03:12,756 ERROR [STDERR] 6/01/2009 17:03:12
> org.apache.solr.handler.dataimport.DataImportHandler inform
> SEVERE: Exception while loading DataImporter
> org.apache.solr.handler.dataimport.DataImportHandlerException: Exception
> occurred while initializing context Processing Document #
>at
> org.apache.solr.handler.dataimport.DataImporter.loadDataConfig(DataImporter.java:176)
>at
> org.apache.solr.handler.dataimport.DataImporter.(DataImporter.java:93)
>at
> org.apache.solr.handler.dataimport.DataImportHandler.inform(DataImportHandler.java:106)
>at
> org.apache.solr.core.SolrResourceLoader.inform(SolrResourceLoader.java:311)
>at org.apache.solr.core.SolrCore.(SolrCore.java:480)
>at
> org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:119)
>at
> org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:69)
>at
> org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:275)
>at
> org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:397)
>at
> org.apache.catalina.core.ApplicationFilterConfig.(ApplicationFilterConfig.java:108)
>at
> org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:3720)
>at
> org.apache.catalina.core.StandardContext.start(StandardContext.java:4358)
>at
> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:752)
>at 
> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:732)
>at 
> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:553)
>at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>at java.lang.reflect.Method.invoke(Method.java:597)
>at
> org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:297)
>at 
> org.jboss.mx.server.RawDynamicInvoker.invoke(RawDynamicInvoker.java:164)
>at org.jboss.mx.server.MBeanServerImpl.invoke(MBeanServerImpl.java:659)
>at 
> org.apache.catalina.core.StandardContext.init(StandardContext.java:5300)
>at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>at java.lang.reflect.Method.invoke(Method.java:597)
>at
> org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:297)
>at 
> org.jboss.mx.server.RawDynamicInvoker.invoke(RawDynamicInvoker.java:164)
>at org.jboss.mx.server.MBeanServerImpl.invoke(MBeanServerImpl.java:659)
>at
> org.jboss.web.tomcat.service.TomcatDeployer.performDeployInternal(TomcatDeployer.java:301)
>at
> org.jboss.web.tomcat.service.TomcatDeployer.performDeploy(TomcatDeployer.java:104)
>at 
> org.jboss.web.AbstractWebDeployer.start(AbstractWebDeployer.java:375)
>at org.jboss.web.WebModule.startModule(WebModule.java:83)
>at org.jboss.web.WebModule.startService(WebModule.java:61)
>at
> org.jboss.system.ServiceMBeanSupport.jbossInternalStart(ServiceMBeanSupport.java:289)
>at
> org.jboss.system.ServiceMBeanSupport.jbossInternalLifecycle(ServiceMBeanSupport.java:245)
>at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
>at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>at java.lang.reflect.Method.invoke(Method.java:597)
>at
> org.jboss.mx.interceptor.ReflectedDispatcher.invoke(ReflectedDispatcher.java:155)
>at org.jboss.mx.server.Invocation.dispatch(Invocation.java:94)
>at org.jboss.mx.server.Invocation.invoke(Invocation.java:86)
>at
> org.jboss.mx.server.AbstractMBeanInvoker.invoke(AbstractMBeanInvoker.java:264)
>at org.jboss.mx.server.MBeanServerImpl.invoke(MBeanServerImpl.java:659)
>at
> org.jboss.system.ServiceController$Se