Re: "background merge hit exception"
In fact this (the root cause NPE) is a Lucene bug -- I have a small test case showing it. It can happen when you have compressed text fields (Store.COMPRESS) in the index. I'll open an issue and fix it. Thank you for raising this! Mike Chris Harris wrote: I've made some changes to my Solr setup, and now I'm getting the "background merge hit exception" pasted at the end of this message. The most notable changes I've made are: Update to r690989 (Lucene r688745) Change a few things in my schema. In particular, I was previously storing my main document text and the metadata fields in a single "body" field, like this: * * whereas I'm now using "body" as a sort of alias that just gets redirected to other fields, like this: * stored="false" /> * When I was indexing with this new setup, things were initially fine, and segments seemed to be merging fine. I ran into trouble when I sent an , though. I think in an earlier run I also got a very similar exception just from document adds, without an explicit . I'm also running with a shingle-related patch (https://issues.apache.org/jira/browse/LUCENE-1370 / https://issues.apache.org/jira/browse/SOLR-744) and the rich document handler patch, though I've used these before without trouble. Is it possible that my schema change is illegitimate? Am I not allowed to have non-indexed, non-stored fields, for example? Anyway, here is my stack trace: * background merge hit exception: _1h:C2552 _1i:C210->_1i _1j:C266->_1i _1k:C214->_1i _1l:C329->_1i _1m:C231->_1i _1n:C379->_1i _1o:C447 _1p:C453->_1p _1q:C485->_1p into _1r [optimize] java.io.IOException: background merge hit exception: _1h:C2552 _1i:C210->_1i _1j:C266->_1i _1k:C214->_1i _1l:C329->_1i _1m:C231->_1i _1n:C379->_1i _1o:C447 _1p:C453->_1p _1q:C485->_1p into _1r [optimize] at org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java: 2303) at org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java: 2233) at org .apache .solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java: 355) at org .apache .solr .update .processor .RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.java:77) at org .apache .solr .handler .XmlUpdateRequestHandler.processUpdate(XmlUpdateRequestHandler.java: 228) at org .apache .solr .handler .XmlUpdateRequestHandler .handleRequestBody(XmlUpdateRequestHandler.java:125) at org .apache .solr .handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1156) at org .apache .solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:341) at org .apache .solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:272) at org.mortbay.jetty.servlet.ServletHandler $CachedChain.doFilter(ServletHandler.java:1089) at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java: 365) at org .mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java: 216) at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java: 181) at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java: 712) at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java: 405) at org .mortbay .jetty .handler .ContextHandlerCollection.handle(ContextHandlerCollection.java:211) at org .mortbay .jetty.handler.HandlerCollection.handle(HandlerCollection.java:114) at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java: 139) at org.mortbay.jetty.Server.handle(Server.java:285) at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java: 502) at org.mortbay.jetty.HttpConnection $RequestHandler.content(HttpConnection.java:835) at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:641) at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:208) at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:378) at org.mortbay.jetty.bio.SocketConnector $Connection.run(SocketConnector.java:226) at org.mortbay.thread.BoundedThreadPool $PoolThread.run(BoundedThreadPool.java:442) Caused by: java.lang.NullPointerException at java.lang.System.arraycopy(Native Method) at org .apache .lucene .store.BufferedIndexOutput.writeBytes(BufferedIndexOutput.java:49) at org.apache.lucene.index.FieldsWriter.writeField(FieldsWriter.java:215) at org.apache.lucene.index.FieldsWriter.addDocument(FieldsWriter.java: 268) at org.apache.lucene.index.SegmentMerger.mergeFields(SegmentMerger.java: 359) at org.apache.lucene.index.SegmentMerger.merge(SegmentMerger.java: 138) at org.apache.lucene.index.IndexWriter.mergeMiddle(IndexWriter.java: 3988) at org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:3637) at org .apache .lucene .index .ConcurrentMergeScheduler.doMerge(ConcurrentMergeScheduler.java:214) at org.ap
Re: SolrJ - SolrServer#commit() doesn't return
Hey. Sorry for the late response. The usage is zero. Seemingly this problem has to do with the amount of data being indexed. I've just now removed a field "text", which simply saved all other fields for use as a standard search field. (Now I run the search over specific fields with different boost factors.) Currently I'm indexing data again, which is nearly finished and thus far only 1 instead of 4 retries was necessary. - Machisuji zayhen wrote: > > > How is your disk and memory usage during these thread hang ups? > > >> >> zayhen wrote: >> > >> > Are you using any postCommit postOptimize eventListener? >> > >> > I got some problems using them, that I run on scenario where the >> > commit/optimize thread never ended. >> > >> > 2008/8/26 Machisuji <[EMAIL PROTECTED]> >> > >> >> >> >> Hey. >> >> >> >> I've been working with SolR for a few days now and as long as I >> haven't >> >> worked with >> >> too much data everything was alright. >> >> >> >> However, now that I wanted to index really all data, I've got problems >> >> with >> >> SolrJ >> >> not returning from a call to CommonHttpSolrServers's commit(). >> >> I try to upload data from online shops, to be more precise name, >> >> category, >> >> price and description of tens of millions of items. >> >> After a few million items the call of commit() doesn't return anymore >> and >> >> simply does nothing. >> >> At least the cpu usage on the computer running the solr server falls >> to >> >> 0%. >> >> >> >> I always add 10,000 items at a time by calling >> >> SolrServer#add(Collection) followed by >> >> SolrServer.commit(). >> >> Has someone an idea what could be the problem here? >> >> -- >> >> View this message in context: >> >> >> http://www.nabble.com/SolrJ---SolrServer-commit%28%29-doesn%27t-return-tp19161293p19161293.html >> >> Sent from the Solr - User mailing list archive at Nabble.com. >> >> >> >> >> > >> > >> > -- >> > Alexander Ramos Jardim >> > >> > >> > - >> > RPG da Ilha >> > >> >> -- >> View this message in context: >> http://www.nabble.com/SolrJ---SolrServer-commit%28%29-doesn%27t-return-tp19161293p19165657.html >> Sent from the Solr - User mailing list archive at Nabble.com. >> >> > > > -- > Alexander Ramos Jardim > > > - > RPG da Ilha > -- View this message in context: http://www.nabble.com/SolrJ---SolrServer-commit%28%29-doesn%27t-return-tp19161293p19286205.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: "background merge hit exception"
OK I opened: https://issues.apache.org/jira/browse/LUCENE-1374 Mike Chris Harris wrote: I've made some changes to my Solr setup, and now I'm getting the "background merge hit exception" pasted at the end of this message. The most notable changes I've made are: Update to r690989 (Lucene r688745) Change a few things in my schema. In particular, I was previously storing my main document text and the metadata fields in a single "body" field, like this: * * whereas I'm now using "body" as a sort of alias that just gets redirected to other fields, like this: * stored="false" /> * When I was indexing with this new setup, things were initially fine, and segments seemed to be merging fine. I ran into trouble when I sent an , though. I think in an earlier run I also got a very similar exception just from document adds, without an explicit . I'm also running with a shingle-related patch (https://issues.apache.org/jira/browse/LUCENE-1370 / https://issues.apache.org/jira/browse/SOLR-744) and the rich document handler patch, though I've used these before without trouble. Is it possible that my schema change is illegitimate? Am I not allowed to have non-indexed, non-stored fields, for example? Anyway, here is my stack trace: * background merge hit exception: _1h:C2552 _1i:C210->_1i _1j:C266->_1i _1k:C214->_1i _1l:C329->_1i _1m:C231->_1i _1n:C379->_1i _1o:C447 _1p:C453->_1p _1q:C485->_1p into _1r [optimize] java.io.IOException: background merge hit exception: _1h:C2552 _1i:C210->_1i _1j:C266->_1i _1k:C214->_1i _1l:C329->_1i _1m:C231->_1i _1n:C379->_1i _1o:C447 _1p:C453->_1p _1q:C485->_1p into _1r [optimize] at org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java: 2303) at org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java: 2233) at org .apache .solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java: 355) at org .apache .solr .update .processor .RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.java:77) at org .apache .solr .handler .XmlUpdateRequestHandler.processUpdate(XmlUpdateRequestHandler.java: 228) at org .apache .solr .handler .XmlUpdateRequestHandler .handleRequestBody(XmlUpdateRequestHandler.java:125) at org .apache .solr .handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1156) at org .apache .solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:341) at org .apache .solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:272) at org.mortbay.jetty.servlet.ServletHandler $CachedChain.doFilter(ServletHandler.java:1089) at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java: 365) at org .mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java: 216) at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java: 181) at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java: 712) at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java: 405) at org .mortbay .jetty .handler .ContextHandlerCollection.handle(ContextHandlerCollection.java:211) at org .mortbay .jetty.handler.HandlerCollection.handle(HandlerCollection.java:114) at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java: 139) at org.mortbay.jetty.Server.handle(Server.java:285) at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java: 502) at org.mortbay.jetty.HttpConnection $RequestHandler.content(HttpConnection.java:835) at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:641) at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:208) at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:378) at org.mortbay.jetty.bio.SocketConnector $Connection.run(SocketConnector.java:226) at org.mortbay.thread.BoundedThreadPool $PoolThread.run(BoundedThreadPool.java:442) Caused by: java.lang.NullPointerException at java.lang.System.arraycopy(Native Method) at org .apache .lucene .store.BufferedIndexOutput.writeBytes(BufferedIndexOutput.java:49) at org.apache.lucene.index.FieldsWriter.writeField(FieldsWriter.java:215) at org.apache.lucene.index.FieldsWriter.addDocument(FieldsWriter.java: 268) at org.apache.lucene.index.SegmentMerger.mergeFields(SegmentMerger.java: 359) at org.apache.lucene.index.SegmentMerger.merge(SegmentMerger.java: 138) at org.apache.lucene.index.IndexWriter.mergeMiddle(IndexWriter.java: 3988) at org.apache.lucene.index.IndexWriter.merge(IndexWriter.java:3637) at org .apache .lucene .index .ConcurrentMergeScheduler.doMerge(ConcurrentMergeScheduler.java:214) at org.apache.lucene.index.ConcurrentMergeScheduler $MergeThread.run(ConcurrentMergeScheduler.java:269)
Re: distributed search mechanism
I thank you for the answer. The http://wiki.apache.org/solr/DistributedSearchDesign page was last edited by Yonik Seeley on the 2008-02-27, which seems a date of major commit (according to JIRA), and he did not amend the "current approach" part of the page : so the "Multi-phased approach, allowing for inconsistency" design is quite probably still valid ?... It is a bit abstract description of the process, but will do it anyway, I'll use that (unless someone tells me this is eventually out of date...) Thanks again, -- Grégoire Neuville
Re: distributed search mechanism
On Wed, Sep 3, 2008 at 7:35 PM, Grégoire Neuville < [EMAIL PROTECTED]> wrote: > The http://wiki.apache.org/solr/DistributedSearchDesign page was last > edited > by Yonik Seeley on the 2008-02-27, which seems a date of major commit > (according to JIRA), and he did not amend the "current approach" part of > the > page : so the "Multi-phased approach, allowing for inconsistency" design is > quite probably still valid ?... It is a bit abstract description of the > process, but will do it anyway, I'll use that (unless someone tells me this > is eventually out of date...) > Yes, AFAIK, the multi-phased approach is being used currently. -- Regards, Shalin Shekhar Mangar.
Errors compiling laster solr 1.3 update
Hi all, First of all , sorry for my English I'm not sure that it's a problem, but after the last update from CVS (solr 1.3 dev) I can't compile the solr source. I think that is a problema of my workspace, but I'd like to be sure that anyone more have the same problema. The classes who have the problema are SnowballPorterFilterFactory and SolrCore Thanks Raul
still looking for multicore.xml?
I'm up to date on trunk (r691646), and multicore.xml has been removed and solr.xml added (I saw the notice [1]). When I start solr with "java -Dsolr.solr.home=multicore -jar start.jar", however, I see the following in the output: ... Sep 3, 2008 3:54:54 PM org.apache.solr.servlet.SolrDispatchFilter init INFO: SolrDispatchFilter.init() Sep 3, 2008 3:54:54 PM org.apache.solr.core.SolrResourceLoader locateInstanceDir INFO: JNDI not configured for solr (NoInitialContextEx) Sep 3, 2008 3:54:54 PM org.apache.solr.core.SolrResourceLoader locateInstanceDir INFO: using system property solr.solr.home: multicore Sep 3, 2008 3:54:54 PM org.apache.solr.servlet.SolrDispatchFilter initMultiCore INFO: looking for multicore.xml: /usr/local/solr/example/multicore/multicore.xml Sep 3, 2008 3:54:54 PM org.apache.solr.core.SolrResourceLoader locateInstanceDir INFO: JNDI not configured for solr (NoInitialContextEx) Sep 3, 2008 3:54:54 PM org.apache.solr.core.SolrResourceLoader locateInstanceDir INFO: using system property solr.solr.home: multicore Sep 3, 2008 3:54:54 PM org.apache.solr.core.SolrResourceLoader INFO: Solr home set to 'multicore/' Sep 3, 2008 3:54:54 PM org.apache.solr.core.SolrResourceLoader createClassLoader INFO: Reusing parent classloader Sep 3, 2008 3:54:54 PM org.apache.solr.servlet.SolrDispatchFilter init SEVERE: Could not start SOLR. Check solr/home property java.lang.RuntimeException: Can't find resource 'solrconfig.xml' in classpath or 'multicore/conf/', cwd=/usr/local/solr/example at org.apache.solr.core.SolrResourceLoader.openResource(SolrResourceLoader.java:168) ... So it appears to be looking for multicore.xml, still. If I put my old multicore.xml in the multicore directory, it runs fine. solr.xml is ignored. Do I have an odd configuration somewhere that might cause this? Gabriel [1] http://mail-archives.apache.org/mod_mbox/lucene-solr-user/200808.mbox/[EMAIL PROTECTED]
Re: Building a multilevel query.
I think in order to do this you'd need to run two queries. We do this as well, for example.. Facet on the product types that match a query term. For each product type, run another query to facet on the subcategories. Thanks for your time! Matthew Runo Software Engineer, Zappos.com [EMAIL PROTECTED] - 702-943-7833 On Sep 2, 2008, at 5:20 PM, Erik Holstad wrote: Hi! I want to do a query that first queries on one specific field and for all those that match the input do a second query. For example if we have a "type" field where one of the options is "user" and a "title" fields includes the names of the users. So I want to find all data with "type" field = user where the name Erik is in the title field. Is this possible? Have been playing with faceting, but can't get the facet.query to work, and otherwise I just get all the results. Regards Erik
Re: still looking for multicore.xml?
> So it appears to be looking for multicore.xml, still. If I put my old > multicore.xml in the multicore directory, it runs fine. solr.xml is > ignored. Do I have an odd configuration somewhere that might cause > this? Looking at the code in trunk everything appears to be fine. Did you run "ant example" before starting the server? Otherwise it's probably picking up some old jars/class files. Lars
Re: Errors compiling laster solr 1.3 update
I can compile it successfully. The lucene jars have been updated, so make sure you update the lib directory too. On Wed, Sep 3, 2008 at 9:30 PM, <[EMAIL PROTECTED]> wrote: > Hi all, > > > > First of all , sorry for my English > > > > I'm not sure that it's a problem, but after the last update from CVS (solr > 1.3 dev) I can't compile the solr source. I think that is a problema of my > workspace, but I'd like to be sure that anyone more have the same problema. > > The classes who have the problema are SnowballPorterFilterFactory and > SolrCore > > Thanks > > > > Raul > > -- Regards, Shalin Shekhar Mangar.
Re: still looking for multicore.xml?
On Wed, Sep 03, 2008 at 06:29:04PM +0200, Lars Kotthoff wrote: > Looking at the code in trunk everything appears to be fine. Did you run "ant > example" before starting the server? Otherwise it's probably picking up some > old jars/class files. Ah, right. Thanks.
Solr runs out of File Handles
Hello all, I'm dealing with an odd problem with solr using multi-cores. If we start using more than about 40 or so cores the java spews forth errors about lacking file handles. Has anyone else seen this problem and what may the solution be? Best Regards, Martin Owens SEVERE: java.lang.RuntimeException: Error opening solrconfig.xml at org.apache.solr.core.SolrResourceLoader.openResource(SolrResourceLoader.java:144) at org.apache.solr.core.Config.(Config.java:72) at org.apache.solr.core.SolrConfig.(SolrConfig.java:94) at org.apache.solr.core.MultiCore.createCore(MultiCore.java:151) at org.apache.solr.core.MultiCore.load(MultiCore.java:122) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:78) at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:99) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:594) at org.mortbay.jetty.servlet.Context.startContext(Context.java:139) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1218) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:500) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:161) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:117) at org.mortbay.jetty.Server.doStart(Server.java:210) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.xml.XmlConfiguration.main(XmlConfiguration.java:929) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.mortbay.start.Main.invokeMain(Main.java:183) at org.mortbay.start.Main.start(Main.java:497) at org.mortbay.start.Main.main(Main.java:115) Caused by: java.io.FileNotFoundException: solr/EverestLucene/conf/solrconfig.xml (Too many open files) at java.io.FileInputStream.open(Native Method) at java.io.FileInputStream.(FileInputStream.java:106) at org.apache.solr.core.SolrResourceLoader.openResource(SolrResourceLoader.java:133) ... 29 more Sep 3, 2008 12:28:07 PM org.apache.solr.core.SolrResourceLoader INFO: Solr home set to 'solr/EverestLucene/' Sep 3, 2008 12:28:07 PM org.apache.solr.core.SolrResourceLoader createClassLoader INFO: Reusing parent classloader Sep 3, 2008 12:28:07 PM org.apache.solr.common.SolrException log SEVERE: java.lang.RuntimeException: Error opening solrconfig.xml at org.apache.solr.core.SolrResourceLoader.openResource(SolrResourceLoader.java:144) at org.apache.solr.core.Config.(Config.java:72) at org.apache.solr.core.SolrConfig.(SolrConfig.java:94) at org.apache.solr.core.MultiCore.createCore(MultiCore.java:151) at org.apache.solr.core.MultiCore.load(MultiCore.java:122) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:78) at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:99) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:594) at org.mortbay.jetty.servlet.Context.startContext(Context.java:139) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1218) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:500) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:161) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:147) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:40) at org.mortbay.
Re: Solr runs out of File Handles
On Wed, Sep 3, 2008 at 1:50 PM, Martin Owens <[EMAIL PROTECTED]> wrote: > I'm dealing with an odd problem with solr using multi-cores. If we start > using more than about 40 or so cores the java spews forth errors about > lacking file handles. - Increase the number of file descriptors available to the system (this is operating system specific). - Change to use the compound file format, which uses fewer files (and hence fewer descriptors) per index -Yonik
RE: Errors compiling laster solr 1.3 update
Hi Shalin, I too think that is a problem of jars files , but I download the lib directory again and isn't work. This is my CVS link http://svn.apache.org/repos/asf/lucene/solr/trunk and y too try whith http://svn.apache.org/repos/asf/lucene/solr/branches/branch-1.3/ It`s correct ??? -Mensaje original- De: Shalin Shekhar Mangar [mailto:[EMAIL PROTECTED] Enviado el: miércoles, 03 de septiembre de 2008 18:56 Para: solr-user@lucene.apache.org Asunto: Re: Errors compiling laster solr 1.3 update I can compile it successfully. The lucene jars have been updated, so make sure you update the lib directory too. On Wed, Sep 3, 2008 at 9:30 PM, <[EMAIL PROTECTED]> wrote: > Hi all, > > > > First of all , sorry for my English > > > > I'm not sure that it's a problem, but after the last update from CVS (solr > 1.3 dev) I can't compile the solr source. I think that is a problema of my > workspace, but I'd like to be sure that anyone more have the same problema. > > The classes who have the problema are SnowballPorterFilterFactory and > SolrCore > > Thanks > > > > Raul > > -- Regards, Shalin Shekhar Mangar.
Re: Errors compiling laster solr 1.3 update
Did you run clean first? Can you share the errors? Note, it compiles for me. On Sep 3, 2008, at 2:15 PM, <[EMAIL PROTECTED]> <[EMAIL PROTECTED]> wrote: Hi Shalin, I too think that is a problem of jars files , but I download the lib directory again and isn't work. This is my CVS link http://svn.apache.org/repos/asf/lucene/solr/ trunk and y too try whith http://svn.apache.org/repos/asf/lucene/solr/branches/branch-1.3/ It`s correct ??? -Mensaje original- De: Shalin Shekhar Mangar [mailto:[EMAIL PROTECTED] Enviado el: miércoles, 03 de septiembre de 2008 18:56 Para: solr-user@lucene.apache.org Asunto: Re: Errors compiling laster solr 1.3 update I can compile it successfully. The lucene jars have been updated, so make sure you update the lib directory too. On Wed, Sep 3, 2008 at 9:30 PM, <[EMAIL PROTECTED]> wrote: Hi all, First of all , sorry for my English I'm not sure that it's a problem, but after the last update from CVS (solr 1.3 dev) I can't compile the solr source. I think that is a problema of my workspace, but I'd like to be sure that anyone more have the same problema. The classes who have the problema are SnowballPorterFilterFactory and SolrCore Thanks Raul -- Regards, Shalin Shekhar Mangar.
Highlighting Unindexed Fields
http://wiki.apache.org/solr/FieldOptionsByUseCase says that a field needs to be both stored and indexed for highlighting to work. Unless I'm very confused, though, I just tested and highlighting worked fine (on trunk) for a stored, *non-indexed* field. So is this info perhaps out of date? Assuming it's correct that indexing the field is not required for highlighting, there isn't any highlighting performance benefit from indexing the field, is there? I guess if you have indexed and termVectors and termPositions then you'll see a highlighting speedup, but not from indexed alone.
RE: Question about autocomplete feature
Did u reindex after the change? > Date: Wed, 27 Aug 2008 23:43:05 +0300 > From: [EMAIL PROTECTED] > To: solr-user@lucene.apache.org > Subject: Question about autocomplete feature > > > Hello. > > I'm trying to implement autocomplete feature using the snippet posted > by Dan. > (http://mail-archives.apache.org/mod_mbox/lucene-solr-user/200807.mbox/[EMAIL > PROTECTED]) > > Here is the snippet: > > > > > > pattern="([^a-z0-9])" replacement="" replace="all" /> > maxGramSize="100" minGramSize="1" /> > > > > > pattern="([^a-z0-9])" replacement="" replace="all" /> > pattern="^(.{20})(.*)?" replacement="$1" replace="all" /> > > > ... > required="false" /> > > First I decided to make it working for solr example. So I pasted the > snippet to schema.xml. Then I edited exampledocs/hd.xml, I added the > "ac" field to each doc. Value of "ac" field is a copy of name filed: > > > > SP2514N > Samsung SpinPoint P12 SP2514N - hard drive - 250 GB - > ATA-133 > Samsung SpinPoint P12 SP2514N - hard drive - 250 GB - > ATA-133 > Samsung Electronics Co. Ltd. > electronics > hard drive > 7200RPM, 8MB cache, IDE Ultra ATA-133 > NoiseGuard, SilentSeek technology, Fluid Dynamic > Bearing (FDB) motor > 92 > 6 > true > > > > 6H500F0 > Maxtor DiamondMax 11 - hard drive - 500 GB - > SATA-300 > Maxtor DiamondMax 11 - hard drive - 500 GB - > SATA-300 > Maxtor Corp. > electronics > hard drive > SATA 3.0Gb/s, NCQ > 8.5ms seek > 16MB cache > 350 > 6 > true > > > > Then I clean solr index, posted hd.xml and restarted solr server. But > when I'm trying to search for "samsu" (the part of word "samsung") I > still get no result. Seems like solr treats "ac" field like the > regular field. > > What did I do wrong? > > Thanks in advance. > > -- > Aleksey Gogolev > developer, > dev.co.ua > Aleksey > _ Movies, sports & news! Get your daily entertainment fix, only on live.com http://www.live.com/?scope=video&form=MICOAL
Re: Highlighting Unindexed Fields
On 3-Sep-08, at 1:29 PM, Chris Harris wrote: http://wiki.apache.org/solr/FieldOptionsByUseCase says that a field needs to be both stored and indexed for highlighting to work. Unless I'm very confused, though, I just tested and highlighting worked fine (on trunk) for a stored, *non-indexed* field. So is this info perhaps out of date? Good point. An analyzer/token filter chain is required to be defined, but actually indexing isn't strictly necessary. Assuming it's correct that indexing the field is not required for highlighting, there isn't any highlighting performance benefit from indexing the field, is there? I guess if you have indexed and termVectors and termPositions then you'll see a highlighting speedup, but not from indexed alone. True. -Mike
Re: Realtime Search for Social Networks Collaboration
Hi Yonik, The SOLR 2 list looks good. The question is, who is going to do the work? I tried to simplify the scope of Ocean as much as possible to make it possible (and slowly at that over time) for me to eventually finish what is mentioned on the wiki. I think SOLR is very cool and was major step forward when it came out. I also think it's got a lot of things now which makes integration difficult to do properly. I did try to integrate and received a lukewarm response and so decided to just move ahead separately until folks have time to collaborate. We probably should try to integrate SOLR and Ocean somehow however we may want to simply reduce the scope a bit and figure what is needed most, with the main use case being social networks. I think the problem with integration with SOLR is it was designed with a different problem set in mind than Ocean, originally the CNET shopping application. Facets were important, realtime was not needed because pricing doesn't change very often. I designed Ocean for social networks and actually further into the future realtime messaging based mobile applications. SOLR needs to be backward compatible and support it's existing user base. How do you plan on doing this for a SOLR 2 if the architecture is changed dramatically? SOLR solves a problem set that is very common making SOLR very useful in many situations. However I wanted Ocean to be like GData. So I wanted the scalability of Google which SOLR doesn't quite have yet, and the realtime, and then I figured the other stuff could be added later, stuff people seem to spend a lot of time on in the SOLR community currently (spellchecker, db imports, many others). I did use some of the SOLR terminology in building Ocean, like snapshots! But most of it is a digression. I tried to use schemas, but they just make the system harder to use. For distributed search I prefer serialized objects as this enables things like SpanQueries and payloads without writing request handlers and such. Also there is no need to write new request handlers and deploy (an expensive operation for systems that are in the 100s of servers) them as any new classes are simply dynamically loaded by the server from the client. A lot is now outlined on the wiki site http://wiki.apache.org/lucene-java/OceanRealtimeSearch now and there will be a lot more javadocs in the forthcoming patch. The latest code is also available all the time at http://oceansearch.googlecode.com/svn/trunk/trunk/oceanlucene I do welcome more discussion and if there are Solr developers who wish to work on Ocean feel free to drop me a line. Most of all though I think it would be useful for social networks interested in realtime search to get involved as it may be something that is difficult for one company to have enough resources to implement to a production level. I think this is where open source collaboration is particularly useful. Cheers, Jason Rutherglen [EMAIL PROTECTED] On Wed, Sep 3, 2008 at 4:56 PM, Yonik Seeley <[EMAIL PROTECTED]> wrote: > On Wed, Sep 3, 2008 at 3:20 PM, Jason Rutherglen > <[EMAIL PROTECTED]> wrote: >> I am wondering >> if there are social networks (or anyone else) out there who would be >> interested in collaborating with Apache on realtime search to get it >> to the point it can be used in production. > > Good timing Jason, I think you'll find some other people right here > at Apache (solr-dev) that want to collaborate in this area: > > http://www.nabble.com/solr2%3A-Onward-and-Upward-td19224805.html > > I've looked at your wiki briefly, and all the high level goals/features seem > to really be synergistic with where we are going with Solr2. > > -Yonik > > - > To unsubscribe, e-mail: [EMAIL PROTECTED] > For additional commands, e-mail: [EMAIL PROTECTED] > >