Re: Multicore Issue with nightly build
Hello again, I finally managed to add/update solr single core by using Perl CPAN Solr by Timothy Garafola. But I am unable to actually update or add anything to a multicore environment ! I was wondering if I am doing something incorrectly or if there is an issue at this point? Should I be editing the schema.xml for the specific core ? Thank you K On Mon, Apr 7, 2008 at 12:54 PM, kirk beers <[EMAIL PROTECTED]> wrote: > Which schema.xml are you referring to ? The core0 schema.xml or the main > schema.xml ? Because I get the following error when I use : > > camera > > I get this error: > > org.apache.solr.common.SolrException: ERROR:unknown > field 'cat' > at > org.apache.solr.update.DocumentBuilder.toDocument(DocumentBuilder.java:245) > at > org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:66) > at > org.apache.solr.handler.XmlUpdateRequestHandler.processUpdate(XmlUpdateRequestHandler.java:196) > at > org.apache.solr.handler.XmlUpdateRequestHandler.doLegacyUpdate(XmlUpdateRequestHandler.java:386) > at > org.apache.solr.servlet.SolrUpdateServlet.doPost(SolrUpdateServlet.java:65) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:710) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:803) > at > org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:269) > at > org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) > at > org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:320) > at > org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215) > at > org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) > at > org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213) > at > org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:174) > at > org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) > at > org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117) > at > org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108) > at > org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:151) > at > org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:874) > at > org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:665) > at > org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:528) > at > org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:81) > at > org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:689) > at java.lang.Thread.run(Thread.java:619) > > = > > > > On Mon, Apr 7, 2008 at 11:50 AM, Thomas Arni <[EMAIL PROTECTED]> > wrote: > > > Please make sure that you do NOT have a field called "category" in > > in the documents you would like to add. For example: > > > > camera > > > > I am almost sure you have some documents, > > which have this field "category" instead of "cat". > > > > You can also add the field "category" to your schema.xml file and copy > > it to the "cat" field. > > > > kirk beers said the following on 07/04/2008 15:40: > > > > Hi Ryan, > > > > > > I re installed the multicore set up and I have it running and working > > > properly. The cores newswire2 etc contained indexes from a prior > > > Lucene > > > application which did not seem to work in the multicore set-up. > > > > > > Now that I have multicore running are there any instructions on how to > > > add/update individual cores with new docs ? I have set a core0 as a > > > default within multicore to make it updateable ? But now I keep > > > getting > > > errors from curl that says it does not recognize specific field > > > names > > > like 'cat' which seem to be declared in both . > > > > > > I am likewise using the following line command : > > > > > > curl -d @add.xml http://localhost:8080/solr/update > > > > > > Here is the contents of add.xml > > > > > > > > > > > >9885A004 > > >Canon PowerShot SD500 > > >camera > > >3x optical zoom > > >aluminum case > > >6.4 > > >329.95 > > > > > > > > > > > > Here is the core0 schema: > > > > > > > > > > > >> > sortMissingLast="true" > > > omitNorms="true"/> > > > > > > > > > > > > > > > > > multiValued="false" required="true"/> > > > > > multiValued="false" /> > > > > > multiValued="false" /> > > > > > multiValued="false" /> > > > > > multiValued="false" /> > > > > > > > > > > > > id > > > > > > > > > name > > > > > > > > > > > > > > > > >
Re: Number of docs per segments
I have a 8 Gigs or RAM 4 CPU's 2.4 GHz eachIs this the information you were looking for? Otis Gospodnetic wrote: > > You'll need to provide more information about your environment and index > if you want guesstimates. > > Otis > -- > Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch > > - Original Message > From: swarag <[EMAIL PROTECTED]> > To: solr-user@lucene.apache.org > Sent: Monday, April 7, 2008 2:31:55 PM > Subject: Number of docs per segments > > > I have about 15 millions documents totalling 27GB I would like to know > how many segments I would need split them into. I am trying to achieve a > qps > of 100? > -- > View this message in context: > http://www.nabble.com/Number-of-docs-per-segments-tp16538528p16538528.html > Sent from the Solr - User mailing list archive at Nabble.com. > > > > > > -- View this message in context: http://www.nabble.com/Number-of-docs-per-segments-tp16538528p16565488.html Sent from the Solr - User mailing list archive at Nabble.com.
Solr & Tomcat 5.5
Hi. We are experimenting with installing Tomcat 5.5 from Red Hat Repositories. Tomcat 5.5, Java 1.5, Solr 1.2, and REHL 4 When I try to access solr, the following error occurs: SEVERE: Exception starting filter SolrRequestFilter java.lang.UnsupportedClassVersionError: unsupported classversion 49.0 at java.lang.ClassLoader.defineClass0(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:539) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:123) at org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:1812) at org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:866) at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1319) at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1198) at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:209) at org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:304) at org.apache.catalina.core.ApplicationFilterConfig.(ApplicationFilterConfig.java:77) at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:3600) at org.apache.catalina.core.StandardContext.start(StandardContext.java:4193) at org.apache.catalina.manager.ManagerServlet.start(ManagerServlet.java:1173) at org.apache.catalina.manager.HTMLManagerServlet.start(HTMLManagerServlet.java:549) at org.apache.catalina.manager.HTMLManagerServlet.doGet(HTMLManagerServlet.java:105) at javax.servlet.http.HttpServlet.service(HttpServlet.java:689) at javax.servlet.http.HttpServlet.service(HttpServlet.java:802) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:252) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:173) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:178) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:524) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:126) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:105) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:107) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:148) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:869) at org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:664) at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:527) at org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:80) at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:684) at java.lang.Thread.run(Thread.java:534) Any suggestions? Thanks in advance for your help. Richard -- View this message in context: http://www.nabble.com/Solr---Tomcat-5.5-tp16566490p16566490.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: Solr & Tomcat 5.5
You're running an older JVM than what was used to compile the code. -Yonik On Tue, Apr 8, 2008 at 1:00 PM, Richard Lichlyter-Klein <[EMAIL PROTECTED]> wrote: > > Hi. We are experimenting with installing Tomcat 5.5 from Red Hat > Repositories. > > Tomcat 5.5, Java 1.5, Solr 1.2, and REHL 4 > > > > When I try to access solr, the following error occurs: > > SEVERE: Exception starting filter SolrRequestFilter > java.lang.UnsupportedClassVersionError: unsupported classversion 49.0 > at java.lang.ClassLoader.defineClass0(Native Method) > at java.lang.ClassLoader.defineClass(ClassLoader.java:539) > at > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:123) > at > > org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:1812) > at > > org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:866) > at > > org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1319) > at > > org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1198) > at > > org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:209) > at > > org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:304) > at > > org.apache.catalina.core.ApplicationFilterConfig.(ApplicationFilterConfig.java:77) > at > > org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:3600) > at > org.apache.catalina.core.StandardContext.start(StandardContext.java:4193) > at > org.apache.catalina.manager.ManagerServlet.start(ManagerServlet.java:1173) > at > > org.apache.catalina.manager.HTMLManagerServlet.start(HTMLManagerServlet.java:549) > at > > org.apache.catalina.manager.HTMLManagerServlet.doGet(HTMLManagerServlet.java:105) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:689) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:802) > at > > org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:252) > at > > org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:173) > at > > org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213) > at > > org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:178) > at > > org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:524) > at > org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:126) > at > org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:105) > at > > org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:107) > at > org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:148) > at > org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:869) > at > > org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:664) > at > > org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:527) > at > > org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:80) > at > > org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:684) > at java.lang.Thread.run(Thread.java:534) > > Any suggestions? > > Thanks in advance for your help. > > Richard > > > > -- > View this message in context: > http://www.nabble.com/Solr---Tomcat-5.5-tp16566490p16566490.html > Sent from the Solr - User mailing list archive at Nabble.com. > >
Re: Date range performance
Ok. Just to give some feedback. I reindexed with less precision as you told me and it's working really fast. Thanks for your help! Jonathan On Fri, Apr 4, 2008 at 6:02 PM, Chris Hostetter <[EMAIL PROTECTED]> wrote: > > : Looking into the code it seems like a Lucene problem, more than Solr. It > is > : in the RangeQuery and RangeFilter classes. The problem with changing > this to > : have a sorted index and than binary search is that you have to sort it, > : which is slow. Unless we can store the ordered index somewhere and reuse > it, > : it will be even slower than now. And if we store it, we will have to > face > : the problem with updating ordered index with new terms. > > FWIW: Lucene Term enumeration is already indexed, it's just not a binary > search tree (the details escape me at the moment, but there there is an > interval value of N somewhere in the code, and every Nth Term is loaded > into memory so a TermEnum.seek can skip ahead N terms at a time). > > But the number of unique terms can be a bottle neck ... rounding to the > level of precision you absolutely need can save you in these cases by > reducing the number of unique terms. > > > > > -Hoss > >
Re: Solr & Tomcat 5.5
Thanks. Yonik Seeley wrote: > > You're running an older JVM than what was used to compile the code. > > -Yonik > > On Tue, Apr 8, 2008 at 1:00 PM, Richard Lichlyter-Klein > <[EMAIL PROTECTED]> wrote: >> >> Hi. We are experimenting with installing Tomcat 5.5 from Red Hat >> Repositories. >> >> Tomcat 5.5, Java 1.5, Solr 1.2, and REHL 4 >> >> >> >> When I try to access solr, the following error occurs: >> >> SEVERE: Exception starting filter SolrRequestFilter >> java.lang.UnsupportedClassVersionError: unsupported classversion 49.0 >> at java.lang.ClassLoader.defineClass0(Native Method) >> at java.lang.ClassLoader.defineClass(ClassLoader.java:539) >> at >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:123) >> at >> >> org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:1812) >> at >> >> org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:866) >> at >> >> org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1319) >> at >> >> org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1198) >> at >> >> org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:209) >> at >> >> org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:304) >> at >> >> org.apache.catalina.core.ApplicationFilterConfig.(ApplicationFilterConfig.java:77) >> at >> >> org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:3600) >> at >> >> org.apache.catalina.core.StandardContext.start(StandardContext.java:4193) >> at >> >> org.apache.catalina.manager.ManagerServlet.start(ManagerServlet.java:1173) >> at >> >> org.apache.catalina.manager.HTMLManagerServlet.start(HTMLManagerServlet.java:549) >> at >> >> org.apache.catalina.manager.HTMLManagerServlet.doGet(HTMLManagerServlet.java:105) >> at javax.servlet.http.HttpServlet.service(HttpServlet.java:689) >> at javax.servlet.http.HttpServlet.service(HttpServlet.java:802) >> at >> >> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:252) >> at >> >> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:173) >> at >> >> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213) >> at >> >> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:178) >> at >> >> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:524) >> at >> >> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:126) >> at >> >> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:105) >> at >> >> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:107) >> at >> >> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:148) >> at >> >> org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:869) >> at >> >> org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:664) >> at >> >> org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:527) >> at >> >> org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:80) >> at >> >> org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:684) >> at java.lang.Thread.run(Thread.java:534) >> >> Any suggestions? >> >> Thanks in advance for your help. >> >> Richard >> >> >> >> -- >> View this message in context: >> http://www.nabble.com/Solr---Tomcat-5.5-tp16566490p16566490.html >> Sent from the Solr - User mailing list archive at Nabble.com. >> >> > > -- View this message in context: http://www.nabble.com/Solr---Tomcat-5.5-tp16566490p16569252.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: Multicore Issue with nightly build
from the client side, multicore should behave exactly the same as multi single core servers running next to each other. I'm not familiar with the perl client, but it will need to be configured for each core -- rather then one client that talks to multiple cores. while you install solr at: http://host/context you will access each core at: http://host/context/coreX http://host/context/coreY ryan On Apr 8, 2008, at 9:51 AM, kirk beers wrote: Hello again, I finally managed to add/update solr single core by using Perl CPAN Solr by Timothy Garafola. But I am unable to actually update or add anything to a multicore environment ! I was wondering if I am doing something incorrectly or if there is an issue at this point? Should I be editing the schema.xml for the specific core ? Thank you K On Mon, Apr 7, 2008 at 12:54 PM, kirk beers <[EMAIL PROTECTED]> wrote: Which schema.xml are you referring to ? The core0 schema.xml or the main schema.xml ? Because I get the following error when I use : camera I get this error: org.apache.solr.common.SolrException: ERROR:unknown field 'cat' at org .apache.solr.update.DocumentBuilder.toDocument(DocumentBuilder.java: 245) at org .apache .solr .update .processor .RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:66) at org .apache .solr .handler .XmlUpdateRequestHandler.processUpdate(XmlUpdateRequestHandler.java: 196) at org .apache .solr .handler .XmlUpdateRequestHandler .doLegacyUpdate(XmlUpdateRequestHandler.java:386) at org .apache .solr.servlet.SolrUpdateServlet.doPost(SolrUpdateServlet.java:65) at javax.servlet.http.HttpServlet.service(HttpServlet.java: 710) at javax.servlet.http.HttpServlet.service(HttpServlet.java: 803) at org .apache .catalina .core .ApplicationFilterChain .internalDoFilter(ApplicationFilterChain.java:269) at org .apache .catalina .core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java: 188) at org .apache .solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java: 320) at org .apache .catalina .core .ApplicationFilterChain .internalDoFilter(ApplicationFilterChain.java:215) at org .apache .catalina .core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java: 188) at org .apache .catalina .core.StandardWrapperValve.invoke(StandardWrapperValve.java:213) at org .apache .catalina .core.StandardContextValve.invoke(StandardContextValve.java:174) at org .apache .catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org .apache .catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117) at org .apache .catalina.core.StandardEngineValve.invoke(StandardEngineValve.java: 108) at org .apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java: 151) at org .apache.coyote.http11.Http11Processor.process(Http11Processor.java: 874) at org.apache.coyote.http11.Http11BaseProtocol $Http11ConnectionHandler.processConnection(Http11BaseProtocol.java: 665) at org .apache .tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java: 528) at org .apache .tomcat .util .net .LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:81) at org.apache.tomcat.util.threads.ThreadPool $ControlRunnable.run(ThreadPool.java:689) at java.lang.Thread.run(Thread.java:619) = On Mon, Apr 7, 2008 at 11:50 AM, Thomas Arni <[EMAIL PROTECTED]> wrote: Please make sure that you do NOT have a field called "category" in in the documents you would like to add. For example: camera I am almost sure you have some documents, which have this field "category" instead of "cat". You can also add the field "category" to your schema.xml file and copy it to the "cat" field. kirk beers said the following on 07/04/2008 15:40: Hi Ryan, I re installed the multicore set up and I have it running and working properly. The cores newswire2 etc contained indexes from a prior Lucene application which did not seem to work in the multicore set-up. Now that I have multicore running are there any instructions on how to add/update individual cores with new docs ? I have set a core0 as a default within multicore to make it updateable ? But now I keep getting errors from curl that says it does not recognize specific field names like 'cat' which seem to be declared in both . I am likewise using the following line command : curl -d @add.xml http://localhost:8080/solr/update Here is the contents of add.xml 9885A004 Canon PowerShot SD500 camera 3x optical zoom aluminum case 6.4 329.95 Here is the core0 schema: id name Here is the error I get: [EMAIL PROTECTED] :~/Desktop/tomcat-solr/apache-solr-nightly/example/exampledocs$ c
Solr + Complex Legacy Schema -- Best Practices?
I just was wondering, has anybody dealt with trying to "translate" the data from a big, legacy DB schema to a Solr installation? What I mean is, our company has (drawn from a big data warehouse) a series of 6 tables A, B, C, D, E, and F of product information that we've currently been making searchable via a home-brew setup. Since that is kind of error-prone and involves a lot of manual data entry, we're looking at making it searchable via Solr. I know I've seen postings here where people were dealing with large DB collections, but it was a fairly simple 1-to-1 setup, whereas this setup has a bunch of foreign-key constraints and composite IDs. Is this the kind of thing that best-practice would still be to join it all together and index that or would we be better off trying to work out a multi-core setup? I should add, we are *not* trying to search the DB live. We are only looking to do something like dump its contents once a day (via some external script) and index that . The tables are pretty big (of about 17k, 287k, 127k, 50k, 56k, 29k rows respectively) so a join is not ideal as you can imagine (~800Mb actually), but we certainly can do it if we need to.
Distributed Search
Hi, I am trying to search through a distributed index and when I enter this link: http://wil1devsch1.cs.tmcs:8983/select?shards=wil1devsch1.cs.tmcs:8983,wil1devsch1.cs.tmcs:8080&q=pizza But it always gives me results from the index stored on 8983 and not on 8080. Is there anything wrong in what I am doing??? -- View this message in context: http://www.nabble.com/Distributed-Search-tp16577204p16577204.html Sent from the Solr - User mailing list archive at Nabble.com.
Nightly build compile error?
Hello everyone. I downloaded the latest nightly build from http://people.apache.org/builds/lucene/solr/nightly/. When I tried to compile it, I got the following errors: [javac] Compiling 189 source files to /home/csweb/apache-solr-nightly/build/core [javac] /home/csweb/apache-solr-nightly/src/java/org/apache/solr/handler/admin/MultiCoreHandler.java:93: cannot find symbol [javac] symbol : variable CREATE [javac] location: class org.apache.solr.common.params.MultiCoreParams.MultiCoreAction [javac] if (action == MultiCoreAction.CREATE) { [javac] ^ [javac] /home/csweb/apache-solr-nightly/src/java/org/apache/solr/handler/admin/MultiCoreHandler.java:95: cannot find symbol [javac] symbol : variable NAME [javac] location: interface org.apache.solr.common.params.MultiCoreParams [javac] dcore.init(params.get(MultiCoreParams.NAME), [javac]^ [javac] /home/csweb/apache-solr-nightly/src/java/org/apache/solr/handler/admin/MultiCoreHandler.java:96: cannot find symbol [javac] symbol : variable INSTANCE_DIR [javac] location: interface org.apache.solr.common.params.MultiCoreParams [javac] params.get(MultiCoreParams.INSTANCE_DIR)); [javac] ^ [javac] /home/csweb/apache-solr-nightly/src/java/org/apache/solr/handler/admin/MultiCoreHandler.java:99: cannot find symbol [javac] symbol : variable CONFIG [javac] location: interface org.apache.solr.common.params.MultiCoreParams [javac] String opts = params.get(MultiCoreParams.CONFIG); [javac] ^ [javac] /home/csweb/apache-solr-nightly/src/java/org/apache/solr/handler/admin/MultiCoreHandler.java:103: cannot find symbol [javac] symbol : variable SCHEMA [javac] location: interface org.apache.solr.common.params.MultiCoreParams [javac] opts = params.get(MultiCoreParams.SCHEMA); [javac]^ [javac] /home/csweb/apache-solr-nightly/src/java/org/apache/solr/handler/admin/MultiCoreHandler.java:164: unqualified enumeration constant name required [javac] case PERSIST: { [javac]^ [javac] /home/csweb/apache-solr-nightly/src/java/org/apache/solr/handler/component/QueryComponent.java:356: cannot find symbol [javac] symbol : class ShardFieldSortedHitQueue [javac] location: class org.apache.solr.handler.component.QueryComponent [javac] ShardFieldSortedHitQueue queue = new ShardFieldSortedHitQueue(sortFields, ss.getOffset() + ss.getCount()); [javac] ^ [javac] /home/csweb/apache-solr-nightly/src/java/org/apache/solr/handler/component/QueryComponent.java:356: cannot find symbol [javac] symbol : class ShardFieldSortedHitQueue [javac] location: class org.apache.solr.handler.component.QueryComponent [javac] ShardFieldSortedHitQueue queue = new ShardFieldSortedHitQueue(sortFields, ss.getOffset() + ss.getCount()); [javac]^ [javac] /home/csweb/apache-solr-nightly/src/java/org/apache/solr/handler/component/QueryComponent.java:491: cannot find symbol [javac] symbol : method join(java.util.ArrayList,char) [javac] location: class org.apache.solr.common.util.StrUtils [javac] sreq.params.add("ids", StrUtils.join(ids, ',')); [javac] ^ [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [javac] 9 errors Am I doing something wrong? -- View this message in context: http://www.nabble.com/Nightly-build-compile-error--tp16577739p16577739.html Sent from the Solr - User mailing list archive at Nabble.com.
How to improve Solr search performance
I'm testing solr search performance using LoadRunner the index contains 5M+ docs , about 10.7GB large. CPU:3.2GHz*2 RAM16GB The result si dispirited max:19s min 1.5s avg 11.7s But the QTime is around 1s (simple query without facet or mlt,just fetch the top 50 IDs) So it seems that XMLWriter is the bottleneck Am I right? Or is there anything i can do to improve the performance?
Re: How to improve Solr search performance
On 08/04/2008, at 23:13, 李银松 wrote: I'm testing solr search performance using LoadRunner the index contains 5M+ docs , about 10.7GB large. CPU:3.2GHz*2 RAM16GB The result si dispirited max:19s min 1.5s avg 11.7s But the QTime is around 1s (simple query without facet or mlt,just fetch the top 50 IDs) So it seems that XMLWriter is the bottleneck Am I right? Or is there anything i can do to improve the performance? Are you limiting the results? Because if you are not writing even a tiny portion of those 10gb can take 11secs for sure. If you can please see the diference from writing xml to writing json data. -- Leonardo Santagada
Re: How to improve Solr search performance
limiting the results? How? I had set rows=50 fl=ID is it what u said for limiting the results? When I switch to writing json data, the result is better max:14.8s min:0.1savg:8.7 Error:QTime is around 1-4s ,not 1s 2008/4/9, Leonardo Santagada <[EMAIL PROTECTED]>: > > > On 08/04/2008, at 23:13, 李银松 wrote: > > > I'm testing solr search performance using LoadRunner > > the index contains 5M+ docs , about 10.7GB large. > > CPU:3.2GHz*2 RAM16GB > > The result si dispirited > > max:19s min 1.5s avg 11.7s > > But the QTime is around 1s > > (simple query without facet or mlt,just fetch the top 50 IDs) > > So it seems that XMLWriter is the bottleneck > > Am I right? Or is there anything i can do to improve the performance? > > > > > Are you limiting the results? Because if you are not writing even a tiny > portion of those 10gb can take 11secs for sure. If you can please see the > diference from writing xml to writing json data. > > -- > Leonardo Santagada > > > > >
Re: How to improve Solr search performance
most of time seems to be used for the writer getting and writing the docs can those docs prefetched? 2008/4/9, Leonardo Santagada <[EMAIL PROTECTED]>: > > > On 08/04/2008, at 23:13, 李银松 wrote: > > > I'm testing solr search performance using LoadRunner > > the index contains 5M+ docs , about 10.7GB large. > > CPU:3.2GHz*2 RAM16GB > > The result si dispirited > > max:19s min 1.5s avg 11.7s > > But the QTime is around 1s > > (simple query without facet or mlt,just fetch the top 50 IDs) > > So it seems that XMLWriter is the bottleneck > > Am I right? Or is there anything i can do to improve the performance? > > > > > Are you limiting the results? Because if you are not writing even a tiny > portion of those 10gb can take 11secs for sure. If you can please see the > diference from writing xml to writing json data. > > -- > Leonardo Santagada > > > > >
Re: Solr + Complex Legacy Schema -- Best Practices?
800MB does not seem that big. Since all of your 6 tables have product information it should not be very difficult to join them together and import them into one Solr index. Again, all of this depends on what you're searching on and what you want to display as results. Have you taken a look at http://wiki.apache.org/solr/DataImportHandler ? Although it is still in development, we've used it successfully to import data from a large collection of tables joined together. On Wed, Apr 9, 2008 at 2:31 AM, Tkach <[EMAIL PROTECTED]> wrote: > I just was wondering, has anybody dealt with trying to "translate" the > data from a big, legacy DB schema to a Solr installation? What I mean is, > our company has (drawn from a big data warehouse) a series of 6 tables A, B, > C, D, E, and F of product information that we've currently been making > searchable via a home-brew setup. Since that is kind of error-prone and > involves a lot of manual data entry, we're looking at making it searchable > via Solr. I know I've seen postings here where people were dealing with > large DB collections, but it was a fairly simple 1-to-1 setup, whereas this > setup has a bunch of foreign-key constraints and composite IDs. Is this the > kind of thing that best-practice would still be to join it all together and > index that or would we be better off trying to work out a multi-core setup? > I should add, we are *not* trying to search the DB live. We are only > looking to do something like dump its contents once a day (via some external > script) and index that > . > > The tables are pretty big (of about 17k, 287k, 127k, 50k, 56k, 29k rows > respectively) so a join is not ideal as you can imagine (~800Mb actually), > but we certainly can do it if we need to. > -- Regards, Shalin Shekhar Mangar.
Re: How to improve Solr search performance
On 09/04/2008, at 00:24, 李银松 wrote: most of time seems to be used for the writer getting and writing the docs can those docs prefetched? There is a cache on solr... if you really want it you could make the cache and the jvm as big as your memory it should probably fit most of the 10gb index. How to config this you can look on the solr wiki pages. Do you need all the fields on the documents? you can also make the index only contain the things needed for search and retrieve the actual record data from your database when the user select an item from the results or something like this. probably tomorrow someone with more solr knoledge is going to give you a better answer. good luck, -- Leonardo Santagada
Re: How to improve Solr search performance
I will try what u suggested ! Thanks a lot~ 在08-4-9,Leonardo Santagada <[EMAIL PROTECTED]> 写道: > > > On 09/04/2008, at 00:24, 李银松 wrote: > > > most of time seems to be used for the writer getting and writing the > > docs > > can those docs prefetched? > > > > > There is a cache on solr... if you really want it you could make the cache > and the jvm as big as your memory it should probably fit most of the 10gb > index. How to config this you can look on the solr wiki pages. > > Do you need all the fields on the documents? you can also make the index > only contain the things needed for search and retrieve the actual record > data from your database when the user select an item from the results or > something like this. > > probably tomorrow someone with more solr knoledge is going to give you a > better answer. > > good luck, > -- > Leonardo Santagada > > > > >
Re: Distributed Search
On Tue, Apr 8, 2008 at 8:56 PM, swarag <[EMAIL PROTECTED]> wrote: > I am trying to search through a distributed index and when I enter this > link: >http://wil1devsch1.cs.tmcs:8983/select?shards=wil1devsch1.cs.tmcs:8983,wil1devsch1.cs.tmcs:8080&q=pizza > But it always gives me results from the index stored on 8983 and not on > 8080. > Is there anything wrong in what I am doing??? A shard should be specified more something like wil1devsch1.cs.tmcs:8983/solr But that shouldn't be causing your problem. Make sure you are using the standard request handler with one of the latest builds of solr. It looks like tthe shards param is simply being ignored. -Yonik
Re: How to improve Solr search performance
: most of time seems to be used for the writer getting and writing the docs : can those docs prefetched? as mentiond, the documentCache can help you out in the common case, but 1-4 seconds for just the XMLWriting seems pretty high ... 1) how are you timing this (ie: what exactly are you measuring) 2) how many stored fields do each of your documents have? (not how many are in your schema.xml, how many do each of your docs really have in them) ...having *lots* of stored fields can slow down retrieval of the Document (and Document retrival is delayed untill response writing) so if you have thousands thta night account for it. If you're use case is to only ever return the "ID" field, then not storing anything else will help keep your total index size smaller and should speed up the response writing. -Hoss
Re: Solr + Complex Legacy Schema -- Best Practices?
: I just was wondering, has anybody dealt with trying to "translate" the : data from a big, legacy DB schema to a Solr installation? What I mean there's really no general answer to that question -- it all comes down to what you want to query on, and what kinds of results you want to get out... if you want your queries to result in lists of "products" then you should have one Document per product -- if you want to be able to query on the text of user reviews then you need to flatten all the user reviews for each product into the Document for each product. Sometimes you'll want two types of Documents ... one Document per product, containing all the text of all the user reviews, and one Document per user review, with the Product information duplicated in each so you can search for... q=reviewtext:solr&fq=doctype:product&fq=productype:camera ...to get a list of all the products that are cameras that contain the word solr in the text of *a* review, or you can search for... q=reviewtext:solr&fq=doctype:review&fq=producttype:camera ...to get a list of all the reviews that contain the word solr, and are about products that are cameras. Your use cases and goals will be differnet then everyone elses. -Hoss
Re: Distributed Search
We are using the Chain Collapse patch as well. Will that not work over a distributed index? swarag wrote: > > Hi, > I am trying to search through a distributed index and when I enter this > link: > > http://wil1devsch1.cs.tmcs:8983/select?shards=wil1devsch1.cs.tmcs:8983,wil1devsch1.cs.tmcs:8080&q=pizza > > > But it always gives me results from the index stored on 8983 and not on > 8080. > Is there anything wrong in what I am doing??? > -- View this message in context: http://www.nabble.com/Distributed-Search-tp16577204p16580104.html Sent from the Solr - User mailing list archive at Nabble.com.
Return the result only field A or field B is non-zero?
Hi all, I want to limit the search result by 2 numerical field, A and B, where Solr return the result only value in field A or B is non-zero. Does it possible or I need to change the document and schema? or I need to change the schema as well as the query? Thank you, Vinci -- View this message in context: http://www.nabble.com/Return-the-result-only-field-A-or-field-B-is-non-zero--tp16580681p16580681.html Sent from the Solr - User mailing list archive at Nabble.com.