Cannot get solr to pickup solr/home property
Software: Linux CentOS Tomcat 5.5 Plesk 9.2.1 Solr 1.3.0 This is the error log I get when I stop solr (or attempt to) and then restart it from Plesk: INFO: Manager: stop: Stopping web application at '/solr' Aug 15, 2009 10:49:30 PM org.apache.catalina.core.StandardContext stop INFO: Container org.apache.catalina.core.ContainerBase.[PSA].[mywebk9.com].[/solr] has not been started Aug 15, 2009 10:49:31 PM org.apache.catalina.core.ApplicationContext log INFO: Manager: list: Listing contexts for virtual host 'mywebk9.com' Aug 15, 2009 10:49:42 PM org.apache.catalina.core.ApplicationContext log INFO: Manager: start: Starting web application at '/solr' Aug 15, 2009 10:49:43 PM org.apache.solr.servlet.SolrDispatchFilter init INFO: SolrDispatchFilter.init() Aug 15, 2009 10:49:43 PM org.apache.solr.core.SolrResourceLoader locateInstanceDir INFO: Using JNDI solr.home: /usr/share/tomcat5/solr Aug 15, 2009 10:49:43 PM org.apache.solr.core.CoreContainer$Initializer initialize INFO: looking for solr.xml: /usr/share/tomcat5/solr/solr.xml Aug 15, 2009 10:49:43 PM org.apache.solr.core.SolrResourceLoader INFO: Solr home set to '/usr/share/tomcat5/solr/' Aug 15, 2009 10:49:43 PM org.apache.solr.core.SolrResourceLoader createClassLoader INFO: Adding 'file:/usr/share/tomcat5/solr/lib/jetty-6.1.3.jar' to Solr classloader Aug 15, 2009 10:49:43 PM org.apache.solr.core.SolrResourceLoader createClassLoader INFO: Adding 'file:/usr/share/tomcat5/solr/lib/servlet-api-2.5-6.1.3.jar' to Solr classloader Aug 15, 2009 10:49:43 PM org.apache.solr.core.SolrResourceLoader createClassLoader INFO: Adding 'file:/usr/share/tomcat5/solr/lib/jetty-util-6.1.3.jar' to Solr classloader Aug 15, 2009 10:49:43 PM org.apache.solr.core.SolrResourceLoader createClassLoader INFO: Adding 'file:/usr/share/tomcat5/solr/lib/jsp-2.1/' to Solr classloader Aug 15, 2009 10:49:43 PM org.apache.solr.servlet.SolrDispatchFilter init SEVERE: Could not start SOLR. Check solr/home property java.lang.ExceptionInInitializerError at org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:117) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:69) at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:221) at org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:302) at org.apache.catalina.core.ApplicationFilterConfig.(ApplicationFilterConfig.java:78) at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:3635) at org.apache.catalina.core.StandardContext.start(StandardContext.java:4222) at org.apache.catalina.manager.ManagerServlet.start(ManagerServlet.java:1176) at org.apache.catalina.manager.ManagerServlet.doGet(ManagerServlet.java:369) at javax.servlet.http.HttpServlet.service(HttpServlet.java:690) at javax.servlet.http.HttpServlet.service(HttpServlet.java:803) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:269) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:210) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:525) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:151) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:870) at org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:665) at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:528) at org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:81) at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:685) at java.lang.Thread.run(Thread.java:636) Caused by: java.lang.RuntimeException: XPathFactory#newInstance() failed to create an XPathFactory for the default object model: http://java.sun.com/jaxp/xpath/dom with the XPathFactoryConfigurationException: javax.xml.xpath.XPathFactoryConfigurationException: No XPathFctory implementation found for the object model: http://java.sun.com/jaxp/xpath/dom at javax.xml.xpath.XPathFactory.newInstance(Unknown Source) at org.apache.solr.core.Config.(Config.java:41) ... 26 more Aug 15, 2009 10:49:43 PM org.apache.catalina.core.StandardContext filterStart SEVERE: Excep
Re: delta-import using a full-import command is not working
Thanks for your response. It is not empty, it contains following: #Sat Aug 15 16:44:18 PDT 2009 last_index_time=2009-08-15 16\:44\:17 Noble Paul നോബിള് नोब्ळ्-2 wrote: > > actually your dataimport.properties is empty , I guess that is the reason > > On Sun, Aug 16, 2009 at 5:19 AM, djain101 wrote: >> >> Hi, >> >> I am following the example on >> http://wiki.apache.org/solr/DataImportHandlerFaq#fullimportdelta >> http://wiki.apache.org/solr/DataImportHandlerFaq#fullimportdelta and i >> configured my data-config.xml in the similar way as mentioned in the >> example >> on this link. the findDelta entity tag looks like as: >> >> > rootEntity="false"> >> >> When i try to index using command=full-import&clean=false, it responds >> with >> message "Indexing completed. Added/Updated: 0 documents. Deleted 0 >> documents." When I looked at the query, it looks like as: >> select party_id as id from core_party where LAST_UPDATED_TIMESTAMP > >> to_date('','-MM-DD HH24:MI:SS') >> >> Looks like Solr is not able to replace ${dataimporter.last_index_time} >> with >> the timestamp from dataimport.properties. This file exist in my conf >> folder >> and gets updated everytime i do full-import. So there is no issue with >> the >> timestamp in dataimport.properties file. Somehow, it is not getting >> replaced >> in sql query. >> >> Please help !!! >> -- >> View this message in context: >> http://www.nabble.com/delta-import-using-a-full-import-command-is-not-working-tp24989144p24989144.html >> Sent from the Solr - User mailing list archive at Nabble.com. >> >> > > > > -- > - > Noble Paul | Principal Engineer| AOL | http://aol.com > > -- View this message in context: http://www.nabble.com/delta-import-using-a-full-import-command-is-not-working-tp24989144p24994120.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: switching between requestHandler
On Aug 16, 2009, at 1:44 AM, Shalin Shekhar Mangar wrote: Use the qt parameter and specify the name of the handler. e.g. qt=site_search_main Thanks! I totally overlooked forgot about that!
how do i - include the items without a facet
My result set for a query has 451 records 293 of them do not have a value set on 'facet.location_name' how can i specify searching for these items? i've tried: fq=facet.location_name:"" and based on http://wiki.apache.org/solr/SimpleFacetParameters#head-b618fc041ffc0c26ea45bdf086895e5b87061bd4 fq=-facet.location_name:[* TO *] fq=facet.location_name:[* TO *] but none seem to work
Re: Whats the maximum limit for Dynamic Fields?
aka, the sky. Lucas Frare Teixeira .·. - lucas...@gmail.com - blog.lucastex.com - twitter.com/lucastex On Sun, Aug 16, 2009 at 2:45 AM, Shalin Shekhar Mangar < shalinman...@gmail.com> wrote: > On Sun, Aug 16, 2009 at 8:43 AM, Ninad Raut >wrote: > > > Hi, > > I want to know whats the maximum limit to how many dynamic fields could > be > > stored per document. > > > > There is no limit. > > -- > Regards, > Shalin Shekhar Mangar. >
Which versions?
Which versions of Lucene, Nutch and Solr work together? I've discovered that the Nutch trunk and the Solr trunk use wildly different versions of the Lucene jars, and it's causing me problems. -- http://www.linkedin.com/in/paultomblin
Re: how do i - include the items without a facet
> > how can i specify searching for these items? > fq=-location_name:[* TO *] Your field name is "location_name" and not "facet.location_name". Cheers Avlesh On Mon, Aug 17, 2009 at 5:04 AM, Jonathan Vanasco wrote: > My result set for a query has 451 records > 293 of them do not have a value set on 'facet.location_name' > > how can i specify searching for these items? > > i've tried: > >fq=facet.location_name:"" > > and based on > http://wiki.apache.org/solr/SimpleFacetParameters#head-b618fc041ffc0c26ea45bdf086895e5b87061bd4 > >fq=-facet.location_name:[* TO *] >fq=facet.location_name:[* TO *] > > but none seem to work >
Re: how do i - include the items without a facet
On Aug 16, 2009, at 9:59 PM, Avlesh Singh wrote: how can i specify searching for these items? fq=-location_name:[* TO *] Your field name is "location_name" and not "facet.location_name". its both. and that didn't work on either. "location_name" is a text field , copyto puts it in "facet.location_name" i'm thinking this could be because the field was not entered as NULL but an empty string ?
Re: JVM Heap utilization & Memory leaks with Solr
My primary issue is not Out of Memory error at run time. It is memory leaks: heap space not being released after doing a force GC also. So after sometime as progressively more heap gets utilized, I start running out of memory The verdict however seems unanimous that there are no known memory leak issues within Solr. I am still looking at my application to analyse the problem. Thank you. On Thu, Aug 13, 2009 at 10:58 PM, Fuad Efendi wrote: > Most OutOfMemoryException (if not 100%) happening with SOLR are because of > > http://lucene.apache.org/java/2_4_0/api/org/apache/lucene/search/FieldCache. > html > - it is used internally in Lucene to cache Field value and document ID. > > My very long-term observations: SOLR can run without any problems few > days/months and unpredictable OOM happens just because someone tried sorted > search which will populate array with IDs of ALL documents in the index. > > The only solution: calculate exactly amount of RAM needed for FieldCache... > For instance, for 100,000,000 documents single instance of FieldCache may > require 8*100,000,000 bytes (8 bytes per document ID?) which is almost 1Gb > (at least!) > > > I didn't notice any memory leaks after I started to use 16Gb RAM for SOLR > instance (almost a year without any restart!) > > > > > -Original Message- > From: Rahul R [mailto:rahul.s...@gmail.com] > Sent: August-13-09 1:25 AM > To: solr-user@lucene.apache.org > Subject: Re: JVM Heap utilization & Memory leaks with Solr > > *You should try to generate heap dumps and analyze the heap using a tool > like the Eclipse Memory Analyzer. Maybe it helps spotting a group of > objects holding a large amount of memory* > > The tool that I used also allows to capture heap snap shots. Eclipse had a > lot of pre-requisites. You need to apply some three or five patches before > you can start using it My observations with this tool were that > some > Hashmaps were taking up a lot of space. Although I could not pin it down to > the exact HashMap. These would either be weblogic's or Solr's I will > anyway give eclipse's a try and see how it goes. Thanks for your input. > > Rahul > > On Wed, Aug 12, 2009 at 2:15 PM, Gunnar Wagenknecht > wrote: > > > Rahul R schrieb: > > > I tried using a profiling tool - Yourkit. The trial version was free > for > > 15 > > > days. But I couldn't find anything of significance. > > > > You should try to generate heap dumps and analyze the heap using a tool > > like the Eclipse Memory Analyzer. Maybe it helps spotting a group of > > objects holding a large amount of memory. > > > > -Gunnar > > > > -- > > Gunnar Wagenknecht > > gun...@wagenknecht.org > > http://wagenknecht.org/ > > > > > > >