Quick comment - why so shy with number of open file descriptors?  On some 
nothing-special machines from several years ago I had this limit set to 30K+ - 
here, for example: http://www.simpy.com/user/otis :)


Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch



----- Original Message ----
> From: vivek sar <vivex...@gmail.com>
> To: solr-user@lucene.apache.org
> Sent: Tuesday, April 14, 2009 3:12:41 AM
> Subject: Re: Question on StreamingUpdateSolrServer
> 
> The machine's ulimit is set to 9000 and the OS has upper limit of
> 12000 on files. What would explain this? Has anyone tried Solr with 25
> cores on the same Solr instance?
> 
> Thanks,
> -vivek
> 
> 2009/4/13 Noble Paul നോബിള്‍  नोब्ळ् :
> > On Tue, Apr 14, 2009 at 7:14 AM, vivek sar wrote:
> >> Some more update. As I mentioned earlier we are using multi-core Solr
> >> (up to 65 cores in one Solr instance with each core 10G). This was
> >> opening around 3000 file descriptors (lsof). I removed some cores and
> >> after some trial and error I found at 25 cores system seems to work
> >> fine (around 1400 file descriptors). Tomcat is responsive even when
> >> the indexing is happening at Solr (for 25 cores). But, as soon as it
> >> goes to 26 cores the Tomcat becomes unresponsive again. The puzzling
> >> thing is if I stop indexing I can search on even 65 cores, but while
> >> indexing is happening it seems to support only up to 25 cores.
> >>
> >> 1) Is there a limit on number of cores a Solr instance can handle?
> >> 2) Does Solr do anything to the existing cores while indexing? I'm
> >> writing to only one core at a time.
> > There is no hard limit (it is Integer.MAX_VALUE) . But inreality your
> > mileage depends on your hardware and no:of file handles the OS can
> > open
> >>
> >> We are struggling to find why Tomcat stops responding on high number
> >> of cores while indexing is in-progress. Any help is very much
> >> appreciated.
> >>
> >> Thanks,
> >> -vivek
> >>
> >> On Mon, Apr 13, 2009 at 10:52 AM, vivek sar wrote:
> >>> Here is some more information about my setup,
> >>>
> >>> Solr - v1.4 (nightly build 03/29/09)
> >>> Servlet Container - Tomcat 6.0.18
> >>> JVM - 1.6.0 (64 bit)
> >>> OS -  Mac OS X Server 10.5.6
> >>>
> >>> Hardware Overview:
> >>>
> >>> Processor Name: Quad-Core Intel Xeon
> >>> Processor Speed: 3 GHz
> >>> Number Of Processors: 2
> >>> Total Number Of Cores: 8
> >>> L2 Cache (per processor): 12 MB
> >>> Memory: 20 GB
> >>> Bus Speed: 1.6 GHz
> >>>
> >>> JVM Parameters (for Solr):
> >>>
> >>> export CATALINA_OPTS="-server -Xms6044m -Xmx6044m -DSOLR_APP
> >>> -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -Xloggc:gc.log
> >>> -Dsun.rmi.dgc.client.gcInterval=3600000
> >>> -Dsun.rmi.dgc.server.gcInterval=3600000"
> >>>
> >>> Other:
> >>>
> >>> lsof|grep solr|wc -l
> >>>    2493
> >>>
> >>> ulimit -an
> >>>  open files                      (-n) 9000
> >>>
> >>> Tomcat
> >>>    
> >>>               connectionTimeout="20000"
> >>>               maxThreads="100" />
> >>>
> >>> Total Solr cores on same instance - 65
> >>>
> >>> useCompoundFile - true
> >>>
> >>> The tests I ran,
> >>>
> >>> While Indexer is running
> >>> 1)  Go to "http://juum19.co.com:8080/solr";    - returns blank page (no
> >>> error in the catalina.out)
> >>>
> >>> 2) Try "telnet juum19.co.com 8080"  - returns with "Connection closed
> >>> by foreign host"
> >>>
> >>> Stop the Indexer Program (Tomcat is still running with Solr)
> >>>
> >>> 3)  Go to "http://juum19.co.com:8080/solr";  - works ok, shows the list
> >>> of all the Solr cores
> >>>
> >>> 4) Try telnet - able to Telnet fine
> >>>
> >>> 5)  Now comment out all the caches in solrconfig.xml. Try same tests,
> >>> but the Tomcat still doesn't response.
> >>>
> >>> Is there a way to stop the auto-warmer. I commented out the caches in
> >>> the solrconfig.xml but still see the following log,
> >>>
> >>> INFO: autowarming result for searc...@3aba3830 main
> >>> 
> fieldValueCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
> >>>
> >>> INFO: Closing searc...@175dc1e2
> >>> main   
>  
> fieldValueCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
> >>> 
> filterCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
> >>> 
> queryResultCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
> >>> 
> documentCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
> >>>
> >>>
> >>> 6) Change the Indexer frequency so it runs every 2 min (instead of all
> >>> the time). I noticed once the commit is done, I'm able to run my
> >>> searches. During commit and auto-warming period I just get blank page.
> >>>
> >>>  7) Changed from Solrj to XML update -  I still get the blank page
> >>> whenever update/commit is happening.
> >>>
> >>> Apr 13, 2009 6:46:18 PM
> >>> org.apache.solr.update.processor.LogUpdateProcessor finish
> >>> INFO: {add=[621094001, 621094002, 621094003, 621094004, 621094005,
> >>> 621094006, 621094007, 621094008, ...(6992 more)]} 0 1948
> >>> Apr 13, 2009 6:46:18 PM org.apache.solr.core.SolrCore execute
> >>> INFO: [20090413_12] webapp=/solr path=/update params={} status=0 
> >>> QTime=1948
> >>>
> >>>
> >>> So, looks like it's not just StreamingUpdateSolrServer, but whenever
> >>> the update/commit is happening I'm not able to search. I don't know if
> >>> it's related to using multi-core. In this test I was using only single
> >>> thread for update to a single core using only single Solr instance.
> >>>
> >>> So, it's clearly related to index process (update, commit and
> >>> auto-warming). As soon as update/commit/auto-warming is completed I'm
> >>> able to run my queries again. Is there anything that could stop
> >>> searching while update process is in-progress - like any lock or
> >>> something?
> >>>
> >>> Any other ideas?
> >>>
> >>> Thanks,
> >>> -vivek
> >>>
> >>> On Mon, Apr 13, 2009 at 12:14 AM, Shalin Shekhar Mangar
> >>> wrote:
> >>>> On Mon, Apr 13, 2009 at 12:36 PM, vivek sar wrote:
> >>>>
> >>>>> I index in 10K batches and commit after 5 index cyles (after 50K). Is
> >>>>> there any limitation that I can't search during commit or
> >>>>> auto-warming? I got 8 CPU cores and only 2 were showing busy (using
> >>>>> top) - so it's unlikely that the CPU was pegged.
> >>>>>
> >>>>>
> >>>> No, there is no such limitation. The old searcher will continue to serve
> >>>> search requests until the new one is warmed and registered.
> >>>>
> >>>> So, CPU does not seem to be an issue. Does this happen only when you use
> >>>> StreamingUpdateSolrServer? Which OS, file system? What JVM parameters are
> >>>> you using? Which servlet container and version?
> >>>>
> >>>> --
> >>>> Regards,
> >>>> Shalin Shekhar Mangar.
> >>>>
> >>>
> >>
> >
> >
> >
> > --
> > --Noble Paul
> >

Reply via email to