Re: Solr on netty

2012-02-22 Thread prasenjit mukherjee
Yonik Seeley wrote: > On Wed, Feb 22, 2012 at 9:27 AM, prasenjit mukherjee > wrote: >> Is anybody aware of any effort regarding porting solr to a netty ( or >> any other async-io based framework ) based framework. >> >> Even on medium load ( 10 parallel clients )  wi

Solr on netty

2012-02-22 Thread prasenjit mukherjee
Is anybody aware of any effort regarding porting solr to a netty ( or any other async-io based framework ) based framework. Even on medium load ( 10 parallel clients ) with 16 shards performance seems to deteriorate quite sharply compared another alternative ( async-io based ) solution as load in

Re: effect of continuous deletes on index's read performance

2012-02-06 Thread prasenjit mukherjee
commit > is probably hurting you far more than any index size > savings. > > I'd actually think carefully about whether you need even > 10 second commits. If you can stretch that out to minutes, > so much the better. But it all depends upon your problem > space. > > Be

Re: effect of continuous deletes on index's read performance

2012-02-05 Thread prasenjit mukherjee
se commitWithin > to limit commit's "performance damage". > > >  Otis > > > Performance Monitoring SaaS for Solr - > http://sematext.com/spm/solr-performance-monitoring/index.html > > > >> >> From: p

effect of continuous deletes on index's read performance

2012-02-05 Thread prasenjit mukherjee
I have a use case where documents are continuously added @ 20 docs/sec ( each doc add is also doing a commit ) and docs continuously getting deleted at the same rate. So the searchable index size remains the same : ~ 400K docs ( docs for last 6 hours ~ 20*3600*6). Will it have pauses when deletes

Re: SolrReplication configuration with frequent deletes and updates

2012-02-01 Thread prasenjit mukherjee
Appreciate your reply. Have some more follow up questions inline. On Thu, Feb 2, 2012 at 12:35 AM, Emmanuel Espina wrote: >> 1. Adds : 20 docs/sec >> 2. Searches : 100 searches/sec >> 3. Deletes : (20*3600*24*7 ~ 12 mill ) docs/week ( basically a cron >> job which deletes all documents more than

SolrReplication configuration with frequent deletes and updates

2012-02-01 Thread prasenjit mukherjee
I have the following requirements : 1. Adds : 20 docs/sec 2. Searches : 100 searches/sec 3. Deletes : (20*3600*24*7 ~ 12 mill ) docs/week ( basically a cron job which deletes all documents more than 7 days old ) I am thinking of having 6 shards ( with each having 2 million docs ) with 1 master an