Re: Aggregated indexing of updating RSS feeds

2011-11-17 Thread sbarriba
Thanks Chris. (Bell rings) The 'params' logging pointer was what I needed. So for reference its not a good idea to use a 'wget' command directly in a crontab. I was using: wget http://localhost/solr/myfeed?command=full-import&rows=5000&clean=false ...but moving this into a separate shell script

Re: Aggregated indexing of updating RSS feeds

2011-11-16 Thread sbarriba
All, Can anyone advise how to stop the "deleteAll" event during a full import? As discussed above using clean=false with Solr 3.4 still seems to trigger a delete of all previous imported data. I want to aggregate the results of multiple imports. Thanks in advance. S -- View this message in cont

Re: Aggregated indexing of updating RSS feeds

2011-11-09 Thread sbarriba
All, Can anyone advise how to stop the "deleteAll" event during a full import? I'm still unable to determine why repeat full imports seem to delete old indexes. After investigation the logs confirm this - see "REMOVING ALL DOCUMENTS FROM INDEX" below. ..but the request I'm making is.. /solr/myfee

Re: Aggregated indexing of updating RSS feeds

2011-11-08 Thread sbarriba
Hi Hoss, Thanks for the quick response. RE point 1) I'd mistyped (sorry) the incremental URL I'm using for updates. Essentially every 5 minutes the system is making a HTTP call for... http://localhost/solr/myfeed?clean=false&command=full-import&rows=5000 ..which when accessed returns the followi

Re: Aggregated indexing of updating RSS feeds

2011-11-07 Thread sbarriba
Thanks Nagendra, I'll take a look. So question for you et al, so Solr in its default installation will ALWAYS delete content for an entity prior to doing a full import? You cannot simply build up an index incrementally from multiple imports (from XML)? I read elsewhere that the 'clean' parameter