Thanks Chris.
(Bell rings)
The 'params' logging pointer was what I needed. So for reference its not a
good idea to use a 'wget' command directly in a crontab.
I was using:
wget http://localhost/solr/myfeed?command=full-import&rows=5000&clean=false
...but moving this into a separate shell script
All,
Can anyone advise how to stop the "deleteAll" event during a full import?
As discussed above using clean=false with Solr 3.4 still seems to trigger a
delete of all previous imported data. I want to aggregate the results of
multiple imports.
Thanks in advance.
S
--
View this message in cont
All,
Can anyone advise how to stop the "deleteAll" event during a full import?
I'm still unable to determine why repeat full imports seem to delete old
indexes. After investigation the logs confirm this - see "REMOVING ALL
DOCUMENTS FROM INDEX" below.
..but the request I'm making is..
/solr/myfee
Hi Hoss,
Thanks for the quick response.
RE point 1) I'd mistyped (sorry) the incremental URL I'm using for updates.
Essentially every 5 minutes the system is making a HTTP call for...
http://localhost/solr/myfeed?clean=false&command=full-import&rows=5000
..which when accessed returns the followi
Thanks Nagendra, I'll take a look.
So question for you et al, so Solr in its default installation will ALWAYS
delete content for an entity prior to doing a full import?
You cannot simply build up an index incrementally from multiple imports
(from XML)? I read elsewhere that the 'clean' parameter