Well, have you distributed both your script and the configs? And what does "doesn't work" mean?
If you've changed your configs and pushed them to Zookeeper, and reloaded your collection (by restarting the Solr nodes, issuing the collections API RELOAD command whatever) then all the nodes for any collection that uses that configset should be fine. More details are necessary to say anything more I'm afraid. Best, Erick On Wed, Nov 23, 2016 at 5:06 AM, Lambrou, Ioannis <ioannis.lamb...@ncr.com> wrote: > Hello all, > > I am using solr 5.5 and I am trying to use > StatelessScriptUpdateProcessorFactory to append the data I am adding to solr > to existing documents. This is working fine when I use one shard but it > doesn't work on multiple shards. > > Is there any way to use it with multiple shards? > > This is the updateProcessor.js: > > function processAdd(cmd) { > logger.error(" >> processAdd") > var name=""; > var startDate=""; > var dataCurr=""; > doc = cmd.solrDoc; > var previousDoc=null; > var previousData=""; > nameDoc=doc.getField("name"); > if( nameDoc !=null ){ > name=doc.getFieldValue("name").toString(); > startDate= doc.getFieldValue("start_date_s").toString(); > dataCurr=doc.getFieldValue("data_ws").toString(); > } > try{ > var Term = Java.type("org.apache.lucene.index.Term"); > previousDocId= req.getSearcher().getFirstMatch(new > Term("id",name+"_"+startDate)); > previousDoc=req.getSearcher().doc(previousDocId); > previousData= previousDoc.getField("data_ws").stringValue(); > doc.setField("data_ws",previousData+dataCurr); > } > catch(err){ > > logger.error("error in update processor "+err) > } > doc.setField("id",name+"_"+startDate); > } > > > function processDelete(cmd) { > // no-op > } > > function processMergeIndexes(cmd) { > // no-op > } > > function processCommit(cmd) { > // no-op > } > > function processRollback(cmd) { > // no-op > } > > function finish() { > // no-op > } > > Thanks and Regards, > Giannis > > >