Thanks for the advice, I followed all the pointers mentioned:

curl
'http://127.0.0.1:8080/solr/newsRSS_DIH?command=full-import&clean=false&commit=true&url=http://www.example.com/news'


Now, I got the following error:

Jan 21, 2013 10:53:02 AM org.apache.solr.core.SolrDeletionPolicy
updateCommits
INFO: newest commit = 95
Jan 21, 2013 10:53:02 AM org.apache.solr.handler.dataimport.SolrWriter
commit
SEVERE: Exception while solr commit.
org.apache.solr.common.SolrException: Error opening new searcher
        at org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1310)
        at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1422)
        at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1200)
        at
org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:560)
        at
org.apache.solr.update.processor.RunUpdateProcessor.processCommit(RunUpdateProcessorFactory.java:87)
        at
org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:64)
        at
org.apache.solr.update.processor.DistributedUpdateProcessor.processCommit(DistributedUpdateProcessor.java:1007)
        at
org.apache.solr.update.processor.LogUpdateProcessor.processCommit(LogUpdateProcessorFactory.java:157)
        at
org.apache.solr.handler.dataimport.SolrWriter.commit(SolrWriter.java:107)
        at
org.apache.solr.handler.dataimport.DocBuilder.finish(DocBuilder.java:304)
        at
org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:252)
        at
org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:382)
        at
org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:448)
        at
org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:429)
Caused by: org.apache.lucene.store.AlreadyClosedException: this IndexWriter
is closed


2nd question:
Also , I am planning to put similar curl commands in one-one .sh file and
invoke through cron job for scheduling. Any idea how to handle
error/exception using the .sh file as some of them may failed due to the
error like above.  Is it the right approach to the critical scheduling
process for indexing our business data.
Appreciate all your advice.





--
View this message in context: 
http://lucene.472066.n3.nabble.com/curl-with-dynamic-url-not-working-tp4035092p4035111.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to