Hello,

I have noticed that when I run concurrent full-imports using DIH in Solr
1.4, the index ends up getting corrupted. I see the following in the log
files (a snippet):


<record>
  <date>2010-02-03T17:54:24</date>
  <millis>1265248464553</millis>
  <sequence>764</sequence>
  <logger>org.apache.solr.handler.dataimport.SolrWriter</logger>
  <level>SEVERE</level>
  <class>org.apache.solr.handler.dataimport.SolrWriter</class>
  <method>commit</method>
  <thread>25</thread>
  <message>Exception while solr commit.</message>
  <exception>
    <message>java.io.FileNotFoundException:
/solrserver/apache-solr-1.3.0/exampl
e/multicore/RET/data/index/_5.cfs (No such file or directory)</message>
    <frame>
      <class>java.io.RandomAccessFile</class>
      <method>open</method>
    </frame>
    <frame>
      <class>java.io.RandomAccessFile</class>
      <method>&lt;init&gt;</method>
      <line>212</line>
    </frame>
    <frame>
     
<class>org.apache.lucene.store.FSDirectory$FSIndexInput$Descriptor</class>
      <method>&lt;init&gt;</method>
      <line>552</line>
    </frame>
    <frame>
      <class>org.apache.lucene.store.FSDirectory$FSIndexInput</class>
      <method>&lt;init&gt;</method>
      <line>582</line>
    </frame>
    <frame>
      <class>org.apache.lucene.store.FSDirectory</class>
      <method>openInput</method>
      <line>488</line>
    </frame>
    <frame>
      <class>org.apache.lucene.index.CompoundFileReader</class>
      <method>&lt;init&gt;</method>
      <line>70</line>
    </frame>
    <frame>
      <class>org.apache.lucene.index.SegmentReader</class>
      <method>initialize</method>
      <line>319</line>
    </frame>
    <frame>
      <class>org.apache.lucene.index.SegmentReader</class>
      <method>get</method>
      <line>304</line>
    </frame>
    <frame>
      <class>org.apache.lucene.index.SegmentReader</class>
      <method>get</method>
      <line>234</line>
    </frame>
    <frame>
      <class>org.apache.solr.handler.dataimport.DataImporter$1</class>
      <method>run</method>
      <line>377</line>
    </frame>
  </exception>
</record>


Could this be because the concurrent full-imports are stepping on each
other's toes? It seems like one full-import request ends up deleting
another's segment files.

Is there a way to avoid this? Perhaps a config option? I would like to
retain the flexibility to issue concurrent full-import requests.

I found some documentation on this issue at:
http://old.nabble.com/FileNotFoundException-on-index-td25717530.html

But I looked at:
http://old.nabble.com/dataimporthandler-and-multiple-delta-import-td19160129.html

and was under the impression that this issue was fixed in Solr 1.4.

Kindly advise.

Ranjit.
-- 
View this message in context: 
http://old.nabble.com/Solr-1.4%3A-Full-import-FileNotFoundException-tp27446982p27446982.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to