I am using the embedded Solr API for my indexing process. I created a brand new
index with my application without any problem. I then ran my indexer in
incremental mode. This process copies the working index to a temporary Solr
location, adds/updates any records, optimizes the index, and then copies it
back to the working location. There are currently not any instances of Solr
reading this index. Also, I commit after every 100000 rows. The schema.xml and
solrconfig.xml files have not changed.
Here is my function call.
protected void optimizeProducts() throws IOException {
UpdateHandler updateHandler = m_SolrCore.getUpdateHandler();
CommitUpdateCommand commitCmd = new CommitUpdateCommand(true);
commitCmd.optimize = true;
updateHandler.commit(commitCmd);
log.info("Optimized index");
}
So, during the optimize phase, I get the following stack trace:
java.io.IOException: read past EOF
at
org.apache.lucene.store.BufferedIndexInput.refill(BufferedIndexInput.java:89)
at
org.apache.lucene.store.BufferedIndexInput.readByte(BufferedIndexInput.java:34)
at org.apache.lucene.store.IndexInput.readChars(IndexInput.java:107)
at org.apache.lucene.store.IndexInput.readString(IndexInput.java:93)
at
org.apache.lucene.index.FieldsReader.addFieldForMerge(FieldsReader.java:211)
at org.apache.lucene.index.FieldsReader.doc(FieldsReader.java:119)
at
org.apache.lucene.index.SegmentReader.document(SegmentReader.java:323)
at
org.apache.lucene.index.SegmentMerger.mergeFields(SegmentMerger.java:206)
at org.apache.lucene.index.SegmentMerger.merge(SegmentMerger.java:96)
at
org.apache.lucene.index.IndexWriter.mergeSegments(IndexWriter.java:1835)
at org.apache.lucene.index.IndexWriter.optimize(IndexWriter.java:1195)
at
org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:508)
at ...
There are no exceptions or anything else that appears to be incorrect during
the adds or commits. After this, the index files are still non-optimized.
I know there is not a whole lot to go on here. Anything in particular that I
should look at?