okay i change the "lockType" to "single" but with no good effect.
so i think now, that my two DIH are using the same data-Folder. why ist it so ? i thought that each DIH use his own index ... ?! i think it is not possible to import from one table parallel with more than one DIH`s ?! myexception: java.io.FileNotFoundException: /var/lib/tomcat5.5/temp/solr/data/index/_5d.fnm (No such file or directory) at java.io.RandomAccessFile.open(Native Method) at java.io.RandomAccessFile.<init>(RandomAccessFile.java:212) at org.apache.lucene.store.SimpleFSDirectory$SimpleFSIndexInput$Descriptor.<init>(SimpleFSDirectory.java:78) at org.apache.lucene.store.SimpleFSDirectory$SimpleFSIndexInput.<init>(SimpleFSDirectory.java:108) at org.apache.lucene.store.NIOFSDirectory$NIOFSIndexInput.<init>(NIOFSDirectory.java:94) at org.apache.lucene.store.NIOFSDirectory.openInput(NIOFSDirectory.java:70) at org.apache.lucene.store.FSDirectory.openInput(FSDirectory.java:691) at org.apache.lucene.index.FieldInfos.<init>(FieldInfos.java:68) at org.apache.lucene.index.SegmentReader$CoreReaders.<init>(SegmentReader.java:116) at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:638) at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:608) at org.apache.lucene.index.IndexWriter$ReaderPool.get(IndexWriter.java:686) at org.apache.lucene.index.IndexWriter$ReaderPool.get(IndexWriter.java:662) at org.apache.lucene.index.DocumentsWriter.applyDeletes(DocumentsWriter.java:954) at org.apache.lucene.index.IndexWriter.applyDeletes(IndexWriter.java:5190) at org.apache.lucene.index.IndexWriter.doFlushInternal(IndexWriter.java:4354) at org.apache.lucene.index.IndexWriter.doFlush(IndexWriter.java:4192) at org.apache.lucene.index.IndexWriter.flush(IndexWriter.java:4183) at org.apache.lucene.index.IndexWriter.updateDocument(IndexWriter.java:2647) at org.apache.lucene.index.IndexWriter.updateDocument(IndexWriter.java:2601) at org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:241) at org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:61) at org.apache.solr.handler.dataimport.SolrWriter.upload(SolrWriter.java:75) at org.apache.solr.handler.dataimport.DataImportHandler$1.upload(DataImportHandler.java:292) at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:392) at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:242) at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:180) at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:331) at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:389) at org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:370) Erik Hatcher-4 wrote: > > what's the error you're getting? > > is DIH keeping some static that prevents it from running across two > cores separately? if so, that'd be a bug. > > Erik > > On Mar 3, 2010, at 4:12 AM, stocki wrote: > >> >> pleeease help me somebody =( :P >> >> >> >> >> stocki wrote: >>> >>> Hello again ;) >>> >>> i install tomcat5.5 on my debian server ... >>> >>> i use 2 cores and two different DIH with seperatet Index, one for the >>> normal search-feature and the other core for the suggest-feature. >>> >>> but i cannot start both DIH with an import command at the same >>> time. how >>> it this possible ? >>> >>> >>> thx >>> >> >> -- >> View this message in context: >> http://old.nabble.com/2-Cores%2C-1-Table%2C-2-DataImporter---%3E-Import-at-the-same-time---tp27756255p27765825.html >> Sent from the Solr - User mailing list archive at Nabble.com. >> > > > -- View this message in context: http://old.nabble.com/SEVERE%3A-SolrIndexWriter-was-not-closed-prior-to-finalize%28%29%2C-indicates-a-bug----POSSIBLE-RESOURCE-LEAK%21%21%21-tp27756255p27768997.html Sent from the Solr - User mailing list archive at Nabble.com.