Re: DataImport issue with large number of documents

2010-06-08 Thread Giri
Hi Glen, Thank you very much for the quick response, I would like to try increasing the netTimoutForStreamingResults , is that something I can do it in the MySQL side? or in the solr side? Giri On Tue, Jun 8, 2010 at 6:17 PM, Glen Newton wrote: > As the index gets larger, the underlying housek

Re: DataImport issue with large number of documents

2010-06-08 Thread Glen Newton
As the index gets larger, the underlying housekeeping of the Lucene index sometimes causes pauses in the indexing. The JDBC connection (and/or the underlying socket) to the MySql database can time out during these pauses. - If it is not set, you should add this to your JCBD url: autoreconnect=true

DataImport issue with large number of documents

2010-06-08 Thread Giri
Hi Group, I have been trying index about 70 million records in the solr index, the data is coming from the MySQL database, and I am using the DataImportHandler with batchSize set to -1. When I perform a full-import, it indexes about 27 million records then throws the following exception: Any help