I'm using the add(MyObject) command form (SolrNet) in a foreach loop
to add my objects to the index.
In the catalina-log i cannot see anything that helps me out.
It stops at:
28.sep.2009 08:58:40
org.apache.solr.update.processor.LogUpdateProcessor finish
INFO: {add=[12345]} 0 187
28.sep.2009 08:58:40 org.apache.solr.core.SolrCore execute
INFO: [core2] webapp=/solr path=/update params={} status=0 QTime=187
Whitch indicates nothing wrong.
Are there any other logs that should be checked?
What it seems like to me at the moment is that the foreach is passing
objects(documents) to solr faster then solr can add them to the index.
As in I'm eventually running out of connections (to solr?) or something.
I'm running another incremental update that with other objects where
the foreachs isn't quite as fast. This job has added over 100k
documents without failing, and still going. Whereas the problematic
job fails after ~3k.
What I've learned trough the day tho, is that the index where my feed
is failing is actually redundant.
I.e I'm off the hook for now.
Still I'd like to figure out whats going wrong.
Steinar
There's nothing in that output that indicates something we can help
with over in solr-user land. What is the call you're making to
Solr? Did Solr log anything anomalous?
Erik
On Sep 28, 2009, at 4:41 AM, Steinar Asbjørnsen wrote:
I just posted to the SolrNet-group since i have the exact same(?)
problem.
Hope I'm not beeing rude posting here as well (since the SolrNet-
group doesn't seem as active as this mailinglist).
The problem occurs when I'm running an incremental feed(self made)
of a index.
My post:
[snip]
Whats happening is that i get this error message (in VS):
"A first chance exception of type
'SolrNet.Exceptions.SolrConnectionException' occurred in SolrNet.DLL"
And the web browser (which i use to start the feed says:
"System.Data.SqlClient.SqlException: Timeout expired. The timeout
period elapsed prior to completion of the operation or the server is
not responding."
At the time of writing my index contains 15k docs, and "lacks" ~700k
docs that the incremental feed should take care of adding to the
index.
The error message appears after 3k docs are added, and before 4k
docs are added.
I'm committing each 1%1000==0.
In addittion autocommit is set to:
<autoCommit>
<maxDocs>10000</maxDocs>
</autoCommit>
More info:
From schema.xml:
<field name="id" type="text" indexed="true" stored="true"
required="true" />
<field name="name" type="string" indexed="true" stored="true"
required="false" />
I'm fetching data from a (remote) Sql 2008 Server, using
sqljdbc4.jar.
And Solr is running on a local Tomcat-installation.
SolrNet version: 0.2.3.0
Solr Specification Version: 1.3.0.2009.08.29.08.05.39
[/snip]
Any suggestions on how to fix this would be much apreceiated.
Regards,
Steinar