Hi,

> Hi Markus,
> 
> thanks for your answer.
> I'm using Solr. 4.0 and jetty now and observe the behavior and my error
> logs next week.
> tomcat can be a reason, we will see, i'll report.
> 
> I'm indexing WITHOUT batches, one doc after another. But i would try out
> the batch indexing as well as
> retry indexing faulty docs.
> if you indexing one batch, and one doc in batch is corrupt, what happens
> with another 249docs(total 250/batch)? Are they indexed and
> updated when you retry to indexing the batch, or fails the complete batch?

The entire batch should fail but i cannot confirm. Usually all fail if there 
is an error somewhere such as an XML error.

> 
> Regards
> Vadim
> 
> 
> 
> 
> 2011/8/11 Markus Jelsma <markus.jel...@openindex.io>
> 
> > Hi,
> > 
> > We  see these errors too once on a while but there is real answer on the
> > mailing list here except one user suspecting Tomcat is responsible
> > (connection
> > time outs).
> > 
> > Another user proposed to limit the number of documents per batch but
> > that, of
> > course, increases the number of connections made. We do only 250
> > docs/batch to
> > limit RAM usage on the client and start to see these errors very
> > occasionally.
> > There may be a coincidence.. or not.
> > 
> > Anyway, it's really hard to reproduce if not impossible. It happens when
> > connecting directly as well when connecting through a proxy.
> > 
> > What you can do is simply retry the batch and it usually works out fine.
> > At least you don't loose a batch in the process. We retry all failures
> > at least a
> > couple of times before giving up an indexing job.
> > 
> > Cheers,
> > 
> > > Hello folks,
> > > 
> > > i use solr 1.4.1 and every 2 to 6 hours i have indexing errors in my
> > > log files.
> > > 
> > > on the client side:
> > > 2011-08-04 12:01:18,966 ERROR [Worker-242] IndexServiceImpl - Indexing
> > > failed with SolrServerException.
> > > Details: org.apache.commons.httpclient.ProtocolException: Unbuffered
> > 
> > entity
> > 
> > > enclosing request can not be repeated.:
> > 
> > > Stacktrace:
> > org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHt
> > t
> > 
> > > pSolrServer.java:469) .
> > > .
> > > on the server side:
> > > INFO: [] webapp=/solr path=/update params={wt=javabin&version=1}
> > > status=0 QTime=3
> > > 04.08.2011 12:01:18 org.apache.solr.update.processor.LogUpdateProcessor
> > > finish
> > > INFO: {} 0 0
> > > 04.08.2011 12:01:18 org.apache.solr.common.SolrException log
> > > SCHWERWIEGEND: org.apache.solr.common.SolrException:
> > > java.io.IOException: Invalid chunk header
> > > .
> > > .
> > > .
> > > i`m indexing ONE document per call, 15-20 documents per second, 24/7.
> > > what may be the problem?
> > > 
> > > best regards
> > > vadim

Reply via email to