You could be right.  Going back in the logs, I noticed it used to happen less 
frequently and always towards the end of an optimize operation.  It is probably 
my indexer timing out waiting for updates to occur during optimizes.  The 
errors grew recently due to my upping the indexer threadcount to 22 threads, so 
there's a lot more timeouts occurring now.  Also our index has grown to double 
the old size so the optimize operation has started taking a lot longer, also 
contributing to what I'm seeing.   I have just changed my optimize frequency 
from three times a day to one time a day after reading the following:

Here they are talking about completely deprecating the optimize command in the 
next version of solr…
https://issues.apache.org/jira/browse/SOLR-3141c


-----Original Message-----
From: Alexandre Rafalovitch [mailto:arafa...@gmail.com] 
Sent: Wednesday, October 10, 2012 11:10 AM
To: solr-user@lucene.apache.org
Subject: Re: anyone have any clues about this exception

Something timed out, the other end closed the connection. This end tried to 
write to closed pipe and died, something tried to catch that exception and 
write its own and died even worse? Just making it up really, but sounds good 
(plus a 3-year Java tech-support hunch).

If it happens often enough, see if you can run WireShark on that machine's 
network interface and catch the whole network conversation in action. Often, 
there is enough clues there by looking at tcp packets and/or stuff transmitted. 
WireShark is a power-tool, so takes a little while the first time, but the 
learning will pay for itself over and over again.

Regards,
   Alex.

Personal blog: http://blog.outerthoughts.com/
LinkedIn: http://www.linkedin.com/in/alexandrerafalovitch
- Time is the quality of nature that keeps events from happening all at once. 
Lately, it doesn't seem to be working.  (Anonymous  - via GTD
book)


On Wed, Oct 10, 2012 at 11:31 PM, Petersen, Robert <rober...@buy.com> wrote:
> Tomcat localhost log (not the catalina log) for my  solr 3.6.1 (master) 
> instance contains lots of these exceptions but solr itself seems to be doing 
> fine... any ideas?  I'm not seeing these exceptions being logged on my slave 
> servers btw, just the master where we do our indexing only.
>
>
>
> Oct 9, 2012 5:34:11 PM org.apache.catalina.core.StandardWrapperValve 
> invoke
> SEVERE: Servlet.service() for servlet default threw exception 
> java.lang.IllegalStateException
>                 at 
> org.apache.catalina.connector.ResponseFacade.sendError(ResponseFacade.java:407)
>                 at 
> org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:389)
>                 at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:291)
>                 at 
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
>                 at 
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
>                 at 
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
>                 at 
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
>                 at 
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128)
>                 at 
> com.googlecode.psiprobe.Tomcat60AgentValve.invoke(Tomcat60AgentValve.java:30)
>                 at 
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
>                 at 
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
>                 at 
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
>                 at 
> org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:849)
>                 at 
> org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:583)
>                 at 
> org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:454)
>                 at java.lang.Thread.run(Unknown Source)

Reply via email to