If the exception can't be reproduced each time for the same query, then I
think it does point to some intermittent network issue, possibly timeout
related.

Let's open a ticket for this and investigate what might be the cause.

Joel Bernstein
http://joelsolr.blogspot.com/

On Thu, May 25, 2017 at 9:28 AM, Yago Riveiro <yago.rive...@gmail.com>
wrote:

> Nop, this happened since 6.3.0 (when I started use the CloudSolrStream),
> now I’m using 6.5.1 code.
>
> Normally this happen with streams with more than 4M documents.
>
> Can be related with network? Is there any TTL in the CloudSolrStream at
> connection level?
>
> --
>
> /Yago Riveiro
>
> On 25 May 2017 13:14 +0100, Joel Bernstein <joels...@gmail.com>, wrote:
> > I've never seen this error. Is this something you just started seeing
> > recently?
> >
> >
> >
> >
> > Joel Bernstein
> > http://joelsolr.blogspot.com/
> >
> > On Thu, May 25, 2017 at 7:10 AM, Yago Riveiro <yago.rive...@gmail.com
> > wrote:
> >
> > > I have a process that uses the CloudSolrStream to run a streaming
> > > expression
> > > and I can see this exception frequently:
> > >
> > > Caused by: org.apache.http.TruncatedChunkException: Truncated chunk (
> > > expected size: 32768; actual size: 1100)
> > > at
> > > org.apache.http.impl.io.ChunkedInputStream.read(
> > > ChunkedInputStream.java:200)
> > > ~[supernova-2.4.0.jar:?]
> > > at
> > > org.apache.http.conn.EofSensorInputStream.read(
> > > EofSensorInputStream.java:137)
> > > ~[supernova-2.4.0.jar:?]
> > > at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
> > > ~[?:1.8.0_121]
> > > at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
> > > ~[?:1.8.0_121]
> > > at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
> > > ~[?:1.8.0_121]
> > > at java.io.InputStreamReader.read(InputStreamReader.java:184)
> > > ~[?:1.8.0_121]
> > > at org.noggit.JSONParser.fill(JSONParser.java:196)
> > > ~[supernova-2.4.0.jar:?]
> > > at org.noggit.JSONParser.getMore(JSONParser.java:203)
> > > ~[supernova-2.4.0.jar:?]
> > > at org.noggit.JSONParser.readStringChars2(JSONParser.java:646)
> > > ~[supernova-2.4.0.jar:?]
> > > at org.noggit.JSONParser.readStringChars(JSONParser.java:626)
> > > ~[supernova-2.4.0.jar:?]
> > > at org.noggit.JSONParser.getStringChars(JSONParser.java:1029)
> > > ~[supernova-2.4.0.jar:?]
> > > at org.noggit.JSONParser.getString(JSONParser.java:1017)
> > > ~[supernova-2.4.0.jar:?]
> > > at org.noggit.ObjectBuilder.getString(ObjectBuilder.java:68)
> > > ~[supernova-2.4.0.jar:?]
> > > at org.noggit.ObjectBuilder.getVal(ObjectBuilder.java:51)
> > > ~[supernova-2.4.0.jar:?]
> > > at org.noggit.ObjectBuilder.getObject(ObjectBuilder.java:128)
> > > ~[supernova-2.4.0.jar:?]
> > > at org.noggit.ObjectBuilder.getVal(ObjectBuilder.java:57)
> > > ~[supernova-2.4.0.jar:?]
> > > at org.noggit.ObjectBuilder.getVal(ObjectBuilder.java:37)
> > > ~[supernova-2.4.0.jar:?]
> > > at
> > > org.apache.solr.client.solrj.io.stream.JSONTupleStream.
> > > next(JSONTupleStream.java:85)
> > > ~[supernova-2.4.0.jar:?]
> > > at
> > > org.apache.solr.client.solrj.io.stream.SolrStream.read(
> > > SolrStream.java:207)
> > > ~[supernova-2.4.0.jar:?]
> > >
> > > The code snip running the stream seems like this:
> > >
> > > CloudSolrStream cstream = new CloudSolrStream(....)
> > > cstream.open();
> > >
> > > while (true) {
> > > Tuple tuple = cstream.read();
> > > ......
> > > }
> > >
> > > Regards,
> > >
> > >
> > >
> > >
> > > -----
> > > Best regards
> > >
> > > /Yago
> > > --
> > > View this message in context: http://lucene.472066.n3.
> > > nabble.com/Truncated-chunk-in-CloudSolrStream-tp4337181.html
> > > Sent from the Solr - User mailing list archive at Nabble.com.
> > >
>

Reply via email to