smeana commented on issue #329:
URL: 
https://github.com/apache/camel-kafka-connector-examples/issues/329#issuecomment-1040041596


   > Hello,
   > 
   > > Hello
   > > I am getting message.max.bytes error when trying to read logs files 
(pain text) of 300MB. As far as I know, the connector moves the whole file in 
one record to kafka.
   > 
   > I haven't really tried it, but I'd look at using a converter for splitting 
the file.
   > 
   > > Is any option to split the file in mini batches and have an atomic 
transaction. In case the connector fails in the middle of > the processing, 
reprocess only from where it left?
   > 
   > Not yet. I am implementing that feature in Camel Core as part of 
CAMEL-15562. It's progressing, and should be in Core in a few versions if 
everything goes alright.
   > 
   > > Regards
   
   Thanks @orpiske I was expecting the connector to do that, as files use to be 
bigger than kafka message.max.bytes


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@camel.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to