aaronphilip commented on issue #12759:
URL: https://github.com/apache/iceberg/issues/12759#issuecomment-2801548512

   > Hi [@peach12345](https://github.com/peach12345) as far as I am aware Kafka 
Connect Iceberg does not has DLQ support for sending the records to DLQ once 
the record lands to the sink task but you can have the Kafka Converter layer 
DLQ support which checks for the convertibility between the ConsumerRecord and 
ConnectRecord and in case of incompatibilities (which is upto the converter 
configured to use) push the records to DLQ topic based on the error tolerance 
level and DLQ configured.
   
   Hi @kumarpritam863 is adding a DLQ for the sink task something on the 
roadmap? 
   
   From [this 
chart](https://www.confluent.io/blog/kafka-connect-deep-dive-error-handling-dead-letter-queues/#where-is-error-handling-not-provided-by-kafka-connect)
 it looks like DLQ support is available for de/serialization errors. However, 
we'd like to handle errors such as a record that causes a table write to fail 
because the record has an incompatible schema with the table. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org

Reply via email to