shubhamcoc opened a new issue, #1560:
URL: https://github.com/apache/camel-kafka-connector/issues/1560

   Hi, 
   I am trying to take backup of multiple Kafka topics data in s3 bucket, 
current observation is, minio-sink plugin is creating random file in same 
bucket without storing the metadata, as a result, during the restore all the 
data saved into bucket will be restore in all the topics. 
   
   I can see in the [kamalet 
yaml](https://github.com/apache/camel-kafka-connector/blob/main/connectors/camel-minio-sink-kafka-connector/src/main/resources/kamelets/minio-sink.kamelet.yaml),
 it is mention, we have to set header properties called file to upload the data 
into the filename. Do anyone know how to configure it? 
   
   I find one property to set the keyname in endpoint, but it is not working as 
intended. There is a similar question in stack-overflow 
https://stackoverflow.com/questions/74662272/kafka-connect-camel-s3-sink-keyname-attribute-behavior.
 
   
   Any help is appreciated. Thanks in advance. Kindly let me know if it is 
possible to store data into a particular object in a bucket. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@camel.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to