saranyaeu2987 edited a comment on issue #251: URL: https://github.com/apache/camel-kafka-connector/issues/251#issuecomment-636048341
@oscerd If that the case, how file variable is resolved? ``` NOT RESOLVED -->camel.sink.url: aws-s3://selumalai-kafka-s3?keyName=${date:now:yyyyMMdd-HHmmssSSS}-${exchangeId}. RESOLVED -->camel.component.aws-s3.accessKey: ${file:/opt/kafka/external-configuration/aws-credentials/aws-credentials.properties:aws_access_key_id}. ``` ======================================================================= ***Basically, I wanted to preserve all Kafka Topic data in S3 without overwriting.*** Any suggestion on - How can I add dynamic part to url so that sink connector writes topics data to different files in s3 **OR** - How to prevent overwriting data is S3 when consuming data from a kafka topic - Is someway to write in batches in S3 ? (If so, it would be AWESOME!!) ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org