This is an automated email from the ASF dual-hosted git repository.

acosentino pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/camel-kafka-connector.git


The following commit(s) were added to refs/heads/master by this push:
     new faa3fc7  Added basic concepts adoc
faa3fc7 is described below

commit faa3fc7b6eeef890b130dd71997923114539a164
Author: Andrea Cosentino <anco...@gmail.com>
AuthorDate: Fri May 8 16:57:05 2020 +0200

    Added basic concepts adoc
---
 docs/modules/ROOT/pages/basic-concepts.adoc | 18 ++++++++++++++++++
 docs/modules/ROOT/pages/index.adoc          |  1 +
 2 files changed, 19 insertions(+)

diff --git a/docs/modules/ROOT/pages/basic-concepts.adoc 
b/docs/modules/ROOT/pages/basic-concepts.adoc
new file mode 100644
index 0000000..b57f87d
--- /dev/null
+++ b/docs/modules/ROOT/pages/basic-concepts.adoc
@@ -0,0 +1,18 @@
+[[BasicConcepts-BasicConcepts]]
+= Basic Concepts
+
+In this section we'll explain the basic concepts of Kafka Connect and Camel 
and how we can create a relation between the two frameworks
+
+As reported on the Apache Kafka website: _Kafka Connect is a tool for scalably 
and reliably streaming data between Apache Kafka and other systems. It makes it 
simple to quickly define connectors that move large collections of data into 
and out of Kafka. Kafka Connect can ingest entire databases or collect metrics 
from all your application servers into Kafka topics, making the data available 
for stream processing with low latency. An export job can deliver data from 
Kafka topics into seco [...]
+
+Connectors exist in two flavors: SourceConnectors import data from another 
system to Kafka and SinkConnectors export data from Kafka.
+
+Connectors do not perform the hard work: their configuration set the data to 
be copied, and the job will be splitted into a set of Tasks that can be 
distributed to workers (from the Connector itself). The tasks exist in two 
flavors too: SourceTask and SinkTask.
+
+An experienced Apache Camel developer will recognize for sure the similarities 
with the between SourceConnector and consumer and SinkConnector and producer.
+
+Basically a Camel-Kafka Source Connector is a pre-configured Camel consumer 
which will perform the same action on a fixed rate (every 5s, 10s) and send the 
exchanges to Kafka, while a Camel-Kafka Sink Connector is a pre-configured 
Camel producer which will perform the same operation on each message exported 
from Kafka.
+
+We have some examples in place on an ad-hoc 
https://github.com/apache/camel-kafka-connector-examples[Repository] 
+
+
diff --git a/docs/modules/ROOT/pages/index.adoc 
b/docs/modules/ROOT/pages/index.adoc
index 09cee1a..5926f32 100644
--- a/docs/modules/ROOT/pages/index.adoc
+++ b/docs/modules/ROOT/pages/index.adoc
@@ -2,6 +2,7 @@
 = Apache Camel Kafka Connector
 
 * xref:about.adoc[What is it?]
+** xref:basic-concepts.adoc[Basic concepts]
 * xref:getting-started.adoc[Getting started]
 ** xref:try-it-out-locally.adoc[Try it locally]
 ** xref:try-it-out-on-openshift-with-strimzi.adoc[Try it on OpenShift cluster]

Reply via email to