This is an automated email from the ASF dual-hosted git repository. davsclaus pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/camel.git
The following commit(s) were added to refs/heads/master by this push: new c8357b9 Polished c8357b9 is described below commit c8357b95dddde4a1896eaecbec0cfed81e76178d Author: Claus Ibsen <claus.ib...@gmail.com> AuthorDate: Mon Dec 3 08:26:41 2018 +0100 Polished --- .../src/main/docs/google-bigquery-component.adoc | 25 ++++++++++------------ .../main/docs/google-bigquery-sql-component.adoc | 20 ++++++++--------- 2 files changed, 20 insertions(+), 25 deletions(-) diff --git a/components/camel-google-bigquery/src/main/docs/google-bigquery-component.adoc b/components/camel-google-bigquery/src/main/docs/google-bigquery-component.adoc index 5af187d..765591e 100644 --- a/components/camel-google-bigquery/src/main/docs/google-bigquery-component.adoc +++ b/components/camel-google-bigquery/src/main/docs/google-bigquery-component.adoc @@ -3,8 +3,6 @@ *Available as of Camel version 2.20* -### Component Description - The Google Bigquery component provides access to https://cloud.google.com/bigquery/[Cloud BigQuery Infrastructure] via the https://developers.google.com/api-client-library/java/apis/bigquery/v2[Google Client Services API]. @@ -29,7 +27,7 @@ for this component: [[GoogleBigQuery-AuthenticationConfiguration]] -### Authentication Configuration +=== Authentication Configuration Google BigQuery component authentication is targeted for use with the GCP Service Accounts. For more information please refer to https://cloud.google.com/docs/authentication[Google Cloud Platform Auth Guide] @@ -48,15 +46,15 @@ https://developers.google.com/identity/protocols/application-default-credentials Service Account Email and Service Account Key can be found in the GCP JSON credentials file as client_email and private_key respectively. -### URI Format +=== URI Format -[source,java] +[source,text] -------------------------------------------------------- google-bigquery://project-id:datasetId[:tableId]?[options] -------------------------------------------------------- -### Options +=== Options // component options: START The Google BigQuery component supports 4 options, which are listed below. @@ -124,7 +122,7 @@ The component supports 4 options, which are listed below. // spring-boot-auto-configure options: END -### Message Headers +=== Message Headers [width="100%",cols="10%,10%,80%",options="header",] |======================================================================= @@ -136,7 +134,7 @@ The component supports 4 options, which are listed below. |======================================================================= -### Producer Endpoints +=== Producer Endpoints Producer endpoints can accept and deliver to BigQuery individual and grouped exchanges alike. Grouped exchanges have `Exchange.GROUPED_EXCHANGE` property set. @@ -148,7 +146,7 @@ correct suffix or partition decorator. Google BigQuery endpoint expects the payload to be either a map or list of maps. A payload containing a map will insert a single row and a payload containing a list of map's will insert a row for each entry in the list. -### Template tables +=== Template tables Reference: https://cloud.google.com/bigquery/streaming-data-into-bigquery#template-tables @@ -159,13 +157,12 @@ I.e. the following route will create tables and insert records sharded on a per [source,java] ------------------------------------------------------ from("direct:start") -.header(GoogleBigQueryConstants.TABLE_SUFFIX, "_${date:now:yyyyMMdd}") -.to("google-bigquery:sampleDataset:sampleTable") - + .header(GoogleBigQueryConstants.TABLE_SUFFIX, "_${date:now:yyyyMMdd}") + .to("google-bigquery:sampleDataset:sampleTable") ------------------------------------------------------ Note it is recommended to use partitioning for this use case. -### Partitioning +=== Partitioning Reference: https://cloud.google.com/bigquery/docs/creating-partitioned-tables @@ -173,7 +170,7 @@ Partitioning is specified when creating a table and if set data will be automati separate tables. When inserting data a specific partition can be specified by setting the `GoogleBigQueryConstants.PARTITION_DECORATOR` header on the exchange. -### Ensuring data consistency +=== Ensuring data consistency Reference: https://cloud.google.com/bigquery/streaming-data-into-bigquery#dataconsistency diff --git a/components/camel-google-bigquery/src/main/docs/google-bigquery-sql-component.adoc b/components/camel-google-bigquery/src/main/docs/google-bigquery-sql-component.adoc index 36d7cb8..a0e06bb 100644 --- a/components/camel-google-bigquery/src/main/docs/google-bigquery-sql-component.adoc +++ b/components/camel-google-bigquery/src/main/docs/google-bigquery-sql-component.adoc @@ -1,9 +1,7 @@ [[google-bigquery-sql-component]] == Google BigQuery Standard SQL Component -== Google BigQuery Component *Available as of Camel version 2.23* -### Component Description The Google Bigquery SQL component provides access to https://cloud.google.com/bigquery/[Cloud BigQuery Infrastructure] via @@ -28,7 +26,7 @@ for this component: [[GoogleBigQuery-AuthenticationConfiguration]] -### Authentication Configuration +=== Authentication Configuration Google BigQuery component authentication is targeted for use with the GCP Service Accounts. For more information please refer to https://cloud.google.com/docs/authentication[Google Cloud Platform Auth Guide] @@ -47,15 +45,15 @@ https://developers.google.com/identity/protocols/application-default-credentials Service Account Email and Service Account Key can be found in the GCP JSON credentials file as client_email and private_key respectively. -### URI Format +=== URI Format -[source,java] +[source,text] -------------------------------------------------------- google-bigquery-sql://project-id:query?[options] -------------------------------------------------------- Examples: -[source,java] +[source,text] -------------------------------------------------------- google-bigquery-sql://project-17248459:delete * from test.table where id=@myId google-bigquery-sql://project-17248459:delete * from ${datasetId}.${tableId} where id=@myId @@ -66,14 +64,14 @@ where * parameters in form @name are extracted from body or message headers and sent to Google Bigquery You can externalize your SQL queries to files in the classpath or file system as shown: -[source,java] +[source,text] -------------------------------------------------------- google-bigquery-sql://project-17248459::classpath:delete.sql -------------------------------------------------------- -### Options +=== Options // component options: START The Google BigQuery Standard SQL component supports 3 options, which are listed below. @@ -137,7 +135,7 @@ The component supports 3 options, which are listed below. // spring-boot-auto-configure options: END -### Ouput Message Headers +=== Ouput Message Headers [width="100%",cols="10%,10%,80%",options="header",] |======================================================================= @@ -146,6 +144,6 @@ The component supports 3 options, which are listed below. |======================================================================= -### Producer Endpoints +=== Producer Endpoints -Google BigQuery SQL endpoint expects the payload to be either empty or a map of query parameters. +Google BigQuery SQL endpoint expects the payload to be either empty or a map of query parameters. \ No newline at end of file