This is an automated email from the ASF dual-hosted git repository. morningman pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/doris.git
from 441125652c5 [Chore](statistic) do not use memory order relaxed on QueryStatistics and add sync on te… (#38048) add 5b6ae3e6a53 [feature](paimon/iceberg)add a docker that can directly pull up all the relevant environments of paimon/iceberg/doris (#38009) No new revisions were added by this update. Summary of changes: samples/datalake/iceberg_and_paimon/README.md | 279 ++++++++++++++++++ .../data/flink-conf/flink-conf.yaml | 312 +++++++++++++++++++++ .../data/flink-conf/log4j-cli.properties | 67 +++++ .../data/flink-conf/log4j-console.properties | 70 +++++ .../data/flink-conf/log4j-session.properties | 42 +++ .../data/flink-conf/log4j.properties | 61 ++++ .../data/flink-conf/logback-console.xml | 67 +++++ .../data/flink-conf/logback-session.xml | 39 +++ .../iceberg_and_paimon/data/flink-conf/logback.xml | 58 ++++ .../iceberg_and_paimon/data/flink-conf/masters | 18 ++ .../iceberg_and_paimon/data/flink-conf/workers | 18 ++ .../iceberg_and_paimon/data/flink-conf/zoo.cfg | 36 +++ .../data/spark-conf/fairscheduler.xml.template | 31 ++ .../data/spark-conf/log4j2.properties.template | 69 +++++ .../data/spark-conf/metrics.properties.template | 210 ++++++++++++++ .../data/spark-conf/spark-defaults.conf | 43 +++ .../data/spark-conf/spark-defaults.conf.template | 27 ++ .../data/spark-conf/spark-env.sh.template | 81 ++++++ .../data/spark-conf/workers.template | 19 ++ .../data/table/customer/000000_0} | Bin .../data/table/customer/000001_0} | Bin .../data/table/customer/000002_0} | Bin .../data/table/customer/000003_0} | Bin .../datalake/iceberg_and_paimon/docker-compose.env | 22 ++ .../datalake/iceberg_and_paimon/docker-compose.yml | 173 ++++++++++++ .../iceberg_and_paimon/scripts/start_doris.sh | 60 ++++ .../datalake/iceberg_and_paimon/sql/init_doris.sql | 21 ++ .../iceberg_and_paimon/sql/init_tables.sql | 53 ++++ .../iceberg_and_paimon/sql/prepare_data.sql | 8 + samples/datalake/iceberg_and_paimon/start_all.sh | 121 ++++++++ .../iceberg_and_paimon/start_doris_client.sh | 20 ++ .../iceberg_and_paimon/start_flink_client.sh | 19 ++ .../start_spark_iceberg_client.sh | 19 ++ .../start_spark_paimon_client.sh | 19 ++ samples/datalake/iceberg_and_paimon/stop_all.sh | 19 ++ 35 files changed, 2101 insertions(+) create mode 100644 samples/datalake/iceberg_and_paimon/README.md create mode 100644 samples/datalake/iceberg_and_paimon/data/flink-conf/flink-conf.yaml create mode 100644 samples/datalake/iceberg_and_paimon/data/flink-conf/log4j-cli.properties create mode 100644 samples/datalake/iceberg_and_paimon/data/flink-conf/log4j-console.properties create mode 100644 samples/datalake/iceberg_and_paimon/data/flink-conf/log4j-session.properties create mode 100644 samples/datalake/iceberg_and_paimon/data/flink-conf/log4j.properties create mode 100644 samples/datalake/iceberg_and_paimon/data/flink-conf/logback-console.xml create mode 100644 samples/datalake/iceberg_and_paimon/data/flink-conf/logback-session.xml create mode 100644 samples/datalake/iceberg_and_paimon/data/flink-conf/logback.xml create mode 100644 samples/datalake/iceberg_and_paimon/data/flink-conf/masters create mode 100644 samples/datalake/iceberg_and_paimon/data/flink-conf/workers create mode 100644 samples/datalake/iceberg_and_paimon/data/flink-conf/zoo.cfg create mode 100644 samples/datalake/iceberg_and_paimon/data/spark-conf/fairscheduler.xml.template create mode 100644 samples/datalake/iceberg_and_paimon/data/spark-conf/log4j2.properties.template create mode 100644 samples/datalake/iceberg_and_paimon/data/spark-conf/metrics.properties.template create mode 100755 samples/datalake/iceberg_and_paimon/data/spark-conf/spark-defaults.conf create mode 100644 samples/datalake/iceberg_and_paimon/data/spark-conf/spark-defaults.conf.template create mode 100755 samples/datalake/iceberg_and_paimon/data/spark-conf/spark-env.sh.template create mode 100644 samples/datalake/iceberg_and_paimon/data/spark-conf/workers.template copy samples/datalake/{hudi/data/customer/000000_0.parquet => iceberg_and_paimon/data/table/customer/000000_0} (100%) copy samples/datalake/{hudi/data/customer/000001_0.parquet => iceberg_and_paimon/data/table/customer/000001_0} (100%) copy samples/datalake/{hudi/data/customer/000002_0.parquet => iceberg_and_paimon/data/table/customer/000002_0} (100%) copy samples/datalake/{hudi/data/customer/000003_0.parquet => iceberg_and_paimon/data/table/customer/000003_0} (100%) create mode 100644 samples/datalake/iceberg_and_paimon/docker-compose.env create mode 100644 samples/datalake/iceberg_and_paimon/docker-compose.yml create mode 100644 samples/datalake/iceberg_and_paimon/scripts/start_doris.sh create mode 100644 samples/datalake/iceberg_and_paimon/sql/init_doris.sql create mode 100644 samples/datalake/iceberg_and_paimon/sql/init_tables.sql create mode 100644 samples/datalake/iceberg_and_paimon/sql/prepare_data.sql create mode 100644 samples/datalake/iceberg_and_paimon/start_all.sh create mode 100644 samples/datalake/iceberg_and_paimon/start_doris_client.sh create mode 100644 samples/datalake/iceberg_and_paimon/start_flink_client.sh create mode 100644 samples/datalake/iceberg_and_paimon/start_spark_iceberg_client.sh create mode 100644 samples/datalake/iceberg_and_paimon/start_spark_paimon_client.sh create mode 100644 samples/datalake/iceberg_and_paimon/stop_all.sh --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org For additional commands, e-mail: commits-h...@doris.apache.org