jaetma commented on issue #33688:
URL: https://github.com/apache/airflow/issues/33688#issuecomment-1692847361

   @potiuk 
   
   Thanks for your response, I can test in our dev environ! 
   
   After the downgrade the task run time is normal:
   
![image](https://github.com/apache/airflow/assets/123120332/77dbcb75-3f27-45f5-9f4c-8029aba06720)
   
   
   The next line are from Ariflow tasks logs 2.6.3 and 2.7.0:
   ```
   AIRFLOW 2.6.3 TASK RUN LOGS
   
   TASK: fetch_header
   
   9b9bc0eb214e
   *** Found local files:
   ***   * 
/opt/airflow/logs/dag_id=mydagname/run_id=mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195/task_id=fetch_header/attempt=2.log
   [2023-08-23, 19:28:45 UTC] {taskinstance.py:1103} INFO - Dependencies all 
met for dep_context=non-requeueable deps ti=<TaskInstance: 
mydagname.fetch_header 
mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195 
[queued]>
   [2023-08-23, 19:28:45 UTC] {taskinstance.py:1103} INFO - Dependencies all 
met for dep_context=requeueable deps ti=<TaskInstance: mydagname.fetch_header 
mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195 
[queued]>
   [2023-08-23, 19:28:45 UTC] {taskinstance.py:1308} INFO - Starting attempt 2 
of 2
   [2023-08-23, 19:28:45 UTC] {taskinstance.py:1327} INFO - Executing 
<Task(PythonOperator): fetch_header> on 2023-08-22 01:36:30+00:00
   [2023-08-23, 19:28:45 UTC] {standard_task_runner.py:57} INFO - Started 
process 4198 to run task
   [2023-08-23, 19:28:45 UTC] {standard_task_runner.py:84} INFO - Running: 
['***', 'tasks', 'run', 'mydagname', 'fetch_header', 
'mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195', 
'--job-id', '1182100', '--raw', '--subdir', 'DAGS_FOLDER/example/mydagname.py', 
'--cfg-path', '/tmp/tmpqdgbw1tb']
   [2023-08-23, 19:28:45 UTC] {standard_task_runner.py:85} INFO - Job 1182100: 
Subtask fetch_header
   [2023-08-23, 19:28:45 UTC] {task_command.py:410} INFO - Running 
<TaskInstance: mydagname.fetch_header 
mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195 
[running]> on host 9b9bc0eb214e
   [2023-08-23, 19:28:46 UTC] {taskinstance.py:1545} INFO - Exporting env vars: 
AIRFLOW_CTX_DAG_EMAIL='[email protected]' AIRFLOW_CTX_DAG_OWNER='admin' 
AIRFLOW_CTX_DAG_ID='mydagname' AIRFLOW_CTX_TASK_ID='fetch_header' 
AIRFLOW_CTX_EXECUTION_DATE='2023-08-22T01:36:30+00:00' 
AIRFLOW_CTX_TRY_NUMBER='2' 
AIRFLOW_CTX_DAG_RUN_ID='mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195'
   [2023-08-23, 19:28:46 UTC] {logging_mixin.py:150} INFO - { censored }
   [2023-08-23, 19:28:46 UTC] {python.py:183} INFO - Done. Returned value was: 
{ censored }
   [2023-08-23, 19:28:46 UTC] {taskinstance.py:1345} INFO - Marking task as 
SUCCESS. dag_id=mydagname, task_id=fetch_header, 
execution_date=20230822T013630, start_date=20230823T192845, 
end_date=20230823T192846
   [2023-08-23, 19:28:46 UTC] {local_task_job_runner.py:225} INFO - Task exited 
with return code 0
   [2023-08-23, 19:28:46 UTC] {taskinstance.py:2653} INFO - 1 downstream tasks 
scheduled from follow-on schedule check
   
   
   TASK: add_fixed_values_header
   
   9b9bc0eb214e
   *** Found local files:
   ***   * 
/opt/airflow/logs/dag_id=mydagname/run_id=mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195/task_id=add_fixed_values_header/attempt=2.log
   [2023-08-23, 19:28:51 UTC] {taskinstance.py:1103} INFO - Dependencies all 
met for dep_context=non-requeueable deps ti=<TaskInstance: 
mydagname.add_fixed_values_header 
mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195 
[queued]>
   [2023-08-23, 19:28:51 UTC] {taskinstance.py:1103} INFO - Dependencies all 
met for dep_context=requeueable deps ti=<TaskInstance: 
mydagname.add_fixed_values_header 
mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195 
[queued]>
   [2023-08-23, 19:28:51 UTC] {taskinstance.py:1308} INFO - Starting attempt 2 
of 2
   [2023-08-23, 19:28:51 UTC] {taskinstance.py:1327} INFO - Executing 
<Task(PythonOperator): add_fixed_values_header> on 2023-08-22 01:36:30+00:00
   [2023-08-23, 19:28:51 UTC] {standard_task_runner.py:57} INFO - Started 
process 4240 to run task
   [2023-08-23, 19:28:51 UTC] {standard_task_runner.py:84} INFO - Running: 
['***', 'tasks', 'run', 'mydagname', 'add_fixed_values_header', 
'mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195', 
'--job-id', '1182114', '--raw', '--subdir', 'DAGS_FOLDER/example/mydagname.py', 
'--cfg-path', '/tmp/tmpqkq5pev4']
   [2023-08-23, 19:28:51 UTC] {standard_task_runner.py:85} INFO - Job 1182114: 
Subtask add_fixed_values_header
   [2023-08-23, 19:28:52 UTC] {task_command.py:410} INFO - Running 
<TaskInstance: mydagname.add_fixed_values_header 
mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195 
[running]> on host 9b9bc0eb214e
   [2023-08-23, 19:28:52 UTC] {taskinstance.py:1545} INFO - Exporting env vars: 
AIRFLOW_CTX_DAG_EMAIL='[email protected]' AIRFLOW_CTX_DAG_OWNER='admin' 
AIRFLOW_CTX_DAG_ID='mydagname' AIRFLOW_CTX_TASK_ID='add_fixed_values_header' 
AIRFLOW_CTX_EXECUTION_DATE='2023-08-22T01:36:30+00:00' 
AIRFLOW_CTX_TRY_NUMBER='2' 
AIRFLOW_CTX_DAG_RUN_ID='mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195'
   [2023-08-23, 19:28:52 UTC] {python.py:183} INFO - Done. Returned value was: 
{ censored }
   [2023-08-23, 19:28:52 UTC] {taskinstance.py:1345} INFO - Marking task as 
SUCCESS. dag_id=mydagname, task_id=add_fixed_values_header, 
execution_date=20230822T013630, start_date=20230823T192851, 
end_date=20230823T192852
   [2023-08-23, 19:28:53 UTC] {local_task_job_runner.py:225} INFO - Task exited 
with return code 0
   [2023-08-23, 19:28:53 UTC] {taskinstance.py:2653} INFO - 1 downstream tasks 
scheduled from follow-on schedule check
   
   
   TASK: schema_validation_header
   
   9b9bc0eb214e
   *** Found local files:
   ***   * 
/opt/airflow/logs/dag_id=mydagname/run_id=mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195/task_id=schema_validation_header/attempt=1.log
   [2023-08-23, 19:29:01 UTC] {taskinstance.py:1103} INFO - Dependencies all 
met for dep_context=non-requeueable deps ti=<TaskInstance: 
mydagname.schema_validation_header 
mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195 
[queued]>
   [2023-08-23, 19:29:01 UTC] {taskinstance.py:1103} INFO - Dependencies all 
met for dep_context=requeueable deps ti=<TaskInstance: 
mydagname.schema_validation_header 
mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195 
[queued]>
   [2023-08-23, 19:29:01 UTC] {taskinstance.py:1308} INFO - Starting attempt 1 
of 1
   [2023-08-23, 19:29:01 UTC] {taskinstance.py:1327} INFO - Executing 
<Task(PythonOperator): schema_validation_header> on 2023-08-22 01:36:30+00:00
   [2023-08-23, 19:29:01 UTC] {standard_task_runner.py:57} INFO - Started 
process 4293 to run task
   [2023-08-23, 19:29:01 UTC] {standard_task_runner.py:84} INFO - Running: 
['***', 'tasks', 'run', 'mydagname', 'schema_validation_header', 
'mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195', 
'--job-id', '1182141', '--raw', '--subdir', 'DAGS_FOLDER/example/mydagname.py', 
'--cfg-path', '/tmp/tmpx34mwefz']
   [2023-08-23, 19:29:01 UTC] {standard_task_runner.py:85} INFO - Job 1182141: 
Subtask schema_validation_header
   [2023-08-23, 19:29:02 UTC] {task_command.py:410} INFO - Running 
<TaskInstance: mydagname.schema_validation_header 
mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195 
[running]> on host 9b9bc0eb214e
   [2023-08-23, 19:29:03 UTC] {taskinstance.py:1545} INFO - Exporting env vars: 
AIRFLOW_CTX_DAG_EMAIL='[email protected]' AIRFLOW_CTX_DAG_OWNER='admin' 
AIRFLOW_CTX_DAG_ID='mydagname' AIRFLOW_CTX_TASK_ID='schema_validation_header' 
AIRFLOW_CTX_EXECUTION_DATE='2023-08-22T01:36:30+00:00' 
AIRFLOW_CTX_TRY_NUMBER='1' 
AIRFLOW_CTX_DAG_RUN_ID='mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195'
   [2023-08-23, 19:29:03 UTC] {python.py:183} INFO - Done. Returned value was: 
{ censored }
   [2023-08-23, 19:29:03 UTC] {taskinstance.py:1345} INFO - Marking task as 
SUCCESS. dag_id=mydagname, task_id=schema_validation_header, 
execution_date=20230822T013630, start_date=20230823T192901, 
end_date=20230823T192903
   [2023-08-23, 19:29:03 UTC] {local_task_job_runner.py:225} INFO - Task exited 
with return code 0
   [2023-08-23, 19:29:03 UTC] {taskinstance.py:2653} INFO - 1 downstream tasks 
scheduled from follow-on schedule check
   
   
   TASK: generate_sql_header
   
   9b9bc0eb214e
   *** Found local files:
   ***   * 
/opt/airflow/logs/dag_id=mydagname/run_id=mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195/task_id=generate_sql_header/attempt=1.log
   [2023-08-23, 19:29:08 UTC] {taskinstance.py:1103} INFO - Dependencies all 
met for dep_context=non-requeueable deps ti=<TaskInstance: 
mydagname.generate_sql_header 
mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195 
[queued]>
   [2023-08-23, 19:29:08 UTC] {taskinstance.py:1103} INFO - Dependencies all 
met for dep_context=requeueable deps ti=<TaskInstance: 
mydagname.generate_sql_header 
mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195 
[queued]>
   [2023-08-23, 19:29:08 UTC] {taskinstance.py:1308} INFO - Starting attempt 1 
of 1
   [2023-08-23, 19:29:08 UTC] {taskinstance.py:1327} INFO - Executing 
<Task(PythonOperator): generate_sql_header> on 2023-08-22 01:36:30+00:00
   [2023-08-23, 19:29:08 UTC] {standard_task_runner.py:57} INFO - Started 
process 4345 to run task
   [2023-08-23, 19:29:08 UTC] {standard_task_runner.py:84} INFO - Running: 
['***', 'tasks', 'run', 'mydagname', 'generate_sql_header', 
'mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195', 
'--job-id', '1182162', '--raw', '--subdir', 'DAGS_FOLDER/example/mydagname.py', 
'--cfg-path', '/tmp/tmp03kw7z3y']
   [2023-08-23, 19:29:08 UTC] {standard_task_runner.py:85} INFO - Job 1182162: 
Subtask generate_sql_header
   [2023-08-23, 19:29:08 UTC] {task_command.py:410} INFO - Running 
<TaskInstance: mydagname.generate_sql_header 
mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195 
[running]> on host 9b9bc0eb214e
   [2023-08-23, 19:29:09 UTC] {taskinstance.py:1545} INFO - Exporting env vars: 
AIRFLOW_CTX_DAG_EMAIL='[email protected]' AIRFLOW_CTX_DAG_OWNER='admin' 
AIRFLOW_CTX_DAG_ID='mydagname' AIRFLOW_CTX_TASK_ID='generate_sql_header' 
AIRFLOW_CTX_EXECUTION_DATE='2023-08-22T01:36:30+00:00' 
AIRFLOW_CTX_TRY_NUMBER='1' 
AIRFLOW_CTX_DAG_RUN_ID='mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195'
   [2023-08-23, 19:29:09 UTC] {python.py:183} INFO - Done. censored
   [2023-08-23, 19:29:09 UTC] {taskinstance.py:1345} INFO - Marking task as 
SUCCESS. dag_id=mydagname, task_id=generate_sql_header, 
execution_date=20230822T013630, start_date=20230823T192908, 
end_date=20230823T192909
   [2023-08-23, 19:29:10 UTC] {local_task_job_runner.py:225} INFO - Task exited 
with return code 0
   [2023-08-23, 19:29:10 UTC] {taskinstance.py:2653} INFO - 1 downstream tasks 
scheduled from follow-on schedule check
   
   
   TASK: execute_sql
   
   9b9bc0eb214e
   *** Found local files:
   ***   * 
/opt/airflow/logs/dag_id=mydagname/run_id=mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195/task_id=execute_sql/attempt=1.log
   [2023-08-23, 19:29:14 UTC] {taskinstance.py:1103} INFO - Dependencies all 
met for dep_context=non-requeueable deps ti=<TaskInstance: 
mydagname.execute_sql 
mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195 
[queued]>
   [2023-08-23, 19:29:14 UTC] {taskinstance.py:1103} INFO - Dependencies all 
met for dep_context=requeueable deps ti=<TaskInstance: mydagname.execute_sql 
mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195 
[queued]>
   [2023-08-23, 19:29:14 UTC] {taskinstance.py:1308} INFO - Starting attempt 1 
of 1
   [2023-08-23, 19:29:14 UTC] {taskinstance.py:1327} INFO - Executing 
<Task(PostgresOperator): execute_sql> on 2023-08-22 01:36:30+00:00
   [2023-08-23, 19:29:14 UTC] {standard_task_runner.py:57} INFO - Started 
process 4380 to run task
   [2023-08-23, 19:29:14 UTC] {standard_task_runner.py:84} INFO - Running: 
['***', 'tasks', 'run', 'mydagname', 'execute_sql', 
'mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195', 
'--job-id', '1182181', '--raw', '--subdir', 'DAGS_FOLDER/example/mydagname.py', 
'--cfg-path', '/tmp/tmp_7byh4dx']
   [2023-08-23, 19:29:14 UTC] {standard_task_runner.py:85} INFO - Job 1182181: 
Subtask execute_sql
   [2023-08-23, 19:29:15 UTC] {task_command.py:410} INFO - Running 
<TaskInstance: mydagname.execute_sql 
mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195 
[running]> on host 9b9bc0eb214e
   [2023-08-23, 19:29:15 UTC] {taskinstance.py:1545} INFO - Exporting env vars: 
AIRFLOW_CTX_DAG_EMAIL='[email protected]' AIRFLOW_CTX_DAG_OWNER='admin' 
AIRFLOW_CTX_DAG_ID='mydagname' AIRFLOW_CTX_TASK_ID='execute_sql' 
AIRFLOW_CTX_EXECUTION_DATE='2023-08-22T01:36:30+00:00' 
AIRFLOW_CTX_TRY_NUMBER='1' 
AIRFLOW_CTX_DAG_RUN_ID='mydagname__bot_extract_sales__v10678132plzv__2023-08-21T20:36:30.234195'
   [2023-08-23, 19:29:15 UTC] {sql.py:265} INFO - censored
   [2023-08-23, 19:29:16 UTC] {base.py:73} INFO - Using connection ID 'conn' 
for task execution.
   [2023-08-23, 19:29:16 UTC] {base.py:73} INFO - Using connection ID 'conn' 
for task execution.
   [2023-08-23, 19:29:16 UTC] {sql.py:374} INFO - censored
   [2023-08-23, 19:29:16 UTC] {sql.py:383} INFO - Rows affected: 1
   [2023-08-23, 19:29:16 UTC] {taskinstance.py:1345} INFO - Marking task as 
SUCCESS. dag_id=mydagname, task_id=execute_sql, execution_date=20230822T013630, 
start_date=20230823T192914, end_date=20230823T192916
   [2023-08-23, 19:29:16 UTC] {local_task_job_runner.py:225} INFO - Task exited 
with return code 0
   [2023-08-23, 19:29:16 UTC] {taskinstance.py:2653} INFO - 0 downstream tasks 
scheduled from follow-on schedule check
   
   
   
   
   
   
   
   
   
   AIRFLOW 2.7.0 TASK RUN LOGS
   
   TASK: fetch_header
   
   90c66b7612c1
   *** Found local files:
   ***   * 
/opt/airflow/logs/dag_id=mydagname/run_id=mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083/task_id=fetch_header/attempt=1.log
   [2023-08-23, 18:23:04 UTC] {taskinstance.py:1159} INFO - Dependencies all 
met for dep_context=non-requeueable deps ti=<TaskInstance: 
mydagname.fetch_header 
mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083 
[queued]>
   [2023-08-23, 18:23:04 UTC] {taskinstance.py:1159} INFO - Dependencies all 
met for dep_context=requeueable deps ti=<TaskInstance: mydagname.fetch_header 
mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083 
[queued]>
   [2023-08-23, 18:23:04 UTC] {taskinstance.py:1361} INFO - Starting attempt 1 
of 1
   [2023-08-23, 18:23:04 UTC] {taskinstance.py:1382} INFO - Executing 
<Task(PythonOperator): fetch_header> on 2023-08-22 01:36:29+00:00
   [2023-08-23, 18:23:04 UTC] {standard_task_runner.py:57} INFO - Started 
process 32217 to run task
   [2023-08-23, 18:23:04 UTC] {standard_task_runner.py:84} INFO - Running: 
['***', 'tasks', 'run', 'mydagname', 'fetch_header', 
'mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083', 
'--job-id', '1180119', '--raw', '--subdir', 'DAGS_FOLDER/example/mydagname.py', 
'--cfg-path', '/tmp/tmp6qr0sdyy']
   [2023-08-23, 18:23:04 UTC] {standard_task_runner.py:85} INFO - Job 1180119: 
Subtask fetch_header
   [2023-08-23, 18:23:35 UTC] {task_command.py:415} INFO - Running 
<TaskInstance: mydagname.fetch_header 
mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083 
[running]> on host 90c66b7612c1
   [2023-08-23, 18:24:06 UTC] {taskinstance.py:1660} INFO - Exporting env vars: 
AIRFLOW_CTX_DAG_EMAIL='[email protected]' AIRFLOW_CTX_DAG_OWNER='admin' 
AIRFLOW_CTX_DAG_ID='mydagname' AIRFLOW_CTX_TASK_ID='fetch_header' 
AIRFLOW_CTX_EXECUTION_DATE='2023-08-22T01:36:29+00:00' 
AIRFLOW_CTX_TRY_NUMBER='1' 
AIRFLOW_CTX_DAG_RUN_ID='mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083'
   [2023-08-23, 18:24:06 UTC] {logging_mixin.py:151} INFO - {censored }
   [2023-08-23, 18:24:06 UTC] {python.py:194} INFO - Done. Returned value was: 
{ censored }
   [2023-08-23, 18:24:06 UTC] {taskinstance.py:1400} INFO - Marking task as 
SUCCESS. dag_id=mydagname, task_id=fetch_header, 
execution_date=20230822T013629, start_date=20230823T182304, 
end_date=20230823T182406
   [2023-08-23, 18:24:06 UTC] {local_task_job_runner.py:228} INFO - Task exited 
with return code 0
   [2023-08-23, 18:24:06 UTC] {taskinstance.py:2784} INFO - 1 downstream tasks 
scheduled from follow-on schedule check
   
   
   TASK: add_fixed_values_header
   
   90c66b7612c1
   *** Found local files:
   ***   * 
/opt/airflow/logs/dag_id=mydagname/run_id=mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083/task_id=add_fixed_values_header/attempt=1.log
   [2023-08-23, 18:25:22 UTC] {taskinstance.py:1159} INFO - Dependencies all 
met for dep_context=non-requeueable deps ti=<TaskInstance: 
mydagname.add_fixed_values_header 
mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083 
[queued]>
   [2023-08-23, 18:25:22 UTC] {taskinstance.py:1159} INFO - Dependencies all 
met for dep_context=requeueable deps ti=<TaskInstance: 
mydagname.add_fixed_values_header 
mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083 
[queued]>
   [2023-08-23, 18:25:22 UTC] {taskinstance.py:1361} INFO - Starting attempt 1 
of 1
   [2023-08-23, 18:25:22 UTC] {taskinstance.py:1382} INFO - Executing 
<Task(PythonOperator): add_fixed_values_header> on 2023-08-22 01:36:29+00:00
   [2023-08-23, 18:25:22 UTC] {standard_task_runner.py:57} INFO - Started 
process 32288 to run task
   [2023-08-23, 18:25:22 UTC] {standard_task_runner.py:84} INFO - Running: 
['***', 'tasks', 'run', 'mydagname', 'add_fixed_values_header', 
'mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083', 
'--job-id', '1180123', '--raw', '--subdir', 'DAGS_FOLDER/example/mydagname.py', 
'--cfg-path', '/tmp/tmpxx8qvkq_']
   [2023-08-23, 18:25:22 UTC] {standard_task_runner.py:85} INFO - Job 1180123: 
Subtask add_fixed_values_header
   [2023-08-23, 18:25:48 UTC] {task_command.py:415} INFO - Running 
<TaskInstance: mydagname.add_fixed_values_header 
mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083 
[running]> on host 90c66b7612c1
   [2023-08-23, 18:26:12 UTC] {taskinstance.py:1660} INFO - Exporting env vars: 
AIRFLOW_CTX_DAG_EMAIL='[email protected]' AIRFLOW_CTX_DAG_OWNER='admin' 
AIRFLOW_CTX_DAG_ID='mydagname' AIRFLOW_CTX_TASK_ID='add_fixed_values_header' 
AIRFLOW_CTX_EXECUTION_DATE='2023-08-22T01:36:29+00:00' 
AIRFLOW_CTX_TRY_NUMBER='1' 
AIRFLOW_CTX_DAG_RUN_ID='mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083'
   [2023-08-23, 18:26:12 UTC] {python.py:194} INFO - Done. Returned value was: 
{ censored }
   [2023-08-23, 18:26:12 UTC] {taskinstance.py:1400} INFO - Marking task as 
SUCCESS. dag_id=mydagname, task_id=add_fixed_values_header, 
execution_date=20230822T013629, start_date=20230823T182522, 
end_date=20230823T182612
   [2023-08-23, 18:26:12 UTC] {local_task_job_runner.py:228} INFO - Task exited 
with return code 0
   [2023-08-23, 18:26:12 UTC] {taskinstance.py:2784} INFO - 1 downstream tasks 
scheduled from follow-on schedule check
   
   
   TASK: schema_validation_header
   
   90c66b7612c1
   *** Found local files:
   ***   * 
/opt/airflow/logs/dag_id=mydagname/run_id=mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083/task_id=schema_validation_header/attempt=1.log
   [2023-08-23, 18:27:28 UTC] {taskinstance.py:1159} INFO - Dependencies all 
met for dep_context=non-requeueable deps ti=<TaskInstance: 
mydagname.schema_validation_header 
mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083 
[queued]>
   [2023-08-23, 18:27:28 UTC] {taskinstance.py:1159} INFO - Dependencies all 
met for dep_context=requeueable deps ti=<TaskInstance: 
mydagname.schema_validation_header 
mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083 
[queued]>
   [2023-08-23, 18:27:28 UTC] {taskinstance.py:1361} INFO - Starting attempt 1 
of 1
   [2023-08-23, 18:27:28 UTC] {taskinstance.py:1382} INFO - Executing 
<Task(PythonOperator): schema_validation_header> on 2023-08-22 01:36:29+00:00
   [2023-08-23, 18:27:28 UTC] {standard_task_runner.py:57} INFO - Started 
process 32360 to run task
   [2023-08-23, 18:27:28 UTC] {standard_task_runner.py:84} INFO - Running: 
['***', 'tasks', 'run', 'mydagname', 'schema_validation_header', 
'mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083', 
'--job-id', '1180127', '--raw', '--subdir', 'DAGS_FOLDER/example/mydagname.py', 
'--cfg-path', '/tmp/tmp7fjqwb36']
   [2023-08-23, 18:27:28 UTC] {standard_task_runner.py:85} INFO - Job 1180127: 
Subtask schema_validation_header
   [2023-08-23, 18:27:54 UTC] {task_command.py:415} INFO - Running 
<TaskInstance: mydagname.schema_validation_header 
mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083 
[running]> on host 90c66b7612c1
   [2023-08-23, 18:28:20 UTC] {taskinstance.py:1660} INFO - Exporting env vars: 
AIRFLOW_CTX_DAG_EMAIL='[email protected]' AIRFLOW_CTX_DAG_OWNER='admin' 
AIRFLOW_CTX_DAG_ID='mydagname' AIRFLOW_CTX_TASK_ID='schema_validation_header' 
AIRFLOW_CTX_EXECUTION_DATE='2023-08-22T01:36:29+00:00' 
AIRFLOW_CTX_TRY_NUMBER='1' 
AIRFLOW_CTX_DAG_RUN_ID='mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083'
   [2023-08-23, 18:28:20 UTC] {python.py:194} INFO - Done. Returned value was: 
{ censored }
   [2023-08-23, 18:28:20 UTC] {taskinstance.py:1400} INFO - Marking task as 
SUCCESS. dag_id=mydagname, task_id=schema_validation_header, 
execution_date=20230822T013629, start_date=20230823T182728, 
end_date=20230823T182820
   [2023-08-23, 18:28:20 UTC] {local_task_job_runner.py:228} INFO - Task exited 
with return code 0
   [2023-08-23, 18:28:20 UTC] {taskinstance.py:2784} INFO - 1 downstream tasks 
scheduled from follow-on schedule check
   
   
   TASK: generate_sql_header
   
   *** Found local files:
   ***   * 
/opt/airflow/logs/dag_id=mydagname/run_id=mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083/task_id=generate_sql_header/attempt=1.log
   [2023-08-23, 18:29:32 UTC] {taskinstance.py:1159} INFO - Dependencies all 
met for dep_context=non-requeueable deps ti=<TaskInstance: 
mydagname.generate_sql_header 
mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083 
[queued]>
   [2023-08-23, 18:29:32 UTC] {taskinstance.py:1159} INFO - Dependencies all 
met for dep_context=requeueable deps ti=<TaskInstance: 
mydagname.generate_sql_header 
mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083 
[queued]>
   [2023-08-23, 18:29:32 UTC] {taskinstance.py:1361} INFO - Starting attempt 1 
of 1
   [2023-08-23, 18:29:32 UTC] {taskinstance.py:1382} INFO - Executing 
<Task(PythonOperator): generate_sql_header> on 2023-08-22 01:36:29+00:00
   [2023-08-23, 18:29:32 UTC] {standard_task_runner.py:57} INFO - Started 
process 32423 to run task
   [2023-08-23, 18:29:32 UTC] {standard_task_runner.py:84} INFO - Running: 
['***', 'tasks', 'run', 'mydagname', 'generate_sql_header', 
'mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083', 
'--job-id', '1180131', '--raw', '--subdir', 'DAGS_FOLDER/example/mydagname.py', 
'--cfg-path', '/tmp/tmpq9g7hxl_']
   [2023-08-23, 18:29:32 UTC] {standard_task_runner.py:85} INFO - Job 1180131: 
Subtask generate_sql_header
   [2023-08-23, 18:30:00 UTC] {task_command.py:415} INFO - Running 
<TaskInstance: mydagname.generate_sql_header 
mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083 
[running]> on host 90c66b7612c1
   [2023-08-23, 18:30:41 UTC] {taskinstance.py:1660} INFO - Exporting env vars: 
AIRFLOW_CTX_DAG_EMAIL='[email protected]' AIRFLOW_CTX_DAG_OWNER='admin' 
AIRFLOW_CTX_DAG_ID='mydagname' AIRFLOW_CTX_TASK_ID='generate_sql_header' 
AIRFLOW_CTX_EXECUTION_DATE='2023-08-22T01:36:29+00:00' 
AIRFLOW_CTX_TRY_NUMBER='1' 
AIRFLOW_CTX_DAG_RUN_ID='mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083'
   [2023-08-23, 18:30:41 UTC] {python.py:194} INFO - Done. Returned value was: 
censored
   [2023-08-23, 18:30:41 UTC] {taskinstance.py:1400} INFO - Marking task as 
SUCCESS. dag_id=mydagname, task_id=generate_sql_header, 
execution_date=20230822T013629, start_date=20230823T182932, 
end_date=20230823T183041
   [2023-08-23, 18:30:42 UTC] {local_task_job_runner.py:228} INFO - Task exited 
with return code 0
   [2023-08-23, 18:30:42 UTC] {taskinstance.py:2784} INFO - 0 downstream tasks 
scheduled from follow-on schedule check
   
   
   TASK: execute_sql
   
   90c66b7612c1
   *** Found local files:
   ***   * 
/opt/airflow/logs/dag_id=mydagname/run_id=mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083/task_id=execute_sql/attempt=1.log
   [2023-08-23, 18:32:47 UTC] {taskinstance.py:1159} INFO - Dependencies all 
met for dep_context=non-requeueable deps ti=<TaskInstance: 
mydagname.execute_sql 
mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083 
[queued]>
   [2023-08-23, 18:32:47 UTC] {taskinstance.py:1159} INFO - Dependencies all 
met for dep_context=requeueable deps ti=<TaskInstance: mydagname.execute_sql 
mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083 
[queued]>
   [2023-08-23, 18:32:47 UTC] {taskinstance.py:1361} INFO - Starting attempt 1 
of 1
   [2023-08-23, 18:32:47 UTC] {taskinstance.py:1382} INFO - Executing 
<Task(PostgresOperator): execute_sql> on 2023-08-22 01:36:29+00:00
   [2023-08-23, 18:32:47 UTC] {standard_task_runner.py:57} INFO - Started 
process 32534 to run task
   [2023-08-23, 18:32:47 UTC] {standard_task_runner.py:84} INFO - Running: 
['***', 'tasks', 'run', 'mydagname', 'execute_sql', 
'mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083', 
'--job-id', '1180141', '--raw', '--subdir', 'DAGS_FOLDER/example/mydagname.py', 
'--cfg-path', '/tmp/tmpfmvj5dnz']
   [2023-08-23, 18:32:47 UTC] {standard_task_runner.py:85} INFO - Job 1180141: 
Subtask execute_sql
   [2023-08-23, 18:33:22 UTC] {task_command.py:415} INFO - Running 
<TaskInstance: mydagname.execute_sql 
mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083 
[running]> on host 90c66b7612c1
   [2023-08-23, 18:33:43 UTC] {taskinstance.py:1660} INFO - Exporting env vars: 
AIRFLOW_CTX_DAG_EMAIL='[email protected]' AIRFLOW_CTX_DAG_OWNER='admin' 
AIRFLOW_CTX_DAG_ID='mydagname' AIRFLOW_CTX_TASK_ID='execute_sql' 
AIRFLOW_CTX_EXECUTION_DATE='2023-08-22T01:36:29+00:00' 
AIRFLOW_CTX_TRY_NUMBER='1' 
AIRFLOW_CTX_DAG_RUN_ID='mydagname__bot_extract_sales__v10678108plzv__2023-08-21T20:36:29.046083'
   [2023-08-23, 18:33:43 UTC] {sql.py:274} INFO - Executing: censored
   [2023-08-23, 18:33:43 UTC] {base.py:73} INFO - Using connection ID 'conn' 
for task execution.
   [2023-08-23, 18:33:43 UTC] {base.py:73} INFO - Using connection ID 'conn' 
for task execution.
   [2023-08-23, 18:33:43 UTC] {sql.py:418} INFO - Running statement: censored
   [2023-08-23, 18:33:43 UTC] {sql.py:427} INFO - Rows affected: 1
   [2023-08-23, 18:33:43 UTC] {taskinstance.py:1400} INFO - Marking task as 
SUCCESS. dag_id=mydagname, task_id=execute_sql, execution_date=20230822T013629, 
start_date=20230823T183247, end_date=20230823T183343
   [2023-08-23, 18:33:43 UTC] {local_task_job_runner.py:228} INFO - Task exited 
with return code 0
   [2023-08-23, 18:33:44 UTC] {taskinstance.py:2784} INFO - 0 downstream tasks 
scheduled from follow-on schedule check
   ``` 
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to