[
https://issues.apache.org/jira/browse/AIRFLOW-660?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Aizhamal Nurmamat kyzy updated AIRFLOW-660:
-------------------------------------------
Component/s: (was: db)
database
> Impossible to record second task failure
> ----------------------------------------
>
> Key: AIRFLOW-660
> URL: https://issues.apache.org/jira/browse/AIRFLOW-660
> Project: Apache Airflow
> Issue Type: Bug
> Components: database
> Affects Versions: 1.8.0
> Reporter: Alexander Shorin
> Priority: Blocker
>
> {code}
> /var/log/airflow/airflow_scheduler_err.log.10: [SQL: 'INSERT INTO task_fail
> (task_id, dag_id, execution_date, start_date, end_date, duration) VALUES
> (%(task_id)s, %(dag_id)s, %(execution_date)s, %(start_date)s, %(end_date)s,
> %(duration)s)'] [parameters: {'task_id': 'test_task', 'end_date':
> datetime.datetime(2016, 11, 30, 14, 38, 39, 197485), 'execution_date':
> datetime.datetime(2016, 11, 30, 0, 0), 'duration': 331.723087, 'start_date':
> datetime.datetime(2016, 11, 30, 14, 33, 7, 474398), 'dag_id': 'test_dag'}]
> /var/log/airflow/airflow_scheduler_err.log.10-Process
> DagFileProcessor314-Process:
> /var/log/airflow/airflow_scheduler_err.log.10-Traceback (most recent call
> last):
> /var/log/airflow/airflow_scheduler_err.log.10- File
> "/usr/local/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
> /var/log/airflow/airflow_scheduler_err.log.10- self.run()
> /var/log/airflow/airflow_scheduler_err.log.10- File
> "/usr/local/lib/python2.7/multiprocessing/process.py", line 114, in run
> /var/log/airflow/airflow_scheduler_err.log.10- self._target(*self._args,
> **self._kwargs)
> /var/log/airflow/airflow_scheduler_err.log.10- File
> "/usr/local/lib/python2.7/site-packages/airflow/jobs.py", line 318, in helper
> /var/log/airflow/airflow_scheduler_err.log.10- pickle_dags)
> /var/log/airflow/airflow_scheduler_err.log.10- File
> "/usr/local/lib/python2.7/site-packages/airflow/utils/db.py", line 56, in
> wrapper
> /var/log/airflow/airflow_scheduler_err.log.10- session.commit()
> /var/log/airflow/airflow_scheduler_err.log.10- File
> "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", line 813,
> in commit
> /var/log/airflow/airflow_scheduler_err.log.10- self.transaction.commit()
> /var/log/airflow/airflow_scheduler_err.log.10- File
> "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", line 390,
> in commit
> /var/log/airflow/airflow_scheduler_err.log.10-
> self._assert_active(prepared_ok=True)
> /var/log/airflow/airflow_scheduler_err.log.10- File
> "/usr/local/lib/python2.7/site-packages/sqlalchemy/orm/session.py", line 214,
> in _assert_active
> /var/log/airflow/airflow_scheduler_err.log.10- % self._rollback_exception
> /var/log/airflow/airflow_scheduler_err.log.10:InvalidRequestError: This
> Session's transaction has been rolled back due to a previous exception during
> flush. To begin a new transaction with this Session, first issue
> Session.rollback(). Original exception was: (psycopg2.IntegrityError)
> duplicate key value violates unique constraint "task_fail_pkey"
> /var/log/airflow/airflow_scheduler_err.log.10-DETAIL: Key (task_id, dag_id,
> execution_date)=(test_dag, test_task, 2016-11-30 00:00:00) already exists.
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)