potiuk commented on code in PR #46891: URL: https://github.com/apache/airflow/pull/46891#discussion_r2213049623
########## airflow-core/tests/unit/api_fastapi/core_api/routes/public/test_dag_run.py: ########## @@ -82,6 +88,7 @@ @pytest.fixture(autouse=True) @provide_session def setup(request, dag_maker, session=None): + clear_db_connections() Review Comment: I did some bisecting and investigation and I think in know why it worked before and what's the minimum change (adding flush()) should be done to make it works in 3.13. 1) Looks like session object is not changing -> as expected, regardless if "create_session" is called in session fixture or in the `clean_db_connections` -> we retrieve the same session from stored in thread local (same object printed throughougt the whole pytest run: <img width="937" height="780" alt="Screenshot 2025-07-17 at 11 54 53" src="https://github.com/user-attachments/assets/3801e426-06fc-4f59-b810-715134dbd0f2" /> 2) but when running this command: ``` pytest airflow-core/tests/unit/api_fastapi/core_api/routes/public/test_dag_run.py::TestPatchDagRun::test_patch_dag_run airflow-core/tests/unit/api_fastapi/core_api/routes/public/test_dag_run.py::TestGetDagRunAssetTriggerEvents ``` * On Python 3.12 -> everything is fine * On Python 3.13 -> we have this: ```python pytest airflow-core/tests/unit/api_fastapi/core_api/routes/public/test_dag_run.py::TestPatchDagRun::test_patch_dag_run airflow-core/tests/unit/api_fastapi/core_api/routes/public/test_dag_run.py::TestGetDagRunAssetTriggerEvents ``` 3) either adding commit() in the `configure_git_connection_for_dag_bundle` or adding `clear_db_connections` in setup of the `test_dag_run.py` fixes the tests. 4) finally I found out that what also solves the problem is adding flush() after yielding from the `configure_git_connection_for_dag_bundle` - just before "clean_db_connections". So it seems that for whatever reason, the "session.query(Connection).delete()` did not delete the object that was not flushed - which is EXPECTED - because when you add object and do not flush it , it is in detached state, only after flush it is moved to persistent state 5) after a bit closer look, I think I know exactly what happens. In Python 3.13 we skip initialization of flask_plugins with: "Some Airflow 2 plugins have been detected in your environment" info. And the thing is that this initialization happens every time when we run those tests when test dags are created via dag maker ``` [2025-07-17T10:38:32.664+0000] {dagbag.py:620} INFO - Filling up the DagBag from /dev/null [2025-07-17T10:38:32.674+0000] {collection.py:83} INFO - Creating ORM DAG for test_dag1 [2025-07-17T10:38:32.714+0000] {collection.py:83} INFO - Creating ORM DAG for test_dag2 [2025-07-17T10:38:32.791+0000] {app.py:131} INFO - Some Airflow 2 plugins have been detected in your environment. Currently FAB provider does not support Python 3.13, so you cannot use Airflow 2 plugins with Airflow 3 until FAB provider will be Python 3.13 compatible. ---------------------------------------------------------------------------------------------------------------------------------------- Captured log setup ---------------------------------------------------------------------------------------------------------------------------------------- INFO airflow.models.dagbag.DagBag:dagbag.py:620 Filling up the DagBag from /dev/null INFO airflow.dag_processing.collection:collection.py:83 Creating ORM DAG for test_dag1 INFO airflow.dag_processing.collection:collection.py:83 Creating ORM DAG for test_dag2 INFO airflow.api_fastapi.core_api.app:app.py:131 Some Airflow 2 plugins have been detected in your environment. Currently FAB provider does not support Python 3.13, so you cannot use Airflow 2 plugins with Airflow 3 until FAB provider will be Python 3.13 compatible. --------------------------------------------------------------------------------------------------------------------------------------- Captured stdout call --------------------------------------------------------------------------------------------------------------------------------------- [2025-07-17T10:38:33.070+0000] {collection.py:83} INFO - Creating ORM DAG for source_dag ---------------------------------------------------------------------------------------------------------------------------------------- Captured log call ----------------------------------------------------------------------------------------------------------------------------------------- INFO airflow.dag_processing.collection:collection.py:83 Creating ORM DAG for source_dag ```` This is done via: ``` @pytest.fixture def unauthorized_test_client(request): with conf_vars( { ( "core", "auth_manager", ): "airflow.api_fastapi.auth.managers.simple.simple_auth_manager.SimpleAuthManager", } ): app = create_app() auth_manager: SimpleAuthManager = app.state.auth_manager token = auth_manager._get_token_signer().generate( auth_manager.serialize_user(SimpleAuthManagerUser(username="dummy", role=None)) ) yield TestClient( app, headers={"Authorization": f"Bearer {token}"}, base_url=f"{BASE_URL}{get_api_path(request)}" ) ``` And when `initialize_flask_plugsins` is called - flush is executed automatically during "permissions syncing". That's why in Python 3.10-3.12 when FAB provider is present and FAB app gets initialized, this flush (and commit) were executed automatically via other fixtures - and this is why it worked previously, The interesting thing is that we CANNOT just flush () before yielding - becuase that will leave the database locked and some of the tests executing "patch" will fail with "database locked" because they are using test fast_api client and that client uses "unscoped" session - so this will be a different session - not the same as in the local thread - and when the REST api call is executed via test client, they will find the database locked. So the minimum change to fix it is to (just in case) `flush` the session just before `clear_db_connection` - in casse flush nor commit was not called by any of the other fixtures. This is what I am doing now (and added appropriate comment). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
