amoghrajesh commented on code in PR #59711:
URL: https://github.com/apache/airflow/pull/59711#discussion_r3071544131


##########
airflow-core/src/airflow/models/trigger.py:
##########
@@ -417,15 +426,14 @@ def handle_event_submit(event: TriggerEvent, *, 
task_instance: TaskInstance, ses
     :param task_instance: The task instance to handle the submit event for.
     :param session: The session to be used for the database callback sink.
     """
+    from airflow.sdk.serde import serialize
     from airflow.utils.state import TaskInstanceState
 
     # Get the next kwargs of the task instance, or an empty dictionary if it 
doesn't exist
     next_kwargs = task_instance.next_kwargs or {}
 
-    # Add the event's payload into the kwargs for the task
-    next_kwargs["event"] = event.payload
-
     # Update the next kwargs of the task instance
+    next_kwargs["event"] = serialize(event.payload)

Review Comment:
   Post mortem on the issues faced.
   
   The test above verified what the API server wrote to the DB, not what an old 
worker read back from the exec API. The full round-trip is:
   
   Trigger fires on new server
       → `handle_event_submit` writes sdk serde to DB (tested)
       → Old worker calls `/execution/task-instances/{id}/run`
       → exec API serves `next_kwargs` as plain sdk serde dict
       → Old worker calls `BaseSerialization.deserialize(next_kwargs)`
       → KeyError: __var (not tested)
   
   
   The test was a rolling-upgrade test, but it only checked one direction: can 
the new server read old data?
   The missing direction was: can an old worker read new data coming back from 
the API?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to