obarisk opened a new issue, #53634:
URL: https://github.com/apache/airflow/issues/53634

   ### Apache Airflow Provider(s)
   
   google
   
   ### Versions of Apache Airflow Providers
   
   16.1.0
   
   ### Apache Airflow version
   
   3.0.3
   
   ### Operating System
   
   debian
   
   ### Deployment
   
   Other Docker-based deployment
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   for dag that trigger by Asset without schedule,
   logical_date is not available in `dag_run`, `context`
   
   in `GCSToBigQueryOperator`, we use `context['logical_date']` to generate 
job_id
   that'll be an error for dags trigger by asset.
   
   ### What you think should happen instead
   
   no error
   
   ### How to reproduce
   
   trigger the dag directly is okay. => with logical_date.
   trigger the dag by trigger the asset. => error.
   
   ```python
   from airflow.sdk import DAG, Asset
   from airflow.providers.google.cloud.transfers.gcs_to_bigquery import (
     GCSToBigQueryOperator
   )
   
   source_asset = Asset("/source_data")
   
   with DAG(
     dag_id="no_logical_date",
     schedule=[source_asset],
     tags=["debug"]
   ):
     gcs_to_gbq = GCSToBigQueryOperator(
       task_id="gcs_to_gbq",
       bucket="example",
       source_objects=["some_data.csv"],
       destination_project_dataset_table="demo.example_table",
       gcp_conn_id="gcp-default",
     )
   
     gcs_to_gbq
   ```
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to