CloudDataFusionStartPipelineOperator.partial().expand_kwargs() not working in airflow 2.9.1 #43614
Replies: 5 comments 8 replies
-
Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval. |
Beta Was this translation helpful? Give feedback.
-
What version of |
Beta Was this translation helpful? Give feedback.
-
Not sure how to check, but we have used the from airflow.providers.google.cloud.operators.datafusion import CloudDataFusionStartPipelineOperator and we use gcp cloud composer @RNHTTR and the data fusion version in pipeline we use as 6.9.2 with memory 8192Mb &virtual core 4, and driver resource memory 8192 Mb, virtual core 4 |
Beta Was this translation helpful? Give feedback.
-
You can see provider's version in the UI of Airflow. Also there is a https://airflow.apache.org/docs/apache-airflow-providers-google/stable/changelog.html changelog of Google provider which is the one that interfaces with cloud fusion. Google team is testing and running the providers (including system testing them), they also manage CloudDataFusion sand they are managing composer, so Composer should be your support - especially that you have paid support with them. So I suggest you open a support ticket with them, this does look like a problem that squarely falls into the support of the service you pay for rather than volunteer-run free open-source software that they manage. Converting it into discussion if more discussion needed. |
Beta Was this translation helpful? Give feedback.
-
Also @VladaZakharova -> maybe you and your team know something about it ? |
Beta Was this translation helpful? Give feedback.
-
Apache Airflow version
Other Airflow 2 version (please specify below)
If "Other Airflow 2 version" selected, which one?
2.9.1
What happened?
Our dag working fine in airflow 2.5.3 , but when we created new composer 2.9.5 and airflow 2.9.1 in this same dag and code is not working , only one task is not working as we call the data dynamically from previous tasks , issue with one task i.e. CloudDataFusionStartPipelineOperator.partial(task_id='start_cdfpipeline_name',
Location='us-cenrtal-1',
Timeout=900,
).expand_kwargs(previous _task.output)
Failing as "airflow.exception.AirflowException: Starting a pipeline failed with code 400"
What you think should happen instead?
No response
How to reproduce
Plz provide your help to fix the issue
Operating System
Na
Versions of Apache Airflow Providers
2.9.1
Deployment
Official Apache Airflow Helm Chart
Deployment details
No response
Anything else?
No response
Are you willing to submit PR?
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions