You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[2023-04-29 11:51:26,584: INFO/ForkPoolWorker-1] Task airflow.executors.celery_executor.execute_command[2a8bbfc1-4c3c-4123-bdc8-36dc6fe7799d] succeeded in 59.32762966118753s: None
However the task logs is not sent to datadog, I can see /opt/airflow/logs does have a lot of task logs but they are never sent to datadog, what is the right way to do that?
It seems only stdout logs are uploaded to datadog, what is missing here to upload task logs to datadog?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hello,
I deployed airflow with helm in Kubernetes
All other configuration files are default ones.
I can see some logs in datadog
However the task logs is not sent to datadog, I can see
/opt/airflow/logs
does have a lot of task logs but they are never sent to datadog, what is the right way to do that?It seems only stdout logs are uploaded to datadog, what is missing here to upload task logs to datadog?
From the documentation
Containerized
part, it is not very clearhttps://docs.datadoghq.com/integrations/airflow/?tab=containerized#troubleshooting
Beta Was this translation helpful? Give feedback.
All reactions