Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding ingress_annotations to awx spec causes deployment failure on Operator 2.11.0+ #1713

Closed
3 tasks done
USB-Coffee opened this issue Feb 15, 2024 · 7 comments · Fixed by #1715
Closed
3 tasks done

Comments

@USB-Coffee
Copy link

Please confirm the following

  • I agree to follow this project's code of conduct.
  • I have checked the current issues for duplicates.
  • I understand that the AWX Operator is open source software provided for free and that I might not receive a timely response.

Bug Summary

An existing production instance of AWX fails to deploy the last two operators. No changes to any file within the AWX operator repository.
Checkout 2.11.0 or 2.12.0 and make deploy fails with a stacktrace. Re-deploying 2.10.0 in the same environment works successfully

I do not believe this is a duplicate of #14876
Happy to try and extract more information if anybody has an idea.

AWX Operator version

2.10.0

AWX version

23.6.0

Kubernetes platform

kubernetes

Kubernetes/Platform version

v1.26.11 +k3s2

Modifications

no

Steps to reproduce

pull the latest version of the operator
git checkout 2.11.0
make deploy

Expected results

New operator deploys successfully

Actual results

Stacktrace and errors when attempting to run the "Apply Resources" Tasks
I have narrowed it down to roles/installer/tasks/resources_configuration.yml:203 but I can't see anything obvious causing it yet.

Additional information

Have tried deleting and re-cloning to rule out modified files / errors as much as possible.
Same occurs with 2.12.0

I have attempted to set no_log vars to false but no additional information was provided that looked useful. (and some tasks did not obey the new var)

Operator Logs

PLAY RECAP *********************************************************************
localhost : ok=63 changed=0 unreachable=0 failed=1 skipped=61 rescued=0 ignored=0


olicy": {}, "f:ingress_tls_secret": {}, "f:ingress_type": {}, "f:ipv6_disabled": {}, "f:ldap_cacert_secret": {}, "f:loadbalancer_port": {}, "f:loadbalancer_protocol": {}, "f:no_log": {}, "f:nodeport_port": {}, "f:postgres_configuration_secret": {}, "f:postgres_init_container_resource_requirements": {}, "f:postgres_resource_requirements": {}, "f:postgres_storage_class": {}, "f:postgres_storage_requirements": {".": {}, "f:requests": {".": {}, "f:storage": {}}}, "f:projects_existing_claim": {}, "f:projects_persistence": {}, "f:projects_storage_access_mode": {}, "f:projects_storage_size": {}, "f:replicas": {}, "f:route_tls_termination_mechanism": {}, "f:set_self_labels": {}, "f:task_extra_env": {}, "f:task_privileged": {}, "f:task_resource_requirements": {}, "f:web_extra_env": {}, "f:web_resource_requirements": {}}}, "manager": "kubectl-client-side-apply", "operation": "Update", "time": "2023-02-08T17:19:38Z"}, {"apiVersion": "awx.ansible.com/v1beta1", "fieldsType": "FieldsV1", "fieldsV1": {"f:status": {"f:adminPasswordSecret": {}, "f:adminUser": {}, "f:broadcastWebsocketSecret": {}, "f:image": {}, "f:postgresConfigurationSecret": {}, "f:secretKeySecret": {}, "f:version": {}}}, "manager": "OpenAPI-Generator", "operation": "Update", "subresource": "status", "time": "2024-01-15T16:24:21Z"}, {"apiVersion": "awx.ansible.com/v1beta1", "fieldsType": "FieldsV1", "fieldsV1": {"f:spec": {"f:ingress_annotations": {}}}, "manager": "kubectl-patch", "operation": "Update", "time": "2024-01-23T16:15:19Z"}, {"apiVersion": "awx.ansible.com/v1beta1", "fieldsType": "FieldsV1", "fieldsV1": {"f:metadata": {"f:labels": {".": {}, "f:app.kubernetes.io/component": {}, "f:app.kubernetes.io/managed-by": {}, "f:app.kubernetes.io/name": {}, "f:app.kubernetes.io/operator-version": {}, "f:app.kubernetes.io/part-of": {}}}}, "manager": "OpenAPI-Generator", "operation": "Update", "time": "2024-02-14T14:57:31Z"}, {"apiVersion": "awx.ansible.com/v1beta1", "fieldsType": "FieldsV1", "fieldsV1": {"f:status": {".": {}, "f:conditions": {}}}, "manager": "ansible-operator", "operation": "Update", "subresource": "status", "time": "2024-02-14T15:29:30Z"}], "name": "awx", "namespace": "awx", "resourceVersion": "154866716", "uid": "64969ac5-ee3a-4c30-93e9-407e2a31ec9d"}, "spec": {"admin_password_secret": "awx-admin-password", "admin_user": "admin", "auto_upgrade": true, "bundle_cacert_secret": "awx-custom-certs", "create_preload_data": true, "ee_extra_env": "- name: SOCIAL_AUTH_USERNAME_IS_FULL_EMAIL\n value: \"True\"\n", "ee_resource_requirements": {}, "extra_settings": [{"setting": "SOCIAL_AUTH_USERNAME_IS_FULL_EMAIL", "value": "True"}], "garbage_collect_secrets": false, "hostname": "hostname_redacted", "image_pull_policy": "IfNotPresent", "ingress_annotations": "traefik.ingress.kubernetes.io/router.middlewares: default-redirect@kubernetescrd", "ingress_tls_secret": "awx-secret-tls", "ingress_type": "ingress", "ipv6_disabled": false, "ldap_cacert_secret": "awx-custom-certs", "loadbalancer_ip": "", "loadbalancer_port": 80, "loadbalancer_protocol": "http", "no_log": true, "nodeport_port": 30080, "postgres_configuration_secret": "awx-postgres-configuration", "postgres_init_container_resource_requirements": {}, "postgres_keepalives": true, "postgres_keepalives_count": 5, "postgres_keepalives_idle": 5, "postgres_keepalives_interval": 5, "postgres_resource_requirements": {}, "postgres_storage_class": "awx-postgres-volume", "postgres_storage_requirements": {"requests": {"storage": "8Gi"}}, "projects_existing_claim": "awx-projects-claim", "projects_persistence": true, "projects_storage_access_mode": "ReadWriteMany", "projects_storage_size": "8Gi", "replicas": 1, "route_tls_termination_mechanism": "Edge", "set_self_labels": true, "task_extra_env": "- name: SOCIAL_AUTH_USERNAME_IS_FULL_EMAIL\n value: \"True\"\n", "task_privileged": false, "task_resource_requirements": {}, "web_extra_env": "- name: SOCIAL_AUTH_USERNAME_IS_FULL_EMAIL\n value: \"True\"\n", "web_resource_requirements": {}}, "status": {"adminPasswordSecret": "awx-admin-password", "adminUser": "admin", "broadcastWebsocketSecret": "awx-broadcast-websocket", "conditions": [{"lastTransitionTime": "2024-02-12T15:26:47Z", "reason": "", "status": "False", "type": "Successful"}, {"lastTransitionTime": "2024-02-14T15:18:35Z", "reason": "Failed", "status": "False", "type": "Failure"}, {"lastTransitionTime": "2024-02-14T15:29:30Z", "reason": "Running", "status": "True", "type": "Running"}], "image": "quay.io/ansible/awx:23.6.0", "postgresConfigurationSecret": "awx-postgres-configuration", "secretKeySecret": "awx-secret-key", "version": "23.6.0"}}]}\n\r\nTASK [installer : Set actual old postgres configuration secret name] ***********\r\ntask path: /opt/ansible/roles/installer/tasks/migrate_data.yml:3\nskipping: [localhost] => {"changed": false, "false_condition": "old_pg_config['resources'] | length", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Store Database Configuration] ********************************\r\ntask path: /opt/ansible/roles/installer/tasks/migrate_data.yml:7\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Set Default label selector for custom resource generated postgres] ***\r\ntask path: /opt/ansible/roles/installer/tasks/migrate_data.yml:16\nskipping: [localhost] => {"changed": false, "false_condition": "old_pg_config['resources'] | length", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Get the postgres pod information] ****************************\r\ntask path: /opt/ansible/roles/installer/tasks/migrate_data.yml:21\nskipping: [localhost] => {"changed": false, "false_condition": "old_pg_config['resources'] | length", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Set the resource pod name as a variable.] ********************\r\ntask path: /opt/ansible/roles/installer/tasks/migrate_data.yml:31\nskipping: [localhost] => {"changed": false, "false_condition": "old_pg_config['resources'] | length", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Scale down Deployment for migration] *************************\r\ntask path: /opt/ansible/roles/installer/tasks/migrate_data.yml:35\nskipping: [localhost] => {"changed": false, "false_condition": "old_pg_config['resources'] | length", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Set pg_dump command] *****************************************\r\ntask path: /opt/ansible/roles/installer/tasks/migrate_data.yml:38\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Set pg_restore command] **************************************\r\ntask path: /opt/ansible/roles/installer/tasks/migrate_data.yml:49\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Stream backup from pg_dump to the new postgresql container] ***\r\ntask path: /opt/ansible/roles/installer/tasks/migrate_data.yml:57\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Set flag signifying that this instance has been migrated] ****\r\ntask path: /opt/ansible/roles/installer/tasks/migrate_data.yml:86\nskipping: [localhost] => {"changed": false, "false_condition": "old_pg_config['resources'] | length", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Load Route TLS certificate] **********************************\r\ntask path: /opt/ansible/roles/installer/tasks/install.yml:74\nskipping: [localhost] => {"changed": false, "false_condition": "ingress_type | lower == 'route'", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Wait for awxrestore to complete] *****************************\r\ntask path: /opt/ansible/roles/installer/tasks/install.yml:80\nok: [localhost] => {"api_found": true, "attempts": 1, "changed": false, "resources": []}\n\r\nTASK [installer : Include resources configuration tasks] ***********************\r\ntask path: /opt/ansible/roles/installer/tasks/install.yml:94\nincluded: /opt/ansible/roles/installer/tasks/resources_configuration.yml for localhost\n\r\nTASK [installer : Get the current resource task pod information.] **************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:2\nok: [localhost] => {"api_found": true, "changed": false, "resources": []}\n\r\nTASK [installer : Set the resource pod as a variable.] *************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:15\nok: [localhost] => {"ansible_facts": {"awx_task_pod": {}}, "changed": false}\n\r\nTASK [installer : Set the resource pod name as a variable.] ********************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:23\nok: [localhost] => {"ansible_facts": {"awx_task_pod_name": ""}, "changed": false}\n\r\nTASK [installer : Set user provided control plane ee image] ********************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:27\nskipping: [localhost] => {"changed": false, "false_condition": "control_plane_ee_image | default([]) | length", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Set Control Plane EE image URL] ******************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:33\nok: [localhost] => {"ansible_facts": {"_control_plane_ee_image": "quay.io/ansible/awx-ee:latest"}, "changed": false}\n\r\nTASK [installer : Check for Receptor CA Secret] ********************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:37\nok: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Delete old Receptor CA Secret] *******************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:50\nskipping: [localhost] => {"changed": false, "false_condition": "receptor_ca['resources'][0]['type'] != \"kubernetes.io/tls\"", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Create tempfile for receptor-ca.key] *************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:56\nskipping: [localhost] => {"changed": false, "false_condition": "receptor_ca['resources'][0]['type'] != \"kubernetes.io/tls\"", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Copy Receptor CA key from old secret to tempfile] ************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:61\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Create tempfile for receptor-ca.crt] *************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:66\nskipping: [localhost] => {"changed": false, "false_condition": "receptor_ca['resources'][0]['type'] != \"kubernetes.io/tls\"", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Copy Receptor CA cert from old secret to tempfile] ***********\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:71\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Create New Receptor CA secret] *******************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:76\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Read New Receptor CA Secret] *********************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:81\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Set receptor_ca variable] ************************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:88\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Remove tempfiles] ********************************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:92\nskipping: [localhost] => {"changed": false, "false_condition": "receptor_ca['resources'][0]['type'] != \"kubernetes.io/tls\"", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Create tempfile for receptor-ca.key] *************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:102\nskipping: [localhost] => {"changed": false, "false_condition": "not receptor_ca['resources'] | default([]) | length", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Generate Receptor CA key] ************************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:107\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Create tempfile for receptor-ca.crt] *************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:111\nskipping: [localhost] => {"changed": false, "false_condition": "not receptor_ca['resources'] | default([]) | length", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Generate Receptor CA cert] ***********************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:116\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Create Receptor CA secret] ***********************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:122\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Read Receptor CA secret] *************************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:127\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Set receptor_ca variable] ************************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:134\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Remove tempfiles] ********************************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:138\nskipping: [localhost] => {"changed": false, "false_condition": "not receptor_ca['resources'] | default([]) | length", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Check for Receptor work signing Secret] **********************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:147\nok: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Create tempfile for receptor work signing private key] *******\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:157\nskipping: [localhost] => {"changed": false, "false_condition": "not receptor_work_signing['resources'] | default([]) | length", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Generate Receptor work signing private key] ******************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:162\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Create tempfile for receptor work signing public key] ********\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:166\nskipping: [localhost] => {"changed": false, "false_condition": "not receptor_work_signing['resources'] | default([]) | length", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Generate Receptor work signing public key] *******************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:171\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Create Receptor work signing Secret] *************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:178\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Read Receptor work signing Secret] ***************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:183\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Set receptor_work_signing variable] **************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:190\nskipping: [localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nTASK [installer : Remove tempfiles] ********************************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:194\nskipping: [localhost] => {"changed": false, "false_condition": "not receptor_work_signing['resources'] | default([]) | length", "skip_reason": "Conditional result was False"}\n\r\nTASK [installer : Apply Resources] *********************************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:203\nok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\nok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\nok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\nok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\nok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\nok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\nfailed: [localhost] (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\nfatal: [localhost]: FAILED! => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\n\r\nPLAY RECAP *********************************************************************\r\nlocalhost : ok=63 changed=0 unreachable=0 failed=1 skipped=61 rescued=0 ignored=0 \n","job":"6258034522662604773","name":"awx","namespace":"awx","error":"exit status 2","stacktrace":"github.com/operator-framework/ansible-operator-plugins/internal/ansible/runner.(*runner).Run.func1\n\tansible-operator-plugins/internal/ansible/runner/runner.go:269"}
{"level":"error","ts":"2024-02-14T15:30:14Z","msg":"Reconciler error","controller":"awx-controller","object":{"name":"awx","namespace":"awx"},"namespace":"awx","name":"awx","reconcileID":"602ce0e0-833e-4537-bbd2-61f7cb35741a","error":"event runner on failed","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:274\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:235"}

@kurokobo
Copy link
Contributor

@USB-Coffee
Could you provide specs for your AWX? Also if it's okay for you to add no_log: false to your AWX and then retry upgrading, we can gather more meaningful logs instead of censored logs.

This is just for your information, this complicated logs:

TASK [installer : Apply Resources] *********************************************\r\ntask path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:203\nok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\nok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\nok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\nok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\nok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\nok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\nfailed: [localhost] (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}\nfatal: [localhost]: FAILED! => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}

can be formatted by replacing both \r\n and \n to line breaks:

TASK [installer : Apply Resources] *********************************************
task path: /opt/ansible/roles/installer/tasks/resources_configuration.yml:203
ok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}
ok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}
ok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}
ok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}
ok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}
ok: [localhost] => (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}
failed: [localhost] (item=None) => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}
fatal: [localhost]: FAILED! => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false}

Now we can easily find that the seventh loop was failed. The seventh loop for this task (Apply Resources) is for networking/ingress.yaml.j2, so I believe something went wrong due to ingress-related specs of your AWX.

I suspect this PR that released in 2.11.0: #1377

@USB-Coffee
Copy link
Author

I should have thought of re-formatting the logs! sorry!

That does help but the awx config only has a single ingress defined.
Host is running K3s + Traefik Ingress. The only ingress changes are a force redirect to https via an annotation..... which I put in the awx spec...

Awx config:

---
apiVersion: awx.ansible.com/v1beta1
kind: AWX
metadata:
  name: awx
spec:
  admin_user: admin
  admin_password_secret: awx-admin-password

  ingress_type: ingress
  ingress_tls_secret: awx-secret-tls
  hostname: awx.hostname.domain.com
  ingress_annotations: | 
    traefik.ingress.kubernetes.io/router.middlewares: default-redirect@kubernetescrd

  postgres_configuration_secret: awx-postgres-configuration

  postgres_storage_class: awx-postgres-volume
  postgres_storage_requirements:
    requests:
      storage: 8Gi

  projects_persistence: true
  projects_existing_claim: awx-projects-claim

  postgres_init_container_resource_requirements: {}
  postgres_resource_requirements: {}
  web_resource_requirements: {}
  task_resource_requirements: {}
  ee_resource_requirements: {}

  ldap_cacert_secret: awx-custom-certs
  bundle_cacert_secret: awx-custom-certs

  extra_settings:
    - setting: SOCIAL_AUTH_USERNAME_IS_FULL_EMAIL
      value: "True"


  # Uncomment to reveal "censored" logs
  #  no_log: false

Deleting the offending line ingress_annotations: | traefik.ingress.kubernetes.io/router.middlewares: default-redirect@kubernetescrd from the awx config in kubernetes before running the upgrade resolved the issue and 2.11.0 deployed successfully!
That must be what was breaking the 'backwards compatible changes' mentioned in #1377

Thanks for the quick reply! Hopefully this can assist somebody else too 😀

@kurokobo
Copy link
Contributor

@USB-Coffee
Okay I can reproduce this issue by simply attempting fresh installation with ingress_annotations instead of upgrading.
Could you please rename this issue something like "Adding ingress_annotations causes deployment failure on Operator 2.11.0+"?

@USB-Coffee USB-Coffee changed the title awx-operator upgrading from 2.10.0 to 2.11.0 or 2.12.0 fails to deploy Adding ingress_annotations to awx spec causes deployment failure on Operator 2.11.0+ Feb 15, 2024
@kurokobo
Copy link
Contributor

Affected case

Defining ingress_annotations at the end of YAML file without trailing new line:

---
spec:
  ...
  ingress_type: ingress
  ingress_hosts:
    - hostname: awx-demo.example.com
  ingress_annotations: |
    environment: testing⛔

or, define ingress_annotations as single line strings:

---
spec:
  ...
  ingress_type: ingress
  ingress_hosts:
    - hostname: awx-demo.example.com
  ingress_annotations: 'environment: testing'

Affected AWX CR

ingress_annotations is defined as single line strings.

$ kubectl -n awx get awx awx -o yaml
apiVersion: awx.ansible.com/v1beta1
kind: AWX
...
spec:
  ...
  ingress_annotations: 'environment: testing'
  ingress_hosts:
  - hostname: awx.example.com
    tls_secret: awx-secret-tls

This causes following broken definition for ingress resources (annotations and spec are combined into single line):

---
apiVersion: 'networking.k8s.io/v1'
kind: Ingress
metadata:
  ...
  annotations:
    environment: testingspec:
  rules:
    ...

Unaffected case

Defining ingress_annotations at the end of YAML file with trailing new line:

---
spec:
  ...
  ingress_type: ingress
  ingress_hosts:
    - hostname: awx-demo.example.com
  ingress_annotations: |
    environment: testing

or, defining other specs after ingress_annotations

---
spec:
  ...
  ingress_type: ingress
  ingress_annotations: |
    environment: testing
  ingress_hosts:
    - hostname: awx-demo.example.com

Unaffected AWX CR

ingress_annotations is defined as multi-line strings.

$ kubectl -n awx get awx awx -o yaml
apiVersion: awx.ansible.com/v1beta1
kind: AWX
...
spec:
  ...
  ingress_annotations: |
    environment: testing
  ingress_hosts:
  - hostname: awx.example.com
    tls_secret: awx-secret-tls

In this case, valid definition is generated.

---
apiVersion: 'networking.k8s.io/v1'
kind: Ingress
metadata:
  ...
  annotations:
    environment: testing
spec:
  rules:
    ...

@kurokobo
Copy link
Contributor

kurokobo commented Feb 15, 2024

@USB-Coffee
Hi, thanks for your support. I sent PR #1715 to address this.
I believe you can revert your ingress_annotations after upgrading by being careful to include a new line at the end.

This isssue is helpful for my guide as well, thanks again for using my guide and reporting this 😃

@USB-Coffee
Copy link
Author

No problem, thanks for the help + fix 🙂

@cnu80
Copy link

cnu80 commented Jun 18, 2024

Hi, I found this closed issue after I tried to update our AWX spec from:

  ingress_annotations: |
    cert-manager.io/cluster-issuer: "acme-as-vault01"

to

  ingress_annotations: |
    cert-manager.io/cluster-issuer: "acme-as-vault01"
    cert-manager.io/private-key-size: 4096

I tried it out with different AWX operator versions:

2.18.0
2.17.0
2.15.1
2.12.1

every time AWX was gone, when I added the second annotations.

Any hint? thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants