Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"unable to decode an event from the watch" errors during pod termination #428

Open
nabadger opened this issue Jun 10, 2024 · 0 comments
Open
Labels
bug Something isn't working

Comments

@nabadger
Copy link

nabadger commented Jun 10, 2024

Operator Version, Kind and Kubernetes Version

  • Operator version: v2.4.1
  • Kind: N/A
  • Kubernetes version: 1.27

YAML Manifest File

N/A

Output Log

2024-06-10 16:28:06    W0610 15:28:06.232834       1 reflector.go:462] k8s.io/[email protected]/tools/cache/reflector.go:229: watch of *v1.Secret ended with: an error on the server ("unable to decode an event from the watch stream: context canceled") has prevented the request from succeeding
2024-06-10 16:37:37    W0610 15:37:37.304037       1 reflector.go:462] k8s.io/[email protected]/tools/cache/reflector.go:229: watch of *v1alpha2.Module ended with: an error on the server ("unable to decode an event from the watch stream: context canceled") has prevented the request from succeeding
2024-06-10 16:39:32    W0610 15:39:32.458770       1 reflector.go:462] k8s.io/[email protected]/tools/cache/reflector.go:229: watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: context canceled") has prevented the request from succeeding
2024-06-10 16:39:32    W0610 15:39:32.458856       1 reflector.go:462] k8s.io/[email protected]/tools/cache/reflector.go:229: watch of *v1.Secret ended with: an error on the server ("unable to decode an event from the watch stream: context canceled") has prevented the request from succeeding
2024-06-10 16:39:32    W0610 15:39:32.458910       1 reflector.go:462] k8s.io/[email protected]/tools/cache/reflector.go:229: watch of *v1alpha2.AgentPool ended with: an error on the server ("unable to decode an event from the watch stream: context canceled") has prevented the request from succeeding
2024-06-10 16:39:32    W0610 15:39:32.458946       1 reflector.go:462] k8s.io/[email protected]/tools/cache/reflector.go:229: watch of *v1alpha2.Module ended with: an error on the server ("unable to decode an event from the watch stream: context canceled") has prevented the request from succeeding
2024-06-10 16:39:32    W0610 15:39:32.459014       1 reflector.go:462] k8s.io/[email protected]/tools/cache/reflector.go:229: watch of *v1alpha2.Workspace ended with: an error on the server ("unable to decode an event from the watch stream: context canceled") has prevented the request from succeeding
2024-06-10 16:39:32    W0610 15:39:32.459064       1 reflector.go:462] k8s.io/[email protected]/tools/cache/reflector.go:229: watch of *v1alpha2.Project ended with: an error on the server ("unable to decode an event from the watch stream: context canceled") has prevented the request from succeeding

Output of relevant kubectl commands

N/A

Steps To Reproduce

I believe this only happens when replicas are set to 1 - so in the helm-chart we can just use

replicaCount: 1

When the pod is terminated, the errors are produced.

With only 1 pod, we don't have leader-election support. I'm fairly sure this error does not show up with >1 pods.

However, we still think running operators with 1 pod is fairly common.

Expected Behavior

The operator should terminate gracefully with no errors.

Actual Behavior

The operator terminates with errors. I have not confirmed if this happens when leader-election is used (>1 replica). I don't recall seeing these errors from memory though.

Additional Context

N/A

References

Community Note

  • Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request.
  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment.
@nabadger nabadger added the bug Something isn't working label Jun 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant