Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Return code when the cluster is not accessible or if is there an issue with authentication #450

Open
xzizka opened this issue Feb 20, 2023 · 4 comments · Fixed by #527
Open
Assignees
Labels
Milestone

Comments

@xzizka
Copy link

xzizka commented Feb 20, 2023

Hello,

During our tests we found, that kubent returns return code = 0 both when the cluster does not exist.

xzizka@xzizka:~$ kubent -e 
12:39PM INF >>> Kube No Trouble `kubent` <<<
12:39PM INF version 0.7.0 (git sha d1bb4e5fd6550b533b2013671aa8419d923ee042)
12:39PM INF Initializing collectors and retrieving data
12:40PM INF Retrieved 0 resources from collector name=Cluster
12:40PM ERR Failed to retrieve data from collector error="list: failed to list: Get \"https://127.0.0.1:34748/api/v1/secrets?labelSelector=owner%3Dhelm\": dial tcp 127.0.0.1:34748: connect: connection refused" name="Helm v3"
12:40PM INF Loaded ruleset name=custom.rego.tmpl
12:40PM INF Loaded ruleset name=deprecated-1-16.rego
12:40PM INF Loaded ruleset name=deprecated-1-22.rego
12:40PM INF Loaded ruleset name=deprecated-1-25.rego
12:40PM INF Loaded ruleset name=deprecated-1-26.rego
12:40PM INF Loaded ruleset name=deprecated-future.rego
xzizka@xzizka:~$ echo $?
0

and when there is an issue with authentication.

xzizka@xzizka:~$ kubent -e 
12:40PM INF >>> Kube No Trouble `kubent` <<<
12:40PM INF version 0.7.0 (git sha d1bb4e5fd6550b533b2013671aa8419d923ee042)
12:40PM INF Initializing collectors and retrieving data
12:40PM ERR Failed to initialize collector: <nil> error="failed to assemble client config: error loading config file \"/home/xzizka/.kube/config\": illegal base64 data at input byte 1532"
12:40PM ERR Failed to initialize collector: <nil> error="failed to assemble client config: error loading config file \"/home/xzizka/.kube/config\": illegal base64 data at input byte 1532"
12:40PM INF Loaded ruleset name=custom.rego.tmpl
12:40PM INF Loaded ruleset name=deprecated-1-16.rego
12:40PM INF Loaded ruleset name=deprecated-1-22.rego
12:40PM INF Loaded ruleset name=deprecated-1-25.rego
12:40PM INF Loaded ruleset name=deprecated-1-26.rego
12:40PM INF Loaded ruleset name=deprecated-future.rego
xzizka@xzizka:~$ echo $?
0

There is an error message returned about the issue, but I would not expect returning of return code = 0 when kubent was not able to connect successfully to the cluster.
The definition of the -e parameter is:

-e, --exit-error                      exit with non-zero code when issues are found

I would consider the unsuccessful connection to the cluster as an issue.
Do you agree with this position?

Thank you
Ondrej

@stepanstipl stepanstipl self-assigned this Feb 21, 2023
@stepanstipl stepanstipl added the bug Something isn't working label Feb 21, 2023
@stepanstipl stepanstipl added this to the 0.8.0 milestone Feb 21, 2023
@stepanstipl
Copy link
Contributor

Hi @xzizka, this is definitely not desirable and we should look into fixing this. Thanks for reporting this 👍 .

@github-actions
Copy link

This issue has not seen any activity in last 60 days, and has been marked as stale.

@github-actions github-actions bot added the stale label Apr 23, 2023
@dark0dave dark0dave self-assigned this Sep 5, 2023
dark0dave added a commit that referenced this issue Sep 5, 2023
…og level for silent resource failures

Signed-off-by: dark0dave <[email protected]>
dark0dave added a commit that referenced this issue Sep 5, 2023
…og level for silent resource failures

Signed-off-by: dark0dave <[email protected]>
dark0dave added a commit that referenced this issue Sep 5, 2023
…og level for silent resource failures

Signed-off-by: dark0dave <[email protected]>
dark0dave added a commit that referenced this issue Sep 5, 2023
…og level for silent resource failures

Signed-off-by: dark0dave <[email protected]>
dark0dave added a commit that referenced this issue Sep 5, 2023
…og level for silent resource failures

Signed-off-by: dark0dave <[email protected]>
dark0dave added a commit that referenced this issue Sep 5, 2023
…og level for silent resource failures

Signed-off-by: dark0dave <[email protected]>
dark0dave added a commit that referenced this issue Sep 5, 2023
…og level for silent resource failures

Signed-off-by: dark0dave <[email protected]>
dark0dave added a commit that referenced this issue Sep 5, 2023
…og level for silent resource failures

Signed-off-by: dark0dave <[email protected]>
dark0dave added a commit that referenced this issue Sep 5, 2023
…og level for silent resource failures

Signed-off-by: dark0dave <[email protected]>
dark0dave added a commit that referenced this issue Sep 5, 2023
…og level for silent resource failures

Signed-off-by: dark0dave <[email protected]>
dark0dave added a commit that referenced this issue Sep 5, 2023
…og level for silent resource failures

Signed-off-by: dark0dave <[email protected]>
dark0dave added a commit that referenced this issue Sep 6, 2023
…og level for silent resource failures

Signed-off-by: dark0dave <[email protected]>
dark0dave added a commit that referenced this issue Sep 8, 2023
…og level for silent resource failures

Signed-off-by: dark0dave <[email protected]>
dark0dave added a commit that referenced this issue Sep 8, 2023
…og level for silent resource failures

Signed-off-by: dark0dave <[email protected]>
dark0dave added a commit that referenced this issue Sep 8, 2023
…og level for silent resource failures

Signed-off-by: dark0dave <[email protected]>
@stepanstipl
Copy link
Contributor

I believe only 1/2 of this is fixed, see #527 for details.

@stepanstipl stepanstipl reopened this Sep 8, 2023
@dark0dave
Copy link
Collaborator

#527 (review)

Still a few tests to add here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants