Skip to content
This repository has been archived by the owner on Oct 14, 2024. It is now read-only.

Not working with kube-image-keeper mutating webhook #664

Open
ppapp92 opened this issue May 28, 2024 · 1 comment
Open

Not working with kube-image-keeper mutating webhook #664

ppapp92 opened this issue May 28, 2024 · 1 comment

Comments

@ppapp92
Copy link

ppapp92 commented May 28, 2024

What happened:

kubeclarity-runtime-k8s-scanner throws error trying to scan docker image.

What you expected to happen:

Expect the docker image in the cluster to get scanned successfully.

How to reproduce it (as minimally and precisely as possible):

  1. Install kube-image-keeper
  2. Run KubeClarity Run-Time scan against a Namespace that has images cached by kube-image-keeper
  3. Scan will error out because it's not able to scan the docker image due to the mutating webhook as it rewrites the image URL to localhost:7439/

Are there any error messages in KubeClarity logs?

kubeclarity-kubeclarity-wait-for-pg-db kubeclarity-kubeclarity-postgresql:5432 - accepting connections
kubeclarity 
kubeclarity 2024/05/28 21:07:14 /build/backend/pkg/database/scheduler.go:58 record not found
kubeclarity [1.032ms] [rows:0] SELECT * FROM "scheduler" ORDER BY "scheduler"."id" LIMIT 1
kubeclarity 2024/05/28 21:07:14 Serving kube clarity runtime scan a p is at http://:8888
kubeclarity 2024/05/28 21:07:14 Serving kube clarity a p is at http://:8080
kubeclarity time="2024-05-28T21:07:46Z" level=warning msg="Vulnerabilities scan of imageID \"localhost:7439/typesense/typesense@sha256:035ccfbc3fd8fb9085ea205fdcb62de63eaefdbebd710e88e57f978a30f2090d\" has failed: &{failed to analyze image: failed to run job manager: failed to run job: failed to create source analyzer=syft: unable to load image: unable to use OciRegistry source: failed to get image descriptor from registry: Get \"https://localhost:7439/v2/\": dial tcp [::1]:7439: connect: connection refused; Get \"http://localhost:7439/v2/\": dial tcp [::1]:7439: connect: connection refused TBD}" func="github.com/openclarity/kubeclarity/runtime_scan/pkg/scanner.(*Scanner).HandleScanResults" file="/build/runtime_scan/pkg/scanner/scanner.go:415" scanner id=24da9132-749b-4e9d-943d-327af7a67275
kubeclarity time="2024-05-28T21:09:30Z" level=warning msg="Vulnerabilities scan of imageID \"localhost:7439/typesense/typesense@sha256:035ccfbc3fd8fb9085ea205fdcb62de63eaefdbebd710e88e57f978a30f2090d\" has failed: &{failed to analyze image: failed to run job manager: failed to run job: failed to create source analyzer=syft: unable to load image: unable to use OciRegistry source: failed to get image descriptor from registry: Get \"https://localhost:7439/v2/\": dial tcp [::1]:7439: connect: connection refused; Get \"http://localhost:7439/v2/\": dial tcp [::1]:7439: connect: connection refused TBD}" func="github.com/openclarity/kubeclarity/runtime_scan/pkg/scanner.(*Scanner).HandleScanResults" file="/build/runtime_scan/pkg/scanner/scanner.go:415" scanner id=776eb73a-263a-423d-aa5f-01f25a901dee
kubeclarity 
kubeclarity 2024/05/28 21:09:36 /build/backend/pkg/database/refresh_materialized_views.go:155 SLOW SQL >= 200ms
kubeclarity [1907.530ms] [rows:0] REFRESH MATERIALIZED VIEW CONCURRENTLY packages_view;
kubeclarity 
kubeclarity 2024/05/28 21:09:36 /build/backend/pkg/database/refresh_materialized_views.go:155 SLOW SQL >= 200ms
kubeclarity [1912.167ms] [rows:0] REFRESH MATERIALIZED VIEW CONCURRENTLY vulnerabilities_view;
kubeclarity 
kubeclarity 2024/05/28 21:09:37 /build/backend/pkg/database/application.go:236 record not found
kubeclarity [6.892ms] [rows:0] SELECT * FROM "applications" WHERE applications.id = 'b17e8e84-3330-5f16-93aa-3b425dd46e40' ORDER BY "applications"."id" LIMIT 1
kubeclarity-kubeclarity-wait-for-sbom-db + curl -sw '%{http_code}' http://kubeclarity-kubeclarity-sbom-db:8081/healthz/ready -o /dev/null
kubeclarity-kubeclarity-wait-for-sbom-db + '[' 200 -ne 200 ]
kubeclarity-kubeclarity-wait-for-grype-server + curl -sw '%{http_code}' http://kubeclarity-kubeclarity-grype-server:8080/healthz/ready -o /dev/null
kubeclarity-kubeclarity-wait-for-grype-server + '[' 200 -ne 200 ]
Stream closed EOF for kubeclarity-test/kubeclarity-kubeclarity-6ddcd445b8-pnvdt (kubeclarity-kubeclarity-wait-for-pg-db)
Stream closed EOF for kubeclarity-test/kubeclarity-kubeclarity-6ddcd445b8-pnvdt (kubeclarity-kubeclarity-wait-for-sbom-db)
Stream closed EOF for kubeclarity-test/kubeclarity-kubeclarity-6ddcd445b8-pnvdt (kubeclarity-kubeclarity-wait-for-grype-server)

Anything else we need to know?:

Environment:

  • Kubernetes version: EKS 1.28
  • Helm version (use helm version): v3.14.4
  • KubeClarity version: latest
  • KubeClarity Helm Chart version: latest
  • Cloud provider or hardware configuration: AWS
Copy link

Thank you for your contribution! This issue has been automatically marked as stale because it has no recent activity in the last 60 days. It will be closed in 14 days, if no further activity occurs. If this issue is still relevant, please leave a comment to let us know, and the stale label will be automatically removed.

@ramizpolic ramizpolic changed the title Not working with kube-image-keeper mutating webhook [kubeclarity] Not working with kube-image-keeper mutating webhook Aug 8, 2024
@ramizpolic ramizpolic changed the title [kubeclarity] Not working with kube-image-keeper mutating webhook Not working with kube-image-keeper mutating webhook Aug 8, 2024
@ramizpolic ramizpolic transferred this issue from openclarity/openclarity Aug 8, 2024
@ramizpolic ramizpolic transferred this issue from another repository Aug 10, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant