quay / clair

Vulnerability Static Analysis for Containers

Home Page:https://quay.github.io/clair/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Not finding any CVEs despite Trivy and Grype finding many

HariSekhon opened this issue · comments

Description of Problem / Feature Request

I've implemented Clair, Trivy and Grype into my pipelines, but Clair is the only one not finding any CVEs for a Debian 11 based docker image.

Expected Outcome

Expected it to find more or less similar CVEs to the other tools

Actual Outcome

Grype:

[2023-05-16T20:09:36.879Z] [0000]  INFO grype version: 0.61.1
[2023-05-16T20:11:28.472Z] [0110]  INFO identified distro: Debian GNU/Linux 11 (bullseye) form-lib=syft
...
[2023-05-16T20:11:56.687Z] [0139]  INFO found 498 vulnerabilities for 260 packages

Trivy:

[2023-05-16T20:09:41.208Z] Total: 490 (UNKNOWN: 1, LOW: 318, MEDIUM: 65, HIGH: 95, CRITICAL: 11)

Clair:

[2023-05-16T20:09:36.304Z] + clairctl -D report --host http://<fqdn>:8080 <fqdn>/<custom>/onboarding:8bccae9c83eee9291f95485fdd0ce5ab4bcb030c
[2023-05-16T20:09:36.879Z] 2023-05-16T20:09:36Z DBG fetching ref=<fqdn>/<custom>/onboarding:8bccae9c83eee9291f95485fdd0ce5ab4bcb030c
[2023-05-16T20:09:36.879Z] 2023-05-16T20:09:36Z DBG using text output
[2023-05-16T20:09:41.157Z] 2023-05-16T20:09:40Z DBG found manifest digest=sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b ref=<fqdn>/<custom>/onboarding:8bccae9c83eee9291f95485fdd0ce5ab4bcb030c
[2023-05-16T20:09:41.157Z] 2023-05-16T20:09:40Z DBG requesting index_report attempt=1 digest=sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b ref=<fqdn>/<custom>/onboarding:8bccae9c83eee9291f95485fdd0ce5ab4bcb030c
[2023-05-16T20:09:42.097Z] 2023-05-16T20:09:41Z DBG body="{\"code\":\"not-found\",\"message\":\"index report for manifest \\\"sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b\\\" not found\"}" digest=sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b method=GET path=/indexer/api/v1/index_report/sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b ref=<fqdn>/<custom>/onboarding:8bccae9c83eee9291f95485fdd0ce5ab4bcb030c status="404 Not Found"
[2023-05-16T20:09:42.097Z] 2023-05-16T20:09:41Z DBG don't have needed manifest digest=sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b manifest=sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b ref=<fqdn>/<custom>/onboarding:8bccae9c83eee9291f95485fdd0ce5ab4bcb030c
[2023-05-16T20:09:42.357Z] 2023-05-16T20:09:42Z DBG found manifest digest=sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b ref=<fqdn>/<custom>/onboarding:8bccae9c83eee9291f95485fdd0ce5ab4bcb030c
[2023-05-16T20:09:42.357Z] 2023-05-16T20:09:42Z DBG found layers count=13 digest=sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b ref=<fqdn>/<custom>/onboarding:8bccae9c83eee9291f95485fdd0ce5ab4bcb030c
[2023-05-16T20:09:43.758Z] 2023-05-16T20:09:43Z DBG requesting index_report attempt=2 digest=sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b ref=<fqdn>/<custom>/onboarding:8bccae9c83eee9291f95485fdd0ce5ab4bcb030c
[2023-05-16T20:09:43.758Z] 2023-05-16T20:09:43Z DBG body="{\"code\":\"not-found\",\"message\":\"index report for manifest \\\"sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b\\\" not found\"}" digest=sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b method=GET path=/indexer/api/v1/index_report/sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b ref=<fqdn>/<custom>/onboarding:8bccae9c83eee9291f95485fdd0ce5ab4bcb030c status="404 Not Found"
[2023-05-16T20:12:20.318Z] 2023-05-16T20:12:15Z DBG digest=sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b method=POST path=/indexer/api/v1/index_report ref=<fqdn>/<custom>/onboarding:8bccae9c83eee9291f95485fdd0ce5ab4bcb030c status="201 Created"
[2023-05-16T20:12:20.319Z] 2023-05-16T20:12:15Z DBG setting validator digest=sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b path=/indexer/api/v1/index_report/sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b ref=<fqdn>/<custom>/onboarding:8bccae9c83eee9291f95485fdd0ce5ab4bcb030c validator="\"7a5f5333aeda3d3d3c679da74d74cab5\""
[2023-05-16T20:12:20.319Z] 2023-05-16T20:12:15Z DBG digest=sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b method=GET path=/matcher/api/v1/vulnerability_report/sha256:30280c8edce7364a56902dde59d7e0ae21fbd900156b16a6ac0bcb237fa8297b ref=<fqdn>/<custom>/onboarding:8bccae9c83eee9291f95485fdd0ce5ab4bcb030c status="200 OK"   
[2023-05-16T20:12:20.319Z] onboarding:8bccae9c83eee9291f95485fdd0ce5ab4bcb030c ok

(company specific info has been anonymized via anonymize.pl)

That last line seems to indicate the the docker image is ok and has no CVEs but that is clearly contradicted by the other two scanners on this exact same docker image:tag.

Environment

  • Clair version/image: 4.3.6
  • Clair client name/version: clairctl version v4.6.0-7-g36990912
  • Host OS: Linux (GKE)
  • Kernel (e.g. uname -a): Linux 5.10.162+
  • Kubernetes version (use kubectl version): v1.22.17-gke.5400
  • Network/Firewall setup: GCP

Reproduce

The complete config to reproduce this is in my Kubernetes-configs repo, specifically this directory:

https://github.com/HariSekhon/Kubernetes-configs/tree/master/clair/base

which can be instantly deployed to Kubernetes:

git clone git@github.com:HariSekhon/Kubernetes-configs

cd clair/base

kustomize build | kubectl apply -f -

and then run this from any pod container on Kubernetes:

clairctl -D report --host http://clair.clair.svc.cluster.local:8080 eu.gcr.io/<project>/<image>:<tag>

eg.

git clone git@github.com:HariSekhon/DevOps-Bash-tools bash-tools

# launch a GCloud SDK container on the cluster and drop me into a bash shell on it
bash-tools/kubernetes/kubectl_gcloud_sdk.sh

# inside the GCloud SDK container, download clairctl
curl -Lo clairctl https://github.com/quay/clair/releases/download/v4.6.1/clairctl-linux-amd64 && chmod +x clairctl

./clairctl -D report --host http://clair.clair.svc.cluster.local:8080 eu.gcr.io/<project>/<image>:<tag>

Live Settings

$ env | grep CLAIR | grep -v -e HOST -e PORT
CLAIR_MODE=combo
CLAIR_CONF=/etc/clair/config.yaml

/etc/clair/config.yaml:

    ---
    introspection_addr: 0.0.0.0:8089
    http_listen_addr: 0.0.0.0:8080
    log_level: info
    indexer:
      connstring: postgres://clair:clair@clair-postgresql.clair.svc.cluster.local/clair?sslmode=disable
      scanlock_retry: 10
      layer_scan_concurrency: 5
      migrations: true
    matcher:
      indexer_addr: clair.clair.svc.cluster.local:8080
      connstring: postgres://clair:clair@clair-postgresql.clair.svc.cluster.local/clair?sslmode=disable
      max_conn_pool: 100
      run: ""
      migrations: true
      updater_sets:
        - alpine
        - aws
        - debian
        - oracle
        - photon
        - pyupio
        - rhel
        - suse
        - ubuntu
    matchers:
      names:
        - alpine
        - aws
        - debian
        - oracle
        - photon
        - python
        - rhel
        - suse
        - ubuntu
        - crda
      config:
        crda:
          url: https://gw.api.openshift.io/api/v2/
          source: clair-sample-instance
          key: 207c527cfc2a6b8dcf4fa43ad7a976da
    notifier:
      indexer_addr: http://clair.clair.svc.cluster.local:8080/
      matcher_addr: http://clair.clair.svc.cluster.local:8080/
      connstring: postgres://clair:clair@clair-postgresql.clair.svc.cluster.local/clair?sslmode=disable
      migrations: true
      delivery_interval: 1m
      poll_interval: 5m
    trace:
      name: jaeger
      probability: 1
      jaeger:
        agent:
          endpoint: localhost:6831
        service_name: clair
    metrics:
      name: prometheus

I just tried this on the alpine base image too, same situation:

$ ./clairctl -D report --host http://clair.clair.svc.cluster.local:8080 alpine
2023-05-16T21:16:12Z DBG fetching ref=alpine
2023-05-16T21:16:12Z DBG using text output
2023-05-16T21:16:13Z DBG found manifest digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-05-16T21:16:13Z DBG requesting index_report attempt=1 digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-05-16T21:16:14Z DBG digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda method=GET path=/indexer/api/v1/index_report/sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine status="200 OK"
2023-05-16T21:16:14Z DBG manifest may be out-of-date digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda manifest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-05-16T21:16:14Z DBG found manifest digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-05-16T21:16:14Z DBG found layers count=1 digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-05-16T21:16:14Z DBG requesting index_report attempt=2 digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-05-16T21:16:14Z DBG digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda method=GET path=/indexer/api/v1/index_report/sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine status="200 OK"
2023-05-16T21:16:15Z DBG digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda method=POST path=/indexer/api/v1/index_report ref=alpine status="201 Created"
2023-05-16T21:16:15Z DBG setting validator digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda path=/indexer/api/v1/index_report/sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine validator="\"7a5f5333aeda3d3d3c679da74d74cab5\""
2023-05-16T21:16:15Z DBG digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda method=GET path=/matcher/api/v1/vulnerability_report/sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine status="200 OK"
alpine ok

but both Trivy and Grype have the same findings:

$ trivy image alpine
2023-05-16T22:15:06.074+0100    INFO    Vulnerability scanning is enabled
2023-05-16T22:15:06.074+0100    INFO    Secret scanning is enabled
2023-05-16T22:15:06.074+0100    INFO    If your scanning is slow, please try '--scanners vuln' to disable secret scanning
2023-05-16T22:15:06.074+0100    INFO    Please see also https://aquasecurity.github.io/trivy/v0.41/docs/secret/scanning/#recommendation for faster secret detection
2023-05-16T22:15:06.893+0100    INFO    Detected OS: alpine
2023-05-16T22:15:06.893+0100    INFO    Detecting Alpine vulnerabilities...
2023-05-16T22:15:07.034+0100    INFO    Number of language-specific files: 0

alpine (alpine 3.17.3)

Total: 2 (UNKNOWN: 0, LOW: 0, MEDIUM: 2, HIGH: 0, CRITICAL: 0)

┌────────────┬───────────────┬──────────┬───────────────────┬───────────────┬────────────────────────────────────────────────────────────┐
│  Library   │ Vulnerability │ Severity │ Installed Version │ Fixed Version │                           Title                            │
├────────────┼───────────────┼──────────┼───────────────────┼───────────────┼────────────────────────────────────────────────────────────┤
│ libcrypto3 │ CVE-2023-1255 │ MEDIUM   │ 3.0.8-r3          │ 3.0.8-r4      │ Input buffer over-read in AES-XTS implementation on 64 bit │
│            │               │          │                   │               │ ARM                                                        │
│            │               │          │                   │               │ https://avd.aquasec.com/nvd/cve-2023-1255                  │
├────────────┤               │          │                   │               │                                                            │
│ libssl3    │               │          │                   │               │                                                            │
│            │               │          │                   │               │                                                            │
│            │               │          │                   │               │                                                            │
└────────────┴───────────────┴──────────┴───────────────────┴───────────────┴────────────────────────────────────────────────────────────┘
$ gype alpine
bash: gype: command not found
22:15:14 hari@agrippa:master ~/github/templates > grype alpine
 ✔ Vulnerability DB        [no update available]
 ✔ Loaded image
 ✔ Parsed image
 ✔ Cataloged packages      [16 packages]
 ✔ Scanning image...       [2 vulnerabilities]
   ├── 0 critical, 0 high, 2 medium, 0 low, 0 negligible
   └── 2 fixed
NAME        INSTALLED  FIXED-IN  TYPE  VULNERABILITY  SEVERITY
libcrypto3  3.0.8-r3   3.0.8-r4  apk   CVE-2023-1255  Medium
libssl3     3.0.8-r3   3.0.8-r4  apk   CVE-2023-1255  Medium

Have I configured Claire wrong is this a bug or limitation as I can see alpine in the list of matchers and updater_sets?

And in the clair pod logs it does look like it is updating for alpine:

{"level":"info","component":"alpine/Updater.Fetch","updater":"alpine-community-v3.11-updater","time":"2023-05-16T21:18:01Z","message":"database unchanged since last fetch"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"alpine-community-v3.11-updater","time":"2023-05-16T21:18:01Z","message":"vulnerability database unchanged"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"alpine-community-v3.11-updater","time":"2023-05-16T21:18:01Z","message":"finished update"}
{"level":"info","component":"alpine/Updater.Fetch","updater":"alpine-community-v3.8-updater","time":"2023-05-16T21:18:01Z","message":"database unchanged since last fetch"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"alpine-community-v3.8-updater","time":"2023-05-16T21:18:01Z","message":"vulnerability database unchanged"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"alpine-community-v3.8-updater","time":"2023-05-16T21:18:01Z","message":"finished update"}
{"level":"info","component":"alpine/Updater.Fetch","updater":"alpine-community-v3.7-updater","time":"2023-05-16T21:18:02Z","message":"database unchanged since last fetch"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"alpine-community-v3.7-updater","time":"2023-05-16T21:18:02Z","message":"vulnerability database unchanged"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"alpine-community-v3.7-updater","time":"2023-05-16T21:18:02Z","message":"finished update"}
{"level":"info","component":"alpine/Updater.Fetch","updater":"alpine-community-v3.12-updater","time":"2023-05-16T21:18:02Z","message":"database unchanged since last fetch"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"alpine-community-v3.12-updater","time":"2023-05-16T21:18:02Z","message":"vulnerability database unchanged"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"alpine-community-v3.12-updater","time":"2023-05-16T21:18:02Z","message":"finished update"}
{"level":"info","component":"alpine/Updater.Fetch","updater":"alpine-main-v3.11-updater","time":"2023-05-16T21:18:02Z","message":"database unchanged since last fetch"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"alpine-main-v3.11-updater","time":"2023-05-16T21:18:02Z","message":"vulnerability database unchanged"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"alpine-main-v3.11-updater","time":"2023-05-16T21:18:02Z","message":"finished update"}
{"level":"info","component":"alpine/Updater.Fetch","updater":"alpine-main-v3.7-updater","time":"2023-05-16T21:18:02Z","message":"database unchanged since last fetch"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"alpine-main-v3.7-updater","time":"2023-05-16T21:18:02Z","message":"vulnerability database unchanged"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"alpine-main-v3.7-updater","time":"2023-05-16T21:18:02Z","message":"finished update"}
{"level":"info","component":"alpine/Updater.Fetch","updater":"alpine-main-v3.3-updater","time":"2023-05-16T21:18:02Z","message":"database unchanged since last fetch"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"alpine-main-v3.3-updater","time":"2023-05-16T21:18:02Z","message":"vulnerability database unchanged"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"alpine-main-v3.3-updater","time":"2023-05-16T21:18:02Z","message":"finished update"}

Does this occur against a current release? 4.3 is pretty old

Yes I've just upgraded the Kubernetes deployment to use the Clair 4.6.1 docker image and run clairctl again after the pod was replaced with the new version and still got the same result:

$ ./clairctl -D report --host http://clair.clair.svc.cluster.local:8080 alpine
2023-05-16T23:22:50Z DBG fetching ref=alpine
2023-05-16T23:22:50Z DBG using text output
2023-05-16T23:22:50Z DBG found manifest digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-05-16T23:22:50Z DBG requesting index_report attempt=1 digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-05-16T23:22:51Z DBG digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda method=GET path=/indexer/api/v1/index_report/sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine status="200 OK"
2023-05-16T23:22:51Z DBG manifest may be out-of-date digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda manifest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-05-16T23:22:51Z DBG found manifest digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-05-16T23:22:51Z DBG found layers count=1 digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-05-16T23:22:51Z DBG requesting index_report attempt=2 digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-05-16T23:22:51Z DBG digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda method=GET path=/indexer/api/v1/index_report/sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine status="200 OK"
2023-05-16T23:22:53Z DBG digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda method=POST path=/indexer/api/v1/index_report ref=alpine status="201 Created"
2023-05-16T23:22:53Z DBG setting validator digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda path=/indexer/api/v1/index_report/sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine validator="\"72c02bd8d137de68c2a998932cc427a2\""
2023-05-16T23:22:53Z DBG digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda method=GET path=/matcher/api/v1/vulnerability_report/sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine status="200 OK"
alpine ok

Can you post server logs and the JSON output of clairctl?

This can also be reproduced by looking at vulnerabilities found by AWS ECR, which uses Clair under the hood.
In our setup, we use Harbor, which scans using Trivy, after which it replicates the images in ECR, which scans on push.
The difference in findings is abysmal:
For the same image, Trivy finds 148 items while ECR/Clair only has 12.

It's impossible to help any further without the relevant server logs and the JSON output.

As far as ECR, I'd advise you to take it up with their support. We don't operate ECR.

I just tried to run this again quickly but it looks like it got an error:

$ kubectl port-forward svc/clair 8080:8080 &

$ clairctl -D report --host http://localhost:8080 alpine
2023-06-01T17:45:55+01:00 DBG fetching ref=alpine
2023-06-01T17:45:55+01:00 DBG using text output
2023-06-01T17:45:58+01:00 DBG found manifest digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-06-01T17:45:58+01:00 DBG requesting index_report attempt=1 digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
Handling connection for 8080
2023-06-01T17:46:00+01:00 DBG digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda method=GET path=/indexer/api/v1/index_report/sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine status="404 Not Found"
2023-06-01T17:46:00+01:00 DBG don't have needed manifest digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda manifest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-06-01T17:46:00+01:00 DBG found manifest digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-06-01T17:46:00+01:00 DBG found layers count=1 digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
2023-06-01T17:46:01+01:00 DBG requesting index_report attempt=2 digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine
Handling connection for 8080
2023-06-01T17:46:01+01:00 DBG digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda method=GET path=/indexer/api/v1/index_report/sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine status="404 Not Found"
Handling connection for 8080
2023-06-01T17:46:03+01:00 DBG digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda method=POST path=/indexer/api/v1/index_report ref=alpine status="201 Created"
2023-06-01T17:46:03+01:00 DBG setting validator digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda path=/indexer/api/v1/index_report/sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine validator="\"72c02bd8d137de68c2a998932cc427a2\""
E0601 17:46:03.091917   92963 portforward.go:391] error copying from local connection to remote stream: read tcp4 127.0.0.1:8080->127.0.0.1:56990: read: connection reset by peer
2023-06-01T17:46:03+01:00 DBG digest=sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda method=GET path=/matcher/api/v1/vulnerability_report/sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda ref=alpine status="404 Not Found"
alpine error alpine(sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda): unexpected return status: 404

kubectl logs

{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"RHEL8-openstack-17-including-unpatched","time":"2023-06-01T16:34:50Z","message":"finished update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"debian/updater/jessie","time":"2023-06-01T16:34:50Z","message":"vulnerability database unchanged"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"debian/updater/jessie","time":"2023-06-01T16:34:50Z","message":"finished update"}
{"level":"info","updater":"debian/updater/stretch","component":"libvuln/updates/Manager.driveUpdater","time":"2023-06-01T16:34:50Z","message":"vulnerability database unchanged"}
{"level":"info","updater":"debian/updater/stretch","component":"libvuln/updates/Manager.driveUpdater","time":"2023-06-01T16:34:50Z","message":"finished update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"debian/updater/wheezy","time":"2023-06-01T16:34:50Z","message":"starting update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"debian/updater/bullseye","time":"2023-06-01T16:34:50Z","message":"starting update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"debian/updater/buster","time":"2023-06-01T16:34:50Z","message":"starting update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"debian/updater/wheezy","time":"2023-06-01T16:34:50Z","message":"vulnerability database unchanged"}
{"level":"info","updater":"debian/updater/wheezy","component":"libvuln/updates/Manager.driveUpdater","time":"2023-06-01T16:34:50Z","message":"finished update"}
{"level":"info","updater":"debian/updater/bullseye","component":"libvuln/updates/Manager.driveUpdater","time":"2023-06-01T16:34:51Z","message":"vulnerability database unchanged"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"debian/updater/bullseye","time":"2023-06-01T16:34:51Z","message":"finished update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"debian/updater/buster","time":"2023-06-01T16:34:51Z","message":"vulnerability database unchanged"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"debian/updater/buster","time":"2023-06-01T16:34:51Z","message":"finished update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"RHEL7-storage-ceph-3-including-unpatched","ref":"3c249929-a060-4ed0-a92a-6f79c1063246","time":"2023-06-01T16:34:52Z","message":"successful update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"RHEL7-storage-ceph-3-including-unpatched","time":"2023-06-01T16:34:52Z","message":"finished update"}
{"level":"info","updater":"RHEL8-rhel-8-including-unpatched","component":"rhel/Updater.Parse","time":"2023-06-01T16:34:52Z","message":"starting parse"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"rhel-container-updater","ref":"64aaaf5c-af98-4105-9d0d-d228e9d0ab0f","time":"2023-06-01T16:35:38Z","message":"successful update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"rhel-container-updater","time":"2023-06-01T16:35:38Z","message":"finished update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"RHEL7-storage-gluster-3-including-unpatched","ref":"04259941-cc4f-4a8b-a657-c6d233e1058f","time":"2023-06-01T16:35:50Z","message":"successful update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"RHEL7-storage-gluster-3-including-unpatched","time":"2023-06-01T16:35:50Z","message":"finished update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"RHEL9-rhel-9-including-unpatched","ref":"5f01296d-26b3-4106-bb08-2b8816ecb2f3","time":"2023-06-01T16:36:59Z","message":"successful update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"RHEL9-rhel-9-including-unpatched","time":"2023-06-01T16:36:59Z","message":"finished update"}
{"level":"info","updater":"RHEL6-rhel-6-including-unpatched","component":"libvuln/updates/Manager.driveUpdater","ref":"57e8fd00-0d64-4b8c-b3f2-b371a2d23130","time":"2023-06-01T16:37:54Z","message":"successful update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"RHEL6-rhel-6-including-unpatched","time":"2023-06-01T16:37:54Z","message":"finished update"}
{"level":"info","updater":"RHEL7-rhel-7-including-unpatched","component":"libvuln/updates/Manager.driveUpdater","ref":"df731e8e-1229-4ef7-ae95-aa67c29394f0","time":"2023-06-01T16:39:06Z","message":"successful update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"RHEL7-rhel-7-including-unpatched","time":"2023-06-01T16:39:06Z","message":"finished update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"RHEL8-rhel-8-including-unpatched","ref":"ce305f90-5500-40aa-959a-80b3b9d447aa","time":"2023-06-01T16:40:20Z","message":"successful update"}
{"level":"info","component":"libvuln/updates/Manager.driveUpdater","updater":"RHEL8-rhel-8-including-unpatched","time":"2023-06-01T16:40:20Z","message":"finished update"}
{"level":"info","component":"libvuln/updates/Manager.Run","retention":10,"time":"2023-06-01T16:40:20Z","message":"GC started"}
{"level":"info","component":"httptransport/New","request_id":"193a9932ac6df8df","remote_addr":"127.0.0.1:56574","method":"GET","request_uri":"/indexer/api/v1/index_report/sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda","status":404,"duration":2141.806282,"time":"2023-06-01T16:46:00Z","message":"handled HTTP request"}
{"level":"info","component":"httptransport/New","request_id":"c2f192cdcce9c801","remote_addr":"127.0.0.1:56584","method":"GET","request_uri":"/indexer/api/v1/index_report/sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda","status":404,"duration":1.338389,"time":"2023-06-01T16:46:01Z","message":"handled HTTP request"}
{"level":"info","manifest":"sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda","request_id":"12d7bc91d06d8de3","component":"libindex/Libindex.Index","time":"2023-06-01T16:46:01Z","message":"index request start"}
{"level":"info","request_id":"12d7bc91d06d8de3","component":"indexer/controller/Controller.Index","manifest":"sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda","time":"2023-06-01T16:46:01Z","message":"starting scan"}
{"level":"info","manifest":"sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda","state":"CheckManifest","request_id":"12d7bc91d06d8de3","component":"indexer/controller/Controller.Index","error":"failed to upsert index report: ERROR: null value in column \"manifest_id\" of relation \"indexreport\" violates not-null constraint (SQLSTATE 23502)","time":"2023-06-01T16:46:03Z","message":"failed persisting index report"}
{"level":"info","request_id":"12d7bc91d06d8de3","component":"libindex/Libindex.Index","manifest":"sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda","time":"2023-06-01T16:46:03Z","message":"index request done"}
{"level":"info","request_id":"12d7bc91d06d8de3","component":"httptransport/New","remote_addr":"127.0.0.1:56594","method":"POST","request_uri":"/indexer/api/v1/index_report","status":201,"duration":1733.370372,"time":"2023-06-01T16:46:03Z","message":"handled HTTP request"}
{"level":"info","component":"httptransport/New","request_id":"e5902acd0f66b763","remote_addr":"127.0.0.1:56594","method":"GET","request_uri":"/matcher/api/v1/vulnerability_report/sha256:c0669ef34cdc14332c0f1ab0c2c01acb91d96014b172f1a76f3a39e63d1f0bda","status":404,"duration":570.820485,"time":"2023-06-01T16:46:03Z","message":"handled HTTP request"}

I looks like there's something up with the indexer, but the provided logs don't have any clues as to why except the error from the database, which is an error that I don't see, at a glance, how it would happen. Debug logs filtered down to a single request_id would be most helpful.

Closing due to age.