fahedouch commented on issue #34244:
URL: https://github.com/apache/airflow/issues/34244#issuecomment-4199512705

   Hey @potiuk,
   
   This is a problem I've been spending time on — specifically the "hundreds of 
CVEs, not enough people to assess them all" part. The approach I've been 
exploring is using the system's own context (how it's deployed, what data it 
handles, what controls are in place) to score each CVE's actual relevance, 
rather than relying on raw CVSS alone.
   
   For a project like Airflow where you have 700+ dependencies, I think the 
realistic workflow is two-tiered:
   
   1. **Automated triage** — a tool scores every CVE against the deployment 
context and flags the ones that are likely relevant. This cuts the manual 
review pile from hundreds to maybe 20-30.
   2. **Human assessment** — maintainers (or security-minded contributors) 
review only the flagged subset and produce authoritative VEX statements.
   
   I've built an open-source tool called 
[vens](https://github.com/venslabs/vens) that handles step 1 — it takes a Trivy 
or Grype report plus a YAML file describing the system context, and outputs 
CycloneDX VEX with risk scores. It wouldn't replace the human step you 
described, but it could dramatically reduce the surface area maintainers need 
to look at.
   
   Would it be useful if I ran it against a recent Airflow image and shared the 
results here? That would give a concrete sense of how many CVEs get filtered 
down and whether the triage quality is good enough to be worth integrating into 
your workflow.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to