Manufacturers of products with digital elements shall:
identify and document vulnerabilities and components contained in products with digital elements, including by drawing up a software bill of materials in a commonly used and machine-readable format covering at the very least the top-level dependencies of the products;
in relation to the risks posed to products with digital elements, address and remediate vulnerabilities without delay, including by providing security updates; where technically feasible, new security updates shall be provided separately from functionality updates;
apply effective and regular tests and reviews of the security of the product with digital elements;
once a security update has been made available, share and publicly disclose information about fixed vulnerabilities, including a description of the vulnerabilities, information allowing users to identify the product with digital elements affected, the impacts of the vulnerabilities, their severity and clear and accessible information helping users to remediate the vulnerabilities; in duly justified cases, where manufacturers consider the security risks of publication to outweigh the security benefits, they may delay making public information regarding a fixed vulnerability until after users have been given the possibility to apply the relevant patch;
put in place and enforce a policy on coordinated vulnerability disclosure;
take measures to facilitate the sharing of information about potential vulnerabilities in their product with digital elements as well as in third-party components contained in that product, including by providing a contact address for the reporting of the vulnerabilities discovered in the product with digital elements;
provide for mechanisms to securely distribute updates for products with digital elements to ensure that vulnerabilities are fixed or mitigated in a timely manner and, where applicable for security updates, in an automatic manner;
ensure that, where security updates are available to address identified security issues, they are disseminated without delay and, unless otherwise agreed between a manufacturer and a business user in relation to a tailor-made product with digital elements, free of charge, accompanied by advisory messages providing users with the relevant information, including on potential action to be taken.
This requirement mandates transparency and traceability in product composition and vulnerabilities.
It requires manufacturers to:
Identify all components, including open-source, third-party, and proprietary ones.
Document these components in a Software Bill of Materials (SBOM) — a formal, structured record similar to an ingredient list for software.
Keep an up-to-date inventory of components and their vulnerabilities to enable faster response when new threats emerge.
Use standardized, machine-readable formats (e.g., SPDX, CycloneDX) so SBOMs can be shared, analyzed, and integrated across tools.
In essence, this clause enforces software supply chain visibility — knowing what’s inside your product, monitoring its weaknesses, and proving it through an auditable SBOM.
Organizational Actions
Define a Software Composition and Vulnerability Management Policy outlining:
Roles and responsibilities (Security, Engineering, Compliance).
SBOM generation frequency and storage location.
Processes for identifying and remediating component vulnerabilities.
Appoint ownership:
Engineering – maintains SBOM and integrates generation tools.
Security – monitors vulnerabilities and coordinates remediation.
Compliance – ensures SBOM availability for audits or customers.
Integrate SBOM into the SSDLC, ensuring each release includes:
Component inventory.
Known vulnerability references (CVEs).
Licensing information (for open source).
Conduct periodic validation audits to ensure SBOM completeness and accuracy.
Policy / Process Updates
Update the SSDLC Policy to include SBOM generation as part of the build and release phase.
Include a requirement in the Supply Chain Software Evaluation Procedure that all third-party libraries must appear in the SBOM.
Update Vulnerability Management Policy to require mapping vulnerabilities to SBOM components.
Define retention and update cycles for SBOMs (e.g., regenerate after every new release or patch).
Mandate that SBOMs are machine-readable and exportable in SPDX or CycloneDX formats.
Technical Implementations
SBOM Generation and Management
Integrate automated tools in CI/CD pipelines such as:
Syft (open-source) or Anchore for SBOM generation.
OWASP Dependency-Track or Snyk for continuous vulnerability tracking.
Include at least top-level dependencies, but aim to expand to transitive dependencies for higher maturity.
Store SBOMs in version-controlled repositories with metadata (build ID, version, timestamp).
Enable API-based access to SBOMs for authorized parties (auditors, customers).
Vulnerability Identification
Link SBOM components with vulnerability sources like:
NVD (National Vulnerability Database)
OSV (Open Source Vulnerability Database)
Vendor security advisories
Automate vulnerability scans using SCA tools (e.g., Dependency-Check, Snyk, Mend, or Black Duck).
Correlate findings with product risk levels to prioritize remediation.
Documentation Requirements
Software Composition and Vulnerability Management Policy
Describes SBOM creation, ownership, and maintenance processes.
SBOM Files
Machine-readable files (SPDX/CycloneDX) containing component inventories.
Release Notes / Change Logs
Include SBOM reference and summary of newly added/removed components.
Vulnerability Management Report
Links SBOM components to discovered CVEs and remediation status.
Tool Configuration Records
CI/CD scripts or screenshots proving automation for SBOM generation.
Audit Reports
Evidence of periodic SBOM verification and review.
Common Pitfalls and Readines Gaps
SBOMs generated manually or inconsistently.
Missing transitive (nested) dependencies.
Using non-standard formats (making them unreadable by security tools).
No linkage between SBOM components and vulnerability tracking tools.
Lack of version control or history of SBOM changes.
SBOMs not regenerated after each release or patch.
Absence of ownership — no one accountable for updates or validation.
Tools and Frameworks to Help
SBOM Generation: Syft, Anchore, SPDX tools, CycloneDX CLI
Vulnerability Tracking: OWASP Dependency-Track, Snyk, Mend, Black Duck
CI/CD Integration: GitHub Actions, GitLab CI, Jenkins plugins for SBOM
Standards / Formats: SPDX (ISO/IEC 5962:2021), CycloneDX, SWID Tags
Governance / Frameworks: NIST SSDF (SP 800-218), ENISA Software Supply Chain Security Framework
This requirement ensures software supply chain transparency and accountability.
To comply:
Maintain a comprehensive, machine-readable SBOM.
Automate SBOM generation and vulnerability mapping in the CI/CD pipeline.
Assign clear ownership and verification responsibility.
Keep documentation and evidence ready for audits.
Properly implemented, this builds customer trust, simplifies incident response, and fulfills CRA’s core goal of ensuring secure, traceable, and trustworthy software components.
This requirement mandates timely and effective vulnerability remediation through a structured and transparent patch management process.
In simple terms, manufacturers must:
Detect, assess, and fix vulnerabilities quickly based on the severity and potential impact.
Deliver security patches promptly, independent of feature releases, to minimize exposure time.
Avoid coupling security fixes with functional updates (e.g., new features), as it delays security remediation and increases risk.
Establish an organizational process that prioritizes patch delivery and tracks remediation progress.
In essence, this clause enforces a continuous vulnerability response capability — ensuring that known weaknesses are addressed without operational or bureaucratic delay.
Organizational Actions
Define a Vulnerability Remediation and Patch Management Policy outlining:
Timeframes for patching based on risk severity (e.g., critical – 7 days, high – 14 days, medium – 30 days).
Ownership and escalation procedures for delayed patches.
Testing, approval, and rollback procedures for updates.
Clear separation of security updates from feature releases.
Assign clear responsibilities:
Security Team – performs vulnerability triage, severity classification, and risk analysis.
Engineering Team – develops and validates security patches.
Product Management / Release Management – ensures timely distribution of patches to customers.
Implement a defined approval and release flow for emergency security updates (out-of-band patches).
Track and report patch SLAs to ensure measurable accountability and continuous improvement.
Policy / Process Updates
Update the SSDLC Maintenance Phase to explicitly require timely security patching independent of functionality updates.
Include vulnerability remediation timelines within the Vulnerability Management Policy.
Establish a Patch Release Procedure that:
Allows for emergency patches outside normal release cycles.
Ensures separate packaging and signing of security patches.
Define risk-based prioritization aligned with CVSS scores and internal Business Impact Assessment (BIA).
Require periodic vulnerability review meetings to track outstanding fixes and exceptions.
Integrate this process into the Product Security Incident Response Process (PSIRP) for coordinated vulnerability response.
Technical Implementations
Automated Vulnerability Tracking and Prioritization
Integrate SCA, SAST, and DAST tools into CI/CD pipelines for early detection.
Use centralized vulnerability dashboards (e.g., Dependency-Track, Jira Security workflows).
Automatically correlate vulnerabilities with SBOM components for traceability.
Secure Update Management
Separate security patch pipelines from general release pipelines to enable rapid delivery.
Digitally sign all security updates and verify authenticity at installation.
Ensure backward compatibility testing and rollback functionality for emergency patches.
Maintain versioning and changelogs distinguishing between:
Security Updates (risk mitigation)
Feature Enhancements (new functionality).
Telemetry and Monitoring
Implement telemetry to track update success rates and adoption among users.
Collect anonymized metrics on patch application status to identify lagging systems.
Establish alerting for overdue patch deployments or unresolved vulnerabilities.
Documentation Requirements
Vulnerability Remediation & Patch Management Policy: Defines timelines, ownership, and escalation for vulnerability fixes.
Patch Release Procedure: Step-by-step process for developing, validating, and releasing security patches independently.
Vulnerability Tracking Log: Records vulnerabilities, severity, status, and patch release dates.
Risk Assessment Reports: Maps vulnerabilities to business impact and risk treatment decisions.
Change Management Records: Evidence of testing, approval, and deployment of security updates.
Release Notes: Clearly distinguish between functionality updates and security patches.
Audit Trail: Historical log of vulnerability detections, fixes, and communication.
Common Pitfalls and Readiness Gaps
Security fixes delayed because they’re bundled with feature releases.
Lack of defined SLAs for patch timelines.
Absence of emergency (out-of-band) patch release processes.
Patches released without digital signing or integrity verification.
No evidence or audit trail of patch distribution.
Customers not informed of available security updates or their criticality.
Poor coordination between development, QA, and security teams leading to bottlenecks.
Tools and Frameworks to Help
Vulnerability Tracking: Jira Security, OWASP Dependency-Track, Snyk, Mend
Patch Automation: Jenkins, GitHub Actions, GitLab CI, AWS CodePipeline
Update Signing / Verification: Sigstore, Cosign, GPG, Notary v2
Compliance & Governance: NIST SP 800-40 Rev.4 (Guide to Patch Management), ISO/IEC 30111 (Vulnerability Handling), ISO/IEC 27034 (Application Security)
Monitoring & Reporting: Grafana, Prometheus, ELK Stack, Vanta or Drata dashboards
This requirement ensures rapid and transparent vulnerability remediation to protect users and maintain product integrity.
To comply:
Establish a formal patch management policy with defined SLAs and ownership.
Separate security patches from feature updates for faster response.
Automate vulnerability tracking, testing, and delivery in CI/CD.
Maintain signed, verifiable update packages and audit records.
When implemented effectively, this requirement transforms vulnerability response from a reactive activity into a proactive resilience process, reinforcing user trust and CRA compliance readiness.
This requirement mandates continuous, systematic, and verifiable security testing throughout the product lifecycle — not just before release.
Manufacturers must ensure that products with digital elements are:
Tested regularly to identify new vulnerabilities introduced through updates, dependencies, or environmental changes.
Reviewed effectively through structured methodologies (e.g., code reviews, threat modeling, penetration testing).
Continuously validated against evolving threats and attack techniques.
In essence, this clause enforces proactive security assurance — turning security testing from a one-time activity into an ongoing process that evolves with the product.
Organizational Actions
Establish a Security Testing Program aligned with product lifecycle stages:
Pre-release: static/dynamic testing, penetration testing, fuzzing.
Post-release: regression testing, patch verification, red-team simulations.
Define testing cadence based on risk and criticality:
Critical systems – quarterly penetration tests.
Standard products – at least annually or with each major update.
Assign ownership:
Security Engineering Team – defines testing strategy and tools.
QA / Test Automation Teams – execute recurring validation.
External Auditors or Red Teams – perform independent assessments.
Integrate security testing results into vulnerability management workflow for continuous tracking and remediation.
Implement a metrics-based validation framework — measuring defect density, remediation time, and recurring issue trends.
Policy / Process Updates
Update the SSDLC Policy to require security testing at every phase (design → deployment → maintenance).
Define a Security Testing and Validation Procedure that includes:
Types of testing (SAST, DAST, IAST, SCA, fuzzing, penetration testing).
Test coverage requirements.
Testing frequency and ownership.
Evidence retention for auditability.
Integrate continuous testing into CI/CD pipelines using automation tools.
Require independent third-party testing for high-risk products before release.
Update the Vulnerability Management Policy to include how test findings are classified and remediated.
Introduce a Post-Deployment Review Policy for verifying that patches or new releases have not introduced regressions.
Technical Implementations
Automated and Manual Security Testing
SAST (Static Application Security Testing): Integrate tools (e.g., Semgrep, SonarQube, Checkmarx) for code-level scanning.
DAST (Dynamic Application Security Testing): Simulate runtime attacks (e.g., OWASP ZAP, Burp Suite).
SCA (Software Composition Analysis): Detect vulnerabilities in dependencies (e.g., Snyk, Mend, Dependency-Track).
Fuzz Testing: Use automated fuzzers (e.g., AFL, libFuzzer, BooFuzz) to uncover edge-case vulnerabilities.
Penetration Testing: Perform both internal (white-box) and external (black-box) tests.
Regression and Revalidation: After every security patch or major update, rerun relevant test suites.
Secure Review Mechanisms
Conduct peer code reviews following Code Review Guidelines, focusing on security logic and data handling.
Integrate threat modeling reviews during design and post-release (e.g., STRIDE or PASTA methodologies).
Establish automated test gates in CI/CD pipelines, preventing release of code failing defined security thresholds.
Implement continuous monitoring and testing through runtime security tools (e.g., WAFs, RASP, or runtime analyzers).
Testing Infrastructure
Maintain isolated test environments mirroring production setups.
Automate test execution scheduling using CI/CD orchestrators.
Enforce test result logging and integrity verification for audits.
Documentation Requirements
Security Testing Policy / Procedure
Defines required testing types, frequency, and scope across product lifecycle.
Testing Schedule and Plan
Documents planned test frequency, responsible teams, and timelines.
Test Execution Reports
Detailed results of SAST/DAST/SCA, fuzzing, and penetration testing.
Vulnerability Remediation Reports
Evidence that findings were addressed and retested.
Threat Modeling Records
Capture evolving threat landscape and design responses.
Post-Release Review Reports
Evaluate product security posture after deployment or updates.
Audit Logs and Metrics
Quantitative data on test coverage, frequency, and issue closure times.
Common Pitfalls and Readiness Gaps
Treating penetration testing as a one-time compliance exercise.
Absence of recurring testing schedule post-deployment.
Reliance on manual testing only — no automation or integration into CI/CD.
Failure to revalidate after patches or major feature changes.
Lack of clear ownership for testing responsibilities.
Missing traceability from test findings to remediation closure.
Tools and Frameworks to Help
SAST: SonarQube, Semgrep, Fortify, Checkmarx
DAST: OWASP ZAP, Burp Suite, Netsparker
SCA: Snyk, Mend, Dependency-Track
Fuzzing: AFL, BooFuzz, libFuzzer
Penetration Testing: Metasploit, Cobalt Strike (controlled use), Kali Linux
Automation / CI-CD Integration: Jenkins, GitLab CI, GitHub Actions
Frameworks / Standards: OWASP SAMM, NIST SP 800-115 (Technical Testing), ISO/IEC 27034-1 (Application Security)
This requirement enforces ongoing validation of product security posture — ensuring vulnerabilities are caught and remediated early and consistently.
To comply:
Conduct automated and manual security testing regularly.
Integrate testing into CI/CD to ensure continuous security assurance.
Maintain testing records, reports, and metrics for CRA audits.
Use independent or third-party assessments for critical releases.
When implemented effectively, this transforms testing from a compliance checkbox into a continuous trust-building mechanism, ensuring the product remains resilient and compliant throughout its lifecycle.
This requirement establishes transparency and responsible disclosure obligations.
Manufacturers must inform users and the public about fixed vulnerabilities—ensuring users can understand:
What was vulnerable,
How severe it was,
Whether their product is affected, and
What steps they must take to remain secure.
It also introduces a risk-based exception: disclosure can be delayed only if immediate publication could increase exploitation risk before users patch their systems.
In short, the CRA expects manufacturers to adopt a structured, transparent, and responsible vulnerability disclosure policy (VDP) that balances security benefit vs. exploitation risk.
Organizational Actions
Establish a Vulnerability Disclosure Policy (VDP):
Define how and when vulnerability information is disclosed publicly.
Align with ISO/IEC 29147 (Vulnerability Disclosure) and ISO/IEC 30111 (Vulnerability Handling Process).
Define Clear Disclosure Timelines:
Security updates must be disclosed immediately after release, unless publication poses undue risk.
Delays should be formally justified and time-limited (e.g., until 80–90% patch adoption).
Define Roles and Responsibilities:
Product Security Team → prepares advisories, severity ratings, and patch guidance.
Communications / PR Team → manages public messaging and press releases.
Legal / Compliance → reviews wording and risk-based justification for any disclosure delays.
Develop a Standard Template for Security Advisories:
Include:
Product name and version affected
CVE identifier (if applicable)
Vulnerability description (high-level, non-exploitative)
CVSS or severity rating
Impact summary
Recommended remediation (patch or configuration)
References or links to additional resources
Coordinate Disclosure:
Share information with affected partners, integrators, and supply chain entities before public release.
Publish advisories via the manufacturer’s security page, RSS feed, and/or trusted vulnerability databases (e.g., NVD, CERTs).
Notify national authorities or CSIRTs, as required under CRA and national transpositions.
Maintain a Disclosure Log:
Track all vulnerability advisories, including date of release, disclosure decision (immediate or delayed), and justification.
Policy / Process Updates
Vulnerability Disclosure Policy:
Defines public communication principles, timing, roles, and exceptions.
Specifies criteria for delayed disclosure and approval authority.
Security Advisory Publication Procedure:
Outlines advisory content, review process, distribution channels, and notification mechanisms.
Incident Response & Coordination Policy:
Integrate the disclosure process into the broader vulnerability handling workflow (detect → fix → disclose).
Risk Assessment Procedure for Delayed Disclosure:
Establish criteria to assess when publication poses more harm than benefit (e.g., active exploitation risk, incomplete patch rollout).
Record Keeping / Audit Trail:
Maintain evidence of disclosures, justifications, and timing to demonstrate regulatory compliance.
Technical and Communication Implementation
Disclosure Channels
Dedicated Security Advisory Portal: A public webpage or RSS feed (e.g., security.example.com/advisories).
Email Alerts / Mailing Lists: Notify registered users or partners directly.
Machine-readable Format: Provide advisory metadata in formats like CSAF (Common Security Advisory Framework).
CVE Assignment: Request or assign CVEs to ensure traceability and industry visibility.
Disclosure Workflow Example
Vulnerability confirmed and patched internally.
CVE requested or assigned.
Advisory drafted with vulnerability ID, impact, fix instructions, and timeline.
Internal legal and technical review.
Advisory released publicly (unless delayed for justified risk reasons).
Monitor for exploitation and patch adoption metrics.
Delayed Disclosure Criteria
Disclosure may be postponed only if:
Public release would provide attackers with exploitable details before users can apply patches.
Patch deployment among users is still low, and early disclosure increases risk of exploitation.
Ongoing coordinated disclosure with external researchers or CERTs requires synchronization.
All delays must:
Be documented,
Include risk rationale, and
Be limited in duration.
Documentation Requirements
Vulnerability Disclosure Policy
Outlines disclosure principles, timing, and exceptions.
Security Advisory Template
Standard format for communicating fixed vulnerabilities.
Disclosure Log
Records advisory release dates, severity, and justification for delays.
CVE Management Procedure
Defines how CVEs are requested and tracked.
Risk Assessment for Delayed Disclosure
Evidence supporting any non-immediate publication.
Public Security Portal Evidence
URLs, screenshots, or communication proof of advisories published.
Metrics Reports
Patch adoption rate, time-to-disclose, time-to-notify.
Common Pitfalls and Readiness Gaps
No formal or published vulnerability disclosure policy.
Inconsistent advisory formats or missing severity ratings.
No CVE assignment, reducing credibility and traceability.
Publishing advisories only to limited audiences (not public).
Over-sharing exploit details, increasing attacker knowledge.
Delaying disclosure without documented justification or defined timeline.
No evidence of user notification (email, RSS, or dashboard).
Tools, Frameworks, and Best Practices
Disclosure Standards: ISO/IEC 29147, ISO/IEC 30111, FIRST PSIRT Services Framework
Advisory Formats: CSAF (Common Security Advisory Framework)
CVE Management: MITRE CVE Program, CNA membership
Publication Platforms: Security advisory webpage, GitHub Security Advisories, RSS feeds
Risk Scoring Tools: CVSS calculator (FIRST), EPSS (Exploit Prediction Scoring System)
Patch Management Systems: Jira, ServiceNow, or custom PSIRT dashboards
This requirement establishes transparency, accountability, and trust between manufacturers and users.
It ensures that users are informed, empowered, and able to act to protect their systems, while also mandating responsible communication to prevent premature exposure.
To comply:
Establish and publish a Vulnerability Disclosure Policy.
Provide detailed, structured advisories for every fixed vulnerability.
Use CVE/CSAF for standardization.
Delay disclosure only under formal, documented, and justified conditions.
When properly implemented, this obligation strengthens user trust, regulatory confidence, and product credibility across the EU market.
This requirement mandates manufacturers to establish, publish, and maintain a Coordinated Vulnerability Disclosure (CVD) policy, ensuring that:
Security researchers and external parties have a clear, safe, and legal channel to report vulnerabilities responsibly.
Manufacturers acknowledge, assess, and respond to such reports promptly and transparently.
Vulnerability information is handled in a structured and cooperative way — protecting users while avoiding premature public disclosure.
In essence, this is the CRA’s way of enforcing the “responsible disclosure” model, encouraging collaboration instead of confrontation between manufacturers and the security community.
It requires a formal policy, secure reporting channels, defined timelines, and response workflows that demonstrate commitment to vulnerability transparency and resolution.
Organizational Actions
Create a Coordinated Vulnerability Disclosure (CVD) Policy
Define how external researchers or users can responsibly report vulnerabilities.
Include contact information, expected response timelines, scope, and safe harbor statements.
Designate a PSIRT (Product Security Incident Response Team)
Responsible for receiving, triaging, and coordinating vulnerability handling and communication.
Must operate according to ISO/IEC 30111 (Vulnerability Handling Process).
Set Up Secure Reporting Channels
Provide an email alias (e.g., security@company.com) or a web submission form using encryption (PGP key).
Optionally integrate with platforms such as HackerOne, Bugcrowd, or OpenCVD.
Establish a Disclosure Workflow
Step 1: Researcher reports issue securely.
Step 2: Manufacturer acknowledges receipt (within 7 days recommended).
Step 3: Vulnerability is verified and triaged (assign CVSS score).
Step 4: Fix is developed and validated.
Step 5: Researcher is notified and patch released.
Step 6: Public disclosure coordinated with researcher (usually 90 days max).
Define Roles and Responsibilities
PSIRT Lead: Oversees communication and disclosure coordination.
Engineering: Validates and fixes the vulnerability.
Legal / Compliance: Ensures policy language and liability coverage.
Communications: Manages public-facing advisories and updates.
Include a Safe Harbor Clause
Encourage good-faith research by stating that no legal action will be taken against those who comply with policy terms.
Align with EU Coordinated Vulnerability Disclosure Recommendation (EU 2017/1584).
Maintain Metrics and Tracking
Track time-to-acknowledge, time-to-fix, and time-to-disclose metrics for continuous improvement.
Policy / Process Updates
Coordinated Vulnerability Disclosure Policy (CVDP):
Publicly available document detailing how vulnerabilities are reported, validated, and disclosed.
Specifies scope (e.g., products, versions, domains covered).
Includes contact details, acknowledgment timelines, and researcher recognition process.
PSIRT Operating Procedure:
Internal document describing the workflow for triaging, validating, fixing, and disclosing vulnerabilities.
Defines escalation paths, severity scoring (CVSS), and disclosure coordination.
Legal/Compliance Framework:
Adds safe harbor language ensuring researchers are protected if they follow policy guidelines.
Defines handling of sensitive or export-controlled information during disclosure.
Risk Management and Recordkeeping:
Establish audit logs of received reports, validation outcomes, and response metrics.
Technical and Communication Implementation
Practical Implementation Steps
Create a Public-Facing Security Page
Example: https://security.company.com/disclosure
Include:
Vulnerability reporting email/form.
PGP key for encrypted communication.
Scope of testing (what’s allowed and what’s not).
Expected response times.
Safe harbor clause.
Secure Vulnerability Intake
Use encrypted mailboxes or web forms.
Automate ticket creation and PSIRT notification on submission.
Automate Coordination and Tracking
Use systems like Jira Security, ServiceNow Vulnerability Response, or DefectDojo to track lifecycle.
Establish Disclosure Timeline
Default: 90 days between report and public disclosure (industry standard, e.g., Google Project Zero).
Allow flexibility for critical patches requiring faster turnaround.
Communicate with Researcher
Acknowledge report within 7 days.
Provide regular status updates.
Credit researchers (if desired) upon disclosure publication.
Documentation Requirements
CVD Policy (Public)
Outlines vulnerability reporting process and contact info.
PSIRT SOP (Internal)
Details handling and coordination workflow.
Safe Harbor Statement
Legally defines protection for good-faith researchers.
Vulnerability Intake Log
Records each report, receipt date, and assigned owner.
Fix Verification Record
Evidence that vulnerability was validated and remediated.
Disclosure Record
Tracks disclosure date, CVE assignment, and public notice.
Researcher Communication Log
Maintains all correspondence and acknowledgments.
Common Pitfalls and Readiness Gaps
No public channel or clear contact for reporting vulnerabilities.
Reports sent to general support emails, leading to delays or ignored submissions.
Legal team discourages researchers, deterring responsible disclosure.
No defined response timeline or researcher acknowledgment.
Inconsistent CVE handling or tracking.
Lack of internal PSIRT capability or ownership.
No safe harbor clause → chilling effect on legitimate research.
Tools, Frameworks, and Best Practices
Disclosure Standards: ISO/IEC 29147, ISO/IEC 30111, ENISA Coordinated Vulnerability Disclosure Guidelines
Workflow Automation: Jira Security, ServiceNow VR, DefectDojo, GitHub Security Advisories
CVE Assignment: MITRE CVE Program (CNA coordination)
Public Disclosure Formats: CSAF, CVRF (Common Vulnerability Reporting Framework)
Encryption Tools: GPG, Keybase, OpenPGP
Bug Bounty / Reporting Platforms: HackerOne, Bugcrowd, Intigriti
Metrics & KPIs: Time-to-acknowledge, Time-to-fix, Disclosure completion rate
This requirement cements a trust-based security collaboration model between manufacturers and the cybersecurity community.
To comply:
Publish a Coordinated Vulnerability Disclosure Policy (CVDP).
Provide secure reporting channels (email, web form, PGP).
Protect researchers with safe harbor language.
Operate a PSIRT with structured triage and response workflows.
Maintain transparency, traceability, and timely communication.
When implemented properly, a strong CVD program reduces exploitation risk, improves patch quality, and builds lasting trust among customers, regulators, and the research community.
This requirement mandates manufacturers to establish, publish, and maintain a Coordinated Vulnerability Disclosure (CVD) policy, ensuring that:
Security researchers and external parties have a clear, safe, and legal channel to report vulnerabilities responsibly.
Manufacturers acknowledge, assess, and respond to such reports promptly and transparently.
Vulnerability information is handled in a structured and cooperative way — protecting users while avoiding premature public disclosure.
In essence, this is the CRA’s way of enforcing the “responsible disclosure” model, encouraging collaboration instead of confrontation between manufacturers and the security community.
It requires a formal policy, secure reporting channels, defined timelines, and response workflows that demonstrate commitment to vulnerability transparency and resolution.
Organizational Actions
Create a Coordinated Vulnerability Disclosure (CVD) Policy
Define how external researchers or users can responsibly report vulnerabilities.
Include contact information, expected response timelines, scope, and safe harbor statements.
Designate a PSIRT (Product Security Incident Response Team)
Responsible for receiving, triaging, and coordinating vulnerability handling and communication.
Must operate according to ISO/IEC 30111 (Vulnerability Handling Process).
Set Up Secure Reporting Channels
Provide an email alias (e.g., security@company.com) or a web submission form using encryption (PGP key).
Optionally integrate with platforms such as HackerOne, Bugcrowd, or OpenCVD.
Establish a Disclosure Workflow
Step 1: Researcher reports issue securely.
Step 2: Manufacturer acknowledges receipt (within 7 days recommended).
Step 3: Vulnerability is verified and triaged (assign CVSS score).
Step 4: Fix is developed and validated.
Step 5: Researcher is notified and patch released.
Step 6: Public disclosure coordinated with researcher (usually 90 days max).
Define Roles and Responsibilities
PSIRT Lead: Oversees communication and disclosure coordination.
Engineering: Validates and fixes the vulnerability.
Legal / Compliance: Ensures policy language and liability coverage.
Communications: Manages public-facing advisories and updates.
Include a Safe Harbor Clause
Encourage good-faith research by stating that no legal action will be taken against those who comply with policy terms.
Align with EU Coordinated Vulnerability Disclosure Recommendation (EU 2017/1584).
Maintain Metrics and Tracking
Track time-to-acknowledge, time-to-fix, and time-to-disclose metrics for continuous improvement.
Policy / Process Updates
Coordinated Vulnerability Disclosure Policy (CVDP):
Publicly available document detailing how vulnerabilities are reported, validated, and disclosed.
Specifies scope (e.g., products, versions, domains covered).
Includes contact details, acknowledgment timelines, and researcher recognition process.
PSIRT Operating Procedure:
Internal document describing the workflow for triaging, validating, fixing, and disclosing vulnerabilities.
Defines escalation paths, severity scoring (CVSS), and disclosure coordination.
Legal/Compliance Framework:
Adds safe harbor language ensuring researchers are protected if they follow policy guidelines.
Defines handling of sensitive or export-controlled information during disclosure.
Risk Management and Recordkeeping:
Establish audit logs of received reports, validation outcomes, and response metrics.
Technical and Communication Implementation
Practical Implementation Steps
Create a Public-Facing Security Page
Example: https://security.company.com/disclosure
Include:
Vulnerability reporting email/form.
PGP key for encrypted communication.
Scope of testing (what’s allowed and what’s not).
Expected response times.
Safe harbor clause.
Secure Vulnerability Intake
Use encrypted mailboxes or web forms.
Automate ticket creation and PSIRT notification on submission.
Automate Coordination and Tracking
Use systems like Jira Security, ServiceNow Vulnerability Response, or DefectDojo to track lifecycle.
Establish Disclosure Timeline
Default: 90 days between report and public disclosure (industry standard, e.g., Google Project Zero).
Allow flexibility for critical patches requiring faster turnaround.
Communicate with Researcher
Acknowledge report within 7 days.
Provide regular status updates.
Credit researchers (if desired) upon disclosure publication.
Documentation Requirements
CVD Policy (Public)
Outlines vulnerability reporting process and contact info.
PSIRT SOP (Internal)
Details handling and coordination workflow.
Safe Harbor Statement
Legally defines protection for good-faith researchers.
Vulnerability Intake Log
Records each report, receipt date, and assigned owner.
Fix Verification Record
Evidence that vulnerability was validated and remediated.
Disclosure Record
Tracks disclosure date, CVE assignment, and public notice.
Researcher Communication Log
Maintains all correspondence and acknowledgments.
Common Pitfalls and Readiness Gaps
No public channel or clear contact for reporting vulnerabilities.
Reports sent to general support emails, leading to delays or ignored submissions.
Legal team discourages researchers, deterring responsible disclosure.
No defined response timeline or researcher acknowledgment.
Inconsistent CVE handling or tracking.
Lack of internal PSIRT capability or ownership.
No safe harbor clause → chilling effect on legitimate research.
Tools, Frameworks, and Best Practices
Disclosure Standards: ISO/IEC 29147, ISO/IEC 30111, ENISA Coordinated Vulnerability Disclosure Guidelines
Workflow Automation: Jira Security, ServiceNow VR, DefectDojo, GitHub Security Advisories
CVE Assignment: MITRE CVE Program (CNA coordination)
Public Disclosure Formats: CSAF, CVRF (Common Vulnerability Reporting Framework)
Encryption Tools: GPG, Keybase, OpenPGP
Bug Bounty / Reporting Platforms: HackerOne, Bugcrowd, Intigriti
Metrics & KPIs: Time-to-acknowledge, Time-to-fix, Disclosure completion rate
This requirement cements a trust-based security collaboration model between manufacturers and the cybersecurity community.
To comply:
Publish a Coordinated Vulnerability Disclosure Policy (CVDP).
Provide secure reporting channels (email, web form, PGP).
Protect researchers with safe harbor language.
Operate a PSIRT with structured triage and response workflows.
Maintain transparency, traceability, and timely communication.
When implemented properly, a strong CVD program reduces exploitation risk, improves patch quality, and builds lasting trust among customers, regulators, and the research community.
This requirement mandates that the manufacturer must have a secure, reliable, and timely update mechanism in place.
Security updates must reach end users quickly and securely to minimize exposure time.
Updates should be authentic, verified, and tamper-proof, ensuring integrity and origin.
Where appropriate, automatic updates must be supported, especially for critical vulnerabilities, with mechanisms to prevent exploitation between disclosure and patching.
Secure Update Infrastructure
Implement a digitally signed update process (e.g., code-signing using PKI).
Use TLS or equivalent encryption for update delivery channels.
Validate the signature of update packages before installation.
Timeliness and Automation
Define and document Service Level Objectives (SLOs) for security patch release timelines.
Implement automatic update mechanisms for security-critical patches (with rollback or user control if needed).
Integrity and Verification
Maintain an update manifest that includes cryptographic hashes of update packages.
Conduct post-deployment verification checks to confirm installation success.
Communication and Transparency
Notify users about the nature of updates, including whether they are security or functional.
Provide clear release notes identifying fixed vulnerabilities (aligning with CRA Part II – 4).
Testing and QA
Integrate update validation into your CI/CD pipelines to ensure patches do not break core functionality.
Perform regression and rollback testing before release.
Documentation
Maintain documentation on:
Update procedures and version control.
Security validation steps.
Emergency patch deployment procedures.
Example of Evidence
Policy on Secure Software Update Management.
Technical documentation showing signed update packages.
Update logs demonstrating timely release of security patches.
Communication templates sent to users notifying them of updates.
Testing reports and QA documentation from CI/CD pipelines.
Link with Other CRA Obligations
Ties to Requirement 2 (address and remediate vulnerabilities without delay).
Ties to Requirement 4 (disclosure of fixed vulnerabilities).
Supports Requirement 1 (SBOM and component tracking) since updates rely on component visibility.
This requirement ensures that security updates must be delivered to users promptly and at no additional cost, maintaining product security throughout its lifecycle.
Manufacturers must not only provide the updates but also communicate clearly with users about:
What the update fixes,
Why it matters (the security context), and
What users should do (any required action, such as restarting, disconnecting, or verifying installation).
The obligation also emphasizes free dissemination, except in special cases (e.g., bespoke, contractual agreements with business users).
Timely Dissemination
Define an internal SLA for security update release (e.g., within 30 days of vulnerability validation).
Ensure updates are rolled out automatically or made easily available for download without user delay.
Maintain an internal tracking mechanism for update distribution completion.
Free Distribution
Ensure that security updates are not monetized or gated behind paid support tiers.
Confirm that licensing or subscription models explicitly include the cost of ongoing security updates.
Clear User Communication
Each update should be accompanied by an advisory message that includes:
Update purpose (e.g., “Security patch addressing a critical vulnerability in [component]”).
Severity level or CVE reference.
Guidance on user actions (restart, verification, etc.).
Where applicable, provide release notes or publish advisories on a public vulnerability disclosure page.
Internal Controls
Establish a Release Communication Process involving engineering, security, and communications teams.
Use template-based advisories to maintain consistency and compliance.
Track and store all published security advisories for audit purposes.
Exceptions for Tailor-Made Solutions
For products developed specifically for business customers, agreements can specify other update terms.
However, even in those cases, timely remediation remains an obligation under CRA.
Example of Evidence
Policy on Vulnerability Management and Patch Dissemination (referencing free and timely security updates).
Records of security update notifications sent to customers.
Public security advisory web pages with version details and severity ratings
Update SLA logs showing release timelines relative to vulnerability discovery.
Templates of user advisories showing clear and accessible information.
Link with Other CRA Obligations
Requirement 2: Supports the timely remediation of vulnerabilities.
Requirement 4: Aligns with public disclosure and transparency of fixed vulnerabilities.
Requirement 7: Complements secure and automatic update mechanisms.