Last reviewed: May 1, 2026
Pillar Guide · Updated 2026 · 16 min read
Pass FDA 510(k) Cybersecurity on the First Submission
Talk to a MedTech cybersecurity expert
TL;DR
- A first-pass-clean cybersecurity package needs eight specific artifacts, structured exactly as the 2026 final guidance lays out.
- Predicate divergence is the most-cited deficiency in 510(k) cybersecurity - the predicate-comparison narrative must address every cyber surface added beyond the predicate.
- Run a 30-day pre-submission readiness check before eSTAR upload. Most rejections are visible in the package before it reaches the reviewer.
- Rapid response to deficiencies (when they happen) is a separate skill - have a playbook ready, not just hope.
The eight artifacts a clean 510(k) cybersecurity package contains
- Security risk assessment aligned with AAMI TIR57 / SW96.
- Threat model with four architecture views and traceability matrix.
- SBOM in CycloneDX or SPDX with vulnerability disclosure paths.
- Penetration test report with full attack surface coverage and Letter of Attestation.
- Architecture views (global, multi-patient harm, updateability, security use-case).
- Cybersecurity labeling and user-facing security information.
- Postmarket cybersecurity management plan.
- Predicate-comparison cybersecurity narrative.
Predicate divergence: the silent killer
The most common 510(k) cybersecurity deficiency is the same every cycle: the device has connectivity or features the predicate did not, and the submission does not address the resulting cybersecurity delta. Reviewers look for an explicit predicate-comparison cybersecurity narrative that lists every interface and data flow added beyond the predicate, identifies the new threats those create, and shows the controls that mitigate them. Without this narrative, even a strong threat model and pen test will draw a deficiency because the reviewer cannot tell what is 'new' from what was inherited.
The 30-day pre-submission readiness check
A focused review against the eight artifacts catches the 80% of issues that cause first-cycle deficiencies. Run it 30 days before planned eSTAR upload.
Day 1-5: Artifact completeness
- All eight artifacts present and version-controlled.
- Each artifact has clear authorship, date, and version.
- Cross-references between artifacts resolve (no broken links).
Day 6-15: Content depth
- Threat model: four architecture views present, trust boundaries consistent across views, STRIDE per element.
- Pen test: Letter of Attestation present, scope covers full attack surface, retest evidence for high/critical.
- SBOM: machine-readable, transitive deps, VEX or equivalent triage statement.
Day 16-25: Traceability
- Threat → control → verification → ISO 14971 harm chain unbroken.
- Predicate-comparison narrative present and complete.
- Postmarket plan describes monitoring cadence, vulnerability triage SLA, and patch deployment.
Day 26-30: Reviewer-eye read-through
A senior MedTech regulatory reader who did not write the package reads it cold. If they cannot answer "what does this device do, what are the cyber threats, and how are they controlled" in 15 minutes from the package alone, the package is not ready.
Common rejection patterns we still see in 2026
- IT-style threat model presented as medical device threat model - no patient-safety mapping.
- Pen test scoped to web app only because the device is "just a sensor with an app."
- SBOM present but cloud backend excluded.
- Postmarket plan describes monitoring intent without cadence, SLAs, or named owners.
- AI/ML feature added but no AI-specific cybersecurity content - no MITRE ATLAS mapping, no model-supply-chain review.
- Cybersecurity labeling missing entirely or buried in a single sentence in the IFU.
If a deficiency does come
You have 180 days. Fast response matters more than long response. See our Deficiency Letter Response Playbook for the structure that gets through second-cycle review without a third.
Frequently asked questions
What is the actual first-pass clearance rate for cyber-relevant 510(k)s?
Hard public number. Industry estimates put first-cycle clean cybersecurity at roughly 30-40% across all submitters. Specialist MedTech cyber teams report 80%+ first-cycle clean when the eight artifacts are built to the 2026 guidance. The delta is not luck.
Do all 510(k) submissions need this full package?
Any device meeting the Section 524B definition of cyber device does. That is essentially every device with software that includes the ability to connect to the internet or networked device, or that contains software validated, installed, or authorized by the manufacturer that the manufacturer intentionally and technologically connects to a digital system. In 2026 that is most 510(k)s.
How does this differ from De Novo and PMA?
510(k) leans heavily on the predicate-comparison narrative because there is a predicate. De Novo replaces the predicate-comparison with a security-architecture-justification (no predicate exists). PMA requires the deepest documentation but the artifact list is similar.
What if our predicate is older and has no cybersecurity evidence?
Common situation. The predicate-comparison narrative becomes more important, not less - you describe the new cyber surface introduced by your device against the predicate baseline and show the cyber risks and controls for that delta. Reviewers do not expect you to retroactively cyber-document the predicate.
Can we file a 510(k) without a Letter of Attestation if our pen tester does not provide one?
You can file, but expect a deficiency citing the missing attestation. Easier to require it from your pen test vendor up front than to litigate it after submission.
Is third-party pen testing required, or can we self-test?
The guidance does not require third-party testing, but in practice independent testing is what reviewers find credible. Self-tested submissions consistently draw deeper scrutiny on methodology and scope.
What happens if we file before our SBOM is complete?
Likely AI letter (Additional Information) request specifically calling for the SBOM. Easier to delay the submission a week than to consume 60+ days of the deficiency clock on a missing artifact.
Does this package satisfy EU MDR cybersecurity expectations too?
Mostly. EU MDR Annex I cybersecurity requirements overlap heavily with FDA 524B; IEC 81001-5-1 and IEC 62443-4-1 add lifecycle process evidence requirements that you can document in a process-evidence appendix. One artifact set, multi-region acceptance.
How long does it take to build the eight artifacts from scratch?
8-14 weeks for a connected Class II device with engaged engineering. Threat modeling and pen testing can run in parallel after the architecture intake.
What is the single highest-leverage thing to get right?
The traceability matrix. If a reviewer can follow threat → control → verification → residual risk → ISO 14971 harm without leaving the document, most other deficiencies do not get raised because the answer is visible.
