
Biometrics are everywhere: clinicians unlock mobile apps with Face ID, service teams use device biometrics on laptops, and some clinical systems are experimenting with fingerprint or iris-based access for speed and convenience. Done well, biometric authentication can reduce password fatigue and improve workflow.
But biometrics aren’t magic. A classic class of attacks—sometimes nicknamed the “gummy bear” or “gummy finger” attack—demonstrates a hard truth: many biometric sensors can be fooled by presentation attacks (spoofing a biometric input). Early research showed that gelatin-based “gummy” fingers could be accepted by multiple fingerprint systems. The point isn’t the candy—it’s the risk model. Matsumoto et al., “Impact of Artificial ‘Gummy’ Fingers…” (SPIE)
This guide reframes biometrics for medical device cybersecurity: where biometrics belong, how they fail, and what controls (and evidence) you should implement to support secure-by-design expectations.
What “gummy fingers” really prove
The “gummy finger” concept is one example of a broader category: presentation attacks, where an attacker presents an artificial biometric sample to the sensor. Standards bodies explicitly treat this as a distinct problem space, often addressed with presentation attack detection (PAD) (sometimes called liveness detection). ISO/IEC 30107-1 (Presentation Attack Detection framework)
In practical terms, the lesson for MedTech is:
- Biometrics can be spoofed—especially if PAD is weak or absent.
- Biometric data can’t be “rotated” like a password. If a biometric template leaks, the impact can follow a user for life.
- Biometrics are best used to unlock something stronger (like a device-bound cryptographic key), not as a standalone magic credential.
Why this matters in medical device ecosystems
Even if your shipped medical device doesn’t contain a biometric sensor, biometrics may still be part of your product’s ecosystem:
- Mobile apps used by clinicians and patients (biometric unlock to access app sessions)
- Cloud portals used by clinical staff, customer admins, and manufacturer support
- Service tooling (field laptops/tablets, remote support workflows, provisioning tools)
- Manufacturing and test systems (privileged consoles and access control)
In all of these, authentication failures can become safety and compliance problems: unauthorized configuration changes, exposure of sensitive health data, or compromised update pipelines.
Biometrics are not secrets (and should not be treated like passwords)
A reliable approach is to treat biometrics as an activation factor—a convenient way to authorize use of a stronger, device-bound authenticator. NIST’s digital identity guidance prefers local biometric verification (on the user’s device) over centralized biometric comparison, and includes requirements for handling biometric samples safely. NIST SP 800-63B
This is why “platform biometrics” are generally safer than building your own biometric backend:
- Templates are kept in a hardware-backed component (for example, Apple describes storing biometric template data in the Secure Enclave). Apple Platform Security: Biometric security
- The server gets a strong authentication result (often public-key based), not raw biometric data.
- It’s easier to prove how biometric data is protected—because you can rely on documented platform behaviors.
Medical device biometric security: practical control checklist
1) Use biometrics as part of MFA, not the only control
For high-impact actions—therapy configuration, admin functions, service access, account recovery—use multi-factor authentication. Biometrics should typically unlock a device-bound key (e.g., passkeys / WebAuthn) rather than act as a standalone “password replacement.”
2) Require presentation attack detection where it matters
If your solution uses biometric sensors directly (or depends on them), define when PAD is required and how it’s validated. ISO/IEC 30107 provides a framework and terminology for PAD expectations. ISO/IEC 30107-1
3) Design safe fallback paths
In clinical environments, users may be gloved, injured, or unable to use a biometric. Fallback must be secure and auditable:
- Prefer strong alternative factors (hardware keys, passkeys, or robust MFA), not weak knowledge-based questions.
- Lock down account recovery (this is a common attacker route).
- Log fallback usage and review it.
4) Protect biometric data like high-value regulated data
Biometric databases are high-consequence targets. Large breaches have included millions of fingerprint records (e.g., the OPM incident) and biometric platform exposures (e.g., BioStar 2 reporting). DNI/NCSC case study: OPM SecurityInfoWatch: BioStar 2 exposure
For MedTech: avoid storing raw biometric images. If your design touches biometric templates at all, you need a strong reason, strong protection, and strong documentation.
5) Pair biometric auth with authorization and auditability
Biometrics help answer “who are you?” They don’t automatically answer “what are you allowed to do?” Enforce:
- Role-based authorization for functions and data
- High-fidelity logging for privileged actions
- Alerting on unusual access patterns
Related reading: ACLs in Cybersecurity
What to document for FDA-facing credibility
If biometrics are part of your device ecosystem, reviewers and auditors will look for a clear thread from risk → controls → verification. Capture:
- Threat modeling for spoofing/presentation attacks, bypass, fallback abuse, and credential recovery abuse
- Architecture showing where biometric decisions occur (local device vs server) and what data is stored
- Requirements for MFA, PAD, and high-risk actions
- Verification evidence demonstrating controls work (including negative tests)
- Postmarket operations for monitoring, incident response, and updates
If you want help building that evidence package:
- FDA Premarket Cybersecurity Services
- Medical Device Threat Modeling Services
- FDA-Compliant Vulnerability & Penetration Testing
- FDA Postmarket Cybersecurity Services
Key takeaways
- The “gummy finger” story is a reminder that biometrics can be spoofed via presentation attacks. Matsumoto et al. (SPIE)
- Use biometrics as a convenience layer that unlocks stronger, device-bound authentication—not as a standalone credential. NIST SP 800-63B
- Presentation attack detection (PAD) and secure fallback paths are essential in clinical workflows. ISO/IEC 30107-1
- Biometric data breaches are high-impact because biometrics cannot be rotated like passwords. DNI/NCSC: OPM case study
- For FDA-facing credibility, document threats, controls, and verification evidence with clean traceability.
FAQs
Is biometric authentication safe for medical device apps?
It can be, when used appropriately—typically as a local unlock for a stronger device-bound authenticator and paired with MFA for high-risk actions.
What is the “gummy bear” or “gummy finger” biometric attack?
It’s a type of presentation attack where an artificial biometric sample is used to fool a sensor. It’s widely discussed in biometric security research as a reminder that sensors need anti-spoofing controls. SPIE paper
What is liveness detection (PAD), and do we need it?
PAD (presentation attack detection) aims to detect spoofed biometric inputs at the capture device. If your system relies on biometrics for access to sensitive or safety-impacting functions, you should define PAD expectations and validate them. ISO/IEC 30107-1
Should we store biometric templates in the cloud?
Generally, avoid it unless you have a compelling, documented reason and strong protections. Prefer platform biometrics where matching happens locally and the server receives an authentication result—not the biometric itself. NIST SP 800-63B
What’s the biggest mistake teams make with biometrics?
Treating biometrics like a secret that “can’t be guessed.” Biometrics can be spoofed and can’t be rotated after a breach—so build layered controls and strong recovery security.
Book a Discovery Session
If biometrics are part of your device ecosystem and you need defensible requirements, testing, and FDA-ready documentation, we can help.
Conclusion
Biometrics can reduce friction—but they don’t remove risk. The “gummy finger” lesson is simple: assume biometrics can be spoofed, design with PAD and MFA where needed, protect biometric data aggressively, and produce verification evidence that proves your controls work in real clinical workflows.