See sample Peer-review report

The Applied Science

The Applied ScienceThe Applied ScienceThe Applied Science
Home
Our Services
  • Request Service
  • Execution Support
  • Peer Review
  • Usability Review
Our Experts
  • Meet the Experts
  • Become Our Expert
Contact Us
Applied Science Blog

The Applied Science

The Applied ScienceThe Applied ScienceThe Applied Science
Home
Our Services
  • Request Service
  • Execution Support
  • Peer Review
  • Usability Review
Our Experts
  • Meet the Experts
  • Become Our Expert
Contact Us
Applied Science Blog
More
  • Home
  • Our Services
    • Request Service
    • Execution Support
    • Peer Review
    • Usability Review
  • Our Experts
    • Meet the Experts
    • Become Our Expert
  • Contact Us
  • Applied Science Blog
  • Sign In
  • Create Account

  • My Account
  • Signed in as:

  • filler@godaddy.com


  • My Account
  • Sign out

Signed in as:

filler@godaddy.com

  • Home
  • Our Services
    • Request Service
    • Execution Support
    • Peer Review
    • Usability Review
  • Our Experts
    • Meet the Experts
    • Become Our Expert
  • Contact Us
  • Applied Science Blog

Account

  • My Account
  • Sign out

  • Sign In
  • My Account

The Applied Science Blog

Where Science Meets Product Innovation

Actionable Recommendations to Avoid FDA Delays

Usability Can Make or Break FDA Clearance

 Bringing a medical device to market isn’t just about great engineering—it’s about making sure real people can use it safely and effectively. Many companies underestimate this until it’s too late. Human factors and usability issues are one of the top reasons FDA 510(k) submissions get delayed or rejected, forcing teams into costly redesigns, repeat testing, and months of lost market time.


From infusion pump overdoses caused by overly complex programming to diagnostic devices recalled for confusing interfaces, history shows that poor usability isn’t just an inconvenience—it can harm patients and derail product launches. In fact, FDA data shows that a significant share of first-round submissions receive additional information requests, often due to usability-related deficiencies.


Most of the insights in this article draw from the FDA’s official guidance document Applying Human Factors and Usability Engineering to Medical Devices and industry analysis like this FDA Law Blog post. A special thanks to the FDA team for making these resources available and for providing ongoing feedback that helps manufacturers understand and meet usability expectations.


In the following section, we translate those lessons into practical, real-world recommendations backed by actual case studies. These are the usability practices that can help you avoid costly delays, build safer devices, and navigate FDA clearance with confidence.

Early usability testing of a connected medical device prototype by engineer using hardware interface

1. Start Early with Iterative Usability Testing (Formative + Summative)

 Recommendation: Begin formative (exploratory) usability testing early in development and follow with a robust summative (validation) test.


Why it matters: Without early iteration, final validation often uncovers critical flaws that require redesign and repeat testing—leading to costly FDA additional information (AI) requests and delays. Approximately 30% of 510(k)s fail initial review, and human factors deficiencies are a leading cause.


Example: A firm rushed a summative usability test due to schedule pressure, only to have FDA reject it for insufficient task risk analysis—forcing a full re‑test and delaying submission by 6+ months, costing tens of thousands in additional work.

Clinical technical team defining usability tasks and user groups at medical device validation plan

2. Precisely Define User Groups & Critical Tasks

Recommendation: Define all intended user types separately (e.g. nurses, physicians, home users) and conduct usability validation with each group, covering all critical safety tasks.


Why: The FDA typically expects 15 participants per distinct user group, and combining groups without justification invites review questions and delays source.


Example: A patient monitor developer grouped nurses and cardiologists together in usability testing. FDA flagged this grouping and requested additional data, delaying the submission while additional group-specific testing was conducted.

3. Design the Task Workflow, Not Just the UI

Recommendation: Address use-risk through design changes first. Training or labeling should be secondary, and validated so users follow instructions correctly.


Why: FDA doesn’t accept “we’ll fix in training” unless backed by evidence. Real-world usability safety comes from intuitive design, not instruction manuals.


Example: A PCA infusion pump caused serious overdoses via a multi-step programming interface. The manufacturer initially relied on training, but FDA required redesign. After redesign, programming steps were simplified by 55% and errors dropped 56%, ending the safety issue source.

Simulate real-world usability conditions in medical device validation for FDA compliance.

4. Simulate Realistic Use Conditions in Validation

Recommendation: Conduct validation tests under conditions mimicking actual settings: noise, lighting, PPE usage, time pressure, fatigue, etc.

Why: Tests that fail to reflect clinical realities may be rejected, forcing repeat testing. FDA requires “sufficiently realistic” environments tied to risk context source.


Example: A pulse oximeter was tested in ideal light but failed under low-light nighttime conditions, a problem only discovered post-launch and reported as user usability failures—leading to complaint and corrective action.

Ensure unbiased FDA usability testing with transparent data and human factors engineering

5. Avoid Test Bias: Don’t Coach or Cherry-Pick Data

Recommendation: During summative testing, do not coach participants. Report all critical task outcomes, and pre-define data exclusion criteria, replacing participants if needed.

Why: Coaching or excluding “bad” data damages integrity. FDA reviewers will question manipulated results, triggering additional information requests or test repetition source.


Example: One company trimmed validation data by excluding a participant who made a critical error—without justification. FDA raised the issue, demanded full raw data review, and the submission stalled while the study was redone with a full, transparent dataset.

Usability risks and FDA delay countermeasures table for medical device human factors validation.

6. Anchor Usability in a Structured Use‑Risk Analysis & Documentation

Recommendation: Create a detailed Use‑Related Risk Analysis (URRA) and document observed usability issues with root‑cause analyses and mitigations. Maintain a Usability Engineering File per IEC 62366‑1/ISO 14971.


Why: FDA deficiencies often cite poor URRA or superficial error documentation. A strong traceable analysis reduces questions and accelerates clearance source.


Example: A device submitted with a generic FMEA instead of task‑focused URRA was flagged, costing weeks to construct proper use-error analysis and delaying clearance. Another company’s submission included a clearly traceable table of error‑cause‑mitigations, impressing FDA and speeding review.

Why These Practices Work—Backed by Data

  • Up to 30% of 510(k)s fail the first review, and human factors is a critical failure point in many cases source.
     
  • FDA has stated that even a single critical task triggers human factors requirements—even if the predicate didn’t require it source.
     
  • Companies integrating usability engineering early and documenting comprehensively can often clear FDA in one cycle—a rare success, but achievable with rigor.

Real-World Example: Successful vs. Failed Approaches

  • Success Case: A home-use therapeutic device underwent early formative testing, had separate summative studies for patients and clinicians, incorporated realistic home environment simulations, and provided a clean URRA with corrective design iterations. FDA cleared it with no human factors questions—enabling on-schedule market entry.
     
  • Failure Case: A Class II device used expedited UI development and deferred usability until late. The validation test had one user group, was conducted in quiet lab conditions, and sunscreened over a user error. FDA sent a human factors deficiency letter citing all these issues—delaying clearance by 4–6 months while new studies and analyses were conducted.

By Diana Saltymakova | Product Development Scientist | Published on July 29, 2025

Go-to platform for peer-reviewed R&D support in MedTech, diagnostics, and instrumentation
About Us

The Applied Science

FLEXIBLE R&D SUPPORT FOR MEDTECH AND INSTRUMENTATION COMPANIES.


Build investor confidence and accelerate development with trusted scientific reviews and hands-on execution.  

Request Service

Featured Blog Posts

#

How Peer Review Reduces Errors

#

Journey of Medical Device

#

Jobs-to-Be-Done

Get Monthly Insights on Product Development

Smart strategies for innovators, and sharp thinking for the experts who support them.

  • Home
  • Request Service
  • Execution Support
  • Peer Review
  • Usability Review
  • Become Our Expert
  • Contact Us
  • Applied Science Blog
  • Privacy Policy

The Applied Science

The Applied Science is a registered partnership based in Ontario, Canada. We specialize in peer-reviewed consulting, technical validation, and regulatory support for medical devices, diagnostics, and scientific instrumentation. As a platform built by scientists, we consult MedTech, biotech, and engineering innovators. Our services include usability testing, feasibility analysis, risk reviews, and regulatory compliance consulting, all backed by senior scientific oversight. We support startups, scale-ups, and manufacturers across Canada and the United States with flexible delivery models — remote, on-site, or hybrid. Trusted by founders. Refer us to your investors. Recommend this platform to your clients. This is the go-to destination for expert-led product development, scientific guidance, and MedTech success.

Copyright© 2025 The Applied Science Network, Inc. All rights reserved.  

contact us at myappliedscience@gmail.com

This website uses cookies.

We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.

DeclineAccept