AI in Medical Devices: Post-Market Surveillance and FDA Guidance

AI in Medical Devices: Post-Market Surveillance and FDA Guidance

Artificial intelligence (AI) is transforming medical technology. Yet, unlike static hardware or conventional software, AI- medical devices (AIMD) evolve. Their algorithms can drift, degrade, or even introduce new risks once deployed in real-world clinical environments.

For founders and product leads, this presents a challenge: how do you maintain regulatory compliance while allowing your AI model to improve and adapt?

This article outlines what you need to know about post-market surveillance (PMS) and change control for AIMDs, including the latest expectations from the U.S. Food and Drug Administration (FDA).

Understanding Post-Market Surveillance for AIMDs

Traditional devices undergo limited updates once approved. AIMDs, however, continuously learn and change. Post-market surveillance must therefore be dynamic, proactive, and evidence-based.

The FDA now emphasises a Total Product Lifecycle (TPLC) approach for AI and machine learning (AI/ML)-enabled devices. This means oversight continues beyond market approval, requiring manufacturers to track device performance, manage risk, and maintain transparency throughout deployment.

Core monitoring requirements

Effective PMS plans should include:

  • Data drift monitoring: Track how real-world input data differs from training data. Changes in clinical practices or demographics can shift model accuracy.
  • Bias detection: Evaluate performance across subgroups such as age, sex, or ethnicity to identify unintentional bias.
  • Root cause analysis: Investigate whether detected issues stem from data, algorithms, or user behaviour.
  • Performance analytics: Continuously measure sensitivity, specificity, error rates, and calibration to spot degradation early.

These components ensure emerging issues are identified and corrected before they affect patient safety or product effectiveness.

FDA Expectations and Change Control

Recognising that AI systems evolve, the FDA introduced guidance on Predetermined Change Control Plans (PCCPs). PCCPs allow manufacturers to make defined algorithmic updates without submitting an entirely new market application each time.

What a PCCP includes

  1. Description of proposed modifications: Outline which parameters or processes may change.
  2. Modification protocol: Define how those changes will be verified, validated, and approved.
  3. Impact assessment: Evaluate potential safety or performance implications and specify acceptance criteria.

If a proposed modification falls outside the PCCP’s scope or could significantly affect safety or effectiveness, a new submission (510(k), De Novo, or PMA supplement) is required.

Version control and traceability

Every AI model update must be documented. Maintain robust version control under your Quality Management System (QMS), clearly stating which version is validated and released. A transparent version history supports traceability, rollback capability, and audit readiness.

Implementing a Monitoring Infrastructure

Building an effective PMS system for AI devices requires collaboration across technical, regulatory, and clinical teams.

Key steps

  • Data acquisition: Capture inputs, predictions, and contextual metadata from deployed devices.
  • Automation and analytics: Use dashboards and alerts to detect performance drift or anomalies.
  • Update workflow: Define when to deploy updates, mitigations, or rollbacks, aligned with PCCP criteria.
  • Integration with QMS: Link monitoring results to CAPA (Corrective and Preventive Actions) processes.
  • Security and integrity: Protect data through encryption, access controls, and integrity checks.

Team responsibilities

  • Software engineers / DevOps: Maintain the infrastructure for data logging and updates.
  • Regulatory affairs: Oversee documentation, reporting, and compliance.
  • Clinical experts: Review flagged results for medical relevance.
  • Data scientists: Analyse drift, bias, and algorithmic performance.

Common Pitfalls in AI Device Compliance

  1. No pre-defined change plan: Without a clear PCCP, even small algorithm updates may require new submissions.
  2. Ignoring bias: Focusing only on overall accuracy can conceal subgroup errors.
  3. Weak documentation: Failing to record performance results, update logs, or mitigations risks non-compliance.
  4. Poor version control: Lack of traceability undermines validation and regulatory confidence.

Avoiding these mistakes saves time, protects patients, and strengthens investor and regulator trust.

Aligning AIMD Risk Management with ISO 14971

AI introduces risks beyond standard software. Manufacturers should align their change-control process with ISO 14971 Application of Risk Management to Medical Devices, incorporating AI-specific considerations from AAMI CR34971.

Risk assessments must address not only the technical but also the ethical and clinical implications of algorithmic updates, including potential bias and model transparency.

Final Steps for Compliance

To maintain compliance and protect patients:

  1. Establish a continuous PMS plan that collects real-world evidence and tracks drift or bias.
  2. Implement a PCCP with defined boundaries, testing criteria, and documentation.
  3. Ensure traceability for every algorithmic version under your QMS.
  4. Engage multidisciplinary teams—technical, regulatory, and clinical to maintain alignment across updates.

Regulatory scrutiny of AI in medical devices is increasing. By embedding surveillance and change control into your design and quality systems, you can innovate confidently while meeting FDA expectations.

LFH supports MedTech innovators in navigating medical device compliance from AIMD lifecycle management to full-system regulatory strategy. If you’re developing or updating an AI medical device, our regulatory experts can help you build robust change-control processes and meet international standards with confidence.

FAQs

Does the FDA require post-market surveillance for AIMDs?

Yes. The FDA expects manufacturers to include a PMS plan to monitor performance, manage drift, and mitigate bias once the product is on the market.

When do AI updates need FDA review?

If an algorithm change falls outside the approved PCCP or could impact safety or effectiveness, a new submission is required.

What data should be collected for monitoring?

Inputs, outputs, metadata, subpopulation performance, versioning information, and user feedback.

Contact Us

If you’d like more information, please feel free to contact us by email at info@LFHregulatory.co.uk or phone on +44 (0)1484662575.

More Resources

Share this content