How to CE Mark Your AI Software Medical Device: Notified Body Insights on MDR & AI Act

Expert webinar with Dr. Engin Çamer & Alireza Sheikhi Nasrabadi (SZUTEST NB2975) and Dr. Hatem Rabeh (Clinical Evaluation Navigator)

Artificial intelligence is rapidly transforming healthcare, but bringing an AI-based medical device to the European market requires a clear regulatory strategy. In this webinar, we discuss how to CE mark an AI software medical device under the EU Medical Device Regulation (MDR 2017/745) with direct insights from a Notified Body — including what reviewers actually evaluate during conformity assessment and the most frequent pitfalls that delay approval.

Key Takeaways from This Webinar

Data governance is the #1 issue — The most common nonconformity in AI device submissions is insufficient or non-representative training data. If your dataset does not match your claimed intended use, patient population, and indications, expect a rejection.
Clinical evaluation has three pillars for SaMD — Notified Bodies evaluate AI software through clinical association (is the output clinically relevant?), technical validation (does the algorithm work?), and clinical validation (does it work on real patients?).
Continuous learning is not allowed under MDR — You must certify a locked version of your AI model. Any retraining or model update requires full revalidation, change assessment, and potentially a new Notified Body review before deployment.
The AI Act requires human-in-the-loop — For high-risk AI applications including medical devices, fully automated evolving systems are prohibited. Human oversight, validation intervals, and controlled retraining cycles are mandatory.
PMCF is critical for AI devices — Most AI medical devices qualify as novel, meaning a specific PMCF study (not just general surveillance) is expected — even if the initial certification was based on a clinical investigation.
Subgroup analysis is expected by default — Notified Bodies systematically check for performance across age, sex, disease stage, and clinical subpopulations. Missing subgroup data may lead to contraindications or intended use restrictions.
Software updates can trigger reclassification — Adding an AI module to existing CE-marked software may change the risk class, require a complete re-evaluation, and open new cybersecurity risk vectors that must be addressed.
Synthetic data can be used — with conditions — Synthetic data may supplement real-world data to enhance model robustness, but it requires proper validation, justification, and sandbox testing before market deployment.

What We Cover in This Session

Common classification mistakes in AI software medical device submissions
Gaps in technical documentation: risk management, cybersecurity, and GSPR mapping
When to use Article 61.1 vs Article 61.10 for clinical evidence
Expected performance and safety metrics for AI-based medical devices
How to monitor AI behaviour after CE marking: bias, drift, and model updates
Which software modifications require a new Notified Body review
Interaction between the MDR and the upcoming European AI Act
Realistic timelines and regulatory costs for startups and SMEs
Use of synthetic data and retrospective vs prospective clinical evidence
Change control and PCCP for AI systems under MDR and AI Act

Meet the Speakers

Dr. Engin Çamer

Clinical Unit Manager — SZUTEST GmbH, Notified Body 2975

Dr. Engin Çamer is a medical doctor working as an internal clinician for SZUTEST, a Notified Body designated in Germany under the EU MDR. With approximately 15 years of experience in the medical device field, he is directly involved in evaluating clinical data for AI-based medical devices, including clinical evaluation reports, benefit-risk documentation, and PMCF plans submitted by manufacturers seeking CE marking.

Alireza Sheikhi Nasrabadi

Lead Auditor & Product Reviewer — SZUTEST GmbH, Notified Body 2975

Alireza Sheikhi Nasrabadi is a Lead Auditor and Product Reviewer for active medical devices at SZUTEST GmbH. He specializes in the convergence between MDR and the AI Act, focusing on AI-specific risk profiles, data governance, model validation, and cybersecurity assessment. He works directly with manufacturers to facilitate the certification process for innovative AI-based medical devices.

Who Should Watch This Webinar?

This session is designed for anyone developing, certifying, or managing AI-based medical devices under the EU MDR:

  • AI/ML Engineers building software medical devices and preparing technical documentation for MDR
  • Regulatory Affairs Managers navigating CE marking for AI-based SaMD under MDR and the AI Act
  • Clinical Evaluation Specialists designing clinical validation strategies for AI software
  • Quality Managers managing change control, PMCF, and post-market monitoring for AI systems
  • MedTech Startups & SMEs planning their first CE marking submission for an AI medical device
  • CTOs and Product Owners who need to understand the regulatory constraints on continuous learning and model updates

Also Watch: SOTA under MDR

In our previous webinar, we explored how Notified Bodies interpret State of the Art (SOTA) during clinical evaluation review — a fundamental topic for any medical device submission.

Watch the SOTA Webinar →

Continue the Conversation

Join the Clinical Evaluation Navigator community to access weekly expert calls, discuss real-world challenges with regulatory professionals, and get direct guidance from Dr. Hatem Rabeh on AI medical devices, clinical evaluation, and MDR compliance.

Join the Community

Free to join • Weekly Friday calls • 100+ published articles