When Post-Market Data Contradicts Your Clinical Conclusions
You approved a device based on pre-market clinical data that showed clear benefit. Two years into the market, your PMCF reveals something different. The benefit you claimed is not holding. The risks you minimized are appearing more frequently. Now what?
In This Article
This is not a theoretical problem. It happens more often than regulatory teams expect. The clinical evaluation that passed your Notified Body review becomes questionable when real-world data starts flowing back.
The tension is immediate. Your conformity assessment is based on conclusions drawn from limited pre-market evidence. Your CE mark depends on those conclusions remaining valid. But the post-market data is telling you something else.
Most manufacturers do not know how to handle this situation correctly. They either ignore the contradiction, justify it away, or panic and withdraw the device too quickly.
Let me show you how this unfolds in practice and what the regulation actually requires you to do.
Why Pre-Market and Post-Market Data Diverge
Pre-market clinical data comes from controlled conditions. You select patients carefully. You follow protocols strictly. You monitor compliance closely. The environment is artificial by design.
Post-market data reflects reality. Patients are less selected. Usage varies widely. Training is inconsistent. Follow-up is incomplete. The device operates in the wild, not in a lab.
MDR Article 61 establishes that clinical evaluation is a continuous process. It does not stop at CE marking. The regulation explicitly requires manufacturers to update clinical evaluation throughout the device lifecycle based on post-market data.
This is not optional. This is a legal requirement.
When MDCG 2020-6 describes the iterative nature of clinical evaluation, it means your conclusions are never final. They remain provisional until confirmed or revised by post-market evidence.
But here is where manufacturers misunderstand the process. They think updating the clinical evaluation means adding new data to the report. It does not. Updating means re-evaluating your original conclusions in light of new evidence.
If the new evidence contradicts your conclusions, you cannot simply append it to the file and move on.
What Contradiction Actually Looks Like
Contradiction is not always dramatic. It shows up in subtle patterns that regulatory teams miss until an auditor points them out.
Your pre-market data showed a complication rate of 2% in 100 patients. Your PMCF survey now shows 6% in 300 patients. Is that a contradiction? Maybe. It depends on patient selection, follow-up duration, and how complications were defined.
Your clinical investigation concluded that the device is equivalent to a predicate in terms of safety. Your vigilance data now shows three serious incidents in one year that never appeared with the predicate. Is that a contradiction? Probably. But you need to analyze if the incidents are device-related or patient-related.
Your literature review identified no significant long-term risks. Your PMCF study at 24 months reveals a delayed complication pattern. That is a clear contradiction. Your original risk-benefit conclusion was based on incomplete evidence.
The common mistake is treating each of these as isolated observations instead of what they are: challenges to your original clinical conclusions.
Manufacturers update their PMCF reports and vigilance summaries without revisiting the clinical evaluation conclusions. They document the divergence but fail to act on it.
So what should you do when you spot the divergence?
The Correct Response Sequence
First, stop and assess. Do not immediately conclude that your device is unsafe or that your CE mark is invalid. Do not rush to notify your Notified Body before you understand what is happening.
Start with a structured analysis. Why is the post-market data different? Is it due to patient population differences? Usage conditions? Follow-up completeness? Data collection methods?
In many cases, the divergence is not a true contradiction. It is a reflection of the difference between controlled studies and real-world use. This is expected. The question is whether the divergence changes your risk-benefit conclusion.
If the divergence is significant and suggests your original conclusions were wrong, you must act. Article 61(11) of the MDR requires you to update the clinical evaluation and, if necessary, modify the technical documentation and inform your Notified Body.
This is where most manufacturers hesitate. They fear that updating the clinical evaluation will trigger a reassessment by the Notified Body. They worry about the timeline. They worry about the cost. They worry about market impact.
But hesitation is worse than action. If an auditor or authority finds that you identified a contradiction and did nothing, you face a nonconformity. You may face a suspension. You may face market withdrawal imposed by the authority instead of managed by you.
What the Update Must Include
Updating the clinical evaluation is not about adding a paragraph. It is about re-running the entire appraisal process with the new data integrated from the start.
You go back to your risk-benefit analysis. You recalculate the rates. You reassess the severity. You check if the intended use remains appropriate. You verify if the contraindications need expansion. You confirm if the clinical performance claims are still supported.
If the post-market data suggests your equivalence claim is no longer valid, you may need to generate your own clinical data. If the safety profile has changed, you update the instructions for use and the risk management file.
This is not a small effort. But it is what the regulation demands.
The clinical evaluation is only credible if it reflects all available data. A report that ignores contradictory post-market evidence is not compliant, even if it was approved before.
How Notified Bodies React
When you inform your Notified Body of a significant update to the clinical evaluation based on contradictory post-market data, their response depends on the nature and magnitude of the change.
If the update confirms that the risk-benefit balance remains positive but requires minor adjustments to labeling or risk management, the review is usually straightforward. The Notified Body will verify that you followed the correct process and that the conclusions are justified.
If the update reveals that the original conclusions were based on insufficient evidence or that the risk-benefit balance has shifted negatively, the Notified Body will treat this as a major change. You may need to submit a new technical file. You may need to conduct additional studies. You may face a new conformity assessment.
In extreme cases, if the contradiction suggests the device is no longer safe or effective for its intended use, the Notified Body may suspend your certificate until you resolve the issue.
This sounds severe. But understand the logic. The Notified Body’s role is to verify that the device meets the general safety and performance requirements throughout its lifecycle. If post-market data proves your pre-market conclusions were wrong, the Notified Body must act.
What You Should Communicate
When you approach your Notified Body with contradictory data, present the full analysis. Do not just show the problem. Show how you investigated it, what you concluded, and what you plan to do.
Prepare a summary that includes:
– The original clinical conclusion
– The contradictory post-market data
– The analysis of why the divergence occurred
– The revised risk-benefit conclusion
– The corrective actions planned or already implemented
– The updated clinical evaluation and related documents
This demonstrates that you are managing the situation proactively. Notified Bodies prefer manufacturers who identify and address issues themselves over manufacturers who wait for audits to expose problems.
The Real Risk is Not Contradiction—It is Denial
Here is what I observe repeatedly in audits and reviews. Manufacturers have the data. They see the divergence. But they rationalize it away.
They say the PMCF sample size is too small to be meaningful. They say the incidents are user error, not device failure. They say the complication rate is within expected variability. They say the literature now shows different results because the field has evolved.
All of these may be true. But they cannot be assumptions. They must be demonstrated through structured analysis and documented in the clinical evaluation update.
What reviewers look for is not perfection. They look for honesty and rigor. Did you recognize the divergence? Did you investigate it properly? Did you update your conclusions appropriately? Did you act on the findings?
If the answer to all four questions is yes, the contradiction becomes evidence of a robust post-market surveillance system. If the answer to any question is no, the contradiction becomes evidence of noncompliance.
Manufacturers wait until the periodic safety update report or the next surveillance audit to address contradictory data. By that time, the delay itself becomes a regulatory issue.
Practical Steps When You Spot Divergence
Step one: document the observation immediately. Create a dated record that shows when you identified the potential contradiction and what triggered the concern.
Step two: convene a cross-functional review. Include clinical affairs, regulatory affairs, quality, and risk management. Evaluate whether the divergence is real or artifactual.
Step three: perform a structured analysis. Use statistical methods if appropriate. Compare patient populations. Assess data quality. Check for confounding factors.
Step four: decide on the conclusion. Does the post-market data confirm, refine, or contradict your original conclusions? Be honest. Document the rationale.
Step five: update the clinical evaluation report. Integrate the post-market data from the beginning of the appraisal, not just as an appendix. Revise your conclusions if needed.
Step six: implement corrective actions. Update labeling, instructions for use, risk management file, and any other affected documents.
Step seven: inform your Notified Body. Submit the updated clinical evaluation and explain what changed and why.
Step eight: if required, report to the competent authority. Certain contradictions may constitute reportable events under vigilance or safety obligations.
This sequence takes time. It requires resources. But it is the only compliant path when post-market data contradicts pre-market conclusions.
What This Means for Your PMCF Design
Most PMCF plans are designed to confirm what you already believe. They measure the outcomes you expect. They avoid the questions that might reveal uncomfortable truths.
This is a mistake. Your PMCF should be designed to detect divergence, not just to document compliance.
If your pre-market data showed low complication rates, your PMCF should actively monitor for complications with enough sensitivity to detect increases. If your equivalence claim is based on limited predicate data, your PMCF should compare your device to the predicate in real-world conditions.
The goal is not to prove you were right. The goal is to learn whether you were right.
When you design PMCF with this mindset, contradictory data becomes valuable information instead of a regulatory problem. You catch issues early. You adjust quickly. You maintain compliance continuously.
A PMCF plan that never reveals contradictions is probably not sensitive enough. Real post-market surveillance should challenge your assumptions, not just confirm them.
The Long-Term View
Clinical evaluation under MDR is fundamentally different from the old directives. It is not a one-time exercise to obtain a CE mark. It is a living process that evolves as evidence accumulates.
When post-market data contradicts pre-market conclusions, it is not a failure. It is the system working as designed. The regulation expects this to happen. It provides the framework to handle it.
What matters is how you respond. Do you treat the contradiction as a threat to your approval, or as information that improves your understanding of the device?
Manufacturers who adopt the second mindset build stronger clinical evaluation files, maintain better relationships with Notified Bodies, and avoid the regulatory crises that come from ignoring divergent data.
The question is not whether your post-market data will contradict your pre-market conclusions. The question is whether you are prepared to recognize it and act on it when it does.
Peace,
Hatem
Clinical Evaluation Expert for Medical Devices
Follow me for more insights and practical advice.
Frequently Asked Questions
What is a Clinical Evaluation Report (CER)?
A CER is a mandatory document under MDR 2017/745 that demonstrates the safety and performance of a medical device through systematic analysis of clinical data. It must be updated throughout the device lifecycle based on PMCF findings.
How often should the CER be updated?
The CER should be updated whenever significant new clinical data becomes available, after PMCF activities, when there are changes to the device or intended purpose, and at minimum during annual reviews as part of post-market surveillance.
What causes CER rejection by Notified Bodies?
Common reasons include inadequate equivalence demonstration, insufficient clinical data for claims, poorly structured SOTA analysis, missing gap analysis, and lack of clear benefit-risk determination. Structure and logical flow are as important as the data itself.
Which MDCG guidance documents are most relevant for clinical evaluation?
Key documents include MDCG 2020-5 (Equivalence), MDCG 2020-6 (Sufficient Clinical Evidence), MDCG 2020-13 (CEAR Template), MDCG 2020-7 (PMCF Plan), and MDCG 2020-8 (PMCF Evaluation Report).
Need Expert Help with Your Clinical Evaluation?
Get personalized guidance on MDR compliance, CER writing, and Notified Body preparation.
✌
Peace, Hatem
Your Clinical Evaluation Partner
Follow me for more insights and practical advice.
– Regulation (EU) 2017/745 (MDR), Article 61: Clinical Evaluation
– MDCG 2020-6: Regulation (EU) 2017/745: Clinical Evidence Needed for Medical Devices Previously CE Marked under Directives 93/42/EEC or 90/385/EEC
– MDCG 2020-7: Post-Market Clinical Follow-up (PMCF) Plan Template
– MDCG 2020-8: Post-Market Clinical Follow-up (PMCF) Evaluation Report Template
Contradictory post-market data may require immediate CER revision. Learn the triggers in our guide on CER update frequency under MDR.
Related Resources
Read our complete guide to PMCF under EU MDR: PMCF Plan & Report under EU MDR
Or explore Complete Guide to Clinical Evaluation under EU MDR





