When Your Equivalence Claim Collapses During Review
You submitted an equivalence-based clinical evaluation. The Notified Body came back with a major nonconformity. Your device is not equivalent. The entire clinical evaluation path must restart. This happens more often than most manufacturers expect, and it usually stems from the same structural mistakes in how equivalence was approached from the beginning.
In This Article
Under MDR, equivalence is not a shortcut. It is a rigorous path that requires you to prove that your device and the equivalent device are so similar that clinical data from one can support the safety and performance of the other. When done correctly, it saves time and reduces the burden of generating new clinical data. When done poorly, it collapses under scrutiny, often months into the conformity assessment process.
The consequences are not minor. A rejected equivalence claim means your device has no valid clinical evidence base. You cannot proceed with certification. You must either generate new clinical data or find another equivalent device and restart the demonstration. Both options delay market access and increase cost significantly.
Most equivalence failures are not due to bad luck. They are due to misunderstanding what equivalence actually requires under MDR and MDCG 2020-5.
What Equivalence Actually Means
Equivalence is defined in Article 61(5) of MDR and detailed in MDCG 2020-5. The guideline breaks equivalence into three pillars: technical equivalence, biological equivalence, and clinical equivalence. All three must be demonstrated. If one pillar fails, the entire equivalence claim fails.
Technical equivalence means the devices have the same design, materials, specifications, and performance characteristics. Biological equivalence means they interact with the body in the same way, with the same materials contacting the same tissues for the same duration. Clinical equivalence means they are used for the same clinical condition, in the same patient population, at the same site in the body, with the same intended purpose.
This is not a checklist you fill out superficially. Each pillar requires evidence, analysis, and justification. If the devices differ on any parameter that could influence safety or performance, you must demonstrate that the difference is not clinically significant.
Equivalence is not about similarity. It is about demonstrating that differences between devices do not affect clinical outcomes. The burden of proof is on the manufacturer.
Where Equivalence Claims Collapse
Most failures occur in one of three areas: incomplete technical comparison, incorrect biological risk assessment, or mismatched clinical indications.
Technical Comparison: The Devil in the Details
Technical equivalence is often treated as a box-ticking exercise. Manufacturers prepare comparison tables showing dimensions, materials, and features. But reviewers do not assess these tables at face value. They look for differences that matter.
A catheter with the same material and diameter but a different tip geometry is not technically equivalent if the tip geometry influences insertion success or vascular trauma risk. A suture with the same tensile strength but a different coating is not equivalent if the coating affects tissue reaction or knot security. A surgical mesh with the same polymer but a different pore size is not equivalent if pore size influences tissue ingrowth or infection risk.
The failure happens when manufacturers list parameters without analyzing whether those parameters influence performance. Reviewers expect a risk-based analysis. For every difference, you must explain why it does not matter clinically. If you cannot explain it, the equivalence claim is not valid.
Equivalence tables that list device characteristics without assessing whether differences affect clinical outcomes. Reviewers will not accept superficial comparisons.
Biological Equivalence: When Materials Are Not Enough
Biological equivalence is often misunderstood as material equivalence. Manufacturers assume that using the same ISO 10993 compliant materials guarantees biological equivalence. This is incorrect.
Biological equivalence depends on the nature, duration, and location of tissue contact. A device made of the same material but used in a different anatomical location is not biologically equivalent. A device with the same material but longer contact time is not equivalent. A device with the same material but different surface treatment is not equivalent if the treatment alters biocompatibility.
I see equivalence claims fail when manufacturers ignore these distinctions. A wound dressing used on intact skin is not biologically equivalent to the same dressing used on an open wound, even if the materials are identical. The biological interaction is fundamentally different. The biocompatibility assessment must reflect this.
Reviewers will ask: Does the device contact the same tissue type? For the same duration? Under the same mechanical or chemical conditions? If the answer to any of these questions is no, you must demonstrate that the difference does not introduce new biological risks.
Clinical Equivalence: The Most Frequent Point of Failure
Clinical equivalence is where most claims collapse. Manufacturers assume that similar intended purposes or overlapping indications are sufficient. They are not.
MDCG 2020-5 is explicit. The devices must be used for the same clinical condition, in the same patient population, at the same site in the body, with the same clinical objectives. Any deviation requires justification.
A hip implant designed for osteoarthritis is not clinically equivalent to a hip implant designed for trauma reconstruction, even if the design is similar. The patient populations differ. The biomechanical demands differ. The success criteria differ. The clinical data from one cannot be extrapolated to the other without additional evidence.
A coronary stent used in the left anterior descending artery is not automatically equivalent to the same stent used in peripheral arteries. The vessel anatomy, lesion characteristics, and clinical outcomes differ. The equivalence claim must address these differences or it will be rejected.
Reviewers do not guess. They look at the claimed indications for use, the patient populations studied, and the clinical endpoints measured. If these do not align between the subject device and the equivalent device, the equivalence claim fails.
Clinical equivalence is the highest bar. It requires alignment on indication, population, site of use, and clinical objective. Partial overlap is not enough.
Contract and Access Requirements
Even when the three pillars of equivalence are demonstrated, practical barriers can prevent the claim from being valid.
You must have access to the technical, biological, and clinical data of the equivalent device. This means either you own the data, or you have a written agreement with the manufacturer of the equivalent device that allows you to reference their data for regulatory purposes.
Without this access, your equivalence claim is empty. You cannot reference public literature alone. You need the full dataset: design specifications, material certificates, biocompatibility test reports, clinical study protocols, raw data, and post-market data.
Many manufacturers discover this requirement too late. They identify an equivalent device on the market, prepare a comparison table, and submit the clinical evaluation assuming the publicly available information is sufficient. It is not. Notified Bodies will request proof of data access. If you cannot provide it, the equivalence claim is invalid.
This is not a technicality. It reflects the fundamental principle of equivalence: you are relying on someone else’s clinical evidence to support your device. That reliance must be formalized and traceable.
Equivalence claims without formal data access agreements. Reviewers will reject claims based on publicly available information alone.
When Equivalence Fails: What Happens Next
When an equivalence claim is rejected, you are left with no valid clinical evidence. The clinical evaluation cannot be completed. The conformity assessment stops.
You have two options. Generate new clinical data through your own clinical investigation, or identify a new equivalent device and restart the equivalence demonstration. Both paths require time and resources.
If you choose to conduct a clinical investigation, you must design a study that addresses the evidence gaps. This often takes years and substantial cost. If you choose to find a new equivalent device, you must ensure that all three pillars of equivalence are met, data access is secured, and the justification is stronger than the first attempt.
The pressure is high. Market timelines slip. Competitors gain advantage. Internal stakeholders question the regulatory strategy. This is the reality of an equivalence failure.
The best approach is to avoid the failure from the beginning by treating equivalence as what it is: a demanding regulatory path that requires rigorous analysis, complete data access, and conservative judgment.
How to Approach Equivalence Correctly
Start by asking whether equivalence is the right path. If your device has novel features, modified materials, or expanded indications, equivalence may not be feasible. It may be faster and more defensible to generate your own clinical data.
If you proceed with equivalence, conduct the three-pillar analysis methodically. For technical equivalence, list every specification and assess whether differences affect performance. For biological equivalence, map tissue contact scenarios and compare biological risks. For clinical equivalence, align indications, populations, sites, and objectives exactly.
Secure data access early. Do not assume public information is sufficient. Negotiate formal agreements with the equivalent device manufacturer. Document the scope of data sharing and ensure you have access to the full technical and clinical datasets.
Prepare the equivalence justification as a standalone document. Include comparison tables, risk analyses, and evidence of data access. Make it transparent and traceable. Reviewers must be able to follow your reasoning and verify your conclusions independently.
Finally, test your equivalence claim internally before submission. Have someone outside the project team review the justification. If they identify gaps or inconsistencies, address them before the Notified Body does.
Equivalence is not a shortcut. It is a precise regulatory path that requires complete data, rigorous analysis, and conservative judgment. Treat it as such.
Final Thought
Equivalence under MDR works when the technical, biological, and clinical alignment is complete and the data access is formal. It fails when manufacturers treat it as a workaround instead of a demonstration. The regulation is clear. The guidance is detailed. The failures are predictable.
If you cannot demonstrate equivalence convincingly, do not force it. The cost of a rejected claim is higher than the cost of generating your own clinical data from the start.
In the next part of this series, we will look at how to structure the clinical evidence base when equivalence is not an option, and what literature appraisal must cover to satisfy MDR requirements.
Peace,
Hatem
Clinical Evaluation Expert for Medical Devices
Follow me for more insights and practical advice.
Frequently Asked Questions
What is a Clinical Evaluation Report (CER)?
A CER is a mandatory document under MDR 2017/745 that demonstrates the safety and performance of a medical device through systematic analysis of clinical data. It must be updated throughout the device lifecycle based on PMCF findings.
How often should the CER be updated?
The CER should be updated whenever significant new clinical data becomes available, after PMCF activities, when there are changes to the device or intended purpose, and at minimum during annual reviews as part of post-market surveillance.
What causes CER rejection by Notified Bodies?
Common reasons include inadequate equivalence demonstration, insufficient clinical data for claims, poorly structured SOTA analysis, missing gap analysis, and lack of clear benefit-risk determination. Structure and logical flow are as important as the data itself.
Which MDCG guidance documents are most relevant for clinical evaluation?
Key documents include MDCG 2020-5 (Equivalence), MDCG 2020-6 (Sufficient Clinical Evidence), MDCG 2020-13 (CEAR Template), MDCG 2020-7 (PMCF Plan), and MDCG 2020-8 (PMCF Evaluation Report). MDCG 2020-5
Part 4 of 8
State of the Art: What Reviewers Actually Want to See
Most Literature Reviews Fail Before They Start
Need Expert Help with Your Clinical Evaluation?
Get personalized guidance on MDR compliance, CER writing, and Notified Body preparation.
✌
Peace, Hatem
Your Clinical Evaluation Partner
Follow me for more insights and practical advice.
– MDR 2017/745 Article 61(5)
– MDCG 2020-5: Clinical Evaluation – Equivalence
Deepen Your Knowledge
Read Complete Guide to Clinical Evaluation under EU MDR for a comprehensive overview of clinical evaluation under EU MDR 2017/745.





