UDI and Clinical Evaluation: The Gap Reviewers Find First
I opened a clinical evaluation report last month. Well-written. Strong SOTA. Clear benefit-risk. But the UDI in the header did not match the UDI in EUDAMED. The reviewer flagged it immediately. Not as a minor issue. As a traceability failure that questioned the entire scope of the CER.
In This Article
That one mismatch triggered a chain of questions. Which device was actually evaluated? Which version? Which intended use? The manufacturer had to prove that the CER covered the exact device registered in EUDAMED. It delayed the submission by weeks.
Most teams treat UDI as an administrative task. A code to generate. A label to print. They do not see it as a clinical evaluation anchor. But reviewers do. They verify that the device evaluated clinically is the same device uploaded to EUDAMED. If the connection breaks, the whole file loses credibility.
Why UDI Matters in Clinical Evaluation
UDI is not just for supply chain tracking. It is the regulatory identity of your device. MDR Article 27 requires that every device entering the market has a unique identifier. That identifier must link the physical device to its technical documentation, its clinical evaluation, and its post-market data.
When a Notified Body or competent authority reviews your file, they start with EUDAMED. They pull the UDI. They check the basic UDI-DI. Then they open your CER. If the UDI in the CER does not match, they stop reading. They assume the CER was written for a different device or an older version.
UDI is the regulatory thread that ties clinical evaluation to device identity. If it breaks, reviewers cannot confirm that the CER covers the registered device.
This is not a theoretical risk. I have seen submissions rejected because the UDI in the CER referenced an older device model. The manufacturer assumed the differences were minor. The reviewer did not. They asked for proof that the clinical data applied to the current version. The manufacturer could not provide it quickly. The review stalled.
What Reviewers Verify
Reviewers follow a simple logic. They want to confirm that the device you evaluated is the device you will place on the market. UDI is the easiest way to verify that. Here is what they check.
1. UDI Consistency Across Documents
The UDI in your CER must match the UDI in your technical documentation. It must match the UDI in EUDAMED. If you updated the device and issued a new UDI-DI, the CER must reflect that change. If it does not, the reviewer assumes the CER is outdated.
I worked on a case where the manufacturer updated the device software. The change triggered a new UDI-DI. But the CER still referenced the old UDI. The Notified Body asked if the clinical evaluation covered the new software version. The manufacturer said yes. The reviewer asked for evidence. The manufacturer pointed to a change analysis in the technical file. But that analysis was not referenced in the CER. It took three rounds of questions to close the gap.
2. Traceability to EUDAMED
EUDAMED is the single source of truth for device registration. When a reviewer opens your EUDAMED entry, they see the UDI-DI, the device description, the intended use, and the classification. They compare that information to your CER. If the intended use in EUDAMED is broader than the intended use in the CER, they flag it. If the classification differs, they flag it.
This happens more often than it should. Teams register the device in EUDAMED before finalizing the CER. They adjust the intended use during the clinical evaluation process. But they forget to update EUDAMED. The mismatch creates doubt. Reviewers assume either the EUDAMED entry is wrong or the CER is incomplete.
EUDAMED entry describes a broader intended use than the CER. Reviewer asks which intended use is correct. Manufacturer must clarify and update one or both documents.
3. Link Between UDI and Device Versions
Devices evolve. Software updates. Material changes. New accessories. Each significant change may require a new UDI-DI. If your CER does not clearly state which UDI-DI it covers, reviewers cannot confirm that the clinical data applies to the current version.
I reviewed a CER that covered three device models. Each model had a different UDI-DI. But the CER only mentioned the UDI-DI of the first model in the header. The other two were buried in an appendix. The reviewer missed them initially. During the audit, they asked if the CER covered all three models. The manufacturer pointed to the appendix. The reviewer said the CER structure was unclear. They requested a revised version with all UDI-DIs listed upfront.
How UDI Failures Break Clinical Traceability
When UDI traceability fails, it creates a cascade of problems. The reviewer cannot confirm that the clinical data matches the device. They question the validity of the equivalence claim. They ask for additional studies. They request updates to the PMCF plan. What started as a labeling inconsistency becomes a clinical evaluation gap.
Here is a real example. A manufacturer submitted a CER for a Class IIb surgical instrument. The CER referenced a UDI-DI that was retired six months earlier. The manufacturer had updated the device design and issued a new UDI-DI. But the CER was not updated. The Notified Body asked if the clinical evaluation covered the current version. The manufacturer said yes. The reviewer asked for the change analysis. The manufacturer provided it. But the analysis concluded that the changes were not clinically significant. The reviewer disagreed. They asked for a clinical justification. The manufacturer had to commission an additional literature review. The submission was delayed by three months.
A UDI mismatch is not just a documentation error. It signals a potential gap in clinical traceability. Reviewers treat it as a red flag that requires full investigation.
How to Maintain UDI-Clinical Evaluation Alignment
Keeping UDI and clinical evaluation aligned requires process discipline. It is not enough to update the CER when you issue a new UDI. You need to verify alignment at every stage of the device lifecycle.
1. Lock UDI Early in the CER Process
Do not finalize the CER until you know the final UDI-DI. If the device is still under development, use a provisional UDI and clearly state that the CER will be updated once the final UDI is issued. Do not submit the technical file with a provisional UDI unless you have a clear plan to update it before market release.
2. Cross-Check CER and EUDAMED Before Submission
Before you submit the technical file, open your EUDAMED entry. Compare the UDI, intended use, classification, and device description to the CER. If anything does not match, fix it. Do not assume the reviewer will overlook small inconsistencies. They will not.
3. Update the CER When You Issue a New UDI
If you update the device and issue a new UDI-DI, update the CER. State clearly which version the CER covers. If the clinical evaluation still applies to the new version, explain why in a change analysis. Reference that analysis in the CER. Do not force the reviewer to search for it.
4. Track UDI Changes in Your Quality System
UDI changes should trigger a review of clinical documentation. Build that check into your change control process. When the regulatory team issues a new UDI, the clinical team should receive an automatic flag. They verify whether the CER needs updating. If it does, they update it before the device ships.
UDI updated in EUDAMED but not in the CER. Reviewer discovers the mismatch during document review. Manufacturer must prove the CER still applies to the current version.
What This Means for PMCF and Post-Market Surveillance
UDI traceability does not stop at the CER. It extends into PMCF and post-market surveillance. When you collect post-market data, you need to link that data to a specific UDI. If your UDI tracking is weak, you cannot prove which device version generated which clinical outcome.
Reviewers check this connection during PMCF plan reviews. They want to see that your PMCF data will be tagged with UDI. They want to know how you will aggregate data across device versions if you issue multiple UDI-DIs. If your plan does not address this, they will ask for clarification.
I worked with a manufacturer who collected post-market data for three device versions. Each version had a different UDI-DI. But the PMCF database did not track UDI. When the Notified Body reviewed the PMCF report, they could not tell which outcomes applied to which version. The manufacturer had to re-analyze the data and resubmit the report. It added months to the review cycle.
Final Reflection
UDI is not just a code. It is the regulatory anchor that ties your device to its clinical evaluation. When that anchor slips, reviewers lose confidence in the entire file. They question whether the CER covers the right device. They ask for proof. They delay approval.
The fix is simple. Verify UDI alignment before submission. Cross-check EUDAMED and the CER. Update the CER when you issue a new UDI. Build UDI tracking into your PMCF system. Treat UDI as part of clinical traceability, not just labeling compliance.
Because when reviewers open your file, they will check the UDI first. If it does not match, they will question everything else.
Next post: How EUDAMED transparency changes the way competent authorities audit your technical file.
Peace,
Hatem
Clinical Evaluation Expert for Medical Devices
Follow me for more insights and practical advice.
Frequently Asked Questions
What is a Clinical Evaluation Report (CER)?
A CER is a mandatory document under MDR 2017/745 that demonstrates the safety and performance of a medical device through systematic analysis of clinical data. It must be updated throughout the device lifecycle based on PMCF findings.
How often should the CER be updated?
The CER should be updated whenever significant new clinical data becomes available, after PMCF activities, when there are changes to the device or intended purpose, and at minimum during annual reviews as part of post-market surveillance.
What causes CER rejection by Notified Bodies?
Common reasons include inadequate equivalence demonstration, insufficient clinical data for claims, poorly structured SOTA analysis, missing gap analysis, and lack of clear benefit-risk determination. Structure and logical flow are as important as the data itself.
Which MDCG guidance documents are most relevant for clinical evaluation?
Key documents include MDCG 2020-5 (Equivalence), MDCG 2020-6 (Sufficient Clinical Evidence), MDCG 2020-13 (CEAR Template), MDCG 2020-7 (PMCF Plan), and MDCG 2020-8 (PMCF Evaluation Report). MDR Article 27, MDCG 2018-1
Need Expert Help with Your Clinical Evaluation?
Get personalized guidance on MDR compliance, CER writing, and Notified Body preparation.
✌
Peace, Hatem
Your Clinical Evaluation Partner
Follow me for more insights and practical advice.
– MDR 2017/745 Article 27
– MDCG 2018-1 Guidance on Basic UDI-DI and changes to UDI-DI
Deepen Your Knowledge
Read Complete Guide to Clinical Evaluation under EU MDR for a comprehensive overview of clinical evaluation under EU MDR 2017/745.





