Your PMCF Report Fails Before the First Review

Hatem Rabeh

Written by HATEM RABEH, MD, MSc Ing

Your Clinical Evaluation Expert And Partner

in
S

I have seen PMCF reports rejected before the reviewer reaches page ten. Not because the data was weak. Not because the surveillance was insufficient. Because the structure made it impossible to verify compliance. The evaluator could not trace how clinical safety and performance were actually confirmed through post-market evidence.

This happens more often than manufacturers realize. A PMCF report can contain valid data, thorough analysis, and real evidence. But if the structure does not align with what MDCG 2020-8 requires, the reviewer cannot perform their assessment. They cannot map your findings to your risk management. They cannot verify that your clinical evaluation was updated with post-market evidence. The report becomes unusable.

MDCG 2020-8 exists to standardize how manufacturers demonstrate that ongoing surveillance confirms the clinical safety and performance established at the pre-market stage. It is not a template. It is a framework that reflects the logical steps a reviewer must take to verify MDR compliance.

Most deficiencies I see are not about missing data. They are about broken logic chains. The report fails to show how post-market findings feed back into benefit-risk determination, how they validate or challenge equivalence claims, and how they inform the clinical evaluation update.

What MDCG 2020-8 Actually Requires

MDCG 2020-8 provides guidance on the structure and content of a PMCF Evaluation Report under MDR Article 61 and Annex XIV Part B. It is aligned with the broader clinical evaluation requirements in MDR Annex XIV and cross-references MDCG 2020-13 on clinical evaluation.

The report must demonstrate two things clearly:

First, that the PMCF activities defined in your PMCF plan were actually executed. This means showing what methods were used, what data was collected, and how the data meets the objectives stated in the plan.

Second, that the findings from PMCF were systematically evaluated and fed into your clinical evaluation and risk management. This is where most reports break down. The data is there. The integration is not.

Reviewers expect to see a logical flow from raw post-market data to clinical conclusions. If that flow is unclear, they cannot verify compliance. If they cannot verify compliance, the report gets rejected.

Common Deficiency
The PMCF report presents data tables, complaint summaries, and literature findings, but does not analyze what those findings mean for the device’s clinical safety and performance. The reviewer is left to guess how post-market evidence impacts the benefit-risk profile.

The Structure Reviewers Expect

MDCG 2020-8 outlines seven sections. Each section serves a specific function in the evaluation logic. Missing one or misunderstanding its purpose compromises the entire report.

Section 1: Device Description and Scope

This section identifies the device covered by the report. It includes the device name, model numbers, intended purpose, indications, contraindications, and the scope of the PMCF evaluation.

The scope must be explicit. If your PMCF plan covers multiple device variants, the report must clarify which variants are addressed and whether the data can be generalized across them. If equivalence to another device was claimed, state it here and reference the justification from your clinical evaluation.

Reviewers use this section to verify alignment between the PMCF report, the PMCF plan, and the clinical evaluation report. If the scope is vague, they cannot confirm that the PMCF activities actually addressed the device in question.

Section 2: Methods of PMCF

Here you describe the PMCF methods actually implemented. This includes proactive studies, registry participation, systematic literature surveillance, complaint and vigilance analysis, customer feedback mechanisms, and any other data sources used.

The key is to connect each method to the objectives defined in the PMCF plan. If the plan stated that you would conduct a prospective study to monitor long-term implant performance, this section must describe the study design, enrollment, follow-up periods, and data collection protocol.

Reviewers compare this section against the PMCF plan to assess whether the manufacturer followed through. If methods deviate from the plan, you must explain why and justify that the deviation still meets the PMCF objectives.

What I see often is a generic description of methods without linking them to specific clinical questions. The report states that complaints were reviewed and literature was monitored, but does not explain how those activities targeted the residual risks and gaps identified during pre-market evaluation.

Key Insight
Your methods section should read like an executed protocol, not a list of activities. Each method must be traceable to a clinical objective from your PMCF plan.

Section 3: Results of PMCF Activities

This is where you present the raw findings. What data did you collect? How many complaints were analyzed? How many publications were reviewed? What adverse events were reported? What patient outcomes were observed in your study?

The results section must be comprehensive and structured. Data should be presented clearly, with tables, figures, or summaries that allow the reviewer to understand the volume and nature of the evidence collected.

But presenting data is not enough. Reviewers expect you to organize results by the clinical questions they address. If one of your PMCF objectives was to confirm safety in a specific patient subgroup, present the data relevant to that subgroup separately and clearly.

Many reports dump all collected data into one section without organizing it by clinical relevance. This makes it nearly impossible for a reviewer to evaluate whether the PMCF objectives were met.

Section 4: Analysis and Interpretation of PMCF Results

This is the most critical section and the one most commonly mishandled. Here you analyze what the post-market findings mean for your device’s clinical safety and performance.

You must interpret the data in the context of your pre-market benefit-risk assessment. Does the post-market evidence confirm your initial conclusions? Does it reveal new risks or side effects? Does it validate the clinical performance you claimed?

If your device was determined to be equivalent to another device, you must assess whether post-market data supports that equivalence claim. If real-world use shows differences in outcomes, those differences challenge equivalence and must be addressed.

Reviewers expect to see explicit statements. Not implications. Not hints. Direct conclusions about whether clinical safety and performance are confirmed or need reassessment.

I have reviewed reports where this section was one paragraph summarizing that no major issues were found. That is not analysis. That is avoidance.

Common Deficiency
The analysis section restates the results without interpreting them. There is no explicit assessment of whether the benefit-risk profile remains favorable. The reviewer cannot determine if post-market data confirms or contradicts the pre-market clinical evaluation.

Section 5: Impact on Benefit-Risk Determination

This section must demonstrate how PMCF findings influence your benefit-risk conclusion. If the post-market data confirms the expected safety and performance, state it explicitly and reference the specific findings that support that conclusion.

If new risks emerged, describe them, assess their severity and frequency, and explain how they impact the overall benefit-risk balance. If mitigations were implemented, describe them here.

Reviewers are looking for a clear statement: based on post-market evidence, is the benefit-risk profile still acceptable for the intended use and target population?

If your answer is yes, you must justify it with specific data. If your answer is no, you must explain what actions you are taking.

This section is where regulatory scrutiny intensifies. If the logic is weak, if the conclusions are not traceable to the data, the report will be challenged.

Section 6: Conclusions

The conclusions section summarizes the key findings from PMCF and states whether the device continues to meet its intended clinical performance and safety profile under real-world conditions.

This section should also address whether the PMCF plan remains appropriate or needs revision. If gaps were identified, if new risks emerged, if the surveillance methods proved insufficient, state what changes will be made.

Reviewers expect actionable conclusions. Not vague statements about ongoing monitoring. If your PMCF revealed something important, the conclusion must reflect it and propose a clear path forward.

Section 7: Date of Next PMCF Evaluation Report

MDR requires periodic updates. This section states when the next PMCF evaluation report will be issued. The frequency depends on the device risk class, the maturity of the clinical evidence, and the findings from the current report.

If your current PMCF identified uncertainties or emerging risks, the next report may need to be produced sooner than the standard cycle. Justify the chosen periodicity based on the clinical and regulatory context.

What Breaks the Evaluation Chain

The structure outlined in MDCG 2020-8 is designed to create a logical chain from data collection to clinical conclusions. When one link in that chain is weak or missing, the entire report loses credibility.

The most common break happens between Section 3 and Section 4. Data is presented, but it is never analyzed in the context of clinical safety and performance. The reviewer is expected to infer what the data means. That is not acceptable under MDR.

Another common issue is the disconnect between the PMCF report and the clinical evaluation report. The PMCF report generates findings, but those findings never appear in the updated clinical evaluation. The feedback loop is broken.

Reviewers notice this immediately. They check whether the conclusions in your PMCF report are reflected in the clinical evaluation update. If they are not, the entire post-market surveillance system appears dysfunctional.

Key Insight
Your PMCF report is not a standalone document. It is part of a continuous clinical evaluation process. If the findings do not feed into your CER, risk management, and IFU updates, the system is incomplete.

How to Align with Reviewer Expectations

Start by mapping your PMCF report structure against MDCG 2020-8 before you write a single sentence. Each section must serve its intended purpose. Do not merge sections. Do not skip sections. Do not assume the reviewer will fill in the gaps.

When presenting data, organize it by clinical question. Make it easy for the reviewer to see how each data source addresses a specific objective from your PMCF plan.

When analyzing data, state your conclusions explicitly. Do not imply. Do not suggest. State whether the post-market evidence confirms clinical safety and performance or not.

When assessing benefit-risk, reference specific findings that support your conclusion. If you claim the benefit-risk remains favorable, point to the data that demonstrates it.

And finally, ensure that your PMCF report conclusions are reflected in your clinical evaluation report. The two documents must be aligned. If your PMCF identified a new risk, that risk must appear in your updated CER and risk management file.

This alignment is not automatic. It requires active coordination between clinical affairs, regulatory affairs, and quality management. If those teams are not communicating, the documentation will not align, and the reviewer will notice.

Why This Matters Beyond Compliance

Following the structure in MDCG 2020-8 is not just about passing a Notified Body review. It is about building a post-market surveillance system that actually works.

When your PMCF report follows a clear evaluation structure, it becomes a tool for decision-making. It helps you see whether your device performs as expected. It helps you identify risks before they become crises. It helps you justify continued market access.

When the structure is weak, the report becomes a formality. A document produced to check a regulatory box but never used to inform real decisions. That is when post-market surveillance fails.

Reviewers recognize this. They can tell when a PMCF report was written to satisfy a requirement versus when it was written to genuinely evaluate clinical evidence. The difference is in the logic, the traceability, and the willingness to state clear conclusions.

A strong PMCF report does not hide uncertainties. It does not downplay risks. It presents evidence, interprets it honestly, and draws conclusions that inform the next steps. That is what MDCG 2020-8 is designed to support.

If your PMCF report cannot do that, it will fail the review. Not because the data is missing. Because the evaluation is incomplete.

Peace,
Hatem
Clinical Evaluation Expert for Medical Devices
Follow me for more insights and practical advice.

Frequently Asked Questions

What is a Clinical Evaluation Report (CER)?

A CER is a mandatory document under MDR 2017/745 that demonstrates the safety and performance of a medical device through systematic analysis of clinical data. It must be updated throughout the device lifecycle based on PMCF findings.

How often should the CER be updated?

The CER should be updated whenever significant new clinical data becomes available, after PMCF activities, when there are changes to the device or intended purpose, and at minimum during annual reviews as part of post-market surveillance.

What causes CER rejection by Notified Bodies?

Common reasons include inadequate equivalence demonstration, insufficient clinical data for claims, poorly structured SOTA analysis, missing gap analysis, and lack of clear benefit-risk determination. Structure and logical flow are as important as the data itself.

Which MDCG guidance documents are most relevant for clinical evaluation?

Key documents include MDCG 2020-5 (Equivalence), MDCG 2020-6 (Sufficient Clinical Evidence), MDCG 2020-13 (CEAR Template), MDCG 2020-7 (PMCF Plan), and MDCG 2020-8 (PMCF Evaluation Report). MDCG 2020-8

Need Expert Help with Your Clinical Evaluation?

Get personalized guidance on MDR compliance, CER writing, and Notified Body preparation.

Peace, Hatem

Your Clinical Evaluation Partner

Follow me for more insights and practical advice.

References:
– MDR 2017/745 Article 61, Annex XIV Part B
– MDCG 2020-8: Post-Market Clinical Follow-up (PMCF) Evaluation Report Template
– MDCG 2020-13: Clinical Evaluation Assessment Report Template

Related Resources

Read our complete guide to PMCF under EU MDR: PMCF Plan & Report under EU MDR

Or explore Complete Guide to Clinical Evaluation under EU MDR