Common Non Conformities in MDR Clinical Evaluation | Eclevar MedTech

Common Non Conformities in MDR Clinical Evaluation

Dr. Nikhil Khadabadi

Dr Nikhil Khadabadi, MD, MS, MRCS

Chief Medical Officer

Clinician-led excellence in clinical evidence and MDR compliance

TÜV SÜD logo Ex TÜV SÜD Clinical Reviewer MDR clinical evidence and PMCF expert Board member and active advisor to several robotics-focused medical device companies

"With over fifteen years of experience across orthopaedic surgery, notified body review and industry, I bring frontline medical and regulatory insight to every evidence program. At Eclevar MedTech we focus on clinical data that is clear, defendable and ready for review."

"I have sat on both sides of the table. I have reviewed clinical files for a Notified Body and I have authored and defended them for manufacturers while working with several Notified Bodies. Most nonconformities are predictable and preventable once you know how reviewers think. My simple advice is: keep a clean structure that is searchable, traceable, and defensible."

What this article gives you

  • The top patterns of nonconformity we see in technical documentations
  • Concrete fixes you can implement
  • Copy ready snippets for your plans and reports
  • A link to a companion cheat sheet on common deficiencies and quick remedies
Nihon Kohden Corporation
Client Success Story

Nihon Kohden Corporation

"Eclevar provided thorough explanations about document preparation policy and relevant guidelines, sharing their expertise in a way that allowed us to create documentation perfectly adapted to our situation and needs. The team responded promptly and arranged meetings in a timely manner."

Y.G., Product Manager for EEG-1200K

Read Full Story

1. General clinical documentation quality

"The CER is only partially searchable and lacks bookmarks or clickable contents. Updated sections are not provided with tracked changes alongside a clean version, which makes review and traceability difficult."

Frequent issues

  • CER and CEP files are not fully searchable or easy to navigate
  • There is no clickable table of contents or bookmarks in long PDFs
  • Updated documents are sent without a tracked changes version
  • Final signed versions of key documents are missing at module closure
  • The CER does not clearly state the primary regulatory approach for the clinical evaluation
  • Author CVs and DOIs are missing, unsigned, or out of date
  • The gap between literature search cut off and submission is too long for higher risk devices

Practical fixes

  • Make every CER and CEP fully searchable with clickable contents and bookmarks
  • Always submit tracked and clean versions together, with a two line change note
  • Add a short Pathway Statement in section one

Example
“This clinical evaluation is based on clinical data from literature and manufacturer generated investigations supported by PMS and PMCF. No equivalence route is used. Article sixty one point ten is not applied.”

  • Keep a small Author Dossier in the appendix with signed CVs and current DOIs
  • Align your literature search cut off with device risk and the pace of state of the art changes
Reviewer mindset: If I cannot find it in under three clicks I assume it is missing.

2. Managing device families and variants in the clinical evaluation

"The CER puts nine device models into one report without explaining the differences between them or how each one is supported."

Frequent issues

  • Multiple models included in one CER without clear model level support
  • Evidence pooled across sizes or indications with no explanation
  • Worst case or highest risk variants used without scientific justification
  • Claims written as family wide when evidence supports only specific models
  • No device family map linking each model to the supporting data

Recommended actions

  • Build a Device Family Matrix listing each model size configuration intended use and the evidence that supports it
  • If using a highest risk model provide a short bridging justification showing why it represents the other variants using preclinical and clinical comparisons
  • Mark every clinical claim as Family level or Model specific
  • When a model lacks direct data explain how bridging similarity testing or limited equivalence supports it
Reviewer mindset: If I cannot see how each model is supported I cannot sign off on any of them.

3. Clinical Evaluation Plan essentials

"The CEP lists improve outcomes as the main clinical benefit for a class IIb implant but does not define measurable outcomes or acceptance thresholds."

Frequent issues

  • Intended purpose statement is vague and not certificate ready
  • Indications and contraindications are incomplete or scattered across documents
  • Clinical benefits are written as general intentions, not measurable outcomes
  • No clear link between benefits and outcome parameters
  • Benefit risk acceptance thresholds are missing or not taken from the state of the art
  • The CEP and CER are not aligned on the outcomes used to judge success

Practical fixes

  • Write one clear intended purpose statement that you would be comfortable printing on the certificate
  • List intended patient groups and intended users in simple language in the CEP and ensure they match the IFU
  • Convert each clinical benefit into one or more measurable outcomes
  • Define acceptance thresholds for performance and safety based on the state of the art and copy them into the CEP
  • Ensure the CER uses the same outcomes and thresholds when presenting device data
  • For low risk devices where numbers are harder to set explain clearly how you will still judge whether the device does what you claim
Reviewer mindset: If I cannot see how you will measure the benefit I cannot judge the benefit risk.

4. State of the art that truly drives thresholds

"The CER provides a long background on the disease but does not define outcome ranges from the literature. The device cannot be compared to any concrete benchmark and the benefit risk conclusion is not supported by state of the art data."

Frequent issues

  • The state of the art section reads like a textbook chapter rather than a focused evidence review
  • Alternative treatment options are listed but not compared in a structured way
  • Safety and performance outcomes from the literature are described narratively, not turned into numbers
  • No explicit acceptance ranges are defined for key outcomes
  • The CEP does not reuse these ranges as thresholds so the CER cannot make direct comparisons
  • For legacy devices, the file does not clearly show how newer alternatives have evolved and what that means for benefit risk

Practical fixes

  • Keep the medical background short and create a separate focused state of the art section
  • Identify the main alternative treatments and summarize their key safety and performance outcomes in a small table
  • Turn those outcomes into numeric ranges or clear qualitative thresholds that you will use as acceptance criteria
  • Copy these thresholds into the CEP so they become the planned benchmarks for the device
  • In the CER compare your device data directly against these thresholds and state clearly whether you meet them or where you differ and why
  • For legacy devices show how newer alternatives influence what is now an acceptable benefit risk balance
Reviewer mindset: If you do not show me the state of the art thresholds I cannot see whether your device truly keeps up with current practice.

5. Making your literature search reviewer ready

"The literature search uses only one database. Studies are excluded as not relevant without explanation and quality is rated as high medium or low without saying what that means for this device."

Frequent issues

  • Only one scientific database is searched
  • Search strategy and filters are not fully documented
  • Excluded studies are marked as “not relevant” with no reason given
  • Appraisal terms like high, medium, low quality are used without device specific definitions
  • No separation of evidence for the device under evaluation, equivalent devices, and the state of the art
  • The final number of included studies in each evidence group is not clearly stated
  • Full text copies of included studies are not available for review

Practical fixes

  • Use more than one scientific database that fits your clinical field
  • Document the full search strategy, including dates and filters, in a repeatable format
  • For every excluded paper provide a short reason such as wrong population, wrong device, or wrong endpoint
  • Define in advance what long enough follow up means for your device and what clinically significant means for your claims
  • Rate each study using these device specific criteria and classify studies as pivotal or supportive
  • Keep separate lists for the device under evaluation, equivalent or similar devices, and the state of the art
  • Ensure full text copies of all included papers are available for the reviewer
Reviewer mindset: If I cannot follow how you searched and judged the evidence I cannot rely on your clinical conclusions.

6. Data transferability

“A pivotal study was conducted outside the Union without clear alignment to ISO 14155. No gap analysis was provided, so the reviewer could not judge whether the results were valid and transferable to the intended European population.”

Frequent issues

  • No clear justification when a clinical investigation has not been performed
  • Pivotal studies conducted outside the Union with no formal gap analysis
  • Studies not aligned with ISO 14155 or MDR Annex fifteen without discussion of limitations
  • No explanation of how population differences, practice patterns, or follow up influence transferability
  • Missing summaries of authority interactions for MDR related studies including approvals, conditions, or refusals

Practical fixes

  • When no clinical investigation is performed provide a clear point by point rationale explaining why the existing evidence is sufficient
  • For studies conducted outside the Union prepare a simple gap analysis comparing study setting, population, practice, and follow up to European use and explain the impact on transferability
  • For studies not aligned to ISO 14155 or MDR Annex fifteen describe the main deviations and their implications for data quality and bias
  • State clearly which study results you consider pivotal and which are used only as supportive
  • Summarise the key outcomes of authority correspondence for MDR related investigations including decisions major questions and how they were addressed
Reviewer mindset: If you do not explain how external studies map to European practice I cannot treat their results as reliable evidence for your device.

7. Making PMS and PMCF truly feed the CER

"A company with five years on the market proposed PMCF through literature only. There were no sales or complaint volumes, no timelines, and the PMCF endpoints did not match the outcomes defined in the CEP."

Frequent issues

  • PMS sections show narrative comments but no yearly volumes or trends
  • PMCF plans rely only on literature even when the device has substantial market history
  • PMCF endpoints do not match the benefit and safety outcomes set in the CEP
  • Sample sizes and timelines for PMCF activities are not stated
  • Serious incidents, FSCAs, recalls, reportable events, and CAPAs are not clearly summarised
  • Links between PMS and PMCF findings and CER updates are not described

Practical fixes

  • Report sales volumes by year and main markets so trends can be seen
  • Show total complaints and key complaint categories by year
  • List severe adverse events, deaths, FSCAs, recalls, reportable events, and CAPAs in a simple table
  • Run queries in the main vigilance databases and write a short summary of what you found
  • In the PMCF plan define sample sizes, timelines, and endpoints that match the benefit and safety outcomes in your CEP
  • Explain how and when PMCF results will be reviewed and used to update the CER and SSCP
  • If PMCF is not applicable for a specific device give a clear rationale and describe how PMS activities and literature surveillance will still update the clinical picture
Reviewer mindset: If PMS and PMCF do not change anything in your CER I question whether they are truly being done.

Request Free Trial of MILO EDC and Survey to Collect RWE Data

Transform your clinical data collection with Milo Healthcare's comprehensive EDC and survey platform designed specifically for MDR compliance and real-world evidence gathering.

Request Free Trial
Guillaume Charles

Guillaume Charles

Clinical Project Manager

8. Keeping IFU SSCP and claims aligned with evidence

"Marketing materials promise faster recovery but the CER does not assess this claim, the SSCP does not show quantitative risks over time, and the expected lifetime of the implant is not described."

Frequent issues

  • Marketing and website claims are not backed up by analysis in the CER
  • The SSCP lists risks but does not give quantitative figures or relate them to time
  • The patient section of the SSCP is missing even though it is required
  • Lifetime information for implants is unclear or not included where it should be
  • New claims appear in brochures that are not reflected in any clinical document

Practical fixes

  • Build a simple claims matrix that lists every external claim and points to supporting analysis and references in the CER
  • In the SSCP provide quantitative risks and link them to time, for example early and late events or per patient year
  • State success rates for key outcomes in the SSCP using clear numbers taken from the CER
  • Include a patient friendly SSCP where it is required and check that the language matches the main clinical messages
  • For implants describe the expected lifetime in line with the legal basis, the risk analysis, and the available clinical and preclinical data
  • When new claims are added in marketing update the claims matrix and check that the CER and SSCP still support them
Reviewer mindset: If your claims do not match your CER and SSCP I cannot trust the story you tell to patients and clinicians.

9. Risk management alignment

"The IFU lists device specific adverse events that are not discussed in the CER and the risk management report shows no clear clinical input."

Frequent issues

  • Residual risks and device specific adverse events do not match across the IFU, CER, and risk management file
  • New risks identified in clinical data are not carried through into the risk management documentation
  • Contraindications and warnings appear in the IFU without a clear link to the underlying risk evaluations
  • There is no visible evidence that someone with clinical expertise has contributed to the risk management process
  • Trend information from PMS and PMCF is not used to review and update risk estimates

Practical fixes

  • Cross check that every device specific adverse event seen in clinical data appears consistently in the IFU, the CER, and the risk management file
  • When a contraindication or warning is added in the IFU record it as a risk control in the risk report and explain how it reduces the residual risk
  • Ensure at least one person involved in risk management has clinical experience with the device and include a short CV in the file
  • Use PMS and PMCF findings to review and adjust risk estimates and document the outcome of that review
  • Add a short alignment table listing key risks, associated adverse events, risk controls, and where they are described in the IFU and CER
Reviewer mindset: If your risks, adverse events, and IFU content do not line up I cannot accept your overall benefit risk conclusion.

Self score: Are you review ready

Give yourself two points for each Yes and zero for each No.

  • ➤ Can a reviewer reach your acceptance thresholds from the state of the art in under one minute
  • ➤ Do your PMCF endpoints mirror your CEP outcomes
  • ➤ Does every clinical claim in marketing point to a cited paragraph in the CER
  • ➤ Does your device family matrix prove coverage for each model
  • ➤ Can you show signed author CVs and DOIs less than twelve months old

Your score

Eight to ten means you are ready for scrutiny

Five to seven means tighten alignment

Zero to four means rebuild your core evidence logic

Trusted by Leading Brands

Partnership Success Story

Testimonial in chronic wound trial from RegenLab,
with more than 14 sites across Europe

With Milo, we transformed the way data becomes clinical data across multiple sites, ensuring high-quality, real-time data capture and seamless integration with clinical workflows. This innovation has significantly improved data accuracy, patient engagement, and regulatory compliance in chronic wound clinical trials, enabling faster insights and better patient outcomes.

Watch Now
Eclevar x RegenLab Partnership

How Eclevar can support your team

Clinical strategy and authoring

  • Author or refresh CEP and CER with clear pathway statements, benefit to outcome mapping, and state of the art driven thresholds
  • Build device family matrices, bridging rationales, and PMCF strategies that feed back into the CER and risk file

Evidence operations

  • Run multi database literature programs with full traceability and appraisal tailored to your device class
  • Design studies, protocols, and Milo Health EDC workflows so that data flows directly into your clinical reports and submissions

PMS and vigilance analytics

  • Set up sales and complaint trend frameworks that drop straight into CERs, PSURs, and SSCPs
  • Link PMS and PMCF findings to clear actions in your risk management, labelling, and next CER update

Labelling and claims governance

  • Write and update SSCPs including the patient part with quantitative risks and success rates aligned to the CER
  • Maintain a claims matrix that ties marketing and website language back to specific evidence and citations in your clinical documentation

Ready to avoid these nonconformities for your device?

Most manufacturers do not fail MDR clinical evaluation because of safety issues. They fail because the evidence story is unclear, scattered or hard for a reviewer to trace.
That is exactly the gap we help close.

When you work with us you do not just receive documents.
You get clear logic, measurable outcomes, reviewer level traceability and real world data pipelines that stand up to scrutiny.

If you want:

  • your next CER or CEP to pass with fewer questions
  • a PMCF plan that actually updates your evidence
  • a reviewer ready state of the art with real thresholds
  • a second opinion on where your file is weak
Dr. Nikhil Khadabadi

Dr. Nikhil Khadabadi

It is a twenty minute clinical evidence diagnostic to identify your top three gaps and the next actions to fix them.

Book a consultation with me, I will personally review your situation with you. No obligation. No generic sales call.

Book your consultation
If you want a reviewer view on your clinical evidence before your notified body gives you theirs, book a call and I will walk you through it.
```

Reforming Clinical Evaluation of Medical Devices in Europe