
One Technical File or Two? What the EU AI Act Means for Your MDR Documentation
The countdown has reached its final months. By August 2026, the EU AI Act will be fully enforceable, and for the MedTech sector, the stakes are uniquely high. If your medical device uses a neural network for diagnostics, a fuzzy logic controller for drug delivery, or even a simple machine-learning algorithm for signal processing, you are no longer just a "Medical Device Manufacturer." You are now a "High-Risk AI Provider." Two regulators. Two frameworks. Two sets of documentation requirements — or so it seems. The good news — and it is genuinely good news — is that these two frameworks are more harmonizable than they first appear. You don't need two compliance departments. You need one well-structured approach. This article breaks down where MDR and the EU AI Act overlap, where they diverge, and what a harmonized compliance strategy actually looks like in practice in 2026.
The "High-Risk" Convergence
Under Article 6 of the AI Act, any product already requiring a third-party conformity assessment under the MDR (Class IIa and above) is automatically categorized as High-Risk AI.
This creates a regulatory overlap. The MDR cares about clinical safety and efficacy, while the AI Act cares about data governance, transparency, and fundamental rights.
The challenge for 2026 is merging these two into a single Technical File.
3 Pillars of a Harmonized Strategy
1. The Unified Quality Management System (QMS)Don't build a new QMS for AI. Instead, extend your ISO 13485 framework.MDR Requirement: Risk management via ISO 14971.AI Act Requirement: Systematic risk management throughout the AI lifecycle.The Harmonized Move: Integrate AI-specific risks (like algorithmic bias or "hallucinations") into your existing ISO 14971 risk table. Treat a "biased algorithm" with the same clinical rigor as a "hardware failure."
2. Technical Documentation: The "Living" FileThe AI Act demands unprecedented transparency. You must be able to explain how your AI reached a conclusion.
3. Data Governance as a Clinical RequirementThe AI Act mandates that training, validation, and testing datasets must be "relevant, representative, and to the best extent possible, free of errors."In 2026, "dirty data" isn't just a technical hurdle; it’s a legal liability. Our engineering teams focus on Data Pedigree, ensuring every byte used to train your MedTech model is documented, ethically sourced, and compliant with both GDPR and the new AI mandates.
The Notified Body Reality
The biggest bottleneck of 2026 isn't the law—it's the auditors.
Notified Bodies (NBs) are now being designated to assess both MDR and AI Act compliance simultaneously.
By presenting a Harmonized Technical File, you reduce the friction for your Notified Body.
Instead of asking them to read two separate books, you’re handing them a single, coherent narrative of safety and transparency.
Conclusion: Don't Wait for August
The "Brussels Effect" means that even if you are a US-based firm, if you want to sell in the EU, your AI must be "MDR+AI Act Ready."
At Thaumatec, we bridge the gap between complex embedded engineering and rigorous regulatory strategy.
We don't just write the code; we ensure the code is defensible in front of an auditor.