Medtech Insight is part of Pharma Intelligence UK Limited

This site is operated by Pharma Intelligence UK Limited, a company registered in England and Wales with company number 13787459 whose registered office is 5 Howick Place, London SW1P 1WG. The Pharma Intelligence group is owned by Caerus Topco S.à r.l. and all copyright resides with the group.

This copy is for your personal, non-commercial use. For high-quality copies or electronic reprints for distribution to colleagues or customers, please call +44 (0) 20 3377 3183

Printed By


Leaked EU AI Act Promises Good News For SMEs And Start-Ups

Executive Summary

The draft EU Artificial Intelligence Act has been leaked on LinkedIn and there is good news for medtech small and medium sized enterprises (SMEs).

Small and medium sized enterprises (SMEs), including in the medtech area, will benefit from priority access to regulatory sandboxes and simplified technical documentation requirements according to the leaked version of the EU’s AI Act.

The European Parliament and Council of the EU reached a provisional agreement on the proposed AI Act at the end of 2023. The final text now needs to be formally adopted by the EU’s co-legislators but had not been made public until the leaked document became available. (Also see "Medtech Further Examines Repercussions With EU On Cusp Of Adopting World-First AI Act" - Medtech Insight, 14 Dec, 2023.)

There are also key changes for high-risk AI systems. However, these are unlikely to affect medical devices, which remain caught up in the legislation’s scope by virtue of the fact that they are risk-assessed under other EU harmonized legislation (i.e. MDR/IVDR).

The 892-page EU AI Act was leaked over LinkedIn; it has been reshared nearly 500 times. Rather appropriately, professionals have been using AI, in the form of ChatGPT, to digest the content.

According to the document, manufacturers will self-assess whether products are high risk through an AI Impact Assessment.

But the AI Act has consistently stipulated products undergoing third party conformity under the Medical Device Regulation (MDR) and In Vitro Diagnostics Regulation (IVDR), are automatically considered to be high-risk anyway.

What is clear is that manufacturers of high-risk devices will have to maintain certain elements of technical documentation for a period of 10 years. The draft AI Act has also put in place specific conditions for real world testing.

As for next steps, the text will be discussed by the Telecommunications and Information Society Working Party (which handle the creation of the digital single market in Europe) on 24 January 2023, with formal adoption by COREPER, the Committee of the Permanent Representatives of the Governments of the Member States to the European Union, expected February 2, 2023

No Change To High-Risk Classification

A significant change to the final text of the AI Act is that systems will not be considered high-risk if they do not pose a significant risk of harm to the health, safety, or fundamental rights of natural persons, including by not materially influencing the outcome of decision making. This observation was made by Barry Scannell, consultant and Leo Moore, partner, in their article for William Fry law firm.

The principle that AI products will be considered high-risk under the AI Act if they undergo third party conformity assessment under other relevant Union harmonization legislation (i.e. the MDR), remains in the leaked draft agreement. So, it is likely that all medical devices will still be considered high-risk AI systems.

In addition to this, Scannell and Moore reported, AI systems will always be considered high-risk if they perform profiling of natural persons. In this context, profiling refers to the “automated processing of personal data” used to evaluate certain personal aspects, to analyze or predict aspects such as health.

AI medical devices will likely also be processing personal data relating to health, and therefore will fall into this category as well.

Manufacturers are required to register their AI systems on EU databases “under Article 60”, even if they are non-high risk, highlighted Scannell and Moore.

"Where AI is a safety component of any devices regulated by EU MDR or IVDR it has the potential to be classified as high-risk AI under the AI act." - Barry Scannell, consultant lawyer, William Fry

MDR Legislative Interplay

The European Parliament provisional agreement amendments suggested that requirements for high-risk AI systems could be integrated into the legislative text of the MDR and IVDR. 

The draft text states that flexibility will be awarded to allow manufacturers to make operational decisions on how to ensure compliance of a product to Union harmonization in the “best way."

“For example, a decision by the manufacturer to integrate a part of the necessary testing and reporting processes, information and documentation required under this Regulation into already existing documentation and procedures”, the document says.

The text suggests that there will be amendments made to standards rather than legislation to clarify this situation.

SME Priority For Regulatory Sandboxes, Conditions For Real World Testing

High-risk AI systems can be tested in real world conditions, provided manufacturers develop a comprehensive plan approved by the market surveillance authority, said Scannell and Moore.

To achieve approval for the testing plan, the plan must adhere to certain conditions, including: EU database registration; appointment of an EU manufacturer or legal representative; adherence to data transfer safeguards; and ensuring reversibility of AI systems.

Testing should not exceed six months unless extended for a valid reason. Any serious incidents should be reported to the national market surveillance authority, asserted Scannell and Moore.

Participants must give documented informed consent, with full awareness of the testing’s nature, Scannell and Moore noted.

Simplified Technical Documentation For SMEs

The technical documentation burden for SMEs has been somewhat alleviated.

A simplified form will be provided by the commission to SMEs when submitting documentation under Annex IV, the list of technical documentation required, reported Scannell and Moore.

The commission confirms notified bodies will accept this form to assess conformity of devices.

Documentation Retention For High-Risk Devices

High-risk AI system manufacturers must keep specific technical documentation for 10 years after product launch, including: 

  • details about the quality management system;
  • records of any changes approved by notified bodies, if relevant;
  • any decisions or documents issued by notified bodies, where applicable; and
  • the EU declaration of conformity.

Related Content


Latest Headlines
See All



Ask The Analyst

Ask the Analyst is free for subscribers.  Submit your question and one of our analysts will be in touch.

Your question has been successfully sent to the email address below and we will get back as soon as possible. my@email.address.

All fields are required.

Please make sure all fields are completed.

Please make sure you have filled out all fields

Please make sure you have filled out all fields

Please enter a valid e-mail address

Please enter a valid Phone Number

Ask your question to our analysts