Author:
Sarah de Heer is a doctoral candidate at the Faculty of Law at Lund University. In her doctoral research, she examines the impact of AI-driven medical devices in precision medicine on the right to health in Sweden. More specifically, Sarah scrutinises to which extent the conformity assessment procedure, which allows medical devices to be placed on the internal market of the European Union, can safeguard the quality pillar under the right to health. Sarah’s doctoral research is funded by Wallenberg AI, Autonomous Systems and Software Program – Humanity and Society (WASP-HS).
Imagine the following: you visit your General Practitioner, as you are experiencing unexplained weight loss and belly pain that spreads to the back. Your General Practitioner decides to test your blood by using an AI-driven medical device in precision medicine that predicts the likelihood of pancreatic cancer. Based on your blood sample, demonstrating a high likelihood of pancreatic cancer, the multidisciplinary team suggests starting chemotherapy. Since multiple combinations of various medicinal drugs can be included in the chemotherapy treatment plan, the tumour in your pancreas is tested with another AI-driven medical device in precision medicine to determine the drug sensitivity of the medicinal products. Based on these results, you are given the medicinal products that have showed high sensitivity and are expected to give you the best results. After treatment, the tumour appears to be in remission.
The above scenario appears to be promising: quicker and more accurate diagnosis and treatment thanks to the combination of precision medicine and AI. While precision medicine modifies healthcare diagnosis and treatment towards the data of an individual belonging to a specific (sub)group of the general population, this tailored approach of medicine is time-consuming and costly. Adding AI allows precision medicine to become faster and more scalable. However, the addition of AI – while undoubtedly revolutionary for the healthcare sector – has its own disadvantages stemming from their inherent characteristics of AI, which includes inaccuracy. The question then becomes: what if the AI-driven medical device in precision medicine is inaccurately trained, which leads to incorrect output? Going back to the scenario: imagine that AI-driven medical devices incorrectly predict that you do not have cancer or recommend a cancer treatment that does not suit you best and actually results in severe and irreversible side effects.
Unfortunately, this is not an unlikely scenario. To train an AI-driven medical device in precision medicine to identify, for instance, pancreatic cancer requires the training, validation, and testing datasets to be complete and accurate. While this is a particularly demanding task, it is a vital one to ensure the health of all individuals of the population. If the datasets, upon which the AI-driven medical device in precision medicine were to be trained, only include data of certain segments of the population, this would result in enhancing the health of individuals similar to the datasets but leaving individuals belonging to different demographics behind. Consequently, the overall health of some individuals may be improved, while simultaneously deteriorate for others. As such, the right to health may either be enhanced or be impeded depending on whether the individual resembles the dataset used.
The right to health, a fundamental right, rests upon four pillars, namely 1) availability, 2) accessibility, 3) acceptability, and 4) quality.[1] While AI-driven medical devices in precision medicine may both positively and negatively affect the pillars of availability, accessibility and acceptability, this doctoral project specifically scrutinises the quality pillar of the right to health. However, quality is not a fixed standard with a clear interpretation or methodology. Rather, seeing the ever-changing nature of medical science and legal requirements[2], the notion ‘quality’ is a moving target that is continuously evolving.[3] Moreover, as the interpretation of quality hinges on the context in which it is used, there is no sole global interpretation. Zooming in on the European Union, the provision of health care should be – amongst others – safe and effective.[4]
In short, the safety aspect of quality requires that healthcare materials, including AI-driven medical devices in precision medicine, should not harm or inflict avoidable injuries.[5] The effective requirement under the quality pillar entails that the provision of health care should be based on evidence[6] and is aimed at improving an individual’s health condition[7]. Both the safety and effectiveness of AI-driven medical devices in precision medicine hinges heavily upon the datasets used for training, testing and validation.
To ensure the safety and effectiveness of medical devices that are placed on the internal market of the European Union, the manufacturer needs to successfully complete the conformity assessment procedure under the Medical Devices Regulation[8]. Further, the Artificial Intelligence Act complements the conformity assessment procedure, where the medical device includes an AI component.[9] Thus, the regulatory framework consisting of the conformity assessment procedure of AI-driven medical devices in precision medicine comprises the Medical Devices Regulation and the Artificial Intelligence Act. Under the conformity assessment procedure, the manufacturer needs to submit evidence demonstrating the safety and effectiveness of their AI-driven medical device in precision medicine to a third party, namely the Notified Body.[10]
During the conformity assessment procedure, this so-called ‘conformity assessment body’ reviews the evidence submitted by the manufacturer. These Notified Bodies issue certifications indicating the conformity of the requirements as indicated in the Medical Devices Regulation and the Artificial Intelligence Act.[11] As such, Notified Bodies are indispensable in verifying the evidence attesting the safety and effectiveness of AI-driven medical devices in precision medicine. After having successfully completed the conformity assessment procedure, the manufacturer may draw up the EU Declaration of Conformity[12] and affix the AI-driven medical device in precision medicine with the CE marking[13]. Both demonstrate the compliance with EU Law, including the provisions on safety and effectiveness. Thus, the EU Declaration of Conformity – and consequently the CE marking – bring along a significant legal assumption.[14]
However, the question is to which extent can the manufacturer provide evidence attesting the safety and effectiveness of an AI-driven medical device in precision medicine? Let alone, to which extent can the Notified Body accurately examine the evidence submitted by the manufacturer of the AI-driven medical device in precision medicine? Seeing the inaccuracy surrounding AI, the manufacturer and the Notified Body are faced with an immense task, as they are tasked to ensure the pillar ‘quality’ under the right of health of AI-driven medical devices in precision medicine, thereby enhancing the health of individuals of all demographics of the population.
In short, the use of AI-driven medical devices in precision medicine may lead to quicker and more accurate health care. Their use, however, brings along challenges, especially caused by training AI-driven medical devices in precision medicine with improper datasets that may – for instance – be incomplete or not representative. This may lead to enhancing the pillar ‘quality’ under the right to health of those individuals who represent the datasets, while simultaneously leading to a decline of the pillar ‘quality’ of those individuals who do not resemble the datasets. Moreover, the four pillars, ‘availability’, ‘accessibility’, ‘acceptability’, and ‘quality’ are interlinked. This means that enhancing quality may also positively affect the other pillars. The other side of medal, however, is that a decline in quality may have a negative impact on the other pillars ‘availability’, ‘accessibility’, and ‘acceptability’. As such, individuals matching the datasets may experience an overall increase of their right to health, while individuals not mirroring the datasets may see their overall right to health backsliding.
Thus, there is a need for caution when implementing AI-driven medical devices in precision medicine in the healthcare sector. Specifically, it is vital that the regulatory regime is fit to address the problems associated with faulty datasets. Furthermore, Notified Body – in its capacity of overseeing the conformity assessment procedure – ought to be given the tools to ensure that quality, and thus the overall right to health, is enhanced for not only a selective group of the population but for everyone.
[1] Office of the High Commissioner for Human Rights, CESCR General Comment No. 14: The Right to the Highest Attainable Standard of Health (Art. 12) (Document E/C.12/2000/4), para. 12.
[2] Santa Slokenberga, ‘The standard of care and implications for paediatric decision-making. The Swedish viewpoint’ in Clayton Ó Néill and others (eds), Routledge Handbook of Global Health Rights (Taylor & Francis Group 2021) 122, 128.
[3] Alicia Ely Yamin, ‘The right to health’ in Jackie Dugard, Bruce Porter and Daniela Ikawa, Research Handbook on Economic, Social and Cultural Rights As Human Rights (Edward Elgar Publishing Limited 2020) 159, 162-163.
[4] Additionally, the provision of healthcare should also 1) patient-centred, 2) timely, 3) efficient, and 4) equitable. Institute of Medicine, Crossing the Quality Chasm. A New Health System for the 21st Century (National Academy Press 2001), 39-40.
[5] European Commission, Future EU agenda on quality of health care with a special emphasis on patient safety (Publications Office 2014), 25-26.
[6] Helen Hughes, ‘Patient Safety and Human Rights’ in Clayton Ó Néill and others (eds), Routledge Handbook of Global Health Rights (Taylor & Francis Group 2021) 259, 261.
[7] European Commission, Future EU agenda on quality of health care with a special emphasis on patient safety (Publications Office 2014), 25-26.
[8] For more information about the conformity assessment procedure of medical devices, please see Article 52 Medical Devices Regulation.
[9] For more information about the conformity assessment procedure for AI systems, please see Article 43 Artificial Intelligence Act.
[10] Article 53 Medical Devices Regulation and Article 43 Medical Devices Regulation.
[11] Article 56 Medical Devices Regulation and Points 3.2 and 4.6, subpara. 2 Annex VII Artificial Intelligence Act.
[12] Article 10(6) Medical Devices Regulation.
[13] Article 20(1) Medical Devices Regulation.
[14] Article 19(1) Medical Devices Regulation.
The post AI-Driven Medical Devices in Precision Medicine – ensuring the pillar ‘quality’ under the right to health appeared first on Ideas on Europe.
La rentrée scolaire 2025-2026 a démarré ce lundi 15 septembre 2025 sur toute l'étendue du territoire national. Salimane Karimou, ministre des enseignements maternel et primaire a procédé au lancement officiel à l'école primaire publique de Wokodorou à Parakou, dans le département du Borgou.
Après les grandes vacances, place maintenant aux activités pédagogiques qui ont démarré ce lundi 15 septembre 2025, conformément au calendrier scolaire de l'année 2025-2026. Elèves et apprenants ont repris le chemin de l'école ce lundi 15 septembre.
C'est à l'EPP de Wokodorou que le ministre des enseignements maternel et primaire a officiellement lancé la rentrée ce lundi. En présence des enseignants, élèves et parents d'élèves, il a réitéré l'engagement du gouvernement à garantir une éducation inclusive et de qualité aux enfants du Bénin. Salimane Karimou a exhorté à l'occasion de ce lancement officiel, les enseignants à plus de dévouement. S'adressant aux apprenants, il a prodigué des conseils et lancé un appel à la discipline et l'assiduité, gage de réussite scolaire. L'autorité ministérielle avait à ses côtés pour ce lancement, le préfet du Borgou, le maire de la ville de Parakou, et plusieurs autres personnalités.
F. A. A.