The introduction of artificial intelligence (AI) in healthcare has created novel challenges for the field of medical malpractice. As healthcare professionals increasingly rely on AI in their decision-making processes, traditional medicolegal assessments may struggle to adapt. It is essential to examine AI's role in clinical care - both its current applications and future advancements - to clarify accountability for diagnostic and therapeutic errors. Clinical decision support systems (CDSSs), in particular, unlike other traditional medical technologies, work as co-decision makers alongside physicians. They function through the elaboration of patient information, medical knowledge, learnt patterns, etc., to generate a decision output (e.g., the suggested diagnosis), which should then be evaluated by the physician. In light of the AI Act, CDSSs cannot function fully autonomously, but instead physicians are to be assigned an oversight role. It is questionable, however, whether it would always be appropriate to assign full responsibility, and consequently liability, to the physician. This would be especially true if oversight is limited to reviewing outputs generated by the CDSS in a manner that leaves no real control in the hands of the physician. Future research should aim to define clear liability allocation frameworks and design workflows that ensure effective oversight, thereby preventing unfair liability burdens.
Giorgetti, C., Giorgetti, A., Boscolo-Berto, R. (2025). Establishing new boundaries for medical liability: The role of AI as a decision-maker. ADVANCES IN CLINICAL AND EXPERIMENTAL MEDICINE, 34(10), 1601-1606 [10.17219/acem/208596].
Establishing new boundaries for medical liability: The role of AI as a decision-maker
Giorgetti, Claudia;Giorgetti, Arianna;
2025
Abstract
The introduction of artificial intelligence (AI) in healthcare has created novel challenges for the field of medical malpractice. As healthcare professionals increasingly rely on AI in their decision-making processes, traditional medicolegal assessments may struggle to adapt. It is essential to examine AI's role in clinical care - both its current applications and future advancements - to clarify accountability for diagnostic and therapeutic errors. Clinical decision support systems (CDSSs), in particular, unlike other traditional medical technologies, work as co-decision makers alongside physicians. They function through the elaboration of patient information, medical knowledge, learnt patterns, etc., to generate a decision output (e.g., the suggested diagnosis), which should then be evaluated by the physician. In light of the AI Act, CDSSs cannot function fully autonomously, but instead physicians are to be assigned an oversight role. It is questionable, however, whether it would always be appropriate to assign full responsibility, and consequently liability, to the physician. This would be especially true if oversight is limited to reviewing outputs generated by the CDSS in a manner that leaves no real control in the hands of the physician. Future research should aim to define clear liability allocation frameworks and design workflows that ensure effective oversight, thereby preventing unfair liability burdens.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


