The principles of transparency and explainability are landmarks of the current EU approach to artificial intelligence. Both are invoked in the policy guidelines as values governing algorithmic decision-making, while providing rationales for existing normative provisions, on information duties, access rights and control powers. This contribution addresses the debate on transparency and explainability from the EU consumer market perspective. The consumers’ position relative to algorithmic decision-making is considered, and their risks concerning mass surveillance, exploitation, and manipulation are discussed. The concept of algorithmic opacity is analyzed, distinguishing technology-based opacity that is intrinsic to design choices, from relational opacity toward users. The response of EU law is then considered. The emerging approach to algorithmic transparency and explainability is connected to the broader regulatory goals concerning transparency in consumer markets. It is argued that EU law focuses on adequate information being provided to lay consumers (exoteric transparency), rather than on understandability to experts (esoteric transparency). A discussion follows on the benefits of transparency, on its costs, and on the extent to which transparency can be implemented without affecting performance. Finally, the merits of a transparency-based regulation of algorithms are discussed and insights are provided on regulating transparency and explainability within the EU law paradigm.
Mateusz Grochowski, Agnieszka Jabłonowska, Francesca Lagioia, Giovanni Sartor (2021). Algorithmic Transparency and Explainability for EU Consumer Protection: Unwrapping the Regulatory Premises. CRITICAL ANALYSIS OF LAW, 8(1), 43-63.
Algorithmic Transparency and Explainability for EU Consumer Protection: Unwrapping the Regulatory Premises
Francesca Lagioia
;Giovanni Sartor
2021
Abstract
The principles of transparency and explainability are landmarks of the current EU approach to artificial intelligence. Both are invoked in the policy guidelines as values governing algorithmic decision-making, while providing rationales for existing normative provisions, on information duties, access rights and control powers. This contribution addresses the debate on transparency and explainability from the EU consumer market perspective. The consumers’ position relative to algorithmic decision-making is considered, and their risks concerning mass surveillance, exploitation, and manipulation are discussed. The concept of algorithmic opacity is analyzed, distinguishing technology-based opacity that is intrinsic to design choices, from relational opacity toward users. The response of EU law is then considered. The emerging approach to algorithmic transparency and explainability is connected to the broader regulatory goals concerning transparency in consumer markets. It is argued that EU law focuses on adequate information being provided to lay consumers (exoteric transparency), rather than on understandability to experts (esoteric transparency). A discussion follows on the benefits of transparency, on its costs, and on the extent to which transparency can be implemented without affecting performance. Finally, the merits of a transparency-based regulation of algorithms are discussed and insights are provided on regulating transparency and explainability within the EU law paradigm.File | Dimensione | Formato | |
---|---|---|---|
36279-Article Text-93526-1-10-20210402.pdf
accesso aperto
Tipo:
Versione (PDF) editoriale
Licenza:
Licenza per Accesso Aperto. Creative Commons Attribuzione (CCBY)
Dimensione
276 kB
Formato
Adobe PDF
|
276 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.