Nowadays, the increasing integration of artificial intelligence (AI) technologies in manufacturing processes is raising the need of users to understand and interpret the decision-making processes of complex AI systems. Traditional black-box AI models often lack transparency, making it challenging for users to comprehend the reasoning behind their outputs. In contrast, Explainable Artificial Intelligence (XAI) techniques provide interpretability by revealing the internal mechanisms of AI models, making them more trustworthy and facilitating human-AI collaboration. In order to promote XAI models' dissemination, this paper proposes a matrix-based methodology to design XAI-driven user interfaces in manufacturing contexts. It helps in mapping the users' needs and identifying the “explainability visualization types” that best fits the end users' requirements for the specific context of use. The proposed methodology was applied in the XMANAI European Project (https://ai4manufacturing.eu), aimed at creating a novel AI platform to support XAI-supported decision making in manufacturing plants. Results showed that the proposed methodology is able to guide companies in the correct implementation of XAI models, realizing the full potential of AI while ensuring human oversight and control.

Grandi, F., Zanatto, D., Capaccioli, A., Napoletano, L., Cavallaro, S., Peruzzini, M. (2024). A methodology to guide companies in using Explainable AI-driven interfaces in manufacturing contexts [10.1016/j.procs.2024.02.127].

A methodology to guide companies in using Explainable AI-driven interfaces in manufacturing contexts

Peruzzini, Margherita
2024

Abstract

Nowadays, the increasing integration of artificial intelligence (AI) technologies in manufacturing processes is raising the need of users to understand and interpret the decision-making processes of complex AI systems. Traditional black-box AI models often lack transparency, making it challenging for users to comprehend the reasoning behind their outputs. In contrast, Explainable Artificial Intelligence (XAI) techniques provide interpretability by revealing the internal mechanisms of AI models, making them more trustworthy and facilitating human-AI collaboration. In order to promote XAI models' dissemination, this paper proposes a matrix-based methodology to design XAI-driven user interfaces in manufacturing contexts. It helps in mapping the users' needs and identifying the “explainability visualization types” that best fits the end users' requirements for the specific context of use. The proposed methodology was applied in the XMANAI European Project (https://ai4manufacturing.eu), aimed at creating a novel AI platform to support XAI-supported decision making in manufacturing plants. Results showed that the proposed methodology is able to guide companies in the correct implementation of XAI models, realizing the full potential of AI while ensuring human oversight and control.
2024
5th International Conference on Industry 4.0 and Smart Manufacturing (ISM 2023)
3112
3120
Grandi, F., Zanatto, D., Capaccioli, A., Napoletano, L., Cavallaro, S., Peruzzini, M. (2024). A methodology to guide companies in using Explainable AI-driven interfaces in manufacturing contexts [10.1016/j.procs.2024.02.127].
Grandi, Fabio; Zanatto, Debora; Capaccioli, Andrea; Napoletano, Linda; Cavallaro, Sara; Peruzzini, Margherita
File in questo prodotto:
File Dimensione Formato  
2024 - ProCS A methodology to guide companies in using explanable AI-XMANAI.pdf

accesso aperto

Tipo: Versione (PDF) editoriale
Licenza: Licenza per Accesso Aperto. Creative Commons Attribuzione - Non commerciale - Non opere derivate (CCBYNCND)
Dimensione 927.14 kB
Formato Adobe PDF
927.14 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/972701
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact