In recent years, the uptake of Artificial Intelligence (AI) in industry is increasing. For many AI techniques, like Deep Learning, optimization, planning, etc., computational and storage requirements are significant. The problem of determining what is the right hardware (HW on premise or on the cloud) architecture and its dimensioning for AI algorithms is still crucial. Searching for the optimal solution is often challenging, as it is not trivial to anticipate the behavior of an algorithm on diverse architectures. This is especially true if the AI application must respect quality-of-service constraints or budgets. In this scenario, having an automated decision support tool to match algorithms, user constraints and HW resources would be a great advantage for companies and practitioners working with AI applications. In this paper, we tackle this challenge with an approach that relies on the Empirical Model Learning paradigm, based on the integration of Machine Learning (ML) models into an optimization problem. The key idea is to integrate domain knowledge held by experts with data-driven models that learn the relationships between HW requirements and AI algorithm performances. In particular, the approach starts with benchmarking multiple AI algorithms on different HW resources, generating data used to train ML models; then, optimization is used to find the best HW configuration that respects user-defined constraints (e.g., budget, time, solution quality). In the experimental evaluation we validate our approach on a complex problem, namely online algorithms for energy systems, an area characterized by uncertainty and tight HW and real-time constraints. Results show the effectiveness of our approach and its flexibility: We can train the ML models only once and reuse them in the optimization model to tackle a variety of problems, determined by different data instances and user-defined constraints. (C) 2022 Elsevier B.V. All rights reserved.

HADA: An automated tool for hardware dimensioning of AI applications / De Filippo, A; Borghesi, A; Boscarino, A; Milano, M. - In: KNOWLEDGE-BASED SYSTEMS. - ISSN 0950-7051. - ELETTRONICO. - 251:(2022), pp. 109199.1-109199.18. [10.1016/j.knosys.2022.109199]

HADA: An automated tool for hardware dimensioning of AI applications

De Filippo, A;Borghesi, A;Boscarino, A;Milano, M
2022

Abstract

In recent years, the uptake of Artificial Intelligence (AI) in industry is increasing. For many AI techniques, like Deep Learning, optimization, planning, etc., computational and storage requirements are significant. The problem of determining what is the right hardware (HW on premise or on the cloud) architecture and its dimensioning for AI algorithms is still crucial. Searching for the optimal solution is often challenging, as it is not trivial to anticipate the behavior of an algorithm on diverse architectures. This is especially true if the AI application must respect quality-of-service constraints or budgets. In this scenario, having an automated decision support tool to match algorithms, user constraints and HW resources would be a great advantage for companies and practitioners working with AI applications. In this paper, we tackle this challenge with an approach that relies on the Empirical Model Learning paradigm, based on the integration of Machine Learning (ML) models into an optimization problem. The key idea is to integrate domain knowledge held by experts with data-driven models that learn the relationships between HW requirements and AI algorithm performances. In particular, the approach starts with benchmarking multiple AI algorithms on different HW resources, generating data used to train ML models; then, optimization is used to find the best HW configuration that respects user-defined constraints (e.g., budget, time, solution quality). In the experimental evaluation we validate our approach on a complex problem, namely online algorithms for energy systems, an area characterized by uncertainty and tight HW and real-time constraints. Results show the effectiveness of our approach and its flexibility: We can train the ML models only once and reuse them in the optimization model to tackle a variety of problems, determined by different data instances and user-defined constraints. (C) 2022 Elsevier B.V. All rights reserved.
2022
HADA: An automated tool for hardware dimensioning of AI applications / De Filippo, A; Borghesi, A; Boscarino, A; Milano, M. - In: KNOWLEDGE-BASED SYSTEMS. - ISSN 0950-7051. - ELETTRONICO. - 251:(2022), pp. 109199.1-109199.18. [10.1016/j.knosys.2022.109199]
De Filippo, A; Borghesi, A; Boscarino, A; Milano, M
File in questo prodotto:
File Dimensione Formato  
HADA: an Automated Tool for Hardware Dimensioning of AI Applications.pdf

embargo fino al 10/06/2024

Tipo: Postprint
Licenza: Licenza per Accesso Aperto. Creative Commons Attribuzione - Non commerciale - Non opere derivate (CCBYNCND)
Dimensione 3.45 MB
Formato Adobe PDF
3.45 MB Adobe PDF   Visualizza/Apri   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/894554
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 0
social impact