Deep Neural Networks (DNNs) have been shaking the AI scene, for their ability to excel at Machine Learning tasks without relying on complex, hand-crafted, features. Here, we probe whether a DNN can learn how to construct solutions of a CSP, without any explicit symbolic information about the problem constraints. We train a DNN to extend a feasible solution by making a single, globally consistent, variable assignment. The training is done over intermediate steps of the construction of feasible solutions. From a scientific standpoint, we are interested in whether a DNN can learn the structure of a combinatorial problem, even when trained on (arbitrarily chosen) construction sequences of feasible solutions. In practice, the network could also be used to guide a search process, e.g. to take into account (soft) constraints that are implicit in past solutions or hard to capture in a traditional declarative model. This research line is still at an early stage, and a number of complex issues remain open. Nevertheless, we already have intriguing results on the classical Partial Latin Square and N-Queen completion problems.

Model Agnostic Solution of CSPs via Deep Learning: A Preliminary Study / Galassi Andrea, Lombardi Michele, Mello Paola, Milano Michela. - STAMPA. - (2018), pp. 254-262. (Intervento presentato al convegno 15th International Conference on the Integration of Constraint Programming, Artificial Intelligence, and Operations Research tenutosi a Delft, The Netherlands nel June 26–29, 2018) [10.1007/978-3-319-93031-2_18].

Model Agnostic Solution of CSPs via Deep Learning: A Preliminary Study

Galassi Andrea;Lombardi Michele;Mello Paola;Milano Michela
2018

Abstract

Deep Neural Networks (DNNs) have been shaking the AI scene, for their ability to excel at Machine Learning tasks without relying on complex, hand-crafted, features. Here, we probe whether a DNN can learn how to construct solutions of a CSP, without any explicit symbolic information about the problem constraints. We train a DNN to extend a feasible solution by making a single, globally consistent, variable assignment. The training is done over intermediate steps of the construction of feasible solutions. From a scientific standpoint, we are interested in whether a DNN can learn the structure of a combinatorial problem, even when trained on (arbitrarily chosen) construction sequences of feasible solutions. In practice, the network could also be used to guide a search process, e.g. to take into account (soft) constraints that are implicit in past solutions or hard to capture in a traditional declarative model. This research line is still at an early stage, and a number of complex issues remain open. Nevertheless, we already have intriguing results on the classical Partial Latin Square and N-Queen completion problems.
2018
Integration of Constraint Programming, Artificial Intelligence, and Operations Research
254
262
Model Agnostic Solution of CSPs via Deep Learning: A Preliminary Study / Galassi Andrea, Lombardi Michele, Mello Paola, Milano Michela. - STAMPA. - (2018), pp. 254-262. (Intervento presentato al convegno 15th International Conference on the Integration of Constraint Programming, Artificial Intelligence, and Operations Research tenutosi a Delft, The Netherlands nel June 26–29, 2018) [10.1007/978-3-319-93031-2_18].
Galassi Andrea, Lombardi Michele, Mello Paola, Milano Michela
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/635303
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 13
  • ???jsp.display-item.citation.isi??? 16
social impact