In past papers, we have introduced Empirical Model Learning (EML) as a method to enable Combinatorial Optimization on real world systems that are impervious to classical modeling approaches. The core idea in EML consists in embedding a Machine Learning model in a traditional combinatorial model. So far, the method has been demonstrated by using Neural Networks and Constraint Programming (CP). In this paper we add one more technique to the EML arsenal, by devising methods to embed Decision Trees (DTs) in CP. In particular, we propose three approaches: 1) a simple encoding based on meta-constraints; 2) a method using attribute discretization and a global table constraint; 3) an approach based on converting a DT into a Multi-valued Decision Diagram, which is then fed to an mdd constraint. We finally show how to embed in CP a Random Forest, a powerful type of ensemble classifier based on DTs. The proposed methods are compared in an experimental evaluation, highlighting their strengths and their weaknesses.

Bonfietti, A., Lombardi, M., Milano, M. (2015). Embedding Decision Trees and Random Forests in Constraint Programming. Springer Verlag [10.1007/978-3-319-18008-3_6].

Embedding Decision Trees and Random Forests in Constraint Programming

BONFIETTI, ALESSIO;LOMBARDI, MICHELE;MILANO, MICHELA
2015

Abstract

In past papers, we have introduced Empirical Model Learning (EML) as a method to enable Combinatorial Optimization on real world systems that are impervious to classical modeling approaches. The core idea in EML consists in embedding a Machine Learning model in a traditional combinatorial model. So far, the method has been demonstrated by using Neural Networks and Constraint Programming (CP). In this paper we add one more technique to the EML arsenal, by devising methods to embed Decision Trees (DTs) in CP. In particular, we propose three approaches: 1) a simple encoding based on meta-constraints; 2) a method using attribute discretization and a global table constraint; 3) an approach based on converting a DT into a Multi-valued Decision Diagram, which is then fed to an mdd constraint. We finally show how to embed in CP a Random Forest, a powerful type of ensemble classifier based on DTs. The proposed methods are compared in an experimental evaluation, highlighting their strengths and their weaknesses.
2015
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
74
90
Bonfietti, A., Lombardi, M., Milano, M. (2015). Embedding Decision Trees and Random Forests in Constraint Programming. Springer Verlag [10.1007/978-3-319-18008-3_6].
Bonfietti, Alessio; Lombardi, Michele; Milano, Michela
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/554980
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 27
  • ???jsp.display-item.citation.isi??? 25
social impact