This paper proposes a new propagator for a set of Neuron Constraints representing a two-layer network. Neuron Constraints are employed in the context of the Empirical Model Learning technique, that enables optimal decision making over complex systems, beyond the reach of most conventional optimization techniques. The approach is based on embedding a Machine Learning-extracted model into a combinatorial model. Specifically, a Neural Network can be embedded in a Constraint Model by simply encoding each neuron as a Neuron Constraint, which is then propagated individually. The price for such simplicity is the lack of a global view of the network, which may lead to weak bounds. To overcome this issue, we propose a new network-level propagator based on a Lagrangian relaxation, that is solved with a subgradient algorithm. The approach is tested on a thermal-aware dispatching problem on multicore CPUs, and it leads to a massive reduction of the size of the search tree, which is only partially countered by an increased propagation time.

A New Propagator for Two-Layer Neural Networks in Empirical Model Learning / Michele Lombardi;Stefano Gualandi. - STAMPA. - 8124:(2013), pp. 448-463. (Intervento presentato al convegno Eighteenth International Conference on Principles and Practice of Constraint Programming tenutosi a Quebec City, Canada nel October 8-12, 2013) [10.1007/978-3-642-40627-0_35].

A New Propagator for Two-Layer Neural Networks in Empirical Model Learning

LOMBARDI, MICHELE;
2013

Abstract

This paper proposes a new propagator for a set of Neuron Constraints representing a two-layer network. Neuron Constraints are employed in the context of the Empirical Model Learning technique, that enables optimal decision making over complex systems, beyond the reach of most conventional optimization techniques. The approach is based on embedding a Machine Learning-extracted model into a combinatorial model. Specifically, a Neural Network can be embedded in a Constraint Model by simply encoding each neuron as a Neuron Constraint, which is then propagated individually. The price for such simplicity is the lack of a global view of the network, which may lead to weak bounds. To overcome this issue, we propose a new network-level propagator based on a Lagrangian relaxation, that is solved with a subgradient algorithm. The approach is tested on a thermal-aware dispatching problem on multicore CPUs, and it leads to a massive reduction of the size of the search tree, which is only partially countered by an increased propagation time.
2013
Lecture Notes in Computer Science - Principles and Practice of Constraint Programming
448
463
A New Propagator for Two-Layer Neural Networks in Empirical Model Learning / Michele Lombardi;Stefano Gualandi. - STAMPA. - 8124:(2013), pp. 448-463. (Intervento presentato al convegno Eighteenth International Conference on Principles and Practice of Constraint Programming tenutosi a Quebec City, Canada nel October 8-12, 2013) [10.1007/978-3-642-40627-0_35].
Michele Lombardi;Stefano Gualandi
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/390366
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 5
social impact