This paper proposes a new propagator for a set of Neuron Constraints representing a two-layer network. Neuron Constraints are employed in the context of the Empirical Model Learning technique, that enables optimal decision making over complex systems, beyond the reach of most conventional optimization techniques. The approach is based on embedding a Machine Learning-extracted model into a combinatorial model. Specifically, a Neural Network can be embedded in a Constraint Model by simply encoding each neuron as a Neuron Constraint, which is then propagated individually. The price for such simplicity is the lack of a global view of the network, which may lead to weak bounds. To overcome this issue, we propose a new network-level propagator based on a Lagrangian relaxation, that is solved with a subgradient algorithm. The approach is tested on a thermal-aware dispatching problem on multicore CPUs, and it leads to a massive reduction of the size of the search tree, which is only partially countered by an increased propagation time.
Michele Lombardi, Stefano Gualandi (2013). A New Propagator for Two-Layer Neural Networks in Empirical Model Learning [10.1007/978-3-642-40627-0_35].
A New Propagator for Two-Layer Neural Networks in Empirical Model Learning
LOMBARDI, MICHELE;
2013
Abstract
This paper proposes a new propagator for a set of Neuron Constraints representing a two-layer network. Neuron Constraints are employed in the context of the Empirical Model Learning technique, that enables optimal decision making over complex systems, beyond the reach of most conventional optimization techniques. The approach is based on embedding a Machine Learning-extracted model into a combinatorial model. Specifically, a Neural Network can be embedded in a Constraint Model by simply encoding each neuron as a Neuron Constraint, which is then propagated individually. The price for such simplicity is the lack of a global view of the network, which may lead to weak bounds. To overcome this issue, we propose a new network-level propagator based on a Lagrangian relaxation, that is solved with a subgradient algorithm. The approach is tested on a thermal-aware dispatching problem on multicore CPUs, and it leads to a massive reduction of the size of the search tree, which is only partially countered by an increased propagation time.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.