Rapidly increasing chip densities and processor speeds have made energy dissipation a leading concern in computer design. The problem raised by energy consumption is especially severe for a whole class of computing devices that has recently become almost ubiquitously available---mobile devices like notebooks, PDAs, or mobile phones. On the one hand, these are only equipped with a very limited power supply, so any computation on such a device should be especially careful about resource usage. Even worse, the battery technology for these devices has not kept pace with advances in processor technology and the growing complexity of software. On the other hand, cooling mechanisms become more and more important. Recent trends suggest that processor power consumption doubles every four years--and cooling costs rise exponentially with heat increases. The future processors will be hotter than light bulbs and require energy management solutions more cost effective than the cooling fans processors use today. Again, this problem is especially severe for space-constrained devices. On the hardware level, decisions influence how software will be able to affect power dissipation. Similarly, decisions on the software level, that is operating system, compiler, virtual machine, or application level, influence the processor's workload. However, currently most research is done in a vacuum, making assumptions on the execution environment that do not take into account the effect of other techniques. The sheer explosion of total hardware/software decisions is the first reason why low power design is a complex optimization problem. The problem is further complicated by decisions along any dimension having tradeoffs along other dimensions. One tradeoff is between power and performance. Another is that in minimizing power, one might introduce sharp power variations that impair chip reliability. The third reason low power design is nontrivial is that no decision exists in a vacuum. A compiler can emit code to slow down the processor during memory stalls, but an operating system can override the compiler's decision by speeding up the processor at the same times. Without synergy between the compiler and operating system, frequency thrashing can occur, resulting in power and performance degradation. More generally, without synergy among decisions at different levels of design, the effect of any power reduction technique will be short lived. For these three reasons, the problem of developing an integrated approach to power management remains unsolved, though researchers are addressing power issues at every level of design. We invited a broad range of researchers from all communities that deal with problems related to power consumption of resource-constrained devices. Our main goals are twofold. On the one hand, we plan to initiate the development of a framework for modeling of multi-layer approaches to reduction of power-consumption. Such a framework would be beneficial for all involved communities, since it would allow a reality-near approximation of system behavior. On the other hand, we expect the seminar to result in ample cross-fertilization between the different research areas and to come up with recommondations to the research community about possible areas for projects and collaborations.
L. Benini (2005). Power-aware Computing Systems.
Power-aware Computing Systems
BENINI, LUCA
2005
Abstract
Rapidly increasing chip densities and processor speeds have made energy dissipation a leading concern in computer design. The problem raised by energy consumption is especially severe for a whole class of computing devices that has recently become almost ubiquitously available---mobile devices like notebooks, PDAs, or mobile phones. On the one hand, these are only equipped with a very limited power supply, so any computation on such a device should be especially careful about resource usage. Even worse, the battery technology for these devices has not kept pace with advances in processor technology and the growing complexity of software. On the other hand, cooling mechanisms become more and more important. Recent trends suggest that processor power consumption doubles every four years--and cooling costs rise exponentially with heat increases. The future processors will be hotter than light bulbs and require energy management solutions more cost effective than the cooling fans processors use today. Again, this problem is especially severe for space-constrained devices. On the hardware level, decisions influence how software will be able to affect power dissipation. Similarly, decisions on the software level, that is operating system, compiler, virtual machine, or application level, influence the processor's workload. However, currently most research is done in a vacuum, making assumptions on the execution environment that do not take into account the effect of other techniques. The sheer explosion of total hardware/software decisions is the first reason why low power design is a complex optimization problem. The problem is further complicated by decisions along any dimension having tradeoffs along other dimensions. One tradeoff is between power and performance. Another is that in minimizing power, one might introduce sharp power variations that impair chip reliability. The third reason low power design is nontrivial is that no decision exists in a vacuum. A compiler can emit code to slow down the processor during memory stalls, but an operating system can override the compiler's decision by speeding up the processor at the same times. Without synergy between the compiler and operating system, frequency thrashing can occur, resulting in power and performance degradation. More generally, without synergy among decisions at different levels of design, the effect of any power reduction technique will be short lived. For these three reasons, the problem of developing an integrated approach to power management remains unsolved, though researchers are addressing power issues at every level of design. We invited a broad range of researchers from all communities that deal with problems related to power consumption of resource-constrained devices. Our main goals are twofold. On the one hand, we plan to initiate the development of a framework for modeling of multi-layer approaches to reduction of power-consumption. Such a framework would be beneficial for all involved communities, since it would allow a reality-near approximation of system behavior. On the other hand, we expect the seminar to result in ample cross-fertilization between the different research areas and to come up with recommondations to the research community about possible areas for projects and collaborations.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.