Traditional appliance recognition models in non-intrusive load monitoring (NILM) are constrained by a static label space, making them unable to recognize unknown or newly introduced appliance types. Incremental learning provides a potential solution by enabling continuous model adaptation; however, its application in NILM is often hindered by catastrophic forgetting, where learning new classes degrades performance on previously learned ones. To address this challenge, we propose a novel class-incremental learning method based on a Nearest Class Mean Forest (NCM-Forest). The proposed approach redesigns the random forest structure by replacing axis-aligned splits with dynamic, centroid-based partitions. This design allows new classes to be incorporated seamlessly through centroid updates, while a partial subtree retraining strategy effectively balances stability and plasticity. Extensive experiments on three NILM datasets demonstrate that our method achieves robust and scalable incremental recognition with an accuracy of up to 93.33 ± 1.52%. Furthermore, deployment on edge devices confirms its practicality, featuring low memory footprint, rapid model updates, and strong potential for real-time edge-based NILM applications.
Yan, Z., Wang, Z.e., Hao, P., Nardello, M., Brunelli, D., Wen, H.e. (2026). A Lightweight and Forgetting-Resistant Approach for NILM: Incremental Appliance Recognition Using NCM-Forest. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 1, 1-11 [10.1109/tim.2026.3662894].
A Lightweight and Forgetting-Resistant Approach for NILM: Incremental Appliance Recognition Using NCM-Forest
Brunelli, Davide;
2026
Abstract
Traditional appliance recognition models in non-intrusive load monitoring (NILM) are constrained by a static label space, making them unable to recognize unknown or newly introduced appliance types. Incremental learning provides a potential solution by enabling continuous model adaptation; however, its application in NILM is often hindered by catastrophic forgetting, where learning new classes degrades performance on previously learned ones. To address this challenge, we propose a novel class-incremental learning method based on a Nearest Class Mean Forest (NCM-Forest). The proposed approach redesigns the random forest structure by replacing axis-aligned splits with dynamic, centroid-based partitions. This design allows new classes to be incorporated seamlessly through centroid updates, while a partial subtree retraining strategy effectively balances stability and plasticity. Extensive experiments on three NILM datasets demonstrate that our method achieves robust and scalable incremental recognition with an accuracy of up to 93.33 ± 1.52%. Furthermore, deployment on edge devices confirms its practicality, featuring low memory footprint, rapid model updates, and strong potential for real-time edge-based NILM applications.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


