Expanding variety of non-intrusive load monitoring training data: Introducing and benchmarking a novel data augmentation technique
Abstract
Energy consumption monitoring is an important asset for demand side management systems. Although smart meters provide high-sample-rated and accurate measurements of various power variables measured on diverse appliances, their high prices restrain them from a large-scale deployment. Non-Intrusive Load Monitoring (NILM) reduces monitoring cost by disaggregating computationally a main measurement into appliance-level load measurements. Deep neural networks NILM techniques have shown higher performances when they are trained and tested on the same building. However, NILM approaches suffer from low generalization capabilities due to the challenging task of identifying a wide variety of load profiles for a few available supervised training data. To address the issues of data scarcity and low generalization capabilities, data augmentation has shown promising results. Data augmentation refers to technique of artificially increasing the size of the training data by creating transformed versions of the existing data. In this study, a new and easy-to-understand transformation technique was extensively tested to introduce variability into the training set. This technique is used for data augmentation and was compared with two other methods. The results of the experiments demonstrated improvements in both F1-score and NDE metrics for all data augmentation techniques when compared to the baseline case without any augmentation. Notably, the proposed data augmentation method (OFFSETAUG) showed even higher improvements than the other two algorithms. Based on these findings, it can be concluded that data augmentation is a straightforward and valuable resource to enhance the performance of low-frequency NILM tasks.