Virtsionis-Gkalinikis, N., Nalmpantis, C. & Vrakas, D. SAED: self-attentive energy disaggregation. Mach Learn (2021). https://doi.org/10.1007/s10994-021-06106-3
The field of energy disaggregation deals with the approximation of
appliance electric consumption using only the aggregate consumption measurement of a mains meter. Recent research developments have used deep neural
networks and outperformed previous methods based on Hidden Markov Models. On the other hand deep learning models are computationally heavy and
require huge amounts of data. The main objective of the current paper is to
incorporate the attention mechanism into neural networks in order to reduce
their computational complexity. For the attention mechanism two different
versions are utilized, named Additive and Dot Attention. The experiments
show that they perform on par, while the Dot mechanism is slightly faster.
The two versions of self-attentive neural networks are compared against two
state-of-the-art energy disaggregation deep learning models. The experimental
results show that the proposed architecture achieves faster or equal training
and inference time and with minor performance drop depending on the device
or the dataset.