I. Pierros, I. Vlahavas. “Architecture-Agnostic Time-Step Boosting: A Case Study in Short-Term Load Forecasting”, in Proceedings of the Artificial Neural Networks and Machine Learning (ICANN 2022), Bristol, UK, 6-9 Sept 2022. LNCS, vol. 13531, pp. 556–568, Springer, 2022, https://doi.org/10.1007/978-3-031-15934-3_46/

Author(s): Ioannis Pierros, Ioannis Vlahavas

Availability:

Keywords: Machine learning, Time series, Forecasting, Neural networks, Energy demand, Short-term load forecasting

Tags:

Abstract: Time series forecasting is important for short-term operations planning and deciding the long-term growth strategy of a company. High accuracy is clearly the hardest challenge, though fast training is also important because a model can go through thousands of iterations. In this paper, we propose Time-Step Boosting, a streamlined methodology that can be applied to any type of neural network for demand forecasting, that adjusts the model’s weights during training to optimize it towards the time steps that are most difficult to predict. First, we calculate the time step error and afterwards train the model anew using the errors as weights when calculating the loss during training. We apply Time-Step Boosting on short-term demand forecasting, a task that is necessary for the smooth operation of all components in the energy sector. Deviations require costly emergency actions to reset the production-demand balance and avoid damaging the substations or even overloading the electrical grid. Even though forecasting systems have advanced in recent years, they oftentimes fail to accurately predict the peaks and lows which admittedly are of utmost importance. Our methodology demonstrates considerable convergence speed and forecasting performance improvements on next-day hourly load forecasting for multiple European countries and 6 states of the U.S. with Multilayer Perceptrons, Long-Short Term Memory networks, Convolutional Neural Networks and state-of-the-art models, showcasing its applicability on more complex architectures.