EMSs collect and analyze data related to building energy consumption to save energy on the demand side and generate schedules for power generation and ESSs on the supply side.
STLF is used to predict the electric load on an hourly basis up to 1 week in advance.
Accurate STLF can provide economic benefits by storing energy at night when electric costs are relatively low and emitting electricity during the day when electric costs are high.
STLF is a challenging task due to complex energy consumption patterns and uncertain external factors.
Proposed Model
This study constructs an ANN (Artificial Neural Network)-based STLF model for accurately forecasting electric energy consumption of a building or building clusters.
General factors such as calendar data, weather information, and historical electric loads are considered for applying the STLF model as the baseline model for other buildings or building clusters.
The model predicts the 30-min interval electric load for five different types of buildings by setting several cases as test sets.
The performance of various activation functions and number of hidden layers is extensively compared to construct an optimal ANN-based STLF model.
Significance
The study aims to improve the performance of STLF in a smart grid context, which is critical for ensuring the reliability of the electric power system equipment and preparing for losses caused by power failures and overloading.
The study's primary contributions are to build an accurate ANN-based STLF model, consider general factors for applying the model to other buildings or building clusters, and extensively compare the performance of various activation functions and number of hidden layers for constructing an optimal ANN-based STLF model.
The number of layers and nodes, as well as the activation function, affect the performance of the network, and the number of hidden layers determines the depth or shallowness of the network.
ReLU (Rectified Linear Unit) has been consistently used as an activation function when the number of hidden layers is two or more, but using ReLU can result in deactivated neurons and slow learning.
ANN-based load forecasting models are constructed using all possible combinations of hidden layers from 1 to 10 together with activation functions, including ReLU, LReLU, PReLU, ELU, and SELU.
The number of hidden neurons is determined to be 2/3 the size of the input layer plus the size of the output layer, and 81 nodes are used for the forecasting model.
Xavier initialization is used to sort initial weights for individual inputs in a neuron model, and the learning rate and learning epoch are important hyperparameters to be considered.
Experiment
Average CVRMSEAverage MAPE
SELU exhibits the best performance, and ELU exhibits the worst performance in most cases.
SELU's superior self-normalization quality enables it to be trained faster and better than other activation functions.
ANN models with SELU repeatedly exhibit a higher frequency than other activation functions.
ANN models with one hidden layer generally have poor predictive performance.
Average Ranking value of number of hidden layers for SELU
댓글