Zettelkasten/Paper Summarization

Efficient Residential Electric Load Forecasting viaTransfer Learning and Graph Neural Networks

Computer-Nerd 2023. 3. 3.
Authors Di Wu, Weixuan Lin
Title Efficient Residential Electric Load Forecasting via Transfer Learning and Graph Neural Networks
Publication IEEE Transactions on Smart Grid
Volume x
Issue x
Pages x
Year 2022
DOI https://doi.org/10.1109/TSG.2022.3208211

Introduction

Background

  • Electric load forecasting is crucial for the efficient operation of modern power grids.
  • Short-term load forecasting (STLF) is of great interest as it can assist real-time energy dispatching.
  • The complexity of modern power systems has increased, making accurate STLF more challenging.
  • STLF used to be implemented for industrial and commercial customers to optimize demand response (DR) programs. DR programs for residential households did not attain enough attention because the capacity for individual residential households was not large enough to meet the market threshold.
  • In recent years, aggregators for residential households have been developed, and STLF at the household level is necessary and vital to offer flexibility to distribution system operators and optimize resources for monetary compensations.

Previous Research

  • Classical methods and machine learning methods can be used to tackle STLF.
  • Machine learning-based methods are more capable of capturing complex nonlinear relationships.
  • A number of machine learning-based methods have been reported to achieve accurate STLF, including support vector regression (SVR), kernel-based method, feed-forward neural network (FNN), and deep residual network.
  • Recurrent neural networks (RNNs) and their variants excel in solving sequence problems.
  • The STLF using spatial correlations to improve the prediction accuracy is defined as spatial-temporal STLF.

Proposed Model

  • Graph neural network (GNN) models have demonstrated their effectiveness on spatial-temporal time-series prediction.
  • GNN-based models can be used to explore the complex spatial correlation between households to achieve accurate spatial-temporal electric load forecasting.

Significance

  • Accurate STLF can significantly improve the controlling and planning of the operation of modern electric grid systems.
  • Electric load forecasting can be used for the day-to-day operations of utility companies, such as scheduling the generation, transmission of electric energy, and real-time energy dispatching.
  • The forecasting error can lead to a large cost in utility companies.
  • STLF at the household level is necessary and vital to offer flexibility to distribution system operators and optimize resources for monetary compensations.

Proposed Model

  • The proposed model is an attentive transfer framework that ensembles knowledge from Graph Neural Network (GNN) models trained in source and target domains.
  • The graph structure data is operated by a GNN-based STLF model that learns from the graph structure data of the last h time steps to predict the spatial-temporal data of the next time step.
  • The adjacency matrix for STLF is difficult to predetermine, so a GNN model that can discover latent correlation is preferable.
  • The attentive knowledge transfer framework consists of a base GNN network learned from the target domain, GNN models learned from source domains, and an attention network learning to assign weights.
  • The attention network is designed to learn to weight the models’ outputs, which is based on a multi-layer perceptron (MLP).
  • The complete model’s output is the weighted summation of the outputs from source models and the base model.
  • Parameters in the base model are updated according to the loss calculated by the output of the base model and the real value in the data of the target domain.
  • Parameters in the attention network are updated according to the final output from the complete model and the real value in the data of the target domain.
  • There are two separate back-propagations in the base GNN network and in the attention network.
  • The loss function is selected as the mean absolute error function.

Experiment

  • The attentive transfer model (Compl_Pre) consistently outperforms the conventional baselines and other models in all target areas and all metrics, with an average improvement of 9.79%, 14.37%, 9.99%, and 13.42% in the metrics of Indv.MAPE, Aggr.MAPE, Indv.MAE, and Aggr.MAE, respectively.
  • The attentive transfer model (Compl_Pre) with a pre-trained base model performs better than the model without a pre-trained base model (Base_noPre), and consistently outperforms the source-average model and other models in all areas and metrics.
  • The performance of the models is influenced by the target data from different areas, and the best source model varies over different target areas.
  • The attention network assigns weights to corresponding suitable models based on the pattern of input data, and the weight pattern varies evidently in different target test sets.
  • The proposed attentive transfer model (Compl_Pre) has a reasonable inference time of less than one second, making it suitable for hour-ahead prediction.

댓글