2512002547
  • Open Access
  • Article

A Review of Deep Learning-based Power Load Forecasting Methods

  • Qian Ma,   
  • Di Wu *,   
  • Xin Luo

Received: 25 Mar 2025 | Accepted: 30 Aug 2025 | Published: 16 Dec 2025

Abstract

Accurate power load forecasting is essential for efficient smart grid operation and dispatch optimization. In recent years, the rapid advancement of deep learning methods has garnered significant attention from the both academic and industrial communities for their application in power load forecasting. This paper provides a comprehensive overview of deep learning models used in this field. First, it introduces common deep learning models, including convolutional neural networks, graph neural networks, recurrent neural networks, generative adversarial networks, and autoencoders. Second, it analyzes and discusses power load forecasting models based on these deep learning approaches in detail. Third, public power load datasets are presented and adopted to evaluate four representative forecasting models through experiments. Finally, this paper summarizes future development trends in the field.

References 

  • 1.

    Chen, W.; Wang, Z.D.; Hu, J.; et al. Privacy-preserving distributed economic dispatch of microgrids using edge-based additive perturbations: An accelerated consensus algorithm. IEEE Trans. Syst. Man Cybern. Syst., 2024, 54: 2638−2650. doi: 10.1109/TSMC.2023.3344885

  • 2.

    Chen, M.Z.; Ma, G.J.; Liu, W.B.; et al. An overview of data-driven battery health estimation technology for battery management system. Neurocomputing, 2023, 532: 152−169. doi: 10.1016/j.neucom.2023.02.031

  • 3.

    Chen, W.; Wang, Z.D.; Ge, Q.B.; et al. Quantized distributed economic dispatch for microgrids: Paillier encryption-decryption scheme. IEEE Trans. Industr. Inform., 2024, 20: 6552−6562. doi: 10.1109/TII.2023.3348816

  • 4.

    Esteves, G.R.T.; Bastos, B.Q.; Cyrino, F.L.; et al. Long term electricity forecast: A systematic review. Procedia Comput. Sci., 2015, 55: 549−558. doi: 10.1016/j.procs.2015.07.041

  • 5.

    Qu, B.G.; Wang, Z.D.; Shen, B.; et al. Event-based joint state and unknown input estimation for energy networks: Handling multi-machine power grids. IEEE Trans. Netw. Sci. Eng., 2023, 10: 253−264. doi: 10.1109/TNSE.2022.3206720

  • 6.

    Xie, K.; Yi, H.; Hu, G.Y.; et al. Short-term power load forecasting based on Elman neural network with particle swarm optimization. Neurocomputing, 2020, 416: 136−142. doi: 10.1016/j.neucom.2019.02.063

  • 7.

    Fan, C.D.; Ding, C.K.; Zheng, J.H.; et al. Empirical Mode Decomposition based Multi-objective Deep Belief Network for short-term power load forecasting. Neurocomputing, 2020, 388: 110−123. doi: 10.1016/j.neucom.2020.01.031

  • 8.

    Deng, S.; Chen, F.L.; Wu, D.; et al. Quantitative combination load forecasting model based on forecasting error optimization. Comput. Electr. Eng., 2022, 101: 108125. doi: 10.1016/j.compeleceng.2022.108125

  • 9.

    Bento, P.M.R.; Pombo, J.A.N.; Calado, M.R.A.; et al. Optimization of neural network with wavelet transform and improved data selection using bat algorithm for short-term load forecasting. Neurocomputing, 2019, 358: 53−71. doi: 10.1016/j.neucom.2019.05.030

  • 10.

    Zhang, L.P.; Wu, D.; Luo, X. An error correction mid-term electricity load forecasting model based on seasonal decomposition. In 2023 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Honolulu, HI, USA, 1–4 October 2023; IEEE: New York, 2023; pp. 2415–2420. doi: 10.1109/SMC53992.2023.10394531

  • 11.

    Liao, W.L.; Wang, S.X.; Bak-Jensen, B.; et al. Ultra-short-term interval prediction of wind power based on graph neural network and improved bootstrap technique. J. Mod. Power Syst. Clean Energy, 2023, 11: 1100−1114. doi: 10.35833/MPCE.2022.000632

  • 12.

    Jiang, C.; Lu, G.; Ma, X.; et al. Robust load prediction of power network clusters based on cloud-model-improved transformer. arXiv preprint arXiv: 2407.20817, 2024. doi: 10.48550/arXiv.2407.20817.

  • 13.

    Shakiba, F.M.; Shojaee, M.; Azizi, S.M.; et al. Real-time sensing and fault diagnosis for transmission lines. Int. J. Netw. Dyn. Intell., 2022, 1: 36−47. doi: 10.53941/ijndi0101004

  • 14.

    Yin, J.B.; Wang, Y.Y.; Chen, K.Y. A novel graph based sequence forecasting model for electric load of campus. In 2021 2nd International Conference on Artificial Intelligence and Information Systems, Chongqing, China, 28–30 May 2021; ACM: New York, 2021; p. 89. doi: 10.1145/3469213.3470291

  • 15.

    He, Y.P.; Jiang, L.H.; Wu, D. Orthogonal-aware constraint neural network compression based on low-rank representation learning. In 2024 International Conference on Networking, Sensing and Control (ICNSC), Hangzhou, China, 18–20 October 2024; IEEE: Red Hook, NY, 2024; pp. 1–6. doi: 10.1109/ICNSC62968.2024.10759942.

  • 16.

    Qiao, X.C.; Xu, C.X.; Wang, Y.S.; et al. SDI: A sparse drift identification approach for force/torque sensor calibration in industrial robots. Neurocomputing, 2025, 620: 129292. doi: 10.1016/j.neucom.2024.129292

  • 17.

    Ma, C.; Cheng, P.; Cai, C.X. Localization and mapping method based on multimodal information fusion and deep learning for dynamic object removal. Int. J. Netw. Dyn. Intell., 2024, 3: 100008. doi: 10.53941/ijndi.2024.100008

  • 18.

    Wang, J.W.; Zhuang, Y.; Liu, Y.S. FSS-Net: A fast search structure for 3D point clouds in deep learning. Int. J. Netw. Dyn. Intell., 2023, 2: 100005. doi: 10.53941/ijndi.2023.100005

  • 19.

    Wu, D.; Shang, M.S.; Luo, X.; et al. An L1-and-L2-norm-oriented latent factor model for recommender systems. IEEE Trans. Neural Netw. Learn. Syst., 2022, 33: 5775−5788. doi: 10.1109/TNNLS.2021.3071392

  • 20.

    Huang, T.; Liang, C.; Wu, D.; et al. A debiasing autoencoder for recommender system. IEEE Trans. Consum. Electron., 2024, 70: 3603−3613. doi: 10.1109/TCE.2023.3281521

  • 21.

    Wu, P.S.; Wang, Z.D.; Zheng, B.X.; et al. AGGN: Attention-based glioma grading network with multi-scale feature extraction and multi-modal information fusion. Comput. Biol. Med., 2023, 152: 106457. doi: 10.1016/j.compbiomed.2022.106457

  • 22.

    Ehab, W.; Huang, L.N.; Li, Y.M. UNet and variants for medical image segmentation. Int. J. Netw. Dyn. Intell., 2024, 3: 100009. doi: 10.53941/ijndi.2024.100009

  • 23.

    Zheng, C.Y.; Tao, Y.F.; Zhang, J.J.; et al. TISE-LSTM: A LSTM model for precipitation nowcasting with temporal interactions and spatial extract blocks. Neurocomputing, 2024, 590: 127700. doi: 10.1016/J.NEUCOM.2024.127700

  • 24.

    Khedher, M.I.; Jmila, H.; El-Yacoubi, M.A. On the formal evaluation of the robustness of neural networks and its pivotal relevance for ai-based safety-critical domains. Int. J. Netw. Dyn. Intell., 2023, 2: 100018. doi: 10.53941/ijndi.2023.100018

  • 25.

    Ahmed, S.F.; Alam, M.S.B.; Hassan, M.; et al. Deep learning modelling techniques: Current progress, applications, advantages, and challenges. Artif. Intell. Rev., 2023, 56: 13521−13617. doi: 10.1007/s10462-023-10466-8

  • 26.

    Mnih, V.; Kavukcuoglu, K.; Silver, D.; et al. Playing Atari with deep reinforcement learning. arXiv preprint arXiv: 1312.5602, 2013. doi: 10.48550/arXiv.1312.5602.

  • 27.

    Devlin, J.; Chang, M.W.; Lee, K.; et al. BERT: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), Minneapolis, MN, USA, 2–7 June 2019; ACL, 2019; pp. 4171–4186. doi: 10.18653/v1/N19-1423.

  • 28.

    Camacho, J.D.; Villaseñor, C.; Gomez-Avila, J.; et al. Auto-compression transfer learning methodology for deep convolutional neural networks. Neurocomputing, 2025, 630: 129661. doi: 10.1016/j.neucom.2025.129661

  • 29.

    Hinton, G.E.; Osindero, S.; Teh, Y.W. A fast learning algorithm for deep belief nets. Neural Comput., 2006, 18: 1527−1554. doi: 10.1162/neco.2006.18.7.1527

  • 30.

    Hong, Y.Y.; Rioflorido, C.L.P.P. A hybrid deep learning-based neural network for 24-h ahead wind power forecasting. Appl. Energy, 2019, 250: 530−539. doi: 10.1016/j.apenergy.2019.05.044

  • 31.

    Liang, Y.P.; Tian, L.L.; Zhang, X.; et al. Multi-dimensional adaptive learning rate gradient descent optimization algorithm for network training in magneto-optical defect detection. Int. J. Netw. Dyn. Intell., 2024, 3: 100016. doi: 10.53941/ijndi.2024.100016

  • 32.

    Hu, L.; Yang, Y.; Tang, Z.H.; et al. FCAN-MOPSO: An improved fuzzy-based graph clustering algorithm for complex networks with multiobjective particle swarm optimization. IEEE Trans. Fuzzy Syst., 2023, 31: 3470−3484. doi: 10.1109/TFUZZ.2023.3259726

  • 33.

    Zhou, J.; Cui, G.Q.; Hu, S.D.; et al. Graph neural networks: A review of methods and applications. AI Open, 2020, 1: 57−81. doi: 10.1016/j.aiopen.2021.01.001

  • 34.

    Xu, X.Y.; Pang, G.S.; Wu, D.; et al. Joint hyperbolic and Euclidean geometry contrastive graph neural networks. Inf. Sci., 2022, 609: 799−815. doi: 10.1016/j.ins.2022.07.060

  • 35.

    Gori, M.; Monfardini, G.; Scarselli, F. A new model for learning in graph domains. In Proceedings of 2005 IEEE International Joint Conference on Neural Networks, Montreal, QC, Canada, 31 July 2005–4 August 2005; IEEE: New York, 2005; pp. 729–734. doi: 10.1109/IJCNN.2005.1555942

  • 36.

    Scarselli, F., Gori, M.; Tsoi, A.C.; et al. The graph neural network model. IEEE Trans. Neural Netw., 2009, 20: 61−80. doi: 10.1109/TNN.2008.2005605

  • 37.

    Guo, Z.W.; Wang, H. A deep graph neural network-based mechanism for social recommendations. IEEE Trans. Industr. Inform., 2021, 17: 2776−2783. doi: 10.1109/TII.2020.2986316

  • 38.

    Zhang, X.M.; Liang, L.; Liu, L.; et al. Graph neural networks and their current applications in bioinformatics. Front. Genet., 2021, 12: 690049. doi: 10.3389/fgene.2021.690049

  • 39.

    Yasunaga, M.; Ren, H.Y.; Bosselut, A.; et al. QA-GNN: Reasoning with language models and knowledge graphs for question answering. In Proceedings of 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 6–11 June 2021; ACL, 2021; pp. 535–546. doi: 10.18653/v1/2021.naacl-main.45.

  • 40.

    Nakamura, T.; Taki, K.; Nomiya, H.; et al. A shape-based similarity measure for time series data with ensemble learning. Pattern Anal. Appl., 2013, 16: 535−548. doi: 10.1007/s10044-011-0262-6

  • 41.

    Larrea, M.; Porto, A.; Irigoyen, E.; et al. Extreme learning machine ensemble model for time series forecasting boosted by PSO: Application to an electric consumption problem. Neurocomputing, 2021, 452: 465−472. doi: 10.1016/j.neucom.2019.12.140

  • 42.

    Zhang, X.L.; Zhong, C.K.; Zhang, J.J.; et al. Robust recurrent neural networks for time series forecasting. Neurocomputing, 2023, 526: 143−157. doi: 10.1016/j.neucom.2023.01.037

  • 43.

    Zhu, X.L.; Zhao, S.; Yang, Y.D.; et al. A real-time ensemble classification algorithm for time series data. In 2017 IEEE International Conference on Agents (ICA), Beijing, China, 6–9 July 2017; IEEE: New York, 2017; pp. 145–150. doi: 10.1109/AGENTS.2017.8015322

  • 44.

    Fulcher, B.D. Feature-based time-series analysis. In Feature Engineering for Machine Learning and Data Analytics; Dong, G.Z.; Liu, H., Eds.; CRC Press: Boca Raton, 2018; pp. 87–116.

  • 45.

    Adhikari, R.; Agrawal, R.K. An introductory study on time series modeling and forecasting. arXiv preprint arXiv: 1302.6613, 2013. doi: 10.48550/arXiv.1302.6613.

  • 46.

    Song, M.L.; Li, Y.; Pedrycz, W. Time series prediction with granular neural networks. Neurocomputing, 2023, 546: 126328. doi: 10.1016/j.neucom.2023.126328

  • 47.

    Lecun, Y.; Bottou, L.; Bengio, Y.; et al. Gradient-based learning applied to document recognition. Proc. IEEE, 1998, 86: 2278−2324. doi: 10.1109/5.726791

  • 48.

    Zhang, X.; Zhang, X.; Wang, W. Convolutional neural network. In Intelligent Information Processing with Matlab; Zhang, X.; Zhang, X.; Wang, W., Eds.; Springer: Singapore, 2023; pp. 39–71. doi: 10.1007/978-981-99-6449-9_2

  • 49.

    Li, X.; Li, M.L.; Yan, P.F.; et al. Deep learning attention mechanism in medical image analysis: Basics and beyonds. Int. J. Netw. Dyn. Intell., 2023, 2: 93−116. doi: 10.53941/ijndi0201006

  • 50.

    Indolia, S.; Goswami, A.K.; Mishra, S.P.; et al. Conceptual understanding of convolutional neural network-a deep learning approach. Procedia Comput. Sci., 2018, 132: 679−688. doi: 10.1016/j.procs.2018.05.069

  • 51.

    Ruderman, A.; Rabinowitz, N.C.; Morcos, A.S.; et al. Pooling is neither necessary nor sufficient for appropriate deformation stability in CNNs. arXiv preprint arXiv: 1804.04438, 2018. doi: 10.48550/arXiv.1804.04438

  • 52.

    Akhtar, N.; Ragavendran, U. Interpretation of intelligence in CNN-pooling processes: A methodological survey. Neural Comput. Appl., 2020, 32: 879−898. doi: 10.1007/s00521-019-04296-5

  • 53.

    Yang, D.J.; Cao, J.J.; Ma, Y.Z.; et al. Circular FC: Fast Fourier transform meets fully connected layer for convolutional neural network. In Proceedings of the 30th International Conference on Neural Information Processing, Changsha, China, 20–23 November 2023; Springer: Berlin/Heidelberg, 2024; pp. 483–494. doi: 10.1007/978-981-99-8126-7_38

  • 54.

    Aloysius, N.; Geetha, M. A review on deep convolutional neural networks. In 2017 International Conference on Communication and Signal Processing (ICCSP), Chennai, India, 6–8 April 2017; IEEE: New York, 2017; pp. 588–592. doi: 10.1109/ICCSP.2017.8286426

  • 55.

    Goodfellow, I.J.; Pouget-Abadie, J.; Mirza, M.; et al. Generative adversarial nets. In Proceedings of the 28th International Conference on Neural Information Processing Systems, Montreal, Canada, 8–13 December 2014; MIT Press: Cambridge, 2014; pp. 2672–2680.

  • 56.

    Dou, J.; Song, Y. An improved generative adversarial network with feature filtering for imbalanced data. Int. J. Netw. Dyn. Intell., 2023, 2: 100017. doi: 10.53941/ijndi.2023.100017

  • 57.

    Salvaris, M.; Dean, D.; Tok, W.H. Generative adversarial networks. In Deep Learning with Azure: Building and Deploying Artificial Intelligence Solutions on the Microsoft AI Platform; Salvaris, M.; Dean, D.; Tok, W.H., Eds.; Apress: Berkeley, 2018; pp. 187–208. doi: 10.1007/978-1-4842-3679-6_8

  • 58.

    Wang, C.; Wang, Z.D.; Ma, L.F.; et al. Subdomain-alignment data augmentation for pipeline fault diagnosis: An adversarial self-attention network. IEEE Trans. Industr. Inform., 2024, 20: 1374−1384. doi: 10.1109/TII.2023.3275701

  • 59.

    Chai, L.; Wang, Z.D.; Chen, J.Q.; et al. Synthetic augmentation for semantic segmentation of class imbalanced biomedical images: A data pair generative adversarial network approach. Comput. Biol. Med., 2022, 150: 105985. doi: 10.1016/j.compbiomed.2022.105985

  • 60.

    Wang, K.F.; Gou, C.; Duan, Y.J.; et al. Generative adversarial networks: Introduction and outlook. IEEE/CAA J. Autom. Sin., 2017, 4: 588−598. doi: 10.1109/JAS.2017.7510583

  • 61.

    Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning internal representations by error propagation. In Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vol. 1: Foundations; Rumelhart, D.E.; McClelland, J.L., Eds.; MIT Press: Cambridge, 1986; pp. 318–362.

  • 62.

    Blaschke, T.; Olivecrona, M.; Engkvist, O.; et al. Application of generative autoencoder in De Novo molecular design. Mol. Inform., 2018, 37: 1700123. doi: 10.1002/minf.201700123

  • 63.

    Pawar, K.; Attar, V.Z. Assessment of autoencoder architectures for data representation. In Deep Learning: Concepts and Architectures; Pedrycz, W.; Chen, S.M., Eds.; Springer: Cham, 2020; pp. 101–132. doi: 10.1007/978-3-030-31756-0_4

  • 64.

    Li, P.Z.; Pei, Y.; Li, J.Q. A comprehensive survey on design and application of autoencoder in deep learning. Appl. Soft Comput., 2023, 138: 110176. doi: 10.1016/j.asoc.2023.110176

  • 65.

    Lipton, Z.C.; Berkowitz, J.; Elkan, C. A critical review of recurrent neural networks for sequence learning. arXiv preprint arXiv: 1506.00019, 2015. doi: 10.48550/arXiv.1506.00019.

  • 66.

    Pascanu, R.; Gülçehre, C.; Cho, K.; et al. How to construct deep recurrent neural networks. In 2nd International Conference on Learning Representations, Banff, AB, Canada, 14–16 April 2014; 2014.

  • 67.

    Fang, W.; Zhang, F.H.; Ding, Y.W.; et al. A new sequential image prediction method based on LSTM and DCGAN. Comput. Mater. Continua, 2020, 64: 217−231. doi: 10.32604/cmc.2020.06395

  • 68.

    Cahuantzi, R.; Chen, X.Y.; Güttel, S. A comparison of LSTM and GRU networks for learning symbolic sequences. In Intelligent Computing; Arai, K., Ed.; Springer: Cham, 2023; pp. 771–785. doi: 10.1007/978-3-031-37963-5_53

  • 69.

    He, Y.L.; Wang, P.F.; Zhu, Q.X. Improved Bi-LSTM with distributed nonlinear extensions and parallel inputs for soft sensing. IEEE Trans. Industr. Inform., 2024, 20: 3748−3755. doi: 10.1109/TII.2023.3313631

  • 70.

    Ni, P.S.; Sheng, J.L.; Jiang, L.Z.; et al. Sequential ISAR images classification using CNN-Bi-LSTM method. In 2022 3rd China International SAR Symposium (CISS), Shanghai, China, 2–4 November 2022; IEEE: New York, 2022; pp. 1–5. doi: 10.1109/CISS57580.2022.9971386

  • 71.

    Chen, W.; Wang, Z.D.; Dong, H.L.; et al. Privacy-preserving distributed economic dispatch of microgrids over directed networks via state decomposition: A fast consensus algorithm. IEEE Trans. Industr. Inform., 2024, 20: 4092−4102. doi: 10.1109/TII.2023.3321027

  • 72.

    Shabani, N.; Wu, J.; Beheshti, A.; et al. A comprehensive survey on graph summarization with graph neural networks. IEEE Trans. Artif. Intell., 2024, 5: 3780−3800. doi: 10.1109/TAI.2024.3350545

  • 73.

    Ekambaram, V.N.; Fanti, G.C.; Ayazifar, B.; et al. Spline-like wavelet filterbanks for multiresolution analysis of graph-structured data. IEEE Trans. Signal Inf. Process. Netw., 2015, 1: 268−278. doi: 10.1109/TSIPN.2015.2480223

  • 74.

    LeFevre, K.; Terzi, E. GraSS: Graph structure summarization. In Proceedings of the 2010 SIAM International Conference on Data Mining, Columbus, OH, USA, 29 April 2010–1 May 2010; Society for Industrial and Applied Mathematics: Philadelphia, 2010; pp. 454–465. doi: 10.1137/1.9781611972801.40

  • 75.

    Sun, M. PP-GNN: Pretraining Position-aware Graph Neural Networks with the NP-hard metric dimension problem. Neurocomputing, 2023, 561: 126848. doi: 10.1016/j.neucom.2023.126848

  • 76.

    Jin, W.; Ma, Y.; Liu, X.R.; et al. Graph structure learning for robust graph neural networks. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, CA, USA, 6–10 July 2020; ACM: New York, NY, USA, 2020; pp. 66–74. doi: 10.1145/3394486.3403049.

  • 77.

    Ding, Y.L.; Fu, M.H.; Luo, P.; et al. Network learning for biomarker discovery. Int. J. Netw. Dyn. Intell., 2023, 2: 51−65. doi: 10.53941/ijndi0201004

  • 78.

    Wu, F.; Zhang, T.Y.; de Souza, A.H., Jr.; et al. Simplifying graph convolutional networks. In 36th International Conference on Machine Learning, Long Beach, CA, USA, 9–15 June 2019; PMLR, 2019; pp. 6861–6871.

  • 79.

    Weinberger, K.Q.; Sha, F.; Zhu, Q.H.; et al. Graph Laplacian regularization for large-scale semidefinite programming. In Proceedings of the 20th International Conference on Neural Information Processing Systems, Vancouver, BC, Canada, 4–7 December 2006; MIT Press: Cambridge, 2006; pp. 1489–1496.

  • 80.

    Defferrard, M.; Bresson, X.; Vandergheynst, P. Convolutional neural networks on graphs with fast localized spectral filtering. In Proceedings of the 30th International Conference on Neural Information Processing Systems, Barcelona, Spain, 5–10 December 2016; Curran Associates Inc.: Red Hook, 2016; pp. 3844–3852.

  • 81.

    Hamilton, W.L.; Ying, R.; Leskovec, J. Inductive representation learning on large graphs. In Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017; Curran Associates Inc.: Red Hook, 2017; pp. 1025–1035.

  • 82.

    Veličković, P.; Cucurull, G.; Casanova, A.; et al. Graph attention networks. In Proceedings of the 6th International Conference on Learning Representations, Vancouver, BC, Canada, 30 April 2018–3 May 2018; OpenReview.net, 2018.

  • 83.

    Liao, W.L.; Bak-Jensen, B.; Pillai, J.R.; et al. A review of graph neural networks and their applications in power systems. J. Mod. Power Syst. Clean Energy, 2022, 10: 345−360. doi: 10.35833/MPCE.2021.000058

  • 84.

    Yang, J.; Li, Y.P.; Wang, G.Y.; et al. An end-to-end knowledge graph fused graph neural network for accurate protein-protein interactions prediction. IEEE/ACM Trans. Comput. Biol. Bioinform., 2024, 21: 2518−2530. doi: 10.1109/TCBB.2024.3486216

  • 85.

    Wang, Y.; Yuan, Y.; Wu, D. A node-collaboration-informed graph convolutional network for precise representation to undirected weighted graphs. In 2023 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Honolulu, HI, USA, 1–4 October 2023; IEEE: New York, 2023; pp. 811–816. doi: 10.1109/SMC53992.2023.10394596

  • 86.

    Ma, Y.; Hao, J.Y.; Yang, Y.D.; et al. Spectral-based graph convolutional network for directed graphs. arXiv preprint arXiv: 1907.08990, 2019. doi: 10.48550/arXiv.1907.08990.

  • 87.

    Li, Y.L. A survey of EEG analysis based on graph neural network. In 2021 2nd International Conference on Electronics, Communications and Information Technology (CECIT), Sanya, China, 27–29 December 2021; IEEE: New York, 2021; pp. 151–155. doi: 10.1109/CECIT53797.2021.00034

  • 88.

    Chen, L.; Luo, X. Tensor distribution regression based on the 3D conventional neural networks. IEEE/CAA J. Autom. Sin., 2023, 10: 1628−1630. doi: 10.1109/JAS.2023.123591

  • 89.

    Alicja, K.; Maciej, S. Can AI see bias in X-ray images?. Int. J. Netw. Dyn. Intell., 2022, 1: 48−64. doi: 10.53941/ijndi0101005

  • 90.

    Yuan, Z.F.; Li, Y.; Liu, Y.; et al. Unsupervised ship detection in SAR imagery based on energy density-induced clustering. Int. J. Netw. Dyn. Intell., 2023, 2: 100006. doi: 10.53941/ijndi.2023.100006

  • 91.

    Wang, T.; Chen, Q.M.; Lang, X.; et al. Detection of oscillations in process control loops from visual image space using deep convolutional networks. IEEE/CAA J. Autom. Sin., 2024, 11: 982−995. doi: 10.1109/JAS.2023.124170

  • 92.

    Dhillon, A.; Verma, G.K. Convolutional neural network: A review of models, methodologies and applications to object detection. Prog. Artif. Intell., 2020, 9: 85−112. doi: 10.1007/s13748-019-00203-0

  • 93.

    Wang, D.L.; Gan, J.; Mao, J.Q.; et al. Forecasting power demand in China with a CNN-LSTM model including multimodal information. Energy, 2023, 263: 126012. doi: 10.1016/j.energy.2022.126012

  • 94.

    Tian, C.J.; Ma, J.; Zhang, C.H.; et al. A deep neural network model for short-term load forecast based on long short-term memory network and convolutional neural network. Energies, 2018, 11: 3493. doi: 10.3390/en11123493

  • 95.

    Rafi, S.H.; Nahid-Al-Masood; Deeba, S.R.; et al. A short-term load forecasting method using integrated CNN and LSTM network. IEEE Access, 2021, 9: 32436−32448. doi: 10.1109/ACCESS.2021.3060654

  • 96.

    Shao, X.R.; Kim, C.S.; Sontakke, P. Accurate deep model for electricity consumption forecasting using multi-channel and multi-scale feature fusion CNN–LSTM. Energies, 2020, 13: 1881. doi: 10.3390/en13081881

  • 97.

    Alhussein, M.; Aurangzeb, K.; Haider, S.I. Hybrid CNN-LSTM model for short-term individual household load forecasting. IEEE Access, 2020, 8: 180544−180557. doi: 10.1109/ACCESS.2020.3028281

  • 98.

    Agga, A.; Abbou, A.; Labbadi, M.; et al. CNN-LSTM: An efficient hybrid deep learning architecture for predicting short-term photovoltaic power production. Electr. Power Syst. Res., 2022, 208: 107908. doi: 10.1016/j.jpgr.2022.107908

  • 99.

    Qu, J.Q.; Qian, Z.; Pei, Y. Day-ahead hourly photovoltaic power forecasting using attention-based CNN-LSTM neural network embedded with multiple relevant and target variables prediction pattern. Energy, 2021, 232: 120996. doi: 10.1016/j.energy.2021.120996

  • 100.

    Lee, W.; Kim, K.; Park, J.; et al. Forecasting solar power using long-short term memory and convolutional neural networks. IEEE Access, 2018, 6: 73068−73080. doi: 10.1109/ACCESS.2018.2883330

  • 101.

    Wu, Q.Y.; Guan, F.; Lv, C.; et al. Ultra-short-term multi-step wind power forecasting based on CNN-LSTM. IET Renew. Power Gener., 2021, 15: 1019−1029. doi: 10.1049/rpg2.12085

  • 102.

    Wang, K.J.; Qi, X.X.; Liu, H.D. Photovoltaic power forecasting based LSTM-Convolutional Network. Energy, 2019, 189: 116225. doi: 10.1016/j.energy.2019.116225

  • 103.

    Alsharekh, M.F.; Habib, S.; Dewi, D.A.; et al. Improving the efficiency of multistep short-term electricity load forecasting via R-CNN with ML-LSTM. Sensors, 2022, 22: 6913. doi: 10.3390/s22186913

  • 104.

    Le, T.; Vo, M.T.; Vo, B.; et al. Improving electric energy consumption prediction using CNN and Bi-LSTM. Appl. Sci., 2019, 9: 4237. doi: 10.3390/app9204237

  • 105.

    Ullah, F.U.M.; Ullah, A.; Haq, I.U.; et al. Short-term prediction of residential power energy consumption via CNN and multi-layer Bi-directional LSTM networks. IEEE Access, 2020, 8: 123369−123380. doi: 10.1109/ACCESS.2019.2963045

  • 106.

    Wu, K.H.; Wu, J., Feng, L.; et al. An attention-based CNN-LSTM-BiLSTM model for short-term electric load forecasting in integrated energy system. Int. Trans. Electr. Energy Syst., 2021, 31: e12637. doi: 10.1002/2050-7038.12637

  • 107.

    Abou Houran, M.; Salman Bukhari, S.M.; Zafar, M.H.; et al. COA-CNN-LSTM: Coati optimization algorithm-based hybrid deep learning model for PV/wind power forecasting in smart grid applications. Appl. Energy, 2023, 349: 121638. doi: 10.1016/j.apenergy.2023.121638

  • 108.

    Khan, Z.A.; Hussain, T.; Ullah, A.; et al. Towards efficient electricity forecasting in residential and commercial buildings: A novel hybrid CNN with a LSTM-AE based framework. Sensors, 2020, 20: 1399. doi: 10.3390/s20051399

  • 109.

    Sajjad, M.; Khan, Z.A.; Ullah, A.; et al. A novel CNN-GRU-based hybrid approach for short-term residential load forecasting. IEEE Access, 2020, 8: 143759−143768. doi: 10.1109/ACCESS.2020.3009537

  • 110.

    Khan, Z.A.; Ullah, A.; Ullah, W.; et al. Electrical energy prediction in residential buildings for short-term horizons using hybrid deep learning strategy. Appl. Sci., 2020, 10: 8634. doi: 10.3390/app10238634

  • 111.

    Kim, J.; Moon, J.; Hwang, E.; et al. Recurrent inception convolution neural network for multi short-term load forecasting. Energy Build., 2019, 194: 328−341. doi: 10.1016/j.enbuild.2019.04.034

  • 112.

    Liu, J.; Shi, Q.; Han, R.L.; et al. A hybrid GA–PSO–CNN model for ultra-short-term wind power forecasting. Energies, 2021, 14: 6500. doi: 10.3390/en14206500

  • 113.

    Zhang, C.; Peng, T.; Nazir, M.S. A novel integrated photovoltaic power forecasting model based on variational mode decomposition and CNN-BiGRU considering meteorological variables. Electr. Power Syst. Res., 2022, 213: 108796. doi: 10.1016/j.jpgr.2022.108796

  • 114.

    Hu, S.; Lu, J.B.; Zhou, S.E. Learning regression distribution: Information diffusion from template to search for visual object tracking. Int. J. Netw. Dyn. Intell., 2024, 3: 100006. doi: 10.53941/ijndi.2024.100006

  • 115.

    Zhang, J.S.; Feng, Y.Q.; Wang, C.; et al. Multi-domain clustering pruning: Exploring space and frequency similarity based on GAN. Neurocomputing, 2023, 542: 126279. doi: 10.1016/j.neucom.2023.126279

  • 116.

    Tian, C.L.; Ye, Y.Y.; Lou, Y.L.; et al. Daily power demand prediction for buildings at a large scale using a hybrid of physics-based model and generative adversarial network. Build. Simul., 2022, 15: 1685−1701. doi: 10.1007/s12273-022-0887-y

  • 117.

    Huang, L.; Li, L.X.; Wei, X.Y.; et al. Short-term prediction of wind power based on BiLSTM–CNN–WGAN-GP. Soft Comput., 2022, 26: 10607−10621. doi: 10.1007/s00500-021-06725-x

  • 118.

    Li, F.Y.; Zheng, H.F.; Li, X.M. A novel hybrid model for multi-step ahead photovoltaic power prediction based on conditional time series generative adversarial networks. Renew. Energy, 2022, 199: 560−586. doi: 10.1016/j.renene.2022.08.134

  • 119.

    Zhou, D.J.; Ma, S.X.; Hao, J.R.; et al. An electricity load forecasting model for Integrated Energy System based on BiGAN and transfer learning. Energy Rep., 2020, 6: 3446−3461. doi: 10.1016/j.egyr.2020.12.010

  • 120.

    Huang, X.Q.; Li, Q.; Tai, Y.H.; et al. Time series forecasting for hourly photovoltaic power using conditional generative adversarial network and Bi-LSTM. Energy, 2022, 246: 123403. doi: 10.1016/j.energy.2022.123403

  • 121.

    Moon, J.; Jung, S.; Park, S.; et al. Conditional tabular GAN-based two-stage data generation scheme for short-term load forecasting. IEEE Access, 2020, 8: 205327−205339. doi: 10.1109/ACCESS.2020.3037063

  • 122.

    Pan, X.P.; Zhou, J.Y.; Sun, X.R.; et al. A hybrid method for day-ahead photovoltaic power forecasting based on generative adversarial network combined with convolutional autoencoder. IET Renew. Power Gener., 2023, 17: 644−658. doi: 10.1049/rpg2.12619

  • 123.

    Zhang, G.Q.; Guo, J.F. A novel ensemble method for residential electricity demand forecasting based on a novel sample simulation strategy. Energy, 2020, 207: 118265. doi: 10.1016/j.energy.2020.118265

  • 124.

    Bu, X.Y.; Wu, Q.W.; Zhou, B.; et al. Hybrid short-term load forecasting using CGAN with CNN and semi-supervised regression. Appl. Energy, 2023, 338: 120920. doi: 10.1016/j.apenergy.2023.120920

  • 125.

    Wang, Z.H.; Wang, C.; Cheng, L.; et al. An approach for day-ahead interval forecasting of photovoltaic power: A novel DCGAN and LSTM based quantile regression modeling method. Energy Rep., 2022, 8: 14020−14033. doi: 10.1016/j.egyr.2022.10.309

  • 126.

    Ma, G.J.; Wang, Z.D.; Liu, W.B.; et al. Estimating the state of health for lithium-ion batteries: A particle swarm optimization-assisted deep domain adaptation approach. IEEE/CAA J. Autom. Sin., 2023, 10: 1530−1543. doi: 10.1109/JAS.2023.123531

  • 127.

    Su, Y.X.; He, Q.Y.; Chen, J.; et al. A residential load forecasting method for multi-attribute adversarial learning considering multi-source uncertainties. Int. J. Electr. Power Energy Syst., 2023, 154: 109421. doi: 10.1016/j.ijepes.2023.109421

  • 128.

    Zang, S.S.; Jin, H.D.; Yu, Q.H.; et al. Video summarization using U-shaped non-local network. Int. J. Netw. Dyn. Intell., 2024, 3: 100013. doi: 10.53941/ijndi.2024.100013

  • 129.

    Chen, Y.H.; Kloft, M.; Yang, Y.; et al. Mixed kernel based extreme learning machine for electric load forecasting. Neurocomputing, 2018, 312: 90−106. doi: 10.1016/j.neucom.2018.05.068

  • 130.

    Alipour, M.; Aghaei, J.; Norouzi, M.; et al. A novel electrical net-load forecasting model based on deep neural networks and wavelet transform integration. Energy, 2020, 205: 118106. doi: 10.1016/j.energy.2020.118106

  • 131.

    Sabri, M.; El Hassouni, M. Photovoltaic power forecasting with a long short-term memory autoencoder networks. Soft Comput., 2023, 27: 10533−10553. doi: 10.1007/s00500-023-08497-y

  • 132.

    Kaur, D.; Islam, S.N.; Mahmud, M.A. A variational autoencoder-based dimensionality reduction technique for generation forecasting in cyber-physical smart grids. In 2021 IEEE International Conference on Communications Workshops (ICC Workshops), Montreal, QC, Canada, 14–23 June 2021; IEEE: New York, 2021; pp. 1–6. doi: 10.1109/ICCWorkshops50388.2021.9473748

  • 133.

    Moradzadeh, A.; Moayyed, H.; Zare, K.; et al. Short-term electricity demand forecasting via variational autoencoders and batch training-based bidirectional long short-term memory. Sustain. Energy Technol. Assess., 2022, 52: 102209. doi: 10.1016/j.seta.2022.102209

  • 134.

    Zhang, Y.; Qin, C.; Srivastava, A.K.; et al. Data-driven day-ahead PV estimation using autoencoder-LSTM and persistence model. IEEE Trans. Ind. Appl., 2020, 56: 7185−7192. doi: 10.1109/TIA.2020.3025742

  • 135.

    Yang, Y.; Wang, Z.J.; Gao, Y.C.; et al. An effective dimensionality reduction approach for short-term load forecasting. Electr. Power Syst. Res., 2022, 210: 108150. doi: 10.1016/j.jpgr.2022.108150

  • 136.

    Zheng, K.H.; Li, P.; Zhou, S.L.; et al. A multi-scale electricity consumption prediction algorithm based on time-frequency variational autoencoder. IEEE Access, 2021, 9: 90937−90946. doi: 10.1109/ACCESS.2021.3071452

  • 137.

    Khan, M.; Naeem, M.R.; Al-Ammar, E.A.; et al. Power forecasting of regional wind farms via variational auto-encoder and deep hybrid transfer learning. Electronics, 2022, 11: 206. doi: 10.3390/electronics11020206

  • 138.

    Dairi, A.; Harrou, F.; Sun, Y.; et al. Short-term forecasting of photovoltaic solar power production using variational auto-encoder driven deep learning approach. Appl. Sci., 2020, 10: 8400. doi: 10.3390/app10238400

  • 139.

    Fazlipour, Z.; Mashhour, E.; Joorabian, M. A deep model for short-term load forecasting applying a stacked autoencoder based on LSTM supported by a multi-stage attention mechanism. Appl. Energy, 2022, 327: 120063. doi: 10.1016/j.apenergy.2022.120063

  • 140.

    Jiao, R.H.; Huang, X.J.; Ma, X.H.; et al. A model combining stacked auto encoder and back propagation algorithm for short-term wind power forecasting. IEEE Access, 2018, 6: 17851−17858. doi: 10.1109/ACCESS.2018.2818108

  • 141.

    Liu, P.; Zheng, P.J.; Chen, Z.Y. Deep learning with stacked denoising auto-encoder for short-term electric load forecasting. Energies, 2019, 12: 2445. doi: 10.3390/en12122445

  • 142.

    Yu, D.; Li, J.Y. Recent progresses in deep learning based acoustic models. IEEE/CAA J. Autom. Sin., 2017, 4: 396−409. doi: 10.1109/JAS.2017.7510508

  • 143.

    Chen, H.H.; Zhu, M.Y.; Hu, X.; et al. Multifeature short-term power load forecasting based on GCN-LSTM. Int. Trans. Electr. Energy Syst., 2023, 2023: 8846554. doi: 10.1155/2023/8846554

  • 144.

    Chen, H.H.; Zhu, M.Y.; Hu, X.; et al. Research on short-term load forecasting of new-type power system based on GCN-LSTM considering multiple influencing factors. Energy Rep., 2023, 9: 1022−1031. doi: 10.1016/j.egyr.2023.05.048

  • 145.

    Hu, L.; Pan, X.Y.; Tang, Z.H.; et al. A fast fuzzy clustering algorithm for complex networks via a generalized momentum method. IEEE Trans. Fuzzy Syst., 2022, 30: 3473−3485. doi: 10.1109/TFUZZ.2021.3117442

  • 146.

    Karimi, A.M.; Wu, Y.H.; Koyuturk, M.; et al. Spatiotemporal graph neural network for performance prediction of photovoltaic power systems. In Proceedings of the 35th AAAI Conference on Artificial Intelligence, 2–9 February 2021; AAAI Press: Washington, DC, USA, 2021; pp. 15323–15330. https://doi.org/10.1609/aaai.v35i17.17799.

  • 147.

    Jiang, Y.M.; Liu, M.S.; Li, Y.Y.; et al. Enhanced neighborhood node graph neural networks for load forecasting in smart grid. Int. J. Mach. Learn. Cybern., 2024, 15: 129−148. doi: 10.1007/s13042-023-01796-8

  • 148.

    Wang, Y.F.; Rui, L.X.; Ma, J.H.; et al. A short-term residential load forecasting scheme based on the multiple correlation-temporal graph neural networks. Appl. Soft Comput., 2023, 146: 110629. doi: 10.1016/j.asoc.2023.110629

  • 149.

    Arastehfar, S.; Matinkia, M.; Jabbarpour, M. R. Short-term residential load forecasting using Graph Convolutional Recurrent Neural Networks. Eng. Appl. Artif. Intell., 2022, 116: 105358. doi: 10.1016/j.engappai.2022.105358

  • 150.

    Song, Y.; Tang, D.Y.; Yu, J.S.; et al. Short-term forecasting based on graph convolution networks and multiresolution convolution neural networks for wind power. IEEE Trans. Industr. Inform., 2023, 19: 1691−1702. doi: 10.1109/TII.2022.3176821

  • 151.

    Zou, Y.Q.; Feng, W.J.; Zhang, J.T.; et al. Forecasting of short-term load using the MFF-SAM-GCN model. Energies, 2022, 15: 3140. doi: 10.3390/en15093140

  • 152.

    Zhang, Q.Y.; Chen, J.H.; Xiao, G.; et al. TransformGraph: A novel short-term electricity net load forecasting model. Energy Rep., 2023, 9: 2705−2717. doi: 10.1016/j.egyr.2023.01.050

  • 153.

    Hebrail, G.; Berard, A. Individual Household Electric Power Consumption. UCI Machine Learning Repository, 2006. Available online: https://doi.org/10.24432/C58K54 (accessed on 16 April 2024).

  • 154.

    Salam, A.; El Hibaoui, A. Power Consumption of Tetouan City. UCI Machine Learning Repository, 2018. Available online: https://doi.org/10.24432/C5B034 (accessed on 16 April 2024).

  • 155.

    Ong, S.: Clark, N. Commercial and Residential Hourly Load Profiles for all TMY3 Locations in the United States [data Set]. 2014. Available online: https://doi.org/10.25984/1788456 (accessed on 16 April 2024).

  • 156.

    Murray, D.; Stankovic, L.; Stankovic, V. An electrical load measurements dataset of United Kingdom households from a two-year longitudinal study. Sci. Data, 2017, 4: 160122. doi: 10.1038/sdata.2016.122

  • 157.

    Zhou, X.Y.; Pan, Z.S.; Hu, G.Y.; et al. Stock market prediction on high-frequency data using generative adversarial nets. Math. Probl. Eng., 2018, 2018: 4907423. doi: 10.1155/2018/4907423

Share this article:
How to Cite
Ma, Q.; Wu, D.; Luo, X. A Review of Deep Learning-based Power Load Forecasting Methods. International Journal of Network Dynamics and Intelligence 2025, 4 (4), 100027. https://doi.org/10.53941/ijndi.2025.100027.
RIS
BibTex
Copyright & License
article copyright Image
Copyright (c) 2025 by the authors.