Mid-Term Power Load Forecasting of a Statistically Modified Long-term Data by using the LSTM
Abstract views: 38 / PDF downloads: 49
Keywords:
Load Forecasting, Long Short-Term Memory, Machine Learning, Short-Term Load Forecasting, Time-Series AnalysisAbstract
The surplus power produced by power plants, which is considered as generation losses, can be
avoided by estimating the expected load consumption, which will lead to financial gains for companies
producing electrical energy. An accurate estimation of the power load can yield a reliable determination
for power system management and the accompanying reduction of gas emitted from power plants. This
work aims to create an integrated deep learning model based on a time series index to estimate future
values of electric power consumption by applying Long Short-Term Memory (LSTM) networks. The
dataset used has taken directly from PJM Interconnection Organization, which is a regional transmission
organization in the United States, the data is an hourly power consumption in megawatt for Chicago and
much area of Northern Illinois state. A statistical test was used to evaluate the dataset. Three different
statistics functions have used for resampling the dataset, mean, minimum, and maximum function. After
fitting the proposed model, it will predict the power load for one year ahead on daily basis. When the
minimum function has used in the resampling processing the model was able to attain a Mean Absolute
Percentage Error (MAPE) of 3.84%, and the coefficient of determination (R-squared) of 0.8.
Downloads
References
Chen, Z., Zhang, D., Jiang, H., Wang, L., Chen, Y., Xiao, Y., Liu, J., Zhang, Y. and Li, M, “Load Forecasting Based on LSTM Neural Network and Applicable to Loads of ‘Replacement of Coal with Electricity,’” Journal of Electrical Engineering & Technology, vol. 16, no. 5, pp. 2333–2342, Apr. 2021.
X.B. Jin, W.Z. Zheng, J.L. Kong, X.Y. Wang, Y.T. Bai, T.L. Su, and S. Lin, “Deep-Learning Forecasting Method for Electric Power Load via Attention-Based Encoder-Decoder with Bayesian Optimization,” Energies, vol. 14, no. 6, p. 1596, Mar. 2021.
N.-R. Son, S. Yang, and J. Na, “Deep neural network and Long Short-Term memory for electric power load forecasting,” Applied Sciences, vol. 10, no. 18, p. 6489, Sep. 2020.
X.-B. Jin, N.-X. Yang, X. Wang, Y. Bai, T. Su, and J. Kong, “Integrated Predictor Based on Decomposition Mechanism for PM2.5 Long-Term Prediction,” Applied Sciences, vol. 9, no. 21, p. 4533, Oct. 2019.
S. Sivamohan, S. N. Sridhar, and S. Krishnaveni, “An Effective Recurrent Neural Network (RNN) based Intrusion Detection via Bi-directional Long Short-Term Memory,” 2021 International Conference on Intelligent Technologies (CONIT).
H. Zhou and Y. Tian, “Research on Time Performance Optimization of LSTM model for pedestrian volume prediction,” 2022 IEEE 10th Joint International Information Technology and Artificial Intelligence Conference (ITAIC), Jun. 2022.
I. S. Kervanci and F. Akay, “LSTM Hyperparameters optimization with Hparam parameters for Bitcoin Price Prediction,” Sakarya University Journal of Computer and Information Sciences, vol. 6, no. 1, pp. 1–9, Apr. 2023.
M. C. Mozer, “A focused backpropagation algorithm for temporal pattern recognition.,” Complex Systems, vol. 3, Jan. 1989, [Online]. Available: https://dblp.uni-trier.de/db/journals/compsys/compsys3.html#Mozer89
S. Hochreiter and J. Schmidhuber, “Long Short-Term memory,” Neural Computation, vol. 9, no. 8, pp. 1735–1780, Nov. 1997.
W. Kong, Z. Y. Dong, Y. Jia, D. J. Hill, Y. Xu and Y. Zhang, "Short-Term Residential Load Forecasting Based on LSTM Recurrent Neural Network," in IEEE Transactions on Smart Grid, vol. 10, no. 1, pp. 841-851, Jan. 2019.
S. Hochreiter, A. S. Younger, and P. R. Conwell, “Learning to learn using gradient descent,” in Lecture Notes in Computer Science, 2001, pp. 87–94.
S. Motepe, A. N. Hasan, and R. Stopforth, “Improving load forecasting process for a power distribution network using hybrid AI and deep learning algorithms,” IEEE Access, vol. 7, pp. 82584–82598, Jan. 2019.
A. Graves, A. -r. Mohamed and G. Hinton, "Speech recognition with deep recurrent neural networks," 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, Vancouver, BC, Canada, 2013, pp. 6645-6649.
Z. C. Lipton, J. Berkowitz, and C. Elkan, “A Critical Review of Recurrent Neural Networks for Sequence Learning,” arXiv Preprint arXiv:1506.00019., May 2015.
Ali Grami, "Hypothesis Testing," in Probability, Random Variables, Statistics, and Random Processes: Fundamentals & Applications, Wiley, 2019, pp.287-310
G. Zhao, “A test of non null hypothesis for linear trends in proportions,” Communications in Statistics - Theory and Methods, vol. 44, no. 8, pp. 1621–1639, Apr. 2013.
Y. LeCun, Y. Bengio, and G. E. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436–444, May 2015.
H. Çeti̇Ner and İ. Çeti̇Ner, “Analysis of Different Regression Algorithms for the Estimate of Energy Consumption,” European Journal of Science and Technology, no. 31, pp. 23–22, Dec. 2021.