LSTM-based Multivariate Time-Series Analysis: A Case of Journal Visitors Forecasting
Anggie Wahyu Saputra(1*); Aji Prasetya Wibawa(2); Utomo Pujianto(3); Agung Bella Putra Utama(4); Andrew Nafalski(5);
(1) Universitas Negeri Malang
(2) Universitas Negeri Malang
(3) Universitas Negeri Malang
(4) Universitas Negeri Malang
(5) University of South Australia
(*) Corresponding Author
AbstractForecasting is the process of predicting something in the future based on previous patterns. Forecasting will never be 100% accurate because the future has a problem of uncertainty. However, using the right method can make forecasting have a low error rate value to provide a good forecast for the future. This study aims to determine the effect of increasing the number of hidden layers and neurons on the performance of the long short-term memory (LSTM) forecasting method. LSTM performance measurement is done by root mean square error (RMSE) in various architectural scenarios. The LSTM algorithm is considered capable of handling long-term dependencies on its input and can predict data for a relatively long time. Based on research conducted from all models, the best results were obtained with an RMSE value of 0.699 obtained in model 1 with the number of hidden layers 2 and 64 neurons. Adding the number of hidden layers can significantly affect the RMSE results using neurons 16 and 32 in Model 1.
KeywordsForecasting; Multivariate; Long Short-term Memory; Sessions.
|
Full Text:PDF |
Article MetricsAbstract view: 1736 timesPDF view: 670 times |
Digital Object Identifierhttps://doi.org/10.33096/ilkom.v14i1.1106.57-62 |
Cite |
References
R. J. Hyndman and G. Athanasopoulos, Forecasting: principles and practice. OTexts, 2018.
M. Bruno Soares, M. Daly, and S. Dessai, Assessing the value of seasonal climate forecasts for decision?making, WIREs Clim. Chang., vol. 9, no. 4, Jul. 2018, doi: 10.1002/wcc.523.
T. Hong and S. Fan, Probabilistic electric load forecasting: A tutorial review, Int. J. Forecast., vol. 32, no. 3, pp. 914938, Jul. 2016, doi: 10.1016/j.ijforecast.2015.11.011.
N. Somu, G. R. M R, and K. Ramamritham, A hybrid model for building energy consumption forecasting using long short term memory networks, Appl. Energy, vol. 261, p. 114131, Mar. 2020, doi: 10.1016/j.apenergy.2019.114131.
A. P. Wibawa, W. Lestari, A. B. P. Utama, I. T. Saputra, and Z. N. Izdihar, Multilayer Perceptron untuk Prediksi Sessions pada Sebuah Website Journal Elektronik, Indones. J. Data Sci., vol. 1, no. 3, Dec. 2020, doi: 10.33096/ijodas.v1i3.15.
M. A. B. Ferdinand, A. P. Wibawa, I. A. E. Zaeni, and H. A. Rosyid, Single Exponential Smoothing-Multilayer Perceptron Untuk Peramalan Pengunjung Unik, J. Mob. Forensics, vol. 2, no. 2, pp. 6270, 2020, doi: https://doi.org/10.12928/mf.v2i2.2034.
A. P. Wibawa, Z. N. Izdihar, A. B. P. Utama, L. Hernandez, and Haviluddin, Min-Max Backpropagation Neural Network to Forecast e-Journal Visitors, in 2021 International Conference on Artificial Intelligence in Information and Communication (ICAIIC), Apr. 2021, pp. 052058, doi: 10.1109/ICAIIC51459.2021.9415197.
A. P. Wibawa, I. T. Saputra, A. B. P. Utama, W. Lestari, and Z. N. Izdihar, Long Short-Term Memory to Predict Unique Visitors of an Electronic Journal, in 2020 6th International Conference on Science in Information Technology (ICSITech), Oct. 2020, pp. 176179, doi: 10.1109/ICSITech49800.2020.9392031.
A. P. Wibawa, R. R. Ula, A. B. P. Utama, M. Y. Chuttur, A. Pranolo, and Haviluddin, Forecasting e-Journal Unique Visitors using Smoothed Long Short-Term Memory, in 2021 7th International Conference on Electrical, Electronics and Information Engineering (ICEEIE), Oct. 2021, pp. 609613, doi: 10.1109/ICEEIE52663.2021.9616628.
H. Li, Time works well: Dynamic time warping based on time weighting for time series data mining, Inf. Sci. (Ny)., vol. 547, pp. 592608, Feb. 2021, doi: 10.1016/j.ins.2020.08.089.
K. Backhaus, B. Erichson, S. Gensler, R. Weiber, and T. Weiber, Multivariate Analysis. Wiesbaden: Springer Fachmedien Wiesbaden, 2021.
X. Yan, W. Weihan, and M. Chang, Research on financial assets transaction prediction model based on LSTM neural network, Neural Comput. Appl., vol. 33, no. 1, pp. 257270, Jan. 2021, doi: 10.1007/s00521-020-04992-7.
M. Lechner and R. Hasani, Learning Long-Term Dependencies in Irregularly-Sampled Time Series, arXiv, 2020.
G. Van Houdt, C. Mosquera, and G. Npoles, A review on the long short-term memory model, Artif. Intell. Rev., vol. 53, no. 8, pp. 59295955, Dec. 2020, doi: 10.1007/s10462-020-09838-1.
L. Munkhdalai, T. Munkhdalai, K. H. Park, H. G. Lee, M. Li, and K. H. Ryu, Mixture of Activation Functions With Extended Min-Max Normalization for Forex Market Prediction, IEEE Access, vol. 7, pp. 183680183691, 2019, doi: 10.1109/ACCESS.2019.2959789.
S. E. Buttrey, Data Mining Algorithms Explained Using R , J. Stat. Softw., vol. 66, no. Book Review 2, 2015, doi: 10.18637/jss.v066.b02.
J. Zhou, H. Chang, X. Cheng, and X. Zhao, A Multiscale and High-Precision LSTM-GASVR Short-Term Traffic Flow Prediction Model, Complexity, vol. 2020, pp. 117, Jun. 2020, doi: 10.1155/2020/1434080.
I.-F. Kao, Y. Zhou, L.-C. Chang, and F.-J. Chang, Exploring a Long Short-Term Memory based Encoder-Decoder framework for multi-step-ahead flood forecasting, J. Hydrol., vol. 583, p. 124631, Apr. 2020, doi: 10.1016/j.jhydrol.2020.124631.
G. Hussain, M. Jabbar, J.-D. Cho, and S. Bae, Indoor Positioning System: A New Approach Based on LSTM and Two Stage Activity Classification, Electronics, vol. 8, no. 4, p. 375, Mar. 2019, doi: 10.3390/electronics8040375.
K. Wang, X. Qi, and H. Liu, Photovoltaic power forecasting based LSTM-Convolutional Network, Energy, vol. 189, p. 116225, Dec. 2019, doi: 10.1016/j.energy.2019.116225.
S. Kaushik et al., AI in Healthcare: Time-Series Forecasting Using Statistical, Neural, and Ensemble Architectures, Front. Big Data, vol. 3, no. March, Mar. 2020, doi: 10.3389/fdata.2020.00004.
Y. Li, Z. Zhu, D. Kong, H. Han, and Y. Zhao, EA-LSTM: Evolutionary attention-based LSTM for time series prediction, Knowledge-Based Syst., vol. 181, p. 104785, Oct. 2019, doi: 10.1016/j.knosys.2019.05.028.
H. Abbasimehr, M. Shabani, and M. Yousefi, An optimized model using LSTM network for demand forecasting, Comput. Ind. Eng., vol. 143, no. July 2019, p. 106435, May 2020, doi: 10.1016/j.cie.2020.106435.
I. I. Zulfa, D. Candra, R. Novitasari, F. Setiawan, A. Fanani, and M. Hafiyusholeh, Prediction of Sea Surface Current Velocity and Direction Using LSTM, Indones. J. Electron. Instrum. Syst., vol. 11, no. 1, pp. 93102, 2021, doi: 10.22146/ijeis.63669.
A. Jadidi, R. Menezes, N. de Souza, and A. de Castro Lima, A Hybrid GAMLPNN Model for One-Hour-Ahead Forecasting of the Global Horizontal Irradiance in Elizabeth City, North Carolina, Energies, vol. 11, no. 10, p. 2641, Oct. 2018, doi: 10.3390/en11102641.
S. Bouktif, A. Fiaz, A. Ouni, and M. Serhani, Optimal Deep Learning LSTM Model for Electric Load Forecasting using Feature Selection and Genetic Algorithm: Comparison with Machine Learning Approaches, Energies, vol. 11, no. 7, p. 1636, Jun. 2018, doi: 10.3390/en11071636.
Refbacks
- There are currently no refbacks.
Copyright (c) 2022 anggie wahyu saputra, Aji Prasetya Wibawa, Utomo Pujianto, Agung Bella Putra Utama, Andrew Nafalski
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.