Soft Computing-Based Generalized Least Deviation Method Algorithm for Modeling and Forecasting COVID-19 using Quasilinear Recurrence Equations
DOI:
https://doi.org/10.52866/ijcsm.2024.05.03.028Keywords:
Time Series Forecasting, Loss Function Minimization, ; COVID-19 Time SeriesAbstract
This study introduces an advanced algorithm based on the Generalized Least Deviation Method
(GLDM) tailored for the univariate time series analysis of COVID-19 data. At the core of this approach is the
optimization of a loss function, strategically designed to enhance the accuracy of the model’s predictions. The
algorithm leverages second-order terms, crucial for capturing the complexities inherent in time series data. Our
findings reveal that by optimizing the loss function and effectively utilizing second-order model dynamics, there is a
marked improvement in the predictive performance. This advancement leads to a robust and practical forecasting tool,
significantly enhancing the accuracy and reliability of univariate time series forecasts in the context of monitoring
COVID-19 trends.
Downloads
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Mostafa Abotaleb
This work is licensed under a Creative Commons Attribution 4.0 International License.