
Table 2 above presents a numerical comparison
between the new algorithm and the HS Algorithm
for n=10000.
5 Conclusion
In this study, a novel method of scaled nonlinear
conjugate gradient is developed that, under certain
assumptions, boosts its global convergence property
for uniformly convex functions. Additionally, it
successfully fulfills the essential descent condition,
which is commonly observed in standard gradient
algorithms. The utility of the suggested new scaled
types was shown in the reported numerical
experiments. The obtained results reveal, for at least
the chosen set of test problems, that the developed
algorithm reduces by an overall 61% NOI and
78.82% IRS against the standard Hestenes-Stiefel
(HS) algorithm. The relative utility of the new
approach 𝑛1000,10000 is presented in Table
3.
Table 3. Relative utility of the new approach 𝑛
1000,10000
Tools NOI IRS
HS-Algorithm 100 % 100%
New-Algorithm 39% 21.18 %
The new method is promising and deserves
further exploration of a wider spectrum of problems
by extending the method to constrained
optimization, exploring parallel and distributed
implementations for scalability, [8], [20], [21],
developing adaptive parameter tuning schemes, and
analyzing sensitivity to initial conditions. It is also
worth integrating the method with machine learning
models, [22], assessing robustness to noisy
objectives, comparing it with state-of-the-art
methods, exploring real-world applications, and
developing user-friendly interfaces for wider
accessibility. These directions aim to enhance the
algorithm's versatility, efficiency, and practical
applicability across diverse optimization scenarios.
References:
[1] E. D. Dolan and J. J. Moré, “Benchmarking
optimization software with performance
profiles,” Math. Program. Ser. B, vol. 91, no.
2, pp. 201–213, 2002, doi:
10.1007/s101070100263.
[2] B. A. Hassan and H. A. Alashoor, “On image
restoration problems using new conjugate
gradient methods,” Indonesian Journal of
Electrical Engineering and Computer
Science, vol. 29, no. 3, p. 1438, Mar. 2023,
doi: 10.11591/ijeecs.v29.i3.pp1438-1445.
[3] H. M. Khudhur and A. A. M. Fawze, "An
improved conjugate gradient method for
solving unconstrained optimization and
image restoration problems," Int. J.
Mathematical Modeling Numerical
Optimization, vol. 13, no. 3, pp. 313–325,
2023, doi: 10.1504/IJMMNO.2023.132286.
[4] B. A. Hassan and H. Sadiq, “Efficient New
Conjugate Gradient Methods for Removing
Impulse Noise Images,” Eur. J. Pure Appl.
Math., vol. 15, no. 4, pp. 2011–2021, Oct.
2022, doi: 10.29020/nybg.ejpam.v15i4.4568.
[5] S. Aji, A. B. Abubakar, A. I. Kiri, and A.
Ishaku, “A Spectral Conjugate Gradient-like
Method for Convex Constrained Nonlinear
Monotone Equations and Signal Recovery,”
Nonlinear Convex Anal. Optim., vol. 1, no. 1,
pp. 1–23, 2022.
[6] M. R. Hestenes and E. Stiefel, "Methods of
conjugate gradients for solving linear
systems, "Journal of Research National
Bureau Standard (1934), vol. 49, no. 6, pp.
409–436, Dec. 1952, doi:
10.6028/jres.049.044.
[7] N. Andrei, “An accelerated conjugate
gradient algorithm with guaranteed descent
and conjugacy conditions for unconstrained
optimization,” Optimization Methods and
Software, vol. 27, no. 4–5, 2012, doi:
10.1080/10556788.2010.501379.
[8] M. M. Abed, U. Öztürk, and H. M. Khudhur,
“Spectral CG Algorithm for Solving Fuzzy
Nonlinear Equations,” Iraqi J. Computer
Science and Math., vol. 3, no. 1, pp. 1–10,
Jan. 2022, doi:
10.52866/ijcsm.2022.01.01.001.
[9] Y. A. Laylani, B. A. Hassan, and H. M.
Khudhur, “Enhanced spectral conjugate
gradient methods for unconstrained
optimization,” Computer Science, vol. 18,
no. 2, pp. 163–172, 2023.
[10] N. Andrei, “Numerical comparison of
conjugate gradient algorithms for
unconstrained optimization,” Stud.
Informatics Control, vol. 16, pp. 333–352,
2007.
[11] H. N. Jabbar and B. A. Hassan, “Two-
versions of descent conjugate gradient
methods for large-scale unconstrained
optimization,” Indonesian Journal of
Electrical Engineering and Computer
Science, vol. 22, no. 3, p. 1643, Jun. 2021,
WSEAS TRANSACTIONS on MATHEMATICS
DOI: 10.37394/23206.2023.22.101
Isam H. Halil, Issam A.R. Moghrabi,
Ahmed A. Fawze, Basim A. Hassan, Hisham M. Khudhur