Two new conjugate gradient methods in unconstrained optimization problems
Keywords:global convergence, unconstrained optimization, strong Wolfe conditions, descent direction, step length
Zheng Y. and Zheng B. in  modiﬁed Dai-Liao conjugate gradient method to come up with two new Dai-Liao-type conjugate gradient methods. These methods were shown to have satisﬁed descent condition taken into consideration the strong Wolfe line search. Convergence for objective functions were also guarantied. In this work, two new conjugate gradient methods are introduced in line with the work of Zheng Y. and Zheng B.  by changing the ﬁrst term in AyO-CG method  to solve unconstrained non-linear optimization problems. Descent properties of these methods are shown and guarantied. Convergence analyses of these methods in line with strong Wolfe conditions showed that they are globally convergent. Comparison based on Dolan More performance proﬁle of the numerical strength of these methods with the two modiﬁed Dai-Laio type methods proved that our methods compare favorably well with them
Copyright (c) 2023 Annals of Mathematics and Computer Science
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.