I C I Bucharest
(National Institute for R & D in Informatics)
Center for Advanced Modeling and Optimization
8-10 Averescu Blvd.
011455 Bucharest 1, Romania
Abstract: In this paper we propose and analyze another hybrid conjugate gradient algorithm in which the parameter is computed as a convex combination of (Hestenes-Stiefel) and (Dai-Yuan), i.e. . The parameter in the convex combination is computed in such a way that the direction corresponding to the conjugate gradient algorithm is the Newton direction and the secant equation is satisfied. The algorithm uses the standard Wolfe line search conditions. Numerical comparisons with conjugate gradient algorithms using a set of 750 unconstrained optimization problems, some of them from the CUTE library, show that this hybrid computational scheme outperforms the Hestenes-Stiefel and the Dai-Yuan conjugate gradient algorithms as well as some other known hybrid conjugate gradient algorithms. Comparisons with CG_DESCENT by Hager and Zhang  and LBFGS by Liu and Nocedal  show that CG_DESCENT is more robust then our algorithm, and LBFGS is top performer among these algorithms.
Keywords: Unconstrained optimization, hybrid conjugate gradient method, Newton direction, numerical comparisons.
CITE THIS PAPER AS:
Neculai ANDREI, A Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization as a Convex Combination of Hestenes-Stiefel and Dai-Yuan, Studies in Informatics and Control, ISSN 1220-1766, vol. 17 (1), pp. 55-70, 2008.