Friday , June 22 2018

A Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization as a Convex Combination of Hestenes-Stiefel and Dai-Yuan

Neculai ANDREI
I C I Bucharest
(National Institute for R & D in Informatics)
Center for Advanced Modeling and Optimization

8-10 Averescu Blvd.
011455 Bucharest 1, Romania
nandrei@ici.ro

Abstract: In this paper we propose and analyze another hybrid conjugate gradient algorithm in which the parameterImage152-2008-1-5 is computed as a convex combination of Image153-2008-1-5 (Hestenes-Stiefel) and Image154-2008-1-5 (Dai-Yuan), i.e. Image155-2008-1-5. The parameter image156 in the convex combination is computed in such a way that the direction corresponding to the conjugate gradient algorithm is the Newton direction and the secant equation is satisfied. The algorithm uses the standard Wolfe line search conditions. Numerical comparisons with conjugate gradient algorithms using a set of 750 unconstrained optimization problems, some of them from the CUTE library, show that this hybrid computational scheme outperforms the Hestenes-Stiefel and the Dai-Yuan conjugate gradient algorithms as well as some other known hybrid conjugate gradient algorithms. Comparisons with CG_DESCENT by Hager and Zhang [17] and LBFGS by Liu and Nocedal [22] show that CG_DESCENT is more robust then our algorithm, and LBFGS is top performer among these algorithms.

Keywords: Unconstrained optimization, hybrid conjugate gradient method, Newton direction, numerical comparisons.

>>Full text
CITE THIS PAPER AS:
Neculai ANDREI, A Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization as a Convex Combination of Hestenes-Stiefel and Dai-Yuan, Studies in Informatics and Control, ISSN 1220-1766, vol. 17 (1), pp. 55-70, 2008.