Friday , April 26 2024

A Hybrid Conjugate Gradient Algorithm with Modified Secant Condition for Unconstrained Optimization as a Convex Combination of Hestenes-Stiefel and Dai-Yuan Algorithms

Neculai ANDREI1,2
1 Center for Advanced Modeling and Optimization, National Institute for R&D in Informatics, (ICI)
8-10, Averescu Avenue, Bucharest 1, Romania
2 Academy of Romanian Scientists
54, Splaiul Independenţei, Bucharest 5, Romania

Abstract: Another hybrid conjugate gradient algorithm is suggested in this paper. The parameterImage152 is computed as a convex combination of Image153 (Hestenes-Stiefel) and Image154 (Dai-Yuan) formulae, i.e. Image155. The parameter Image156 in the convex combination is computed in such a way so that the direction corresponding to the conjugate gradient algorithm to be the Newton direction and the pair Image227 to satisfy the modified secant condition given by Zhang et al. [32] and Zhang and Xu [33], where Image789 and Image790 The algorithm uses the standard Wolfe line search conditions. Numerical comparisons with conjugate gradient algorithms show that this hybrid computational scheme outperforms a variant of the hybrid conjugate gradient algorithm given by Andrei [6], in which the pair Image227 satisfies the secant condition Image228, as well as the Hestenes-Stiefel, the Dai-Yuan conjugate gradient algorithms, and the hybrid conjugate gradient algorithms of Dai and Yuan. A set of 750 unconstrained optimization problems are used, some of them from the CUTE library.

MSC: 49M07, 49M10, 90C06, 65K.

Keywords: Unconstrained optimization, hybrid conjugate gradient method, Newton direction, numerical comparisons.

Neculai Andrei graduated in Mathematics, “A.I. Cuza” University – Iasi, and in Electrical Engineering from “Politehnica” University – Bucharest. He took his PhD degree for his contributions to Digraph Approach of Large-Scale Sparse Linear Dynamic Systems in 1984. He has been a senior scientific researcher at National Institute for R&D in Informatics – ICI, since 1973. He is author of 13 books and text books, as well as over 50 published papers and 300 Technical Reports in area of mathematical modeling optimization. He is the key architect of ALLO language and compiler for modeling in mathematical programming, as well as of some professional packages for linear programming and large-scale unconstrained optimization. His main current scientific interests centre on modeling for mathematical programming and on large-scale non-linear optimization, conjugate gradient, interior point methods, penalty and barrier approaches. Dr. Andrei is a founding member of the Operations Research Society of Romania; founding member of the Center for Advanced Modeling and Optimization; member of the Editorial Board of Computational Optimization and Applications – Springer Verlag (since 1992); member of Computational Optimization and Applications – Optimization Forum; member of the Editorial Board of Journal of Systems Science and Complexity – Science Press – China and Allerton Press – USA (since 2001). He is Editor-in-Chief to the Advanced Modeling and Optimization – An Electronic International Journal – ICI Press (since 1999). Dr. Andrei is an Alexander-von- Humboldt fellow, member ERCIM and of Academy of Romanian Scientists (since 2001).

>>Full text
CITE THIS PAPER AS:
Neculai ANDREI, A Hybrid Conjugate Gradient Algorithm with Modified Secant Condition for Unconstrained Optimization as a Convex Combination of Hestenes-Stiefel and Dai-Yuan Algorithms, Studies in Informatics and Control, ISSN 1220-1766, vol. 17 (4), pp. 373-392, 2008.