A Hybrid Conjugate Gradient Algorithm with Modified Secant Condition for Unconstrained Optimization as a Convex Combination of Hestenes-Stiefel and Dai-Yuan Algorithms
Neculai ANDREI1,2
1 Center for Advanced Modeling and Optimization, National Institute for R&D in Informatics, (ICI)
8-10, Averescu Avenue, Bucharest 1, Romania
2 Academy of Romanian Scientists
54, Splaiul Independenţei, Bucharest 5, Romania
Abstract: Another hybrid conjugate gradient algorithm is suggested in this paper. The parameter is computed as a convex combination of
(Hestenes-Stiefel) and
(Dai-Yuan) formulae, i.e.
. The parameter
in the convex combination is computed in such a way so that the direction corresponding to the conjugate gradient algorithm to be the Newton direction and the pair
to satisfy the modified secant condition given by Zhang et al. [32] and Zhang and Xu [33], where
and
The algorithm uses the standard Wolfe line search conditions. Numerical comparisons with conjugate gradient algorithms show that this hybrid computational scheme outperforms a variant of the hybrid conjugate gradient algorithm given by Andrei [6], in which the pair
satisfies the secant condition
, as well as the Hestenes-Stiefel, the Dai-Yuan conjugate gradient algorithms, and the hybrid conjugate gradient algorithms of Dai and Yuan. A set of 750 unconstrained optimization problems are used, some of them from the CUTE library.
MSC: 49M07, 49M10, 90C06, 65K.
Keywords: Unconstrained optimization, hybrid conjugate gradient method, Newton direction, numerical comparisons.
Neculai Andrei graduated in Mathematics, “A.I. Cuza” University – Iasi, and in Electrical Engineering from “Politehnica” University – Bucharest. He took his PhD degree for his contributions to Digraph Approach of Large-Scale Sparse Linear Dynamic Systems in 1984. He has been a senior scientific researcher at National Institute for R&D in Informatics – ICI, since 1973. He is author of 13 books and text books, as well as over 50 published papers and 300 Technical Reports in area of mathematical modeling optimization. He is the key architect of ALLO language and compiler for modeling in mathematical programming, as well as of some professional packages for linear programming and large-scale unconstrained optimization. His main current scientific interests centre on modeling for mathematical programming and on large-scale non-linear optimization, conjugate gradient, interior point methods, penalty and barrier approaches. Dr. Andrei is a founding member of the Operations Research Society of Romania; founding member of the Center for Advanced Modeling and Optimization; member of the Editorial Board of Computational Optimization and Applications – Springer Verlag (since 1992); member of Computational Optimization and Applications – Optimization Forum; member of the Editorial Board of Journal of Systems Science and Complexity – Science Press – China and Allerton Press – USA (since 2001). He is Editor-in-Chief to the Advanced Modeling and Optimization – An Electronic International Journal – ICI Press (since 1999). Dr. Andrei is an Alexander-von- Humboldt fellow, member ERCIM and of Academy of Romanian Scientists (since 2001).
>>Full text
CITE THIS PAPER AS:
Neculai ANDREI, A Hybrid Conjugate Gradient Algorithm with Modified Secant Condition for Unconstrained Optimization as a Convex Combination of Hestenes-Stiefel and Dai-Yuan Algorithms, Studies in Informatics and Control, ISSN 1220-1766, vol. 17 (4), pp. 373-392, 2008.
1. Introduction
Let us consider the nonlinear unconstrained optimization problem
where is a continuously differentiable function, bounded from below. As we know, for solving this problem starting from an initial guess
a nonlinear conjugate gradient method generates a sequence
as
where is obtained by line search and the directions
are generated as
In (3) is known as the conjugate gradient parameter,
and
. Consider
the Euclidean norm and define
. The line search in the conjugate gradient algorithms is often based on the standard Wolfe conditions:
where is a descent direction and
Different conjugate gradient algorithms correspond to different choices for the scalar parameter
The methods of Fletcher and Reeves (FR) [19], of Dai and Yuan (DY) [13] and the Conjugate Descent (CD) proposed by Fletcher [18]:
have strong convergence properties, but they may have modest practical performance due to jamming. On the other hand, the methods of Polak – Ribière [27] and Polyak (PRP) [28], of Hestenes and Stiefel (HS) [23] or of Liu and Storey (LS) [25]:
may not always be convergent, but they often have better computational performances.
In this paper we focus on hybrid conjugate gradient methods. These algorithms have been devised to use the attractive features of the above conjugate gradient algorithms. They are defined by (2) and (3) where the parameter is computed as projections or as convex combinations of different conjugate gradient algorithms, as in Table 1.
Table 1. Hybrid conjugate gradient algorithms.
The hybrid computational schemes perform better than the classical conjugate gradient algorithms [5]. In [6] we presented another hybrid conjugate gradient algorithm as a convex combination of the Hestenes-Stiefel and the Dai-Yuan algorithms, where the parameter in convex combination is computed in such a way so that the direction corresponding to the conjugate gradient algorithm to be the Newton direction and the pair to satisfy the secant condition. Numerical experiments with this computational scheme proved to outperform the Hestenes-Stiefel and the Dai-Yuan conjugate gradient algorithms, as well as some other hybrid conjugate gradient algorithms [6]. In this paper, motivated by a result given by Zhang, Deng and Chen [32] and Zhang and Xu [33] concerning a better approximation of
using the modified secant condition, we present another variant of the hybrid conjugate gradient algorithm for unconstrained optimization which performs much better and it is more robust than the variant using the secant condition.
The structure of the paper is as follows. Section 2 introduces our hybrid conjugate gradient algorithm, HYBRIDM as a convex combination of HS and DY algorithms with modified secant condition. Section 3 presents the algorithm and in section 4 its convergence analysis both for uniformly convex functions and general functions is shown. In section 5 some numerical experiments and performance profiles of Dolan-Moré [17] corresponding to this new hybrid conjugate gradient algorithm are given. The performance profiles correspond to a set of 750 unconstrained optimization problems in the CUTE test problem library [10] as well as some other ones presented in [1]. It is shown that this hybrid conjugate gradient algorithm outperforms the classical HS and DY conjugate gradient algorithms and also the hybrid variants hDY and hDYz. 6. Conclusion A large variety of conjugate gradient algorithms is well known. In this paper we have presented a new hybrid conjugate gradient algorithm in which the parameter The performance profile of our algorithm was higher than those of the well established conjugate gradient algorithms HS and DY and also of the hybrid variants hDY and hDYz, for a set of 750 unconstrained optimization problems. Additionally the proposed hybrid conjugate gradient algorithm is more robust than the HS and DY conjugate gradient algorithms. REFERENCES is computed as a convex combination of
and
The parameter in convex combination is computed in such a way so that the direction corresponding to this algorithm to be the Newton direction. Using the modified secant condition we get an algorithm which proved to be more efficient than the algorithm based on secant condition. For uniformly convex function our algorithm is globally convergent. For general nonlinear functions we proved the global convergence of a variant of the algorithm using the strong Wolfe line search.