Past Issues

Studies in Informatics and Control
Vol. 18, No. 3, 2009

Accelerated Conjugate Gradient Algorithm with Modified Secant Condition for Unconstrained Optimization

Neculai ANDREI
Abstract

Conjugate gradient algorithms are very powerful methods for solving large-scale unconstrained optimization problems characterized by low memory requirements and strong local and global convergence properties. Over 25 variants of different conjugate gradient methods are known. In this paper we propose a fundamentally different method, in which the well known parameter k β is computed by an approximation of the Hessian / vector product through modified secant condition. For search direction computation, the method takes both the available gradient and the function values information in two successive iteration points and achieves high-order accuracy in approximating the second-order curvature of the minimizing function. For steplength computation the method uses the advantage that the step lengths in conjugate gradient algorithms may differ from 1 by two order of magnitude and tend to vary in an unpredictable manner. Thus, we suggest an acceleration scheme able to improve the efficiency of the algorithm. Under common assumptions, the method is proved to be globally convergent. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in function values is significantly improved. Numerical comparisons with some conjugate gradient algorithms (including CG_DESCENT by Hager and Zhang [19], CONMIN by Shanno and Phua [29], SCALCG by Andrei [3-5], or LBFGS by Liu and Nocedal [22]) using a set of 750 unconstrained optimization problems, some of them from the CUTE library, show that the suggested algorithm outperforms the known conjugate gradient algorithms and LBFGS. MSC: 49M07, 49M10, 90C06, 65K

Keywords

Unconstrained optimization, conjugate gradient method, Newton direction, modified secant condition, numerical comparisons

View full article