Past Issues

Studies in Informatics and Control
Vol. 14, No. 4, 2005

Conjugate Gradient Algorithm with Scaled Memoryless BFGS Preconditioner for Unconstrained Optimization

Neculai Andrei
Abstract

This work presents and analyzes a scaled memoryless BFGS preconditioned conjugate gradient algorithm, and its implementation, for solving unconstrained optimization problems. The basic idea is to combine the scaled memoryless BFGS method and the preconditioning technique in the frame of the conjugate gradient method. The preconditioner, which is also a scaled memoryless BFGS matrix, is reset when a restart criterion holds. The computational scheme is embedded into the restart philosophy of Beale-Powell. The parameter scaling the gradient is selected as spectral gradient or in an anticipative manner by means of a formula using the function values in two successive points. In very mild conditions it is shown that, for strongly convex functions, the algorithm is globally convergent. Computational results and performance profiles, for a test set consisting of over 500 unconstrained optimization problems, show that this new scaled conjugate gradient algorithm substantially outperforms known conjugate gradient methods including: the spectral conjugate gradient by Birgin and Martínez [4], the conjugate gradient by Fletcher and Reeves [10], Polak and Ribière [17], and by Hestenes and Stiefel [13], as well as the most recent CG_DESCENT conjugate gradient method with guaranteed descent by Hager and Zhang [11,12].

Keywords

Unconstrained optimization, conjugate gradient method, spectral gradient method

View full article