Friday , March 29 2024

Accelerated Conjugate Gradient Algorithm with Modified Secant Condition for Unconstrained Optimization

Neculai ANDREI1,2
1 Research Institute for Informatics, Center for Advanced Modeling and Optimization
8-10, Averescu Avenue, Bucharest 1, Romania
2 Academy of Romanian Scientists
54, Splaiul Independentei, Bucharest 5, Romania

Abstract: Conjugate gradient algorithms are very powerful methods for solving large-scale unconstrained optimization problems characterized by low memory requirements and strong local and global convergence properties. Over 25 variants of different conjugate gradient methods are known. In this paper we propose a fundamentally different method, in which the well known parameter Image01-2009,3,3 is computed by an approximation of the Hessian / vector product through modified secant condition. For search direction computation, the method takes both the available gradient and the function values information in two successive iteration points and achieves high-order accuracy in approximating the second-order curvature of the minimizing function. For steplength computation the method uses the advantage that the step lengths in conjugate gradient algorithms may differ from 1 by two order of magnitude and tend to vary in an unpredictable manner. Thus, we suggest an acceleration scheme able to improve the efficiency of the algorithm. Under common assumptions, the method is proved to be globally convergent. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in function values is significantly improved. Numerical comparisons with some conjugate gradient algorithms (including CG_DESCENT by Hager and Zhang [19], CONMIN by Shanno and Phua [29], SCALCG by Andrei [3-5], or LBFGS by Liu and Nocedal [22]) using a set of 750 unconstrained optimization problems, some of them from the CUTE library, show that the suggested algorithm outperforms the known conjugate gradient algorithms and LBFGS.

MSC: 49M07, 49M10, 90C06, 65K

Keywords: Unconstrained optimization, conjugate gradient method, Newton direction, modified secant condition, numerical comparisons.

>>Full text
CITE THIS PAPER AS:
Neculai ANDREI, Accelerated Conjugate Gradient Algorithm with Modified Secant Condition for Unconstrained Optimization, Studies in Informatics and Control, ISSN 1220-1766, vol. 18 (3), pp. 211-232, 2009.

Introduction

Mobile phones and the Internet are frequently and increasingly used as marketing mediums. Users of a mobile game content system respond to a questionnaire that is intended to improve contents when they stop their subscription to the services. The questionnaire consists of multiple-choice and open-ended questions that appear on the screen of their cellular phones.

Let us consider the nonlinear unconstrained optimization problem

Image26-2009,3,3                                                                                                (1.1)

where Image27-2009,3,3 is a continuously differentiable function, bounded from below. As we know, for solving this problem starting from an initial guess Image28-2009,3,3 a nonlinear conjugate gradient method generates a sequence Image29-2009,3,3 as

Image30-2009,3,3,                                                                                                       (1.2)

where Image31-2009,3,3 is obtained by line search and the directions Image32-2009,3,3 are generated as

Image33-2009,3,3, Image34-2009,3,3.                                                                            (1.3)

In (1.3) Image35-2009,3,3 is known as the conjugate gradient parameter, Image36-2009,3,3 and Image37-2009,3,3. Consider Image38-2009,3,3 the Euclidean norm and define Image39-2009,3,3. The line search in the conjugate gradient algorithms is often based on the standard Wolfe conditions:

                                                                  (1.4)

Image41-2009,3,3 ,                                                                                                      (1.5)

where Image42-2009,3,3 is a descent direction and Image43-2009,3,3

The search direction Image1056-2009,3,3, assumed to be a descent one, plays the main role in these methods. Different conjugate gradient algorithms correspond to different choices for the scalar parameter Image157-2009,3,3 On the other hand the stepsize Image1057-2009,3,3 guarantees the global convergence in some cases and is crucial in efficiency. The line search in the conjugate gradient algorithms is often based on the standard Wolfe conditions. Plenty of conjugate gradient methods are known and an excellent survey of these methods with a special attention on their global convergence is given by Hager and Zhang [20]. A numerical comparison of conjugate gradient algorithms (1.2) and (1.3) with Wolfe line search (1.4) and (1.5), for different formulae of parameter Image1058-2009,3,3 computation, including the Dolan and Moré performance profile, is given in [6].

In [23] Jorge Nocedal articulated a number of open problems in conjugate gradient algorithms. Two of them seem to be really very important. One refers to the direction computation in order to take into account the problem structure. In particular, when the problem is partially separable the idea is to use the partitioned updating like in quasi-Newton methods [18]. The second one focuses on the step length. Intensive numerical experiments with conjugate gradient algorithms proved that the step length may differ from 1 up to two orders of magnitude, being larger or smaller than 1, depending on how the problem is scaled. Moreover, the sizes of the step length tend to vary in a totally unpredictable way. This is in contrast with the Newton and quasi-Newton methods, as well as with the limited memory quasi-Newton methods, which usually admit the unit step length for most of the iterations and require only very few function evaluations for step length determination.

In this paper we present a conjugate gradient algorithm which address to these open problems. The structure of the paper is as follows. In section 2 we present a conjugate gradient algorithm with modified secant condition. The idea of this algorithm is to use the Newton direction for Image01-2009,3,3 computation in (1.3). This leads us to a formula for Image1059-2009,3,3 which contains the Hessian of the minimizing function.

In section 3 we present the convergence of the algorithm both for uniformly convex functions and for general nonlinear functions. We prove that under common assumptions and if the direction is a descent one then the method is globally convergent. In section 4 we present an acceleration scheme of the algorithm. The idea of this computational scheme is to take advantage that the step lengths Image1060-2009,3,3 in conjugate gradient algorithms are very different from 1.

Therefore, we suggest we modify Image1060-2009,3,3 in such a manner as to improve the reduction of the function values along the iterations. In section 5 we present the ACGMSEC algorithm and we prove that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in function values is significantly improved. Numerical comparisons of our algorithm with some other conjugate gradient algorithms including CG_DESCENT by Hager and Zhang [19], CONMIN by Shanno and Phua [29], SCALCG by Andrei [3-5], or limited quasi-Newton LBFGS by Liu and Nocedal [22] are presented in section 6. For this we use a set of 750 unconstrained optimization problems presented in [1], some of them from the CUTE library [10]. We show that the suggested algorithm outperforms the above conjugate gradient algorithms and LBFGS.

REFERENCES

  1. Andrei, N., Test functions for unconstrained optimization. http://www.ici.ro/camo/ neculai/HYBRID/evalfg.for
  2. Andrei, N., An acceleration of gradient descent algorithm with backtracking for unconstrained optimization, Numerical Algorithms, Vol. 42, 2006, pp. 63-73.
  3. Andrei, N., Scaled conjugate gradient algorithms for unconstrained optimization, Computational Optimization and Applications, 38 (2007), pp. 401-416.
  4. Andrei, N., Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Optimization Methods and Software, 22 (2007), pp. 561-571.
  5. Andrei, N., A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Applied Mathematics Letters, 20 (2007), pp. 645-650.
  6. Andrei, N., Numerical comparison of conjugate gradient algorithms for unconstrained optimization, Studies in Informatics and Control, 16 (2007), pp. 333-352.
  7. Andrei, N., Acceleration of conjugate gradient algorithms for unconstrained optimization. Submitted.
  8. Andrei, N., Accelerated conjugate gradient algorithm with finite difference Hessian / vector product approximation for unconstrained optimization, ICI Technical Report, March 4, 2008.
  9. Birgin, E., J.M. Martínez, A spectral conjugate gradient method for unconstrained optimization, Applied Math. and Optimization, 43, 2001, pp. 117-128.
  10. Bongartz, I., A.R. Conn, N.I.M. Gould and P.L. Toint, CUTE: constrained and unconstrained testing environments, ACM Trans. Math. Software, 21, 1995, pp. 123-160.
  11. Dai, Y.H., L.Z. Liao, New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim., 43 (2001), pp. 87-101.
  12. Dai, Y.H., Y. Yuan, An efficient hybrid conjugate gradient method for unconstrained optimization, Ann. Oper. Res., 103 (2001), pp. 33-47.
  13. Dai, Y.H., Han, J.Y., Liu, G.H., Sun, D.F., Yin, .X., Yuan, Y., Convergence properties of nonlinear conjugate gradient methods, SIAM Journal on Optimization 10 (1999), pp. 348-358.
  14. Daniel, J.W., The conjugate gradient method for linear and nonlinear operator equations, SIAM J. Number. Anal., 4 (1967), pp. 10-26.
  15. Dolan, E.D., J.J. Moré, Benchmarking optimization software with performance profiles, Math. Programming, 91 (2002), pp. 201-213.
  16. Gilbert, J.C., J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim., 2 (1992), pp. 21-42.
  17. Goldstein, A.A., On steepest descent, SIAM J. Control, Vol. 3, 1965, pp. 147-151.
  18. Griewank, A., Ph.L. Toint, On the unconstrained optimization of partially separable objective functions, in Nonlinear Optimization 1981, (M.J.D. Powell Ed.), Academic Press, London, pp.301-312.
  19. Hager, W.W.; H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), pp. 170-192.
  20. Hager, W.W., H. Zhang, A survey of nonlinear conjugate gradient methods, Pacific journal of Optimization, 2 (2006), pp.35-58.
  21. Hestenes, M.R., E.L. Stiefel, Methods of conjugate gradients for solving linear systems, J. Research Nat. Bur. Standards, 49 (1952), pp. 409-436.
  22. Liu, D.C., J. Nocedal, On the limited memory BFGS method for large scale optimization methods, Mathematical Programming, 45 (1989), pp. 503-528.
  23. Nocedal, J., Conjugate gradient methods and nonlinear optimization, in Linear and nonlinear Conjugate Gradient related methods, L. Adams and J.L. Nazareth (eds.), SIAM, 1996, pp. 9-23.
  24. Perry, A., A modified conjugate gradient algorithm, Discussion Paper no. 229, Center for Mathematical Studies in Economics and Management Science, Northwestern University, (1976).
  25. Polak, E., G. Ribière, Note sur la convergence de directions conjuguée, Rev. Francaise Informat Recherche Operationelle, 3e Année 16 (1969), pp.35-43.
  26. Polyak, B.T., The conjugate gradient method in extreme problems, USSR Comp. Math. Math. Phys., 9 (1969), pp.94-112.
  27. Schlick, T., A. Fogelson, TNPACK – A truncated Newton minimization package for large-scale problems: I Algorithm and usage, ACM Transactions on Mathematical Software, 18 (1992), pp. 46-70.
  28. Schlick, T., A. Fogelson, TNPACK – A truncated Newton minimization package for large-scale problems: II Implementation examples, ACM Transactions on Mathematical Software, 18 (1992), pp. 71-111.
  29. Shanno, D.F., K.H. Phua, Algorithm 500, Minimization of unconstrained multivariate functions, ACM Trans. on Math. Soft., 2, 1976. , pp. 87-94.
  30. Zhang, J.Z., N.Y. Deng, L.H. Chen, New quasi-Newton equation and related methods for unconstrained optimization, J. Optim. Theory Appl., 102 (1999), pp. 147-167.
  31. Zhang, J.Z., C.X. Xu, Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations, Journal of Computational and Applied Mathematics, 137 (2001), pp. 269-278.
  32. Yabe, H., M. Takano, Global convergence properties of nonlinear conjugate gradient methods with modified secant condition, Computational Optimization and Applications, 28 (2004), pp. 203-225.