Wednesday , October 4 2023

A Numerical Study on Efficiency and Robustness of Some Conjugate Gradient Algorithms for Large-scale Unconstrained Optimization

Neculai ANDREI1,2
Center for Advanced Modeling and Optimization,
I C I Bucharest
(National Institute for R & D in Informatics)

8-10 Averescu Blvd.
011455 Bucharest 1, Romania
2 Academy of Romanian Scientists
54, Splaiul Independenţei, Bucharest 5, Romania

Abstract: A numerical evaluation and comparisons using performance profiles of some representative conjugate gradient algorithms for solving a large variety of large-scale unconstrained optimization problems are carried on. In this intensive numerical study we selected eight known conjugate gradient algorithms: Hestenes and Stiefel (HS), Polak-Ribière-Polyak (PRP), CONMIN, ASCALCG, CG-DESCENT, AHYBRIDM, THREECG and DESCON. These algorithms are different in many respects. However, they have a lot of concepts in common, which give the numerical comparisons sense and confident expectations. The initial search direction in all algorithms is the negative gradient computed in the initial point and the step length is computed by the Wolfe line search conditions. Excepting CONMIN and CG-DESCENT, all the algorithms from this numerical study implement an acceleration scheme which modifies the step length in a multiplicative manner to improve the reduction of the functions values along the iterations. The numerical study is based on a set of 800 artificially large-scale unconstrained optimization test functions of different complexity and with different structures of their Hessian matrix. A detailed numerical evaluation based on performance profiles is applied to the comparisons of these algorithms showing that all of them are able to solve difficult large-scale unconstrained optimization problems. However, comparisons using only artificially test problems are weak and dependent on arbitrary choices concerning the stopping criteria of the algorithms and on decision of whether an algorithm found a solution or not. To get definitive conclusions using this sort of comparisons based only on artificially test problems is an illusion. However, using some real unconstrained optimization applications we can get a more confident conclusion about the efficiency and robustness of optimization algorithms considered in this numerical study.

Keywords: Large scale unconstrained optimization. Conjugate gradient algorithms. Numerical comparisons. Benchmarking, Performance profiles, Data profiles, MINPACK-2 applications.

>>Full text
Neculai ANDREI, A Numerical Study on Efficiency and Robustness of Some Conjugate Gradient Algorithms for Large-scale Unconstrained Optimization, Studies in Informatics and Control, ISSN 1220-1766, vol. 22 (4), pp. 259-284, 2013.