Conjugate gradient methods represent an important class of unconstrained optimization algorithms with strong local and global convergence properties and modest memory requirements. This family of algorithms includes a lot of variants, well known in the literature, with important convergence properties and numerical efficiency. The purpose of this paper is to present these algorithms as well as their Dolan and Moré’s performance to solve a large variety of large-scale unconstrained optimization problems. Some comparisons with well established limited memory quasi-Newton and truncated Newton methods are also presented.
unconstrained optimization, conjugate gradient, hybrid conjugate gradient, scaled conjugate gradient, conjugacy condition, numerical comparisons, Dolan-Moré profile.