Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent

Abstract

Recently, a new nonlinear conjugate gradient scheme was developed which satisfies the descent condition gTkdk ≤ −7/8 ‖gk2 and which is globally convergent whenever the line search fulfills the Wolfe conditions. This article studies the convergence behavior of the algorithm; extensive numerical tests and comparisons with other methods for large-scale unconstrained optimization are given.

References

  1. Al-Baali, M. 1985. Decent property and global convergence of the Fletcher-Reeves method with in exact line search. IMA J. Numer. Anal. 5, 121--124.Google ScholarGoogle Scholar
  2. Al-Baali, M. and Fletcher, R. 1984. An efficient line search for nonlinear least squares. J. Optim. Theory Appl. 48, 359--377. Google ScholarGoogle Scholar
  3. Bongartz, I., Conn, A. R., Gould, N. I. M., and Toint, P. L. 1995. CUTE: Constrained and unconstrained testing environments. ACM Trans. Math. Soft. 21, 123--160. Google ScholarGoogle Scholar
  4. Cohen, A. I. 1972. Rate of convergence of several conjugate gradient algorithms. SIAM J. Numer. Anal. 9, 248--259.Google ScholarGoogle Scholar
  5. Dai, Y. H. and Liao, L. Z. 2001. New conjugate conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87--101.Google ScholarGoogle Scholar
  6. Dai, Y. H. and Yuan, Y. 1999. A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177--182. Google ScholarGoogle Scholar
  7. Dai, Y. H. and Yuan, Y. 2000. Nonlinear Conjugate Gradient Methods. Shang Hai Science and Technology, Beijing.Google ScholarGoogle Scholar
  8. Dai, Y. H. and Yuan, Y. 2001. An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103, 33--47. Google ScholarGoogle Scholar
  9. Daniel, J. W. 1967. The conjugate gradient method for linear and nonlinear operator equations. SIAM J. Numer. Anal. 4, 10--26.Google ScholarGoogle Scholar
  10. Dolan, E. D. and Moré, J. J. 2002. Benchmarking optimization software with performance profiles. Math. Program. 91, 201--213.Google ScholarGoogle Scholar
  11. Fletcher, R. 1987. Practical Methods of Optimization vol. 1: Unconstrained Optimization. Wiley & Sons, New York. Google ScholarGoogle Scholar
  12. Fletcher, R. and Reeves, C. 1964. Function minimization by conjugate gradients. Comput. J. 7, 149--154.Google ScholarGoogle Scholar
  13. Gilbert, J. C. and Nocedal, J. 1992. Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21--42.Google ScholarGoogle Scholar
  14. Goldstein, A. A. 1965. On steepest descent. SIAM J. Control 3, 147--151.Google ScholarGoogle Scholar
  15. Golub, G. H. and O'leary, D. P. 1989. Some history of the conjugate gradient and Lanczos algorithms: 1948--1976. SIAM Rev. 31, 50--100. Google ScholarGoogle Scholar
  16. Hager, W. W. 1988. Applied Numerical Linear Algebra. Prentice-Hall, Englewood Cliffs, N.J.Google ScholarGoogle Scholar
  17. Hager, W. W. 1989. A derivative-based bracketing scheme for univariate minimization and the conjugate gradient method. Comput. Math. Appl. 18, 779--795.Google ScholarGoogle Scholar
  18. Hager, W. W. and Zhang, H. 2004. CG_DESCENT user's guide. Tech. Rep., Dept. Math., Univ. Fla.Google ScholarGoogle Scholar
  19. Hager, W. W. and Zhang, H. 2005. A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170--192. Google ScholarGoogle Scholar
  20. Hager, W. W. and Zhang, H. 2006. A survey of nonlinear conjugate gradient methods. Pacific J. Optim. 2, 35--58.Google ScholarGoogle Scholar
  21. Han, J., Liu, G., Sun, D., and Yin, H. 2001. Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications. Acta Math. Appl. Sinica 17, 38--46.Google ScholarGoogle Scholar
  22. Han, J. Y., Liu, G. H., and Yin, H. X. 1997. Convergence of Perry and Shanno's memoryless quasi-Newton method for nonconvex optimization problems. OR Trans. 1, 22--28.Google ScholarGoogle Scholar
  23. Hestenes, M. R. and Stiefel, E. L. 1952. Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Standards 49, 409--436.Google ScholarGoogle Scholar
  24. Hirst, H. 1989. n-step quadratic convergence in the conjugate gradient method. Ph.D. thesis, Dept. Math., Penn. State Univ., State College, Penn.Google ScholarGoogle Scholar
  25. Lemaréchal, C. 1981. A view of line-searches. In Optimization and Optimal Control. vol. 30. Springer Verlag, Heidelberg, 59--79.Google ScholarGoogle Scholar
  26. Liu, D. C. and Nocedal, J. 1989. On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503--528. Google ScholarGoogle Scholar
  27. Liu, Y. and Storey, C. 1991. Efficient generalized conjugate gradient algorithms, part 1: Theory. J. Optim. Theory Appl. 69, 129--137. Google ScholarGoogle Scholar
  28. Moré, J. J. and Sorensen, D. C. 1984. Newton's method. In Studies in Numerical Analysis, G. H. Golub, ed. Mathematical Association of America, Washington, D.C., 29--82.Google ScholarGoogle Scholar
  29. Moré, J. J. and Thuente, D. J. 1994. Line search algorithms with guaranteed sufficient decrease. ACM Trans. Math. Soft. 20, 286--307. Google ScholarGoogle Scholar
  30. Nocedal, J. 1980. Updating quasi-Newton matrices with limited storage. Math. Comp. 35, 773--782.Google ScholarGoogle Scholar
  31. Perry, J. M. 1977. A class of conjugate gradient algorithms with a two step variable metric memory. Tech. Rep. 269, Center for Mathematical Studies in Economics and Management Science, Northwestern University.Google ScholarGoogle Scholar
  32. Polak, E. and Ribière, G. 1969. Note sur la convergence de méthodes de directions conjuguées. Rev. Française Informat. Recherche Opérationnelle 3, 35--43.Google ScholarGoogle Scholar
  33. Polyak, B. T. 1969. The conjugate gradient method in extremal problems. USSR Comp. Math. Math. Phys. 9, 94--112.Google ScholarGoogle Scholar
  34. Powell, M. J. D. 1977. Restart procedures for the conjugate gradient method. Math. Program. 12, 241--254.Google ScholarGoogle Scholar
  35. Powell, M. J. D. 1984. Nonconvex minimization calculations and the conjugate gradient method. In Lecture Notes in Mathematics. vol. 1066. Springer Verlag, Berlin, 122--141.Google ScholarGoogle Scholar
  36. Powell, M. J. D. 1986. Convergence properties of algorithms for nonlinear optimization. SIAM Rev. 28, 487--500. Google ScholarGoogle Scholar
  37. Ramasubramaniam, A. 2000. Unconstrained optimization by a globally convergent high precision conjugate gradient method. M.S. thesis, Dept. Math., Univ. Florida.Google ScholarGoogle Scholar
  38. Shanno, D. F. 1978. On the convergence of a new conjugate gradient algorithm. SIAM J. Numer. Anal. 15, 1247--1257.Google ScholarGoogle Scholar
  39. Shanno, D. F. 1985. On the convergence of a new conjugate gradient algorithm. Math. Program. 33, 61--67.Google ScholarGoogle Scholar
  40. Shanno, D. F. and Phua, K. H. 1980. Remark on algorithm 500. ACM Trans. Math. Soft. 6, 618--622. Google ScholarGoogle Scholar
  41. Wang, C., Han, J., and Wang, L. 2000. Global convergence of the Polak-Ribière and Hestenes-Stiefel conjugate gradient methods for the unconstrained nonlinear optimization. OR Trans. 4, 1--7.Google ScholarGoogle Scholar
  42. Wolfe, P. 1969. Convergence conditions for ascent methods. SIAM Rev. 11, 226--235.Google ScholarGoogle Scholar
  43. Wolfe, P. 1971. Convergence conditions for ascent methods II: Some corrections. SIAM Rev. 13, 185--188.Google ScholarGoogle Scholar

Supplemental Material

Index Terms

  1. Algorithm 851

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in

          Full Access

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader
          About Cookies On This Site

          We use cookies to ensure that we give you the best experience on our website.

          Learn more

          Got it!