Abstract
An analysis is given of preconditioned nonlinear conjugate gradient methods in which the preconditioning matrix is the exact Hessian matrix at each iteration (or a nearby matrix). It is shown that the order of convergence of certain preconditioned methods is less than that of Newton's method when exact line searches are used, and an example is given.
Original language | English |
---|---|
Pages (from-to) | 658-665 |
Number of pages | 8 |
Journal | SIAM Journal on Scientific Computing |
Volume | 17 |
Issue number | 3 |
DOIs | |
Publication status | Published - May 1996 |
Keywords
- Conjugate gradient methods
- Newton's method
- Preconditioning
- Unconstrained optimization
ASJC Scopus subject areas
- Computational Mathematics
- Applied Mathematics